"not a great thing to remove the obviously stupid answers. They perform a salutatory task of educating the public about the limitations and giving them memorable hooks to remember not to trust these things." #llm
1270193
"Google’s poor testing is hardly in doubt."

Also Microsoft's. Either massive overconfidence, or they know "80%" is easy to achieve and going further isn't worth it for the hype train. They just need the checkbox checked off. #llm
1270194
Current goal is to reduce the *obviously* stupid answers (or un-human answers) to questions, so that nobody notices the unintelligence/context/thinking behind it all.
1270195
"pre-alpha trash that’s worse than traditional search" google #llm
1270196
@33MHz Isn’t it kind of impressive how bad this is? Like, what the heck?! How are these jackholes allowed to continue making decisions for the company?
1270224
"Google put $1.5B into crypto." now big companies pivoting to use that worthless crypto hardware for something.

MS reporting significant increase in greenhouse gasses from themselves b/c of #llm use?
1270203
#LLM the perfect workhorse b/c we will need *more* people to keep the rider on the horse, not fewer. Centralized, mono-/mega-product to win or lose and put more into until you win or lose and put more into until.
1270204
The gift that keeps on taking.
1270205
@33MHz Yah, you know this LLM hyping really has reminded me of the height of the Crypto/NFT hype train. Facebook was the "Metaverse" company but now suddenly is the AI company. Sure.
1270221
@bayprogrammer @33MHz I think the turning point for me was getting a machine I could run models on locally. llama3:70b and mixtral 8x22b on my own laptop has actually been a productivity boost in many ways. Importantly, none of my data leaves my machine.
1270295
@kyle @bayprogrammer 👍👍 yah, that's nice to have. And to be able to tune it to react the way you want, maybe.
1270297
@33MHz That will come later, still figuring out all of this, and how RAG/embeddings/etc all work. https://github.com/ItzCrazyKns/Perplexica is pretty cool though. Building my own assistant now that works how I want / doesn't export my data @bayprogrammer
1270301
@33MHz fun fact: I broke the precursor to the precursor to Gemini by asking it questions in Zalgo text.
1270296