Reading the word “hallucination” in an ML paper is really depressing.
-
Reading the word “hallucination” in an ML paper is really depressing.
-
It’s not a technical or scientific term; it’s not an apt comparison or metaphor for what is occurring. It did not arise naturally from the nature of generative ML models.
It’s a stolen fucking term that came from a marketing department to imply that LLMs and other generative ML models are human or sentient in some fashion. They are not. All LLM output is a “hallucination”; by their very nature they generate text. That some of it happens to be correct does not make it mechanistically different from the incorrect text. It should not be used in publications as all it does is distort the fucking process and do a little propaganda. ffs -
The fact that all these LLM booster “AGI” shits are eugenicists who would happily suggest anyone having actual hallucinations should just die is something that really should dissuade people from using this term.
-
The vision of the world these assholes want is one where people in need are already dead.
-
@aud hallucination is when i go on a colonialist ayahuasca trip and have yet another business idea that doesn't pan out but it doesn't matter because daddy underwrites me anyway
-
Julian Crosson-Hill, ACCreplied to Asta [AMP] last edited by
@aud I see it being added to edtech products and shudder. It absolutely should not be used in teaching.
-
@[email protected] I was just in the shower and thinking about how maybe they think it’s not a terrifying concept because they totally had one during their ayahuasca trip