

28·
2 days agoIt doesn’t, it generates incorrect information. This is because AI doesn’t think or dream, it’s a generative technology that outputs information based on whatever went in. It can’t hallucinate because it can’t think or feel.


It doesn’t, it generates incorrect information. This is because AI doesn’t think or dream, it’s a generative technology that outputs information based on whatever went in. It can’t hallucinate because it can’t think or feel.
Trying to isolate out “emotional baggage” is not how language works. A term means something and applies somewhere. Generative models do not have the capacity to hallucinate. If you need to apply a human term to a non-human technology that pretends to be human, you might want to use the term “confabulate” because hallucination is a response to stimulus while confabulation is, in simple terms, bullshitting.