I agree, bring on the weird, I don’t need accurate, I want hallucinated novelty. This is like people who treat LLM like a dictionary or search engine and complain about innaccuracy. They don’t understand this is to be expected of a synthetized answer.
Hallucinations is an essential part of the value these things bring.
I agree, bring on the weird, I don’t need accurate, I want hallucinated novelty. This is like people who treat LLM like a dictionary or search engine and complain about innaccuracy. They don’t understand this is to be expected of a synthetized answer.
Hallucinations is an essential part of the value these things bring.