• FatCrab@lemmy.one
    link
    fedilink
    arrow-up
    4
    ·
    1 day ago

    Grok is closed source, I believe, so it’s hard to say. But, ignoring unknown architecture or latent space details, this could be a lot of things. The way you seem to be using the term hallucination effectively applies to EVERY output of a GPT. They effectively reason probabilistically across a billion dimensioned space mapping language components, with various dimensions taking on various semantic values due to a sort of mathematical differentiation during training. This could be the result of influence from any number of things tbh.