• barsoap@lemm.ee
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    10 months ago

    What does it mean to “know what a word means”?

    For one, ChatGPT has no idea what a cat or dog looks like. It has no understanding of their differences in character of movement. Lacking that kind of non-verbal understanding, when analysing art that’s actually in its domain, that is, poetry, it couldn’t even begin to make sense of the question “has this poem feline or canine qualities” – best it can do is recognise that there’s neither cats nor dogs in it and, being stumped, make up some utter nonsense. Maybe it has heard of catty and that dogs are loyal and will be looking for those themes, but feline and canine as in elegance? Forget it, unless it has read a large corpus of poet analysis that uses those terms: It can parrot that pattern matching, but it can’t do the pattern matching itself, it cannot transfer knowledge from one domain to another when it has no access to one of those domains.

    And that’s the tip of the iceberg. As humans we’re not really capable of purely symbolic thought so it’s practically impossible to appreciate just how limited those systems are because they’re not embodied.

    (And, yes, Stable Diffusion has some understanding of feline vs. canine as in elegance – but it’s an utter moron in other areas. It can’t even count to one).


    Then, that all said, and even more fundamentally, ChatGPT (as all other current AI algos we have) is a T2 system, not a T3 system. It comes with rules how to learn, it doesn’t come with rules enabling it to learn how to learn. As such it never thinks – it cannot think, as in “mull over”. It reacts with what passes as a gut in AI land, and never with “oh I’m not sure about this so let me mull it over”. It is in principle capable of not being sure but that doesn’t mean it can rectify the situation.

      • barsoap@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        10 months ago

        Coming up with questions that would be trivial to answer for any human, but impossible for ChatGPT is quite tricky.

        Which is why I came up with the “feline poetry” example. It’s a quite simple concept for a human even if not particularly poetry-inclined, yet, if noone ever has written about the concept it’s going to be an uphill battle for ChatGPT. And, no, I haven’t tried. I also didn’t mean it as some kind of dick measuring contest I simply wanted to explain what kind of thing ChatGPT really has trouble with.

        Have you actually ever actually seen an iceberg or just read about them?

        As a matter of fact yes, I have. North cape, Norway.

        ChatGPT doesn’t learn. It’s a completely static model that doesn’t change.

        ChatGPT is also its training procedure if you ask me, same as humanity is also its evolution.