• silverlose@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    3 days ago

    FWIW speech to text works really well on Apple stuff.

    I’m not exactly sure what info you’re looking but: my gaming PC is headless and sits in a closet. I run ollama on that and I connect to it using a client called “ChatBox”. It’s got a gtx 3060 which fits the whole model, so it’s reasonably fast. I’ve tried the 32b model and it does work but slowly.

    Honestly, ollama was so easy to setup, if you have any experience with computers I recommend giving it a shot. (Could be a great excuse to get a new gpu 😉)

    • tupalos@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 hours ago

      Yeah, I think the Apple speech to text is pretty decent, but I think on ChatGPT they use the whisper API to return the text and it just seems to be a lot more reliable, especially when it comes to understanding random words in context

      How much VRAM do you have on the 3060 to be able to fit the whole thing on the GPU?

      • silverlose@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        True. Honestly apples software is just getting worse by the day. It’s sad.

        It’s a version with 12gb of vram. I use it to game though. If you want a real GPU for this, I hear the Tesla P40 is the best.