TheBigBrother@lemmy.world to Selfhosted@lemmy.worldEnglish · edit-24 months agoWhat's the bang for the buck go to setup for AI image generation and LLM models?message-squaremessage-square17fedilinkarrow-up140arrow-down111file-text
arrow-up129arrow-down1message-squareWhat's the bang for the buck go to setup for AI image generation and LLM models?TheBigBrother@lemmy.world to Selfhosted@lemmy.worldEnglish · edit-24 months agomessage-square17fedilinkfile-text
minus-squarekata1yst@sh.itjust.workslinkfedilinkEnglisharrow-up4·edit-24 months agoKobaldCPP or LocalAI will probably be the easiest way out of the box that has both image generation and LLMs. I personally use vllm and HuggingChat, mostly because of vllm’s efficiency and speed increase.
minus-squareDarkThoughts@fedia.iolinkfedilinkarrow-up3·4 months agoIt is probably dead but Easy Diffusion is imo the easiest for image generation. KoboldCPP can be a bit weird here and there but was the first thing that worked for me for local text gen + gpu support.
KobaldCPP or LocalAI will probably be the easiest way out of the box that has both image generation and LLMs.
I personally use vllm and HuggingChat, mostly because of vllm’s efficiency and speed increase.
It is probably dead but Easy Diffusion is imo the easiest for image generation.
KoboldCPP can be a bit weird here and there but was the first thing that worked for me for local text gen + gpu support.