

35·
1 day agoOh no, how did we use it for decades and were even able to talk to Google Chat users.


Oh no, how did we use it for decades and were even able to talk to Google Chat users.


As others said it all depends on what you expect. I run stable diffusion on my gaming pc with 32GB RAM and a AMD 9070xt and it works fine. Did also on a 6800xt before that one died. A GPU with 16GB RAM helps a lot, would say that 12GB is the minimum. Lower will limit you in the models and speed.
For LLM just try it out, they work fine without special hardware for smaller models and as long as you are the only user. There are tools like Jan or lmstudio which make it easy to run.
Sorry about that! I didn’t mean to offend you at all! The thing is, you’re spot on. So think of my comment as the rant of a bitter, nostalgic guy who lived through the days of XMPP, IRC, and all that, and saw what could be done. I just hold a bit of a grudge against the way instant messaging and chats are now. Everything’s locked into closed systems, it’s slowly turning rubbish, and decent solutions don’t catch on because they’re “too complicated”, ignoring all the good stuff we’d get if we managed to get past the first hurdle.