• 0 Posts
  • 2 Comments
Joined 3 years ago
cake
Cake day: June 30th, 2023

help-circle
  • If your focus is LLMs, get a 3090 gpu. Vram is the most important thing here because it determines what models you can load and run at a decent speed, and having 24Gb will let you run the mid range models that specifically target this amount of memory because of this being a very standard amount to have for hobbyists. These models are viable for coding, the smaller ones are less so. Looking at prices it seems like you can get this card for 1-2k depending on if you go used or refurbished. I don’t know if better price options are going to be available soon but with the ram shortage and huge general demand it kind of doesn’t seem like it.

    If you want to focus on image or video generation instead, I understand that there are advantages to going with newer generation cards because certain features and speed is more of a factor than just vram but I know less about this.