coffee_with_cream@sh.itjust.workstoPrivacy@lemmy.ml•Why does my calculator need a privacy policy? It's a calculator!
24·
1 month agoWe send your calculations to your fourth grade teacher
We send your calculations to your fourth grade teacher
You probably want 48gb of vram or more to run the good stuff. I recommend renting GPU time instead of using your own hardware, via AWS or other vendors - runpod.io is pretty good.
Imo it’s worthwhile to just run the biggest model available and rent expensive GPU time. It still amounts to very little overall and you get much better results. Project dependent of course
Uncensored models are so much better, too. chatGPT is like one of those plastic children’s toy hammers vs real models are titanium hammers
For anyone doing a serious project, it’s much more cost effective to rent a node and run your own models on it. You can spin them up and down as needed, cache often-used queries, etc.
Put yourself on public camera streams first as a test. I bet this guy is not always on his best behavior