bobburger@fedia.iotoSelfhosted@lemmy.world•Any of you have a self-hosted AI "hub"? (e.g. for LLM, stable-diffusion, ...)
2·
6 months agoLlamafile is a great way to get use an LLM locally. Inference is incredibly fast on my ARM macbook and rtx 4060ti, its okay on my Intel laptop running Ubuntu.
I’m not sure what features you’re looking for, but Quarto has a lot of really nice features that make it really easy to self host a blog.