In my server I currently have an Intel i7 9th gen CPU with integrated Intel video.

I don’t use or need A.I. or LLM stuff, but we use jellyfin extensively in the family.

So far jellyfin worked always perfectly fine, but I could add (for free) an NVIDIA 2060 or a 1060. Would it be worth it?

And as power consumption, will the increase be noticeable? Should I do it or pass?

  • InverseParallax@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 days ago

    Google is pushing av1 because of patents, but 266 is just plain better tech, even if it’s harder to encode.

    This same shit happened with 265 and vp9, and before that, and before that with vorbis/opus/aac.

    They’ll come back because it’s a standard, and has higher quality.

    Maybe this is the one time somehow av1 wins out on patents, but I’m encoding av1 and I’m really not impressed, it’s literally just dressed up hevc, maybe a 10% improvement max.

    I’ve seen vvc and it’s really flexible, it shifts gears on a dime between high motion and deep detail, which is basically what your brain sees most, while av1 is actually kind of worse than hevc at that to me, it’s sluggish at the shifts, even if it is better overall.