• jordanlund@lemmy.world
    link
    fedilink
    arrow-up
    54
    ·
    6 months ago

    “Now wait for 1,000 Hz content and capable GPUs.”

    Forget the content and GPU, you need an input port capable of that.

    HDMI 2.1 and Display Port 1.4 cap out at, what? 240?

      • Fermion@mander.xyz
        link
        fedilink
        arrow-up
        11
        arrow-down
        1
        ·
        edit-2
        6 months ago

        So you just need 3 4090’s with 1 displayport each to the monitor and a whole new version of sli.

          • Fermion@mander.xyz
            link
            fedilink
            arrow-up
            8
            ·
            6 months ago

            I vaguely remember that being a thing for early commercial 8k projectors, but I don’t know anything about the implementation.

            • cabb@lemmy.dbzer0.com
              link
              fedilink
              arrow-up
              9
              ·
              6 months ago

              Two ports at once have been used for Samsung’s 5120x1440 240hz monitors. Each port refreshes half of the screen and there are two scanlines going from left to right. Using the calc here you might be able to use two DP2.1 UHBR80 cables with DSC and nonstandard timings to run 4k 1000hz 10bit.

      • Rexios@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        6 months ago

        Isn’t 4k 360hz equivalent to 1080p 1440hz? I wouldn’t expect 1000hz at 4k any time soon but 1080p in competitive FPS is easy

        • iopq@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          6 months ago

          Not really? Modern hardware gets almost 1000 fps in rocket league. You don’t need exactly 1000 to get a benefit, even getting 800 fps will give you a smoother experience

    • istanbullu@lemmy.ml
      link
      fedilink
      arrow-up
      10
      arrow-down
      4
      ·
      6 months ago

      “Now wait for 1,000 Hz content and capable GPUs.”

      Now wait for humans who can see the difference

    • kevincox@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      6 months ago

      I’m sure some people will demand it. But for 99.9% of the population you don’t need 1000Hz content. The main benefit is that whatever framerate your content is it will not have notable delay from the display refresh rate.

      For example if you are watching 60Hz video on a 100Hz monitor you will get bad frame pacing. But on a 1000Hz monitor even though it isn’t perfectly divisible. the 1/3ms delay isn’t perceptible.

      VRR can help a lot here, but can fall apart if you have different content at different frame rates. For example a notification pops up and a frame is rendered but then your game finishes its frame and needs to wait until the next refresh cycle. Ideally the compositor would have waited for the game frame before flushing the notification but it doesn’t really know how long the game will take to render the next frame.

      So really you just need your GPU to be able to composite at 1000Hz, you probably don’t need your game to render at 1000Hz. It isn’t really going to make much difference.

      Basically at this point faster refresh rates just improve frame pacing when multiple things are on screen. Much like VRR does for single sources.

  • JackGreenEarth@lemm.ee
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    7
    ·
    6 months ago

    Who needs 1000hz refresh rate? I understand it’s impressive, but 120hz already looks smooth to the human eye.

    • giloronfoo@beehaw.org
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      6 months ago

      Competitive (professional) gamers?

      Seems there are diminishing returns, but at least some gains are measurable at 360.

      • Fushuan [he/him]@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        6 months ago

        In thought that 60Hz was enough for most games, and that for shooters and other real time games 120 or 144 was better. However, it reaches a point where the human eye can’t notice even if it tried.

        Honestly, going up in framerate t9o much is just a waste of GPU potency and electricity.

        • narc0tic_bird@lemm.ee
          link
          fedilink
          arrow-up
          10
          ·
          6 months ago

          A better way to look at this is frametime.

          At 60 FPS/Hz, a single frame is displayed for 16.67ms. At 120 Hz, a single frame is displayed for 8.33ms. At 240 Hz, a single frame is displayed for 4.16ms. A difference of >8ms per frame (60 vs 120) is quite noticeable for many people, and >4ms (120 vs 240) is as well, but the impact is just half as much. So you get diminishing returns pretty quickly.

          Now I’m not sure how noticeable 1000 Hz would be to pretty much anyone as I haven’t seen a 1000 Hz display in action yet, but you can definitely make a case for 240 Hz and beyond.

        • jsomae@lemmy.ml
          link
          fedilink
          arrow-up
          4
          ·
          6 months ago

          It’s pretty easy to discern refresh rate with the human eye if one tries. Just move your cursor back and forth really quickly. The number of ghost cursors in the trail it leaves behind (which btw only exist in perception by the human eye) is inversely proportional to the refresh rate.

          • Fushuan [he/him]@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            6 months ago

            Sure, but wasting double or triple the resources for that is not fine. There’s very limited places where that even is a gain on games, because outside those super competitive limited games it’s not like it matters.

            • jsomae@lemmy.ml
              link
              fedilink
              arrow-up
              4
              ·
              6 months ago

              Yeah I agree with you, but I was just refuting your claim that it’s not perceivable even if you try.

              • Fushuan [he/him]@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                6 months ago

                oh, yeah I’ve read and heard of plenty people saying that they definitely notice it. I’m lucky enough not to because most ARPGs don’t run 60FPS on intense combat, let alone 120 fps on a rtx3080 lmao.

                I was talking more about the jump from 240 and beyond, which I find surprising for people to notice the upgrade on intense gaming encounters, not while calmly checking or testing. I guess that there’s people who do notice, but again, running games on such high tick rate is very expensive for the gpu and a waste most of the time.

                I’m just kinda butthurt that people feel like screens below 120 are bad, when most games I play hardly run 60 fps smooth, because the market will follow and in some years we will hardly have what I consider normal monitors, and the cards will just eat way more electricity for very small gains.

    • You999@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      ·
      6 months ago

      The obvious awnser would be VR and AR where the faster the refresh rate is the less likely you are to get motion sick. A display with a refresh rate that high would be displaying a frame every millisecond meaning if the rest of the hardware could keep up a headset using this display would be able to properly display the micro movements your head makes.

  • Baggie@lemmy.zip
    link
    fedilink
    arrow-up
    19
    arrow-down
    1
    ·
    6 months ago

    I would be happy with a 240hz 4k that doesn’t have a subtle hum when it’s going that hard. It’s hard to test for because shops are too loud to hear it, but in a quiet office it gets very noticeable.

    • Dippy@beehaw.org
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      6 months ago

      I’m hoping that people stop giving an actual fuck at around 400 so that they can just simply produce that and stop.

      • grrgyle@slrpnk.net
        link
        fedilink
        arrow-up
        2
        ·
        6 months ago

        Genuine answer is that it’s just not necessary. Current displays are sharp and smooth enough. I’d rather a display that lasts for a few decades, since the only reason to replace these is when they break down.

      • madcaesar@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        4
        ·
        6 months ago

        Your eyes can’t possibly tell the difference. We’re past the max eye resolution at this point.

          • Pyr_Pressure@lemmy.ca
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            6 months ago

            I imagine it was a typo*, but this article in Nature reports that in specifics circumstances the median maximum that people can perceive a difference may be around 500hz, with the maximum in their test possibly being as high as 800hz.

            Normally though it seems closer to 50-90hz, but I’m on the road and haven’t delved too deeply into it

            Edit: Type to Typo

              • Pyr_Pressure@lemmy.ca
                link
                fedilink
                arrow-up
                2
                ·
                6 months ago

                Not the original you replied to. And I had a typo when trying to spell typo 😂 just adding to the conversation. Wasn’t disputing you, just meant the may have meant refresh rate instead of resolution. Easy mistake. It’s still quite disputed how well eyes can tell the difference in refresh rates.

  • LaggyKar@programming.dev
    link
    fedilink
    arrow-up
    18
    arrow-down
    8
    ·
    6 months ago

    So it’s not really a 4K 1000Hz screen then, if it’s just togglable between being a 4k 240 Hz screen and a 1080p 1000 Hz screen.

    • Confetti Camouflage@pawb.social
      link
      fedilink
      English
      arrow-up
      28
      ·
      6 months ago

      From what I understand in the article the prototype TCL panel being demonstrated is actually 4k@1000hz. They mention a few competitors with multiple modes right after which could be where the confusion comes from.

  • pedz@lemmy.ca
    link
    fedilink
    arrow-up
    10
    ·
    6 months ago

    After having a TCL smart TV that constantly smells like burning plastic, even a year after using it, I’m not sure I would want another of their product in my home.

    • Dorkyd68@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      6 months ago

      Mine burnt out half the led strips in 3 years. Will never buy again. Idc how affordable they are. I miss when appliances and electronics were built to last, not break after a few years.

  • mindbleach@sh.itjust.works
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    6 months ago

    Hitting a locked 1000 Hz and nitpicking the frame-pacing is not the point of high refresh rates. It makes exact framerate irrelevant. Even for mundane double-digit framerates - this would work the same as FreeSync. Frames would appear the instant they are ready. There is no difference between 60 FPS and 59 FPS.

    You can limit an efficient game to 240, and if it doesn’t hit that, who gives a shit.

  • pete_the_cat@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    20
    ·
    6 months ago

    “Create your own penis showing game”

    That’s what the tech world has come to recently, especially with monitors and smartphones.

    • Lojcs@lemm.ee
      link
      fedilink
      arrow-up
      15
      ·
      6 months ago

      I don’t understand this comment; are they not supposed to improve?

      • Fades@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        6 months ago

        They would rather everyone shut up about it so they don’t have to battle FOMO lol

    • glimse@lemmy.world
      link
      fedilink
      arrow-up
      12
      ·
      6 months ago

      Recently?

      Tech has always been about pushing boundaries. And that’s not a bad thing

      • pete_the_cat@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        Screen technologies for a lot of things has gotten to the point where your eyes literally can’t tell the difference, but sure, dump money into a placebo.

        • glimse@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          6 months ago

          I agree that 1000hz is ridiculous, I have a 165hz monitor and can’t tell the difference past 120 or so. But that’s not really the point…this will never be a mainstream product but the technology may lead to useful advancements in the future

          • pete_the_cat@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 months ago

            That is the point, most people don’t do research and see “ahh a bigger number, it must be better!”. 1Khz refresh rate may be a niche thing now but in two years every company will be pushing something similar.

            • glimse@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              6 months ago

              Let “bigger is better” people waste their money like they already do on other products. It makes things cheaper for the rest of us while opening up new avenues for display technology in the future.

  • Etterra@lemmy.world
    link
    fedilink
    arrow-up
    9
    arrow-down
    15
    ·
    6 months ago

    JFC nobody needs that kind of refresh. Your eyes literally can’t tell much past what, a 150? And 60ish is good enough.