Back in 2013, Nvidia introduced a new technology called G-Sync to eliminate screen tearing and stuttering effects and reduce input lag when playing PC games. The company accomplished this by tying your display’s refresh rate to the actual frame rate of the game you were playing, and similar variable refresh-rate (VRR) technology has become a mainstay even in budget monitors and TVs today.

The issue for Nvidia is that G-Sync isn’t what has been driving most of that adoption. G-Sync has always required extra dedicated hardware inside of displays, increasing the costs for both users and monitor manufacturers. The VRR technology in most low-end to mid-range screens these days is usually some version of the royalty-free AMD FreeSync or the similar VESA Adaptive-Sync standard, both of which provide G-Sync’s most important features without requiring extra hardware. Nvidia more or less acknowledged that the free-to-use, cheap-to-implement VRR technologies had won in 2019 when it announced its “G-Sync Compatible” certification tier for FreeSync monitors. The list of G-Sync Compatible screens now vastly outnumbers the list of G-Sync and G-Sync Ultimate screens.

    • conciselyverbose@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      48
      arrow-down
      20
      ·
      1 month ago

      This is silly.

      Gsync solved a problem that couldn’t be solved before they made it. They stayed committed to that good solution until there was an alternative that reached a reasonable level of performance, then supported both until they could get close without the expensive extra hardware.

      Was it worth it? For most people no. But it’s still technically superior today and there are loads of options without the extra cost.

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        46
        arrow-down
        3
        ·
        1 month ago

        VESA Adaptive-Sync goes back to the eDP stardard, 2009. AMD simply took that and said “Hey why aren’t we doing that over external DisplayPort”. And they did.

        So instead of over-engineering a solution that nobody asked for to create vendor lock-in nobody (but fanboys with Stockholm Syndrome) want they exposed functionality that many many panels already had, anyway, because manufactures don’t use completely different control circuitry for laptop (eDP) and stand-alone monitors.

        And, no, nvidia’s tech is not superior. From what I gather they have stricter certification requirements but that’s it.

        • AngryMob@lemmy.one
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 month ago

          Gsync modules have a lower sync window before LFC kicks in (usually around 30), and faster pixel response (overdrive) anywhere in the sync window. Those are benefits for both high framerate content and low framerate content.

          Even today freesync usually bottoms out around 48. That constantly puts you at the LFC boundary for a lot of AAA games if youre on a popular midrange graphics card and aiming for 60fps average.

          • frezik@midwest.social
            link
            fedilink
            English
            arrow-up
            6
            ·
            1 month ago

            Just to address this from a high level, I see this as typical of Nvidia and AMD approaches. Nvidia makes something that’s engineered to perfection, but adds a bunch of requirements on it that make it expensive and supports vendor lock-in. Even if you’re willing to put with that to have The Best, you might hesitate when finding out what assholes Nvidia are about everything.

            AMD then makes something 95% as good, and it’s cheap and you can work with them without yelling.

            See also: FSR vs DLSS.

      • Tetsuo@jlai.lu
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        7
        ·
        1 month ago

        The problem was solved by Nvidia, then AMD made it cheap and accessible and not requiring a dedicated hardware module.

        For years and years Nvidia increased artificially by up to 150 euros many Gsync screens and for no legitimate reason. Initially there was NO compatibility with free sync at all.

        Nvidia wasn’t kindly solving a gamers problem at least to after the first year of release of that tech. They were forcibly selling expensive hardware modules nobody needed or wanted. And long after freesync showed you could do it just as well without this expensive requirements.

        This hardware module they insisted on selling wasn’t solving a technical problem but a money one.

        I don’t even think anyone was ever able to differentiate between the different qualities of “sync techs”.

        • conciselyverbose@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          4
          ·
          1 month ago

          There absolutely was a legitimate reason. The hardwares was not capable of processing the signals. They didn’t use FPGAs on a whim. They did it because they were necessary to handle the signals properly.

          And you just haven’t followed the tech if you think they were indistinguishable. Gsync has supported a much wider variance of frame times over its entire lifespan.

        • Overshoot2648@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 month ago

          They could literally just transition to Vulkan with a Metal wrapper for pre-existing software ate any time but no, they have to keep their ecosystem locked down for some reason.

    • GamingChairModel@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      arrow-down
      1
      ·
      1 month ago

      They always win, unless they don’t. History is littered with examples of the freer standard losing to the more proprietary standard, with plenty of examples going the other way, too.

      Openness is an advantage in some cases, but tight control can be an advantage in some other cases.

        • Vik@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 month ago

          Up until relatively recently, it was great to see that Vulkan and DX12 were in a practically even split.

          Still great to see that some of the best talent in terms of visual fideltiy showcases Vulkan, like rdr2 (originally defaulted to dx12, now vulkan), doom eternal and so on. Fully expect the next GTA to.

          stadia was derp but it forced interested publishers to get acquainted with vk. I think it ended up doing more good for the industry in the end as a failure, rather than harm by succeeding and locking subscribers into such a restrictive game “ownership” paradigm.

    • Psythik@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      4
      ·
      1 month ago

      I’ll buy an AMD GPU once they have an answer to the 4090 (actually the 5090 at this point). I need AI upscaling, SDR-to-HDR conversion for videos, and way better ray tracing performance. Until that happens, my PC will unfortunately remain a mixed-breed bastard.

  • ScampiLover@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    ·
    1 month ago

    TL:DR The stuff the dedicated module is doing will go inside specific Mediatek chips on specific premium monitors

    Really weird it’s taken this long - I remember reading that the modules were expensive and assumed it was just because they were early generations and Nvidia was still working things out

    • melroy@kbin.melroy.org
      link
      fedilink
      arrow-up
      3
      arrow-down
      4
      ·
      1 month ago

      haha. I wouldn’t, but yes please sell your stocks. I will buy them. We still have a $300B inflow of AI shizzle (bubble) that goes into Nvidia.

  • Vik@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    Good for them if it help eliminate the mark up of displays advertising gsync ultimate. I have my doubts but it’d make sense if they’re no longer using dedicated boards with FPGAs and RAM.

    One has to wonder if VESA will further their VRR standard to support refresh rates as low as 1Hz

    • AngryMob@lemmy.one
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 month ago

      Yeah it feels premature since so many freesync displays still only go to 48hz.

      Maybe if the mediatek chip can go to 30hz then VESA will update.

      • Vik@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 month ago

        I think below that range they can frame double (low framerate compensation LFC) to go as low as 24 FPS