• sugar_in_your_tea@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 days ago

    it would work

    And that’s probably enough. I don’t know enough about HDR to know if it would look anything like the artist imagined, but as long as it’s close enough, it’s fine if it’s not optimal. Having things completely break is far less than ideal.

    • AnUnusualRelic@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      You’d probably get some colours that end up being quite off target. But you’ll get an image to display. So in the end it depends on how much “not optimal” you’re ready to accept.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        2 days ago

        Right, and it depends on what “quite off target” means. Are we talking about greens becoming purples? Or dark greens becoming bright greens? If the image is still mostly recognizable, just with poor saturation or contrast or whatever, I think it’s acceptable for older software.

        • The_Decryptor@aussie.zone
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 hours ago

          So it depends on the specific HDR encoding used, Rec2020 is the most common ones you’ll see (It’s meant for “pure” setups, i.e. where the source and output are tightly linked, e.g. gaming consoles or blu-ray, or so) and the raw data won’t look great. While something like HLG (Hybrid-Log Gamma) is designed for better fallback (As it’s meant for TV broadcast, where the output device is “whatever TV the user has”), so should just look dimmer.

          This is a HDR screenshot I took of Destiny 2, which uses Rec2020, tone mapped to SDR

          And here’s the raw screenshot data from before tonemapping.

          If the second image had all the right HDR metadata, and the viewer supported it properly, then both images would match.