• Resonosity@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    7
    ·
    6 小时前

    Just picked up my first Framework 13. Moves like this are why I’m increasingly trusting of their mission and vision.

    Hopefully they stay private, or better yet, change their corporate charter into a cooperative. Never go public.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      20 分钟前

      I’d pay it if they had a few things I’m looking for:

      • Trackpoint (Thinkpad nipple)
      • physical mouse buttons, including middle mouse button

      Basically, I want the ThinkPad keyboard on a Framework laptop.

      • raspberriesareyummy@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 分钟前

        My main problem with those prices is not even the value for money (albeit imo there is quite a disparity here), but the financial damage when lost/stolen. For private use, I just don’t want to carry any device around that is substantially more expensive than ~1000 EUR.

        And the only thing that Framework has to offer me that no one else insise the EU has, is 15+ inches without a num block and with centered touchpad. The market share of laptops with num blocks (and accordingly off-center touchpads) is infuriatingly high and tells me 99.5% of people either have a tiny left hand or do no serious typing on the keyboard.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          41 秒前

          I’m not worried about theft at all, but I hate typing on crappy keyboards with no travel and I don’t like using the trackpad unless I absolutely have to (and I have a Macbook Pro at work, which supposedly has the best trackpad, and I still don’t like using it).

  • silasmariner@programming.dev
    link
    fedilink
    English
    arrow-up
    14
    ·
    edit-2
    1 天前

    FFS I was just about to buy myself one and now I’m obviously gonna have to wait until November

    Oh, wait, I can just upgrade it. Nbd.

  • BlameTheAntifa@lemmy.world
    link
    fedilink
    English
    arrow-up
    106
    arrow-down
    2
    ·
    2 天前

    I’d prefer an AMD 9000 series because I refuse to support Nvidia, but the upgradability is still an amazing achievement. I’m glad to see Framework delivering.

      • BlameTheAntifa@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        1 天前

        Their website still lists RX 7000M & S series, but I don’t know of a single *other *laptop brand that currently offers them. There is certainly is no hint of a 9000 series mobile GPU, which is a shame. I probably won’t buy another laptop until AMD is back in the mobile GPU game. Not that they’re perfect, but they are significantly less evil than NVidia.

      • tempest@lemmy.ca
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 天前

        Most people don’t need them. The gaming and workstation laptop market is smaller than ever. The integrated graphics has been “good enough” for a while now.

        • Dremor@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          edit-2
          1 天前

          Especially since the Steam Deck and derivatives mostly killed the gaming laptop niche market.

          • boonhet@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 天前

            High end gaming laptops needed desktop GPUs anyway, because at least for nVidia, once you get past the **60 range, the mobile version starts getting very small jumps in performance compared to the desktop.

            At some point it’s cheaper to get a gaming desktop and a cheap laptop lol

    • bassomitron@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      13
      ·
      edit-2
      1 天前

      Out of curiosity, why do you refuse to support Nvidia? AMD isn’t some saint, they’re a shitty corporation just like Nvidia. They got lucky when Jim Keller saved their asses with the Ryzen architecture in the mid-2010s. They haven’t really innovated a god damn thing since then and it shows.

      Edit: I get it, I get it, Nvidia is a much shittier company and I agree. I was pretty drunk last night before bed, please pardon the shots fired

      • Domi@lemmy.secnd.me
        link
        fedilink
        English
        arrow-up
        48
        ·
        2 天前

        Besides what was mentioned below, it’s not about making competitive products but about Nvidia being an absolute asshole since the 2000s and they got even worse ever since the crypto and AI craze started. AMD and Nvidia are both corporations but they are not even playing the same game when it comes to being anti-competitive.

        There’s a reason why Wikipedia has a controversies section on Nvidia: https://en.m.wikipedia.org/wiki/Nvidia#Controversies

        That list is far from exhaustive. There’s so much more about Nvidia that you should remember vividly if you were a PC gamer in the 2000s and 2010s with an AMD GPU, like:

        • When they pushed developers to use an unecessary amount of tesselation because they knew tesselation performed worse on AMD
        • When they pushed their Gameworks framework which heavily gimped AMD GPUs
        • When they pushed their PhysX framework which automatically offloaded to CPU on AMD GPUs
        • When they disabled their GPUs in their driver when they detected an AMD GPU is also present in the system
        • When they were cheating in benchmarks by adding optimizations specific to those benchmarks
        • When they shipped an incomplete Vulkan implementation but claimed they are compliant

        Nvidia has been gimping gaming performance and visuals since forever for both AMD GPUs and even their own customers and we haven’t even gotten to DLSS and raytracing yet.

        I refuse to buy anything Nvidia until they stop abusing their market position at every chance they get.

      • amorpheus@lemmy.world
        link
        fedilink
        English
        arrow-up
        52
        arrow-down
        2
        ·
        2 天前

        they’re a shitty corporation just like Nvidia

        Neither of them are anyone’s friend, but claiming they’re the same level of nasty is a bit of a stretch.

        • Crashumbc@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          6
          ·
          2 天前

          Not saying that supporting the under dog isn’t good.

          Just don’t think AMD is less “nasty”, the only thing stopping them is the lack of power to do so.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 分钟前

            Right, and since they’re not dominant, they’re less nasty. If they become dominant, consider switching to whoever is the underdog at that point.

        • Redex@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          2 天前

          Except that AMD doesn’t support HDMI 2.1 on Linux (not their fault to be fair, but still)

            • Redex@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 天前

              I personally don’t have a need for it, but if someone has a 4K 120Hz TV or monitor without DisplayPort that they want to use as such, it’s kinda stupid that they can’t.

              • WhyJiffie@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                4
                ·
                1 天前

                yeah, but that’s the fault of the HDMI standards group. AMD cards could only support HDMI 2.1 if they closed their driver down. I guess this can’t be fixed with a DP to HDMI adapter either, right?

                my opinion: displayport is superior, and if I have a HDMI-only screen with supposed 4k 120Hz support I treat it as false info.

          • naitro@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 天前

            Is that the case on mobile APUs as well? I’m pretty sure my laptop with 7840u does 4k120hz

        • bassomitron@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 天前

          That’s completely valid, I haven’t had issues on Linux myself with nvidia, but I know it’s definitely a thing for a lot of people.

      • Frezik@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 天前

        Haven’t innovated? 3D chip stacking?

        CPU companies generally don’t change their micro-architecture, especially when it works.

      • Bluewing@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        12 小时前

        It’s getting harder and harder to afford high end computers. I have already decided my next new computer will be a mini desktop. They are noticeably cheaper, can be well spec’ed, and powerful with a small foot print.

  • FackCurs@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    1 天前

    OK I’m a bit confused. I have a Framework 16” that I bought earlier this year, without the GPU extension bay. I don’t care that much about the expansion bay as without it, the laptop is already huge. I have an eGPU to play on when I need it.

    What upgrade options does this announcement offer to me?

    I’m dissatisfied with:

    • the webcam
    • screen colors / brightness
    • key stability on the keyboard (the keys are a bit wobbly)
    • speaker sound quality (I’m not expecting the best, but something better than what it shipped with)

    They are announcing a new webcam, will it be backwards compatible ?

    Otherwise I’m really happy with it, I absolutely love the modular I/O, being able to swap which side the audio jack is is amazing. happy to support this endeavor of repairability

  • humanspiral@lemmy.ca
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    1 天前

    for laptops, either get last or even further generation 8 core cpu and 5070/4070, or be happy with AI 300 series igpu. Buy more memory instead. You might one day want local AI/LLMs.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 天前

      Problem is almost no laptop has Strix Halo. Not even the Frameworks.

      And rumors are its successor APU may be much better, so the investment could be, err, questionable.

      • DacoTaco@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        51 分钟前

        Iirc this was due to the design of the chip. Framework said the bandwidth needed for the strix halo is waaaaaaayyyyy faster than the bandwidth of the sodimms that laptops have.
        Hence they made the desktop, only thing they could think of doing with those classes of apu’s

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          44 分钟前

          It’s just soldered LPDDR5X. Framework could’ve fixed it to a motherboard just like the desktop.

          I think the problem is cooling and power. The laptop’s internal PSU and heatsink would have to be overhauled for Strix Halo, which would break backwards compatibility if it was even possible to cram in. Same with bigger AMD GPUs and such; people seem to underestimate the engineering and budget constraints they’re operating under.

          That being said, way more laptop makers and motherboard makers could have picked up Strix Halo. I’d kill for a desktop motherboard with a PCIe x8 GPU slot.

  • inclementimmigrant@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    3
    ·
    1 天前

    So I’m going to be skeptical here. I had an older 9xx MSI laptop that was touted as replaceable and “upgradable” GPU for the next generation at the time.

    That ended up as a big ol’ whoops, because replacement screwed with thermals and found that you couldn’t actually upgrade because of all kinds of reasons and resulted in a class action suit.

    Just color me skeptical on these types of things.

    • BombOmOm@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      ·
      1 天前

      Framework has been pretty consistent on upgradability. You can even put the newest MOBOs/CPUs in the oldest laptops since they kept the formfactor identical. They sell such mobos on their website.

      • inclementimmigrant@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        1 天前

        GPUs a bit of a different monster since there no such thing as a standard socket, you’re bound by the manufacturer spec for pin in/out.

        And that was the case with MSI laptop and Nvidia partnership when Nvidia went full Darth Vader and changed the terms of the deal.

        I mean more power to them if they can actually deliver actual modules that can be upgraded and if I can actually see a generation or two of this actually working, I’ll be on board but once bitten, can’t fool me again for the time being.

        • BombOmOm@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          1 天前

          The standard is PCI-e, and it is interchangeable. This is the second dedicated video card you can get in a Framework laptop, and they can be swapped out with each other. The other video card is even an AMD Radeon.

          • inclementimmigrant@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            4
            ·
            1 天前

            Again, that’s great if they can continue to update and release their GPU module to work with additional and future gpus. I’ll believe it when I see it be updated with the next generation of gpus because just like it said on their press release, others have tried it and failed.

  • chiliedogg@lemmy.world
    link
    fedilink
    English
    arrow-up
    133
    ·
    edit-2
    2 天前

    The more impressive thing is that they managed to get the Nvidia upgrade to be backwards compatible with existing Framework 16 models.

    That’s the push I need to really, truly believe they’re committed to the goal of upgradablity. Too many “modular” products have come out where the “upgraded” modules were only available if you bought the newest version of the base product.

    In the next year or so, I’ll probably be buying a new laptop, and this has convinced me that Framework is probably the way to go.

    • AliasVortex@lemmy.world
      link
      fedilink
      English
      arrow-up
      25
      ·
      2 天前

      I’ve been rocking a Framework 16 for about a year now and would happily recommend it. It’s a bit more upfront, but I love knowing that I can fix or replace just about anything on it (pretty affordably too). It’s just so refreshing to not have to worry about dumb shit like an obscure power adapter or port forcing my laptop into an early retirement.

      It’s not the lightest laptop I’ve ever had, but realistically not all that much different from my last gaming laptop. Now that I’m not a full time student anymore I could probably get away with one of the smaller models, but the form factor is pretty nice.

      Overall, no major complaints!

    • Chloé 🥕@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      15
      ·
      2 天前

      i’ve had a framework 13 from a time before there was any other type of framework, and it’s a great laptop honestly. ive yet to do big upgrades, but just being able to repair it myself is awesome. one time i dented the chassis around where the power button was. no worries, just changed the input cover and bam 5 minutes later it’s like new.

      my only complaint is that the battery life is atrocious. i heard it’s better (but still not great) on newer models tho

      • SeeFerns@programming.dev
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        1 天前

        I have a newer gen 13 and yeah battery life is mediocre. I love literally everything else about it though so it’s ok.

        I can’t remember the last time I wasn’t near an outlet though tbh.

        • Damage@feddit.it
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 天前

          I’ve got an HX 370 one and aside the battery, the only other complaint is the screen, max brightness isn’t much and I miss my previous laptop’s touch screen

          • SeeFerns@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 小时前

            Oh interesting, the brightness hasn’t been an issue for me but different strokes for different folks.

            I’ve never had a touch screen laptop so I can’t miss it lol I bet it’s convenient sometimes though

      • randombullet@programming.dev
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 天前

        I have a 7840U with a 55HWr battery. I can squeeze out 7 hours. If I’m power using then 5-6 is typical. With the 63WHr battery, you’ll get about 15% more time with it.

        • Chloé 🥕@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          1 天前

          yea, that’s what i meant when talking about newer gens being better

          i have a i5-1240P (with 55WHr battery) and im lucky to get 5 hours while on power saver reading PDFs

      • SkaveRat@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 天前

        I have two Intel frameworks, and they both suck in regards to battery life

        Buuut, I just have a big power bank in my backpack. Gives me at least 1 full charge when I’m on the go. And at home I just have a lighter laptop due to smaller battery

        The only thing that pisses me off about framework, is their abysmal software and communication in that regard. It’s basically impossible to get them to acknowledge or fix problems in their firmware

        • notthebees@reddthat.com
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 天前

          Out of curiosity, what cpu? I had an i5-1135g7 laptop that I motherboard swapped with a Ryzen 7 5825U motherboard. The battery life on the i5 was atrocious. I got 2 hours out of it doing note taking. Maybe 3 when new and I had the full battery capacity to work with. After the motherboard swap, I got basically double the battery life in the same conditions.

          (HP pavilion 15-eg050wm and then I put a 15-eh2085cl motherboard in it)

          • SkaveRat@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 天前

            i5-1340P and i7-1260P

            Both FW13

            both get maybe 3 hours if I’m lucky. Although they are a couple years old now. Fresh battery got me maybe 4 when lucky.

            I have a 25k power bank, so I can extend the runtime quite a bit. The “at least once” above is quite conservative. it’s probably closer to 2. and that includes using it while charging.

            I heard the ryzens are a lot better regarding power, so it doesn’t surprise me that the runtime basically doubles

            • notthebees@reddthat.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 天前

              I’d recommend disabling boost and setting cooling to passive.

              On windows, if you set maximum processor usage to 99% in advanced power plan settings, it will disable boost. You can set the cooling policy as well. Also repasting is probably beneficial. The more efficient your cooling system is, the less fan usage it will need and you’ll get better battery life as a result.

              That’s what I noticed on the i5 laptop, it would kick on the fans doing basically nothing and would kill battery. When the fans were off, the estimates were higher. Also maybe disabling the P cores in both machines might be beneficial.

    • SatyrSack@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      2 天前

      The only downside I have seen is that GSYNC will not work. The newer display supports it, put anyone upgrading an older Framework 16 with the new NVIDIA card will have to buy the screen upgrade as well if they need GSYNC.

      • halcyoncmdr@lemmy.world
        link
        fedilink
        English
        arrow-up
        34
        ·
        2 天前

        That’s not unexpected. Variable refresh rate (GSYNC and Freesync) has always needed the display to support it first.

        • malwieder@feddit.org
          link
          fedilink
          English
          arrow-up
          8
          ·
          2 天前

          Yeah, but the old display supports VRR via VESA Adaptive-Sync. Nvidia supports that as well, but not sure if their mobile GPUs don’t for built-in displays?

          If it is supported, I don’t see any advantage of having Gsync vs. standard VRR.

          If not that’s a shame. Pretty wasteful having to buy the same display with different firmware just to get adaptive sync working.

        • potustheplant@feddit.nl
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          2 天前

          Nowadays they’re the same thing. Nvidia uses a different name because they like appropriating things, I guess.

          • tekato@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 天前

            They are not the same thing. GSYNC requires the monitor to be embedded with an NVIDIA controller.

            • potustheplant@feddit.nl
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 天前

              It does not. You’re talking about the original version GSYNC which required a hw module. That’s no longer the case.

                • potustheplant@feddit.nl
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  2 小时前

                  “g-sync compatible” monitors are still advertised as “g-sync”. So, while you’re technically right, even though nvidia’s marketing differentiates versions, manufacturers only put that in the fine print. Also, if you go back to the original question (“Will freesync work with it?”), if gsync works, freesync works as well. Regardless of which variant of gsync you have.

                  Lastly, framework mentions “we’ve updated our 165Hz 2560x1600 panel to support NVIDIA G-SYNC®” but I’m not sure they’re referring to actually including a coprocessor. It most likely refers to just adding VRR support.

    • tankplanker@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 天前

      Yeah it pushed me to finally put in an order, got to wait till December now as I’m in the third batch.

      I wanted to wait till we had proof thst the graphics card would be updatable and a better one would be available as their AMD card is a bit too lightweight for me.

      I would rather it had been a better AMD card, I have a 7900 xtx in my desktop, but i will take what I can get at this point, especially as I know I can upgrade later.

  • ZeroOne@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    ·
    2 天前

    Now if only Framework did that with AMD & Intel GPUs, then we’d all be balling.

    Also please make it available in the East

  • iopq@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    1
    ·
    2 天前

    That’s it, every other gaming laptop is finished. Even though I have the older CPU I can get the newest GPUs now. Nobody can claim that right now. No other company is doing this.

    • DacoTaco@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      ·
      edit-2
      2 天前

      The other laptops arent finished yet. Framework is super expensive , even compared to other gaming laptops.
      I think its worth it, but thats not the opinion of a lot of casual people.
      And had i not gotten one via my job, i would not have gotten a framework 16 because of the price

      • woelkchen@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        2 天前

        Well, the idea is that you can upgrade components without replacing everything, so the initial cost is higher but the long term cost is lower.

        That said, they took their time. The 1st generation is old now. The Radeon dGPU is probably weaker or on a similar level than the new Ryzen iGPU. There is no Radeon dGPU upgrade path other than “just use the old one”. They have a better upgrade cadence with the 13 inch model.

          • woelkchen@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 天前

            What are you talking about? Of course there is newer hardware than a Radeon RX 7700. The 7900 specifically.

            The CPU also has no Ryzen 395 option either which Framework source for their unmodular desktop PC.

            • brucethemoose@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 天前

              The 7900 specifically.

              They have to stay within the TDP. Their only option is something newer and ~100W (like the 5070).

              And I’m pretty sure the 7000 series is going out of production anyway…

              Also (while no 395 is disappointing), it is a totally different socket/platform, and the 395 has a much, much higher effective TDP, so it may not even work in the Framework 16 as its currently engineered. For instance, the cooling or PSU just may not be able to physically handle it. Or perhaps there’s no space on the PCB.

              • woelkchen@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                14 小时前

                A new power brick is needed anyway. That’s why FW now has a much more powerful one as well.

                The 395 obviously would throttle if heat or power become a problem.

                If GPD can put the 395 in a handheld, Framework can put it in a 16" chassis.

        • DacoTaco@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 天前

          Oh i know, and i agree. Im in talks with my boss to maybe upgrade the mainboard pre-maturely to the latest. Im using the lower costs as an argument haha

          • woelkchen@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 天前

            Surely there will be a desktop case for the old mainboard, as with the case for the 13" mainboard. Then you can to a little yoink and have yourself a good desktop PC.

            • DacoTaco@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              2 天前

              The yoink is not needed. We have a policy to get a new laptop every 4 years. After that the laptop is all ours once formatted on site ( to make sure no company or customer details get leaked ). This is how my brother got my old dell xps, which he really needed for his education
              Edit: apparently they are working on it, same with a case for the gpu to convert it into a e-gpu

  • ErableEreinte@lemmy.ca
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    2
    ·
    2 天前

    And still no OLED screen… why Framework, why?

    I got one of the latest Framework 13 a couple months ago for work, and while I’m happy about the prospects of future repairability and upgradability down the line, it’s not a great laptop given its pricepoint.
    The build is subpar, with the screen flexing a ton, the keyboard and trackpad are lacklustre and pretty uncomfortable, but the worst is the screen, it’s dim, with poor colour reproduction and 3:2 is frankly not for me. And fractional scaling is a mess with XWayland, while it was much better on my 2019 XPS 13.

    I love what Framework are pushing for and actually achieving, but tradeoffs are very much at play. I’m hoping for an OLED screen replacement in the near future though.

    • AlecSadler@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      2 天前

      I’ve yet to use an OLED monitor that didn’t make text look shitty and I’ve used $1000+ OLED displays with high ratings.

      Don’t get me wrong, OLED colors and blacks are gorgeous. I love OLED.

      Even my Samsung Pro whatever latest laptop with an OLED display…the text just looks off. Which was disappointing because my Samsung phone text is fine.

      LG C2/3/4, also gross looking text.

      Alienware OLED $750+ monitor? Text was bad.

      I love OLED but I’ve yet to find one that works for productivity.

      • firebingo@lemmy.zip
        link
        fedilink
        English
        arrow-up
        13
        ·
        2 天前

        Almost all OLED displays use a different pixel layout than traditional LCD displays. And sub pixel font rendering is designed for the standard LCD layout. Depending on your OS you may be able to configure the font rendering to look better on most OLEDs. But some people are just more sensitive to this as a problem.

        • AlecSadler@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 天前

          Yeah, unfortunately I might be one of those people. I can also see some monitors flickering which gives me a headache in sub 3 minutes.

          It’s a curse. Especially with in-office pairing.

          • randombullet@programming.dev
            link
            fedilink
            English
            arrow-up
            6
            ·
            2 天前

            I think that’s PWM dimming vs DC dimming.

            PWM dimming turns pixels on and off to make them darker. So for 50% of the brightness, it’s off 50% of the time. Higher end panels flicker much faster which helps mitigate perceived flicker. I think 500hz and above is preferred.

            For DC dimming is just using voltage to control the darkness with no flickering involved.

          • bluecat_OwO@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 天前

            I have a friend who has always been picky about displays, I thought he was just being nit picky

            Since normally my eyes can’t distinguish between 480 and 1080 under normal circumstances and flicker goes un noticed

      • bassomitron@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        2 天前

        Aren’t phone screens AMOLED? I’m definitely not an expert, but I thought it was a variation of OLED, which would explain why text looks better.

        That being said, I also have an OLED Steam Deck and I can read text on it just fine if the scaling is set correctly in the game or just browsing the web normally in desktop mode.

        • AlecSadler@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          6
          ·
          2 天前

          Ah, true, thanks for the correction.

          Maybe I’ve just had bad batches of displays? I don’t know. I got 3 really nice Asus ProArts and the text clarity and colors are fantastic.

          Still wish I had blacker blacks.

      • ErableEreinte@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 天前

        Yep, text is definitely not handled well on most OLED monitors (or TVs) because of their pixel substructure. It’s usually been better on Linux for me and I essentially don’t notice it anymore, but I also haven’t used Windows in years so I can’t compare.

        • AlecSadler@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 天前

          Hmm, I am working on converting all my things over to Linux so maybe I’ll give it another shot.

          Windows always has this weird ghosting going on, super odd.

      • SkunkWorkz@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 天前

        Did you turn on PC-Mode with your LGs?

        I use an LG nanocell TV as an pc monitor and the fonts didn’t look good until I set the HDMI input type to PC. And ofcourse you need to play around with the font rendering tools like ClearType in Windows.

    • AlexisFR@jlai.lu
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      9
      ·
      edit-2
      2 天前

      OLED does not belong on a computer. Also it’s a dead end technology.