• horse@lemmy.world
    link
    fedilink
    English
    arrow-up
    47
    ·
    3 months ago

    There is exactly one reason why they do this: So they can charge you $200 to upgrade it to 16GB and in doing so make the listed price of the device look $200 cheaper than it actually is. Or sometimes $400 if it’s a model where the base model comes with a 256GB SSD (the upgrade to 512GB, the minimum I’d ever recommend, is also $200).

    The prices Apple charges for storage and RAM are plain offensive. And I say that as someone who enjoys using their stuff.

    • Jesus_666@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      3 months ago

      That’s why I dropped them when my mid-2013 MBP got a bit long in the tooth. Mac OS X, I mean OS X, I mean macOS is a nice enough OS but it’s not worth the extortionate prices for hardware that’s locked down even by ultralight laptop standards. Not even the impressive energy efficiency can save the value proposition for me.

      Sometimes I wish Apple hadn’t turned all of their notebook lines into MacBook Air variants. The unibody MBP line was amazing.

      • ebc@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        3 months ago

        Sometimes I wish Apple hadn’t turned all of their notebook lines into MacBook Air variants. The unibody MBP line was amazing.

        Typing this from a M2 Max Macbook Pro with 32GB, and honestly, this thing puts the “Pro” back in the MBP. It’s insanely powerful, I rarely have to wait for it to compile code, transcode video, or run AI stuff. It also does all of that while sipping battery, it’s not even breaking a sweat. Yes, it’s pretty thin, but it’s by no means underpowered. Apple really is onto something with their M* lineup.

        But yeah, selling “Pro” laptops with 8GB in 2024 is very stupid.

  • BilboBargains@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    ·
    3 months ago

    As engineers, we should never insert proprietary interfaces into our designs. We shouldn’t obfuscate the design.

    The motivation for these toxic practices comes from the business side because it’s profitable. These people won’t share the profits with you because they are psychopaths. Ultimately we are making more waste when electronics cannot be upgraded, maintained and repaired. It’s bad for people and it’s bad for the environment.

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      edit-2
      3 months ago

      So much stuff in both the hardware and software world really annoys me and makes me think our future is shit the more I think about it.

      Things could be so much better. Pretty much everything could be open and standardised, yet it isn’t.

      Software can be made in a way that isn’t user-hostile, but that’s not the way of things. Hardware could be repairable and open, without OEMs having to navigate a minefield of IP and patents, much of which shouldn’t have been granted in the first place, or users having no ability to repair or upgrade their devices.

      It’s all so tiresome.

      • rottingleaf@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        I think Napoleon said something similar to “the army is commanded by me and the sergeants”?

        Well, not true anymore today. All this connectivity and processing power, however seemingly inefficiently they are used, allow to centralize the world more than it could ever be. No need to consider what sergeants think.

        (Which also means no Napoleons, cause much more average, grey, unskilled and generally unpleasant and uninteresting people are there now.)

        It’s about power and it happened in the last 15 years.

        I think it’s a political tendency, very intentional for those making decisions, not a “market failure” and other smartassery. It comes down to elites making laws. I feel they are more similar to Goering than to Hitler all over the world today.

        This post may seem nuts, but our daily lives significantly depend on things more complex and centralized in supply chains and expertise than nukes and spaceships.

        We don’t need desktop computers which can’t be fully made in, say, Italy, or at least in a few European countries taken together. Yes, this would mean kinda going back to late 90s at best in terms of computing power per PC, but we waste so much of it on useless things that our devices do less now than then.

        We trade a lot of unseen security for comfort.

  • kamen@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    3 months ago

    Yeah, sure. Even if what they say about the OS resource usage is true, it’s only a fraction of the total usage. A lot of the multiplatform software will use the same resources regardless of the OS. Many apps eat RAM for breakfast, doesn’t matter if it’s content creation or software development. Heck, even smartphones these days have have this much or more RAM.

    I won’t argue, I just won’t buy an Apple product in the near future or probably ever at all.

    • KillingTimeItself@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      3 months ago

      buys [insert price] laptop, top of the line, flagship, custom silicon, built ground up to be purpose specific.

      Opens final cut pro: crashes

      ok…

      • Retrograde@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        ·
        3 months ago

        Especially paired with Apple’s 128gb integrated, non replaceable hard drives. Whoops you installed all of Microsoft office? Looks like you have no room to save any documents :(

        • KillingTimeItself@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          8
          ·
          3 months ago

          ah yes, we can’t forget the proprietary non controller based nvme drives that use m.2 but arent actually nvme drives, they’re just flash.

            • KillingTimeItself@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              10
              ·
              edit-2
              3 months ago

              it’s NVME in the sense that it’s non volatile flash, probably even higher quality than most existing NVME ssds out there today.

              The thing is that it literally just the flash. On a card with an m.2 pin out, that fits into an m.2 slot, it doesn’t have a storage controller or any standardized method of communication, that already exists. It’s literally a proprietary non standard standard form factor SSD.

              The controller is integrated onto the silicon chip die itself, there is no storage controller on the storage itself.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    20
    ·
    3 months ago

    8GB RAM is what my phone has.

    Having that in a laptop shows what they think of people buying their kit. They think you’re only buying it so you can type easier on Facebook.

    • macrocephalic@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      My phone was manufactured in 2022, cost under USD250, and has 8gb of ram. New phones generally come with 12gb or more.

        • KillingTimeItself@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          22
          ·
          3 months ago

          nothing that requires 8GB of ram lol.

          I’ve played the entirety of java minecraft on an old thinkpad with 4GB of ram. It didn’t crash (i dont use swap)

          There literally shouldn’t be anything capable of using that much memory.

          • greedytacothief@lemmy.world
            link
            fedilink
            English
            arrow-up
            13
            ·
            3 months ago

            Is this bait? Because like, you could be rendering, simulating, running virtual machines. Lots of stuff that aren’t web browsers also eat ram

              • greedytacothief@lemmy.world
                link
                fedilink
                English
                arrow-up
                5
                ·
                edit-2
                3 months ago

                I was trying to mention things that weren’t just web browsers. Since it seemed the comment was about programs that use more ram than they seemingly need to.

                Edit: There’s like photogrammetry and stuff that happens on phones now!

                • AdrianTheFrog@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  3 months ago

                  There’s like photogrammetry and stuff that happens on phones now!

                  No, the photogrammetry apps all use cloud processing. The LIDAR ones don’t, but that’s only for Apple phones and the actual mesh quality is pretty bad.

                • KillingTimeItself@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  3 months ago

                  i suppose photo editing would be one? Maybe? I’m not sure how advanced photo editing would be on mobile, it’s not like you’re going to load up the entirety of GIMP or something.

                  As for photogrammetry, i’m not sure that would consume very much ram. It could, i honestly don’t think it would be that significant.

                • woelkchen@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  3 months ago

                  it’s not like most people are chronically browsing the web on their phones.

                  Yes, they do.

            • AdrianTheFrog@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              3 months ago

              you could be rendering, simulating, running virtual machines

              On a phone? I guess you could, although 4gb is probably enough for any video game that any amount of people use.

              • woelkchen@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                3 months ago

                People use phone apps for photo and video editing these days. The common TikTok kid out there doesn’t use Adobe Premiere on a desktop workstation.

                Phone apps often are desktop applications with a specialized GUI these days.

                • KillingTimeItself@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  3 months ago

                  i mean yeah, but even then those aren’t significant filters, and what makes you think that tiktok isn’t running a render farm somewhere in china to collect shit tons of data? They’re already collecting the data, might as well provide a rendering service to make the UI nicer, but i don’t use tiktok so don’t quote me on it.

                  Those are also all built into tiktok, and im pretty sure tiktok doesn’t require 8GB of ram to open.

              • dustyData@lemmy.world
                link
                fedilink
                English
                arrow-up
                8
                ·
                edit-2
                3 months ago

                My man, have you been to selfhosted? People are using smart phones for all kinds of crazy stuff. They are basically mini ARM computers. Particularly the flagships, they can do many things like editing video, rendering digital drawings, after they end their use life they can host adguards, do torrent to NAS, host nextcloud. You name it.

                • AdrianTheFrog@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  3 months ago

                  It sounds a lot more cost effective to get a used mini-pc than a flagship phone for any sort of server stuff.

                • pythonoob@programming.dev
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  3 months ago

                  Something like the Samsung Dex app that basically turns your phone into a mini computer with kbm and a monitor wouldn’t bee too bad tbh for most people. Take all your shit with you in your pocket and dock it at home or at work or whatever.

                • KillingTimeItself@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  6
                  ·
                  3 months ago

                  yeah, i literally selfhost a server, running like 8 different services. I’m quite acclimated to it by now. Using a phone for this kind of thing is the wrong device. A chromebook is going to be a better alternative. You can probably get those cheaper anyway.

                  A big problem with phones is that they just aren’t really designed for that kind of thing, you leave a phone plugged in constantly and it’s going to spicy pillow itself. Let alone even trying to do that on something that isn’t an android. I cannot imagine the hell that self hosting on an android would be, let alone on an iphone.

                  I could see a usecase for it as a network relay in the event that you need a hyper portable node or something. GLHF with the dongling if you need those.

                  Unfortunately, if you already have a server, it’s going to be better to just spin up a new task on that server, as the cost of running a new device is going to outweight the cost of just using an existing one that’s already running. Also, you can get stuff like a raspi or le potato for pretty cheap also. not very powerful, but probably more utility, especially given the IO.

          • IthronMorn@sh.itjust.works
            cake
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 months ago

            What about running a chrooted nix install and using a vnc to connect to it? While web browsing and playing a background video? Just because you don’t use your ram doesn’t mean others don’t. And no, I don’t use all my ram, but a little overhead is nice.

            • KillingTimeItself@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              3 months ago

              on a phone? I mean i suppose you could do that, but VNC is not a very slick remote access tool for anything other than, well, remote access. The latency and speed over WIFI would be a significant problem, i suppose you could stream from your phone to your TV, but again, most TVs that exist today are smart TVs so literally a non issue.

              my example here was using a computer rather than a phone, to show that even desktop computing tasks, don’t really use all that much ram.

              • IthronMorn@sh.itjust.works
                cake
                link
                fedilink
                English
                arrow-up
                1
                ·
                3 months ago

                Well, then by that logic, since desktop computing tasks don’t really use all that ram: we shouldn’t need more than 8GB in a desktop ever. Yes, my example was a tad extreme, vnc-ing into your own VM on your phone, but my point was rather phones are becoming capable and replacing traditional computers more and more. A more realistic example is when I was using Samsung Dex the other day I had 80ish chrome tabs open, a video chat, and a terminal ssh’d into my computer fixing it. I liked the overhead of ram I had above me. Was I even close to 12GB? No. But it gave me room if I wanted another background program or had to spin something up quickly without disrupting my flow or lagging out/crashing.

                • KillingTimeItself@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  3 months ago

                  Well, then by that logic, since desktop computing tasks don’t really use all that ram: we shouldn’t need more than 8GB in a desktop ever.

                  if this is the logic we’re using, then we shouldn’t have phones at all. Since clearly they do nothing more than a computer. Or we shouldn’t have desktops/laptops at all. Because clearly they do nothing more than a phone.

                  I understand that phones are more capable, my point is that they have no reason to be more capable. 99% of what you do on a phone is going to be the same whether you spend 200 dollars on it, or 2000.

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        Yeah, but if you have plenty of RAM on Android, there’s a chance those apps you left in the background will still be running when you go back to them, rather than doing the usual Android thing of just restarting them.

        • KillingTimeItself@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 months ago

          yeah i get that, but i often only have like 2 apps open on my android phone (maybe three). And even if you didn’t have enough ram there’s no reason android can’t cache old apps to page file or something. Then you don’t need to restart them, just load it from page. Given how fast modern phone storage is likely to be, this should be pretty negligible.

  • anhydrous@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    3 months ago

    My X220 and T520 each have 16GB. The designed max was actually “only” 8GB, but it turns out 16 GB actually works. I replaced the RAM modules myself without asking Lenovo for permission. Those models came out in 2011.

    • jaschen@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      3 months ago

      My HP Omen 17" was designed for a maximum of 32GB ram. I’m currently running 64GB on it.

    • Duamerthrax@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 months ago

      This was also true for Apple computers before they started soldering the ram in place. I remember going way over spec in my old G4 tower. Hell, I doubt the system would crash if you found larger ram chips and soldered them in.

      • Klause@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 months ago

        I doubt the system would crash if you found larger ram chips and soldered them in.

        You can’t even swap components with official ones from other upgraded models. Everything is tied down with verification codes and shit nowadays. So I doubt you could solder in new ram and get it to work.

    • Valmond@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 months ago

      Yeah lol my thinkcentre with a 6gen intel had only 8GB (I paid under 100€ for it) so I went shopping to double that on a second hand site, but the price for 4, 8 or the 16GB ddr4 ram stick (sodimm, there seems to be a flood of used ones) I bought was about the same, like 30€ shipping included, so now I got 24GB.

  • mightyfoolish@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    3 months ago

    I get upgrades help the bottom line but considering that 8GB of RAM chokes the silicon they are allegedly so proud of… seems like a slap in the face to their own engineers (and the customer as well but that is not my point).

    • Raz@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      Like the upper management and C-suite give a fuck about any of their employees.

  • Alien Nathan Edward@lemm.ee
    link
    fedilink
    English
    arrow-up
    12
    ·
    3 months ago

    Tim Apple be like “We’ve tried charging more money. Have we tried charging more money and delivering less stuff in exchange?”

    • goatman360@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      Yes, they do constantly. Yet, people still keep buying. I hate that I have to use Apple for my job because of the software and interface is exclusive.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        Yup, same. I really don’t like macOS, but that’s what we’ve standardized on. I’m a Linux guy and use Linux at home for everything.

      • Alien Nathan Edward@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        3 months ago

        I really like my macbook for dev work, and I think that now that macos is essentially a linux distro it’s quite nice, but it’s not that much better than the free distros and it’s getting worse while they get better. Right now the only thing keeping me on a mac at work is that they gave it to me and the only thing keeping me on a mac at home is that it’s already paid for.

        • KillingTimeItself@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          3 months ago

          you wanna expand on why you think it’s basically a linux distro? Last i heard macos was more closely based on BSD than it was linux, and this was ages ago. Unless they rewrote it without my knowledge it really shouldn’t be anything like either one of the two.

  • sugar_in_your_tea@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    11
    ·
    3 months ago

    Well yeah, they’re enough to meet the minimum use cases so they can upsell most people on expensive RAM upgrades.

    That’s why I don’t buy laptops with soldered RAM. That’s getting harder and harder these days, but my needs for a laptop have also gone down. If they solder RAM, there’s nothing you can (realistically) do if you need more, so you’ll pay extra when buying so they can upcharge a lot. If it’s not soldered, you have a decent option to buy RAM afterward, so there’s less value in upselling too much.

    So screw you Apple, I’m not buying your products until they’re more repair friendly.

    • akilou@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      I had a extra stick of RAM available the other day so I went to open my wife’s Lenovo to see if it’d take it and the damn thing is screwed shut with the smallest torx screws I’ve ever seen, smaller than what I have. I was so annoyed

      • tal@lemmy.today
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        smallest torx screws I’ve ever seen

        Torx is legitimately useful for small screws, because it’s more resistant to stripping than Phillips.

        Now, if they start using Torx security bits or some oddball shapes, then they’re just being obnoxious. But there are not-trying-to-obstruct-the-customer reasons not to use Phillips.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        I bought the E495 because the T495 had soldered RAM and one RAM slot, while the E495 had both RAM slots replacable. Adding more RAM didn’t need any special tools. Newer E-series and T-series both have one RAM slot and some soldered RAM. I’m guessing you’re talking about one of the consumer lines, like the Yoga series or something?

        That said, Lenovo (well, Motorola in this case, but Lenovo owns Motorola) puts all kinds of restrictions to your rights if you unlock the bootloader of their phones (PDF version of the agreement). That, plus going down the path of soldering RAM gives me serious concerns about the direction they’re heading, so I can’t really recommend their products anymore.

        If I ever need a new laptop, I’ll probably get a Framework.

        • tal@lemmy.today
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          3 months ago

          I keep looking at the Frameworks, because I’m happy with the philosophy, but the problem is that the parts that they went to a lot of trouble to make user-replaceable are the parts that I don’t really care about.

          They let you stick a fancy video card on the thing. I’d rather have battery life – I play games on a desktop. If they’d stick a battery there, that might be interesting.

          They let you choose the keyboard. I’m pretty happy with current laptop keyboards, don’t really need a numpad, and even if you want one, it’s available elsewhere. I’ve got no use for the LED inserts that you can stick on the thing if you don’t want keyboard there.

          They let you choose among sound ports, Ethernet, HDMI, DisplayPort, and various types of USB. Maybe I could see putting in more USB-C then some other vendors have. But the stuff I really want is:

          • A 100Wh battery. Either built-in, or give me a bay where I can put more internal battery.

          • A touchpad with three mechanical buttons, like the Synaptics ones that the Thinkpads have.

          The fact that they aren’t soldering in the RAM and NVMe is nice in that they’re committing to not charging much more then market rate, so I guess they should get credit for that, but they are certainly not the only vendor to avoid soldering those.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            Yeah, ThinkPad used to allow either a CD drive or an extra battery in their T-series. They stopped offering the extra battery and started soldering RAM, so I got the cheaper E-series (might as well save cash if I can get what I want).

            I think there’s a market there. Have an option for a hot-swap battery to bring on trips and use the GPU at home. Serious travelers could even bring a spare battery to keep working for longer.

            touchpad with three mechanical buttons

            Yes please! And give me the ThinkPad nipple as well. :) If they had those, I’d not bother with even looking at Lenovo. The middle button is so essential to my normal workflow that any other laptop (including my fancy MacBook for work) feels crappy.

            I’m guessing the things they made modular are just the low hanging fruit. It’s pretty easy to make a USB-C to whatever port, it’s a bit harder to make a pluggable battery in a slot that can also support a GPU.

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      3 months ago

      These days I don’t realistically expect my RAM requirements to change over the lifetime of the product. And I’m keeping computers longer than ever: 6+ years where it used to be 1 or 2.

      People have argued millions of times on the internet that Apple’s products don’t meet people’s needs and are massively overpriced. Meanwhile they just keep selling like crazy and people love them. I think the issue comes from having pricing expectations set over the in race-to-the-bottom world of commoditized Windows/Android trash.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        I upgraded my personal laptop a year or so after I got it (started with 8GB, which was fine until I did Docker stuff), and I’m probably going to upgrade my desktop soon (16GB, which has been fine for a few years, but I’m finally running out). My main complaint about my work laptop is RAM (16GB I think; I’d love another 8-16GB), but I cannot upgrade it because it’s soldered, so I have to wait for our normal cycle (4 years; will happen next year). I upgraded my NAS RAM when I upgraded a different PC as well.

        I don’t do it very often, but I usually buy what I need when I build/buy the machine and upgrade 3-4 years later. I also often upgrade the CPU before doing a motherboard upgrade, as well as the GPU.

        Meanwhile they just keep selling like crazy and people love them. I think the issue comes from having pricing expectations set over the in race-to-the-bottom world of commoditized Windows/Android trash.

        I might agree if Apple hardware was actually better than alternatives, but that’s just not the case. Look at Louis Rossmann’s videos, where he routinely goes over common failure cases that are largely due to design defects (e.g. display cable being cut, CPU getting fried due to a common board short, butterfly keyboard issues, etc). As in, defects other laptops in a similar price bracket don’t have.

        I’ve had my E-series ThinkPad for 6 years, with no issues whatsoever. The USB-C charge port is getting a little loose, but that’s understandable since it’s been mostly a kids Minecraft device for a couple years now, and kids are hard on computers. I had my T-Mobile series before that for 5-ish years until it finally died due to water damage (a lot of water).

        Apple products (at least laptops) are designed for aesthetics first, not longevity. They do generally have pretty good performance though, especially with the new Apple Silicon chips, but they source a lot of their other parts from the same companies that provide parts for the rest of the PC market.

        If you stick to the more premium devices, you probably won’t have issues. Buy business class laptops and phones with long software support cycles. For desktops, I recommend buying higher end components (Gold or Platinum power supply, mid-range or better motherboard, etc), or buying from a local DIY shop with a good warranty if buying pre built.

        Like anything else, don’t buy the cheapest crap you can, buy something in the middle of the price range for the features you’re looking for.

    • BorgDrone@lemmy.one
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      3 months ago

      That’s why I don’t buy laptops with soldered RAM.

      In my opinion disadvantages of user-replaceable RAM far outweigh the advantages. The same goes for discrete GPUs. Apple moved away from this and I expect PC manufacturers to follow Apple’a move in the next decade or so, as they always do.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Here’s how I see the advantages of soldered RAM:

        • better performance
        • less risk of physical damage
        • more energy efficient
        • smaller

        The risk of physical damage is so incredibly low already, and energy use of RAM is also incredibly low, so neither of those seem important.

        So that leaves performance, which I honestly haven’t found good numbers for. If you have this, I’m very interested, but since RAM speed is rarely the bottleneck in a computer (unless you have specific workloads), I’m going to assume it to be a marginal improvement.

        So really, I guess “smaller” is the best argument, and I honestly don’t care about another half centimeter of space, it’s really not an issue.

        • BorgDrone@lemmy.one
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          So that leaves performance, which I honestly haven’t found good numbers for. If you have this, I’m very interested, but since RAM speed is rarely the bottleneck in a computer (unless you have specific workloads), I’m going to assume it to be a marginal improvement.

          This is where you’re mistaken. There is one thing that integrated RAM enables that makes a huge difference for performance: unified memory. GPUs code is almost always bandwidth limited, which why on a graphics card the RAM is soldered on and physically close to the GPU itself, because that is needed for the high bandwidth requirements of a GPU.

          By having everything in one package, CPU and GPU can share the same memory, which means that you eliminate any overhead of copying data to/from VRAM for GPGPU tasks. But there’s more than that, unified memory doesn’t just apply to the CPU and GPU, but also other accelerators that are part of the SoC. What is becoming increasingly important is AI acceleration. UMA means the neural engine can access the same memory as the CPU and GPU, and also with zero overhead.

          This is why user-replaceable RAM and discrete GPUs are going to die out. The overhead and latency of copying all that data back and forth over the relatively slow PCIe bus is just not worth it.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            3 months ago

            Do you have actual numbers to back that up?

            The best I’ve found is benchmarks of Apple silicon vs Intel+dGPU, but that’s an apples to oranges comparison. And if I’m not mistaken, Apple made other changes like a larger bus to the memory chips, which again makes comparisons difficult.

            I’ve heard about potential benefits, but without something tangible, I’m going to have to assume it’s not the main driver here. If the difference is significant, we’d see more servers and workstations running soldered RAM, but AFAIK that’s just not a thing.

            • Turun@feddit.de
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 months ago

              I understand the scepticism, but without links of what you’ve found or which parts in particular you consider dubious claims (ram speed can be increased when soldered, higher speeds lead to better performance, etc) it comes across as “i don’t believe you, because i choose to not believe you”

              LTT has made a comparison video on ram speeds: https://www.youtube.com/watch?v=b-WFetQjifc

              Do you need proof that soldered ram can be made to run faster?

              • sugar_in_your_tea@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                0
                ·
                3 months ago

                Yes, and the results from that video (i assume, I skimmed it, but have watched similar videos) is that the difference is negligible (like 1-10FPS) and you’re usually better off spending that money on something else.

                I look at the benchmarks between the Intel MacBook Pro and the M1 MacBook Pro, and both use soldered RAM, yet the M1 gets so much better performance, even on non-GPU tasks (e.g. memory-heavy unit tests at work went from 3-5min to 45-50sec from latest Intel to M1). Docker build times saw a similar drop. But it’s hard for me to know what the difference is between memory vs CPU changes. I’d have to check, but I’m guessing there’s also the DDR4 to DDR5 switch, which increases memory channels.

                The claim is that proximity to the CPU explains it, but I have trouble quantifying that. For me, a 1-10FPS drop isn’t enough to reduce repairability and expandability. Maybe it is for others though, but if that’s the difference, that’s a lot less than the claims they seem to make.

                • Turun@feddit.de
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  3 months ago

                  The video has a short section on productivity (i.e. rendering or compiling). That part is probably the most relevant for most people. Check the chapter view in YouTube to jump directly to it.

                  I think a 2x performance improvement is plausible when comparing non-soldered ram to the Apple silicon, which goes even further and has the memory on the die itself. If, of course, ram is the limiting factor.

                  The advantages of upgradable, expandable ram are obvious. But let’s face it: most people don’t need and even less use that capability.

  • GlobalMind@lemm.ee
    link
    fedilink
    English
    arrow-up
    10
    ·
    3 months ago

    I also can not figure out why so many companies are selling them with only a 500Gb drive. SSD or HDD.

  • Dr. Moose@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    3 months ago

    Apple has been really stretching their takes lately. Nice to see some fire under their ass though it’s not going to matter. Too many ignorant people falling for likeable propaganda.

  • NostraDavid@programming.dev
    link
    fedilink
    English
    arrow-up
    6
    ·
    3 months ago

    I haven’t used 8GB since… 2008 or so? TBF, I’m a power user (as are most people on any Lemmy instance, I presume), but still…

    And sure, Mac OS presumably uses less RAM than Windows, but all the applications don’t.

  • mhague@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    3 months ago

    Isn’t “it’s good enough for most users” a little too close to “it’s good enough to be bought, used for a bit, and then tossed”? Usually computers that were adequate for X stop being able to do X. There’s little to no margin and you can’t upgrade it?

  • June (she/her) 🫐@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 months ago

    I was using my 2016 (or so) MacBook Air the other day and getting low memory errors. I thought, wow, this thing only has 8 gb, maybe it’s time to upgrade, just to see this 😐

    • realitista@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      My 2009 Mac mini had 8gb of RAM. And it wasn’t even very expensive to do so when I did it in ~2013. Couple hundred bucks max.