I'm running a box I put together in 2014 with an i5-4460 (3.2ghz), 16 GB of RAM, GeForce 750ti, first gen SSD, ASRock H97M Pro4 motherboard with a reasonable PSU, case and a number of fans. All of that parted out at the time was $700.
I've never been more fearful of components breaking than current day. With GPU and now memory prices being crazy, I hope I never have to upgrade.
I don't know how but the box is still great for every day web development with heavy Docker usage, video recording / editing with a 4k monitor and 2nd 1440p monitor hooked up. Minor gaming is ok too, for example I picked up Silksong last week, it runs very well at 2560x1440.
For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.
For a DDR3-era machine, you'd be buying RAM for that on Ebay, not Newegg.
I have an industrial Mini-ITX motherboard of similar vintage that I use with an i5-4570 as my Unraid machine. It doesn't natively support NVMe, but I was able to get a dual-m2 expansion card with its own splitter (no motherboard bifurcation required) and that let me get a pretty modern-feeling setup.
Don't worry, if you are happy with those specs you can get Corporate ewaste dell towers on ebay for low prices. "Dell precision tower", I just saw a listing for 32gb ram, Xeon 3.6ghz for about 300 usd.
Personally, at work I use the latest hardware at home I use ewaste.
I agree with you on SSDs, that was the last upgrade that felt like flipping the “modern computer” switch overnight. Everything since has been incremental unless you’re doing ML or high-end gaming.
The only time I had this other than changing to SSD was when I got my first multi-core system, a Q6600 (confusingly labeled a Core 2 Quad). Had a great time with that machine.
I know it's not the same. But I think a lot of people had a similar feeling going from Intel-Macbooks to Apple Silicon. An insane upgrade that I still can't believe.
If you fast forward just a few years though, it wasn't too bad.
You could put together a decent fully parted out machine in the late 90s and early 00s for around $600-650. These were machines good enough to get a solid 120 FPS playing Quake 3.
A few years later but similarly - I am still running a machine built spur-of-the-moment in a single trip to Micro Center for about $500 in late 2019 (little did we know what was coming in a few months!). I made one small upgrade in probably ~2022 to a Ryzen 5800X w/ 64GB of RAM but otherwise untouched. It still flies through basically anything & does everything I need, but I'm dreading when any of the major parts go and I have to fork out double or triple the original cost for replacements...
Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum? I’m not saying to do what I do and buy a $7k laptop and a $15k desktop every year but compared to revenue it seems silly to be worrying about a few thousand dollars per year delta.
I buy the best phones and desktops money can buy, and upgrade them often, because, why take even the tiniest risk that my old or outdated hardware slows down my revenue generation which is orders of magnitude greater than their cost to replace?
Even if you don’t go the overkill route like me, we’re talking about maybe $250/month to have an absolutely top spec machine which you can then use to go and earn 100x that.
Spend at least 1% of your gross revenue on your tools used to make that revenue.
The vast majority of humans across the planet aren’t making their money with their computer, which was the qualifier in the first line of my comment.
Furthermore, even if they did, the vast majority of them still won’t be using their computer to generate revenue - they’ll be using and employer-provided one and the things I’m talking about have nothing to do with them.
What is the actual return on that investment, though? This is self indulgence justified as « investment ». I built a pretty beefy PC in 2020 and have made a couple of upgrades since (Ryzen 5950x, 64GB RAM, Radeon 6900XT, a few TB of NVMe) for like $2k all-in. Less than $40/month over that time. It was game changing upgrade from an aging laptop for my purposes of being able to run multiple VMs and a complex dev environment, but I really don’t know what I would have gotten out of replacing it every year since. It’s still blazing fast.
Even recreating it entirely with newer parts every single year would have cost less than $250/mo. Honestly it would probably be negative ROI just dealing with the logistics of replacing it that many times.
> maybe $250/month (...) which you can then use to go and earn 100x that.
25k/month? Most people will never come close to earn that much. Most developers in the third world don't make that in a full year, but are affected by raises in PC parts' prices.
I agree with the general principle of having savings for emergencies. For a Software Engineer, that should probably include buying a good enough computer for them, in case they need a new one. But the figures themselves seem skewed towards the reality of very well-paid SV engineers.
Yes? I think that's crazy. I just maxed out my new Thinkpad with 96 GB of RAM and a 4 TB SSD and even at today's prices, it still came in at just about $2k and should run smoothly for many years.
Prices are high but they're not that high, unless you're buying the really big GPUs.
Where can you buy a new Thinkpad with 96GB and 4TB SSD for $2K? Prices are looking quite a bit higher than that for the P Series, at least on Lenovo.com in the U.S. And I don't see anything other than the P Series that lets you get 96GB of RAM.
I agree with the general sentiment - that you shouldn't pinch pennies on tools that you use every day. But at the same time, someone who makes their money writing with with a pen shouldn't need to spend thousands on pens. Once you have adequate professional-grade tools, you don't need to throw more money at the problem.
Yes, that's an absolutely deranged opinion. Most tech jobs can be done on a $500 laptop. You realise some people don't even make your computer budget in net income every year, right?
I try to come at it with a pragmatic approach. If I feel pain, I upgrade and don't skimp out.
========
COMPUTER
========
I feel no pain yet.
Browsing the web is fast enough where I'm not waiting around for pages to load. I never feel bound by limited tabs or anything like that.
My Rails / Flask + background worker + Postgres + Redis + esbuild + Tailwind based web apps start in a few seconds with Docker Compose. When I make code changes, I see the results in less than 1 second in my browser. Tests run fast enough (seconds to tens of seconds) for the size of apps I develop.
Programs open very quickly. Scripts I run within WSL 2 also run quickly. There's no input delay when typing or performance related nonsense that bugs me all day. Neovim runs buttery smooth with a bunch of plugins through the Windows Terminal.
I have no lag when I'm editing 1080p videos even with a 4k display showing a very wide timeline. I also record my screen with OBS to make screencasts with a webcam and have live streamed without perceivable dropped frames, all while running programming workloads in the background.
I can mostly play the games I want, but this is by far the weakest link. If I were more into gaming I would upgrade, no doubt about it.
========
PHONE
========
I had a Pixel 4a until Google busted the battery. It runs all of the apps (no games) I care about and Google Maps is fast. The camera was great.
I recently upgraded to a Pixel 9a because the repair center who broke my 4a in a number of ways gave me $350 and the 9a was $400 a few months ago. It also runs everything well and the camera is great. In my day to day it makes no difference from the 4a, literally none. It even has the same storage space of which I have around 50% space left with around 4,500 photos saved locally.
========
ASIDE
========
I have a pretty decked out M4 MBP laptop issued by my employer for work. I use it every day and for most tasks I feel no real difference vs my machine. The only thing it does noticeably faster is heavily CPU bound tasks that can be parallelized. It also loads the web version of Slack about 250ms faster, that's the impact of a $2,500+ upgrade for general web usage.
I'm really sensitive to skips, hitches and performance related things. For real, as long as you have a decent machine with an SSD using a computer feels really good, even for development workloads where you're not constantly compiling something.
One concern I'd have is that if the short-term supply of RAM is fixed anyway, even if all daily computer users were to increase their budget to match the new pricing and demand exceeds supply again, the pricing would just increase in response until prices get unreasonable enough that demand lowers back to supply.
I don't spend money on my computers from a work or "revenue-generating" perspective because my work buys me a computer to work on. Different story if you freelance/consult ofc.
I mean, as a frontline underpaid rural IT employee with no way to move outward from where I currently live, show me where I’m gonna put $5k a year into this budget out of my barren $55k/year salary. (And, mind you - this apparently is “more” than the local average by only around $10-15k.)
I’m struggling to buy hardware already as it is, and all these prices have basically fucked me out of everything. I’m riding rigs with 8 and 16GB of RAM and I have no way to go up from here. The AI boom has basically forced me out of the entire industry at this point. I can’t get hardware to learn, subscriptions to use, anything.
To be fair, Samsung's divisions having guns pointed at each other is nothing new. This is the same conglomerate that makes their own chip division fight for placement in their own phones, constantly flip-flopping between using Samsung or Qualcomm chips at the high end, Samsung or MediaTek chips at the low end, or even a combination of first-party and third-party chips in different variants of ostensibly the same device.
Sears had horizontal market where all of it did basically the same thing. Samsung is a huge conglomerate of several completely different vertical with lots of redundant components.
It makes absolutely no sense to apply the lessons from one into the other.
I think what the GP was referring to was the "new" owner of Sears, who reorganized the company into dozens of independent business units in the early 2010s (IT, HR, apparel, electronics, etc). Not departments, either; full-on internal businesses intended as a microcosm of the free market.
Each of these units were then given access to an internal "market" and directed to compete with each other for funding.
The idea was likely to try and improve efficiency... But what ended up happening is siloing increased, BUs started infighting for a dwindling set of resources (beyond normal politics you'd expect at an organization that size; actively trying to fuck each other over), and cohesion decreased.
It's often pointed to as one of the reasons for their decline, and worked out so badly that it's commonly believed their owner (who also owns the company holding their debt and stands to immensely profit if they go bankrupt) desired this outcome... to the point that he got sued a few years ago by investors over the conflict of interest and, let's say "creative" organizational decisions.
Sears had horizontal market where all of it did basically the same thing. Samsung is a huge conglomerate of several completely different vertical with lots of redundant components.
Sears was hardly horizontal. It was also Allstate insurance and Discover credit cards, among other things.
It's a forcing function that ensures the middle layers of a vertically integrated stack remain market competitive and don't stagnate because they are the default/only option
This is why the western idea of 'many companies competing' is a bit of a myth.
Countries with these huge conglomerates just take competitive modes internally. There's no magic that really happens when there's another company present. If competent, then these companies will naturally fall into this dynamic. W
Or how one-party states like China, Cuba, Vietnam, the old SU, or Japan (unofficially) are just as financially successful as Westminster parliamentary states. Or how your company only needs one union, not a dozen competing.
Basically every Galaxy phone comes in two versions. One with Exynos and one with Snapdragon. It's regional though. US always gets the Snapdragon phones while Europe and mostly Asia gets the Exynos version.
My understanding is that the Exynos is inferior in a lot of ways, but also cheaper.
In the past using Snapdragon CPUs for the U.S. made sense due to Qualcomm having much better support for the CDMA frequencies needed by Verizon. Probably no longer relevant since the 5G transition though.
Not one phone, they did this all over the place. Their flagship line did this starting with the Galaxy S7 all the way up to Galaxy S24. Only the most recent Galaxy S25 is Qualcomm Snapdragon only, supposedly because their own Exynos couldn't hit volume production fast enough.
The S23 too was Snapdragon only, allegedly to let the Exynos team catch some breath and come up with something competitive for the following generation. Which they partly did, as the Exynos S24 is almost on par with its Snapdragon brother. A bit worse on photo and gaming performance, a bit better in web browsing, from the benchmarks I remember.
Apple is going to be even more profitable in the consumer space because of RAM prices ? I feel like they are the only player to have the supply chain locked down enough to not get caught off guard, have good prices locked in enough in advance and suppliers not willing to antagonize such a big customer by backing out of a deal.
When RAM gets so expensive that even Samsung won’t buy Samsung from Samsung, you know the market has officially entered comic mode. At this rate their next quarterly report is just going to be one division sending the other an IOU.
Overleverage / debt, and refusing to sell at a certain price, are actually very different things though. OpenAI might be a tire fire, but Samsung is the gold pan seller here, and presumably has an excellent balance sheet.
Most of the things people say about efficient markets assume low barriers to entry. When it takes years and tens of billions of dollars to add capacity, it makes more sense to sit back and enjoy the margins. Especially if you think there's a non-trivial possibility that the AI build out is a bubble.
In their defense, how many $20 billion fabs do you want to build in response to the AI ... (revolution|bubble|other words)? It seems very, very difficult to predict how long DRAM demand will remain this elevated.
It's dangerous for them in both directions: Overbuilding capacity if the boom busts vs. leaving themselves vulnerable to a competitor who builds out if the boom is sustained. Glad I don't have to make that decision. :)
I'd take the opposite bet on this. They're diverting wafer capacity from lower-profit items to things like HBM, but all indications are that wafer starts are up a bit. Just not up enough.
"Sequentially, DRAM revenue increased 15% with bit shipments increasing over 20% and prices decreasing in the low single-digit percentage range, primarily due to a higher consumer-oriented revenue mix"
(from june of this year).
The problem is that the DRAM market is pretty tight - supply or demand shocks tend to produce big swings. And right now we're seeing both an expected supply shock (transition to new processes/products) as well as a very sudden demand shock.
If it’s an AI bubble, it would be stupid to open new manufacturing capacity right now. Spend years and billions spinning up a new fab, only to have the bottom of the market drop out as soon as it comes online.
I feel we have a RAM price surge every four years. The excuses change, but it's always when we see a generation switch to the next gen of DDR. Which makes me believe it's not AI, or graphics cards, or crypto, or gaming, or one of the billion other conceivable reasons, but price-gouging when new standards emerge and production capacity is still limited. Which would be much harder to justify than 'the AI/Crypto/Gaming folks (who no-one likes) are sweeping the market...'
But we're not currently switching to a next gen of DDR. DDR5 has been around for several years, DDR6 won't be here before 2027. We're right in the middle of DDR5's life cycle.
That is not to say there is no price-fixing going on, just that I really can't see a correlation with DDR generations.
Regardless of whether it is Crypto/AI/etc., this would seem to be wake-up call #2. We're finding the strangle-points in our "economy"—will we do anything about it? A single fab in Phoenix would seem inadequate?
If 'the West' would be half as smart as they claim to be there would be many more fabs in friendly territory. Stick a couple in Australia and NZ too for good measure, it is just too critical of a resource now.
It's a political problem: do we, the people, have a choice in what gets prioritized? I think it's clear that the majority of people don't give a damn about minor improvements in AI and would rather have a better computer, smartphone, or something else for their daily lives than fuel the follies of OpenAI and its competitors. At worst, they can build more fabs simultaneously to have the necessary production for AI within a few years, but reallocating it right now is detrimental and nobody wants that, except for a few members of the crazy elite like Sam Altman or Elon Musk.
Why is this downvoted, this is not the first time I've heard that opinion expressed and every time it happens there is more evidence that maybe there is something to it. I've been following the DRAM market since the 4164 was the hot new thing and it cost - not kidding - $300 for 8 of these which would give you all of 64K RAM. Over the years I've seen the price surge multiple times and usually there was some kind of hard to verify reason attached to it. From flooded factories to problems with new nodes and a whole slew of other issues.
RAM being a staple of the computing industry you have to wonder if there aren't people cleaning up on this, it would be super easy to create an artificial shortage given the low number of players in this market. In contrast, say the price of gasoline, has been remarkably steady with one notable outlier with a very easy to verify and direct cause.
A few hours ago I looked at the RAM prices. I bought
some DDR4, 32GB only, about a year or two ago. I kid
you not - the local price here is now 2.5 times as it
was back in 2023 or so, give or take.
This is important to point out. All the talk about AI companies underpricing is mistaken. The costs to consumers have just been externalized; the AI venture as a whole is so large that it simply distorts other markets in order to keep its economic reality intact. See also: the people whose electric bills have jumped due to increased demand from data centers.
Americans are subsidizing ai by paying more for their electricity for the rest of the world to use chatgpt (I'm not counting the data centers of Chinese models and a few European ones though)
And even more outrageous is the power grid upgrades they are demanding.
If they need the power grid upgraded to handle the load for their data centers, they should pay 100% of the cost for EVERY part of every upgrade needed for the whole grid, just as a new building typically pays to upgrade the town road accessing it.
Making ordinary ratepayers pay even a cent for their upgrades is outrageous. I do not know why the regulators even allow it (yeah, we all do, but it is wrong).
I bought 2x16 (32GB) DDR4 in June for $50. It is now ~$150.
I'm kicking myself for not buying the mini PC that I was looking at over the summer. The cost nearly doubled from what it was then.
My state keeps trying to add Data Centers in residential areas, but the public seems to be very against it. It will succeed somewhere and I'm sure that there will be a fee on my electric bill for "modernization" or some other bullshit.
The problem is further upstream. Capitalism is nice in theory, but...
"The trouble with capitalism is capitalists; they're too damn greedy." - Herbert Hoover, U.S. President, 1929-1933
And the past half-century has seen both enormous reductions in the regulations enacted in Hoover's era (when out-of-control financial markets and capitalism resulted in the https://en.wikipedia.org/wiki/Great_Depression), and the growth of a class of grimly narcissistic/sociopathic techno-billionaires - who control way too many resources, and seem to share some techno-dystopian fever dream that the first one of them to grasp the https://en.wikipedia.org/wiki/Artificial_general_intelligenc... trophy will somehow become the God-Emperor of Earth.
I'm running a box I put together in 2014 with an i5-4460 (3.2ghz), 16 GB of RAM, GeForce 750ti, first gen SSD, ASRock H97M Pro4 motherboard with a reasonable PSU, case and a number of fans. All of that parted out at the time was $700.
I've never been more fearful of components breaking than current day. With GPU and now memory prices being crazy, I hope I never have to upgrade.
I don't know how but the box is still great for every day web development with heavy Docker usage, video recording / editing with a 4k monitor and 2nd 1440p monitor hooked up. Minor gaming is ok too, for example I picked up Silksong last week, it runs very well at 2560x1440.
For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.
For a DDR3-era machine, you'd be buying RAM for that on Ebay, not Newegg.
I have an industrial Mini-ITX motherboard of similar vintage that I use with an i5-4570 as my Unraid machine. It doesn't natively support NVMe, but I was able to get a dual-m2 expansion card with its own splitter (no motherboard bifurcation required) and that let me get a pretty modern-feeling setup.
Don't worry, if you are happy with those specs you can get Corporate ewaste dell towers on ebay for low prices. "Dell precision tower", I just saw a listing for 32gb ram, Xeon 3.6ghz for about 300 usd.
Personally, at work I use the latest hardware at home I use ewaste.
I agree with you on SSDs, that was the last upgrade that felt like flipping the “modern computer” switch overnight. Everything since has been incremental unless you’re doing ML or high-end gaming.
The only time I had this other than changing to SSD was when I got my first multi-core system, a Q6600 (confusingly labeled a Core 2 Quad). Had a great time with that machine.
I know it's not the same. But I think a lot of people had a similar feeling going from Intel-Macbooks to Apple Silicon. An insane upgrade that I still can't believe.
This and high resolution displays, for me at least.
> I've never been more fearful of components breaking than current day.
The mid 90s was pretty scary too. Minimum wage was $4.25 and a new Pentium 133 was $935 in bulk.
> The mid 90s was pretty scary too.
If you fast forward just a few years though, it wasn't too bad.
You could put together a decent fully parted out machine in the late 90s and early 00s for around $600-650. These were machines good enough to get a solid 120 FPS playing Quake 3.
Are you sure? From what I can tell it's more like 500 USD RRP on release, boxed.
Either way, it was the 90s: two years later that was a budget CPU because the top end was two to three times the speed.
If you were in minimum wage jn the 90s your lifelihood likely didn't rely on Pentium processors.
Also, it is frightening how close that is to current day minimum wage.
I was an unemployed student then -- a generous family member gifted me my first Windows PC, and it cost about the same as a used car.
A few years later but similarly - I am still running a machine built spur-of-the-moment in a single trip to Micro Center for about $500 in late 2019 (little did we know what was coming in a few months!). I made one small upgrade in probably ~2022 to a Ryzen 5800X w/ 64GB of RAM but otherwise untouched. It still flies through basically anything & does everything I need, but I'm dreading when any of the major parts go and I have to fork out double or triple the original cost for replacements...
[dead]
Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum? I’m not saying to do what I do and buy a $7k laptop and a $15k desktop every year but compared to revenue it seems silly to be worrying about a few thousand dollars per year delta.
I buy the best phones and desktops money can buy, and upgrade them often, because, why take even the tiniest risk that my old or outdated hardware slows down my revenue generation which is orders of magnitude greater than their cost to replace?
Even if you don’t go the overkill route like me, we’re talking about maybe $250/month to have an absolutely top spec machine which you can then use to go and earn 100x that.
Spend at least 1% of your gross revenue on your tools used to make that revenue.
This is a crazy out of touch perspective.
Depending on salary, 2 magnitudes at $5k is $500k.
That amount of money for the vast majority of humans across the planet is unfathomable.
No one is worried about if the top 5% can afford DRAM. Literally zero people.
The vast majority of humans across the planet aren’t making their money with their computer, which was the qualifier in the first line of my comment.
Furthermore, even if they did, the vast majority of them still won’t be using their computer to generate revenue - they’ll be using and employer-provided one and the things I’m talking about have nothing to do with them.
What is the actual return on that investment, though? This is self indulgence justified as « investment ». I built a pretty beefy PC in 2020 and have made a couple of upgrades since (Ryzen 5950x, 64GB RAM, Radeon 6900XT, a few TB of NVMe) for like $2k all-in. Less than $40/month over that time. It was game changing upgrade from an aging laptop for my purposes of being able to run multiple VMs and a complex dev environment, but I really don’t know what I would have gotten out of replacing it every year since. It’s still blazing fast.
Even recreating it entirely with newer parts every single year would have cost less than $250/mo. Honestly it would probably be negative ROI just dealing with the logistics of replacing it that many times.
> maybe $250/month (...) which you can then use to go and earn 100x that.
25k/month? Most people will never come close to earn that much. Most developers in the third world don't make that in a full year, but are affected by raises in PC parts' prices.
I agree with the general principle of having savings for emergencies. For a Software Engineer, that should probably include buying a good enough computer for them, in case they need a new one. But the figures themselves seem skewed towards the reality of very well-paid SV engineers.
Yes? I think that's crazy. I just maxed out my new Thinkpad with 96 GB of RAM and a 4 TB SSD and even at today's prices, it still came in at just about $2k and should run smoothly for many years.
Prices are high but they're not that high, unless you're buying the really big GPUs.
Where can you buy a new Thinkpad with 96GB and 4TB SSD for $2K? Prices are looking quite a bit higher than that for the P Series, at least on Lenovo.com in the U.S. And I don't see anything other than the P Series that lets you get 96GB of RAM.
Typing this on similar spec P16s that was around 2.6k or so. So if you call anything under 3k simply 2k, then it was 2k.
Thats in Germany, from a corporate supplier.
I agree with the general sentiment - that you shouldn't pinch pennies on tools that you use every day. But at the same time, someone who makes their money writing with with a pen shouldn't need to spend thousands on pens. Once you have adequate professional-grade tools, you don't need to throw more money at the problem.
Yes, that's an absolutely deranged opinion. Most tech jobs can be done on a $500 laptop. You realise some people don't even make your computer budget in net income every year, right?
Most tech jobs could be done on a $25 ten year old smartphone with a cracked screen and bulging battery.
That’s exactly my point. Underspending on your tools is a misallocation of resources.
That's crazy spend for anyone making sub 100K
I try to come at it with a pragmatic approach. If I feel pain, I upgrade and don't skimp out.
======== COMPUTER ========
I feel no pain yet.
Browsing the web is fast enough where I'm not waiting around for pages to load. I never feel bound by limited tabs or anything like that.
My Rails / Flask + background worker + Postgres + Redis + esbuild + Tailwind based web apps start in a few seconds with Docker Compose. When I make code changes, I see the results in less than 1 second in my browser. Tests run fast enough (seconds to tens of seconds) for the size of apps I develop.
Programs open very quickly. Scripts I run within WSL 2 also run quickly. There's no input delay when typing or performance related nonsense that bugs me all day. Neovim runs buttery smooth with a bunch of plugins through the Windows Terminal.
I have no lag when I'm editing 1080p videos even with a 4k display showing a very wide timeline. I also record my screen with OBS to make screencasts with a webcam and have live streamed without perceivable dropped frames, all while running programming workloads in the background.
I can mostly play the games I want, but this is by far the weakest link. If I were more into gaming I would upgrade, no doubt about it.
======== PHONE ========
I had a Pixel 4a until Google busted the battery. It runs all of the apps (no games) I care about and Google Maps is fast. The camera was great.
I recently upgraded to a Pixel 9a because the repair center who broke my 4a in a number of ways gave me $350 and the 9a was $400 a few months ago. It also runs everything well and the camera is great. In my day to day it makes no difference from the 4a, literally none. It even has the same storage space of which I have around 50% space left with around 4,500 photos saved locally.
======== ASIDE ========
I have a pretty decked out M4 MBP laptop issued by my employer for work. I use it every day and for most tasks I feel no real difference vs my machine. The only thing it does noticeably faster is heavily CPU bound tasks that can be parallelized. It also loads the web version of Slack about 250ms faster, that's the impact of a $2,500+ upgrade for general web usage.
I'm really sensitive to skips, hitches and performance related things. For real, as long as you have a decent machine with an SSD using a computer feels really good, even for development workloads where you're not constantly compiling something.
One concern I'd have is that if the short-term supply of RAM is fixed anyway, even if all daily computer users were to increase their budget to match the new pricing and demand exceeds supply again, the pricing would just increase in response until prices get unreasonable enough that demand lowers back to supply.
I don't spend money on my computers from a work or "revenue-generating" perspective because my work buys me a computer to work on. Different story if you freelance/consult ofc.
I mean, as a frontline underpaid rural IT employee with no way to move outward from where I currently live, show me where I’m gonna put $5k a year into this budget out of my barren $55k/year salary. (And, mind you - this apparently is “more” than the local average by only around $10-15k.)
I’m struggling to buy hardware already as it is, and all these prices have basically fucked me out of everything. I’m riding rigs with 8 and 16GB of RAM and I have no way to go up from here. The AI boom has basically forced me out of the entire industry at this point. I can’t get hardware to learn, subscriptions to use, anything.
Big Tech has made it unaffordable for everyone.
The bright side is the bust is going to make a glut of cheap used parts.
You’re an employee. You don’t use any of your computers to make your revenue; your employer provides them. Obviously I wasn’t speaking about you.
To be fair, Samsung's divisions having guns pointed at each other is nothing new. This is the same conglomerate that makes their own chip division fight for placement in their own phones, constantly flip-flopping between using Samsung or Qualcomm chips at the high end, Samsung or MediaTek chips at the low end, or even a combination of first-party and third-party chips in different variants of ostensibly the same device.
To be honest, this actually sounds kinda healthy.
Yeah, makes absolute sense.
A bit like Toyota putting a GM engine in their car, because the Toyota engine division is too self-centered, focusing to much on efficiency.
Sears would like to have a word about how healthy intra-company competition is.
Sears had horizontal market where all of it did basically the same thing. Samsung is a huge conglomerate of several completely different vertical with lots of redundant components.
It makes absolutely no sense to apply the lessons from one into the other.
I think what the GP was referring to was the "new" owner of Sears, who reorganized the company into dozens of independent business units in the early 2010s (IT, HR, apparel, electronics, etc). Not departments, either; full-on internal businesses intended as a microcosm of the free market.
Each of these units were then given access to an internal "market" and directed to compete with each other for funding.
The idea was likely to try and improve efficiency... But what ended up happening is siloing increased, BUs started infighting for a dwindling set of resources (beyond normal politics you'd expect at an organization that size; actively trying to fuck each other over), and cohesion decreased.
It's often pointed to as one of the reasons for their decline, and worked out so badly that it's commonly believed their owner (who also owns the company holding their debt and stands to immensely profit if they go bankrupt) desired this outcome... to the point that he got sued a few years ago by investors over the conflict of interest and, let's say "creative" organizational decisions.
Sears had horizontal market where all of it did basically the same thing. Samsung is a huge conglomerate of several completely different vertical with lots of redundant components.
Sears was hardly horizontal. It was also Allstate insurance and Discover credit cards, among other things.
It's a forcing function that ensures the middle layers of a vertically integrated stack remain market competitive and don't stagnate because they are the default/only option
This is why the western idea of 'many companies competing' is a bit of a myth.
Countries with these huge conglomerates just take competitive modes internally. There's no magic that really happens when there's another company present. If competent, then these companies will naturally fall into this dynamic. W
Or how one-party states like China, Cuba, Vietnam, the old SU, or Japan (unofficially) are just as financially successful as Westminster parliamentary states. Or how your company only needs one union, not a dozen competing.
The opposite, nepotism, is very unhealthy, so i think you're correct.
> two versions of the same phone with different processors
That's hilarious, which phone is this?
Basically every Galaxy phone comes in two versions. One with Exynos and one with Snapdragon. It's regional though. US always gets the Snapdragon phones while Europe and mostly Asia gets the Exynos version.
My understanding is that the Exynos is inferior in a lot of ways, but also cheaper.
In the past using Snapdragon CPUs for the U.S. made sense due to Qualcomm having much better support for the CDMA frequencies needed by Verizon. Probably no longer relevant since the 5G transition though.
Not one phone, they did this all over the place. Their flagship line did this starting with the Galaxy S7 all the way up to Galaxy S24. Only the most recent Galaxy S25 is Qualcomm Snapdragon only, supposedly because their own Exynos couldn't hit volume production fast enough.
[delayed]
The S23 was also Snapdragon-only as far as I know[1]. The S24 had the dual chips again, while as you say S25 is Qualcomm only once more.
[1]: https://www.androidauthority.com/samsung-exynos-versus-snapd...
The S23 too was Snapdragon only, allegedly to let the Exynos team catch some breath and come up with something competitive for the following generation. Which they partly did, as the Exynos S24 is almost on par with its Snapdragon brother. A bit worse on photo and gaming performance, a bit better in web browsing, from the benchmarks I remember.
This is the case as recent as of S24, phones can come with exynos or snapdragon, with exynos usually featuring worse performance and battery life
Several high end Galaxy S's AFAIK.
Apple is going to be even more profitable in the consumer space because of RAM prices ? I feel like they are the only player to have the supply chain locked down enough to not get caught off guard, have good prices locked in enough in advance and suppliers not willing to antagonize such a big customer by backing out of a deal.
Kdrama on this when?
You make more money selling the good stuff. It's like this in just about every industry.
When RAM gets so expensive that even Samsung won’t buy Samsung from Samsung, you know the market has officially entered comic mode. At this rate their next quarterly report is just going to be one division sending the other an IOU.
Overleverage / debt, and refusing to sell at a certain price, are actually very different things though. OpenAI might be a tire fire, but Samsung is the gold pan seller here, and presumably has an excellent balance sheet.
The manufacturers are willing to quadruple the prices for the foreseeable future but not change their manufacturing quotes a bit.
So much for open markets, somebody must check their books and manufacturing schedules.
Most of the things people say about efficient markets assume low barriers to entry. When it takes years and tens of billions of dollars to add capacity, it makes more sense to sit back and enjoy the margins. Especially if you think there's a non-trivial possibility that the AI build out is a bubble.
In their defense, how many $20 billion fabs do you want to build in response to the AI ... (revolution|bubble|other words)? It seems very, very difficult to predict how long DRAM demand will remain this elevated.
It's dangerous for them in both directions: Overbuilding capacity if the boom busts vs. leaving themselves vulnerable to a competitor who builds out if the boom is sustained. Glad I don't have to make that decision. :)
I don’t think they’re working at 100% capacity or don’t have any other FAB that they can utilize for other low profit stuff.
Let’s check their books and manufacturing schedule to see if they’re artificially constraining the supply to jack up the prices on purpose.
I'd take the opposite bet on this. They're diverting wafer capacity from lower-profit items to things like HBM, but all indications are that wafer starts are up a bit. Just not up enough.
For example: https://chipsandwafers.substack.com/p/mainstream-recovery
"Sequentially, DRAM revenue increased 15% with bit shipments increasing over 20% and prices decreasing in the low single-digit percentage range, primarily due to a higher consumer-oriented revenue mix"
(from june of this year).
The problem is that the DRAM market is pretty tight - supply or demand shocks tend to produce big swings. And right now we're seeing both an expected supply shock (transition to new processes/products) as well as a very sudden demand shock.
If it’s an AI bubble, it would be stupid to open new manufacturing capacity right now. Spend years and billions spinning up a new fab, only to have the bottom of the market drop out as soon as it comes online.
This seems to be for chips put in phones in 2026? I thought these orders were booked further in advance, or is that only for processors?
In the 90s, Motorola Mobile used Cypress SRAMs and not Motorola SRAMs.
Pricing.
I feel we have a RAM price surge every four years. The excuses change, but it's always when we see a generation switch to the next gen of DDR. Which makes me believe it's not AI, or graphics cards, or crypto, or gaming, or one of the billion other conceivable reasons, but price-gouging when new standards emerge and production capacity is still limited. Which would be much harder to justify than 'the AI/Crypto/Gaming folks (who no-one likes) are sweeping the market...'
But we're not currently switching to a next gen of DDR. DDR5 has been around for several years, DDR6 won't be here before 2027. We're right in the middle of DDR5's life cycle.
That is not to say there is no price-fixing going on, just that I really can't see a correlation with DDR generations.
Regardless of whether it is Crypto/AI/etc., this would seem to be wake-up call #2. We're finding the strangle-points in our "economy"—will we do anything about it? A single fab in Phoenix would seem inadequate?
If 'the West' would be half as smart as they claim to be there would be many more fabs in friendly territory. Stick a couple in Australia and NZ too for good measure, it is just too critical of a resource now.
It's a political problem: do we, the people, have a choice in what gets prioritized? I think it's clear that the majority of people don't give a damn about minor improvements in AI and would rather have a better computer, smartphone, or something else for their daily lives than fuel the follies of OpenAI and its competitors. At worst, they can build more fabs simultaneously to have the necessary production for AI within a few years, but reallocating it right now is detrimental and nobody wants that, except for a few members of the crazy elite like Sam Altman or Elon Musk.
Micron is bringing up one in Boise Idaho as well.
What will we do with that fab in two years when nobody needs that excess RAM?
Sell it at lower prices. Demand is a function of price, not a scalar.
Tax write-off donations to schools and non-profits, too.
There has never been 'an excess of RAM', the market has always absorbed what was available.
Why is this downvoted, this is not the first time I've heard that opinion expressed and every time it happens there is more evidence that maybe there is something to it. I've been following the DRAM market since the 4164 was the hot new thing and it cost - not kidding - $300 for 8 of these which would give you all of 64K RAM. Over the years I've seen the price surge multiple times and usually there was some kind of hard to verify reason attached to it. From flooded factories to problems with new nodes and a whole slew of other issues.
RAM being a staple of the computing industry you have to wonder if there aren't people cleaning up on this, it would be super easy to create an artificial shortage given the low number of players in this market. In contrast, say the price of gasoline, has been remarkably steady with one notable outlier with a very easy to verify and direct cause.
This industry has a history of forming cartels.
https://en.wikipedia.org/wiki/DRAM_price_fixing_scandal
AI companies must compensate us for this outrage.
A few hours ago I looked at the RAM prices. I bought some DDR4, 32GB only, about a year or two ago. I kid you not - the local price here is now 2.5 times as it was back in 2023 or so, give or take.
I want my money back, OpenAI!
This is important to point out. All the talk about AI companies underpricing is mistaken. The costs to consumers have just been externalized; the AI venture as a whole is so large that it simply distorts other markets in order to keep its economic reality intact. See also: the people whose electric bills have jumped due to increased demand from data centers.
I think we're going to regret this.
Americans are subsidizing ai by paying more for their electricity for the rest of the world to use chatgpt (I'm not counting the data centers of Chinese models and a few European ones though)
DDR4 manufacturing is being spun down due to lack of demand. The prices on it would be going up regardless of what's happening with DDR5.
Yup.
And even more outrageous is the power grid upgrades they are demanding.
If they need the power grid upgraded to handle the load for their data centers, they should pay 100% of the cost for EVERY part of every upgrade needed for the whole grid, just as a new building typically pays to upgrade the town road accessing it.
Making ordinary ratepayers pay even a cent for their upgrades is outrageous. I do not know why the regulators even allow it (yeah, we all do, but it is wrong).
I bought 2x16 (32GB) DDR4 in June for $50. It is now ~$150.
I'm kicking myself for not buying the mini PC that I was looking at over the summer. The cost nearly doubled from what it was then.
My state keeps trying to add Data Centers in residential areas, but the public seems to be very against it. It will succeed somewhere and I'm sure that there will be a fee on my electric bill for "modernization" or some other bullshit.
I am so glad I built my PC back in April. My 2x16gb DDR5 sticks cost $105 all in then, now it’s $480 on amazon. That is ridiculous!
The problem is further upstream. Capitalism is nice in theory, but...
"The trouble with capitalism is capitalists; they're too damn greedy." - Herbert Hoover, U.S. President, 1929-1933
And the past half-century has seen both enormous reductions in the regulations enacted in Hoover's era (when out-of-control financial markets and capitalism resulted in the https://en.wikipedia.org/wiki/Great_Depression), and the growth of a class of grimly narcissistic/sociopathic techno-billionaires - who control way too many resources, and seem to share some techno-dystopian fever dream that the first one of them to grasp the https://en.wikipedia.org/wiki/Artificial_general_intelligenc... trophy will somehow become the God-Emperor of Earth.