If you've been trying to build a PC lately, you've probably noticed RAM prices going absolutely nuts— DDR5 kits doubling in price, and even specific high-end kits jumping up to 619% in some regions. It's frustrating, especially when you're trying to build or upgrade on a budget. Now with Christmas here, we're getting free games for PC or laptop from Epic Games , but it's kinda sad we can't afford the RAM to run them properly. This feels like a vicious cycle: after AI drove up GPU prices first, now it's RAM. How sad is it to have games ready to play, finally manage to grab a decent GPU, and then can't afford the RAM? The sad part is that a lot of companies seem focused on AI workloads and data centers , while the regular users gamers, students, creators are the ones feeling locked out. We’re still the customers for both games and AI tools , but right now it doesn’t really feel that way.
Note:Now you can watch this blog on youtube I think GPUs are aging like milk right now. Every year, there’s a “ new god GPU ” on the market that makes the previous one feel instantly outdated. We’ve already seen this: first the H100 , then the H200 , and now the Blackwell B200 , each bringing insane performance jumps. In my view, buying a new flagship GPU is often a waste of money. Let me tell you why: You buy a flagship card, and 6–12 months later, the resale value is trash because new software demands even more VRAM and TFLOPS again. The situation in data centers is even worse. Companies are pouring hundreds of billions into GPUs that run at crazy power levels all day. A significant chunk of them either die or become “old” within just 1–2 years . That is a lot of silicon going straight to the graveyard. For normal people like us, the lesson is simple: don’t treat your laptop or PC like a mini data center. If you try to run AI workloads 24/7, you will cook your hardware in 1–2...