Below you will find pages that utilize the taxonomy term “Ai-Computing”
The GPU Wars Heat Up: Former Intel CEO's Shot at NVIDIA Misses the Mark
The tech world is buzzing with former Intel CEO Pat Gelsinger’s recent comments about NVIDIA’s AI GPU pricing, claiming they’re “10,000x too expensive” for inference tasks. While sitting in my home office, looking at the rather modest Intel Arc GPU in my secondary machine, I can’t help but find the irony in these statements a bit rich.
Let’s be real here - NVIDIA’s pricing is absolutely eye-watering. The cost of their enterprise AI GPUs would make even the most seasoned tech procurement manager break into a cold sweat. But to suggest this is merely a case of Jensen Huang “getting lucky” with AI timing completely misses the mark.
The GPU Arms Race: When Home AI Servers Get Ridiculous
Reading about someone’s 14x RTX 3090 home server setup this morning made my modest 32GB VRAM setup feel like I brought a butter knife to a nuclear war. This absolute unit of a machine, sporting 336GB of total VRAM, represents perhaps the most extreme example of the local AI computing arms race I’ve seen yet.
The sheer audacity of the build is both impressive and slightly concerning. We’re talking about a setup that required dedicated 30-amp 240-volt circuits installed in their house - the kind of power infrastructure you’d typically associate with industrial equipment, not a home computer. The cooling requirements alone must be enough to heat a small neighbourhood.