Original Reddit post

Nvidia has put its name behind a fledgling effort to put mini-data centers beside people’s homes in boxes that look like HVAC units. It’s a “power” play, considering that the main bottleneck to building out more data center capacity is not money or chips, but rather retrofitting the electrical grid to supply the power. The idea, put forward by a California smart utility box company called Span, is to put the GPUs where the power has already been allocated—at the home. Span says the average household uses only about 42% of the electricity allotted to it, and rarely reaches peak usage. Span’s smart utility boxes detect that, and steer the extra available power over to the GPUs, which live inside a “node” that sits beside the house and looks something like an HVAC unit. The boxes contain 16 Nvidia GPUs, 4 AMD CPUs, 4 terabytes of memory, and a cooling system. When a large number of homes have these, the servers could be connected together in a network and work together on distributed computing jobs (workloads), Span says. In exchange for hosting a node, Span pays a big chunk of the homeowner’s electricity and broadband internet bills. It’s a cool idea on paper, but it’s almost completely unproven in real-world use. Span has been prototyping the units but has yet to install any of them beside real homes. I asked Span VP Chris Lander if his company has done technical studies showing that its brand of distributed computing will be fast and robust enough to handle real AI workloads. “We’ve done a bunch of technical studies internally [and] a bunch of modeling for different kinds of workloads, both from the business point of view [and] the product point of view and from the technical architecture point of view,” he replies. submitted by /u/_fastcompany

Originally posted by u/_fastcompany on r/ArtificialInteligence