TL;DR: Jensen Huang and China’s data chief both declared tokens a “commodity” and “settlement unit” the same week. They’re not talking about compensation or tech specs. They’re building the pricing infrastructure that turns AI from a money-losing subscription service into a functioning economy where token consumption is an investment with measurable returns, priced like energy or raw materials. Two things happened the same week that are more connected than they may first appear. At GTC, Jensen Huang called tokens “the new commodity” and proposed giving Nvidia engineers token budgets worth half their base salary. Days later, China’s National Data Administration head Liu Liehong called tokens a “settlement unit” and a “value anchor for the intelligent era.” China even coined an official term: “ciyuan,” combining “word” with “yuan,” their currency unit. Two very different actors, arriving at the same framing independently. Why, and why now? Because the AI industry is at the point where tokens need to be understood as what they actually are: units of productive output, not just a cost center. When Jensen says he’d be “deeply alarmed” if a $500,000 engineer consumed only $5,000 in tokens, he’s saying the tokens are where the value gets created. An engineer plus $250K in token consumption produces dramatically more than that same engineer working without them. The token spend is an investment with a return, the same way a manufacturer investing in better equipment expects higher output per worker. The problem isn’t that tokens cost money. It’s that the current pricing model doesn’t reflect their productive value. AI companies have been giving away tokens at below cost to build market share, the way ride-sharing companies subsidized every trip for years. OpenAI is projecting $17B in cash burn this year. Anthropic is spending roughly $19B against break-even revenue. That’s not sustainable, but it also doesn’t mean tokens are overpriced. It means they’re underpriced relative to the value they generate. That’s why the commodity framing matters. When both Jensen and China’s data chief independently call tokens a commodity and a settlement unit, they’re building the foundation for a pricing model that connects cost to value. Once organizations budget for tokens the way they budget for energy, cloud compute, or raw materials, the price can find a level that reflects what tokens actually produce rather than what a subscription marketing strategy dictates. The analogy to energy markets runs deeper than you might expect. The compute that produces tokens (GPU cycles, electricity, data center capacity) is fungible at the base layer, same as crude oil regardless of origin. Tokens are the refined product. Like gasoline, they come in grades: lightweight inference is regular, deep reasoning is premium, multimodal is high-octane. What matters to the end user is the output, not the molecular composition of the fuel. Once you see it this way, the competitive landscape snaps into focus. China is playing the low-cost producer: converting cheap renewable energy into tokens through efficient model architectures. MiniMax and Moonshot charge $2-3 per million output tokens vs. roughly $15 for comparable US models. US providers are playing the premium tier: better reliability, data sovereignty, deeper reasoning. Both approaches work because different applications demand different grades of token, just as different vehicles need different grades of fuel. Goldman Sachs found in March that AI delivers roughly 30% productivity gains on targeted tasks like customer support and software development. Those gains translate into real returns for organizations willing to invest in token consumption. The companies figuring out which tasks generate the highest return per token spent are building a genuine competitive advantage, not just running up a bill. The race isn’t just to build better models. It’s to define how the output of those models gets priced, traded, and valued. Jensen and Liu Liehong both seem to understand that whoever wins that framing contest shapes the economics of AI for the next decade. submitted by /u/Neobobkrause
Originally posted by u/Neobobkrause on r/ArtificialInteligence
