Most of us have been discussing how far Apple seems to be behind the curve when it comes to AI. On the other hand, most of us are familiar with Apple’s tradition of perhaps not always being first in a given market segment but usually getting it right - more right than the competition - in the end.
With both points in mind, I offer this excerpt from a Marketwatch newsletter that hints Apple may be closer to the second scenario than the first when it comes to AI:
>>Investors have greeted the news that Scott Bessent will be President-elect Donald Trump’s choice to be Treasury secretary with relief, hoping the hedge-fund manager will smooth out the rough edges of the incoming president’s policies.
But this year’s stellar stock-market returns have less to do with Trump and more with hopes for artificial intelligence. Lux Capital counts AI-focused companies including Together AI and Hugging Face in its portfolio, and in its latest letter to shareholders, the New York venture-capital firm discussed how power-hungry AI is.
“Today’s largest AI clusters use around 100,000 Nvidia H100 chips and consume roughly 100 megawatts of power. Next year’s clusters will scale to 300,000 to 500,000 chips requiring nearly a gigawatt of power — about the consumption of a small city,” the firm said. That’s imposing real physical constraints, as “the tech giants who built empires on weightless bits and bytes are now grappling with atoms — steel, copper, water rights, and critically, natural gas,” according to the firm.
While there have been some AI-fueled nuclear power deals, Lux said “abundant natural gas from the Texas Permian seems a wiser bet” to power AI-related demand.
The firm talked about the need for companies to do more with less. Some of that is talking the firm’s book — Together AI makes software to run generative AI workloads more efficiently — but Lux also discussed Apple Inc. (AAPL). “Apple has quietly published research showing how to run large language models directly on devices with limited memory. By leveraging novel storage techniques and the inherent sparsity in AI models, the company can reduce memory requirements by 50% while maintaining performance. Advances like these could enable future iPhones and Macs to run sophisticated AI models locally and privately — a development that could radically reshape the entire AI infrastructure landscape,” the firm said.<<
With both points in mind, I offer this excerpt from a Marketwatch newsletter that hints Apple may be closer to the second scenario than the first when it comes to AI:
>>Investors have greeted the news that Scott Bessent will be President-elect Donald Trump’s choice to be Treasury secretary with relief, hoping the hedge-fund manager will smooth out the rough edges of the incoming president’s policies.
But this year’s stellar stock-market returns have less to do with Trump and more with hopes for artificial intelligence. Lux Capital counts AI-focused companies including Together AI and Hugging Face in its portfolio, and in its latest letter to shareholders, the New York venture-capital firm discussed how power-hungry AI is.
“Today’s largest AI clusters use around 100,000 Nvidia H100 chips and consume roughly 100 megawatts of power. Next year’s clusters will scale to 300,000 to 500,000 chips requiring nearly a gigawatt of power — about the consumption of a small city,” the firm said. That’s imposing real physical constraints, as “the tech giants who built empires on weightless bits and bytes are now grappling with atoms — steel, copper, water rights, and critically, natural gas,” according to the firm.
While there have been some AI-fueled nuclear power deals, Lux said “abundant natural gas from the Texas Permian seems a wiser bet” to power AI-related demand.
The firm talked about the need for companies to do more with less. Some of that is talking the firm’s book — Together AI makes software to run generative AI workloads more efficiently — but Lux also discussed Apple Inc. (AAPL). “Apple has quietly published research showing how to run large language models directly on devices with limited memory. By leveraging novel storage techniques and the inherent sparsity in AI models, the company can reduce memory requirements by 50% while maintaining performance. Advances like these could enable future iPhones and Macs to run sophisticated AI models locally and privately — a development that could radically reshape the entire AI infrastructure landscape,” the firm said.<<