They spent two years telling their AI team to catch up to rivals and then gutting them on what was already a hardware compromise.
Everyone else is using huge server farms filled with the latest 400w, 2,000GB/sec NVidia hardware, the lower rung is trying their best on the latest 750w, 5,300GB/sec. AMD hardware, and the Apple team asks for dogfooded Mac hardware topping out at 300w, 800GB/sec. and THAT allotment gets chopped in half? Yeah, no **** morale divebombed.
They're behind on sparsity-capable GPU shaders, they're behind on memory bandwidth for those GPUs, they're behind on cache architecture for those GPUs, they're comically behind on network interconnect for the systems hosting those GPUs, they're behind on power curves for GPUs plugged into the wall, and they somehow expect their AI team to outpace other companies using actually good hardware.