Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I tried asking it about itself and it got confused whether it was built on OpenAi’s framework or DeepSeek; it even apologised for the confusion when I quoted the App Store description to it and it accepted that it believed it was built on OpenAi but will now reconsider its view.
The same has happened with Gemini and Claude. It's because they have been trained on lots of ChatGPT output.
 
Last edited:
  • Like
Reactions: haruhiko
But you do need some quite beefy hardware to run the LLMs yourself. It'll be very hard to run their biggest model.
You need beefy hardware, but it's not that hard. People are running the 671B model on 32-core EPYC systems with 384 GB of RAM.
 
  • Like
Reactions: ingik
It's the age of surveillance capitalism.
It's crazy how our lives are contained on hundreds of databases across multiple companies and traded/sold.

My Dad just passed away on the weekend, and he always said cash was king, he had a bankbook, no credit card, when he got paid he'd take all his money out and hide it around as he didn't want anyone to know how much he had or what he spent. To him it wasn't anyone else's business, and he is right. Seems impossible in this digital age, but I think about the method to his madness a lot.
 
It's crazy how our lives are contained on hundreds of databases across multiple companies and traded/sold.

My Dad just passed away on the weekend, and he always said cash was king, he had a bankbook, no credit card, when he got paid he'd take all his money out and hide it around as he didn't want anyone to know how much he had or what he spent. To him it wasn't anyone else's business, and he is right. Seems impossible in this digital age, but I think about the method to his madness a lot.
So sorry for your loss, Jonny.
 
  • Like
Reactions: Jxdawg
It's what happens when you don't give a 💩 what it's trained on, because everything you can get your hands on is yours. Unlike western techbro led enterprises who eventually have to bow to some sort of consumer protection though, that probably won't change as fast: It only matters whether you're harming the CCP or not.
 
LLM's really don't use much power for inference. Training sure - but once that is done they really don't take much power which is why, if you've got enough VRAM you can run models at home.

I'm not sure where this notion came from that every time you ask an LLM something half a rain forest is destroyed.
Just because evaluating one single prompt doesn't weigh all that much in terms of environmental impact doesn't mean that the whole operation of offering an inference API is ecologically fine. These models run on very power hungry GPUs, many thousands of them and they are all running 24/7 at close to 100% utilization. It doesn't matter whether you run training code or inference code on them, if they are used, they are used. You can be pretty sure that a model such as GPT4o has burned more energy during inference integrated over its lifetime than the training of the model has.
Additionally—especially running LLMs at home is actually an incredibly wasteful endeavor comparatively. The only reason why inference on a server farm scale is so efficient is because it can run batched, meaning one instance of the model can process multiple prompts at once. At home, you'll likely only ever process a single prompt at once, which is very inefficient energetically.
 
Last edited:
Honestly what can China do with your dancing video data?
You would rather have your data given to US companies so they could pass them to your gov, so that they can technically get any info they want or able to quickly take any action on you?
The use such mundane data to feed their propaganda machine so whatever division they are attempting to sew seems genuinely homegrown. You need to know what very specific demographics like; their very specific parlance; etc etc to be even more effective. It’s even more important as more complicated things like AI videos are added to their arsenal.
 
I like using ChatGPT as a search engine. When you have it installed in macOS, you just press OPTION+SPACEBAR to open the prompt, similar to COMMAND+SPACEBAR (for Spotlight Search).

I decreased my use of a search engine (DuckDuckGo and Yandex) considerably. I will try DeepSeek to see how it compares to ChatGPT.

Privacy Policy mentions how your data will be sold to advertisers. I tried to register using disposable emails to avoid giving my personal one and they're blacklisted. Screw that.
 

US tech stocks got hammered Monday morning. Nvidia (NVDA), the leading supplier of AI chips, whose stock more than doubled in each of the past two years, fell 12% in premarket trading. Meta (META) and Alphabet (GOOGL), Google’s parent company, were also down sharply, as were Marvell, Broadcom, Palantir, Oracle and many other tech giants.

Best community comment
US AI enterprises seem to be more about finance, power accumulation, and wealth extraction than about computing.
 
Last edited:
Huge win for open source AI and competition, already using this for searching code solutions. Cannot solve anything complex still, but useful as fast google alternative for simple things.
 
  • Like
Reactions: AeroEd
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.