Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I’m liking the summarization feature. AI is not going away and will eventually be part of our fabric. Apple intelligence is not going away. And I look forward to seeing how apple intelligence evolves. And all of this will improve. Fake news by the BBC I say.

It might not be going away, but I'm keeping it off until further notice.

Apple doesn't need to messing with headlines, because that isn't a case of fake news, it's a case of Apple AI changing the wording, the context, and the meaning. How is that on the BBC? It's like you tell me something, and I repeat it back completely changed but then blame you.
 
Last edited:
AI is just the ruling class wanting to decouple profit from 'customers'. Businesses used to have to have customers buying things, or they'd fail. Most tech businesses are supported not by selling things, but by an endless torrent of venture capital.
 
  • Like
Reactions: kamyk35


Apple is once again under scrutiny after its AI-powered notification system generated false news summaries on Friday, including an erroneous claim about a darts player winning a championship and a fabricated story about tennis star Rafael Nadal.

bbc-news-apple-intelligence-summaries.jpg

BBC News reported that Apple Intelligence incorrectly notified users that Luke Littler had won the PDC World Championship before the final match had even taken place. In a separate incident on the same day, the system falsely claimed that Rafael Nadal had come out as gay, misinterpreting a story about a Brazilian tennis player.

These latest mishaps follow previous concerns raised by Reporters Without Borders (RSF), which called for Apple to remove the AI summary feature last month after it generated misleading headlines about a high-profile murder case. The journalism organization warned that such AI-generated summaries pose "a danger to the public's right to reliable information."

The BBC has demanded urgent action from Apple, claiming that the recurring issue threatens the credibility of trusted news organizations. "It is essential that Apple fixes this problem urgently - as this has happened multiple times," a BBC spokesperson said.

Apple Intelligence is available on iPhone 15 Pro, iPhone 16 models, and select iPads and Macs running iOS 18.1 or later. Amongst other things, the AI features aim to simplify notification management by condensing multiple alerts into brief summaries. The feature includes a reporting mechanism for inaccurate summaries, but Apple has not publicly addressed the ongoing concerns or disclosed the number of reports it has received.

Article Link: BBC Calls Out Apple's AI Feature for Creating More Fake News Headlines
No media outlet pushes out more fake news than the BBC in the UK....having said that, clearly trying to summarise summaries doesn't work, so Apple should disable it.
 
LLMs are complete trash with no viability in the short or long term.

Some of the traditional ML stuff is useful (classifiers etc).

That's where it is.

LLMs are a tool. Use them correctly, and they can be highly beneficial. Some people are content to use a shovel and dig a large hole, but maybe it would be faster to use a backhoe loader. That's what LLMs can be like in some situations -- upgrading from a shovel to a tractor.

You can ignore them all you want to, but they literally save me months of work for each project I work on. Granted, I mostly use them for scripting/coding (I'm not a computer scientist, just a normal Ph.D.-holding scientist), but they help me code my scientific work and statistical analyses. What I do is highly study- and analysis-specific. I need to create my own workflows using multiple validated external applications. These pipelines/workflows will likely only be used one time ever (by me) because they are specific to my data. This means that while I incorporate what others do and have produced, there is a lot of unique code to produce to use my data.

When produced, all final code is publicly available on my GitHub account for independent verification during the peer review process and after publication. All statistical analyses are also independently run by a statistician in SAS. To date, with my LLM-helped R code and her 'traditionally' written SAS code, we've had the same results 100% of the time. We've even pulled in a 3rd person a few times to also run independently using other software, and they get the same results. That's not necessary to do because I'm using standard R packages and statistical approaches, but I'm a stickler for validation.

My other generated code is primarily Bash and Python and uses independently developed and validated research tools (as I mentioned previously) within the scripts. Before LLMs I figured the scripting all out on my own, which was fine, but it took days to weeks to months to do web searches, dig through Stack Exchange, and contact the developers of code (rarely needed to do, but it happened). Now I can produce much more elegant code without having to do most of that in a fraction of the time. Why don't I hire someone to code for me? I wouldn't mind doing that if NIH or my university gave me money to do that. They don't, so I get to do it myself (which I enjoy anyway but I also enjoy being much more efficient).

I was talking with a colleague who has also been impressed with LLMs. He mentioned his multi-year postdoc was almost entirely coding some specific analyses. Now, with LLMs, he could do the same thing in a couple weeks.

Is what we do with LLMs "complete trash"? Not remotely. It's better and even more reproducible than what we were able to do before because the LLMs help make the code more generalizable.

Broadly gesticulating and saying, "LLMs are complete trash" is simply wrong. Maybe there's a more tactful way of saying that, maybe they are worthless to you, but to me and many other people, they are life-changing for the better. If you don't like them, don't use them. I'm getting more and better work done now than I ever was. And no, the LLMs are not "hallucinating" data. My use for them is to accelerate my coding. I know the input and output. I'm not generating new data with LLMs, I'm scripting more efficiently.
 
Last edited:
The tech bros love churning out lies and misinformation. Apple, Musk, Altman et al are only interested in advancing their billion/trillionaire status, if that means feeding the rest of us freshly constructed 💩, so be it.
 
In it's current form, it works like a middle school student who forgot what they read and just mashes everything together. If there is more than 1 piece of info in the notification it more often than not screws it up. It works about as well as Siri does lol.
 
Apple has clearly not vetted nor done any due diligence to ensure its AI makes any sense. Apple should be held accountable. I would sue if I were falsely accused of anything by AI. I hope these news agencies win with real world reporting over AI BS.
 
Glad BBC provided the original headlines instead of just the summarization on at least the darts one. The headline in question actually is quite confusing and definitely could have been misinterpreted by even a human reader.

That being said, Apple should disable this by default for new apps and/or give developers more control over the summaries.
 
But to say that AI is trash and has no value is missing a huge tectonic shift.

Nah

It’s just getting massively overhyped and so everybody is assuming that there is some big new paradigm shift here

The general public is getting played as pawns by big tech money and interests

Are you banking with crypto?
How are NFTs going?
Are you reading this in your space in the meta-verse?

It’s just one blown up load of BS after another from the tech industry
 
It might not be going away, but I'm keeping it off until further notice.
Sure and I understand that sentiment.
Apple doesn't need to messing with headlines, because that isn't a case of fake news, it's a case of Apple AI changing the wording, the context, and the meaning. How is that on the BBC? It's like you tell me something, and I repeat it back completely changed but then blame you.
So how do you summarize many headlines or emails or messages into tiny boxes? Or many say, don’t really need this functionality.

I like that feature and it may not be perfect but it will get better. But I don’t take what I read as face value, if I want to see what’s behind I click.
 
  • Haha
Reactions: rmadsen3
The issue is clickbait push notifications. Write better push notifications and the summaries will suck much less.
 
These days in this climate we should just assume that everyone is gay... People can come out as straight.. BTW i am not coming out as straight cuz i am not.
 
  • Disagree
Reactions: wyrdness
It might not be going away, but I'm keeping it off until further notice.

Apple doesn't need to messing with headlines, because that isn't a case of fake news, it's a case of Apple AI changing the wording, the context, and the meaning. How is that on the BBC? It's like you tell me something, and I repeat it back completely changed but then blame you.
It's because the BBC is worried that people will see the messed up summaries and think that was the BBC headline, without realising that Apple AI got the message wrong.

The AI hype is getting out of hand now (see the launch of MS Copilot on 2025 LG & Samsung TVs!) and I think companies are running the risk of destroying the promising technology by destroying the public's trust in it. LLMs do have a place in productivity, but show someone who hasn't used ChatGPT some of these summaries and they will just write off AI as a load of rubbish. I'm still not sure if the AI boom in CE devices is due to companies trying to sell new hardware, or just scared of missing out.
 
  • Like
Reactions: R3k
The AI hype is getting out of hand now (see the launch of MS Copilot on 2025 LG & Samsung TVs!) and I think companies are running the risk of destroying the promising technology by destroying the public's trust in it.

I agree
There are some useful scenarios for LLMs and "AI" (it's not really "intelligence", thus my use of quotes)

The problem is that all the tech companies, even Apple, are now focused on nothing but trying to hit grand slams.

Everyone is wildly over focusing on something that should be very gradually rolled out feature by feature, in certain usage cases, over time, only as the capabilities and performance warrants it.

Just jamming it into everything as fast as possible is a huge mistake
 
I’m liking the summarization feature. AI is not going away and will eventually be part of our fabric. Apple intelligence is not going away. And I look forward to seeing how apple intelligence evolves. And all of this will improve. Fake news by the BBC I say.
See Apple Maps - that launched in a half finished state and was widely mocked & ignored for years by the public, even though it was just as good/better than Google maps a couple of years later. By rushing out a half-baked product, Apple effectively lost years of user engagement to their competitor.
 
Last edited by a moderator:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.