Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple is already deploying AI in its products, but not calling it AI. For example, automatic translation in pictures and magic cropping in pictures. AI, in the ChatGPT/ Midjourny sense of the word, is a flash in the pan that is amazingly hot, and amazingly flashy. "AI" (ok, actually applied statistics) was hyped up as being amazing, and now it's crashing down hard.

Apple will never, ever, release a competitor to ChatGPT or Midjourny. It's too bad for the environment, impossible to develop a "blue ocean", and isn't as good as people believe it is.

I wonder if they can do an autocorrect that's based on machine learning? That should be possible.
 
INTEL has a separate co processor for AI in Next Years release of the Meteor Lake Processors.

So I dont think ANY of the Mac Chips have a separate AI Co Processor including the upcoming M3 3nm Chips.

So the future with AI and Apple look way way off. Much like Touch Screen Macs.

And how does that work? In fact, what exactly is "AI" in this context. I consider "AI" to be marketing fluff at this point.
 
Apple is already deploying AI in its products, but not calling it AI. For example, automatic translation in pictures and magic cropping in pictures. AI, in the ChatGPT/ Midjourny sense of the word, is a flash in the pan that is amazingly hot, and amazingly flashy. "AI" (ok, actually applied statistics) was hyped up as being amazing, and now it's crashing down hard.

Apple will never, ever, release a competitor to ChatGPT or Midjourny. It's too bad for the environment, impossible to develop a "blue ocean", and isn't as good as people believe it is.

I wonder if they can do an autocorrect that's based on machine learning? That should be possible.
The funny thing is Apple refers to it as what all this is, Machine Learning.

AI became a stock market buzzword that companies have mainstreamed by slapping it onto products and names to get a temporary pop in their stock price.

What we’re *calling* AI today is nowhere near what we *think* AI is supposed to be, artificial general intelligence. The funnier part is these language models and machine learning are a complete dead end to the pursuit of AGI. AGI will require entirely new techniques and technologies.
 
👆🏾

What everyone misses is that Siri is little more than a series of badly-coded if statements. It barely qualifies as AI. The real AI work is in the apps and the chips, and that’s not the Samsung type of AI where using AI to “enhance” your picture means finding a better one on the internet and replacing it with that.
False statement that has been proven so countless times.
Apple isn't better than Samsung in AI in the first place anyway.
 
False statement that has been proven so countless times.
Apple isn't better than Samsung in AI in the first place anyway.
How is that a false statement? That's how Samsung's "moon mode" works. Someone took a picture of the moon, removed a bunch of information of the picture in photoshop, took a picture of the screen with a Samsung phone camera in "moon mode", and the information reappeared.

source: https://www.theverge.com/2023/3/13/23637401/samsung-fake-moon-photos-ai-galaxy-s21-s23-ultra
 
How is that a false statement? That's how Samsung's "moon mode" works. Someone took a picture of the moon, removed a bunch of information of the picture in photoshop, took a picture of the screen with a Samsung phone camera in "moon mode", and the information reappeared.

source: https://www.theverge.com/2023/3/13/23637401/samsung-fake-moon-photos-ai-galaxy-s21-s23-ultra
It is a false statement.
Here it's explained in detail how the feature works. IT DOESN'T REPLACE THE ORIGINAL PICTURE WITH ANOTHER ONE FROM THE INTERNET.
 
The funny thing is Apple refers to it as what all this is, Machine Learning.

AI became a stock market buzzword that companies have mainstreamed by slapping it onto products and names to get a temporary pop in their stock price.

What we’re *calling* AI today is nowhere near what we *think* AI is supposed to be, artificial general intelligence. The funnier part is these language models and machine learning are a complete dead end to the pursuit of AGI. AGI will require entirely new techniques and technologies.
100% agreed. ChatGPT "AI" (or Applied Statistics) is not just a technology dead end (because of hallucinations), not just a business dead end (because of the inability to create a "blue ocean" and to allow profits to be developed), but is also a dead end because of the ecological environmental harm it generates (source https://www.theguardian.com/technol...l-intelligence-industry-boom-environment-toll).

I think this time next year people would have moved on from this type of AI systems, and attention would be drawn to something else that's new and flashy.
 
I for one am glad. Apple rushing into AI like Google did with bard would just be an embarrassment. I’d rather they perfect it first.
I agree. I want to be secure that my answer is right, and not an hallucination. However, that doesn’t mean that Apple gets a free ride where Siri is concerned. If it takes six weeks to make a change, then hire six more engineers, and every week there should be a change in its capabilities. It doesn’t have to be AI but I think Apple device owners should be able to expect reliable and consistent information from Siri as well as new features. And as I am always saying, tell us about them, don’t expect us to just happen upon them . Apple shold toot Siri’s horn, and not expect someone else to do it for them.
 
It is a false statement.
Here it's explained in detail hos the feature works.
Of course Samsung says it's not replacing information and their system actually works as advertised 🙄. Do you have a better trusted source, or should I just block/ignore you?

I read your posting history, and to save us both time, I decided just to block you. Bye.
 
Well, if it's powered by Siri, which it would be, Apple must be in crisis mode for letting Siri languish so much rather than trying to put effort into making it even remotely useful. This might just be Tim Cooks biggest failing is letting Siri get so far behind. I suppose the neural engine in their chips is designed to do AI computing so I would think that would be already built into the hardware at least?
It IS Tim Cook’s biggest failing. If he were not so focused on vision pro, which a large number of Apple users cannot afford, and wouldn’t want if they could, then he might see that this feature, which is available in every device, and is the face of these devices, should be the absolute best thing about all of them.
 
I made an account here so I could "like" this comment. She doesn't tell you she says "i found some information on the web" If I wanted to look at my phone, I wouldn't be asking you, Siri.
Exactly that's os annoying when she doesn't just give an answer.
 
Of course Samsung says it's not replacing information and their system actually works as advertised 🙄.
Of course it works as advertised. I have an S23U and I actually know how it works. I also can take moon shots without Scene Optimized, so no AI, the hardware is perfectly capable and multiple people already proved it.

Do you have a better trusted source, or should I just block/ignore you?
LOL, official documentation isn't "a trusted source" 🤣🤣🤣. Yeah you can block me alright, you don't have any real arguments anyway.

I read your posting history, and to save us both time, I decided just to block you. Bye.
Great thank you! Sayonara.
 
  • Like
Reactions: gusmula
I honestly have zero issues with Siri. I use it every day for all sorts of things. Not discounting anyone else's experience but mine has been great.
Is Siri only working well on devices with the neural engine? I have a 2017 iPad as well as an M1 iPad Pro. Siri works comparatively well on the iPad Pro but is absolute garbage on the 2017 iPad.
 


The progress of Apple's generative AI technology is significantly behind its competitors and there is no sign that the company plans to launch AI services next year, according to analyst Ming-Chi Kuo.

hey-siri-banner-apple.jpg

In a new post on Medium outlining how Apple's imminent earnings report will affect Apple stocks and the supply chain, Kuo explained that the company will likely not dedicate much time to discussion of AI during its earnings call due to its lack of progress in the area. There is reportedly no sign that Apple has plans to launch or integrate AI computing or hardware products in 2024, indicating that AI is unlikely to boost the company's stock price or supply chain in the immediate future.

Last month, Bloomberg's Mark Gurman said that Apple was working on "Apple GPT" artificial intelligence projects that could rival OpenAI's ChatGPT. Apple does not yet have a "clear strategy" for creating a product for consumers, and while it could be planning to make a "significant" AI announcement in 2024, Gurman claims it has no concrete plans as of yet.

During Apple's May earnings call, Tim Cook said there are a "number of issues that need to be sorted" with AI, and that it's important to be "deliberate and thoughtful" in the development approach. Cook also said that Apple views AI as "huge," and plans to "continue weaving it in products on a very thoughtful basis."

Article Link: Kuo: 'No Sign' of Apple Generative AI Technology Coming in 2024
Used to buy apple products every single year:, Hardcore, now im really considering moving forward giving samsung a chance. The Lack of Apple in certain areas is poor.
 
Yes, The simple solution is to learn to speak Siri's language. Siri will never learn real English.

For example I can say "Hey Siri set kitchen lights to 85%" and the dimmer is adjusted to the desired brightness. If I change any of those words there is a chance of error. So I learn to say the phrases exactly as Siri wants. Same with sending texts and setting destination for maps app. You need to use the exactly "corect" words and speak each word clearly.

Yes i could be better but it is not so hard to learn "Siri Speak". Well it is not hard, perhaps because I've been a software developer for five decades, and I've gotten used to using the computer's language.

We have come so far. I first stared studying speech recognition and AI in the mid 1980s as a graduate student and that you could do it at all was exciting.But now people complain that Siri has less than human ability.

What if Apple used GPT-like systems? You could ask, "Please tell me about George Washington's first grandchild Susan." and you'd get a nice biographical summary, even though GW had children or grandchildren. It is also fun to ask about Beethoven's 10th symphony. Generative models are VERY good at generating text. But they have basically a negative IQ. They are worse than stupid. Even worse. If you publish this made-up stuff then it gets used in the next set of training data for the next version of GPT. So we get authorize answers about made-up facts.

Apple and Google have a really hard problem on their hands. These systems are good with language but are far less intelligent than my dog. The only solution is FAR beyond the current state of the art, These systems must be able to understand what they are saying. We are not even close to that yet, No one has a clue how to do that.

So, what we have is useful already, and be careful what you ask for.
My problems with Siri are not about voice recognition, at least not on my M1 iPad Pro. My problem is Apple giving Siri functions, and then taking them away. Two prime examples are that Siri can no longer search your photos for objects and pets and locations and dates, now it only opens the Photos app now, which is exceedingly not useful when you have thousands of photos. Another example is when it used to be able to add data to a note and they took that function away too. These were very helpful things but I’m sure for those whose speech recognition function is a problem that is indeed more important.
 
  • Like
Reactions: Pinkyyy 💜🍎
What we’re *calling* AI today is nowhere near what we *think* AI is supposed to be, artificial general intelligence. The funnier part is these language models and machine learning are a complete dead end to the pursuit of AGI. AGI will require entirely new techniques and technologies.

Along with a functional definition of "intelligence".

That's been the problem with AI since the 60's. "Intelligence" remains a mirage always on the horizon because we're driven to believe that humanity holds some special ability that's not easily replicated.
 
It is a false statement.
Here it's explained in detail how the feature works. IT DOESN'T REPLACE THE ORIGINAL PICTURE WITH ANOTHER ONE FROM THE INTERNET.
Your problem is your narrow definition of “replace”. If you think Samsung has a series of pngs that it crops and replaces your moon with, that’s not what’s happening. What’s happening is that the same data is encoded in its ”moon fixer” network which makes it easier to shift, reproject, and color match.

But if you actually looked at the Verge article that @JustAnExpat linked to, Samsung is creating information that simply doesn’t exist in the source image. That’s not super-resolution or anything of the sort, it’s replacing your picture with data from other, better, pictures.
 
100% agreed. ChatGPT "AI" (or Applied Statistics) is not just a technology dead end (because of hallucinations), not just a business dead end (because of the inability to create a "blue ocean" and to allow profits to be developed), but is also a dead end because of the ecological environmental harm it generates (source https://www.theguardian.com/technol...l-intelligence-industry-boom-environment-toll).

I think this time next year people would have moved on from this type of AI systems, and attention would be drawn to something else that's new and flashy.
The staggering energy demands of these systems that we’ve thrown into the public with no clear direction really is a worrisome aspect of all this. It’s takes something like 10x the energy to do an “AI” session than a search does. Really, really concerning given how much they’re trying to drive adoption of it.
 
The staggering energy demands of these systems that we’ve thrown into the public with no clear direction really is a worrisome aspect of all this. It’s takes something like 10x the energy to do an “AI” session than a search does. Really, really concerning given how much they’re trying to drive adoption of it.
I tend to think that’s the main distinction between human intelligence and artificial intelligence at this point: power efficiency. We do a lot with 20W.
 
I absolutely respect Apple 100% for not jumping on the generative-AI bandwagon.

I'm dreading the update for Windows Copilot. Just more half-baked feature-bloat chasing headlines instead of addressing fundamental issues in the OS. I'll use Copilot as much as I used Cortana -never!
Maybe CoPilot in Windows is not that useful. Still, Apple could improve Pages, Numbers and Keynote with generative AI, like MS did with CoPilot in MS Office. What I have seen looks very useful, specially for business and enterprises.
 
  • Like
Reactions: Pinkyyy 💜🍎
Your problem is your narrow definition of “replace”. If you think Samsung has a series of pngs that it crops and replaces your moon with, that’s not what’s happening. What’s happening is that the same data is encoded in its ”moon fixer” network which makes it easier to shift, reproject, and color match.
I think the problem is, users that obviously don't own these phones that they hate on(and can't test it for themselves), keep insisting with the same stuff even if months have passed and no new information to reinforced their beliefs, aka "opinions" has appeared. Actually I periodically see random posts that disapprove the hole "moon shots are Fake" drama.

Also he clearly said: that’s not the Samsung type of AI ... means -finding a better one on the internet- and -replacing it- with that. So what narrow definition of “replace” ? He simply said Samsung's AI actually replaces the original photos with 3rd party photos(there's no nuance, that's what he meant) and its obviously not true.

But if you actually looked at the Verge article that @JustAnExpat linked to, Samsung is creating information that simply doesn’t exist in the source image. That’s not super-resolution or anything of the sort, it’s replacing your picture with data from other, better, pictures.
The Verge article is outdated(not to mention users generally weren't able to recreate the same extreme example) and it's premise(what it tries to suggest) was disapprove by multiple user tests.
It's not replacing anything, it recognizes what it sees and enhances it as best as it can(upscale resolution, details), that's why results vary. Google and Samsung have an AI based features that fixes blurred photos, does that mean that the resulting photos are fake? NO, ABSOLUTELY NOT. Samsung has a feature that removes shadows or reflections from photos, does that mean that the result is a fake photo?
 
Last edited:
...
o I dont think ANY of the Mac Chips have a separate AI Co Processor including the upcoming M3 3nm Chips.

So the future with AI and Apple look way way off. Much like Touch Screen Macs.


The Rundown On Intel's Meteor Lake Chip

The details on Intel's 14th-gen processors are largely speculative, but here's what we do know about the upcoming Meteor Lake release.
www.howtogeek.com ...
Says here Apples neural engine only handles media manipulation. not a true AI Co processor.



You are about as clueless as Kuo is on this topic. You posts are flawed on several fronts.


First, Intel's VPU has a media focused AI-augment history.


The 'V' in VPU name comes from 'Vision'. So wagging your finger at the Apple NPU saying it is heavlly tasked with video AI compute tasks is a bit like the pot calling the kettle black.


Second, Apple has both the NPUs and AMX ( Apple Matrix extension in the Arm cores). There are two subsets of matrix works that Apple does well. That holds up very well versus Intel's somewhat hobbled AVX ( no full AVX512 in meteor lake) and VPU ( Intel has a Advanced Matrix Extention also named AMX coming, but won't arrive in clients until after Meteor Lake. I think it is Granite lake first for that. ). Apple's NPU cores are not their only AI silver bullet.



Third, The assertion that intel has a "separate AI chip" is is not really true. The SoC chip in Meteor Lake handles a variety of stuff. The DRAM Memory Controller, security , two "low power cores" (even lower power than the 'E' cores all in addtion to this VPU. Probably not really aimed at most user level apps at all. Mostly background stuff when the system is idle/'sleeping'.) and a few other things ( Display Controller, Media handling (AV1, video processing/inferencing, etc. ).

Some overview here

and here

HC34-Intel-Meteor-Lake-SoC-Tile.jpg





So it really is not a whole chip dedicated to AI. Nor is the TOP ( tera-ops-per-second) really going to be all that much better than the NPU cores or better .



[ There are three other chiplets. Compute -- CPU cores ( P and E ) , GPU -- GPU cores I/O -- Thunderbolt controllers , USB , PCI-e , Ethernet/WiFi/etc. , embedded DisplayPort , etc. ]


Intel isn't really 'buying' a whole lot here by using a different die . Apple is not 'loosing' by not going chiplets. Intel is going to pay a bigger Perf/Watt loss penalty than Apple is. Apple is going to make a bigger TSMC chip than Intel's TSMC chips. There probably will not be some massive "AI processor" die area allocation loss here between the two.
Compared to the likely contemporary Apple M3 the Intel SoC will be using an older TSMC process.

"...
Intel Meteor Lake Tile/ChipletManufacturer / Node
CPU TileIntel / 'Intel 4'
3D Foveros Base DieIntel / 22FFL (Intel 16)
GPU Tile (tGPU)TSMC / N5 (5nm)
SoC TileTSMC / N6 (6nm)
IOE TileTSMC / N6 (6nm)
..."



So unlikely going to bet Apple on core count per unit area allocated. The M2 is on TSMC N5P. and the M3 is likely on some variant of N3. The VPU of Intel's is going N6 ( and Apple has not been on a N7 family process for several YEARS).
 
  • Like
Reactions: Analog Kid
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.