Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple should have been working on a smarter Siri ages ago. They should not have waited until they saw themselves flat footed, they should have made Siri, which is on all their devices, the absolute best it could be from the beginning and constantly improving it.
Yup. So much for Apple skating to where the puck is going...lol. At least with regards to AI.
 
It's actually getting worse in some cases.

Some have discovered that GPT4 was just 8 instances of GPT3 stacked on top of each other. That makes it extremely inefficient for what you are getting out of it. Literally scorching the planet to death just to play with a word machine.

Some have also noticed model decay. This is what happens when AI models scrape data created by AI models. It's similar to what happens when a VHS tape is copied again and again and again. The quality degrades over time.

They go from being trained on high quality data to being trained on lower and lower quality data.

So imagine a future where an AI is surfing an internet full of AI generated garbage, fake articles, fake images, and general enpooification.
You just described modern journalism. 😄
 
I for one am not on the AI hype bandwagon....I just see no reason the average person needs to manipulate or generate every little thing. All these AI features with the pixel for example, like at what point does the picture you took not exist and everyones just sharing these fake photos that never actually happened how they are pictured? AI has its uses and I just dont think those uses should be available to the masses at the scale they are or are planning on. Already have read so many users talk about how they dont do any actual work (school or employment) as they just use AI services like ChatGPT. Yea they have to look it over and examine it but effectively not learning anything or doing any actual work. Idk just how I personally feel
 
By the time the Macs get really good at creating generative ai and doing your work for you they will also be great for making sure the images and works you create on your own are up to the standards of that time. No need for downvotes or moderators in that brave new world. Every place you post anything will already have a full understanding of the meaning and context of any image you post as soon as you post it. That is what this type of software can really do and is worth the millions of investment dollars it takes to bring it to the public.
I always wondered how we eventually get to the Orwellian population with the glazed over, expressionless face. This explains it.

I once in a while try like “Hey siri activate the alarm on my house” and she will answer “ok I have set an alarm to 7:00 tomorrow”.

Maybe it would be better in English. I’m talking Danish to her.

And if I’m in a conversation with someone and raises my hand, the Apple Watch will think I’m talking to Siri and she will interrupt us. I know that can probably be deactivated but it’s still stupid as s***
Trust me, it’s not any better in English. Siri can “Call my wife, mobile.” but in capable of “Call my wife, home.” Because Siri doesn’t think there is a number for home. 🤦‍♂️

There are so many stupid examples of stuff like this which just makes me laugh any time I read one of these articles.
 
  • Like
Reactions: wegster
We saw instances of cyberbullying even before the iPhone was released -- Megan Meier was cyberbullied (on MySpace) and committed suicide in 2006.
People have been doing bad things on the internet for much longer than that. I had to study the 1993 incident titled "A Rape In Cyberspace" in computer ethics class. Here is the wiki article giving the details: https://en.wikipedia.org/wiki/A_Rape_in_Cyberspace

(Edit to add: this was decades ago when I was getting my BSCS. Are ethics classes still mandatory for CS majors?)
 
  • Like
Reactions: wegster
Federighi's team is also looking at integrating AI into Xcode to help developers write code more quickly, bringing it in line with services like Microsoft's GitHub Copilot.
This should be the top priority for Apple. GitHub Copilot has dramatically increased both my personal productivity and that of my entire engineering team. Using it is definitely a skill you have to learn, because until you do it just seems like a fancy autocomplete.

But, for Apple, increasing the productivity of everyone who uses Xcode so dramatically should be the number one corporate goal.
 
  • Like
Reactions: mjs916
If you don't believe that then just look at what mobile phones and social media have done to our youth. Every social scientist, psychologist, anthropologist, mental health expert are in agreement that these ‘tools’ have caused havoc.
They aren't in agreement. For one thing, the situation with social media is complicated and nuanced. It's provided connection and support for a lot of young people that they might not have found otherwise.
 
  • Disagree
Reactions: VulchR and lkrupp
I love playing with generative AI (with Stable Diffusion), but Apple's hardware is complete garbage for AI right now. nVidia is the AI king right now, and since there's zero support for eGPUs with Apple Silicon, Apple has painted themselves into a corner for now.

The current state of Apple Silicon is the only thing holding me back right now from switching back to Mac.
 
I am all in for a smarter Siri. For now it’s a joke telling machine that my 10 yo son enjoys asking questions it cannot answer correctly.

If I had put millions of dollars into building that I would have been disappointed
More than a few millions IMO. Reminder - they acquired Siri back in 2010. From wikipedia:
Siri's original release on iPhone 4S in 2011 received mixed reviews. It received praise for its voice recognition and contextual knowledge of user information, including calendar appointments, but was criticized for requiring stiff user commands and having a lack of flexibility. It was also criticized for lacking information on certain nearby places and for its inability to understand certain English accents. In 2016 and 2017, a number of media reports said that Siri lacked innovation, particularly against new competing voice assistants.

So yeah, mildly unimpressive in 2011. Lack of flexibility, etc.
2016/17(!!) - poor against <everyone else>.
That was somehow now 6-7 years ago, but the same words from 2011, 2016/2017 still apply.
 
I for one am not on the AI hype bandwagon....I just see no reason the average person needs to manipulate or generate every little thing. All these AI features with the pixel for example, like at what point does the picture you took not exist and everyones just sharing these fake photos that never actually happened how they are pictured? AI has its uses and I just dont think those uses should be available to the masses at the scale they are or are planning on. Already have read so many users talk about how they dont do any actual work (school or employment) as they just use AI services like ChatGPT. Yea they have to look it over and examine it but effectively not learning anything or doing any actual work. Idk just how I personally feel
But just think of all of the 'positive uses' for government and political posturing and nonsense... :(
 
  • Like
Reactions: VulchR
I love playing with generative AI (with Stable Diffusion), but Apple's hardware is complete garbage for AI right now. nVidia is the AI king right now, and since there's zero support for eGPUs with Apple Silicon, Apple has painted themselves into a corner for now.

The current state of Apple Silicon is the only thing holding me back right now from switching back to Mac.
True for training, not necessarily the case for low power inferencing, although of course Nvidia plays there as well with Jetson, Xavier, etc. AMD has their whole ROCm interface allowing effectively AMD to 'work like NVidia' at some levels to try to compete but Nvidia is certainly the gorilla in the room for many cases.
 
This may turn into a rant, sorry in advance.

Why is it called Artificial Intelligence when there literally is nothing on the Earth that is “artificial”. Anything and everything that is made on Earth is not artificial. Humans are not exempt from this, so anything “created” by humans is natural.
 
#1 most critical new feature needed: more accurate voice recognition, for Siri commands and voice-to-text. This is absolutely paramount. It must also include the ability to “teach” Siri how you say certain words, and how to make distinctions between words like “pitchers” (in baseball) and “pictures”, so that she doesn’t write the wrong thing 15 times while you’re watching a game and texting friends.

Also, Google is kicking butt with photo features like Magic Eraser and Magic Audio Eraser. As well as Best Take and other great new tools. These features seem like things Apple could and should have come out with first. Even the names are Apple-like. Apple camera users shouldn’t have to get by with second best.

I’m looking forward to what Apple comes up with, but also kinda surprised that they seem to be so far behind in this area where the importance seems pretty obvious. Love that they wanting to lead in spatial computing and automobiles, but future success in those areas will most certainly be linked to AI excellence.
 
  • Like
Reactions: ScooterComputer
AMD has their whole ROCm interface allowing effectively AMD to 'work like NVidia' at some levels

Sadly, ROCm has been, at least currently, only supported on Linux.

Interesting comparison, I get roughly 18-20 it/sec (iterations per second) on my nVidia RX4070, and about 1-2 it/sec on my 6700XT. The M2 Ultra currently performs even worse (less than 1 per second).
 
  • Like
Reactions: wegster
I’m looking forward to what Apple comes up with, but also kinda surprised that they seem to be so far behind in this area where the importance seems pretty obvious.

Honestly, I'm not really shocked. GPU power has always been an underwhelming for Apple, which is why game devs mostly skip Mac ports these days. Even Blizzard who's long done simultaneous Mac/PC release have skipped Mac for its latest games like Diablo 4.

I really hope things improve with the M3, but I'm not holding my breathe.
 
  • Like
Reactions: sidewinder3000
I just wish that they were not waiting until possibly June to show us anything new with Siri. Those of us who use Siri regularly have been waiting quite a long time to see improvement.
 
#1 most critical new feature needed: more accurate voice recognition, for Siri commands and voice-to-text. This is absolutely paramount. It must also include the ability to “teach” Siri how you say certain words, and how to make distinctions between words like “pitchers” (in baseball) and “pictures”, so that she doesn’t write the wrong thing 15 times while you’re watching a game and texting friends.

Also, Google is kicking butt with photo features like Magic Eraser and Magic Audio Eraser. As well as Best Take and other great new tools. These features seem like things Apple could and should have come out with first. Even the names are Apple-like. Apple camera users shouldn’t have to get by with second best.

I’m looking forward to what Apple comes up with, but also kinda surprised that they seem to be so far behind in this area where the importance seems pretty obvious. Love that they wanting to lead in spatial computing and automobiles, but future success in those areas will most certainly be linked to AI excellence.
This is John Gruber's take with respect to Apple and AI. I've quoted the main gist but you can read the whole thing if you want;

"What I have heard from little birdies in Cupertino is not that there was a miss on this already. Apple is almost never at the forefront of stuff like this. They’re a deliberate company. Their goal, as with any new technology, is to integrate it into products in meaningful ways best, not first. That’s why there’s no internal anxiety that they’ve already missed anything related to AI."

"The anxiety inside Apple is that many people inside do not believe Apple’s own AI/ML team can deliver, and but that the company — if only for privacy reasons — is only going to use what comes from their own AI/ML team.
"

 
Sadly, ROCm has been, at least currently, only supported on Linux.

Interesting comparison, I get roughly 18-20 it/sec (iterations per second) on my nVidia RX4070, and about 1-2 it/sec on my 6700XT. The M2 Ultra currently performs even worse (less than 1 per second).

Yes on ROCm, but a good chunk of backend systems used for training are Linux in one form or another in many cases in some for of pooled container/virtual architecture for GPU/Compute resource mgmt so it makes sense for them, as ROCm is no small investment in time/$.

We did a few rounds looking to make use of a few AMD GPUs left over from a specific project, but ROCm has been kind of a moving target for a bit - always seemed to be chasing down dependencies. AFAIK, they’re still actively working on it. I don’t immediately remember the model of AMD cards or the comparison #s, as it was more of a ‘can we use these for smaller projects’ vs all of the V100s and the like (too many cards to keep replacing, so it’s more of rolling updates/additions), and was curious if the relatively lower cost AMDS w/ROCm were at least performance per dollar reasonable. Going from memory they did overall ok (not max performance) but ran into too many annoyances to use them reliably in any of our GPU pools.

What’d did you use for the benchmark comparison?
That doesn’t seem too far off the mark on 4080 vs the 6700. Random generation results for SD posted it at least a 10x… did a quick search for mlperf trading results but it’s hell trying to compare/find on iPad…

Not so stellar-seeming on M2. Might be time for some rainy day benchmarks or searching. I guess the 6700xt isn’t that old, but am wondering across different types of performance where the M2 family compares.
 
Where was apple 10 years ago? Siri is completely useless compared with Google Assistant and has been since the beginning.
A couple years removed from losing Steve Jobs and a year from firing Scott Forstall and Richard Williamson, over Maps decision making.

I’m not claiming that it would definitely be the leader in digital assistants had these things not occurred but I don’t doubt they contributed to the feeling that Siri was left to wither on the vine after that.
 
  • Love
Reactions: smulji
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.