Finally they’re getting the ball rolling. It’s quite difficult to sell an Apple Intelligence phone without Apple Intelligence. 🤷
At least this was a better argument than what other people have said against it. Still, it would only take a Google search after looking at the photo, contacting the person if you know them, or other things to find out the truth of the photo. Unless the whole world tries to gaslight you into believing that it is true, which would be kinda funny.The problem with this is you can take a screenshot and that hides the metadata. They can put a watermark on the images, but they have AI tools online to remove watermarks.
I do because if I wanted to use Apple Intelligence in the UK, I had to switch to the US region, which for me broke some shortcuts, and the HomePods didn’t know who I was.Lol do Brits actually care that much? I wouldn’t mind at all if something were only available in UK English (though I’m admittedly more well-versed in it than probably 90%+ of Americans).
Anyone get off of the wait list yet?
So because metadata can be modified, and because some people are more adept than others at creating fake photos, then that is why we can’t have it. Is it too hard to run a Google search or contact the person in question to see if the image holds any truth? I would imagine that a celebrity, politician, etc would be quick to jump on social media to confirm or deny a photo that’s not true. I still do not think the arguments you are making holds justification against it. It just sounds like protecting people from themselves because you don’t trust that most people will have fun with it and not use it for bad.Yes, because metadata is trivially easy to modify and/or delete. Nobody has developed a reliable system of IDing AI-generated images, and it causes (so far relatively minor) problems every day.
Yes. A very small minority of people are adept enough at Photoshop to make convincing fakes; even fewer are willing/able to pay for the requisite software. Versus Apple giving the ability to make convincing fakes to literally tens of millions of people.
I don't. But unfortunately plenty of unscrupulous people do. Apple doesn't want to give those people even more tools to spread lies and deception, and they're 100% correct.
People can have fun and will have fun with Apple's decidedly cutesy version of generative AI. There's absolutely no reason for Apple to make them photorealistic, particularly since there are plenty of other tools out there already. And it kinda does mean that, especially where, as here, the risks and downsides far outweigh the marginal utility.
The first part of your post is what Apple has specifically said would happen every time Siri tries to tap into ChatGPT. That’s not a bug.Quick feedback of 18.2 Beta 1:
- The ChatGPT integration with Siri asks you to use ChatGPT every time if it needs. So annoying! Why can’t it just go straight to ChatGPT if Siri needs?
- Siri doesn’t read the output from ChatGPT, just displays the text. So annoying!
They NEED to fix these two things before launching 18.2.
Okay it actually started working for me. Confirmed it doesn't bother reading the ChatGPT output out loud. But you can actually turn off the Request access to ChatGPT in optionsThe first part of your post is what Apple has specifically said would happen every time Siri tries to tap into ChatGPT. That’s not a bug.
I’m just a very strong believer that deepfake images and videos should not only be heavily regulated, but illegal. There are very serious consequences to this technology. It can be used to defame people, steal their identity, and cause personal harm. And I don’t think I’m exaggerating one bit.At least this was a better argument than what other people have said against it. Still, it would only take a Google search after looking at the photo, contacting the person if you know them, or other things to find out the truth of the photo. Unless the whole world tries to gaslight you into believing that it is true, which would be kinda funny.
Since UK is not EU seems going better for a few things...UK language support at bloody last!
It was posted on the front page of MacRumors five minutes before you commented.Mail categories are live. Haven't seen anyone post that yet. Been waiting for that one.
That's why I'm skipping the iPhone 16 lineup and sticking with my iPhone 12, which is basically identical to each release since then, minus niche camera features. There's tons of stuff to be sorted before I invest in Apple's AI promise.A notable mention that keeps being forgotten is that all Apple intelligence is US only, and the rest of the world has to wait unless you change your region which will in turn play with your language and spelling.
Is it too hard to run a Google search or contact the person in question to see if the image holds any truth?
I would bet a ton of money that if a developer comes along that uses the power of Image Playground to produce realistic images in their app, if that would be possible with the API, then that would gain a lot more traction and user involvement than the “cutesy” cartoons of Image Playground.
It’s not like this would prevent lawsuits from happening. I would be on board with regulations against using people and children in nefarious ways, like a politician being decapitated or photos of a female celebrity naked that has never been naked on camera before or something. There is such a thing as the Streisand Effect though, and the more that people try to remove or prevent this stuff from taking off the more people that’s going to gravitate towards it.I’m just a very strong believer that deepfake images and videos should not only be heavily regulated, but illegal. There are very serious consequences to this technology. It can be used to defame people, steal their identity, and cause personal harm. And I don’t think I’m exaggerating one bit.
I would bet a ton of money that if a developer comes along that uses the power of Image Playground to produce realistic images in their app, if that would be possible with the API, then that would gain a lot more traction and user involvement than the “cutesy” cartoons of Image Playground.
It’s not hard for me to make a decision about something I see or read online. If I see a photo that doesn’t look right, then I typically find the source to see if it’s corroborated by other people or other sites. Same as say news articles.For most people? Yes, absolutely. As we've seen again and again and again, people believe things they read and see online, even when they shouldn't. The vast majority of people have little to no media literacy. It's an enormous problem, it's only going to get worse, and there's truly no reason whatsoever for Apple to add to it. Is it too hard for you, someone who is obviously tech-savvy, to use any number of other tools to generate realistic images?
Maybe. But Apple obviously doesn't want "traction," or at least doesn't want it at the expense of safety and its reputation. Again, there are dozens if not hundreds of easily accessible tools you can use to generate realistic images if you so desire. Apple doesn't want to facilitate that.
I’ve not seen it ?That's already in 18.1, which you can grab the release candidate of right now if you don't want to wait till next week.