Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
PSA: If you are concerned about having your chats used to train the model, then you can just turn that off in settings in ChatGPT. Or you can use bard, which explicitly doesn't save your data.
 
I’m not sure how Apple can enforce this rule. It’s a super useful program and unfortunately Pandora’s box has been opened.
Super hallucinating App for anything higher than High school level tech stuff.
 
"Apple is concerned that AI tools could leak the company's confidential data"

WOW... if your IQ is under, say, 90, surely you have problems creating prompts NOT having anything project/company-specific. But not for anyone with higher IQ - and that's the VAST MAJORITY of anything non-third-world country.

I too use GPT-4 for programming (it REALLY speeds up everything!) but make absolutely sure no confidential info is left in any kind of code / questions I post as prompts.

Apple shoots themselves in their foot as they're now reducing the effectiveness / speed of their coders.
What programming are using it for? Anything beyond some react/node is and basic python, it hallucinates non existent libraries and functions.
 
  • Like
Reactions: filchermcurr
I can see some Apple employee asking chatGPT to summarize a research document about an upcoming product. There goes the secrecy.. I don’t blame Apple for not trusting Microsoft or any other competitor.
 
If Tim had been President of the Twelve Colonies instead of the idiot Adar, Caprica Six would never had been able to compromise the CNP.😤 We saw what happened as the result of the Cyclons inserting code into the CNP.😒
At first it seems like a spiteful decision (eat own dog food by force) but the reasoning actually makes a lot of sense.
It makes lots of sense. Programmers who have to use developement software they write will make add features they need and fix bugs that bothers them.

If Apple had forced their accounting department to use Numbers instead of Excel those 20 odd years back, we'd have a viable competitor to Excel. They could have bought Improv from Lotus 1-2-3 and built off that. Improv was THE killer spreadsheet of it's time. M$ copied a lot of its features to grow Excel into the industry standard.
 
Quite a few companies have banned this - privacy reasons, data harvesting.

Not surprise that Apple have banned this also.
 
  • Like
Reactions: Naraxus
I work for a large healthcare company. They recently blocked company wide access to Chat-GPT. Employees who need it have to request special access and be approved. Of course this is only when inside the company firewall and on company owned hardware. They couldn’t block me from installing/using it on my personal device.
Very reasonable, as it could present a risk. But on a personal device for personal use? Mmmm....
 
then the problem is people using it on their phones then emailing it to themselves at work
The problem is with those people who will soon find themselves out of a job.

But those working on secretive projects wont be using ChatGPT anyway. There's strict protocols and processes that have been in place for years to keep them and their work siloed away from the rest of the company.
 
  • Like
Reactions: jamezr
"Apple is concerned that AI tools could leak the company's confidential data"

WOW... if your IQ is under, say, 90, surely you have problems creating prompts NOT having anything project/company-specific. But not for anyone with higher IQ - and that's the VAST MAJORITY of anything non-third-world country.

I too use GPT-4 for programming (it REALLY speeds up everything!) but make absolutely sure no confidential info is left in any kind of code / questions I post as prompts.

Apple shoots themselves in their foot as they're now reducing the effectiveness / speed of their coders.

Just curious, do you have any tips for someone moving from QA to Dev that may want to use GPT-4 and deals with sensitive data for prompts? (Really just trying to learn as much as I can when it comes to best practices)
 
Just curious, do you have any tips for someone moving from QA to Dev that may want to use GPT-4 and deals with sensitive data for prompts? (Really just trying to learn as much as I can when it comes to best practices)
Ask for generic information (e.g., "Show me a recursive function in Python.") and reimplement the answer to suit your specific use case. I still think books and manuals are more useful in the long term but at least this usage doesn't leak private information and forces you to understand the answer. There's nothing so awkward as being asked why you implemented something one way and only being able to respond "that's what the AI told me to do." (Though it's kinda fun asking the question.)
 
  • Like
Reactions: Veihl and CarlJ
Ask for generic information (e.g., "Show me a recursive function in Python.") and reimplement the answer to suit your specific use case. I still think books and manuals are more useful in the long term but at least this usage doesn't leak private information and forces you to understand the answer. There's nothing so awkward as being asked why you implemented something one way and only being able to respond "that's what the AI told me to do." (Though it's kinda fun asking the question.)
Double edge sword, using AI to do your work.
347092661_912852363144507_8065493123020763482_n.jpg
 
Anyways, huh. Not unexpected. It would probably lead to some of the secret juicy internal rumors being milled around in the models.
It could leak all sorts of information about unannounced products - imagine Apple starts developing for a particular support chip (not an A-series or M-series chip - think of, say, a screen controller or modem/radio chip), and developers start asking ChatGPT to write for support code for that chip. This can leak info about the direction Apple is heading with future products.

There was a case, years back, where one of the track-your-runs app companies (Strava?) ended up with precise maps of the perimeters of a bunch of military bases (some secret?) because on-base personnel would go out on a daily run around the perimeter of the base for exercise. People rarely put enough thought into the privacy/security implications of things like this - being stationed on a base happens, going for a run is a good thing, tracking your runs can be helpful, but, when you put all the pieces together...
 
Ask for generic information (e.g., "Show me a recursive function in Python.") and reimplement the answer to suit your specific use case. I still think books and manuals are more useful in the long term but at least this usage doesn't leak private information and forces you to understand the answer. There's nothing so awkward as being asked why you implemented something one way and only being able to respond "that's what the AI told me to do." (Though it's kinda fun asking the question.)
This. It's the difference between looking on stackoverflow to get an idea ("oh, sorting the data this way is helpful") and then implementing that idea in your own code, versus copying/pasting large blocks of code from stackoverflow, where, (a) you're not the author, and (b) you don't really know how it works or why that's the best solution.
 
Apple just wants to avoid a Tron situation, the last thing they need are programs fighting each other on The Grid.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.