PSA: If you are concerned about having your chats used to train the model, then you can just turn that off in settings in ChatGPT. Or you can use bard, which explicitly doesn't save your data.
Super hallucinating App for anything higher than High school level tech stuff.I’m not sure how Apple can enforce this rule. It’s a super useful program and unfortunately Pandora’s box has been opened.
What programming are using it for? Anything beyond some react/node is and basic python, it hallucinates non existent libraries and functions."Apple is concerned that AI tools could leak the company's confidential data"
WOW... if your IQ is under, say, 90, surely you have problems creating prompts NOT having anything project/company-specific. But not for anyone with higher IQ - and that's the VAST MAJORITY of anything non-third-world country.
I too use GPT-4 for programming (it REALLY speeds up everything!) but make absolutely sure no confidential info is left in any kind of code / questions I post as prompts.
Apple shoots themselves in their foot as they're now reducing the effectiveness / speed of their coders.
It makes lots of sense. Programmers who have to use developement software they write will make add features they need and fix bugs that bothers them.At first it seems like a spiteful decision (eat own dog food by force) but the reasoning actually makes a lot of sense.
Very reasonable, as it could present a risk. But on a personal device for personal use? Mmmm....I work for a large healthcare company. They recently blocked company wide access to Chat-GPT. Employees who need it have to request special access and be approved. Of course this is only when inside the company firewall and on company owned hardware. They couldn’t block me from installing/using it on my personal device.
Just blacklist the website/app on company devices & networks. Easy.I’m not sure how Apple can enforce this rule. It’s a super useful program and unfortunately Pandora’s box has been opened.
oh and make just different enough from IOS so as to not get sued....“Hey ChatGPT, using the Swift programming language write a rock-solid OS that never breaks and is lightning fast for use on iphones”
“Done”
then the problem is people using it on their phones then emailing it to themselves at workJust blacklist the website/app on company devices. Easy.
The problem is with those people who will soon find themselves out of a job.then the problem is people using it on their phones then emailing it to themselves at work
Yes don't want MS receiving data on what Apple is up to.Apple has restricted employee use of ChatGPT and other external artificial intelligence utilities amid the development of its own similar technology, The Wall Street Journal reports.
Lets face it, it's not going to be much.Yes don't want MS receiving data on what Apple is up to.
"Apple is concerned that AI tools could leak the company's confidential data"
WOW... if your IQ is under, say, 90, surely you have problems creating prompts NOT having anything project/company-specific. But not for anyone with higher IQ - and that's the VAST MAJORITY of anything non-third-world country.
I too use GPT-4 for programming (it REALLY speeds up everything!) but make absolutely sure no confidential info is left in any kind of code / questions I post as prompts.
Apple shoots themselves in their foot as they're now reducing the effectiveness / speed of their coders.
Ask for generic information (e.g., "Show me a recursive function in Python.") and reimplement the answer to suit your specific use case. I still think books and manuals are more useful in the long term but at least this usage doesn't leak private information and forces you to understand the answer. There's nothing so awkward as being asked why you implemented something one way and only being able to respond "that's what the AI told me to do." (Though it's kinda fun asking the question.)Just curious, do you have any tips for someone moving from QA to Dev that may want to use GPT-4 and deals with sensitive data for prompts? (Really just trying to learn as much as I can when it comes to best practices)
Double edge sword, using AI to do your work.Ask for generic information (e.g., "Show me a recursive function in Python.") and reimplement the answer to suit your specific use case. I still think books and manuals are more useful in the long term but at least this usage doesn't leak private information and forces you to understand the answer. There's nothing so awkward as being asked why you implemented something one way and only being able to respond "that's what the AI told me to do." (Though it's kinda fun asking the question.)
It could leak all sorts of information about unannounced products - imagine Apple starts developing for a particular support chip (not an A-series or M-series chip - think of, say, a screen controller or modem/radio chip), and developers start asking ChatGPT to write for support code for that chip. This can leak info about the direction Apple is heading with future products.Anyways, huh. Not unexpected. It would probably lead to some of the secret juicy internal rumors being milled around in the models.
This. It's the difference between looking on stackoverflow to get an idea ("oh, sorting the data this way is helpful") and then implementing that idea in your own code, versus copying/pasting large blocks of code from stackoverflow, where, (a) you're not the author, and (b) you don't really know how it works or why that's the best solution.Ask for generic information (e.g., "Show me a recursive function in Python.") and reimplement the answer to suit your specific use case. I still think books and manuals are more useful in the long term but at least this usage doesn't leak private information and forces you to understand the answer. There's nothing so awkward as being asked why you implemented something one way and only being able to respond "that's what the AI told me to do." (Though it's kinda fun asking the question.)