Is it not?Do you want to monitor every step he takes too? Maybe even monitor every time he breathes?
[doublepost=1556802109][/doublepost]
Mentally?
Is it not?Do you want to monitor every step he takes too? Maybe even monitor every time he breathes?
[doublepost=1556802109][/doublepost]
Mentally?
Of course the data can be encrypted.. you think Apple doesn't already perform this in Screentime?What you just proposed makes zero sense. A token would simply authenticate the two devices for the purpose of allowing access. The data itself cannot be encrypted because then the app would not be able to use it. And if the data is not encrypted then the developer would be able to view it and store it.
The only solution is for Apple to crunch the data then provide it via the API in a way that it is anonymized, but obviously that won't work either. And even if it was not anonymized, Apple would be providing the same data it already provides directly to parents via Screen Time. Then you'd have the equivalent of a fart app with just a different icon.
As a parent of 2 kids who uses OurPact, I sure hope they get this resolved.
OurPact has been the gold standard for parental control for some time. The interface is Apple simple and intuitive...though the setup process isn't. But to be fair - has anyone taken a look at what it *really* takes to lock down a kids phone using screen time and parental controls from Apple? It's a heck of a slog to get it all set up...and you pretty much have to have the device in hand to make changes.
Nice thing about OurPact is that I can make changes in allowed apps, grant and block use any time from my device...theirs could be a continent away.
My wife and I both use OurPact and have for quite a while now, it gives us a good level of comfort that our kids are safe on the internet (safari is blocked!) and the "just one more thing..." in minecraft has been cut to nothing..when time is up, it's up.
I realize you can do similar/the same things in the Apple controls, but OurPact is popular for a reason...it's easy to use and does what it says.
As long the have enough money to pay the hefty fines such companies are charged by teh EU and others for unfair business practices.Apple can do whatever they want in terms of making screen time available for third parties.
Apple has more money than any publicly traded company.As long the have enough money to pay the hefty fines such companies are charged by teh EU and others for unfair business practices.
It’s doing more harm than good.If you are not teaching your children to responsibly use technology you may find that they are quickly left behind their peers.
Working in the education sector and using MDM to manage our fleet of iPads; MDM is way more granular than that of screentime, allowing you to manage almost all aspects of the iPad itself. Whats more, the MDM allows you to see where an iPad is, content of the iPad and so many other things. I think that this is a good move by Apple, protecting the end user - the naive end user that doesn't understand exactly what can be divulged to a company with an MDM profile installed on the device. These companies will probably lose the ability to sell off analytics to other companies gathered by the MDM profiles.
Because the vendors of those enterprise MDM apps aren’t giving themselves access to the customer devices, it’s the device owners who get that access. Unlike the Screen Time type apps, the vendor has no access to devices. Do you really not understand the difference?Enterprise MDM apps are being commercially marketed. I’d like to understand why Apple does not feel they are a privacy and security concern too.
I read what you wrote, both posts. Do you remember what you wrote?Please read my point again. I’m not speculating anything. I’m wondering if MR (Joe) had the presence of mind to ask Apple if they have the same privacy and security concerns for a commercially marketed app for enterprise use.
MR (Joe), did Apple really say that?
If Apple actually said what MR (Joe) claimed
As Apple?Spotify has the same level of access as any other developer
I don’t know exactly what Siri features are implemented in either Spotify or Apple Music, but there does seem to be a misunderstanding on the current status of Siri integration with the Spotify app. In its response statement, Apple states the following:As Apple?
To me, that sounds like the ball is in Spotify’s court. Maybe Spotify has denied that’s the case; if so, I’m not aware of it. There may be other issues with respect to Siri integration; I really have no idea.We’ve worked with Spotify frequently to help them bring their service to more devices and platforms:
- When we reached out to Spotify about Siri and AirPlay 2 support on several occasions, they’ve told us they’re working on it, and we stand ready to help them where we can.
Because the vendors of those enterprise MDM apps aren’t giving themselves access to the customer devices, it’s the device owners who get that access. The vendor has no access to devices, the device owner does. Do you really not understand the difference?
...deleted...
First, I appreciate the help with my trying to understand this.
Second, lightening up your tone would be appreciated. If that’s impossible then I invite you to cease trying to help.
With that out of the way, I believe it would help to work backward.
I am an employer.
I want to purchase a license for an Enterprise MDM application from a third party company to monitor and control iso devices. These iso devices are (1) company owned and (2) employee’s personally own devices used as BYOD.
My question: What oversight/controls has Apple placed on these third party development companies to verify they are not giving themself access to my company’s data and the iso devices?
As Apple?
No, Apple shouldn’t give 3rd party apps access to the same internal APIs it’s own apps use, that would be a massive security hole.
Apple MIGHT be able to give some/similar abilities through APIs to developers, but everyone who assumes it will be easy/won’t involve considering the security implications had clearly never spent any time developing real software (or alternatively is very very bad at it). I certainly think it would be fine for Apple to consider that, but depending on the demand for these types of apps it may or may not be worth it.
What Apple should be doing (and I hope they are) is investing more money/resources in the app review process to try and find ways to catch apps that are using features like MDM in non-approved ways sooner. And developers who were using such functionality in non-approved ways shouldn’t be complaint, they should be thanking their lucky stars Apple didn’t just shut them all down and ban them from the App Store to begin with. They had absolutely NO business using MDM, none. It was clearly a violation of the terms of the developer agreement.
1. Well, what else would you say.I'm a developer at one of these companies. Not sure why you assume we sell data, but let me tell you that you are fundamentally wrong. We make our living out of user subscriptions and nothing more.
Because the vendors of those enterprise MDM apps aren’t giving themselves access to the customer devices, it’s the device owners who get that access. Unlike the Screen Time type apps, the vendor has no access to devices. Do you really not understand the difference?
In one case, a device owner is installing software on their own device which gives “unrestricted access”—including location and camera permissions. The analogous situation would be a parent who installs an MDM app, giving them unrestricted access to their own child’s device, including the ability read emails, check their child’s location, look through their browser history, etc.
But that’s not what’s happening. It’s not the parent (device owner) but instead a vendor (a third party) who has been given that unrestricted access. Apple doesn’t want apps on the App Store that, when installed, permit the vendor to have unrestricted access to customer devices. They do not have a problem with employers being given that same access—which has privacy and security implications—to devices used by their employees.
Actually the use case if very similar. Companies want to make sure devices they own are used appropriate and be able to take action if necessary. Parents want to make sure their children use their device appropriately and take action if necessary.I can’t imagine you don’t see how these are two completely different uses (of the same technology).
If you still don’t understand the difference, let me ask you a few questions: 1) would you be ok with having various people, who you do not know, having access to the real-time location of your child? 2) Would you be ok with those same people being able to remotely install any software they wished on your child’s device? 3) Would you be ok with persons unknown to you having control over the camera permissions of your child’s device?
1) “MDM gives a third party control and access over a device and its most sensitive information including user location, app use, email accounts, camera permissions, and browsing history”.Are you not aware that many of the major MDM vendors offer cloud based solutions? No big difference. The major MDM company (Jamf, VMWare, Zuludesk, etc.) all would have the ability to see the data on device enrolled in cloud instances. In addition, there are companies that offer managed services utilizing these same MDM. There are for companies (usually small businesses), that offer to manage MDMs for customer. They would have the same (if not more) access to the data coming from the devices than the companies we are talking about her.
Wrong, possible but not unlimited, and wrong. First of all, no company, either the MDM vendor or even the Organization will have UNRESTRICTED access. That is flat out false. You can not "read emails" or "look through a browser history" utilizing MDM. Apple has been very clear to their MDM vendor what data they can access. For the most part, MDM focuses on a few key functions:
As far as tracking a child's location, if a parent is concerned about that, they can easily turn off location tracking for the app on the iPhone and utilize Find My Friends.
- Initial setup fo the device including App installation
- App installation of both App Stores Apps and Enterprise Apps
- Ability to set restriction on usage and control data access (i.e. ensure that data on the device is not shared in unsafe manner)
- Ability to collect certain usage data and data about the device (Apps installed, phone time, etc.)
"Unrestricted" - You keep using that word, I don't think it means what you think it means. No company, MDM Vendor, or person ever has unrestricted access to an iOS device. Apple has done a great job with their MDM protocol of balancing the needs of a company to manage their device and data while maintaining the privacy of the users.
Actually the use case if very similar. Companies want to make sure devices they own are used appropriate and be able to take action if necessary. Parents want to make sure their children use their device appropriately and take action if necessary.
For #2 & #3, you are 100% wrong. An MDM can not install any App without permission UNLESS the device has been Supervised. For 99% of the supervised devices, that means they are enrolled in Apple DEP program, which ONLY is available to Enterprise and Educational customer. These Parental Control companies would only be able to install Apps if the user approved them. The camera is COMPLETE false. MDM can only disable access to the camera (for security purpose). No MDM option exists to enable the camera. Any App that asks for Camera use will require approval from the user.
Apple has spent a significant time and effort to refine their MDM protocols and balance the need of company to control their devices and data while maintaining the privacy of the end users. Apple continues to move a vast majority of the more restrictive and intrusive MDM functions to only be available on Supervised devices. And Apple's unofficial roadmap indicates this trend will continue.
Seems easier, cheaper, and better in other ways to just not give the kids smartphonesAs a parent of 2 kids who uses OurPact, I sure hope they get this resolved.
OurPact has been the gold standard for parental control for some time. The interface is Apple simple and intuitive...though the setup process isn't. But to be fair - has anyone taken a look at what it *really* takes to lock down a kids phone using screen time and parental controls from Apple? It's a heck of a slog to get it all set up...and you pretty much have to have the device in hand to make changes.
Nice thing about OurPact is that I can make changes in allowed apps, grant and block use any time from my device...theirs could be a continent away.
My wife and I both use OurPact and have for quite a while now, it gives us a good level of comfort that our kids are safe on the internet (safari is blocked!) and the "just one more thing..." in minecraft has been cut to nothing..when time is up, it's up.
I realize you can do similar/the same things in the Apple controls, but OurPact is popular for a reason...it's easy to use and does what it says.
This still doesn't make sense. You say the app has access to the data, either on the parent's phone or the child's (there's no other way this would work). What's stopping the app from sending it to the dev's servers?Of course the data can be encrypted.. you think Apple doesn't already perform this in Screentime?
I don't think you're understanding what people want here.
While Screentime APIs can provide the access to the data and parental control functions, app developers can give the app more functionality on top of them.
App developers could make something that actually works in a more coherent manner, as opposed to the Screentime app that is limited in things like scheduling options and the navigation is a hot mess.
There's tons of room for improvement.
And again, there would be no reason for the developer to capture or even see the control data between the child and parent's device.
Only the on device app needs to be able to access the data.
We develop in these scenarios all the time. There's no need (and where I work, no legal option) for me to see the customer data our apps are using.
Learning at age 18 or using the school-provided machines seems like not an issue. I was in 7th grade when the iPhone 3G came out, and within a few years nearly everyone at my school had an iPhone, so I lived right through this. Smartphones ruined a lot of aspects of being a kid, and nobody learned self-control until like halfway through college.How do you define "little"? Do you define a 12 or 13 year old as little? Plenty of 12 year olds have smartphones these days. They are increasing becoming a necessity in today's society. I have one relative who told me that his children have had to use their smartphones in class to take a test when they didn't have enough computers to go around. Elementary schoold children are routinely using iPads and Chromebooks in class to do work.
If you are not teaching your children to responsibly use technology you may find that they are quickly left behind their peers.
It's impossible unless Apple implements a new level of sandboxing for these apps that makes them pretty much hermetic except for access to this API, and that's hard. Otherwise, nothing stops an app with access to this data from sharing it.Sure there is. Some combination of Family Sharing and multi-user authentication could easily be combined to provide security.
Well, there's nothing wrong with their argument this time. I don't think I'd update my iOS if they added screentime APIs as described here.Apple is blowing smoke up everyone's ass. The term "security" has become everybody's bitch - it's always used as an excuse for someone to do (or not do) something they shouldn't be doing. Apple is more and more frequently doing stupid ****. They're slowly becoming the Microsoft of old we used to despise.
Learning at age 18 or using the school-provided machines seems like not an issue. I was in 7th grade when the iPhone 3G came out, and within a few years nearly everyone at my school had an iPhone, so I lived right through this. Smartphones ruined a lot of aspects of being a kid, and nobody learned self-control until like halfway through college.
That and there's no need or compelling reason to have one as a kid. A dumbphone is legitimately useful, though. I'm worried because today's parents didn't have this tech as children and don't know how messed up things are.
1) “MDM gives a third party control and access over a device and its most sensitive information including user location, app use, email accounts, camera permissions, and browsing history”.
As I said in an early post, that’s what Apple says. I didn’t make that up; it’s directly from their press release. It clearly states MDM gives access to email accounts and browsing history. If you have a disagreement with that, maybe that’s best taken up with Apple.
2) “Unrestricted access” is also from the press release. Schiller uses the term as well in his email:
“No one, except you, should have unrestricted access to manage your child’s device, know their location, track their app use, control their mail accounts, web surfing, camera use, network access, and even remotely erase their devices”.
Apparently you and Apple differ on what “unrestricted access” means.
3) As I said in an earlier post, Apple doesn’t have a problem when enterprises use MDM to access and exert control over devices if that company has a right to all of the data and use of those devices.
(Apple also doesn’t have a problem with hosted/third party/managed service providers of MDM servers. Companies choose to outsource many different aspects of IT, including VPN, Exchange/email, DNS/AD, etc. Any use of service providers for these types of services can bring with it privacy and security concerns, however enterprises that outsource those functions are well aware of the potential issues.)
4) Apple definitely has a problem with a private, consumer-focused app business installing MDM control over a customer’s device.
5) I’m sure you understand the difference between an allowable internal-only enterprise use vs. a prohibited consumer-facing app. OP did not.
6) #3 is why MDM exists. #4 is against App Store rules since 2017 and is why the offending apps were removed. Apps that didn’t abuse MDM remain.
7) For any of the companies of the pulled apps to pretend they had no idea what the problem was with their apps, or what they needed to do to be compliant is disingenuous and not at all believable. It’s simple: don’t use MDM in a non-enterprise setting.
8) For vendors of the apps abusing MDM to call Apple’s actions anticompetitive doesn’t make any sense when Apple doesn’t have a competing app for sale. Instead of benefiting from those app removals, Apple actually loses money—15/30% of all the revenue those apps make.
9) I can’t imagine that the EU complaints will result in any action against Apple, but you might disagree.
10) Apps that don’t want to get pulled from the App Store shouldn’t flagrantly violate the rules. But when you get caught, please do us a favor and don’t insult our intelligence by claiming to be caught off guard/not given sufficient notice/unaware of the reason you got pulled/don’t know what you have to change in order to be compliant/etc. You knew you were violating App Store rules (and no doubt the T&C for MDM as well) and have been on borrowed time for months/years. It finally caught up with you, and crying to the EU and writing snarky open letters to Apple is not likely to be helpful.
2012 is not relevant. The rules were updated in 2017, giving these vendors plenty of time to become compliant. Some did, and remain in the App Store. Those that refused to stop installing MDM profiles were blocked/removed.You are simply taking CEO and Executives words as truths. These are PR words designed to mislead and invoke fear. They differ significantly from the actual MDM documentation that developers have access to. It's simply a lie or misleading at best to say "unrestricted". They were also the ones that lied about battery throttling until they got caught.
Fact: MDM has been used for Parental Control since at least 2012 - over 6 years ago. Not a word from Apple.
Fact: iOS 12 is released with Screen Time for parental control - 2018 - suddenly same apps get removed.
You can read more facts here on Apple Executives PR vs. MDM Documentation: https://medium.com/@ourpactapp/there-used-to-be-an-app-for-that-41344f61fb6f
So this begs four questions:
1. Why did Apple allow non-business and non-enterprise developers to use MDM for the last 6 years?
2. How is it possible that this "unrestricted" MDM feature was used and escaped detection for 6 years?
3. Why is it now a problem only after screen time is released?
4. Why does Apple executives contradicts their own MDM documentation?
So IMO Apple was either incompetent in their vetting process of "god mode" of using MDM or they simply did not care until now with the release of Screen Time. I am going with the latter. Now that they have Screen Time these apps are no longer beneficial to the ecosystem - so they removed for violating MDM six years ago (that Apple allowed).