Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Not sure which rock you're living under but companies aren't going with Apple (or Windows) - the new model is all BYOD and in the office its just lightweight thin clients... that's how you really save costs. Everything is hosted in the cloud, no more supporting individual devices. Soon enough the vast majority of work related computing will be handled by the cloud - even a Chromebook is enough compute to get you by.

A coworker came to my office to ask about his Corporate MacBook Pro as it wouldn't sit flat on his table. I had a look at it and determined that he had bulging battery and that he should shut it down and get it serviced. He talked to our Mac IT guy and it was out of warranty but we have enterprise support which is good for five years so he's getting a swap. The Apple Enterprise service provides on-site and by-next-day service. So he did a backup to crashplan and I assume that he's up and running on his new system today. The IT guy showed me the Apple apps that manage corporate systems and the status of the system in terms of warranty and service.

If you're allowing BYOD as a company, you are absolutely nuts. We do allow BYOD at work but the employee has to install a bunch of corporate security software to be allowed on the network and that includes Admin access to your system. I use two personal systems on the corporate network and they both have the corporate security software installed. That's a fair trade for the flexibility that I get with my systems.

If there were no enterprise customers using Apple hardware, why would Apple have Enterprise support? There must be quite a bit of demand for this service for them to moving from a consumer-only model to Enterprise support as it would mean building up a support organization country-wide and maybe world-wide.

I'm a cloud-developer so I do know a little about the cloud. I sit above a cloud data center and know how much money we spent to build it along with what kinds of services and workloads there are. You could argue that you need even more security and control in the cloud because it's homogeneous and someone breaking in has a lot more to gain with success.
[doublepost=1556800946][/doublepost]
So you're saying employers are no longer supplying their employees with computers and they have provide their own? Tbh, I never heard that with computers, every company I know of provides their employees with the tools needed to do their job.

Yeah, even at McDonalds. The people that work there don't have to bring their own cooking equipment, electronic ordering systems, and headsets to talk to people in the drive-through. This guy has never worked at Google, Microsoft, Oracle, Facebook, Adobe, Mozilla, IBM, etc.
 
So you're saying employers are no longer supplying their employees with computers and they have provide their own?

Yup - its been happening for some years now. They might provide some allowance for tech but the bring your own device trend is becoming far more common. All you need to run is a browser and maybe download their thin client of choice software, then everything else is hosted and executed on servers.

Since Apple killed Blackberries, companies these days hardly give out phones anymore too... again its BYOD, and all you get is a discount from your carrier that the company might negotiate with. If you're out a lot (eg. sales) they'll usually give a phone allowance but its still BYOD most times.
 
  • Like
Reactions: jerryk
I'll be honest, at a point in history, I too used to think and say publicly about how ___ laptops are not worth the money compared to others. I believe otherwise now.

The angle I'm coming from is not from a part for part, chip for chip comparison. I'm looking at things from the most basic and most practical angle, which is... it's your money. You can't really put a pricetag on anything that makes you happy. Some people are just fine spending $500-1000 more for a MBP than a PC, it's not for anyone to tell someone how to spend their money. You buy what you want, let others buy what they want.

Heat will always be an issue with any laptop based around a slim design. It's not exclusive to Macs. My Razer Blade 15 is cooler than a comparable MBP in terms of temps at max load, but it's not like it's ice cold to the touch when gaming.

The downtrend in MBP sales in general works like this. Nobody wants to invest in a product where under normal use, the keyboard may fail. I'm not sure if Apple realizes how much of a showstopper this really is. When a $600 Lenovo has a much better typing experience and significantly more reliable keyboard, you have serious problems.

To sum things up, this is really about Apple's internal battle between features versus benefits. Features are not necessarily benefits. The focus should be on benefits (to the user) first and the features should be increased reliability then productivity.
 
  • Like
Reactions: pshufd
I'm currently doing work for a large Bank and they're encouraging BYOD wherever possible and you then just connect in over VPN (using their wifi) to a virtual desktop meaning they don't have to support multiple different types of hardware and builds.

They actively got rid of laptops from their workforce about 3 years ago.

There's still lots of desks with traditional thin clients available, but more and more people are taking their own portable device in and just using that.
 
  • Like
Reactions: jerryk
Yup - its been happening for some years now. They might provide some allowance for tech but the bring your own device trend is becoming far more common. All you need to run is a browser and maybe download their thin client of choice software, then everything else is hosted and executed on servers.

Since Apple killed Blackberries, companies these days hardly give out phones anymore too... again its BYOD, and all you get is a discount from your carrier that the company might negotiate with. If you're out a lot (eg. sales) they'll usually give a phone allowance but its still BYOD most times.

That explains why so many companies get hacked.

These CIOs should be fired.
[doublepost=1556802851][/doublepost]
I'm currently doing work for a large Bank and they're encouraging BYOD wherever possible and you then just connect in over VPN (using their wifi) to a virtual desktop meaning they don't have to support multiple different types of hardware and builds.

They actively got rid of laptops from their workforce about 3 years ago.

There's still lots of desks with traditional thin clients available, but more and more people are taking their own portable device in and just using that.

We provide phones for some workers. It allows them to keep personal and work life separate. One important aspect of the company phones is remote wipe. We're paranoid about security here (employees have to take regular computing and networking security courses) but that's a good thing.
 
I think there are misconceptions here. Most companies who have BYOD for desktops are only allowing those devices to be used as thin clients/dumb terminals. i.e. you actually connect to a remote desktop hosted on Citrix or whatever in a data centre somewhere or the cloud. The data itself is still secured in a managed desktop environment.

What is also happening is allowing people to use their own mobile phone devices, moving away from an MDM scenario where a company installs a heavy weight completely controlling device profile to ensure you don't leak data everywhere, to a lighter MAM or app based approach (requires the apps in question to be MAM aware), e.g. you can access company data but only in say the core Microsoft apps of Outlook, Word, OneDrive and so on and they have been configured via a policy to not allow that data outside of the secure containers in their apps. Blackberry Work and related apps work in a similar way. This approach is only really applicable to phones/tablets though as they have the secure app ecosystem that allows MAM. It doesn't really work with desktop OSes as far as I am aware - hence the first scenario.

Where you have moved to cloud services like Office 365 it is possible to lock those down too to prevent data leakage, controlled access from particular devices, policies, networks etc.
 
That explains why so many companies get hacked.

These CIOs should be fired.

I've been bringing my own device for work for almost 20 years. I have always supplied better hardware for my work than the company is willing to invest. There is nothing worse than being excited about starting a new job and getting a piece of **** to work on.

As long as the company has policies and you follow basic security protocols you are no more vulnerable than if the company supplied the device.
 
  • Like
Reactions: 09872738
I think there are misconceptions here. Most companies who have BYOD for desktops are only allowing those devices to be used as thin clients/dumb terminals. i.e. you actually connect to a remote desktop hosted on Citrix or whatever in a data centre somewhere or the cloud. The data itself is still secured in a managed desktop environment.

What is also happening is allowing people to use their own mobile phone devices, moving away from an MDM scenario where a company installs a heavy weight completely controlling device profile to ensure you don't leak data everywhere, to a lighter MAM or app based approach (requires the apps in question to be MAM aware), e.g. you can access company data but only in say the core Microsoft apps of Outlook, Word, OneDrive and so on and they have been configured via a policy to not allow that data outside of the secure containers in their apps. Blackberry Work and related apps work in a similar way. This approach is only really applicable to phones/tablets though as they have the secure app ecosystem that allows MAM. It doesn't really work with desktop OSes as far as I am aware - hence the first scenario.

Where you have moved to cloud services like Office 365 it is possible to lock those down too to prevent data leakage, controlled access from particular devices, policies, networks etc.

How do you protect data on a device where someone can capture screen contents?

We have intrusions into infrastructure right now. My question: why don't you just remove all network access to nuclear power plants, water control stations, and electrical control systems? Some of this stuff is running software based on MS-DOS and old versions of Windows NT. Do you want network access to these, old systems?

We're not allowed to use software like Citrix or setup any kind of internal servers. As I said, we're paranoid about security. And that's a good thing.
[doublepost=1556803802][/doublepost]
I've been bringing my own device for work for almost 20 years. I have always supplied better hardware for my work than the company is willing to invest. There is nothing worse than being excited about starting a new job and getting a piece of **** to work on.

As long as the company has policies and you follow basic security protocols you are no more vulnerable than if the company supplied the device.

I'm the same way. I've used my own equipment for decades. But my company requires what most would consider intrusive access and control of my personal devices. Given the security context of our work, I think that entirely reasonable.

My point is that most companies aren't getting the job done on security.
 
Going back on topic, are Macs really slower than PC's? Well it depends, and again it's not always easy to do a true 1v1 under identical conditions comparison.

Benchmarks. I have a love/hate relationship with benchmarking software. I don't know of anyone that buys or builds a machine just to be the benchmarking king/queen as their top priority. I've seen situations where benchmarking for content creation (video compiling) looked a little low, but the time to complete in the actual application showed otherwise.
Benchmarking was based around the goal of giving the user a general idea of how the machine would perform in a given situation. It was never to be the final word on the actual performance.

Temps. You can control the fans on both Macs and PCs if the native control isn't sufficient. PC users sometimes use undervolting to help control temps in order to achieve less (or zero) throttling.

Software and hardware optimization. This one matters and it's not always easy to compare PC to Macs. Adobe for a while has been better optimized for PC's but when they started to improve their Mac-based products, that gap narrowed. Even if a PC was 2-3% faster, there are a lot of people that would say that they don't mind that simply because that margin doesn't justify having to not use Mac OS.

Linus is a performance oriented youtuber. He's the kind of guy that has no problem shelling out over $1000 to RAID stripe a couple of NvME drives to game on. It's fun and cool to watch but in the end, it really has no practical value other than bragging rights. Take his videos with a grain of salt. I'm a PC user myself (and Macs) and even I don't find a lot of value in most of his videos.
 
  • Like
Reactions: mudflap
How do you protect data on a device where someone can capture screen contents?

We have intrusions into infrastructure right now. My question: why don't you just remove all network access to nuclear power plants, water control stations, and electrical control systems? Some of this stuff is running software based on MS-DOS and old versions of Windows NT. Do you want network access to these, old systems?

We're not allowed to use software like Citrix or setup any kind of internal servers. As I said, we're paranoid about security. And that's a good thing.

Both Android and iOS allow you to disable the taking of screen shots with app policies. You obviously can't prevent taking pictures with a camera of monitors or screen though.
 
Going back on topic, are Macs really slower than PC's? Well it depends, and again it's not always easy to do a true 1v1 under identical conditions comparison.

Benchmarks. I have a love/hate relationship with benchmarking software. I don't know of anyone that buys or builds a machine just to be the benchmarking king/queen as their top priority. I've seen situations where benchmarking for content creation (video compiling) looked a little low, but the time to complete in the actual application showed otherwise.
Benchmarking was based around the goal of giving the user a general idea of how the machine would perform in a given situation. It was never to be the final word on the actual performance.

Temps. You can control the fans on both Macs and PCs if the native control isn't sufficient. PC users sometimes use undervolting to help control temps in order to achieve less (or zero) throttling.

Software and hardware optimization. This one matters and it's not always easy to compare PC to Macs. Adobe for a while has been better optimized for PC's but when they started to improve their Mac-based products, that gap narrowed. Even if a PC was 2-3% faster, there are a lot of people that would say that they don't mind that simply because that margin doesn't justify having to not use Mac OS.

Linus is a performance oriented youtuber. He's the kind of guy that has no problem shelling out over $1000 to RAID stripe a couple of NvME drives to game on. It's fun and cool to watch but in the end, it really has no practical value other than bragging rights. Take his videos with a grain of salt. I'm a PC user myself (and Macs) and even I don't find a lot of value in most of his videos.

There used to be a particular Javascript benchmark and I used to participate in contests to get the best time. I had a Compaq r3300z laptop and I was using Firefox to compete with other Firefox users and builders along with other browsers though there weren't many browsers at that time. I think that Internet Explorer was the other major browser. I was competing with desktop systems with far more horsepower. So I overclocked the laptop but I also built my own custom Firefox browser and I did assembler-based optimizations to improve performance for the specific Javascript tests. And I was frequently on top of the heap even though I had a much slower system.

Software can matter on performance. A lot.

Just taking advantage of hardware performance features can make a big difference. Intel and AMD added vector operations in the late 1990s and early 2000s and they have added increasing functionality in this area through the years. These vector operations can greatly improve performance on certain kinds of workloads but they required custom assembler or machine-code coding. Things are better today with packages from Intel to do these specific workloads. I see that more people (thought the number is still tiny) that know how to use these vector instructions and their use is in operating systems today. There are intrinsics to make programming in SIMD easier as well - with automatic register allocation.
[doublepost=1556804372][/doublepost]
Both Android and iOS allow you to disable the taking of screen shots with app policies. You obviously can't prevent taking pictures with a camera of monitors or screen though.

I would assume that iOS is secure in this area outside of a jailbreak version but I wouldn't assume that someone couldn't modify Android to spy on what the user is doing. There is also the old hack of using a surreptitious video camera.
 
That explains why so many companies get hacked.

Its actually way more secure - nothing is stored or processed locally on your laptop. All you're getting is an encrypted screen dump of a remote desktop/workspace. Issuing laptops is a far bigger security risk. The old days where you see people trying to steal data via USB sticks and such is history too.
 
Its actually way more secure - nothing is stored or processed locally on your laptop. All you're getting is an encrypted screen dump of a remote desktop/workspace. Issuing laptops is a far bigger security risk. The old days where you see people trying to steal data via USB sticks and such is history too.

I would definitely disagree.

If you get into the server side, then there's far more that's exposed.

I don't agree that the thin-client model is more secure. I prefer the security-paranoid model.
 
Can you explain to us what you mean by this security paranoid model?

- All employees are educated on computing and network security risks including social engineering.
- All devices are secured including remote wipe and surveillance.
- Access to the internet is disabled; if you need access to the internet, you connect to a separate network.
- Remove WAN access unless it's required.
- Physical security of servers
- Server redundancy
- Ability to get back up and running if one of your cloud sites takes a hit
- Ability to move back and forth between cloud and on-premises, or hybrid
- Security team that does random internal hacking attempts
- Secure development training
- More but that's off the top of my head.
 
- All employees are educated on computing and network security risks including social engineering.
- All devices are secured including remote wipe and surveillance.
- Access to the internet is disabled; if you need access to the internet, you connect to a separate network.
- Remove WAN access unless it's required.
- Physical security of servers
- Server redundancy
- Ability to get back up and running if one of your cloud sites takes a hit
- Ability to move back and forth between cloud and on-premises, or hybrid
- Security team that does random internal hacking attempts
- Secure development training
- More but that's off the top of my head.

So if we follow all those principles you outlined there - why have devices in the wild (company laptops with company data) when you can have them in secure servers where people can't get to? Laptops are inherently more insecure than the centralized model, because you have a lot more endpoints and gaps to secure.
 
So if we follow all those principles you outlined there - why have devices in the wild (company laptops with company data) when you can have them in secure servers where people can't get to? Laptops are inherently more insecure than the centralized model, because you have a lot more endpoints and gaps to secure.

You offer laptops because employees are hard to come by in this industry and you have to accommodate their wishes on employment location flexibility. We do have walled off LANs and you have to come into the office to use these or reboot them when they go down.
 
As an OSCE, I can't help but sigh when anyone uses the term "hack" incorrectly. I blame Hollywood and the media for this.

To get an idea of what we do in our company, it's normal and common for us to load up a machine (and other devices) with malware and exploit packages and deploy them within the infrastructure. If you're worried about what may happen, it's very common to know that no amount of security measures you apply to the client-side of the infrastructure will be sufficient to protect the rest of infrastructure.

It would be focusing on making a car very secure, but intend on driving it across large wooden bridges made mostly out of termites holding hands.
 
  • Like
Reactions: 09872738
Yup - its been happening for some years now. They might provide some allowance for tech but the bring your own device trend is becoming far more common. All you need to run is a browser and maybe download their thin client of choice software, then everything else is hosted and executed on servers.

Since Apple killed Blackberries, companies these days hardly give out phones anymore too... again its BYOD, and all you get is a discount from your carrier that the company might negotiate with. If you're out a lot (eg. sales) they'll usually give a phone allowance but its still BYOD most times.
I can see phones, but not computers. One issue my company has is ensuring that the computers are locked down and not at risk of malware. Using someone's personal computer opens the door to a number of security risks as I see it. I guess that's neither here nor there, as I worked for an organization that is very conservative about allowing devices on its network
 
As an OSCE, I can't help but sigh when anyone uses the term "hack" incorrectly. I blame Hollywood and the media for this.

To get an idea of what we do in our company, it's normal and common for us to load up a machine (and other devices) with malware and exploit packages and deploy them within the infrastructure. If you're worried about what may happen, it's very common to know that no amount of security measures you apply to the client-side of the infrastructure will be sufficient to protect the rest of infrastructure.

It would be focusing on making a car very secure, but intend on driving it across large wooden bridges made mostly out of termites holding hands.

One really easy way to hack is to use really old-fashioned technology that has worked for thousands of years. Well, maybe two ways.
 
Hacking is based on the idea that someone takes something apart, whether it's physical or digital, in hopes of understanding how things work in order to facilitate and/or implement change.

A radio controlled car may have an issue where the batteries for the remote may lose contact periodically. A hack could mean that someone realized what was happening and implemented a change to mitigate/eliminate that issue.

For example, Bradley Manning didn't hack classified systems when he got caught sending stuff to Wikileaks. He was able to copy content on a system he already had access to. It's what he did with the content which was the violation. That's not really a hack.

if someone were to gain access to your Instagram account, that's also not a hack, especially if they found a way to circumvent existing security measures, that's what we call exploiting, meaning you used something whether it's a software package/script or process to take advantage of a known vulnerability.
 
Hacking is based on the idea that someone takes something apart, whether it's physical or digital, in hopes of understanding how things work in order to facilitate and/or implement change.

A radio controlled car may have an issue where the batteries for the remote may lose contact periodically. A hack could mean that someone realized what was happening and implemented a change to mitigate/eliminate that issue.

For example, Bradley Manning didn't hack classified systems when he got caught sending stuff to Wikileaks. He was able to copy content on a system he already had access to. It's what he did with the content which was the violation. That's not really a hack.

if someone were to gain access to your Instagram account, that's also not a hack, especially if they found a way to circumvent existing security measures, that's what we call exploiting, meaning you used something whether it's a software package/script or process to take advantage of a known vulnerability.

I thought that hacking was any way to break into a system or maintain access or disable access to it as in a DDOS.

I used to be a hacker - I broke into systems at a wide variety of places and I modified operating system software to maintain access. That was a long time ago though.
 
I thought that hacking was any way to break into a system or maintain access or disable access to it as in a DDOS.

I used to be a hacker - I broke into systems at a wide variety of places and I modified operating system software to maintain access. That was a long time ago though.

Not in a professional sense, no. DDOS is simply referred to as that as it's a specific kind of attack. Not all DDOS attacks are the same however.

If I were to back down a little and let some stuff slide, I would accept (to a point) that a hack could simply mean "unauthorized access", but if you think about it, it's kind of silly.

Here's an analogy. If your girlfriend is changing and in her underwear and you took a peek in her bedroom when the door was closed, that would constitute a hack because that door was closed for a reason.

I'm not doubting what you did in the past but I hardly consider that hacking in both the professional and social sense. It's not hard to take advantage of a situation where supervision is clearly lacking and security measures that are insufficient to prevent people with some knowledge to gain access to machines. This isn't a jab at you or something, please don't misunderstand.

There's this idea that "hackers" are people sitting at the computer typing or running apps in a terminal window and such trying to crack something in order to gain arbitrary access. That may've been true a long time ago, but if someone said that today... I'd probably giggle unless I see it first hand.
[doublepost=1556808586][/doublepost]
I can see phones, but not computers. One issue my company has is ensuring that the computers are locked down and not at risk of malware. Using someone's personal computer opens the door to a number of security risks as I see it. I guess that's neither here nor there, as I worked for an organization that is very conservative about allowing devices on its network

I guess it really is about what you're protecting and what you're trying to protect it from.

Phones may seem okay but the do have the ability to record and share information. In a SCIF, it's a big no-no. In a Banana Republic retail store, probably okay.

A lot of companies and orgs rely too heavily on compartmentalization. When you take that too far, it provides very little security benefit, but it impairs the users ability to use the system(s) making matters way too complex to work on esp for the simple tasks.

When I'm about to do an analysis on anything, I often start with an interview where we exchange questions and answers. It's important to be aware of what's critical, their intent with the systems, and how the systems are being used.

Using my Banana Republic example, securing cash at the registers, cameras for shoplifting and crime, how often the safe is used, how many people on shift, credit card machines, etc. all have to be looked at in order to form a plan of action for security improvements.

One of the biggest problems in almost every company is, the notion that admins need full access to everything and I'll say this, there's never a time unless it's your own personal property, where any admin requires full access to everything.
 
Not in a professional sense, no. DDOS is simply referred to as that as it's a specific kind of attack. Not all DDOS attacks are the same however.

If I were to back down a little and let some stuff slide, I would accept (to a point) that a hack could simply mean "unauthorized access", but if you think about it, it's kind of silly.

Here's an analogy. If your girlfriend is changing and in her underwear and you took a peek in her bedroom when the door was closed, that would constitute a hack because that door was closed for a reason.

I'm not doubting what you did in the past but I hardly consider that hacking in both the professional and social sense. It's not hard to take advantage of a situation where supervision is clearly lacking and security measures that are insufficient to prevent people with some knowledge to gain access to machines. This isn't a jab at you or something, please don't misunderstand.

There's this idea that "hackers" are people sitting at the computer typing or running apps in a terminal window and such trying to crack something in order to gain arbitrary access. That may've been true a long time ago, but if someone said that today... I'd probably giggle unless I see it first hand.

Most people don't have the knowledge to break into server operating systems or can't read crash dumps, write machine code or read assembler code to understand what a piece of executable code is doing. I would guess that a few people do and that they write tools to infiltrate systems.

As I wrote, there's an easier way to break into systems and it's thousands of years old.
 
I've been bringing my own device for work for almost 20 years. I have always supplied better hardware for my work than the company is willing to invest. There is nothing worse than being excited about starting a new job and getting a piece of **** to work on.

As long as the company has policies and you follow basic security protocols you are no more vulnerable than if the company supplied the device.

All of my personal hardware would run circles around my work equipment, but I'm not buying my own device for work. I'd be willing to go down that road if work 100% paid for the device I picked, but they should pay for the tools that get the job done.

Shame companies still don't fully get it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.