I never understood how this works. In early 2000 I used to browse the internet and it looks pretty much the same, even YouTube was the same except it probably didn't have better quality than 480p. How come it became so intensive?
I remember in late 90's where people would try to minimize images that were like 40kb to be more 10kb so it would load faster.
I heard about this before but I don't believe it. So you have access to a powerful machine that you can do intensive 3D graphics and 4K video editing on a weaker computer like the macbook and it is as if you were running locally? No slow downs, frame drops, ?
Actually YES. With the speed of cable modems, and LTE, it is very easy to edit files on a remote machine, even with precision. Now you aren't going to be able to use a Stylus on a remote machine, but you get in there and really work.
There will be some lag, if you play back video from the remote machine to your local machine. But you can get a faster connection. Which could help alleviate that. But no I think I did mention if you are a videographer you will have a hard time tho. But I usually just put my machines to work and then leave them to do the task at hand.
Then I come back here for another 250 post UGH...haha
I guess you were living in a cave when the gen Retina MacBook Pro launched in 2012; even though it was still using a of standard components that Apple used in the conventional MacBook Pro at the time.
- Faulty Retina Displays was a big deal
- Performance issues because of the resources needed to drive that Retina display
- Yellowing of the display was also an issue.
Some of these issues even affected the redesigned iMac when it launched. Even the 2013 revisions of the MacBook Pro continued to experience problems. So my recommendation does have merit.
The last I checked, the MacBook's butterfly keyboard is not great and reviews out there and my own time playing with one would say so.
[doublepost=1476713904][/doublepost]
- Here is the difference, the MacBook Air and MacBook all use x86 processors. Which means, all desktop macOS software is guaranteed to work, regardless of the brand or model.
- iPhone SE is a 4 inch smartphone, I don't see how that could be confused with a larger iPhone. Its about meeting market demand. A lot persons out there don't like or want 4.7 or 5.5 inch devices.
- iPad Air 2 and iPad Pro, I agree, but the fundamental difference is, they are both the same; they are not using some alternative platform or architecture, they are still iPads. They run the same iOS apps, but there are small distinctions such as Pencil Support, True Tone, better camera, improved display.
An ARM MacBook would be running ARM based version of macOS for which there exist no desktop apps other than those Apple bundle and releases through the App store. Considering how bad the Mac app store is today just for x86, I bet it would be even worse under ARM. Sure, Apple could enable some switch to make iOS apps run out of the box in a windowed mode, but it would still not be macOS x86 that can run traditional desktop apps. The market has proven with the Surface RT, looks like Windows, but can't run Windows software.
There is only one thing you seem not to have in your memory or recollection...
Apple went from chip to chip:
6502 (Apple //e, among other chips)
68000 (dropping support for 6502)
68020 (leaving 68000 unable to run newer software)
68030 (leaving non PMU guys in the cold)
601 (with emulation for 680xx, but leaving then in the dust with...)
604 (even leaving some 601 in their wake)
G3/G4/G5 (same thing all previous unable to run NEW Wares)
x86 (with PPC emulation)
x64 (leaving 32 bit guys out to dry)
but mainly
6502 to Motorola 680xx to PPC to Intel and now to Arm
why? Experience in making CPU shifts
Microsoft and most PC companies have been with one chip the WHOLE time
ARM was the first time a PC company (Microsoft) even really tried to switch
people over. Believe me they wanted to and thought they had something
cutting edge. But the plain and simple fact was they couldn't cut it...
Apple has proper libraries this time with the OS X frameworks and compilers, to probably make it fun and almost effortless. It's one of their greatest advantages really.
I mean think about this:
If a "new" chip came out and was like 10 times faster and 2 times cooler, wouldn't you want to be able to just SWITCH? and yeah you leave the other chips behind, but this is TECH you have to be able to go from Technology to the next Technology, even if it's a CPU.
It's really just like ADB to USB, SCSI to FireWire, FireWire to Thunderbolt, etc
Just way more encompassing and involved. But Apple hopefully still has some 60 year old guys laying around in the backrooms who can guide some of the younger programers and engineers thru this and pass on the knowledge.
It's needed to simply be the best.
While all of the rest stay on the EOL x86_64 architecture.
The main reason tho, is Apple wants developers to get paid from the App Stores. Minimal Piracy, they want to make bootlegs a thing of the past. For the new generations...
NOTE: I forgot to mention [Making software that required the latest and greatest video chips, leaving prior GPUs to cry and want more] So they have no problem making newer versions of programs older machines can't run... and only newer ones can...
Last edited: