unless, of course, you're not experiencing stutter on the mac you own (my imac, macbook, or previous macbook pro... for example).Every Mac that I used had animation stuttering to some degree (even my 15” 2018 one), they run at high resolutions with a mediocre GPU at best
You won’t really realize that almost every Macbook has animations stuttering until you try a Hackintosh with a real high end GPU where every thing is butter smooth
You clearly didn’t try a Hackintosh with high end gpu (or an imac pro with vega 64) so you won’t even notice..unless, of course, you're not experiencing stutter on the mac you own (my imac, macbook, or previous macbook pro... for example).
i have no need (or desire) to do so, am having a great experience on my imac & macbook. it's a non-issue...You clearly didn’t try a Hackintosh with high end gpu (or an imac pro with vega 64) so you won’t even notice..
It’s like the 30 fps vs 60 fps when most people claimed that there is no difference (and the human eye can’t see beyond 30 fps and all that ****), you won’t believe it until you experience it yourself (or compare side by side)
I've never mentioned anything regarding 144/240HZ the difference between 60 and 120/144/240 is controversial, but between 30 and 60? com'on man!Funny funny, I still don't get this 60/144/240fps thing it's nonsense and proven by double blind a-b-c-d testing repeatedly as a placebo effect. Not even professional gamers can pick out the given frame rate of a monitor past 30fps in play testing. You can ramp out to 600 to 1000fps and it is still impossible for the eye and especially the brain to process or to extract image quality, density or rate of motion beyond the 30fps frame. The multiplication of rates benefits the system in some cases through white paper overhead capacity > though it really only lessens burdens it in real world scenarios because of the current fashion of trying to overdrive it past it's actual performance spec and the only indicator of it actually happening is in benchmarks and on screen displays rattling off a counter of fps.
Otherwise, it's a method to extract cash from your credit card and spend your life over at the geek bench site with people that need the validation they spent ridiculous money on the best card that the next person after them bought too. It's a marketing tactic and a little crazy to bring an industries (windows gaming) marketing campaign into any discussion of what performance is on a Mac. People are having a fit over the Mac Pro starting at 6K. They should have had a fit about the 2080 RTX starting at 1200 - much higher in reality > for a marginal speedbump and some non supported and performance sapping marketing gimmicks (RTX which is just a DXR extension with a couple of dedicated compute units given to it - funny how they don't tout how many and it will drop the oh so holy fps by a third to half) You want to see performance improvements..check out the ray tracing panel at WDC on the dev page where it was easily leveraging at least half to possibly all the CU's to produce the results shown.
Your hackintosh isn't butter smooth, it's a mashup of parts in a Mel Brookes young Frankenstein configuration. some over performing some underperforming - all of them mismatched and running from a really iffy and insecure efi boot dongle. your buttery smooth is probably because the monitor you're using is over processing it's output. I doubt that's even the vega64 helping it because you are going to be hard pressed to make that Vega do more than system default/limits are going to allow.
WTF are you talking about? It's very easy to notice the framerate difference between 30fps and 60+ Ridiculously easy.Funny funny, I still don't get this 60/144/240fps thing it's nonsense and proven by double blind a-b-c-d testing repeatedly as a placebo effect. Not even professional gamers can pick out the given frame rate of a monitor past 30fps in play testing.
Already covered above., Wow is exactly the right word for the arguments about it a lot of the time.WTF are you talking about? It's very easy to notice the framerate difference between 30fps and 60+ Ridiculously easy.
At least 3D games (not VR). I can't comment about 144/240fps, but 30 vs 60+ It's day and night.
I am not saying the games have to be 60+, but to claim (bs claim btw) that gamers can't tell them difference...just wow.
You're right on several points. Intel is facing pressure on the processor front and that's going to end up in a pricing and competitive cores war on the horizon (amd still is about 2 generations behind in IPC right now and infinity fabric isn't a substitute for on chip interconnect, but it is challenging them on the low end. They still lag behind on high end and professional applications implementations. So, intel still has the edge in that regard) but it will be enough to force them to jump ahead again as they did with their core strategy. My best bet would be they jump directly past sunny cove and go to their 64bit non legacy burdened designs somewhere in the next two years if they can get their 7nm/5nm process under control.Great question. I feel macOS is 2nd priority for Apple. Their first priority is iOS. Considering macOS only accounts for < 10% of their total revenue, it makes sense that they don’t allocate as many resources as the iOS counterpart. It wouldn’t surprise me if macOS team is only a fifth the size of the iOS team. It’s a shame because macOS far precedes iOS.
The only way to compel Apple to invest more into macOS is to pressure them using direct competition from Microsoft just like how intel was forced to add more cores to desktop, laptop CPUs which was a direct result of competition from amd. When intel didn’t have sizeable competition from amd in pre-2016 era, year-to-year performance gains were marginal at best. Ever since amd introduced Ryzen w/ more cores, intel was forced to respond and that is when we started seeing decent incremental gains and more cores - 6, 8 cores for the last 2 years and we will be seeing 10 cores on mainstream desktop this year. We were stuck on quad core CPUs for God knows how many years. Intel didn’t have any incentive to work hard b/c they were basically dominating a monopoly market. So why bother?
It’s a different story now. Apple is facing increased competitions in the laptop segments and consumers have a lot more choices when it comes to metal enclosure laptops. Apple still has an edge b/c they are the only company that sells laptops w/ macOS which still allows them to charge more. But Microsoft is improving w10 every 6 months and a lot of die hard macOS fans have already migrated over to w10.
I think the recent price cuts on Macs is proof that Apple is acknowledging Windows competitions and the fact that consumers have more choices than ever before. Competition is always good for consumers and bad for companies.