Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Chrome? Edge?
UI wrappers around the exact same WebKit used by Safari, because Apple doesn’t allow third-party browser engines in the App Store, and they don’t allow mass distribution of non-App Store apps on iOS.

For years, third-party browser apps actually had to use a limited version of WebKit that lacked some performance improvements Apple had brought to Safari. Eventually, Apple relented and made the same version of WebKit that Safari uses available as a public API.
 
  • Like
Reactions: ErikGrim
You’re building pages that only work in chrome JUST like long ago, folks built pages that ONLY worked in IE. Pages that are NOT built SPECIFICALLY for chrome work fine in non-chrome browsers.

Right now, there are websites that ONLY work properly in chrome, just like back then there were websites that ONLY worked properly in IE. To my knowledge, there are no sites that work well with webkit that DON’T work in a chrome renderer.

Safari isn’t the new IE, chrome is. However, those that develop websites want folks to think otherwise. So that, when they code a site SPECIFICALLY for chrome and it doesn’t work for their non-chrome browsers, they can say “oh, Safari is the new IE”.
To begin with, I'd say this rather ignores that there are enough inconsistencies between the WebKit, Blink and Gecko engines that indeed, if you design specifically for Safari, there is a good chance elements will be broken in Chrome and Firefox. But arguing the specifics ignores the flawed premise.

No one I know would code specifically for Chrome, just as we didn't code specifically for Firefox 15 years ago. We try to design for all browsers, but the one that gives us the best developer tools is naturally the first browser we test in. Arguably, that is currently Chrome though historically it has been Firefox (and still is in some people's opinions).

For Apple, providing a better environment for development wouldn't be just about the tools built-in to the browser (though those could seriously use an upgrade ...). They'd need to allow people who don't own Apple products to use the browser.

As I said before, as long as using Safari is tied to owning specific hardware, it's never going to be the targeted browser. You can't even legally virtualize MacOS except on Apple hardware. Before bringing Edge to Linux and MacOS, at least Microsoft offered free virtual machine packages for Web development. Services like BrowserStack are great for showing where there's a problem, but not terribly effective for diagnosing and fixing the problem. You're never going to get the Linux fans (of which there are many in the Web developer community) or those in corporate environments standardized on Windows to develop on Safari when Apple doesn't allow them to run Safari.
 
I suppose it is true that if devs coded specifically for Safari
It’s not required that anyone code specifically for Safari, just not to code to any specific browser. But, web devs want to code specifically for chrome, just like those that wanted to code specifically for IE, making chrome specific developers the new IE specific developers. Can’t stop people doing what they want to do!
 
there is a good chance elements will be broken in Chrome and Firefox. But arguing the specifics ignores the flawed premise.
I’m guessing that it’s not a “good chance” as I haven’t seen any sites that work in a Webkit renderer fail in a chrome renderer. However, as indicated in this thread, there are many examples of sites that only work in chrome.

No one I know would code specifically for Chrome, just as we didn't code specifically for Firefox 15 years ago. We try to design for all browsers, but the one that gives us the best developer tools is naturally the first browser we test in.
“No one would code specifically for chrome, but, you know, the developer tools are best in chrome so… yeah we code specifically for chrome.” :)

And, just because developers today have slightly different reasons for doing the same things IE developers did years ago, doesn’t make chrome any less the new IE.
 
It’s not required that anyone code specifically for Safari, just not to code to any specific browser. But, web devs want to code specifically for chrome, just like those that wanted to code specifically for IE, making chrome specific developers the new IE specific developers. Can’t stop people doing what they want to do!

Developers code to web standards. They aren't coding specifically for Chrome. They are using standards that are commonly supported by browsers. In the example I gave, the standard was released in January of 2017. Safari support came in the end of 2021. When was it OK for developers to use WebGL 2.0? In January of 2017, when Chrome and Firefox started supporting it? In Oct 2021, so Safari users could benefit? At some future date when every browser in the world has WebGL 2.0 support?

Why not hold Safari developers responsible for keeping their product up to date, rather than demanding website support out of date browsers?
 
“No one would code specifically for chrome, but, you know, the developer tools are best in chrome so… yeah we code specifically for chrome.” :)

And, just because developers today have slightly different reasons for doing the same things IE developers did years ago, doesn’t make chrome any less the new IE.
What a clever retort. Whoever wrote "No one would code specifically for chrome, but, you know, the developer tools are best in chrome so… yeah we code specifically for chrome" is surely going to be embarrassed.
 
Developers code to web standards. They aren't coding specifically for Chrome. They are using standards that are commonly supported by browsers. In the example I gave, the standard was released in January of 2017. Safari support came in the end of 2021. When was it OK for developers to use WebGL 2.0? In January of 2017, when Chrome and Firefox started supporting it? In Oct 2021, so Safari users could benefit? At some future date when every browser in the world has WebGL 2.0 support?
It was OK for them as soon as the rendering engine they wanted to code for. In this, and many, cases, that means chrome. And they KNOW that, when they code for chrome, it’s not going to work in Safari. Nothing’s wrong with coding to chrome if that’s what they want to do, they’re just using the IE playbook. Which, also, is fine. We can rest assured that Safari will NEVER be in a position where developers are coding ONLY for it.

Why not hold Safari developers responsible for keeping their product up to date, rather than demanding website support out of date browsers?
In looking up how one would enable WebGL on Windows, I read that the ability to turn WebGL on/off is not included in the standard Settings interface of Google Chrome. This sounded strange to me for something that should have been available in 2017, right? Looking further, as of October last year, it was under the ‘Experiments’ interface of the Chrome browser. Has it been made non-experimental in Chrome yet? Because if it’s still experimental, then it still hasn’t been “released”.
 
For testing older versions of Safari, or any other browser, have you tried BrowserStack? Not as convenient as the real thing for development of course. We use Browserstack for our automated testing too.
We do use browserstack, especially because of safari. But as you mentioned, hunting an layout bug on an old Safari is a real burden here too. Also the webdriver for safari has it own quirk's
 
Weird. Safari is not the new IE, it's Chrome. The problem with IE was that due to its majority of browser market share, it dictated site compatibility against the actual web standard. That's what Chrome has done as well, with sites checking compatibility with Chrome instead of the actual web standard. The fact that some sites only work correctly with Chrome is the obvious sign.
We have one solution for this:
They should ship safari for windows/Linux.

As a web developer we just want things to work, to bring the idea to the customer. Don't get me wrong, I also thing Chrome-Only pages are the wrong way, they stop innovative in an indirect way.
But I definitely would not say that chrome is the new IE. Back then it was a real battle with closed source software and big interests, now we have open source fundamentals and everybody can complain about the source.
 
We have one solution for this:
They should ship safari for windows/Linux.

As a web developer we just want things to work, to bring the idea to the customer. Don't get me wrong, I also thing Chrome-Only pages are the wrong way, they stop innovative in an indirect way.
But I definitely would not say that chrome is the new IE. Back then it was a real battle with closed source software and big interests, now we have open source fundamentals and everybody can complain about the source.
Yeah, I have always wondered why Apple stopped releasing Safari for Windows.
 
I wouldn’t exactly hold up Python as a shining beacon of semver adherence.
Well, no, Python doesn’t use SemVer, I’m just using a real world example of major breaking changes that caused a lot of compatibility pain. I’m just arguing that, had Python used Chrome style versioning, it would be even worse (ie is version 25 a major breaking change or 27?).
 
It was OK for them as soon as the rendering engine they wanted to code for. In this, and many, cases, that means chrome. And they KNOW that, when they code for chrome, it’s not going to work in Safari. Nothing’s wrong with coding to chrome if that’s what they want to do, they’re just using the IE playbook. Which, also, is fine. We can rest assured that Safari will NEVER be in a position where developers are coding ONLY for it.


In looking up how one would enable WebGL on Windows, I read that the ability to turn WebGL on/off is not included in the standard Settings interface of Google Chrome. This sounded strange to me for something that should have been available in 2017, right? Looking further, as of October last year, it was under the ‘Experiments’ interface of the Chrome browser. Has it been made non-experimental in Chrome yet? Because if it’s still experimental, then it still hasn’t been “released”.
WebGL I don’t think gets much use in general. It largely gets grouped under the “experimental features” toggles on every browser. Reminds me of my college graphics class back in 2017, one of the units actually did cover WebGL. For what it’s worth, I was able to run it on my iPad in Safari. I’m not sure what WebGL 2.0 adds that the older version that Apple did support (in an experimental fashion) lacked. Also, I’ve yet to find a good use of WebGL outside of source ports of games like Quake to the web, so I find it hard to believe that it’s a technology a lot of web devs are eager to use. (Never mind my animosity towards 3D graphics programming in general, that graphics class was kinda rough!)
 
  • Like
Reactions: Unregistered 4U
It was OK for them as soon as the rendering engine they wanted to code for. In this, and many, cases, that means chrome. And they KNOW that, when they code for chrome, it’s not going to work in Safari. Nothing’s wrong with coding to chrome if that’s what they want to do, they’re just using the IE playbook. Which, also, is fine. We can rest assured that Safari will NEVER be in a position where developers are coding ONLY for it.


In looking up how one would enable WebGL on Windows, I read that the ability to turn WebGL on/off is not included in the standard Settings interface of Google Chrome. This sounded strange to me for something that should have been available in 2017, right? Looking further, as of October last year, it was under the ‘Experiments’ interface of the Chrome browser. Has it been made non-experimental in Chrome yet? Because if it’s still experimental, then it still hasn’t been “released”.

Web developers code to standards, not to browsers. Browser devs code to standards, not websites. Browsers support thousands of standards. Web browsers are not like word processors, with proprietary formats. Browsers render code built to agreed standards.

For WebGL has been a standard since 2011. Apple is a member of the group that develops it, and has been from the beginning. Safari, Chrome, and Firefox supported WebGL until WebGL 2.0 was released in January 2017. At that time, Chrome and Firefox released versions that supported the new WebGL standard. Safari dropped WebGL support entirely. In mid-2020, Safari started supporting WebGL again as an experimental feature. In September 2021, Safari returned to fully supporting WebGL. "The ability to turn WebGL on/off is not included in the standard Settings interface of of Google Chrome" because Chrome just renders WebGL content. Do you really want thousands of toggles in your browser settings, so you can choose what web standards are rendered?

Chrome is like the old IE in that it is the dominant browser. But there is nothing inherently wrong in being a market leader. People seek out Chrome. It isn't the native browser on Windows, Macs, iPhone, iPads, Samsung mobile devices, and probably a lot more devices. But people intentionally install it. That is just the market speaking.

Safari is like the old IE in that it is the most poorly supported and developed of the major browsers. Safari is like the old IE in that its market share is almost entirely from people using the default browser that came with their Apple device. When you put those two things together, you have a degraded experience for those users.
 
WebGL I don’t think gets much use in general. It largely gets grouped under the “experimental features” toggles on every browser. Reminds me of my college graphics class back in 2017, one of the units actually did cover WebGL. For what it’s worth, I was able to run it on my iPad in Safari. I’m not sure what WebGL 2.0 adds that the older version that Apple did support (in an experimental fashion) lacked. Also, I’ve yet to find a good use of WebGL outside of source ports of games like Quake to the web, so I find it hard to believe that it’s a technology a lot of web devs are eager to use. (Never mind my animosity towards 3D graphics programming in general, that graphics class was kinda rough!)
WebGL isn't under "experimental features" in Chrome. To disable it, you need to open Chrome with "chrome.exe -disable-webgl".

FWIW, my group was using Foundry VTT when we were struggling with Safari support of WebGL. It is a common virtual table top system for role playing games (like D&D). I'm not saying that WebGL is some singularly important web standard. It was just one example of a standard where Safari support lagged the competition.

To be clear, as of October of last year, Safari began full support of WebGL again (including 2.0). From mid-2020 until then, it kind of worked as an experimental feature.
 
I'll tell you what I DO want Safari to do; If I'm on clientsite.com and click on a link - be it Citrix, Zoom, Teams, etc, I'd like Safari to ask, "Do you want to open this one time or always for this website?" so that when I go back into a client's domain every few hours, I don't have to take the extra step to say "Yes, open this link in Citrix/Zoom/etc" each time.
 
But I definitely would not say that chrome is the new IE. Back then it was a real battle with closed source software and big interests, now we have open source fundamentals and everybody can complain about the source.
It’s likely for different reasons, but, just like for IE, folks code just for it and aren’t concerned about incompatibility. And, again, just like there were IE ONLY sites back then, there are chrome only sites now. There are no “safari only” sites.
 
Chrome just renders WebGL content.
Chrome for Windows has WebGL 2.0 enabled by default? That’s what I couldn’t confirm. (I don’t have a Windows device)
Safari is like the old IE in that it is the most poorly supported and developed of the major browsers. Safari is like the old IE in that its market share is almost entirely from people using the default browser that came with their Apple device. When you put those two things together, you have a degraded experience for those users.
Those two things PLUS web developers coding specifically for chrome. Remove that, and the experience is fine (just like the experience of all the chrome users on sites that render fine in safari).
 
I really like the current macOS Safari, but my main issue with it is that the extension API is too restrictive. uBlock Origin hasn’t worked on Safari in years as a result and alternatives like AdGuard work nowhere near as well (the current version breaks YouTube completely on my 14” MBP). As such, I end up using Firefox whenever I want to watch a YouTube video.
 
  • Love
Reactions: turbineseaplane
Chrome for Windows has WebGL 2.0 enabled by default? That’s what I couldn’t confirm. (I don’t have a Windows device)

Those two things PLUS web developers coding specifically for chrome. Remove that, and the experience is fine (just like the experience of all the chrome users on sites that render fine in safari).

WebGL is not some special feature. It is just a web standard that browsers should be able to interpret. Support is baked into the browser.

Maybe you could more specific about what you mean by "coding specifically for Chrome"? They code to standards like HTML5 and CSS3. Each of those standards have large number of standards contained within. For example, desktop Chrome supports 518 HTML standards. Desktop Safari supports 471, the worst of all the major browsers.

So what would it mean to not code specifically for Chrome? Should web devs only use standards that every browser in existence supports? Do browser devs have no responsibility for their product? If Safari can't properly read the web that exists now, the web that is defined by international standards, how is that not a problem with Safari?

If I create a browser that doesn't support, say, HTML5 video, then my browser would not be able to display most video on the web, including YouTube. Who would be at fault? Me, for creating a browser that doesn't meet current standards? Or all the web developers who failed to make websites that supported HTML5 instead of TheSapientVideoPlayerSuperGood?
 
Maybe you could more specific about what you mean by "coding specifically for Chrome"?
Developers want their jobs to be as easy as possible and coding ONLY for chrome helps them in that. So, that chrome is the new IE is not a bad thing, it’s the natural result of taking the easy path, like the developers that ONLY supported IE. And, as long as there are sites that ONLY work in chrome and NO sites that ONLY work in webkit, that will be the case.
 
I've preferred Safari for the longest time but think I've about had enough. Sure initial loading of pages is plenty fast in Safari but sites where you have the same page open for a long time like Gmail and Monday.com the performance degrades at a constant rate.

On iOS it's the stupid "download our app" banner that Apple generates. I've turned off "Safari Suggestions", which seems to get turned on on its own every so often, and the banners STILL. KEEP. APPEARING. ON. EVERY. SINGLE. PAGE. I. VISIT. ON. A. SITE.
If it dismiss a suggestion notification immediately more than 5 times, guess what, I'm never going to do what the suggestion is presenting.

Meanwhile, of all browsers, I've been using Edge for a couple weeks. Haven't had either of those issues at all.
 
  • Like
Reactions: turbineseaplane
It’s likely for different reasons, but, just like for IE, folks code just for it and aren’t concerned about incompatibility. And, again, just like there were IE ONLY sites back then, there are chrome only sites now. There are no “safari only” sites.
Safari only means also that everybody has to buy a Mac. And on the iphone it is already Safari/WebKit only
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.