Nolan, I'm not familiar with what you do or have done, but have you ever tried to make a website back in the day? Ever tried to support IE6? 7? 8? IE literally broke the web. CSS rules which worked in every browser ever wouldn't work in IE. Entire webpages weren't formatting correctly, having to spend hours, days, on workarounds. Comparing Safari to IE is just wrong. Safari doesn't break the web, it 'just' lacks support for some new javascript functions, not break standards/rules that had been there for tens of years.
What rules that had been there for tens of years were broken by IE6? IE6 was the best browser available when it was released, and the web standards ecosystem was nothing like it is today.
I think the analogy is very clear. If you want to code like it's 2000, supporting IE6 is easy, because IE6 is a very good 2000 browser. If you want to code like it's 2010, supporting Safari is easy, because Safari is a very good 2010 browser. But in both cases, if you want to use any new features that have been standardized and are supported by other browsers, you're fucked.
The main difference I see is that IE6 didn't get any updates at all. Safari gets updates, but many new features are missing or broken.
EDIT: Another difference is that MS never blocked you from installing a better browser, but Apple does that on iOS. That policy is becoming increasingly ridiculous...
Absolutely the truth. There was a reason the only answer was tables for a very long time.
A major source of hatred for IE was the fact the IE team did work prior to the browser standards, a practice which is now standard operating procedure. Where would be without ajax?
"Another difference is that MS never blocked you from installing a better browser, but Apple does that on iOS. That policy is becoming increasingly ridiculous..."
Why has that not resulted in anti competitive behavior lawsuits? MS got in to a fair bit of trouble for similar things. They didn't stop you installing an alternative.
Netscape used to be a paid product* before Microsoft made IE free. What Microsoft did was make a free competitor and install it on every machine running their OS.
So what Microsoft did that was illegal was use their monopoly power to destroy an existing player. That's how they were "anti-competitive".
While Apple not allowing any competing rendering engines to be written for iOS is anti-competitive in the dictionary sense that they are preventing competition to be created, it isn't the type of behavior targeted by anti-trust law.
(IANAL, but my understanding is that if Apple had removed Spotify, Pandora and other music apps from the App Store while releasing their streaming music service, that would be an anti-trust violation.)
The real issue is just whether Apple has a monopoly or not. IANAL, but as I understand it what's illegal is using a monopoly in one market to muscle into another market via bundling deals. Just doing bundling deals if you're not a monopoly is perfectly OK.
If Apple sold the operating system used on the vast majority of computers you could have a case, as did the government when 95% of PCs were shipped with Windows.
Even by the most generous definition of "computer", Apple holds 20% market share at most.
Samsung won't get sued for anti-competitive practices because you can't switch the browser on their so-called smart TV for the same reason.
They also provided a free graphical text editor, thereby destroying the market for commercial text editors. It boggles the mind that they escaped punishment for notepad.exe.
There were also the facts that IE was deeply integrated with the OS to the point where r was impossible to remove and MS had anti-competitive agreements with OEMs preventing them from providing Netscape bundled.
MS was deemed to have a monopoly on desktop software, Apple is not. You're allowed to hinder competition for software on your operating system, if the user can reasonably easily choose another operating system.
Apple has a monopoly of taste – a self-defining market that won't switch OSes even if that constrains their other choices. But they could, so the "market of Apple users" isn't a discrete market under the law.
It doesn't need to be. MS in the late 90's dominated, and you could not do work with someone using MS products unless you also used MS products. One person having an iOS device and another having an Android device does not cause any issues between the two, they are fully able to communicate by the nature of cell phones. I would bet that the lawsuits against MS would not go through if they were brought up today (and, of course, MS was still doing the same things), because it's actually possible to switch desktop operating systems without ending up unable to collaborate with your coworkers and family.
I can't tell you how many times I've had to help people who switched away from their iPhone and then saw eternal, intermittent issues with not receiving text messages because their friends iPhones were convinced they should be sending iMessages and not SMS. The situation gets even worse with group messages.
Even tech-savvy users (read: me) sometimes get bitten by that. I wonder if it would make sense to auto-disable iMessage when a phone's SIM is removed? The problem probably isn't even on the radar, though (iPhone users never switch!).
Yeah, I know. In some cases this somehow still doesn't always work for some people. Even after you delete the old conversation, which also sometimes has to be done.
But the very idea that someone has to know this has to be done and disable iMessage is insane to me, and I suspect part of the reason it is the way it is is because Apple doesn't really mind what (to the average Joe) is a major annoyance when switching phones.
To be fair, matters were even worse before Apple released that tool. But this has been an ongoing issue for something like four years.
Maybe, maybe not. That is what the person I was responding to claimed though.
I don't believe it is reasonably easy to choose/install another operating system on an i device. I could be wrong.
>One person having an iOS device and another having an Android device does not cause any issues between the two, they are fully able to communicate by the nature of cell phones.
By the nature of cell phones? What is the nature of cell phones?
Ethernet existed in the 90's and I personally setup networks using it that allowed Windows and Linux systems to communicate with each other.
So on one hand people say that Android has more market share, on the other hand they're anti-competitive with iOS and Safari. Is Apple preventing you from using another device? is Apple using its position to put Chrome out of business?
This will ultimately hurt Apple. They're giving away marketshare of end users who want or need to use some other browser than Safari.
Web browser is arguably the single most important app on any phone. Or a computer. How many browsers you have currently installed on yours?
I should be able to install any browser on iPhone, just like I can on my Macbook.
While I was still using iPhone, that forced me to always reach for Android phone when I wanted to read any page with wrong size font for me. Which is pretty often.
IMHO, there's just one good mobile browser, and it's Opera. Not mini version, but the proper mobile one.
The big feature? It can always reflow any div (paragraph) text to 100% screen width. Always the font size I want and never any horizontal scrolling.
I hope Apple will finally allow other browsers that are more than just embedded mobile Safaris. Including full Javascript JIT support, because that's what modern web requires. Currently on iOS only app that can allocate executable pages is Safari.
Let's define it as apps that actually use a different browser engine, and aren't merely a reskin of the exact same engine used in Safari. In short, something that is actually capable of supporting new features that Safari doesn't.
This is not possible on iOS without a jailbreak, because Apple simply doesn't allow alternative browser engines to be used.
Why not? Apple also doesn't allow Flash, so the question is, why? Does Apple make money from Safari? Is there a technical reason that might preclude another browser engine?
My understanding was that they don't want to allow executing downloaded code that hasn't gone through the App Store review (such as JavaScript in web pages). I think they allow Lua for scripting in games, though.
So, basically, a 3rd party JavaScript engine is not allowed. I guess you could write your own browser engine that doesn't support JavaScript.
I guess Flash was not allowed for performance and security reasons (lots of vulnerabilities, right?)
>>> Why has that not resulted in anti competitive behavior lawsuits?
IANAL but folks over reddit explained it as this:
Microsoft was a software only vendor in the past but Apple is the hardware and software vendor. US and EU laws dictate that the hardware manufacturer to have full control over the ecosystem becuase of the industry behaviour in the past and who the end-consumer pays his money to.
Also ironic given that most people have adopted some of that "broken CSS box model" in their CSS.
box-sizing: border-box;
Now there were many other problems but I'm not sure people appreciate how such features eventually got standardized. Similar issues happened around DOM serialization with APIs like innerHTML, which Netscape refused to adopt because of "series of pointing at standards". Developers ended up adopting the idea and it was later standardized. XHR is another case. There are many more.
In the case of Safari, I can think of canvas, touch (love or hate it vs pointer events it was before any of the alternatives), DPI independence, &c.
It's some of these quirks that seems to show how newer web standards might be rushed through by increasingly aggressive vendor involvement. Apple hasn't changed all that much from when it first released Safari. It's only our expectations for the pace of new additions that has.
It's so easy to be cavalier about random popular facts. I would love for someone to go back and load up the alternative browsers available at the time on some vms and take stock of their featuresets. Factoring in popularity at the time I'm pretty sure not much would have changed.
No, you can't. You can install wrappers around Safari, but you can't install any app that uses the rendering engine of Chrome, Firefox, IE, or any other browser better than Safari.
Chrome for ios is a wrapper for safari? Are you sure? Safari and chrome definitely render certain pages differently when both running on the same iOS device.
I'm also sure. At least for mobile devices. I can't speak to desktop Chrome. That's why it does the same weird rendering Safari does which has become more noticeable since the split from webkit.
There's one discernible difference between Chrome and Safari on iOS 8- try scrolling on both and you'll see that both render them differently. Much faster on Chrome and less so on Safari. Prior to iOS 8, scrolling on both browsers were very much the same.
Apple does not block people from installing better browsers (I know a couple). They did, back in the day, when they couldn't find a way to make Nitro (their web engine) secure.
If I'm not wrong, Apple allows you to install Chrome on iOS? Of course for a long time they didn't allow other browsers to use their Nitro Engine (so deliberately forced them to stay slower), but seems now they allow that as well ...
Again, it doesn't allow them to be made as the 'default' browser - which in itself is a big reason to criticize them.
Aah. I knew it 'used to be' the case, but didn't know that it is still the case, especially after Apple started allowing other apps to use the Nitro Engine (WKWebView). Just came to know that due to other technical limitations by Apple, Chrome still can't use it [0].
I built websites during that entire period. I saw the rise and fall of Netscape, the "dark ages" of browser development when Microsoft let IE rot - I witnessed the birth of Firefox and the re-ignition of the "browser wars" when Google launched Chrome, and Apple got serious with Safari by moving to WebKit.
The author is right, the details aren't the same but the same attitude exists. The author is simply speaking up before the differences become extreme.
Microsoft didn't "break the web" they created their own web and then let it twist in the wind because they were pissed about the government beat down.
Apple didn't get their hand slapped, but they did build something so lucrative that evolving Safari became a smaller priority, which some would argue aligns well with their push to the App/closed ecosystem model.
Not only was Safari always WebKit based, WebKit is Apple's browser engine (forked from KHTML and open sourced) that Chrome used to use until they forked it.
Apple had little choice other than to Open Source Webkit unless they wanted to violate LGPL licence which forces - source code of any derivative work to be released back to the user.
My mistake, my comment was penned rather quickly and emotionally (typical). I should have verified my memory with actual homework, the revised timeline does not change my opinion ;)
Offtopic, but your first paragraph sounds like Roy Batty's final monologue in Blade Runner. I watched c-beams glitter in the dark near the Tannhauser Gate! :)
> Microsoft didn't "break the web" they created their own web
> and then let it twist in the wind because they were pissed
> about the government beat down.
I was around as well. My recollection is that they let it twist in the wind because SaaS competed with their two interdependent quasi-monopolies: Windows and Office.
Which is another instance of the Innovator's Dilemma: How can a successful company embrace technology that disrupts themselves?
> they let it twist in the wind because SaaS competed with their two interdependent quasi-monopolies
It wasn't called SaaS back then, but in any case -- they let it "twist in the wind" because their market had become the enterprise, and that's a market that wants stability, predictability and no updates if they can be avoided. By then they had killed all competition, so their incentives were to keep businesses locked-in through backward compatibility and security fixes. Add to that the big move to .Net and an effort to replace Flash...
IMHO it wasn't about the antitrust or platform control (they could have broken all the SaaS apps they wanted with each update, they had 95%+ of the market and were n.1 target for every website out there). They just had other priorities: the browser war had been won and attention was now on the enterprise market. Around that time, IIRC, Gates also retired, leaving Ballmer in charge; Ballmer was not the sort of technologist to lose much sleep over "the future of the web"...
I had to do some work[1] with, basically, HTML4. Eventually I reached the stage whereby the feature worked in every other browser (IE8+, including Opera), using clean and completely hack-free code. In 3 days I had done my work.
Then I tested it in Safari. Two weeks later I gave up and basically had to `if (Safari)`. I don't know if the code still has that, but it probably does. This wasn't anything 'lacking'. Sure, it was contenteditable-related but the fact remains that I found a consistent subset in every browser except Safari.
So, yes, Safari broke my web - and it wasn't something that was missing, it was something that behaved very differently compared to other browsers. I can't remember what specifically broke it, but even ~2 years ago I had a significantly better development experience with IE than Safari with no new HTML5 features in sight.
If your target browser is IE8 and you write a bunch of html4 that looks great in IE8, I would expect it to not work anywhere else.
Not only is IE8 a snowflake but html4 is particularly hard to get a good layout working. Lots of nontrivial layouts in pure html4 (and css 1/2) require tons of browser specific hacks.
You missed his point. He mentioned IE8+ and Opera as working targets to highlight the fact Safari was being lame. He said every browser - that would include complaint ones like firefox and chrome. He also specifically said he avoided browser specific hacks.
Saying "I didn't use any browser-specific hacks" with HTML4 is a strange statement, seeing as how many non-trivial layouts in html4 (and I'm assuming the older CSS standards that would typically go along with it) require browser-specific hacks to work.
But honestly, I can't know because he's not naming anything specific anyway, just an anecdote about how IE8 was great at being standards-compliant and safari wasn't. Which is such an extraordinary claim (and is so contrary to established precedent) that without any extraordinary evidence, I have a lot of trouble accepting it.
(And it's amazing how many people think "I'm not doing any browser hacks" while they're using something like jQuery, which is a library 100% devoted to doing browser hacks for you while abstracting you from that fact.)
> Saying "I didn't use any browser-specific hacks" with HTML4 is a strange statement
I was working on top of a huge CSS framework - the majority of styling work I did was tiny tweaks to styles where the appropriate hacks had already been solved.
Also I didn't mention layout. The specific pain point was contenteditable in combination with JS - by the time I got to the code everything surrounding my little problem-space had been solved by the rest of the team. Again, specifically, Safari demonstrated wildly different behavior within that specific space, where other browsers only behaved marginally different (a relative term, contenteditable is a complete mess, sadly, still in HTML5) in a way that I could find a viable subset of functionality that worked consistently.
Well, that's the whole thing then, isn't it? The huge CSS framework probably had tons of hacks (as you say) to make things work in IE8... it's a bit disingenuous to say you wrote everything to be clean and without browser hacks, if the hacks all exist but are buried under a "huge CSS framework".
Your whole experience could probably be restated as "I was working with a CSS framework that did a bunch of cross-browser hacks for me, and the tweaks I made to it didn't work in safari", which is a pretty uninteresting statement, because it's equally likely that any issues you had in safari are the fault of the CSS, and not the browser. (In fact, much much more likely.)
Opera went out of its way to ensure non-standard stuff targeted towards specific IE versions also worked in Opera. The downsides of being a browser no one cares about.
Something working in opera doesn't mean it is standards compliant.
I use Chrome as my primary development browser and Safari and Firefox as my second and third choice. When I do get around to testing in IE 9+, it's almost an afterthought. Is it surprising then that IE is the browser that gives me the most trouble?
All browsers have quirks. It's inevitable when you have many competing implementations from companies that have different priorities. The browser you use the most is naturally the one whose quirks you will get used to.
In the same vain, Chrome does break web by introducing various custom features that can't be found in other browsers. See all those Google demos from past several years: "You need Chrome to view this" sounds very much like those "IE5+" warnings.
It's one thing to be the first to implement standards, it's another to completely ignore them for years.
The "Chrome Experiments" was a way to show off the WebGL capability in chrome that no other browsers had at the time. (Now those demos will mostly work in all browsers)
Plus you are forgetting that those are experiments. They were never really meant to be actual "apps". Making an experiment/demo for one browser is fine IMO. Especially when only 1 or 2 browsers have implemented the standard currently.
It's like webRTC. I made a "demo" webRTC app about a year ago, and it only worked in chrome OR firefox (but they couldn't talk to each other). That's not chrome or firefox breaking the web, that's the other browsers not being at the cutting edge, but i don't blame them. WebRTC was (and still is) VERY new, it wasn't even finalized and i was using the app as a bit of a tech demo to see what may be possible in the near future.
The chrome experiments were the same way. WebGL wasn't fully finalized, it was one of the first implementations of it, and people wanted to try it out, so Chrome Experiments was born.
WebRTC is not really new any more - a couple of years old (and existed in other forms long before), and still not widely adopted. In fact, not adopted in a single further browser in that time, and the original two are not compatible.
I'm thinking, its almost time to consider WebRTC 'failure to achieve traction'?
Not an expert, but hopeful that WebRTC will be a success in the end. Maybe I'm just overly optimistic?
> still not widely adopted
Certainly everyone involved with WebRTC wishes that it had the support of IE and Safari, but Chrome + Firefox + Opera is nothing to sneeze at, particularly if you're targeting a savvy audience.
> Original two are not compatible
Maybe I'm confused, are you saying that Firefox and Chrome aren't compatible via WebRTC? I've used them together and had success, but just for basic video calling.
> not adopted in a single further browser in that time
This is fair, but if Microsoft's latest moves come to fruition that will be a big change. There are hints that Edge will likely support WebRTC[0], though I don't think there has been any official word. Maybe that's just for ORTC?
All the important scalable features are incompatible - so-called 'bundling' where many streams use a single port etc. Aggregation of error feedback so an MCU can sensibly report bandwidth limits. Slow connect/reconnect times. No 'supernode' support. Pretty much anything that takes it from a toy with 3 or 4 p2p connections, to a product that can handle up to 100 streams with audio/video switching realtime through an MCU or P2P dynamically.
Other products can do this. WebRTC suffers from being based on RTP/RTCP and SIP-style signaling. There's not enough juice there to get the job done.
Yes, 2 years after it was released. And there are many levels of bundling. Will they both support all of them? Will the MCUs follow quickly? The WebRTC landscape is a disjoint map of features and support that evolves very slowly.
I'm sure as hell not married to the idea of WebRTC, but i desperately want something like it.
The ability to do peer-to-peer communication from the browser is an amazing tool. It allows the creation of client-side-only web apps which can easily send massive amounts of data to others. With WebRTC (or any other service like it) you can add video chat to a web app with very little work, and no real server side infrastructure that is more secure than the alternatives.
If it needs to be re-done in the name of getting it implemented widely that's fine with me and i welcome it, but if it's just NIH syndrome at microsoft holding it back, that's not okay.
Mostly its an unprofessional wad of code written by some PhD's. It went through some hands and ended up at Google, who mercifully refactored it somewhat and renamed it.
I've been dealing with the code base for 5 years, and never liked it. Had to rewrite the APIs every time we got a new source version, to fit our model which isn't the 'conference call' model. I'd be glad to see something more app-agnostic that just dealt with negotiating P2P streams, without so many assumptions about what/why/when the streams are.
IMO this is directly related to the post. WebRTC is a technology that has huge potential on mobile (given phones have cameras etc) but Apple haven't/won't implement it.
WebGL is a good comparison here - it was pretty dead before Apple implemented it. When they did (in iOS8) you started seeing a ton more WebGL things popping up.
Still another conference-call-specific take on p2p communications. Application assumptions are laced throughout the design. Which in my experience is the usual problem with open source IP - its tied up with so many assumptions as to be brittle.
Speech Recognition and Synthesis are in Safari -- though I don't know to the extent that it's as complete as Chrome. Safari definitely lagged behind though...
> See all those Google demos from past several years: "You need Chrome to view this" sounds very much like those "IE5+" warnings.
I'm in sympathy with the spirit of this post, but I'm not sure that I agree with this specific example. I think that there's a big difference between an 'everyday' web page requiring a specific browser, and the creator of a browser showing off what they see as its exceptional capabilities.
However much designers (and users) like adherence to standards, I think that it's clear that a lot of the innovations we take for granted have come from browsers leading the way; and there's not much point in having those innovations in the browser if the user doesn't know about them.
(I should be clear that I'm not advocating a functionality free-for-all, though; probably much more harm than good has come from this behaviour.)
I have to support IE11 for corporate software, websites, etc. I'll used it as my main browser for a day or so to test things out and its just incredible how its become a 2nd class citizen and how you pretty much need Chrome or Safari nowadays to get by. I'm partial to Firefox and it works well enough, but designers only have so much time and they'll hit big targets first - Chrome/Sarafi and leave IE be with FF support just working because Gecko and Webkit render very, very alike. Things are pretty much the opposite they've been just a few years ago. Obviously, supporting standards are good and IE11 is better at standards than anything before, but there's still bad blood here.
That said, I was just playing with Edge on Win10. Its like a pre-bundled Chrome. Fast, compliant, etc. It hilariously even breaks MS software, for example, I can't open the address book in Outlook Web Access because some deprecated html control that Chrome no longer supports. So if Chrome no longer supports, Edge doesn't either!
I guess we'll see if extensions ever take off for it. If they do I could see it hurting Chrome's marketshare. Personally, the Chrome/Google juggernaut needs to be knocked down a peg or two. I hope Edge gets popular just to keep Google's sometimes anti-consumer ambitions in check. For example, using FF in Android means Play Store links don't work right and a million other things. Google isn't incentivized to make alternative browsers work well in Android. They want use to use Chrome and all of its spying/tracking/marketing stuff cooked in.
A lot of that is new features that are later implemented in competing browsers though, sometimes features that aren't even officially declared stable in Chrome yet. I'll happily allow "works best in X" for a tech demo. For a site/app I want to use for doing something, this is different of course.
While the title is a bit click-baity, I think the overall point is: Company sees the value in the internet. Company wants to "play too". Company builds good browser that is fast and feature filled. Company realizes that they can throw their weight around and have company specific features. Company starts to care less and less about accepted standards, and cares more about their pet standards. Company refuses to let other, more capable browsers run on their OS. Company thinks that because they're the biggest and richest, that everyone should get onboard with their choices. It'll last for awhile. Something else will come along and give users and developers a better option. Let's hope Apple doesn't screw around for as many years as MS did before creating a decent browser.
We've got one or two last perf fixes for the Edge dev tools going in before RTM, but the current perf in Edge (Tech Preview) is about what to expect. Have you tried the latest builds? Always happy to take a message about what to fix or add (this username @microsoft.com)
Yah, although some more than others. Microsoft looks ironically (or perhaps fittingly) to be the first into the next lean cycle with Microsoft Edge.
The browser game has changed and become commoditized. Before they went towards being monoliths with everything from email, to apps and games, various services etc. Those parts are being usurped by stand-alone services and apps and need to longer be integrated into the browser. I think that's good! Browsers should focus at being browsers. Better to be a 'la carte than bundle everything under the sun.
Also, I think the vision of HTML as a "universal" app/game platform is slowly dying. We're coming to terms with HTML is really good for somewhat basic presentation and interfaces, but not much more than that.
Although doing advanced stuff is possible it seems we're coming to the realization that it's neither practical nor performant
> Firefox Dev Edition is broken or slow every other build
Then just use release? Dev Edition is the alpha version with some different defaults, based on the assumption that "developers" are OK with an alpha quality build that has some new features faster. If that bothers you, use the release. It also contains the dev tools!
Only in the sense that its missing all the features. People seem to forget when considering current browsers "bloated" that the bloat comes from the webpages containing all the cruft.
If it lacks very basic functionality, isn't the effect the same as breaking the web?
I think one unfortunate aspect of the article is that it focuses on some of the less common deficiencies of Safari. I would have called out mobile Safari being unable to download files, or Apple's insistence against video codecs that all other vendors are aligned in agreement on when it comes to video.
Well, I mentioned the two that I am most keenly aware of. I only know what I've come across, so I'm assuming there's lots more.
To my knowledge, file downloads worked in all browsers prior to mobile Safari being initially released. So Apple did release a browser that broke some quite basic web functionality.
I don't know how I missed that part of of your post. Sorry about that. Still. I find it quite a stretch to argue that not supporting file downloads "breaks" the web. Users don't perceive websites to be broken just because they can't download files. It would be akin to claiming that a device with no camera breaks the web by not supporting the camera API. Or that the Lynx browser breaks the web by not supporting CSS and JavaScript.
Regarding Apple's insistence on not supporting Ogg (Vorbis|Theora) (I assume that's what you're alluding to), those codecs were briefly recommended by the HTML5 standard, but were later dropped.[1] Again, I think its a stretch to say that Apple broke anything, simply because they resisted a proposed standard. I know a lot of developers were disappointed, but that's how committees work: You don't always get it your way.
Given that the author self identifies on his blog as a twenty-something I'm wondering if he's old enough to have ever really had to lift IE v. 4-7.
IE was a vast pile of shit for about a decade. It was pile of shit so high that Microsoft took out ads saying in so many words that its latest version "doesnt suck, honest". I tore my hair out for years over IE bugs. Fucking kids these days. Mobile Safari might not be as magical as you would desire but it is not even in the same league as the virtual genocide/superfund/WMD that IE was.
Oh the pain of having to use nested tables, various css parsing hacks, browser sniffing, two different box models, and changing significant-insignificant whitespace to support IE 5, 5.5, and 6. All with zero debugging tools.
Yep, I have had to support IE7 and 8. IE6 was, thankfully, a bit before my time.
And yeah, the title is pretty clickbaity; guilty as charged! But my blog has no ads, so it's not like I'm making money off of it. I'm just trying to draw attention to what I think is an important issue.
As others have pointed out, it's all a matter of perspective. If you're trying to build for Web 2.0, then Safari is a fine browser. But if you're trying to build for Web 3.0 (or the "next web" or "appy web" or whatever you want to call it), then prepare to be disappointed.
The worst thing MS did with IE was implementing proprietary, and very aggressively protected, solutions to common problems. Developers, excited about the new features, rushed to support what they felt was the next generation of web technologies, only to get caught in Microsoft's trap. Once they held the marketshare majority, they stopped innovating.
Safari is nowhere close to a majority player in the market (not even on mobile). Thus, it could never be considered equivalent to the evils of IE.
What your comment says to me is that you were not actually there back in the day. When IE6 was released it was the best, most compliant, fastest browser. That wasn't the problem. The problem is they then didn't update it for 5 years and people eventually wanted more which IE couldn't deliver, just like Chrome and Firefox really suck at HTML6 at the moment.
Ha whoops, on mobile it looked like you were responding to the OP article instead of a top-level commenter. Mea culpa. It's just something of a pet peeve that there are people who call 'ad hominem' at every insult.
Yeah there would definitely appear to be a generation of developers now on the scene that were doing something other than web development in the 2000s. 100% this. Web development was fucking hard ten years ago.
Yeah, I was a big believer in "screw it we'll just use tables" back then. When CSS finally caught on it basically was the end of dynamic-width websites for a few years.
The point is that Safari is the browser that destroys the ability to universally use new standards. It is the odd one out that makes it hard for all the others.
Considering that Safari fails to start on a regular basis (amongst a ton of bugs and problems with it), I'd say that breaks the web a lot more than IE6.
But I'm sure this title gets you more clicks.