It's hard to remember, but even though Windows 3.11 was extremely dominant at the time, it was by no means assured that Windows 95 would be the success that it was. The very first version missed wildly in some big ways (MSN was a folder integrated into the desktop, for example, and no TCP/IP support [*Edit: yes there was - I misremembered.]), but the core, underlying redesign of the GUI was so profoundly good it propelled Microsoft into a new level of ubiquity. Compare it to other GUIs at the time, like CDE, IBM's Presentation Manager, or even Mac OS 8 and there's no comparison. Windows 95 solidified Microsoft's dominance, but could just as easily eroded it had they dropped the ball.
Even though I've used a Mac daily for the past decade or so, I still miss the task bar, and window-oriented GUI of Windows. I still get frustrated on OSX when I minimize a window and have to hunt around for it. I wouldn't switch back because of the underlying crap that is the Windows OS and file system, but I still miss the interface.
Edit: Found this fantastic PDF "Chicago Reviewers Guide" which goes over all the new stuff in Win95. So much stuff I had forgotten - TrueType fonts, Plug and Play, registry settings, right-click properties, long file names... Basically everything that makes Windows what it is today.
The one thing I don't understand that Mac has never adopted is being able to use open and save dialogues as mini file explorers (move stuff around and rename, specifically). Having to switch to Finder to move or rename a file that has the same name as the file I'm trying to save is ridiculous. Of course I never need to do this anymore since I only work on text files under revision control, but it still seems odd to me it was never introduced. I really just miss Window Explorer A LOT since moving to Mac. I don't hate macOS, but Finder is a bit of a joke.
To be fair, Windows' file open/save dialogues are so far ahead of everything else that the competition seems like unusable garbage to me. I'm glad KDE/Qt chose to emulate these very closely on Linux. Wouldn't want a desktop where my only choice is Gnome's take at this.
(On the flip side, Windows' select-a-directory dialogue of the same vintage is such an utter piece of garbage that I can't imagine there being any overlap of designers between the two dialogues.)
@blattimwind: "Windows' file open/save dialogues are so far ahead of everything else that the competition seems like unusable garbage to me. I'm glad KDE/Qt chose to emulate these very closely on Linux"
I hadn't realized that KDE was copied from Windows 95. I'm surprised no one here has mentioned NeXTSTEP. Here's a demo by Steve Jobs from 1992: https://www.youtube.com/watch?v=gveTy4EmNyk
It’s awesome that his goal has always been to allow “mere mortals” to be able to use computers. I can’t believe it’s from 1992. Why is it still so hard to build a database powered app in 2018?
To remedy this:
File > Options > Save >Save to Computer by Default (yes, agreed that this is ridiculous).
A habit that I've developed from earliest computer classes in elementary school is to save the file in the location you want as soon as it's named, so ever after Ctrl+S saves it with no hassle.
Word 2016 also has "Don't show backstage while opening or saving files" in the same options dialog, which basically hides the Backstage UI[1] unless you specifically invoke it from the File menu, and shows a plain old Open/Save dialog instead.
Absolutely true. I use a MATE desktop too, I can't even rename files in the save dialog. And I don't want to look it up either because it is so inconsistent all the time...
I would extend your argument to Windows Explorer in general.
Gnome's save dialog allows you to rename or delete files. Maybe MATE should take some inspiration from the newer version of their desktop environment...
I don't really care whos fault it is, or which DE allows this. Such basic stuff MUST work.
Sometimes I feel like the developers don't want stuff to work similar to Windows. Because that would be admitting that Microsoft actually did something right.
Windows 3.0 didn't have any standard system dialogs, these were introduced in 3.1, but that is not the main point. In Windows 3.1 the directory selection dialog was exactly the same as file open/save dialog (Figure 8 in the article) but didn't let you to choose files (and there are still places in windows where standard open dialog is used for choosing directory).
What is typically used as "standard directory selection dialog" actually isn't even documented standard WinAPI dialog, but originally internal dialog of the explorer shell (and it would not surprise me it it was not present in original Windows 95 RTM but introduced in some slightly later version), also it does not select directories (ie. pathname as string), but shell folders (ie. ITEMIDLIST).
Because that is extremely non-intuitive. It's an "Open" and "Save" dialog. That is what it should do. Joe Public is not going to know it does anything else, yet it does.
Actually, you completely missed my point. The Windows dialogue is not only superior, because it allows power users to do what they need, but also because it's actually usable for regular users.
For example: I literally can't know how I can save a file using Gnome's dialogue in the general case, e.g. if the dialogue opens at /foo, and I navigate to /foo/bar using the dialogue, but then go back to /foo ("bar wasn't the right place after all"), I can't save the file there any more. "bar" will be selected. Clicking "Save" while a directory is selected will not save, but navigate instead. Now I'm in "bar" again. I go back to /foo and try to click something else, say a file. This changes the to-be-saved-file's name to the selection.
After I asked someone who uses Gnome as their main desktop they told me "that's easy: you just have to ctrl+click on the selected directory to de-select it, then you can save in the current directory".
That, my dear readers, is indeed unusable garbage.
As a long-time Gnome user, I've only realized how bad it is after reading this comment ;) (No irony here, I agree it's truly horrible now.)
But, I only realized it now because almost none of the apps I regularly use make use of Gnome's default file open/save dialog. Linux's extreme inconsistency has some benefits ;)
Oh, and talking about Gnome, the only reason anyone uses it is that after configuring a few basic stuff, you can completely ignore it, just run your apps, forget that there's an actual OS with GUI somewhere beneath them. Heck, if even the applications running on it ignore Gnome and its "standard widgets", its biggest strength is that it can be easily ignored!
> I go back to /foo and try to click something else, say a file. This changes the to-be-saved-file's name to the selection.
That is the most annoying part to me. Windows does this as well. It's such a rare thing to want to overwrite a file for me, I find it so irritating that if I accidentally click on a file name instead of a folder, suddenly I lose the file name that I wanted to save as. Usually I just cancel and start again. So stupid.
Whereas I would find it irritating if it didn't have that behaviour. Sure it's not the most common operation, but it's still quite common to want to overwrite an existing file.
This is the excuse that's always used, but I rarely see it play out in practice. Many people don't notice advanced features that are present, that's true. But they don't notice them so it does matter. For the people who do, they can easily discover advanced features through exploring the interface.
Apple have a tradition of being really bad at this. Many of the (slightly more) advanced features are completely hidden behind undiscoverable key combinations or very hidden features. The slide-to-reveal pattern on iOS (now mostly fixed with augmentations) is a good example. Middle clicking the titlebar in Finder to reveal the directory parents is another.
In Windows's save dialog, right-clicking on a file gives me the same list it does in Explorer. In Mac's save dialog, right-clicking on a file does nothing.
It's not like the Mac way saves screen real estate or anything. It's like the way Firefox lets me close a tab by middle-clicking on it. Most users don't know it's possible, but it doesn't actively harm them to have the feature there.
I have similar problems with the Finder. Right-click a file on the Desktop and you have the option to compress a file. Open the Finder to the desktop folder, that option isn't present -- it's grayed out on the Finder file menu as well.
I really like the fact that the Windows save dialog is basically Windows Explorer, and I really miss those features when an application uses the older file save API (which gives a more Windows 95 / 3.11 interface).
OSX Finder is horrible. Classic MacOS Finder was far better - but still had open/save dialogs.
RiscOS had it right - the Open dialog didn't exist - you would open the folder and double-click the file. And the Save dialog didn't exist - the document would reduce to an icon which you would then place into the correct folder. (You can sort of do this on the various versions of macOS by dragging the icon from the title bar but it's inconsistent)
A few years ago, I actually searched around for an implementation of the classic finder, so this is very amusing for me to see actually implemented!
What I was interested in back then was the idea that there was a direct correspondence between the folder window and the data structure on the hard disk, and that to the user these concepts should be indistinguishable. One part of the illusion is that a folder always appears in the same place with the same window size, to give the sense that the folder is a tangible thing with permanence.
It's got a loooonnggg way to go (I never realized how many small but vital details are in a file explorer application), but it's strangely exciting to see it draw itself on modern macOS, especially on a retina screen!
And yes, the spatial UX! I'm still working through getting all of that implemented (I just completed persisting window locations/positioning days ago). I have recently been reading theough some of John Siracusa's turn of the Mac OS X era writings and that's been hugely insightful and helpful. The level of detail (both of the original Finder and his writings) is impressive and sirprising.
I also used to love the fact that the equivalent of kernel extensions (totally forgotten the name) could be disabled by moving them to a different folder. The file system IS the computer.
Coming from the other direction, it kills me when I'm using windows and have a save dialog open, then drag the file I want to open into it from the explorer. On windows it moves the file to a possibly new location, while on OSX the save dialog selects the file you dropped on it (without moving it).
I have the opposite experience WRT Finder vs Explorer. I use a Mac as my primary machine, but have a PC for games. Trying to use a file browser without Miller columns[0] drives me crazy.
I love being able to drag a file up one level in the tree without cutting and navigating to the destination to paste, or having 2 windows open to almost the exact same location.
> I love being able to drag a file up one level in the tree without cutting and navigating to the destination to paste, or having 2 windows open to almost the exact same location.
While Miller columns certainly support that, so does the tree view in the left panel in the default Windows explorer layout.
Honestly I didn't think of tree view before seeing this (and neither did the friends I asked who are more regular Windows users than I am). I'm gonna give it a shot.
I've been using bitCommander, which is a Windows file manager that supports (among other things) Miller columns.
Compared to the tree view, Miller columns use more horizontal space, but have less vertical overflow. Which is superior probably depends a lot on your usage patterns, but both will support “drag to (grand, etc.) parent of current folder” fairly simply.
Another good approach might be to do it the other way around: Get rid of the pseudo-file-manager dialogs entirely and let applications integrate with the regular file manager.
I think RISC OS (?) did this. Open documents in applications had icons representing them which you could drag to the file manager to save. (And perhaps to other applications to open?) Mac OS also has (or had?) this to some extent – many document-based applications show an icon in the title bar, which is for dragging and dropping the document in question.
Even Windows has an example of catering to this way of working, in Explorer, where the folder icon in the location bar represents the current folder and can be dragged and dropped. They even have some custom behavior to prevent the window from being raised when you drag from it, so that it works more like classic Mac OS and lets you drag to an overlapping window.
Windows Explorer can be extended; the example I remember is browsing a folder of email message files, both the metadata attributes displayed in list view, and a functional content pane for viewing the message itself.
Be OS did this, too; I got the impression that Windows was inspired by that, both for BeFS (NTFS) and the desktop filesystem UX.
You can use most open/save dialogues to do this - a right-click is all it takes in most places. But I definitely agree with your sentiment. Finder feels like a badly-designed toy. The latest things they've added - tabs and labels iirc - require more steps to do the same tasks in the new way versus the old way.
While a fair point, it’s important that feature-rich OS components be designed correctly (including security) and this wasn’t the case with Windows.
How many exploits were just a matter of tricking an extra-fancy OS dialog into popping open something it’s not supposed to, escalating permissions alongside it?
I would normally agree with you here except that I've accidentally renamed files on Windows so many times only to lose them because something was in focus that I didn't intend to be. I use a combination of the mouse and keyboard when browsing and the fact that I can accidentally click on a file (which renames my current file to that file) and save or click enter thinking that I'm saving which then just starts to rename the file is utter garbage in an Open or Save dialog box. I feel like the only things I should be able to do are Open a file or Save a file but not rename other files. Neither is a good solution so it comes down to preference but I definitely see the reason for it because it happens to me on Windows all the time.
From someone traveling in the opposite direction at the moment: the Windows folder selection dialog is the crustiest open/save relic of them all and "open files are locked even to administrators" is a UX catastrophe. Lesser pains: it hurts to have missing consistent "jump to enclosing folder" semantics everywhere (command-click on mac -- only get an equivalent context menu 30% of the time) and to not have the ability to snap open/save dialogs to a particular file or folder with drag and drop.
The door swings in both directions, in other words :)
Windows Explorer, on the other hand, still has no “expand folder” functionality on the right pane. I have to go back and forth to sort garbage trees, opening multiple windows or temporary pinning all folders into favorites. Given that their names may intersect, problem gets worse. In Finder you just expand interesting folders and DnD until it’s done.
What really is a joke is a left pane that combines favorites, libraries (die, die, DIE) and disk trees. It was never usable, except for favorites.
Haha, damn ok, it's been so long since I've had to do this, I didn't realize it is now possible (I used to want this all the time and was the bane of my existence in the late 90s early 2000s when using Mac). You still can't seem to move files into sibling directories, though, only to directory links in the sidebar. Anyway, I LEARNED SOMETHING TODAY (about macOS dialogues and about testing things before I open my mouth).
There absolutely was TCP/IP support in Windows 95 from the very beginning. It was not installed by default but it was trivial to add via the Network applet in Control Panel. SLIP/PPP was also supported and you had basic utilities like Telnet and FTP included so you could connect to the Internet right out of the box.
No web browser, though. Internet Explorer 1.0 shipped with the optional Plus! pack.
Since a large part of my work involved building networks of pre-DOCSIS cable modems at the time, I can tell you that the first Windows to support TCP/IP was Windows 3.11 (Windows for WorkGroups). This was carried into Windows 95 and by Windows 98 there was even Internet Explorer.
But in windows 3.11 the TCP stack was not part of the installation or even on the installation media, it had to be installed separately together with win32s. In Windows 95 IIRC everything you needed for TCP/IP was on the installation media.
...and it had preemptively multitasking 32 bit drivers, we built a Mac (AppleTalk) server product on top of that with pretty good performance. 3.11 was a huge step forward.
Probably inertia. Windows 3.0 and 3.1 didn't ship with a TCP/IP stack. You had to either have something like Trumpet or install Internet Explorer 2.1 (which, AFAIK, was a separate purchase) to get Winsock. Or you could try to use DOS drivers. Even Windows for Workgroups 3.1 only shipped with NetBEUI and IPX/SPX. It wasn't until Windows for Workgroups 3.11 that the OS shipped with TCP/IP.
> Compare it to other GUIs at the time, like CDE, IBM's Presentation Manager, or even Mac OS 8 and there's no comparison.
Well, I agree that there was no comparison with System 8, but not in the sense you mean. I think that the Mac back then was head-and-shoulders a better system than Windows. It might still be, but they're both so painful to use now that it's very difficult to pick a winner.
The Macintosh system was very understandable, very clean. Extensions were an easy-to-understand way to extend one's system, and easy-to-disable too. The window system itself was better-thought-out and less-confusing than Windows's was. The Finder was much more straightforward than the Windows equivalent (was it called the File Explorer back then?). The way that the Mac associated programmes to files (with an application code & a file code) was much better than the extension-based naming of Windows. The way that the Mac used its files' resource fork was great.
Programming a Mac back then was very clean & straightforward. I don't think there's anything today as nice, except maybe Cocoa, maybe. Certainly not the Windows 95 API!
I was using Macs back in the late '90s, and none of the things you say ring true.
Extensions could easily bring down the entire system because there was no memory protection. Full OS crashes (what modern macOS calls kernel panics) were a daily occurrence for the typical Mac-using professional who ran complex software.
The window system was often difficult to understand because apps tended to use a plethora of little panel windows that could overlap even from different apps. Windows preferred large windows that contained the entire app UI, and users typically maximized them. The Windows 95 Task Bar was much better for actually keeping track of your tasks than whatever the MacOS 8 thing was.
File extensions were always a hack, but one that Apple adopted too for Mac OS X. The days of Mac's file-specific associations were numbered when the Internet happened, because Unix servers wouldn't keep track of that metadata, so you needed file extensions anyway.
Besides, the file-specific associations were often super annoying because they were created by the editor app even for exported files. You saved a JPEG file from Photoshop, and it forever insisted on launching the full Photoshop when you double-clicked on it, instead of your preferred lightweight image viewer. This would happen even when you copied the file to someone else because the association was in the file metadata.
Windows NT 4 and its next version Windows 2000 were just heads and shoulders above MacOS 8 and 9 in terms of performance, stability and usability.
(And programming in Mac OS 8... Ugh. No memory protection, no multitasking, APIs originally designed in Pascal.)
> Extensions could easily bring down the entire system because there was no memory protection.
Yes, they were quite unstable. I didn't say that they were stable; I said that they were easy-to-understand and easy-to-disable, which they were: each extension had a distinct icon displayed at system boot; disabling one was as easy as dragging it to another folder; disabling all was a matter of, IIRC, holding down Command as you booted.
> Windows preferred large windows that contained the entire app UI, and users typically maximized them.
As a Mac user at the time, I much preferred the multi-window mode: it meant that I could customise my desktop as I liked. The Windows single-window mode was terrible, as it meant that I couldn't layer windows properly.
> You saved a JPEG file from Photoshop, and it forever insisted on launching the full Photoshop when you double-clicked on it, instead of your preferred lightweight image viewer.
We considered that a plus at the time: it meant that different files of the same type could be opened up by different apps by default. One could always, IIRC, Save As if one wanted to change the file type — or use ResEdit.
> Windows NT 4 and its next version Windows 2000 were just heads and shoulders above MacOS 8 and 9 in terms of performance, stability and usability.
Stability, probably. Performance, maybe. But usability? Never! That was back when Apple cared about UX.
Windows NT 4.0 had the same GUI as Windows 95 and was released in 1996. That's the version of Windows that sealed the deal for professional applications.
The late '90s I remember had a mishmash of NT, NetWare, and some (or maybe several) variant(s) of Unix on the bits that lived in the room with all the air conditioners.
Windows 9x and maybe OS/2 were far and away the dominant OSes on actual workstations. Macs and even aging Amigas at some creative shops, some more Unix workstations at places where most people could recognize and identify the purpose of (if not actually use) a slide rule. But what I essentially never saw on any desktops was Windows NT. Lack of driver support and inability to run many business applications kept it out of that space.
I suppose it depends a lot on geography and business. I remember Windows NT/2000 rapidly displacing Macs in the creative fields, and being widely used as developer workstations.
The funny thing I have always thought about mac vs win is that mac was so set on doing it “their way(think different)” that they ignored UX that really was intuitive and “just worked”
Mac was so set on being different, that they aschewed UX tropes that were natural.. and had to spend ridiculous amounts of resources trying to convince people that their way was the right way, but clearly it was not.
This, IMO, is where the “fanboi” concept evolved.
Brainwashing.
Who cares - in the long term - where the UX and UI elements came from, the point is to make machines immediately accessible to humans creative desires, not to mold workflows to a corporate ego...
So, Apple figured out how to develop products that were managed through the extension of the desire of the user, but they are still struggling with the requirement from Jobs to be “different” - and Ives’ perception of a common user is skewed toward “Ives’ has stated that this is how it should be done” type design, which I find completely ironic given the whole “think different, so long as it’s exactly how I am designing you to think” campaign is a hypocracy that goes up to an 11
What Apple have done with both first and second generations of the Macintosh -- System 1 and OSX -- is pick a basic GUI metaphor and stick with it for at least fifteen years.
The original Mac released in 1984 and was produced through 2000, sixteen years. OSX released in 2000 and is now in its eighteenth year. Whilst each system has seen some evolution over time, the general metaphores and interfaces have remained consistent.
Apple have realised and internalised a core concept of GUIs: change is bad. There is a far higher cost to changing interfaces than can be gained through efficiency, and the retraining and unlearning costs are exceedingly high relative to benefits.
This is a message apparently lost on Microsoft and most of the leading Linux desktops.
Mind: I write this as someone who whilst using a Mac presently doesn't much care for the interface. My preferred desktop remains WindowMaker (itself based on Aqua's predecessor, NextStep), which has a key advantage of having changed almost not at all in the 20+ years that I've been using it. It's also configurable in ways I find useful, and I schlep around a configuration directory to new systems as needed.
Microsoft hasn't been too different in that regard. In terms of interfaces with any real userbase to speak of, the only real Microsoft UI systems I can come up with could be characterized as
* MS-DOS/Windows 3.1-like
* Windows 95-like
* Windows 8-like
Windows 8 basically was born and died in a couple of years, to be replaced with Windows 10, which is very much the same basic set of metaphors as Windows 95. There's a start menu, a little dock of pinned icons next to it, a taskbar, a clock and some little icons for what are basically background processes. There's a maximize and minimize and close button on windows. There's a File/Edit/Whatever set of menus. You can right click for context items, many of which are consistent with Windows 95 20+ years ago. That basic system is now something like 21 years old now.
You can maybe argue that Microsoft forgot your message five or so years ago, but they obviously rediscovered it.
I'm largely familiar with the DOS -> Win2K period, and have made little use of Microsoft operating systems since.
Windows 3, 95, NT, and 2K each saw significant changes in where and how major system functionality was presented.
During the same period I was using numerous Unix and Linux platforms (and still do). Those have largely seen far less substantive change at the shell and system level, with a few notable exceptions.
I'm not discussing Linux GUIs, which have been all over the goddamned map. I've used twm, fvwm, fvwm2, VUE, CDE, WindowMaker (my preferred option), GNOME and KDE through multiple generations, Enlightenment, various of the 'boxes (black, open, flux, ...), ion, xfce4, ... And those are the ones I've trialed to some significant extent. I've at least fired up and looked at virtually all the options mentioned on the XWinMan page: http://www.xwinman.org/
There are several fairly central components which have changed fairly markedly. The shift from telnet to ssh, multiple iterations of firewalling, various scripting languages of preference (bash, perl, python, an oddment of others), mailers (sendmail, qmail, anything reasonably sane, mostly exim and/or postfix now), and of course, the whole init replacement clusterfuck.
But the notional concepts of files, filesystem, shell, utilities, pipes, etc., has remained consistent, and even across several utility / server replacements (particularly ssh and mailers), command-level compatibility has been preserved to a remarkable extent with previous options (e.g., rsh and sendmail syntax).
If Microsoft can only be relied on for five-year stints of "having learnt this lesson" then they have not learnt this lesson.
I fear though that command level compatibility is under attack these days, as fewer and fewer see shell scripting as something positive (never mind trying to do more and more via dbus rather than pipes and such).
And already with 8.1 the "start screen" could behave like a very large start menu.
Hell, you can today configure Windows 10 to behave much like 8.1. The one thing i see some people miss with the 8.1 to 10 transition is the charm bar. In particular that it gave easy access to printing and such.
Seem to me that Apple and MS focus on different kinds of change.
While Apple may retain the UI across time, they are more than willing to change APIs etc on a whim.
MS on the other hand may change the UI (though outside of 8.x, the core layout and behavior has remained much the same, and even 8.x could to a large degree behave like the older UI) but they bend over backwards to maintain APIs across time.
I get the feel that stable APIs are undervalued as a user retention element.
Being able to get a new computer but install from the same software library (i can hear the _sec people getting hissy already) as was used on the old one makes people more likely to pick the same "platform" over time.
Right, I see that point and would have acknowledged it more explicitly had I time earlier. That's the interesting part of this.
The counter is that Apple caters to a smaller software development community, though several of the tools also see extensive use and support (particularly photoshop). But there's a heck of a lot fundamental functionality on Apple's platforms that you can get without relying on third-party software, or at least, third-party proprietary software. Given the dynamics of proprietary software markets, particularly toward adware, nagware, and malware, this seems a possibly positive development.
(I've made much the same observation in recent years about the Android marketplace, which I see as a growing cesspit, and of the Windows application space, particularly at the peak of its crapware / spyware / adware period in the decade of the 2000s.)
Linux solves the software compatibility problem by allowing for recompiling of software for which the source is freely available, for the most part. This isn't a perfect solution, and there are complex systems which tend to not be particularly forward-compatible. One possible argument is that such complex systems are themselves inherently problematic and ought perhaps be avoided. You may not agree with the argument, but I'd expect you'd admit to its existence.
Microsoft was addressing a different space, and one in which there was a massive focus on desktop-distributed client software, much of it aimed at very specific business applications. This is a major application area for computers, though it's also one that's shifted significantly toward client-server Web-based solutions (or app-based, now). Which presents its own set of features and limitations.
And again, all this is what I was hinting at earlier with noting that you'd presented a very interesting point. I'll be thinking about this for a while.
Windows 95 was no paragon of stability when you started loading it up with tray apps and running big nasty applications or trying to use the damn printer with its piece of shit driver. We are spoiled these days by how stable our computers are.
I agree with you that keeping the file metadata in a separate fork is far superior to keeping the file metadata in a three character extension, but sadly the world zagged on this one when Apple zigged. I especially like that you could have two files of the same type associated with different applications, so if you created a text file in your IDE it would open back up in that IDE instead of launching the word processor. And you could change it (somewhat clunkily) at will.
Win95 did have some advantages though. The Start Menu was a better organization system than Apple's Application folder for example. Macs of that era were also slow and badly overpriced.
> The Start Menu was a better organization system than Apple's Application folder for example.
These were not actually that different. The original Start Menu was just a menuized view of a folder hierarchy (which mostly contained shortcuts but could contain any document).
The resource fork idea is great if everyone agrees on it and everyone preserves them, even across OSes.
Or if you're going to Very Deliberately Ignore the Other OSes and go do things your own way. VDIing like that seems to be a very Apple trait.
My point is, I'm not sure how well the resource fork model could have ever survived prolonged and sustained contact with the Internet and modern pervasive networking.
> I agree with you that keeping the file metadata in a separate fork is far superior to keeping the file metadata in a three character extension
Since we're talking about Win95, then the 3 characters limit doesn't apply (long filenames was a major new feature of this OS after all). .jpeg and .html were relatively common for example at the time, and worked fine.
I find the extension system still kludgy, but arguing than it was worse in part because of the limited pool, is incorrect starting from Win95.
Yet for the same compatibility reasons most people stuck with .jpg and .htm, at least in the Windows world. Even now it's very unusual to see a .jpeg extension on a filename.
The metadata that Macs kept in the resource fork go way beyond file type and creator too. It included things like the file's Icon, creation/modification information (so it would survive a trip over the Internet!), loads of stuff for applications (menus, graphics, sounds, etc...), formatting for plain text documents (so they fall back to plain text on unsupported systems), and so much more.
Fun fact: NTFS supports the concept of a resource fork on files, but almost nothing in Windows uses it. I think I've seen more malware hiding stuff in there than legitimate uses in the wild. Worse, even the obvious case of loading a Mac file on a Windows machine it usually fails and falls back to creating the clunky separate directory instead.
NTFS does not have resource fork in the MacOS sense nor extended attributes in the unix sense. Instead it allows for file to have multiple named contents that are accessible by same file IO API (in essence the file can behave like a simplified directory). There is no distinction between data and metadata stored this way. In the late 90s MS even intended to not use OLE compound storage fileformat (ie. what office 97/2000 formats are built on) on NTFS drives and instead write the objects into separate streams (reportedly it was not implemented because then windows would have to somehow transparently reconstruct the compound storage when you copy such file to non-NTFS drive or upload it to the internet). Today apart from malvare hidding only major usage multiple streams have are the "this file was downloaded from internet, are you sure you want to open it?" prompts which store the internet-ness of file in secondary stream.
Conceptually the MacOS Resource Fork is basically a directory where all of your filenames have to be exactly 4 characters long. The only difference is that each "file" might be a stack of "files". So you might have a CODE resource that has multiple CODE segments in it.
One thing I loved about old MacOS apps is opening them up in ResEdit and so much of how the thing was built.
> Conceptually the MacOS Resource Fork is basically a directory where all of your filenames have to be exactly 4 characters long. The only difference is that each "file" might be a stack of "files". So you might have a CODE resource that has multiple CODE segments in it.
Sort of. It would be more accurate to say that each filename was a 4-letter type code and a 16-bit ID. Each resource could also have a name, but that was less frequently used (and didn't have to be present, let alone unique).
More importantly, resource forks didn't exist in isolation. They were loaded into a chain of active resource files -- for instance, while working with a Hypercard stack, the resource chain would include the active stack, the Home stack, the Hypercard application, and the System suitcase. A stack could use resources (like icons or sounds) from any of those sources.
The filename extension is the bare minimum of metadata for a file and not easy to extend.
Even then Unix systems will skip even that minimal metadata and force you to messily search for magic numbers at the start of the file and make a guess.
> I think that the Mac back then was head-and-shoulders a better system than Windows.
I completely disagree.
First of all, Windows 95 had preemptive multitasking (the Amiga was the only computer that had this at the time), Mac OS was single task and used a terrible scheduling and it would be years before Mac OS gained preemptive multitasking because of terrible architecture choices that made this extremely challenging.
From a GUI standpoint, Windows 95 was a total revolution and made Mac OS look completely antiquated: rendering, scrolling speed, font types, menu items and dialogs, etc...
The Amiga was exotic hardware, like the BeBox. If you want to include those, you might as well include SGI and Sun hardware. Preemptive multitasking was common on non-PC hardware.
Cocoa is pretty painful compared to web UI alternatives like Electron. I feel stupid for mastering it considering there’s no jobs there and the Mac App Store is probably impossible to make any money off of anymore.
Windows has always been head-and-shoulders above the Mac and that's why it won the 90s desktop wars and that's why just about every business on earth uses Windows and not Macs.
You are in a tiny minority if you think Mac window and file management are any good. File extensions are the most pragmatic way of dealing with associations and that just about sums up the difference between Mac and Windows: Microsoft made Windows to be practical whereas Apple has always focused on style above all else.
> I still get frustrated on OSX when I minimize a window and have to hunt around for it
Windows user here. Honest curiosity: does anyone know why the minimize / maximize works on the Mac the way it does? I mean, what's the rationale to design it like this?
Different ways of working, mostly stuck in our own ways of doing things.
I rarely, if ever, use Minimize on the Mac. Minimize comes from Windows (and other windowing systems) where the window minimizes to an icon or button on the task bar.
"Maximize" also comes from Windows (and other windowing systems.) As others have noted, in newer macos (which I don't use,) I think it oddly makes the window go full screen. Full screen is a recent macos feature -- I once asked about making my Application go full screen and Apple developer support said that going full screen did not follow their Human Interface guidelines. Something seems to have changed at Apple since I asked about that decades ago.
There was no Maximize on the Mac, it is called "Zoom." The idea is that the window has two sizes and you zoom between the two sizes: One size is the size the user has resized the window to (often with much difficulty) and the other size is an ideal compact size ("optimally fit content") without hiding anything and hopefully where scroll bars do not appear -- it is a UI feature that is/was rarely, if ever, done very well by applications other than the Finder.
By the way, Command-Tab to switch tasks was once an add-on from Microsoft for Mac OS. Go Microsoft! (Mac fanboy here :)
All correct. In daily use, I almost never use those buttons - I end up manually sizing/placing windows.
The the MacOS scheme of doing this leads a sort of organic, emergent window layout - I always end up with windows staggered to display relevant bits. With Windows (and with most window managers, X), things always end up either strict tiled or stacked.
I'm very used to the Mac way and prefer it, but that could just be the result of long use. It doesn't waste space (Windows apps always seem to have lots of dead space to me and makes me work to be able to see parts of other windows) and forces me to switch windows far less. But it is a fiarly subtle thing.
The amount of tedious manual work required to resize and arrange several windows on MacOS (6+ to latest OSX) bothered me all the time I had a misfortune to have to use it. It does not even have something like edge snapping.
My WM of choice is now Xfwm, which allows to rearrange windows easily without gaps, make windows fill available space (vertical and horizontal separately), or tile them by dragging to corners.
On OSX, things like Spectacle and Magnet sort of help.
The lack of window and edge snapping are exactly the reason I prefer MacOS behavior. If it did that, partly revealing lower windows would be far more tedious/impossible.
For me, the goal is not to maximize available space to the frontmost windows; it is to maximize the use of the monitor to display what's currently relevant. It allows me to do things like keep a Finder window open with just a bit peeking through to drag things to, keep an eye on a few lines of a terminal tail -f, see the mailbox pane and room pane in Mail/Slack so I can see if anything new happened I should respond to, etc. all while working on whatever I'm working on.
With the Windows-ish "maximize the window", that is replaced, almost invariably, by useless window background.
Again, I expect this is mostly what one is accustomed to, and the Macish approach is more idiosyncratic to the user. Works for me.
Edge snapping does not prevent the behavior you describe; sometimes it even makes it easier to do. You can still rearrange and resize the windows manually as you wish. But when you want two edges to fit snugly, it's easy to do.
Also, you can maximize a window for a moment, look at it, and then unmaximize it back to its previous size.
Interesting. Now that you've made me think about it, I've realized that I use Windows more like a single-threaded operating system. I only ever flip between applications (ignoring my second monitor). I never, ever, combine multiple applications on the same monitor - this is crazy because this was the Windows 95 promise. Not that it's a bad thing, I'm used to the workflow and love it.
I absolutely agree with your guess as to why this is.
in Maximized state the window is set to maximal available size (so you are not wasting any part of the screen) while you are still provided with fast and easy access to relevant OS UI elements
in full screen you explicitly tell the system that you don't want to be distracted by OS UI elemtents (typically in situation when you know that you won't need them for extended period of time, or if you REALLY need every single pixel of the screen)
I feel your pain. I was using BetterTouchTool to remap the default behavior of that green +, but eventually decided it was silly to use an add-on for something I should be able to change via `defaults`, so I just trained myself to hit Option when I wanted to maximize.
I'm came from ~18 of Windows & Linux usage and everyone always told me macOS is THE OS with the best usability.
But I can't confirm this.
Minimization of windows is shitty and maximization even more.
When I maximize, often just the height is changed, when I go back to "normal" the height and the width is changed, so I always have to adjust the width manually.
When I minimize a few windows, it's impossible to get back the right one without luck.
Know what you're talking about. Since I've switched back to Linux I can't imagine working without workspaces - each one dedicated for specific task/app - and every time staying with specific order so e.g. 1 workspace: Browser, 2nd: Code editor, 3rd: Terminal, 4th: File explorers etc. I used to it so much that I automatically use shortcuts to access it immediately - switching between minimized windows using alt + tab keys is a nightmare.
Windows 10 actually supports this feature, still can't get used to it though... win+tab -> switch between virtual desktops, or ctrl+win+left/right arrows.
and if you have many windows open (in win 10 at least) you don't have to press alt+tab 10 times in a row to choose your desired program, you can hold alt+tab and use arrows.
Yes you can, it was added in the Creators update. Hit win+tab and right-click the window, you can choose to show that single window across all desktops or all windows from the app across all desktops.
You can accomplish (mostly) the same thing with Spaces on Mac. Granted, I don't believe you can really script any of it. All manual, but it still works pretty well for me.
Some days I still really miss dwm, but having Photoshop, Ableton, and several other things Just Work™ makes it worth it.
The problem with spaces is that any time you cmd-tab between spaces, those spaces will be repositioned relative to each other to be adjacent. This also happens whenever a window opens a dialogue that forces you to switch to it. This means that you can't reliably keep spaces in a strict order.
Yes, this is something I do to on linux, and it's great particularly when you're doing something that gets quite messy with lots of windows open - when you're working with lots of files, or you the program (like GIMP) opens up several windows. If you need to do something else it's so nice to just leave it all and move onto a nice clean workspace without having to minimize everything.
You do have multiple workspaces on Windows 10... Though if you mean "automatically assign application X to workspace Y", which many Linux WMs are able to do, you're out of luck.
I used to use VirtuaWin[1] on Windows, which adds virtual desktops to it. It's possible it doesn't work in the most recent Windows (although, given Win compatibility, it could just as well work), but until Windows 7 (when I stopped using Windows) it was a life-saver. I used Enlightenment (E16) on some of my computers back then and after working with multiple desktops I just couldn't live without them. I mostly use 3x3 layout, with the main application I work with at the center, and other applications to the sides. Works great for me!
After using a mac for ~6 months now, I finally understand what happened to gnome3/unity, clearly designed by mac users.
I mean it's perfectly usable once I got the basic gestures down and installed the app that let me independently set the track pad and mouse wheel scroll directions, but will go so far as to say that both xfce and MATE are objectively better at window management.
Partly I think their hands were tied by the too late to change decision for the always there contextual top menu bar. Or this is just 25 years of using win95 clones talking and I'm set in my ways.
I wouldnt say Gnome is by or for mac users. In fact in many ways it is way more similiar to Metro than to OSX. Gnome 3 just broke the common Windows 95 workflow, however many others did before.
Personally i love it but it took me way more than 10 minutes to realize why :) IMO it is highly underestimated, mostly because you need to change your workflow which takes time, but when you did it feels super productive.
You wouldnt judge i3 or other completely different approaches after only a few minutes.
I feel absolutely the same about Gnome 3. One of the biggest things, I think, is that it puts workspaces absolutely in your face, so using them is a much more natural part of the workflow than in Gnome 2. It's also more keyboard-friendly than Gnome 2 (though it still could use some work in this area). Despite being rather large (gnome-shell on Wayland is typically the second-biggest RAM user on my laptop), it feels minimalistic, and is almost always fast, and stays out of the way of whatever I'm working on.
> it feels minimalistic, and is almost always fast, and stays out of the way of whatever I'm working on.
absolutely :)
What i like the most is you are just a super key tab away from basically everything so focus on a single thing feels natural and right. There is no way to lose anything either, its all there. Always.
I'm going to have to give it another go then, though I do love my MATE desktop.
The last time I tried it, I got frustrated when working with a lot of pdf sources - hitting the super key just presented me with a myriad of white rectangles where open windows would frequently rearrange requiring a slow manual search to find the file I was looking for. This can be less of a problem with a taskbar, as the filename is the main identifier, and being 1 dimensional it is easier to scan and preserves its position better.
What's strange on a mac is somehow having the ability to completely lose windows.
Another frustration is trying to view two apps together on the screen at the same time, if one of the apps itself contains multiple windows. I'm not at my desk to try this but let's say you have multiple chrome windows open, all with their own tabs, and you want to view your current chrome window overlayed on a window from another app. To do this you have to manually minimise all of your chrome windows one by one so they will all move of the way, to allow you to switch between apps and view them both at the same time.
Four fingers up, expose.
Drag the windows you need up top into a new desktop.
Command (or control? Or option? Or a combination?) Plus the right arrow to switch desktops.
Try all the combinations until I get to the right desktop or throw the damn thing out a window.
Somehow it is able to open up the device manager and THEN give the window in the background focus when I switch to my editor and back to Xcode with cmd+tab.
So I have to move the Xcode window down, to grab that backgrund thing what seems to be part of Xcode.
Classic Mac OS (System 7 was contemporary to Windows 95) didn't really have any concept of minimisation or even maximization as such.
There was no task bar or dock or anything else really to minimise to. IIRC there were addons for 7.1 that added "window shading": a button on the window title bar that reduces the window to just the title bar.[1]
The closest thing to a maximise button in classic Mac OS was more like a size-to-fit button: the application gave the window manager a hint which was the appropriate size for the document displayed, be it a file folder, a word processor document or whatever. Having a single window fill the entire screen wasn't as common as it was on Windows.
None of this was particularly strange to me back then.
Sure it did, namely Windows 1.0, 2.10/2.11 (actually Windows/286 and Windows/386), but very few people used them.
Windows 3.0 was the first one to have some diffusion, but it had very limited capabilities, and it's adoption was slow because of the increased specifications for the PC, and in any case not comparable with the later wide adoption of 3.1.
> Having a single window fill the entire screen wasn't as common as it was on Windows.
Yup! For the longest time I liked to work with a half-width browser window to match my half-width editor & word processor windows. It drove me crazy the number of websites which set their body text to some fraction of the window width, which looked good with a fullscreen window but terrible with a halfscreen one.
Eventually I just gave up. The whole point of the web was device-independent information transfer, but somehow we allowed device-dependence to sneak it.
MacOS System 7 did have a function like minimize for Apps (not individual windows). In the finder menu on the upper right, you could "hide" or "show" a program and use the same menu to switch apps, similar to the task bar in Win95.
macOS has Hide & HideOthers in addition to Minimize. I go weeks at a time without using minimize because of those.
IMO there's a whole generation of people who did their early computing on MS Windows (including myself) and so internalised that that is how GUIs are "supposed to work". When moving to something else later in life there's a feeling that it is "wrong", but it (e.g macOS)'s way of doing things is also correct and is just a divergent evolution to MS Windows. Research, open-mindedness and experimentation are necessary when using something different.
I'm new to macOS. Thank you. I need these tips. It's really annoying in the differences, but I'm sure thre are more hidden things that are useful I suspect I am not alone
Thanks, this is the first answer to this question that sort of makes sense from a Windows user point of view. I mean I still prefer the "Windows way", but this is at least viable reasoning for the "Mac way".
Depending on how you've configured your Dock, Minimize is pretty simple. Either the window just moves to the Dock, or it minimizes into the application icon (then you can right-click on the Dock icon to view a list of the windows that are minimized, or click to open the last minimized window)
Maximize on Mac was NOT designed for the window to fill the whole screen, but rather to resize the window to optimally display its contents. E.g. maximizing a Preview window with a PDF document changes the width of the window to the PDF page width. (Good tip for getting along with your Mac: stop trying to maximize everything.)
In a recent macOS version, Apple changed the green "maximize" button to full-screen, which is very different from maximize. Now double-clicking on most window chromes will execute the old maximize behavior.
> Maximize on Mac was NOT designed for the window to fill the whole screen, but rather to resize the window to optimally display its contents.
The problem with this approach is I definitely do not need someone else making the decision of what is "optimal" for me.
I've been using macOS for about as long as I used to use Windows now, and at this point, macOS seems to have largely abandoned the concept, which is great. Applications get either a full screen in their own isolated context, or option-click for a full-screen in a regular windowed context. The options now are a very windows-95/98 like: "either go completely full screen, or resize to whatever you like," which gives me full control of what I find optimal for any given application.
The difference is that Windows never really embraced universal drag and drop and the Mac did.
Macs used drag and drop for file management between windows representing separate locations on disk. Windows users tended to select files and choose cut or copy then navigate to the second location and paste.
The same held true for moving content between documents in an application or moving content between applications. Mac users preferred to use drag and drop, while Windows users relied on copy and paste.
The problem with keeping every window maximized is that you're giving up system wide drag and drop as the primary user interaction method.
That's a great distinction I hadn't thought of before, and it definitely makes sense - if you're focusing on drag-and-drop, you want as many windows visible somewhere on the screen as possible to maximize possible destinations.
Personally, I find drag-and-drop handy sometimes but it's very constraining. You have to go through non-standard motions to complete any more that is more than trivial, always holding down the primary mouse button and thereby losing your primary way of interacting with the interface. In other words, sure, if you have a clear view of your destination, then yeah, drag and drop is fine, but in all other instances, it becomes clunky.
Cut/paste is incredibly quick and doesn't sacrifice usability of your interface or input methods between the two ends of the transaction. Windows seemed to balance this out well, where you could drag and drop most of the time, but you could also ALWAYS cut/paste. I despise that I can't cut/paste in finder. Which is why I use PathFinder instead.
The danger cut/paste DOES pose is it fundamentally unlinks the start of the transaction and the end. In between, you can do literally anything, which may mean losing track of what's in your paste. Still, I'd call this a fair trade-off, specifically because it is non-destructive for files. You won't lose a file to to paste. It just stays put.
> In between, you can do literally anything, which may mean losing track of what's in your paste.
On Windows Ditto, and on Linux CopyQ (among others, and there has to be something like that for Mac) solve this problem, by giving you a preview of what's in the clipboard as well as the history copies you made.
I've seen users cut files from one location and forget what they are doing before they manage to find the destination they intended to move those files to.
Then they are shocked later when they paste those files into some random location and can't figure out where they went.
Dragging and dropping does not have that issue. Users find it much easier to learn.
Open the source window. Open the destination window. Drag.
>The problem with this approach is I definitely do not need someone else making the decision of what is "optimal" for me.
And that's the crux. The whole feature (in its original incarnation) rested on the false assumption that there's a singular "optimal" state at any given time.
The Apple Human Interface Guidelines have been a state-of-the-art reference for good UI for a long time, but the part about the zoom button always baffled me, as it directly contradicted several Core Principles laid out in Part I of the book.
The reason this is done is of course that most applications don't have content that fill the entire screen, so maximizing, in most cases, is meaningless - and hinders the usability of the system.
It makes more sense to leave some space over for other apps than have a big empty area on both sides of the screen.
Interestingly, I noticed that it's only Windows-switchers who complain about this. People who've used Mac for a long time don't give this any thought.
> most applications don't have content that fill the entire screen, so maximizing, in most cases, is meaningless
It's a matter of where you place responsibility. It's like saying, "most websites aren't responsive, so naturally it makes sense to restrict the size of your browser window and leave space for other apps." But most would laugh at this and say it's the responsibility of the website/webapp to build a responsive layout. Why should we hold desktop applications to a different standard?
I completely that it is probably almost entirely windows-switchers who complain about it. I'd, obviously, self-aggrandizingly suggest it's because we've tasted something better. People don't complain about the taste of food they've never tasted ;)
>It's like saying, "most websites aren't responsive, so naturally it makes sense to restrict the size of your browser window and leave space for other apps."
I think that's a misrepresentation. It's not a static size. It's not an artificial limit. If the document that's open has content to fill the entire screen, the window will fill the entire screen.
For minimize I can't say. But the maximize works the way it works because (and this is according to the platform ideology, not some general truth) you are not supposed to maximize windows in the Windows sense of the word.
The macOS interface is based around floating and overlapping windows. If you put a window over the whole screen then it could be as well maximized. This gets a bit hairy on smaller screens but really shines on huge monitors. In general macOS is more optimized around having one big screen rather than a multi-monitor setup.
I do not know how the interface looked back then (I was not even born), but I can imagine that at that time a lot in interface design had yet to be discovered.
But seriously, minimising windows is a reflex learnt from Windows. In Windows you often need to minimise one thing to find something else. Especially the desktop. On a Mac you can usually find something more quickly in the dock. In Windows you’re far more likely to have an app maximised by default, and minimising is a natural way to switch tasks. On a Mac, minimising is not a natural way of task switching.
It drove me nuts when they updated the maximize button behavior to full screen.
I use ShiftIt, a neat little Open Source tool that help me manage windows sizes and positions (including minimizing and maximizing): https://github.com/fikovnik/ShiftIt
I've recommend it to pretty much any Mac user I've met.
Haven't used ShiftIt, but I've been using Spectacle to accomplish the same thing. They seem pretty similar, overall. Works pretty well and I've had no issues.
I used Spectacle for years than was forced to switch to ShiftIt at a new employer. ShiftIt default hotkey combos didn’t conflict with other apps like Spectacle.
No idea... And a few versions ago Apple changed the maximize button to go full screen, which made it useless for the 95% of us who don't use one app at a time. Minimized windows also used to show up at the bottom of the screen when using "Expose", but then Apple changed the name to "Mission Control" and removed them. Just one more example of them slowly but surely driving its Mac customers away.
It depends on the definition of 'at the same time', but I think that many people in a work environment at the very least use an e-mail client and a productivity application at the same time. Also throw in a calendar for good measure.
I think that Apple thought that many regular users would switch to full screen apps on the Mac, combined with Launchpad (it's just like an iPad/iPhone). But virtually all non-tech-savvy Mac users that I know do not use Launchpad, nor fullscreen apps.
I think the problem with Launchpad as with Spotlight search [1] is that they are not very discoverable on the Mac. Having search in an application menu (like recent Windows versions and some Linux desktops) is far more discoverable.
I guess people don't use fullscreen apps because they equate desktops/laptops to the 'WIMP' interface paradigm.
[1] If I received a penny every time I see even experienced Mac users launch applications by clicking on a Dock icon or by navigating to the Applications folder in Finder, rather than using Spotlight, I would be rich.
Only after reading your comment did it occur to me that I should set up applications in the Launchpad the same way I have them organized on my iPhone (or like Win 3.1's "Program Groups" or Start Menu folders) and stop using the Dock to hold the applications I use most often. Maybe that would help keep track of which applications I have open, and where my minimized apps keep going off to!
You sure you are not confusing it with iOS? That's where you always have strictly only one app in fullscreen!
(Yes, yes. It is a lame joke. But who uses only one Window at a time? What is the point of that? Though I remember when I used OS X you could swipe left (or right) to switch back to all other apps. So I think that was good enough.)
That's a very un-Classic-Mac-like way of using windows.
In the olden days, Mac users would tend to have lots of overlapping windows. Dragging anywhere on the window edges would move the window, so it was easy to arrange them as you wanted - almost like shuffling bits of paper on your desktop (strangely enough) and if you wanted to move something out of the way, you could fold it up (window shade). There was no need for maximising or minimising and "zooming" just meant "make this as large as makes sense for this particular document", not "expand to take up all the space on my desktop"
As OSX/macOS has developed, all that document/desktop-style behaviour has been lost.
They're talking about a window sized to the maximum space as if you click and dragged its edges out as far as they can go or just used a tool like Divvy.
You're talking about the "fullscreen" feature which I always found very weird. For example, ever forget your video was "fullscreen"ed as you try to alt-tab to it only to realize it's a 4-finger swipe to pull it back up. Making the user have to differentiate will always be bizarre to me.
Speak for yourself. Spaces in macOS is one of the best window management features I've encountered. Of course, it's been common in Linux window managers, but since I use a Mac for work, its a godsend that spaces and multiple desktops is implemented. I'm a developer, and being able to organize my windows and split them (fullscreen isn't limited to one app per screen anymore) is so freaking useful that I feel lost whenever I try to use Windows again.
> Minimized windows also used to show up at the bottom of the screen when using "Expose", but then Apple changed the name to "Mission Control" and removed them.
You may not recall, but when win95 came out - there were lines blocks long awaiting to buy it, much like when the iPhone came out.
I recently sold several windows 95 shrink wrapped original copies on eBay which was the OS on 32 3.5” floppy disks. As an original piece of computer history.
Win 95 was monumental and great. Aside from outlook, and excel, the greatest product MS ever made.
I'm predominately a Linux user and I switched away from Mac OSX specifically because of the underlying file system. I'll admit that it was annoying finding windows sometimes but the case-preserving, case-insensitive file system made no sense.
Case-preserving,case-insensitive is good for less sophisticated users: prevents accidentally misplacing or duplicating files with capitalization; and simplifies through this constraint media listing and sorting — no worrying about jpg vs JPG.
When a file refuses to rename because the OS thinks it is the same name. When you have multiple files that differ in case, but they overwrite each other because the OS thinks they are the same.
Really, the only place case-insensitive filenames makes sense is when you are searching. It makes no sense for any other reason.
Hmmm... You're right of course. I didn't describe that well. I meant Unix file names and way of working with files vs. Windows. In other words, forward slashes, symlinks, mounts, sane permissions, etc. I hate dealing with drive letters, etc.
What's a good name for this?
NTFS/Windows actually has all of that stuff you want, too. NTFS's permission system, for example, is extremely feature-full and integrates nicely with the user system (ACL support by default rather than an add-on, for example). The octal user-group-all permission you're probably used to is pretty crude by comparison.
It's more likely just you're unfamiliar with it rather than it's actually missing anything.
But if you want you can just pretend C:\ is equivalent to / and mount all your other drives at C:\mnt\whatever, that's completely doable (with a GUI to configure it if you want, even)
>Even though I've used a Mac daily for the past decade or so, I still miss the task bar, and window-oriented GUI of Windows. I still get frustrated on OSX when I minimize a window and have to hunt around for it. I wouldn't switch back because of the underlying crap that is the Windows OS and file system, but I still miss the interface.
OT but have you tried Witch[1] as a task switcher? It switches between windows, which made my life SO much easier
Wow, looks like a very polished Witch and then some. The addition to the search looks interesting. Any area or use case thst witch is better? Is it as snappy? I like that Witch is instant.
I remember using Witch, but a very long time ago. I remember stopping using it, though the reason escapes me.
I've been using Contexts for what feels a year, and I keep using it. It's that good.
Not sure why you’d have to hunt for minimized windows because there is a separate area on the dock reserved for them, and the animation clearly shows the window moving towards it.
> I still get frustrated on OSX when I minimize a window and have to hunt around for it.
OSX took the NextStep/OpenStep interface and dumbed it down to great detriment, then added new things back (spaces, zooming) which were inferior logically but required less 'thinking' of how one works and more 'shiny looking' to potential customers..
IMHO hands down the best mouse-oriented window management paradigm to exist to date is the NextStep/OpenStep style over and above windows and osx, though I will admit windows has improved things with their sort of hybrid 'classic' windows+'macish' updates, and some of the newer ui things (e.g. window thumbnails) haven't made it into the current flagship of that lineage which is the open source WindowMaker..
Since the 'official' lineages are dead, am hoping the WindowMaker people continue to innovate/move this paradigm forward as they have been doing subsequently for the last N years...
it was by no means assured that Windows 95 would be the success that it was
I remember these times well. It was considered a huge break. People were whining about how stupid the Start menu was compared to just seeing your apps in front of you all the time :-D I love that we eventually came full circle to a Windows 3.1 Program Manager-esque approach with iOS nowadays!
To me Windows is bloatware. But I also make the OSX dock as small as possible and autohidden. I launch everything through spotlight though as I abhor unecessary point and click (synonymous with hitting the windows key and typing a couple letters of the application to launch.).
I remember Windows 95 as a complete disaster of crashes, data loss, failing installations, incompatible applications, missing drivers and countless other problems which were only fixed with the release of Windows 98 (maybe even SE), which was much much better. I have memories of people sticking with DOS and 3.x, only having 95 as a nondefault boot option in case they wanted to watch the Buddy Holly video or launch the new Encarta cd-rom.
Most of the above are talking about the UI which is arguably separate from the underlying OS. You can like a UI even if it's a UI to a system that crashes a lot :)
> still get frustrated on OSX when I minimize a window and have to hunt around for it.
Learn to use the dock? The taskbar on Win 95 was evolutionary rather than revolutionary, and the Dock from NeXT was one of the influences. Which is the same dock was have today in macOS.
You can still have personal preference, of course. But if you have trouble using macOS to find minimized windows, that's because you haven't learned to use it not because it's not possible.
Even though I've used a Mac daily for the past decade or so, I still miss the task bar, and window-oriented GUI of Windows. I still get frustrated on OSX when I minimize a window and have to hunt around for it. I wouldn't switch back because of the underlying crap that is the Windows OS and file system, but I still miss the interface.
Edit: Found this fantastic PDF "Chicago Reviewers Guide" which goes over all the new stuff in Win95. So much stuff I had forgotten - TrueType fonts, Plug and Play, registry settings, right-click properties, long file names... Basically everything that makes Windows what it is today.
http://tech-insider.org/windows/research/acrobat/940601.pdf