It was a golden age for the first person immersive simulation game. Wolfenstein 3D, Doom, Quake, Thief, System Shock II, Half-Life, Deus Ex - there was a period of classics with large improvements and innovation from one game to the next, where game design space was explored just as the technology to render it became available.
First person immersive sims are generally best played on the PC. When consoles that were able to do decent graphics for a reasonable period (i.e. the pace of hardware evolution slowed) came into their own, third person control worked better with joypads, and for commercial reasons, PC games often ended up as a second-rate ports of console games, with compromised controls. You really need a mouse for first person.
And many of those early games (Wolf3d, Doom, Hexen) didn't have Mouse look on be default, you played them Closed (Arrow Keys and Right Ctrl/Alt) or Open (Arrow Keys and Left Ctrl/Alt).
It really wasn't until Quake I/QuakeWorld that FPS' made the shift to Mouse+WASD. And evenstill in Quake1, mouse look was off by default.
Improvements in aim assist really make console FPS games more approachable. Still not up to the usability of a mouse, but if the game is thoughtfully designed it meets a "good enough" bar.
Although I wasn't a big fan of Rare's cursor aim functionality, it felt very clumsy.
Personally I'm a fan of sticky aim assist (halo) or snap-to-body aim assist (call of duty, gta). These kinds of systems are the kind of work where players may not even realize they are being helped.
If the question is which is a more effective control scheme FPS, mouse+keyboard is clearly superior to a dual-analog sticks especially in competitive matches.
But, my point is simply Golden Eye 64 was a console FPS where the controls weren't clunky (for it's time).
I didnt even realize mouse + keyboard was a thing until i downloaded the killcreek vs romero demo. I asked my friend “how are they looking around so fast?” Until then I had been using pg up to look up.
I know what FPS stands for, but Thief most definitely isn't a shooter, and System Shock II is less of a shooter than a creepy exploration game. Deus Ex is half an RPG.
FPS is too reductive. I'm using the phrase immersive sim in the same way as these articles:
No, it doesn't really help explain how Doom, Wolfenstein, or Quake, etc., are lumped in together with games like Thief as simulations. FPS is certainly not reductive. A game can be more than one genre, but sometimes an FPS is just an FPS, and more than half the games you listed are pretty much just an FPS.
They weren't even referred to as FPS games at that period of time, they were DOOM clones. The FPS moniker came along later after it settled into a genre.
Immersive sim is a genre. Games with strong narratives that react to the player often in significant ways. Some of the games grandparent listed aren't immersive sims.
As the comments below observe, I think the types and number of experiences built on the back of new (to mainstream audiences anyway) technology were transformative.
In 92 the Amiga 1200 came out, a wonderful computer with great hardware but unfortunately it never made it big. Four years later in 1996 mass market PCs were coming out with Pentium processors, 3dfx accelerators, high resolution bitmap displays, internet connections and powerful general purpose sound adapters.
Today we have orders of magnitude more computing power, but essentially they are powering the same experiences, albeit far more polished.
If we look back four years, 2014, not much has really changed. The big game releases indeed are the same games (in the 90s they'd have been called mission packs).
The magic of experiencing something completely brand new, completely alien and magical, has gone away. Though, I am hopeful VR/AR is going to deliver that again. I don't know how far off it is, but it would be fitting if it's Carmack and Abrash behind it. Again.
I wonder sometimes, what could be built using today's tech if it was focused like Amigas back in the day with a single hardware spec and a dedicated OS just for that platform. Without regard to backwards compability in OS, like BeOS at the time.
What would a revolutionary Amiga 1000 look like in 2018? With modern hardware and latest OS/system ideas.
Gaming consoles could be considered examples of this, but they are kind of single purpose.
Therefore, if I was attempting something like this today, I would:
- make the graphics preeminent in the system. In other words, keep the graphics card and throw away the legacy CPU. Let it drive PCI devices.
- on its own, doing that makes it much harder to program, so put considerable work into making the instruction set and programmer's model open and well documented
- find a DAW engineer and let them build the audio subsystem with an obsessive focus on low latency. Let's aim for no more than 10 samples latency between input and output processing and see where that gets us.
- full multitasking in which nothing is ever allowed to block anything else unrelated, through resource reservation (qv the Nemesis research operating system). Having an Electron app on the system should not impair anything else, and the system's default editor should also be focused on low-latency.
- apps are by default fully security partitioned from each other. The operating system would maintain a CRDT-based record-orientated personal data storage system, incorporating lessons from PalmOS. This gives both native sync and automatic persistence across power-off.
- low latency non-USB keyboard and mouse. PS/2 would actually work but we could go for something really surprising like gigabit Ethernet or optical TOSLINK.
(Low latency is a good example of a feature which is extremely hard to retrofit and you end up redesigning the system around it).
> keep the graphics card and throw away the legacy CPU
If you throw away the CPU, how do you run general purpose programs (of which games are one type)? If the graphics card can run them, why are you not calling it a CPU?
The Amiga had a CPU :) Never owned one, but I understand a key feature was that graphics were very easy to program (mapped to main memory, or so a friend who owned one told me).
Given that outside of the Switch the consoles are x86 AMD made socs, even consoles have bowed to the PC.
In hindsight i feel the Amiga's custom chips was actual an Achilles heel. Yes it allowed each model to sprint out of the gate, but it also limited how far the user could upgrade them.
In contrast the only thing on the motherboard of a PC of the era (once we hit the 386 and later) was the CPU and RAM. Everything else lived on replaceable boards hooked to a bus. And as the Pentium era showed, even the bus could be "replaced" by placing a new one (PCI) side by side with the old one (ISA).
All this allowed a continuation of sorts, where a humble DOS box could live on into the Win9x era in some way or other.
> ...a single hardware spec and a dedicated OS just for that platform. Without regard to backwards compability in OS, like BeOS at the time.
I would argue the closest mainstream machines currently available that meet this requirement are Macs and iOS devices. Sure, there's a handful-ish amount of different configurations, but they do not vary greatly. Backwards compatibility certainly isn't much of a concern in macOS.
I always dislike it when the Amiga 1200 is mentioned as a forerunner to modern computing and the wonderful, albeit buggy, Atari 1040ST, released in 1985, is totally forgotten about
Using the Motorola 680000 with 32 bit internal/16 bit external buses and color bit-mapped graphics, my ST, hardware wise, totally blew away all other PCs from a technical POV and it just annoys me when the Amiga gets all this love the and the ST gets none whatsoever.
The ST was not as good as the Amiga, that's why. The only advantage it had over the Amiga was the slightly higher CPU speed, because the Amiga's was reduced to keep in sync with the hardware it had. The hardware being what it needed to trounce the ST. You only have to look at the range and quality of games for each to see what I mean. ST ports always had that muddy look to them.
I still own and love both :) I mentioned the A1200 because it was basically the apex of that line of computers (and indeed that entire approach to computing).
There's some great videos on Youtube (particularly the Computerphile ones) on the Tramiel era Atari computers. They are a great little machine to help learn computer organization on. They were powerful yet still relatively simple machines that you could fit in your head.
The Amiga had a hardware blitter - this made the difference in games. If you play the same game on both systems, the Atari port seems like a slideshow in comparison.
I am a musician as well and I do remember its MIDI-specs were top-of-the-line. I never really got the chance to use it in that way but thinking about that almost makes me want to pick one up and give it a go...
Not GP, but I agree. The world of PC gaming was moving so fast in the 90s - sound cards, CD-ROM, "multimedia", the internet, 3D accelerators etc. (I appreciate that most of these technologies were invented earlier, but the 90s was when they really came of age and hit mass adoption).
I grew up in that era and agree that it was a golden age, but of course I'm aware that nostalgia plays a big part in that. If you look at, say, 2000-2010 or even 2010 to now it seems things have slowed down - which isn't necessarily a bad thing.
I think a strictly technological view of videogames is very restrictive.
For a variety of reasons, in the last decade (or in the last couple), videogames matured as a narrative medium, which is a radical difference from the 90s. One could argue that, from an whole - including artistic - perspective, 2010s are the golden era.
I don't think there is a golden age, though. There are still radical improvements to come in the next decade(s), and it will require a long time to be able to have a judgment of what characterized video games/developemnt in each era.
Largely because of improvements in graphics, game budgets have gone way up and studios are afraid to take risks. Hence you get mostly sequels, online cash cows, linear on-rails shooters/cinematic experiences that are easy to sell. Content is expensive to produce so studios are afraid of "wasting" it. Yet I was still discovering hidden things in the original Deus Ex 10 years after I first played it.
1997-2004 I think was the golden age of gaming where technology was good enough that cool things could be made, yet bad enough that small-ish creative studios could compete. Games were popular enough to make it profitable for studios to make them yet not too mainstream to dumb them down for the lowest common denominator.
System Shock 2, Deus Ex, Metal Gear Solid, Silent Hill 1-3, Resident Evil 1-3, No One Lives Forever, Age of Empires, Warzone 2100, Mafia 1, Half-Life, Syphon Filter, Parasite Eve - in no particular order; I am sure I am forgetting many many more great titles from that era.
I agree, luckily we have games from indie developers to fill in that gap. The tradeoff is that you have to sacrifice good graphics and a perpetual alpha/beta development cycle for gameplay. There’s been so many great indie games, such as Minecraft, Rimworld, Stardew Valley, Terraria, Prison Architect, Rust, Undertale, etc. In the good ‘ol PC gdays I used to buy 10-15 AAA titles per year on release day. Now I only buy a couple only on a heavily discounted Steam sale.
The last AAA game I bought on release day was SimCity (5). That was a huge disappointment for me and I wouldn’t do that again. I should have knew better.
> [...] small-ish creative studios could compete. Games were popular enough to make it profitable for studios to make them yet not too mainstream to dumb them down for the lowest common denominator.
I would venture that Ninja Theory with "Hellblade: Senua's Sacrifice" fulfills this - they're were[1] a small studio with a game you can't really call dumbed down and it was a profitable game with a relatively small budget (<$10M).
All good points. You're certainly right about the costs involved in 'triple-A' development. Indie games are out there, though. Some of them even look great, just not photorealistic (such as Limbo).
> Yet I was still discovering hidden things in the original Deus Ex 10 years after I first played it
Ever play the PS2 port?
They had to rework the maps to cope with the PS2's memory constraints. Everything's familiar, but a bit different. Worth a shot if you want another hit.
How many games have you played in the last, say, 5 years?
I suspect that many people in this discussion reference big names because they played in the past, but don't play anymore, therefore talk about what they actually see advertised.
> Largely because of improvements in graphics, game budgets have gone way up and studios are afraid to take risks [...] 1997-2004 [...] good enough that cool things could be made yet bad enough that small-ish creative studios could compete
Big studios are only a part of the gaming production landscape.
There is a lot going on in the small/indie studio segment; I've randomly picked up the first link for "Best 2017 pc games", and roughly half of it was not AAA.
Nowadays, with the availability of game frameworks, the entry barriers to game development are pratically non-existent, to the point that at least one critically acclaimed game was made with... Game Maker (in fact, lauded for the narrative).
I agree on the indie games. I enjoyed VVVVV and Doki Doki Literature Club. Out of recent AAA titles, Prey was very encouraging, GTA5 was pretty damn good, the recent Hitman is quite good also. I just think that the time period I mention had a higher number/density of great AAA titles.
Wasn't Vice City was the Grand Theft Auto of 2002/03? ;)
More to the original point, the first GTA remains along with Age of Empires one of my all-time favourite games. It's easy to forget in light of the franchise it became what a novel and crazy little game that was.
What I find absolutely fascinating is that there are only 16 years between the release of GTA1 and GTA5. How could such amazing progress have occurred in such a short timeframe? It's unbelievable.
>* For a variety of reasons, in the last decade (or in the last couple), videogames matured as a narrative medium, which is a radical difference from the 90s. One could argue that, from an whole - including artistic - perspective, 2010s are the golden era.*
In an era when most mainstream movies are way dumbed down, I doubt games got much more "artistic" and "narrative".
(Especially if one considers the text and graphic adventure games in the 80s/90s).
At best they got some ersatz narrative qualities, but nothing to write home about.
Agreed. Just look at franchises that were around in the 1990s and compare them to their 2010s counterparts. Almost none of them look good in comparison to their antecedents.
Deus Ex -- You play as a cybernetic anti-terrorist operative in a prosaic and cynical vision of the future fueled entirely by the conspiracies of the mid-90s BBS scene. The world is complex, coherent, and fleshed out. You have the illusion that your choices matter.
Deus Ex: Human Revolution -- You play as a super bad-ass private security in a neo-Renaissance world populated fueled by the hyperbole surrounding transhumanism. The developers deliberately make it less complex compared to the original for the sake of streamlining, use banal pop culture for their "inspirations," and intentionally design the game so that player choices are irrelevant. [0] It can be boiled down to "We want to make a western Metal Gear Solid."
I'll just grab a quote from a random Fallout 3 retrospective here -- "[...] Fallout 1 and 2 were defined by complex storylines, detailed characters and far-reaching consequences to the player's actions. And that these elements are less prominent in FO3, while faster action and stunning visuals have been brought to the forefront." [1]
Or a Thief (2014) review: "The three major strengths of past Thief titles - wide open mission design, sound propagation and narrative - are this game’s biggest weaknesses. That is a fundamental problem it cannot hope to overcome." [3]
Video games "matured" in the sense that they became more like movies interspersed with interactive segments, but the notion that that made them more artistic is pretty unfounded.
I think some article in PCGamer back in the day quoted Warren Spector regarding Deus Ex, and its complexities, as him regularly going into his office and sitting down to figuratively bang his head on the desk while asking why they make things so hard on themselves over and over.
The decision that the player should be able to win via multiple paths, be it sneaking, gunplay, or something else entirely, really made it a beast to work on.
Damn it, there is a whole sub-section of one of the maps were you can encounter mobs you normally only encounter towards the later end of the game. And you do this by following up on a missing person and finding a way into the sewers.
That said, i keep coming back to a quote found over at Filfre, where one of the people that worked on the early Lucasarts (still Lucasfilms Games back then) games muses over how game developers have a bad habit of getting distracted by new toys.
Meaning that whenever some new hardware came along that allowed more of something, more colors, more sprites, more anything, they would invariably churn out a mass of shallow action games or similar to show of how many sprites or colors they can make the hardware push. Effectively the industry is rolled back by several years of development practice (and i dare say something similar happened with mobile tech when the iPhone released).
And it may well be that as we keep having AMD and Nvivida push out new GPUs, that we are stuck in a rut of continually colorful and well rendered but bland games.
Never mind that with the gamepad being the more likely input device, many interfaces are hampered (Deus Ex inventory tetris anyone?).
Modern video games too often have far more linear gameplay, because AAA budgets means assets get wasted if they don't get consumed. See the entire CoD series, for example - tedious to play if you're used to a different kind of game.
Many people want a movie-like experience from their game, but that isn't what I played games for. It was more about emergent situations, rather than scripted. That was also what made games replayable - it's very rare for me to want to replay a modern game these days.
Can I ask, what does "assets get wasted if they don't get comsumed" mean?
Also, I agree with emergent situations, as a gamer, I'd call myself an "explorer" I'll check every nook and cranny, often, with modern games, I end up breaking things because I go places I shouldn't yet, instead of following what the developer expected of me.
As for replayability, I agree that it has mostly been lost, but, I don't feel it's a bad thing. I have a backlog of games on various systems to last me several more years, and I continue to buy games at a pace that means it'll sustain for some time to come.
Coupled with increasingly less time to game as I get older, I struggle to play some games at all.
Fallout 3 and Oblivion were two of my favs, New Vegas and Skyrim sit in their plastic wrap since day one, as I haven't found the time to commit to them. Fallout 4 got some 60 hours of my time compared to Fallout 3 in which I spend over 400 hours.
I don't play COD, or similar games, their "experience" is too shallow and linear, and it's easy to just keep them out of my backlog.
I want a game with a solid deadline that I can shelve when I'm done and move on in the backlog. The really greats will get a special "shelf" where they'll come back out, or are given to friends with similar taste.
> Can I ask, what does "assets get wasted if they don't get comsumed" mean?
I think the parent poster means this: that because each "scene" in an AAA game cost a lot to produce, they want every player to experience it. If it was entirely optional, then some players would miss it, and then how can you explain to them why the game was so expensive? So they must see scene; in order to ensure this, the game becomes more linear and with fewer optional missions/situations/paths.
Think of it as a big budget movie: they filmed the action-oriented, CGI-ladden scene, so they want you to watch it.
Fallout 3 and Oblivion were two of my favs, New Vegas and Skyrim sit in their plastic wrap since day one, as I haven't found the time to commit to them. Fallout 4 got some 60 hours of my time compared to Fallout 3 in which I spend over 400 hours.
None of these games are scripted in the same way CoD is. RPGs certainly have a lot of scripting, but they're at the other end of the spectrum - broad scripting rather than deep scripting. CoD has scripted experiences, where almost every detail of a scene is pre-planned, so that if you have two players in two different rooms, and they meet up later to talk about the game, they'll have had similar experiences and the same sequence of events.
It's the deep scripting, for complex cinematic scenes, that the game directors are afraid of players missing. These are what the players are buying. If the players miss out, they get a substandard play experience.
Like the other reply, thanks for explaining. I see what it means now. This is one of the reasons I explore games so thoroughly, don't want to miss anything.
For me the early 90s with the new technology came a lot of experimentation and new kinds of games. For example one of my favourite developers Bullfrog made games like Populous, Syndicate, Magic Carpet, Theme Park, Dungeon Keeper. All radically different game styles and all excellent games.
The late 90s and 00s to me was all dominant single games (e.g. Starcraft, Quake, Halflife). A lot of it was about having the latest graphics, with big studios winning. And spinoff games were done with an 'engine', which often made them look/feel/respond similar to the original. You also saw a lot of sequels and series taking over from new game concepts because of this. And most of them were in a few genres (FPS and RTS especially).
I agree its a good time now though, we are so flooded with different types of games that you have to do something interesting to get noticed.
Really? Currently, I think video games are a terrible way to tell a story. Much worse than a book, tv show, or movie.
The way games are used as a narrative medium is by shoving movies inside them and forcing you to watch them instead of actually playing the game. Its like going to the movie theater, being handed a book, and told to read the book while we pause the movie between two scenes.
Obviously some games are exceptions, but they are few and far between.
Games are certainly a different way to tell a story, but I wouldn't say "terrible".
> The way games are used as a narrative medium is by shoving movies inside them
If that's how a game is presenting its story, then yes; they likely would be better served by simply making a movie. Games are an interactive medium, but that just means that storytelling in games will be different and use different tools than other mediums, while allowing for entirely new kinds of engagement with a story being told.
The memorable video game stories in my experience have been those that were engaged, collaborative experiences that I felt physically involved in as a player. The wholly emergent stories from Dwarf Fortress, the dynamically simulated open world with a strong interactive narrative in Star Control II, or the incredible physical connection to the tightly presented stories of Brothers: A Tale of Two Sons or What Remains of Edith Finch; these are all different ends of a spectrum of interactive narrative design.
The kind of games you mention are certainly terrible examples of storytelling, but to dismiss the medium as being a "terrible way to tell a story" is to miss out on some of the most interesting interactive stories being told.
The reason I say it is a terrible way to tell a story is not because you can't tell a good story in a video game, but because most stories don't benefit from having gameplay elements thrown into them. In fact, I think it takes away from the story because the storyteller loses control over important things like pacing and the person experiencing the story has to constantly switch between story-mode and gameplay-mode.
Brothers: A Tale of Two Sons is pretty much the only game I can think of that does this successfully, and I don't think it's something that can be done with any story. Books and movies, on the other hand, can tell pretty much any story. They are universal, while video games are limited. I believe that to be inherent to the mediums, and not just because we are still learning how to tell stories in video games.
Emergent stories are not storytelling and have no bearing on video games effectiveness as a narrative medium. They are stories, yes, but they are not being told. They are being created in real time by the people playing the game, which is quite fun but not storytelling.
> The reason I say it is a terrible way to tell a story is not because you can't tell a good story in a video game, but because most stories don't benefit from having gameplay elements thrown into them.
Interesting that I have exactly opposite view. Gameplay elements if done well create a tight feedback loop between the game and the player, an illusion that the player is part of the story, not just an observer.
Like a sibling comment says, in the 2010s videogames have "gone Hollywood". My perspective is similar to what you say, but actually in the opposite direction: I think videogames in the 90s experimented more and were less focused in graphics, even though of course there were very technical programmers like Carmack. In the 2010s, videogames seemed to think graphics were the most important improvement; but even worse, every game designer thought of himself as an amateur movie director. Unfortunately, most game designers would make terrible directors. I'm going to single out David Cage of Quantic Dream as someone who writes embarrassingly bad plots for his videogames (Indigo Prophecy, Heavy Rain), and actually calls himself a "director".
The problem with "games as movies" is that most games tell a story that would feel amateurish or childish if told as a movie. But also, as one piece in the Atlantic controversially argued [0], games are a different medium than movies; trying to "tell a story" with a game, in the traditional sense, is failing to take advantage of the medium.
> I'm going to single out David Cage of Quantic Dream as someone who writes embarrassingly bad plots for his videogames (Indigo Prophecy, Heavy Rain), and actually calls himself a "director".
To be fair, "director" has been a title in video game credits for 30+ years. Shigeru Miyamoto (or rather "S. Miyahon") is listed as a director on the credit roll of The Legend of Zelda, for example. I don't know how common it is relative to other titles like "designer" or "planner", but in any case Cage isn't breaking any more ground here than he is in his storytelling.
I agree things are more 'cinematic', but text based adventures were arguably just as 'narrative' if not more so, and these are some of the earliest games in existence.
Then there were the 'graphical adventure' games like kings quest, etc which were hugely popular, then things like 'tomb raider', and so on.
taking these as a lineage, one can say it's really just the presentation that has matured, which one could then say makes the argument a 'technological view'..
I would argue that Brothers: A Tale of Two Sons or What Remains of Edith Finch are both examples of games that use inherent features of physically playing a game to deliver their stories in a really unique way.
I would say that kind of thing, finding ways of using interactivity to tell stories in ways that no other medium can, is an example of the field "maturing". (Beyond just "pass this test of skill or strategy to see the next part of the story, presented in an otherwise conventional manner").
I would agree to the 90s having huge fundamental leaps. For more recently, on the innovative technology side you do have VR, SSDs, steady performance increases across the board. But I would say the biggest leaps have come with better code and asset generation rather than brute hardware improvements. PBR was a massive change in the way assets are generated and how real they can look and feel. Likewise games are reaching really impressive levels of mechanical fidelity and visual depth. GTAV still looks amazing today, and that's down to the sheer amount of unique assets and texture variance in the game. Games now tend to flourish due to gameplay depth, scale and freedom over every impressive visuals. Which is great.
It was my teenagerhood, of course it was a golden age.
More seriously, the rate of technical change in games development was huge during the 90s. We went from 3D being barely possible without textures to Half-Life 1's cinematic-style monorail introduction (which was visually better than most of the "full motion video" we'd seen in games up to that point). Graphics accelerators became a consumer product.
The development of the tech also made a wide open space for innovation in art direction - each new game looked noticeably different as well, in a way that's been attenuated among all the brown gritty shooters of today.
Doom also popularised multiplayer gaming (not really a thing on PCs at that point, limited to splitscreen on consoles). And it was the first game I encountered that really embraced modding - the Doom editors allowed the re-use and re-mix of the game assets into your own levels and all sorts of strange doom-flavoured experiences.
Doom was the first networked PC game I played. It was one of the first of it's kind. It opened up a world of connected gaming that lasted at least a decade.
It's not surprising at all - what comes after the very early stages of development of many areas in science and technology are indeed usually could be called "a golden age." (In math it was the period of around 100 years after Newton; in physics - the second half of the 19th century plus the first half of the 20th.) As far as computers and software, we are now also way past the golden age, unfortunately...
For PC gaming, the move from pre to post Voodoo cards was a revolution leap I have yet to see again in that space. Everything since has been evolutionary. Carmack and glQuake were one of things leading the charge. Each new game, new video card, new processor at the time brought so much new power and excitement. It was the golden age.
CoD 4 is widely recognized as the last good CoD, and arguably, the best one. CoD was and is huge. Not among your peers, but among youth. We're no longer in "the gaming scene" as old farts, and
> Games from that era are still fun and enjoyable to pick up and play.