I'm teaching introductory psychology and working on my prelims (dissertation proposal), so I honestly shouldn't even be wasting time on this, but I need to get my fingers moving and some ideas flowing. So here's one I've been munching on for a while (even though I know I have a cliffhanger on my decision making post).
Video game design faces a fascinating problem that you simply don't see in human-computer interaction or human factors: the issue of challenge. (Perhaps outside the realm of error prevention, where you want to make screwing something up difficult to do or easy to undo. But even then, it's a much more discrete and categorical issue.)
Normally, good design in software and technology means that the user doesn't have to think. A good design anticipates the user's needs and expectations and hands them what they want on a silver platter. Unfortunately, that's almost exactly the opposite of what the core gaming audience wants. (A metaphor I'll admit was inspired by this rather apt comic floating around the web:)
I did my damnedest to find out where this came from to give credit where credit's due, but couldn't find a source. If you know who made it, leave a note in the comments.
A greater distinction between the hardcore and casual markets, I believe, is perhaps more about the time investment required to appreciate the core experience of what makes the game fun. Angry Birds is a casual game because it takes very little time to appreciate the fun in chucking stuff at other stuff to make it fall down. Whether you're good or bad at that is another issue entirely. Compare that to Skyrim, where simply learning all the options available to you for creating your character is at minimum a half hour investment in and of itself. You can play either game in short bursts or in multiple-hour marathon runs, and even log the same amount of hours on both; but, the initial cost of entry is dramatically different.
As a side note to my side note, I thought about how games like Super Meat Boy (which Steam brazenly calls "casual") and Smash Bros. fit into this scheme, as they're a sort of easy-to-learn but difficult-to-master type game. Like chess, there's a low cost of admission, but there are depths to them that most people will never see. You can jump into those sorts of games and have fun immediately, but there is still a question of how fully you've experienced the game after a limited time. But that's another discussion for another day.
Anyway, I digress. The issue I wanted to talk about here is that applying human-computer interaction is a tricky issue to video games. On the one hand, following the principles of good design are necessary for a comfortable and intuitive experience. Yet, it's possible to over-design a game to the point of stripping it of exactly what actually makes the experience fun.
I'm going to use two super simplified examples to illustrate my point.
One obvious place you want to remove obstacles for the player is in control input. Your avatar should do what you want it to when you want it to and the way you expected it to. Many an NES controller has been flung across the room over games where a game avatar's reflexes (at the very least, seemingly) lagged behind the player's. (Or maybe that's just a lie we all tell ourselves.) A slightly more recent example is the original Tomb Raider franchise (back when Lara Croft's boobs were unrealistically proportioned pyramids). Lara had a bit of inertia to her movement, never quite starting, stopping, and jumping immediately when you hit a button, but rather with a reliable delay. You learned to compensate, but it limits the sort of action you can introduce to the environment, as it's impossible to react to anything sudden. Bad control is not only frustrating, it limits the experiences your game can provide to players.
Underlying this principle is the fact that humans generally like feedback. We like to know our actions have had some effect on the system we're interacting with, and we want to know that immediately. When your avatar isn't responding to your input in real time, or you don't know if you're getting damaged or doing damage, or you're otherwise unsure of what your actions are accomplishing, that's just poor feedback. This is a very simple and basic foundational concept in HCI and human factors, and it has a huge impact on how well a game plays. Just look at Superman64 or the classic Atari ET that's currently filling a patch of Earth in the New Mexico desert. People complained endlessly about how unresponsive the controls felt and how the games simply did not properly convey what you were supposed to do or how to do it.
Ironically, the game appeared to be all about getting out of holes in the ground (though no one is entirely sure - we think that's what's going on).
The trickiness comes in when trying to distinguish what's a frustrating obstacle to actually enjoying the game and what's an obstacle that makes the game enjoyable. You want to present the core experience of the game to the player and maximize their access to it, but at the same time manufacture a sense of accomplishment. It's an incredibly difficult balance to strike, as good games ("hardcore" games especially) stand on the precipice of "frustrating." You want to remove obstacles to what makes the game an enjoyable experience, but you also risk removing the obstacles that create the enjoyable experience.
I believe no one is more guilty of this right now than Blizzard. It's been immensely successful for them, tapping into the human susceptibility to variable ratio reinforcement schedules, but the core gaming crowd doesn't talk about Blizzard's games these days with affection.
Blizzard has successfully extended a bridge from what were hardcore gaming franchises into Casual-land. Pretty much anyone can pick up Diablo III or World of Warcraft and experience the majority of what makes the game fun: killing stuff for loot. But if you listen to the talk about these games, you find that these games are regarded as soulless and empty experiences. So what went wrong?
Blizzard's current gen games follow a core principle of design, which is to find the core functional requirements of your product and design everything around gently guiding your users toward them without requiring effort or thought. The one thing that keeps people coming back to Blizzard games is their behavioral addiction to the variable ratio reinforcement of loot drops for killing things.
Blizzard wants you to keep coming back to it, and they clearly made steps to optimize that experience and minimize obstacles to accessing it. Your inventory in Diablo III is bigger than ever, and returning to town incurs absolutely no cost - two things that previously interrupted the flow of action in previous incarnations of the franchise. Having to use resources to return to town just to make room for more stuff is an incredibly tedious chore that just keeps you from doing what you'd rather be doing in the game: killing demons in satisfyingly gory ways. Hell, one of my favorite features of Torchlight, Runic Games' Diablo clone, is that you can even have your pet run back to town to do all those tedious housekeeping duties that normally pull you out of the demon-slaying action.
Torchlight II even lets your pet fetch potions for you from town.
If you look at these games, almost like the design of a casino to keep players at the slots, everything is geared towards keeping you killing monsters for loot. Sure, there are a lot of accessories and side attractions, but they're all part of or related to getting you into their Skinner box. The ultimate side effect is that it makes the game feel like a one-trick pony. You practically habituate to the killing, and soon what used to be fun simply becomes tedious. And yet the periodic squirts of dopamine you get from blowing up a crowd of monsters or picking up a legendary piece of equipment keeps you doing it past the point you really feel like you're enjoying yourself.
Granted, this was made before any major updates.
Like their behaviorist heroes, the Blizzard (Activision?) business team doesn't seem to care about your internal experience, they just care about your overt behavior: buying and playing their game. I personally don't think this is a sustainable approach for the video game industry as a whole; it would be akin to food manufacturers just putting crack cocaine in everything they make so that people keep buying their product. Your addicted consumers will continue giving you their money, but they'll hate you for it and tear themselves up over it in the process. And that's fine if you don't have a soul, but I think that's hardly what anyone wants to see.
Anyhow, the point is, these games are not creating the obstacles that the hardcore crowd expects. A good game rewards you for mastering it with a sense of accomplishment, no matter whether it's hardcore or casual. A major problem is that the hardcore crowd requires so much more challenge in order to feel the same sense of accomplishment. Like junkies, they've desensitized themselves to (and perhaps developed the skills necessary to overcome) typical game challenges Just to complicate things more, it's not so simple a matter as making enemies deadlier, like monsters that randomly turn invincible or become exponentially stronger after a time limit, as in Diablo III's Inferno mode. Blizzard operationalized challenge as "how often your character dies," and they had to overhaul Inferno mode because everyone hated it - it was a cheap means of manufacturing an artificial challenge.
Oh, fuck you.
Challenge in video games is a hugely difficult problem. A good challenge is one in which the solution is foreseeable but not obvious, difficult but attainable, and furthermore provides a sense of agency to the player. I believe these are features all great games - that is, the ones you return to and replay for hundreds of hours instead of shelving immediately after completion or (gasp!) boredom - have in common. When the player overcomes the challenge, a good game leaves you with the feeling that you did it by mastering some new skill or arriving at some insight you didn't have when you started - not because you grinded (is that correct syntax?) your way to some better stats or more powerful equipment.
Not a simple problem to solve.
One potential solution I believe already exists is the idea of flexibility (providing an optimal experience for multiple tiers of players), but this is even only a step towards the answer. This traditionally took the form of adjustable difficulty levels, but that seems like a clunky approach. A player may not know what tier of difficulty is best for them, and the means of modulating the difficulty can very easily feel cheap (like making monsters randomly turn invincible). That's often where "rubber banding" (a sudden spike in difficulty) comes from - a developer introduces some new obstacle to crank up the difficulty without having any sense of scale or calibration for the player. Another reason why aggressive user testing is necessary and important.
I'm not gonna lie, I have some opinions on methods for overcoming this problem. But like the Joker says, if you're good at something, never do it for free.
Anyway, I should get back to my prelims.