Arcade Fire | Small Planet

The UX patterns established in 1980s-era video game arcades are all around us today.

Small Planet
6 min readMar 1, 2023

We can trace so much of our relationship with digital devices back to the video arcades of 40 years ago. Seriously. In fact, it’s difficult to really talk about dark or gray patterns in UX today because of subtle (and not so subtle) patterns that began in gaming’s origin story.

Pitfalls and Pole Positions

Darkened, buzzy rooms in shopping malls of varying reputations provided the first mass experience with consumer digital products. For a price, of course. The arcade environment was based on a pretty basic financial premise: feeding quarters into a machine. That was the goal, and it was a huge cash cow for arcade owners because they basically offered up games designed to get money from kids.

Arcades provided no save states and maximum visual and aural stimulus. There were few areas for socialization, and no chairs (unless they were part of the game). The arcade games themselves, even the earliest ones, had a cinematic flair in their branding and narratives. Characters, colors, and marketing all blurred together. Even then the crossover appeal of properties was real (looking at you, Tron). You as a player were part of it all, taking the first step into an immersive digital experience.

The patterns around the gaming itself were to incentivize playing longer. Each time you die — which is spectacularly easy to do — you’re learning pattern recognition. In order to progress and become the best or to complete a game, you had to keep feeding these quarters to keep practicing, essentially.

Dopamine hits came along with loud noises of approval and pixelated mayhem. It was really stimulating, especially for kids. It was also slightly predatory because the incentives around getting better included a certain social status. Being the best was very real in that world.

Pay-to-play is everywhere

You can trace a lot of modern digital interactions to arcades. Like, say, the whole idea of “pay-to-play.” Many older games, once you know the patterns, and once you memorize the play drill, are really only 60-minute-long games. But in order to get to that kind of excellence and speed run, it takes countless hours and countless quarters.

That kind of cash-based “improvement incentivization” is everywhere now, ingrained into mobile and console games of all stripes. In-game stores feel almost mandatory, offering more weapons, coins, fuel, levels, mods, expansion packs, and add-ons to make your gaming journey better.

In-game purchasing is essential to the economic model of game development studies. That’s understandable, but more than a little depressing. You become part of a content cycle, which makes you feel like you’re getting ceaselessly hustled after your initial purchase.

Everything autoplays

Another good example is the Netflix Next Episode Autoplay feature. Arcade games always prompted you to continue after unfortunate encounters with asteroids or millipedes, never allowing a hard break or restart of the game if you were willing to engage further with time and money.

Though HBO doesn’t require quarter insertion to watch the next episode of “The Last Of Us” (though that would be both hilarious and horrible if it did), the principle is still firm. Keep the game going either with episodes of this show or a recommended one in the same genre.

On one hand, it’s very convenient. If you’re planning to watch more than one episode, you don’t have to go through the same kind of user flow to select the next episode. It’s already queuing for you and you could back out if there’s something else you need to do or maybe you’re finished watching.

On the other hand, it very easily enables binge-watching, which definitely has downsides. We all do it, but now it’s easier than ever. That pattern can be perceived more as a “gray” pattern than a dark pattern because it’s murky.

The gamification of learning

I was obsessed with Duolingo because of how they gamified the experience. But I realized that aside from the vocabulary, I wasn’t really learning French or how to converse and listen in a way that would help me if I were to go to France. I also noticed the mechanics involved in keeping streaks going, not starting over, and staying in the game … so old-school.

Also, I really wanted to keep my Duolingo diamond-level status. I became focused on earning points more than actually learning the language, and so I would forgo the more challenging lessons to earn more points … those quick hits that I knew would help me maintain my rank. I then got so burnt out on earning points that I stopped using it. Now, I’m paying ~$70 a year for an app that I don’t even use.

Video game prompts can be effective in teaching someone a language, but it employs that dopamine, too. At what point is that detrimental for the user? Does the benefit of learning the vocabulary outweigh the effects of constant notifications and the fear of losing your streak? Or the cost incentive (keep those quarters coming) to pay to protect your streak?

Casino takeovers

One of the most obvious direct descendants of the arcade is the casino floor. Never a stranger to overstimulation, gambling establishments have long understood the UX and “keep them playing” ethos of Frogger’s 80s heyday.

What’s so stunning is how quickly the shift from analog to digital occurred in slot machines and related games. The mimicry of video game aesthetics is amazing: fast-moving action, overblown visuals when you win, downbeat tones when you lose, and the crossover of characters to other entertainment IPs. Ditto for arcade aesthetics: no windows, garish neons, analog buttons connecting to digital actions, and a vague yet palpable sense of status when you win.

Pattern recognition

I spent a little time in arcades, but they were on their way out by the time I was the target audience, the target audience that played at home. My game growing up was Donkey Kong Country. I sunk years of my life in that game. Now, I can get through the whole thing very quickly because I developed this excellence around a game that just spoke to me.

This is where context really comes into play, because the amount of time required to learn, excel, and complete a game can be viewed as a dark pattern, a resource that’s leveraged fully.

But it’s based on the player’s experience as well. I loved that game. Is it a dark pattern if I’m enjoying myself, but I’m sinking a lot of time into this one game? I would argue not, but I’m sure there are others that would say yes.

Even the definition of “dark pattern” is a challenge since everything is based on context and transparency. When one individual might perceive a pattern as helpful, or maybe even the company perceives it as helpful, an unintended consequence could actually create harm in the short-term or the long-term.

As designers and developers, whether in the gaming industry or another, it’s important to take a step back and consider the purpose of common conventions and their origins. How did this pattern come to be? Is there a better way? Is this in service of the business at the expense of the user? Are these choices ethical? What are the potential unintended consequences? The list goes on.

By asking ourselves these questions we become advocates for the user. And when knowing the history of seemingly common conventions we can better understand how to break from them — or better yet improve upon them for our fast-evolving digital world.

Originally published at https://smallplanet.com on March 1, 2023.

--

--

Small Planet

Experts in UX, mobile products, and streaming services.