World Library  
Flag as Inappropriate
Email this Article

Video Game Crash of 1983

Article Id: WHEBN0000077269
Reproduction Date:

Title: Video Game Crash of 1983  
Author: World Heritage Encyclopedia
Language: English
Subject: David Crane (programmer), Swordquest, Xonox, Don Daglow, Racing Destruction Set, Sky Kid, Don James (video games), Sega
Publisher: World Heritage Encyclopedia

Video Game Crash of 1983

History of video games

The video game crash of 1983, also known as Atari shock in Japan,[1] was a massive recession of the video game industry that occurred from 1983 to 1985. Revenues that had peaked at around $3.2 billion in 1983,[2] fell to around $100 million by 1985 (a drop of almost 97 percent). The crash was a serious event that brought an abrupt end to what is considered the second generation of console video gaming in North America.

The crash almost destroyed the then-fledgling industry and led to the bankruptcy of several companies producing home computers and video game consoles in the region, including the fastest-growing U.S. company in history at that point, Atari.[3] It lasted about two years, and many business analysts of the time expressed doubts about the long-term viability of video game consoles. The video-game industry was revitalized a few years later, mostly due to the widespread success of the Nintendo Entertainment System (NES), which was soft launched in New York City in late 1985 and had become extremely popular in North America by 1987.[4]

There were several reasons for the crash, but the main cause was saturation of the market with hundreds of mostly low-quality games which resulted in the loss of consumer confidence. The full effects of the industry crash would not be felt until 1985.[5]

Causes and factors

The North American video game console crash of 1983 was caused by a combination of factors. Although some were more important than others, all played a role in saturating and then imploding the video game industry.

Flooded console market

At the time of the U.S crash, there were numerous consoles on the market, including the Atari 2600, the Atari 5200, the Bally Astrocade, the ColecoVision, the Coleco Gemini (a 2600 clone), the Emerson Arcadia 2001, the Fairchild Channel F System II, the Magnavox Odyssey2, the Mattel Intellivision (and its just-released update with several peripherals, the Intellivision II), the Sears Tele-Games systems (which included both 2600 and Intellivision clones), the Tandyvision (an Intellivision clone for Radio Shack) and the Vectrex.

Each of these consoles had its own library of games, and many had large third-party libraries. Likewise, many of these same companies announced yet another generation of consoles for 1984, such as the Odyssey3, and Atari 7800.[6]

Adding to the industry's woes was a glut of poor titles from hastily-financed startup companies. These games, combined with weak high-profile Atari 2600 games, such as the video game version of the hit movie E.T. the Extra-Terrestrial and Pac-Man, seriously damaged the reputation of the industry.[7][8][9] Finally, Atari's market-leading 2600, then in its sixth year, was starting to approach saturation.

Competition from home computers

The first microcomputers such as the Altair 8800 and Apple I were primarily toys for electronics hobbyists and usually required assembly from a kit. Starting in 1977 however, a new breed of pre-assembled machines with the BASIC programming language in ROM became available, of which the most famous were the "Trio of '77", the Apple II, Commodore PET, and TRS-80 Model I. The latter two retailed for under $1000 and the TRS-80 benefited from Radio Shack's chain of electronics stores, thus it could be seen on display by anyone shopping there while many personal computers at the time had to be mail-ordered from their manufacturer. In 1979, Atari unveiled the Atari 400 and 800 computers, built around a chipset originally meant for use in a game console, and which retailed for the same price as their names. By 1982, personal computer sales were booming and the TI 99/4A and the Atari 400 were both at $349, Radio Shack's Color Computer sold at $379, and Commodore had just reduced the price of the Commodore VIC-20 to $199 and the Commodore 64 to $499.[10][11][12]

Because these and other home computers generally had more memory available, and better graphic and sound capabilities than a console, they permitted more sophisticated games and could also be used for tasks such as word processing and home accounting. Also, their games were often much easier to copy, since they came on floppy disks or cassette tapes instead of ROM modules (though many of them continued to use ROM modules extensively). The use of a writable storage medium also allowed players to save games in progress, a feature useful for the increased complexity of computer games, and one not available on the consoles of the era.

In a strategy that directly affected its home computer arch-rival Atari, Commodore explicitly targeted video game players in its advertising by offering trade-ins toward the purchase of a Commodore 64 and suggesting that college-bound children would need to own computers, not video games.

However, Commodore had a huge competitive advantage in their ownership of a chip fab, MOS Technologies. Because they could manufacture ICs in-house, the VIC-20 and C64 sold for much lower prices than competing home computers. The result was a ruinous price war that led to the downfall of Atari and Texas Instruments (Radio Shack was not affected due to having their own distribution chain, but quickly embraced the IBM PC standard with the launch of the Tandy 1000 and 3000 lines). In addition, European and Japanese computer manufacturers chose to avoid the US market due to the unfavorable conditions created by Commodore, so computers like the MSX and Amstrad CPC never fully entered the US market.

Despite advertisements extolling the virtues of personal computers over consoles, the home computer market was additionally dragged down by the video game crash. Gaming had always been a major catalyst in computer sales, but after the crash, gaming was widely seen as passe. Much like the console market, the computer industry suffered from oversaturation and an excess of competing platforms. During the first half of the 1980s, sales of computers steadily rose each year and set a new record in 1984, effectively saturating the market until newer computer technologies spurred a demand for upgrades.

The market for home computers was not the only one affected; the swift rise of the IBM PC into the corporate world (aided by the IBM-exclusive spreadsheet program Lotus 123) spelled the end of Kaypro, Morrow, and other computer manufacturers who targeted the business market. IBM's stranglehold on that market ended when Compaq developed the first legal IBM clone BIOS using reverse engineering and clean room design. The arrival by 1987 of low-cost Taiwanese IBM PC compatibles from companies such as Acer and Leading Edge ensured the final triumph of the IBM-compatible architecture.


The American game industry lobbied in Washington, D.C. for a smaller $1 coin, closer to the size of a quarter, arguing that inflation (which had reduced the quarter's spending power by a third in the early 1980s) was making it difficult to prosper.[13] Arcade machines in Japan had standardized the use of ¥100 coins, worth roughly $1, which industry veteran Mark Cerny proposed as a reason for Japan's game industry stability of the time.[13]

Loss of publishing control

Activision was founded by Atari programmers who left the company in 1979 because Atari did not allow credits to appear on the games and did not pay employees a royalty based on sales. At the time, Atari was owned by Warner Communications, and the developers felt that they should receive the same recognition that musicians, directors, and actors got from Warner's other divisions. After Activision went into business, Atari quickly sued to block sales of Activision's products, but never won a restraining order and ultimately settled the case in 1982.[14] This court case legitimized third-party development, encouraging companies such as Quaker Oats (with their US Games division) to rush to open video-game divisions, hoping to impress both stockholders and consumers. Companies lured away each other's programmers or used reverse engineering to learn how to make games for proprietary systems. Atari even hired several programmers from Mattel's Intellivision development studio, prompting a lawsuit by Mattel against Atari that included charges of industrial espionage.

Despite the lessons learned by Atari in the loss of its programmers to Activision, Mattel continued to try to avoid crediting game designers. Rather than reveal the names of Intellivision game designers, Mattel instead required that a 1981 TV Guide interview with them change their names to protect their collective identities. ColecoVision designers worked in similar obscurity, feeding more departures to upstart competitors.

Unlike Nintendo, Sega, Sony, or Microsoft in later decades, the hardware manufacturers in this era lost exclusive control of their platforms' supply of games. With it, they also lost the ability to ensure stores were never overloaded with products. Activision, Atari, and Mattel all had experienced programmers, but many of the new companies rushing to join the market did not have enough experience and talent to create the games. Titles such as Chase the Chuck Wagon (about dogs eating food, bankrolled by the dog food company Purina), Skeet Shoot, and Lost Luggage were examples of games made in the hopes of taking advantage of the video-game boom. While heavily advertised and marketed, these games were perceived to be of poor quality and did not catch on as hoped, further damaging the industry.

BYTE magazine stated in December 1982, "In 1982 few games broke new ground in either design or format ... If the public really likes an idea, it is milked for all it's worth, and numerous clones of a different color soon crowd the shelves. That is, until the public stops buying or something better comes along. Companies who believe that microcomputer games are the hula hoop of the 1980s only want to play Quick Profit."[15]

High-profile failures

A core cause of the crash was two high-profile titles for the Atari 2600 that were failures. In 1982, Atari attempted to take advantage of the craze following the arcade game Pac-Man by releasing a version for the Atari 2600. However, development was rushed so as to have the game out in time for the 1982 Christmas season. Although the game managed to sell well in terms of absolute numbers, Atari had grossly overestimated the number of sales it would generate. Critics and gamers universally panned the game as being nothing like the lively, colorful original. In the end, Atari only sold a little over half the number of cartridges it produced. Production cost overruns combined with the costs incurred with a big marketing campaign for the game resulted in huge losses for Atari. With Atari in full meltdown by the second half of 1983, CEO Ray Kassar was fired and later investigated by the SEC for embezzling company funds, although the charges were dropped.[16]

Atari also issued its widely advertised E.T. game. Once again, it manufactured millions of units in anticipation of a major hit. Concerned about making the Christmas season, Atari again rushed the game to market quickly, after a mere six weeks of development time. The end result is widely considered to be one of the worst video games ever.[17][18] Many rumors state that, in order to clear their inventory, Atari eventually ended up burying the unsold copies in a landfill in New Mexico. Combined with the high costs for the movie license, E.T. became another financial failure for Atari. Eventually the company's arcade and home divisions were split off into two separate companies, the latter being taken over by Commodore founder Jack Tramiel and his sons after their expulsion from that company.

Fallout effects

Immediate effects

The release of so many new games in 1982 flooded the market. Most stores had insufficient space to carry new games and consoles. As stores tried to return the surplus games to the new publishers, the publishers had neither new products nor cash to issue refunds to the retailers. Many publishers, including Games by Apollo and US Games, quickly folded. Unable to return the unsold games to defunct publishers, stores marked down the titles and placed them in discount bins and sale tables. Recently released games which initially sold for USD$35 were in bins for $5.[19] By June 1983, the market for the more expensive games had shrunk dramatically and was replaced by a new market of rushed-to-market, low-budget games.

A massive industry shakeout resulted. Magnavox and Coleco abandoned the video game business entirely. Imagic withdrew its IPO the day before its stock was to go public; the company later collapsed. The largest third-party developer, Activision, survived in part because they also developed games for home computers to offset their console losses. A patent infringement lawsuit in the early 90s left Activision financially weakened, and they were bought out by an investment group headed by Robert Kotick. The company still exists despite a merger with Vivendi Games in 2007, and is considered a major video game publisher on personal-computer platforms (thanks to the then-legal ability to average its income and recover millions of dollars in past tax payments from the IRS), most of the smaller software development houses supporting the Atari 2600 closed.

Additionally, the toy retailers which controlled consumer access to games had concluded that video games were a fad. That fad, they assumed, had ended, and the shelf space would be reassigned to different products; as a result, many retailers ignored video games for several years. This was the most formidable barrier that confronted Nintendo, as it tried to market its Famicom system in the US. Retailers' opposition to video games was directly responsible for causing Nintendo's branding its product an "Entertainment System" rather than a "console", using terms such as "control deck" and "Game Pak", as well as producing a toy robot called R.O.B. to convince toy retailers to allow it in their stores.[20][21]

The sales of home video games had dropped considerably during this period, from $3 billion in 1982 to as low as $100 million in 1985, leading to bankruptcy for many game companies at the time. Following the release of the Nintendo Entertainment System in 1985, the industry began recovering, with annual sales exceeding $2.3 billion by 1988, with 70% of the market dominated by Nintendo.[22] In 1986, Nintendo president Hiroshi Yamauchi noted that "Atari collapsed because they gave too much freedom to third-party developers and the market was swamped with rubbish games." In response, Nintendo limited the number of titles that third-party developers could release for their system each year, and promoted its "Seal of Quality", which it allowed to be used on games and peripherals by publishers that met Nintendo's quality standards.[23]

Long-term effects

The North American video game crash had two long-lasting results. The first result was that dominance in the home console market shifted from the United States to Japan. When the video game market recovered by 1985, Nintendo's NES was by far the dominant console, leaving only a fraction of the market to a resurgent Atari battling Sega's Master System for the number-two spot soon after. By 1989, home video game sales in the United States had reached $5 billion, surpassing the 1982 peak of $3 billion during the previous generation. A large majority of the market was controlled by Nintendo, whose NES had sold over 30 million units in the United States by 1989, exceeding the sales of other consoles and personal computers by a considerable margin.[24] Other Japanese companies also rivalled Nintendo's success in the United States, with Sega's Mega Drive/Genesis in 1989, and then the Sony PlayStation in 1995. Atari never truly recovered and could not match the success of its competitors or its own 2600 console; it finally stopped producing game systems in 1996 after the failure of the Atari Jaguar. It was not until 2001, when Microsoft released the Xbox, that a U.S. manufacturer became competitive in the home console market again, albeit with heavy losses in the market throughout the first few years.

A second, highly visible result of the crash was the institution of measures to control third-party development of software. Using secrecy to combat industrial espionage had failed to stop rival companies from reverse engineering the Mattel and Atari systems and hiring away their trained game programmers. While Mattel and Coleco implemented lockout measures to control third-party development (the Colecovision BIOS checked for a copyright string on power-up), the Atari 2600 was completely unprotected and once information on its hardware became available, little prevented anyone from making games for it. Nintendo thus instituted a strict licensing policy for the NES that included equipping the cartridge and console with lockout chips, which were region-specific and had to match in order for a game to work. In addition to preventing the use of unlicensed games, it also was designed to combat piracy, rarely a problem in the US or Europe, but rampant in East Asia.

Accolade achieved a technical victory in one court case against Sega, challenging this control, even though it ultimately yielded and signed the Sega licensing agreement. Several publishers, notably Tengen (Atari), Color Dreams, and Camerica, challenged Nintendo's control system during the 8-bit era by producing unlicensed NES games. The concepts of such a control system remain in use on every major video game console produced today, even with fewer "cartridge-based" consoles on the market than in the 8/16-bit era. Replacing the security chips in most modern consoles are specially-encoded optical discs that cannot be copied by most users and can only be read by a particular console under normal circumstances.

Nintendo reserved a large part of NES game revenue for itself by limiting most third-party publishers to only five games per year on its systems (some companies tried to get around this by creating additional company labels like Konami's Ultra Games label). It also required all cartridges to be manufactured by Nintendo, and to be paid for in full before they were manufactured. Cartridges could not be returned to Nintendo, so publishers assumed all the risk. As a result, some publishers lost more money due to distress sales of remaining inventory at the end of the NES era than they ever earned in profits from sales of the games.

Nintendo portrayed these measures as intended to protect the public against poor-quality games, and placed a golden seal of approval on all licensed games released for the system. These strict licensing measures backfired somewhat after Nintendo was accused of antitrust behavior.[25] In the longer run, however, many third-party publishers such as Electronic Arts actively supported competing consoles such as the Sega Genesis. Most of the Nintendo platform-control measures were adopted by later console manufacturers such as Sega, Sony, and Microsoft, although not as stringent. Indeed consoles from Nintendo's rivals had always enjoyed much stronger third-party support than Nintendo which relied more heavily on first-party games.

On the computer side of things, the crash meant the end of the 8-bit computer era in North America and the beginning of the 16-bit era, which was marked by the virtual disappearance of any architecture except IBM PC clones and Apples. Computer gaming also shifted to an emphasis on primarily RPGs, sports sims, strategy, and adventures, and away from arcade games.

Effects on world gaming markets

In Europe, the early years of personal computing (1981–1985) were spearheaded by the very aggressive marketing of inexpensive home computers with the theme "Why buy your child a video game and distract them from school when you can buy them a home computer that will prepare them for university?"[26] Marketing research for both the gaming and the home-computer industries tracked the change as millions of consumers shifted their intention to buy choices from game consoles to low-end computers that retailed for similar prices but still allowed access to comparable games, while at the same time being useful for other tasks such as word processing, calculations and programming. That is why video game consoles had already been largely marginalised in Europe by 1984, thus, leaving the North American video game crash with little effect for the European market.

Although the European home computer market began later than the US one, it took off extremely fast. The Commodore 64 arrived in Europe during 1983, the same year in which the ZX Spectrum was launched in the UK. With disk drives being unaffordable for home users there, nearly all software came on cassette tapes. However, Germany and other countries on the European mainland quickly adopted disk drives on their home computers.

Having destroyed most of their low-end competition in North America, Commodore survived there to the end of the decade with the C64 and Amiga being marketed as a niche platform for gaming. By that point however, the company's primary focus was on Europe where they were a leading computer manufacturer until a series of poor business decisions lead to bankruptcy in 1994.

While the US market was shifting away from 8-bit computers and cassette storage during the mid-1980s, this was not the case in Europe where technology moved slower and IBM compatibles did not gain a significant foothold. As video game consoles only had a minor share in the European market until nearly the end of the 1980s, computer gaming became the established tradition there and Nintendo failed to make an impact with the NES, although Sega was quite successful with the Master System and Mega Drive. Overall however, console games never became as popular as computer gaming in Europe and PC compatibles would not become dominant until Commodore's demise in the mid-90s and the greater availability of non-English language software.

Japan's home market meanwhile was effectively insulated from the rest of the world and the crash had little more effect there than in Europe. High prices and the small size of Japanese homes meant that personal computers never became widespread there, with consoles being the dominant form of gaming. A number of Japanese computers did however achieve fair success such as NEC's machines, the Sharp X68000, and the Fujitsu FM-Towns, although the majority of games released for them were adult-oriented titles that would not meet the licensing requirements of Nintendo and other console manufacturers. While Japanese companies did not attempt to sell their computers in North America, some were released in Europe with varying degrees of success.

See also

  • Video gaming in Canada
  • Video gaming in the United States


Further reading

  • DeMaria, Rusel & Wilson, Johnny L. (2003). High Score!: The Illustrated History of Electronic Games (2nd ed.). New York: McGraw-Hill/Osborne. ISBN 0-07-222428-2.
  • Gallagher, Scott & Park, Seung Ho (2002). "Innovation and Competition in Standard-Based Industries: A Historical Analysis of the U.S. Home Video Game Market". IEEE Transactions on Engineering Management, vol. 49, no. 1, February 2002, pp. 67–82. doi: 10.1109/17.985749

External links

  • Article at The Dot Eaters, a chronicle of the Great Videogame Crash
  • ISBN 1-887472-25-8
  • Classic Gaming Expo site Biographies and history of the era
  • Official Intellivision History Site by the original programmers
  • The History of Computer Games: The Atari Years Written by Chris Crawford, a game designer at Atari during the crash
  • Detailed C64 Chronology Events & Game release dates (1982–1990)
  • The Great Video Game Crash Of 1983 - Television Tropes & Idioms
This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a non-profit organization.

Copyright © World Library Foundation. All rights reserved. eBooks from Project Gutenberg are sponsored by the World Library Foundation,
a 501c(4) Member's Support Non-Profit Organization, and is NOT affiliated with any governmental agency or department.