How the Walkman and Poisoned Fish Made the Modern Tech Industry

Written By Brian Hicks

Posted May 27, 2015

There’s a famous story about how the video game Space Invaders caused a nationwide shortage of 100-yen coins in Japan.

According to the story, so many people were putting their money in the new video arcade game that the government had to triple production of the coins that the machines accepted.

It’s a good story, but it’s apocryphal. Nobody has ever confirmed whether that shortage actually happened or whether the government really had to produce more coins.  The story keeps getting retold because it’s quaint.

At just around that same time, there was another crazy story brewing in the tech world, and it is real. Unquestionably real.  It focuses not on a shortage, but on an excess.  An excess of batteries.

We’re still living through the story today.

The first Sony WalkmanIn 1979, consumer electronics company Sony launched the Walkman. Today we know it as an icon of a bygone era, a fond reminder of 1980s fashion and technology.

But history also recognizes the Walkman as a wonder of miniaturization. It modernized the bulky magnetic cassette player design by shrinking it down to a super-portable 15-ounce package.

With it, people could conveniently listen to their favorite music everywhere they went.

It was a major culture-shifting hit.

Without the Walkman, we’d never have had the mobile tech boom.  The Walkman begot the Discman, which begot the iPod, which begot the iPhone that is the archetype for almost every smartphone today.

Sony only expected to sell 5,000 Walkmen a month in its first year, but the popular cassette player ended up in the hands of 50,000 consumers in its first two months on the market.

The Walkman was powered by two AA batteries, and as it — and other copycat portable cassette players — grew in popularity, so did the consumption of AA batteries.

At the time, miniaturization was the leading trend among consumer technology, and cameras, watches, calculators, and portable games were all eating up dry-cell batteries at a similar pace.

The result was a massive overconsumption of batteries. They’d go into a device, their electrons would be consumed, and they’d go in the trash.  By 1984, Japan was seriously worried about battery pollution caused by consumer technology.  It was a battery waste crisis.

Crisis Leads to Innovation

According to a New York Times article from June 1984:

Mercury, a toxic metal used in most batteries, is starting to seep into the soil around garbage dumps. The leakage has raised fears that Japan is slowly being contaminated by the dry cells that power its calculators, cameras, portable stereos and watches.

In 1983, each Japanese citizen used an average of 15 batteries a year. In other words, they were using 1.78 BILLION dry cell batteries a year.

To understand why this was a crisis will require a quick history lesson…

Japan in the ’80s was very well acquainted with the terrors of mercury poisoning. A condition known as Minamata disease — mercury poisoning from eating seafood living in mercury-contaminated waters — killed 2,265 Japanese and sickened more than 10,000 in the decades prior to the battery crisis. The two outbreaks of Minamata disease in 1956 and 1966 were considered two of the biggest environmental crises in the history of Japan.

The horrors of those outbreaks were still fresh in the public consciousness. The Japanese did not want to risk public health again.

The solution had to come by changing the way batteries were used.  Maybe different chemicals were needed, maybe better recycling was needed.  Whatever it was, Japanese companies sprang into action to prevent another Minamata-style outbreak.

Sony began working on its own batteries. It had established its Eveready joint battery venture with United States chemical company Union Carbide at around the same time the Walkman debuted.

Responding to the public fear of mercury poisoning, Sony’s scientists strove to create a battery that could be reused instead of thrown away.  The idea of a rechargeable battery wasn’t new, but it was one that needed improvement.

Sony knew lightweight lithium was an excellent battery material.  The problem was that Union Carbide was opposed to extracting, storing, and handling the highly volatile material.

By 1986, Union Carbide backed out of Everready, and Sony pursued lithium-based rechargeable batteries.

One year later, it had discovered that a special ionic alloy (lithium cobalt oxide) made an excellent anode that provided one and a half times more cycles than the existent NiCad (nickel-cadmium) rechargeable battery.

This was the first lithium-ion battery. This invention is considered as important to technology as the transistor itself. It came to market five years later in 1991.

Today, smartphones, tablets, laptops, and cameras of all brands are powered by Li-ion battery cells. Electric cars use them, Tesla’s new Powerwall home battery is made of them, and solar energy is going to be stored in them.

Innovations in electrolytic makeup promise to improve the thermal profile of new generations of rechargeable mobile batteries, but lithium remains the king. Every industry that uses batteries forecasts growth in the consumption and application of lithium, especially as technology shrinks down to the nano scale.

Think about it: We live in the lithium age, and we have the Walkman and poisoned seafood to thank for it.

Good Investing,

  Tim Conneally Sig

Tim Conneally

follow basic @TimConneally on Twitter

For the last seven years, Tim Conneally has covered the world of mobile and wireless technology, enterprise software, network hardware, and next generation consumer technology. Tim has previously written for long-running software news outlet Betanews and for financial media powerhouse Forbes.

Angel Publishing Investor Club Discord - Chat Now

Brian Hicks Premium

Introductory