Your Face is the Next Input Device

Written By Brian Hicks

Posted January 19, 2015

If you have a webcam pointed at you right now, this story might make your skin crawl.

It’s a story about the scary future of analytics.

Maybe you want to put a piece of tape over your webcam before we start.

Don’t worry; I know lots of people who do that already.

Maybe it’s because I am of a certain age, but I can totally understand how people would think it’s unsettling to have a camera pointing at you while you’re at your computer. Even when you think it’s off, someone could be surreptitiously watching you.

They’re a huge security vulnerability — hacked webcams have been making headlines for years.

Most recently, the news came out that unsecured footage from thousands of webcams was being accessed and restreamed on a third-party website “to highlight security weaknesses.”

From a single site, users could browse through security cameras across the world and even view video baby monitors.

But this story is a different kind of creepy.

It’s about how computers can tell your emotions through your webcam.

Face Off

Whether we know it or not, we communicate our state of mind through our facial expressions.

If we’re tired, our faces droop and our eyelids get heavy. If we’re frightened, our eyes widen and our pupils constrict to tiny dots. If we’re sexually stimulated, our faces flush and our pupils dilate.

It’s a major component of face-to-face communication, and it’s extremely important feedback that could be used to enrich digital interaction.

Now think about the term “interactive” for a moment.

There was a huge push in the late ’90s where every tech company had something it declared to be “interactive.”

But really, what was the interaction? It was the machine seeking a conscious interaction with the human. In other words, it was the technology — computer, video game, TV, or whatever — ASKING for input.

It was interactive, but it wasn’t proactive.

Today, there’s a new interface push on the cutting edge of technology. It’s not a keyboard or mouse or joystick or touchscreen. The new human interface device is the human face.

With automated facial coding, computers can perform active neuroscientific analysis. They can watch facial reactions in real time and change their responses based upon observed conditions.

It’s a whole new level of interactive, and tons of companies are competing to bring the tech to market.

Facial Analysis

We’ve got facial recognition everywhere. Even our smartphone cameras can pinpoint and focus on faces or use our faces as biometric authentication.

The next step beyond facial recognition is facial analysis: measuring changes in facial characteristics to indicate emotion.

Companies like Emotient, Affectiva, and Sension each offer computer vision software that is designed to optimize different technology.

Affectiva, for example, uses a computer’s webcam for optimization and prediction in advertisement and entertainment. Based upon facial expressions, it can predicts when viewers will skip in an advertisement, or it can optimize movie trailers with the most emotionally engaging moments

Sension has applied its facial analytics software to online education to track student engagement and optimize the learning environment. When the software knows the student’s engagement level, it can time questions appropriately and alter its pace to be most accommodating to different learners.

Imagine if your 8 a.m. organic chemistry class could have been re-tooled to be most forgiving to your tired teenage mind; maybe your grades could have been improved?

The therapeutic applications are vast as well. Affectiva and Sension have both done work on facial recognition for autistic patients. Affectiva showcases a Google Glass application that gives the user a live display of the engagement of the person he or she is talking to.

For autistic people who might not be able to pick up on nonverbal facial cues, a head-up application that illustrates interest would be an incredibly helpful tool.

Cameras + Sensors + Data Analysis = The Future

Right now, you probably think of Facebook (NASDAQ: FB) as more of a time sink than a scientific battleground.

But really, Facebook is doing big things in emotional analysis.

In mid-2014, the social network participated in a controversial study about emotional contagion. By manipulating feeds, Facebook data analysts showed how emotional sentiment could be spread through people with no physical interaction.

This study sparked outrage, but it also showed how big data and analytics are really the critical elements in deepening human/machine interaction.

And that level of interaction is going to be extremely important with Facebook’s acquisition of Oculus, a virtual reality company that banks on increased emotional interaction through deep immersion.

Early games optimized for the Oculus Rift’s heads-up display, such as Amnesia: The Dark Descent and Among The Sleep, rely heavily on creating a sensation of fear in the player.

Fear, it turns out, is actually one of the easier sensations to detect. Researchers at the Korea Advanced Institute of Science and Tech (KAIST) published their research in June about a wearable patch they created that can detect goose bumps.

“In the future, human emotions will be regarded like any typical biometric information, including body temperature or blood pressure,” one of the researchers said in a media statement.

The next generation of video games will do well to incorporate more biometric sensors that affect gameplay.

Stanford doctoral candidate Corey McCall crafted an Xbox controller with sensors that can detect excitement through heart rate, respiration, perspiration, and body temperature. With this data available, the game can maximize the excitement for players or provide a useful wind-down to slowly detach users from the game.

When we give an immersive environment the ability to react to our biological state, we finally add real interactivity to the long-promised ideal of virtual reality.

These are scary capabilities for sure, but their combined power is astounding and could thoroughly change our interaction with digital media.

Good Investing,

  Tim Conneally Sig

Tim Conneally

follow basic @TimConneally on Twitter

For the last seven years, Tim Conneally has covered the world of mobile and wireless technology, enterprise software, network hardware, and next generation consumer technology. Tim has previously written for long-running software news outlet Betanews and for financial media powerhouse Forbes.

Angel Publishing Investor Club Discord - Chat Now

Brian Hicks Premium

Introductory