Published by the Students of Johns Hopkins since 1896
April 25, 2024

New computers can gauge users’ boredom

By SCOTT ZHENG | March 10, 2016

B7_Computer-1024x805

Brian Kerrigan/CC-by-SA-3.0 Computers can tell how engaged you are by reading and quantifying your body language.

Imagine that you have just gotten home after a long day of classes. It’s a Tuesday afternoon and you decide to plop down on your couch, log on to Facebook, browse the first six pages of Reddit and then open up some lecture notes. However, you soon wake up and realize you’ve been asleep for three hours. All of a sudden you remember that you have a midterm tomorrow and a paper due on Friday. This situation may soon be a thing of the past because new research could lead to the development of a computer screen that keeps you awake when you need to be.

A research team led by Discipline Leader in Physiology at the Brighton and Sussex Medical School in Brighton, England, Dr. Harry Witchel, has discovered that computers can tell how engaged people are while staring at the screen by reading and quantifying their body language.

When people are uninterested in what is on their computer screens they generally exhibit very small, involuntary movements called non-instrumental movements. An example is fidgeting with a computer’s mouse. The opposite is also true. When someone is totally engaged and absorbed in what is displayed on the screen these non-instrumental movements stop.

In the study, a group of participants held a trackball that helped them reduce their instrumental movements and looked at a computer screen that displayed varying degrees of stimulating activities for three minutes. These activities ranged from intense computer games to monotonous readings. The subjects’ non-instrumental movements were measured through video motion tracking. The more engaging activities showed a decrease of over 40 percent in these tiny movements.

“Our study showed that when someone is really highly engaged in what they’re doing they suppress these tiny involuntary movements. It’s the same as when a small child, who is normally constantly on the go, stares gaping at cartoons on the television without moving a muscle,” Witchel said in a press release.

This discovery could lead to various developments of artificial intelligence in the future such as self-adapting online tutorial programs that change the style of tutoring to best suit the user’s interests. Another potential development that could stem from this research involves the creation of companion robots, which can read what a person is thinking and provide them with tailored support. Witchel believes that such technology could be developed in the near future.

“Being able to ‘read’ a person’s interest in a computer program could bring real benefits to future digital learning, making it a much more two-way process,” Witchel said. “Further ahead it could help us create more empathetic companion robots, which may sound very ‘sci-fi’ but are becoming a realistic possibility within our lifetimes.”

This discovery could also be useful for movie directors, as it could objective observe movie watchers to determine whether they are engaged. Rather than subjectively asking viewers which scenes they found interesting, directors could instead quantify which scenes were the most interesting and which scenes needed more work on by tracking viewers’ body language. The same technology could be applied to video games to make more absorbing and interactive games.


Have a tip or story idea?
Let us know!

Comments powered by Disqus

Please note All comments are eligible for publication in The News-Letter.

Podcast
Multimedia
Earth Day 2024
Leisure Interactive Food Map
The News-Letter Print Locations
News-Letter Special Editions