New AI-Based Software Turns Any Smartphone Into An Eye-Tracking Device.
Eye-tracking technology, For the past 40 years— which can conclude wherein a visual scene people are directing their gaze — has been generally used in psychological experiments and marketing research, but it’s compulsory pricey hardware that has retained it from finding consumer applications.
Researchers led by an Indian-origin scientist have established a software that can turn any smartphone into an eye-tracking device, a discovery that can help in psychological experiments and marketing research. In addition to making existing applications of eye-tracking technology more reachable, the system could enable new computer interfaces or help identify signs of incipient neurological disease or mental illness.
“Since there are no applications, there’s no incentive for people to buy the devices. We thought we should break this circle and try to make an eye tracker that works on a single mobile device, using just your front-facing camera,” explained Aditya Khosla, the graduate student in electrical engineering and computer science at Massachusetts Institute of Technology (MIT).
Researchers at Artificial Intelligence Laboratory and MIT’s Computer Science and the University of Georgia hope to change that, with software that can turn any smartphone into an eye-tracking device. They define their new system in a paper they’re presenting on June 28 at the Computer Vision and Pattern Recognition conference.
In addition to making current applications of eye-tracking technology more accessible, the system could allow new computer interfaces or help detect signs of incipient neurological disease or mental illness.
“The field is kind of stuck in this chicken-and-egg loop,” says Aditya Khosla, an MIT graduate student in electrical engineering and computer science and co-first author on the paper. “Since few people have the external devices, there’s no big incentive to develop applications for them. Since there are no applications, there’s no incentive for people to buy the devices. We thought we should break this circle and try to make an eye tracker that works on a single mobile device, using just your front-facing camera.”
Researchers made their eye tracker using machine learning, a procedure in which computers learn to make tasks by looking for patterns in large sets of training examples. Their training set contains examples of gaze patterns from 1,500 mobile-device users, Khosla said. Earlier, the largest data sets used to train experimental eye-tracking systems had topped out at about 50 users.
The researchers report the first round of experiments, using training data drawn from 800 mobile-device users. On that base, they were able to get the system’s margin of error down to 1.5 centimeters, a double improvement over previous experimental systems.
They later developed data on another 700 people, and the extra training data has reduced the margin of error to about a centimeter.
Earlier, the main data sets used to train new eye-tracking systems had topped out at about 50 users. To assemble data sets, “most other groups tend to call people into the lab,” Khosla says. ”It’s really hard to scale that up. Calling 50 people in itself is already a fairly tedious process. But we realized we could do this through crowdsourcing,” he added.
The application sparks a small dot anywhere on the device’s screen, attracting the user’s consideration, then briefly replaces it with either an “R” or an “L,” instructing the user to tap either the right or left side of the screen.Correctly performing the tap confirms that the user has really shifted his or her gaze to the intended location.
During this procedure, the device camera constantly captures images of the user’s face.
0 comments: