stub New Technology Could Allow Us to “Feel” Objects Through Touchscreens - Unite.AI
Connect with us

Augmented Reality

New Technology Could Allow Us to “Feel” Objects Through Touchscreens



A team of researchers at Texas A&M University is attempting to further the development of technology that could lead to enhanced touch screens that allow us to “feel” objects. This new technology would take us further than a device just sensing and reacting to touch, and they’re doing this by better defining how the finger interacts with such a device.

The team is being led by Dr. Cynthia Hipwell, who is a professor in the Department of Mechanical Engineering at the university.

The research was published last month in the journal Advanced Materials.

New Type of Human-Machine Interface 

The team’s goal is to develop a human-machine interface that gives touch devices the ability to provide users with a more interactive touch-based experience. They are achieving this by developing technology that can mimic the feeling of physical objects. 

According to Hipwell, there are many potential applications like a more immersive virtual reality (VR) platform that can tactile display interfaces like those in a motor vehicle dashboard. It could also enable a virtual shopping experience where users can actually feel the texture of materials through the device prior to purchasing them. 

“This could allow you to actually feel textures, buttons, slides and knobs on the screen,” Hipwell said. “It can be used for interactive touch screen-based displays, but one holy grail would certainly be being able to bring touch into shopping so that you could feel the texture of fabrics and other products while you’re shopping online.”

Refinement of Haptic Technology

Hipwell says that the “touch” aspect of current touch screen technology is actually there more for the screen than the user. However, that relationship between user and device can now be more reciprocal thanks to the emergence and refinement of haptic technology.

By adding touch as a sensory input, the virtual environments can be enriched, and it could ease communication that is currently carried by audio and visuals.

“When we look at virtual experiences, they're primarily audio and visual right now and we can get audio and visual overload,” Hipwell said. “Being able to bring touch into the human-machine interface can bring a lot more capability, much more realism, and it can reduce that overload. Haptic effects can be used to draw your attention to make something easier to find or easier to do using a lower cognitive load.”

The team is dealing with an incredibly complex interface that changes depending on the user and environmental conditions.

“We're looking at electro-wetting effects (the forces that result from an applied electric field), electrostatic effects, changes in properties of the finger, the material properties and surface geometry of the device, the contact mechanics, the fluid motion, charge transport — really, everything that's going on in the interface to understand how the device can be designed to be more reliable and higher performing,” Hipwell said. “Ultimately, our goal is to create predictive models that enable a designer to create devices with maximum haptic effect and minimum sensitivity to user and environmental variation.”

Hipwell believes that these features will begin to be implemented into common devices in the next few years. 

“I think early elements of it will definitely be within the next five years,” Hipwell said. “Then, it will just be a matter of maturing the technology and how advanced, how realistic and how widespread it becomes.”

Alex McFarland is an AI journalist and writer exploring the latest developments in artificial intelligence. He has collaborated with numerous AI startups and publications worldwide.