Connect with us

Robotics

Scientists Develop Smart Artificial Hand Combining User Control and Automation

Published

 on

Scientists Develop Smart Artificial Hand Combining User Control and Automation

Scientists from Ecole Polytechnique Fédérale de Lausanne are working on new ways to improve the control of robotic hands, especially for amputees. They have developed a way to combine individual finger control and automation to help improve grasping and manipulation. They tested this idea of neuroengineering and robotics on three different amputees and seven healthy people. The results of the study were published in Nature Machine Intelligence

This newly developed technology combines two separate fields for robotic hand control. This is something that has not been done before, and it is following the new field of shared control in neuroprosthetics.

One of the new concepts comes from neuroengineering. The intended finger movement is identified by reading the muscular activity on the amputee’s stump. This is then used for individual finger control of the prosthetic hand. The other concept comes from robotics. The robotic hand is able to grab objects and keep in contact with them by grasping. 

“When you hold an object in your hand, and it starts to slip, you only have a couple of milliseconds to react,” explains Aude Billard, who leads EPFL’s Learning Algorithms and Systems Laboratory. “The robotic hand has the ability to react within 400 milliseconds. Equipped with pressure sensors all along the fingers, it can react and stabilize the object before the brain can actually perceive that the object is slipping.”

The process starts by the algorithm learning how to decipher the user’s intention, and it then translates that into finger movement of the prosthetic hand. In order for this to happen, the amputee first has to train the algorithm that uses machine learning by performing a series of hand movements. Sensors are used on the amputee’s stump, and they can detect certain muscular activity. The algorithm then learns and connects the hand movements and their corresponding muscular activity. Eventually, the algorithm will know the user’s intended finger movements, and then the individual fingers can be controlled on the prosthetic hand. 

Katie Zhuang is the first author of the publication. She spoke about the machine learning algorithm. 

“Because muscle signals can be noisy, we need a machine learning algorithm that extracts meaningful activity from those muscles and interprets them into movements,” she said.

The scientists then went on to engineer the algorithm so that when a user tries to grasp an object, robotic automation is initiated. The algorithm will relay to the prosthetic hand to close its fingers and grasp when an object comes in contact with sensors. The sensors are located on the surface of the prosthetic hand. The scientists created this new system based on an adaptation from a previous study. In that study, robotic arms were designed to identify the shape of objects and then grasp them. They did this based solely on tactile information, and there was no reliance on visual signals. 

There are still challenges ahead before this technology can be effectively used among people and become a commercially viable option for amputees looking for prosthetic hands. However, this technology is a huge step forward in the field, and it will continue to push the idea of merging human and robotics. As of right now, the algorithm is still being tested on a robot.

“Our shared approach to control robotic hands could be used in several neuroprosthetic applications such as bionic hand prostheses and brain-to-machine interfaces, increasing the clinical impact and usability of these devices,” says Silvestro Micera, EPFL’s Bertarelli Foundation Chair in Translational Neuroengineering, and Professor of Bioelectronics at Scuola Superiore Sant’Anna.

 

Spread the love

Robotics

AI Makes it Easier for Drones to Scan and Excavate Terrain

Published

on

AI Makes it Easier for Drones to Scan and Excavate Terrain

Researchers from Aarhus University (AU) and the Technical University of Denmark (DTU) have collaborated on a project that aims to decrease the costs of measuring and documenting gravel and limestone quarries, while at the same time being faster and easier than the traditional method. 

The project included the use of artificial intelligence (AI), which took over the traditionally human-controlled drones that are currently relied on to complete the task. 

Erdal Kayacan is an associate professor and expert in artificial intelligence and drones at the Department of Engineering at Aarhus University. 

“We’ve made the entire process completely automatic. We tell the drone where to start, and the width of the wall or rock face we want to photograph, and then it flies zig-zag all the way along and lands automatically,” says Kayacan.

Limitations of Human-Controlled Drones

The current method of measuring and documenting gravel and limestone quarries, cliff faces, and other natural and human-made formations relies on drones to photograph the area. A computer then receives the recordings and automatically converts everything and creates a 3D terrain model.

One of the downsides of this method is that drone pilots cost a lot, and the measurements are time-consuming. In an excavation, the drone pilot has to make sure that the drone keeps a constant distance from the wall. At the same time, the drone camera has to be kept perpendicular to the wall, making it a complex and difficult task. 

In order for the computer to convert and create a 3D figure out of the images, there has to be a specific overlap in the images. This is the main process that was automated by artificial intelligence, and it drastically reduced the complexity of completing the task. 

“Our algorithm ensures that the drone always keeps the same distance to the wall and that the camera constantly repositions itself perpendicular to the wall. At the same time, our algorithm predicts the wind forces acting on the drone body,” says Kayacan.

AI Overcomes Wind Problem

The artificial intelligence also helps overcome the wind, which is one of the biggest challenges with autonomous drone flight. 

Mohit Mehndiratta is a visiting Ph.D. student in the Department of Engineering at Aarhus University.

“The designed Gaussian process model also predicts the wind to be encountered in the near future. This implies that the drone can get ready and take the corrective actions beforehand,” says Mehndiratta.

When a human-controlled drone is completing this task, even a light breeze can alter the course of it. With the new technology, wind gusts and the overall wind speed can be accounted for. 

“The drone doesn’t actually measure the wind, it estimates the wind on the basis of input it receives as it moves. This means that the drone responds to the force of the wind, just like when we human beings correct our movements when we are exposed to a strong wind,” says Kayacan.

The research was completed in collaboration with the Danish Hydrocarbon Research and Technology Centre at DTU, and the results of the project will be presented in May 2020 at the European Control Conference. 

 

Spread the love
Continue Reading

Robotics

Researchers Create Soft Robot Able to Change Shape and Roam

Published

on

Researchers Create Soft Robot Able to Change Shape and Roam

One of the challenges surrounding soft robotics is that most of them are required to be attached to an air compressor or plugged into a wall. Researchers from Stanford set out to overcome this challenge. 

Nathan Usevitch is a graduate student in mechanical engineering at Stanford. 

“A significant limitation of most soft robots is that they have to be attached to a bulky air compressor or plugged into a wall, which prevents them from moving,” said Usevitch. “So, we wondered: What if we kept the same amount of air within the robot all the time?”

The team was able to develop a human-scale soft robot that is capable of changing its shape. By doing this, the soft robot can latch onto and handle objects, and it is able to roll in controllable directions. 

The research was published in Science Robotics on March 18. 

“The casual description of this robot that I give to people is Baymax from the movie Big Hero 6 mixed with Transformers. In other words, a soft, human-safe robot mixed with robots that can dramatically change their shape,” said Usevitch.

Simple Version

This soft robot was developed by combining three different types of robots. The simple version of the team’s invention is called an “isoperimetric robot,” since the shape changes while the total length of the edges and the amount of air inside stays the same. 

The isoperimetric robot was developed out of soft robots, truss robots, and collective robots. Each category of robotics brought a different advantage: soft robots are lightweight and compliant, truss robots can change shape, and collective robots are small and collaborate. 

Sean Follmer is an assistant professor of mechanical engineering and co-senior author of the paper. 

“We’re basically manipulating a soft structure with traditional motors,” said Follmer. “It makes for a really interesting class of robots that combines many of the benefits of soft robots with all of the knowledge we have about more classic robots.”

Complex Version

The team also developed a more complex version of the robot by attaching several triangles together. They were able to coordinate the movements of the different motors, which allowed the robot to carry out desired behaviors, such as picking up a ball. 

Elliot Hawkes is an assistant professor of mechanical engineering at the University of California, Santa Barabara and co-senior author of the paper. 

“A key understanding we developed was that to create motion with a large, soft pneumatic robot, you don’t actually need to pump air in and out,”  said Hawkes. “You can use the air you already have and just move it around with these simple motors; this method is more efficient and lets our robot move much more quickly.”

Space Exploration

According to Zachary Hammond, a graduate student in mechanical engineering at Stanford and co-lead author of the paper, one of the possible uses for this soft robot is space exploration.

“This robot could be really useful for space exploration — especially because it can be transported in a small package and then operates untethered after it inflates,” said Hammond. “On another planet, it could use its shape-changing ability to traverse complicated environments, squeezing through tight spaces and spreading over obstacles.”

The researchers are now trying out different shapes, and they want to test the robot in water. 

Allison Okamura is a professor of mechanical engineering and co-author of the paper. 

“This research highlights the power of thinking about how to design and build robots in new ways,” said Okamura. “The creativity of robot design is expanding with this type of system and that’s something we’d really like to encourage in the robotics field.”

 

Spread the love
Continue Reading

Interviews

Marcio Macedo, Co-Founder of Ava Robotics – Interview Series

mm

Published

on

Marcio Macedo, Co-Founder of Ava Robotics - Interview Series

Marcio Macedo is Co-Founder and VP of Product and Marketing at Ava Robotics, a recent spin-off of iRobot that focuses on autonomous navigating robots for enterprise, commercial and industrial environments.

Having previously worked at iRobot, what were some of the interesting projects that you worked on?

At iRobot we were fortunate to be designing and pioneering applications of telepresence, including an FDA-certified telemedicine robot for intensive care environments and the Ava telepresence product in partnership with Cisco.

 

Ava Robotics is a spinoff of iRobot, what was the inspiration behind launching a new company instead of keeping it in the iRobot family?

With iRobot’s strategic focus shifting to home products, Ava Robotics spun off to operate independently and better address the needs of our nascent markets. As an independent company we gain more flexibility in meeting our customers’ needs while enjoying the support of technology developed originally at iRobot.

 

The Ava Telepresence robot can be remotely controlled by users and features autonomous technology to have the robot simply move itself to a designated area. Could you walk us through the machine learning that is used to have the robot navigate through an environment without bumping into new objects?

When an Ava is installed at a location it learns its operating environment and creates a realistic topology map of the site. This map can be further annotated to force specific behaviors, such as speed zones, keep-out zones, etc.

Ava has built-in obstacle detection and obstacle avoidance (ODOA) capabilities, which leverage multiple sensors in the robot body so that Ava will not bump into people or objects in its path. Furthermore, if the most direct path to its destination is blocked, the Ava will search for and navigate through an alternative path if one is available.

 

What are the navigation sensors that are used, is it reliant on LiDAR or regular cameras?

Ava’s robotic navigation technologies use a variety of sensors (3-D cameras, LiDAR, and IMU) and they are combined for all actions, such as localization, planning, collision avoidance, cliff detections, etc. We operate in medium- and large-size spaces, so we think LiDAR is a very valuable part of a sensing package for real-world commercial spaces.

 

The telepresence robot looks like it would be extremely useful in the hospitality sector. Could you walk us through some of these potential use-cases?

Visits to Executive Briefing Centers provide access to senior-level executives and deliver value in the form of hands-on briefings, strategy reviews, product demonstrations and opportunities for relationship building. Customer Experience Centers offer organizations the opportunity to wow customers and show off their latest products and services. But with so many busy schedules, getting the right people to attend is not always easy.

For meeting planners, Ava provides the ability to “walk” the hotel and visit the meeting spaces, conference rooms and ballrooms that are available for their conference or event. In this application, the property’s sales and marketing team gain a unique tool to accelerate their sales cycles.

When invitees and guests can’t get to the conference or event, Ava allows them to attend and move around as if they were there. Whether it’s a business meeting, conference exhibit hall, or social event, Ava provides an immersive experience with freedom to move around.

 

What are some of the use-cases that are being targeted in the corporate sector?

Businesses benefit from Ava in many ways. The robot allows freedom of movement and access to meetings, corporate training, factory inspections, manufacturing sites, labs and customer experience settings.

Natural, face-to-face, ad-hoc conversations are critical to moving a business forward. Yet today’s globally distributed businesses have employees telecommuting from home or from across the world, who miss these vital interactions. With Ava, you unlock the ability to bring everyone back together as if they’re sitting together in the office and can walk up and interact naturally.

Use Case examples include:

  • Agile Product Development: Agile product development teams come together for scheduled and unscheduled meetings, looking to achieve high levels of collaboration and communication. When remote workers are part of the team, existing collaboration tools are challenged to meet the need. With Ava, remote team members can actively participate in stand-up meetings, sprint planning and demos, and project reviews as if they were co-located with the team.
  • Manufacturing: In manufacturing, remote visits by management, collaboration between experts at headquarters and staff at the plant, and remote tours by customers or suppliers are frequent – and necessary – events. Ava increases collaboration between those on the design team or in engineering and those building and delivering the final product on the plant floor. Also, imagine that the manufacturing plant is experiencing a production-line problem, but the person who knows how to fix it is thousands of miles away. In such a case, the technician needs to freely walk to different parts of the manufacturing floor to meet with someone or see something. Ava can by delivering that critical physical presence right to the factory floor. Ava allows the remote person to immediately connect via the robot as if she was physically present, put eyes on the problem, and communicate with the local team on the floor. As a result, she can deliver immediate insight into the problem and quickly resolve the issue.
  • Laboratories and Clean Rooms: Those who work in laboratories and clean rooms work hard to ensure they are kept sterile and clean. While necessary, this can be a time-consuming process for employees entering and leaving these spaces repeatedly during the day. Due to the risks of potential contamination, companies often limit tours by customers and other visitors. Ava brings people right into a laboratory or a clean room without compromising the space. With Ava, remote visitors can easily move around as if they were there in person, observing the work being done and speaking with employees.

 

Ava Robotics recently partnered with Qatar Airways to Introduce Smart Airport Technologies at QITCOM 2019. Could you share with us some details in regards to this event and how those in attendance reacted?

We have been fortunate to work with Hamad International Airport in Qatar and Qatar Airways via our strategic partner Cisco building applications for robots in airports for a variety of use cases. Showing our work at QITCOM 2019 was a good opportunity to expose to the IT community to the applications that are now possible through different verticals and industries.

 

Is there anything else that you would like to share about Ava Robotics?

In these times of challenges to global travel, we have seen increased demand for solutions like telepresence robotics. Customers are just beginning to realize the potential of applications that truly empower remote workers to collaborate as if they were physically present at a distant location.

To learn more more visit AVA Robotics

Spread the love
Continue Reading