Connect with us

Robotics

Engineers Design Humanoid Hand for Gripping Fragile Objects

Published

 on

A team of engineers at Michigan State University has designed and developed a humanoid hand that is much better at carrying out delicate tasks compared to the ones traditionally used. The new robotic hand can handle objects that are fragile, light, and irregularly shaped.

Robots in the Industrial Setting

The industrial setting is one of the areas most impacted by developments in robotics, as the technology is often used for the repetitive grasping and control of objects. 

The area of the robot where a human hand would be located is called an end effector or gripper, and that was the focus of the new research, which was published in Soft Robotics and is titled “Soft Humanoid Hands with Large Grasping Force Enabled by Flexible Hybrid Pneumatic Actuators.”

Changyong Cao was the lead author and is the director of the Laboratory for Soft Machines and Electronics at MSU. Cao is also an assistant professor in Packaging, Mechanical Engineering, and Electrical and Computer Engineering. 

“The novel humanoid hand design is a soft-hard flexible gripper. It can generate larger grasping force than a traditional pure soft hand, and simultaneously be more stable for accurate manipulation than other counterparts used for heavier objects,” said Cao. 

Soft-Hand Grippers

Soft-hand grippers are normally utilized whenever an object is fragile, light, and irregularly shaped. However, they have many downsides, including sharp surfaces, poor grasping stability when unbalanced loads are targeted, and a weaker grasping force for heavy loads. 

Cao and his team looked toward human-environment interactions, including fruit picking and sensitive medical care, when designing the new humanoid hand. They determined that most gripping systems are not effective in the areas where they are most needed, such as tasks that require firm interaction with fragile objects.

According to the team, the prototype demonstrated a responsive, fast, and lightweight gripper that could handle various tasks. 

The different fingers on the soft humanoid hand are made from a flexible hybrid pneumatic actuator, also called FHPA, and their bend is controlled by pressurized air. This process results in each digit being able to be controlled independently from the others.

“Traditional rigid grippers for industrial applications are generally made of simple but reliable rigid structures that help in generating large forces, high accuracy and repeatability,” Cao said. “The proposed soft humanoid hand has demonstrated excellent adaptability and compatibility in grasping complex-shaped and fragile objects while simultaneously maintaining a high level of stiffness for exerting strong clamping forces to lift heavy loads.”

The FHPA

The FHPA consists of hard and soft components.

“They combine the advantages of the deformability, adaptability and compliance of soft grippers while maintaining the large output force originated from the rigidity of the actuator,” Cao said.

According to Cao, the newly developed humanoid hand could be used in tasks like fruit picking, automated packaging, medical care, and surgical robotics.

The team will now look to combine the work with Cao’s previous developments, which include ‘smart’ grippers. They also want to integrare printed sensors in the gripping material and combine the hybrid gripper with ‘soft arms’ models, which would lead to a more accurate representation of human actions among robots.

 

Spread the love

Robotics

Researchers Develop Self-Healing Soft Robot Actuators

Published

on

Credit: Demirel Lab, Penn State

A team of researchers at Penn State University has developed a solution to the wear on soft robotic actuators due to repeated activity: a self-healing, biosynthetic polymer based on squid ring teeth. The material is beneficial to actuators, but it could also be applied anywhere that tiny holes could cause problems, such as hazmat suits.

According to the report in Nature Materials, “Current self-healing materials have shortcomings that limit their practical application, such as low healing strength and long healing times (hours).” 

Drawing inspiration from self-healing creatures in nature, the researchers created high-strength synthetic proteins. They are able to self-heal minute and visible damage.

Melik Demirel is a professor of engineering science and mechanics and the holder of the Lloyd and Dorothy Foehr Huch Chair in Biomimetic Materials.

“Our goal is to create self-healing programmable materials with unprecedented control over their physical properties using synthetic biology,” he said. 

Robotic Arms and Prosthetics

Some robotic machines, such as robotic arms and prosthetic legs, rely on joints that are constantly moving. This requires a soft material, and the same is true for ventilators and various types of personal protective equipment. These materials, and any that undergo continual repetitive motion, are at risk of developing small tears and cracks, eventually breaking. WIth the use of self-healing material, these tiny tears can be quickly repaired before any serious damage is done. 

DNA Tandem Repeats

The team of researchers created the self-healing polymer by using a series of DNA tandem repeats consisting of amino acids produced by gene duplication. Tandem repeats are often a short series of molecules that can repeat themselves an unlimited number of times. 

Abdon Pena-Francelsch is lead author of the paper and a former doctoral student in Demirel’s lab.

“We were able to reduce a typical 24-hour healing period to one second so our protein-based soft robots can now repair themselves immediately,” Abdon Pena-Francelsch said. “In nature, self-healing takes a long time. In this sense, our technology outsmarts nature.”

According to Demirel, the self-healing polymer can heal itself with the application of water, heat, and even light. 

“If you cut this polymer in half, when it heals it gains back 100 percent of its strength,” Demirel said.

Metin Sitti is direcor of the Physical Intelligence Department at the Max Planck Instiute for Intelligent Systems, Stuttgart, Germany.

“Self-repairing physically intelligent soft materials are essential for building robust and fault-tolerant soft robots and actuators in the near future,” Sitti said.

The team was able to create the rapidly-healing soft polymer by adjusting the number of tandem repeats. It is able to retain its original strength, and at the same time, they were able to make the polymer 100% biodegradable and 100% recyclable into the same polymer. 

Petroleum-Based Polymers

“We want to minimize the use of petroleum-based polymers for many reasons,” Demirel said. “Sooner or later we will run out of petroleum and it is also polluting and causing global warming. We can’t compete with the really inexpensive plastics. The only way to compete is to supply something the petroleum based polymers can’t deliver and self-healing provides the performance needed.”

According to Demirel, many of the petroleum-based polymers are able to be recylced, but it has to be into something different. 

The biomimetic polymers are able to biodegrade, and acids like vinegar are able to recycle it into a powder which can then be manufactured into the original self-healing polymer. 

Stephanie McElhinny is a biochemistry program manager at the Army Research Office. 

“This research illuminates the landscape of material properties that become accessible by going beyond proteins that exist in nature using synthetic biology approaches, McElhinny said. “The rapid and high-strength self-healing of these synthetic proteins demonstrates the potential of this approach to deliver novel materials for future Army applications, such as personal protective equipment or flexible robots that could maneuver in confined spaces.” 

 

Spread the love
Continue Reading

Interviews

Adam Rodnitzky, COO & Co-Founder of Tangram Robotics – Interview Series

mm

Published

on

Adam Rodnitzky, is the COO & Co-Founder of Tangram Robotics, a company specializing in assisting robotic companies to integrate sensors quickly and maximize uptime.

What initially attracted you to Robotics?

I’ve always loved mechanical things, and I’ve always loved cutting-edge technology. Robots sit right at the intersection of those two interests. Beyond that foundation of what they are, however, is what they can do. For the longest time, robots were largely relegated to factory settings, where they worked under relatively constrained circumstances. That meant that for most, robots were something they knew about, but never experienced. It’s only been recently that robots have started to play a larger role in society, and that is largely because the technology required to let them operate safely and consistently in the human world is just now becoming viable. The future of robotics is being built as we speak, and the level of interaction between them and humans is going to grow exponentially in the next decade. I’m very excited to witness that.

 

You were a mentor at StartX a seed stage accelerator out of Stanford University for over a decade. What did you learn from this experience?

Being a company founder comes with a lot of uncertainty, as you face new challenges you’ve never faced, and try to pattern match on prior experience to make sense of the day-to-day realities of running a new company. Looking to mentors for guidance is a natural response to having that uncertainty. But there is a challenge in taking advice from mentors. Mentors will prescribe advice based on their own past experiences. Yet those experiences occurred in different contexts, at different company stages and for different reasons. As a mentor, you’ve got to remember this when giving advice. You may have the best intentions, but you might lead a company astray by not properly contextualizing advice based on past experience. I’ve tried to keep this in mind as I mentor companies at StartX.

 

You previously worked as a General Manager for Occipital which develops state-of-the-art mobile computer vision applications and hardware. Could you tell us what this role involved in a day to day setting?

When I was at Occipital, our core product was the Structure Sensor and SDK, which made it simple to add 3D sensing to mobile devices, and develop applications to take advantage of that 3D data stream. On a day-to-day basis, I saw my role as combining a short-term tactical and long-term strategic pursuit of revenue and revenue growth. For instance, the SDK was free, and therefore it generated no revenue on a daily basis. However, as developers used the SDK to create apps to use Structure Sensor, there was a direct relationship between the number of apps published on our platform and the rate of sensor sales. So on a daily basis, I’d pursue these indirect revenue opportunities around developer community support, while also setting up programs to sell our sensors in as many channels as possible – including directly through those developers.

 

When did you first get the idea to launch a robotics startup?

Much of the credit here goes to my co-founder, Brandon Minor. Brandon is a co-founder of Colorado Robotics, and has had his finger on the pulse of the robotics community as long as I have known him. We had both left Occipital independently with the idea of starting companies. Earlier this year, we met up and he proposed that we join forces to build on our past experience with robots, computer vision and sensors. And that is how Tangram Robotics was created.

 

Could you tell us what Tangram Robotics does?

Tangram Robotics offers sensors-as-a-service to robotics platforms. All robots need perception sensors, but not all of those sensors meet the performance needs of robotics. We infuse trusted hardware with Tangram software that makes integration, calibration, and maintenance a breeze during development and deployment. This means that roboticists don’t need to make any trade-offs; they can start using the best sensors for their platform from day one, and keep that momentum as they deploy.

 

What are some of the existing challenges companies face when it comes to the integration of Robotic Perception Sensors?

Our interviews with robotics companies of all types have led us to the conclusion that hardware companies make great hardware, but marginal software. The process of developing the right streaming and integration software for a sensor therefore falls to the robotics company themselves and can take months to get right. Not only that, but every robotics company is going through this same process, for the same sensors, over and over as they develop up their perception stack. This results in a major loss of engineering time and customer revenue. We’ve set up our solution so that it can help robotics companies at any stage, from design through development and ultimately to deployment.

 

Could you discuss Tangram Robotics web-based diagnostics and monitoring systems?

Tangram understands that the key to improvement is in metrics, both during development and in the field. With that in mind, we are creating remote diagnostics systems that work on top of our integration software that allow robotics developers to better understand what’s happening during operation. This includes data transmission rates, processing time, and metrics directly related to other aspects of our platform. Setting this up over a web portal means that decisions can be made competently without needing the physical presence of an engineer.

 

One of the solutions Tangram Robotics is working on is developing full-stack tools for robotic companies to add to their project. Could you discuss the vision behind these tools?

Sensor integration is much more than streaming. We look at sensors from a holistic perspective, focusing on the tools needed to develop faster and work longer. This includes competent calibration tools that work in the field, as well as diagnostics and monitoring of data and performance. By solving the base requirements of many robot platforms out-of-the-box, Tangram’s tools dramatically improve time-to-market. We anticipate that various other tools will be requested as our platform matures.

 

Is there anything else that you would like to share about Tangram Robotics?

As we’ve gone through the process of talking with roboticists, we’ve been blown away at the diversity of applications that robotics companies are pursuing. We’ve spoken to companies building all sorts of wild solutions, from strawberry pickers to sous chefs to boat captains to groundskeepers!

Thank you for the interview. I believe that sensors is often something that is overlooked by different companies and I look forward to following your progress. Readers who wish to learn more should visit Tangram Robotics.

Spread the love
Continue Reading

Robotics

Tiny Robotic Cameras Give First-Person View of Insects

Published

on

Credit: Mark Stone/University of Washington

Many people throughout generations have been curious about the viewpoints of insects and small organisms, which are often portrayed in movies. However, this has never been able to be demonstrated in real-life, up until now. 

Researchers at the University of Washington have created a wireless steerable camera that is capable of being placed on the back of an insect, bringing that viewpoint to the world. 

Insect Camera 

The camera on the back of the insect can stream video to a smartphone at 1 to 5 frames per second, and it is placed on a mechanical arm that allows a 60-degree pivot. The technology provides high-resolution, panoramic shots, as well as the possibility of tracking moving objects.

The entire system weighs around 250 milligrams, and it was demonstrated on the back of live beetles and insect-sized robots.

The work was published on July 15 in Science Robotics.

Shyam Golakota is the senior author and a UW associate professor in the Paul G. Allen School of Computer Science & Engineering. 

“We have created a low-power, low-weight, wireless camera system that can capture a first-person view of what’s happening from an actual live insect or create vision for small robots” said Golakota. “Vision is so important for communication and for navigation, but it’s extremely challenging to do it at such a small scale. As a result, prior to our work, wireless vision has not been possible for small robots or insects.”

Smartphone Cameras

There are a few reasons why the researchers had to come up with a new camera rather than use the small ones that are currently present in smartphones. Those currently used are considered lightweight, but the batteries that are required would make them too heavy to be placed on the back of insects. 

Sawyer Fuller is co-author and a UW assistant professor of mechanical engineering. 

“Similar to cameras, vision in animals requires a lot of power,” Fuller said. “It’s less of a big deal in larger creatures like humans, but flies are using 10 to 20% of their resting energy just to power their brains, most of which is devoted to visual processing. To help cut the cost, some flies have a small, high-resolution region of their compound eyes. They turn their heads to steer where they want to see with extra clarity, such as for chasing prey or a mate. This saves power over having high resolution over their entire visual field.”

Modeled After Nature

The newly developed camera was inspired by nature, and the researchers used an ultra-low-power black-and-white camera to mimic an animal’s vision. The camera can move across a field of view with the help of the mechanical arm, which is controlled by the team applying a high voltage, causing the arm to bend and move the camera. 

The camera and the arm are able to be controlled via Bluetooth from a smartphone up to 120 meters away. 

Testing the Camera

The researchers tested the camera on two different types of beetles, which ended up living for at least a year following the experiment.

“We made sure the beetles could still move properly when they were carrying our system,” said Ali Najafi, co-lead author and UW doctoral student in electrical and computer engineering. “They were able to navigate freely across gravel, up a slope and even climb trees.”

“We added a small accelerometer to our system to be able to detect when the beetle moves. Then it only captures images during that time,” Iyer said. “If the camera is just continuously streaming without this accelerometer, we could record one to two hours before the battery died. With the accelerometer, we could record for six hours or more, depending on the beetle’s activity level.”

According to the researchers, this technology could be applied in the areas of biology and exploration, and they hope for future versions to be solar-powered. However, the team does recognize certain privacy concerns could arise due to the technology. 

“As researchers we strongly believe that it’s really important to put things in the public domain so people are aware of the risks and so people can start coming up with solutions to address them,” Gollakota said.

 

Spread the love
Continue Reading