Connect with us

Robotics

Tiny Robot Could Clean Particles From Water and Transport Cells

Published

 on

Researchers at Eindhoven University of Technology have developed a tiny plastic robot that can be used to attract and capture particles in the water. It could also be used to transport cells for analysis in diagnostic devices.

The research was published in the journal PNAS.

The Robot

The tiny robot is made of responsive polymers that can be controlled by light and magnetism. Also termed “wireless aquatic polyp,” it is inspired by a coral polyp in nature, which is present in coral reefs and has tentacles.

In the real-world, living polyps can make a specific movement with their stem in order to create a current that attracts food particles.

According to doctoral candidate Marina Pilz Da Cunha, “I was inspired by the motion of these coral polyps, especially their ability to interact with the environment through self-made currents.”

The newly developed artificial polyp is 1 by 1 cm, with the stem reacting to magnetism and the tentacles being controlled by light. 

“Combining two different stimuli is rare since it requires delicate material preparation and assembly, but it is interesting for creating untethered robots because it allows for complex shape changes and tasks to be performed,” says Pilz Da Cunha.

In order to control the tentacles, light is shone on them with different wavelengths. With the use of UV light, the tentacles ‘grab,’ and while under blue light, they ‘release.’

Underwater

The artificial polyp is capable of grabbing and releasing objects underwater. The new robot is an advancement from the light-guided package delivery mini-robot that was presented by the researchers earlier in the year. 

The land-based robot was not able to operate underwater, since the polymers act through photothermal effects. In contrast to the underwater model, the land-based one used energy from the heat that was generated from the light, rather than the light itself.

“Heat dissipates in water, which makes it impossible to steer the robot underwater,” Pilz Da Cunha said.

With this information, the researchers developed a photomechanical polymer material capable of being controlled by light only, no heat. 

Another major development with this new robot is that it can hold its deformation after being activated by light. After the stimuli is removed, the photothermal material returns to its original shape, but the molecules in the photomechanical material take on a new state. Because of this, different stable shapes can be maintained for longer periods of time.

“That helps control the gripper arm; once something has been captured, the robot can keep holding it until it is addressed by light once again to release it,” says Pilz Da Cunha.

Attracting Particles

A rotating magnet is located underneath the robot which allows the stem to circle around the axis.

According to Pilz Da Cunha, “It was therefore possible to actually move floating objects in the water towards the polyp, in our case oil droplets.”

The fluid flow can be changed by the position of the tentacles.

“Computer simulations, with different tentacle positions, eventually helped us to understand and get the movement of the stem exactly right. And to ‘attract’ the oil droplets towards the tentacles,” says Pilz Da Cunha.

The robot can operate no matter what the surrounding liquid is. This runs counter to the hydrogels that are often used for underwater applications, which are sensitive to the environment. 

“Our robot also works in the same way in salt water, or water with contaminants out of the water by catching them with its tentacles,” says Pilz Da Cunha. 

The researchers are now working on getting various different polyps to collaborate, with the possibility of one polyp passing a package to another. They are also working on swimming robots that could be used for biomedical applications.

 

Spread the love

Robotics

Researchers Develop Self-Healing Soft Robot Actuators

Published

on

Credit: Demirel Lab, Penn State

A team of researchers at Penn State University has developed a solution to the wear on soft robotic actuators due to repeated activity: a self-healing, biosynthetic polymer based on squid ring teeth. The material is beneficial to actuators, but it could also be applied anywhere that tiny holes could cause problems, such as hazmat suits.

According to the report in Nature Materials, “Current self-healing materials have shortcomings that limit their practical application, such as low healing strength and long healing times (hours).” 

Drawing inspiration from self-healing creatures in nature, the researchers created high-strength synthetic proteins. They are able to self-heal minute and visible damage.

Melik Demirel is a professor of engineering science and mechanics and the holder of the Lloyd and Dorothy Foehr Huch Chair in Biomimetic Materials.

“Our goal is to create self-healing programmable materials with unprecedented control over their physical properties using synthetic biology,” he said. 

Robotic Arms and Prosthetics

Some robotic machines, such as robotic arms and prosthetic legs, rely on joints that are constantly moving. This requires a soft material, and the same is true for ventilators and various types of personal protective equipment. These materials, and any that undergo continual repetitive motion, are at risk of developing small tears and cracks, eventually breaking. WIth the use of self-healing material, these tiny tears can be quickly repaired before any serious damage is done. 

DNA Tandem Repeats

The team of researchers created the self-healing polymer by using a series of DNA tandem repeats consisting of amino acids produced by gene duplication. Tandem repeats are often a short series of molecules that can repeat themselves an unlimited number of times. 

Abdon Pena-Francelsch is lead author of the paper and a former doctoral student in Demirel’s lab.

“We were able to reduce a typical 24-hour healing period to one second so our protein-based soft robots can now repair themselves immediately,” Abdon Pena-Francelsch said. “In nature, self-healing takes a long time. In this sense, our technology outsmarts nature.”

According to Demirel, the self-healing polymer can heal itself with the application of water, heat, and even light. 

“If you cut this polymer in half, when it heals it gains back 100 percent of its strength,” Demirel said.

Metin Sitti is direcor of the Physical Intelligence Department at the Max Planck Instiute for Intelligent Systems, Stuttgart, Germany.

“Self-repairing physically intelligent soft materials are essential for building robust and fault-tolerant soft robots and actuators in the near future,” Sitti said.

The team was able to create the rapidly-healing soft polymer by adjusting the number of tandem repeats. It is able to retain its original strength, and at the same time, they were able to make the polymer 100% biodegradable and 100% recyclable into the same polymer. 

Petroleum-Based Polymers

“We want to minimize the use of petroleum-based polymers for many reasons,” Demirel said. “Sooner or later we will run out of petroleum and it is also polluting and causing global warming. We can’t compete with the really inexpensive plastics. The only way to compete is to supply something the petroleum based polymers can’t deliver and self-healing provides the performance needed.”

According to Demirel, many of the petroleum-based polymers are able to be recylced, but it has to be into something different. 

The biomimetic polymers are able to biodegrade, and acids like vinegar are able to recycle it into a powder which can then be manufactured into the original self-healing polymer. 

Stephanie McElhinny is a biochemistry program manager at the Army Research Office. 

“This research illuminates the landscape of material properties that become accessible by going beyond proteins that exist in nature using synthetic biology approaches, McElhinny said. “The rapid and high-strength self-healing of these synthetic proteins demonstrates the potential of this approach to deliver novel materials for future Army applications, such as personal protective equipment or flexible robots that could maneuver in confined spaces.” 

 

Spread the love
Continue Reading

Interviews

Adam Rodnitzky, COO & Co-Founder of Tangram Robotics – Interview Series

mm

Published

on

Adam Rodnitzky, is the COO & Co-Founder of Tangram Robotics, a company specializing in assisting robotic companies to integrate sensors quickly and maximize uptime.

What initially attracted you to Robotics?

I’ve always loved mechanical things, and I’ve always loved cutting-edge technology. Robots sit right at the intersection of those two interests. Beyond that foundation of what they are, however, is what they can do. For the longest time, robots were largely relegated to factory settings, where they worked under relatively constrained circumstances. That meant that for most, robots were something they knew about, but never experienced. It’s only been recently that robots have started to play a larger role in society, and that is largely because the technology required to let them operate safely and consistently in the human world is just now becoming viable. The future of robotics is being built as we speak, and the level of interaction between them and humans is going to grow exponentially in the next decade. I’m very excited to witness that.

 

You were a mentor at StartX a seed stage accelerator out of Stanford University for over a decade. What did you learn from this experience?

Being a company founder comes with a lot of uncertainty, as you face new challenges you’ve never faced, and try to pattern match on prior experience to make sense of the day-to-day realities of running a new company. Looking to mentors for guidance is a natural response to having that uncertainty. But there is a challenge in taking advice from mentors. Mentors will prescribe advice based on their own past experiences. Yet those experiences occurred in different contexts, at different company stages and for different reasons. As a mentor, you’ve got to remember this when giving advice. You may have the best intentions, but you might lead a company astray by not properly contextualizing advice based on past experience. I’ve tried to keep this in mind as I mentor companies at StartX.

 

You previously worked as a General Manager for Occipital which develops state-of-the-art mobile computer vision applications and hardware. Could you tell us what this role involved in a day to day setting?

When I was at Occipital, our core product was the Structure Sensor and SDK, which made it simple to add 3D sensing to mobile devices, and develop applications to take advantage of that 3D data stream. On a day-to-day basis, I saw my role as combining a short-term tactical and long-term strategic pursuit of revenue and revenue growth. For instance, the SDK was free, and therefore it generated no revenue on a daily basis. However, as developers used the SDK to create apps to use Structure Sensor, there was a direct relationship between the number of apps published on our platform and the rate of sensor sales. So on a daily basis, I’d pursue these indirect revenue opportunities around developer community support, while also setting up programs to sell our sensors in as many channels as possible – including directly through those developers.

 

When did you first get the idea to launch a robotics startup?

Much of the credit here goes to my co-founder, Brandon Minor. Brandon is a co-founder of Colorado Robotics, and has had his finger on the pulse of the robotics community as long as I have known him. We had both left Occipital independently with the idea of starting companies. Earlier this year, we met up and he proposed that we join forces to build on our past experience with robots, computer vision and sensors. And that is how Tangram Robotics was created.

 

Could you tell us what Tangram Robotics does?

Tangram Robotics offers sensors-as-a-service to robotics platforms. All robots need perception sensors, but not all of those sensors meet the performance needs of robotics. We infuse trusted hardware with Tangram software that makes integration, calibration, and maintenance a breeze during development and deployment. This means that roboticists don’t need to make any trade-offs; they can start using the best sensors for their platform from day one, and keep that momentum as they deploy.

 

What are some of the existing challenges companies face when it comes to the integration of Robotic Perception Sensors?

Our interviews with robotics companies of all types have led us to the conclusion that hardware companies make great hardware, but marginal software. The process of developing the right streaming and integration software for a sensor therefore falls to the robotics company themselves and can take months to get right. Not only that, but every robotics company is going through this same process, for the same sensors, over and over as they develop up their perception stack. This results in a major loss of engineering time and customer revenue. We’ve set up our solution so that it can help robotics companies at any stage, from design through development and ultimately to deployment.

 

Could you discuss Tangram Robotics web-based diagnostics and monitoring systems?

Tangram understands that the key to improvement is in metrics, both during development and in the field. With that in mind, we are creating remote diagnostics systems that work on top of our integration software that allow robotics developers to better understand what’s happening during operation. This includes data transmission rates, processing time, and metrics directly related to other aspects of our platform. Setting this up over a web portal means that decisions can be made competently without needing the physical presence of an engineer.

 

One of the solutions Tangram Robotics is working on is developing full-stack tools for robotic companies to add to their project. Could you discuss the vision behind these tools?

Sensor integration is much more than streaming. We look at sensors from a holistic perspective, focusing on the tools needed to develop faster and work longer. This includes competent calibration tools that work in the field, as well as diagnostics and monitoring of data and performance. By solving the base requirements of many robot platforms out-of-the-box, Tangram’s tools dramatically improve time-to-market. We anticipate that various other tools will be requested as our platform matures.

 

Is there anything else that you would like to share about Tangram Robotics?

As we’ve gone through the process of talking with roboticists, we’ve been blown away at the diversity of applications that robotics companies are pursuing. We’ve spoken to companies building all sorts of wild solutions, from strawberry pickers to sous chefs to boat captains to groundskeepers!

Thank you for the interview. I believe that sensors is often something that is overlooked by different companies and I look forward to following your progress. Readers who wish to learn more should visit Tangram Robotics.

Spread the love
Continue Reading

Robotics

Tiny Robotic Cameras Give First-Person View of Insects

Published

on

Credit: Mark Stone/University of Washington

Many people throughout generations have been curious about the viewpoints of insects and small organisms, which are often portrayed in movies. However, this has never been able to be demonstrated in real-life, up until now. 

Researchers at the University of Washington have created a wireless steerable camera that is capable of being placed on the back of an insect, bringing that viewpoint to the world. 

Insect Camera 

The camera on the back of the insect can stream video to a smartphone at 1 to 5 frames per second, and it is placed on a mechanical arm that allows a 60-degree pivot. The technology provides high-resolution, panoramic shots, as well as the possibility of tracking moving objects.

The entire system weighs around 250 milligrams, and it was demonstrated on the back of live beetles and insect-sized robots.

The work was published on July 15 in Science Robotics.

Shyam Golakota is the senior author and a UW associate professor in the Paul G. Allen School of Computer Science & Engineering. 

“We have created a low-power, low-weight, wireless camera system that can capture a first-person view of what’s happening from an actual live insect or create vision for small robots” said Golakota. “Vision is so important for communication and for navigation, but it’s extremely challenging to do it at such a small scale. As a result, prior to our work, wireless vision has not been possible for small robots or insects.”

Smartphone Cameras

There are a few reasons why the researchers had to come up with a new camera rather than use the small ones that are currently present in smartphones. Those currently used are considered lightweight, but the batteries that are required would make them too heavy to be placed on the back of insects. 

Sawyer Fuller is co-author and a UW assistant professor of mechanical engineering. 

“Similar to cameras, vision in animals requires a lot of power,” Fuller said. “It’s less of a big deal in larger creatures like humans, but flies are using 10 to 20% of their resting energy just to power their brains, most of which is devoted to visual processing. To help cut the cost, some flies have a small, high-resolution region of their compound eyes. They turn their heads to steer where they want to see with extra clarity, such as for chasing prey or a mate. This saves power over having high resolution over their entire visual field.”

Modeled After Nature

The newly developed camera was inspired by nature, and the researchers used an ultra-low-power black-and-white camera to mimic an animal’s vision. The camera can move across a field of view with the help of the mechanical arm, which is controlled by the team applying a high voltage, causing the arm to bend and move the camera. 

The camera and the arm are able to be controlled via Bluetooth from a smartphone up to 120 meters away. 

Testing the Camera

The researchers tested the camera on two different types of beetles, which ended up living for at least a year following the experiment.

“We made sure the beetles could still move properly when they were carrying our system,” said Ali Najafi, co-lead author and UW doctoral student in electrical and computer engineering. “They were able to navigate freely across gravel, up a slope and even climb trees.”

“We added a small accelerometer to our system to be able to detect when the beetle moves. Then it only captures images during that time,” Iyer said. “If the camera is just continuously streaming without this accelerometer, we could record one to two hours before the battery died. With the accelerometer, we could record for six hours or more, depending on the beetle’s activity level.”

According to the researchers, this technology could be applied in the areas of biology and exploration, and they hope for future versions to be solar-powered. However, the team does recognize certain privacy concerns could arise due to the technology. 

“As researchers we strongly believe that it’s really important to put things in the public domain so people are aware of the risks and so people can start coming up with solutions to address them,” Gollakota said.

 

Spread the love
Continue Reading