Scientists are currently developing independent robotic networks that work together in order to create smart satellites. Those smart satellites could then be used to repair others in space. Currently, it is extremely difficult to do anything to broken satellites, which happens quite often. Because there is no real solution, the expensive satellites end up orbiting Earth for years until they are brought back into the atmosphere by gravity.
Ou Ma, a professor from the University of Cincinnati, is engineering robotics technology to fix the orbiting satellites before they break. He runs the Intelligent Robotics and Autonomous Lab at the university, and he would like to create robotic satellites that are capable of docking with other satellites for repairs and refueling.
The best repair satellite will be capable of performing multiple tasks, according to Ma. He has a long career involving various projects that deal with robotic arms on the International Space Station, as well as the former space shuttle program.
In the lab, Ma and UC senior research associate Anoop Sathyan are working on robotic networks that work independently and collaboratively on a common task.
In their latest study, the pair used a group of robots and tested them with a novel game involving strings to move an attached token to a target spot on a table The robots each control one string, so they need the help of others in order to move the token to the right spot. To do this, they release or increase the tension on the string in response to each robot’s actions.
The team uses an artificial intelligence termed genetic fuzzy logic, and they were able to get the three robots, later five, to move the token to the desired spot.
The results of the research and experiments were published in the journal Robotica this month.
When the researchers used five different robots, they learned that the task can be completed even if one of them malfunctions.
“This will be especially true for problems with larger numbers of robots where the liability of an individual robot will be low,” the researchers concluded.
According to Ma, every satellite launch has the possibility of countless problems, and it is almost always impossible to do anything about it once the satellite is deployed.
Earlier this year, a $400 million Intelsat satellite, the same size as a small school bus, malfunctioned after reaching a high elliptical orbit. Some of the first 60 Starlink satellites launched by SpaceX also malfunctioned this year. In the case of SpaceX, the satellites were designed to orbit Earth at a low altitude, causing them to decay after a few years.
The most well-known of all took place in 1990 when the Hubble Space Telescope was deployed. NASA later learned that the mirror was warped, and a subsequent repair mission aboard the space shuttle Endeavor took place in 1993. That mission set out to replace the mirror, allowing images of the universe to make it back to Earth.
Sending humans to space in order to repair satellites is extremely expensive, according to Ma. The missions can cost billions of dollars and are difficult to complete.
The issues become more prominent every time a satellite is launched.
“Big commercial satellites are costly. They run out of fuel or malfunction or break down,” Ma said. “They would like to be able to go up there and fix it, but nowadays it’s impossible.”
NASA is looking to launch a satellite in 2022 that is capable of refueling others in low Earth orbit. They will set out to intercept and refuel a U.S. government satellite. The project is called Restore-L, and it is expected to be the proof of concept for autonomous satellite repairs, according to NASA.
Maxar, a company out of Colorado, will be responsible for the spacecraft infrastructure and robotic arms for the project.
According to John Lymer, chief roboticist at Maxar, most satellites fail because they run out of fuel.
“You’re retiring a perfectly good satellite because it ran out of gas.” he said.
“Ou Ma, who I’ve worked with for many years, works on rendezvous and proximity organization. There are all kinds of technical solutions out there. Some will be better than others. It’s about getting operational experience to find out whose algorithms are better and what reduces operational risk the most.”
AI Makes it Easier for Drones to Scan and Excavate Terrain
Researchers from Aarhus University (AU) and the Technical University of Denmark (DTU) have collaborated on a project that aims to decrease the costs of measuring and documenting gravel and limestone quarries, while at the same time being faster and easier than the traditional method.
The project included the use of artificial intelligence (AI), which took over the traditionally human-controlled drones that are currently relied on to complete the task.
Erdal Kayacan is an associate professor and expert in artificial intelligence and drones at the Department of Engineering at Aarhus University.
“We’ve made the entire process completely automatic. We tell the drone where to start, and the width of the wall or rock face we want to photograph, and then it flies zig-zag all the way along and lands automatically,” says Kayacan.
Limitations of Human-Controlled Drones
The current method of measuring and documenting gravel and limestone quarries, cliff faces, and other natural and human-made formations relies on drones to photograph the area. A computer then receives the recordings and automatically converts everything and creates a 3D terrain model.
One of the downsides of this method is that drone pilots cost a lot, and the measurements are time-consuming. In an excavation, the drone pilot has to make sure that the drone keeps a constant distance from the wall. At the same time, the drone camera has to be kept perpendicular to the wall, making it a complex and difficult task.
In order for the computer to convert and create a 3D figure out of the images, there has to be a specific overlap in the images. This is the main process that was automated by artificial intelligence, and it drastically reduced the complexity of completing the task.
“Our algorithm ensures that the drone always keeps the same distance to the wall and that the camera constantly repositions itself perpendicular to the wall. At the same time, our algorithm predicts the wind forces acting on the drone body,” says Kayacan.
AI Overcomes Wind Problem
The artificial intelligence also helps overcome the wind, which is one of the biggest challenges with autonomous drone flight.
Mohit Mehndiratta is a visiting Ph.D. student in the Department of Engineering at Aarhus University.
“The designed Gaussian process model also predicts the wind to be encountered in the near future. This implies that the drone can get ready and take the corrective actions beforehand,” says Mehndiratta.
When a human-controlled drone is completing this task, even a light breeze can alter the course of it. With the new technology, wind gusts and the overall wind speed can be accounted for.
“The drone doesn’t actually measure the wind, it estimates the wind on the basis of input it receives as it moves. This means that the drone responds to the force of the wind, just like when we human beings correct our movements when we are exposed to a strong wind,” says Kayacan.
The research was completed in collaboration with the Danish Hydrocarbon Research and Technology Centre at DTU, and the results of the project will be presented in May 2020 at the European Control Conference.
Researchers Create Soft Robot Able to Change Shape and Roam
One of the challenges surrounding soft robotics is that most of them are required to be attached to an air compressor or plugged into a wall. Researchers from Stanford set out to overcome this challenge.
Nathan Usevitch is a graduate student in mechanical engineering at Stanford.
“A significant limitation of most soft robots is that they have to be attached to a bulky air compressor or plugged into a wall, which prevents them from moving,” said Usevitch. “So, we wondered: What if we kept the same amount of air within the robot all the time?”
The team was able to develop a human-scale soft robot that is capable of changing its shape. By doing this, the soft robot can latch onto and handle objects, and it is able to roll in controllable directions.
The research was published in Science Robotics on March 18.
“The casual description of this robot that I give to people is Baymax from the movie Big Hero 6 mixed with Transformers. In other words, a soft, human-safe robot mixed with robots that can dramatically change their shape,” said Usevitch.
This soft robot was developed by combining three different types of robots. The simple version of the team’s invention is called an “isoperimetric robot,” since the shape changes while the total length of the edges and the amount of air inside stays the same.
The isoperimetric robot was developed out of soft robots, truss robots, and collective robots. Each category of robotics brought a different advantage: soft robots are lightweight and compliant, truss robots can change shape, and collective robots are small and collaborate.
Sean Follmer is an assistant professor of mechanical engineering and co-senior author of the paper.
“We’re basically manipulating a soft structure with traditional motors,” said Follmer. “It makes for a really interesting class of robots that combines many of the benefits of soft robots with all of the knowledge we have about more classic robots.”
The team also developed a more complex version of the robot by attaching several triangles together. They were able to coordinate the movements of the different motors, which allowed the robot to carry out desired behaviors, such as picking up a ball.
Elliot Hawkes is an assistant professor of mechanical engineering at the University of California, Santa Barabara and co-senior author of the paper.
“A key understanding we developed was that to create motion with a large, soft pneumatic robot, you don’t actually need to pump air in and out,” said Hawkes. “You can use the air you already have and just move it around with these simple motors; this method is more efficient and lets our robot move much more quickly.”
According to Zachary Hammond, a graduate student in mechanical engineering at Stanford and co-lead author of the paper, one of the possible uses for this soft robot is space exploration.
“This robot could be really useful for space exploration — especially because it can be transported in a small package and then operates untethered after it inflates,” said Hammond. “On another planet, it could use its shape-changing ability to traverse complicated environments, squeezing through tight spaces and spreading over obstacles.”
The researchers are now trying out different shapes, and they want to test the robot in water.
Allison Okamura is a professor of mechanical engineering and co-author of the paper.
“This research highlights the power of thinking about how to design and build robots in new ways,” said Okamura. “The creativity of robot design is expanding with this type of system and that’s something we’d really like to encourage in the robotics field.”
Marcio Macedo, Co-Founder of Ava Robotics – Interview Series
Marcio Macedo is Co-Founder and VP of Product and Marketing at Ava Robotics, a recent spin-off of iRobot that focuses on autonomous navigating robots for enterprise, commercial and industrial environments.
Having previously worked at iRobot, what were some of the interesting projects that you worked on?
At iRobot we were fortunate to be designing and pioneering applications of telepresence, including an FDA-certified telemedicine robot for intensive care environments and the Ava telepresence product in partnership with Cisco.
Ava Robotics is a spinoff of iRobot, what was the inspiration behind launching a new company instead of keeping it in the iRobot family?
With iRobot’s strategic focus shifting to home products, Ava Robotics spun off to operate independently and better address the needs of our nascent markets. As an independent company we gain more flexibility in meeting our customers’ needs while enjoying the support of technology developed originally at iRobot.
The Ava Telepresence robot can be remotely controlled by users and features autonomous technology to have the robot simply move itself to a designated area. Could you walk us through the machine learning that is used to have the robot navigate through an environment without bumping into new objects?
When an Ava is installed at a location it learns its operating environment and creates a realistic topology map of the site. This map can be further annotated to force specific behaviors, such as speed zones, keep-out zones, etc.
Ava has built-in obstacle detection and obstacle avoidance (ODOA) capabilities, which leverage multiple sensors in the robot body so that Ava will not bump into people or objects in its path. Furthermore, if the most direct path to its destination is blocked, the Ava will search for and navigate through an alternative path if one is available.
What are the navigation sensors that are used, is it reliant on LiDAR or regular cameras?
Ava’s robotic navigation technologies use a variety of sensors (3-D cameras, LiDAR, and IMU) and they are combined for all actions, such as localization, planning, collision avoidance, cliff detections, etc. We operate in medium- and large-size spaces, so we think LiDAR is a very valuable part of a sensing package for real-world commercial spaces.
The telepresence robot looks like it would be extremely useful in the hospitality sector. Could you walk us through some of these potential use-cases?
Visits to Executive Briefing Centers provide access to senior-level executives and deliver value in the form of hands-on briefings, strategy reviews, product demonstrations and opportunities for relationship building. Customer Experience Centers offer organizations the opportunity to wow customers and show off their latest products and services. But with so many busy schedules, getting the right people to attend is not always easy.
For meeting planners, Ava provides the ability to “walk” the hotel and visit the meeting spaces, conference rooms and ballrooms that are available for their conference or event. In this application, the property’s sales and marketing team gain a unique tool to accelerate their sales cycles.
When invitees and guests can’t get to the conference or event, Ava allows them to attend and move around as if they were there. Whether it’s a business meeting, conference exhibit hall, or social event, Ava provides an immersive experience with freedom to move around.
What are some of the use-cases that are being targeted in the corporate sector?
Businesses benefit from Ava in many ways. The robot allows freedom of movement and access to meetings, corporate training, factory inspections, manufacturing sites, labs and customer experience settings.
Natural, face-to-face, ad-hoc conversations are critical to moving a business forward. Yet today’s globally distributed businesses have employees telecommuting from home or from across the world, who miss these vital interactions. With Ava, you unlock the ability to bring everyone back together as if they’re sitting together in the office and can walk up and interact naturally.
Use Case examples include:
- Agile Product Development: Agile product development teams come together for scheduled and unscheduled meetings, looking to achieve high levels of collaboration and communication. When remote workers are part of the team, existing collaboration tools are challenged to meet the need. With Ava, remote team members can actively participate in stand-up meetings, sprint planning and demos, and project reviews as if they were co-located with the team.
- Manufacturing: In manufacturing, remote visits by management, collaboration between experts at headquarters and staff at the plant, and remote tours by customers or suppliers are frequent – and necessary – events. Ava increases collaboration between those on the design team or in engineering and those building and delivering the final product on the plant floor. Also, imagine that the manufacturing plant is experiencing a production-line problem, but the person who knows how to fix it is thousands of miles away. In such a case, the technician needs to freely walk to different parts of the manufacturing floor to meet with someone or see something. Ava can by delivering that critical physical presence right to the factory floor. Ava allows the remote person to immediately connect via the robot as if she was physically present, put eyes on the problem, and communicate with the local team on the floor. As a result, she can deliver immediate insight into the problem and quickly resolve the issue.
- Laboratories and Clean Rooms: Those who work in laboratories and clean rooms work hard to ensure they are kept sterile and clean. While necessary, this can be a time-consuming process for employees entering and leaving these spaces repeatedly during the day. Due to the risks of potential contamination, companies often limit tours by customers and other visitors. Ava brings people right into a laboratory or a clean room without compromising the space. With Ava, remote visitors can easily move around as if they were there in person, observing the work being done and speaking with employees.
Ava Robotics recently partnered with Qatar Airways to Introduce Smart Airport Technologies at QITCOM 2019. Could you share with us some details in regards to this event and how those in attendance reacted?
We have been fortunate to work with Hamad International Airport in Qatar and Qatar Airways via our strategic partner Cisco building applications for robots in airports for a variety of use cases. Showing our work at QITCOM 2019 was a good opportunity to expose to the IT community to the applications that are now possible through different verticals and industries.
Is there anything else that you would like to share about Ava Robotics?
In these times of challenges to global travel, we have seen increased demand for solutions like telepresence robotics. Customers are just beginning to realize the potential of applications that truly empower remote workers to collaborate as if they were physically present at a distant location.
- AI Powered State Surveillance On Rise, COVID-19 Used as Scapegoat
- Anastassia Loukina, Senior Research Scientist (NLP/Speech) at ETS – Interview Series
- How Governments Have Used AI to Fight COVID-19
- Neural Hardware and Image Recognition
- Charles J. Simon, Author, Will Computers Revolt? – Interview Series