Connect with us

Robotics

Facebook Creates Method May Allow AI Robots To Navigate Without Map

mm

Published

 on

Facebook Creates Method May Allow AI Robots To Navigate Without Map

Facebook has recently created an algorithm that enhances an AI agent’s ability to navigate an environment, letting the agent determine the shortest route through new environments without access to a map. While mobile robots typically have a map programmed into them, the new algorithm that Facebook designed could enable the creation of robots that can navigate environments without the need for maps.

According to a post created by Facebook researchers, a major challenge for robot navigation is endowing AI systems with the ability to navigate through novel environments and reaching programmed destinations without a map. In order to tackle this challenge, Facebook created a reinforcement learning algorithm distributed across multiple learners. The algorithm was called decentralized distributed proximal policy optimization (DD-PPO). DD-PPO was given only compass data, GPS data, and access to an RGB-D camera, but was reportedly able to navigate a virtual environment and get to a goal without any map data.

According to the researchers, the agents were trained in virtual environments like office buildings and houses. The resulting algorithm was capable of navigating a simulated indoor environment, choosing the correct fork in a path, and quickly recovering from errors if it chose the wrong path. The virtual environment results were promising, and it’s important that the agents are able to reliably navigate these common environments, as in the real world an agent could damage itself or its surroundings if it fails.

The Facebook research team explained that the focus of their project was assistive robots, as proper, reliable navigation for assistive robots and AI agents is essential. The research team explained that navigation is essential for a wide variety of assistive AI systems, from robots that perform tasks around the house to AI-driven devices that help people with visual impairments. The research team also argued that AI creators should move away from map usage in general, as maps are often outdated as soon as they are drawn, and in the real world environments, they are constantly changing and evolving.

As TechExplore reported, the Facebook research team made use of the open-source AI Habitat platform, which enabled them to train embodied agents in photorealistic 3-D environments in a timely fashion. Haven provided access to a set of simulated environments, and these environments are realistic enough that the data generated by the AI model can be applied ot real-world cases. Douglas Heaven in MIT Technology Review explained the intensity of the model’s training:

“Facebook trained bots for three days inside AI Habitat, a photorealistic virtual mock-up of the interior of a building, with rooms and corridors and furniture. In that time they took 2.5 billion steps—the equivalent of 80 years of human experience.”

Due to the sheer complexity of the training task, the researchers reportedly culled the weak learners as the training continued in order to speed up training time. The research team hopes to take their current model further and go on to create algorithms that can navigate complex environments using only camera data, dropping the GPS data and compass. The reason for this is that GPS data and compass data can often be thrown off indoors, be too noisy, or just be unavailable.

While the technology has yet to be tested outdoors and has trouble navigating over long-distances, the development of the algorithm is an important step in the development of the next generation of robots, especially delivery drones, and robots that operate in offices or homes.

Spread the love

Robotics

AI Makes it Easier for Drones to Scan and Excavate Terrain

Published

on

AI Makes it Easier for Drones to Scan and Excavate Terrain

Researchers from Aarhus University (AU) and the Technical University of Denmark (DTU) have collaborated on a project that aims to decrease the costs of measuring and documenting gravel and limestone quarries, while at the same time being faster and easier than the traditional method. 

The project included the use of artificial intelligence (AI), which took over the traditionally human-controlled drones that are currently relied on to complete the task. 

Erdal Kayacan is an associate professor and expert in artificial intelligence and drones at the Department of Engineering at Aarhus University. 

“We’ve made the entire process completely automatic. We tell the drone where to start, and the width of the wall or rock face we want to photograph, and then it flies zig-zag all the way along and lands automatically,” says Kayacan.

Limitations of Human-Controlled Drones

The current method of measuring and documenting gravel and limestone quarries, cliff faces, and other natural and human-made formations relies on drones to photograph the area. A computer then receives the recordings and automatically converts everything and creates a 3D terrain model.

One of the downsides of this method is that drone pilots cost a lot, and the measurements are time-consuming. In an excavation, the drone pilot has to make sure that the drone keeps a constant distance from the wall. At the same time, the drone camera has to be kept perpendicular to the wall, making it a complex and difficult task. 

In order for the computer to convert and create a 3D figure out of the images, there has to be a specific overlap in the images. This is the main process that was automated by artificial intelligence, and it drastically reduced the complexity of completing the task. 

“Our algorithm ensures that the drone always keeps the same distance to the wall and that the camera constantly repositions itself perpendicular to the wall. At the same time, our algorithm predicts the wind forces acting on the drone body,” says Kayacan.

AI Overcomes Wind Problem

The artificial intelligence also helps overcome the wind, which is one of the biggest challenges with autonomous drone flight. 

Mohit Mehndiratta is a visiting Ph.D. student in the Department of Engineering at Aarhus University.

“The designed Gaussian process model also predicts the wind to be encountered in the near future. This implies that the drone can get ready and take the corrective actions beforehand,” says Mehndiratta.

When a human-controlled drone is completing this task, even a light breeze can alter the course of it. With the new technology, wind gusts and the overall wind speed can be accounted for. 

“The drone doesn’t actually measure the wind, it estimates the wind on the basis of input it receives as it moves. This means that the drone responds to the force of the wind, just like when we human beings correct our movements when we are exposed to a strong wind,” says Kayacan.

The research was completed in collaboration with the Danish Hydrocarbon Research and Technology Centre at DTU, and the results of the project will be presented in May 2020 at the European Control Conference. 

 

Spread the love
Continue Reading

Robotics

Researchers Create Soft Robot Able to Change Shape and Roam

Published

on

Researchers Create Soft Robot Able to Change Shape and Roam

One of the challenges surrounding soft robotics is that most of them are required to be attached to an air compressor or plugged into a wall. Researchers from Stanford set out to overcome this challenge. 

Nathan Usevitch is a graduate student in mechanical engineering at Stanford. 

“A significant limitation of most soft robots is that they have to be attached to a bulky air compressor or plugged into a wall, which prevents them from moving,” said Usevitch. “So, we wondered: What if we kept the same amount of air within the robot all the time?”

The team was able to develop a human-scale soft robot that is capable of changing its shape. By doing this, the soft robot can latch onto and handle objects, and it is able to roll in controllable directions. 

The research was published in Science Robotics on March 18. 

“The casual description of this robot that I give to people is Baymax from the movie Big Hero 6 mixed with Transformers. In other words, a soft, human-safe robot mixed with robots that can dramatically change their shape,” said Usevitch.

Simple Version

This soft robot was developed by combining three different types of robots. The simple version of the team’s invention is called an “isoperimetric robot,” since the shape changes while the total length of the edges and the amount of air inside stays the same. 

The isoperimetric robot was developed out of soft robots, truss robots, and collective robots. Each category of robotics brought a different advantage: soft robots are lightweight and compliant, truss robots can change shape, and collective robots are small and collaborate. 

Sean Follmer is an assistant professor of mechanical engineering and co-senior author of the paper. 

“We’re basically manipulating a soft structure with traditional motors,” said Follmer. “It makes for a really interesting class of robots that combines many of the benefits of soft robots with all of the knowledge we have about more classic robots.”

Complex Version

The team also developed a more complex version of the robot by attaching several triangles together. They were able to coordinate the movements of the different motors, which allowed the robot to carry out desired behaviors, such as picking up a ball. 

Elliot Hawkes is an assistant professor of mechanical engineering at the University of California, Santa Barabara and co-senior author of the paper. 

“A key understanding we developed was that to create motion with a large, soft pneumatic robot, you don’t actually need to pump air in and out,”  said Hawkes. “You can use the air you already have and just move it around with these simple motors; this method is more efficient and lets our robot move much more quickly.”

Space Exploration

According to Zachary Hammond, a graduate student in mechanical engineering at Stanford and co-lead author of the paper, one of the possible uses for this soft robot is space exploration.

“This robot could be really useful for space exploration — especially because it can be transported in a small package and then operates untethered after it inflates,” said Hammond. “On another planet, it could use its shape-changing ability to traverse complicated environments, squeezing through tight spaces and spreading over obstacles.”

The researchers are now trying out different shapes, and they want to test the robot in water. 

Allison Okamura is a professor of mechanical engineering and co-author of the paper. 

“This research highlights the power of thinking about how to design and build robots in new ways,” said Okamura. “The creativity of robot design is expanding with this type of system and that’s something we’d really like to encourage in the robotics field.”

 

Spread the love
Continue Reading

Interviews

Marcio Macedo, Co-Founder of Ava Robotics – Interview Series

mm

Published

on

Marcio Macedo, Co-Founder of Ava Robotics - Interview Series

Marcio Macedo is Co-Founder and VP of Product and Marketing at Ava Robotics, a recent spin-off of iRobot that focuses on autonomous navigating robots for enterprise, commercial and industrial environments.

Having previously worked at iRobot, what were some of the interesting projects that you worked on?

At iRobot we were fortunate to be designing and pioneering applications of telepresence, including an FDA-certified telemedicine robot for intensive care environments and the Ava telepresence product in partnership with Cisco.

 

Ava Robotics is a spinoff of iRobot, what was the inspiration behind launching a new company instead of keeping it in the iRobot family?

With iRobot’s strategic focus shifting to home products, Ava Robotics spun off to operate independently and better address the needs of our nascent markets. As an independent company we gain more flexibility in meeting our customers’ needs while enjoying the support of technology developed originally at iRobot.

 

The Ava Telepresence robot can be remotely controlled by users and features autonomous technology to have the robot simply move itself to a designated area. Could you walk us through the machine learning that is used to have the robot navigate through an environment without bumping into new objects?

When an Ava is installed at a location it learns its operating environment and creates a realistic topology map of the site. This map can be further annotated to force specific behaviors, such as speed zones, keep-out zones, etc.

Ava has built-in obstacle detection and obstacle avoidance (ODOA) capabilities, which leverage multiple sensors in the robot body so that Ava will not bump into people or objects in its path. Furthermore, if the most direct path to its destination is blocked, the Ava will search for and navigate through an alternative path if one is available.

 

What are the navigation sensors that are used, is it reliant on LiDAR or regular cameras?

Ava’s robotic navigation technologies use a variety of sensors (3-D cameras, LiDAR, and IMU) and they are combined for all actions, such as localization, planning, collision avoidance, cliff detections, etc. We operate in medium- and large-size spaces, so we think LiDAR is a very valuable part of a sensing package for real-world commercial spaces.

 

The telepresence robot looks like it would be extremely useful in the hospitality sector. Could you walk us through some of these potential use-cases?

Visits to Executive Briefing Centers provide access to senior-level executives and deliver value in the form of hands-on briefings, strategy reviews, product demonstrations and opportunities for relationship building. Customer Experience Centers offer organizations the opportunity to wow customers and show off their latest products and services. But with so many busy schedules, getting the right people to attend is not always easy.

For meeting planners, Ava provides the ability to “walk” the hotel and visit the meeting spaces, conference rooms and ballrooms that are available for their conference or event. In this application, the property’s sales and marketing team gain a unique tool to accelerate their sales cycles.

When invitees and guests can’t get to the conference or event, Ava allows them to attend and move around as if they were there. Whether it’s a business meeting, conference exhibit hall, or social event, Ava provides an immersive experience with freedom to move around.

 

What are some of the use-cases that are being targeted in the corporate sector?

Businesses benefit from Ava in many ways. The robot allows freedom of movement and access to meetings, corporate training, factory inspections, manufacturing sites, labs and customer experience settings.

Natural, face-to-face, ad-hoc conversations are critical to moving a business forward. Yet today’s globally distributed businesses have employees telecommuting from home or from across the world, who miss these vital interactions. With Ava, you unlock the ability to bring everyone back together as if they’re sitting together in the office and can walk up and interact naturally.

Use Case examples include:

  • Agile Product Development: Agile product development teams come together for scheduled and unscheduled meetings, looking to achieve high levels of collaboration and communication. When remote workers are part of the team, existing collaboration tools are challenged to meet the need. With Ava, remote team members can actively participate in stand-up meetings, sprint planning and demos, and project reviews as if they were co-located with the team.
  • Manufacturing: In manufacturing, remote visits by management, collaboration between experts at headquarters and staff at the plant, and remote tours by customers or suppliers are frequent – and necessary – events. Ava increases collaboration between those on the design team or in engineering and those building and delivering the final product on the plant floor. Also, imagine that the manufacturing plant is experiencing a production-line problem, but the person who knows how to fix it is thousands of miles away. In such a case, the technician needs to freely walk to different parts of the manufacturing floor to meet with someone or see something. Ava can by delivering that critical physical presence right to the factory floor. Ava allows the remote person to immediately connect via the robot as if she was physically present, put eyes on the problem, and communicate with the local team on the floor. As a result, she can deliver immediate insight into the problem and quickly resolve the issue.
  • Laboratories and Clean Rooms: Those who work in laboratories and clean rooms work hard to ensure they are kept sterile and clean. While necessary, this can be a time-consuming process for employees entering and leaving these spaces repeatedly during the day. Due to the risks of potential contamination, companies often limit tours by customers and other visitors. Ava brings people right into a laboratory or a clean room without compromising the space. With Ava, remote visitors can easily move around as if they were there in person, observing the work being done and speaking with employees.

 

Ava Robotics recently partnered with Qatar Airways to Introduce Smart Airport Technologies at QITCOM 2019. Could you share with us some details in regards to this event and how those in attendance reacted?

We have been fortunate to work with Hamad International Airport in Qatar and Qatar Airways via our strategic partner Cisco building applications for robots in airports for a variety of use cases. Showing our work at QITCOM 2019 was a good opportunity to expose to the IT community to the applications that are now possible through different verticals and industries.

 

Is there anything else that you would like to share about Ava Robotics?

In these times of challenges to global travel, we have seen increased demand for solutions like telepresence robotics. Customers are just beginning to realize the potential of applications that truly empower remote workers to collaborate as if they were physically present at a distant location.

To learn more more visit AVA Robotics

Spread the love
Continue Reading