Connect with us

Robotics

Robotic Fish Created to Control Invasive Species

Published

 on

Robotic Fish Created to Control Invasive Species

NYU Tandon School of Engineering Professor Maurizio Porfiri and his team of researchers from NYU Tandon and the University of Western Australia have demonstrated how robotic fish can be used to fight against one of the most problematic invasive species in the world, the mosquitofish. The control of invasive species is a problem that countries throughout the world deal with, and some of the biggest issues arise in lakes and rivers. It is in these environments where fish and other species have barriers preventing them from escaping predators. These new developments, from the laboratory that deals with biometric robots and animal behavior, can provide solutions to big problems everywhere. 

The population of mosquitofish has soared throughout freshwater lakes and rivers around the world, and they cause serious problems for the native fish and amphibian species. The traditional way of controlling the mosquitofish is through toxicants or trapping, and it is many times unsuccessful or harmful to local wildlife. 

The team of researchers led by Porfiri conducted experiments to see if it was possible for a biologically inspired robotic fish to cause fear-related changes in the mosquitofish’s behavior. What they concluded was that even a small amount of exposure to robotic largemouth bass, the mosquitofish’s biggest predator, caused stress responses. They then avoided certain behaviors and underwent psychological changes caused by the loss of energy reserves. These changes were then the possible reason for a reduction in reproduction rates among the fish. 

The paper was published in the Journal of the Royal Society Interface. It is titled “Behavioral and Life-History Responses of Mosquitofish to Biologically Inspired and Interactive Robotic Predators.” 

“To the best of our knowledge, this is the first study using robots to evoke fear responses in this invasive species,” Porfiri said. “The results show that a robotic fish that closely replicates the swimming patterns and visual appearance of the largemouth bass has a powerful, lasting impact on mosquitofish in the lab setting.”

The experiments involved exposing groups of mosquitofish to a robotic largemouth bass for a period of 15 minutes. This happened per week for a total of six consecutive weeks. Each session involved a different behavior exhibited by the robotic bass. In some of the trials, the robotic fish was programmed so that it provided real-time feedback, and it mimicked predatory behavior by swimming at a faster speed. The researchers then analyzed the data and found links between the degree of biomimicry in the robot and the stress levels of the live mosquitofish. Some of the fear-related behaviors that the mosquitofish exhibited included stopping in place, an increased level of suspicion for unfamiliar and unexplored spaces, and erratic swimming patterns. 

Another area that the researchers focused on was the physiologic parameters of stress response. They measured the fish every week to track any changes in weight or length. When a fish decreases in weight, it can be a response to predators, and it causes lower energy reserves. These fish are not able to survive as long and don’t have the same amount of energy to dedicate for reproduction. 

Whenever the robotic bass mimicked the aggressive behavior of swimming in attack patterns, the highest levels of behavioral and physiological stress responses were measured among the mosquitofish. 

“Further studies are needed to determine if these effects translate to wild populations, but this is a concrete demonstration of the potential of a robotics to solve the mosquitofish problem,” said Giovanni Polverino, Forrest Fellow in the Department of Biological Sciences at the University of Western Australia and the lead author of the paper. “We have a lot more work going on between our schools to establish new, effective tools to combat the spread of invasive species.”

The Dynamical Systems Laboratory, ran by Porfiri, often uses biomimetic robots with live fish to study collective animal behavior. These behaviors include mating preferences and leadership. Another benefit of this technology is that it reduces the need to use live animals for experiments. 

These developments are yet another example of robotics making their way into every aspect of the society and environment. It isn’t hard to think forward to a point at which we have robotic fish, dedicated to certain tasks and ran by artificial intelligence, living in our environment with live species. 

 

Spread the love

Robotics

Scientists Repurpose Living Frog Cells to Develop World’s First Living Robot

Published

on

Scientists Repurpose Living Frog Cells to Develop World's First Living Robot

In what is a remarkable cross between biological life and robotics, a team of scientists has repurposed living frog cells and used them to develop “xenobots.” The cells came from frog embryos, and the xenobots are just a millimeter wide. They are capable of moving towards a target, possibly pick up a payload such as medicine for the inside of a human body, and heal themselves after being cut or damaged. 

“These are novel living machines,” according to Joshua Bongard, a computer scientist and robotics expert at the University of Vermont who co-led the new research. “They’re neither a traditional robot nor a known species of animal. It’s a new class of artifact: a living, programmable organism.”

The scientists designed the bots on a supercomputer at the University of Vermont, and a group of biologists at Tufts University assembled and tested them. 

“We can imagine many useful applications of these living robots that other machines can’t do,” says co-leader Michael Levin who directs the Center for Regenerative and Developmental Biology at Tufts, “like searching out nasty compounds or radioactive contamination, gathering microplastic in the oceans, traveling in arteries to scrape out plaque.”

The research was published in the Proceedings of the National Academy of Sciences on January 13.

According to the team, this is the first time ever that research “designs completely biological machines from the ground up.”

It took months of processing time on the Deep Green supercomputer cluster at UVM’s Vermont Advanced Computing Core. The team included lead author and doctoral student Sam Kriegman, and they relied on an evolutionary algorithm to develop thousands of different designs for the new life-forms. 

When the computer was tasked with completing a task given by the scientists, such as locomotion in one direction, it would continuously reassemble a few hundred simulated cells into different forms and body shapes. As the programs ran, the most successful simulated organisms were kept and refined. The algorithm ran independently a hundred times, and the best designs were picked for testing.

The team at Tufts, led by Levin and with the help of microsurgeon Douglas Blackiston, then took up the project. They transferred the designs into the next stage, which was life. The team gathered stem cells that were harvested from the embryos of African frogs, the species Xenopus laevis. Single cells were then separated out and left to incubate. The team used tiny forceps and an electrode to cut the cells and join them under a microscope into the designs created by the computer.

The cells were assembled into all-new body forms, and they began to work together. The skin cells developed into a more passive build and the heart muscle cells were responsible for creating ordered forward motion as guided by the computer’s design. The robots were able to move on their own because of the spontaneous self-organizing patterns.

The organisms were capable of moving in a coherent way, and they lasted days or weeks exploring their watery environment. They relied on embryonic energy stores, but they failed once flipped over on their backs. 

“It’s a step toward using computer-designed organisms for intelligent drug delivery,” says Bongard, a professor in UVM’s Department of Computer Science and Complex Systems Center.

Since the xenobots are living technologies, they have certain advantages. 

“The downside of living tissue is that it’s weak and it degrades,” says Bongard. “That’s why we use steel. But organisms have 4.5 billion years of practice at regenerating themselves and going on for decades. These xenobots are fully biodegradable,” he continues. “When they’re done with their job after seven days, they’re just dead skin cells.”

These developments will have big implications for the future. 

“If humanity is going to survive into the future, we need to better understand how complex properties, somehow, emerge from simple rules,” says Levin. “Much of science is focused on controlling the low-level rules. We also need to understand the high-level rules. If you wanted an anthill with two chimneys instead of one, how do you modify the ants? We’d have no idea.”

“I think it’s an absolute necessity for society going forward to get a better handle on systems where the outcome is very complex. A first step towards doing that is to explore: how do living systems decide what an overall behavior should be and how do we manipulate the pieces to get the behaviors we want?”

“This study is a direct contribution to getting a handle on what people are afraid of, which is unintended consequences, whether in the rapid arrival of self-driving cars, changing gene drives to wipe out whole lineages of viruses, or the many other complex and autonomous systems that will increasingly shape the human experience.”

“There’s all of this innate creativity in life,” says UVM’s Josh Bongard. “We want to understand that more deeply — and how we can direct and push it toward new forms.”

 

Spread the love
Continue Reading

Robotics

Skin-like Sensors Help Advance AISkin

Published

on

Skin-like Sensors Help Advance AISkin

A group of researchers from the University of Toronto have developed super-stretchy, transparent, and self-powering sensors that will help advance artificial ionic skin. The sensor is able to record the complex sensations of human skin, which was one of the big barriers to developing artificial skin similar to the real thing. 

The new technology is being called AISkin, and the researchers believe that the new technology will be important in wearable electronics, personal health care, and robotics. 

Professor Xinyu Liu’s lab is working on the breakthrough areas of ionic skin and soft robotics.

“Since it’s hydrogel, it’s inexpensive and biocompatible — you can put it on the skin without any toxic effects. It’s also very adhesive, and it doesn’t fall off, so there are so many avenues for this material,” according to Professor Liu.

The AISkin is adhesive, and it consists of two oppositely charged sheets of stretchable substances. Those substances are known as hydrogels. The researchers overlay negative and positive ions in order to create a “sensing junction” on the surface of the gel.

The sensing junction works whenever the AISkin is subjected to strain, humidity, or changes in temperature, which cause controlled ion movements across it. Those can then be measured as electrical signals such as voltage or current. 

“If you look at human skin, how we sense heat or pressure, our neural cells transmit information through ions — it’s really not so different from our artificial skin,” says Liu.

The AISkin is both tough and stretchable.

Binbin Ying is a visiting PhD candidate from McGill University, and he is leading the project in Liu’s lab. 

According to Ying, “Our human skin can stretch about 50 percent, but our AISkin can stretch up to 400 percent of its length without breaking.” 

The researchers published their findings in Materials Horizons.

The new AISkin can lead to the development of certain technologies such as skin-like Fitbits that are capable of measuring multiple body parameters. Other technologies include an adhesive touchpad that is able to stick onto the surface of your hand. 

“It could work for athletes looking to measure the rigour of their training, or it could be a wearable touchpad to play games,” according to Liu.

The technology could also measure the progress that is made in muscle rehabilitation. 

“If you were to put this material on a glove of a patient rehabilitating their hand for example, the health care workers would be able to monitor their finger-bending movements,” says Liu.

The technology could also play a role within the field of soft robotics, or flexible bots made out of polymers. One of the uses could be with soft robotic grippers that handle delicate objects within factories.

The researchers hope that AISkin will be integrated onto soft robots in order to measure data, such as the temperature of food or the pressure required to handle certain objects.

The lab will now work on advancing AISkin and decreasing the size of the sensors. Bio-sensing capabilities will be added to the material, which will allow it to measure biomolecules in body fluids. 

“If we further advance this research, this could be something we put on like a ‘smart bandage,'” says Liu. “Wound healing requires breathability, moisture balance — ionic skin feels like the natural next step.”

 

Spread the love
Continue Reading

Interviews

Deniz Kalaslioglu, Co-Founder & CTO of Soar Robotics – Interview Series

mm

Published

on

Deniz Kalaslioglu, Co-Founder & CTO of Soar Robotics - Interview Series

Deniz Kalaslioglu is the Co-Founder & CTO of Soar Robotics a cloud-connected Robotic Intelligence platform for drones.

You have over 7 years of experience in operating AI-back autonomous drones. Could you share with us some of the highlights throughout your career?

Back in 2012, drones were mostly perceived as military tools by the majority. On the other hand, the improvements in mobile processors, sensors and battery technology had already started creating opportunities for consumer drones to become mainstream. A handful of companies were trying to make this happen, and it became obvious to me that if correct research and development steps were taken, these toys could soon become irreplaceable tools that help many industries thrive.

I participated exclusively in R&D teams throughout my career, in automotive and RF design. I founded a drone service provider startup in 2013, where I had the chance to observe many of the shortcomings of human-operated drones, as well as their potential benefits for industries. I’ve led two research efforts in a timespan of 1.5 years, where we addressed the problem of autonomous outdoor and indoor flight.

Precision landing and autonomous charging was another issue that I have tackled later on. Solving these issues meant fully-autonomous operation with minimal human intervention throughout the operation cycle. At the time, solving the problem of fully-autonomous operation was huge and it enabled us to create intelligent systems that don’t need any human operator to execute flights; which resulted in safer, cost-effective and efficient flights. The “AI” part came into play later on in 2015, where deep learning algorithms could be effectively used to solve problems that were previously solved through classical computer vision and/or learning methods. We leveraged robotics to enable fully-autonomous flights and deep learning to transform raw data into actionable intelligence.

 

What inspired you to launch Soar Robotics?

Drones lack sufficient autonomy and intelligence features to become the next revolutionary tools for humans. They become inefficient and primitive tools in the hands of a human operator, both in terms of flight and post-operation data handling. Besides, these robots have very little access to real-time and long-term robotic intelligence that they can consume to become smarter.

As a result of my experience in this field, I have come to an understanding that the current commercial robotics paradigm is inefficient which is limiting the growth of many industries. I co-founded Soar Robotics to tackle some very difficult engineering challenges to make intelligent aerial operations a reality, which in turn will provide high-quality and cost-efficient solutions for many industries.

 

Soar Robotics provides a fully autonomous cloud connected robotics intelligence platform for drones. What are the types of applications that are best served by these drones?

Our cloud-connected robotics intelligence platform is designed as a modular system that can serve almost any application by utilizing the specific functionalities implemented within the cloud. Some industries such as security, solar energy, construction, and agriculture are currently in immediate need of this technology.

  • Surveillance of a perimeter for security,
  • Inspection and analysis of thermal and visible faults in solar energy,
  • Progress tracking and management in construction and agriculture

These are the main applications with the highest beneficial impact that we focus on.

 

For a farmer who wishes to use this technology, what are some of use cases that will benefit them versus traditional human-operated drones?

As with all our applications, we also provide end-to-end service for precision agriculture. Currently, the drone workflow in almost any industry is as follows:

  • the operator carries the drone and its accessories to the field,
  • the operator creates a flight plan,
  • the operator turns on the drone, uploads the flight plan for the specific task in hand,
  • drone arms and executes the planned mission and return to its takeoff coordinates, drone lands,
  • the operator turns off the drone,
  • the operator shares the data with the client (or the related department if hired in-house),
  • the data is processed accurately to become actionable insights for the specific industry.

It is crucial to point out that this workflow is proven to be very inefficient, especially in sectors such as solar energy, agriculture and construction where collecting periodic and objective aerial data for vast lands is essential. A farmer who uses our technology is able to get measurable, actionable and accurate insights on:

  • plant health and rigor,
  • nitrogen intake of the soil,
  • optimization and effectiveness of irrigation methods
  • early detection of disease and pest

Without having to go through all the hassle mentioned above, without even clicking a button every time. I firmly believe that enabling drones with autonomous features and cloud intelligence will provide considerable savings in terms of time, labor and money.

 

How will the drones be used for solar farm operators?

We handle almost everything that needs counting and measuring in all stages of the solar project. In the pre-construction and planning period, we generate topographic model, hydrologic analysis and obstacle analysis with high geographical precision and accuracy. During the construction period, we generate daily maps and videos of the site. After processing the collected media we measure the progress of the piling structures’, the mounting racks’ and the photovoltaic panels’ installations, position, area and volume measurements of trenches and inverter foundations as well as counting the construction machinery/vehicles and personnel on the site.

When the construction is over, and the solar site is fully operational Soar’s autonomous system continues its daily flights but this time generating thermal maps and videos along with visible spectrum maps and videos. From thermal data, Soar’s algorithms detect cell, multi-cell, diode, string, combiner and inverter level defects. From visible spectrum data, Soar’s algorithms detect shattering, soiling, shadowing, vegetation and missing panels. As a result, Soar’s software generates a detailed report of the detected faults and marks them on the as-built and RGB map of the site down to cell level, as well as showing all detected errors on a table; indicating string, row and module numbers with geolocations. Also clients’ total loss due to the inefficiencies caused by these faults and prioritize each depending on their importance and urgency.

 

In July 2019 Soar Robotics joined NVIDIA’s Inception Program which is an exclusive program for AI startups. How has this experience influenced you personally and how Soar Robotics is managed?

Throughout the months, this was proven to be an extremely beneficial program for us. We had already been using NVIDIA products both for onboard computation as well as the cloud side. This program has a lot of perks that streamlined our research, development and test processes.

 

Soar Robotics will be generating recurring revenue with Robotics-as-a-Service (RaaS) model. What is this model exactly and how does it differ from SaaS?

It possesses many similarities with SaaS in terms of its application and effects to our business model. RaaS model is especially critical since hardware is involved; most of our clients don’t want to own the hardware and only interested in the results. Cloud software and the new generations of robotics hardware blend together more and more each day.

This results in some fundamental changes in industrial robotics which used to be about stationary robots with repetitive tasks that didn’t need much of an intelligence. Operating under this mindset we provide our clients’ with robot connectivity and cloud robotics services to augment what their hardware would normally be capable of achieving.

Therefore Robotics-as-a-Service encapsulates all hardware and software tools that we utilize to create domain-specific robots for our clients’ purpose in the form of drones, communications hardware and cloud intelligence.

 

What are your predictions for drone technology in the coming decade?

Drones have clearly proven their value for enterprises, and the usage will only continue to increase. We have witnessed many businesses trying to integrate drones into their workflows, with only a few of them achieving great ROIs and most of them failing due to the inefficient nature of current commercial drone applications. Since the drone industry hype began to fade, we have seen a rapid consolidation in the market, especially in the last couple of years.  I believe that this was a necessary step for the industry, which opened the path to real productivity and better opportunities for products and services that are actually beneficial for enterprises. The addressable market that the commercial drones will create until 2025 is expected to exceed $100B, which in my opinion is a fairly modest estimation.

 

  • We will see an exponential rise in “Beyond Visual Line of Sight” flights, which will be the enabling factor for many use cases of commercial UAVs.
  • The advancements in battery technology such as hydrogen fuel cells will extend the flight times by at least an order of magnitude, which will also be a driving factor for many novel use cases.
  • Drone-in-a-box systems are still perceived as somehow experimental, but we will definitely see this technology become ubiquitous in the next decade.
  • There have been ongoing tests that are conducted by companies of various sizes in urban air mobility market, which could be broken down into roughly three segments, namely last-mile delivery, aerial public transport and aerial personal transport. The commercialization of these segments will definitely happen in the coming decade.

 

Is there anything else that you would like to share about Soar Robotics?

We believe that the feasibility and commercialization of autonomous aerial operations mainly depend on solving the problem of aerial vehicle connectivity. For drones to be able to operate Beyond Visual Line of Sight (BVLOS) they need seamless coverage, real-time high throughput data transmission, command and control, identification, and regulation. Although there have been some successful attempts to leverage current mobile networks as a communications method, these networks have many shortcomings and are far from becoming the go-to solution for aerial vehicles.

We have been developing a connectivity hardware and software stack that have the capability of forming ad hoc drone networks. We expect that these networking capabilities will enable seamless, safe and intelligent operations for any type of autonomous aerial vehicle. We are rolling the alpha and beta releases of the hardware in the coming months to test our products with larger user bases under various usage conditions and start forming these ad-hoc networks to serve many industries.

To learn more visit Soar Robotics or to invest in this company visit the Crowdfunding Page on Republic.

Spread the love
Continue Reading