Researchers at the University of Washington have developed a new system capable of monitoring factory and warehouse workers and warn them of risky behaviors in real time. The new system relies on machine learning to get this done.
There were about 350,000 incidents of workers taking time off due to muscle, nerve, ligament, or tendon injuries, according to the U.S. Bureau of Labor Statistics. The workers with the highest number of incidents were those working in factories and warehouses.
These incidents are normally musculoskeletal disorders that happen when people do certain tasks that cause strain on the body. These researchers looked for a way to detect these behaviors so that workers can be more aware.
The algorithm of the new system divides certain tasks, such as lifting boxes off high shelves and carrying objects, into individual actions. A risk score is then calculated for each one.
Ashis Banerjee, an assistant professor in both the industrial & systems engineering and mechanical engineering departments at the UW, is one of the senior authors.
“Right now workers can do a self-assessment where they fill out their daily tasks on a table to estimate how risky their activities are,” she said. “But that’s time consuming, and it’s hard for people to see how it’s directly benefiting them. Now we have made this whole process fully automated. Our plan is to put it in a smartphone app so that workers can even monitor themselves and get immediate feedback.”
Those current self-assessments rely on snapshots of tasks being performed. The position of each joint gets scored, and they are all added up to determine a risk score. This new algorithm will make it much more simple as it is able to score an entire action instead.
The team tested the algorithm by using a dataset with 20 three-minute videos of people doing 17 activities. These activities are commonplace among warehouses and factories.
“One of the tasks we had people do was pick up a box from a rack and place it on a table,” said first author Behnoosh Parsa, a UW mechanical engineering doctoral student. “We wanted to capture different scenarios, so sometimes they would have to stretch their arms, twist their bodies or bend to pick something up.”
The researchers then used a Microsoft Kinect camera to capture the dataset, and 3D videos were recorded. They then determined what was happening to the person’s joints during the tasks.
The algorithm was first able to determine risk scores for each video frame. Eventually, it was able to tell when a task started and finished so that it could give a risk score for the entire action.
The team’s next step is developing an app that these factory workers and supervisors can use. They want it to be able to detect and warn of moderately risky actions and high-risk actions.
In the long term, they hope that robots will be able to be used in these factories and utilize the algorithm to help keep workers safe.
“Factories and warehouses have used automation for several decades. Now that people are starting to work in settings where robots are used, we have a unique opportunity to split up the work so that the robots are doing the risky jobs,” Banerjee said. “Robots and humans could have an active collaboration, where a robot can say, ‘I see that you are picking up these heavy objects from the top shelf and I think you may be doing that a lot of times. Let me help you.'”
The research was published in IEEE Robotics and Automation Letters on June 26, and it will be presented at the IEEE International Conference on Automation Science and Engineering in Vancouver, British Columbia on August 23.
- Attention-Based Deep Learning Networks Could Improve Sonar Systems
- Cerebras CS-1 System Integrated Into Lassen Supercomputer
- Deepfaked Voice Enabled $35 Million Bank Heist in 2020
- Facebook: ‘Nanotargeting’ Users Based Solely on Their Perceived Interests
- IBM Announces AI-Driven Software for Environmental Intelligence