Artificial Intelligence Is Now Being Used To Detect Cyberbullying in School Children
Data about bullying, self-harm and cyberbullying among school children around the world is becoming alarming. As presented by Jun Wu, data from the US shows that in 2017, According to the National Center for Education Statistics and Bureau of Justice, about 20% of students ages 12–18 experienced bullying. According to the Center for Disease Control and Prevention, 19% of students in grades 9–12 report being bullied on school property in the 12 months preceding the survey.
What is becoming even more alarming, spreading beyond the school grounds themselves is the rise of cyberbullying. As Wu points out, “harassment in online forums, by emails, and on social media platforms can often be more damaging to the victim’s mental health than in-person bullying. Cyberbullying can often be an escalation from the school bullying. At the same time, bullying can start on social media, then work its way into the classroom.”
In Australia, researchers are reporting about the phenomenon named Momo, which involves a situation in which “cyber predators were taking on a persona called Momo and contacting children via social media asking them to hurt themselves, had sent ripples of concern through schools across the country.”
Preventing such bullying and possible self-harmed caused by depression has prompted a number of artificial intelligence developers to try to seek solutions to this widespread problem.
As SkyNews reports, some British schools have started using an AI tool called AS Tracking, developed by a company called STEER, which came into use at 150 schools in Britain. The tool involves students taking an online psychological test, and in September 2019 the test will be taken by 50,000 schoolchildren.
As is explained, the test, asks students to imagine a space they feel comfortable in, then poses a series of abstract questions, such as “how easy is it for somebody to come into your space?” The child can then respond by clicking a button on a scale that runs from “very easy” to “very difficult”. The results are sent to STEER, “which compares the data with its psychological model, then flags students who need attention in its teacher dashboard.”
According to Dr. Jo Walker, co-founder of STEER, “our tool highlights those particular children who are struggling at this particular phase of their development and it points the teachers to how that child is thinking.” He adds that “since introducing it the college has seen a 20% decrease in self-harm.”
In her analysis, Wu mentions a number of AI developers in the US that are helping with the problem. Securly uses AI to create “web filtering, cyberbullying monitoring, and self-harm alerts for schools. Schools can issue Apple devices and Chromebooks to students while monitoring the student’s cyber activities. Parents can also use the apps on their home devices to monitor their children’s online activities.” Bark uses AI to monitor text messages, YouTube, emails, and 24 different social networks to alert parents of potential safety concerns. SN Technologies Corp goes a step further, as its AI solutions use facial recognition to track ‘blacklisted’ students in schools from videos of surveillance cameras in schools themselves.
In Australia, cybersecurity startup Saasyan Assure developed an AI method that could help teachers track students when they were watching “Momo Challenge” videos. Greg Margossian, head of Saasyan Assure said that his company “just made sure to ensure that the ‘Momo' keyword was in all the client’s databases, without them even thinking about it.” It is added that the company “offers a subscription software that can be added to all devices at school to create a historical footprint of each student's computer use and ping teachers if any risks, from bullying to possible self-harm or violence, emerge.”
- NVIDIA: From Chipmaker to Trillion-Dollar AI Powerhouse
- Laura Petrich, PhD Student in Robotics & Machine Learning – Interview Series
- Liquid Neural Networks: Definition, Applications, & Challenges
- Patrick M. Pilarski, Ph.D. Canada CIFAR AI Chair (Amii) – Interview Series
- AI Leaders Warn of ‘Risk of Extinction’