Kids are just adorable aren't they? That is, unless you're an isolated shopping mall robot in Japan, in which case children – especially small groups of children – might just gang up, surround, and even start attacking you. Researchers have shared footage of this disturbing phenomenon, which led them to devise a new algorithm to help their robots evade abuse at the hands of children wherever possible.

While the idea of a free-roaming robot making its way around a suburban shopping mall might in itself seem unusual to some of us, in Japan it's not uncommon for robots like this Robovie II model to be out and about, assisting shoppers in public places. While most of the time their role is perfectly safe, harassment from children is clearly an occupational hazard, especially when kids band together.

This video from researchers at ATR Intelligent Robotics and Communication Laboratories and Osaka University shows the different ways in which children will impede and abuse such robots. When obstructed, the robots are programmed to ask people to step aside. Most children oblige, but some don't. The problem is more likely to occur when a number of children gather around a robot, especially if no parents or adults are around to reprimand the youngsters.

When this happens, the children will intentionally bar robots from moving, with the video, filmed in a mall in Osaka, showing a group of children linking hands to prevent a robot from escaping. If the kids' behaviour escalates, they may act violently. The video depicts kids grabbing, pulling, hitting, kicking, and even throwing objects at one of the machines.

What's the solution? While many who have seen the video online offer droll suggestions ("Provide robots with tasers. That will teach those brats," says one commenter), the researchers have developed an algorithm designed to lessen robot abuse by making the machines avoid parentless children in the first place.

The researchers developed a simulator of pedestrian behaviour that models the behavior of children in a mall environment and calculated the probability of abuse. The likelihood of abuse increases the longer a single child remains in the immediate presence of the robot, or when more than one child is present. From the point of view of the robot, a child is anybody below 1.4 metres (4 feet, 6 inches) in height, whereas adults are anybody taller.

If a high probability of abuse is detected – for example, if the robot becomes aware that it is being persistently followed by a group of children – the robot will change its destination or move towards adults, in a bid to lower the risk of attack. The research, presented at this year's ACM/IEEE International Conference on Human-Robot Interaction, found that real-world simulations run with the child-evasion algorithm resulted in lower occurrences of abuse.

A follow-up study looking into why children abuse robots saw the same researchers interview aggressors (with their parents' consent) after having witnessed them impede or attack the robot. It found that the majority of children who abused the robot perceived the machine as human-like (74 percent), and claimed they abused it for reasons of enjoyment (35 percent), curiosity (22 percent), or because their behaviour was triggered by others (17 percent).

Damning findings. But at least we now may have some new leads on the perps responsible for hitchBOT's untimely demise.