Compliance Management, Privacy

Online child protection guidance falls short

A ruling by the Supreme Court of Canada makes it clear that it is the responsibility of internet users to ascertain the age of people they chat with – or, more specifically, chat up – but falls short of saying how.

Michell Rayal Levigne, a 46-year-old Edmonton man, told the court that he was convinced that “Jessy G” – the moniker used by an undercover police officer – was at least 18 because the chat room where they met was restricted against minors. Even though “Jessy” protested that he was only 13, Levigne repeatedly propositioned him for oral sex and arranged to meet him at a local restaurant.

In its unanimous ruling, the court found that Levigne did not take “reasonable steps” to confirm that he was soliciting an adult, but didn't say what might be considered reasonable.

“Nothing in this decision gives us guidance,” says Avner Levin, director of the Privacy & Cyber Crime Institute at Ryerson University's Ted Rogers School of Management. “It leaves us with the challenge of how chat rooms can be set up to ensure participants are of age.”

Levin says the current protections are not sufficient, and that everyone – including chat room operators, predators and law enforcement agencies – recognize the reality that kids will go around the online walls set up to protect them.

“We need to have something to grasp that will give us some comfort that children are not at risk when they're online.”

Levin says he recognizes that many people will say it is the responsibility of parents to ensure their children are not putting themselves in danger.

“That's a very challenging proposition, especially with the rise of mobile devices. Sure, parents can take steps to put digital safeguards in place, so kids can't go where they're not supposed to, but that requires parents to be more technically savvy than many are.”

The other approach, he says, is to make online service providers liable for allowing minors in.

“Google has illustrated that you can create an algorithm that predicts what is likely copyrighted material, and you could likely apply the same logic to determining who is underage. But Google put those safeguards in place because they feared being sued by some powerful content creators. Where's the impetus for chat room operators to take the same kind of steps? The Supreme Court saying that people need to take reasonable measures doesn't solve the problem of underage children going online. That will take a legislative approach.”

Levin isn't alone in his concern. On the same day as Canada's highest court ruled against the Edmonton stalker, Jessica Rich, deputy director of consumer protection at the U.S. Federal Trade Commission (FTC), was testifying before a Congressional sub-committee on consumer protection about the dangers young people face online. In addition to putting themselves at risk through “impulsive” behavior, Rich said young people are now at added risk because of the location-based services inherent in many smartphones.

Like Levin, Rich is proposing that legislative change be considered as a way of protecting minors. The FTC is currently holding a series of roundtable discussions about online privacy, with a special emphasis on what she called “the ever-increasing challenges” posed to teens by online technology.

Levin says any action taken against corporations in the U.S. is likely to have significant impacts on how much danger Canadian kids face online.

“It is a real problem for us to do anything in isolation. We always have to look at what the impact will be on U.S. companies.”

If Canada is the only country putting legislation in place, he concludes, predators like Michell Rayal Levigne will likely have no problem finding vulnerable victims online.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.