Facebook's Filters Fall Short in Blocking Pedophiles

Facebook's Filters Fall Short in Blocking Pedophiles - Facebook is failing to prevent child predators from posting suggestive and potentially illegal photographs of children on its website, a weeks-long investigation by FoxNews.com reveals, despite its claim that it's doing all it can to keep pedophile materials from being displayed.


http://www.foxnews.com/static/managed/img/U.S./1FBtop_397x224.jpg

The world's largest social network employs content filters that automatically scan for basic keywords that the National Center for Missing and Exploited Children (NCMEC) identifies as being commonly associated with child exploitive material. Those filters, if they are properly employed, should flag much of the offensive material found on the site, cybersecurity experts say.

But in a lengthy telephone interview on Oct. 6, FoxNews.com took two Facebook executives on a click-by-click tour of their own website, bringing them face-to-face with some of its vile contents and forcing them to admit that their efforts to block child predators were not working.

During a 90-minute phone interview with Facebook spokesman Simon Axten and the company's chief security officer, Joe Sullivan, the two executives were guided by FoxNews.com through the site’s seamy subculture – an encounter that left Sullivan sounding dumbfounded, unaware of and unable to explain the extremely graphic content on the site.

In the interview, FoxNews.com told the executives to enter "PTHC" in the website's search box. The term “PTHC” — short for “Pre-Teen Hard Core” — is frequently found in connection with child sexual exploitation activity and materials, law enforcement officials say. Multiple sources confirmed that “PTHC” is on the NCMEC list of keywords.

Having searched for "PTHC," the two Facebook executives were then instructed to click on the first result — a public group Page called “PTHC,” with 197 members. That's when the executives came face-to-face with a post directing users to a video purportedly featuring an 8-year-old boy being sexually abused.

Then, when asked to click on the profile of any of the group’s members, the executives were ushered into a subculture dedicated to using Facebook to traffic child pornography and to target and interact with children.

At this point, there was silence for nearly a full minute, except for the sound of furious, rapid typing. Axten and Sullivan sounded stunned, unable to explain why this happened and how their filters could have failed.

Facebook later said it had launched an investigation into the pages, profiles and video links they had found during the interview. That same day, the “PTHC page” and others were removed from the website.

But much of what FoxNews.com found in its investigation remains active.

During the interview, the Facebook executives emphasized that identifying and removing content that may exploit children is a top priority. They said material flagged by the NCMEC keywords filter is evaluated and, if merited, promptly removed.

“We’re constantly looking to improve our filter system. As we get more information and tactics, we’ll use that to inform our system to make it even better,” Sullivan said.

“Believe me, it’s incredibly frustrating to all of us that they’re trying to share this, I’m so repulsed by the fact — I have three daughters — we have a large number of people who care greatly about these issues throwing a lot of money and technology at them.”

But despite their efforts, FoxNews.com found an entire underworld of widely recognized terms, code words and abbreviations on Facebook -- hundreds of pages with “PTHC” and “Incest” in their titles, and many others that are unprintable. Both terms are on the NCMEC keywords list, sources said, and they were found on Facebook's public, private, group and profile pages. Many of those pages purported to host video links to child pornography, and many had been active for months.

Most if not all of the content appears to be in clear violation of Facebook’s terms of use, and cyber safety experts said much of it may be illegal.

“A fair question to ask after a period of time is, why is it still up?” said NCMEC President Ernie Allen, who told FoxNews.com his organization was encouraged — but unaware — that Facebook was filtering by its keyword list.

He said there could be three explanations for the content's presence:

— Law enforcement officials asked Facebook to keep the content up on the site for investigative purposes;

— Facebook’s internal investigation concluded the words used, in the context being used, did not rise to an actionable level;

— Facebook might have just missed it.

“With half a billion members, my guess is they haven’t seen everything,” Allen said.

But the problem, in some ways, is just the tip of the iceberg: In the same way that Facebook users can “like” a page dedicated to a television show, they can similarly “like” pages that suggest or host photos of pedophile activities. And Facebook’s automatic aggregation system pulls “global posts” on certain topics to community pages dedicated to those same topics.

This means that someone who innocently writes on her wall that she's going out partying with "her girls" can wind up appearing on a "young girls” community page created, “liked” and monitored by a pedophile.

Similarly, Microsoft’s collaboration with Facebook has also inadvertently brought links to what presumably is hardcore child porn onto the social network. Search for a user with “PTHC” in his name, and Bing's Internet search results at the bottom of Facebook's results page links out to what appear to be child porn websites.

Following the phone interview and in response to a follow-up request to clarify if and how “PTHC” is used by the filtering system, Facebook’s spokesman said via email:

“We're constantly improving how we integrate NCMEC's list and others into our proactive systems. As we explained on the phone, these systems use keywords to either block content from being created or flag it for review by our team of investigators. Some terms on these lists, including code words and acronyms, have multiple meanings, which makes it difficult to block them upfront without also preventing legitimate uses. We do a careful evaluation of each term and consider both the potential for abuse and the frequency with which the term is used in other contexts when making decisions on whether to block or flag. We're reviewing this term (“PTHC”) again to make sure we have the right implementation.”

But the mass of pedophile content on the site would have been rooted out if Facebook were doing its job properly, said Hemanshu Nigam, co-chairman of President Obama's Online Safety Technology Working Group.

“The fact that Facebook missed the most basic terms in the terminology of child predators suggests that they’ve taken a checkbox approach instead of implementing real solutions to help real problems facing children online,” Nigam said.

“To not be focusing in on a word like 'PTHC' or 'Incest' means you’re not stopping the problem proactively.”

He said that if Facebook were using the keyword list as it should — as a set of clues to follow and as part of a multi-dimensional approach — it would need to take a deep dive into the network of friends, groups and fan pages surrounding the keyword hits.

“Rather than use the list to find the match and delete that one single match, Facebook should say we found one, now let’s look under the covers,” said Nigam, who headed security divisions at Microsoft and MySpace and now, in addition to his co-chairmanship position, runs SSP Blue, an online security consulting firm. (SSP Blue consults for News Corp., which is the parent company of Fox News.)

“You either look under the covers or close your eyes and check the box.”

But Facebook executives said they face greater challenges than any other social networking site, many of which can be tied to the evolution of what once was a closed network for college students to a global behemoth facing real-world criminal threats. Add increasingly savvy criminals and the sheer volume of content — more than 1 billion files shared daily by its half-billion users — and the challenges grow.

Plus, Sullivan said, scrubbing users from Facebook will be a game of whack-a-mole until law enforcement figures out how to track down predators who hide behind computers.

“We have a pretty big team focused on keeping the site clean of child exploitative content, and we’re always looking to get better,” Sullivan said.

“We really believe there is no place for this on this site.” ( foxnews.com )


loading...

This article may also you need...!!!




No comments:

Post a Comment