Facebook is enticing new members of all ages to sign up by suggesting pornographic profiles to anyone casually browsing the site.
The social media giant is promoting the explicit profiles - many of which appear to be inviting contact from people wanting sexual encounters - when some users, including children, enter the site looking for something legitimate, such as the name of a friend. Internet users not already registered with Facebook are presented with a string of pictures of naked women and men simply by typing the name of someone they are looking for into a search engine.
• 'So ridiculous': Auckland tattoo artist has Facebook photos deleted
• Facebook will now show you exactly how it stalks you - even when you're not using Facebook
• Facebook launches new 'Dating' service for its two billion users
• 'No Indians, no Māoris' ad: South Auckland real estate agent says her Facebook page was hacked
Facebook's algorithm apparently identifies that profiles with pictures of naked women elicit more traffic than others and so pushes them at selected new users by suggesting them as alternatives to the people they are actually searching for. In order for the casual browser to see more, or send messages, they need to become users themselves by inputting their personal data and signing up.
The revelation shows that Facebook is not only often ignoring its own rules on sexual content and profiles soliciting for sex, but it is actually benefiting from them being breached. Last night Facebook said it was investigating the issue.
Many of the suggested profiles have the hallmarks of being run by scammers, who use explicit profile pictures in order to tempt people into contacting them via Facebook. They may then take the conversation on to private email in order to try and fleece them for money. Others may effectively act as adverts for prostitutes.
Dr Marco Bastos, senior lecturer in media and communications at City University, called the discovery that Facebook is allowing the promotion on such material as "pretty horrifying."
He said: "It's not just shocking it's also tremendously sloppy for a company the size of Facebook to do this. It goes against the grain of Facebook's own community standards and is concerning and worrying. They've made considerable efforts to put in place standards and to see their own algorithm violating them is very troubling."
The revelation will fuel calls for social media platforms to be face up to the full extent of their duty of care. Following a campaign by the Daily Telegraph, the government is now legislating to regulate the tech giants, with the threat of prosecution if they allow online harm to occur. In the drive for accountability MPs have called for greater transparency into the algorithms being used by organisations such as Facebook.
Chi Onwurah, the shadow minister for digital, said the latest revelation highlighted something that was "clearly harmful". She said: "This concerns people who aren't logged in so Facebook should be taking even more care because it doesn't know how old they are or how vulnerable. What if you are not logged into Facebook because you are a child and you don't have a Facebook account?
"Because Facebook keeps their algorithm secret they are never going to get it fully tested because they are testing it in their interest. Their algorithm is driven by driving advertising and funds to Facebook and if that means that children are ending up with semi pornographic ads that's not their priority.
"Facebook must either publish its algorithm or else do much, much better at keeping people safe."
She added that the revelation also highlighted the need for the forthcoming legislation to be robust. "Does the government's duty of care apply in this case when it's clearly those who aren't logged into Facebook that are being targeted by this disreputable and clearly harmful thing?"
Lord Gilbert of Panteng, Conservative chair of the Lords Communications and Digital Committee, echoed the point: "This is something that the government needs to seriously consider when it works out how to implement the duty of care, to ensure that it would cover people who are not signed in."
He said a duty of care would "require big tech like Facebook not just to remove images when they are there, but to systematically do everything reasonable to prevent it happening in the first place."
Baronness Kidron, the influential chair of the 5Rights, a charity which lobbies for better protection of children online, said: "The tech sector themselves give the best argument for a 'Duty of Care' - they are so extraordinarily careless. I accept the argument that millions of people want to click on sexy Facebook profiles - the bit that still bewilders me is why the hell Facebook think that they should provide algorithms that entice children to click on them?"
The graphic profiles appear as suggestions only when an unregistered user, who could be either an adult or a child, goes on to the site to search for someone they want to find or contact. After inputting a name and the word 'Facebook' on Google, the searcher is presented with a list of profiles which either match the search terms or are reasonably close to them.
If a profile is clicked upon, the searcher is then given a further list of 'others with a similar name' by Facebook. These further profiles, which originate from all over the world and often bear little resemblance to the name initially searched for, often feature explicit or even pornographic photos as their thumbnails. Anyone clicking upon them is directed to the page with the photo, but if they choose to contact the owner of the profile they are obliged to first sign up to Facebook. Additionally, they are offered yet another list of 'others with a similar name' whose thumbnail images are also likely to be explicit.
In order to test Facebook's algorithm, the Telegraph searched Facebook for dozens of female names chosen by taking first names from the list of most popular girls' first names in England and Wales and combining them with the most common British surnames.
In more than 95% of instances, clicking upon the first name offered resulted in a list of 'similar names' which included one or more profile featuring suggestive or outright pornographic photographs. Most had little additional detail, suggesting that they were probably fake profiles set up to lure casual browsers into contacting them, probably with a view to scamming them in some way.
One suggested by Facebook, whose name was only a 50% match to the original search and with a woman's crotch as its profile picture, stated "Basically I'm a little bisexual nymphmaniac...and not shy about it;)."
A number of the profiles suggested by Facebook contravened their own standards on nudity by featuring close-up photographs of fully naked buttocks and bare breasts. One, suggested by Facebook in response to a search for the woman's name Ayla Brown, showed a man pushing his naked buttocks at the camera in a sexually suggestive way. Another, suggested by the algorithm following a search for the name Amelia Jones, is illustrated with photographs of a woman's naked breasts. Facebook's own community standards states that 'fully nude close-ups of buttocks' are not allowed and neither are 'uncovered female nipples' except in the context of breastfeeding, birth or health-related situations.
Ironically, Facebook, which reported annual revenue last year of more than $70 billion, recently congratulated itself on cracking down on instance of adult nudity on its network. In its Community Standards Enforcement Report, published last November, it said: 'Prevalence of content with violations of adult nudity dropped in Q2 and Q3 2019, due to improvements to our proactive detection technology and adjustments to our methodology for measuring prevalence.' It claimed that it reduced adult nudity and sexual activity violations to around 0.05%.
After being confronted by The Daily Telegraph, Facebook said it had launched an investigation into why it was promoting the pornographic profiles.
A Facebook company spokesperson said: "We have strict rules around adult nudity and we don't allow fake accounts. We have removed the accounts that violated our Community Standards and we are investigating this matter. It is not in our interests to have profiles, pictures and posts that violate our rules on Facebook."