Facebook will use artificial intelligence and new human review processes to detect disguised, or ‘cloaked,’ links that breach its policies.
Facebook continues to crack down on the shady side of its social network.
On Wednesday, Facebook announced that it has started fighting back against advertisers and Page owners that link to sites that violate Facebook’s rules but hide those links from Facebook’s reviewers.
In the past, these offenders would disguise the actual destination of the link attached to an ad or post, or they would come up with ways to dupe Facebook’s reviewers by directing them to a dummy page when vetting a link but would take people using Facebook’s mobile app to the offending page. But Facebook has figured out how to detect these so-called “cloaking” schemes.
“We are utilizing artificial intelligence and have expanded our human review processes to help us identify, capture, and verify cloaking. We can now better observe differences in the type of content served to people using our apps compared to our own internal systems,” said Facebook product management director Rob Leathern and software engineer Bobbie Chang in a company blog post published on Wednesday.
Any advertiser or Page that Facebook finds disguising links that violate its Advertising Policies and/or Community Standards will be banned, the company said. Pages that don’t use cloaking shouldn’t be affected. Since going after cloaked links over the past few months, Facebook has fended off “thousands of these offenders,” according to the blog post.
Facebook’s latest move follows several others the company has made in the past year to cut down on low-quality links in people’s news feeds. Last year, the company started considering how much time people spend on a page when ranking links to that page and prioritized shares from friends when evaluating Pages’ organic posts. It has also stepped up its whack-a-mole war against clickbait links, such as by pinpointing individual posts. And this year, it has gone after spammy links and the spammers that share them.