Facebook is cracking down on scammers who use malware and fraud to generate phony “Like” votes.
The company said that it will increase its use of automated tools to detect and remove fraudulent Likes. The effort will included the use of updated security components that can better identify suspect behavior.
“On average, less than one per cent of Likes on any given page will be removed, providing they and their affiliates have been abiding by our terms,” the company said.
“These newly improved automated efforts will remove those Likes gained by malware, compromised accounts, deceived users, or purchased bulk Likes.”
The use of fraudulent Likes, or “Like-jacking” has become a popular method for scammers and malware writers to spread their attacks and divert users to advertising or malware-serving sites, as well as advertising affiliate pages.
Just as a click-jacking attacks trick users into visiting a page and artificially inflating traffic numbers, automated Like operations can artificially inflate the perceived popularity of a page or company, allowing the attacker to collect a higher affiliate rate from an advertiser.
Facebook noted that any effort to buy, sell, or otherwise artificially generate “Like” activity on the site is a violation of its terms and conditions.
“These improvements to our site integrity systems benefit both users and brands alike,” the company said.
“Users will continue to connect to the Pages and Profiles they authentically want to subscribe to, and Pages will have a more accurate measurement of fan count and demographics.”
This article was originally published on V3.
Online reputation is important for every business and social media has escalated this need. How can you filter the noise to maintain ... read more
Snapchat Discover has been a hit with publishers that want access to the popular messaging app’s highly-desirable audience, and some reports even ... read more
Little more than a year ago, Facebook CEO Mark Zuckerberg streamed the first live video from Facebook headquarters. In April of this ... read more