OnlyFans bribed Meta to put porn stars on terror watchlist: lawsuits
OnlyFans squashed competitors in the online porn industry with the help of a bizarre scheme that bribed Meta employees to throw thousands of porn stars onto a terrorist watchlist, according to a group of explosive lawsuits.
Adult performers who sold X-rated photos and videos on rival sites saw their Instagram accounts falsely tagged as containing terrorist content — crippling their ability to promote their business and devastating their incomes, according to the suits.
Sellers of smutty pictures were then “shadowbanned” across Instagram, Facebook, YouTube, Twitter and other sites, the suits allege. Targeted accounts also included businesses, celebrities, influencers and others who “have nothing to do with terrorism,” according to the suits.
“When I heard that my content may be listed on the terror watch list, I was outraged,” Alana Evans, an adult performer and one of the plaintiffs in the California suit, told The Post. “I was angry because it affected my income when my social media traffic dropped significantly, and I was angry because I am the daughter of a veteran who fought for this country.”
Evans and others were all allegedly placed in a database of terror-linked accounts run by the Global Internet Forum to Counter Terrorism, or GIFCT, a nonprofit group intended to stop the spread of mass shooting videos and other terrorist content across social media sites.
After adult performers who used rival sites allegedly had their names added to the GIFCT’s list, traffic to the rival sites drastically fell, the suits allege. Meanwhile, OnlyFans’ traffic and profits soared as the site became a household name.
Law firm Milberg Coleman Bryson Phillips Grossman is representing the plaintiffs in suits filed against Meta and OnlyFans. The lawyers claim they have acquired a list of more than 21,000 Instagram accounts they say were unfairly tagged as potential terrorists, previously unreported California superior court filings show.
In a statement to The Post, Milberg partner David Azar called on Meta and the GIFCT to “open up” their records “to help figure out whether our clients or their content are indeed on any databases intended for terrorists, and how to get them off.”
In a statement to The Post, OnlyFans said, “We are aware that these cases have been filed. We are not aware of any evidence which supports these allegations. The alleged participants have all publicly stated that these cases have no merit.”
Meta did not respond to requests for comment but told the BBC, which first first reported the bribery allegations, that it had investigated and found no evidence the terror database had been abused.
“These allegations are without merit and we will address them in the context of the litigation as needed,” Meta said.
The GIFCT likewise did not respond to The Post but told the BBC that it was “not aware of any evidence to support the theories presented in this lawsuit between two parties with no connection to GIFCT.”
“Our continuing work to enhance transparency and oversight of the GIFCT hash-sharing database is the result of extensive engagement with our stakeholders and has no connection to these claims,” the GIFCT added
The plaintiffs claim the scheme dates back to 2018, when they say one or more Meta employees — potentially including an unnamed senior executive — took bribes from OnlyFans.
They claim the bribes were routed from OnlyFans’ parent company, Fenix International, through a secret Hong Kong subsidiary into offshore Philippines bank accounts set up by the crooked Meta employees, potentially including at least one unnamed senior executive.
The suits — which also name OnlyFans majority owner Leonid Radvinsky as a defendant — claim the bribes paid off around October 2018, when people sold content through OnlyFans’ rivals were allegedly hit with a “massive spike in content classification/filtering activity” that limited their reach. Meanwhile, users of OnlyFans enjoyed a “mysterious immunity” to the crackdown, the plaintiffs claim.
“The blacklisting of plaintiff and others has caused OnlyFans to achieve a drastically enlarged market share while its competitors stagnated or declined,” attorneys in a class action led by OnlyFans competitor JustFor.Fans wrote in an August court filing in California state court. “The defendants engaged in a scheme to misuse a terrorist blacklist to obtain a competitive advantage.”
The suits include the California superior court filing court on behalf of JustFor.Fans and a California federal court suit on behalf of a group of several women led by the Adult Performing Artists Guild. In June, Meta asked a judge to throw out the federal suit. Hearings in both cases are slated for September.
Another suit filed in Broward county, Florida on behalf of adult site FanCentro lists OnlyFans as a defendant but does not name Meta.
The GIFCT was formed by Meta, Microsoft, Twitter, and Google’s YouTube in 2017 in a joint effort to stop the spread of mass shooting videos and other terrorist material online. When a member of the group flags a photo, video or post as terrorist-related, a digital fingerprint called a “hash” is shared across all its members.
In effect, that means a bikini pic wrongly flagged as jihadist propaganda on Instagram can also be quickly censored on Twitter or YouTube, all without the poster or public knowing that it was placed on the list — much less how or why.
“Due to the proliferation of the GIFCT database, any mistaken classification of a video, picture or post as ‘terrorist’ content echoes across social media platforms, undermining users’ right to free expression on several platforms at once,” Electronic Frontier Foundation researchers Svea Windwehr and Jillian C. York wrote in 2020.
“While [the GIFCT’s system] sounds like an efficient approach to the challenging task of correctly identifying and taking down terrorist content, it also means that one single database might be used to determine what is permissible speech, and what is taken down — across the entire Internet,” the researchers added.
Read the full article Here