Inside TikTok’s transparency center

TikTok is staring down the barrel of an outright ban in the US. It has already been prohibited on federal employee devices, blocked by dozens of universities across the country, and lawmakers are calling for its removal from US app stores.

It’s with that context that I and a handful of other journalists were invited to the company’s Los Angeles headquarters earlier this week for the first media tour of its “Transparency and Accountability Center.” It’s a space that, like the political discussion about TikTok these days, seems more about virtue signaling than anything else. Company officials say the center is designed for regulators, academics, and auditors to learn more about how the app works and its security practices. We were told that a politician-who-would-not-be-named had toured it the day before. TikTok eventually plans to open more centers in Washington, DC, Dublin, and Singapore.

Our tour was part of a multi-week press blitz by TikTok to push Project Texas, a novel proposal to the US government that would partition off American user data in lieu of a complete ban. The CEO of TikTok, Shou Zi Chew, was in DC last week giving a similar pitch to policymakers and think tanks. In March, he is expected to testify before ​​Congress for the first time.

What you see when you first enter TikTok’s transparency center.
Photo by Allison Zaucha for The Verge

TikTok isn’t the first embattled tech company to lean on the spectacle of a physical space during a PR crisis. In 2018, Facebook invited journalists to tour its election “War Room,” which was really just a glorified conference room packed with employees staring at social media feeds and dashboards. Photos were taken, stories were written, and then the War Room was closed about a month later.

In a similar way, TikTok’s transparency center is a lot of smoke and mirrors designed to give the impression that it really cares. Large touchscreens explain how TikTok works at a high level, along with a broad overview of the kind of trust and safety efforts that have become table stakes for any large platform.

A key difference, however, is a room my tour group wasn’t allowed to enter. Behind a wall with Death Star-like mood lighting, TikTok officials said a server room houses the app’s source code for outside auditors to review. Anyone who enters is required to sign a non-disclosure agreement, go through metal detectors, and lock away their phone in a storage locker. (It wasn’t clear who exactly would be permitted to enter the room.)

A room where you can interact with a mock version of the moderation software TikTok uses.

A room where you can interact with a mock version of the moderation software TikTok uses.
Photo by Allison Zaucha for The Verge

The interactive part of the center I was allowed to experience included a room with iMacs running a mock version of the software TikTok says its moderators use to review content. There was another room with iMacs running “code simulators.” While that sounded intriguing, it was really just a basic explanation of TikTok’s algorithm that seemed designed for a typical member of Congress to grasp. Close-up photos of the computer screens weren’t allowed. And despite it being called a transparency center, TikTok’s PR department made everyone agree to not quote or directly attribute comments made by employees leading the tour.

At the moderator workstation, I was shown some potentially violating videos to review, along with basic information like the accounts that posted them and each video’s number of likes and reshares. When I pulled up one of a man talking into the camera with the caption of “the world bringing up 9/11 to justify Muslims as t3rrori$ts,” the moderator system asked me to select whether it violated one of three policies, including one on “threats and incitement to violence.”

At the code simulator iMac in the other room, I was hoping to learn more about how TikTok’s recommendations system actually works. This was, after all, a physical place you had to travel to. Surely there would be some kind of information I couldn’t find anywhere else?

What I got was this: TikTok starts by using a “coarse machine learning model” to select “a subset of a few thousand videos” from the billions hosted by the app. Then, a “medium machine learning model further narrows the recall pool to a smaller pool of videos” it thinks you’ll be interested in. Lastly, a “fine machine learning model” makes the final pass before serving up videos it thinks you will like in your For You page. 

The information displayed was frustratingly vague. One slide read that TikTok “recommends content by ranking videos based on a combination of factors, including the interests that new users convey to TikTok the first time they interact with the app, as well as changing preferences over time.” That’s exactly how you would expect it to work.

Eric Han, head of USDS Trust and Safety at TikTok.

Eric Han, head of USDS Trust and Safety at TikTok.
Photo by Allison Zaucha for The Verge

TikTok first tried to open this transparency center in 2020, when then-President Donald Trump was trying to ban the app and Kevin Mayer was its CEO for all of three months. But then the pandemic happened, delaying the center’s opening until now.

In the past three years, TikTok’s trust deficit in DC has only deepened, fueled by a growing anti-China sentiment that started on the right and has since become more bipartisan. The worst revelation was in late December, when the company confirmed that employees improperly accessed the location data of several US journalists as part of an internal leak investigation. That same month, FBI director Chris Wray warned that China could use TikTok to “manipulate content, and if they want to, to use it for influence operations.”

TikTok’s answer to these concerns is Project Texas, a highly technical, unprecedented plan that would wall off most of TikTok’s US operations from its Chinese parent company, ByteDance. To make Project Texas a reality, TikTok is relying on Oracle, whose billionaire founder Larry Ellison leveraged his connections as an influential Republican donor to personally secure Trump’s blessing in the early phase of negotiations. (No one from Oracle was present at the briefing I attended, and my request to speak with someone there for this story wasn’t answered.)

Photo by Allison Zaucha for The Verge

I was given a brief overview of Project Texas before the tour, though I was asked to not quote the employees who presented directly. One graphic I was shown featured a Supreme Court-like building with five pillars showing the issues Project Texas is meant to address: org design, data protection and access control, tech assurance, content assurance, and compliance and monitoring.

TikTok says it has already taken thousands of people and over $1.5 billion to create Project Texas. The effort involves TikTok creating a separate legal entity dubbed USDS with an independent board from ByteDance that reports directly to the US government. More than seven outside auditors, including Oracle, will review all data that flows in and out of the US version of TikTok. Only American user data will be available to train the algorithm in the US, and TikTok says there will be strict compliance requirements for any internal access to US data. If the proposal is approved by the government, it will cost TikTok an estimated $700 million to $1 billion per year to maintain. 

Whether Project Texas satisfies the government or not, it certainly seems like it will make working at TikTok more difficult. The US version of TikTok will have to be fully deconstructed, rebuilt, and published by Oracle to US app stores. Oracle will also have to review every app update. Duplicate roles will be created for TikTok in the US, even if the same roles already exist for TikTok elsewhere. And app performance could suffer when Americans are interacting with users and content in other countries since American user data has to be managed inside the country.

Photo by Allison Zaucha for The Verge

One name that wasn’t uttered during the entire briefing: ByteDance. I got the impression that TikTok employees felt uncomfortable talking about their relationship with their parent company. 

While ByteDance was directly unacknowledged, its ties to TikTok weren’t hidden, either. The Wi-Fi for the building I was in was named ByteDance and conference room screens in the transparency center displayed Lark, the in-house communications tool ByteDance developed for its employees around the world. At one point during the tour, I tried asking what would hypothetically happen if, once Project Texas is greenlit, a Bytedance employee in China makes an uncomfortable request to an employee in TikTok’s US entity. I was quickly told by a member of TikTok’s PR team that the question wasn’t appropriate for the tour.

Ultimately, I was left with the feeling that, like its powerful algorithm, TikTok built its transparency center to show people what it thinks they want to see. The company seems to have realized that it won’t save itself from a US ban on the technical merits of its Project Texas proposal. The debate is now purely a matter of politics and optics. Unlike the tour I went on, that’s something TikTok can’t control.

Read the full article Here

Leave a Reply

Your email address will not be published. Required fields are marked *

DON’T MISS OUT!
Subscribe To Newsletter
Be the first to get latest updates and exclusive content straight to your email inbox.
Stay Updated
Give it a try, you can unsubscribe anytime.
close-link