Portman Presses Meta Official on Policies Allowing Exploitation of Children

WASHINGTON, DC – Today, in response to questioning from Ranking Member Rob Portman (R-OH) at the second panel of a Senate Homeland Security and Governmental Affairs Committee hearing examining social media’s impact on homeland security, Chris Cox, the Chief Product Officer at Meta, refused to commit to revise Meta’s policies that result in higher rates of child sexual abuse material on the platform.

Portman has been a leader in combatting the exploitation of children, most recently with his Stop Enabling Sex Traffickers Act (SESTA), which was signed into law in 2018. SESTA reformed Section 230 by removing barriers to both criminal prosecution and civil suits against websites that knowingly facilitate online sex trafficking. Portman expressed his concern with Meta’s policy that fails to effectively moderate child sexual abuse material, which results in underreporting to the National Center for Missing and Exploited Children and to law enforcement.

Portman also pressed the officials on the need for increased transparency. All of the officials in attendance, Mr. Cox, Neal Mohan, Chief Product Officer, YouTube; Vanessa Pappas, Chief Operating Officer, TikTok; and Jay Sullivan, General Manager of Bluebird, Twitter, agreed with the need for transparency and signaled support for Portman’s bipartisan Platform Accountability and Transparency Act (PATA). The legislation will require social media companies to provide certain platform data to vetted, independent researchers so that these researchers may study and improve the public’s understanding of the inner workings of social media platforms.

A transcript of the questioning can be found below and videos can be found here and here.

 

Portman: Thank you, Mr. Chairman. I look forward to getting into this issue of the balance between free speech and the hate speech that leads to violence because it is a line that has to be drawn. I know it’s not easy, but I’m going to talk about one that I think is easier, and that is child sexual exploitation. I talked about it in my opening statement a little bit. We all know that the spread of this sexual abuse material is a persistent threat. In fact, we know that last year over 29 million reports came in of child sexual exploitation. That was a 35 percent increase from just 2020. So this is an increasing problem across the board, but particularly with regard to our kids. That’s why I thought it was so unfortunate, Mr. Cox, when I learned about the Meta policy directing content moderators, and I quoted this earlier, but it’s to err on the side of the person involved in sexual exploitation being an adult when they are unsure about the age of the person. Let me give you a chance to respond to that. This has been in the public media. Doesn’t mean that it’s true, I suppose. But is that truly what you directed your content moderators to do?”

 

Chris Cox, Chief Product Officer for Meta: “Senator, I know that, we…first of all, this is an incredibly serious issue, and I appreciate your work on this issue. As a father of two kids, this is something I personally care about making sure that we pay attention to as well. The work that we do here is in consultation with NCMEC, the National Center for Missing and Exploited Children. We’ve been the most aggressive of the tech companies there. We’ve referred more content to them, I believe, than all the other tech platforms combined. That’s both through the work we do on WhatsApp and Messenger, as well as across the family of apps. My understanding on this specific question is that we received direction from NCMEC  to prioritize known CSAM content, which was the nudge that they gave us on where they wanted us to focus our time. I haven’t been focused on that specific conversation, but I’d be happy to have the team follow up.”

 

Portman: “Yeah. So let me just be sure I understand this. You’re blaming the National Center for Missing and Exploited Children for changing your approach of moderators, saying that we’re going to assume that kids are adults if we don’t know? I mean, NCMEC has said you have a responsibility, all of you do, to report all images that appear to involve a child so that law enforcement can intervene to stop the abuse and prosecute the perpetrators. Period. And I can’t believe that you’re saying that NCMEC would want you guys to send out instructions to your moderators saying err on the side of this being an adult if you’re not sure. Did I misunderstand what you said?”

 

Mr. Cox: “Senator, I haven’t been in that specific conversation with NCMEC, but I’d be happy to follow up on the details. I agree. It’s a very important issue.”

 

Portman: “Well, given your role, would you commit to, one, getting back to me on it, and two, ensuring that if that’s true, that you change that policy?”

 

Mr. Cox: “Senator, I could commit to getting into the details of the policy and make sure we follow up with the teams who work on it.”

 

Portman: “Okay. You’re the Chief Product Officer. I would hope that this is one that you would follow up on and ensure it is not the direction you’re giving your moderators, because that’s what’s been publicly reported. With TikTok, we talked about this earlier again in the opening statement, nearly half of American kids use TikTok. As you know, that’s your audience. There are a lot of risks there to privacy and national security, in my view.  Ms. Pappas, I understand that TikTok is subject to the laws of the United States, but it’s also subject to the laws of other countries in which it operates, the United Kingdom, Germany. But with regard to China, is it true, yes or no, does TikTok have an office and employees in Beijing?”

 

Vanessa Pappas, Chief Operating Officer of TikTok: “So I think this is another one for clarification…”

 

Portman: Just yes or no.”

 

Ms. Pappas: “…of which TikTok does not operate in China. And you are right in saying that TikTok is subject to the laws in the United States, as we are incorporated in the U.S. and California.”

 

Portman: “Do you have employees in Beijing?”

 

Ms. Pappas: “Yes, we do, as do many global tech companies, including those on this panel.”

 

Portman:That’s fine, I’m just asking, do you have an office in Beijing?”

 

Ms. Pappas: “Yes.”

 

Portman: Okay. And is your parent company, ByteDance, headquartered in China?”

 

Ms. Pappas: “No, they are not.”

 

Portman: “ByteDance is not headquartered in China?”

 

Ms. Pappas: “No. ByteDance is founded in China, but we do not have an official headquarters as a global company.”

 

Portman: “And where is the headquarters of ByteDance?”

 

Ms. Pappas: “We’re a distributed company. We have offices around the world. Our leadership team is largely in Singapore, but we don’t have an official headquarters.”

 

Portman: “You have to be headquartered somewhere, and I think it’s the Cayman Islands, is that correct?”

 

Ms. Pappas: “So the parent company was incorporated in the Cayman Islands. That is correct.”

 

Portman: “Okay, so you’re headquartered somewhere, and it’s the Cayman Islands, but you have a presence in China, and of course, you comply with Chinese law with regard to your people presence in China, correct?”

 

Ms. Pappas: “That is not correct. So, just again, TikTok does not operate in China. The app is not available. As it relates to our compliance with law, given we are incorporated in the United States, we comply with local law.”

 

Portman: “Do you believe that the Chinese Communist Party has the right to access data collected by your company because you have a presence in China?”

 

Ms. Pappas: “Sorry again, Senator Portman, TikTok, the app, is not available in China.”

 

Portman: “No. You said you have an office in Beijing and you have employees in Beijing. That’s a presence.”

 

Ms. Pappas: “Yes. So as we’ve said on the record, we do have employees based in China. We also have very strict access controls around the type of data that they can access and where that data is stored, which is here in the United States. And we’ve also said under no circumstances would we give that data to China.”

 

Portman: “Well, I’m glad that you say that. It doesn’t seem to square with what we know about the Chinese national security law, but I appreciate that approach. U.S. military banned their own service members from using TikTok for this reason, as you know. And just last month, the House of Representatives warned lawmakers of the risk of using TikTok. These were members of Congress that were told not to use it. Our military is told not to use it out of concern for the user’s privacy and national security. Do you think that those decisions were wrong?”

 

Ms. Pappas: “I wouldn’t opine on the needs for an entertainment platform on federal devices, but I would say that TikTok is an entertainment platform first and foremost, and this is part of the joy that we bring to millions of people around the world. We are very much committed to the security of our U.S. users and citizens, which is why we’re investing so heavily in this area.”

 

Portman: “According to a leaked audio obtained by BuzzFeed News, which I’m sure you saw, there are TikTok and ByteDance employees in China who can gain access to U.S. user data. So this Committee will be now looking into the assurance of what you said, that TikTok would not give U.S. data to China. Do you have any response to the BuzzFeed News story?”

 

Ms. Pappas: “Yes, those allegations were not found. There was talk of a master account which does not exist at our company, period.”

 

Portman: “Will TikTok commit to cutting off all data and meta flows to China, Chinese-based TikTok employees, ByteDance employees, or any other party located in China that might have the capability to access information on U.S. users?”

 

Ms. Pappas: “Again, we take this incredibly seriously in terms of upholding trust with U.S. citizens and ensuring the safety of U.S. user data. As it relates to access and controls, we are going to be going above and beyond in leading initiative efforts with our partner, Oracle, and also to the satisfaction of the U.S. government through our work with CFIUS, which we do hope to share more information on.”

 

Portman: “Can you make the commitment, though, that I just asked you to make that you will commit to cutting off all data and metadata flows to China, Chinese-based TikTok employees, ByteDance employees, or any other party located in China?”

 

Ms. Pappas: “What I can commit to is that our final agreement with the U.S. government will satisfy all national security concerns, yes.”

 

Portman: But you won’t make a commitment to agree to what I have now twice asked you about.”

 

Ms. Pappas: “Sorry, given the confidentiality of CFIUS, I’m not able to talk specifically about that agreement, but happy to share more when available.”

 

Portman: “Forget CFIUS. I’m not talking about CFIUS. I’m asking whether you would make that commitment today. Will you make that commitment?”

 

Ms. Pappas: “I am committing to what I’ve stated, which is we are working with the United States government on a resolve through the CFIUS process, in which we will continue to minimize that data, as well as working with Oracle to protect that data in the United States.”

 

Portman: “This is part of the United States government too. This is our oversight function.”

 

Ms. Pappas: “I appreciate that.”

 

Portman: “And I’m concerned that you’re not able to answer the question except to say that you will not make the commitment to cutting off this data to China. We think that all data collected relating to Americans and then access in China is a problem. We think it should be safe from exploitation by the Chinese Communist Party. And if the data is accessible in China, as you have testified, then it could be exploited. So that concerns us. I’ve gone over my time, I apologize, Mr. Chairman, but I thought it was important to get the answers.”

 

 

Portman: “Thank you Mr. Chairman. Not to leave Twitter out, I wanted to ask a question regarding the sexual material online we talked about earlier. As I said, this Committee has been a leader in trying, stopping human trafficking and specifically sex trafficking of underage kids. And we’ve passed some legislation that’s making a difference. Based on a website called BARK, I see it advises parents on how to keep their kids safe. Among the top five severe sexual content sites was Twitter. This year it was widely reported that Twitter considered monetizing sexual content, meaning, as I understand it, people could actually get paid for pornography, basically, for putting sexual content online. My understanding, this project has now been put on ice because a group of Twitter’s employees found that the platform couldn’t effectively separate out the child exploitation content. And I appreciate you did not go forward with this plan. According to Verge, the Twitter employees have said that despite executives knowing about the child sexual exploitation problems on the platform, they’ve not committed sufficient resources to detect, remove, and prevent this harmful content. There’s a news story which I’d like to ask be made part of the record, so there are lots of issues here. One is you made the right decision not to monetize this explicit content at this time, which is really pursuing a pornography scheme as I see it. But I wonder if you can give us a commitment to data halting this program indefinitely so as to prevent the platform and bad actors from making money off of child sexual material.”

 

Jay Sullivan, General Manager of Bluebird at Twitter: “First, may I say that we abhor CSAM, the sharing of sexual material. I appreciate your work there. I worked on this here and also at Meta, so I’ve been working on this for years. I made that decision to pause this idea. It wasn’t a product. It was a set of people had an idea that they thought they might want to pursue. I said I want to look at all the information here and learn about where we stand, what the risks could be, and I think this is how the system should work. We looked at a product and its very early ideation and did the analysis and got the perspectives and said this is not appropriate for us to be doing. So that’s how the process went.”

 

Portman: “Okay, so you’ve made a commitment today not to pursue it.”

 

Mr. Sullivan: “We are not pursuing that.”

 

Portman: “And you made a commitment not to pursue it in the future.”

 

Mr. Sullivan: “We have no plans to pursue monetization of adult content. That’s correct.”

 

Portman: “You have no plans to do it. Can you just tell us you’re not going to do it? Can you just give us a plain answer?”

 

Mr. Sullivan: “I’m not planning to do it.”

 

Portman: “You’re not planning to?”

 

Mr. Sullivan: “I don’t know how to say this more clearly.”

 

Portman: “Just say you’re not going to do it.”

 

Mr. Sullivan: “We’re not planning to do it, no.”

 

Portman: “Yeah. Can’t get a ‘planning’ out of there. Okay. Not to again leave anybody out, Mr. Mohan, we haven’t had a chance to talk yet. I want to ask you about something that is important to this Committee and I hope a way forward in terms of legislating and regulating platforms. Your platform’s algorithms have been described as a black box, according to experts and researchers, meaning there’s little to no transparency in the algorithms. I’m sure you’ve heard that before. And the question is, is there a way to come up with a transparency approach that makes sense as calls grow for Congress to pass legislation? I like the idea of having much better information than we have, getting behind the curtain and getting into that black box. That’s why, along with Senator Chris Coons, I drafted this legislation called the Platform Accountability and Transparency Act, or PATA. And it would require the largest tech platforms to share data with vetted, independent researchers, and other investigators so that we can know exactly what’s happening with regard to the privacy issues we’ve talked about today, or content moderation, product development, sexual exploitation issues, key industry practices. So my question for you, Mr. Mohan would you be supportive of legislation like PATA to get at this need for transparency, for us to be able to legislate with better information?”

 

Neal Mohan, Chief Product Officer at YouTube: “Yes, Senator, I would be supportive of the spirit behind that regulation, and the reason why is because I agree with you. I do think that transparency around our practices, how we go about them, is an important thing. It’s the reason why we’ve invested so heavily in our quarterly transparency report, which you may be familiar with. It’s also the reason why we just a few weeks ago launched the YouTube Research Program, which is similar, in my understanding, to what the act that you’re referring to is trying to get at, which is giving academic researchers access to our raw data, obviously in a user privacy-sensitive way, where they can derive metrics or derive insights of their own based on that data. And we’ve taken it a step further where we will also provide technical support that these researchers might need to get at the insights that they’re looking for. So I’m very bullish about that transparency program and based on the feedback that we hope to get from researchers, look forward to enhancing it in the future as well.”

 

Portman: “We are following your YouTube researcher program carefully. We’re glad you created it. We want to see what the results are. We want to be sure these are independent individuals who will give actual information about what the algorithms are, again, what’s in the black box so that citizens can understand it better, and as legislators, we can legislate better. So I think that’s a positive step. With regard to PATA, can I hear from the other members of the panel how you feel about this legislation? We’ve shared it with all of you, hope to introduce it soon, and, again, it would be bipartisan and it would be one that would, I hope, give us a way forward as a first step. Mr. Cox?”

 

Mr. Cox: “Senator, thanks. I know our teams have been in contact with yours on this. We are aligned that more transparency about content on our platform is a good thing. It’s a good thing for the public, it’s a good thing for the company. We also have an academic research program called Fort, where we’ve designed privacy-protected ways of sharing information with outside academics and researchers. We’ve also released a widely viewed content report, which helps folks get access to which content is seen the most times on the platform. We also publish quarterly Community Standards Enforcement Report, which gets into categories of content by region and shows the work we’re doing every day. So we’re committed to working with you on this.”

 

Portman: “Okay, you’ve talked about regulations needed. Ms. Pappas, yes or no?”

 

Ms. Pappas: “Senator, transparency builds trust. We were the first open platform to open our own Transparency and Accountability Center. So for that specific reason, so people could take a look at our content moderation systems, recommendation systems as well. Last month, we announced that we’ll be opening our API to researchers as well. So we’d be happy to support that legislation.”

 

Portman: “Okay.Thank you. Mr. Sullivan?”

 

Mr. Sullivan: “Yes, we’ve been publishing data to researchers for years, and we’re very open to anything that improves transparency, especially as AI moves forward. It’s going to be very important.”

 

Portman: “It is important, and it is needed. Thank you, Mr. Chairman. Thank you all.”

 

###

 

Print
Share
Like
Tweet