Skip to main content

tv   Current Former Social Media Execs Testify on National Security  CSPAN  September 14, 2022 11:08am-11:51am EDT

11:08 am
and proudly serves as a historic landmark in the fifth district of south carolina. may god continue to bless this church and its congregation. i yield whack -- back. the speaker pro tempore: pursuant to clause 12-a of rule 1, the chair declares the house in recess until noon today.
11:09 am
11:10 am
a facebook employee thanked the c.d.c. for responding to misinformation queeries. we'll get moving to removing that one claim as soon as that authorization happens. on july 28, 2022, that's this year, facebook reached out to c.d.c. doing a monthly
11:11 am
misinfo/debunking meeting. on may 11, 2021, facebook employees organized a be on the lookout for c.d.c. officials. on july, 2021, clark humphrey of the white house asked any way we can get this poll down and cited an instagram account within 46 seconds facebook replied, quote, yep. we are on it, end quote. and down the account went. is that normal in your time at facebook? >> i don't have experience around that. senator hawley: you don't have any knowledge and then presto it happened in 2020 when you left? >> i don't have personal experience. senator hawley: you never heard of anything like this happening? >> i don't. senator hawley: that's remarkable. i thought you were the former vice president of marketing,
11:12 am
strategic operations and analytics at facebook. mr. boland: that's true. hau awe what do you think drove this kind of collaboration where you have facebook becoming to t
11:13 am
11:14 am
11:15 am
another committee i sit on that 4,000 engineers at twitter had access to all of the personal information, user data, geolocations of twitter users. is that accurate? mr. boland: so i never met him and he joined the company after i left so i don't know don't kn is accurate. senator hawley: he said all the engineers. you're an engineer. did you have access to data? mr. boland: when i was there i -- senator hawley: did you have access to user data? mr. boland: i was engineer. senator hawley: did you have user data access?
11:16 am
mr. access? mr. roetter: i think i could have gotten it. senator hawley: you did have access? mr. roetter: that's probably right. senator hawley: did you access any user data? mr. roetter: no. senator hawley: lots to unpack there. thank you. >> thank you. >> thank you for the witnesses here today. transparency and accountability, those are the words of the day because we know that social media companies, of course, i am a former computer programmer. data is power. the demographic, the behavioral data from consumers in order to enhance the predictive algorithm
11:17 am
to target consumers with ads, recommendations, perceived interests, or even vulnerability. maybe not so great when you're on an extremist or violent website or harmful content. and so when it comes to that, there has to be greater transparency into the platform, promotion mechanism, how the content spreads from platform to platform so i believe consumers -- and i'll say not just the individual -- we have small businesses, hospitals, schools, everyone on these platforms in some form or fashion. when we say consumer we can go from the individual right up to our full national security that we understand better the algorithm that amplify the content and how these things reach their feed.
11:18 am
so some social media platforms, for example, they have standards in place for moving content that promotes holocaust denial or distortion. they're inconsistent with implementing the policies but the content flurishes. so -- flourishes. i am going to kind of cut to the chase. mr. roetter and then mr. boland, is there a difference now in how predictive user engagement algorithms engage with harmful content versus other content and how might might we modify agnostic algorithm. agnostic algorithm to identify illegal or extremist content, how do we take the agnostic out of the math? mr. roetter. mr. roetter: so today the algorithms are doing which is maximize attention on the
11:19 am
platform. if you change what those companies were accountable for, these companies are very smart. they have a lot of engineers and a lot of company and a lot of computational power. they would change what the algorithms do. for example, it's certain type of content if companies were penalized with sharing content, they would no longer promote that content because it would be not optimal for them to do so. the extra benefit they get from the attention and the usage would be outweighed by whatever the penalty was. so this was all possible. i think the two takeaways are, one, without one, without transparency we are not going to know what it's doing today. two, they will behave optimally in the face of incent structure they have. they are doing what they would expect. you could change that. mr. boland: i think it's important to note not only do we not know what's happening on the platforms, the platforms don't know what's happening on the platforms. the turning point for me having
11:20 am
concerns to being publicly vocal about my concerns facebook said nothing on january 6 happened on our platform and then we find there is a lot of stop to steal on the platform. in order to change these algorithms, part of it is understanding what is happening and then as society having a conversation about what do we think the right distribution is. facebook has proven with things like qanon after the fact, after the fire was lit and burned through, they could then adjust it and actually manage the distribution of that type of content. so it is possible when we know what we're managing towards. the problem it's all after the fact. it's after all the damage is done you go back and say, ok. there's been this set of articles or conversations and finally we go back to address it. rather than saying we have a whole research of people that could spot things, adjust thing.
11:21 am
i'm simple theic of the fact that human speech is very complicated. senator rosen: the platforms have a willingness -- they actually want to have this lack of understanding so they have some deniability on thing about a end if that's what you're -- back end if that's what you're saying is true. oh, my gosh, it happened. their analysis ahead of time and not understanding their platform, they're setting themselves up for deniability in my estimation. we'll move on to cybersecurity because i have a few minutes left. we know the whistleblower complaint from the head of security -- protect the figures, influential figures from spam and security breaches. complaint alleged company servers were running out-of-date and vulnerable software and
11:22 am
withheld the breaches and lack of protections for user data. i'm really concerned about cybersecurity. companies are laser are laser f growth. not laser focused on protection, in my opinion. small businesses, hospitals, schools, critical infrastructure, all of those things we're responsible for here are at potential risk. so based on, again, both your experience working at facebook and twitter, is cybersecurity a high enough priority for the large social media platforms? and do the social media platform security teams, do they work alongside product development, application development to protect that cyberattacks? are you looking for these breaches? how are you working that? and how does this threaten our own security, even our national security? mr. roetter: so the teams, they do work alongside engineering.
11:23 am
you need to build something to drive usage and revenue and make it secure enough. so in terms of your question -- is it a high enough priority, the answer can be known if you know the nature of the threat and if the bad actors are trying to be successful. senator rosen: so there is no operations built in for people trying to breach the data? there's no kind of hunt forward? there's no way that people are really actively looking for data breaches, you're finding it after the fact in many cases? mr. roetter: no. there are some cases penetration cases and people breaking in to learn, that happens. mr. boland: my sense from metastandpoint they're quite good and invested in protecting people's data. from a cybersecurity standpoint which goes to show you where there is a will and desire to make progress on issues they can. in my experience, it's quite
11:24 am
strong. senator rosen: thank you. my time is up. >> senator hassan. senator hassan: thank you, mr. chair and ranking member portman for this hearing and for the witnesses being here today. i really appreciate it. i want to start with a question that built on what senator rosen was discussing and this is to mr. boland and mr. roetter. terrorists have horrifically livestreamed their attack on social media. these in turn inspire our individuals to commit future attacks. are there ways for social media companies to quickly identify terrorist content so it's not shared in real-time? mr. boland, i'll start with you. mr. boland: i know for livestream videos that metahas put considerable resources and a.i. to try to spot these
11:25 am
attacks attacks and take them down quickly. they have gotten better. it's an incredibly hard problem. i am not an expert on the extent of what's possible there. i do think they've made strides. mr. roetter: it's certainly possible. it's a hard technical challenge but you can -- you can build algorithms to figure out in real time or near real time the content of videos. they won't be perfect like any segmentation algorithm. senator hassan: this is an ongoing issue. we've seen the acceleration from idea to action because of the influence of social media, too. i thank you for that and look forward to following up with you on that. another question for the two of you. facebook is currently running an advertising campaign which is touting the thousands of employees, billions of dollars that the company says it spends on safety and security. these numbers, however, are pretty meaningless without proper context, right? what specific information or metrics should these companies provide this committee to help us fully understand their actual
11:26 am
commitment to safety and security? mr. boland, again, we'll start with you. mr. boland: you're 100% correct on the context of the numbers matter. i think when i first -- when they announced 13 billion of five years it was in context of 50 million in stock buyback. a massive amount of money. they give you number of employees. if you think of these issues, you have employees who are nontechnical who can be in what would be a review queue to look at content. engineering resource and how many engineers are put on these numbers, i would probably get from the companies an understanding where they allocate their engineers. for these types of problems, right? they don't need to show you their entire org chart. these are the numbers working on the safety and security issues.
11:27 am
these safety issues. and this is how they're allocated by country, by topic, etc. i think that's justifiable to understand and feel like we have a sense of that's adequate to the total number of engineering employees. senator hassan: thank you. mr. roetter. mr. roetter: we get metrics of the form that show what results they're getting, not metrics that basically equate to we're trying real hard. give us a break. that won't work on wall street. we tried to make profit this quarter. you had to show what the results are. if you have transparency over the content and how the content is shared and the engagements on that content, we will be able to study independent people can look at -- see certain content. certain content spreads very widely. other content does not. then after this investment that this made has this changed or not? so beneed metrics where we can measure the actual result, not just -- i tried really hard so please be happy. mr. boland: one more quick
11:28 am
thought there. i worry a lot of times -- because it's so painful we focus on these extreme examples of content of the livestream shootings. there is a broad swath of content that influences people that don't feel as scary. that's the stuff that terrifies me. that's the stuff we don't get to see without transparency. senator hassan: i thank you both. i thank the whole panel for your testimony and very grateful for this hearing and i yield back. >> thank you, senator hassan. during your opening statements, i think each of you discussed the product development process at these companies and we've talked at length about that process through the hearing. mr. boland, you discussed how facebook does not incentivize limiting the spread of harmful content but, of course, prioritizes growth and revenue. so could you tell the committee generally what metrics and employee compensation at the company, what goes into that?
11:29 am
mr. boland: so employees at facebook, you kind of receive two -- it's about rewards, right? so the rewards you receive is your cash compensation, stock compensation, and promotions. generally, if you're building products. and that product success is identified by some sort of metrics whether that product is being used more. let's pretend you're building a video product. the things you care about are the metrics, what are the total watch hours or hours spent watching videos, how many -- what's the user growth, how many people are using that video product. where does it spread geographically, etc. you're incentivized on those hard metrics. then you're not incentivized around, well, what kind of content are you growing your video with? what's the stuff underneath the hood that's showing up, driving this growth? it's not your problem. it's somebody else's problem.
11:30 am
there may be companywide goals. ok. for the company -- i wasn't aware of any safety or trust goals. they could theoretically create one. the problem is it doesn't drive individual behavior. company goals are there. you don't think about them. you think what you individually and your team delivers. that's always metrics. that's always product growth metrics and success metrics of the product and not success in are we keeping people safe. senator peters: there's not a sift -- mr. boland: for the safety team they've moved into a central team. i did not experience products like videos or others carrying a metric that is incentivizing trust and safety. senator peters: that's not there? is that the case as well? mr. roetter: yes. i agree there's that promotion system, compensation system, review system. the problem with trust and safety metrics typically companies may have top level
11:31 am
goals and maybe one is trust and safety. the problem it's at odds with the other metrics and the other metrics always win. if i'm an engineer building a livestream product, if i launch it and gets some usage, that's a feather in my cap. that's something i can say that's what i did. it will help me with promotions, compensation, career advancement. if at the last minute i decide not to launch that product because i realize i can't control some of the safety aspects and we shouldn't do it if we can't do it without certain safeguards, i get zero credit for that. it's as if i've done nothing for the company over the last x months. senator peters: you're effectively punished and your future advancement will be questionable as well? mr. roetter: it's no different -- a product i build and don't launch because it might not be safe, it's no different than if i didn't show up to work in terms of the future i get. senator peters: not a good place
11:32 am
for an employee to be. mr. boland: you can change incentives and change the way people show up. not even just through up. not even just through goals but through process. there's an example where when facebook started the desktop site and moved to mobile, mark zuckerberg required all products that were demoed to him showed mobile in their demonstration. they had designs around that. he kicked the that. he kicked the first team out that came in without that design. suddenly, everybody was thinking about mobile designs. if in your process you create an incentive, as part of every product design, discussion is what are all the harmful ways this product can drive hate or drive extremism or drive polarization, you would have a radical change in the way people showed up to those meetings and in the process thought about the negative impacts of your product. senator peters: it's my understanding that you voiced objections how facebook recommendation, algorithms were actually promoting extreme hateful and extreme content, is
11:33 am
that correct? mr. boland: yes. it was around racist content. senator peters: what was the -- mr. boland: disappointing. i raised issues around the distribution of racist impact and my concerns is we didn't understand it and brought steps forward to help mitigate the problem. one more internal researchers. two, more external researchers. and sharing more information. i had a range of responses from you're wrong. that's not the case. that this is driving us -- with no evidence, mind you. just a i believe you're wrong. and no counterevidence. two, some, yeah, this might be a problem but not something we're working on right now. senator peernt peters: when you're saying no evidence, you work for a company that looks at a lot of data and makes decisions based on data but this
11:34 am
is something they wanted to ignor basically? mr. boland: when i had my moment where i really came to terms with believing that the product could be causing harm, i started to look at a variety of things that research teams were doing internally to understand what they were saying. the internal dialogue and the internal documents, many of which were shared, were troubling. there was a particular document that was an overview of polarization research i think june, 2020. that talked about political polarization and one of the lines that we have not reverend and have very little understanding of racial, ethnic, or religious polarization. that was particularly concerning to me. senator peters: one of the documents submitted by a twitter whistleblower to the s.e.c. was a 2021 study he commissioned of the site integrity teams capabilities. the study found that twitter
11:35 am
planned to launch a new product, fleets, just weeks before the 2020 elections. the integrity team, according to that document, i'm quoting the document, said, quote, had to beg a product team not to launch before the election because they did not have the resources or capabilities to take action on misinformation or disinformation on the new product, unquote. the report also found, quote, while product teams elicit feedback for new launch products, they're incentivize to ship new products as quickly as possible and thus willing to accept security risks, end of quote. are these findings consistent with pattern of decision-making that you saw? mr. roetter: with the caveat that specific example happened after i was there and i can't speak to it, that's absolutely consistent. in fact, i would be surprised, given the incentives at play, if the product team had done
11:36 am
anything else. the product managers are quote-unquote c.e.o.'s of their product. it's their decision to launch or not. again, there is no possible credit or reward from not launching. where there is a possible red or reward from lunch launching. they would get at least some usage and potentially drive revenue there's ever reason to launch and not worry about the other issues. senator peters: thank you. thank you, ranking member portman. any other questions? senator portman: thank you. it was allegedly rolled out in such a rush to your point it hadn't been fully tested for safety. twitter lacked real-time audio content moderation capabilities when they launched it. we're told in the wake of our
11:37 am
withdraw from afghanistan that was exploited -- it was exploited by the taliban and taliban exploiters talked how cryptocurrency can fund terrorism. first off, is that accurate? mr. roetter, i'll start with you. and second, will twitter launch content capabilities? you said sometimes they are under pressure to ship products as soon as possible. was that why this happened? mr. roetter: so it is accurate they're under pressure to ship products as soon as possible. twitter in particular has a history of being worried about user growth and revenue growth. it's not the runaway success that facebook or google are. there were extreme pressures to launch things. a saying we had, if you walk around and ask enough people if you can do something, eventually you'll find someone that says no. the point of that was to really
11:38 am
emphasize, you just need to get out and do something. again, the overwhelming metrics are usage. you would never get credit or be held up as an example or promoted or get more compensation if you didn't do something because of potential negative consequences on the safety side or otherwise. in fact, you would be viewed probably as someone that just says no or has a reason not to take action. there's a huge bias towards taking action and launching things at these companies. senator portman: are you aware of the taliban having exploited it? mr. roetter: that specific example i am not. senator portman: do you think -- assuming my example is correct, which i think it is, that it would be helpful to get behind the curtain and know why decisions have been made? mr. roetter: i haven't read the draft of that. from my understanding, it's correct. having more understanding what these products do and what sort of content is promoted and what the internal algorithms are that
11:39 am
drive both decision-making and usage of the products would be extremely valuable. without any of that i would expect examples such as this to keep happening. senator portman: on this trust and safety issue and specifically the product development and business decision-making processes, mr. boland, let me direct this to you. meta disbanded its responsible innovation team just last week it was announced and did you see that? mr. boland: i did. extremely disappointing. senator portman: they have been tasked with harmful products processes so you're saying it was concerning to you. why are you concerned about it? and tell us about how you interacted with integrity teams while you were at facebook. mr. boland: i know the people who led that team. very, very high integrity. very intentional about responsible design of products as the team was named.
11:40 am
without that kind of center of excellence that's helping to shape other teams, i fear that meta is not going to continue to have that as a part of their conversation. you can think about that group as influencing and indoctrine ating, if you will, how to start to think about some of these issues. it's less hard coated into the incentive structure which i think is a missing element. but would have driven really important conversations on how to ethically design products. disbanding that unit -- and i don't believe them when they say they are making it a part of everything, that they'll interweave it in the company. that's a very convenient way to dodge the question, in my view. i don't believe they're going to continue to invest in it if they're a team. this comes at a time when meta is building the metaverse. we don't know how it will play out. i'm extremely concerned because
11:41 am
the paradigms in the past, content and content distribution is very different in the metaverse. that's an area if i were this committee would spend a lot of time really trying to understand the risks of the meta verse. it feels very risky to me. it feels like the next space -- helping to guide, not having that, that's concerning. senator portman: same question to mr. roetter. how to evaluate these trust and safety issues and the responsible innovation team and the impact it's having, do you think it will be helpful to have the platform accountability and transparency act? mr. roetter: i think so. if we get from that more information to illuminate what the incentive structures are is helpful. they are operating in a vacuum. we see a lot of conversation about this people will cherry
11:42 am
pick and use it as evidence of whatever their theory is of these companies that of these companies that they're doing. of course, it must be true because this is one example. the fact of the matter is these companies are sos mafb and -- so massive and content, they can cherry pick whatever they want. and without broad scale representatives data from which we can compute what is being promoted and then reverse engineer what the incentives might be we will never see the things. senator portman: what are your thoughts on that, mr. boland? mr. boland: no. i think the issue we face today is that we have to trust and without having a robust set of data to understand what's happening and make these public conversations, not company conversations, it's critical. meta would like to tell you they would not like to put their thumb on the sell when it comes to algorithm distributions. these algorithms were built in such a way -- they're already doing a lot to shape discourse
11:43 am
and to shape what people experience. we don't get to see it. we have to trust the companies to share with us information that we know they're not sharing. i -- as i said earlier, i think the platform accountability and transparency act is a critical, critical first step to -- we need to do it quickly because these things are accelerating to understand what is actually happening on these platforms. senator portman: i believe you have the last word. >> it will have significance not only for our democracy within america but the position of america and the world. there's been major changes i've seen personally having been in china and russia and recently ukraine and the world of technology and the world of social media. my greatest concern we're ceding too much ground to authoritarian regimes that seek to undermine us. to undermine and to malign us in whatever way they can. mr. cain: the software we're
11:44 am
using, the a.i., the apps, these are
11:45 am
11:46 am
11:47 am
11:48 am
11:49 am
[captions copyright national cable satellite corp. 2022] [captioning performed by the national captioning institute, which is responsible for its caption content and accuracy. visit ncicap.org]
11:50 am
>> this senate committee hearing now going into recess to allow committee members to head over to the senate chamber to attend votes on judicial nominees. remember at noon eastern, about 10 minutes from now, we'll have our commitment of the house. in theea

72 Views

info Stream Only

Uploaded by TV Archive on