tv Washington Journal Chris Riley CSPAN October 31, 2021 2:04am-2:27am EDT
washington journal continues. host: we are back with chris riley, the internet governance senior fellow at the archery institute. good morning. guest: thank you for having me. host: remind our viewers exactly what the r street institute is question mark -- is? guest: it's a nonprofit free-market free tank -- think tank. we have a lot of different issues and we are happy to be part of the elegy and innovation team. host: where does it get its funding?
guest: a mix of sources, which is pretty typical of d.c. organizations. during my time -- [indiscernible] we are supported entirety by the knight foundation in my department. host: you are the public policy director for mozilla, remind us what that is in tell us how your work there translates to what you are doing at our street? guest: i had the pleasure working there for almost seven years i was the first public policy higher for the corporation status. this is the taxes, and nonprofit foundation. in that perspective i was inside the tech industry but in a rare role that allowed me perspective to look broader than just the products i was working with and really build a team and a
strategy of making the internet better. this is what i have built my whole career towards, trying to make the internet better through work at the intersection of public policy, technology, and law. it was natural for me to go from one to the other. the think tank style is a deeper investigative dive on these rapidly evolving and i believe very important technology issues. host: we have your background, let's get into the topic that we have you here to speak about. tell our viewers exactly what the facebook papers are and why they are important. guest: perhaps we should call them that meta-memorandum now, with the corporate name change. at this point, several whistleblowers and former facebook employees who have taken some materials with them and made them available to the
securities and exchange commission as well as newspaper reporters. most recently, and the most prolific was from the employee frances haugen, who captured most of which are internal screenshots of the workplace tool, facebook's version of facebook, used by employees in what had been a very open internal culture commentary. the net result is that a lot of integral research that has been done it facebook over the past few years on these products and society and around the world, not just in the united states, but also some extent instead internally and the effects that those had on engagement and on how those represented, that research is now in the hands of the u.s. government and many different investigative ab that's -- outlets around the country. host: how did that information get out of facebook? guest: i don't know exactly what
mechanics were involved. i believe a lot of it was screenshots taken from an internal computer. facebook actually internally has had a very open culture where a lot of information is free to share and employees were allowed to express anything on any topic and communicated with each other in a way that i think has many positive aspects to it. it used to be commonplace within many different technology companies. most of whom have since locked down a bit, there is some interesting coverage and facebook may take the same trajectory. host: what were the big takeaways from what we found out from the facebook papers and the testimony? guest: i think what this has revealed, at least from my perspective is not substantially different from what many of the things we have learned about the effects of a social network at
the scale of facebook over several years. it's an increase in scale rather than substance. as a recent example, prior to these papers really becoming widely known, there is an investigative journalist here in california that showed on september 22 through some research that they were doing that post of the far right afd party in germany are appearing three times as often as other parties in the german elections. what we are seeing is really detailed evidence of the outputs and outcomes of the recommendation engines that are powering how and what you see when you use facebook as a user. as a facebook user you are connected to hundreds or maybe thousands of people and the amount of information available to you is vast. long ago, facebook and many other companies similarly situated were shifting away from
presenting that information to you and built an extensive recommendation engine, the algorithms, which i they're working on your behalf in the background to sort and prioritize information for you. we have learned a lot more through the papers about how these worked and the research that facebook has been doing internally to understand how these worked in the real world. host: i'm glad you brought the word up, we hear it over and over, the algorithm. can you tell our viewers what they are talking about when they talk about the algorithm and what it actually does? so what they see on facebook and social media. guest: it's a hobby horse of mine as well, prior to public policy i was in computer science and as a grad student in johns hopkins in baltimore i taught introduction to algorithms twice. i consider myself an expert on
the subject. the algorithm itself is the formula, the recipe that underlies the code. facebook has written extensive and thoroughly researched technologies that seems to say you like this post from a democrat. he liked this other post from a democrat. maybe you want to see more content aligned with democratic interests. it's that kind of learning that happens in the background based on data from a variety of sources which ends up powering the kinds of things you see and what you don't see. host: what's the worry that congress has about what facebook is doing? guest: at the end of the day, it's any powerful system that we don't have an understanding of and belief in control of that i think powers seen in the backlash. i am phoned saying that 20 or so years ago, when members of
congress and policymakers used the word computer commit might be 30 years ago. they used to use the word computer as a synonym for magic or even black magic. it was a force that was not understood operating in ways that exerted clear power over the world. but they didn't know how to get a handle on it. this is better with the word computer. but the same dynamic exists around algorithm. the idea that when we talk about the algorithm it is this scary and powerful apparatus operating in the background about which we will never have a detailed understanding or ability to control. there's a big element of fear going into this. it's understandable area we need to invest in technology expertise. progress has been made on that. more would be helpful. the recognition of the power of this system, because it is a system that runs on 3.5 billion
people around the world, coupled with this gap in understanding hand a perceived lack of control. host: let's take a break to remind our viewers that they can take part in this discussion about the facebook papers and the future big tech. we are opening up our regular lines. for republicans (202) 748-8001. for democrats (202) 748-8000. for independents (202) 748-8002. keep in mind you can also text us at (202) 748-8003. we are always reading on social media on twitter and on facebook . chris, one of the complaint about facebook is how they reacted to january 6 and the push on disinformation. first talks about what was said
about facebook and what they did following the insurrection of january 6. i want to read this to you. facebook discussed developing a stream and measures to limit misinformation, calls for violence, and other materials that could disrupt the 2020 presidential election. both former president donald trump and his supporters tried to stop joe biden from a declared president on january 6, 2021, facebook employees complained these measures were implemented too late or stymied by technical and bureaucratic caps. -- hang ups. how much do we look at facebook and say you should have done more on january 6? you already have procedures for this but it doesn't look like they worked. guest: that's right. i look back at the incidence of that tragic day and say yes, i think facebook should have done more. i wish they had done more. there's a deep challenge,
institutional myopia is a real problem, inherent no matter the institution, size, scale, how many sociologist work on staff. it's hard to figure out how to draw very sensitive balances like this by yourself. it's not something that needs to be the forever situation. we are talking about speech, that's why the balance is important. you don't want to air on the side of restricting speech too quickly. free expression is important. it's protected strongly in this country but it's a universal human right. companies take it on themselves to celebrate free expression as well they should. we are talking about a careful balance. it's difficult to draw precisely correctly in every circumstance. it's hard to say how do we prevent that thing from happening? looking back, i wish facebook i don't more to trigger some of these circuit breakers and other methods. it's my hope that the
development of professionalization within the safety field, we have seen this over the past year, two separate 501(c) six is focused on trust and safety to improve shared knowledge. i look at the emergence of that, as well as outside research advocacy groups and tell the dynamics that will help give us that perspective to avoid the bad consequences of institutional myopia in the future. host: social media companies like facebook and twitter issues -- insist they are not media companies and don't have a responsibility for what people say on their platforms. you agree with their contention? do you think there should be some regulation of what companies like facebook, twitter, and instagram and other social media companies can put out to the world? guest: you are talking about one
of the hottest issues these days . that section 230 in the united states law protecting facebook and companies like it has intermediaries from being liable for the actions of their users. i believe there is an important distinction to be made between companies whose work and business is centered around facilitating expression of others. in contrast to media companies, many of which are successful that work to cultivate the content that comes out through their services which is much more hands and direct. i think it is probably correct that historically there has been a level of action for intermediaries. the same is true in europe, there is a similar law in the european union that protects intermediaries from being held liable for the actions of their users. however i think it is right and proper to consider what will government can play at this important point in time.
to put more emphasis and perspective on investing and responsibility. i think this distinction is important and should be preserved. there's a role for government to be engaged. host: let's let viewers take part, we will start with gilbert , calling from raleigh, north carolina on the democrats line. good morning. caller: good morning. chris, i wanted to get your thoughts on the issue regarding facebook and how they have been perceived and investigated around suppressing information on their platform regarding legal campaigns in support of candidates and belief systems that they believe in. mark zuckerberg foundation rights million-dollar checks towards democratic candidates,
this is an issue that has been going on for long time and has been investigated. do you see facebook really doing this? is anything really happening with all these investigations? are they being broken up or regulated? and other social media companies, i'm skeptical. i think they have a lot of politicians in their pockets, lobbyists, spending millions of dollars and i don't see anything happening. i could be wrong. i would like to be wrong. guest: i'm not sure what evidence or stories you are referring to. my understanding of facebook is that they have a number of serious, excellent, well credentialed political veterans from both parties operating internally. i know there senior policy leadership is closely linked to the republican party rather than the democratic party. i don't see a particular bias in the outputs of the company from my perspective. there is a difference between
intentional actions and the ramifications and repercussions that can arise from these complex systems. i knowledge there might be cases where we see that. i mentioned earlier that in germany the far right party has three times as much visibility than the other policies. i don't believe that was an intentional decision by facebook to promote the far right party in germany. it was a consequence. on the subject of investigation. i think facebook is being actively investigated by a number of different federal agencies, including the a security's and exchange commission, obviously. understood to start calling it the meta-memorandum. that's appropriate at this point in time. as well as a number of agencies outside the united states. some have broad regulatory powers to have more of this internal info which should give us a clear picture of what's going on and what was intended to go on.
host: so were not saying whether it's true or false, you don't think they are pushing out far right material, but if they wanted to, could they? are they a private company, this is not a government owned entity. if they wanted to be more liberal or more conservative, is that their right as a private company? guest: i agree. they have every right under the law, american law, to take a stance on political issues, on any or all political issues. we have seen quite a lot of companies adopt an explicitly progressive position and it's really made -- i don't think facebook has done that. i think it's harder for a company to do that. a company could not do that secretly. they would need to be very overt . facebook more than most, there's
quite a lot of -- -- whistleblowers coming from the woodwork if the comedy were trying to secretly take a stance. host: roger is calling from raleigh, north carolina, on the republican line. good morning. caller: my question relates to whether or not facebook is subject to -- i think it's the 1934 federal broadcasting act. i can't member the name of it specifically -- remember the name of it specifically, or the patriot act, things like that that govern to the extent to which the government can monitor what goes on facebook. also to what extent are facebook groups actually private? guest: that's an excellent question. the law you are referring to is the 1954 communications act, the provision set up the federal communications commission as a regulator. i've spent some of my career working at the fcc.
there's a robust debate still going on among scholars and researchers in the community about how to approach the internet as a whole. not just more traditional feelings at this point, like communications companies, but all of the emerging technology. from my perspective i think there are three options and i don't have a particular preference. one of them is to regulate internet companies, especially those that feel a little bit more like communication services including social network services. there are many calls for regulating those under traditional communication clause -- munication laws paradigms. in united kingdom the regulator has been chosen to implement a duty of care law to try to govern online content and limit the harms. they have chosen their equivalent to the fcc. that is the government body to look at an implement this creed
united states went more towards the federal trade commission, a general competition and consumer protection regulator. it determines -- depends on how you interpret it. there is also a call for a new third kind of agency to challenge and not except the past paradigms. there are pros and cons to any of these. i don't senior legal pathway to have a statutory law of the sec or its existing authority could be used to govern this kind of online activity. so when former president trump issued an executive order trying to pressure the independent agency, the fcc, to step up, i do others did not feel that was so