Skip to main content

tv   British Committee Hearing on Fake News Facebook Panel  CSPAN  February 10, 2018 3:43pm-5:19pm EST

3:43 pm
feel he enjoys having us around because it helps drive his message, it helps drive the news of the day, which he can do every day and does every day. he is constantly driving the message and therefore having us around allows him to do that. "q&a" sunday night at 8:00 on c-span. >> members of the british committee travel to the united states this week for a hearing on fake news and the media. on this panel, witnesses from facebook testified on how the social media site was handling the distribution of misinformation and preventing fake accounts from interfering in the election process. from george washington university, this is 90 minutes.
3:44 pm
>> we will go straight into the second panel. somewould ask monica background information about the senate intelligence investigation into russian intelligence. some of that work has been quite insightful for us and lead to us contacting facebook and asking for similar insight to be done as to when he -- as to whether there was any russian activity in the u.k.. i want to ask about the information that facebook has provided. is it correct to say that all the unit -- all the evidence so far and the number of people that were exposed to content creation disputed by the internet research agency in st. petersburg, all of that announced so far comes from the analysis facebook did identifying payments made in
3:45 pm
ruples related to advertising around that content. monica: the information we gave to the committee in testimony is the best source of information for the details about that. didhat testimony, colin speak to how facebook conducted that investigation. payment in rubles was one of the signals. that is one of the things the testimony is the best record of. havest from information we received in the senate intelligence committee, they are clear that everything has been extrapolated from those accounts where wrubel payments -- where rubel payments have been made. the facebook analysis was looking at the lowest hanging rubel accounts where payments have been made and there's been no wider analysis done as to whether similar
3:46 pm
activity was taking place. monica: i can refer you to the statement, we did put out some ofts, including in april 2017 and then several of them over the summer, including one in august into september where we talked about how we did look at that sort of content. >> i am pleased that facebook has agreed to conduct an analysis of whether russian agencies were distributing information on the platform. just a note, our expectation will be that won't just be based madeat ruble payments were , or whether it is likely to have come from a russian agency. payments for advertising is one told but there are many and we hope the analysis will be
3:47 pm
wider than that. i can tell you that as i explained previously in my letter, a second investigation is underway. i has not yet completed but can tie you that we do expect to report back to the committee by the end of february the results of that investigation and we will be prepared to share with you, possibly in private because we do not want to tip off the bad actors, exactly how that work was undertaken. we look forward to seeing that at the end of the month. to what extent do you feel the company has a responsibility to see that your customers know the source of the information they are seeing? very responsible for letting our community know that they are in a safe community and some of the questions that were put to the previous panel went to questions
3:48 pm
of safety and we can speak about that. that is a huge priority. it is also a priority for us to help people connect with authentic information. we know from talking to our community that it is something they care about and we are investing a lot not just in the policies for keeping our communities safe, which tend to of it is in the sense fairly black and white, in either crosses the line into something that is unsafe, but also in the area of fake news which is different. prongeddeveloped a four approach where we are trying to make sure that when people are not with news on facebook it is reliable news and they have the ability to make decisions that are informed. do you feel that with information people might share, community pages that might have a political content, it should be clear to people where those pages are being administered
3:49 pm
from. if i am seeing a page about kent where i live in england, is that being administered by somebody who lives in england -- there's a spectrum of what people might call news and information and one of the things that is notable about social media is it has given a voice to people in areas of the world where news outlets do not necessarily reach. there is a spectrum of information. our job is to make sure people connect with reliable information and make their own decisions. isn't one of the ways you empower people to make those decisions is they understand where the content is being created and the creators are who they are pretending to be. monika: there are a couple things we do to try to increase transparency. facebook,we do on this is distinct from other services, we have a policy that
3:50 pm
requires that you use your real name. removing falsef news, a lot of that comes from -- if you think about the worst types of false news, this financially motivated spam that has links that takes people to oftenms, that is propagated by fake accounts. that transparency requirement is important to removing those accounts. there is another thing we do, which is we are looking at using context as a way of informing people about their news source. this is something we are testing right now. when people see information from a news source, if there is any signal to us that that source might be unreliable, they can , and we an icon released this november of 2017, and from that icon they are
3:51 pm
taken to information taken from across the internet about the source and the reliability of the source. people do set up fake accounts and you have identified fake accounts. monika: we remove them every day. damian: what will be the harm in ofing it clear the origin material as you see it on the platform, where it is being created from, that may be a bake signal to you as to whether it is the source of been -- a source of information you should trust. monika: people can see who is publishing. if they are using the real name, and if they are not they take that account down. damian: it could be there for a while and people do not necessarily know the location the pages being administered from, the country or so on. monika: you're right that we do not catch every fake account at its inception. we do find and remove many of these fake accounts every day.
3:52 pm
this is also an area of tremendous technical investment for us. in the run-up to the french election, the german election, were usingection, we our tools to remove thick accounts, not that those were necessarily related to spreading misinformation, but they were fake accounts and we are using technical tool to reduce the chance they might be used to spread disinformation. i will ask the same question i asked you to. percentage of your revenues do you reinvest in identifying bad content? is something that thousands of employees -- we just put out an earnings call where our ceo said that more than 14,000 people working at facebook are working on safety and security issues. that includes the engineers who are working on the technical systems, who identify fake accounts, who identified terror
3:53 pm
propaganda or other violating therial, it also includes work of our content reviewers who are looking at this sort of content that has been reported to us. damian: youtube said they were spending they thought tens of millions of dollars, how much is the investment for facebook in money terms? monika: i would not have odd-numbered to give you. this is such a priority -- i do not have a number to give you. or than 13,000 -- more than 14,000 people are working on it. 's jobs.these people damian: you know what the ad revenue for facebook is? all of my questions are for so hopefully the u.k. taxpayer will get the best value from my travel. you just mentioned thousands of
3:54 pm
fake accounts in connection with the u.s. election. thei have seen so far is 470 connected with advertising, see good -- could you elaborate on that statement. my statement was that in the run-up to the french election, we removed thousands of accounts using these enhanced fake account technical tools. we have been investing in this area for a long time. this is not something -- the real name policy is not a new policy and using tools to find fake accounts is not new. we been doing this for years. we have had significant advancements in the past year and that is what has allowed us to remove those thousands of accounts. the thousands in relations to the french and not the u.s. election? why only 470 with the u.s. election, who is better at french, you or the russians? monika: i can refer you to the
3:55 pm
comments that were put to the committee by my colleague. that is the best place to find information about that. that is part of an ongoing investigation. we are cooperating with the relevant authorities and that is the best source of information. paul: you do these sweeps in relation to specific events like elections or to do them all the time? monika: we are doing them all the time. paul: how many thousands of fake accounts have you suspended that have no connection with specific events like elections? without knowing precisely, that is probably the most common scenario. accounts many false every day and many of those are created for the purpose of sending out spam links or engaging in other bad behavior.
3:56 pm
some of them we can catch up the time of creation and we stop them from reading the account. others we can remove quickly after identifying them. if you cannot give us the numbers now, can you provide us with a briefing as a follow-up? monika: when you remove accounts quickly, you do not necessarily know what the purpose of those accounts might have been. haveccounts we find that or createdd en masse where there are signals that they are not being accurate in in a name or engaging false way, we remove them regardless of why they have come to facebook. 470 and relation to the dollars or thousand rubles or equivalent, why did
3:57 pm
you accept that money? of an ongoingpart investigation, we are continuing to operate with u.s. authorities. what i can do now is refer you to the comments to the senate judiciary. paul: the question is unanswered, why did you accept the money? do not make any efforts to know your user or your advertiser? monika: we do. with regard to our systems generally, we can speak about ads and user generated content. when it comes to ads, every ad that comes to facebook is reviewed by automated or manual review before it goes live. ofimportant component advertising on social media is that it does happen quickly. we try to use these systems to find things like that content or an ad that might have a certain word in it that would suggest we should take the time to review it before it goes live.
3:58 pm
there are a combination of signals that might lead us to take this review after the ads go live. the review does not stop. paul: you've taken money in the first place. if you do not know your advertiser, or your user, how can you be sure you are not in breach of international sanctions, money laundering regulations, what responsibility do you take? a team thatave works very hard to make sure that when it comes to taking money we are complying with all laws, such as sanctioned individuals and countries. as i mentioned before, we also have a policy that requires accounts to be authentic and the advertiser must have an account before they can purchase an ad on facebook. our advertising is a self-service model and what that means is if you are facebook -- if you use facebook and you have an account, you can run an
3:59 pm
advertisement. if you do so, that advertisement will be reviewed in some fashion before those live and we will look at additional signals after the act goes live in terms -- in addition to how people are interacting with that and -- with that ad. paul: my son opened his facebook account at the age of nine and has found it unable to change his birthday. check on who signs up stumbles at the first block. it is important for users looking forward at the integrity of facebook, and advertisers, to be reasonably confident they are dealing with real people, not fake people. what can you do to improve your game and making sure that people are dealing with real people,
4:00 pm
not fakes? that is a very important part of what we do. when people come to facebook they expect that they are interacting with real people. that is the cornerstone of our -- the cornerstone of our service. age to come to facebook's 13, and we do have automated systems and things we can follow up in private. we do have systems that try to detect when a person is putting in a false birthdate and we have systems that restrict people changing their birthdates. we catch many people who try to come online earlier. this is something we are constantly investing and constantly improving.
4:01 pm
when we have those safety-related policies or not allowing people to bully others some of that can be done by google andools, like youtube. we are seeing real gains from that. a lot of that is also contextual and requires human review. there are technical tools that allow us to identify an account may be fake. it does appear that this is fake and it will do it. >> if i go to facebook and complain that my identity is , how long in terms of your policy would it take for
4:02 pm
that page, post, site to be removed by facebook so that i can be satisfied you are reacting? >> the vast majority of complaints we get from all are reviewed within 24 hours. if there is something we think a safety-related it goes to the front of the queue. if you are seeing an imposter account that is something we would attempt to respond to very quickly and imposter account can wreak havoc. that is not always a simple inquiry. i've looked at our reviewers who do this work. account and another account. you have to look not only at the date the accounts are created,
4:03 pm
the economy of the accounts to figure out which one is right, you may have to ask for the uploads of forms of identification. we don't always hit that mark. have -- youink you can do much better. ask, i wanted to .o combine something you said have you done an analysis of which country these fake accounts come from when talking about them being taken down? >> there are various signals our team uses. can follow up privately and .ollow up on those signals
4:04 pm
>> people who are trying to influence our elections because all materials are very regulated. it's really important, so i would appreciate it if you could. what could have a very controversial referendum in the u.k.. informationve this contains -- >> it is something we understand why people are concerned about it. it's one of the reasons why the electoral commission approached us in october of last or and said we would like you to assess about whether or not there was indeed misinformation coming from another country. and we were keen to cooperate on that.
4:05 pm
we reported some findings from an initial investigation to chair the committee. i think it's fair to say we haven't done enough work having reflected on that. is we won't be able to tell you until that work is completed. we are committed to telling the committee the outcome of those results. the one thing i would say is unlike the u.s. election, we intelligence any reports that suggests there were downright russian interference using facebook involved in the brexit referendum. different where there is an intelligence report demonstrating. >> obviously the referendum was about the membership of the european union going forward.
4:06 pm
countries within the eu, not the u.k., had more of a vested interest. are you looking at any country? i have not been aware of any parliamentary debate or newspaper stories, anything else suggesting other countries other than russia might be doing this. >> is not whether we have suggested anything, i'm asking is a company are you looking? >> no we are not. >> there was clearly misleading information on facebook during that referendum. anecdotally any time you knocked on a door in somebody with her you something you would say where does that information come from? they would say facebook. that was every time. what do you think the company stop that proliferation of false information that was getting shared and re-shared?
4:07 pm
i think people were buying into it because people they were were trusted.h goodness only knows what is a company can you do to stop that kind of proliferation with information? >> i want to be clear that we do not accept that sort of proliferation of false content on facebook. i want to make sure we are distinguishing between the sorts of content like extremist content and bullying content where there is a bright line of leave it up, take it down versus fake news. hear the term fake news, as you pointed out, many people will share stories of things they say online -- they see online. those range from the financially , which is themers most common type of false news we see in the platform, down the
4:08 pm
line to these sensationalist headlines where the underlying story may be based in truth using certain words to get people to click on a headline. we can't have one policy that addresses all of that. since this has become a topic of interest and concern over the course of the past year and a half or so we have developed a four pronged approach to this. you can take this to the point where you can say where that say what if we have a policy on only -- we would in no anything about an individual post whether it is true or false. it would inhibit the type of speech that it would all in case -- all engage in. protecting things are speculate
4:09 pm
things are sharing opinions on things. we remove false accounts and bad known actors. well, that takes care of a lot of that content. we try to do is disrupt the financial incentive for the sorts of actors that come to facebook. i mentioned earlier they may post links that may be exciting articles that take you off site to an ad farm. we are getting better at detecting those at farms. or something that looks like a video that plays, but is a ruse to get people to click on it. our systems are detecting that and removing that. the next thing we are trying to do is prioritize the visibility
4:10 pm
of content that is trustworthy and specifically deemed trustworthy by our community, we are very interested in this, and reduce the visibility of content where we have a reason to suspect it is unreliable. people can report to us when they believe news to be fake. we are using a system of fact checkers. if we have indication that news is fake, we reduce that visibility 1080% in their newsfeed. i really want to underscore this because i think this is something so important long-term trying to us, we are improve the ability of the not justommunity, people using facebook but journalists, policymakers, parents, to fight false news by recognizing it, distinguishing it among news sources and being able to make those responsible choices. we are doing that in the u.k.,
4:11 pm
we are working not only with young people where we have ambassadors talking about how to recognize false news, we did this to the run-up to the u.k. election, where we ran full facts for how to spot fake news. we actually went out in traditional media and published these, helping people make responsible choices. it is something we have been doing with newsletters -- news literacy trust in the u.k. where we try to research and understand the best way to help people recognize false news and disrupted. >> there is a company, this misinformation that is on your platform. it gets re-shared and everything.
4:12 pm
therefore influencing elections, i would say of having a very negative effect on the democratic process, which is the between -- which is between people of different local platforms. >> i mentioned to the broader initiative. i want to emphasize we are engaging with the community and trying to learn more about what they are seeing as this news manifests itself. into -- in addition to the , we are testing a way of giving people information about related .rticles if you see a particular topic, underneath that you will see related target -- related articles.
4:13 pm
we are looking at ways of incorporating brand notes. if you find content in facebook, you know exactly who it is coming from and whether or not you recognize or trust that new source. this is definitely an area where we are investing and learning and understanding and testing different options. >> to a couple of points simon millner raised, for the record, my complaint about the analysis facebook did is that all facebook had done in looking at the brexit referendum was going back to the account that had already been identified as part of the u.s. senate investigation and look to see if whether those selfsame account have been active in the u.k. during the referendum and nothing else. clear -- you were very clear about your assessment. and the views of other parliamentarians was a significant factor in why we are doing another investigation.
4:14 pm
mentioned -- also mentioned the work that was the american is based on intelligence that facebook was given. >> i said there wasn't intelligence about russian interference, comes to interfere the russian in the u.s. -- in the u.s. election. there was an intelligence report produced by the u.s. authorities about attempted russian interference in the u.s. election. we have had that we have not had a similar report about russian interference in the brexit vote we have not had a similar report about russian interference in the brexit vote. >> as far as they were concerned preservation seeking to identify accounts that are
4:15 pm
problematic. to accounts were response widespread pressure they were facing. this wasn't based on a dossier of intelligence. when we first discussed this i found that facebook should conduct its own research and not rely on people giving you intelligence. this is your system and your platform. >> i was not suggesting that this had a bearing on our ability to look at the systems. the wideraining and context of the u.k. situation compared to the u.s. situation. >> you are insinuating there was a lack of intelligence and the the absence of intelligence meant that work had already been done. in america it was pressure from congress.
4:16 pm
facebook isat least prepared to initiate that research in the u.k.. it didn't necessarily give the clearest view of what had happened. .> more of the same what problem lead you -- let you identify that decision? >> when we look at the advertisements in the wake of the u.s. election, one of the questions is are we doing enough to identify when ads may be coming from bad actors. one is is our policy in the right place? we decided to tighten those
4:17 pm
policies to make sure we're not allowing ads that inadvertently contain hate speech. >> this was in response to evidence? >> this was taking a broad look at our advertising system and saying how do we prepare for going forward? not necessarily because of in the u.s. we saw related investigation. our -- is ising is our review process holistic enough? we don't want one review to look at the content and another to be looking at how the ad is being targeted, we want to make sure we have one source of information, who is in the face of it at who it is targeting. part of that requires investing more in our viewers.
4:18 pm
mentioned in response to that therequestion was a lack of intelligence or no about thece possibility that u.k. elections and referendums may be impacted by political advertising from sources that have mischief in mind from other countries. presumably there was no intelligence that was not -- you just don't know. milner's there was presumably no evidence either way. >> when the electoral commission approach is to say we would like you to look into this, i said to
4:19 pm
them have you got any examples? has any member of the public or highlighted to you a page or in at that was seen during that election that they thought was fishy? was coming from somewhere they didn't recognize. until we complete this investigation we will not know. what we haven't had is information that has enabled us to target on a particular page or particular phenomenon from another source. the doesn't mean we are not looking very thoroughly. >> one of the pictures is parliamentarians in the u.k. come of the effectiveness of our targeted advertising.
4:20 pm
it clearly works because you are encouraging us to do it most of the time. this enables me to target people over the age of 65 with an interest in fishing. subliminal method of political advertising. that you are within electoral guidelines, particularly in relation to >> we have ats? very good relation with the electrolux -- electoral commission. reminding people it is election day. if we have a good relationship with them, of course they are .onsumer responsible
4:21 pm
we absolutely agree with you, that there is an issue around the transparency of political advertising. can you see what your opponent in your constituency is saying that's one of the reasons we are now rolling out a .ystem of transparency in the next election you will be able to see every ad being run by both the main campaign pages and all candidates. we are going to introduce a radical new level of transparency that has never been seen before. we recognize it as something that will be very valuable. >> you would've seen the pm,
4:22 pm
sing the overlying platform is no longer hosting the opinions of others. when you look at the liability for legal content on their site, this reflects comments made. as being thehat age of unregulated social media >>coming to an end? certainly we don't think of ourselves as unregulated social media. they are responsible for an incredibly set of regulations. they are self-imposed but i can give you an example of how they fit into the broader structure. >> we are talking about regulation, which is democratic and transparent.
4:23 pm
>> all of our rules are public. it is accountable in that we let when they complain about content what decision we made and why and if your content is removed, why we have done it. bodyhere isn't a single globally that has said these are the rules that are applied to facebook from the outside. it's very hard to see how that would work. their countries which have applied laws or rules which we take account of an and sure the do not break those rules on facebook. >> that strikes me as a pm to opposing anything that may resemble state regulations. if legislation did come forward
4:24 pm
to apply to your business in the light of overwhelming evidence, which you cooperate fully with that, or would you resist? >> we would certainly want to be part of the dialogue, because we do see legislation that results in unintended consequences that aren't good for anybody. i was a federal prosecutor for more than a decade before he came to facebook. i came to facebook. all of the criminal behavior and things we tried to find and remove from our service was very much in line with the incentives of policymakers.
4:25 pm
we do have a process for complying with greg -- with government regulations. is see ifthing we do it violates our community standards. if it doesn't, if it is a law that does not quite violate facebook standards, we would look at the legal process. we do have a system in effect. i would note sometimes regulations can sometimes stay in place. there will be broader societal concerns about content we will be moving. -- wesomething we will want to be a part of in the dialogue.
4:26 pm
>> isn't it the clever half of facebook is you get that clever part of facebook is you get people to sign away their data and the right to their data? >> not only do we not say -- i'm theyaying you sell it, don't necessarily know they are you areou data, harvesting data about people every civil time you use facebook? our data very clear in use policy how we use data. if you go to facebook -- you can go to "download your information." >> whether they are playing on a
4:27 pm
.ame, the video >> the way that targeted advertising works, that we do allow, is that advertisers will say i want to target this particular audience. for instance people who have liked their page. -- we will provide an audience, we won't provide data, but we will provide an audience. >> is in this a massive surveillance operation? >> no, this is a system where
4:28 pm
people can come in and communicate with one another. target thesers can done with the engaged on the site. >> a headline says, how to win 2 billion friends and destroy civilization. this is by a well named journalist. he is suggesting that it is this disposition. you hold so much data that you are now very powerful. >> we are definitely regulated in many different ways. >> we are>> we are regulated byn data protection law. everybody is covered by the protection law.
4:29 pm
we are absolutely accountable to it. suggest howrong to we handle the data. outlet -- itt news has massive hits, 2 million viewers. they are going to get loads and loads of hits. they are sharing the data of who was looking at this. will you share that data with them? >> unlike other services, we do not place ads in this content. >> that's not what they told me. advertiser youn
4:30 pm
created target audience, and then people who meet that audience will see an advertisement in their news feed. clear, the ads face around the content. there's advertising adjacent, and the point we are making. >> we don't run ads on pages. in some of these newsfeed, if somebody is in the audience for an advertiser, for the criteria, then they may see an ad. >> if they can put some of your adverts in the middle of their videos --their >> we are not aware specifically.
4:31 pm
>> sometimes these figures are taken by other users and paste it on and changed and altered. >> we have policies against anybody infringing on others intellectual property rights and we have systems in place to detect that. and certainly we have a notice to takedown procedure. >> there is a rather unpleasant incident of child chronology getting into the network through the messenger service, and it is being spread far and wide. how did that get through your systems? >> the way we deal with any child sexual abuse imagery, we have systems in place to automatically identify such content. the systems work on matching
4:32 pm
known images. we've become aware of an image of -- if we've become aware of an image of chopper not a few we reduce that to a touch of chopper not be we reduce that to a digital fingerprint and detective through facebook. we don't have means to stop new images of child tatian we have not seen before from being uploaded. as soon as we become aware of such an image, whether it is some of it to us or law , we will take steps to stop the dissemination of it. >> how did that happen? >> i can talk with that because i spoke with the channel editor of -- with the editor of channel four news.
4:33 pm
this is something where we ought to using this opportunity. a piecever come across of content like this, do not share it, reported. they will make sure it gets taken off the internet. unfortunately a lot of people share it to condemn it. every person who did that broke the law and the u.k. by doing so. were made aware of it, and once they told us which boxes they were in, we removed prevent thousands shares of that horrible material. >> does that indicate how powerful views have become and what a pandora's box views have opened up? does it perhaps demonstrate it's time for regulation, it's time for rules to set out responsibility.
4:34 pm
>> i would ask you to take the i would encourage you to ask the attribute up about this. -- the iw f about this. a much more significant concern about the aspect of the internet, despite the big size, these kind of instances are incredibly unusual. >> i wanted to know how long do to --ld people >> as long as they want to? -- as they want to. we have held his data for as long as he wants us to predict if he wants us to look at data when he first joined he can.
4:35 pm
we will remove that content. >> he was a child when he signed it. how long do you hold data on children? >> depends on whether they want us to. we are holding data for them, precious. it may be photos and family moments an important moments in their lives. them to decide if and when that data is removed from facebook. >> you are custodians of that data. >> for as long as they want, as long as they have a facebook account. >> we look after it. >> we would delete that in accordance with --
4:36 pm
>> you said something interesting concerning the election law. in 2015, 2017, and the 2016 wasrendum, facebook recognized as extremely important. do you agreetions ,as not possible to stop this 'sere another candidate purchase of facebook advertising was brought from -- bought from. don't understand what you mean. >> my opponent can purchase >> itssing from facebook unlawful for someone to pay for advertising outside the u.k..
4:37 pm
all the information has to be recorded within my particular district. candidate for me as a to check where that advertising is bought from. >> actually was a man or a woman. >> they have imprints on them. >> am not sure it is any different from any form of advertising. >> i don't understand what the question is. >> can you assume that foreign donors do not pay for the campaign purchased in britain today? >> i can't assure you of that. >> do you have that information on facebook? >> no.
4:38 pm
let me try to think about this --nario, somebody is buying we can see the account that has paid for the ads. we want to know where the money has come from to go into that. we will have information from people who paid for those let -- those ads. >> you are aware of that, so do you prevent that from happening? >> we don't at the moment. this is a map of the electoral can fit -- electoral commission. they have to ensure they comply with the law. >> it's a matter for you because you are not complying with the law either. >> i have never heard of that analysis before.
4:39 pm
>> the problem, you have everything, you have all of the information. we have none of it. >> we are moving forward with of political advertising transparency. >> there is a problem. it would be wrong to suggest that suddenly we have determined that the problem on facebook, there is a wider issue of public , in which the electoral commission is interested in. would lookrliament at, in which whether or not british election law, particularly around the transparency of stent -- of spending needs to be modernized for a different era of political advertising. rather than wait for that, do what we can to provide greater
4:40 pm
transparency. not because of particular examples where so that he was .sing foreign money to pay >> i'm pleased you recognized it is a problem. i welcome the assurances you gave. rules we check that the that you are announcing are being complied with? >> you will be able to look at the page of your opponents, see what ads they are running. you will be able to ask questions of them. the electoral commission has the power to require that person to provide information. people forrly find legal election spending. they have those powers.
4:41 pm
we will help you but it won't necessarily help you get concerned about that. >> what about advantages from third parties? do you think i would then have access to sufficient information to check that? >> we are hopeful with our new transparency around fiscal advertising, you have much more information than ever before around the nature of advertising that has been the ploy during the election. >> it was extraordinary, facebook was a bank, someone was laundering money through that bank. the mere platforms.
4:42 pm
even when you know money is country,d outside the the systems aren't picking that up at all. this change in policy, this is a consequence of the policy requiring disclosure about who is placing fiscal advertising. >> that is a bill. seeon't want to wait to governments are doing here. >> and may have been helpful if he puts the context in place as well. rather as a consequence of a very public -- >> this is something we
4:43 pm
undertook voluntarily. a very hard look at how advertising system works. that's what i described in response, making sure we are doing a better question in reviewing advertisements. we are rolling out those initiatives. >> not because you think it is a good idea. >> this is something we took voluntarily off of the election. we try to identify where he could do better. we do that across all of our policies all of the time. or anytime we see a type of conduct that is a new type of behavior that they didn't previously had a policy in place before. we are constantly seeing how we can improve.
4:44 pm
>> he is quite clear, if the system isn't picking up people he can place it in others. the question is we don't know it's going on, the question is it could be. >> we have not seen during the brexit vote investigative journalism that is linked to this adjust and there are lots of campaigns. >> you haven't looked. >> just to move on, given the controversy that surrounded this issue in the last couple of years, has facebook ever done an analysis of how profitable fake news or the deliberate dissemination of this information has been to the company? has it been very probable or has it been hugely profitable?
4:45 pm
>> we are looking at research and that is ongoing. i can't look at the financial aspect of it. we have shown a willingness to refine false accounts. policies,olates our would we take money from it? >> i had wondered had it not been any analysis of just how profitable -- >> i can't answer. we look to understand how false news manifests in itself. i can't say we look at how much money we have gained. >> would you expect us to that the dissemination of fake news and false information has been a financial drain on the country?
4:46 pm
your reactions and policies would have been the same as they are? >> we do think because people want to come to facebook with a environment, safe we think it is against our financial interest to have that content on our site. we think it is bad for our community. long-term people don't want to be in a place where they think it is not safe or were they cannot connect with reliable information. it comes to the trust our community has in us. absence -- i would guess that the propagation of fake news has been hugely .rofitable to facebook the more sensational the story, the more people are driven towards the advertising and what surrounds it.
4:47 pm
if you haven't crossed that line already, how far away do you think you are from crossing that lane? and how do you know by your own measure how do you cross that lane? >> i'm very happy to speak to that. my job and facebook is to manage policies. we draw line for every bad behavior. there are many different lines we have to draw. ,f somebody crosses that line accountit is a facebook with many thousands of followers, they cross the line, we remove it. if people cross that line their content comes down. >> you make it sound as if the
4:48 pm
facebook title has -- facebook is the biggest player of dissemination of information. fromhat it was asking about youryoutube role and helping regulate that, are you happy? regulatoryking about .ramework are you happy as it commonly exists? gore do you see that debate to the benefit not just a facebook but of society? >> because of the breadth of content that is called fake news
4:49 pm
, there are things we would not call news at all. there are things we should not be in the business of deciding whether it is fake or true. our community wouldn't want a private company to be the arbiter of truth. there are other types of content that we know are fake accounts. those bad actors trying to send people to add farms, their incentives are very much on .ined i would point you in terms of where we are and regulations, we want to be part of that dialogue. one thing i see when we talk to is sometimes a solution may seem great in theory. and then when we talk about the practical ramifications for it, we end the policy makers will
4:50 pm
say there are some unintended .onsequences there was a government ken mitty -- government committee. have been accused of actively walking can sledges they do framework and particularly electoral law. why do you stand accused of that? >> we have reached out to electoral commission since the u.s. election in 2016. here are some of the things we are going to do to respond to the threat of false news. that's an important part of getting this right. >> thank you.
4:51 pm
you are in fact the largest disseminator and publisher of news in the world. you make to publishing decisions every day. you designed the algorithms and the algorithms design what we recently see on facebook every day. and they inherit the biases of the people who are developing them. to ask you, how many developers do you employ to design this algorithm? >> i don't know the number of developers. we have a team that focuses on those algorithms. including the recency of a piece of content, how new it is. withyou tend to interact
4:52 pm
we recently came out with an announcement we are prioritizing content from friends and family. we know this is going to cause people to spend less time on facebook. those are the sorts of things that going to the news algorithm. >> you don't know how may developers google deploys? >> i can speak broadly. we have many engineers in the company and some of them work on the team that refines that out rhythm. >> would you tell us privately how many are working on that? >> we are happy to work on that. algorithms enable hypo person isolation -- enable hyper
4:53 pm
personalization. and your advertising helps to do that. most individuals don't even realize you are doing that. they don't realize what comes up is what youook line are targeting towards them. there's a huge power imbalance there. and the person receiving it doesn't have any control over it of an abusive relationship where there's coercive control going on. can you see the parallels? can you see why i would be concerned about that? >> people have a lot of control over their news feed but people don't understand exactly how much control a half. we have tried to make that more visible for people.
4:54 pm
select, the reason we ife a newsfeed algorithm is you come to facebook and you have a bunch of different friends and pages you interact with. we try to prioritize based on the fact that i mentioned, relevance. you can go into your newsfeed preferences and say i want to see the content in reverse chronological order. we try to make the newsfeed out rhythm something that is providing the information that you want to see. >> how do people get to know they can do that? you say there are mechanisms in which they can stop that from happening. is they will take
4:55 pm
some for them that from the site. have the ability to control who sees what you post. all of the settings are included within that, really designed to give people control over their name and experience. we try to make that visible. >> do you know what percentage of developers are women? >> it says there is a heavy name dish heavy male bias. only 35% of facebook's total employees are women.
4:56 pm
i talked about inherent biases. andou have mail developers algorithms, there is going to be a bias in those algorithms. does that worry you? we are certainly concerned about any type of bias, other forms of bias that could affect the way it works in our company. things like making decisions and enforcing our policies. stores.t the we want our workforce to be diverse and reflect the community. we have initiatives to try to inelop talent underrepresented communities.
4:57 pm
>> when does the initiative start? did you set yourself any targets? >> we are improving, we have work to do. we along others are confronting this issue. >> but you set yourself targets? >> we have more information we could provide. this is important to the company. this is another way we are tackling the issue. this is the recognition everybody has these biases in their mind. finally when it comes to develop and of our algorithms, we do
4:58 pm
have checks in place with the way we enforce our policies, we try to make sure those policies are sufficiently granular, so we don't leave room for anybody to interpret one way or another. say are we to perfect on this? no. issuequestion on another you recognize there's a lot of work to be done and you talk about prioritize asian -- prioritization. not make a commission so they can independently research? >> we are working across industry. welle now doing that as when we think about tackling that iss and elections.
4:59 pm
something we are doing in the u.k.. we will continue to find ways of partnering with these organizations. we are consistent with our terms and the laws, but it is something we do. >> i would like to ask you about your relationship, facebook relationship with cambridge analytic up. >> right now we have some colleagues meeting with the ico, undertaking an inquiry. i'm afraid i'm not the expert on that but i would happy -- i >>ld be happy to follow up let's see if you are able to
5:00 pm
answer. have you ever past any user information to any of its associated companies? >> no. >>but they do hold a large chunk of facebook's used data, don't they? >> no. it may be data about people who are on facebook that it is not data we have provided. >> how to they gather that data? >> it can be all kinds of things that these organizations do. we have no insight on that. may well do in the future, seeing how the inquiry progresses. is it the case that third-party users, whatever we might call it could ask for a facebook users data and collect data on facebook and bank it? >> that is part of the platform
5:01 pm
policies we have good -- that is part of the platform policies we have. >> is it not the case that when i agreed to give my data it takes my friends' data as well. caught we have policies our platform policies that govern how these applications can use facebook data. the way it works is they have to tell each facebook user who is going to integrate with their app, who is going to use their app, here is the data we are requesting. an example is we are requesting your hometown and your email address. they have to give you a possibility to opt out of any non-necessary data and you can opt out but you see the elements of data they require to run their app and which ones they do not and you make those choices.
5:02 pm
you do not take data that is you or your friends personal data with you. that thehave that data user has agreed to give you, you have certain responsibilities under our policies. if you were to give that data to some third party or engage in any of that act 70, that would violate our policy and if we found out about that we would enforce upon it. >> how might you find out about it? up on thiscan follow privately but there are things we do to discover that behavior and of somebody raises the flag we would investigate. >> when was the last time such an incident like that happened? monika: i do not have an answer. >> how often does an incident like that happen? monika: that is not common behavior. we are transparent with developers about what the expectations are.
5:03 pm
we expect them to comply and take steps that they do comply. >> has facebook ever provided advisors to assist with political campaigns in the u.k., putting your advisors into campaigns to advise on micro-targeting? simon: and the u.k. we have two teams. one team is called our politics and government team which is a team that sits with public policy and advisors policy makers on how to use our products. that team is focused on products which are free. how to set up a page, how to handle your inbox, how to manage your affairs and they also provide guidance on how to keep safe on facebook. how do you deal with abusive behavior, which i know is a very important concern to all of us. there is a separate team who are involved in selling advertising and they obviously operate on a
5:04 pm
demand basis, people want to buy advertising and they want to do and thoseg elections teams will work with those campaigns but they're not embedded in campaigns. >> during the referendum campaign, did you provide that tod of advertising service the two principal campaigns? simon: we did to both. >> and he did so as well with the scottish referendum -- and you did so as well with the scottish referendum? simon: yes. >> you had a success story recently. it replaced that story with one about a less controversial campaign in finland. simon: i'm aware of this issue and there has been a news story written about it as if it is big news. those kinds of case studies we refreshinge often
5:05 pm
and this is a genuine case of someone making a mountain out of a mole hill. we are very proud of the work that our teams do to help campaigns that want to make use of our products and to reach people with messages. we think that is a fundamental part of how democracy works. >> let's look at the advertisements you provided. if we look at the general election campaign of 2017, would you be able to provide all the advertisements or would you be able to identify them and with necessary provide them, all the advertisements we used to influence the course and nature of the campaign, including the ads that are specifically targeted? simon: we are moving towards a system which will enable that. we are hoping that is part of it we will be able to provide not just what ads are being shown
5:06 pm
right now but also an archive of medical advertising. i'm not able to tell you how far back that will go. we think what is particularly important is that we focus on moments of democracy which are happening now. we cannot revisit previous moments of democracy and run them again, but we can focus our efforts on upcoming elections, upcoming referendums, and help people understand what is going on. >> you have identified that there has been a problem there? simon: i think there is an understandable concern that campaigns and candidates have that they cannot respond to or see the ads that their opponents are running and that is why we are introducing new transparency to enable that. >> let's go back to the referendum. do you hold information on how much money was spent on facebook during the eu referendum campaign by cambridge analytical
5:07 pm
or its subsidiaries? sure we will have information with respect to the campaigns and how much the different campaigns -- all of those campaigns have provided that information to the electoral commission. that information is excepted. -- that information is accepted. the electoral commission will be producing a report quite soon. >> my understanding is that in -- is that that is not quite the case. throughit was channeled parties in northern ireland where the rules on political reporting are different and that information is not forthcoming. you might be able to provide it. simon: we are cooperating with the electoral commission, with the ico who are running -- looking separately at this -- at these issues. we are helping them as best we can. these are matters for the electoral commission rather than
5:08 pm
for us. >> i will refer you back to the earlier statements on that. do you hold information regarding the content of online advertisements referring to the eu referendum, who paid for them? can you identify those? simon: that is one of the things we're doing with respect to this investigation, we are particularly focused on this issue of was there russian backed advertising directed around the issues during the referendum campaign. >> dr. martin moore of kings college london suggested that in the united states presidential election in 2016, it would have to be around 50,000 bot
5:09 pm
accounts, without be figure that either of you would concur with? in order to send out the amount of postings? monika: i cannot confirm that number. >> one final question. has facebook ever been successfully hacked? monika: we have seen individual accounts by people that are been compromised and that is usually because somebody might give their password. , hass facebook centrally it ever been compromised? not to my knowledge. >> would it be possible to hack and alter the underlying algorithms that are used to manage facebook? monika: we have a dedicated security team that works to stop any unauthorized access to facebook.
5:10 pm
they are identifying the best way to do that every day. >> we can send you a copy of the report that is published by our security team in april last year on this very issue, something called informational operations which are attempts to hack systems. of quick couple questions. as of the end of last year, you had 2.2 billion monthly users. i was probably one of them because i had to check up on someone who i've said a few things about me. i must say i have cut down on facebook because on the brexit issue, i was getting inundated by people at various points on the lunacy scale from the abusive to the deranged, which i can cope with because i can just
5:11 pm
not read them, i have a thick skin, but what i cannot cope with is the people that do read them who are friends of mine who then say do you realize what such and such a person is saying about you and at some point you have to stop. if i have to take that view, effectsonder what onebook scale of success has the mental health of children, quite frankly. of the 2.2 billion users, just to inform your efforts, what is within theuess organization as to the percentage of those that are fake and nongenuine accounts? monika: i believe our estimates take that into account. that wet for accounts think may be fake that we have not yet identified. we can certainly follow up on that. i do want to address --
5:12 pm
>> what is the scale of the adjustment? monika: i do not know. >> could you follow up? monika: absolutely. i want to a what we do around hate speech. it is a real area of concern for policymakers. here is our approach. for private individuals, we do not allow bullying of any sort. we would remove any content we become aware of. andpublic individuals, these would include elected officials, we would remove any sort of hate speech or direct threats. we do allow robust discussion and debate about public figures and recognize exactly what you peopleed, that sometimes will be saying things that are off-topic and irrelevant. we provide people the controls so if it is on your page, you can control those sorts of
5:13 pm
comments and make sure you can use your page the way you want to use it. that. aware of it is just simpler for me not to use it. the second statistical question. of the total number of posts, i have not got a figure for that, again for your internal efforts, what percentage of those do you estimate are made by bought accounts -- are made by bot accounts? monika: i do not have an answer. simon: i would say almost none. that is not really an issue on facebook. -- i seeal question that the u.k. is number 10 in terms of your users and ahead of the united states is india. could i ask you what efforts you make in your biggest user market
5:14 pm
to make sure your platform is not used in the world's biggest democracy is not used for electoral manipulation or to foment social unrest. monika: just as we do a safety issues, we take a global approach. fakei spoke about removing accounts, that is something we are doing around the world. outreach with industry, with government commissions, that is something we do around the world . something we have not talked about is what we are doing to prioritize good news and reliable journalism. part of what we are doing is the facebook journalism project. we have worked with more than 2600 publishers to identify the ways that reliable news can best succeed on facebook. ,his includes product fixes making it easier for news media organizations to attract subscribers or to have
5:15 pm
advertisements that work within their content. >> perhaps we can follow up about your regional approaches in different countries? >> we have a session later on with some of the news media organizations. i have a couple of clarification questions i want to ask before we finish. going back to the discussion on facebook developers. you said that it was not true that the developers had facebook use the data but they have data about people on facebook. assumingwas initially the question was has facebook provided data. we do not provide data to anyone without your permission. developer, the system that the secretary was talking about does allow people to decide i am prepared to let them have some of my data in order to get the service. >> i just wanted to be clear
5:16 pm
what you meant. facebook developers gather data about facebook users. if that facebook user then tries to leave facebook, to the developers keep the data they have gathered? monika: no. in our policy they have to delete the data. you do not have to leave facebook to make that decision. if you have interacted with an app on facebook and you decide you do not want to do that anymore, you can go at your settings on facebook and turn that off. the issue of dark outs came up. new, these changes of policy of put in place where you ,an see who the advertiser is does that include dark outs? monika: any time you see an ad
5:17 pm
you will be able to see what is the page behind the ad and one of the other ads they are running. >> let's say you want to see what adds a particular page has run, you'll build see those ads? this is a u.k. specific story, but i would be interested in terms of facebook policy as a whole, there was an investigation where they brought -- were they bought advertising in facebook for the 20 to 34-year-old audience and they were told the reach of that audience was 17 million people, but they're only 12 million people that age group in the country, so there was a disparity. the audience that facebook is selling do not tally with the audience in the country of people that are actually there? monika: i do not have an answer for you right now. i want to make sure we get to the bottom of that and we will follow-up with you on that. >> it is quite a big discrepancy on a number like that.
5:18 pm
many people in the advertising industry would say that if that discrepancy is real, it is basically fraud, it is mis-selling of an audience to an advertiser when a lot of that number could be fake accounts of people have been wrongly ascribed into that category. weika: we want to make sure are being honest in giving the right numbers so i will follow-up with you on what happened there. >> thank you are much. -- thank you very much. >> that same committee also heard from academic and media researchers who discussed the spread of fake news and its impact. this is just under one hour.


info Stream Only

Uploaded by TV Archive on