Skip to main content

tv   Cato Institute 2018 Surveillence Conference Part 2  CSPAN  December 18, 2018 7:24am-9:03am EST

7:24 am
meaningful and substantive way and i think having spent time at the agency, i am a ci whistleblower, there is no love lost between myself and my former employer but unlike most critics i'm happy to admit when i think they are right and the question of whether russians interviewed in the election i have complete confidence they are right. yes? >> i retired from the department of defense and you have brought up a comment about moving documents from donald trump's desk. if we could define a shell of the president's appointees with other doubts, do they have a function to act quickly when everyone else does not have time to act?
7:25 am
is there any connection between the deep state described and the people who shine to control the president? >> i will say having worked for a member of congress for over a decade the interesting thing about staff is they do have the ability depending on the member's personality and characteristics, this applies to presidents too, they have the ability to shape the agenda and essentially shape what is in front of the president. the bureaucracy has had the ability to do this too. every time somebody is making a decision about what goes into the presidential daily brief which i hear he doesn't read that much. they are making a judgment call ultimately about what they think is important to the president or what the president
7:26 am
ought to know. does that qualify as deep state activity? i don't think so. it is the qualitative judgment about what is important and whether something is a threat but there's no question staff absolutely helped shape policy and shape the decisions and choices that elected and appointed officials made. that is one of the reasons you do have to be concerned about personnel. you have to be worried about a particular position. and we have time, barely, for one more. the gentleman on the wing over here with glasses, looks like a navy jet and blue shirt. name and affiliation? >> yes. yes. thank you. i am steve dewey, federal government retiree. question i have, you talked
7:27 am
about how the bureaucrats in the federal government, civil servants, don't really affect public policy. my experience, that is not true at all. i have seen people in senior management positions who do affect public policy. the question i have, there is public information out there, you can find it on federal election commission websites, if you look at political donations, it reveals that over 95% of justice department employees donated hillary clinton, same with us state department and many other federal agencies. >> your question is?
7:28 am
>> overwhelming donations of federal employees per political candidate influence their execution of public policy? >> answer, of course. >> they don't affect public policy, they deeply affect public policy, whether they are able to disentangle their personal bias. that something to be careful of with fbi agents in political donations, federal employees who have political views and we shouldn't pretend they don't have them. we should acknowledge they are able to do the work separately and to put those views to the side and do their job with integrity as the law requires, i would frame it as a slightly different presentation. >> we should be clear the data
7:29 am
you are talking about represents those employees in the department of justice or any other federal agency who elected to make campaign contributions, that is not necessarily an accurate reflection of the overall political worldview of the to permit of justice. and we are going to conclude the panel and take a 15 minute break to give my friends a shout out. [applause] [inaudible conversations] [inaudible conversations] [inaudible conversations]
7:30 am
>> welcome back, thank you so much. as i mentioned in my introductory remarks, there are so many fascinating issues surrounding surveillance, new technologies, that if we were to come to the mall, this would be a conference that would last approximately three weeks and even i have in some capacity to focus on issues first for that long. the last couple years we have been inviting scholars and activists to present talks to focus tightly on a single subject and present more analysis than they have been doing and to get a sense of the
7:31 am
range of hard questions for citizens and policymakers. our morning flash talk covers issues from facial recognition to social media surveillance to the global war on encryption on various fronts. i will introduce the speakers if you want, full or biographies, look to the conference website on and you will find in addition to agenda links from speakers names for more extensive biographies and begin with analysis of regionally passed legislation in australia that seeks to mandate law enforcement access to encrypted software, encrypted messaging tools to the first of its kind, could be a model of regulation elsewhere. i want to invite from new america sharon bradford franklin.
7:32 am
>> thank you. i with new america's open technology. if you had told me a year ago that i would be here today talking to you about australia i would have thought you were joking but i'm glad to have the opportunity to speak to you today about the law passed earlier this month in australia, how this could allow the united states to look down under friend encryption back door. and get the clicker working. there we go. those of you familiar with the long-standing encryption debate, this pit security
7:33 am
against security, the us justice department, arguing they are, quote, going dark due to increasing use of encryption. they complain they can no longer access electronic communication. even when they have a valid court order. many have encryption by default and in their products and services, they simply do not have access to users encrypted communications. the justice department and fbi want to require tech companies guarantee government has exceptional access and what they now start calling the use of so-called responsible encryption. so they are able to access encrypted messages. they are hampered in their ability to keep americans safe from paris and other criminals. from tech companies and privacy advocates pointed out this amounts to encryption back
7:34 am
door. that could be exploited by others. there is no way to guarantee the us government would be able to use any such mechanisms rather, this amounts to deliberately building vulnerabilities into products and services and undermining device security for all would harm everyone's privacy and cybersecurity and create new threats the we will all be victims of collectivity. in addition as we explored in a forum oti hosted last month, and corruption protect economic security, personal security and freedom of journalists and individuals in vulnerable communities including victims of domestic violence. this debate which has been going on for years in the united states has gone global with a quickly up down under in australia. this past august the australian
7:35 am
government released what they called an exposure draft of telecommunications and other legislation amendments or assistance and access bill 2018. unlike the u.s. congress which takes months and months or more likely years to pass anything the australian parliament managed to wrap up its consideration of the bill in a matter four months following a public comment period on the exposure draft a slightly modified version of the bill was introduced in parliament and referred to the parliament rejoined committee on intelligence and security or pj cis which opened a new public comment period. the technology institute organized an international coalition of organizations, tech companies and trade associations and filed 3 rounds of public comments on the bill outlining concerns i will describe in a moment, held a series of hearings and at the beginning of last week the pj cis issued a report recommending passage of the
7:36 am
bill with certain amendments inc.. early in the morning just last thursday, december 5th, the parliament released an updated version of the bill including 173 amendments no one had ever seen before but by the end of the day the australian parliament passed the bill into law. so what does the australian law actually do? as one commentator put it, quote, it combines stupidity and cowardice of the coalition and labor, any it product, hardware or software made in australia is automatically too risky to use for anyone concerned about cybersecurity. we are focusing on schedule one of the australian law which is the one that undermines the safeguards of encryption. there are also other sections of the law to create additional privacy for powers of government hacking but we are
7:37 am
focusing on schedule one which relates to encryption. the law includes what appears to be an encouraging statement that purports to prohibit the government from demanding creation of the incursion backdoors and i have on the slide here section 317 cg, says the government may not request or require communications providers, quote, to implement or build a systemic weakness or vulnerability and also must not prevent the communications provider from rectifying systemic vulnerability. however, the law grants new authority to the australian government that undermines this, specifically the law creates 3 new and powerful tools for the australian government, technical assistance requests, technical assistance notices and technical capability notices.
7:38 am
the requests are supposed to be voluntary where notices are mandatory and the differences depend which government officials authorized to issue the notice. all of these authorize the australian government to request or demand any, quote, listed thing. that is a long list in the bill and it includes things like removing one or more forms of electronic protection that are or were applied by or on behalf of of the provider and also includes modifying or facilitating the modification of any of the characteristics of a service provided by the designated communications provider. in short these are powers tech companies weaken the security
7:39 am
features of their products. for example, the australian government can now make the same request to apple that the fbi made in the 2015 san bernardino shooter case, they build a new operating system to circumvent ice loan security features. apple explained building the requested software tool would've made the technique widely available thereby threatening the cyber security of others. as we know in the lawsuit in the us the united states government argued under the obscure act that dates to 1789 they were permitted to make that, apple supported by other tech companies and privacy advocates argued this demand was unconstitutional. the justice department withdrew the demand before the court could resolve legal questions because the fbi was able to pay the outside vendor to hack into the phone but in australia they have specific authority to make these demands. another worrisome scenario is
7:40 am
australia may seek to use this authority in the same way the united kingdom is looking to use it powers. last month ian levy and kristi robinson of the uk's nsa put out a proposal that tech companies would be asked or required to add a silent participant in end to end encrypted chats and the tech company would suppress the notification to the user, they argue that, quote, you don't even have to touch the encryption to add a ghost user in the encrypted chat. there are several other threats posed by the new australian law's approach to encryption. and our coalition comments in addition to explain the breadth of the new powers created by the we addressed three other key concerns.
7:41 am
first the law lacks requirement for independent review or adequate oversight. many features of the new law such as authorization for technical capability doses were modeled on the uk's invesco tory power act passed in 2016 and raises threats to additional security and human rights but section 254 of the uk act does require the judicial commissioners to review and approve proposed technical capability notices before they may be issued although we still have questions about the adequacy and independence of the review under the uk law, australia's authority poses greater threat to cyber security individual rights because there is no provision requiring any type of prior little-known independent review.
7:42 am
in addition, australia has no bill of rights. the procedures through which tech companies, there are procedures for companies that may challenge government requests these challenges will be more difficult. tech companies do not have the same legal arguments available to them based on protecting individual rights as they would in countries like the uk and the us. it requires into secrecy. although the law requires statistical transparency reporting by the government and statistical transparency reporting by tech companies, it also includes strict nondisclosure requirement. when the government issued a request or notice to a tech company, violation of the secrecy rules of the criminal offense punishable by 5 years in prison and there are no limits to the duration of these gag orders such as we have here in the us when the reason for confidentiality no longer exists. the law's definition of cover designation and communications providers is overbroad and includes anyone who provides
7:43 am
electronic service that has one or more users in australia. this means any tech company doing business in australia or anyone providing electronic services in australia is subject to government demands that they weaken the security features of their products and services. this is bad for australia but what does it mean for us in the united states? australia's legislation appears to be part of a coordinated effort for those who may not be family with that term, the 5 eyes is an intelligence alliance comprised of australia, canada, new zealand, the united kingdom and the united states that dates back to world war ii. since 2013 these 5 nations formed a 5 country ministerial which is an annual convening on strategy and information sharing on law enforcement and national security. for the past two years they have focused on strategies and policies to weaken encryption. just this past august, august 2018 the five countries
7:44 am
released a statement on principles, access to evidence and encryption and that statement includes if these governments continue to, quote, be counter impediments in their efforts to access communications they have legislative mandates for infection backdoors. the very same month that statement came out, australia released the exposure draft of its encryption bill. so now australia law provides the united states and other governments a backdoor to encryption back door. australia now has the authority to tell providers to create incursion backdoors and once providers are forced to build weaknesses into their products, other governments conflate those weaknesses. i mention the examples of apple versus fbi. if australia issued a technical capable when it is to compel apple to build a new operating system to circumvent iphone security features which is what
7:45 am
the fbi demanded, if apple built that system it did no longer argue it lacked capacity to turn over data to the us government in similar cases. similarly if australia forced facebook to reengineer what's apps encrypted chats to be acceptable to australian legal demands those chats would be vulnerable to other governments. finally, there is a risk the us government seeks to expand its own directives by pointing to australia as the new model for, quote, responsible encryption legislation so whether it is a pathway or model the australian law creates risks to cyber security and privacy that extends well beyond australia's borders, thank you. [applause] >> thanks so much, sharon.
7:46 am
next up, the french philosopher michelle truffaut is known for his analysis between surveillance and timing or discipline. his book usually translated in english as discipline and punish is in french, it could be translated to survey all and to punish. very naturally, close monitoring is always a key part of training and indoctrination, children are often clustered as we are teaching them, perhaps an inevitable part of raising children safely but it also means we need to worry about whether we are training them for compliance and surveillance
7:47 am
as the technological capability to monitor children ever more closely becomes a reality and in use i often wonder whether we are preparing children to accept as normal a world in which everything they do is closely scrutinized. to look at one aspect, social media surveillance, rachel levinson waldman. >> thank you so much. that is the perfect introduction i will be coming back to the point near the end of my presentation. my name is rachel levinson waldman, senior counsel with the liberty - maturity program at the brennan center for justice. what i'm talking about today is social media surveillance, k-12 students. to start this off i want to talk for a moment about the prevalence, the deep saturation
7:48 am
the kids have online. according to a pew internet study from last month, 97%, 97% of 13-17-year-olds have at least one major online social media platform. 95% of access to a smart phone, and constantly. there's a lot of time, teens and younger kids are spending on and that social media presence comes social media monitoring. preventing bullying, preventing school shootings, and online stress and big business.
7:49 am
and 9 major social monitoring companies, they gave spikes, mountains and valleys, overall a pretty massive increase in 2010 going to the spike in 2016, and a big spike in summer of 2018 potentially driven by the shooting in parkland, florida, but public school districts are spending more and more money on automated social media. this is similar reflected by keyword searches, for social media monitoring in contrast between public schools and private companies and showing these spikes over the last several years, significant increases and a major spike in 2018 so increasingly a lot of public money spent on these services. based on these statistics, you
7:50 am
might think schools are getting more dangerous but in fact the opposite is true. schools are getting safer. it is true this country has the unique risk of school shootings among developed countries and obviously a single shooting or even a single bullying incident is too many, the overall crime decline in this country holds true in schools as well. the arthritic-12 student will be shot and killed in a public school is one in 614 million. by way of contrast, the odds of choking are one in 3400. in 1995, 10% of students age 12 through 18 reported being victim of a crime in school in the previous 6 months. in the 2015-16 school year just 3% of students did. and that 20 year period it went from 10% down to 3%.
7:51 am
less than 3% used homicide and youth suicides have occurred at school. of course part of the hope with social media monitoring may be it will pick up risks off of school grounds as well but by any measure school is a pretty safe place to be. the one state in the country that is legislated social media monitoring, i am sure everyone is familiar with the shooting in parkland, florida, last february when nicholas cruz, a former student at marjorie stoneman douglas high school shot and killed 17 students and staff members and injured 17 others. in the wake of that shooting the florida legislature passed a law that included the creation of an office of state schools in the department of education. that office is required to coordinate with florida department of law enforcement, a centralized database with a wide range of information including social media data.
7:52 am
and the public safety commission which recommended development of protocol around social media monitoring, and doesn't look like it was done yes, it is likely to do so in the next year. nicholas cruz listed online with intentions before the shooting. and local police 3 times for disturbing post. and a separate call flag to youtube posts in which the user said he wanted to become a professional school shooter. the poster wasn't identified as cruise until after the shooting. there certainly were warning signs on social media, wasn't the case a district was flying blind. people were seeing those warning signals and trying to act on them and what failed
7:53 am
those students wasn't failure to see those posts. according to review the school district actions that came out in august it was more the district itself failed nearly every turn to provide crews with educational and support services he needed. florida is embarking on a national experiment when it comes to social media data. the question is why not? if a single school shooting is one too many a social media monitoring could catch one future nicholas cruz, one future suicide students, why not do it? if the stakes are that high? what's the harm? there are a lot of reasons for being cautious about this kind of monitoring. the first is a real concern about the accuracy of social media monitoring and displays in a couple different directions.
7:54 am
and when wade is inaccurate is through overreach, likely to pull in much more information, police in jacksonville, florida set up a social media monitoring tool, and related to public safety that indicated criminal activity. it was the word bomb, some sort of bomb threat that would turn around. it turns out there were no bomb threats that were online. instead was inundated with posts about things like pizza that was the bomb and photo bombs. a lot of stuff coming in, very little use. the second issue is under reach by which i mean the kinds of risks social media monitoring tools would like to find our is going to appear online at all.
7:55 am
and the intentions that people reported them, it is not clear what the extra value of monitoring software would have been and it turns out to some extent he was the exception so the brennan center did an informal survey of major school shootings, that is a category, major school shootings since the sandy hook shooting in 2012. there was only one other perpetrator according to the public reporting that had put up social media postings that strongly indicated an interest in school violence and that was the shooter in newtown, sandy hook, posting a discussion forum about the columbine high school shooting, and similar accounts, after school shooters. fellow users were able to see these and they may not have known whether to take this seriously, it is hard to imagine these wouldn't have
7:56 am
been reported directly to authorities, we saw that with nicholas cruz is what happened and the individual concern with users, the online profiles and school shootings, don't show them to flag them the automated tool. and in oregon, a facebook page showing he liked first-person shooter and military gains like call of duty, various knife and gun pages so in retrospect, these warning signs that something was going on but in fact the official facebook page for call of duty world war ii had 24 million followers, remington arms facebook pages 1.3 million likes so sending up a red flag about every single
7:57 am
person in these pastimes would create a huge quantity of noise for little signal. automated social media monitoring tools have built-in shortcomings. and mixed messages, which does research on this and research shows automated monitoring tools work best when posts were in english and tools looking for something very concrete. they can be easily fooled by lingo, slaying, pop culture references and the best example, and the trial of the boston marathon bomber, during the trial, the fbi produced as evidence through the twitter account.
7:58 am
and he wasn't just following his brother's orders. he had tweeted and this is one of the things the agent brought up a quote that says i shall die young which maybe was suggesting something about his intent but also a quote for russian pop song and he linked to the pop song in the sweet. the agent hadn't bothered clicking on the sweet to see that this was a song lyric. of the quotes the fbi relied on where from jay z songs and south park episodes among other things. social media is incredibly contextual and neither automated tools more human analysts are that great at parsing out. the second major concern is the risk of discrimination and this comes in two forms. the first is the keywords themselves that the tools will be set to flag on will be discriminatory. for instance and aclu report found when the boston police department set up social media monitoring tool, the hashtag flagging included black lives matter, ferguson, muslim lives matter and the arabic word for
7:59 am
community. needless to say these words aren't signs of a public safety threat. these tools are only as good as the people using them and there are a lot of ways to use them to further a discriminatory mindset. the second risk of discriminatory impacts, whatever keywords are flagged there's going to be a huge amount of discretion in what is done with the results including which students are brought in, who is punished by the school and who is subjected to criminal justice consequences. we already know that students of color in every level of schooling experience harsher discipline than white students even for the same infractions and when they commit infractions at lower rates. there is a real concern social media monitoring could contribute to the school to prison pipeline.
8:00 am
i suspect ahmed mohammed who brought the homemade clock to his dallas area high school, was then arrested on the suspicion that it concealed a bomb. he was well-known at his school for bringing electronics, tinkering, fixing of the people at electronics and told his teachers and the principal repeatedly that it was in fact a clock. .. even though blacks made up only 40% of the student body. not surprisingly where people are mistakenly identified as posing a threat because of their social media posts, the consequences can be serious. one connecticut teenager posted
8:01 am
a picture of a toilet airsoft gun that resemble a real rifle. in his words in terms of when why, he thought it was awesome and he knew his friends would also think it was awesome. another students on the post and was worried about it so he reported it to school officials. this does not strike me as a crazy thing to do although as that is noted, if officials had googled the name on the side of the gun of the manufacturer if they would've seen it was a toy gun even though it did bear a resemblance to a real one. instead of discussing it with them and resolving the issue, potentially with some lessons about responsible social media use in and thinking before yout things, he was not only suspended for the day but arrested for breach of peace, a misdemeanor offense. now, because it's so hard to reliably pin point individual social media posts that actually indicate some kind of life
8:02 am
threat, monitoring companies have a perverse incentive. they have incented to sweep up everything so they can assure their customers they will spot that needle in the haystack. at the same time they have very little reliable way of gauging their effectiveness. 2015 investigation revealed none of the three major school social media monitoring companies they looking to head for metrics for measuring effectiveness and at least one said basically we know we succeeded when we get a call from a school saying something with symptom was interesting. it was a perfect storm for a mindset of more, more, more. at the same time parents and students often know very little about these tools. researchers will social media monitoring companies may assume students are being tracked by virtue posting a public sites, students more often believe that
8:03 am
companies are prohibited from sharing personal information with third parties. there's a real lack of information about how these programs operate, or rather there's asymmetrical information. and finally this goes to the point at the beginning, it's worse thinking about what it means -- worth -- from students to be under constant surveillance online. as a practical matter they may just stop posting or start posting less or more private forums which will simply blunt any effectiveness that these tools would have had. may be more concerning only it teaches students to expect surveillance and even to anticipate and authority figures opinion and react accordingly. some of this you could say is good digital hygiene. we do something with those publicly we need to think before you post about what that looks like, who might see it now and you might see in the future. it's not clear it's healthy for students were learning about
8:04 am
citizens wrote in a democracy to know that they are under that surveillance all the time and to be acting accordingly. so what does this all mean? at the very least before a school or school district pulls out of social media moderate program it's incumbent upon officials to weigh the cost and the benefit and to involve parents and students in a frank discussion of what it means. if they decide not to set forth on a monitoring program they should remember they are most likely not going dark. there are concern people who are spot posts. thank you so much. [applause] >> thanks so much, rachel. it does remind me, an acquaintance, a science-fiction writer who is more optimistic, view of this, is this is great because we are training our children to develop habits of
8:05 am
sophisticated counterintelligence of tradecraft just to be able to have a normal childhood. so the next generation will be very sophisticated about invading surveillance. i suppose we will find out. the next two talks focus on privacy in public in a sense, the myriad ways that just walking down an ordinary city street we are being observed in ways we may not recognize, and also the ways existing networks of surveillance like closed-circuit cameras can be transformed into fairly deep ways by existing infrastructure becoming a a platform for new methods of monitoring. the first of these is going to be an examination of camera networks for facial recognition surveillance from jake
8:06 am
laperruque. >> thank you so much for having here. i'm jake laperruque, senior counsel at the constitution project where i focus on surveillance issues and i'm really excited to be talking about facial recognition and specific aspect of facial recognition, how cameras and various aspects can empower and grow facial recognition surveillance into dragnets. as a quick start about facial recognition surveillance itself, this is no longer a sci-fi technology of minority report that we can see in the distant future. it is happy now. the fbi conducts over 4000 facial recognition searches every month on average. a quarter of all state and local police departments have the
8:07 am
ability to conduct facial recognition scans as well. customs and border protection has a biometric exit program uses facial recognition for outgoing flights. they plan to spread this to airports in general as well as seaports and lan ports across the country, and ice is looking to buy facial recognition technology as well. so that is the state of facial recognition. facial recognition depends on three key factors to be a powerful force for civilians. first, you need a database of photos that are identified of people. they have half of all american adults in the photo-based burkini very powerful software technology that can scan across hundreds of millions of photos and skin faces rapidly. lots of companies are developing this technology, the government as well. and third is you need a network of cameras that you can tap into and you can use to see peoples
8:08 am
faces all the time. there are four areas where you have the potential to build this camera networks. first, government surveillance cameras, cctv, second, police body cameras, third, privacy and security cameras, and last, social media photo databases. let's start first with the government surveillance programs, cctv programs. about a decade ago then chicago mayor richard daley said he expected one day we'd have a police camera basically on every corner. i want you to keep the court and might as we talk more and more about cctv in american cities but first let's go to where we truly have a cctv photo dragnet and where it seems we've achieved big brother status, and that is in china. china is by far the most powerful network of government surveillance cameras that we can see in the world. the country has an estimated 200 million government run surveillance cameras throughout the country, and the effects of
8:09 am
this are quite profound. if you look at cities, these networks are incredibly dense, incredibly powerful. beijing maintains over 46,000 cctv cameras blanket the city. state media and police in beijing boast that this network allows them to have 100% coverage of the city and see everything that is going on all the time. this can have real power impacts of facial recognition of recently a reporter asked to test the system that he went to say of 3.5 million people, gave his photo to the government to input into its system and asked them to find them. using the cameras and their systems from the automated facial recognition software tracked him down and found them throughout that entire city of 3.5 million people in a seven minutes. so that is surveillance cameras at its peak but cctv is in america to strong degree.
8:10 am
it is being instituted in large cities such as new york, chicago, washington and los angeles. in new york there is a cctv network, called the domain am an system. the weight as you all cameras network into centralized, that can be subject to real-time viewing analysis and other tools, facial recognition could become one of those in the future. oakland considered voting its own domain awareness of has hooked up cameras all across the city, used by government involving everything from port authority two police cars to camps outside schools. smaller cities such as st. louis and new orleans also have mass cctv networks and central subs they used to watch. the city largest by far with the cctv in chicago. chicago is the closer to achieving big brother status and america. chicago maintains a police surveillance network of cameras that is over 30,000 total
8:11 am
cameras in the city. this in some ways surpasses the level of surveillance dragnet you will see in china. although 30,000 cameras in chicago is less than the total 46,000 in beijing. if you if you look at area density for cameras, the 128 cameras per square mile on average in chicago is far, far higher than that information dragnet that covered 100% of the population. this can have really powerful effects for facial recognition and it is starting to an america. we're seeing this primarily first in orlando. orlando is currently running a pilot program with amazon real-time recognition program. the way this system works is that you have cameras scattered throughout the city. they will try to scan faces, find people can identify them and like any persons of interest, whatever persons of interest needs. not sure. that is government cctv.
8:12 am
next i want to look at police body cameras. this is probably the area of greatest risk in terms of establishing video surveillance dragnets in the united states. the simple reason for that is that body cameras are becoming incredibly popular in america, in american police department. axon, america's largest body camera producer in the united states, has systems already in over half of american largest cities. this is a huge surprise because they offer the body chemistry police department for free. so as you then use axon, the media storage system. studies from recent years the police departments indicate that 97% of the largest police departments in america all either have body camera programs in place or are in pilot and testing stages, or if they don't have been yet are planning to build in the future. this is going to be a universal phenomenon of police wing body cameras, and that being a common
8:13 am
thing we'll see on the streets as beat cops walk by. why is this a big deal for a proliferation of government surveillance cameras? it's because cities have lots of police. they have between 16-24 police officers for every 10,000 residents but when you look at big cities, this amount gets much higher. plenty of cities have as much as 40 officers for every 10,000 residents, or more. d.c. is over 50. if you look at area density come some cities are very popular with police officers. for example, ten different cities have over 20 police officers per square mile. topping the list of new york city which has over 100 police officers for every square mile. in terms of facial recognition we have seen a bit of progress. axon recently backtracked on a long-term plan to put facial recognition in its body cameras. the acknowledged the fact this tech in a lot of ways is very flawed, very prone to
8:14 am
misidentification so they scrapped plans that might've happened as soon as of this year to put facial recognition in its system. but not all vendors are taking that cautious approach. some are charging had with facial recognition the body cameras, , and it's only a mattr of time before companies like exxon are probably satisfied that it's good enough for their work and begin to instituted. after all, and axon vp described their interest in body chemistry is go by saying that the putting facial recognition in body cameras, when the every cop in american with the robocop. this is very worrying because while virtually all police polie department are charging it with police body cameras, very few are setting rules and standards for facial recognition. according to a scorecard on body cameras maintained by upturn in a leadership conference, basically no cities operate body camera programs have effective rules and facial recognition, and that is many, many cities that are not acting with
8:15 am
appropriate standards. so that's police body cameras. next i want to talk about private surveillance cameras and capacity to build government surveillance networks from them. co-opting private surveillance cameras december to cctv come another way government could build up surveillance video surveillance networks but do so with very little work without the infrastructure and at a fraction of the cost. we may not have the 200 million surveillance cams that china does what america does have over 30 million privately owned security cameras throughout the country. so given that, the potential to tap into these instead of building your own cameras it's no surprise government may want to turn this into this. otherwise, a quick aside, couple of those cameras are amazon ring doorbell, a video doorbell system. just last night news broke that amazon patented technology to build facial recognition into
8:16 am
those doorbells and connected to police networks and notify them whenever anyone suspicious came up. so another fun innovation from amazon. police departments are not just think about this idea. they are proactively soliciting owners of private security cameras asking for registration of security cameras and asking for them to engage in formal agreement whereby those cameras can be accessed and readily used by law enforcement in video surveillance networks. i mentioned new york before, and the tomato when the system they have there that allows real-time streaming of video cameras. of the 6000 cameras that are connected to new york's network, two-thirds are privately owned cameras that have agreements that allow the new police department to access and use them. washington, d.c. and a lot of other cities offer incentives to try to get people to hook up their surveillance cameras into police networks. here for example, is d.c. mayor muriel bowser saint please
8:17 am
purchase security cameras, connect into a networks. we will pay you to do this. excellent use of emojis, mayor bowser. so that is privately owned circa to chemistry in terms of effect it's similar to government cctv. it's a network of stable cameras that potential provided at the detractors that could be called for facial mcnish but something of the way build without an pretty severe risk given that we don't have the option of potential stopping government building discounts. we are worried about having law-enforcement potentially tap into the. last i want to talk about social media photos. this is bit of a different network. not talking cameras taking images or rather images that are already being stockpiled. nonetheless can social media photos are potentially the greatest risk in terms of a a photographic method could be used or co-opted by government for facial recognition in this
8:18 am
because of the sheer size of these photo databases. we have seen facial recognition user users ocean me to a limited degree by the firm gop via a few years ago they got caught and admitted during protest had run social media photos through facial recognition technology during protest in baltimore to find individuals within outstanding warrant and directly arrest and remove them from the crowd. luckily when this came out as a private aclu research companies were spotted probably pick the block and shut down their access to their services. it's important that social media companies continue to be vigilant on the front to limit their api to prevent photo databases from becoming a means of government surveillance and facial recognition surveillance. but i think it's important companies think not just about data scanning and harvesting on the platforms ultimately through api access but although about court orders and go into those
8:19 am
means to we seen similar things like this in recent past. for example, a couple years ago yahoo! received and complied with a court order asking that they scan all-female content in their databases for specific bits of content that government was looking for. it's an hard to imagine the government, with a similar court order to some of the maintained databases and asking for a mass scan to find very particular face prints. we have google talking about secretary spencer, facebook talk about surveillance. these companies maintain very large photo databases. google has over 200 million users stored photos in its cloud photo services including 24 billion selfies. facebook has over 359 photos uploaded every single day. it would be great if these companies continue to build up what are already fantastic surveillance transparency reports. they're getting better all the time come to think about
8:20 am
including facial recognition so that if the government ever does come with this sort of broad excess of order thing we want to start scanning all your photos for facial recognition purposes, we will get the heads up and start acting. with that i i want to concludey talking about what actions can we take if we start to feed these activities, how should we respond. there's a lot of potential at the local level. before i mentioned oakland had a proposed domain awareness system that would have connected all of their government cameras into a hub. this was a great success story. oakland activist when they he d out about this got organized, got mad, talk a lot to the city government about and got shut down. that's the sort of thing we can see in other cities if we take action and want to give a shot of the great program going on right now, the sea cops campaign, an effort to improve transparency and limit surveillance properly in cities all across the country. i'm sure as the campaign goes on is going to continue do a lot of great work to limit video surveillance and limit advanced
8:21 am
surveillance tools like facial recognition and built into cameras. on the federal level with a lot of potential in terms of limiting and conditioning thoughts. we talked a little bit of government cctv. a lot of funds local government cctv networks don't come from the localities. they come from the federal government. dfg funds cctv and place grants very often, for example. orlando which is running a cctv real-time facial recognition network original receipt funds for cctv from the department of justice. it would be great if any future when doj handed out funds for cctv surveillance, video surveillance networks they said you cannot use this for facial recognition or set guidelines and limit on how it could be used. dhs funds surveillance cameras for six on the largest degree as well. again this is another opportunity or setting rules, guidelines and limbs could be a very effective way from stopping this video surveillance networks from being turned into mass
8:22 am
facial recognition location tracking and scanning networks. finally, the department of justice also issues grants and tanzanians of dollars every year for police body cameras but again we do not see virtually any departments putting in good rules for facial recognition on body cameras that it would be a vast improvement when doj was handing out its grants for body chemist based would say you need put in effective rules and guidelines and limits to protect privacy before we give you all this money. so those are some actions we should take. i just think it is important we take now because we are very quickly approaching the point where we all going to on a daily basis but much like that bbc reporter, track of the automated computer computer system that is being monitored with a million little eyes. thank you very much. you can read more about our work at and looking forward to the rest of the conference. [applause]
8:23 am
>> so the classic feature of surveillance makes that a mechanism of power is it is an equal in the jeremy benson and octagon, the prisoners and the ultimate survey of the prison know that they're under potential observation. they can be seen by can't see the u.s. so when it comes to public networks of cameras monitoring as, maybe one of the most effective things we can do in order to encourage people to react to the changes that are happening around them is to be aware of them. and so on fascinated by tool the electronic frontier foundation has now to try to help you practice the ways in which surveillance in public is exploding around us. to talk about that i want to invite dave maass.
8:24 am
>> thank you for having me today. my neighbors dave maass and i'm with electronic frontier foundation, and if you're not to me with the we are we are based in san francisco. we've been around since 1990 and we exist to make sure our rights and liberties continue to exist as our societies use of technology advances. i particularly work on efs street-level surveillance projects which aims to ensure there is transparency, regulation public awareness of the various technologies that law enforcement is deploying in our communities. a lot of times that work looks like filing public records request. so, for example, with license plate readers, efs teamed up with the organization to five hundreds of public records requests around the country to find out how law enforcement agencies were sharing license plate reader to amongst
8:25 am
themselves. or let's say drones. will file a public record request for mission log reports on how uc berkeley police use drones to surveilled protesters in 2017. or will file a public records request for the district attorneys office to get a spreadsheet with the geo of every surveillance camera in their database. similar to what jake was just talking about. this is all a problem because too often i work looks like this. we are checking public records of people saying here you go, here's the document on document cloud, or here's a white paper rewrote, or a 3000 word blog post. or even worse, it's me stand in front of you doing a powerpoint presentation, and if we're lucky i have a funny car t to go with it. i don't have one today so i had to use this one. really our work should look like this to the public. i contextualized within their
8:26 am
community. if i could i would run a walking tour company where i could take people around and show them that there is surveillance technology around them. i'm a very busy person and i don't know that doing or groups of six or seven people is really the most effective way to get our message across. however, maybe this concept can transfer over to something like virtual reality. take it a step back, when we look at virtual reality and law enforcement technology, police are already working on virtual reality stuff. this is a company out of georgia called motion reality that has a warehouse sized space were police officers put on virtual reality helmets. they are given real, realistic feeling a fake electronic firearms and their wired up had to do and they go and the run scenarios. that can be replayed back so they could see what to did right and what they did wrong. one of my favorite things about
8:27 am
this is they are also covered in a guest electrodes. so if they are shot they get shocked. it is demobilized in the part of the body. the company has taken one of these modes and modified it to work as a replacement for field sobriety tests. so the whole flashlight thing would happen within an adr visor. then there's a surveillance aspect, something called bounce imaging and it is a little ball covered with cameras and a swat team officer might throw that into a hostage situation or whatever, then somebody could sit outside in virtual reality looking around before they go in and then recording at 360 view of everything of everything that is going on. i have been looking at what cane do on the other side with vr. give you a quick background, a brief history. this is one of our founders, both a the lyricist for the grateful dead as well as
8:28 am
institutional i knew. he wrote an essay which he is asking, after donna visited some of the early vr compass and came back and and he was amazed to p out of the psychedelic experience. he thought a lot of things were psychedelic experience back in because i think is on psychedelics quite a a big chuk of the time. this is the next big thing. welcome to virtual reality. now jump 25 years because not a lot of happened since then but in 2015 we finally saw vr start to move towards the mass commercial market. this was a oculus with, the htc five, the playstation dear. they all came out early 2016. for our organization there were two big questions replacing after looking at. first, whether the digital rights publications of virtual reality technology honor society? and once a potential for virtual belt as an advocacy to an educational tool? what it think it's a privacy element, "the intercept" had a great piece about hypothesizing
8:29 am
virtual reality might be the most nefarious kind of surveillance with regards to the internet yet. i tend to agree with it. it does voice a lot of the concise having, talking amongst and we had not seen it floated publicly yet. the reason is biometrics. virtual reality tends to rely on our physical characteristics in order to function. on a very basic level that is how your head is moving, the distance between your hand and your head, how long your arms are picky you are left and right hand or even something so simple as how your head is moving in a virtual reality in phytochemical letter to mental health conditions. more advanced the our technology is starting to evolve the license that measure breath or track your eyes or manufacture facial expression and that's a whole other work to one of the creepiest things when your companies that in order to
8:30 am
gather sort of reaction will biometrics are throwing stimulus at you and fairly quite man without saying why so they can find something measurable and how you respond to it. we will not get too much into augmented reality a couple present even more problems because a lot of devices are scanning the world around you is content. something interesting the king of israel is there was a research study by the extended mind in pluto vr the fan current state of play, 90% of the our users are taking some sort of steps to protect the privacy whether that is adjusting the facebook settings are using an ad blocker. three-quarters of users were okay with companies using the biometric data for product developer, the oval majority were opposed to that information being sold anonymized or not to other entities. as far as advocacy to come were
8:31 am
not the first one to try this. planned parenthood has an experience called across the line that puts people in the position of a woman trying to seek reproductive health services at a clinic that is whole lot of angry protesters there. peta has a couple expenses with a challenge people to step inside a factory farming situation, what is it like to be a calf at a a factory farm or a chicken. and there are some groups out of brooklyn messages that worked with united nations environmental assembly to do virtual reality visualizations of data on air pollution. they took that and they ran that through a bunch of u.n. delegates in nairobi. so that brings us to the surveillance project. this is an space of virtual
8:32 am
reality experts that use basic simulation to teach people about the various technologies that please made a point in their communities. when we were starting to pursue this in the early stages we had some consideration. we wanted to be a meaningful experience, we wanted to not collect biometric information. we wanted it as an organization for open source and accessible to technology we want to make sure worked on multiple platforms and not just the oculus with a five. we want to be also function on a modest budget because were a nonprofit and we are not sony. when i see meaningful advocacy experience we didn't want to rely on the novelty factor of the air. you can take anything at put in vr and if somebody for some using fear it would be like wow, amazing, regardless of what it is that we want to make sure ours was present a research and way that only vr could allow. we did what people be watching a movie. one of them to be doing something interacting with the
8:33 am
world and to be challenged by. we want people to learn information that even though they were experiencing in a virtual world through one of them to carry the fact to the real world. the concept is somebody can when she put the headset on, people of demo string a a lunch breakt you put on your, your placed in a street scene in western addition neighborhood of san francisco where there is a police encounter going on between a young citizen and to make officers. you look around and as you find something you get a pop-up and a voice over explain what it is. it's not meant to how quickly can you go through it and score points about surveillance technology. it is supposed in educational tool. there were four goal. >> eleven was coming to a virtual reality experience, can we do it cheaply and put into it the first time maybe we can do other things down the road. number two was whether just to educate people about the forms of surveillance. then we also wanted to help them
8:34 am
figure out where they are in their communities. finally, we had this thought that police encounters are stressful situations, protests are very stressful situation sometimes. thanks move quickly but it can be useful for people to take note of what surveillance technology they saw in those scenes. arrest by putting people in simulation in a controlled environment where they are able to gain practice looking for these technologies it might carry over to these higher stress situations. we decided not to go with a computer-generated environment and just go with a 360-degree photo. you can see it here, it's also on the screen. it has two concave lenses can one at each site and it captures just beyond one of the 80° on each side and then stitches them together. if i use it not you would get all of this, all of this particular thing you might not get is just the very base of the tripod underneath the camera.
8:35 am
this helped us get past what people refer to as the uncalibrated valley when it comes to video games judgment uncanny -- the more creepy does to people but by using an actual photo with a real scene with a few things photoshop in it bypassed that altogether. this is what the photo looks like that we took. once you in the virtual reality headset mraps its way already. you can see the scenario they're going on and you can kind of see us at the bottom, i'll show you a little bit. this is what it looked like and you don't see this in the game. this is like behind the scenes exclusive. we were hiding under this longer version of the father went about about this i can we are hiding their outside his police station hoping police would come outside and eventually did and it being san francisco they didn't question to make people with every piece of technology on the
8:36 am
street. which was great because it was kind of the perfect shot for us. for those of you will not have a chance to try today this is what looks like it looked over at the body can. you would get a pop-up about that explained what it is and it has a voice over because we come such a visual medium, we did want to just be you have to be fully cited to enjoy the six print or to learn from it. if you're unable to set when i or have limited visibility, but you have a certain amount of awareness of environment you can go in and still learn things through audio. we did our beta launch on november 5. fifth. this is at the internet archives at the international hackathon. that is brewster king, the found testing that which is was a real honor. for the most part we're looking at having a table like this. there's not a lot at this point, not a lot of people have these devices in their homes, even though like this would just get dropped down to hundred dollars recently.
8:37 am
not a lot of people have but is something we can take the conferences, have our grassroots activist when you're going to visit committee groups bring it with them just like they would bring one pagers or brochures for things like that. they could bring one of these with them. we've run it through probably 500 500 people in the last month which if you think about in terms of an activism organization, if you can spend nine minutes with somebody getting them to like only exclusively focus on surveillance, that is incredibly, that's a lot of talk. it was available on the internet and one of the things i found gratifying is portland, maine, is about as far from san francisco as you can get while staying in the united states but there are hacker spaces and neovascular kind this kind this out and having people demo it. we start is see social media respond to as well. my favorite tweet is is one in the middle.
8:38 am
that's exactly what we're going for with this sucks you are to good about that. as far as next steps for us, we are still in mode will continue to demos to gather feedback. once we had that mr. singh at taking herbs, a become up with an education curriculum teachers can do. after that went to look at what with the next version of this project? >> we have a few ideas. let's do an internet of things first, lets the home office we look right and you see the net, a printer come all these which might be surveilled through your devices in your home. or maybe we do what were not everyone is like in december since the what and what it's like and i will do what is
8:39 am
liking new york city would build the same for various areas. or maybe we abandon vr and they go to a are and we have a way for peoples phones to project things for them into the world. all these different have the technology develops, kind of interest we get into, , whether there's a return on investment, the kind of grants that are. it's a new world and we don't know what's going to be in the we don't know it's going to be in five years. but i can to know what's it's going to be at lunchtime and that's just outside the luncheon where you can come try it out. i'm happy to see how the camel works or anything like that. and that is all i have. if you do have headset at home or you want to play with it on your computer browser, it's on eff dot spot. >> i love the concept, the idea play games especially for
8:40 am
pattern like this behavior, it spills over into the non-game lies. start seeing shapes edward and think about how they can fit them together. someone who is reliving a simulation of his ancestors lives, takes on their sort of superhuman murder stealth abilities and that seems both unrealistic and undesirable but might be desirable to imagine a population trained on games that teach them about spotting surveillance technology in the world around them. a more useful version of the tetris effect. turning back to the question of encryption as we heard from sharon bradford franklin earlier, law enforcement have for years now been complaining that the spread of encryption is causing them to go dark, making it more difficult to do electronic surveillance with communications. the fascinating report from the
8:41 am
center for studies really point out there are a lot of ways that come sort of difficulties law enforcement is having with intercepting an electronic key medications really does have a lot to do with the need for backdoors and there's a lot of low hanging fruit being left on the table that we ought to examine before we talk about legislating platforms for breaches in the tools we rely on that secures. to talk about that want to invite jennifer daskal to discuss the report which i believe you will find on the table outside. >> thank you, julian. as julian said the focus of my talk today is the range of
8:42 am
challenges that law enforcement faces in accessing digital evidence separate and apart from the encryption related challenges. this talk stems from a report that i worked on with a co-author, will carter, under the auspices of the center for strategic and international studies or what many of us know as csis. the debates about encryption undoubtedly will continue but it was, is and it is emphatically more so our view after writing this report and working on this report that while encryption and the debate about encryption have taken up so much of the limelight, the a range of other challenges that law enforcement faces that need to be dealt with and they can be dealt with relatively easily and you the d to be dealt with now. and that these challenges will continue no matter what happens with respect to encryption, no
8:43 am
matter if, in fact, there ever were a clear encryption mandate, there would still be these other ongoing challenges that need to be dealt with. and so as our title low hanging fruit indicates, these are problems we think that can be relatively easily solved, not completely, nothing in this space ever relates to a complete solution and we make a mistake if we assume that we are seeking a complete solution or that we're ever trying to eliminate totally some of the friction in the process. some of the friction is, in fact, healthy. some of the friction is unnecessary and actually collectively harmful to both security and privacy, and minimizing that friction is not only impossible goal but one that is eminently achievable -- laudable goal. i will note the report we looked worked on was endorsed by number of individuals and also groups and entities. was endorsed by the former cia director john brennan, former fbi general counsel and
8:44 am
weinstein, two, deputy attorney general, the former boston commissioner, police commissioner ed davis, former assistant attorney general for national security david kris if it's been praised by a number of different groups and providers and several providers already introduced a number of reforms consistent with what we called for in this report. now that i've given you the hard sell on going to spend the remainder of my time talk about the substance and talk a little bit about the methodology that we use in doing this report a little bit about her findings and are ultimate recommendations. this report stems from about a years worth of research including a series of qualitative interviews with state state, local, and federal law enforcement officials, prosecutors can represent is from a range of different tech companies and member of the civil society community. it also involved a quantitative survey of state, local, and federal law enforcement
8:45 am
officials. the survey results are notable. hopefully you can all read every single bit of this. according to the survey results, those surveys found difficulties accessing, analyzing and utilizing digital evidence in over a third of the cases. >> we believe that's a problem that is only going to continue to grow as digital information becomes more and more ubiquitous edit digital evidence is needed and just about every criminal investigation. this chart shows the response to the question, what is the biggest challenge that your departments encounters in using digital evidence? and accessing data from service providers was ranked as the key challenge amongst our respondents, separate and apart from questions about interpretation. identifying which service provider has the data was reported as the number one challenge, , 30% of our respondents ranked it as their biggest problem. often the data once it was
8:46 am
identified was reported as a number to challenge him between 9% to 25% rank it as their biggest challenge. accessing data from a device was 19% ranked it as the biggest challenge they face, and then collectively analyzing data from devices and analyzing data from providers that's been disclosed from providers which are two separate things combined, that's about 24%. these are problems that can be fixed or at least largely reduced without huge changes in the system but with more resources and more dedicated systematic thoughts to addressing these problems. so to the extent that law enforcement doesn't know where to go to get data of interest, that is a problem that can be solved with better information flows and better training.
8:47 am
to the extent that law enforcement faces challenges and obtaining data, that is a bigger challenge, and we heard two very different stories from the law enforcement officials we talk to and the provider committee, they talked about what they perceived as very long delays in getting information back from service providers, what they perceived as service providers dragging their feet, of service providers having insufficient resources to respond to the need of requesting slow walks or turn down what they perceived to be in valid circumstances. providers, on their side, told us a very different story. they complained about what they saw as overbroad requests, about law enforcement asking for things that simply were not available, as delays being the fault of law enforcement as they were internally debating and deciding whether not to get nondisclosure orders that would prohibit to provide from kelly
8:48 am
their customer subscriber that the customers discovered it had been obtained and providers holding off at law enforcement requests, turning over the data intel they learned whether or not they had permission to tell the customer or the subscriber. the data interestingly, supports both sides of the story. this chart shows the requests that were, that use law enforcement issued to six key companies, facebook, microsoft, twitter, google, yahoo! and apple. overtime this is based on the companies own transparent reporting. there is no other good source of this data. not surprisingly use from this chart a pretty dramatic increase in requests over a pretty short time. the show request in six-month integrals. ending in 2013 there about 400,000 requests to the six
8:49 am
u.s.-based providers. by december of 2017, the previous six months before that, that almost doubled or at least increased by significant amount in about 650,000, almost 700,000 requests in the prior six months. what's interesting about this chart is the grant rates have covered more or less at about the same rate. at about 80%. they've been consistent over time in terms of the percentage of requests or demands that providers complied with, but that also means the absolute number of requests that are being turned down for the number of disclosure demands that are not being complied with is higher given that there's a bigger volume of actual requests. to some extent law enforcement is frustrated because they are sensing this bigger number of request denials, where his providers are saying we are pretty consistent and how we've been treating this overtime.
8:50 am
the chart only shows whether qwest are made, not for the requests were not made because the law enforcement did nowhere to go over otherwise stymied in the request. the grant rates say nothing about the legitimacy of either the request or the grounds for rejecting the request. it is and it should be some ongoing disagreement about the appropriate scope of the request. request. this is any were some friction is not only healthy, it's productive and it's just going to persist inevitably because there's different views about the appropriate scopes of these requests. there's also a number of areas with respect to grant rates and law enforcement issuance of requests to providers with is unnecessary friction. some of the reduction in that friction can both support privacy and security at the same time. some of the things that can be helpful in this regard are better up-to-date law
8:51 am
enforcement guides, provided by the providers, resourcing of law enforcement teams either providers, better training and dissemination of that training to state and local law enforcement officers, better training of judges that review and approve the range of request subject to court order or warrants. and these obvious security benefits in the sense that provides law enforcement more streamlined ability to access the data of interest but it also has privacy benefits to the extent it leads to better tailored, better, more privacy protective requests and as a result more tailored, more narrow requests. to the extent that law enforcement cannot interpret that if that is disclosed, this is a problem that stems in part from encryption but also what we heard over and over again with some the absence of tactical tools to decipher non-encrypted
8:52 am
data that was disclosed. this is a this is a problem the, one, from the absence of tools to some extent and also a a distribution problem. so sometimes some of the bigger law-enforcement entities would have access to the appropriate tools but it was not disseminated to the 18,000 state and local law enforcement entities that exist around the country. so despite what appears to be a pretty clear need and a pretty easy to identify solution with respect resourcing, training, resources training and dissemination of tools, the sole federal entity with an explicit permission to better facilitate cooperation between law enforcement and providers is at fbi's national domestic communication center. it is a of just 11.4 million this fiscal year and that is spread out among several different programs designed to
8:53 am
distribute knowledge about service providers, policies and products, develop and share technical tools, trained law enforcement making a 24/7 hotline center among many other initiatives. that is a drop in the bucket given the need that is out there. one of the key most highly regarded training centers, a national computer forensic institute which is run by the secret service fights for appropriations every year district at 1.9 million come enough to train 1200 students. people were fully fund it could train over 3000 3000 but that,, is just a drop in the bucket when you consider that there's 18,000 federal, state and local entities across the country. that's just the number of entities, not the number of individuals working at this entities. there are a range of state and local training centers and resources and some of the federal resources but that has a rosen to fill some of these guts but as you can see they are not
8:54 am
geographically distributed evenly, much i concentration on the east coast, to some extent the west coast, a big swap in the middle where there's not much in terms of resources and training centers. and there's no central entity for determining what's out there, what works, what doesn't and how to best allocate these resources. so this event gets me to our recommendation which is the creation of a national digital evidence office that is authorized and resources by congress that would sit in doj and that would do the kind of work that is needed to both assess what is out there and ensure a more efficient and reasoned distribution of resources. so develop a national digital evidence baltica coordinate ongoing efforts including grantmaking. there's a lot of different grantmaking bodies that are not well courtney did exist only.
8:55 am
identify some of the gaps. establish and promote consistent set of standards for securing and minimizing the data that is collected. developing authentication systems to ensure that the person who is asking for the request is, in fact,, asking for the date is, in fact, entitled to receive that data. coordinate with some of the interesting international efforts that are ongoing and report to congress and promote transparency about what is going on. we've also called on congress to authorize the ndcac. it does not have an independent authorization at the moment to authorize an adequate resource ndcac to do its job. this would serve within the broader digital office. the technologies are aware of what's going on the tech told and aware of what's going on in the field and the challenges in
8:56 am
the field with some a policy folks, and i can't allow ndcac to do what it already is trying to do but has tried to do on a very slim budget, which is conduct and disseminate training, rather and disseminate information about service providers, develop and disseminate technical tools, provide a hotline system. we've also cleared a series of recommendations to providers. providers to step up some of the training efforts, having a centralized body like ndcac with dick and coconut facilitate that they're one of the things we heard from provider over and over again is that we do trainings that there's 18,000 federal, state and local law enforcement entities across the country. there's rapid turnover. it's like a cat and mouse game. but having some sort of central play second to some of the training and then lead to better work and request is helpful both for the law enforcement folks and for the provider folks. so provide training, maintain online portals, facilitate the
8:57 am
request process, to a pulse with the identification to some extent, provide explanations for rejections so they can be a dialogue, church appropriate staffing to meet the needs, provide rapid responses, what is an adequate response that is going to change based on what is being requested so we don't include specific time limits in the but also maintain the transfers that providers are already doing with respect at least the big providers with respect to law enforcement request they get but to break that down even more in terms of the categories of requests and overtime a range of other different smaller categories as well. and finally this is not on you but important to think as well about providers, , some of the bigger providers working with and helping to develop best practices for the range of smaller providers that are
8:58 am
increasingly coming on the market and will have to do with this bucket of issues as well. so i will end by saying the challenges are only going to grow over time. we think this is low hanging fruit. these are structures and resources they need to be put in place now, because the needs are only going to expand as we move forward and in our view this as benefit both for security and for privacy, and allows us to do something as the debate about encryption goes on. so thanks. [applause] >> reminds me of a comet i heard from somebody who work for a tech company, recounted getting kind of a confused e-mail from a law-enforcement officer. these files you sent me requested, they're all encrypted. we need you to help us.
8:59 am
they are not encrypted. it's a spreadsheet. you need to open it in excel. [laughing] i think the pattern we see is very often there are automatic calls for greater authority to solve problems that are often more about institutional knowledge and ability to navigate changing technological structures. i want to thank our flash talkers and want to invite you to join us upstairs for lunch i hope you'll join us in the afternoon session as well and in particular you will stick around or return at the end of the day when we will be leading a group over to the art museum for a tour of the exhibit. please join in thanking our speakers one last time. [applause]
9:00 am
[inaudible conversations] >> here's some of our live coverage tuesday. at 10:30 a.m. eastern on c-span, the wilson center takes a look at the state of u.s.-china relations. ..
9:01 am
i don't think he's a racist. i think the way he looks at people. everyone is either a friend or an enemy. you can change categories, the american first thing i think is an idea he holds dear. that our country has been short changed in dealings with the rest of the world. the trade policies and in the minds of many supporters and middle america have hurt them, and i believe that's a sincere set of beliefs on his part. >> on c-span's q & a.
9:02 am
>> when the congress comes back in january they will have the youngest and most diverse. watch it live starting on january 3rd. >> current government funding is set to expire on friday at midnight. congress and the president need to reach an agreement to keep the government funded. the speeches from the senate floor where senators spoke about the potential government shutdo shutdown. >> earlier this month, congress sent the president another continuing resolution to allow more time to resolve a partisan impasse that has us on the brink of a government shutdown once again. yes, continuing resolution allows agencies to continue to spend money without knowing how much they actually get to spend. the current episode is yet another example of the breakdown of what should be the basic


info Stream Only

Uploaded by TV Archive on