Skip to main content

tv   House Oversight Hearing on Facial Recognition Technology  CSPAN  June 4, 2019 9:59am-12:54pm EDT

9:59 am
on men's faces than women's faces. they still performed better on lighter skin than darker skin. we still see a bias. >> and you know what the collateral damage can be through misidentification. i've fought for years to free people from prison who were wrongfully convicted and that's just where this is going because of lax rules and regulations and technology. >> absolutely. we don't even have reporting requirements, at least in the u.k. where they have done pilots of facial regular in addition technology, there are reported results and you have false positive match rates of over 90%. there's a big brother watch u.k. report that came out that showed more than 2400 innocent people had their face misidentified. so this is building on what aclu said, right. we already see this in the real world where performance metrics
10:00 am
are required. >> all in the name of profits. mr. chairman, my time is up. i yield back. >> the committee will come to order. without objection is chair is authorized to declare a recess of the committee at any time. this is our second hearing on facial recognition technology. i now recognize myself for five minutes to make an opening statement. today the committee is holding our second hearing on the use of facial recognition technology. we will be examining the use of this technology by law enforcement agencies across the federal government. we had a broad survey of full range of issues raised by technology. we heard from a number of
10:01 am
experts about the benefits and the dangers of this technology across government and the entire private sector. the stark conclusion after our last hearing was that this technology is evolving extremely rapidly without any really safeguards whether we're talking about the commercial use or government use, the real concerns about the risks that this technology poses to our civil rights and liberties and our right to privacy. the other conclusion from our last hearing was that these concerns are indeed bipartisan. as we saw at our last hearing among deservconservatives and liberals, this is a wide range agreement that we should be conducting oversight of this
10:02 am
issue to develop common sense, concrete proposals in this area. i truly appreciate the ranking member's commitment to working together on this issue again in a bipartisan way. today we will focus on the use of facial recognition technology by our government. our committee has broad jurisdiction overall government agencies so we are uniquely situated to review how different agencies are using this technology on the american people. for example, today we will hear from the federal bureau of investigation. in april the government accountability office sent a letter to the department of justice with open
10:03 am
recommendations on the fbi's use of facial recognition technology. as that later stated the fbi had not implements these recommendations despite the fact that gao initially made them three years ago. we'll also hear from gao not only on the importance of these recommendations which focus on transparency and accuracy, but also on the dangers associated with failing to implement them. we will also hear from the transportation security administration which has launched pilot programs in u.s. airports that subject american citizens to a facial recognition system. finally, we will hear from the national institute of standards and technology on this. this has been the standard
10:04 am
bearer for biometric accuracy for the past 20 years. this will discuss the state of the technology, the rapid advancement of this technology, accuracy challenges this technology still faces and future plans for testing and monitoring progress. hearing from all of these relevant actors and building this record of information is important as we begin to stress the use of facial recognition technology by government and private actors and potential ly develop legislative solutions. we will continue to hear from stake holders throughout subcommittees each of which is tasked with a specialized focus such as safeguarding civil rights and liberties, protecting consumers, examining our government's acquisition of this
10:05 am
technology and reviewing national security concerns. i anxiously look forward to hearing from all of our witnesses today. with that i reck distinguished ranking member of our committee mr. jordan for his opening statement. >> thank you for this hearing. we fight a lot on this committee. i think we may have a little vigorous debate tomorrow morning but today is subject matter where we have a lot of agreement and a lot of common ground. so i genuinely appreciate the chairman's willingness to have a second hearing on this important subject. two weeks ago we learned some important things. facial recognition technology, there are all kinds of mistakes made when it's implemented, those mistakes disproportionately impact african-americans. there are first amendment, fourth amendment concerns when it's used by the fbi and the federal government.
10:06 am
there are due process concerns when it's used by the fbi and the federal government. we learned that over 20 states have been given their bureau of motor vehicles, they've just given access of that to the fbi. no individual signed off on that when they renewed their driver's license. they didn't sign any waivers. no elected officials voted to allow that to happen, no state assemblies, no bills, no governors passing a bill to say it's okay for the fbi to have this information. and now we learn that when gao did their investigation and study into how the fbi implemented this, there were all kinds of mistakes the fbi made in how it was implemented. i think five recommendations the gao said you were supposed to follow that the fbi didn't follow. all this happened and it's been
10:07 am
three years for some of those that they still haven't corrected and fixed those concerned that gao raised with the implementation of facial recognition technology. and all this happens in a country with 50 million surveillance cameras. so this is an important subject. i appreciate the chairman's willingness to have a second hearing and willingness to work together in a bipartisan fashion to figure out what we can do to safeguard american citizens' fourth amendment and first amendment and due process rights as we go forward. with that, mr. chairman, i yield back. >> thank you very much. i now want to welcome our witnesses. m dr. greta goodwin is director of
10:08 am
homeland security and justice at the united states government accountability office. dr. charles romine and mr. austin gold is the assistant administrator of requirements and capabilities analysis at the transportation security administration. if you would, please stand and raise your right hand and i will swear you all in. do you swear or affirm that the testimony you're about to give is the truth, the whole truth and nothing but the truth, so help you god? let the record show that the witnesses answered in the affirmative. thank you very much, you may be seated. the microphones are very sensitive to please speak directly into them. make sure they're on when you
10:09 am
speak, please. without objection, your written statements will be made a part of the official record of this committee. with that, you're now considered for recognized to give your statement for five minutes. >> thank you. my name is kimberly dell greco. i lead the information services branch with the fbi. thank you for the opportunity to appear before the committee. i am testifying today regarding the fbi's use of facial recognition for law enforcement purpose. it is crucial that authorized members of law enforcement and national security communities have access to today's biometric technologies to investigate, identify, apprehend and prosecute terrorists and criminals. the fbi's next generation identification or ngi system which includes facial
10:10 am
recognition aids in our ability to solve crimes. facial recognition is an investigative tool that can greatly enhance law enforcement capabilities and protect public safety. at the fbi, trust is crucial. protecting the privacy and civil liberties of the american people is part of our culture. this is why when the fbi developed its facial recognition technologies, it also pioneered a best set of best practices to deploy these technologies for public safety in keeping with the law and without interfering with our fundamental rights. there are two separate programs. they are the fbi's interstate photo system or the facial analysis comparison and evaluation or face services unit. specifically the ngi ips allows law enforcement agencies the ability to use investigative tools of facial recognition by searching criminal mug shots.
10:11 am
law enforcement has performed photo line-ups for decades. while this practice is not new, the efficiency of such searches has significantly improved using automated facial recognition. policies and procedures emphasize that photo candidates returned are not to be considered positive identification, that the certainli searches are photos and only result in a ranked listing of candidates. the policy places legal, training and security requirements on law enforcement users of the ngiips including prohibition against committing photos that were obtained without respect to the fourth and fourth amendments. photos are solely criminal mug shots acquired by law
10:12 am
enforcement partners with criminal fingerprints associated with an arrest. the fbi face services unit proceeds investigative lead support to fbi offices, operational divisions and legal attaches by using trains face examiners to compare face images of persons associated with open assessments or active investigations against facial images available in state and federal facial recognition systems through established agreements with state and federal authorities. the face services unit only searches photos that have been collected pursuant to the attorney general guidelines. they are not retained. this service does not provide positive identification but rather an investigative lead. since the gao review and the last oversight hearing in 2017, the fbi has taken significant steps to advance the fbi's facial recognition technology.
10:13 am
at the end of 2017, the fbi validated the accuracy rate at all list sizes. in early 2018 the fbi required law enforcement users to have completed facial recognition training consistent with the face standards prior to conducting facial recognition searches in the ngiips. additionally the fbi collaborated to perform the facial recognition vendor test and determined a most viable option to upgrade its current ngiips algorithm. i am proud to be working alongside so many mission focused staff protecting the country against horrific crimes.
10:14 am
thank you. >> thank you very much. dr. goodwin. >> chairman cummings, ranking member jordan and members of the committee. i'm pleased to be here today to discuss gao's work on the fbi's use of face recognition technology. over the past few decades this technology has advanced rather quickly and it now has wide ranging usage from accessing a smart phone to social media and to helping law enforcement and criminal investigations. however, questions exist regarding the accuracy of the technology, the transparency and its usage and the protection of privacy and civil liberties when that technology is used to identify people based on certain characteristics. today i will discuss the extent to which the fbi has ensuinsure
10:15 am
adherence regarding its use of face recognition technology as well as whether the fbi has eninsured its face recognition capabilities are sufficiently accurate. in our may 2016 report we noted that two legally required documents, the privacy impact assessment and the system of records notice were not being published in a timely manner. these documents are vitally important for privacy and transparency because the pia analyzes how personal information is collected, stored, shared and managed while the other informs the public about the very existence of the systems and the types of data that are being collected. doj has taken actions to
10:16 am
expedite the development process of the pias but it has yet to update the sorns. specifically we found that the fbi conducted limited assessments of the accuracy of the face recognition searches before they accepted and demo d deployed the technology. the face recognition system generates a list of the requested number of photos. the fbi only assessed accuracy when users requested a list of 50 possible matches. it did not test smaller list sizes which might have yielded different results. additionally, these tests did not specify how often incorrect matches were returned. knowing all of this the fbi still deployed the technology. the fbi often uses face recognition systems operated by
10:17 am
21 state and two federal external partners to enhance criminal investigations. we reported that the fbi had not assessed the accuracy of these external systems. as a result they cannot know how accurate these systems are. yet the fbi keeps using them. moreover, we found that the fbi did not conduct regular reviews to determine whether the searches were meeting users' needs. we made recommendations to address all of these accuracy concerns. doj has yet to implement these. we issued our annual priority recommendations report which provided an overall status of doj's open recommendations and outlined those that gao believes should be given high priority. this report included six recommendations related to face recognition. as of today, five of those six remain open. the use of face recognition
10:18 am
technology raises potential concerns about both the effectiveness of the technology in aiding law enforcement and the protection of privacy and individual civil liberties. this technology is not going away and it is only going to grow. so it will be important that doj takes steps to ensure the transparency of the systems so that the public is kept informed about how personal information is being used and protected, that the implementation of the technology protects individual's privacy and that the technology and systemed u es used are accu and being used appropriately. this concludes my remarks. i'm happy to answer any questions you have. >> thank you very much. dr. romine. >> i'm chuck romine director of the information technology laboratory at the department of
10:19 am
commerce's national institute of standards and technology. thank you for the opportunity to appear before you today to discuss nist's role. nist resea guidance to industry and government agencies that depend upon biometrics recognition. nist leads national and internal standards activities in biometrics such as facial recognition technology but also
10:20 am
crypting cry cryp cryptography. nist biometric evaluations advance the technology by identifying and reporting gaps and limitations of current technologies. nist evaluations advance measurement science by providing a scientific basis for what to measure and how to measure. nist evaluations also facilitate development of consensus based standards by providing quantitative data for development of scientifically sound fit for purpose standards. nist conducted the face recognition grand challenge and multiple biometric grand challenge programs to challenge the facial recognition community to break new ground solving research problems on the biometric frontier. since 2000, nist's face recognition vendor testing
10:21 am
program or frvt has assessed capabilities of algorithms for one to many identification and 1-1 verificatioverification. nist broadened the scope of its work in this area to understand the upper limits of human capabilities to recognize faces. historically and currently nist biometrics research has assisted the federal bureau of investigation and department of homeland security. nist's research was used by dhs in its transition to ten prints. nist is working with fbi and dhs to analyze face recognition capabilities including performance impacts due to image quality and demographics and
10:22 am
provide algorithms, optimal thresholds and match gallery creati creation. nist's face recognition vendor program was established in 2000. significant progress has been made in algorithm improvements since the program was created. nist is researching how to measure the accuracy of forensic examiners matching identify across different photographs. the study measures face identification accuracy for an international group of forensic face examiners working under circumstances approximating real world case work. the findings showed that examiners and other human face specialists including forensically trained facial reviewers and untrained super recognizers were more accurate than the control groups on a challenging test of face identification.
10:23 am
it also presented data comparing state of the art facial recognition algorithms with the best human face identifiers. optimal face identification was achieved only when humans and machines collaborated. rigorous testing can increase efficiency in government and industry, expand innovation and competition, broaden opportunities for international trade, conservative resources, provide consumer benefit and choice, improve the environment and promote health and safety. thank you for the opportunity to testify on nist activities and facial recognition. >> mr. gold. >> good morning. thank you for inviting me.
10:24 am
i'm austin gould. i would like to thank the committee for working with tsa as we continue to improve the security of transportation systems and particularly for your support of our officers at airports nationwide. tsa's establishment in 2001 charged the agency with providing transportation systems security. a key component to performing this mission is positively identifying passengers boarding aircraft and directing them to the appropriate level of physical screening. this primarily occurs when passengers enter a checkpoint and present themselves to a security officer. since inception tsa has strived to carry out that role as effectively and officially as possible. recognizing the need to positively identify passengers in an era where fraudulent means of identification are becoming more prevalent.
10:25 am
tsa identifies the steps that the agency is taking to test and potentially expand biometric identification capability at tsa checkpoi checkpoin checkpoints. the road map has four major goals. partner with customs and border protection on biometrics for international travelers, operationalize biometrics for tsa precheck passengers, potentially expand biometrics for domestic travelers and develop the infrastructure to support these efforts. consistent with the biometrics road map, tsa has conducted pilots. these paymeare being used to eve the applicability of biometric for tsa operations.
10:26 am
each pilot has been supported by a privacy impact assessment and passengers always have the opportunity to not participate. in these cases standard manual identification process is used. i have observed the pilot currently underway in terminal f in atlanta for international passengers. of note, virtually every passenger chose to use the biometric identification process. the facial capture camera was in active mode meaning it only captured a facial image after the passenger was in position and the officer activated it. in that regard, biometrics represents unique opportunity for tsa. this capability can increase security effectiveness for the entire system by using biometric identification while also increasing throughput at the checkpoint and enhancing
10:27 am
passengers passengers's pernexperience. we experienced our busiest travel day ever on 24 may. tsa is committed to addressing accuracy, privacy and cyber security concerns. in that regard and pursuant to section 1919 of the tsa modernization act this report will address accuracy error rates and privacy issues associated with biometric identification. looking ahead, tsa plans to continue to build upon the success of past pilots to refine requirements. these pilots will be supported by privacy impact assessments
10:28 am
clearly identified through airport signage and passengers will always have the opportunity to choose not to participate. to close, tsa is in the process of a systemic asoesmesessmentas. this identification process will enhance aviation security while also increasing passenger throughput and making air travel a more enjoyable experience. tsa's system will be used for passenger identification and to determine the appropriate level of screening only. it will not be used for law enforcement purposes. as always, passengers will have the opportunity to not participate. thank you for the opportunity to address this important issue before the committee and i look forward to answering your questions. >> i now recognize myself. ms. del greco, in 2017 the government accountability office testified before our committee that the fbi has signed
10:29 am
contracts with at least 16 states to be able to request searches of their photo databases. gao stated that most of these systems access driver's license photos, but several states also include mug shots or corrections photos. can you explain how the fbi decides to search a state database versus when it searches its own system and how this policy is determined? >> i would be happy to explain that. at the fbi we have a service called face services unit. they process background checks and profession facial recognition searches of the state dmv photos. they do this in accordance with
10:30 am
the attorney general guidelines. a field office has to have an open assessment or active investigation. we launch the search to the state. the state runs the search for the fbi and provides a candidate list back. with regard to the ngiips, the interstate photo system, the face services unit will utilize that repository as well as the dmv photos. however, state and local and federal law enforcement agencies only have access to the ngi interstate photo system. these are the fbi mug shots that are associated with a ten print criminal card associated with a criminal arrest record. >> do individuals who consent to having their faces in the noncriminal database also consent to having their faces searched by the fbi for criminal
10:31 am
investigations? for example, when applying for a driver's license, does someone consent at the dmv to being in a database searchable by the fbi? >> the fbi worked diligently with the state representatives in each of the states that we have mous. we did so under the states' authority to allow photos to be used for criminal investigations. we also abided by the federal driver's license privacy protection act. and we consider that very important process for us to access those photos to assist the state and local law enforcement and our federal agencies. >> you just said state authority allows you to do this. one question that our ranking member has been asking over and over again is, do you know
10:32 am
whether in these states do any elected officials have anything to do with these decisions? in other words, where is that authority coming from? we're trying to figure out with something affecting so many citizens, whether elected officials have anything to do with it. do you know? >> i do. only in one state, the state of illinois, did an elected official sign the mou. in the other states they were done so with the state representatives. this is state law that's established at the state level prior to facial recognition and our program getting started. we're just leveraging that state law. that state law is already in place. we did work with the officer of general counsel at the fbi and the attorney level at the state level. >> well, it was prior to facial recognition coming into
10:33 am
existence, i'm just wondering do you think that whatever law you're referring to anticipated something like facial recognition? >> it's my understanding that the states established those laws paubecause of fraud and ab of driver's license. we are just reviewing each of the state laws and working with the representatives in those states to ensure we can leverage that for criminal investigation. >> so when you say leverage, i guess you're saying that there were laws that were out there. these laws did not anticipate something like facial recognition and now the fbi has decide decided it would basically take advantage of those laws, is that a fair statement? >> the federal driver's license privacy protection act, it
10:34 am
allows the state to disclose personal information and including a photo or image obtained in connection with the motor vehicle record. >> we have seen significant concern among states about providing this level of access to the fbi. for example, during our may 22nd hearing, we learned that vermont suspended the fbi's use of its driver's license database in 2017. is that correct? >> i'm not aware of that, sir. >> well, it is accurate. how many states have provided this level of direct access to the fbi? >> we the not have direct access. we submit a probe to the state. there's 21 states. >> okay. >> what we did, sir, in the last two years since the last oversight hearing, our office of
10:35 am
general counsel reviewed every single m.o.u. to ensure that it met the federal and state authorities. >> does the fbi have plans to increase the number of states that provide the fbi with access to its database? >> that would be up to the states, sir. we have reached out to all the states but it's up to them if they want their data used. it's optional for them. >> when states agreed to provide this level of access to the database, are they aware of the fbi policies in searching their systems and any changes that are made to these policies? >> it is made extremely clear to i each of the states how the information will be used, the retention. we purge all photos coming back to us from the state. we ask that the state purge all of the probe photos that we send them. >> how do you make them aware? >> we have active discussions
10:36 am
and it's in the m.o.u., sir. >> is the fbi undergoing any current negotiations to expand the information available for fbi face services photo searches? if so, can you please describe these anythinegotiations. >> i'm not aware of any current negotiations right now, sir. >> finally we also heard reports that the fbi can search photo database of other agencies including the department of state. are there any limits to this access? >> the searches of the state department's photo are in accordance with an active fbi investigation and are only done so under the attorney general guidelines followed by the fbi. >> and can the fbi perform a face recognition search for any american with a passport? >> for open assessment or an
10:37 am
active investigation only by the fbi, sir. >> thanks for bringing this important issue to the forefront. now i know we don't have border patrol here in their use of the facial recognition to meet the congressional mandate for biometrics. also i'm from the state of arizona and our department of transportation uses this technology to combat fraudulent driver's license applications. can you give us a little bit more inference and details on some of the successes with partners that you have been working with? >> the successes that we've had, the majority are with state and local law enforcement. the fbi is not a positive identification. it provides investigative leads
10:38 am
out to law enforcement and to our fbi field offices. some of those successes are assisting with the capture of a terrorist in boston, assisting with putting the pieces together to identify where a pedophile is that was trying to avoid the law enforcement for 20 years and also assisting in identifying a person that was on the ten most wanted list for homicide. >> mr. gould? >> our greatest success in terms of partnering has been with customs and border protection. we leverage their travel and r verification system at our checkpoints. we're doing this sloelly on a pilot bassis but it's increased through our checkpoints. >> at our last hearing we heard some disturbing facts about
10:39 am
accuracy of facial recognition. can you give us some idea about from what you see how we're going to be able to be much more accurate in that application? >> yes, sir. the most recent testing that we've conducted demonstrates significant improvement over previous tests. we conducted tests in 2010 and 2014 and demonstrated certain limitations associated with facial recognition accuracy. the most recent test results will be published this month for the frvt one to many evaluation that is being readied but the results so far suggest substantial increases in accuracy. >> what sort of accuracy rates are you finding in the different algorithms ability to match an
10:40 am
image against a laerrger galler of images? >> we have many different participants who have submitted algorithms, approximately 70 participanted es in our testin. the best algorithms are performing at a rate of approximately 99.7 in terms of accuracy. there's still a wide variety or wide variance across the number of algorithms. some of the participants fared significantly poorer but the best algorithms are in the 99.7 category. >> are there algorithms you've tested that you would recommend for law enforcement? >> we don't make recommendations. we provide the data necessary about how an algorithm will perform in a field.
10:41 am
for law enforcement for example, accuracy rates are one important aspect that needs to be considered but there are other aspects that have to be taken into consideration for procurement or acquisition of such technology. >> going back to the development of algorithms really the biases can be built into those that are manufacturing or building the algorithms, is that true? >> it is true that the algorithms depending on the way they have been developed can have biases associated with them. in many cases the improvement that we see in the performance of these algorithms, the dramatic pruchlt coimprovement these machine learning algorithms that are what has made the dirchgfference.
10:42 am
we evaluate these as black boxes. my assertion there is from discussions that we've had from vendors and not from examination of the algorithms themselves. the training of those algorithms determined the level of bias that may exist within the algorithms themselves. >> thank you very much. mr. lynch. >> thank you, mr. chairman. i want to thank you for holding this second hearing on facial recognition. i thank the ranking member as well. it's good to have bipartisan interest on this issue. ms. del greco i certainly understand the dynamic at may when there is an active fbi investigation ongoing and you're reviewing mug shots of known criminals. but mr. gould, according to the biometrics road map released by
10:43 am
tsa in september 2018, tsa seeks to expand the use of facial recognition technology to, quote, the general flying public, close quote. in specific locations but the general flying public. and tsa envisions the use of technology upon domestic flights as well as international which would capture the faces of mostly american citizens. going back to the chairman's original question, what's the legal basis -- i'm not talking about a situation with the fbi where you might have probable cause. where does the tsa find its justification, its legal justification for capturing the facial identity of the flying public? >> yes, sir. in accordance with the aviation transportation security act of
10:44 am
2001, tsa is charged with positively identifying passengers boarding aircraft. >> let me stop you right there. so we all fly at least a couple times a week. >> yes, sir. >> so we have -- now you have to have a certified license. you can't go with the old version that your state had. now we have much more accurate licenses. we surrender that oftentimes in the airport during the boarding process you've got to show it a couple of times. so you're doing that right now. >> yes, sir. >> you have been doing that for a long, long time. >> manually, yes, sir. >> right. so now you're saying you're going to do these pilot programs and you're going to heard peo e people -- you're saying voluntarily. but i could image liine you cou either agree to surrender your
10:45 am
right to an nom anymorety a ano >> with respect to the general traveling public, we anticipate using a one to one matching capability at the checkpoint. you produce your kreescredentiau stick it in the machine and the machine returns a match result. that will then allow you to proceed. should you decide not to participate, we will always have the option to do that process manually. >> but to match you've got to have that data in the -- on board to match something with, right? >> that data is embedded in your credential. the photograph is on your driver's license for example. there's a digital recording of that image in the credential. when your picture is captured by
10:46 am
the camera, it is matched to the photograph on the credential. it does not depart the check point for any database search or anything like that. that's the one to one identification we intend to use for the broader traveling public. >> that's it. you don't anticipate using a database or collecting a database of information within tsa with which to identify passengers. >> for international travelers who have a passport photo of record we will match them to a gallery. but for the general traveling public that does not participate in those programs and merely has a credential -- >> what's the size of the gallery? if anybody engages in international travel are they going to be in that or are they foreign nationals who travel to the u.s.? >> sir, the gallery that we use
10:47 am
right now with tvs includes anyone who is traveling internationally and who has a photo on record. >> here's the problem. we had a problem with opm where we had 20 million individuals, their personal information, social security numbers, everything that they submitted on federal documents to opm and stolen by we think the chinese. i'm just curious and concerned that we don't have a great track record in protecting people's personal information. >> yes, sir. under cyber security rules associated with the program is something we take very very seriously. >> i hope so. my time has expired. i yield back. >> mr. higgins. >> thank you, mr. chairman. ladies and gentlemen, thank you for appearing before the committee today. mr. chairman, i ask unanimous consent to enter into the record a document from the security
10:48 am
industry association, that's association for cyber security providers, just a general knowledge document. >> without objection, so ordered. >> during this emerging technology era of digital technologies, i think it's important that we refer to technologies that we've had existing for quite some time. in 2005 as a police officer we had in the city that i patrolled we had access to a camera that was -- a series of cameras that was disguised as a transformer on an electric poll where we had large numbers of complaints and crimes in portions of the city. the citizenry themselves wanted these crimes solved and investigated. we would have the linemen for
10:49 am
the heelectric company install this camera. we solved many crimes. crimes would go down. this was 15 years ago. we have license plate readers right now. madam, gentlemen, i'm quite sure you're familiar with license plate readers. we use them from sea to shining sea. if your vehicle is on a public road, you're subject to a license plate reader. these cameras are available to any citizen. there's a cross reference to the dmv and they'll know exactly what vehicle passed in front of that camera. these cameras have been used to successfully investigate and solve crimes, some of them heinous crimes. i have in my home 11 smart
10:50 am
cameras. these cameras are connected to software, to high resolution digital cameras. familiar perso the cameras have learned is a constant visitor to my home or myself, my wife, my son, et cetera, there's no alert sent to the security company. if it's not a familiar person a human being receives a prompt and looks at the camera feed to my home. everyone here wants to protect fourth amount rights and privacy rights of american citizens. none of us want our constitutional protections violated. the fact is this emerging technology of facial recognition is coming and reflecting just
10:51 am
the advancement of our digital technologies we have already employed. across the country, deployed in public areas, including airports. miss del greco, like any technology it has chance for abuse, would you concur. >> fee we feel at the fbi following policies and procedures are extremely important. >> thank you. these are human beings following policies and procedures, correct? >> we require all state and law enforcement entities to adhere to the required training. >> thank you. so the technologies that we're viewing, these cameras don't make arrests, do they? they just add to the data of a case file or to the strength of
10:52 am
an investigation and then a human being, an investigator must follow up on that and determine if you have probable cause for arrest, is that correct? >> our system doesn't capture real time. ta photo has to be submitted to the ngiips by law enforcement and they have to have authority to access our system for law enforcement purpose. >> the concern of this committee as it should be is the potential abuse of this technology and i believe the point that we should clarify in my remaining ten seconds here is that human beings are ultimately in control of the investigative effort and that technology that's viewed is part of a much larger totality of circumstances in any criminal investigation. would you concur with that, ma'am? >> for the fbi we're very strict on the use of our system and the authorities that are provided to those law enforcement entities.
10:53 am
>> mr. chair, my time has expired. >> what do you mean by strict? what does that mean? >> since the last hearing in 2017 the fbi, we take this seriously, sir. we went out to our advisory policy board, made up of over 100 state, local, federal and tribal entities and we talked to them about the gao findings and we talked about photos against the first and fourth amendments and require state and local and federal and tribal entity is to have training to submit a photo to the ngiips. we restrict the access unless they're authorized to have it. we also put out the ngi policy and implementation guide and told the states they must follow the standards identified in the facial identification working group standards. >> mr. clay. >> thank you. i want to thank you and the ranking member for conducting this hearing and the witnesses
10:54 am
for being here. let me start with the gao recommended in may of 2016 that the fbi make changes to ensure transparency of its use of facial recognition technology. in april 2019 gao released a letter to the department of justice highlighting the recommendations recommending and i, quote, doj determine why number one privacy impact assessment and two, a system of records notice were not published as required to implement corrective actions, end of quote. doj did not agree with either of these recommendations and the fbi still has not fully implemented the two open recommendations offered by gao. dr. goodwin, can you explain the importance of transparency when
10:55 am
it comes to the fbi's use of facial recognition technology? >> yes, thank you, sir. as you mentioned we made six recommendation, three of them related to privacy, three related to accuracy. only one of those has been closed as implemented. the ones we made related to privacy and ac cure raysy focus on the privacy impact assessment and that is a requirement under the e-gov act of 2002 that be ducted to help determine the privacy implications and evaluate the protections and so the doj has disagreed with that. we know that they are concerned about privacy and transparency, but they disagree with our recommendation. these are legally required documents that they have to submit so they have to submit. it is required under the privacy act. that provides information, any time there's a change to the
10:56 am
system or a change to the technology, they have to make that information publicly available so that the public knows what's going on. so we stand behind those recommendations because those speak to transparency, those speak to privacy. >> and to this date, those documents have not been made public? >> that is correct. >> miss del greco, can you explain why the fbi disagrees with these transparency focus recommendations? >> i believe doj disagrees with gao's assessment of the legal requirements. the fbi did publish both. initial developments of the face recognition, we had privacy attorneys embedded in our process to develop the protocols and procedures and we have submitted updates, continued updates and provided updates to gao. >> so what steps do you take to
10:57 am
protect privacy when conducting face recognition searches? >> the fbi monitors the appropriate audits with audits of the state, local, federal and tribal entities and we look at four system requirements. we provide outreach to our users and to date we have not had any violations or notice from the public that they feel like their rights are violate. >> and to what extent do you share the steps you take with the public? >> so those with regard to those, those are on behalf of the department of justice and i would have to take that question back to them. >> would you get back to us with a response? >> yes, sir. >> i'm concerned that fbi is not fully complying with the notice obligation when it comes to the use of facial recognition. miss del greco when the fbi arrests an individual based on a lead generated by face recognition does it notify the
10:58 am
defendant of that fact? >> so those are through fbi open assessments or active investigations and they're done so conforming and following the attorney general guidelines. and that would be foreign a active fbi investigation. >> so how many times has the fbi provided notice to criminal defendants that face recognition was used in their case? >> as part of a criminal investigation i don't believe that's part of the process. >> what about when it gets to trial? gets through discovery, they get that? >> so the fbi face services unit, that's the department that i represent, in clarks burg, west virginia, we provide a candidate back to the fbi field office. two or more candidates. they make the determination whether that is a match or not or their person of interest that they're looking for. >> so does the fbi provide other
10:59 am
candidate matches to the defendant as part of brady evidence or discovery? >> i'm not aware of any other information other than a candidate back from a search of the facial -- ngi interstate photo system. >> okay. what steps are the fbi taking to ensure that its use of the technology is as transparent as possible by showing proper notification? >> the fbi provides policy and procedures out to state and local entities that they must follow and they have to follow the standards that we establish and they have to make sure that they do so in accordance with authorized law enforcement purposes. >> so how does the public know whether their face image might be subject to searches you conduct? >> the law enforcement entity would have to have the authority to do so for criminal justice
11:00 am
purpose in order to access the ngi interstate photo system. >> i see. my time has expired and i yield back, mr. chairman. >> thank you. dr. goodwin, did the fbi meet all requirements of the e-government law? >> so as i mentioned earlier, the pia, the e-government -- >> did it meet all the requirements? i was looking for a yes or no? did it meet all the requirements when they implemented? >> no. we have -- we still have open recommendations -- >> i understand. dr. goodwin did the fbi publish privacy impact when it implemented frt in 2011. >> no. >> did the fbi file proper notice, the system of record notice in a timely fashion when it implemented facial recognition technology? >> no. >> did the fbi conduct proper
11:01 am
testing of the next generation interstate photo system when it implemented frt? >> proper in terms of determining accuracy for use? >> yes. >> no. >> did the fbi test the accuracy of the systems it interfaced with? >> no. >> didn't follow the law, the egovernment law, didn't follow impact assessment notices like it was supposed to, didn't provide timely notice, didn't conduct proper testing of the system it had and didn't check the accuracy of the state system it was going to interface with, right? >> that is correct. >> miss del greco said we have strict standards, you can count on us. we've got memorandums of understanding with the respective states to safeguard people. that's what she told us. when they started the system, stood ups the system, there were five key things they were supposed to follow that they didn't and my understanding is they still haven't corrected all
11:02 am
those, is that accurate? >> that is correct. >> they still haven't fixed the five things they were supposed to do when they first started. >> five open recommendations. >> but we're supposed to believe, don't worry, everything is just fine? and we haven't even got to the fundamentals yet, to the first amendment concerns, the fourth amendment. we're just talking about the process for implementing standing up the system? miss del greco, you said earlier to the chairman, i think you used the word strict policies we follow. how are we supposed to have confidence in strict policies that you're going to follow when you didn't follow the rules when you set the thing up in the first place? >> sir, the fbi published both the pia and the sorn. the department of justice disagrees with the gao on how they interpret the legal assessment of the -- >> you disagree in one area or all five? >> i believe in the three areas
11:03 am
of the findings. >> if you have five problems. >> the accuracy was tested of the system and disagree with gao and since the last hearing in 2017, the fbi went back and we evaluated our current algorithm again at all list sizes and the accuracy boasted above a 90% percentile than what we have reported initially in the hearing. we care about the accuracy of the system and the testing. >> earlier, you said when the chairman was asking questions, folks signed memorandums of understanding between someone at the fbi signed a document and someone in the 21 respective states who allow access to their database signs these recommend ran dumbs. who are the people signing away the rights of the citizens in their respective states? who are those individuals? >> our office of general counsel works with the state representatives in the state that garner those authorities. >> but not state representatives
11:04 am
in the sense they're elected to the general assembly in those states, some person designated by somebody to sign away -- in ohio i think i said this two weeks ago, 11 million people in our state my guess is 8, 9, 10 million drive, so someone is signing away access to those 9 million people's facial, their picture and everything else in that database. who is that individual? >> the state authorities are public documents that anyone could get access to. we work with the appropriate state officials. we review those documents very carefully. we talk about the use of the data and we make sure they're in accordance with our federal driver's license privacy protection act as well. >> okay. >> mr. chairman, i come back to the basics, if five key things are supposed to do when they started implementing the system in 2011 if i read the material correctly that they didn't
11:05 am
follow, yet we're supposed to believe don't worry, everything is fine, all this happening in an environment as we said earlier we learned two weeks ago an environment where there are 50 million surveillance cameras around the country, again i appreciate the chairman's willingness to have a second hearing and work with the minority party in trying to figure out where we go down the road. i yield back. >> what is your disagreement, by the way, with gao? you said there's a disagreement. what is it? >> with regard to privacy? >> yeah. yes. >> doj, i understand, disagrees with their illegal assessment of the reporting of such. i would have to take that specifically back to doj to respond. >> would you do that for us? >> i will, sir. >> thank you very much. m miss maloney. >> thank you to all the panelists for being here on this important hearing.
11:06 am
i have read that facial recognition technology is susceptible to errors that can have grave ramifications for certain vulnerable populations. i've read that for some reason it is more difficult to recognize women and minorities or -- i would like a private meeting with members that are interested in this on why this was reported, if it was reported correctly. i want to follow up on the ranking member's questions on the scope and accountability of this program. mrs. del greco, how many searches has the fbi rupp in the next generation photo system to date. do you have that information? >> i have from fiscal year''' year''''''''''''''''''' year''''''''''''''''''''''' 20'''''' 20'''''''''17'' 20'''''''''17''''' to''''''''' april'' april''''' of'''''''''''' 2019. >> does the fbi track if the results of this system is
11:07 am
useful, in your investigations? >> we do ask our state, local, federal, and tribal to provide feedback on the services that we provide. to date we have not received any negative feedback. >> but have they said that it's been successful? can you get back to me in writing? one thing not getting any fe feedback, the other is there any proof this system has been helpful to law enforcement in any way? has it led to a conviction? and get it to me in writing. how many of the fbi's searches have led to arrests and convictions? do you have that information? >> i do not. >> you do not. how many of the fbi's searches have led to the arrests of innocent people? >> for facial recognition, the law enforcement entity must have the authorized access to our system and they must do so for -- >> but they -- my question was, has it led to the arrest of any
11:08 am
innocent people? yes or no? >> not to my knowledge, ma'am. >> you don't know anything about any innocent person being arrested? >> our system is not built for identification. we provide two or more -- >> maybe we should change your system. we need accountability if the system is working or not or if it's just abusing people. the fbi database i read contains over 600 million photos of individuals primarily of people who have never been convicted of a crime, and my question is, why does the fbi need to gather photos of innocent people? >> we do not have innocent people or citizens in our database. we have criminal mugshot photos associated with a criminal arrest. >> well then i -- my information that i read in the paper must be wrong.
11:09 am
i'm going to follow up with a letter for clarification because i was told you had 600 million in your database of innocent people. so to me it's extremely importance that we know whether the use of this technology leads to any benefits for society, especially in determining whether there is a crime that is helping to solve or are we just weighing in on constitutional rights of people and creating really constitutional risks and we cannot know this unless there is a sufficient database for law enforcement that uses this. my question is, what are the current reporting requirements regarding the fbi's use of facial recognition technology? is there any oversight reporting requirements on the use of this technology? >> the fbi monitors appropriate uses of our technology through
11:10 am
audits. we have a robust audit -- >> do you have a database that tracks whether or not this is actually working? is it helping law enforcement arrest people? is it arresting innocent people? is it keeping information on innocent people? do you have a database that basically tells us what this program is doing and what the benefits or penalties are to our society? >> no, we do not. >> well, i think you should have one. i'm going to go to work on one right now. i further -- i'm very concerned about it. and the american people deserve government accountability and i actually agree with the questioning of the -- of the minority party leadership on this on that you don't have answers on how it's working, how it was set up, what's coming out of it, whether it's hurting people, helping people. you don't even have information on whether it's aiding law enforcement in their goal for hunting down terrorists.
11:11 am
we need more accountability and i yield back. >> thank you very much. i'm sorry, i recognize -- >> just real quick. >> unanimous consent. >> sent from consumer technology association to chairman cummings about this issue. >> without objection so ordered. mr.s massey. >> thank you, mr. chairman. dr. romine you reported on the accuracy of the algorithms tested. you said they're 99 to 99.7% accurate. did you test -- well first of all, that accuracy rating, i can imagine two ways the algorithm fails. one would be a false positive and one would be failing to recognize an actual match. which number are you reporting? >> so for the -- let me double check because i want to be sure that i get this right. the accuracy at 99.7, i believe, is false negative rates but i'm
11:12 am
going to have to double check and get back to you on that. >> you can get back to me later. did you test certain conditions like siblings, the accuracy for siblings? >> we do have -- perhaps the most relevant data i can give you is we do know that there is an impact on twins in the database or in the testing, whether they are identical twins or even fraternal twins. >> let me give you the point i have, i have two sons, one is 2 1/2 years younger than the other. he can open his brother's phone. they don't look that much alike. they look like brothers. he furrows his eyebrows and changes the shape of his mouth to the way he thinks his brother looks and opens his phone every single time. so that accuracy is not 99%. that is 0%. now that may be an older algorithm. i'm sure they've improved in a couple years since this
11:13 am
happened. i want to submit for the record an article in "forbes" by thomas brewer called "we broke into a bunch of android phones with a 3d printed head". >> without objection so ordered. >> thank you. so i think these aren'ts as accurate for certain conditions, somebody wearing a mask or makeup or sibling, the accuracy does not or may not approach 99% with some of these algorithms, what do you think? >> the situations you're describing are situations where there is intent to deceive either through lack of liveness -- >> do you think there's intent to deceive in the world? >> i certainly do. >> yeah. >> that's what we're worried about at tsa, is intent to deceive, not the honest actor. let me go to something else here. this question is for miss del greco.
11:14 am
the supreme court case brady v maryland held due process rights require government to promptly disclose potential exculpatory evidence with the defense, so in the case where multiple photos are returned, or there may be nine possible matches, does the defense get access or knowledge that there were other possible matches? let me give you an example. in a prior hearing, i had somebody testify to us that a sheriff's office gave an example where a person, a person with 70% confidence, was the person they ended up charging even though the algorithm thought somebody else was at 90% confidence. so they charged the person with -- that the algorithm said was 70% likely and passed over the one that was 90% likely in this case. can you guarantee us that the fbi would provide that type of information to the defense? >> first, the fbi doesn't make a match. we provide an investigative lead
11:15 am
to our law enforcement partners. with all evidence obtained during an investigation -- >> do you ever provide more than one lead? >> we provide more than one lead sometimes, yes, sir. >> okay. >> two or more. it depends on the state. some states want 20 candidates back, some want two back. depends on the state system. >> does the defense get access to the knowledge that there were other leads and do you assign a probability that the lead or a confidence level with that facial recognition? >> i think the prosecution team must determine on a case-by-case basis. >> so you're not sure if they always get that? >> no, i'm not. we don't provide a true match, an identification back. it's up to the law enforcement entity to make that decision. >> a quick question, how many photos does the face database have access to including the state driver's license databases? >> that changes daily. i don't have that. >> is it in the millions, tens
11:16 am
of mill sentence. >> i don't know, sir. i can provide that. >> do you have access to kentucky's database? >> i can check for you, sir. we do not. yes, we do, sir. >> so you -- you have access to all the photographs in the driver's license database in kentucky, which elected official agreed to that? >> i believe we worked with the state authorities in kentucky to establish the mou. >> but not an elebltsed official? >> the state authority is public and it's predetermined and established prior to face recognition. >> you said -- so you say the laws that you're relying on were passed before facial recognition became -- >> they were, they were, sir. >> that's i think a problem. i yield back, mr. chairman. >> thank you very much. mr. ruda. >> thank you, mr. chairman. dr. goodwin, in may of 2016, the gao made six recommendations to the fbi, three relayed to
11:17 am
privacy which i believe was implemented and three to accuracy. can you talk briefly about the five that were not yet implemented? >> yes, sir. the three related to privacy focused on developing the process so that it is more aligned with the requirements. the other one focuses on publishing in a timely manner. devise and developing the process, developing a process and making certain that those are published in a timely fashion. the other three are accuracy related and about testing, you know, expanding or testing the candidate list size. as you know, the list size, we took issue with the fact that they didn't test the smaller list size. so that's one of them. the other one is regularly assessing the ngiips meets their needs. that's an accuracy concern.
11:18 am
the other one on the face database, making certain that those are also meeting the needs. those three questions related to accuracy speak to this conversation here. the information that the fbi is using that information needs to be accurate if they're using them to -- for their criminal investigations. it's important that the information be accurate that they're using. >> the recommendations were made three years ago. is the lack of implementation, why has that been the case for three years? >> that probably is a question better left up to the fbi. >> i'll come around. dr. romine, you stated 99.7% plus accuracy, but that is specific algorithms. when you look at the breadth of algorithms used, i then assume based on your statement that there are accuracy rates much lower than that depending on the algorithm. can you elaborate on that? >> yes, sir. the range of performance in terms of accuracy for the
11:19 am
algorithms is pretty broad. some of the participants have made constitutional progress and have remarkably accurate algorithms in terms of the 99 and above percent for false negative rates. others are as much as i think i believe it's about a 60 fold less accurate than that. those are from a variety of sources including one university algorithm for research participation. >> is there data -- i'm going to ask you this as well as miss del greco -- is there data showing facial recognition accuracy versus traditional photographs and enhanced photography? >> i'm not quite sure i understand your question, sir. >> well, whether it's an old-fashioned technology of just using photographs versus facial
11:20 am
recognition. >> i see. >> is there any data that we have available that shows facial recognition is a large step in the right direction with even the challenges we're having here? >> we do have tests, human performance in facial recognition through comparison photographs. interestingly, what we find and i refer to my testimony that if you combine two humans, you don't really do much better than any one individually. if you combine two algorithms you don't really do much better than either individually. if you combine a human and facial recognition algorithm you do substantially better than either. >> okay. miss del greco, going to you, you can answer the same question, but also i would like to pivot back as to why the fbi has not implemented the five other recommendations of the gao? >> the two recommendations regarding the pia and the sorn,
11:21 am
doj disagrees with gao's publication. we had privacy attorneys embedded in our process the whole time. we published a pia and a sorn and continue to update those accordingly and we've provided updates to gao. with regard to the candidate list size, since the last hearing in 2017, the fbi conducted a test of our current accuracy in the system at all list sizes and we're able to validate the percentage was higher than what we published in 2017. >> one more quick question in, if a bad actor with bad intentions and the skill set to use disguises, doesn't that circumvent the entire process? >> we provide a candidate back and we use trained fbi examiners
11:22 am
as dr. romine alluded. the system provides a better response back to the law enforcement entity. >> thank you. i yield back, mr. chairman. >> thank you very much. >> thank you, mr. chairman. miss del greco does the fbi use real-time face recognition on live video feeds or have any plans to do so in the future? >> we do not. >> has the fbi ever experimented with real-time face recognition? >> not to my knowledge, sir. >> do any of the fbi's domestic law enforcement partners utilize or plan to utilize real-time face recognition technology? >> not for criminal justice purposes. >> does the department of justice believe the fbi has statutory authority to do real-time face recognition itself? >> not to my knowledge. >> does the department of justice believe the fbi has statutory authority to give stant grants to support real time face recognition? >> no so.
11:23 am
>> miss del greco and mr. gould, name the companies who lobby or communicate with your agencies about face recognition products they would like to provide? >> we have the testing that we've done through nist, but those are the only agencies that we're familiar with and we would defer to the nist vendors that participated in the facial recognition test in 2018. >> sir, the system that tsa is prototyping in conjunction with pb using an algorithm developed by nec. >> nec would be the -- >> that's -- >> the only company? >> that's the company that we're working right now with cbp. >> mr. gould, how many air passengers have participated in tsa's face recognition pilots? >> sir, i would have to get back to you with a number on that for the record. >> and you couldn't tell us how
11:24 am
many participants are u.s. citizens? >> no, sir. >> under what statutory authority does tsa use face recognition technology on american citizens? >> we use the authority of the aviation transportation security act which requires us to positively identify passengers who are boarding aircraft and proceeding through the checkpoint. >> and can you tell me what statutory authority tsa uses for face recognition technology on domestic travelers generally? >> sir, i would say it's the same authority the aviation transportation security act. >> does tsa have any plans for real-time face recognition technology in airports? >> sir, if you mean real-time by facial capture and matching at the checkpoint, then yes, that is what we're pursuing. >> and has tsa considered the privacy implications of real-time face recognition technology? >> yes, sir. absolutely. we've done privacy impact
11:25 am
assessments associated with this. there's signage at the airports that identifies that we're using facial recognition technology in a pilot manner to identify passengers. we don't store any photographs on the camera. >> and will travelers be able to opt out? >> yes, sir. travelers will always have the opportunity to not participate in the program. >> and you think that's true now and into the foreseeable future? >> yes, sir. >> do you have plans to implement face recognition technology at additional points in airports beyond -- besides gates or security checkpoints? >> we are prototyping facial recognition technology at bag drops. when you drop a bag off to be placed on an aircraft we can use facial -- we're exploring the use of facial technology there. then for tsa purposes only other location is the checkpoint. >> okay. i yield. >> will the gentleman yield? >> i yield. >> mr. gould, let me come back,
11:26 am
if you're doing it bag drops that's not a one on one comparison? i mean, what are you comparing it to? if you're looking at changing -- checking facial recognition at bag drops, there wouldn't be necessarily the identification that you were talking about earlier. what pilot program are you working with with that? >> the pilot program in place right now is with delta airlines and cbp and tsa in atlanta's terminal f, a matching of the passenger's bag against their identification or their photograph on the system. >> well, that contradicts your earlier testimony, mr. gould. what you said you were doing is just checking the biometrics within the identification against a facial recognition, but it sounds like you're doing a lot more than that. >> sir, this is for international travelers. >> i understand. i just came back, i came through jfk, i didn't see any of the signs you're talking about. all right. i guess what i'm saying is, what
11:27 am
statutory authority gives you the ability to do that. you refer to a 2001, i actually am on the transportation committee, and i can tell you, we never envisioned any of this and i'm looking at the very statute myself here, and how can you look and suggest that the statute gives you the ability to invade the privacy of american citizens? >> the gentleman's time has expired but you may answer the question. >> i'm sorry, sir? >> you may answer the question. >> thank you. sir, with respect to the pilot in atlanta it's international travelers. the purpose of that pilot is to positively match using biometrics, the passenger to that bag at the bag drop. the only -- the travelers -- the photograph is captured, image captured transmitted to the cbp system for matching and it returns a match result. that's it. no privacy information or any
11:28 am
other data associated with it. with respect to jfk, there's no pilot going on there right now. solely in atlanta in terminal f. >> miss hill? >> thank you, mr. chairman. i want to follow up actually on several of these questions. you, mr. gould, does the tsa record how many american citizens faces it captured during the pilot and if so, do you know the numbers? >> no, ma'am i don't know the numbers. i would have to submit that for the record. >> yes, please. also, you -- tsa uses the facial recognition systems of customs and border protection, cbp, which may not restrict how private corporations use passenger data. according to an august 2018 article from the "new york times" cbp, quote, has said it cannot control how the companies use the data because they are not collecting photographs on cbp's behalf. official stated that, quote, he believed that commercial carriers had interest in keeping
11:29 am
or retaining the data they collect and the airlines have said they are not doing so. if they did that would be up to them. tsa has said that it intends to pursue innovative models of public/private partnerships to drive collaboration and coinvestment. if they use systems to scan the faces of american citizens, how can it ensure that the private data of these passengers is not stored or sold by private airlines? >> ma'am, i would have to refer to cbp for any assessment for the security and privacy of that system. with respect to the public/private partnership, when we refer to that, we're talking about partnering with industry, airline and airports solely on the front end capture system. basically the cameras being utilized. >> but you talk about co-investment. >> so in accordance with tsa's authorities, we are allowed to enter into agreements with airports and/or airlines to procure equipment on our behalf and that equipment would be the camera system only.
11:30 am
solely for the capture, the matching and the database that's a government system and right now we're using the cbp system. >> so have you thought about how you would ensure that the private daughters is not stored or sold by airline. it is encrypted and sent off for matching. the database that cbp uses the tsa system, that is cyber secure in applicable standards. we do not transfer any private -- any personally identifiable information between us and the airlines. >> dr. goodwin, what regulations do you believe should be put in place in order to prevent the abuse of passenger data by airlines and other private companies? >> so as you know, gao, we wouldn't have an answer -- we wouldn't provide an answer to that question. the way we think about it is we have issued recommendations to privacy and accuracy and if
11:31 am
those are implemented that would go a long way to meeting some of the needs of the public as well as the needs of this committee. >> sorry. can you clarify some of those? >> we have those six recommendations. >> right. >> related to privacy and accuracy. only one has been implement. >> we believe if the remaining five were implemented that would address some of the concerns around privacy for the citizens and accuracy for the data that are being collected. >> and mr. gould, do you have issues with those recommendations? is there something preventing tsa from incorporating those? >> so as i stated before in accordance with section 1919 of the modernization act we've executed in conjunction with cbp a thorough review of the privacy impacts associated with biometric identification at the airport as well as any error rates and security concerns associated with that. that report will come from dhs in the near future. >> great.
11:32 am
"the washington post" further stated that around 25,000 passengers traveled through atlanta's airport pilot program terminal each week. according to the article, quote, only about 2% of travelers opt out. even assuming that the systems used by tsa are 99% accurate which they're likely not the high volume of traffic would still mean hundreds are inaccurately identified each week. does tsa keep metrics of american citizens inaccurately identify fide? >> in accordance to our analysis they were capturinging match rates and nonmatch rates with americans that do not return a positive match rate i would have to submit something for the record. >> what would be the most effective way for tsa to measure how accurate its facial recognition systems are when testing the identify of american citizens? >> we're not expert in testing full systems. we test algorithms for -- we evaluate those algorithms for
11:33 am
accuracy of matching. the entire system is something that's a little bit outside my purview. >> i understand the value of the technology, but i think we need to have some clearer regulations and guidance that are essential to prevent the abuse of data collected to protect our privacy and while i appreciate the gao's recommendations i think we need some more teeth to ensure that those are implemented. thank you. i yield back. >> mr. roy? >> thank you, mr. chairman. thank you to my colleague from georgia letting me go now. appreciate you all taking the time to testify today and your service to our nation. former federal prosecutor i appreciate the commitment to law enforcement and what you're trying to do to keep the united states and its citizens safe. i think there have been
11:34 am
important issues on privacy raised on both sides of the aisle. one of the line of questions my colleague from michigan, asking a little bit about real-time use of this technology and i want to explore that a little further and maybe asking sort of a simple maybe not all that informed question. is the united states government in any way based on the knowledge of anybody at the table using facial recognition technology on american citizens without their knowledge today if if so, where and how? miss del greco? >> the fbi systems are not designed for real-time capture of the american people. >> so to your knowledge, the united states government, your base of knowledge, is not using facial recognition technology to capture information of american citizens using it and processing it without their knowledge? >> i can speak on behalf of the
11:35 am
fbi. we would require it for criminal purpose only in accordance with a law enforcement ps purpose. >> mr. gould? >> sir, with respect to tsa, we're doing it solely with the passenger's consen. the cameras are visible and the passenger needs to assume a position in front of the camera for accurate facial capture. >> any other witnesses? dr. goodwin? are you aware of anything. >> we are not in the work that we've done that's been beyond the scope. >> okay. sir? >> it's also outside of the scope. >> are there any plans, do you know of any plans, to use that technology without consent of the american -- an american citizen? >> not with respect to tsa, sir. >> fbi? >> the fbi will not develop technology that outside of the division outside of a criminal purpose. >> miss del greco, a quick question, you said in response to mr. amash about real-time use, quote, not for criminal
11:36 am
justice purposes. can you explain, expand on that caveat? >> that we only collect a photo in conjunction with criminal justice. our law enforcement partners, the state and local, federal entities, must be authorized to have access to our system and they must have a criminal justice purpose in order to search our system. the ngi interstate photo system. >> i would like to yield to my colleague from louisiana. >> i thank my colleague for yielding a bit of his time. miss del greco, according to the fbi in 2017, fbi records 10, 554, 985 criminal arrests were made and ran about a 59% conviction rate. i think that this body and the american people witnessing must be reminded that every american that's arrested by probable
11:37 am
cause, standards of probable cause, much less than that of conviction, is that true? >> that is correct, sir. >> would the totality of circumstance and kroebtive evidence be used on the course of a criminal investigation and any technology including facial recognition technology, would that be added as a tool in a toolbox to add to perhaps the strength or weakness of that case snil. >> state and local entities have the option to submit a probe photo in accordance with a criminal investigation. >> moving quickly one of my colleagues mentioned that there was a 70% match on a subject and that's the subject that was arrested. versus a 90% match that was not arrested. does not arrested mean not investigated? >> i'm not aware of that, sirp we provide candidates back to law enforcement. >> during the course of a regular criminal investigation is reasonable suspicion grounds for investigation of any
11:38 am
citizen? >> i am not a law enforcement officer, sir. >> all right. well i am and it is. probable cause is a standard for arrest beyond a reasonable doubt or a shadow of a doubt is a standard for a conviction. i very much appreciate everyone's testimony today. this is an emerging technology. mr. chairman, mr. ranking member, we should watch this technology closely and protect the rights of american citizens. we should also recognize this can be a valuable tool for law enforcement and to fight crime in our country and i yield. >> mr. norton. >> thank you, mr. chairman, look. we recognize i think advancements that skypes cience making, perhaps this particular facial recognition advancement
11:39 am
such as it is is not ready for prime time and that's what we're trying to ascertain here. yet, it's being used as if it were. the fbi, dr. goodwin, uses this facial recognition system but cannot tell us, we've learned today much about its accuracy, the gao, and we rely heavily on the gao, of course, has said doj officials stated there is value in searching all available external databases, regardless of the level of accuracy. that's where my question goes. regardless of their level of accuracy.
11:40 am
the fbi has said that, miss del greco, that the facial recognition tool is used for investigative leads only. now, what's the value of searching inaccurate databases? i can see the downside. mistaken identity. misidentification. why is -- why is there any value in searching whatever database appears to be the case, is available to you based on investigative leads only? >> the fbis uses our trained
11:41 am
face examiners to look at candidates that come back on a search for an fbi open investigation and it evaluates all of the candidates and it provides the search back. >> can an investigative lead lead to conviction? >> the fbi field office and the fbi agent is the one that's primary to that case. they know all the details about the case. we would not be making that decision. it would be up to them to use that as a tool. >> it could lead as far as you know could lead to a conviction or maybe not? >> that's correct, ma'am. i agree. >> so we -- not only could itted three a conviction, it might lead to inaccurate convictions? >> we hope not, ma'am. >> it would lead to a conviction. perhaps they would be inaccurate since we're using the database for investigative purposes alone, i'm sorry, not alone but as well. here's what bothers me most,
11:42 am
there's been a prominent study done which included an fbi expert, miss del greco, found leading facial recognition algorithms like one sold by amazon and microsoft, ibm, were more inaccurate when used on darker skinned individuals, women, and people aged 18 to 30 when compared with white men. so we do have some indication when we look at what our population is. dr. romine, do you agree with the findings of this study? >> there are demographic effects. this is very time dependent. it depends on the time at which this evaluation was done and the
11:43 am
algorithms evaluated. nist is prepared to release demographic information or -- >> what my concern is, that there is excessive overpolicing in minority communities, i understand why, but it has resulted in americans being incarcerated at four times the rate of white americans, african-americans are over represented in mugshots that some facial recognition systems scan for potential matches. miss del greco, do you agree that both the presence, the over representation of african-americans in mugshot photos, the lower accuracy rates, that facial recognition
11:44 am
systems have on assessing darker skinned people, such as african-americans, that it is possible that false convictions could result from the fbi's use of these external systems if they are not audited? >> the lady's time has expired. you may answer the question. >> the fbi retains photos in our repository mugshot photos but they are associated with a criminal arrest and a finger -- ten print fingerprint. we do provide a -- >> are they audited? >> yes, they are, ma'am. we have a robust audit process with the state, federal, local, and tribal agencies. we send auditors out to those agencies and look at security requirements in accordance with the fbi security. we look at the policies, the procedures, and the standards to
11:45 am
ensure that they are required training and following our process. >> thank you, mr. chairman. i think we all are very much aware of the effects of surveillance on people. their behavior certainly changes. noncriminal speech, noncriminal behavior, it alters the way people behave when there's surveillance. just even as a passenger for many years, i know with the prying eyes of the irs and how that has had a chilling effect on speech even within non-profit organizations and churches. so this is an extremely serious thing when we know the possibility of surveillance is out there. miss del greco, has the fbi ever -- you mentioned a while ago the facial services unit or something of that effect. does that particular unit or any other unit in the fbi farm for
11:46 am
images, photographs, other i.d. type information on american citizens through social media or whatever platform. >> we do not, sir. >> does the fbi, have they ever purchased from a third party contract or wherever else images, photographs, i.d. information? >> no, sir. the fbi retains only criminal mugshot photos. okay. >> mr. chairmanni, i would likeo ask to be submitted to the record, here's an article by joseph cox, of vice news, socio spider, a tool bought by the fbi to monitor social media. >> without objection so ordered. >> i would like to submit for the record an archived copy of the socio spider.com web domain
11:47 am
that states that this software is used for on demand or automated collection of social media user data like that to be submitted. >> without objection so ordered. >> thank you, mr. chairman. finally also i have -- i would like to submit to the record the purchase of ordered logs of the fbi socio spider software and service agreement and individual user license purchased by allied associates international. >> without objection so ordered. >> thank you. >> miss del greco, there have been software purchased by the fbi and i don't nowhere you're coming from to not be aware of that. >> sir, i would have to find out from the other entities within the fbi. i represent the technology that's used for criminal justice purpose at the division. >> there's another avenue of facial recognition technology
11:48 am
taking place within the fbi that you know nothing about? >> not that i'm aware of, sir. >> if you don't know anything about this, there is. >> we can look into it, sir. >> we most certainly can. so are you saying then that to your knowledge, there's no software, although there is, that's being used by the fbi to collect information on u.s. citizens? >> i'm only aware of the use of our system for criminal justice purpose, sir. >> okay. and your system would include the systems of the driver's license database of multiple states? >> our system does not retain driver's license photos. >> you have access to it. you have your internal system and this system that you can access. >> we do not have direct access. we. >> a 2016 by georgetown's law center on privacy and technology found that you do have access to that. a total of 117 million americans
11:49 am
which comes to about one out of every two adults that you have access to that information? >> that is incorrect, sir. we disagree with that. the fbi, through an active fbi investigation, can submit a photo to our face -- >> how many do you have access to? >> we can submit a probe photo to the state dmvs and they provide a candidate backp we do not have access to the photos. >> the study disagrees with you. there's a precrime database if you will. i have a little bit of time and want to yield to the ranking member the remaining time. thank you. >> i think the gentleman -- the miss del greco, just to go to this real-time surveillance. so has the fbi or any other federal agency to your knowledge ever used real-time surveillance sort of a continuous look at say a group of people at some location, has that ever been done? >> no, sir, not to my knowledge. >> to your knowledge no other federal agency has done that,
11:50 am
the irs, any other agency has not done that? >> i cannot speak on behalf of the other agencies. real quick if i could, mr. chairman, the numbers, dr. goodwin, how many photos does t have access to in just their database? >> in just their database, a little over 36 million. >> 36 million. and in the databases they can send information to that are screened and used and there's interface, interaction with at the state level, what is the total number of photos in all those databases? >> so access to photos across all of the repositories, about 640 million. >> 640 million photos. only 330 million people in the country. wow. all this from -- the fbi has access to 600 some million photos, and this is the fbi that
11:51 am
didn't comply with the five things they were supposed to comply with when they set up the system. >> so if you think about the face services system and then all of the searchable repositories, that's over 640 million photos, and the fbi really only searches for criminal. they're looking for the criminal photos. they're doing -- they're looking through all of this for theirintheir criminal investigations. across all the repositories, we're talking over 600 million. >> thank you. >> we're talking about people who have been arrested, right? not necessarily convicted. is that right? >> arrested, by searching these databases, sir? >> yes, ma'am. >> we would have to go back and do a survey. we do every 90 days go out to state and local agencies to see if there's any input they could provide to us. we do know there are arrests made, but it's not on the identification of the photo. it's a tool to be part of the
11:52 am
case that they have. >> yes. >> if i could add one more thing about the 640 million. so most of those are civil photos, but those are -- >> that's what scares me. >> say that again. >> those are primarily civil photos. so we're talking about passports and driver's licenses. >> just regular every day people. wow. ms. kelly. >> thank you, mr. chairman, for holding this second hearing on facial recognition. the government's use of facial recognition increasing, it is important this nascent technology is not rushed to market and that all communities are treated equally and fairly. mr. romain, in your testimony, you mentioned the report that is due for publication this fall on demographic effects and mug shots. can you talk a little bit about this report and your objectives? >> the objective is to ensure complete transparency with regard to the performance of the algorithms that we evaluate and
11:53 am
to see if we can use rigorous statistical analysis to demonstrate the presence or absence of demographic effects. that statistical analysis has not been completed yet. we have preliminary data that have suggested that demographic effec effects such as difference in age, across ages, difference in sex and difference in race can effect or can have differences in terms of the performance of the algorithms. however, the increased performance across the board for the best performing algorithms is we expect diminishing that effect overall. so in the fall we'll have the final report of demographic.
11:54 am
>> i commend you for looking into this. when you're doing evaluations for companies, are you testing for demographic consistency? >> we do -- we don't test for specific companies on their behalf. we test or evaluate the algorithms that are submitted to us through this voluntary program. so we don't test specifically for algorithms demographic effects. we're talking about the demographic effects across all the algorithms that are submitted. >> and what are you doing to make sure no categories of people are suffering from lower rates of accuracy? >> the best we can do in that is to ensure transparency and public access to data about the level of the demographic effects. we have no regulatory authority
11:55 am
to do anything about that other than just make the data available for policymakers to make appropriate decisions. >> did you have a comment about that? no, okay. mr. gould, tsa has been partnering with cpb on biometrics for international travelers. how much training did operators receive beginning prior to the pilot program at jfk and l.a.x.? >> the training was significant. i would say multiple days of training in how the system works, how to analyze the match results, and how to effectively use the system. >> what were the top complaints that were received during this pilot -- >> the complaints in the public? >> the top complaints, yeah. >> ma'am, i'm really honestly not aware of any specific category complaints that rose to the surface. in general, the public seems to enjoy the enhanced passenger experience by using biometrics. >> any complaints by employees? >> i would say employees in
11:56 am
general, when you introduce new technology, the change can be somewhat challenging to use. but having just been down to atlanta and talked to many of the operators down there as well as the federal security director in charge of the airport, they embrace the technology and find it to be a significant enhancement to security at the check point. >> okay. the report on disparities is due on july 2nd, 2019. are you on schedule for publication, and are there any previews you can share? >> i don't have any previews available that i can share. the report has been completed in accordance with section 1919 of the tsa modernization act, correct? the report has been compiled and it's on its way through the department to congress, yes, ma'am. >> thank you very much. i yield back. >> thank you. >> thank you very much. mr. meadows. >> thank you, mr. chairman. i'm not going to beat up on you, but i want to come back and give you two pieces of advice.
11:57 am
one is -- and it's the same advice i give to every witness that sits in that seat right next to gao. if gao isn't happy, i'm not happy. so here's what i would recommend on the five outstanding things, that you work with gao to close those out. the five recommendations that they have. are you willing to do that? >> absolutely, sir. >> all right. the fact that you only closed one of them out last week prior to this hearing is what i understand. is that not accurate? i can tell you were smiling. so you didn't agree with that statement. >> not that i disagree. we have been completing audits. we completed 14 of the 21, and i think goa felt that was enough to satisfy the issue. >> all right. well, dr. goodwin, if you'll report back to this committee, what i would like in the next 60 days is the progress we're making. that's as gracious as i can be when it comes to that. listen, we want you to have all the tools to accurately do what
11:58 am
you need to do. the second thing that i would mention is you mentioned about not having any realtime systems. yet we had testimony a couple weeks ago from georgetown that indicated chicago police department, detroit police department has realtime. they purchased it where they're actually taking realtime images. do they ping the fbi to validate what they've picked up in realtime with what you have on your database? >> i mean, there are authorized law enforcement entities that have access to our system. we train them. we expect them to follow our policy. we audit them. >> i get that. but what pooim sayii'm saying i concerned about realtime. you have police departments in chicago and detroit that are doing realtime surveillance and getting you to authenticate that through your database. is that correct? >> they submit a probe photo in accordance with a criminal -- >> from realtime surveillance. >> not to my knowledge. i'm not aware -- >> well, that's opposite of the
11:59 am
testimony. what i want you to do -- and did they purchase that realtime surveillance technology with federal funds? so if you'll get back to the committee on that. >> absolutely. yes, sir. >> all right. thank you. mr. gould, i'm going to come to you. some of your testimony -- and actually, i've been to dulles where we looked at cpb, actually looking at realtime facial recognition when travelers come in and out. so i guess you're saying that right now you're not doing that at dulles anymore. is that correct? because you mentioned only atlanta and -- >> sir, i can't comment on the cbp program because they do it for entry and exit purposes for international travel. tsa is not doing it there. >> okay. so here's what i would recommend. out of all the priorities the tsa has and all the inefficiencies that actually in this committee and other committees have, facial recognition certainly cannot be the top priority in terms of what we're looking at to make
12:00 pm
sure our traveling public is safe here. would you say that's the top priority that you have in terms of your achilles heel? >> sir, positive identification of travelers is -- >> that's not the question i asked. is that the top priority? yes or no. >> it's one of multiple significant priorities for tsa. >> what's your top priority? >> i would say -- >> there can only be one top, mr. gould. this is a softball question. >> i would say at this point enhanced property screening at the check point, ct machines for the check point to do better assessment of carry-on baggage. >> you mentioned the fact you potentially have actually taken photos of american citizens dropping off their bags. is that correct? in my questioning earlier, you talked about the fact you might have -- part of tsa is looking at the screening process, where it's not just a one on one. you're actually taking photos of people at bag drops. is that correct? >> only if they choose to participate and only in one location and that's terminal f in atlanta. >> so you can guarantee --
12:01 pm
because i've flown out of terminal or concourse f, i think is what it is. but i've flown out of that on delta. so you can guarantee that i was not photographed? because i've never given anybody my permission on international travel to my knowledge. so can you guarantee that i'm not picked up in that? >> unless you were photographed while you were dropping off the bag -- >> that's my question. but that's my question. i gave no one permission to take my picture while i'm dropping off my bag. i'm an american citizen. what rights, what legal rights do you have to take that photo? >> you should not have been photographed. >> so you can't guarantee that i wasn't. so here's what i would recommend, mr. gould. i am all about making sure that we have screening, but i can promise you i have gone through screening more than most americans, and there are inefficiencies and tsa's problem that has nothing to do with facial recognition.
12:02 pm
until you get that right, i would suggest that you put this pilot program on hold. because i don't know of any appropriations that specifically allowed you to have this pilot program. are you aware of any? because you keep referring back to a 2001 law. >> yes, sir. >> i'm not aware of any appropriations that have been given you the right to do this pilot program. >> i'm not aware of any specific appropriations. >> exactly. so i would recommend that you stop it until you find out your statutory authority. i yield back. >> thank you very much. before we go to ms. lawrence, let me follow up on gentleman's request of ms. del greco and dr. goodwin. one thing i've noticed after being on this committee for 23 years is that what happens so often is people say they're going to get things done and
12:03 pm
they never get done. so mr. meadows, in the spirit of efficiency and effectiveness, has made a very reasonable request that you get together so that we can get some of these items resolved. so i'm going to call you all back in about two months maybe. i'll figure it out. because i'm worried that this is going to go on and on. and in the meantime, i'm sure we'll be able to come up with some bipartisan solutions. but the american citizens are, i think, being placed in jeopardy as a result of a system that's not ready for prime time.
12:04 pm
and so we'll call you all back. so i hope that you all get together as soon as possible. again, i say this because i've seen it over and over again. we'll be in the same position or worse in three years, five years, ten years. by that time, so many citizens may have been subjected to something that they should not be. with that, i call on -- >> mr. chairman, i just want to say, i appreciate your leadership on that and appreciate your follow-up. >> no problem. i now call on distinguished lady from michigan, ms. lawrence. >> thank you, mr. chair. doctor, do you think that third-party testing is important for the safe deployment of facial recognition technology? and i want you to know that i sit on the criminal justice appropriations committee and
12:05 pm
funding for nist is something that i have a responsibility for. so i would really like the response to these questions. >> i think independent assessment of new technologies, particularly if they're going to be used in certain ways is an essential part and one of the things we're privileged to do. >> and how dependent are government agencies on nist's finding? how dependent? >> it's hard for me to assess that. i think we certainly have collaborative relationships with dhs, with fbi, with other federal agencies, part of our statutory requirement is working with other agencies on advancement of technologies and evaluation of technologies. >> is there a way that we can
12:06 pm
move forward that you can do an assessment so that we would know when we're talking about the findings, which is a critical factor right now, is there a way we can move forward so we can assess how -- what is the role that you play -- that is played by the third party? >> with respect to facial recognition, we have ongoing evaluations on a rolling basis. so participants can submit algorithms at any time. we continue to provide open, public, transparent evaluation methodology so that everyone, federal agencies and the public and private sector, can see the results of our testing and make determinations on effectiveness of the algorithms.
12:07 pm
>> which organizations are currently equipped to test new facial recognition technologies? >> we're certainly equipped to do that at nist. i don't have any information about other entities that might be equipped to do that. >> do you believe that nist currently has significant funding and resources to carry out your work as the standard barrier of the facial recognition industry? >> yes, we have sufficient resources today to be able to execute the program that we have in biometrics. to carry out. that's the word you're saying. this is evolving and we're looking at the challenges.
12:08 pm
do you have enough funding for the r&d and for the checks and balances for you to be the standard barrier of the record, the facial recognition industry. nothing frustrates me more than for you to come before congress and say i have everything i need and then when you don't do the job, well, we didn't have the funding. so i'm asking this question, and i need you to be very honest with me. >> i would make two remarks. one is we have a long track record of delivering high-quality evaluations in biometrics for nearly 60 years. the second part of it is it's a bit awkward for me in front of congress or any federal official to speak about funding levels. i'll just make the comment that any research organization can do more with more.
12:09 pm
i'll leave it at that. >> well, for me to do my job, i have to get past accurate, and you have to have a plan and directive. i just want to ask if anyone on the panel wanted to comment on the organizations and the ability to accurately test new facial recognition technologies. are there any comments from any of you? no? thank you. >> thank you very much. ms. miller. >> thank you, chairman cummings and ranking member jordan. thank you all for being here today. america has been a leader and an innovator in the technology sector. american companies have pioneered many of the technologies deployed around the world. however, as this sector continues to grow, we need to ensure that our government agencies are accurately
12:10 pm
deploying this technology within the bounds of law. this past week i was in china and saw facial recognition technology deployed on a massive scale from the moment i was getting ready to get on the airplane. there were cameras everywhere. alibaba recently instituted a program where customers can smile to pay. using facial recognition technology. i also saw cameras at street crossings that can pinpoint certain individuals who are breaking traffic laws. it was rather daunting to see the government shaming individuals so publicly, which is a stark contrast to what our privacy and liberty is in america. they would flash your face right there. seeing this use of facial recognition technology in china poses many questions to the united states about the appropriate use of this technology. ms. goodwin, dr. goodwin, what
12:11 pm
steps can our government take to ensure facial recognition technology is being deployed in a way that is accurate? >> so thank you for that question. you know, i will always go back to the recommendations that we made when we did this work a few years ago that doj is still working through. accuracy, transparency are key and vital to when we're talking about this technology as well as just making certain we are protecting privacy rights. to go back to the recommendations, we want doj to pay more attention to the list sizes they're testing. we want them to regularly assess the ngi/ips, whether that information is accurate. we also want them to assess and have some understanding of whether the information that they're getting from their external partners are also accurate. >> thank you. to your knowledge, has the fbi had any misidentifications of
12:12 pm
individuals when utilizing facial recognition technology? >> i'd like to go back to the statement by dr. goodwin. we did test all -- since the last hearing in 2017, the fbi did test all of the list sizes and saw improvements in the accuracy. we conducted the facial recognition vendor test with nist and are implementing a new algorithm. and we work continuously with our state and federal and local partners on their use of our system. we've also commissioned nist to do a 2019 and onward -- it's called ongoing facial recognition test, where we'll be able to test the accuracy of the system yearly. with regard to misidentification, i am not aware of any. thank you. >> okay. then basically, my next question sort of falls right in line. does the fbi have any plans to assess the rate of misidentifications generated by the next generation
12:13 pm
identification interstate photo system? >> so the system was designed to return two or more candidates. we provide an investigative lead back to law enforcement. we require training by law enforcement to follow the ngi interstate policy and implementation guide and the facial identification scientific working group standards. so anyone running a search through the ngi interstate photo system must comply with the policies and standards, and they're audited by our fbi annually. >> can you discuss the regulations in place that allow for an agent to utilize facial recognition technology and how strictly these regulations are enforced? >> i do know for the fbi face services unit, an fbi field office must have an open assessment or an active investigation, and they must follow the attorney general guidelines associated with that
12:14 pm
for us to be able to receive a probe photo from them and then submit the probe photo for a search. >> okay. dr. goodwin, to your knowledge, has the fbi been adhering to these regulations? >> so we're working very closely with the fbi, if can i go back to something said earlier. so the testing that they're currently doing, the new information that they're providing, until we see that, we won't be closing our recommendations. we need to make -- we need to make certain that they are meeting the recommendations as we have put forward to them. >> okay. thank you. i yield back my time. >> mr. gomez. >> thank you, mr. chairman. in the history of this country, we've always had this debate and this goal of trying to balance security with liberty. but in the era of facial recognition, i feel that we're
12:15 pm
stumbling into the future without really understanding how much liberty we're giving up for how much security. it's really with that understanding we have to kind of set up guidelines that really dictate the use of this technology. so that's where my concern comes from. this is pride month. june is pride month. i think about the transgender and nonbinary communities. we've seen reports that show that black women are more likely to be misidentified than any other group. so when you layer on top of that transgender, nonbinary, black individual, what happens to those results? mr. romain, have you seen any data when it comes to the lgbtq
12:16 pm
community, specifically the transgender community? >> we haven't done an analysis of accuracy rates for the transgender community. i'm not sure how we would obtain the relevant data that we can use to do that. but i am aware of -- i've been made aware of concerns in the transgender community about the potential for problematic use here. >> okay. now, i appreciate that. a lot of this is also revolved around training. i know nist has pointed out and this indicates that people are likely to believe computer generated results. those who aren't specially trained in facial recognition have problems in identifying people they don't know, even if they perform face identifications as part of their
12:17 pm
work. so i'm kind of keeping that in mind with my questions i'm about to answer. first, what's the confidence interval level the fbi uses when it comes to running the program for the matches? is it 80%? 85%, 95%? >> our quoted accuracy rate -- and we don't have matches. let me clarify that first, sir. it's an investigative lead. it's two or more candidates. our system is not built to respond to one response. currently, we have an 85% accuracy rate. although, since the last hearing -- >> that's not what i'm asking. i'm asking when you run the program, is it set to a high level that it needs to be accurate to a 95% confidence level, that the computer recognizes that this individual is 95% likely to be this person, or is it 80%? like amazon sells their program at 80% default.
12:18 pm
what do you run your program at? >> because we don't conduct an identification match, we don't look at that, sir. we have an accuracy rate we rely on. we are currently implementing the new nist vendor recognition test results at 99.12% at a rank one, and it's 99.72 at a rank 50. those are the new -- that's the new algorithm. because it's not a true identification, we don't report that. >> okay. how does the fbi combat human tendency to trust computer generated results? >> through the testing with nist for sure. then we also use other agencies and entities, universities to provide testing results to us. >> do you train the fbi personnel to perform facial comparisons of persons that run kno -- are unknown to them? >> we receive probe photos from an active investigation from the fbi field office, an fbi agent.
12:19 pm
they process that probe photo against our mug shot repository and receive a candidate back. they are trained to evaluate. >> so is the fbi training personnel on the inaccuracy and biases of algorithms? >> bias? no, sir. >> the fbi does publish -- and why is that? >> i think the employees -- i mean, our system doesn't look at skin tone and features. it's a mathematical computation that comes back. they're to look at the mathematical features of the face. >> okay. i understand. you're basically describing facial recognition technology. but outside studies have shown there is a bias when it comes to certain populations, that the error rate is a lot higher. were you aware the aclu conducted a match of different members of congress that at an 80% confidence interval level
12:20 pm
and members of congress, including myself, were mismatched positively with the mug shot photos? >> so the technology you're referencing to is an identification. that's a match. we do not do that. >> so you do broader. >> we do 2 to 50 candidates back. our employees look at two candidates or more. we do not look at one-to-one match. it's not a match. >> all right. the fbi publishes that it trains third parties in a manner consistent with the guide looirn and recommendations outlined by the facial identification scientific working group. the facial identification scientific working group does not endorse a standard certified body of facial comparison to compare the ai ten-print certification againsts for personnel that analyze fingerprints. these programs require hours of training before a person can be certified. since there's no formal certification process that the working group endorses, what standards does the fbi require?
12:21 pm
>> twe require all law enforcement entities that have access to the interstate photo system to follow the fbi's policy and implementation guide and the standards. they have to follow both. >> the gentleman's time is expired. thank you very much. >> thank you, mr. chairman. >> mr. presley. >> thank you, mr. chairman. it has been abundantly clear that facial recognition technology is flawed by design, unlawfully producing false matches due to algorithmic bias, including to every day americans. in fact, even members of congress, which representative gomez was one of those and was just speaking to that. there's growing and i believe credible concern over the unauthorized use of this technology in public spaces, such as airport, schools, and courthouses. these systemsi could certainly e subject to misuse and abuse by law enforcement.
12:22 pm
we know this technology is often used without consent. in that there are no real safeguards, no guardrails here, this is not fully developed, i just want to take a moment to say that i appreciate the leadership of the city of somerville in my district who have passed a moratorium on the surveillance and on this software because of the fact it is not developed and there are no safeguards and no guardrails. much of my line of questioning has already been asked, but i did just want to pick up on a couple things in the space of consent because i wanted to just get some accuracy questions and just better understand for the purposes of the record here. mr. gould, do you keep data on how many people opt out of use for the facial recognition technology? >> ma'am, i'm not aware that we're actually collecting data
12:23 pm
on people who choose not to participate. i don't think we're collecting it. no, ma'am. >> okay. so you have no idea how many people have opted out of previous tsa facial recognition pilot programs? >> no, ma'am. >> okay. do you know how many passengers were notified of tsa's use of facial recognition technology? >> ma'am, the notification at the airport consists of signage and also verbal instructions from the officers. so if they're in a lane where facial technology is being piloted, i would say 100% of the people being aware that it's being used. and they actually have to assume a suitable pose to actually have the camera capture their image. >> so again, how can a -- so if this is based on signage, which in many ways can be arbitrary, how are folks even aware of the option to opt out other than signage? and how do they opt out? >> it's signage.
12:24 pm
it's announced. if you'd like to have your picture taken for your identification, you know, please stand right here. otherwise, can i please see your credential, your hand-carried identification. >> okay. and is that communicated in multiple languages? >> for the purposes of pilot, ma'am, it has not been communicated in multiple languages. >> okay. again, just for the purposes of the record, i guess i overspoke based on my own desires that the municipality in my district, the massachusetts 7th, passed an ordinance to ban but has not yet passed a moratorium. i just wanted to correct that for the purposes of the record. let me just for a moment get back into some questions regarding government benchmarking for facial recognition. are you aware of how many government agencies use or possess facial recognition
12:25 pm
technology? anyone. >> i don't know that answer. >> nor do i. i do also want to put in front of everyone, so we -- the gao does have ongoing work right now looking at the use of frt at cbp and tsa. so we will be following up on the information here. >> okay. and so -- okay. so there isn't -- is there a stabilizing, sort of comparison benchmark as to the accuracy of these programs and how they compare with other programs? >> we are not aware of that as of yet. >> okay. did nist present any red flags to agencies about inaccuracies in any particular system used by government agency that you're aware of? >> nist doesn't interpret the scientific data in terms of red flags. instead, we just ensure that
12:26 pm
everyone who is using facial recognition technology has access to the scientific data we publish openly about the performance of the algorithms that have been voluntarily submitted to our program. >> okay. all right. i think that's it for me for now. i yield. thank you. >> let me just ask you this, dr. goodwin. you said there's ongoing work. what's happening there? >> so we have ongoing work at the request of both the senate and house homeland committees to look at the use of face recognition technology at dhs and in particular tsa and cbp. we also have ongoing work looking at the commercial uses of face recognition technology. if i could just circle back to congresswoman pressley's comment about consent. there is the senate bill that will look at consent, but it only looks at consent from the
12:27 pm
standpoint of commercial usage, not federal usage. so we have those ongoing jobs. then gao does have a request in to look at face recognition technology across the rest of law enforcement. >> going back to ms. pressley's questions about the whole idea of language, do you all feel comfortable -- i mean, i assume you've looked at tsa already, right? >> we're just starting that engagement. >> so you haven't looked at the pilot program? >> not as of yet. i imagine that will be part of what we examine. but that engagement -- that work just started at gao. >> one of the things i'm hoping you'll look at is that whole question. you know, people in an area, trying to get to where they got to go. a lot of people don't even know what facial recognition is. they don't have a clue. put all the signs up you want.
12:28 pm
then if you got a language problem, that's even more, mr. gould. have you all thought about that? >> yes, sir. i was remiss when i answered the question before. one of the reason we're doing these pilots is to assess the efficiency with how we communicate with passengers. can the signage be better? multiple language in certain areas, is that something we should be looking at? all that will be assessed with respect to these pilots before making a decision moving forward. >> mr. lee. >> thank you, mr. chairman. i got to tell you, and i hope this is okay, this stuff freaks me out. i'm a little freaked out by facial recognition, mr. chairman. i hope that's okay. >> yeah, that's okay. >> thank you. my residents in michigan's 13th congressional district have been subjected to increased surveillance and overpolicing for decades. currently the city of detroit rolled out realtime video
12:29 pm
surveillance program called the project green light. in 2016 to monitor crime at late night businesses like gas stations and liquor stores. but now the system has expanded to over 500 locations, including parks, churches, schools, women's clinics, addiction treatment centers, and now public housing buildings. without notice or public comments from residents, the detroit police department added facial recognition technology to the project green light, which means detroit police department has the ability to locate anyone who has a michigan driver's license or an arrest record in realtime using video cameras mounted across the city and a database of over 50 million photos. in january of 2019, reports emerged that fbi had begun a use of amazon recognition, amazon's controversial software that can match faces in realtime video, similar to project green light. recognition like realtime facial surveillance programs has
12:30 pm
dangerous high error rates for women of color as compared to white males. in the 13th congressional district, residents will be disproportionately bear the harms of facial recognition misidentification. so what policies does the fbi have in place regarding the use of realtime facial recognition technology? i heard claims you all are not using it, but there's a pilot program, correct? >> no, there is not. for the amazon recognition software, to the best of my knowledge, verified before i came today, the fbi does not have a contract with amazon for their recognition software. we do not perform realtime surveillance. >> through the chair, if i may, if you can produce that documentation and that information to our committee, i would really greatly appreciate that. >> we will do so. >> now, can you explain how the fbi -- so the fbi is not currently using amazon recognition at all. >> we are not. >> good. so in march 2017, nist released
12:31 pm
a report in accuracy of facial recognition systems. the report found significantly higher error rates for realtime use of recognition with accuracy rates as low as 60%. so doctor, do you think that the use of realtime facial recognition technology is ready for law enforcement usage? >> that's a judgment that nist is not prepared to make. that's a policy judgment that should be predicated on the best available scientific data, which is our position. >> what does your scientific data say? >> the scientific data verifies that facial recognition accuracy is highly dependent on image quality and on the presence of injuries. both of those things can affect the ability to have -- >> so is there any viable solution to improving the realtime capabilities? >> i can't predict how accurate
12:32 pm
the systems will be in the future as they continue to develop. currently, systems that use facial images that are not in profile -- or that are not straight on, like mug shot images or facial images that are indistinct or blurred have a much lower ability to match. >> dr. goodwin, do you have any information about the inaccuracies? i know you all had several recommendations. but can you talk a little bit more about my question in regards to is this fixable? >> so in regards to your question about the amazon recognition technology, that was not something we looked at for the purposes of our report. so i won't be able to speak to that. >> but in regards to right now the usage of facial recognition inaccuracy, you all had like six recommendations about transparency and so forth. i was just talking to some of my colleagues. how do you fix something like this where you dump so many
12:33 pm
innocent people into a database? the numbers are -- i heard 411 million. i think i heard from you 600 million people are now in this database that is being used for criminal justice purposes, which i'm not sure what's the definition of that. >> so i'll kind of start at the beginning. for the ngi/ips, there are 36 million photos in the criminal part of that. there are 21 million photos for the civil part of that. and as you look across all of the searchable databases or repositories that face has access to, that's over 600 million. that's what i was talking about earlier. the recommendations that we made, those three recommendations that we made related to accuracy, we feel like this would go a long way into helping doj better ensure that the data they're
12:34 pm
collecting, the way they're using the information, that that's accurate. as of yet, as you've heard, doj has yet to close those recommendations. we'll work closely with them to get those closed because the issues around privacy and accuracy are very important and vitally important when you're talking about using this technology. >> thank you. and mr. chairman, if possible, this is very important to my district and to others, if we can get some follow-up and confirmation that indeed the current administration does not have any pilot program going on with amazon recognition program. >> thank you very much. i don't know if you heard me earlier. we're going to bring folks back in six weeks to two months, somewhere in that area. i'm hoping that before then, they'll have those questions resolved. des moin definitely we'll check back then. ms. ocasio-cortez. >> thank you, mr. chair. in the fourth amendment, our
12:35 pm
founding fathers endowed with us the right -- quote, the right of people to be secure in their persons, houses, papers, and effects against unreasonable searches and seizures. the fourth amendment guarantees us that these areas shall not be unreasonably intruded upon with most searches founded upon a warrant. over the last few weeks, we've been hearing whether from the private sector, the public that we've heard about facial recognition technology being used in airports, protests, being purchased off of social media, et cetera. you're with the fbi. does the fbi ever obtain warrants before deploying the use of facial recognition technology? >> the criminal mug shots are searched by our law enforcement partners, and all photos are collected pursuant to an arrest with a criminal ten-print
12:36 pm
fingerprint. >> and in use of facial recognition, it's beyond just the search of the criminal database, but scanning a person's face, i would say, is akin to searching their face in order to match it to a database. does the fbi ever obtain a warrant to search someone's face using facial recognition? >> we do not do realtime searching. >> okay. do you require your external partners to obtain a warrant? >> they must do so with a criminal law enforcement interest. >> does the fbi use any information from any other agency with respect to facial recognition? >> we share our records with other federal agencies with regard to law enforcement purposes. >> in our may 22nd hearing, chairman cummings stated he was present at the 2015 baltimore protest following the death of freddie gray. at those protests, the baltimore county police department allegedly used facial recognition technology to identify and arrest certain
12:37 pm
citizens present at the protest, exercising their first amendment rights. has the fbi ever used facial recognition deployed at or near a protest, political rally, school, hospital, courthouse, or any other sensitive location? >> no, we have not. >> and do you think that the generalized facial surveillance should be permissible? do you think that undermines the first amendment? >> i do think that protecting the american people is extremely important to us. the fbi absolutely wants the best, most fair system. we want to make sure that we're following the guidelines, process, protocols, and standards we put in place for law enforcement. >> okay. thank you. mr. gould, you're with the tsa. the tsa has outlined proposals to collaborate with private companies, including delta and g jetblue to develop and implement their facial recognition search
12:38 pm
systems. is this correct? >> ma'am, we've issued a security program amendment to delta to allow them to use biometric identification at their bag drop. in terms of partnering with them to develop the back end matching system, that is something that we're still engaged with cbp on. >> and the bag drop, those are the computers that folks check in and get their boarding pass from? >> that would be -- i would use the term kiosk for that. >> key yaiosk. >> delta uses that technology at their kiosk. where we have equity is at our check point and the bag drop, where we're required to ensure the passengers match to their bag. >> do individuals know that that is happening and do they provide explicit consent? is it opt in? >> passengers have the opportunity to not participate. >> so it's opt out but not opt in. >> it is. yes, ma'am. >> so it's possible that jetblue and delta are working with the tsa to capture photos of passengers' faces without their explicit opt-in consent.
12:39 pm
>> ma'am, i was down in atlanta last week and watched the delta check-in process, the bag drop process. it was very clear while i was down there that passengers wore afforded the opportunity if you'd like to use facial capture for identification, please stand in front of the camera and we'll do so. there was no automatic capture of passengers or anything like that. >> and this capture is not saved in any way but is -- correct, right? >> no, ma'am. the camera captures the image. the image is encrypted. it is sent to the tvs matching system, which is what cbp uses, solely for the purpose of match. then that match result is sent back to the operator. >> is that captured image destroyed? >> it's not retained at all, no, ma'am. >> so it's sent but not retained. >> it's not retained on the camera, no, ma'am. >> okay. could these companies potentially be using any part of this process to either capture the algorithm or data? >> no, ma'am.
12:40 pm
i don't see that happening currently with the pilots we're doing right now. >> okay. thank you very much. i yield back. >> thank you very much. >> thank you very much, mr. chairman. when we had our hearing on may 22nd in this committee, there was an nit researcher who was testifying about data sets that nist uses and that they may not adequately test for the full range of diversity that is present in the u.s. population. she said, quoted, in evaluating benchmark data sets from organizations like nist, i found some surprising imbalances. one prominent was 75% male and 80% lighter skinned, or what i like to call a pale male data set, end quote. so can you discuss how representative data sets are when it comes to race, gender,
12:41 pm
and age? >> sure. the data we obtain is from multiple sources. the largest amount of data we get -- first i need to make a distinction between data that we are releasing as part of the ability for vendors to determine whether they are able to submit their algorithms to our system, to our evaluation process. so we provide them with data for that. the rest of our data, the vast majority of it, is sequestered. it is not made public. it is solely for the purposes of evaluation. most of that data is fbi image data that we sequester and protect from release. there are some other image data related to creative commons, to images that we have received with full institutional review
12:42 pm
that involves permissions and also deceased data sets. in all cases, if you look at the full suite of data, it is true that it is not representedive of the population as a whole. however, we have a large enough data set that our evaluation capabilities can be statistically analyzed to determine demographic effects of race, age, or sex. and we're in the process of doing that now and will release that report in the fall. >> so i gather that since the last hearing, you've been testing for differential error rates on the facial recognition systems between races and genders. can you talk a little bit more about the error rates of the algorithms you tested between different races and genders?
12:43 pm
>> sure. i can say a little preliminary information, but i want to stress the full statistical analysis, the rigorous analysis is not completed yet. the report will be released in the fall that outlines the full conclusions that we have with regard to demographic effects broadly speaking. we can say that there are still remaining differences, even with the extraordinary advances in the algorithms over the last five years. there are still differences remaining that we can detect. we don't yet know whether those differences, whether it's with regard to race, sex, or age, are significant. we don't know yet until we've completed that analysis. >> so you understand the concern. there's sort of two -- at least two levels of analysis that are the thematic here today. one is the threshold question of
12:44 pm
whether we like or don't like this technology given the general threat that it can pose to civil liberties. the second theme is whether recognizing that the technology is barrelling ahead anyhow and is being adopted and applied increasingly across many different platforms, let's say, and uses. whether it's being developed in a way that ensures that when it's used, it's not being used in a discriminatory fashion, not being applied unfairly, et cetera. and that depends on the algorithms being developed in a way that is respectful of accurate data. and we're not there yet.
12:45 pm
so we're going to be paying a lot of attention. i'm glad the chairman's going to have you all come back. i think he's right that this is sort of a moving target here. we're going to be paying a lot of attention to how the data gets digested and how the algorithms that flow from that data are being applied, whether they're accurate and so forth. so we appreciate your testimony, but obviously this is not the end of the inquiry. with that i yield back. >> a while ago we were told that the basis for a lot of these agreements between the fbi and the states were -- well, the authorization was actually, regulations, whatever, were put together before facial recognition came about.
12:46 pm
we talk about a moving target. so it wasn't even anticipating this. we still haven't caught up. that's part of the problem. thank you very much. mr. jordan. >> thank you. thank you, mr. dharchairman. i want to thank our witnesses for being here today. appreciate the time and expertise you bring -- that you brought to this important hearing. i think you understand it from both sides of the aisle. there's a real concern. i hope you understand how serious i think everyone is on this committee with this issue. i think you got to understand the framework. i mean, you talked about strict standards in place. there were strict standards in place -- at least people from our side of the aisle view it this way. strict standards on how people go to the fisa court and put information in front of the fisa court. the attorney general of the united states has tapped u.s. attorney john durham to look at potential spying done by the fbi
12:47 pm
of one presidential campaign. so this is the context and the framework that many in our side see this happening, and it's happening when gao, not jim jordan, not republicans, that when you guys started this, started using this, you didn't follow the e-commerce law. you didn't do privacy impact assessment like you're supposed to. you didn't provide timely notice, didn't conduct proper testing, and didn't check the accuracy of the state systems you were going to interact with. so that's the back drop. that's the framework. so when republicans talk about we're concerned and working with democrats -- and i do appreciate the chairman's focus on two hearings and now a third hearing in looking at legislation that we may attempt to pass here. this is the framework. so i hope you'll tell the folks back at the fbi, you know, we appreciate the great work that fbi agents do every single day protecting our country and stopping bad things from happening and finding bad people who did bad things. but the framework in the context is very serious. and that's why we come at it
12:48 pm
with the intensity that i think you've seen both two weeks ago in that hearing and in today's hearing. so again, mr. chairman, thank you for your leadership on this, and i would thank our witnesses again for being here. >> i, too, want to thank the witnesses for being here for almost three hours. we really do appreciate your testimony. of all the issues that we've been dealing with, this probably will receive the most intense scrutiny of them all. the ranking member referred to the fact we are bringing -- but we also have two subcommittees that are also looking into this. because we want to get it right. it's just that important. so i thank you. without objection, the following shall be a part of the hearing record. face recognition performance,
12:49 pm
role of demographic information, scientific study, dated december 6th, 2012. face-off, law enforcement use of face recognition technology, white paper by the electronic frontier foundation. gao priority open recommendations, department of justice letter to a.g. barr and gao. opening ongoing face recognition vendor tests, part one verification, nist. ongoing face recognition vendor test, part two, nist report. face and video evaluation, face recognition of noncooperative subjects, nist report. coalition letter calling for federal moratorium on face recognition coalition letter.
12:50 pm
and the coalition of privacy civil liberties, civil rights and investor in faith, aclu, georgetown lgpt technology partnership, and the nacp. again, i want to thank our witnesses for being here today. without objection, all members will have five legislative days within which to submit additional written questions for the witnesses to answer, which will be forwarded to the witnesses for their responses. i would ask that our witnesses please respond as promptly as possible. with that, this hearing is adjourned.
12:51 pm
12:52 pm
the complete guide to congress is now available. it has lots of details about the house and senate for the current session of congress. contact and bio information about every senator and representative. plus information about congressional committees, state governors and the cabinet. the 2019 congressional directory is a handy, spiral-bound guide. order your copy from the c-span online store for $18.95. our live coverage will continue here on c-span3 at 2:00 eastern this afternoon when we bring you the hearing on federal response to white supremacy. a house oversight and reform subcommittee talks with officials from the fbi and
12:53 pm
homeland security. again, that starts at 2:00 p.m. eastern. and congress is in session today. the house is meeting to consider the dream act, a democratic proposal to grant a legal status and a path to citizenship of undocumented migrants who came here illegally. a confirmation vote is expected at the end of the day on the nomination of the confirmation director. see the senate on c-span2 and a reminder you can watch all these programs on line at c-span.org or listen with the free c-span radio app. the house will be in order. >> for 40 years ,

89 Views

info Stream Only

Uploaded by TV Archive on