We use essential cookies and similar tools that are necessary to provide our site and services. We use performance cookies to collect anonymous statistics, so we can understand how customers use our site and make improvements. Essential cookies cannot be deactivated, but you can choose “Customize” or “Decline” to decline performance cookies.
If you agree, AWS and approved third parties will also use cookies to provide useful site features, remember your preferences, and display relevant content, including relevant advertising. To accept or decline all non-essential cookies, choose “Accept” or “Decline.” To make more detailed choices, choose “Customize.”
Essential cookies are necessary to provide our site and services and cannot be deactivated. They are usually set in response to your actions on the site, such as setting your privacy preferences, signing in, or filling in forms.
Performance cookies provide anonymous statistics about how customers navigate our site so we can improve site experience and performance. Approved third parties may perform analytics on our behalf, but they cannot use the data for their own purposes.
Functional cookies help us provide useful site features, remember your preferences, and display relevant content. Approved third parties may set these cookies to provide certain site features. If you do not allow these cookies, then some or all of these services may not function properly.
Advertising cookies may be set through our site by us or our advertising partners and help us deliver relevant marketing content. If you do not allow these cookies, you will experience less relevant advertising.
Blocking some types of cookies may impact your experience of our sites. You may review and change your choices at any time by selecting Cookie preferences in the footer of this site. We and selected third-parties use cookies or similar technologies as specified in the AWS Cookie Notice.
We display ads relevant to your interests on AWS sites and on other properties, including cross-context behavioral advertising. Cross-context behavioral advertising uses data from one site or app to advertise to you on a different company’s site or app.
To not allow AWS cross-context behavioral advertising based on cookies or similar technologies, select “Don't allow” and “Save privacy choices” below, or visit an AWS site with a legally-recognized decline signal enabled, such as the Global Privacy Control. If you delete your cookies or visit this site from a different browser or device, you will need to make your selection again. For more information about cookies and how we use them, please read our AWS Cookie Notice.
To not allow all other AWS cross-context behavioral advertising, complete this form by email.
For more information about how AWS handles your information, please read the AWS Privacy Notice.
We will only store essential cookies at this time, because we were unable to save your cookie preferences.
If you want to change your cookie preferences, try again later using the link in the AWS console footer, or contact support if the problem persists.
Facial recognition is a system built to identify a person from an image or video. This technology has been around for decades, but its usage has become more noticeable, and accessible, in the past few years as it now powers innovative solutions, such as personal photo applications and secondary authentication for mobile devices. To understand these emerging capabilities, let’s first discuss how facial recognition works.
Facial analysis capabilities, such as those available in Amazon Rekognition, allow users to understand where faces exist in an image or video, as well as what attributes those faces have. For example, Amazon Rekognition can analyze attributes such as eyes open or closed, mood, and hair color. These detected attributes become increasingly useful for customers that need to organize or search through millions of images in seconds using metadata tags (e.g., happy, glasses, age range) or to identify a person (i.e., facial recognition using either a source image or a unique identifier).
Facial recognition is useful across many applications and industry verticals. Today, we see this technology helping news organizations identify celebrities in their coverage of significant events, providing secondary authentication for mobile applications, automatically indexing image and video files for media and entertainment companies, all the way to allowing humanitarian groups to identify and rescue human trafficking victims.
Marinus Analytics, for example, uses artificial intelligence with Amazon Rekognition to provide agencies with tools, such as Traffic Jam, that assist them in identifying and locating victims of human trafficking. Investigators save invaluable time by using image analysis to search automatically through millions of records in seconds, which previously required individual analysis by investigators.
Another example is Aella Credit, a financial services company based in West Africa that provides banking services via a mobile app for underbanked individuals in emerging markets. Using Amazon Rekognition’s ability to detect and compare faces, Aella Credit can provide identity verification, without any human intervention. This simple use of facial recognition allows for more individuals to receive access to banking services than was ever previously possible. You can find other examples of customers using Amazon Rekognition here: Amazon Rekognition Customers.
Facial recognition should never be used in a way that violates an individual’s rights, including the right to privacy, or makes autonomous decisions for scenarios that require analysis by a human. For example, when a bank uses tools like Amazon Rekognition in a financial application to verify their customers’ identity, the bank should always clearly disclose their use of the technology and ask the customer approval of the terms and conditions. Regarding public safety and law enforcement, we think that governments are free to work with law enforcement agencies to develop acceptable use policies for facial recognition technologies that both protects the rights of citizens and enables law enforcement to protect the public’s safety.
In all public safety and law enforcement scenarios, technology like Amazon Rekognition should only be used to narrow the field of potential matches. The responses from Amazon Rekognition allow officials to quickly get a set of potential faces for further human analysis. Given the seriousness of public safety use cases, human judgment is necessary to augment facial recognition, and facial recognition software should not be used autonomously.
As stated by Dr. Matt Wood, “Machine learning is a very valuable tool to help law enforcement agencies, and while being concerned it’s applied correctly, we should not throw away the oven because the temperature could be set wrong and burn the pizza. It is a very reasonable idea, however, for the government to weigh in and specify what temperature (or confidence levels) it wants law enforcement agencies to meet to assist in their public safety work.”
Rekognition face matching is built using ML and computer vision technologies. It works as follows: (1) Locate the portion of an input image that contains the face. (2) Extract the image region containing the head, and align the region so the face is in a “normal” vertical position, outputting cropped face images. (3) Convert each cropped face image to a “face vector” (technically, a mathematical representation of the image of a face). Note that the collections searched by SearchFaces are sets of face vectors, not sets of face images. (4) Compare the source and target face vectors and return the system’s similarity score for the face vectors. See the developer documentation for details of the API calls.
A similarity score is a statistical measure of how likely two faces in an image are the same person, when analyzed by Amazon Rekognition. An image that received a similarity score of 95% for instance, would indicate that amongst all the faces Rekogniton analyzed, this image had a 95% similarity with the face being searched for. A higher similarity score means the more likely the two images are from the same identity. That said, even a 99% similarity does not guarantee it is a positive match.
That is because Rekognition uses what is called a probabilistic system, where determinations cannot be made with absolute precise accuracy, it is instead, a prediction.
This is where the similarity threshold comes into play. A similarity threshold is the lowest similarity score the application using Rekognition is willing to accept as a possible match. The choice of threshold has a fundamental impact on the search results that are returned. The number of misidentifications (sometimes called ‘false positives’) that can be afforded by the customer is a direct result of the threshold setting. A customer will select the appropriate setting based on their needs and use case of the application.
We recommend a 99% threshold setting for use cases where highly accurate face similarity matches are important. In public safety and law enforcement scenarios for example, this is often a key first step to help narrow the field and allow humans to expeditiously review and consider options using their judgment.
On the other hand, many scenarios don’t require human review of Amazon Rekognition responses. For example, secondary factor authentication with an employee badge and a face recognized by Amazon Rekognition with a high (99%) similarity. Or a personal photo collection application, where a few incorrect matches can be tolerated, a lower threshold of 80% may be acceptable. Customers can tune the similarity threshold to the specifics of their use case and needs.
Celebrity Detection is designed to identify potential famous people across different movie scenes and environments. Since celebrities often play different characters (wearing different makeup, wigs, and other distortions to their appearance), this Amazon Rekognition feature has been trained on pre-labeled data to return the highest probable matches within a specific list of famous people. By design, this use case allows for a higher number of false positives and should not be used in public safety or law enforcement use cases.
In contrast, Rekognition’s Face Search feature is designed to tell you the precise amount of similarity between two faces – and it can be optimized for precise matches and used in security and public safety applications, such as finding missing children and reuniting them with their parents, authorizing employee access to a building, or identifying and rescuing victims of human trafficking.
These two features are completely different regarding the underlying technology they use, the use cases they solve, and the customers they serve.
Yes. Let’s examine some common misconceptions about facial recognition and how it works.
First, some believe that people can match faces to photos better than machines. However, the National Institute for Standards and Technology (NIST) recently shared a study of facial recognition technologies that are at least two years behind the models used in Amazon Rekognition and concluded that even older technologies could outperform human facial recognition capabilities.
Second, as in all probabilistic systems, the mere existence of false positives doesn’t mean facial recognition is flawed. Rather, it emphasizes the need to follow best practices, such as setting a reasonable similarity threshold that correlates with the given use case. Also, one of the advantages of this technology is that it continuously learns and improves, so false positives can be reduced over time.
Today, many successful customers, such as Thorn, VidMob, Marinus Analytics, and POPSUGAR are using facial recognition in simple ways that have a powerful impact.
AWS provides 10-minute tutorials and in-depth documentation with prescriptive guidance to help you get started using facial recognition.
If you suspect that Amazon Rekognition is being used in manner that is abusive or illegal, or infringes on your rights or the rights of other people, please report this use and AWS will investigate the issue.