Algorithm bias is when an algorithm displays bias against certain groups leading to repeatable and undesirable errors due to what should be unrelated factors. Bias in algorithms is typically unintentional and can occur for several reasons like a learning algorithm being trained on data that excludes a racial group of people with different characteristics or a system only being tested by one racial group while the system is being developed. One example of algorithm bias is facial recognition software failing to recognize people of African descent due to darker skin or their facial features. This leads to situations where some smartphones’ facial recognition features to not work as reliably for darker skinned people.
The way facial recognition software makes decisions are by first using a camera to get a picture of your face and it Maps the geometry of your face and marks unique key features from your face, turning your face into something like a virtual signature.When a person’s face is later scanned to access something, a program will either look to verify if the face matches the face set in the device or looks through a database of faces of others to find the person’s specific “virtual signature” that then gives that person access to their personal data.
Many companies use limited subjects when testing facial detection software, this further leads to there being biases in the algorithms when applied by users. Three types of bias that facial detection algorithms may be susceptible to are: features of an individual may not be in the body of images, a dark-skinned individual may have trouble being recognized than a light-skinned individual, and there can be a misinterpretation of who a person really is. Biases such as these will make the software and the data undependable.
One of the possible effects if this bias is left unchecked is that some people with facial deformities will not be able to utilize the technology of facial recognition on their phone and computers. Another possible effect is private devices being opened and exposed by someone it shouldn’t due to a similarity in peoples features. Also, the phone or computer may not recognize a certain race if they have not been programmed in as a sample leaving them unable to unlock the device.