"Face-reading AI will be able to detect your politics and IQ, professor says.
Professor whose study suggested technology can detect whether a person is gay
or straight says programs will soon reveal traits such as criminal predisposition."
"Using photos, AI will be able to identify people’s political views, whether
they have high IQs, whether they are predisposed to criminal behavior,
whether they have specific personality traits and many other private,
personal details that could carry huge social consequences, he said."
"Faces contain a significant amount of information, and using large datasets
of photos, sophisticated computer programs can uncover trends and learn
how to distinguish key traits with a high rate of accuracy. With Kosinski’s “gaydar” AI,
an algorithm used online dating photos to create a program that could
correctly identify sexual orientation 91% of the time with men and 83%
with women, just by reviewing a handful of photos.
Kosinski’s research is highly controversial, and faced a huge backlash from
LGBT rights groups, which argued that the AI was flawed and that anti-LGBT
governments could use this type of software to out gay people and persecute them."
".. preliminary results showing that AI is effective at guessing people’s
ideologies based on their faces. This is probably because political views
appear to be heritable, as research has shown, he said. That means political
leanings are possibly linked to genetics or developmental factors, which
could result in detectable facial differences.
Kosinski said previous studies have found that conservative politicians
tend to be more attractive than liberals, possibly because good-looking
people have more advantages and an easier time getting ahead in life."
"Facial recognition may also be used to make inferences about IQ, said
Kosinski, suggesting a future in which schools could use the results of
facial scans when considering prospective students. This application
raises a host of ethical questions, particularly if the AI is purporting to
reveal whether certain children are genetically more intelligent."
"There are, however, growing concerns that AI and facial recognition
technologies are actually relying on biased data and algorithms and could
cause great harm. It is particularly alarming in the context of criminal justice,
where machines could make decisions about people’s lives – such as
the length of a prison sentence or whether to release someone on bail –
based on biased data from a court and policing system that is racially
prejudiced at every step."