27 September 2023

Could AI prove more trustworthy than people?

Start the conversation

Mark Travers* explains why AI-synthesized faces might be more trustworthy than real faces.


A new study published in PNAS highlights the potential threat AI-generated faces might pose to society due to a tendency to find them more trustworthy than real human faces.

“We’ve seen incredible advances in technology and the use of artificial intelligence to synthesize content,” says psychologist Sophie Nightingale, lead author of the research from the University of Lancaster in the United Kingdom.

“It is particularly exciting but also worrying.”

To understand how AI-generated faces are perceived by humans, the researchers used state-of-the-art computer software to synthesize 400 “artificial” faces from real photographs.

Then they recruited participants to rate the real and artificial faces on attributes such as the trustworthiness of the face.

They also asked participants to guess whether the face was a real or computer-generated face.

Interestingly, they found that people were generally unsuccessful in telling a real face from a fake one.

Moreover, respondents tended to view the artificial faces as more trustworthy.

“In addition to finding that naive respondents were at chance in determining if a face was real or synthetic, we also found that additional training and feedback improved performance only slightly,” says Nightingale.

“Perhaps most interestingly, we found that not only are synthetic faces highly realistic, they are deemed more trustworthy than real faces.

“As a result, it is reasonable to be concerned that these faces could be highly effective when used for nefarious purposes.”

Nightingale offers two possibilities why certain faces are deemed more trustworthy than others.

The first has to do with familiarity.

“Synthesized faces tend to look more like average faces,” says Nightingale.

“This more average appearance is an artifact of how the synthesis technique favors average faces as it is synthesizing a face.

“We also know that people show a preference for average or typical-looking faces because this provides a sense of familiarity.

“Therefore, it might be this sense of familiarity that elicits, on average, higher trust for the synthetic faces.”

Other research shows that people find faces from their own culture to be more trustworthy, which also lends credibility to the familiarity hypothesis.

Smiling might also contribute to the trustworthy difference.

This point is echoed in other research where emotionally neutral faces are perceived as more trustworthy when facial features resemble a facial expression of happiness.

The researchers were surprised that training people on the differences between AI-generated faces and real faces did little to improve performance.

“At this time, I’m not aware of a reliable way for an average person to identify if a face has been AI-synthesized,” says Nightingale.

“However, we’ll continue to conduct research to try to help.”

A next step for the authors will be to consider what computational techniques can be developed to discriminate real from synthetic images, as well as to think carefully about what ethical guardrails should be put in place to protect people from the dangers posed by these technologies.

“Given the rapid rise in sophistication and realism of synthetic media (i.e., deep fakes), we propose that those creating these technologies should incorporate reasonable precautions into their technology to mitigate some of the potential misuses in terms of non-consensual porn, fraud, and disinformation,” adds Nightingale.

“More broadly, we recommend that the larger research community consider adopting best practices for those in this field to help them manage the complex ethical issues involved in this type of research.”

*Mark Travers, Ph.D., is an American psychologist with degrees from Cornell University and the University of Colorado Boulder.

This article first appeared at forbes.com.

Start the conversation

Be among the first to get all the Public Sector and Defence news and views that matter.

Subscribe now and receive the latest news, delivered free to your inbox.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.