Deep and cheap fakes: effects on audience's attitudes, knowledges, and literacy

This research aims to advance the understanding of deepfake perception by adopting a communicative approach. Specifically, it seeks to collaborate to fill in the gap of obtaining empirical evidence about the psychological relationship users establish with deepfake technology. The work packages will observe user attitudes and knowledge about fake technologies and characterize the impact of deepfake on users, particularly youths. The project is funded under NUS Centre for Trusted Internet and Community (CTIC) Collaborative Project Grant (Project ID: CTIC-CP-21-07).

https://www.mdpi.com/1424-8220/21/21/7367/htm

Source: https://www.mdpi.com/1424-8220/21/21/7367/htm

Project Overview

Deepfakes (DFs) are any fake visual, audio, or audio-visual production generated by deep learning methods. Deepfake technology (DT) is still in an early stage of development and use. However, rapid synthetizing technologies advancements and access already allow users to make videos and clips of individuals doing and saying things they never did or said. Nowadays, users can synthesize an individual's voice from transcripts, produce an entirely new video of people speaking by lip-synced to their face, or swap one person's face onto another person's body.

Although DFs developments may have many benefits, the emphasis is now placed on how they can be used for unethical and malicious purposes and might affect the integrity of many social domains. DFs are mainly considered a powerful form of disinformation and they might increase the difficulty to differentiate between what is real and what is fake. Debates about DFs mainly focus on their potentialities for future disruption rather being based on examples of their effects. However, such narratives on the impact of DFs are mainly based not on empirical proofs but on discourses. The truth is that DFs are a nascent area of research and their implications only start to emerge. There is little empirical knowledge about DFs. Particularly, the psychological processes and consequences associated to DFs remain largely unstudied. Thus, researching on DFs from a human communication perspective is opportune and necessary.

Research Objectives

This research aims to advance in the understanding of deepfake perception by adopting a communicative approach. It seeks to collaborate to fill in the gap of obtaining empirical evidence about the psychological relationship users establish with DTs. In doing so, this research offers valuable information to academy and industries and might impact policies and media literacy initiatives. The research is divided in 3 work packages (WP) each with its own objectives.:

  • WP1) To Uncover and measure audience's attitudes and knowledges of DF: Within this WP, this research aims to identify and measure attitudes of different audiences to fake technologies, and to determine to what extent such audiences know about the technical, distributing and potential effects of fake (deep and cheap) technologies.
  • WP2) To observe deepfake impact on receivers. Within this WP, this research aims to advance in the characterization of audiences' perceptions of fake videos and audios by analyzing their subjective evaluations when detecting and processing audio-visual fake contents. This WP also seeks to observe the impact of different variables on the detection of fake audios and videos by different audiences. It also aims to determine the extent to which fake reception impacts memory of events. Finally,
  • WP3) To advance in the design of deepfake literacy programs. Following the results from previous WPs, within this WP, this research aims at obtaining knowledge on how to design and implement effective literacy programs on DTs, particularly on adolescents.

Methodology

For achieving its objectives, this research will adopt a mixed-methods communicational approach, which implies gathering information from qualitative and quantitative data techniques, such as focus groups, experiments and surveys. The methodology also implies creating audio and video DFs for experimental and learning purposes.

Research Team

Dr. María T. Soto-Sanfiel
Primary Investigator
Dr. María T. Soto-Sanfiel

Associate Professor

Dept. of Communications and New Media

Centre for Trusted Internet and Community

National University of Singapore

Dr. Terence Sim
Co-Primary Investigator
Dr. Terence Sim

Associate Professor

Department of Computer Science

National University of Singapore

Dr. Saifuddin Ahmed
Co-Primary Investigator
Dr. Saifuddin Ahmed

Assistant Professor

Wee Kim Wee School of Communications

Nanyang Technological University

...
Research Assistant
Sanjay Saha

PhD Student

Department of Computer Science

National University of Singapore

Related Publications

More publications

Contact Us

Centre for Trusted Internet and Community

innovation 4.0, #04-04
3 Research Link
Singapore 117602

ctic.deepfakes@gmail.com
ctic@nus.edu.sg