Abstract
It is now possible for anyone with rudimentary computer skills to create a pornographic deepfake portraying an individual engaging in a sex act that never actually occurred. These realistic videos, called “deepfakes,” use artificial intelligence software to impose a person’s face onto another person’s body. While pornographic deepfakes were first created to produce videos of celebrities, they are now being generated to feature other nonconsenting individuals—like a friend or a classmate. This Article argues that several tort doctrines and recent non-consensual pornography laws are unable to handle published deepfakes of non-celebrities. Instead, a federal criminal statute prohibiting these publications is necessary to deter this activity.
Citation
Douglas Harris, Deepfakes: False Pornography Is Here and the Law Cannot Protect You, 17 Duke Law & Technology Review 99-127 (2019)
Available at: https://scholarship.law.duke.edu/dltr/vol17/iss1/4