Federico Bergaminelli
He was a partner



Taken from Police Magazine, ed. online of 04/10/2020

#Lexamp News Team NoteWe are glad to propose again, also on these pages, a recent comment by Prof. Avv. Federico Bergaminelli, on the theme of deepfake: a technique based on artificial intelligence that allows you to combine or superimpose images to create a fake video, but perfectly credible.


*    *    *


Revenge porn, public humiliation, full nude spread. Now violence against women is excercised by the “deepfake” technique as well: this issue is surely destined to worsen in the future.

Who knows if Hermione Granger, aka Emma Watson, playing “Harry Potter” and the Philosopher's Stone" in 2001 would have ever imagined that, in 2020, she would have been the subject of a disconcerting episode of deepfake porn, viewed - in a handful of days - as many as 23 millions of times.

This is exactly what happened to her last June.

Deepfake is the technique based on artificial intelligence that allows you to combine or superimpose images to create a fake but perfectly credible video.

It all begins on Reddit (a social news aggregator, a mixture of discussion platform and link distributor, divided into forums), when a user called "deepfakes" began to bring images of films to the site, in which the faces of the actors it was replaced by that of other people. Using machine learning tools and open source programs, the user soon attracted the attention of many and especially the Motherboard - Vice website (news website) which dedicated an article to him in December last year.

We are talking about an alarming phenomenon that generates, every month, over 1000 "photomontages" published on the largest porn internet sites on the planet, pages with record views.And, just to be clear, we are talking about websites that, in terms of visitors, in the ranking of the most popular addresses in the world, are second only to Google, YouTube, Facebook, Twitter and Instagram, and in some cases have even more visits than Wikipedia, Yahoo or Amazon.

These data are taken from a study carried out by Sensity, which was, among other things, commented with pain by my colleague Guido Scorza, now a member of the Italian Privacy Authority, in a recent post.

Of the 14,678 deepfake videos found online, 96% are pornographic material.

Fake news is not the main problem (at least for now), but the exploitation of the image of women.

Imagine the discomfort for the female universe, because - obviously - the phenomenon is totally "dedicated" to women: suddenly realizing that one's face has been cut ad hoc, on the body of a porn star and shot in the universe of porn sites. An authentic sexual violence with consequences on his life, probably, comparable.

It is likely that on the millions of lustful pornphile patrons who will think they recognize a face, known for quite different reasons, in that sex scene, many will get - it matters little whether right or wrong - an idea about her that simply does not corresponds to reality.

Few, of course, will tell her they've seen her on a porn site and very few, even when she tries to explain that it's a montage, however unrecognizable, will really believe her.

And her life will, in many cases, be destroyed, and her future compromised because those images, presumably, will continue to float on the web forever.

Sensity's study is absolutely pessimistic about the possibility of successfully combating this upsetting and rampant phenomenon: the technology for making deep fake porn - starting with sexually oriented ones - is increasingly affordable.

It must be said, however, that the managers of the porn sites, albeit with different sensitivity and commitment, are adopting useful remedies, at least to guarantee the victim that he can request and obtain - if she realizes it - the removal of the video or images, in time.

But will all this be enough?

Before the unfortunate woman realizes that she has been the subject of this upheaval of her person, the video will have been viewed and broadcast along hundreds of thousands of digital streams.

At this point the question arises: how long will it take us to deal with an adolescent victim to whom the ex-boyfriend on duty, or a group of school friends, will decide to make a "spite" in bad taste and with dramatic consequences, maybe, without even realizing it?

There is no time to waste and, above all, as has sometimes happened, we cannot let ourselves be stopped by the itchy nature of the phenomenon, of pornography, of the target market.

The operators of porn sites are now authentic multinationals of the web that must be treated in the same way as Google, Facebook, Apple, Amazon or Twitter.

And It is from them that must be obtained, the installation of an automatic recognition system for deep fakes - which, by now, is starting to spread - at least to report the suspicious character of a video being uploaded by the user, first of all to the user himself who could be unaware of the fake or, perhaps, in any case, think twice before raping a woman via the web.

And then, of course, to the moderators to verify and possibly block the publication of the content.

I repeat what has been expressed in previous posts: it is essential to have laws that punish new digital sexual abusers in an exemplary manner.

In Italy, for example, we finally have a good law on revenge porn but it is difficult, given the sacrosanct principle of mandatory criminal law, to apply it to the hypothesis in which a photomontage ends up online; a law that must certainly be revisited in the light of the phenomenon commented upon here and which is proposed as “open” to the stigmatization of similar phenomena.

Education, the culture of personal rights also in its digital dimension, confrontation at school, in the family and at the university, must make a difference.

I’d like to conclude with an invitation: help me make a dream come true!

We all should bring around Italy a reflection dedicated to the "dark side of the web and social media", precisely to make everyone more aware: parents, teachers, our children, educators, all those who "depend" on excessive power of digital communication...


Pin It