Ethics of the Technology of Deepfakes

Deepfakes videos, audios, images are truly fake but appear real. They are produced by generative AI using machine learning and deep learning techniques. It involves three steps: extraction, training and creation. Deepfakes require to work huge data sets. This data has to be extracted. It is extracted from the wild in the web. Machine learning using neural network then trains synthetic media to produce the desired video, audio or image. We may ask an important question: Because it is technological possible is it morally or ethically permissible? We do not say that we have to throw the baby with the bath water. The issue is complex. Besides, it is hard to distinguish between what is real and fake.

There are claims that there is the good side of this technology. It can help us to bring our dead to life on the screen or in the web. Film industry may continue producing films using deepfake technology with their dead artists. It can also reproduce the videos of the dead persons as well as the voice so that their loved one’s can cope with their loss. It can be used to bring life in to iconic artwork. We already have a video of movements of head, face, mouth and lips of Mona Lisa. It is said to be going to be financially saving. It will produce cheap movies, advertisement clips, games. It may also enable identity protection in important sensitive matters.

Besides, these and other good sides, there is the dark side of deepfakes. It can be used to target innocent persons. As it enables us to swap faces and even bodies, it may be employed to tarnish the good image of someone by creating pornographic videos as well as photoshop videos and audios that say or do things that a person has never done and thus, bring harm to innocent persons, especially when they are public figures. Even some persons may be enabled to become imposters of someone else by cloning the voice of that someone else and do frauds and scams.

But the good news is that there are possibilities of detection of the deepfake media. A close scrutiny of the videos can enable us to find the fault lines in them. We may be enabled to trace some inconsistencies like the age of the eyes, face, smoothness of the face, facial hair and choose the real from the fake. This means deepfake can betray itself. Besides, it generally fails to reproduce the scene of action and hence, the shadows, eye movements etc., may also assist us to choose the real from the fake. But with time deepfake technology will evolve and make it very difficult to detect fake media from the real ones. Hence, with every passing day the ethical concerns regarding deepfakes will grow.

We cannot restrict the use of deepfakes but we can regulate its use. The copyright issues do arise when an image or face of someone is used without consent of that person or the person who holds the rights for the same say a iconic artwork. The attacks of deepfakes need to be checked. We can trace that these attacks choose politicians, celebrities and businessman as their victims. But there is no difference when it comes to the use of images of a public figure or a private individual. Every person has equal dignity. For now, it is public persons that happen to be the main targets as they have lots of their images and videos in the web and are easily available for manipulation. This things can be extremely harmful for the social wellbeing and the health of the metal and physical the victim. Hence, we need ethics to prevent it from being maliciously weaponized to harm people.

Deepfakes can harm not just individuals, they can harm businesses , political parties as well as our democracy at large. Hence, we have an ethical imperative to develop ethical response to this technology that can be destructively employed by malicious people. Because of the acceleration at which we are living as well as the speed at which deepfakes can be created and circulated, we need ethics that dynamically responds to it. The service providers do have the responsibility to act at the production level where each product coming out of the synthetic media is already labelled as produced artificially. Besides, ensuring the consent of the persons, they have to be sensitive to the good of the audience. If the deepfakes can become a potential reason of violence or riots among people, the content has to be immediately recalled from the web. The other regulations like limiting of sharing on platforms like Whatsapp has to continue. Even down raking of some potentially harmful videos, audios as well as images have to be done.

Above everything the primacy of the wellbeing of every individual person has to be the guiding principle for the time to time interventions by the service providers. This does not absolve the individual users of the technology. It is the same principle that has to employed to make a responsible use of deepfake technology. The Government has the responsibility to establish forensic labs that would enable citizens to screen deep fakes from time to time and tag them as such so that ordinary citizens are neither deceived or harmed.

The Ethical implications are enormous. Our ethics is static and deals with an action when it is already done. We seem to have the challenge to think and generate an ethics that sensitively guides the dynamism of the impact of the deepfakes. Since they are synthetically developed and are continuously in operation producing their impact on our society, their potential to harm keeps changing over time just like the power of the items of on the chess-board keep changing over the use of the same. Its abuse can indeed decrease the trust decimals of our society. They can cause unrepairable damage to people, business, institution and our nation as a whole. But when used responsibly and ethically, we has potentials to do a world of good for us. Hence, we need to promote judicious and ethical use of deepfakes.

Leave a Reply

Your email address will not be published. Required fields are marked *

GREETINGS

If you are not paying for the product, then you are the product.

That's Big Data Analytics.

- Fr Victor Ferrao