Responding to the challenges of artificial intelligence imposes ethical reflection

Publié le 13 Jul, 2017

Artificial intelligence specialist, Jean-Gabriel Ganascia, warns of the importance of ethical reflection given the massive influx and use of data by artificial intelligence systems.


 “In the light of devices that track and preserve information, it is not possible to forget,” he explained. “On a personal level, the information collected will follow a person for his/her entire life without that person having any hold over it, but it will not be the same ten or twenty years down the line. Collectively, pardon and peace are linked to a certain element of forgetfulness when it comes to information”. And he wonders how to reconcile these opposites.


The second issue involves delegation to machines: “under the pretext of efficiency, there is a huge risk that a machine-based decision will take precedence over a human decision because it avoids taking responsibility”. Artificial intelligence decisions are based on autonomous learning which can make them unpredictable. They are taken furtively and warrant the introduction of a certain number of human values. It is also vital  “to make human beings responsible for their acts and not to delegate those tasks to machines”.


This calls for immediate reflection. In fact, “in some areas of the United States, predictive justice is already based on predicting recidivism to determine the sentence, which is worrying”. Furthermore, the baseline indicators are unreliable and may lead to “a form of implicit discrimination in terms of data collection”. In the insurance field, groups will establish individual risk on “perfectly discriminatory bases”.


These applications warrant the creation of a framework, a key question whereas “conventional standards are floundering”. “Therefore, something needs to be changed”. This is a demanding challenge in a context where technology is gaining ground and accelerated knowledge leads to “a kind of intoxication”.

le Monde, Laure Belot (04/07/2017)

Share this post

For further