How does Europe want to impose greater transparency of algorithms on the tech giants?
By imposing greater transparency on their algorithms to web giants, Europe aims to shed light on these black boxes that govern information, entertainment and consumption on the Internet, a project that raises many questions.
What does Brussels want to do?
The European executive intends to use the future European regulation on Internet platforms (Digital Services Act), to be presented on December 15, to impose on the web giants more transparency on the sources of disinformation, and the operation of their algorithms. An algorithm is a computer system that makes it possible to take personalized decisions on a very large scale – to display the results of an Internet search in a certain order, to organize the flow of publications on a social network, to place orders on the stock market at very high speed etc.
Large platforms “Will have to tell us how they decide what information and products they recommend, and what they hide, and give us the ability to influence those decisions. They will have to tell us who is paying for the ads we see and why we have been targeted ”, declared the Vice-President of the European Commission Margrethe Vestager at the end of October.
Why regulate algorithms?
Social networks say they are politically neutral “But are they really? Nobody can observe their behavior today ”, explained recently at the Médias en Seine conference, Benoit Loutrel, who led a study mission of senior French officials at Facebook. ” When the platforms say ‘we are reducing the visibility’ of certain content ” contentious (like Facebook and Twitter did during the US election campaign) “We don’t know by how much they reduce this visibility, we don’t know what that means exactly”, observes Christine Balagué, professor at the Mines Telecom Institute.
In a note circulated by the Jean-Jaurès Foundation, MP Paula Forteza believes that social networks are neither media (subject to editorial rules), nor simple hosting platforms, but “Public spaces” where the “Business secret” cannot be invoked. “When 53% of the French population has accessed a place open without restriction to meet and discuss, it is called a public space and it is the general interest that takes precedence”, she emphasizes.
What means can be implemented?
Many reports on the subject recommend the creation of a regulator, involving technical communities, scientists and journalists in its work. The NGO Algorithm Watch suggests different levels, ranging from examining the training data of the artificial intelligence model, to re-reading the code, including a “test” allowing the comparison of the input and output data. ” In the case of very complex machines, even the engineers who developed them do not necessarily know why such and such a result comes out, so to determine if a system perpetuates biases, it is necessary to be able to carry out experiments ”Mackenzie Nelson, who heads the platform governance project for the organization, told AFP.
“There are so many parameters that re-reading the code does not tell what will happen”, supports Guillaume Chaslot, one of the first to have warned about the bias of YouTube’s recommendation algorithm. These systems need to be observed ” day by day “ to check what they produce, and “If the platforms keep repeating the same mistakes, we have to come to sanctions”.
What are the platform’s objections?
The platforms would accommodate a loop of exchanges with a regulator, but ” we don’t want someone to tell us in advance for each innovation what we can or cannot do ”, told AFP an official of a web giant, who asked not to be identified. Facebook argues, in its responses to the public consultation on the draft regulation, that “ sharing details about how algorithms work (…) could allow bad actors to bypass detection mechanisms more effectively ”.
It also warns about the risk of personal data leakage: ” we believe that it is essential to have a structure of protection concerning any information shared with the authorities, by clearly defining the conditions of the data sharing, the persons who will have access to it and by ensuring the confidentiality of the data ”.