Inform Users on Machine Learning Usage

42 / 45 Governance This practice was not ranked. Click to read more.


Make users aware that machine learning is used by the application, what it is used for, and what its limitations are. This allows users to understand better how to use, or not use the application.


Machine learning systems should not represent themselves as humans to users. Humans have the right to know that they are interacting with a machine learning system.


User communication should be applied to any machine learning application.


The following is an extract from the EU ethics guidelines for trustworthy AI:

[AI] users should be able to make informed autonomous decisions regarding AI systems. They should be given the knowledge and tools to comprehend and interact with AI systems to a satisfactory degree and, where possible, be enabled to reasonably self-assess or challenge the system. [Moreover, …] AI system should not represent themselves as humans to users; humans have the right to be informed that they are interacting with an AI system. This entails that AI systems must be identifiable as such. In addition, the option to decide against this interaction in favour of human interaction should be provided where needed to ensure compliance with fundamental rights.

The potential impact of artificial intelligence and machine learning on society calls for mature and responsible uses and regulations of these technologies. Since machine learning can be deployed to shape or influence human behaviour through mechanisms that can be difficult to detect, failing to inform users of machine learning use may violate human rights. Communicating that decisions that can impact users are made through machine learning increases transparency and helps users make better decisions.

Communication with users can be made through labels that inform of machine learning usage, and through public descriptions of the machine learning system (e.g. through model cards).


Read more

42 / 45 Governance This practice was not ranked. Click to read more.