Assess and Manage Subgroup Bias

21 / 46 Training This practice was ranked as advanced.
Click to read more.
This practice addresses requirements
from the EU guidelines for trustworthy ML.
Click to read more.


Intent

Avoid bias and unfair decisions within subgroups.

Motivation

Ensuring fairness between two groups can lead to violations within subgroups.

Applicability

Subgroup bias should be assessed and managed for all applications which process data regarding groups and subgroups of individuals.

Description

Subgroup bias can arise from improperly divided groups, often defined in order to avoid group bias, or due to a lack of data. For example, consider an application where we divide the data based on location in New York and Amsterdam. After division, it may be the case that we only have data where the population of New York is predominantly female, and the population of Amsterdam is predominantly male. This division introduces a subgroup bias, which ultimately leads to socially biased models.

In order to avoid subgroup bias, it is imperative to test, assess and calibrate the models as in the case of social bias.

Follow the references in order to learn more about technical approaches to ensure fair predictions for every sub-population which can be identified in a set of groups.

Adoption

Related

Read more



21 / 46 Training This practice was ranked as advanced.
Click to read more.
This practice addresses requirements
from the EU guidelines for trustworthy ML.
Click to read more.