Description

Profiling may threaten values that the law aims to protect, and undermine goals that the law aims to achieve. Profiling involves automated processing of personal or other data to develop profiles that can be used to make decisions about people. Profiling can be used in different contexts. For instance, (i) with retail price discrimination, online shops charge different consumers different prices for the same or similar products. (ii) Lenders use profiling to estimate a consumer’s creditworthiness. Lenders can adapt interest rates to certain consumers, or refuse to lend to them. (iii) Predictive policing refers to the use of profiling technology to predict criminal behaviour.However, profiling has drawbacks. For instance, profiling can discriminate unintentionally, when an algorithm learns from data reflecting biased human decisions. Additionally, profiling is opaque: people may not know why they are treated differently. Making profiling transparent is difficult, among other reasons because of the complexity and the possibly ever-changing nature of algorithms.
The project’s overarching research question is: considering the rationales for the rules in different sectors, is additional regulation needed, and if so: how should profiling be regulated? The project aims to develop guidelines for regulating profiling.
AcronymEU535
StatusFinished
Effective start/end date1/01/1831/12/18

    Flemish discipline codes

  • Human rights law

    Research areas

  • profiling, regulation, human rights

ID: 31331959