(EN) Paragraph 1 of this article laid down a general ban on the use of automated decision-making that has legal or similarly significant effects (as mentioned above). This prohibition is intended to serve as a safeguard, ensuring that decisions of this kind are not taken without due consideration and oversight.
This implies that the controller should not undertake the processing described in Article 22(1) unless one of the exceptions listed below applies.
(a) tá sé riachtanach chun conradh a dhéanamh nó a fheidhmiú idir an t-ábhar sonraí agus rialaitheoir sonraí;
(EN) The use of automated decision-making processes for contractual purposes may be the most appropriate way to achieve the desired outcome in certain situations. This is especially true where routine human involvement is impractical or impossible due to the large volume of data. In such instances, it is essential that the controller is able to demonstrate that the processing is necessary, taking into account whether a less privacy-intrusive method could be employed. For example, if there are alternative methods that are equally effective and less intrusive, then automated decision-making is not considered to be ‘necessary’.
Moreover, automated decision-making may also be necessary for pre-contractual processing in accordance with Article 22(1). It is essential that controllers consider the privacy implications of their automated decision-making processes, ensuring that any processing is necessary and proportionate, and that there are sufficient safeguards in place to protect individuals’ data rights.
For instance, it may be necessary to utilize automated decision-making in order to identify a short list of suitable candidates due to the exceptionally high volume of applications received for this open position. This is done with the intention of entering into a contract with the data subject in order to progress the recruitment process.
(b) údaraítear é faoi dhlí an Aontais nó faoi dhlí Ballstáit a bhfuil an rialaitheoir faoina réir agus lena leagtar síos freisin bearta iomchuí chun cearta agus saoirsí agus leasanna dlisteanacha an ábhair sonraí a choimirciú; nó
(EN) Automated decision-making under 22(2)(b) may be allowed by law, with measures to protect data subject rights. Recital 71 notes potential use for fraud/tax evasion prevention, or service security/reliability.
(EN) Article 22 of the GDPR makes an exception for using explicit consent for significant automated individual decision-making. This is due to the serious privacy risks posed by such processing and, as such, a higher level of individual control over personal data is deemed appropriate.
However, ‘explicit consent’ is not defined in the GDPR. For this reason, the WP29 guidelines on consent (see the “Related” tab for Art.22(2b)) provide important guidance. These guidelines emphasize that consent must be demonstrated through clear affirmative action, such as ticking a box when visiting an internet website or choosing technical settings for an online service.
3. Sna cásanna dá dtagraítear i bpointe (a) agus i bpointe (c) de mhír 2, cuirfidh an rialaitheoir sonraí bearta iomchuí chun feidhme chun cearta agus saoirsí agus leasanna dlisteanacha an ábhair sonraí a choimirciú, ar a laghad ceart an rialaitheora chun idirghabháil daoine a fháil, chun a dhearcadh nó a dearcadh a chur in iúl agus chun agóid a dhéanamh i gcoinne an chinnidh.
(EN) According to Art29 Working Party Guidelines on Automated Individual Decision-Making and Profiling for the Purposes of Regulation 2016/679 (2018) the controllers should implement measures that include providing a way for the data subject to obtain human intervention, express their point of view and contest the decision.
Recital 71 further emphasizes the need for transparency around processing, as it outlines that appropriate safeguards should include providing the data subject with specific information and the right to obtain an explanation and to challenge the decision reached after assessment. Furthermore, the controller must provide an easy way for the data subject to exercise these rights, as this ensures they are able to adequately challenge a decision or express their view if they understand how it was made and on what basis.
Errors in data or automated decision-making can lead to wrong classifications and inaccurate projections that can harm individuals. So, controllers should regularly assess their data sets to find any bias, and figure out how to handle any prejudiced elements. Data controllers must regularly review algorithms to ensure accuracy and the absence of bias. Furthermore, they should review the underlying data to guarantee that automated decisions are based on valid and reliable information.
Controllers should establish regular procedures to prevent errors, inaccuracies, and discrimination during both the design and production stages.
The European supervisory authority recommended the following measures in its Guidelines:
4. Cinntí dá dtagraítear i mír 2, ní bheidh siad bunaithe ar chatagóirí speisialta sonraí pearsanta dá dtagraítear in Airteagal 9(1), ach amháin sa chás go bhfuil feidhm ag pointe (a) nó ag pointe (g) d’Airteagal 9(2) agus go mbeidh bearta iomchuí ann chun cearta agus saoirsí agus leasanna dlisteanacha an ábhair sonraí a choimirciú.
(EN)
Concern: Request to object to automated decision
Dear Madam, Dear Sir,
I am subject to a decision made by your [company | organization | etc.] based solely on [automated processing | profiling | etc.].
(EN) […]
(EN) Sign in
to read the full text
(EN) ISO/IEC 27701, adopted in 2019, added additional ISO/IEC 27002 guidance for PII controllers.
Here is the relevant paragraphs to article 22 GDPR:
7.2.2 Identify lawful basis
Control
The organization should determine, document and comply with the relevant lawful basis for the processing of PII for the identified purposes.
Implementation guidance
Some jurisdictions require the organization to be able to demonstrate that the lawfulness of processing was duly established before the processing.
(EN) […]
(EN) Sign in
to read the full text
(71) Ba cheart an ceart a bheith ag an ábhar sonraí gan a bheith faoi réir cinneadh, a bhféadfadh beart a bheith mar chuid de, lena meastar gnéithe pearsanta a bhaineann leis nó léi agus atá bunaithe ar phróiseáil uathoibrithe amháin agus ag a mbíonn éifeachtaí dlíthiúla maidir leis nó léi nó a mbíonn éifeacht chomhchosúil shuntasach aige air nó uirthi, amhail diúltú uathoibríoch d'iarratas ar líne i leith creidmheas a fháil nó cleachtais earcaíocht ar líne nach mbaineann aon idirghabháil le duine leo. Áirítear ar phróiseáil den sórt sin “próifíliú”, arb é atá i gceist leis foirm ar bith de phróiseáil uathoibrithe a dhéantar ar shonraí pearsanta lena meastar na gnéithe pearsanta a bhaineann le duine nádúrtha, go háirithe chun anailís nó tuar a dhéanamh ar ghnéithe a bhaineann le feidhmíocht an ábhair sonraí ag an obair, le staid eacnamaíochta, le sláinte, le roghanna pearsanta nó le díol spéise, le hiontaofacht nó iompar an duine sin, leis an áit ina bhfuil an duine sin nó le gluaiseachtaí an duine sin, i gcás ina mbíonn éifeachtaí dlíthiúla mar thoradh air sin a bhaineann leis nó léi nó a mbíonn éifeachtaí comhchosúla suntasacha aici air nó uirthi. Ba cheart, áfach, cinnteoireacht a cheadú atá bunaithe ar phróiseáil den sórt sin, lena n-áirítear próifíliú, nuair atá sin údaraithe go sainráite le dlí an Aontais nó le dlí Ballstáit a bhfuil an rialaitheoir faoina réir, lena n-áirítear chun faireachán a dhéanamh ar chalaois agus ar imghabháil cánach agus chun críocha iad a chosc, ar faireachán é a dhéantar i gcomhréir le rialacháin, caighdeáin agus moltaí institiúidí de chuid an Aontais nó comhlachtaí náisiúnta maoirseachta, agus chun go n-áirithítear sláine agus iontaofacht seirbhíse a sholáthraíonn an rialaitheoir, nó atá riachtanach maidir le conradh a dhéanamh nó a chomhlíonadh idir an t-ábhar sonraí agus rialaitheoir, nó sa chás gur thug an t-ábhar sonraí a chead sainráite nó a cead sainráite. In aon chás, ba cheart próiseáil den sórt sin a bheith faoi réir coimircí oiriúnacha, agus ba cheart a áireamh ar choimircí den sórt sin faisnéis shonrach don ábhar sonraí agus an ceart idirghabháil ó dhuine a fháil, an ceart a dhearcadh nó a dearcadh a chur in iúl, an ceart míniú a fháil ar an gcinneadh a rinneadh i ndiaidh measúnú den sórt sin agus an ceart agóid a dhéanamh i gcoinne an chinnidh. Níor cheart baint a bheith ag beart den sórt sin le leanbh.
Chun próiseáil chothrom thrédhearcach a áirithiú i ndáil leis an ábhar sonraí, agus na himthosca sonracha agus an comhthéacs sonrach a bhaineann le próiseáil na sonraí pearsanta á gcur san áireamh, ba cheart don rialaitheoir nósanna imeachta iomchuí matamaiticiúla nó staidrimh a úsáid don phróifíliú, bearta teicniúla agus eagraíochtúla a chur chun feidhme, arb iomchuí iad chun a áirithiú, go háirithe, go ndéantar tosca, a mbíonn míchruinnis sna sonraí pearsanta mar thoradh orthu, a cheartú, agus a áirithiú go ndéantar an riosca maidir le hearráidí a íoslaghdú, chun sonraí pearsanta a shlánú ar bhealach lena gcuirtear san áireamh na bagairtí a d'fhéadfadh a bheith i gceist maidir le leasanna agus cearta an ábhair sonraí agus lena gcuirtear cosc, inter alia, ar éifeachtaí idirdhealaithe atá ar dhaoine nádúrtha ar bhonn tionscnamh ciníoch nó eitneach, tuairimí polaitiúla, reiligiúin nó creidimh, duine a bheith mar chomhalta de cheardchumann, stádas géiniteach nó sláinte, gnéaschlaonta, nó a bhfuil bearta lena ngabhfadh éifeacht den sórt sin mar thoradh orthu. Níor cheart cinnteoireacht ná próifíliú uathoibrithe bunaithe ar chatagóirí speisialta sonraí pearsanta a cheadú ach amháin faoi choinníollacha sonracha.
(72) Tá an próifíliú faoi réir rialacha an Rialacháin seo lena rialaítear an phróiseáil a dhéantar ar shonraí pearsanta, amhail na forais dhlíthiúla le haghaidh próiseála nó prionsabail maidir le cosaint sonraí. Ba cheart an Bord Eorpach um Chosaint Sonraí arna bhunú leis an Rialachán seo (“an Bord”) bheith in ann treoir a eisiúint sa chomhthéacs sin.
(EN)
Article 29 Working Party, Guidelines on Automated Individual Decision-Making and Profiling for the Purposes of Regulation 2016/679 (2018).
European Commission, Commission Guidance on the application of Union data protection law in the electoral context, A contribution from the European Commission to the Leaders’ meeting in Salzburg on 19-20 September (2018).
EDPB, Guidelines 8/2020 on the targeting of social media users (2020).
European Commission, Guidance on Apps supporting the fight against COVID 19 pandemic in relation to data protection Brussels (2020).
ICO, Data sharing: a code of practice (2020).
Spanish Data Protection Agency (AEPD), Guide on use of cookies (2021).
(EN) The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her. In such cases, the data subject shall have the right to obtain human intervention, to express his or her point of view, to contest the decision and to have it reconsidered.
Scope of the Right
The applicability of this article is limited to automated data processing where the decisions have a big impact on data subjects. According to the Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679, this article sets up a general ban on deciding based just on automated processing, regardless of whether or not the data subject takes any action.
In a nutshell, Article 22 states that:
But, the Article 22(1) ban only counts in certain cases where a decision based just on automated processing, including profiling, has a legal effect on or similarly affects someone. Even in these cases, there are specified exceptions which allow such processing to take place.
Automated Processing
An automated process can produce a recommendation about a data subject. If a person reviews and takes into account other elements to make the final decision, it won’t be a decision that’s just based on automated processing.
The controller can’t bypass Article 22 requirements by making it look like a human is involved. For example, if someone constantly uses automatically generated profiles for individuals without any actual effect on the result, that’s still a decision based solely on automated processing.
To qualify as human involvement, the controller must make sure that any oversight of the decision is significant, not just a formality. It should be done by someone who can override the decision and has the knowledge to consider all the relevant data.
Significant Effect
Even if a decision-making process does not have an effect on people’s legal rights it could still fall within the scope of Article 22 of the GDPR if it produces an effect that is equivalent or similarly significant in its impact. This means that even if there is no legal change, the data subject could still be impacted enough to require the protections under this provision. The GDPR introduces the word ‘similarly’ to the phrase ‘significantly affects’ in order to provide a threshold for significance that is similar to that of a decision producing a legal effect.
A legal effect occurs when a decision based solely on automated processing impacts someone’s legal rights, such as freedom of association, voting, and legal action, or creates legal effects like contract cancellation, entitlement/denial of social benefits, denial of admission to a country of refusal in citizenship.
According to Recital 71, typical examples of other similarly significant effects could include ‘automatic refusal of an online credit application’ or ‘e-recruiting practices without any human intervention’.
For data processing to significantly affect someone the effects of the processing must be great or important enough. This could include decisions that affect someone’s financial circumstances, such as their eligibility for credit; decisions that affect someone’s access to health services; decisions that deny someone an employment opportunity or put them at a serious disadvantage; or decisions that affect someone’s access to education, for example, university admissions.
In many typical cases, the automated decision to present targeted advertising based on profiling will not have a similarly significant effect on individuals. However, it is possible for data profiling to have an effect on individuals depending on the characteristics of the case. This includes the intrusiveness of the profiling process, the expectations and wishes of the individuals, and the knowledge of the vulnerabilities of the data subjects. Even if it has little effect on some individuals, it can have a significant impact on certain groups, such as minority groups or vulnerable adults.
Similarly, automated decision-making that results in differential pricing based on personal data or personal characteristics could also have a significant effect if, for example, prohibitively high prices effectively bar someone from certain goods or services.