1. 資料主體應有權不受僅基於自動化處理（包括建檔）所做成而對 其產生法律效果或類似之重大影響之決策所拘束。
1. The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.
(EN) Paragraph 1 of this article laid down a general ban on the use of automated decision-making that has legal or similarly significant effects (as mentioned above). This prohibition is intended to serve as a safeguard, ensuring that decisions of this kind are not taken without due consideration and oversight.
This implies that the controller should not undertake the processing described in Article 22(1) unless one of the exceptions listed below applies.
(a) is necessary for entering into, or performance of, a contract between the data subject and a data controller;
(EN) The use of automated decision-making processes for contractual purposes may be the most appropriate way to achieve the desired outcome in certain situations. This is especially true where routine human involvement is impractical or impossible due to the large volume of data. In such instances, it is essential that the controller is able to demonstrate that the processing is necessary, taking into account whether a less privacy-intrusive method could be employed. For example, if there are alternative methods that are equally effective and less intrusive, then automated decision-making is not considered to be ‘necessary’.
Moreover, automated decision-making may also be necessary for pre-contractual processing in accordance with Article 22(1). It is essential that controllers consider the privacy implications of their automated decision-making processes, ensuring that any processing is necessary and proportionate, and that there are sufficient safeguards in place to protect individuals’ data rights.
For instance, it may be necessary to utilize automated decision-making in order to identify a short list of suitable candidates due to the exceptionally high volume of applications received for this open position. This is done with the intention of entering into a contract with the data subject in order to progress the recruitment process.
(b) 係控管者受拘束之歐盟法或會員國法有明文授權，且定有適當之 保護措施以確保資料主體之權利及自由及正當利益者；或
(b) is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or
(EN) Article 22 of the GDPR makes an exception for using explicit consent for significant automated individual decision-making. This is due to the serious privacy risks posed by such processing and, as such, a higher level of individual control over personal data is deemed appropriate.
However, ‘explicit consent’ is not defined in the GDPR. For this reason, the WP29 guidelines on consent (see the “Related” tab for Art.22(2b)) provide important guidance. These guidelines emphasize that consent must be demonstrated through clear affirmative action, such as ticking a box when visiting an internet website or choosing technical settings for an online service.
3. 在第 2 項所定第 a 點及第 c 點之情形，資料控管者應執行適當保護 措施以確保資料主體之權利及自由及正當利益，至少有權對控管者部 分為人為參與、表達意見以及挑戰該決策。
3. In the cases referred to in points (a) and (c) of paragraph 2, the data controller shall implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.
(EN) According to Art29 Working Party Guidelines on Automated Individual Decision-Making and Profiling for the Purposes of Regulation 2016/679 (2018) the controllers should implement measures that include providing a way for the data subject to obtain human intervention, express their point of view and contest the decision.
Recital 71 further emphasizes the need for transparency around processing, as it outlines that appropriate safeguards should include providing the data subject with specific information and the right to obtain an explanation and to challenge the decision reached after assessment. Furthermore, the controller must provide an easy way for the data subject to exercise these rights, as this ensures they are able to adequately challenge a decision or express their view if they understand how it was made and on what basis.
Errors in data or automated decision-making can lead to wrong classifications and inaccurate projections that can harm individuals. So, controllers should regularly assess their data sets to find any bias, and figure out how to handle any prejudiced elements. Data controllers must regularly review algorithms to ensure accuracy and the absence of bias. Furthermore, they should review the underlying data to guarantee that automated decisions are based on valid and reliable information.
Controllers should establish regular procedures to prevent errors, inaccuracies, and discrimination during both the design and production stages.
The European supervisory authority recommended the following measures in its Guidelines:
4. 除第 9 條第 2 項第 a 點或第 g 點所定情形外，第 2 項所定決策不 得係基於第 9 條第 1 項所定之特殊類型之個人資料，且應實施適當保 護措施以確保資料主體之權利及自由及正當利益。
4. Decisions referred to in paragraph 2 shall not be based on special categories of personal data referred to in Article 9(1), unless point (a) or (g) of Article 9(2) applies and suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests are in place.
The latest consolidated version of the Regulation with corrections by Corrigendum, OJ L 127, 23.5.2018, p. 2 ((EU) 2016/679). Source: EUR-lex.
Concern: Request to object to automated decision
Dear Madam, Dear Sir,
I am subject to a decision made by your [company | organization | etc.] based solely on [automated processing | profiling | etc.].
(EN) ISO/IEC 27701, adopted in 2019, added additional ISO/IEC 27002 guidance for PII controllers.
Here is the relevant paragraphs to article 22 GDPR:
7.2.2 Identify lawful basis
The organization should determine, document and comply with the relevant lawful basis for the processing of PII for the identified purposes.
Some jurisdictions require the organization to be able to demonstrate that the lawfulness of processing was duly established before the processing.
(71) 資料主體應有權不受決策之拘束，該決策可能包括對其產生法 律效果或類似之重大影響並僅以自動化處理來評估其個人特徵之措 施，例如網路貸款申請之自動拒絕或不包括任何人為介入之電子化招 募。該處理包括評估個人特徵之個人資料自動化處理的任何形式之 「建檔」，尤其是為了分析或預測有關資料主體之工作表現、經濟狀 況、健康、個人偏好或興趣、可信度或行為、地點或動向等特徵，而 會對其產生法律效果或類似之重大影響者。然而，在控管者受拘束之 歐盟法或會員國法有明文授權時，基於該處理所作成之決策（包括建 檔）應予允許，此包括為監控及預防詐騙及逃漏稅之目的，依歐盟機 構或國家層級監督機構之規範、標準及建議所為之者，以及為確保控 管者提供服務之安全性與可信度，或為締結或履行資料主體與控管者 間之契約所必要者，或於資料主體曾給予明確同意之情形。在任何情 況下，該處理應有適當之保護措施，此應包括將特定資訊給予資料主 體及獲得人為干預、表達意見、獲得依上開評估後做成決策之解釋， 以及挑戰該決策之權利。該措施不得涉及兒童。
(71) The data subject should have the right not to be subject to a decision, which may include a measure, evaluating personal aspects relating to him or her which is based solely on automated processing and which produces legal effects concerning him or her or similarly significantly affects him or her, such as automatic refusal of an online credit application or e-recruiting practices without any human intervention. Such processing includes ‘profiling’ that consists of any form of automated processing of personal data evaluating the personal aspects relating to a natural person, in particular to analyse or predict aspects concerning the data subject's performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements, where it produces legal effects concerning him or her or similarly significantly affects him or her. However, decision-making based on such processing, including profiling, should be allowed where expressly authorised by Union or Member State law to which the controller is subject, including for fraud and tax-evasion monitoring and prevention purposes conducted in accordance with the regulations, standards and recommendations of Union institutions or national oversight bodies and to ensure the security and reliability of a service provided by the controller, or necessary for the entering or performance of a contract between the data subject and a controller, or when the data subject has given his or her explicit consent. In any case, such processing should be subject to suitable safeguards, which should include specific information to the data subject and the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision. Such measure should not concern a child.
為了確保對於資料主體之公平與透明的資料處理，於考慮個人資料處 理之特定情況與脈絡時，控管者應於建檔時使用適當之計算或統計程 序、應實施科技化且有組織的措施以適度確保尤其是可使個人資料不 準確性得以更正及將錯誤風險最小化的要素，並應在考慮資料主體的 利益與權利所受潛在風險，及預防包括但不限於基於種族或人種、政 治意見、宗教或信仰、貿易聯盟會員、基因或健康狀態或性傾向等理 由對當事人之歧視效果或造成此種效果之態度下，保護個人資料。基 於特殊類型之個人資料所為之自動決策與建檔只有在特定條件下始 被允許。
In order to ensure fair and transparent processing in respect of the data subject, taking into account the specific circumstances and context in which the personal data are processed, the controller should use appropriate mathematical or statistical procedures for the profiling, implement technical and organisational measures appropriate to ensure, in particular, that factors which result in inaccuracies in personal data are corrected and the risk of errors is minimised, secure personal data in a manner that takes account of the potential risks involved for the interests and rights of the data subject and that prevents, inter alia, discriminatory effects on natural persons on the basis of racial or ethnic origin, political opinion, religion or beliefs, trade union membership, genetic or health status or sexual orientation, or that result in measures having such an effect. Automated decision-making and profiling based on special categories of personal data should be allowed only under specific conditions.
(72) Profiling is subject to the rules of this Regulation governing the processing of personal data, such as the legal grounds for processing or data protection principles. The European Data Protection Board established by this Regulation (the ‘Board’) should be able to issue guidance in that context.
Article 29 Working Party, Guidelines on Automated Individual Decision-Making and Profiling for the Purposes of Regulation 2016/679 (2018).
European Commission, Commission Guidance on the application of Union data protection law in the electoral context, A contribution from the European Commission to the Leaders’ meeting in Salzburg on 19-20 September (2018).
EDPB, Guidelines 8/2020 on the targeting of social media users (2020).
European Commission, Guidance on Apps supporting the fight against COVID 19 pandemic in relation to data protection Brussels (2020).
ICO, Data sharing: a code of practice (2020).
(EN) The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her. In such cases, the data subject shall have the right to obtain human intervention, to express his or her point of view, to contest the decision and to have it reconsidered.
Scope of the Right
The applicability of this article is limited to automated data processing where the decisions have a big impact on data subjects. According to the Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679, this article sets up a general ban on deciding based just on automated processing, regardless of whether or not the data subject takes any action.
In a nutshell, Article 22 states that:
But, the Article 22(1) ban only counts in certain cases where a decision based just on automated processing, including profiling, has a legal effect on or similarly affects someone. Even in these cases, there are specified exceptions which allow such processing to take place.
An automated process can produce a recommendation about a data subject. If a person reviews and takes into account other elements to make the final decision, it won’t be a decision that’s just based on automated processing.
The controller can’t bypass Article 22 requirements by making it look like a human is involved. For example, if someone constantly uses automatically generated profiles for individuals without any actual effect on the result, that’s still a decision based solely on automated processing.
To qualify as human involvement, the controller must make sure that any oversight of the decision is significant, not just a formality. It should be done by someone who can override the decision and has the knowledge to consider all the relevant data.
Even if a decision-making process does not have an effect on people’s legal rights it could still fall within the scope of Article 22 of the GDPR if it produces an effect that is equivalent or similarly significant in its impact. This means that even if there is no legal change, the data subject could still be impacted enough to require the protections under this provision. The GDPR introduces the word ‘similarly’ to the phrase ‘significantly affects’ in order to provide a threshold for significance that is similar to that of a decision producing a legal effect.
A legal effect occurs when a decision based solely on automated processing impacts someone’s legal rights, such as freedom of association, voting, and legal action, or creates legal effects like contract cancellation, entitlement/denial of social benefits, denial of admission to a country of refusal in citizenship.
According to Recital 71, typical examples of other similarly significant effects could include ‘automatic refusal of an online credit application’ or ‘e-recruiting practices without any human intervention’.
For data processing to significantly affect someone the effects of the processing must be great or important enough. This could include decisions that affect someone’s financial circumstances, such as their eligibility for credit; decisions that affect someone’s access to health services; decisions that deny someone an employment opportunity or put them at a serious disadvantage; or decisions that affect someone’s access to education, for example, university admissions.
In many typical cases, the automated decision to present targeted advertising based on profiling will not have a similarly significant effect on individuals. However, it is possible for data profiling to have an effect on individuals depending on the characteristics of the case. This includes the intrusiveness of the profiling process, the expectations and wishes of the individuals, and the knowledge of the vulnerabilities of the data subjects. Even if it has little effect on some individuals, it can have a significant impact on certain groups, such as minority groups or vulnerable adults.
Similarly, automated decision-making that results in differential pricing based on personal data or personal characteristics could also have a significant effect if, for example, prohibitively high prices effectively bar someone from certain goods or services.