AI discussing AI: Siri on SyRI Bookmark has been added
AI discussing AI: Siri on SyRI
A GDPR-analysis of SyRI
SyRI is an instrument that uses personal data in order to predict the likelihood that an individual violates the law. It runs on data collected by various public authorities. However, SyRI (allegedly) violates the right to privacy and data protection. This fictional Siri-discussion analyzes GDPR-issues relating to SyRI.
Diderik Bierens de Haan
What can I help you with?
Go ahead, I’m listening.
Okay, let me put it this way: what can you tell me about the use of AI and big data by the government for social security purposes?
You mean SyRI?
SyRI, short for System Risk Indication, is a fraud prevention system in the Netherlands. It allows public authorities entrusted with the (administrative) supervision of the social security systems to investigate and detect criminal activities, including social security fraud. Combining big data and AI has the potential to be fast, cheap and effective to this end.
Okay – is there a catch though?
Well, a few concerns in relation to privacy have been raised… Civil rights organizations, privacy advocates and human rights activists have recently even taken a case to court, challenging the lawfulness of SyRI. Also, SyRI has the (rather dubious) honor of “winning” this year’s Big Brother Award for “biggest privacy violation”.
But, isn’t the government allowed to use AI and big data on personal data? Surely, it has a lawful basis to do so?
Well, being the government, it was able to create a legal basis for itself: art. 65 of the “Wet structuur uitvoeringsorganisatie werk en inkomen” which allows personal data to be linked and subsequently analyzed to generate risk reports. But apart from a legal basis, more arguments can be made that important principles of the GDPR have not been adhered to.
Principles in relation to transparency, proportionality and on submission to automated decision making (“ADM”).
What about transparency?
This is a key GDPR principle. Personal data must be processed lawfully, fairly and in a transparent manner. The controller must provide information to the data subject “in a concise, transparent, intelligible and easily accessible form, using clear and plain language”.
Individuals who are subjected to SyRI’s screening are not informed of the processing of their personal data. Since it is unclear which individuals are subject to SyRI’s analysis and what happens to their personal data, SyRI violates the principle of transparency.
The Secretary of State informed parliament that she would not make the risk model publicly available. She claimed this could compromise SyRI, since suspected fraudsters would be able to anticipate to the risk indicators and consequently circumvent the system. She also refused to perform a technical audit on the algorithm.
The Minister for Legal Protection is of the opinion that SyRI is not required to meet the standards of transparency prescribed in art. 5 GDPR.
He claims the exception in art. 23 GDPR applies, which states that the obligations (including regarding transparency) may be restricted by member
state law, including for the monitoring of social security.
Another key GDPR principle. SyRI’s proportionality has been criticized since it uses negative personal characteristics (e.g. debt, offenses and other signals of fraud). The Dutch Data Protection Authority emphasized that the use of such personal data requires extra scrutiny. The criticism regarding the proportionality consists of the following elements:
- Very large set of data: SyRI uses a very large database of personal data, scanning data of tens of thousands of individuals. They have provided their personal data to the government for administrative purposes, not for the purpose of fraud detection. This makes the processing unfocused and undefined.
- Purposes limitation and categories of personal data: SyRI’s purposes, as well as the categories of data that are being used, are unspecific and broadly
defined. It is therefore at risk of violating the principle of purpose limitation under art. 5(1)(c) GDPR, which states that personal data shall be collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes.
- Subsidiarity: instead of analyzing complete data sets, SyRI is also able to analyze a subset, which is limited in size. This would significantly decrease the amount of personal data analyzed by SyRI.
- Effectiveness: very much the subject of debate. Despite its high expectations, it is much lower than expected. In one program, 63.000 individuals were screened, arriving at 42 cases of fraud (equalling a success rate of 0,07%). In another program, 117 cases of fraud were detected after screening 119.000 individuals (success rate of 0,1%).
As a result, since SyRI was developed, only five municipalities have requested the ministry to make use of the system. SyRI’s limited effectiveness makes the processing disproportionate.
And automated decision making?
Art. 22 (1) GDPR concerns automated decision-making and profiling. ADM concerns the making of decisions which are based solely on automated processing: “The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.”
Art. 22(2)(b) GDPR provides for an exception if the decision “is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and
It remains unclear if SyRI falls under the scope of art. 22 GDPR. The Dutch Minister for Legal Protection argues SyRI merely generates a “notification of increased risk”, which requires further investigation by the relevant public authority. This investigations leads to a decision made by a human being. In other words: since SyRI itself does not make any decisions, it seems not to fall within the scope of art. 22 GDPR. Therefore, it can be argued art. 22 GDPR does not apply to SyRI.
Interesting stuff. Any final thoughts?
SyRI –and similar project used or being developed by public authorities– potentially has great negative impact on the private lives of Dutch citizens. Its large scale processing and its secret nature are ingredients for turning the Netherlands into a ‘digital welfare state’. These signs call for extra vigilance of the public, and scrutiny by courts and supervisory authorities is therefore essential.
Any news on the court ruling?
On February 5, 2020, the court in The Hague ruled (ECLI:NL:RBDHA:2020:865) that SyRI violates the right to respect for private and family life (article 8(2)
ECHR). According to the court, SyRI does not provide a fair balance between the social need to prevent social security fraud and the violation of the right to privacy (which is necessary for that purpose). The court specifically takes into considerations the basic GDPR-principles of transparency, purpose limitation and data minimization. As a result, the court effectively invalidates the use of SyRI in its current form.
This ruling is likely to set a precedence for the future use of data sets and algorithms for fraud detection by the government. It is clear that the GDPR-principles must be taken into account.
Moreover, it will be interesting to keep a close watch on the Dutch Data Protection Authority. It recently announced that for 2020 to 2023, it would focus on three areas. Two of them, “digital government” and “artificial intelligence and algorithms” are relevant to SyRI, which means that projects similar to SyRI
will be under scrutiny of the AP in the near future.
 Kamerstukken II, 2017-2018, 32 761, nr. 122, p. 2.
 Kamerstukken II, 2017-2018, 26 643, nr. 543, p. 29.
 Art. 23(1)(e) and (h) GDPR.
 College Bescherming Persoonsgegevens, Advies conceptbesluit SyRI, 18 februari 2014, p. 4.
 Brief by the United Nations Special Rapporteur on extreme poverty and human rights as Amicus Curiae in the case of NJCM c.s./De Staat der Nederlanden (SyRI) before the District Court of The Hague (case
number C/09/550982/ HA ZA 18/388) (“Amicus brief”), p. 4
 Amicus brief, p. 5.
 Amicus brief, p. 1.
 Focus AP 2020-2023, Data protection in a digital society, p. 24 (link to text in Dutch)
More information on Data protection or Privacy law
Do you want to know more on privacy and data protection?
Please contact Peter Kits at +31 (0)88 288 7370 or Marloes Dankert at +31 (0)88 288 7437.