The algorithm that decided the destiny of thousands of families

Ricardo Zapata Lopera
6 min readDec 6, 2019

Or how inscrutable code can amplify human mistakes. A case study of La Buona Scuola teachers’ mobility algorithm in Italy.

Teachers protest against the Buona Scuola educational reform (Source: Contropiano)

What can happen when you poorly implement a simple but inscrutable allocation algorithm in the midst of a hectic political context? As part of a teachers’ mobility program in 2016, an algorithm was supposed to assign vacant positions across Italy to existent and new teachers. After its implementation, thousands of teachers found to be wrongly placed. News coverage began to unveil the personal drama of separated families and the insuficient explanations on the side of the Ministry of Education. The case was taken to courts, teachers were reassigned to new posts, and just until early-2019 Italy’s State Council ruled creating valuable jurisprudence in favor of better algorithmic accountability. This case brings automated public decisions to the center of debates around digital human rights when writing ‘rules as code’.

An education reform is born

La Buona Scuola (the Good School, in English) was an educational reform promoted in 2015 by the reformist, center-left Government of Matteo Renzi in Italy. This reform aimed to transform the school system introducing a series of actions in three main pillars.

Table 1 — Buona Scuola reform summary (Source: self-construction, based on Universo Scuola[1])

The reform was finally approved as Law 107 of 2015, containing a set of dispositions that were to be developed in the next two years. Since its discussion, it sparked protests all around the country, being highly criticized by teachers’ unions and students’ groups[2], claiming the introduction of neoliberal principles in the school system.

It is worth to say that Matteo Renzi’s reforms went through extensive debates around the country. The climax came when he lost the Referendum that was to approve a wide Constitutional Reform.[3] This led to his resignation in December 2016. Even though the Democratic Party continued in Government with Paolo Gentiloni until mid-2018, the implementation of the Buona Scuola was still mid-way. In April 2017 the Minister of Education issued the eight implementation decrees that Law 107 required.

In the midst of this process, a new scandal came to light. Product of the 100,000 recruitment plan (see Table 1) and a teachers’ mobility program, during the 2016–2017 school year, teachers were to be located in vacant positions around the country, based on their merits and location preferences. Around 210,000 transfers were demanded[4]. Of those, at least 10,000 teachers demanded explanations over why the program had placed them differently than they expected. After a two-year social and legal struggle to scrutinize the used algorithm, society was once again reminded that automating public decisions without the proper operational and ethical frameworks has the potential to amplify negative implications.

The technological promise

In a 117 million EUR contract, Italy’s Ministry of Education hired Hewlett-Packard (HP) Italia and Finmeccanica to manage various IT processes. Inside that contract, as it was discovered by various investigative journalists, the contractors were tasked with the creation of a software to manage the teachers’ mobility program. The cost of that item totaled 444,000 EUR[5].

As explained in the following diagram, HP Italia and Finmeccanica where supposed to create an algorithm that assigned teachers to different vacant positions around the country. To do so, it had to consider the availability of the seats, together with teachers’ scores in merit-based tests and their location preferences.

Logical sequence of teachers’ mobility algorithm (Source: La Sapienza and Tor Vergata universities[6])

Nevertheless, as teachers began complaining over the inconsistent results of the program, the ‘Gilda degli Insegnanti’ (Teachers’ Union) began a legal complaint with the Ministry of Education, after the latter denied supplying the information that would explain how the algorithm worked. Also asked about it, the contractors denied supplying information, arguing privacy of commercial contracts. “We do not comment on private commercial contracts. I think that the best opportunity will be to get in contact with the Ministry and see if they are willing to provide additional answers”[7], they answered.

It was Lazio’s Regional Administrative Court who required the Ministry of Education to provide all technical material to allow an audit from the affected parties. The provided documentation didn’t allow to entirely replicate the algorithm’s operation, but after a preliminary evaluation of its source code, the auditors established “that the most basic programming criteria that are known to apply have not been observed. In fact — analysts write in the report — even in light of the simplicity of the requested operation, it is not clear what are the reasons that led the programmer to create a bombastic, redundant and non-maintenance-oriented system”[8]. Particularly, the use of COBOL programming language led the auditors to speculate that the algorithm was badly adapted from an older program. This language was used in the 60’s and 70’s, and only old systems keep using it due to the high costs of migration to other languages. For a new system like the teachers’ mobility software, it was an inexplicable choice.

Involved actors (Source: self-construction)

The technical audit was ultimately not able to determine how the algorithm worked, thus, it couldn’t determine how it was exactly delivering a place assignment considering merit and preferences.

“given the lack of [input data (rankings, teacher data, etc.)], it is not possible to understand whether these data were transferred to the company that developed the algorithm and processed by it or whether, once the algorithm was set up, it was transferred to the Ministry of Education which managed the operation (data input and output) until the appointment of the teachers. It is evident that the lack of such details, as well as the lack of the files referred to in the code, the database, the files that the software uses to read and write data (not so much in content as in form), as well as technical specifications, configures a conduct that is not very transparent, despite the order to show the data and documents by the TAR (Tribunali Amministrativi Regionali), against the ministry.”[9]

The legacy of La Buona Scuola’s allocation algorithm

The ‘Buona Scuola’ algorithm case brings two important lessons for algorithmic decision making. First, it illustrates the social and ethical implications of automation and digitisation, reminding of a simple but striking risk: the amplification of mistakes through technological solutions. Once a system is programmed, any wrong instruction will replicate undesired outcomes as data is crunched. This is particularly problematic when public decisions are at stake.

Nevertheless, one must remember that, as technology is ultimately created by humans, mistakes may happen. Coping with this may not require more than adjusting to the mechanisms already defined by contemporary, analogous administrative law. After determining the lack of impartiality, transparency and publicity in teachers’ allocation mechanism, the State Council’s final rule in April 2019[10] called to comply with the principles of reasonableness, proportionality, publicity and transparency[11]. As governments digitise and public decisions become more driven by data and automated procedures, they should be subject to ex ante and ex post control and scrutiny.

A second lesson this case brings about is that technification and digitisation do not necessarily correlate with neutrality. In other words, discretion is still present. Even more, due to the increased complexity, detecting the embedded principles and criteria for decision-making could be harder. This is important to bear in mind as public sector digitisation is happening amidst profound ongoing debates. For instance, in the Buona Scuola’s case there was resistance and scepticism to automation due to a general context of market-oriented management of education.

These two lessons suggest that public-interest algorithms should be knowable in all aspects, “from its authors to the procedure used for its elaboration, to the decision mechanism, including the priorities assigned in the evaluation and decision-making procedure and the data selected as relevant” (Consiglio di Stato, sentenza n. 2270, 8 aprile 2019). A radical and comprehensive stance for algorithmic transparency is a first but necessary step for the delivery of real public value by automated systems.

--

--

Ricardo Zapata Lopera

Writing on digital, civic and urban affairs. I studied Public Policy at Sciences Po Paris. ES EN FR.