Challenges for the IT Industry: Complications of Developing Electronic Justice Systems

Yuri Filatov
4 min readFeb 25, 2022
Problems in creating a robot judge

Filing an electronic application to the court and attaching the relevant materials to it is a task within most people’s powers. Virtually anyone, if necessary, is capable of taking part in a remote legal procedure with online communication between participants and document transfer.

What is much more difficult to imagine today is an electronic justice where the entire proceeding is carried out by a completely autonomous digital judge. Nevertheless, e-courts have many advantages, such as the speed of processing materials, reduction of the workload for a number of claims, elimination of some purely human factors, etc. It makes sense that developments in this direction are not going to stop — the same as government investments in such projects.

What are the challenges for developers?

If we consider the process of creating and launching AI systems in justice as a technical task, the list of difficulties and problems faced by developers of this technology needs to be ranked:

  • problems of interaction between humans and electronic systems;
  • problems of the imperfection of the current algorithms for processing and finalizing data and related limitations;
  • problems of developing AI as an autonomous moral agent (AMA) acting based on the concepts of law, morality, and justice.

Each group of problems complicates the implementation of the final product in its own way and requires a solution at one of the following levels:

  • systems’ design;
  • transformation of social practices and their connection to digital technologies;
  • significant modification of the current legal, moral, and ideological systems of the state and society with further global consensus.

Problems of interaction

Certain difficulties with the interaction of a person and the electronic system of online jurisdiction became immediately apparent during the period of intensive use of telejustice. The main problem for most countries’ legal systems is the proper identification of citizens participating in a court hearing.

Even in digitally advanced countries, not all residents have national IDs to reliably and securely identify them through electronic devices and store their data. This may be partly caused by the large number of illegal migrants or with a high level of citizens’ distrust of state institutions to receive the corresponding identifiers. At the same time, small states such as Estonia are counting on the effectiveness of citizens’ interaction with electronic systems.

Another difficulty is the unequal availability of technology for different segments of the population, also called ‘the digital divide.’ For example, in the opinion of Elena Avakyan — the Advisor to the Federal Chamber of Lawyers of the Russian Federation — using biometric authentication to identify participants in the process can make this inequality even higher:

“This is not just the transformation of the judiciary system into the elite one. You can count on your hand the individuals who will have access to it.”

In this case, not every state can guarantee fair access of all possible participants to the electronic legal process.

A less obvious difficulty is the incorrect format of presenting data to the system. A judge of the LA Superior Court Wendy Chang says this directly:

“In my experience in judging, especially with a self-represented litigant, most of the time people don’t even know what to tell you.”

In this case, the digital analyzer will need additional capabilities of not only receiving, processing, and storing data but also establishing their correct format against the background of information noise.

Imperfection of algorithms

Even today, when designing ML-based systems for automated data processing, technical difficulties causing discontent among developers emerge. Big Data and decision-making systems often have an opaque algorithm to an external observer. In this case, users can only trust the morality of these systems’ manufacturers. Moreover, the systems can inherit the bias and fallacy of views of their human creators.

For example, COMPAS algorithms that predict the risk of ex-prisoners reoffending in the USA have been criticized for their low accuracy. This is evidenced by data from the ProPublica report. Only 20% of ex-prisoners in Florida, whose risk of reoffending was high, committed a felony again. Although, for less serious crimes, the prediction accuracy was three times higher. There was also a certain bias in the system towards African Americans, whose risk of reoffending was estimated to be significantly higher.

However, such errors are also faced by developers who work on intelligent systems for other areas. Refresh your memory about the results of recently published studies highlighting the problem of the worst performance of intelligent speech recognition systems. Similar problems were identified in 2018 in Amazon’s electronic recruiting system which was discriminating against women. Another example is the facial recognition system by Microsoft and IBM, where the correctness of gender determination changed depending on the skin color. All these errors are mainly tied to the peculiarities of AI programs’ training, which was carried out using a database that initially discriminates against certain groups of the population.

Conclusion

Developers of AI systems for justice face a number of challenges in finding a suitable communication interface. It should be both democratic in use and acceptable in terms of security, allow a person to fully interact with the e-court, and pay attention to the correct format for presenting data to the system. ML-based programs used today also require optimization to avoid errors and developers’ bias.

However, these tech problems can be solved by the means of tech solutions. The last group of problems associated with training AI systems and turning them into AMA is more complicated. So far, it has caused unceasing arguments among theorists, developers of AI systems, representatives of justice, and public opinion leaders. But this issue requires a deeper separate analysis.

--

--