Columns
AI and the future of justice in Nepal
We must embrace innovations in a way that upholds the values of our legal system.Sunil Kumar Yadav
Following the issuance of the concept paper on the use of Artificial Intelligence (AI) by the technical committee formed by the Ministry of Communication and Information Technology on July 3, 2024, law practitioners and leaders of the judiciary have directed their attention to AI in the administration of justice. Until now, the law, legal logic and legal reasoning have largely been the products of human conscience and history. Still, concerns remain about balancing human intelligence and AI. However, integrating AI into the administration of justice will be a distant dream until we carefully consider AI’s complementary role rather than using it entirely as an alternative to human judgment in the legal system.
The development of AI tools is one of the most revolutionary and powerful phenomena of the 21st century and can potentially transform the fate of the judiciary. It can process vast amounts of data with remarkable speed, handle routine administrative tasks like the automatic publishing of cause lists and even help with automisation, which the Nepal Bar Association and reformists of the judiciary have sought to do since Cholendra Shumsher Rana was the chief justice.
It can also offer insights and predictions based on the analysis of available data. For example, AI can modernise case management by automating document review and scheduling. Legal research can also be swiftly enhanced by processing and reprocessing relevant precedents and statutes. AI can significantly improve efficiency and accuracy by reducing backlogs (menace to the judiciary) and aiding in the speedy delivery of justice.
The human intelligence
Understanding and reinstating that the law is a product of social experience and human conscience is important. Despite all the scientific advancements and the potential of AI, it cannot replace human intelligence.
Legal practice is not limited to processing information. It also involves interpreting laws within socio-cultural contexts, human experiences, and ethical considerations. When interpreting laws and legal principles, judges and legal professionals empathise, employ moral reasoning, and develop legal logic with cultural sensitivity, abilities that AI lacks. Human judgment remains essential in Nepal, as cultural and social dynamics are deeply ingrained in legal processes.
Overcoming challenges
Integrating AI into Nepal’s judicial system presents several challenges. The absence of an organised database is a major concern. Unorganised and scattered data restrict the optimal use of AI, which prevents us from gaining the highest returns. Furthermore, the development of AI is West-centric. This will also likely limit its implementation in Nepal’s judicial system. Therefore, Nepal must focus on developing a system for organising data, and AI developers should consider eliminating barriers that hinder AI’s functionality. The concern that AI algorithms may be biased cannot be dismissed. AI systems may reinforce or amplify biases present in earlier data if trained.
Since Nepal’s judiciary handles private and sensitive data, data security and privacy concerns are crucial when integrating AI into the judiciary. So, creating a strong legal framework is a prerequisite to safeguarding data privacy and guaranteeing the ethical application of AI.
Collaborative framework
Developing a collaborative approach where human intelligence benefits in legal reasoning with AI’s assistance is essential. Decision-making and interpretation of law must solely be the subject of human judges’ conscience, whereas AI can play its role in data-driven insights and completing routine tasks, such as automating cause lists.
This collaborative approach requires a shift in the perspective of legal practitioners. They must be trained to interpret the data and insights generated by AI tools. Their skills must be encouraged to maintain humanness in administering justice while using AI outputs. Lack of critical interpretation of AI data may lead to fake citations of resources and literature, as happened in a case from the US, where the respondent submitted a non-existent judicial opinion with fake quotes and citations created by ChatGPT.
Before embracing AI into Nepal’s judicial system, training AI on local data, such as case laws in the Nepali language, is imperative. Safeguarding against data theft, sensible implementation of AI, and rigorous dialogue with stakeholders will enhance the capacity of the judicial system and those involved in justice delivery. It is safe to assert that the future of the judicial system in Nepal depends on how well the authorities manage to use AI. The right balance of human intelligence and AI will make the judicial system fairer and more accessible. We must embrace innovations in a way that upholds the values of our legal system.