Background & Context:
The framers of the 2011 Evidence Act (“Principal Act”) which introduced the admissibility of computer-generated evidence under Section 84 of the Act would never have imagined the huge exploits to be made to technological innovation in the no distant future. More precisely, occasioned by the origination of Artificial Intelligence (AI). With the amendment made to the Principal Act which introduced the Evidence Act 2023. The prevailing Act has expanded the spectrum of admissibility of computer-generated evidence and provided a more robust definition of computer-generated evidence to not just include “documents” but to be inclusive of “electronic records”, comprising of: “data, record or data generated, image or sound stored, received, or sent in an electronic form or microfilm” [1].
However, with the innovation and subsequent advancement to digital technology, particularly deep fakes and AI generated media, the existence of doctored, manipulated and distorted material portends a great threat to the administration of justice in Nigeria. Thus, there exists a need to interrogate Nigeria’s corpus juris to determine whether our prevailing laws have sufficiently catered for the advancements occasioned by these innovations and even future innovations to be considered. So as to ensure that our courts are adequately “fire-proof” and would not be susceptible to distorted facts being presented before it or be used as vehicles for affirming manipulated evidence.
ESSENTIAL CONTEXT OF SECTION 84 OF THE EVIDENCE ACT AT A GLANCE:
Section 84 of the Evidence Act 2023 [2] reels out the criterion’s core to the presentation and admissibility of computer-generated evidence.
Precisely, 84 (2) states; [3] [paraphrased]
For computer-generated evidence to be made admissible, the following must be adhered:
a). The document or electronic records was produced by the computer, over which the computer was used to store or process information for the purposes of any activities carried out over that period
(b) That during the period, information of the kind contained in the electronic record or of the kind from which the information so contained is derived was regularly fed into the computer in the ordinary course of the activities”;
c) That throughout the material part of that period the computer was operating properly and, in any case, it was out of operation during that period, it was such as not to affect the production of the document or electronic record or accuracy of its contents.
d) That the information contained in the statement reproduces or is derived from information supplied to the computer in the ordinary course of those activities.
Section 84(4) goes further to note that in any proceeding where it is desired to give a statement in evidence by virtue of this section, an accompanying certificate of compliance must be followed.
In Brila Energy Ltd v FRN (2018) LPELR-43926 (CA).[4] The Court held that: “where such certificate is not produced, it has been held that oral evidence of a person familiar with the operation of the computer can be given of its reliability and functionality; and that such a person need not be a computer.”
The Supreme Court in Kubor v Dickson [2013] 4 N.W.L.R (Pt.1345) at 534, pages 577-578,[5] it was held that: “a party that seeks to tender in evidence a computer-generated document needs to do more than just tendering same from the bar. Evidence in relation to the use of the computer must be called to establish the conditions set out under S. 84(2) of the Evidence Act, 2011. In this case, no witness testified before tendering the documents so there was no opportunity to lay the necessary foundations for their admission as e-documents.”
EMERGENCE OF ARTIFICIAL INTELLIGENCE AND EVIDENTIARY CHALLENGES:
Historical Perspective:
As highlighted, the enormous impact of AI while advantageous also portends great danger. What commenced in the 1950s as a brain child of Alan Turing who proposed the Turing test to determine if a machine could mimic human intelligence gained momentum in the 1960s with the development of the first AI programming language, LISP, by John McCarthy. Initial AI systems were centered on symbolic reasoning and rule-based approaches, paving the way for the emergence of expert systems during the 1970s and 1980s. [6]
During the 1990s, there was a significant shift towards machine learning and data-driven methodologies, fueled by greater access to digital data and improvements in computing power. This era marked the emergence of neural networks and support vector machines, enabling AI systems to learn from data and enhance their performance and flexibility. In the 2000s, AI research broadened its scope to include fields such as natural language processing, computer vision, and robotics, setting the foundation for the AI revolution we are experiencing today.
According to Bernard Marr, “the recent explosion of AI is largely attributed to the development of deep learning techniques and the emergence of large-scale neural networks, such as the Generative Pre-trained Transformer (GPT) series by OpenAI. GPT-3, released in 2020, is a prime example of how AI has evolved, boasting 175 billion parameters and demonstrating unprecedented natural language understanding and generation capabilities.” [7]
ACKNOWLEDGED AND UN-ACKNOWLEDGED AI EVIDENCE
According to Megan Capenter, Dean and Professor of Law at the University of New Hampshire’s Franklin Pierce School of Law, the defining types of AI in evidence falls into two distinctive categories; One, is the acknowledged AI-generated evidence – This type of evidence is disclosed and noted as being modified by AI. [8]
The other is the unacknowledged AI evidence which is often presented as authentic or not modified by any AI tool. Such examples include deepfake videos, fabricated receipts, and manipulated photographs. This type of manipulated evidence poses significant challenges for detection and authentication in court proceedings.
The court in the State of Washington v Puloka case, [9] rejected to admit in evidence video exhibits “enhanced by artificial intelligence” for use in a jury trial. The State objected to the admission of the AI-enhanced video, arguing that it did not satisfy the admissibility requirements established in Frye v United States [10] —which mandates that evidence based on novel scientific methods or principles must be generally accepted within the relevant scientific community. The state’s expert, a certified forensic video analyst, contended that the AI tools employed by the defense rendered conventional forensic examination of the video unfeasible.
In the Nigerian context, AI-generated content, such as deepfakes or synthetically enhanced videos, threatens the very foundational legal position that “a fact is said to be proved when after considering the matter before the court, the court comes to the conclusion that a given state of fact exists or its existence is probable.” [11] Section 4 of the Evidence Act gives the court a blanket power to presume some facts and treat them as being proved unless it is contrary wise proved, this was the position in Olagundoye & Anor v Albert & Ors (2014) LPELR-22980 (CA). [12] Therefore, AI-generated content challenges the court’s ability to distinguish genuine exhibits from algorithmic fabrications.
AI tools may further undermine the constitutional guaranteed right to fair hearing enshrined under Section 36 of the ConstitutionFederal Republic of Nigeria 1999 [13] This is as defendants may be unable to effectively challenge the provenance or accuracy of evidence generated by opaque algorithms.
Recommendations.
Expert Evidence:
The provisions of Section 68 of the Evidence Act, notes that when the court has to form an opinion upon a point of foreign law, customary law or custom, or of science or art, or as to identity of handwriting or finger impressions, the opinions upon that point of persons specially skilled in such foreign law, customary law or custom, or of science or art or in questions as to identify of handwriting or finger impressions, are admissible. [14]
The emergence of AI requires thorough interpretation and analysis of materials presented before the court as evidence. More especially, documents or electronic records that fall under the purview of Section 84 of the Evidence Act. As can be gleaned from the State of Washington v Puloka Case, the adoption of the general position adopted in Frye v United States —which mandates that evidence based on novel scientific methods or principles must be generally accepted within the relevant scientific community should be the general benchmark for admitting computer-based generated evidence in this era of AI. Thus, the court is encouraged to suo motu require expert evidence to determine the authenticity of computer-based evidence presented before it. Rendering such evidence to forensic analysis and scrutiny would sufficiently unravel its accuracy and authenticity.
In Oyakhire v Obaseki (1986) 1 NWLR (Pt.19) 735 at p. 742,[15] the court held: “If the evidence of an expert is not shaken under cross-examination and is uncontradicted it should be admitted.
It is important to point out that experts invited by the court in this regard should be one who is qualified to speak with some amount of authority by reason of their special training, skill and mastery with the subject matter in question. See the case of Rabiu v Amadi [2013] 2 N.W.L.R (Pt. 1336) at 36, page 52, pargs D-E. [16]
Professional Obligation:
Ethical considerations must be taken into account by legal practitioners in the course of presenting evidence before the court. More poignantly, as a cardinal principle of a legal practitioner is to uphold and observe the rule of law at all times. [17] The Nigerian Bar Association Section on Legal Practice Technology and Law Committee, has created a robust guideline guiding the use of AI in legal practice.
Collaboration with other relevant Stake Holders in the Technology Eco system:
There exists a need for cross breeding ideas with experts and stakeholders in the technology eco-system (Public and Private sector) in other to identify gaps and loopholes. This is as the emergence of AI threatens the very foundation of administering justice.
Conclusion.
In conclusion, the reality of the present and the future indicates that AI is here to stay. With a targeted projection to improvements in the present infrastructure (The market is projected to grow from USD 58.78 billion in 2025 to USD 356.14 billion by 2032) [18] – thus, translating that the future is AI. There exists a need for collaboration, and, targeted capacity building trainings for judicial officers in addressing the threats to AI.
Also, it is recommended that the National Assembly, as the highest decision law-making body of the country, revisit amendments made to the Evidence Act, thereby, making it more robust and reflective of the emergent changes and dynamism associated with AI.
FOOTNOTES:
1. Evidence Act, 2023.
2. Ibid.
3. Ibid.
4. LPELR-43926 (CA).
5. [2013] 4 N.W.L.R (Pt.1345) at 534, pages 577-578
6. Bernard Marr. < ‘The Evolution Of AI: Transforming the World One Algorithm at a Time.’ Available at > https://bernardmarr.com/the-evolution-of-ai-transforming-the-world-one-algorithm-at-a-time/
7. Ibid.
8. Megan Capenter. <‘Deepfakes on trial: How judges are navigating AI evidence authentication’ Available at > https://www.thomsonreuters.com/en-us/posts/ai-in-courts/deepfakes-evidence-authentication/
9. (2024) No. 21-1-04851-2 KNT
10. 293 F.1013(D.C. Cir. 1923)
11. Section 121, Evidence Act.
12. (2014) LPELR-22980 (CA)
13. Constitution Federal Republic of Nigeria, 1999.
14. Evidence Act (n1)
15. 1 NWLR (Pt.19) 735 at p. 742
16. [2013] 2 N.W.L.R (Pt. 1336) at 36, page 52, pargs D-E
17. Rule 1, General Responsibility of a Lawyer. Rules of Professional Conduct, 2023.
18. Fortune Business Insights. < ‘AI Infrastructure Market Size, Share & Industry Analysis, By Offering (Hardware and Software), By Deployment (On-premises, Cloud, and Hybrid), By End-User (Enterprises, Government Organizations, and Cloud Service Providers), and Regional Forecast, 2024 – 2032’> Available at https://www.fortunebusinessinsights.com/ai-infrastructure-market-110456
Author:
Omogbolahan Sheba, is a lawyer, based in Abuja, Nigeria. Gbolahan, can be reached via Email: gbolahansheba@gmail.com or Phone no: +234 7039075997.