One-Day Workshop on AI & Evidential Inference
(in memory of Craig Callen)
in Conjunction with
ICAIL 2011, Pittsburgh, Pennsylvania, June 10, 2011
Workshop Chairs: Giovanni Sartor & Peter Tillers
Program Committee: Henry Prakken, Giovanni Sartor, Douglas Walton, & Peter Tillers
For more information please contact either Giovanni Sartor - giovanni.sartor at gmail.com - or Peter Tillers - peter.tillers at gmail.com
Panelists: Ronald J. Allen, Rainhard Bengez, Floris Bex, Scott Brewer, James Franklin, David Hamer, Bruce Hay, Joseph Laronge, D. Michael Risinger, Michael Pardo, Federico Picinali, Henry Prakken, Boaz Sangero, Giovanni Sartor, Peter Tillers, Bart Verheij, Douglas Walton
Subject: Computational methods and evidential inference in legal settings such as pretrial investigation and trials. Two foci of discussion will be (i) stories, narrative, or rhetoric, and evidential argument; and (ii) burdens of proof. Panelists will also be free to consider other topics, including, for example, (iii) evidential inference and statistical methods, and (iv) cognitive science, psychology, and inference.
Biographical details about panelists (listed in alphabetical order):
Ronald J. Allen, John Henry Wigmore Professor of Law at Northwestern University, in Chicago, Illinois. He did his undergraduate work in mathematics at Marshall University and studied law at the University of Michigan. He is an internationally recognized expert in the fields of evidence, procedure, and constitutional law. He has published five books and approximately eighty articles in major law reviews. The New York Times referred to him as one of nation's leading experts on evidence and procedure. He has been quoted in national news outlets hundreds of times, and appears regularly on national broadcast media on matters ranging from complex litigation to constitutional law to criminal justice.
Topic: Taming Complexity: Rationality, the Law of Evidence, and the Nature of the Legal System
Rainhard Z. Bengez is a lecturer in philosophy and mathematics, and scholar in residence at TU München, Germany. After seven years of religious education he studied and graduated in mathematics, medicine, and physics. In his PhD in mathematics he worked on formal systems, its theoretical limits and its limits concerning practical application. Together with Lothar Phillips he developed the concept of quantitative justice and fairness and its impact on decision systems, and on the idea of humanity. The overall idea of his research interests are questions and problems concerning structures of epistemic phenomena and how to gain epistemic insight by problem transformation and by using multiple (even unconventional) methods. Some of his recent projects:
(i) Integration of technology (esp. automata) in society, esp. Theory of Trust(!); (ii) Morality, emotions, and machines (chances and limits); (iii) Structure of energy systems (esp. in Europe, and there esp. in Germany, we have reached a point on which we cannot base our conclusions on historical data and assumptions as we try to change the historical (infra-) structure completely. Thus, we have to rethink given ontic and ontological concepts and extend our set of terms. This will or may enable our engineers to design / create new structures with a deeper connection to society and to include questions concerning complex systems.); (iv) Philosophy for children (Children develop questions concerning philosophy of science, technology, and society by using issues of natural sciences and technology according to their school curriculum.); and (v) Art & Science – the many ways of epistemic insights.He has been awarded grants by German states for development of new ways in teaching functional illiterates mathematics, together with Marie-Cecile Bertau for her Gilgamesh project, as well as for his Theory of Trust project.
He has been awarded grants by German states for development of new ways of teaching functional illiterates mathematics, together with Marie-Cecile Bertau for her Gilgamesh project, as well as for his Theory of Trust project.
Topic: On the Computable Structure of the Logocratic Method and Analyses Specific to Evidence Law
Floris Bex, postdoctoral research assistant, Argument Research Group, University of Dundee. Bex has written a comprehensive dissertation on reasoning with legal evidence and proof. In this dissertation he presents an informal theory, which caters to an (informal) philosophical and legal audience, as well as a more formal logical theory, aimed at an AI-oriented audience. His work has been frequently presented at the relevant conferences (Jurix, ICAIL). As of 2003, Floris has published four major journal papers on the subject of reasoning with evidence. At the University of Groningen (2005 - 2009), Floris has taught law students the basics of thinking about evidence and scenarios and he was recently (2010) invited by the court of appeals in Arnhem to give a seminar about reasoning with evidence. Together with Henry Prakken, Floris gave workshops on evidential reasoning at the IVR legal theory conference in Cracow (2007) and at a recent conference for Dutch judges and legal professionals (2010).
Topic (with Douglas Walton): Combining Evidential and Legal Reasoning with Burdens and Standards of Proof
Scott Brewer, professor of law, Harvard Law School. Research Interests: Philosophical Aspects of Legal Thought. Education: SUNY at Stony Brook B.A. 1979, Philosophy and Religious Studies; Yale University M.A. 1980, Philosophy; Yale Law School J.D. 1988; Harvard University Ph.D. 1997, Philosophy. Appointments: Lecturer on Law, 1988 ; Assistant Professor of Law, 1991; Professor of Law, 1998. Representative Publications: Brewer, Scott. "Scientific Expert Testimony and Intellectual Due Process," 107 Yale Law Journal 1535 (1998); Brewer, Scott. "Exemplary Reasoning: Semantics, Pragmatics, and the Rational Force of Legal Argument by Analogy," 109 Harvard Law Review 923 (1996).
Topic: Representing Legal Arguments: The Centrality of Abduction
Craig Callen, Judge John D. O' Hair Professor of Evidence & Procedure, Michigan State University School of Law.
James Franklin, professor, School of Mathematics and Statistics, University of New South Wales. Brief History. Franklin's undergraduate work was at the University of Sydney (1971-75). He completed his PhD in 1981 at Warwick University, on algebraic groups. Since 1981 he has taught in Mathematics at UNSW. His book What Science Knows: And How It Knows It (Encounter) was published in 2009. His book Catholic Values and Australian Realities appeared in 2006. Franklin's book Corrupting the Youth: A History of Philosophy in Australia was published by Macleay Press in 2003. His book The Science of Conjecture: Evidence and Probability Before Pascal (Johns Hopkins University Press) appeared in 2001. Franklin's research areas include the structuralist philosophy of mathematics and the 'formal sciences' (He is a member of the Sydney School), Australian Catholic history, the parallel between ethics and mathematics, restraint, the quantification of rights in applied ethics, and the analysis of extreme risks.
Topic: How much of commonsense and legal reasoning is formalizable? A review
David Hamer is Associate Professor in the Law Faculty of the University of Sydney. He has undergraduate degrees in both science and law from the Australian National University and a PhD from the University of Melbourne. His dissertation examined probabilistic models of burdens and standards of proof. He has published a series of articles in leading journals in Australia and the UK applying probability theory to various aspects of evidence, proof and justice, including causation, the right to silence, civil and criminal standards of proof, delayed complaints and double jeopardy.
Topic: A probabilistic model of the relationship between the quantity (weight) of evidence, and its strength
Bruce Hay, professor of law, Harvard Law School. Research Interests: Economics of Procedure and Litigation; Evidence; Legal Theory. Education: University of Wisconsin B.A. 1985, Political Science and French; Harvard Law School J.D. 1988. Appointments: Assistant Professor of Law, 1992; Professor of Law, 1998. Representative Publications: "Manufacturer Liability for Harm Caused by Consumers to Others," 94 American Economic Review 1700 (2005) (authored with K. Spier); "Sting Operations, Agents Provocateurs, and Entrapment," 70 Missouri Law Review 387 (2005); "'Sweetheart' and 'Blackmail' Settlements in Class Actions: Reality and Remedy," 75 Notre Dame Law Review 1377 (2000) (authored with D. Rosenberg); "Burdens of Proof in Civil Litigation: An Economic Perspective," 26 Journal of Legal Studies 413 (1997); "Allocating the Burden of Proof," 72 Indiana Law Journal 618 (1997).
Topic: Roughly Two Conceptions of the Trial
Joseph Laronge, Senior Assistant Attorney General, Oregon Department of Justice: Laronge has been a trial and appellate attorney for 35 years. For the last ten years, he has made extensive use of argument mapping (e.g., Rationale software and embodied metaphoric argumentation visual languages) in trial illustrative exhibits and court briefs as ancillary support for factual and legal inferential arguments. During this period, he has applied these argument mapping approaches for teaching advanced legal reasoning skills as an adjunct professor of law in Advanced Argumentation at Lewis & Clark Law School and as a former associate with Austhink Consulting. He previously taught the fundamentals of legal reasoning as an adjunct professor of law in Legal Research & Writing at Willamette University Law School. From these experiences, Laronge found that the multiplicity of modes of inference and arguments schemes was an impediment to the enhancement of law students’ legal reasoning skills and the clear and rigorous representation of such argumentation in court. To help overcome this obstacle, he developed a single generalizable structure of inferential proof named defeasible class-inclusion transitivity (DCIT). Since 2005, Laronge has used DCIT successfully as a persuasive universal structure of factual and legal inferential reasoning in trial and appellate court real-world applications. An explanation of its theoretical foundation and its generalizable application in court is discussed in “A Generalizable Argument Structure Using Defeasible Class-inclusion Transitivity for Evaluating Evidentiary Probative Relevancy in Litigation” J Logic Computation exp066 first published online December 2, 2009 doi:10.1093/logcom/exp066.
Topic: Evaluating Universal Sufficiency of a Single Logical Form for Inference in Court
Michael Pardo, associate professor of law, University of Alabama School of Law, writes and teaches in the areas of evidence, criminal procedure, civil procedure, and jurisprudence. His scholarship explores a variety of philosophical issues in these areas, with a particular focus on epistemological issues regarding evidence and legal proof. His recent scholarship also examines philosophical and evidentiary issues pertaining to law and neuroscience. Professor Pardo is the author of several publications in law reviews, including the Boston College, Illinois, Northwestern, Texas, and Iowa Law Reviews, among others, and in peer-reviewed journals, including Legal Theory, Law and Philosophy, and the Journal of Legal Studies, among others. His article, “The Field of Evidence and the Field of Knowledge,” was presented at the Stanford/Yale Junior Faculty Forum in the jurisprudence and philosophy category. Professor Pardo is also a co-author of the fifth edition of Evidence: Text, Problems, and Cases (Aspen, forthcoming, with Allen, Kuhns, Swift, and Schwartz) and a forthcoming book on law and neuroscience (with Dennis Patterson). Professor Pardo is currently the Chair-Elect of the American Association of Law Schools Section on Evidence. He also serves as the U.S. book review editor of International Commentary on Evidence. Professor Pardo joined the Alabama Law Faculty in 2005. Prior to joining the faculty, he was a visiting assistant professor at Chicago-Kent College of Law and at Northwestern University School of Law. Professor Pardo received his JD from Northwestern University School of Law.
Topic: Relevance, Sufficiency, and Defeasible Inferences: Comments on Modeling Legal Proof
Federico Picinali, LL.M. student at the Yale Law School. He earned a Ph.D. in criminal law and criminal procedure at the Università degli Studi of Trento, Italy, a Degree in legal sciences, and a Specialized degree in law at the Università degli Studi of Milan, Italy. Former Visiting Researcher at UC Hastings, Cardozo School of Law, and Penn Law. Former Exchange Student at the UCB Boalt Hall School of Law. His research interests and publications concern the influence of fact finding on substantive criminal law principles, inferential reasoning, the beyond a reasonable doubt standard, the comparison between "legal reasoning" and "factual reasoning."
Topic: Structuring inferential reasoning in criminal cases. An analogical approach
Henry Prakken, lecturer in the Intelligent Systems Group of the computer science department at Utrecht University, and professor of Law and IT at the Law Faculty of the University of Groningen. Prakken has master degrees in law (1985) and philosophy (1988) from the University of Groningen. In 1993 he obtained his PhD degree at the Free University Amsterdam with a thesis titled Logical Tools for Modelling Legal Argument. His main research interests concern logical and dialogical aspects of argumentation, and the application of argumentation in legal reasoning, multi-agent systems and other domains. Prakken was the ICAIL program chair in 2001 and the ICAIL president in 2008-9. He had papers accepted at all ICAIL conferences since 1991 except in 1999 (when he gave a tutorial). Prakken co-organised workshops on legal reasoning about evidence at ICAIL 2001 and IVR 2007 and a conference on a similar topic in New York. He has published regularly about evidence since 2001.
Topic: Can non-probabilistic models of legal evidential inference learn from probability theory?
D. Michael Risinger, John J. Gibbons Professor of Law. Seton Hall University School of Law. Risinger holds a B.A., magna cum laude, from Yale University, and a J.D., cum laude, from Harvard Law School. He clerked for the Honorable Clarence C. Newcomer of the United States District Court for the Eastern District of Pennsylvania. He is a past chair of the Association of American Law Schools Section on Civil Procedure, the immediate past chair of the AALS Section on Evidence, and a life member of the American Law Institute. He was also a member of the New Jersey Supreme Court Committee on Evidence for 25 years, which was responsible for the current version of the New Jersey Rules of Evidence. Professor Risinger moved to Seton Hall Law School in 1973. He served as a visiting senior fellow on the law faculty of the National University of Singapore from 1985-1986. Professor Risinger has published in the areas of evidence and civil procedure. He is the co-author of Trial Evidence, A Continuing Legal Education Casebook and the author of two chapters in Faigman, Kaye, Saks and Cheng, Modern Scientific Evidence (“Handwriting Identification” and “A Proposed Taxonomy of Expertise”). Professor Risinger was selected as one of Seton Hall’s two inaugural Dean’s Research Fellows (2002-2004) and was named the John J. Gibbons Professor of Law in May 2008. His scholarship has recently concentrated on wrongful convictions as well as expert evidence issues.
Topic: Against Symbolization—Some reflections on the limits of formal systems in the description of inferential reasoning and legal argumentation
Boaz Sangero, Professor of law and Head of Criminal Law & Criminology Department at the Academic Center of Law & Business, Israel. He received his LL.D. from the Hebrew University of Jerusalem, in 1994. He has written over 40 articles and books, in Israel, England and United States. His recent book Self-Defence in Criminal Law (Hart Publishing, 2006) has been cited many times and reviewed in Oxford Journal of Legal Studies and in Cambridge Law Journal. Prof. Sangero’s areas of research include Substantive Criminal Law; Criminal Procedure; Criminal Evidence; Sentencing. He particularly looks for ways to reduce miscarriages of justice. Representative publications: "A New Defense for Self-Defense", 9 Buffalo Crim. L. Rev. 475 (2006); "Miranda Is Not Enough: A New Justification for Demanding “Strong Corroboration” to a Confession", 28 Cardozo L. Rev. 2791 (2007); "Why A Conviction Should not be Based on A Single Piece of Evidence: A proposal for Reform" (co-writer – Dr. Mordechai Halpert) 48 Jurimetrics: The J. of L., Science, and Technology, 43 (2007); "From a Plane Crash to the Conviction of an Innocent Person: Why Forensic Science Evidence Should Be Inadmissible Unless it has been Developed as a Safety-critical System" (co-writer – Dr. Mordechai Halpert), 32 Hamline L. Rev. 65 (2009); Heller's Self-Defense, 13 The New Criminal Law Review 449 (2010).
Topic (with Mordechai Halpert): Proposal to Reverse the View of a Confession:
From Key Evidence Requiring Corroboration
to Corroboration for Key Evidence
Giovanni Sartor, professor of Legal informatics and Legal Theory at the European University Institute of Florence and at the University of Bologna. He obtained a PhD at the European University Institute (Florence), worked at the Court of Justice of the European Union (Luxembourg), was a researcher at the Italian National Council of Research (ITTIG, Florence), held the chair in Jurisprudence at Queen’s University of Belfast (where he now is honorary professor), and was Marie-Curie professor at the European University of Florence. He is President of the International Association for Artificial Intelligence and Law. Sartor has published widely in legal philosophy, computational logic, legislation technique, and computer law. Among his publications is: Corso di informatica giuridica (Giappichelli, 2008), Legal Reasoning: A Cognitive Approach to the Law (Springer: 2005), The Law of Electronic Agents (Oslo: Unipubskriftserier, 2003), Judicial Applications of Artificial Intelligence (Dordrecht: Kluwer, 1998), Logical Models of Legal Argumentation (Dordrecht: Kluwer, 1996), and Artificial Intelligence in Law (Oslo: Tano, 1993).
Topic (with Giuseppe Contissa) : Evidence arguments in air traffic safety. A model for the law?
Peter Tillers, professor of law, Cardozo School of Law, Yeshiva University. Tillers is a reviser of John Henry Wigmore's multi-volume treatise on the law of evidence and has published a variety of articles on evidence, inference, and investigation. He is an editor of the Oxford journal Law, Probability and Risk. He is former chairman and secretary of the Evidence Section of the Association of American Law Schools. He was a Fellow of Law & Humanities at Harvard University and an Alexander von Humboldt & Senior Max Rheinstein Fellow at the University of Munich. He was a visiting professor at Harvard Law School in the spring semester of 2002. Tillers was legal adviser for the Latvian mission to the United Nations during the 48th Session of the General Assembly. He maintains a website with discussion of a wide range of general issues of evidence. Tillers' scholarship focuses on evidential inference and fact investigation in legal settings. He maintains that multiple methods of marshaling and analyzing evidence are important in trials, in pretrial investigation and informal fact discovery, and in other domains. He believes that inference networks offer a useful window into investigative discovery and proof at trial. But he believes that subjective, synthetic, and gestalt-like perspectives on evidence, inference, and proof are also essential.
Topic: A Rube Goldberg Approach to Fact Investigation, Evidential Inference, and Factual Proof in Legal Settings
Bart Verheij, lecturer and researcher at the University of Groningen, Department of Artificial Intelligence and a member of the ALICE institute. He participates in the Multi-agent systems research program. His research interests include argumentation, rules and law, with emphasis on defeasible argumentation, legal reasoning and argumentation software. As research methods, he uses formal analysis (in the styles of logic and analytic philosophy), software design, algorithm implementation, agent-based social simulation, controlled experiment, observation, and thinking and exploring. His research field is interdisciplinary, and includes artificial intelligence, argumentation theory and legal theory.
Topic: Can the argumentative, narrative and statistical perspectives on legal evidence and proof be integrated?
Douglas Walton holds the Assumption University Chair in Argumentation Studies, and is Distinguished Research Fellow Centre for Research in Reasoning, Argumentation and Rhetoric (CRRAR), at the University of Windsor. He serves on the editorial boards of several journals, including Informal Logic, Argument and Computation, and Artificial Intelligence and Law. He is the author of forty-five books and three hundred refereed papers in the areas of argumentation, logic and artificial intelligence. The books include Witness Testimony Evidence, Cambridge University Press, 2008, Fundamentals of Critical Argumentation, Cambridge University Press, 2006, and Legal Argumentation and Evidence, Penn State Press, 2002.
Topic (with Floris Bex): Combining Evidential and Legal Reasoning with Burdens and Standards of Proof
Abstracts
Ronald J. Allen, "Taming Complexity: Rationality, the Law of Evidence, and the Nature of the Legal System"
This essay explores the implications of complexity for understanding both the law of evidence and the nature of the legal system. Among the propositions critically analyzed is that one significant way to understand the general problem of the meaning of rationality is that it has involved a multivariate search for tools to understand and regulate a hostile environment. The law of evidence is conceptualized as a subset of this effort, at least in part, as involving a search for tools to regulate the almost infinitely complex domain of potentially relevant evidence and at the same time to accommodate policy demands. The proposition is then considered that the legal system of which the evidentiary system is a part has emergent properties that may not be deducible from its component parts and that suggest that it may be, or at least has properties highly analogous to, a complex adaptive system. One implication of this analysis is that the tools of standard academic research that rely heavily on the isolation and reduction of analytical problems to manageable units to permit them to be subjected to standard deductive methodologies may need to be supplemented with analytical tools that facilitate the regulation of complex natural phenomena such as fluid dynamics. This has direct implications for such things as the conception of law as rules, and thus for the Hart/Dworkin debate that has dominated jurisprudence for 50 years. That debate may have mischaracterized the object of its inquiry , and thus the Dworkinian solution to the difficulties of positivism is thus inapplicable. Even if that is wrong, it can be shown that the Dworkinian solution is not achievable and cannot rationally be approximated. Solutions to legal problems within the legal system as a whole (as compared to any particular node within the legal system) are arrived at through a process of inference to the best explanation that occurs within a highly interconnected set of nodes that has similarities to a neural network.
Rainhard Bengez, "On the Computable Structure of the Logocratic Method and Analyses Specific to Evidence Law"
My contribution aims at discussing the computational structure of Scott Brewer’s logocratic method specific to evidence law. It intends to contribute to this fundamental methodology by developing four interrelated goals. One is to provide a meta-logical and computable framework and foundation for the process of specification and the design of a domain specific language. The second is to employ this framework to introduce and study the bivalent structure of enthymemata (logical-syntactical structure and semantic or interpretative structure by assigning a certain measure). The third is to provide some algorithms and data structures for practical training software. This software concept and tool can be used in education or for assisting analysts. And, the last one is to use these models and concepts empirically to analyze the time-dependent structure of evidences and arguments used in law practice. This extends the notion of the logocratic method to the more general arena of mathematical modeling and epistemic logic where the aim is to build formal models that are constructed from specification with the aim to formulate algorithms and procedures for semi-autonomous systems (weak AI). For short, by formulating a computable structure for the logocratic method we discover a meta-structure, gain a deeper insight in our own practice of evaluating enthymemata; can investigate the time dependent structure of evidence law and can provide some algorithms for a software base which can be used for simulating and semi-autonomous reasoning.
Floris Bex (with Douglas Walton), "Combining Evidential and Legal Reasoning with Burdens and Standards of Proof"
In this paper, we provide a formal logical model of evidential reasoning with proof standards and burdens of proof that enables us to evaluate evidential reasoning by comparing stories on either side of a case. It is based on a hybrid inference model that combines argumentation and explanation, using inference to the best explanation as the central form of argument. The model applied to one civil case and two criminal cases. It is shown to have some striking implications for modeling and using traditional proof standards like preponderance of the evidence and beyond reasonable doubt.
Scott Brewer, "Representing Legal Arguments: The Centrality of Abduction"
James Franklin, "How much of commonsense and legal reasoning is formalizable? A review"
After decades of experience with Artificial Intelligence, it is clear that commonsense and legal reasoning share a number of obstacles to formalization - obstacles not found in some other areas such as mathematics or pathology test interpretation. We review some of the main difficulties, including: the fuzziness or open texture of concepts, leading to borderline cases and problems of similarity (e.g. of cases to precedents); the problem of relativity to context with the difficulty of representing context; the subtleties of causation and counterfactuals; problems of probabilistic and default reasoning including reference class problems. Having surveyed those problems, we extract from them two higher-order issues that are responsible for much of the trouble: discreteness versus continuity (the mismatch between the discreteness of formal symbols and the continuous variation of commonsense concepts), and understanding (the need for genuine human understanding to make the first step in correct classification). It is concluded that full formalization of legal reasoning is unachievable, though there are prospects for systems that behave intelligently by harvesting the results of human understanding in the manner of Google.
David Hamer, "A probabilistic model of the relationship between the quantity (weight) of evidence, and its strength"
A recurrent problem for the probabilistic representation of proof in legal and other contexts is its purported failure to adequately reflect the quantity or weight of evidence. There are cases (such as the naked statistical evidence hypotheticals) where a slight body of evidence appears to provide a strong measure of support, probabilistically, however, a common intuition is that the evidence would lack sufficient weight to constitute legal proof. Further, if the probability measure has no relation to the weight of evidence, it would appear to express scepticism about the value of evidence. Why bother considering fresh evidence, increasing the weight of evidence, if it provides no epistemic benefit?
Probability theory can avoid the spectre of scepticism. There are probabilistic measures which can be expected to increase as the quantity of evidence increases – utility, certainty, and the probability score. In this paper I prove the relationship between quantity of evidence and these expected increases, and provide a computer model of the relationship. The model highlights some interesting features. As the weight of evidence increases, greater certainty can be expected. Equivalently, on numerous runs of the model, greater certainty is achieved on average. But in particular cases, certainty may remain the same or decrease. The average and expected increase will be more accentuated where more decisive evidence is available, but the result still holds for situations where the evidence is merely probative. Notwithstanding that certainty can be expected to increase, the probability measure expected from considering fresh evidence is, by definition, equal to the prior probability measure.
These results provide a response to the opponents of probabilistic measures of probability. Probability theory is not sceptical of the value of evidence, and does provide motivation for considering fresh evidence and the value of evidence. However, it makes no assumptions that the fresh evidence will necessarily put the fact-finder in a stronger epistemic position.
Bruce Hay, "Roughly Two Conceptions of the Trial"
Joseph Laronge, "Evaluating Universal Sufficiency of a Single Logical Form for Inference in Court"
Inference in court is subject to scrutiny for structural correctness (e.g., deductive or nonmonotonic validity) and probative weight in determinations such as probative relevancy and sufficiency of evidence. These determinations are made by judges or informally by jurors who typically have little, if any, training in formal or informal logical forms. This paper explores the effectiveness of a single intuitive categorical natural language logical form (i.e., Defeasible Class-Inclusion Transitivity, DCIT) for facilitating such determinations and its universal sufficiency for constructing any typical inferential network in court. This exploration includes a comparison of the functionality of hybrid branching tree-like argument frameworks with the homogenous linear path argument framework of DCIT. The practicality of customary dialectical argument semantics and conceptions of probative weight are also examined with alternatives proposed. Finally, the use of DCIT for depicting the reasoning of legal cases typically used in AI research is considered.
Michael Pardo, "Relevance, Sufficiency, and Defeasible Inferences: Comments on Modeling Legal Proof"
This paper discusses criteria for formals model of legal proof at two levels: the micro-level issue of the relevance of particular items of evidence, and the macro-level issue of the sufficiency of evidence as a whole to satisfy particular proof standards. At both levels, I examine criteria along two different dimensions. First, I explore a content-based distinction—whether the relationships between evidence and contested propositions ought to be modeled based on probabilistic or explanatory criteria. Second, I explore a structural distinction—whether the inferences being modeled ought to be modeled as defeasible or non-defeasible. I conclude that modeling legal proof based on defeasible, explanatory criteria provides a more plausible avenue than the alternatives, but also that there appear to be significant limitations on the utility of such models.
Federico Picinali, "Structuring inferential reasoning in criminal cases. An analogical approach"
The paper proposes a normative theory of inferential reasoning in criminal cases. The straightforward approach adopted in the work essentially consists in structuring factual inference and then exploring the dynamics of the structure as they are influenced by the requirements of evidence law, including, in particular, the standard of proof.
Factual inference is conceived of as having three components: a generalization, a probability statement attached to the generalization, and an analogy. The third component of this three-part structure is understood in terms of its classical meaning of “resemblance of relations”. Analogy performs a pivotal role: it puts the structure into “motion” by translating a general statement into a singular one.
While analogical reasoning has been widely studied in connection with legal reasoning and legal adjudication, little attention has been devoted to it by evidence law scholars. This paper aims to show that an analogy-based theory of inferential reasoning is a useful tool for investigating the characteristics of factual inference. In particular, the work claims that viewing juridical fact finding through the lenses of the proposed normative theory has the following three main merits.
First, the theory sketched in this paper makes it possible to incorporate in a single framework the important insights of different approaches to “reasoning under uncertainty” in a way that is consistent with the demands of the evidence law.
Second, the theory of inference presented here helps the assessment of some evidential problems that have been widely discussed by scholarship in recent years. By considering these problems in light of the tripartite inferential structure herein described, the paper attempts to clarify their nature and provides either tentative solutions or a solid foundation for further discussion.
Third, the proposed conceptualization allows for a functional taxonomy of reasonable doubts, a taxonomy that can automatically be derived from the tripartite inferential structure sketched in the paper.
The discussion here is limited to fact finding in criminal trials. Indeed, the peculiar standard of proof that the evidence law demands in this area necessarily informs any contextual normative theory of inference, influencing both its ends and its constituents.
Henry Prakken, "Can non-probabilistic models of legal evidential inference learn from probability theory?"
Recent miscarriages of justice in the Netherlands have led to increasing interest in Dutch legal practice in scientifically founded ways of thinking about evidence. In the resulting debate between academics and legal practitioners there is a tendency to conclude that the only scientifically sound way to perform legal evidential inference is in terms of probability theory. However, as is well known in our research communities, the languages of probability theory and the law are miles apart, which creates the danger that when courts attempt to model their evidential reasoning as probabilistic reasoning, the quality of their decisions will not increase but decrease.
For this reason many have proposed alternative models of evidential legal inference. Recently it has been claimed that AI accounts of argumentation and scenario construction are easier to apply in legal settings than probability theory. However, this raises the question to which extent such models violate the insights of probability theory. In this talk I will discuss this question by comparing alternative modellings of some examples.
D. Michael Risinger, "Against Symbolization—Some reflections on the limits of formal systems in the description of inferential reasoning and legal argumentation"
"There are few, if any, useful ideas in economicsthat cannot be expressed in clear English."John Kenneth Galbraith, The New Industrial State 419 (3rd ed, 1978)
When Jeremy Bentham wanted to summarize his felicific calculus, he wrote a mnemonic verse. When modern rational choice and expected utility theorists do the same, they write a formally symbolized expression. I want to suggest that Bentham’s instinct in this regard was superior. Formal symbolization, and its implication of an underlying mathematizability, has great and fecund power when something approaching defensible numerical values are or can be made available. This power is what justifies the loss in general availability that rendering things in specialized symbolic language entails. But when such values are not available, and are not likely to become available, then symbolization becomes an act of mystification with very little benefit and the potential for much mischief.
Boaz Sangero (with Mordechai Halpert), "Proposal to Reverse the View of a Confession: From Key Evidence Requiring Corroboration to Corroboration for Key Evidence"
Both case law and legal literature have recognized that all, and not just clearly statistical, evidence is probabilistic. Therefore, we have much to learn from the laws of probability with regard to the evaluation of evidence in a criminal trial. The present article focuses on the confession. First, we review legal and psychological literature and show that the probability of a false confession and, consequently, a wrongful conviction, is far from insignificant. In light of this, we warn against the cognitive illusion, stemming from the fallacy of the transposed conditional, which is liable to mislead the trier of fact in evaluating the weight of a confession. This illusion in evaluating the weight of a confession occurs when the trier of fact believes that, if there is only a low probability that an innocent person would falsely confess, then there is also only a low probability of innocence in each and every case where a person does confess guilt. The surprising truth is that even if there is only little doubt regarding the credibility of confessions in general, in some cases, this raises considerable doubt regarding the certainty of a conviction. We demonstrate this through the case of George Allen, who was convicted in 1983 of the rape and murder of Mary Bell. This is an example of a case in which the fallacy reaches extreme proportions, since nothing connected the accused to the crime, apart from his confession. Following this, we turn to a Bayesian calculation of probability for evaluating the weight of a confession. The probabilistic calculation that we perform dictates a new and surprising conclusion that calls for a significant reversal in how we view the confession: a confession should only be treated as corroboration of other solid evidence—if it exists—and not as key evidence for a conviction. Given the real danger of convicting innocents, we call on law enforcement officials to refrain from interrogating a person, with the aim of extracting a confession, when there is no well-established suspicion against this person, and even when the law allows for such an interrogation. Moreover, we call on legislatures to amend the law so that such an interrogation would not be possible, and to set forth that a confession is insufficient to constitute the sole, or key, evidence for a conviction, but it can be used only as corroboration for other key evidence—if it exists.
Giovanni Sartor (with Giuseppe Contissa) : "Evidence arguments in air traffic safety. A model for the law?"
Eurocontrol, the European Institution in charge with the control of Air Traffic requires a safety assessment with regard to both for an on-going operation (Unit Safety Case) and major changes to that operation (Project Safety Case). This assessment is based on the idea of a Legal Case, understood as the "presentation of Argument and Evidence that a overall claim is true". In such assessment safety must be demonstrated, with the support of relevant evidence, both with regard so Success Cases and with regard to Failure Cases. In this contribution we will consider whether this argument-based approach may be appropriate also for presenting legally relevant evidence (on safety and on its failures), and how safety arguments could be improved with argument models and structures developed within evidence and AI & law research. In particular, we shall address the prospects for the ALIAS project, which aims at developing a Legal-case, understood a method for providing evidence the operations or their changes are compliant with legal requirement, and provide a proper allocation of liabilities.
Peter Tillers, "A Rube Goldberg Approach to Fact Investigation, Evidential Inference, and Factual Proof in Legal Settings"
There is no single logical or analytical process that characterizes effective human deliberation about ambiguous, incomplete, and inconclusive evidence about facts. Some or many of the early proponents of AI knew this: They knew or believed that intelligent creatures -- like those who possess mechanisms like our neural brains -- have a variety of distinct information processing mechanisms (e.g., sensory mechanisms, mechanisms for storing sensory data, etc.) that are clumped together and that, taken together, somehow manage to generate sweet inferential and epistemic music. The work that I did with David Schum led me to an analogous conclusion: Investigating factual hypotheses and drawing inferences about factual hypotheses involve a variety of distinct marshaling processes that, despite their distinctiveness (or, perhaps one might even say, incommensurability), work together in a way not specifiable by any kind of recipe or strict logic to produce elegant and frequently accurate factual inferences. However, it does not follow and it is not true that structured deliberation about evidence is pointless.
Bart Verheij, "Can the argumentative, narrative and statistical perspectives on legal evidence and proof be integrated?"
In the study of legal evidence and proof, three theoretical perspectives can be discerned: argumentative, narrative and statistical. In the argumentative perspective, it is considered how hypothetical facts are supported or attacked by the arguments as they can be based on the available evidence. A key theoretical issue in this perspective is how a bundle of related argumentative elements (e.g. in the form of a Wigmorean chart) determines which hypothetical facts are justified and which are not. The issue has led to an intricate web of theoretical and mathematical studies of what might be called argumentation logics. As a result of this, it has become a tricky question how (and which) argumentation logics are relevant for the analysis of arguments as they occur in practice. In the narrative perspective, it is acknowledged that hypothetical facts never come alone, instead occur in a context of relevance that takes the form of a coherent story that fits the evidence. The narrative perspective emphasizes the holistic nature of the judgment of evidential data and their analysis. A key theoretical puzzle for the narrative perspective is how to avoid the dangers of the persuasive properties of stories – cf. the warning that good stories can push out true stories –, in particular by the explication of the connection of narrative considerations with the context of justification that counts when evaluating the evidence. Another issue is how alternative stories are to be constructed and compared, thereby preventing tunnel vision and allowing a well-balanced investigation and decision about the facts of a case.
The statistical perspective focuses on quantitative analyses, in particular on the basis of the careful collection of empirical evidence. The statistical perspective builds on well-established mathematical and methodological theory and practice, and can for instance explain how the conditional probabilities that connect evidence and hypotheses (as they arise from empirical investigation or expert judgment) change in the light of new evidence. A key issue for the statistical perspective, with both theoretical and practical connotations, is that normally only a fragment of the necessary quantitative input information is available. A second issue is how the descriptive focus of the statistical method is to be connected to the normative context in which evidential decision making takes place.
In this talk, the question is investigated whether and to what extent it is possible to develop an integrating theoretical perspective in which argumentative, narrative and statistical considerations about legal evidence and proof find a natural unification.
Douglas Walton (with Floris Bex), "Combining Evidential and Legal Reasoning with Burdens and Standards of Proof"
In this paper, we provide a formal logical model of evidential reasoning with proof standards and burdens of proof that enables us to evaluate evidential reasoning by comparing stories on either side of a case. It is based on a hybrid inference model that combines argumentation and explanation, using inference to the best explanation as the central form of argument. The model applied to one civil case and two criminal cases. It is shown to have some striking implications for modeling and using traditional proof standards like preponderance of the evidence and beyond reasonable doubt.
&&&
I expect there will be a bit of a donnybrook as well as more tempered scholarly discussion and deliberation.