Friday, April 08, 2011

The Government Shutdown & Essential Employees


Are Supreme Court Justices essential employees? Or can we do without them and will they all go to Maine for an early summer vacation?


&&&


The dynamic evidence page
It's here: the law of evidence on Spindle Law. See also this post and this post.

Thursday, April 07, 2011

Spindle Law Interview: Susan Crawford




See Interview with Susan Crawford



&&&

It's here: the law of evidence on Spindle Law. See also this post and this post.

Spindle Law Interview: Kenneth Feinberg



&&&

The dynamic evidence page
It's here: the law of evidence on Spindle Law. See also this post and this post.



See Interview with Kenneth Feinberg


&&&

It's here: the law of evidence on Spindle Law. See also this post and this post.

Spindle Law Interview: Piper Hoffman



See Interview with Piper Hoffman of Outten & Golden

 




&&&

The dynamic evidence page
It's here: the law of evidence on Spindle Law. See also this post and this post.

Spindle Law Interview: Yvette Alberdingk Thijm

See Interview with Yvette Alberdingk Thijm

Yvette Alberdingk Thijm, Executive Director of WITNESS



&&&

The dynamic evidence page
It's here: the law of evidence on Spindle Law. See also this post and this post.

Spindle Law Interview: Burt Neuborne



See Interview with Burt Neuborne 

Spindle Law Interviews Burt Neuborne


&&&

The dynamic evidence page
It's here: the law of evidence on Spindle Law. See also this post and this post.

Monday, April 04, 2011

The Hart-Fuller Debate Resurrected: What Is a Vehicle in a Park?

J. David Goodman, Caught Between Sidewalk and Street NYTimes (April 2, 2011:)
“What is a bicycle?” asked the title of a panel discussion at an international bike conference last month in Spain. “An engine-less car or a pedestrian on wheels?”

...

In Central Park, the odd, middling nature of the bike was put into stark relief during the police crackdown that started in January and focused on the running of red lights. Because the ticketing effort included hours when the park was closed to cars, many cyclists saw it as a form of harassment.

&&&

The dynamic evidence page

It's here: the law of evidence on Spindle Law. See also this post and this post.

Tuesday, March 29, 2011

Rube Goldberg and Factual Proof in Legal Settings

My abstract for a One-Day Workshop on AI & Evidential Inference in Conjunction with ICAIL 2011, Pittsburgh, Pennsylvania, June 10, 2011:

Peter Tillers, "A Rube Goldberg Approach to Fact Investigation, Evidential Inference, and Factual Proof in Legal Settings"

There is no single logical or analytical process that characterizes effective human deliberation about ambiguous, incomplete, and inconclusive evidence about facts. Some or many of the early proponents of AI knew this: They knew or believed that intelligent creatures -- like those who possess mechanisms like our neural brains -- have a variety of distinct information processing mechanisms (e.g., sensory mechanisms, mechanisms for storing sensory data, etc.) that are clumped together and that, taken together, somehow manage to generate sweet inferential and epistemic music. The work that I did with David Schum led me to an analogous conclusion: Investigating factual hypotheses and drawing inferences about factual hypotheses involve a variety of disparate marshaling processes that, despite their distinctiveness (or, perhaps one might even say, incommensurability), work together in a way not specifiable by any kind of recipe or strict logic to produce elegant and frequently accurate factual inferences. However, it does not follow and it is not true that structured deliberation about evidence is pointless.



The dynamic evidence page

It's here: the law of evidence on Spindle Law. See also this post and this post.

Wednesday, March 23, 2011

The Nature of the Modern American University

Stanley Fish in Stanley Fish, "We're All Badgers Now," NYTimes (March 21, 2011):
If “universities are not corporations” ever was a good argument, it isn’t anymore because universities, always corporations in financial fact, become increasingly corporate in spirit every day; and if I and my colleagues are not employees, from whom do we receive salaries, promotions, equipment, offices, etc., and to whom are we responsible in the carrying out of our duties? (If it looks like a duck . . . .) It’s not God and it’s not (despite some claims to the contrary) students, and it’s not awestruck admirers of our dazzling intellects.
Of course, in a technical legal sense almost all American universities have been "corporations" for quite some time. The question here is whether American universities today are much like profit-seeking commercial corporations. I agree with my colleague Stanley Fish that in many respects and in most instances the answer is "yes."

&&&

The dynamic evidence page

It's here: the law of evidence on Spindle Law. See also this post and this post.

Friday, March 18, 2011

Tillers on Evidence and Inference on Tillers on Evidence and Inference

&&&

The dynamic evidence page

Update on Workshop on AI & Evidence, ICAIL 2011, Pittsburgh, June 2011


One-Day Workshop on AI & Evidential Inference in Conjunction with
ICAIL 2011, Pittsburgh, Pennsylvania, June 10, 2011

Workshop Chairs: Giovanni Sartor & Peter Tillers
Program Committee: Henry Prakken, Giovanni Sartor, Douglas Walton, & Peter Tillers
For more information please contact either Giovanni Sartor - giovanni.sartor at gmail.com - or Peter Tillers - peter.tillers at gmail.com
Panelists: Ronald J. Allen, Rainhard Bengez, Floris Bex, Scott Brewer, Craig Callen, James Franklin, David Hamer, Bruce Hay, Joseph Laronge, D. Michael Risinger, Michael Pardo, Federico Picinali, Henry Prakken, Boaz Sangero, Giovanni Sartor, Peter Tillers, Bart Verheij, Douglas Walton, Nanning Zhang
Subject: Computational methods and evidential inference in legal settings such as pretrial investigation and trials. Two foci of discussion will be (i) stories, narrative, or rhetoric, and evidential argument; and (ii) burdens of proof. Panelists will also be free to consider other topics, including, for example, (iii) evidential inference and statistical methods, and (iv) cognitive science, psychology, and inference.
If time allows, the Program Committee will review papers by other people for possible presentation at the workshop. Information about this possibility will be posted later.
Publication: The Oxford journal Law, Probability and Risk will publish those workshop papers that pass peer review.
Biographical details about panelists (listed in alphabetical order):
Ronald J. Allen, John Henry Wigmore Professor of Law at Northwestern University, in Chicago, Illinois. He did his undergraduate work in mathematics at Marshall University and studied law at the University of Michigan. He is an internationally recognized expert in the fields of evidence, procedure, and constitutional law. He has published five books and approximately eighty articles in major law reviews. The New York Times referred to him as one of nation's leading experts on evidence and procedure. He has been quoted in national news outlets hundreds of times, and appears regularly on national broadcast media on matters ranging from complex litigation to constitutional law to criminal justice.
Topic: Taming Complexity: Law as Optimization

------

Rainhard Z. Bengez is a lecturer in philosophy and mathematics, and scholar in residence at TU München, Germany. After seven years of religious education he studied and graduated in mathematics, medicine, and physics. In his PhD in mathematics he worked on formal systems, its theoretical limits and its limits concerning practical application. Together with Lothar Phillips he developed the concept of quantitative justice and fairness and its impact on decision systems, and on the idea of humanity. The overall idea of his research interests are questions and problems concerning structures of epistemic phenomena and how to gain epistemic insight by problem transformation and by using multiple (even unconventional) methods. Some of his recent projects:
(i) Integration of technology (esp. automata) in society, esp. Theory of Trust(!); (ii) Morality, emotions, and machines (chances and limits); (iii) Structure of energy systems (esp. in Europe, and there esp. in Germany, we have reached a point on which we cannot base our conclusions on historical data and assumptions as we try to change the historical (infra-) structure completely. Thus, we have to rethink given ontic and ontological concepts and extend our set of terms. This will or may enable our engineers to design / create new structures with a deeper connection to society and to include questions concerning complex systems.); (iv) Philosophy for children (Children develop questions concerning philosophy of science, technology, and society by using issues of natural sciences and technology according to their school curriculum.); and (v) Art & Science – the many ways of epistemic insights.He has been awarded grants by German states for development of new ways in teaching functional illiterates mathematics, together with Marie-Cecile Bertau for her Gilgamesh project, as well as for his Theory of Trust project.
He has been awarded grants by German states for development of new ways of teaching functional illiterates mathematics, together with Marie-Cecile Bertau for her Gilgamesh project, as well as for his Theory of Trust project
Topic: On the Computable Structure of the Logocratic Method and Analyses Specific to Evidence Law

------

Floris Bex, postdoctoral research assistant, Argument Research Group, University of Dundee. Bex has written a comprehensive dissertation on reasoning with legal evidence and proof. In this dissertation he presents an informal theory, which caters to an (informal) philosophical and legal audience, as well as a more formal logical theory, aimed at an AI-oriented audience. His work has been frequently presented at the relevant conferences (Jurix, ICAIL). As of 2003, Floris has published four major journal papers on the subject of reasoning with evidence. At the University of Groningen (2005 - 2009), Floris has taught law students the basics of thinking about evidence and scenarios and he was recently (2010) invited by the court of appeals in Arnhem to give a seminar about reasoning with evidence. Together with Henry Prakken, Floris gave workshops on evidential reasoning at the IVR legal theory conference in Cracow (2007) and at a recent conference for Dutch judges and legal professionals (2010).
Topic (with Douglas Walton): Combining Evidential and Legal Reasoning with Burdens and Standards of Proof

------

Scott Brewer, professor of law, Harvard Law School. Research Interests: Philosophical Aspects of Legal Thought. Education: SUNY at Stony Brook B.A. 1979, Philosophy and Religious Studies; Yale University M.A. 1980, Philosophy; Yale Law School J.D. 1988; Harvard University Ph.D. 1997, Philosophy. Appointments: Lecturer on Law, 1988 ; Assistant Professor of Law, 1991; Professor of Law, 1998. Representative Publications: Brewer, Scott. "Scientific Expert Testimony and Intellectual Due Process," 107 Yale Law Journal 1535 (1998); Brewer, Scott. "Exemplary Reasoning: Semantics, Pragmatics, and the Rational Force of Legal Argument by Analogy," 109 Harvard Law Review 923 (1996).
Topic: Representing Legal Arguments: The Centrality of Abduction

------

Craig Callen, Judge John D. O' Hair Professor of Evidence & Procedure, Michigan State University School of Law. Callen's research in evidence for the last three decades has focused on the lessons of psychology, math, logic, statistics and AI for evidence, with particular emphasis on information processing theory.

------

James Franklin, professor, School of Mathematics and Statistics, University of New South Wales. Brief History. Franklin's undergraduate work was at the University of Sydney (1971-75). He completed his PhD in 1981 at Warwick University, on algebraic groups. Since 1981 he has taught in Mathematics at UNSW. His book What Science Knows: And How It Knows It (Encounter) was published in 2009. His book Catholic Values and Australian Realities appeared in 2006. Franklin's book Corrupting the Youth: A History of Philosophy in Australia was published by Macleay Press in 2003. His book The Science of Conjecture: Evidence and Probability Before Pascal (Johns Hopkins University Press) appeared in 2001. Franklin's research areas include the structuralist philosophy of mathematics and the 'formal sciences' (He is a member of the Sydney School), Australian Catholic history, the parallel between ethics and mathematics, restraint, the quantification of rights in applied ethics, and the analysis of extreme risks.
Topic: How much of commonsense and legal reasoning is formalizable? A review

------

David Hamer is Associate Professor in the Law Faculty of the University of Sydney. He has undergraduate degrees in both science and law from the Australian National University and a PhD from the University of Melbourne. His dissertation examined probabilistic models of burdens and standards of proof. He has published a series of articles in leading journals in Australia and the UK applying probability theory to various aspects of evidence, proof and justice, including causation, the right to silence, civil and criminal standards of proof, delayed complaints and double jeopardy.
Topic: A probabilistic model of the relationship between the quantity (weight) of evidence, and its strength

----------

Bruce Hay, professor of law, Harvard Law School. Research Interests: Economics of Procedure and Litigation; Evidence; Legal Theory. Education: University of Wisconsin B.A. 1985, Political Science and French; Harvard Law School J.D. 1988. Appointments: Assistant Professor of Law, 1992; Professor of Law, 1998. Representative Publications: "Manufacturer Liability for Harm Caused by Consumers to Others," 94 American Economic Review 1700 (2005) (authored with K. Spier); "Sting Operations, Agents Provocateurs, and Entrapment," 70 Missouri Law Review 387 (2005); "'Sweetheart' and 'Blackmail' Settlements in Class Actions: Reality and Remedy," 75 Notre Dame Law Review 1377 (2000) (authored with D. Rosenberg); "Burdens of Proof in Civil Litigation: An Economic Perspective," 26 Journal of Legal Studies 413 (1997); "Allocating the Burden of Proof," 72 Indiana Law Journal 618 (1997).
Topic: ...

--------

Joseph Laronge, Senior Assistant Attorney General, Oregon Department of Justice: Laronge has been a trial and appellate attorney for 35 years. For the last ten years, he has made extensive use of argument mapping (e.g., Rationale software and embodied metaphoric argumentation visual languages) in trial illustrative exhibits and court briefs as ancillary support for factual and legal inferential arguments. During this period, he has applied these argument mapping approaches for teaching advanced legal reasoning skills as an adjunct professor of law in Advanced Argumentation at Lewis & Clark Law School and as a former associate with Austhink Consulting. He previously taught the fundamentals of legal reasoning as an adjunct professor of law in Legal Research & Writing at Willamette University Law School. From these experiences, Laronge found that the multiplicity of modes of inference and arguments schemes was an impediment to the enhancement of law students’ legal reasoning skills and the clear and rigorous representation of such argumentation in court. To help overcome this obstacle, he developed a single generalizable structure of inferential proof named defeasible class-inclusion transitivity (DCIT). Since 2005, Laronge has used DCIT successfully as a persuasive universal structure of factual and legal inferential reasoning in trial and appellate court real-world applications. An explanation of its theoretical foundation and its generalizable application in court is discussed in “A Generalizable Argument Structure Using Defeasible Class-inclusion Transitivity for Evaluating Evidentiary Probative Relevancy in Litigation” J Logic Computation exp066 first published online December 2, 2009 doi:10.1093/logcom/exp066.
Topic: Evaluating Universal Sufficiency of a Single Logical Form for Inference in Court

------

Michael Pardo, associate professor of law, University of Alabama School of Law, writes and teaches in the areas of evidence, criminal procedure, civil procedure, and jurisprudence. His scholarship explores a variety of philosophical issues in these areas, with a particular focus on epistemological issues regarding evidence and legal proof. His recent scholarship also examines philosophical and evidentiary issues pertaining to law and neuroscience. Professor Pardo is the author of several publications in law reviews, including the Boston College, Illinois, Northwestern, Texas, and Iowa Law Reviews, among others, and in peer-reviewed journals, including Legal Theory, Law and Philosophy, and the Journal of Legal Studies, among others. His article, “The Field of Evidence and the Field of Knowledge,” was presented at the Stanford/Yale Junior Faculty Forum in the jurisprudence and philosophy category. Professor Pardo is also a co-author of the fifth edition of Evidence: Text, Problems, and Cases (Aspen, forthcoming, with Allen, Kuhns, Swift, and Schwartz) and a forthcoming book on law and neuroscience (with Dennis Patterson). Professor Pardo is currently the Chair-Elect of the American Association of Law Schools Section on Evidence. He also serves as the U.S. book review editor of International Commentary on Evidence. Professor Pardo joined the Alabama Law Faculty in 2005. Prior to joining the faculty, he was a visiting assistant professor at Chicago-Kent College of Law and at Northwestern University School of Law. Professor Pardo received his JD from Northwestern University School of Law.
Topic: Relevance, Sufficiency, and Defeasible Inferences: Comments on Modeling Legal Proof

------

Federico Picinali, LLM student at the Yale Law School; PhD student at the Università degli Studi of Trento, Italy. Former Visiting Researcher at UC Hastings, Cardozo School of Law and Penn Law. Former Exchange Student at the UCB Boalt Hall School of Law. Picinali received a Degree in legal sciences and a Specialized degree in law at the Università degli Studi of Milan, Italy. His research interests and publications concern the influence of fact finding on substantive criminal law principles, inferential reasoning, the beyond a reasonable doubt standard, the comparison between "legal reasoning" and "factual reasoning."
Topic: Structuring inferential reasoning in criminal cases. An analogical approach

------

Henry Prakken, lecturer in the Intelligent Systems Group of the computer science department at Utrecht University, and professor of Law and IT at the Law Faculty of the University of Groningen. Prakken has master degrees in law (1985) and philosophy (1988) from the University of Groningen. In 1993 he obtained his PhD degree at the Free University Amsterdam with a thesis titled Logical Tools for Modelling Legal Argument. His main research interests concern logical and dialogical aspects of argumentation, and the application of argumentation in legal reasoning, multi-agent systems and other domains. Prakken was the ICAIL program chair in 2001 and the ICAIL president in 2008-9. He had papers accepted at all ICAIL conferences since 1991 except in 1999 (when he gave a tutorial). Prakken co-organised workshops on legal reasoning about evidence at ICAIL 2001 and IVR 2007 and a conference on a similar topic in New York. He has published regularly about evidence since 2001.
Topic: Can non-probabilistic models of legal evidential inference learn from probability theory?

------

D. Michael Risinger, John J. Gibbons Professor of Law. Seton Hall University School of Law. Risinger holds a B.A., magna cum laude, from Yale University, and a J.D., cum laude, from Harvard Law School. He clerked for the Honorable Clarence C. Newcomer of the United States District Court for the Eastern District of Pennsylvania. He is a past chair of the Association of American Law Schools Section on Civil Procedure, the immediate past chair of the AALS Section on Evidence, and a life member of the American Law Institute. He was also a member of the New Jersey Supreme Court Committee on Evidence for 25 years, which was responsible for the current version of the New Jersey Rules of Evidence. Professor Risinger moved to Seton Hall Law School in 1973. He served as a visiting senior fellow on the law faculty of the National University of Singapore from 1985-1986. Professor Risinger has published in the areas of evidence and civil procedure. He is the co-author of Trial Evidence, A Continuing Legal Education Casebook and the author of two chapters in Faigman, Kaye, Saks and Cheng, Modern Scientific Evidence (“Handwriting Identification” and “A Proposed Taxonomy of Expertise”). Professor Risinger was selected as one of Seton Hall’s two inaugural Dean’s Research Fellows (2002-2004) and was named the John J. Gibbons Professor of Law in May 2008. His scholarship has recently concentrated on wrongful convictions as well as expert evidence issues.
Topic: Against Symbolization—Some reflections on the limits of formal systems in the description of inferential reasoning and legal argumentation

------

Boaz Sangero, Professor of law and Head of Criminal Law & Criminology Department at the Academic Center of Law & Business, Israel. He received his LL.D. from the Hebrew University of Jerusalem, in 1994. He has written over 40 articles and books, in Israel, England and United States. His recent book Self-Defence in Criminal Law (Hart Publishing, 2006) has been cited many times and reviewed in Oxford Journal of Legal Studies and in Cambridge Law Journal. Prof. Sangero’s areas of research include Substantive Criminal Law; Criminal Procedure; Criminal Evidence; Sentencing. He particularly looks for ways to reduce miscarriages of justice. Representative publications: "A New Defense for Self-Defense", 9 Buffalo Crim. L. Rev. 475 (2006); "Miranda Is Not Enough: A New Justification for Demanding “Strong Corroboration” to a Confession", 28 Cardozo L. Rev. 2791 (2007); "Why A Conviction Should not be Based on A Single Piece of Evidence: A proposal for Reform" (co-writer – Dr. Mordechai Halpert) 48 Jurimetrics: The J. of L., Science, and Technology, 43 (2007); "From a Plane Crash to the Conviction of an Innocent Person: Why Forensic Science Evidence Should Be Inadmissible Unless it has been Developed as a Safety-critical System" (co-writer – Dr. Mordechai Halpert), 32 Hamline L. Rev. 65 (2009); Heller's Self-Defense, 13 The New Criminal Law Review 449 (2010).
Topic (with Mordechai Halpert): Proposal to Reverse the View of a Confession: From Key Evidence Requiring Corroboration to Corroboration for Key Evidence

-----

Giovanni Sartor, professor of Legal informatics and Legal Theory at the European University Institute of Florence and at the University of Bologna. He obtained a PhD at the European University Institute (Florence), worked at the Court of Justice of the European Union (Luxembourg), was a researcher at the Italian National Council of Research (ITTIG, Florence), held the chair in Jurisprudence at Queen’s University of Belfast (where he now is honorary professor), and was Marie-Curie professor at the European University of Florence. He is President of the International Association for Artificial Intelligence and Law. Sartor has published widely in legal philosophy, computational logic, legislation technique, and computer law. Among his publications is: Corso di informatica giuridica (Giappichelli, 2008), Legal Reasoning: A Cognitive Approach to the Law (Springer: 2005), The Law of Electronic Agents (Oslo: Unipubskriftserier, 2003), Judicial Applications of Artificial Intelligence (Dordrecht: Kluwer, 1998), Logical Models of Legal Argumentation (Dordrecht: Kluwer, 1996), and Artificial Intelligence in Law (Oslo: Tano, 1993).
Topic: ...

------

Peter Tillers, professor of law, Cardozo School of Law, Yeshiva University. Tillers is a reviser of John Henry Wigmore's multi-volume treatise on the law of evidence and has published a variety of articles on evidence, inference, and investigation. He is an editor of the Oxford journal Law, Probability and Risk. He is former chairman and secretary of the Evidence Section of the Association of American Law Schools. He was a Fellow of Law & Humanities at Harvard University and an Alexander von Humboldt & Senior Max Rheinstein Fellow at the University of Munich. He was a visiting professor at Harvard Law School in the spring semester of 2002. Tillers was legal adviser for the Latvian mission to the United Nations during the 48th Session of the General Assembly. He maintains a website with discussion of a wide range of general issues of evidence. Tillers' scholarship focuses on evidential inference and fact investigation in legal settings. He maintains that multiple methods of marshaling and analyzing evidence are important in trials, in pretrial investigation and informal fact discovery, and in other domains. He believes that inference networks offer a useful window into investigative discovery and proof at trial. But he believes that subjective, synthetic, and gestalt-like perspectives on evidence, inference, and proof are also essential.
Topic: A Rube Goldberg Approach to Fact Ivestigation, Evidential Inference, and Factual Proof in Legal Settings

------

Bart Verheij, lecturer and researcher at the University of Groningen, Department of Artificial Intelligence and a member of the ALICE institute. He participates in the Multi-agent systems research program. His research interests include argumentation, rules and law, with emphasis on defeasible argumentation, legal reasoning and argumentation software. As research methods, he uses formal analysis (in the styles of logic and analytic philosophy), software design, algorithm implementation, agent-based social simulation, controlled experiment, observation, and thinking and exploring. His research field is interdisciplinary, and includes artificial intelligence, argumentation theory and legal theory.
Topic: Can the argumentative, narrative and statistical perspectives on legal evidence and proof be integrated?

-----

Douglas Walton holds the Assumption University Chair in Argumentation Studies, and is Distinguished Research Fellow Centre for Research in Reasoning, Argumentation and Rhetoric (CRRAR), at the University of Windsor. He serves on the editorial boards of several journals, including Informal Logic, Argument and Computation, and Artificial Intelligence and Law. He is the author of forty-five books and three hundred refereed papers in the areas of argumentation, logic and artificial intelligence. The books include Witness Testimony Evidence, Cambridge University Press, 2008, Fundamentals of Critical Argumentation, Cambridge University Press, 2006, and Legal Argumentation and Evidence, Penn State Press, 2002.
Topic (with Floris Bex): Combining Evidential and Legal Reasoning with Burdens and Standards of Proof

Tuesday, March 15, 2011

Breathless

The law school world breathlessly awaits the 2011 US News & World Report rankings. They're out! We're number xx. [smile, frown]

When you approach death's door, law professor, will you care that your law school was ranked, say, 34th rather than 39th or, say, 50th rather than 54th? Or will it matter more to you that you contributed something to the world (or that you did not)?

Such jeremiads aside, the rankings are probably useful to law school applicants.

&&&

The dynamic evidence page

It's here: the law of evidence on Spindle Law. See also this post and this post.

Thursday, March 10, 2011

Legal History in Judicial Decisionmaking

Thomas Y. Davies, 100 Journal of Criminal Law & Criminology 940 (2010) (footnotes omitted):
The essential rule for recovering authentic legal history is to never take judicial statements about that history at face value. Judges routinely innovate and change existing doctrine, but they typically cover up their innovations by inventing fictional accounts of precedent and history. Sycophantic academics then come along and embellish the judicial fictions. (How else are commentators to get “cited” in Supreme Court opinions?) To complete the cycle, the justices then cite the commentaries as confirmations of their own inventions. The overall result is that the conventional doctrinal history that is derived from judicial claims often turns out to be drastically different from the authentic history.

&&&

The dynamic evidence page

It's here: the law of evidence on Spindle Law. See also this post and this post.

Fabricated DNA

Kristen Bolden, Note: "DNA Fabrication, A Wake Up Call: The Need to Reevaluate the Admissibility and Reliability of DNA Evidence," 27 Georgia State Law Review 409 (2011) (footnotes omitted):
In June 2009, Israeli forensic science researchers published a ground breaking study that put credence to the possibility of creating artificial Deoxyribonucleic Acid (DNA) that can fool current forensic testing procedures. The researchers asserted that anyone with the proper equipment and basic understanding of molecular biology could create artificial DNA in virtually unending amounts. Furthermore, the research demonstrates that the current American forensic science system utilized by law enforcement is incapable of distinguishing between artificial and genuine DNA.


&&&

The dynamic evidence page
It's here: the law of evidence on Spindle Law. See also this post and this post.



Justice Scalia and the Future of Crawford

Linda Greenhouse has some thoughts here about this matter. But perhaps Greenhouse is a bit too hard on Scalia. After all, even if if Crawford's and Scalia's formalistic and originalist solution to the riddle of the meaning of the Confrontation Clause makes no sense (which I happen to believe is true), there is something quite admirable about this supposedly conservative Justice's dedication to principle (as he sees it), principle that, in the new case that Greenhouse discusses, happens to favor a criminal defendant.

&&&

The dynamic evidence page

It's here: the law of evidence on Spindle Law. See also this post and this post.

quantius.org




You might be interested in quantius.org

Sunday, March 06, 2011

The Structure and the Logic of Proof in Trials

Discussion Paper: The Structure and the Logic of Proof in Trials
(Draft; final paper: advance access Law, Probability and Risk, 24 February 2011, 10.1093/lpr/mgq014)

If one travels across time, continents, and cultures, one needs to attend to the interests and attitudes of one's audience. If one doesn't do so, one risks being thought strange. Of course, the same may hold true if one travels just from New York City to Texas (a mere 1,600 miles or so). Many years ago I traveled from New York City, where I was teaching at the time, to Houston, Texas, to give a talk at one of the law schools in Houston. At the time I was much preoccupied with the work that had been done by people in artificial intelligence. I was interested in such stuff because I wondered about the possible applications of that research to the study of evidentiary processes in litigation and adjudication. In my talk in Houston I tried to explain my tentative conclusion that artificial reasoning methods (including statistical methods) could not supplant ordinary methods of reasoning. After the talk, it soon became apparent that some or much of the audience viewed me as closely akin a man from Mars. [1. I didn't get the job. But I had the good sense to withdraw my candidacy before the school had a chance to reject me.] In this paper I will use some words – words such as “ontology” – that may seem strange or repellent to some or many readers. So I thought I would begin by talking briefly about some of my motivations for talking in ways that may seem strange or unpleasant to some or many readers.

Several decades ago I revised the first volume of John Henry Wigmore's multi-volume treatise on the law of evidence.[2. 1 & 1A Wigmore on Evidence (Little Brown & Co.: Peter Tillers rev. 1983).] Being a reviser is both harder and easier than being an author of one's own book. It is easier because one can, if one wishes, take the position of a critic and commentator rather than that of author. And in dealing with some or many of the theoretical portions of Wigmore's magnum opus, I often did exactly that. However, after finishing my revision, I agreed to write a successor to my revision. Now, after a delay of many years, I am doing that. The job of this successor volume, like part of its precursor, is to examine “theoretical considerations” that bear on the law of evidence and proof. This means I can no longer just be a commentator. Now I must present my own “theory of proof.”

In looking at what others had done in trying to develop a theory of the law of evidence and proof, I saw many impressive accomplishments. I also saw a variety of approaches. On the one hand, some of the authors I read seemed to focus on the abstract logic of uncertain reasoning; and their theory of the law of evidence and proof (a theory that they often called a theory of “relevance”) effectively amounted to their view of the nature of logical thinking about uncertain factual propositions.[3. See, e.g., Edmund Morgan, Basic Problems of Evidence 183-188 (Joint Committee on Continuing Legal Education, ALI & ABA: 2d ed., 1962).] It seemed to me then – and it seems to me now – there is something wrong with this approach. Above all, I wondered and I wonder still how it can be confidently said that the methods of reasoning and demonstration in a particular legal system (such as the American one) rest on and express such logic.

On the other hand, other American Evidence scholars (too numerous to mention here) seemed to do little more than catalogue some of the features they thought and said were characteristic of the American law of evidence and American methods of proof. These catalogues, taken as such, were sometimes interesting. But I couldn't see and I still cannot see how such catalogues, in of themselves (if taken at face value), could be a considered a “theory” of the American law of evidence or American methods of proof. Catalogues, taken as such, are mere lists of things, not explanations.

In more recent years some legal scholars have charted a middle course between these two extremes. Legal scholars such as William Twining, Paul Roberts, and Adrian Zuckerman have not tried to reduce the systems of proof they studied either to logic or to contingency. These scholars have instead emphasized what might be called the force of cultural and normative ideals in the workings of the law of evidence and of evidentiary processes in settings such as criminal trials.[4. In Criminal Evidence (Oxford: 2004) Paul Roberts and Adrian Zuckerman use invoke five central principles to explain, they say, the main features of the law of criminal evidence in England. Id. at pp. 18-22. In Rethinking Evidence: exploratory essays (2nd ed., 2006) (as well as elsewhere) William L. Twining describes what he calls the “rationalist” tradition of evidence scholarship. At p. 76 of that book he even provides a helpful table that summarizes the properties that that that scholarly tradition ascribes to the methods of inference and proof used in trials following the common law tradition. (But he views these attributes of inference and proof as forming an “ideal type” rather than an actual and precise characterization of any actually-existing system of juridical proof.)] However, these same scholars (particularly W.L. Twining) are generally not willing to regard the law of evidence in this or that country as nothing more than a cultural artifact. For example, Twining apparently thinks that logic is also at work in the proof process in law – at least sometimes and to some degree.[5. In various works William Twining advocates the use of neo-Wigmorean analytical methods to advance what he presumably views as “rational” methods for lawyers to participate in the process of juridical proof. See, e.g., Terence Anderson, David Schum & William Twining, Analysis of Evidence (2nd ed., 2005). However, Twining has relatively little to say about the rationality or irrationality of specific rules of evidence such as the hearsay rule and the best evidence rule.]

I think this third way of looking at juridical proof – as being neither pure logic nor pure accident – is, roughly speaking, the correct one. But to say that is not to say a whole lot. Where do we go from here? More specifically, where do I go from here? What if anything can I add to what insightful and masterful scholars such as Twining, Roberts, and Zuckerman have said and written?

I yearn to extract timeless lessons – or, in any event, relatively timeless lessons – from my study of juridical proof in America. If I am to have any hope of doing that, I think I must turn to ontology – that is to say, I think I must talk about the fundamental nature of things, including the fundamental nature of the human animal. However, if an ontology is to be of any substantial use to me, it cannot amount to the teasing out of the necessary consequences or implications of the unchanging nature of things and human beings. That sort of ontology would likely generate an ideal model of juridical proof, but not an explanation of actual systems and practices of juridical proof. I want and need an ontology – a theory of nature and of human nature – that allows for contingency as well as necessity. In addition to that, however, I yearn for an ontology that allows reason to exist in contingency and accident (to exist, that is, in contingency, not just co-exist with it).

Can I have all that I want? That remains to be seen.

We all know what some of the necessary starting points must be. We must concede, I think, that human beings have limited amounts of time and limited resources. Furthermore, we must now concede, I think, that all or almost all factual questions have uncertain answers and that nothing we can do can eliminate all uncertainty about most factual hypotheses.

So far so good, yes? But what does this tell us about the nature of juridical proof?

Perhaps it tells us quite a bit. One might argue that given the realities of human existence that have been recognized so far, we know that a system of juridical proof must draw uncertain factual inferences about factual questions in a limited amount of time and with limited resources – and, knowing that, we at least know that if we are to understand factual proof we must understand the logic of uncertain inference and the workings of the logic of inference under resource and time constraints. So, to understand proof, we must understand the logic and economics of uncertain inference. There are, of course, quarrels about the nature of the logic of uncertain inference and about how scarcity constrains and channels uncertain inference. But at least – so it might be argued – we know what we have to study and understand if we are to understand juridical proof.

But there is something wrong with this hypothesis. The error is hinted at by one question: How do we know that actual systems of juridical proof (if, that is, they deserve to be called “systems”) aim at establishing the truth about the world? Furthermore, even if we concede that truthfinding is one of the aims of any system of juridical proof, how do we know how important – how comparatively important – that aim is?

These questions point to an important feature of actual systems of juridical proof: Proof practices in legal settings are social and cultural phenomena that have multiple purposes; when viewed from the perspective of the norm of truthfinding, juridical proof has many “accidental” features – and nothing in heaven (or on earth) dictates what those “accidental” purposes are, how important they are, or what the tradeoffs are between such accidental purposes and truthfinding. Given these realities, it is probably not possible to deduce the necessary characteristics of juridical proof (except at a very abstract level, one from which deductions about specific historical proof practices can rarely be drawn).

Are we then reduced to embracing the question-begging proposition that ontology reveals the nature of juridical proof to the extent that juridical proof seeks to establish the truth about the world?

I think that is not the limit of what ontology has to teach us about the actual and necessary workings of juridical proof. I say that because modern ontology teaches us that the human animal is an evolving intelligent organism. This feature of our existence (in addition to the features of time and resource constraints) also has some necessary implications for the workings of rational juridical proof.

I cannot spell out all of the implications in this short paper. Permit me to mention just two possible implications of this fact about the present character of human existence.

First, because human beings are natural organisms, human beings will and must use tacit, ingrained, and subterranean knowledge and “information processing mechanisms” to reach conclusions about the world.[6. The epistemological and inferential theory I sketch here builds on the neo-Aristotelian theory I mentioned in my essay Peter Tillers, Crime, Procedure, and Evidence in a Comparative and International Context 179 (Hart, 2008).] This fact in turn has a variety of implications. For example, it generally means that no conceptual apparatus can hope to replace the inferential mechanisms that human beings use to draw conclusions about the world; and it means that, in general, the job of explicit inferential methods is, to the extent possible, to make the relatively implicit, the partially submerged, more explicit and less submerged. The person who more than any other has adopted roughly this perspective on representations of evidential inference is Timothy van Gelder. Van Gelder views such representations as tools that can “augment” existing human cognitive capacities – rather than as devices that replace defective human cognitive processes. At the very beginning of a seminal article [7. Timothy van Gelder, “The Rationale for Rationale,” 6 Law, Probability and Risk 23 (2007).] about his software Rationale – and, more generally, about formal representations of evidential inference – van Gelder tellingly quotes, with approval, a passage by D.A. Norman:

The power of the unaided mind is highly overrated. Without external aids, memory, thought, and reasoning are all constrained. But human intelligence is highly flexible and adaptive, superb at inventing procedures and objects that overcome its own limits. The real powers come from devising external aids that enhance cognitive abilities.[8. Id. at p. 24 (quoting D.A. Norman, Things That Make Us Smart: Defending Human Attributes in the Age of the Machine (Reading, MA: Addison Wesley, 1994)).]
As this quotation makes plain, van Gelder most definitely does not abjure logic. But he does believe that representations of logical evidential inference can complement naïve cognitive capacity. In an e-mail conversation with me he referred to such representations as “extrospection” – as contrasted with introspection. This neologism (which he claims he did not invent) evokes what I have in mind when I talk about the implications of thinking of individual human beings as “evolving intelligent organisms,” for the enterprise of constructing formal representations of evidential inference.

Second, the material that the human mind must excavate to guess at the proper workings of the human mind is not just the workings of one’s own psyche and mental processes. The philosophical investigator must entertain the hypothesis that at least some human social practices, like individual mental processes, are to some extent rational truthfinding practices and that social factfinding processes, like individual psychic and mental phenomena, can suggest or hint at important characteristics of the proper logic of evidential argument, or factual inference.[9. In an abstract of another paper (a still-unwritten paper that prompted the argument sketched in this paper), I offer three examples of the possible epistemological lessons of actual human social-legal practices. For example, I suggest that American proof practices, if viewed as resting on truthfinding considerations, harbor some possibly very important epistemological lessons about the relationship between truthfinding, multiple investigative hypotheses, and resource constraints. See Abstract, http://tillers.net/abstract.html (July 19, 2010). Cf. Peter Tillers, “The Fabrication of Facts in Investigation and Adjudication,” (1995, 1998 & 2007) (see esp. “§5. Implications of Interpersonal Variability in the Formation of Conjectures and Hypotheses: Let a Hundred (Discordant?) Flowers Bloom in Investigation and Proof?”), at http://tillers.net/fabrication.html]

My vision of (wo)man and his (her) world is neo-Aristotelian. I do not believe in the radical separation of descriptive inferential theory and normative inferential theory. I believe that the ideal workings of human inference must be and are rooted in the actual workings of human inference and that human intelligence consists in part of the ability to see when actual inference works well and when natural inference works in a degraded or imperfect fashion. The function of reflection and conscious thought is, to the extent possible, to perfect – and, very occasionally, to transcend – the excellence of natural human thought.

&&&

This – what I have just said – is the nub of what I wanted to say here. After I shared an earlier version of this argument with several close friends, I got two very interesting reactions.

One friend (Bruce Hay) wondered about – well, in truth, he vigorously challenged – my treatment of “the question of universality/necessity vs. accident/contingency.” He wrote that he wondered whether my premises imply that legal factfinding systems would evolve toward the same end, that they would all eventually become more or less the same or whether, instead, my principles or premises suggest or imply that “our limitations naturally evolve us toward very different cultural results.” Am I suggesting, that is, that “if we were all infinitely rational beings we would presumably have the same practices; but we aren't, so we should expect very different, localized, contingent, accidental adaptations in the matter of proof, as in other matters”?

I answered in part by saying the following:

You raise a question I didn't try to answer, the question of the universality or non-universality of my theory of proof. I recently told a good friend … that I had made a mistake in originally conceiving of my Sydney talk as sketching the outlines of a theory of proof.

My focus is in part on human social-legal practices as offering "hints" of rationality. This business of hints has an obvious and intended link to the Peircean … idea that evidentiary trifles are sources of inspiration for abductive inferences.

Decades ago I was a neo-Hegelian. (That was before I decided that any kind of Hegelian logic is a dead end.) If I were still a neo-Hegelian, I might venture to guess that different societies largely-unwittingly experiment with, or at least try out, different visions of inferential rationality.[10. Earlier in this paper I spoke of my yearning to extract timeless lessons – relatively timeless lessons – from ontology. One possible (relatively) timeless lesson from the evolving nature of human creatures and societies may be that different individuals and societies can and will entertain different ideas about how to best find the truth about facts and about how best to reconcile the search for the truth with other objectives, preferences, and aspirations. Cf. my concluding comments here about the question of the eventual convergence or non-convergence of individual or socio-legal methods for getting at the truth about facts. Simply stated, I am agnostic on the question of where all of us are headed.]

Two other friends – good friends both (Scott Brewer and Federico Picinali) – raised another question. They raised this second question in different ways but they raised essentially the same question.

One of these two good friends (Scott Brewer) was particularly upset by what he thought were the anti-critical implications of my argument, by what he thought was the implication that human beings should tolerate and accept their error-prone ways of reasoning about evidence and facts. The other friend raised this same question in a different way. I answered as follows:

The question of the relationship between native or inherited reasoning, on the one hand, and artificial or new forms of reasoning, on the other hand, is central. It certainly is the case that in some domains (e.g., the realm of chemistry) we have improved our reason. It is also the case in other domains (such as law) that we hope to improve on our prior and inherited reasoning. It is rare that we can entirely escape from inherited (and often tacit) reasoning. But we can improve or we hope to improve how well our inherited conceptual, reasoning, sensory etc. equipment works. I see the human animal as in part a self-organizing creature. But the human creature must work with [the equipment] it has at any given moment. There is a mystery here: The human creature has the power to use what it has to become more than it was. This applies to reasoning and inferential ability. But history proves that this can happen. Else how does one explain the existence and power of methods such as calculus?
These two general questions – one question deals with humanity writ relatively large (socio-legal methods of factual inference and proof), the other with single human creatures (the methods individuals do and should use to draw inferences from evidence) – these two general questions may be related. I confess I hesitate to discuss how they may be related because I fear that I am wading into deep philosophical questions that are better addressed by theologians or cosmologists than by parochial lawyers such as me. But the persistent nagging of my friends has forced me into this corner. So please bear with me while I venture a few extremely speculative thoughts.

In both cases – both in the case of humanity writ large and in the case of humanity taken singly – I assert that it may be possible, appropriate, and perhaps even necessary to wrest rational methods for dealing with uncertain factual propositions out of our existing or inherited human thought-practices. The fragility of this hypothesis in either case (in the collective case or the case of the individual) is exposed by the following question: By what right can we or should we believe that anything in our existing way of thinking or in our existing way of doing things (dealing with factual issues) is rational and what is the process by which we supposedly improve on the hypothetically half-baked rationality of our existing modes of thought and action, which may not partake of rationality to begin with?

My answer to this is a concession that I have no demonstrably-correct solution to this difficulty. But I do say that what we see in ourselves (either taken singly or taken collectively) does sometimes appear to us to be rational and sensible and that when we reflect on what we presently do and the way we presently think (process information), we sometimes seem to make our half-conscious but rational inferential practices more explicit and thereby – it seems to us – sometimes make our existing ways of thinking and acting work better – and that sometimes we are even able to decide to modify the way we think and act (as well improve the working of our ways of thinking and acting) and that sometimes (but not always) by doing so we are able to become more rational in the way we draw conclusions about facts. I also say that I am not alone in believing that such things happen. But I readily concede that this is not conclusive proof that such things do actually happen. Moreover, I concede that even if I am correct – even if such things do happen – I am not in a position to say whether we are all evolving toward becoming better and more rational beings or whether something like a divine or cosmic spirit or substance has implanted within us some budding rationality together with the ability to develop our incipient and imperfect rationality and sometimes even transcend it. But I am entitled to hope!

&&&

The comments of Federico Picinali, Mike Redmayne, and Paul Roberts about this paper are also being published in Law, Probability and Risk.

Tuesday, March 01, 2011

Notes for a Talk in Florence - on Trial by Mathematics


European University Institute, 25 February 2011, http://qajf.wordpress.com/
Trial by Mathematics - Reconsidered
(footnotes omitted; rough draft-please do not quote))
by Peter Tillers
(for eventual publication in Law, Probability and Risk, http://lpr.oxfordjournals.org/ )

In 1970 Michael O. Finkelstein (with William B. Fairley) proposed that under some circumstances a jury in a criminal trial might be invited to use Bayes' Theorem to address the issue of the identity of the criminal perpetrator. In 1971 Laurence Tribe of Harvard Law School responded to this proposal with a rhetorically-powerful and multipronged attack on what he called "trial by mathematics." Professor Tribe, who went on to have a distinguished career as a scholar of American (U.S.) constitutional law, argued that any use of probability theory in trials (particularly in criminal trials) to regulate the drawing of inference from evidence has a variety of vices.


Tribe's article focused on the probability calculus in part because he was responding to a proposal to use a theorem of probability theory, Bayes' Theorem. But in that same article Tribe said that his objections to the use of Bayes Theorem in legal trials have broader implications. The subject matter of his article was, as he put it, "the entire family of formal techniques of analysis that build on explicit axiomatic foundations, employ rigorous principles of deduction to construct chains of argument, and rely on symbolic modes of expression calculated to reduce ambiguity to a minimum."

Tribe argued that the use of mathematics to model factual inference and proof in trials (particularly in criminal trials) is a bad idea because:
1. Bayes' Theorem makes precise what is inherently imprecise; 
2. Bayes' Theorem makes objective what is subjective; 
3. Trial by mathematics and statistics is morally and socially offensive; 
4. Lay triers of fact cannot understand matters such as Bayes' Theorem; and 
5. Numbers tend to dwarf soft variables, considerations expressible in numbers swamp unquantifiable considerations, doubts & uncertainties.
The debate about "trial by mathematics" – or, more broadly, the debate about the use of formal analysis to model of evidence and inference in legal proceedings – took many twists and turns after Tribe's formidable, rhetorically-powerful assault.

Finkelstein and Fairley published several brief rejoinders to Tribe. One of the more interesting rejoinders was made jointly by Fairley and the eminent statistician Robert Mosteller of Harvard University. Fairley and Mosteller argued that Tribe's most technical objection to Bayesian analysis of identification – Tribe's claim that Bayesian cannot accommodate uncertain evidential premises – was incorrect; they argued that the product rule for dependent conditional events can accommodate the sorts of uncertainties (and the redundancies) that Tribe mentioned in his article. The attempted rebuttals by Finkelstein and Fairley had little effect. It seemed to most American legal scholars (to those legal scholars, in any event, who weren't entirely mystified by the debate) -- it seemed to most legal American scholars at the time that Tribe had killed the baby (Bayesian analysis of evidence in legal trials) practically at the moment of its birth; it seemed that the Bayes-Baby was born still-born.

But it turned out that the baby that Tribe had attacked was hard to kill. In 1975 Professor Richard Lempert published an influential article that did much to resurrect interest in the use of mathematics and probability theory to model factual inference and proof in legal proceedings. Although Lempert said he agreed with Tribe that the "costs of attempting to integrate mathematics into the factfinding process of a legal trial outweigh the benefits," Lempert argued that that Tribe had overlooked the possibility of heuristic use of mathematical models of inference. Lempert argued that judges and legal scholars, for example, could and should use subjective Bayesian logic to explore their own thinking and reasoning about inferences from evidence.

But two years after Lempert published his influential article, the debate took a different turn. L. Jonathan Cohen, an Oxford philosopher, published the influential book, The Probable and the Provable (Oxford, 1977). In that book and elsewhere he called accounts of inference and proof that rest on the standard probability calculus, "Pascalian"; and he argued that another way of thinking about inference, induction, and proof, which he called "Baconian," was also, at a minimum, valid and important.

Six years later the debate about the nature of inference and factual proof took yet another direction. In 1983 three social psychologists -- Reid Hastie, Steven D. Penrod, and Nancy Pennington -- published the book Inside the Jury (Harvard, 1983). There and elsewhere they advanced what they later called the "story model" of proof: They argued that American juries typically evaluate evidence in part by constructing stories.

Some observers who were uncomfortable with Bayesian accounts of inference and proof in legal trials found solace and support both in the work of L.J. Cohen and in the work of Hastie, Penrod, and Pennington. Some or most of these observers thought that Cohen's theory was anti-mathematical and some or many observers thought that an account of inference that emphasizes story-telling is incompatible with, or at least fundamentally different from, a Bayesian model of evidential inference.

Many of the protagonists and participants in these debates and discussions, together with some others, came together in a conference at Boston University School of Law in 1986. At this conference many of these participants discovered, or so they said, that the differences between their various approaches and theories were not really as stark or as fundamental as many observers had supposed. However, some of the participants in the conference -- in the main several newcomers to the debate -- stuck to or picked up their guns and insisted that mathematical analysis of evidence in trials, while not necessarily invidious, is radically incomplete and imperfect. In particular, Professor Ronald Allen, argued that only a non-mathematical theory that emphasizes stories and storytelling can provide an accurate model or picture of juridical proof. (Professor Allen invoked both L.J. Cohen's theory and Inside the Jury to support his theory.)

The debate and the discussion about trial by mathematics, of course, continued after the 1986 conference. (It did so in part at several conferences that I organized.) However, as the years passed, it seemed increasingly apparent to some observers that the debate about trial by mathematics was becoming unproductive and sterile. I was one of those people.

It seemed to me that many of the opponents of Bayesian analysis of inference in legal trials and, more broadly, of mathematical analysis of proof in trials never really understood (and still fail to understand) what at least some of the proposals for mathematical analysis were all about and that because of this failure of understanding most of the counterattacks (against mathematical and formal models of factual inference) were made against a straw theory. For example, it was a mistake for critics of trial by mathematics to suppose that Cohen's Baconian theory is anti- or non-mathematical; it was a mistake to suppose that story-telling is inconsistent with Bayesian or mathematical analysis of evidence, it was a mistake to suppose that Bayesian analysis is equivalent to objective or statistical analysis; it was a mistake to suppose that formal or mathematical analysis is necessarily "mechanical" or that formal mathematical analysis necessarily amounts to an "algorithm"; it was perhaps even a mistake to suppose that non-mathematicians and ordinary human beings could never be made to understand Bayesian logic or any kind of mathematically-grounded account of inference and proof; and, finally, it was a profound mistake to suppose that the debate over trial by mathematics was really only a debate about the uses of mathematics in or about factfinding in the legal process rather than (as Tribe himself recognized) a debate about the broader question of uses and limits of formal argument about evidence, factual inference, and factual proof in legal proceedings. (These points, it seemed to me, had all been very clearly made by the unjustifiably-modest polymath David Schum. I had also tried to make some of these points. I just could not understand why law professors could not seem to understand these basic points, the ones I have just mentioned.)

At about the time that I was thinking such sour and possibly-arrogant thoughts, I began thinking again and at more length about Lotfi Zadeh's theory of fuzzy sets. I was convinced (and I am still convinced) that fuzzy logic potentially has much to say about how human beings do and should reason about evidence in legal contexts (and about legal reasoning in general). However, although I felt this way, I found that I could not escape the suspicion that fuzzy logic does not address some important features of argument about evidence in American trials and legal proceedings. Although I was not sure and I still am not sure that my understanding of fuzzy logic was correct, I had the sense that fuzzy logic is in the main a theory about the natural behavior of concepts and words in much the way that meteorology is a theory of the behavior of the atmosphere. I could not readily see how fuzzy logic could be used to portray the sort of argument one often hears in courtrooms and trials about the inferences to be drawn from some collection of evidence, some set of evidential premises. As I thought about this question -- as I puzzled over both the power and the limits of fuzzy logic -- I became even more convinced than I had been before that it is very important to keep in mind that formal argument about evidence (whether the argument uses numbers or not) can serve quite distinct purposes.

Much of the fear that many legal scholars and judges have of mathematical and formal argument may be rooted in two intuitions, one of which seems valid to me and one of which does not. The valid intuition or sentiment that quite possibly lies at the root of much of the distrust by "legal professionals" of mathematical and formal analysis of evidence is the belief that in legal proceedings argument from and about evidence must be "transparent" to "ordinary" people such as judges and jurors. This intuition, or "prejudice," is in part rooted in the sentiment that the ultimate decision makers in legal proceedings must be human beings and in the correlative sentiment or belief that decision making about evidential inferences cannot be handed over to a logic that ordinary judges and jurors cannot follow and whose trustworthiness such judges and jurors therefore cannot assess.

The invalid intuition or suspicion on which legal professionals' fear of formal analysis rests is the notion that formal analysis is necessarily mechanical -- "mechanical" in the sense that mathematical or formal analysis is necessarily removed from and impenetrable to ordinary human judgment and intuition and therefore necessarily runs "on its own," beyond the control or effective supervision of ordinary mortals.

But the responsibility for this "mistake" -- for the mistaken notion that formal argument necessarily runs on its own and beyond the control of the personal judgments of ordinary human beings -- is not entirely or even primarily attributable to the supposed naiveté of ordinary people. It is in fact the case that most complex argument about inferences from evidence rests on almost innumerable personal or subjective judgments. The mistake in thinking that formal argument necessarily works irrespective of or independent of such personal or subjective judgments and intuitions is probably largely attributable to the failure of the practitioners of formal analysis to show how formal arguments can be made intelligible to ordinary people (i.e., to non-logicians and non-mathematicians) and their failure therefore to show how at least some formal arguments can be rooted in and made responsive to subjective human sentiments and judgments of ordinary people.

These are the views and sentiments that led me to develop the following table or list of possible purposes of mathematical and formal argument about inference from evidence
1. To predict how judges and jurors will resolve factual issues in litigation. 
2. To devise methods that can replace existing methods of argument and deliberation in legal settings about factual issues. 
3. To devise methods that mimic conventional methods of argument about factual issues in legal settings. 
4. To devise methods that support, or facilitate, existing, or ordinary, argument and deliberation about factual issues in legal settings by legal actors (such as judges, lawyers, and jurors) who are generally illiterate in mathematical and formal analysis and argument. 
5. To devise methods that would capture some but not all ingredients of argument in legal settings about factual questions questions. 
6. To devise methods that perfect – that better express, that increase the transparency of – the logic or logics that are immanent, or present, in existing ordinary inconclusive reasoning about uncertain factual hypotheses that arise in legal settings. 
7. To devise methods that have no practical purpose – and whose validity cannot be empirically tested – but that serve only to advance understanding - possibly contemplative understanding – of the nature of inconclusive argument about uncertain factual hypotheses in legal settings.
In the abstract for this talk and paper I explained the purpose of my list in the following way:
Before any further major research project on "trial by mathematics" is begun, interested researchers in mathematics, probability, logic, and related fields, on the one hand, and interested legal professionals, on the other hand, should try to reach agreement about the possible distinct purposes that any given mathematical or formal analysis of inconclusive argument about uncertain factual hypotheses might serve. Putting aside the special (and comparatively trivial) case of mathematical and formal methods that make their appearance in legal settings because they are accoutrements of admissible forensic scientific evidence, I propose that discussants, researchers, and scholars of every stripe begin by carefully considering the possibility that mathematical and formal analysis of inconclusive argument about uncertain factual questions in legal proceedings could have any one (or more) of the ... distinct purposes [that I enumerate in my abstract].
Over the years I happen to have been most interested in the fourth possible purpose of formal argument about and from evidence: "To devise methods that support or facilitate existing, or ordinary, argument and deliberation about factual issues in legal settings by legal actors (such as judges, lawyers, and jurors) who are generally illiterate in mathematical and formal analysis and argument." Making and assessing arguments is hard work. Probing the strengths and weaknesses of arguments, including arguments about evidence and inference, is also hard work. Nothing will ever change that. But I believe that people who study mathematics and formal logic have it in their power to make many of their propositions about logic and to make many of their formal arguments intelligible to people such as judges and jurors. For example, I believe that pictures and picture-thinking may be one way in which the worlds of the formal and informal sciences can learn to communicate effectively with each other. If that is the case, the day may yet come when rigorous formal argument about evidence, factual inference, and factual proof looks and feels warm and friendly to ordinary and mathematically illiterate people such as me.


&&&


The dynamic evidence page
It's here: the law of evidence on Spindle Law. See also this post and this post.

Wednesday, February 16, 2011

Circling around Reasonable Doubt

In part of its opinion in Victor v. Nebraska, 511 U.S. 1 (1994), the Supreme Court of the United States approved the following line of reasoning:
Question: What level of proof is required for conviction of crime?
Answer: Proof beyond a reasonable doubt. 
Question: Does a showing of a "strong probability" of criminal guilt constitute proof beyond a reasonable doubt?
Answer: Under some circumstances. 
Question: Under what circumstances?
Answer: When the probability is so strong that it excludes reasonable doubt.
Except to the extent that the Court acknowledges that probabilities (of some kind and in some way) can figure in the trier's assessment of criminal guilt, the Court's reasoning has an Alice-in-Wonderland air:

"It's all very clear, my dear: Proof beyond a reasonable doubt is proof that removes all reasonable doubt. Why are you troubled by this self-evident proposition?"



&&&

The dynamic evidence page
It's here: the law of evidence on Spindle Law. See also this post and this post.