My comments about "two cultures" (see post, 9/28/2003) provoked a friend of mine to e-mail me some comments. This friend of mine is a member of the community sometimes known as UAI -- uncertainty in artificial intelligence. After several exchanges, my friend focused on the reception of science and computer science in the law school world. The following is part of what this UAI person said:
1) Regardless of the validity of a "humanities vs. scientific" dichotomy in university curriculums, I believe that there may be another valid and relevant distinction between academic graduate-level disciplines that mostly train professors vice ones that primarily train practitioners. For example, I would bet (meaning that I don't know the facts) that on the average, a much larger percentage of doctorates in physics, biology and pure mathematics become professors than those in chemistry, architecture or any engineering area - e.g. electrical, mechanical, civil, etc. It may still be true that a larger percentage of humanities disciplines such as sociology, political science, english or history, mostly train professors than the corresponding percentage of the scientific, or perhaps more accurately, the technologically-focused disciplines. What I like about this conjecture is that it should be relatively easy to check the facts (if one was willing to take the time and energy).
2) Assuming conjecture (1) above is true, i.e. borne out by the statistics, then I would further conjecture that the primary reason is the societal-economic need for practitioners in industry and other business, e.g. the law profession, medical profession, architecture profession, etc. For example, most graduate study in chemistry is aimed at practical experimentation, not theoretical exploration, because most chemistry grad students want to go and get jobs in industry where they can make useful things. It is not unusual, I believe, for chemistry grad students who get more fascinated with the nature of molecules than how to get them to do stuff, to transfer into a physics curriculum where that is the primary focus of the study of matter. After that, of course, they're not fit for anything but teaching and basic research ;^>. This theory implies that at least one of the reasons law schools don't teach much of the scientific theories of evidence at the state of the art, i.e. including its technological implications and possibilities, is because it's not useful when one goes into practice. If this conjecture was true, then i) it follows that it *should* be practitioners, not professors, who have the greatest interest in shifting how things are done in practice, primarily for financial motivations, and ii) when courts start admitting evidence based on such arguments and techniques, then we ought to see a corresponding major shift in law school curricula.
3) This point is peripheral to the issue of law school curricula, but relevant to your basic argument, I believe, i.e. the cultural dichotomy of "humanity versus technology" or the like. To the extent that thesis holds water, and my gut feeling is that it does or did anyway, I believe we may be in the midst of a paradigm shift wrought by the very progress of technological innovation itself. Innovations such as genetic manipulation, artificial stupidity, er, intelligence I mean, and nanotechnology, for example, raise critical issues in society that come back to basic humanties focused questions - such as the definition of human life and its corresponding protections under governments and their legal and other institutions. And I believe that there is a corresponding shift in technology-focused curricula to incorporate teaching of philosophy and ethics in particular, as a consequence of this phenomena.