Keep it up Jack!
As I recall, Shafer was equally skeptical of inflated claims about:
Bad things -- child abuse etc. -- do happen, of course. But if we want to say how often such bad things happen, let's use good numbers. So let's actually count -- carefully. And when we do count, let's be quite clear about what is being counted.
hunger in America
One Massachusetts organization managed to get such figures very high by defining (it told me) "hungry children in Massachusetts" with the number of children in Massachusetts who experienced two or more "episodes of hunger" in a given calendar year. child abuse in America
I recall that the "statistics" bandied about one year seemed to suggest that 25% or more of all children had been "abused" in any given year. Many organizations got child and wife abuse figures into the stratosphere by adopting very broad definitions of "abuse" or -- just as often -- by quoting the figures made up by some self-professed and oft-quoted "expert."
8 comments:
In continuation to the quantification issue..
Suppose that the Bar Association of some state passes a rule that requires every lawyer to provide, upon learning the facts of the case, the client a confidential, quantitative estimate of the probability that the case will succeed. Would such rule change the economics of lawyering, and if yes, then how?
Suppose that a single law firm decides to implement such rule independently of other firms. Would this add to or substract from innovator's economic success?
1. Bar Association Question. I know a little about a little. I know nothing about a lot. I once took several courses in economics, including microeconomics. Since then I have looked at economic theory now and then. But not much. And the field of economics has changed dramatically over the years. More important, I see nothing but complications and curlicues when I face the question of whether or how something or other will affect actual human (individual or social) behavior. There are too many variables -- including evidential uncertainty and incompleteness -- for me to cope with. (More in my next message.)
More on the Bar Association Question. The following point is immaterial but revealing: I can't imagine that a bar association would voluntarily adopt a rule requiring lawyers to provide clients with quantitative estimates of "success." By the way, keep in mind that "success" is a bit ambiguous in this context, and keep in mind that a lawsuit has many possible outcomes, and many of them are hard even for the participating lawyers to anticipate. This point (inability to foresee possible outcomes) is one thing that impedes effective application of decision theory to litigation problems. (Ward Edwards in his book with von Winterfeldt [sp?] tells a story that illustrates the difficulty.) One of my online papers deals in rather elaborate detail with this same point.
On the question about a single law firm: Here I will refer to my earlier comment about my extensive ignorance about most things. When I practiced law, law firms did not give quantitative estimates. However, even in those long-distant days I had the sense that insurers were very interested in having quantitative estimates (of "failure" and "success") and perhaps they extracted such estimates even in those days. (I was only rarely involved in classic insurance litigation, and I never had an insurance company as a client.) The pressures toward "rationalization" of the "management" of litigation have grown since I practiced law -- and everything I know suggests that corporate clients are much more demanding of their lawyers now -- and so perhaps quantitative estimates of some kind have become de rigeur for certain kinds of litigation. I will have to look into this -- if only to reduce the extent of my massive ignorance.
Under the protection of the lower standards for writing in blog communication, my quick thoughts:
A Theory of Quantification would be much needed, not only for lawyers but for people who
envision and implement IT systems for the law.
IT systems are usually easier to develop and more effective on processes that have high level
of quantification. IT itself drives formalization (and quantification) of all fields that use
computers. (This is often a very serious negative side effect of computerization. They promise to make your work more effective, but in effect, your work will not get more effective – it gets more regulated).
AN OUTLINE OF A THEORY OF QUANTIFICATION
A Theory of Quantification probably should look at the whole system of law, because
processes of quantification can be interrelated. Such a theory should help establish and
understand the levels of quantification in various parts of the system of law. The theory
should also provide guidelines for identification and evaluation of drivers for quantification
(and dequantification). It should help understand and influence (if possible) the complex
processes of quantification and dequantification.
Some branches of semiotics have been developing theories of this kind. (Unfortunately I do
not know the semiotics of law, there is certainly such field). One school of semiotics
(Yu.Lotman) would state that:
a) Any semiotic system is heterogenous with respect to degree of its organization. Highy
organized (formalized) layers co-exist with fluid, nebulous layers (in the system of law too,
areas differ in their degree of elaboration).
b) Semiotic systems also differ by their level of formalization (cultures differ in level of
codification and formalization of their systems of law).
c) Semiotic systems are dynamic. The system is in continuous development. (The changes
can be quite slow, though). Some elements move toward higher formalization. Others
'losen' their degree of organization. Changes from one form of high organization to another
are possible.
[Introduction of visual material into trial process can be viewed as dequantification (discrete,
textual information is partly converted to analogue, iconic visual information. But
dequantification in this use should not suggest that visual information has lower level of
organization than textual information.]
[continued]
SOME SPECIFICS OF QUANTIFICATION PROCESSES IN LAW
Quantification processes in law seem to have certain specifics:
a) Strong striving to quantification yet this is eventually countered with equally strong
counterforces or constraints. Obviously, quantification is generally useful and even essential
for justice. Therefore, systematic effort toward quantification is characteristic of systems of
law. But there seem to be (hidden) forces that counteract quantification.
b) Quantification with safequard mechanisms. Also, it is very interesting that system of law
while striving high formalization and quantification, often have 'back doors' built into the
system – that allow the operator of the system to stop the blind mechanism of formal
reasoning. Usually, these 'back doors' are in form of general (and rather vague) rules that
can be used to overrule more specific rules (in practice this can happen even in contrary to
the meta-rule that states the priority of the more specific rule).
DRIVERS FOR QUANTIFICATION (DEQUANTIFICATION)
– Driver of technical rationality (T) – quantification can simplify administering of the law.
This includes considerations for internal consistency of the whole system of law. For
example, EU requires its member states to vastly expand their national laws, with the
justification of 'harmonization' law in the EU (that allows greater efficiency in
administration of law on super-national level).
– Driver of economic utility (E) – quantification can change the balance of power between
the participants and produce different economic outcomes for them.
– Psychological (cognitive) driver (P) – this rationale is used when one claims that humans,
by the inherent limitation of their psyche, cannot correctly articulate the matter in numbers,
or cannot comprehend the numbers properly. Psychological justification can be used to
support quantification as well. One can well argue that in many situations numbers are
cognitively processed more efficiently than words.
– Cultural drivers (C) – level of quantification can be substantially affected by the internal
dynamics of the whole cultural system. Mechanism of any large scale cultural system can be
only partly understood and can be only partly designed. (The system develops by its internal
logic in part.) Accordingly, a culture may put high symbolic value into numbers; or a culture
may regard words as a symbol of competence. These attitudes also differ for different areas
of practice.
– Rationale of justice (J) – Claim that quantification (or dequantification) is instrumental or
even essential for achieving justice (equity). For example, we may imagine a legal system
where the measure of punishment is left at the discretion of the judge - with a justification
that only a human is able to weigh and fuse the unique circumstances of the case. Of course,
the same rationale can be used to justifiy a highly standardized system where the judge has
little freedom of choice.
– Irrational reasons, or unknown reasons (I) – For example, in Middle Age, a peasant who
fled to free town and lived there for one year and one day, became a free man. The term of
one year and one day seems to have no other motivation than some poetic or mystic impulse
(one year, a round number would have been technically easier to handle).
SYSTEMATIC ANALYSIS OF QUANTIFICATION LEVELS AND QUANTIFICATION
PROCESSES IN THE SYSTEM OF LAW
With these preparations, a scheme can be sketched for a study of actual level of
quantification in the system of law: element of the legal system – level of quantification (or
direction of the (de-)quantification process – Rationale (justification) for the chosen level of
quantification.
I think that such an analysis would be quite interesting (but probably too academic for a
practitioner). Being no professional in law, I can just touch a couple of elements:
Legal age – It appears highly unjust that two persons with difference in age of only one day
can fall into two different categories (minor or not minor) that handle their very similar
violations very differently. Quantification: high. Rationale: technical rationality (T)?
Terms of imprisonment – In some countries prison terms are given in full years or half-
years. Other countries may have higher granularity (months, days). Some countries seem to
have a 'fake' quantification here – the term announced with verdict and the actual term can
differ significantly, because of relatively little quantified systems of pardon. I would say, that
we can see a trend towards dequantification with respect to this element. I think that in
practice, prison management not court can determine the final length of the term.
Form of presentation – In some systems, Supreme Court does not hear parties in person but
only studies submitted documents. Quantification: medium to high. Rationale: technical
rationality (T)?
Due dates (for submission of documents to court) – Most countries have strict rules that are
followed, I suppose. Quantification: high; Rationale: technical rationality (T)?
Verdict – From process-temporal point of view (the case as it proceeds from beginning to the
end), quantification probably reaches the highest point in verdict. Probably all countries use
the binary system (guilty/not quilty). But theoretically, 'partially quilty' is well possible. A
formula '50% quilty x $100 penalty' is equivalent with 'quilty x $50 penalty'. Quantification -
high. Rationale: justice (J)?
pr has no need to apologize: He seems to have sketched a general theory of society & epistemology and of the possible place of law within that system. I wonder who this pr is. (I have a guess. But it's a weak guess.)
Yes, there are people who do semiotics of law. For reasons having to do mostly with my personal turn, dacdes ago, away from German Idealism and almost all that goes along with GI, I have stayed away from semiotics. And some of the people who semiotics & law are indeed thoroughly post-modern. Hwoever, even just for purposes of my narrow interest (in inference) there is value in semiotics, I think. There is, first, the consideration that the modern paterfamilias of modern semiotic theory is Charles Peirce, a complete if personally erratic genius. There is, second, the consideration that my erstwhile collaborator, David Schum, thinks semiotic theory is important -- and Dave Schum is a wise man. Beyond this, I think if one wants to take the medias res between subjective and objective theories of knowledge (and inference) -- and this is a wise path to take -- , one must impute signaling properties to events in the world that serve as "evidence," one must see in events properties that reach beyond themselves -- hints -- and suggest things (hypotheses and such things) beyond themselves. This kind of notion of signs is practically mystical and religious. But perhaps some sort of quasi-mystical and quasi-religious notion such as this -- a notion of signs or hints in the cosmos -- is necessary to explain who it is possible that people manage draw conclusions -- often mangificcent conclusions (e.g., e = mc^ and the like) -- about the cosmos). [But one must remember that the mind also makes its contribution and that the mind is sometimes somehow magically aligned with the cosmos. I think now of the wonderful story of the Russian mathematician Perelman, who wants nothing of the prizes and the $1,000,000 he was offered. See NYTimes His work may help tell us what the shape of the universe is!]
Sorry for the wrap problem...
Re: postmodernism
I'm just in process of reconceptualizing my information management courses along the modernism-post-modernism axis. I teach two information courses. The first one, a bachelor-level course is about the 'modernist' approach to information management. The goal is to learn the rational (hard) system development methods. Only after learning the rational process is the student prepared to benefit from postmodern theories of information management, including semiotics (the second, masters-level course). [Probably the same applies to learning disciplines of law as well. One must learn the 'mechanics' of law. But even the best study of the hard-edge methods is not sufficient to meet the real world. One needs philosophy as well, a study of a dialectical or another system.]
RULES VERSUS JUDGMENT CALLS
One may ask: if a rational, rule-based legal process is generally regarded as the ideal (would not the 'rule of law' mean, in part, the 'rule of rules'?) and great efforts have been made, over history, to codify and quantify almost all elements of the process - when why some elements still seem to resist the codification (quantification)?
Is a fully rational system possible? If not, when to what extent should we strive to rationality?
**
This is an issue of philosophical scale, and therefore, as expected, emerges in more than one science or technical field.
In respect of software systems, David Parnas perhaps has made one of the most elegant contributions to the issue.
David Parnas and Paul Clements (1985) "A Rational Design Process: How and Why to Fake It"
http://en.wikipedia.org/wiki/David_Parnas
http://web.cs.wpi.edu/~gpollice/cs3733-b05/Readings/FAKE-IT.pdf#search=%22Clements%20fake%22
Wikipedia says that Parnas' position is that of 'technical realism': "If we have identified an ideal process but cannot follow it completely, we can still follow it as closely as possible and we can write the documentation that we would have produced if we had followed the ideal process. This is what we mean by “faking a rational design process”."
It seem that we can substitute "software design" in first tree sections of Parnas' & Clement's article with names of processes from various other fields - intelligence analysis, trial, performance assessment, student grading, electronic medical records - and get meaningful, timely discussion points for these other fields:
I The search for the philosopher's stone: why do we want a rational design process?
II Why will a software design "process" always be an idealisation?
III Why is a description ofr a rational idealised process useful nonetheless?
**
A simple model of the problem could be:
D -> P -> P -> ... -> I
- raw data (D) is acquired from sources and then processed in a number of steps (P). The output - information (I), in form of recommendations, plans, decisions, or orders - is delivered to the human users of the information.
Processing can be made as rational (rule-based) as possible (R) - by establishing work procedures, rules, and measurements:
D -> R -> ... -> R -> I
However, even if the construction of the fully rational system can be finished, the use of it can raise problems: a) the output can turn out to be not as effective as expected; b) some output can be outright unrealistic; c) the users can find themselves generally unhappy with the output of the rational system.
Irrationality, or what is probably better labeled by English 'judgement call', has to be slipped into the system, at some point, to make the system more satisfying, or even to make it function at all. This can happen at the end of the process
D -> R -> ... -> R -> I -> Judgement Call -> I1
or in other points. Lotman's theory posits that fully rational systems (all R's) are possible only when the system is relatively simple (non-intelligent) or small. Any really complex cultural system is a dynamic mix of R's and J's, according to his view.
Therefore, a more realistic view would be of a system where rational (highly codified) elements (R) are combined with 'judgement calls' (J) - elements of less formalization.
D -> R -> J -> R -> J -> I
**
The dialectic of 'rules versus judgment calls' can be seen in the disputes about how to build a good intelligence analysis system. For example, in a NYT article a government official criticizes intelligence agency: “The people in the community are unwilling to make judgment calls" http://www.nytimes.com/2006/08/24/washington/24intel.html
While a part of the disfunction can be the end users's desire to manipulate the intelligence analysis system's output, it can also be that a fully rational (rule-based) system of intelligence analysis is inferior, for inherent reasons, to a system that has 'judgment call' elements.
(An inportant idea in Lotman's theory is bipolarity: any body or system, to exhibit intelligence, has to have two sub-systems as minimum, that are built in fundamentally different ways yet are capable of holding dialogue with each other. This would even make a case for two intelligence agencies - as was the case in Soviet Union (GRU and KGB).)
**
Hypothetically, in the legal process, the ban of hearsay evidence may be one of (the few remaining?) places where a 'judgment call' element is retained, and it somehow balances other, more rational (rule-based, mechanic) elements of the system.
Post a Comment