Prior odds — their meaning and significance

The concepts of ‘prior odds’, a.k.a., prior probabilities or simply priors, and ‘posterior odds’ come up in most discussions about the evaluation of evidence.  The significance and meaning of both terms becomes clear when viewed in the context of a “Bayesian approach”, or the logical approach, to evidence evaluation.  That approach has been discussed at length elsewhere and relates to the updating of one’s belief about events based upon new information.  A key aspect is that some existing belief, encapsulated as the ‘prior odds’ of two competing possibilities or events, will be updated on the basis of new information, encapsulated in the ‘likelihood-ratio’1 (another term you will undoubtedly have seen), to produce some new belief, encapsulated as ‘posterior odds’ about those same competing possibilities.

But what precisely do these terms, ‘prior odds’ and ‘posterior odds’, mean and how do they relate to the work of a forensic examiner?
Read more

Hilton and Mathematical Probability

In 1958 Ordway Hilton participated in Session #5 of the RCMP Seminar Series. His article was originally published in that series by the RCMP, and subsequently republished in 1995 in the International Journal of Forensic Document Examiners.1

The later republication included the following abstract:

In every handwriting identification we are dealing with the theory of probability. If an opinion is reached that two writings are by the same person, we are saying in effect that with the identification factors considered the likelihood of two different writers having this combination of writing characteristics in common is so remote that for all practical purposes it can be disregarded. Such an opinion is derived from our experience and is made without formal reference to any mathematical measure. However, the mathematician provides us with a means by which the likelihood of chance duplication can be measured. It is the purpose of this paper to explore the possibility of applying such mathematical measure to the handwriting identification problem to see how we might quantitatively measure the likelihood of chance duplication.

Hilton’s article was written in 8 main sections with references, and is followed by a discussion between seminar participants. Today’s review will discuss each section of the article in turn.
Read more

Huber, Headrick & Bayes…

Like many document examiners I consider Huber and Headrick’s 1999 textbook, Handwriting Identification: Facts and Fundamentals, to be a seminal work.1

Huber and Headrick Handwriting IdentificationIn my opinion, it is the best textbook written to date on the topic of handwriting identification. The authors provide a comprehensive overview as well as some less conventional perspectives on certain concepts and topics.  In general I tend to agree with their position on many things. A bit of disclosure is need here: I was trained in the RCMP laboratory system; the same system in which Huber and Headrick were senior examiners and very influential.  Hence, I tend to be somewhat biased towards their point-of-view.

But that does not mean I think their textbook is perfect. While it is well written and manages to present a plethora of topics in reasonable depth, some parts are incomplete or misleading; particularly when we take developments that have happened since it was written into account.

One area of particular interest to me relates to the evaluation of evidence; specifically evaluation done using a coherent logical (or likelihood-ratio) approach.  I have posted elsewhere on the topic so I’m not going to re-hash the background or details any more than necessary.

Rev Bayes

This post will look at the topic of ‘Bayesian concepts’ as discussed by Huber and Headrick in their textbook.  These concepts fall under the general topic of statistical inference found in Chapter 4 “The Premises for the Identification of Handwriting”.  The sub-section of interest is #21 where the authors attempt to answer the question, “What Part Does Statistical Inference Play in the Identification Process?”  Much of their answer in that sub-section relates to Bayesian philosophy, in general, and the application of the logical approach to evidence evaluation.  However, while they introduce some things reasonably well, the discussion is ultimately very flawed and very much in need of correction. Or, at least, clarification.
Read more

Propositions — key to the evaluation process

One of the key elements in the logical approach to evidence evaluation are the propositions used for the evaluation. They are, in a certain sense, the most important part of the whole process. At the same time, they are also one of the least understood.

Scales balancing Evidence

Today’s post explores the concept of propositions. I will attempt to describe what they are, how they are used, why we don’t change them once set and why they matter so much, among other things… all from the perspective of forensic document examination (and other forensic disciplines).
Read more