The 11th International Conference on Forensic Inference and Statistics, or ICFIS 2023, is set for June 12-15 of this year. It will be held at the Faculty of Law (Juridicum) of Lund University, Lund, Sweden. While I am saddened that I cannot attend this particular meeting, several years ago I had the pleasure of going to the 2014 International Conference on Forensic Inference and Statistics, or ICFIS which was the 9th iteration of the conference. I wrote a blog post about that meeting some time ago.
I can say, based on past experience alone, that this meeting is well worth attending. That’s particularly true if you are interested in the logical approach to evidence evaluation, but it would benefit any forensic scientist. You will not find a better collection of brilliant people all focused on forensic inference, in the broadest sense.
Forensic scientists, lawyers, academics—they will all be there.
Years ago, in 2013 to be precise, I was invited to speak at the ICA conference held in Montréal, Québec. The conference had a special session on “distinguishing between science and pseudoscience in forensic acoustics”. Now, I am definitely not an expert in forensic acoustics. In fact, I know almost nothing about the field other than what I’ve read from time to time. So I wasn’t there to tell the audience anything about forensic acoustics, per se.
I recently published an editorial in the Journal of the Canadian Society of Forensic Science. Two versions were published almost simultaneously (the original written in English and a translation in French) entitled, respectively, “CSFS Document Section Position on the Logical Approach to Evidence Evaluation and Corresponding Wording of Conclusions” and “La position de la Section des documents de la SCSJ sur l’approche logique de l’évaluation de la preuve et le libellé des conclusions”.
I wrote these in my capacity as the sitting chairman of the Documents section of the CSFS, on behalf of the members of that section. The impetus for writing them was to introduce the “logical approach” and related topics to the Canadian forensic community in a ‘formal’ way (hopefully resulting in ongoing discussion) and to provide the public and the courts with the perspective of forensic practitioners who have reviewed the literature and studied this issue in depth. To that end, the document references many initiatives relating to the topic. I will note that it’s not a perfect document but it covers the main points reasonably well.
Please note that this position paper was first written a few years ago. There was considerable delay in publication relating to the production of an acceptable French-language translation of the document. I must thank Julie Binette who was invaluable in that process. The delay, however, means the references provided in the paper are not fully up-to-date with the very latest developments in this area.
Nonetheless, that shortcoming doesn’t detract from the position expressed. Today there is even more support and justification than is outlined in the paper.
David H. Kaye (DHK) is one of my favourite writers. He is truly prolific and always manages to provide great insights for the reader. His grasp of statistics, logic, and the law is second-to-none, and his ability to communicate those very challenging topics to his audience is equally impressive.
As a mini introduction, David “…is Distinguished Professor, and Weiss Family Scholar in the School of Law, a graduate faculty member of Penn State’s Forensic Science Program, and a Regents’ Professor Emeritus, ASU.” If you would like to see a list of his publications check out http://personal.psu.edu/dhk3/cv/cv_pubs.html
Yes, DHK has written many things on many topics. But I would like to focus on his less formal writings from his blog Forensic Science, Statistics & the Law.
Forensic Document Examination is a complex area involving many different topics and abilities. I am always looking for useful resources that can help me do this work and some of that information can be found online.
In time I would like to provide a more fulsome list of online resources pertaining to the different facets of this work but that is going to take a while to compile and it will be an ongoing project. Still there are already a few websites I consider to be particularly interesting and useful. I’ve compiled them into a list to serve as a starting point for a more complete and general list.
Some of these relate to Forensic Document Examination, some to logic and reasoning, and some pertain to programming and statistics (i.e., my main areas of interest). They are not listed in any particular order. Other categories, and more sites, may be added from time to time. In the meantime, I hope that you find them as interesting and useful as I have. If you know of other sites that you think might be included here, please let me know via the contact page. Enjoy!!
The concepts of ‘prior odds’, a.k.a., prior probabilities or simply priors, and ‘posterior odds’ come up in most discussions about the evaluation of evidence. The significance and meaning of both terms becomes clear when viewed in the context of a “Bayesian approach”, or the logical approach, to evidence evaluation. That approach has been discussed at length elsewhere and relates to the updating of one’s belief about events based upon new information. A key aspect is that some existing belief, encapsulated as the ‘prior odds’ of two competing possibilities or events, will be updated on the basis of new information, encapsulated in the ‘likelihood-ratio’ (another term you will undoubtedly have seen), to produce some new belief, encapsulated as ‘posterior odds’ about those same competing possibilities.
But what precisely do these terms, ‘prior odds’ and ‘posterior odds’, mean and how do they relate to the work of a forensic examiner?
In 1958 Ordway Hilton participated in Session #5 of the RCMP Seminar Series. His article was originally published in that series by the RCMP, and subsequently republished in 1995 in the International Journal of Forensic Document Examiners.
The later republication included the following abstract:
In every handwriting identification we are dealing with the theory of probability. If an opinion is reached that two writings are by the same person, we are saying in effect that with the identification factors considered the likelihood of two different writers having this combination of writing characteristics in common is so remote that for all practical purposes it can be disregarded. Such an opinion is derived from our experience and is made without formal reference to any mathematical measure. However, the mathematician provides us with a means by which the likelihood of chance duplication can be measured. It is the purpose of this paper to explore the possibility of applying such mathematical measure to the handwriting identification problem to see how we might quantitatively measure the likelihood of chance duplication.
Hilton’s article was written in 8 main sections with references, and is followed by a discussion between seminar participants. Today’s review will discuss each section of the article in turn.
Like many document examiners I consider Huber and Headrick’s 1999 textbook, Handwriting Identification: Facts and Fundamentals, to be a seminal work.
In my opinion, it is the best textbook written to date on the topic of handwriting identification. The authors provide a comprehensive overview as well as some less conventional perspectives on certain concepts and topics. In general I tend to agree with their position on many things. A bit of disclosure is need here: I was trained in the RCMP laboratory system; the same system in which Huber and Headrick were senior examiners and very influential. Hence, I tend to be somewhat biased towards their point-of-view.
But that does not mean I think their textbook is perfect. While it is well written and manages to present a plethora of topics in reasonable depth, some parts are incomplete or misleading; particularly when we take developments that have happened since it was written into account.
One area of particular interest to me relates to the evaluation of evidence; specifically evaluation done using a coherent logical (or likelihood-ratio) approach. I have posted elsewhere on the topic so I’m not going to re-hash the background or details any more than necessary.
This post will look at the topic of ‘Bayesian concepts’ as discussed by Huber and Headrick in their textbook. These concepts fall under the general topic of statistical inference found in Chapter 4 “The Premises for the Identification of Handwriting”. The sub-section of interest is #21 where the authors attempt to answer the question, “What Part Does Statistical Inference Play in the Identification Process?” Much of their answer in that sub-section relates to Bayesian philosophy, in general, and the application of the logical approach to evidence evaluation. However, while they introduce some things reasonably well, the discussion is ultimately very flawed and very much in need of correction. Or, at least, clarification.
One of the key elements in the logical approach to evidence evaluation are the propositions used for the evaluation. They are, in a certain sense, the most important part of the whole process. At the same time, they are also one of the least understood.
Today’s post explores the concept of propositions. I will attempt to describe what they are, how they are used, why we don’t change them once set and why they matter so much, among other things… all from the perspective of forensic document examination (though also applicable to other forensic disciplines).