When someone “opens a can of worms” it usually spells trouble. For many people, that phrase evokes a powerful image of a writhing mess of worms escaping from a previously-sealed, but now opened, can or container. With the result of such action being serious problems for the owner of said can, often problems of an unanticipated or uncertain nature. In the context of our work as Forensic Document Examiners, I sometimes hear this coming up in discussions of how to handle questions on the stand. The advice goes along the lines of ‘keep your answers simple and say as little as possible in order to limit any opportunity for questions from the other side.’
It is suggested that lengthy or complex answers will only lead to more questions and more discussion. The latter are the proverbial “can of worms” that one must strive to avoid opening.
That makes little sense to me.
After all, communication is a critical part of our work insofar as we must be able to effectively and accurately communicate with our clients and the courts. If we cannot do this, then our opinion cannot possibly be understood. Instead, there is a real risk that our opinion will be mis-valued or mis-applied to the matter at hand. I suspect the real question that should be addressed is the issue of what type and how much information should or must be provided in order to ensure prInoper understanding and comprehension. I would have no difficulty with this recommendation if it went along the lines of ‘keep your answers simple and say only what is required for a full understanding of the issue at hand.’
I am not suggesting that an examiner go on at length about irrelevant or unimportant things—we need only provide relevant information that the trier needs in order to evaluate our evidence fully and properly. Yet that type of information is precisely what people often try to avoid.
The term “can of worms” bothers me because it implies that information ‘hidden’ in the can is itself somehow negative. Sure, worms are not terribly pleasant things for most people—especially if you are not a nematologist, oligochaeteologist, or some other worm fancier—so I suppose it’s understandable that they might be seen in a negative light. In our context, the information we are keeping hidden in the can isn’t, in itself, negative in any way whatsoever.
The only negative in our situation is that the examiner has to do more work and be a good communicator. They have to be prepared to provide information, sometimes quite detailed and complicated, about what they’ve done. I argue that, in general, that is a positive thing. The work we do is not trivial, simple, or easy to understand. So every opportunity we get to discuss that work fully and completely is a good thing.
Some examiners seem to suggest that providing more information will lead to confusion, rather than better understanding. I find that suggestion nonsensical. It might happen, of course, if the person providing the information is incompetent or unqualified.
We are the experts when it comes to our evidence so we really should be able to provide explanations in terms both understandable and appropriate for the audience at hand. That is one of the key functions for any ‘expert witness’. Sure, the subject matter is complex but that doesn’t mean we should try to avoid talking about the subject.
Of course, there is the not-so-trivial, and possibly unpleasant, issues that derive from the fact FDEs are much less consistent and coherent, as a group, than we might like.3 That reality may make any potential discussion… well, challenging, to say the least. But, again, as experts we really should be prepared to discuss and explain everything about our work. Consider, for example, the way we express our conclusions or opinions.
The panel discussion at the 2014 ASQDE-ASFDE meeting included disclosed the somewhat controversial nature of conclusion wording used by different examiners and labs. While many groups purport to use a similar approach (and wording) for conclusions, there really is no single homogeneous approach used by FDEs everywhere (and, yes, there are many reasons why those different approaches are taken.)
Without belabouring the point on this particular topic I feel that whatever wording is used for conclusions must: 1) reflect the actual examination and all of the limitations that exist in that process, and 2) be logically coherent and sound. More specifically, and using terminology developed by far brighter minds than mine, there are four essential requirements that must be met by any evaluation and reporting scheme:4
- ‘Balance’ meaning that the evidence/findings should be evaluated given at least one pair of competing propositions; ideally with the first proposition based upon one party’s account of the events and the latter based upon an alternative account.
- ‘Logic’ meaning that the evaluation process must be one that speaks first to the probability of the evidence/findings given the propositions (plus relevant background information), and not the probability of the propositions given the evidence/findings (plus background information). This is essential to ensure there is no inappropriate or unjustified transposition of the conditional element.
- ‘Robustness’ meaning simply that the evaluation process must be capable of sustaining scrutiny or review by other experts through review or cross-examination. It should be based upon sound knowledge and experience of the evidence type including the use, when available, of pertinent databases, published data or ad-hoc case-based experimentation. In other words, ‘robustness’ refers to the scientist’s ability to explain the grounds for their opinion based upon their degree of understanding of the particular evidence type and it’s probability of occurrence in the relevant ‘population’ relating to each of the competing propositions.
- ‘Transparency’ meaning the entire process should be demonstrable and recorded so as to permit proper review and assessment. This applies to all facets of the examination and evaluation. To achieve this, worknotes should clarify all relevant aspects of the evidence including the interpretation and evaluation of that evidence in terms of the competing propositions. And the report should be written in way that is suitable for a varied audience (i.e. participants in the justice system).
A correct and proper opinion should be able to withstand any ‘attack’ (i.e., it should be ‘robust’, as per point 3 above). The examiner should also be open about how they reached that opinion—they must be able to answer questions like what features were observed, what do they mean, how were they weighed and evaluated, what ‘alternatives’ were considered, and so on.
Questions about all of this are an opportunity, not a negative. The process in its entirety should be exposed for review and discussion (i.e., it must be ‘transparent’, as per point 4 above).
Now, to bring this post back to the original topic I feel that the “can of worms” argument, in any form, runs contrary to the idea of full and complete transparency. I understand the desire to limited debate or attack from counsel for the ‘other side’. However, as an expert testifying on the stand I want things to be clear and unambiguous. Simple would be nice as well, but simple should never trump proper understanding.
That is why I personally prefer to have more discussion, not less, when it comes to our work. We use words and terminology that we believe are effective means of communication. I don’t believe that is necessarily true. Too many studies looking into the understanding of our conclusion wording have shown a significant disconnect between what we mean to convey and what is understood by our audience. In my opinion, the best way to ensure we are successful in our attempt to communicate is to provide as much information as possible about the examination process, what it entailed and what the results mean.
To that end I say, “go ahead — open the can of worms!!”
Footnotes
- Remember that there are no controls or constraints on who calls themselves a ‘forensic document examiner’.
- See, for example, Association of Forensic Science Providers, Standards for the formulation of evaluative forensic science expert opinion, Science & Justice, Vol 49, Iss 3, Sept 2009, pp 161-164, ISSN 1355-0306 (http://dx.doi.org/10.1016/j.scijus.2009.07.004)
- Remember that there are no controls or constraints on who calls themselves a ‘forensic document examiner’.
- See, for example, Association of Forensic Science Providers, Standards for the formulation of evaluative forensic science expert opinion, Science & Justice, Vol 49, Iss 3, Sept 2009, pp 161-164, ISSN 1355-0306 (http://dx.doi.org/10.1016/j.scijus.2009.07.004)