top of page

The Advantage Blog

Theodore O. Prosise Ph.D.

TEN KEY QUESTIONS: EVALUATING THE QUALITY OF MOCK TRIAL RESEARCH

Updated: Jul 14, 2023


It is important to recognize that mock jury and focus group research projects come in all shapes and sizes; not all are created equal. This article offers ten key questions to help you assess the quality of mock trial research in the manner a Jury Consultant might. It is essential to remember that the quality of the findings—the output—is wholly dependent upon the quality of the methodological choices—the inputs.

Q1: What is the sample size of the research?

Sample size is a key in determining how much weight to ascribe to research results. A larger sample size increases the confidence that the data accurately reflects the strengths and weaknesses of the case, and serves as a control to make sure particular findings are not a product of atypical jurors or panels. The best research findings are derived from testing the case with three or more panels comprised of eight to twelve jurors each.

Q2: Is the case presentation identical for all groups?

Case presentation should be identical for multiple groups if the findings are to be reliable. Assessing the consistency of findings from randomly recruiting and assigning groups and understanding why the consistency or inconsistency between groups exists, based on what and how the jurors talk about the case, is critical. This is particularly important if one of the primary research questions involves a measure of damages, arguably the least predictive element of any pre-trial mock jury research.

Q3: How were the participants recruited and screened for security and eligibility?

Randomized recruiting methods like random digit dialing (RDD) result in a more representative sample than other methods. Widespread methods of recruitment include obtaining participants through newspapers, Internet ads, or from market research databases. These methods are particularly vulnerable to self-selection bias: participants who voluntarily respond to ads have different characteristics than the population as a whole. You do not want quasi “professional” focus group participants in your research. Such participants are not representative, are often too familiar with the focus group or mock process, and thus act differently than a realistic jury panel would to the mock presentations of themes, evidence, and law.

At a minimum, research participants should be screened to ensure they are (1) qualified to serve as jurors in the trial venue and (2) that there are no confidentiality or security risks.

Q4: How is the case presented to participants?

Attorney presentations, whether live or videotaped, are effective ways to present case facts to participants. However, other methods (e.g., facilitator-read scripts) can be useful depending on the goals of the research. Presentations should be more argumentative than most opening statements and should explicitly argue the appropriate application of jury instructions to the verdict form questions. The goal is to mimic a jury audience’s experience in the courtroom as closely as possible.

Q5: Are you testing the case in the right way? Will the data be valid?

Jury verdicts are products of small group deliberation. They are the result of collective reasoning, conversations, and persuasion in deliberation. What motivates and arms jurors? What is effective in the small group? Answering these questions is key. Perception dials may generate revenue, but they don’t test the right thing. Jurors listen to the lawyers and testimony, observe evidence, make physical or mental notes, but don’t turn dials. It is how the themes, testimony, and evidence resonate, are recalled (or not), and how they are used in face-to-face deliberation that is critical. Again, the idea is to mimic as closely as possible the ways in which juries actually deliberate to verdicts.

A facilitator can have substantial influence on the conclusions of a group. If the goal is to understand how a jury will likely assess and evaluate two opposing case theories in light of the evidence, themes, and jury instructions, it is best to let them deliberate for a good deal of time unimpeded by the facilitators. The mock trial design can then include a focus group discussion at the end where unanswered questions can be addressed.

Q6: Are the presentations balanced (or weighted in favor of the opposition)?

Ensuring balanced presentations is essential to a good and reliable test. You learn the least from a “win” where the strengths of the opposition’s case and style were not front and center. Test the toughest case against you. Questions germane to the issue of balance include:

  • Is the evidence shown to the participants the most important evidence? This is a critical process of making important decisions about what will be front and center in the mock trial. You cannot present everything, so what should make the cut?

  • Were actual witnesses used (as opposed to actors)? If so, how were the witnesses represented? Actors can provide false findings, as so much of people’s impressions can be influenced by an individual’s unique style, look, and demeanor.

  • Since video depositions are a common stimuli for jury research, are both good and bad portions of depositions included?

  • Are the presenters equally experienced, skilled, or knowledgeable? Consider having the most experienced attorney present the opposition’s case to increase the likelihood of having equal presentations.

Q7: Who conducts the research?

Quality of insight and execution can vary greatly between practitioners. Most trial consultants have advanced degrees in social sciences (e.g., psychology or communication). There are certainly those without advanced degrees who can offer exceptional service; however, quality of findings may depend on the researcher’s understanding of the nature of communication, advocacy, social science, small group research, and juror behavior. What are the qualifications of your consultant team? Are they trained to analyze the right things in the data you are collecting (the communication dynamics in the deliberation, based on the messaging in the mock trial stimuli)? Can they interpret, analyze, and assess the most meaningful things, distill the findings and provide practical solutions, advice, and recommendations?

Is it a team focused on the job, with expertise and personal and institutional knowledge and memory, or a cobbled together set of independent contractors with other jobs and interests? Are these independent contractors free of conflicts? Are they focused on the job or counting their hourly pay as they work on the project?

Quality research takes care to prevent participants from discovering which party is conducting and funding the research. If the research participants know who they are working for, the “good subject” phenomenon—the phenomenon of research participants attempting to give the answers they think the researcher wants to hear—could influence the outcome. Care and tact needs to be the rule in the execution of a mock trial.

Can you afford a failure? Lots of time and expense by the trial team, staff, and client are invested in the research process, in addition to the expenses of the trial consultant teams fielding your mock trial. Is there a track record of reliability, execution and implementation of the logistical elements so critical and foundational to any assessment of the findings? These logistics include confidentiality assurance, ensuring an adequate number of participants show up, providing an adequate facility for deliberations and observations – all necessary elements of a successful mock trial which cannot be taken for granted.

Q8: How are the results of the research collected, presented, and used to stimulate solutions?

An active and engaging process can be the secret sauce of a productive and meaningful mock trial or focus group. Is the data being collected passively or actively in terms of analysis? What is done with data and findings at the time of collection, not just in later analysis, is a unique element of a good mock trial. A process of active engagement with the findings, and an interactive and dynamic process of strategic development when the ideas are fresh, when creativity and focus are at their highest, is the way to get the most out of the hard work and expense that the trial team and client have put into a mock trial. High risk juror profiles, demonstrative/visual advocacy concepts (conceptualized in a war room and tested out in the last section of the mock), strategic thematic and witness “headlines” can all be considered immediately, distilled, and even tested in the project.

Q9: What is the final mock trial product?

A good report should provide enough information to allow you to judge the quality of the research methodology. At a minimum, it should include sample size, participant recruitment procedures, an overview of how the project was conducted, and an explanation of the data analysis. If any of these are missing, it could be for a reason.

The report should not be a data dump or insights that are equivalent to the “two-handed economists” view of the case. Findings should be clear and data-driven recommendations; practical strategic and tactical advice should be the norm. The recommendations should be based on unique findings from the mock trial and informed by tools and techniques in the vast arsenal of the trial consulting team’s experience about what has worked and not worked in the past.

Q10: Is the budget clear and inclusive?

Are all fees and expenses clearly included in the budget, or will there be surprises based on unclear or unrealistic understandings of expenses? We call this the infamous “page 8 question,” where there is often a vague reference to expenses to be determined later, rather than before authorization of the project.

In short, the quality of your research findings depends upon the quality of the methodology, and the quality of the assessment and interpretation of those findings.

This article is a variation of an August 2007 DRI For the Defense article. For a copy of the original, contact Tsongas. Additionally, if your firm is interested, Tsongas conducts seminars and CLEs on this and related topics.

Comments


bottom of page