Why Your 20-Minute Voir Dire Misses Everything, with Valerie Hans, Jessica Salerno, and John Campbell
Episode Summary
Is there evidence that giving attorneys more time to ask more detailed questions would better predict juror bias? “We found a resounding ‘yes,’” says Valerie Hans, who joins Jessica Salerno and John Campbell to discuss their groundbreaking research. Valerie, of Cornell Law School and a leading authority on the jury system; Jessica, a social psychologist from Cornell University's psychology department; and John, trial lawyer-turned-researcher and co-author of JuryBall, published their findings in a 2021 paper called “The Impact of Minimal Versus Extended Voir Dire and Judicial Rehabilitation on Mock Juror's Decisions In Civil Cases.” Hosts Alicia Campbell, Nick Schweitzer, and Kevin Doran unpack the significance of their study, including that extended voir dire with case-specific questions identified 42% of jurors as potentially excludable, while minimal voir dire captured less than 2%.
Learn More and Connect
☑️ Trial by Data and Cornell University
☑️ Valerie Hans
☑️ Kevin Doran
☑️ Subscribe: Apple Podcasts | Spotify | YouTube
Transcript
Voice Over (00:00:03):
Every trial lawyer knows that moment when you've built what feels like an airtight case, but you're still lying awake, wondering what will the jury actually think? Jury research was once a luxury reserved for cases that could support a big data study bill. Not anymore. Join trial lawyer and trial scientist, Alicia Campbell, empirical legal scholar, Nick Schweitzer and data guru Kevin Doman as they break down the barriers between you and the minds of your jury. This is the Fred Files produced and powered by Law Pods.
Alicia Campbell (00:00:38):
Hi, welcome everybody. Back to the Fred Files. We are thrilled to be here again. Of course, it's me, Alicia Campbell and Nick Schweitzer and Kevin Doran, and we're all super, super excited to have three guests. Today we're going to talk about one of our favorite topics, which is voir and really how it's gone wrong, how judges are getting it wrong. I guess I'm going to go on a limb and say that today the things they're not doing in court to make sure that it's a better process. So we have Valerie, Hans, Jess Salerno, and John Campbell, who for some of you you'll be familiar because we have passed around the law on human behavior article many, many times in the hopes that you will file it with motions, asking for questionnaire, asking for no rehab, asking for a fair process so that a judge could be a little bit more informed and that would be part of a record if something goes up on appeal. But today you're going to hear from the authors themselves and it's going to be awesome. So I'm excited. Everybody knows Nick, me and Kevin. Yes, we've done like a grand total of five of these. Now everybody's like, Hey. Yeah, Alicia, Kevin, and Nick. I know well, but if you guys would go ahead and tell us a little bit about yourselves starting with Valerie, that would be awesome so that everybody knows who you are.
Valerie Hans (00:01:57):
Sure. I'm the Charles f Recklin professor of law at Cornell Law School. I'm actually a social psychologist by training and I've been studying juries for decades. Fascinating subject. Really, really interesting. And I, I've been so interested in how our country selects juries, so this is a great opportunity to talk about that with the rest of you.
Alicia Campbell (00:02:19):
Awesome. Jess, you want to go?
Jessica Salerno (00:02:22):
Yeah. So I am Jess. I am also a social psychologist at Cornell University, but I'm in the psychology department here at Cornell. I've also technically been studying juries for decades now. I started as an undergraduate. My undergraduate thesis was little mock juries, trying to help them with understanding science in court better, and I've been doing it ever since. So I also, a lot of my research is on how juries deliberate, how they make judgments and how to improve how we select juries. So I'm also super excited to be here. This is one of my favorite papers I've done. So happy to get a chance to keep talking about it.
Alicia Campbell (00:02:54):
And here, I thought you were going to say it was one of your favorite podcasts.
Jessica Salerno (00:02:59):
That's obvious, Alicia, that doesn't need to be said.
Alicia Campbell (00:03:02):
So tell me about this paper. What were the
Jessica Salerno (00:03:03):
Results? Yeah, well, let's start with the headline before we get into the details. So the paper was essentially designed to see if we could find some evidence that giving attorneys more time to ask more detailed questions would better predict bias jurors than the kind of minimal voir dire that typically happens when it's limited. And we found a resounding yes, extended voir dire questions predicted juror's verdicts and their damages and none of the minimal voir dire questions did. We were able to identify almost half the sample admitted to bias when we asked the right questions, when only 2% did when we asked them to identify their own biases and sort of yes or no questions. We also found no evidence that judicial rehabilitation helped the situation at all.
Alicia Campbell (00:03:42):
And then John Campbell, which if anybody listened to the Res Ipsa podcast, you get to know him really well with Kevin. But what is your role here, John Campbell?
John Campbell (00:03:51):
Hey, my role probably is best described as somebody who was in the practicing bar, but also working in the academy who recognized that although academics knew a lot about jury selection and attorneys had a lot of opinions about jury selection, there were some bridges probably that needed to be built or gaps filled. So this paper that we're going to talk about is really an interesting model because there were private practice attorneys who were struggling with the fact that when they wanted to pick a jury, they were facing what they viewed as challenges that prevented them from rooting out people who might have really strong biases. Somebody who might, if given the chance, say, I don't like this kind of lawsuit because, or I have a real concern about, or I don't like non-economic damage, whatever. But in some states and in some federal courts, those questions weren't being asked and still aren't asked.
(00:04:43):
And so I had attorneys that approached me while I was at the University of Denver who said, we would gladly fund research into jury selection because we think that we know what's happening, but the judge isn't going to listen to our opinion. I mean, in fact, they think it's slanted and biased because we're just trying to get a better jury. And so it sort of seeded this idea of what if we took funding, did an honest and clear academic study on jury selection and how we identify bias, existing methods and maybe better methods. And while we're at it, look at rehabilitation, the process of asking jurors if they can be fair, which is very common. And so then my job when I got this funding was to say, what would the dream team look like to study this issue so that we do it? And so that any results we have are credible and clear and precise. And so I reached out to Valerie and to Jessica and to the lately Lee Ross, and we got to work. And so I think you'll learn about that paper as we go along. But my role maybe was as bringing the team together to do the work and then sort of tagging along with people a lot smarter than me as we did it.
Valerie Hans (00:05:53):
John, can I also say I think your presence was absolutely because left to our own devices, designing experiments, designing studies might not be actually reflective of the way real lawyers were operating in the real world and real jury selection. So you really kept us tightly linked to that world world as we developed our experimental methods.
Jessica Salerno (00:06:19):
And I think one of the reasons I mentioned this was one of my favorite papers, and I think the big reason is it was the first paper that I started collaborating with John and Alicia and real attorneys. And I remember when John called to pitch this study, and there's a real unfortunate thing that happens where a lot of people in academia, psychologists, for example, are incentivized by academic publishing. And that tends to require that you discover something brand new about psychology. And when John called and said, well, I think these attorneys might be really struggling because they're not being allowed to ask specific questions, they're not being given time. When the judge asks them if they can set aside their biases and they say, yes, we think that might not work. And my reaction at first was, well, yeah, but we already know that. But we know that in the basic psychology world and practicing attorneys and academics don't get together very often to really zero in on what are the important questions that will actually help attorneys.
(00:07:12):
How do we take a lot of the science we have and actually translate it in a helpful way rather than just keeping it in an ivory tower and not really applying it to any real world context? So it started a great program of research now where we all work together a lot on things that I think John and Alicia's role are really as bringing us the important questions, making sure we're not suggesting wild things that don't make sense for the real world. So it's a really special collaboration. I think that doesn't happen as often as it really should.
Alicia Campbell (00:07:39):
Awesome. And so yeah, we're all here to talk. What's the name of the paper? Good question.
Valerie Hans (00:07:46):
I've got it right here. The impact of minimal versus extended voir dire and judicial rehabilitation on mock juror's decisions in civil cases here. It is.
Alicia Campbell (00:07:56):
Wonderful. And it was published in law and human behavior. Yes.
Valerie Hans (00:07:59):
Yeah,
Alicia Campbell (00:08:00):
Perfect. Can somebody tell me, because a lot of times when lawyers think about publishing, we harken back to our law school days and law journals. What's the difference between a law journal and where this was published?
Jessica Salerno (00:08:13):
I'll throw that to Valerie because she publishes a lot in both law journals and psychology journals, and they are very different processes. I don't even understand the law world.
Valerie Hans (00:08:20):
Well, in a law journal, you are pitching to your students and hoping that you've got something that's exciting enough, sexy enough that they will bite on it and agree to publish your work. But in a peer reviewed journal like Law in Human Behavior, the editor sends your paper with your names and identification and all your affiliations absolutely removed. So it's blind and sends it out to other experts in the field and they review it and point out all the things that perhaps you've overstated problems with your analysis or strengths of the paper, and they come back and make a recommendation to the editor about what changes might need to be made before the paper can be published. And that's the process that we followed in this particular case. And we were lucky to get it published in the top journal in Psychology and Law
Jessica Salerno (00:09:12):
And how many people were
Alicia Campbell (00:09:13):
Studied.
Jessica Salerno (00:09:14):
We studied over 2000 mock jurors. So we recruited thousands of mock jurors to play the role of a juror and read one of three different civil cases. So they were kind of divided up to read three different cases. One was a malpractice case, one was an insurance, bad fee of case, and one was a wrongful birth case. So we have at least probably around 700 jurors all reading the same case so that we could measure things like how do their attitudes that we asked about in voir dire predict how they interpreted that case and judge that case because we have the benefit of seeing how hundreds and hundreds and hundreds of jurors read the same case.
John Campbell (00:09:51):
I was just going to say Jess and Val do more of this than any of us, but for those who aren't used to sort of research, I was late to this and sort of learning from people around me. One of the things that's become more talked about and important in research like this is the idea of replication and that you might get effects in a study and they might even be statistically significant, but it's important to see that that happens more than once. I would just point out that for the judges who might listen to this podcast or for attorneys talking to judges or legislators about this, one of the things we wanted to do with this study was make it robust enough that if we found something and we didn't know, we would know what we'd find. We had hypotheses, but if we did that, we had addressed that to some degree the replication issue even within the paper.
(00:10:37):
And so it is unusual to see a study where you have three different cases being tested and each of those cases is powered with 700 participants, and then you're looking at effects across all of those. I would just say that our goal was, because we were lucky to have funding to do it, was to run a really robust study. There are bigger jury studies like the Chicago jury project in the fifties and others, but it is somewhat unusual to see a jury study that has 2000 people. And so that was really the goal that this would be unimpeachable, at least in terms of having multiple cases and lots of participants.
Alicia Campbell (00:11:22):
That's awesome. So then what's the best way to get started with how you conducted the data? You guys tell me where we should start in talking about that. What's the best way to go through it?
Jessica Salerno (00:11:33):
I can jump in with just sort of a broad overview of the methodology that we used. We conducted what we often refer to as a mock jury experiment. So a lot of us here are social psychologists who aren't the typical, the first type of psychologists people think of. They often think of therapists and clinical psychologists, but we're what's called experimental psychologists. So our specialty is conducting highly scientific experiments to determine causality, meaning our basic methods are very similar to medical researchers who are testing the efficacy of a drug, for example. But we just do it in the context of juries. So the basic methodology is pretty simple. We recruited thousands of mock jurors online, so there are services online. They're very useful for researchers where these services will recruit people who want to participate in surveys. So we recruited thousands of them. We brought them to our site and we had them all serve as mock jurors.
(00:12:29):
So what that meant was they would engage in some form of voir dire. We had three different versions, which I'll mention in a second, and then they read a summary of real cases. So again, we used three different cases. And then they made judgements as if they were real jurors. They decided whether or not they thought that defendant was liable or not. They decided damage awards. And what we did in our study is we randomly assigned all of these thousands of people to go through different versions of voir dire. So first we manipulated what type of questioning they got, so we randomly assigned them to get no voir dire at all before they judge the case, a minimal voir dire version. And this was really the type of voir dire that we were concerned about where attorneys are highly limited by time and they're forced to use very few questions, but also what we call minimal voir dire questions, which involve things like, have you been a juror before?
(00:13:21):
Have you ever been sued before? Demographics? And a certain type of question that attorneys who don't have much time are forced to use, which is where you ask the jury to self-identify their bias. So you ask a yes or no question, something along the lines of, Valerie, can you think of any reason you might be prejudiced or biased in this case? And they say yes or no. So they're sort of forced to do this. They don't have much time to ask about specific things. So one third of our jurors got that type of voir dire before they read the case. And then the final third got what we thought was a much better version of voir dire and extended voir dire where they got those minimal questions, but they also got a whole bunch of case relevant attitudes questions. So we knew as psychologists that these types of questions are much more effective typically in revealing people's biases.
(00:14:07):
So we asked all kinds of things and we drew a lot from both Valerie's previous research, and John and Alicia are amazing consultants in a lot of these cases. So they had a lot of questions that they found to be effective in voir dire. These are things like how they feel about civil cases in general, how they feel about non-economic damages, whether they trust lawyers, if they're a Trump supporter. I mean, just ran the gamut, but we're much more specific. So one third of the jurors got those questions before they judge the case, and then they were again randomly assigned to either go through judicial rehabilitation or not. So half of the jurors went through our version of this where we had a video of a judge talking about judicial rehabilitation, telling them that it's very important to not be biased and to set aside any of those and ask them if they thought they would be able to do that. And they had to say yes or no, and half of the jurors did not go through that judicial rehabilitation. Then they read the case, they made their judgements, and we asked them some questions at the end about how biased they thought they were, if they're aware of the biases that they might hold. And then we analyzed all that data. So that's sort of a broad overview of how the study worked. I dunno if anyone else has anything they want to add, but that's the basic gist of it.
Valerie Hans (00:15:18):
I just want to underscore that the kinds of questions we asked really came directly from the field from John and Alicia's work and from some other work that I did with actual jurors. So that really, I think adds to the importance of the study because it suggests what lawyers could be doing in the real world actually could affect the outcomes of the cases that they're, but we're getting ahead of ourselves. We haven't given results yet.
John Campbell (00:15:48):
I was just going to add for those listening to kind of situate the questions that we use that were sort of like the minimal voir dire, really general question. I actually pulled most of those off of a site I found where a federal judge had said, here's the questions I'll allow or that I asked the jurors, and they are representative of what many attorneys have experienced in states where at the time we did this study, it was funded largely by Colorado lawyers. There was a statute in Colorado that you got 20 minutes in some federal courts, you either get 20 minutes, 10 minutes, or in some courts no minutes. And so we really modeled the general high level questions off what happens when that's all the time you have. So yeah. And then like Valerie said, the more detailed questions, I would break 'em into two categories just so people can imagine 'em. How do people view lawsuits? Do they have feelings about non-economic damages, lawsuits, tort reform, caps on damages, and then questions that were specific to our cases. So for example, if it's a bad faith case, how do they feel about insurers? If it's a med mal case, what are your views on doctors? I mean things that you would think and that I think many attorneys intuitively believe will help them get at people's feelings in a more specific way for the case they're working.
Alicia Campbell (00:16:59):
Hey, Kevin, do you know what a judge rehab is?
Kevin Doran (00:17:02):
No.
Alicia Campbell (00:17:03):
In voir dire, have you ever sat on a jury?
Kevin Doran (00:17:06):
No.
Alicia Campbell (00:17:07):
Do you know anybody who has?
Kevin Doran (00:17:09):
A couple of my friends.
Alicia Campbell (00:17:10):
Yeah. What did they say about it?
Kevin Doran (00:17:13):
They were happy to participate, but they did not enjoy it.
Alicia Campbell (00:17:16):
They did not enjoy it. Really? Why?
Kevin Doran (00:17:19):
Because the case material.
Alicia Campbell (00:17:21):
Oh,
Kevin Doran (00:17:22):
It was criminal cases? Yeah.
Alicia Campbell (00:17:23):
Okay. So they just didn't like the subject matter, but they thought the process was really good.
Kevin Doran (00:17:27):
Yeah, I was surprised they were fine to take off work. They were okay that it was sort of a net loss financially for them. They made lots of jokes about getting a bus ticket or something like that and And they actually did not disclose anything about the case either, which was funny. This was just actually very recent. One of them did, and then a year ago, another one did. So I never have, and I'm always paranoid. My living situations, I'm always in and out of the country and I'm always paranoid that I'm going to miss a letter or something. And I wish they knew what email was, but yeah, someday I will.
John Campbell (00:17:57):
Kevin just sounded like a narco traffic ante. My situation in and out of the country. We might want to clarify some of this.
Valerie Hans (00:18:07):
Alicia, I actually have served on a jury. I don't know if anybody else on our panel have you really? Yeah. It was years ago when I lived in the state of Delaware, which I think at the time was known for the most efficient voir dire on the 1520 minute order. And so I slipped into the jury even though I had already written a book on juries, and I think lawyers know that might have been a little nervous about it. But in contrast to your friends, Kevin, I loved every moment. It was fascinating. It was really interesting, and I feel like I've really drawn on it a lot and better understanding this institution that we all study.
John Campbell (00:18:47):
I agree. I was thinking the same thing. I was like, man, this is low hanging fruit. Right? Have you ever written books about juries maybe or something? But no, no detection of that. Valerie. I had the opposite experience. I called for jury duty and you have to write down what you do for a living. And I wrote down plaintiff lawyer, law professor, jury, researcher, and the judge called me up and said, you can go. That was it. They looked at the forms. You can go, no question.
Jessica Salerno (00:19:15):
I feel like that is the truest sign I've ever heard that vo dire was not sufficient. If they let Valerie Hans the biggest expert on juries onto a jury, something is wrong.
Alicia Campbell (00:19:26):
Yeah. I sat on a panel and I was in the box, and the only question that I got, it was a criminal case, and the prosecutor asked me how old I was and I said 20. And they were like, thank you. And after that I got dismissed. So that's as close as I've been. I guess I was too young. I don't know.
Jessica Salerno (00:19:44):
I've never even been called. I would kill to be on a jury. I would pay to be on a jury, but never even been called even going through jury selection. I would love,
John Campbell (00:19:51):
There's a movie about paying to be on a jury or working your way on a jury. I would recommend it. I think it's a documentary. So you could try that,
Alicia Campbell (00:19:59):
Nick. Yeah. What about you?
Nick Schweitzer (00:20:01):
Oh, me. My jury experience is I've been called multiple, multiple times. I don't know how I'm lucky enough to always be called often I have to show up, and I've been, I've started voir dire three or four times over my life and always been cut.
Alicia Campbell (00:20:20):
Oh, wow.
Nick Schweitzer (00:20:21):
Yeah, I don't know. So yeah, I've never actually gotten that experience except I've seen voir dire actually plenty now. So
Jessica Salerno (00:20:28):
I've actually daydreamed how I could honestly answer the questions, but not reveal just how much I know about juries. And I think the one thing that they would get me is if they asked what my dissertation was about or the name of articles, but I've got a few I could pull up. I think I could claim I'm a psychologist, but I've daydreamed how I could somehow get on there. But I don't think it's ever going to happen.
John Campbell (00:20:50):
I think the ship sailed because now that it's not standard, but certainly recommended that as long as you're allowed to ethically in the state to Google jurors, even if you didn't volunteer it, I suspect it gets turned up.
Nick Schweitzer (00:21:03):
Val, I wanted to ask just because it's just something that we were all talking about. We were all in different places when these things happened. And I was thinking, because formerly I lived in Phoenix, very large city. Now I live in Ithaca, New York, a very small place. Are there different pressures on making sure you seat a jury because you have a much smaller number of people to draw from when you're in a small place where Jess and I are now?
John Campbell (00:21:31):
Well, maybe I'll take that to start, but I think there'll be more than one answer here. I will tell you, Nick, that I think in most places, I mean barring really small counties, there's probably enough people, I think the pressure on saving a jury let judges feel sometimes where you'll see them. So I mean, maybe we should back up. You see judges rehabilitate jurors and to be sure we're all talking about the same thing, right? You'll see a judge hear a juror say something that I think everyone agrees might be bias that has to do with the case. And then the judge says some version of, okay, I understand you've expressed these views, but could you follow the law and instructions as I give them to you? And put those biases aside. And maybe it begs the question, why is the judge trying to save a juror who's expressed bias?
(00:22:20):
I don't think it's usually the population of the venue. I think it's the size of the panel because many judges for efficiency and time will call a panel of 40 people. But they need, for example, to see a jury of 12 with three alternates, and they need to give each side peremptory strikes. And so once you count even just three peremptory strikes per side and another strike or two for alternates, plus 15 people, you're at, oh, well we need that would already eat up 22 people. So then often we've seen judges in real life start to sort of in Alicia, and I've seen this, where a judge first starts striking people for, cause they expressed a bias. They said, I don't like lawsuits. Adios. I would always vote for the plaintiff because I feel like, sorry for people who are hurt and I just couldn't turn 'em away.
(00:23:07):
Adios. And then by the middle of the panel, the panel starts looking thin and the judge starts rehabilitating or saving people. So we're really jumping ahead. But I mean to me, one of the things attorneys need to do proactively that I think our paper would support is encourage the court to call panels big enough that we don't have this concern. We're going to bust it. And we don't see jurors as something that despite how many biases they express, must be rehabilitated and saved because otherwise we'll have to call in a new group of 40 and start all over and waste another day or two. That to me is probably driving it more than the fact that there are less people in some places.
Alicia Campbell (00:23:47):
Well, does your paper contemplate that? What happens a lot now is they call a panel of 40. And then the new thing is I've been in various federal courts where this seems to be, I don't know, going around, maybe they have a federal judicial listserv where it's like, you know what? We only technically need six, so we're going to sit nine. And then you have three alternates. But they also are going to participate so that actually they're trying to be a bigger jury. But if you have some fall off with jurors because they either somebody gets sick or somebody's inappropriate on the jury, meaning they've already come to a conclusion way, way before they're supposed to, you can lose people and end up with a jury of six. And what that does to the jury, math is mean that 40 is sustainable, at least in a federal judge's eyes. If I get 40 and I only need nine or eight, then I've gotten alternates and I can at least have six worst case scenario.
John Campbell (00:24:44):
Yeah, I know what you're talking about. Leash. I mean, I'll just mention, yeah, we see this regularly that judges will actually, it might be sort of pressure attorneys to say, we're going to seat eight, but I want you to agree that even if a couple gets sick, we we'll just proceed with six or we're going to see 12. But if somebody gets sick or a couple people don't show up or somebody's late and we can't wait, you'll just proceed with 11 or 10. And if both parties agree in most states and courts down to us kind of a certain number, that's acceptable. So yeah, I mean we see federal court in particular, you see incredibly small juries, juries of six.
Valerie Hans (00:25:16):
Yeah, it's so interesting. I think since the US Supreme Court made I think a wrong decision some years ago to permit smaller juries of size six as opposed to the traditional one of size 12, we have lots of research now on the superiority of larger juries. First of all, they're just no question that they are more representative of the community. So if you want to get everybody in there and have everyone have a voice in resolving disputes, civil or criminal, larger juries are really the way to go. But in addition, they are more stable judgments of the population. So I think now since the research has accumulated and is quite definitive, I think federal judges looking at that thought, well, at the very least we've got alternates here, even though the minimum number is six. If we've got alternates here, we should allow them to participate in the jury deliberation. So it's been a kind of interesting thing. Of course, what I would've liked to have seen was, let's go to 12 person juries, but that hasn't happened, at least not yet.
John Campbell (00:26:21):
Valerie, what would you say if you had to put you on the spot, what's the right size jury, not 12, which I know 12 is a common number. What do you think the right size jury is?
Valerie Hans (00:26:32):
I like 12. I think 12 makes a lot of sense. If you look at representation of racial or ethnic groups or other particular minority groups in a community, you still have a shot at getting those represented on a 12 person jury. So I've been a fan of juries of size 12 for some time. I think the only larger jury in the world is in Scotland, and it is a jury of size 15, and it is a majority verdict, so they can decide on a case just with eight people. So yeah, with seven people opposing, that could still be a verdict in Scotland. So
John Campbell (00:27:13):
Wow, putting my plaintiff attorney hat on, I'm going to start filing cases in Scotland. Eight seven sounds a lot better than I think. The lowest I know of at least in a 12 person jury is nine three. You see that pretty often that nine three is good enough in California and Missouri and other places. Some states are 10 too. And then you have states like Maine that have 12 person juries, but you need all 12.
Alicia Campbell (00:27:33):
What were the results? Let's start with the results of the paper dealing with voir dire and then we can dig into the nitty gritty from there.
Jessica Salerno (00:27:39):
Sure. I would say the paper actually has a lot of results. So I would organize it in a couple of sort of sections. The first was one of our main hypotheses, which was whether allowing the more extended voir dire would be more effective than the minimal vore. The second kind of category findings we had was about how effective judicial rehabilitation was. And then we have some kind of follow-up findings that ended up being really interesting that were a little more exploratory. So in terms of whether extended voir dire was more effective than minimal, we found a resounding yes. I mean, I don't think the results could have possibly been any more clear in favor of the hypothesis. So what we found was the way that we analyze these data was we looked to see how strong of a predictor those minimal questions were of how the jurors actually decided the case compared to how predictive the case specific ones were.
(00:28:31):
What does that mean? So the minimal questions, these were again questions like, have you been a juror before? Demographic questions and those self-identification of biases, when we asked them, are you prejudiced, are you biased? We found that literally none of those predicted how they judged the case. So not one of those was related in any way to how they interpreted the case evidence and decided their verdict or their damage award. But in contrast, the extended verre questions, almost all of them were statistically significant predictors of verdicts and damages. So when I say statistically significant predictors, what that actually means in practical terms is that we have 700 jurors reading the exact same case, and the degree to which they answered our questions was related to how they judged the case. What that means is that they were clearly filtering how they interpreted the evidence, how they weighed the evidence through those potential preexisting biases.
(00:29:23):
For a specific example, this was one of John's questions, which ended up being a really, I thought, powerful example of this. So it offered up a question that I believe he and Alicia and a lot of their work use, which is to explain to jurors what non-economic damages are and ask them whether they're willing to give them or not. And I thought that this was really insightful because as a quick side note, when I've observed for deer in civil cases, I'm always surprised at how much the court assumes naive jurors have any idea what's going on. So they'll say something like, oh, non-economic damages, do you support this or not? They don't know what that means. They haven't had time to think about it. And so this question explained to them what non-economic damages were and just ask them before they judge the case if they thought that they would be willing to give those or not.
(00:30:09):
We had a split. There were a good chunk of people who said they were not willing to give out non-economic damages, and we found that that was a very strong predictor of verdicts. So on average, if they said they were willing to give noneconomic damages, they awarded around 14 million in one of the cases. If they said they were unwilling, they gave about 10 million. So a pretty big gap there. But what I found was really disturbing is that attitude towards non-economic damages also predicted how they judged the verdict in the case liability. This should not have been related, whether they in general have a philosophy where they think noneconomic damages aren't good, should not be factoring in any way. How do they judge liability in the case? But it did. So we found a pretty big effect actually, that people who were on board with noneconomic damages voted liable 65% of the time, and those who are unwilling voted liable 36% of the time, which is a big difference.
(00:31:02):
And that was all judged by just their attitude towards noneconomic damages, and we found lots of those. So the main take home here was that lots and lots of the questions clearly had shaped how they judged the case. And I'll throw out one other quick finding here before we let others shatter, we move on to judicial rehabilitation, which was we decided, another thing we wanted to do to try to get at this question of whether extended voir dire is more effective or not, is we tried to classify jurors into what we called potential excludables and non excludables. And what we meant by that was we looked how they answered the questions to see if they basically in answering any of these questions, admitted that they were either unwilling to follow the law or that they were pretty severely biased in one direction or the other.
(00:31:45):
So if they said they were unwilling to award non-economic damages, if they said the burden of proof was wrong and it was either too high or too low, if they admitted to the fact that they think that the plaintiff or the defense on either side would have a really hard time convincing them, things like that, these are things that if they admitted it in court could probably be used to argue cause potentially or could be fodder for a preemptory challenge. And we found huge differences between those and how they answered the minimal question of, can you think of a reason you're prejudice? So these are all the same jurors. So these are the same people answering these questions. When we asked these 2000 people, can you think of a reason you might be biased or prejudiced? We got less than 2% of people saying yes.
(00:32:29):
So they're either unaware or unwilling to admit bias with that question. If we looked at our excludable category based on the specific questions that jumped up to almost of the participant pool. So 42% of those jurors did admit to something that clearly was evidence of a bias. So it's not that jurors are unwilling to admit this, but they have to be asked the right questions. You can't just ask them to identify their own biases. I think another big difference is using words like, are you prejudiced? Are you biased? People don't want to say yes to that, but if you ask them questions that might reveal a bias rather than asking them to label themselves bias is clearly identifying much more people who are going to read the case in a bias manner.
Valerie Hans (00:33:11):
That's just a great overview, Jess. The questions that I found so intriguing and I think really form a pattern with the hostility to non-economic damages are the questions that we ask that really suggest people who don't want non-economic damages also just do not trust the plaintiffs. They don't trust litigation, they have concerns about the motivations of the plaintiffs who are bringing lawsuits. And all of that affects not only the verdict as you were saying, but also the damages that you could expect to get from individuals like this.
Jessica Salerno (00:33:46):
Absolutely. And so the next question that we wanted to test is now that we've established clearly these jurors have preexisting biases that could be revealed during voir dire that are related to the case, we wanted to see if judicial rehabilitation was effective in what it's supposed to do, which is to break that link or reduce the relationship between these preexisting attitudes and how they judge the case. So here we're looking at are those relationships between their biases that they've been in court and how they judge the case, is that relationship diminished at all? For the half of the jurors that you'll recall, we made go through judicial rehabilitation and across the board we found that judicial rehabilitation does not even reduce any of these links or any of these relationships. What it does do though we found is that it makes jurors think they're less biased.
(00:34:34):
So at the end of the study, we asked jurors, how much do you think your attitudes towards litigation or lawyers affects your judgments? And almost everyone of course said not at all to maybe a little bit, but they thought their biases were significantly less influential if they went through judicial rehabilitation compared to those who did not, which suggests this is giving this false sense of security. It's making them feel kind of credentialed. I'm good, I've set aside my biases. But the data show that that is a false sense of security and that the judicial rehabilitation is not actually making them less biased, is just making them confident that they're less biased.
John Campbell (00:35:10):
Before we go further, I thought, I know as a lawyer when I listened to people talk about research and that Jess is explaining it very plainly, sometimes you kind of get lost in the weeds. So I thought I'd zoom out as maybe having looked at this more as a lawyer probably than a researcher, because that's probably my first hat. Imagine talking to a judge and how you might use this research to help them understand. I believe most judges want to do this, right, but judges aren't psychologists, they don't study human behavior that often. So they may not have thought about these issues much. And I think a lot of 'em genuinely believe, look, I can just say to somebody, take your bias and put it aside. Promise me you'll just follow the instructions. And they will. And they also believe that look, to the extent we're going to find bias, we're going to find it in a few questions.
(00:35:55):
We don't need to talk to 'em for hours. So if I were to sort of say, well, what did you find? First of all, we found we need bigger panels because for any given case, there are a lot of people who have biases for that case. Now, it doesn't mean they can't serve as jurors, but it might mean they can't serve as jurors on this case. You can imagine somebody who really hates non-economic damages. Well, they might be great on a contract case or a criminal case, but they might not be great on an injury case. And somebody who has a very specific view of doctors because a doctor saved their child and they could never imagine holding a doctor liable, might not be the right juror for a med mal case, but they might be great in a patent case or another injury case.
(00:36:36):
And so one of the takeaways for me was we need panels big enough so that we can find the people who are not a good fit for this particular case. And we need a recognition that that is something that can be found if questions are allowed about both views on lawsuits and about the case in particular. And that doesn't have to take forever, but it does mean that we have to get past the basic questions. So that was encouraging, but also kind of a reminder to attorneys and to judges, we need panels big enough and we need to let the questions, and then the next part is we'll solve the rehabilitation problem that way too. If we understand that you can't ask people to take 30, 40, 50, 60, 70 years of beliefs because biases in the simplest way are just core beliefs, feelings, genuine held beliefs. You can't ask them to just simply set them aside because you ask them to.
(00:37:29):
And even if they made an honest effort to try to do that, they might not know how or how much to adjust. And so if instead what we said is, and I want to be clear, I'm a plaintiff's lawyer, but our paper is entirely neutral on this point, which is if somebody says, I'll always vote for the plaintiff because I feel bad for people and I can't imagine ever voting against a plaintiff because they're hurt and they need the money and I care about people, that person probably shouldn't sit because they've said they can't consider the defense. And if somebody says, I hate lawsuits and I hate non-economic damages, and I think people bring lawsuits or ruining society, they probably shouldn't sit because they're not going to listen to the plaintiff. And if we had enough people and we allowed this process, we would remove these outlying folks.
(00:38:14):
We'd leave people that for this case can listen to the evidence and render a decision based on the facts of this case. And one of our sort of conclusions was that is necessary to satisfy the constitutional requirements in federal court in many state courts and certainly the presidential and statutory requirements in many courts. And so I don't want to understate it or overstate it, but I think our paper strongly supports the idea that failing to allow meaningful jury selection and then excluding people who have expressed clear biases, risk seeding juries that don't meet constitutional and statutory requirements and who decide cases based on something other than the facts. And so then the last thing I'll say, and I'll shut up, is if you really think about that as a judge or an attorney, what in the world did you do discovery and motion hearings and development of case themes and visuals on both sides?
(00:39:11):
What was all that for? If you then sat a jury that couldn't consider that evidence because they had strong enough beliefs that they weren't going to be able to get past them, the trial was theater. And so to me, what I would want courts to hear is there's not really a genuine debate about this among people who study juries that we need to have real jury selection that doesn't benefit one side or the other. It benefits the cause of justice and that as we'll talk about there are efficient ways to do it that don't take days or weeks, but guarantee that we seed a real panel that will decide cases on the facts, which I don't think anybody could actually be opposed to.
Jessica Salerno (00:39:50):
And I think it's a great point, John. And also I think something, our paper that we touch on and kind of secondarily contributed is that I know obviously bringing in bigger panels takes a lot more time, resources, all that, but also what we discovered in our papers that online questionnaires are super helpful, you can get all kinds of information about these jurors and exclude a lot of them before they even show up to the courthouse. And there's a lot of basic psychology behind this too, of people are going to be more honest, they're going to be more thoughtful when they're sitting in the comfort of their own home. They're not in front of an intimidating judge and a bunch of strangers having to admit how biased they are. You might get more thoughtful, more honest answers on a questionnaire. And our paper actually was heavily relied on in Arizona when Arizona banned preemptory challenges.
(00:40:37):
There were a lot of concerns raised that attorneys weren't going to be able to really conduct voir dire ineffective ways. And so they used our paper to institute a lot of really great reforms in Arizona. So they took our recommendation to discourage judicial rehabilitation to give enough time for case specific attitudes to call bigger panels as John mentioned, but also really encouraged and started instituting online questionnaires. And I've spoken with judges since these changes were made and they were saying, I mean, they really liked the online questionnaires. Even some of the judges who hated the idea of them and were very skeptical have come around and said, it really makes the whole thing more efficient. We can bring a lot more viable jurors if we've already screened people out for hardship, cause challenges, things like that. So I think both bigger panels and also really using online questionnaires can really make these kinds of more what would normally be more cumbersome jury selection processes a little bit more efficient.
Alicia Campbell (00:41:33):
So I have a question With your research and the results, would you say that a question whether a person supports lawsuits in general, whether that be through caps, there are too many, I don't like them. I'm going to be hard on the plaintiffs. I don't think you should sue. Don't you think that's a characteristic that's similar to, Hey, do any of the plaintiff, do any of the plaintiff's attorney? Because what we understand when we do jury selection, is that a common question? Do you know any of the witnesses? Do you know any of the defendants? Do you know any of the plaintiffs? Do you know any of the lawyers? Do you know any of the judge staff? Because it's implicit that we understand that could be a biased view then because you're seeing through the lens of knowing a person who's directly involved is hating lawsuit. Are some of these questions in your all's minds based on the research similar to that, that it gets in such a way, it puts up such a blind spot that leaving them on would be leaving on someone who knows one of the parties?
Valerie Hans (00:42:36):
Wow, I've never thought of it that way, but I guess I do see them somewhat differently that personal connections to parties are often encountered and judges have no trouble excluding people under those bases. And that kind of goes without even finding out what kind of relationship it is, whether they love or hate the person. But the other kinds of things like attitudes toward lawsuits, attitudes toward litigation, especially hostility toward people bringing litigation is attitudinal and can affect so many things in terms of the interpretation of the evidence that they're going to see the severity plaintiff who is injured, the severity of the injury in a different research project. I interviewed civil jurors in a northeastern state and I asked them questions that are very similar and overlapping in some ways with the questions we asked in this project. What I found, I was able to interview seven out of every 12 jurors on average. I took what I call their litigation explosion attitudes that is hostility to litigation, hostility to plaintiffs. And I calculated a kind of jury average, and this kind of surprised me, but I predicted it. That set of collective attitudes was actually linked to the damage award amount. The jury decided. So we're really talking during this podcast during a mock jury research, but it clearly feeds into and is similar to what is actually going on in the courts.
John Campbell (00:44:10):
Yeah, that's fascinating, Valerie. So when you say their sort of explosion measure or how they view lawsuits or maybe call their tort reform view measure or whatever, am I understanding correctly then that as people expressed more skepticism, let's call it about lawsuits or negative feelings about them, they gave lower damages?
Valerie Hans (00:44:28):
Absolutely. Lower awards,
John Campbell (00:44:30):
Yeah.
Valerie Hans (00:44:31):
And those who were open to the possibility that the plaintiff was really injured and they didn't automatically have doubts about whatever the plaintiff was saying, those were people who were more generous collectively on their awards.
John Campbell (00:44:45):
We're talking around something that maybe we should say explicitly too, maybe we have a little, is because we hear this not just from judges, and we don't just see this in selection. We hear this from lawyers. Alicia and I talk to a lot of lawyers, and we sometimes hear lawyers say, well, I'm not too worried about my jury because I know that between the way I present and my personality and the evidence, I'll persuade anybody. I mean, I know I can persuade 'em. And I always say to them, and to me this has been helpful to imagine is, well, maybe you ought to think about what you're really saying here, which is imagine somebody who has, for example, I do a morning coffee greet. I have coffee with some dads here in Madrid regularly. Imagine somebody who has a group like that, they meet up at the cafe and have coffee and imagine that for the last 15 years, once a month or twice a month subject of lawsuits come up and they talk about how they hate 'em.
(00:45:35):
And everybody's kind of agreed in the group that lawsuits are hurting America and that people are whiny babies and they sue too much. And a lot of people are just looking for handouts or lotteries and we need to get rid of this stuff, and I'm sick of it. And they talk about the McDonald's hot coffee case and they say, can you believe that lady got a billion dollars for spilling her own coffee? And they talk about all this, and that's how they view lawsuits. And then they show up to court and they have to sit on a lawsuit and they're asked to decide if they should award a significant damage award for somebody who's badly hurt in something that there is liability. How do they go back to their group after that conversation of decades and say, you know what I did. I gave $30 million to this lady who was hurt.
(00:46:18):
I think it is not realistic that most of them do, not because they're dishonest or because they're not trying, but because we are social beliefs, our stated beliefs, the things we've said to people in the past exert real pressure on our behavior. And it would be tough to go back and talk about that. And so the idea that that person in the course of a week or two or three in trial will be so persuaded by the evidence and maybe the attorney's charm in the case of some of these attorneys who believe this, that they will set aside decades of conversations with friends and risk going back to those conversations and looking like they didn't uphold their beliefs. To me, as I think about it like that, it helps me understand why biases maybe not totally intractable, but certainly not easily moved. And so maybe it's useful to say to attorneys too, because judges we're saying to judges essentially, I think very directly, look, trust that when people express bias, it will impact the case.
(00:47:11):
And let's not see people with extreme bias. I would say to attorneys too, don't believe that your case, whether you're the defense or plaintiff is so good that bias couldn't impact it. Instead, work to get a fair procedure, cite this paper and others and ask the court for time a questionnaire, the right. And then whether you're a plaintiff or defense attorney, be fair. So I'm going to call out our friend Sean Claggett, just briefly. Sean is a great lawyer, and one of the things I love about Sean is if he's talking to a juror and he says, tell me how you feel about lawsuits. And they say, I think they're good and I think they're helpful to society. And he says, well, okay, in this case, this person has sued a company and they've sued this big company. Do you think you could return a verdict for either side based on the evidence?
(00:47:56):
And they say, no, I don't like big companies. I don't like this company in particular, and I think I'm going to find for the plaintiff. I mean, I can't imagine not fighting for the plaintiff. Sean doesn't say, well, could you set that aside and be fair? He says, judge, I'd like to exclude this juror even though that juror is good for Sean's case. And he says, judge, the reason I'd like to exclude the juror is because as much as I'd love to have her sit, she just expressed a bias that I think means she can't give the defense a fair trial. Well, to me, that is what should happen from both sides is that we should aim to seat juries that don't have these express biases and then trust that we can get. And I think the literature and science and everything else is really good that jurors will work really hard to give a just verdict when we seat a fair jury.
Alicia Campbell (00:48:42):
Well, I mean, that's what I meant with my question is there's some questions that judges believe they should always ask because they think it always matters. So if you know someone, I don't think I've ever appeared or watched anybody do voir dire where they didn't ask that question. And then when the judge is like, wait, who do you know? Okay, great, thank you. And excuses them. But I think what your paper demonstrates is there's some other characteristics that should be taken just as seriously, because you may not know someone, but if you hate lawsuits, the idea that you could sit on a jury and not see it through that lens is, I mean, I think the research demonstrates is impossible.
Jessica Salerno (00:49:23):
So Alicia, I think you're right to the degree that people had intuitions about these things before, some people thought these would make an impact, the other side thought they didn't, and now we have data showing these do have an impact. That's exactly what our data shows. People who have certain attitudes are judging the case differently, whether it's impossible or not. I mean, I think this is kind of interesting because I think part of this gets to kind of the nature of different types of bias and what people think bias means versus what it can often mean, which is that I think part of why judges might not think these questions are as important is because they're thinking of a certain type of bias, which is the really explicit direct types of bias that are like, I hate lawsuits and I will never, ever find for the plaintiff.
(00:50:04):
A judge might be able to be like, okay, you're off. But if it sounds like something that they could set aside, so if they say, okay, you hate lawsuits, but are you going to listen to the case? Most people will say, yeah, I'll listen. I'll judge it fairly. But the type of bias that these I think are representing is more akin to a confirmation bias. And what John was starting to talk about, which is that it's kind of indirect. People can walk into this trial with the best of intentions. I'm going to listen to everything carefully, I'm going to judge it. But what they don't realize is operating in the background is this attitude is predisposing them to put more weight on one side versus the other, be a little more skeptical of one side than the side than the other. Maybe judge credibility of the witnesses a little differently.
(00:50:50):
And those kinds of things are what ends up leading to a different verdict, not because they're being so blatant that they're like, I hate lawsuits, so I'm not even going to listen to one side. That's not what's happening. And I think judges are not necessarily intuitive to the fact that those confirmation biases and how these operate can be just as insidious. They're just not as aware, and they're not going to sound like the type of bias of like, oh, well, yeah, that's my best friend. I would never vote against them. It's a different kind of bias. And so I think that that might be part of why it's so hard to accept it.
John Campbell (00:51:22):
Hey, Jess, I'm going to tell you a quick discouraging story. Alicia and I were on a zoom in a jury trial up in Washington, and we were helping with selection. It was against an airline and a woman said, I am a frequent flyer on that airline. The airline was the defendant. She says, I'm a frequent flyer on the airline. I have a card from the airline, I have point from the airline. My best friend is the director of marketing at the airline. And the plaintiff's sort of like,
Alicia Campbell (00:51:48):
Wait, and you had stock?
John Campbell (00:51:49):
Yeah, and I own stock in the airline, all these things. And the plaintiffs sort of like judge, he's like, do I need to ask any more questions? The judge looked right at her and says, well, I understand those things, but I mean, do you think you could still be fair and return a verdict? You get the airline if the evidence supported it. And what did she say? I mean, of course she said, I think so. And like you said, she wasn't lying, she was saying, I think so. I'll sure try.
Jessica Salerno (00:52:12):
And these are unrealistic questions. How does anyone know how to answer that? If I were asked that? I don't know. What is your proof? How do you answer that question? It's kind of an unanswerable question. True with a clear right answer.
John Campbell (00:52:25):
The plaintiff ended up using a peremptory strike because the judge would not exclude the person for cause. It was a case where I think part of the concern was the panel. It wasn't big enough and oh no, and so I'm going to save this person. But you can imagine that that meant the plaintiff didn't have a peremptory strike for somebody else who'd expressed some concerning views, but the judge wouldn't exclude. And so it's stuff like that to me, I would love, and this is an open call to anybody who listens to this, I would love for judges in state courts and federal, the federal Judiciary Commission, to invite people like Valerie and Jessica to come and talk about this stuff candidly, including how to do this efficiently so that your court still runs. You're not wasting time, but you are seating juries that meet the constitutional standard because do think that there's a way to do it that's quick and efficient. It doesn't need to be weeks, it doesn't even need to be days, but we would get better results. And it would mean that all the work the court did and both sides did to prepare the case, which often means years of work is heard by people who can listen to it and make a fair decision.
Jessica Salerno (00:53:31):
And some of it isn't even more time consuming. It's just changing how you word your questions. Like anytime you ask someone, are you biased? Are you prejudiced? Can you not be impartial? Those are questions that people, honestly, most of them will say, no, I'm going to try. I don't think I can do it. But again, if you describe in this project, Lee Ross came up with, I think a great question that ended up being really effective at predicting verdicts, which is pretty straightforward. But it's literally just asking people, when you're judging this case, you've read or you've heard a little bit about what it is, what do you think? Do you think one side might have a harder time convincing you than the other? It doesn't sound, you're not asking the person to label themselves as biased, but it's basically the definition of a bias.
(00:54:16):
It's not using a bad label, but it's essentially asking them, are you going to read this case in a biased way? And lots of people admitted, I think the plaintiff's side is really going to have a hard time convincing me, or I think the defense is going to have a little bit harder time. I mean, that's literally admitting a bias. You're just not forcing them to use these loaded negative words that I don't think people think they're biased even when they are. And that doesn't take more time. That's just kind of changing some wording even.
Valerie Hans (00:54:42):
Well, John, you mentioned this, but I think I could also speak for just to say we'd love to talk to judges and lawyers about this work and how it can be implemented in a way that doesn't waste the court's time is most efficient, but actually deals with the very serious issue of juror bias.
Alicia Campbell (00:55:01):
Do you think there's ever a scenario based on your research where an attorney should not be asking any questions?
Valerie Hans (00:55:08):
Nothing comes to mind. Are you thinking of something?
Alicia Campbell (00:55:11):
No, I'm just curious. I've got lawyers going to trial on October 6th, then federal court, and they literally won't address the jury at all during voir dire. They'll have no input into the questions and they won't get a chance to ask anything. Just hard to understand.
Jessica Salerno (00:55:26):
That's so shame. There is, and I think Valerie is more familiar with this research, but there is a lot of research showing that attorney led voir dire is more effective. I mean, there's no one who knows better what might bias someone against their case than the attorney that's presenting it. I mean, again, I think judges are probably going to go the, I don't know, but I assume they would go the more generic route. They probably have a set of questions like the one that John mentioned that they typically ask, and they're a little somewhat generic to the case and asking jurors to come up with their own reasons why they might be biased. I mean, the attorneys are the ones who know what to ask and what are going to be the most biasing factors. And I know there are data behind attorney led voir dire being more effective.
Valerie Hans (00:56:05):
And one of the most, I'm going to say chilling ones was a older study that about questioning during death penalty cases that showed if you limit attorneys who are representing the defendant and don't allow them to ask a range of questions in death penalty case, it actually turns out those attorneys clients are sentenced to death more frequently compared to limited questioning of the sort that we represented in our minimal voir dire questions.
Nick Schweitzer (00:56:34):
The thing that I think about is we talked about using juror questionnaires and the time in the process. So in the spirit of efficiency, Val Jas, John, do you think in order to really do this in a practical way, are these online questionnaires or online written, however they administer them, is this the way it has to be able to ask all of these different sorts of things? Or is this something you think can actually be done in person in the court as they're calling in panels?
Valerie Hans (00:57:04):
I think it can work both ways. It's more efficient to have prospective jurors fill out a questionnaire in advance and allow the jury commissioner to look through and to spot anyone who's clearly not going to be able to serve or is barred from serving. Because, for example, knowing individuals in the lawsuit. So that's the most efficient way. But jurors, when they are called for jury duty, spend a lot of time hanging around the jury room, and that can also be an opportunity for them to fill out questions. Now, the attorneys would, I think, benefit more if they had these questionnaires in advance and the responses in advance so they could plan their jury selection. So I think it's better all around for both sides to have that and for the judge to have it in advance, but it can be done.
Jessica Salerno (00:57:55):
And I think I'll give a slightly nerdy stat the answer here, which is that we did an analysis in the paper to see, so we had a lot of questions and a lot of questions revealed biases that impacted how they judge the case. A good number of them kind of overlapped. So we did one analysis where we put 'em all in one analysis together and we see how many of these are really uniquely helpful. Do you need all of these or could you ask a smaller subset? And there were a handful of them that were unique predictors. And this is making me think, I feel like we should write some kind of practitioner's guide to this article or something where we kind of zero in on, here are the most helpful questions, here are the ones that are unique so that if you're not able to convince them to give you the long in person thing here, the best five to ask.
(00:58:36):
And that's probably going to capture a lot of what we had here. So yeah, I agree with Valerie. I think it can be done in person, but the questionnaires really help. And I mean, I know that at least in Arizona, it was a tough process to get them all instituted because they kind of created a common one. And then attorneys in specific cases could add case specific ones to it, and they had to build a system, which was hard, but once it was built, made everything more efficient. So I know there are barriers, but it really just seems like questionnaires would really help the situation.
Alicia Campbell (00:59:06):
Awesome. Well, thank you guys for sharing time with us today. It was awesome. This is a great, great topic and one we should come back to, frankly, I mean, lawyers are obsessed with vo dire. How can they do it better? How can they do it when they're not really allowed? What's the best way of going about that? I mean, I get that question all the time. Okay, if I'm not allowed to ask any questions and I'm basically going to sit there, what is it that I can find out about these jurors? What do you suggest? And at that point, it gets really hard, but I think this article calls for, I don't want to use the word reform, but it's certainly maybe some enlightenment of the way that people have been handling the process because judges do get a ton of discretion most places. And it would be good for them to know the effects of some of the routines or procedures they have in their court. I think most of 'em think it doesn't matter very much.
Jessica Salerno (01:00:04):
Yeah. Well, thank you for giving us the chance to continue to talk about the paper. We take every option or every chance we can get to explain it to people and hopefully put it in people's hands so that they can make their jury selection better. So thank you so much for the opportunity to chat.
Alicia Campbell (01:00:20):
No, it's great. Thank you guys for coming. Kev, Nick, always great to see you. We're going to schedule another podcast soon about animations and whether or not you should have them in your case and what they do. That'll be with Seaton Claggett. He doesn't know it yet, but I'm calling him out. You'll get an email from me now. Hey seat. You want to be on the Fred files? Too bad. I already told everybody you're doing it, so it'll work out well. Alright guys. Thank you so much.
Voice Over (01:00:50):
Thank you for listening to the Fred Files. If you found value in today's discussion, please subscribe and share this episode with your colleagues. To explore how Fred can transform your case preparation, visit us@focuswithfred.com. Produced and powered by Law Pods.

Produced and Powered by LawPods
The team behind your favorite law podcasts