Seventh Circuit Advises Against Daubert "Gatekeeping ... With Too Much Vigor"

Where is the line lie between FRE 702 "reliable" methodology assessed by the judge and "reliable" conclusions from that methodology, assessed by the jury? The Seventh Circuit recently examined this distinction in an insurance contract dispute, reversing and remanding a summary judgment for the defendant insurance company; the circuit warns of the dangers that can occur when a "district court usurps the role of the jury ... if it unduly scrutinizes the quality of the expert's data and conclusions rather than the reliability of the methodology the expert employed," in Manpower, Inc. v. Insurance Co. of Pennsylvania, __ F.3d __ (7th Cir. Oct. 16, 2013) (No. 12-2688)

There is a line between a methodology and the conclusions reached by employing that methodology. The Seventh Circuit noted that "this is not always an easy line to draw," yet it was one necessarily drawn in every case in which expert testimony is admitted into evidence. Most recently this issue arose in a Seventh Circuit case in which the plaintiff Manpower sought compensation from its insurer, Insurance Co. of Pennsylvania (ICOP), for losses due to business interruptions suffered after the building in which its subsidiary was located collapsed. A major issue in the case involved not the defendant's liability to pay the plaintiff's insurance claim as required by the insurance contract. Rather the question involved how to calculate the amount of lost business that the defendant would have to pay to the plaintiff under the policy. Manpower, Inc., __ F.3d at __.

The Daubert Issue

As noted by the circuit, the need to determine if an expert's testimony is reliable involves "primarily a question of the validity of the methodology employed by an expert, not the quality of the data used in applying the methodology or the conclusions produced." The circuit found substantial support for this distinction by a number of other recent cases determinations by the court, including:

  • Smith v. Ford Motor Co., 215 F.3d 713, 718 (7th Cir. 2000) (“The soundness of the factual underpinnings of the expert's analysis and the correctness of the expert's conclusions based on that analysis are factual matters to be determined by the trier of fact, or, where appropriate, on summary judgment.")
  • Stollings v. Ryobi Technologies, Inc., 725 F.3d 753, 765-66 (7th Cir. 2013)(“Rule 702's requirement that the district judge determine that the expert used reliable methods does not ordinarily extend to the reliability of the conclusions those methods produce—that is, whether the conclusions are unimpeachable.”

Manpower, Inc., __ F.3d at __.

Locating The Methodology-Data Line

In explaining why the trial court crossed the line in the Manpower case, the circuit noted a number of problems with the trial court's assessment. This assessment had shifted from the proper focus of expert methodology to a focus on the conclusions that the methodology generated. The circuit noted this was part of a process and that the trial judge had gone too far. The judge appropriately determined if the methodology was appropriate. It did this by examining if the methodology could estimate the loss the plaintiff suffered from the destruction of a plant. It determined that the expert's methodology

was sound. Indeed, the district court sought out a scholarly treatise and cited its endorsement of this methodology. Having drawn those conclusions, the district court's assessment of the reliability of the methodology ought to have ceased .... Instead, the district court drilled down ... to assess the quality of the data inputs [expert witness] Sullivan selected in employing the growth rate extrapolation methodology. What the district court took issue with was not Sullivan's growth-rate extrapolation methodology, but rather his selection of certain data from which to extrapolate. Indeed, the district court effectively acknowledged that its problem was not with Sullivan's methodology but with his data selection when it stated that “had Sullivan not chosen such a short base period for calculating lost revenues, I might have found his analysis reliable.” The district court thought Sullivan should have selected different data, covering a longer period, as the base for his projection, but the selection of data inputs to employ in a model is a question separate from the reliability of the methodology reflected in the model itself.

Manpower, Inc., __ F.3d at __ (emphasis added).

Key Factors

The circuit noted that in cases where the court had invaded the role of the fact-finder, there were certain common characteristics. First, the "district judge had agreed that the expert correctly employed a valid methodology but found the expert's opinion unreliable only because he concluded that one of the key data inputs he used was not sufficiently reliable." The other characteristic of a court that has gone too far in its Daubert analysis is that it invaded the task of the jury -- "even though the data input in question 'was undoubtedly a rough estimate,' “[t]he judge should have let the jury determine how the uncertainty about [the accuracy of the data input] affected the weight of [the expert's] testimony.”
Manpower, Inc., __ F.3d at __ .

The circuit emphasized that the court should not have considered whether the expert could have used better data to perform his calculations with the proper model. This was "a question for the jury, not the judge. Assuming a rational connection between the data and the opinion—as there was here—an expert's reliance on faulty information is a matter to be explored on cross-examination; it does not go to admissibility. Manpower, Inc., __ F.3d at __ (citing Walker v. Soo Line R.R. Co., 208 F.3d 581, 589 (7th Cir. 2000)." The noted that the test of how sound the expert conclusions were turned to a great deal on “[o]ur system [that] relies on cross-examination to alert the jury to the difference between good data and speculation.” Manpower, Inc., __ F.3d at __ (citation omitted).

Conclusion - The Acceptance of Regression Analysis

The circuit provided another example of the difference between reliable methodology and reliable data. It cited the treatment of cases that presented a regression analysis, noting that the "latitude we afford to statisticians employing regression analysis, a proven statistical methodology used in a wide variety of contexts," and yet the reliability of the data presents a different question. According to the circuit "[r]egression analysis permits the comparison between an outcome (called the dependent variable) and one or more factors (called independent variables) that may be related to that outcome. As such, the choice of independent variables to include in any regression analysis is critical to the probative value of that analysis. Nevertheless, the Supreme Court and this Circuit have confirmed on a number of occasions that the selection of the variables to include in a regression analysis is normally a question that goes to the probative weight of the analysis rather than to its admissibility." The circuit cited the following cases as support of this proposition:

  • Bazemore v. Friday, 478 U.S. 385, 400 (1986) (reversing lower court's exclusion of regression analysis based on its view that the analysis did not include proper selection of variables)
  • Cullen v. Indiana Univ. Bd. of Trustees, 338 F.3d 693, 701–02 & n. 4 (7th Cir. 2003) (citing Bazemore in rejecting challenge to expert based on omission of variables in regression analysis)
  • In re High Fructose Corn Syrup Antitrust Litigation, 295 F.3d 651, 660–61 (7th Cir. 2002) (detailing arguments of counsel about omission of variables and other flaws in application of the parties' respective regression analysis and declining to exclude analysis on that basis)
  • Adams v. Ameritech Servs., Inc., 231 F.3d 414, 423 (7th Cir. 2000) (citing Bazemore in affirming use of statistical analysis based solely on correlations—in other words, on a statistical comparison that employed no regression analysis of any independent variables at all)

An adage from the early days of computing taught "garbage in - garbage out." Remarkably, this bears a rough resemblance to the Manpower, Inc concept. That model seems to be that the reliability of the computer would be the task for the court, whereas its data inputs -- and whether it is garbage in -- enables the jury to determine the relative soundness of the results.


Subscribe Now To The Federal Evidence Review

** Less Than $25 Per Month ** Limited Time Offer **

subscribe today button

Federal Rules of Evidence