A picture may be worth a thousand words, but trial lawyers do not have a set of ready principles for the development of the pictures – graphics – we use at trial. A bulleted-list of points displayed in PowerPoint is hardly a substitute for a well-designed graphic that communicates to the jury. Where allowed by the budget, lawyers will work with trial-graphics firms to assist in making pretty pictures. But the case should remain that of counsel, which means that counsel must take responsibility for the development of graphics for trial.
The need to express information clearly by way of graphics is hardly new to trials, or to the modern age, or to insurance-coverage trials. In the cases I have worked up for trial over the years, we have devoted considerable effort to developing the graphics. Through this effort, and through reading of graphic-design literature, I have developed a personal working set of principles that assist in developing trial graphics.
What do I mean by trial graphics? Especially today with large-screen plasma or LCD displays, there are a number of ways to present exhibits and information to juries via a computer. In trials today, it is common to project documentary exhibits up on a screen while a witness is being examined. Through the use of trial-presentation software, which I strongly endorse and which is easy to use, the examining attorney is able to flash the document up on a screen and enlarge a key paragraph all while simultaneously asking questions of the sponsoring witness. (For some courtrooms or for technically less-savvy lawyers, a paralegal or trial-presentation specialist will operate the computer and manipulate the images on the examining lawyer’s behalf.). From the perspective of graphic design, this is a reasonably straightforward process that does not require any real artistic presentation.
Trial graphics however means something more, and typically involves either the creation via computer of simulations of reality (e.g., a computer animation showing the impact of forces when the two automobiles involved in an accident collided) or similar demonstrative exhibits or (ii) illustrative exhibits that illustrate principles or loosely (but not misleadingly) model the facts (e.g., a schematic of an intersection). For demonstrative exhibits, the exhibit is being offered to the jury itself as substantive evidence; as a result, the demonstrative exhibit needs to be independently admissible and it then can go into the jury room in deliberations.
In contrast, an illustrative exhibit is meant to assist the testimony of a witness, and it is the witness testimony that is “the” evidence, not the illustration. The illustrative exhibit is not itself moved into evidence or given to the jury, but instead is used by a witness to help him or her explain to the jury what s/he there to talk about. A proper foundation for use of the exhibit needs to be laid setting forth that use of the exhibit would be helpful to the witness in explicating the testimony and would aid the jury in understanding the testimony (and the exhibit can’t be misleading, inaccurate, etc.) While the trial judge in the judge’s discretion controls use of illustrative exhibits at trial, e.g., Fed. R. Evid. 611(b), if a modicum of care is used in preparing the illustrations most judges will allow their use.
But just because a graphic may be allowed by a judge does not mean that it is either effective or good. Lawyers are taught many subjects in law school, but even in trial-advocacy courses the work of developing effective trial graphics is not taught much, if at all (I certainly don’t remember it from my jury trial advocacy course in law school).
For me, the two most important principles in developing trial graphics are:
1. Trial graphics should be color-blind neutral.
2. The graphics should have a consistent look-and-feel and iconography.
As to the first point, in a twelve-person jury, it is extremely likely that one has difficultly discerning differences between certain pairs of colors. Given that as an advocate I want to get the vote of each juror, the last thing I want to do is risk any juror’s misunderstanding a graphic or, worse, being mad that he (and incidence of color deficiency is greater among men) cannot understand or process the chart placed before him and about which I or the witness is yammering. Moreover, trial-graphics firms seem not to have gotten this message and routinely create graphics that do not account for this. So, I am routinely required to double-check my graphics (even after explaining all this to the designer), and more often than not I have to ask the design firm to rework the graphics on this basis (at their expense since I made this a specification of their retention). There are a handful of websites that show neutral color palettes, and I recently saw a write-up about a program that aids specifically in making discernible graphics. At all events, I can’t fathom a reason not to make my graphics color-blind friendly.
I also believe that graphics should feel consistent. In part, I am trying to make sure that if the jury looks at a graphic they know it is from my side, not the other side. I think this makes the case feel coherent and consistent, and I think it helps the jurors in understanding my exhibits if they become familiar with in general how they look. But the principle applies more broadly than having a consistent look-and-feel. When similar concepts or ideas appear in more than one graphic, I think that all things being equal that concept/idea should be presented consistently. So, if we are illustrating a widget in one graphic, if a later graphic is to depict the widget presumptively the same icon should be used (i.e., the widget should be drawn the same way). All rules have exceptions (except this rule), and there may be a reason to depict the widget differently. In the absence of any reason for variation, however, the same icon should be used.
But this rule of graphic design can be applied more powerfully to help organize a lot of information in a case. I handled one fidelity insurance case where my client (the insured company) suffered a substantial loss from a rogue employee who hijacked a subsidiary and engaged in a number of criminal schemes for the purpose of embezzlement and to cover his tracks. These different schemes operated both independently and interdependently: one aspect of the overall scheme was to solicit increasing volumes of new business in order to have new cost centers against which to (falsely) account for sales and losses. We called this the “cheapest guy” scheme, as in “call the [bad guy] to bid on this job for us because he’ll do it for cheaper than will anybody else.” Thus, this scheme was for him to be the “cheapest guy.” Other schemes involved creating false and fraudulent invoices, using whiteout, photocopying, and the like, and submitting those to the parent company for reimbursement. Another scheme involved use of “ghost employees”, nonexistent workers for whom the subsidiary would also obtain payroll funds and the like from the parent company.
Our coverage-maximizing theory of the case was that all dozen or so schemes together were part of a larger effort to obtain money for the bad guys at the expense of the parent company. Certain coverage questions were presented by the “cheapest guy” scheme, for example, for maybe the loss that resulted was simply from bad business judgment in formulating bids for jobs rather than from being an integral part of the larger scheme. Putting each element together was essential for us to defeat the insurer’s effort to disaggregate and evaluate coverage and loss for each sin of the employee.
We worked for months on creating and revising our overall summary graphic to depict on one graphic the Rube Goldberg model of all the schemes together, resulting in the multi-million dollar loss (for which we were seeking insurance recovery). We worked very hard to lay out an effective graphic, but the goal was not that the graphic would be self-explanatory (it was an illustrative exhibit) but rather that it was to anchor the presentation of our side of the case.
We needed to have witnesses, documents, and experts to explain how each of the schemes worked a fraud or combined to facilitate or cover up the fraud from the parent (insured). Our graphic-design strategy involved keying each of the subsidiary issues to the main summary chart. On the summary chart we had a “ghost employees” box and an icon (no, not a ghost, we thought that was way too kitschy). Each supporting graphic then used that same icon in the upper corner of the graphic to help the jurors relate the particular graphic to the summary fraud chart as a whole. (The summary chart is available on the trial-graphics vendor’s website under its “best of” section, though the description of the case by the vendor isn’t quite right and this link should not be taken as a particular endorsement (and if it were it certainly would be an uncompensated one).)
The idea was to take each of the multitudinous schemes we depicted, and on average we probably had two or three separate illustrative graphics for each of the schemes, and then tie those all back to the main graphic that served to organize the entire case.
These two main principles by which I create graphics in my cases does not really guide the creation of particular graphics. The only way to create individual graphics is to have brainstorming sessions with the team, with the result being (one hopes) a series of stick-figure or schematic drawings that we can hand off to our firm’s in-house designer or to the trial-graphics firm to generate something pretty. We then do the same thing again – and again and again. (Our summary graphic in our fidelity case set the record at the particular graphics firm for most revisions.)
In formulating the design principles governing individual graphics, no doubt the single most important influence on my thinking is a series of books by Edward R. Tufte. Professor Tufte has published four relevant volumes (and promises a fifth), each of which is beautiful and written in an exceptionally engaging and clear way. All trial lawyers should read at least two of these, The Visual Display of Quantitative Information (1983, 2001) and his recently published Beautiful Evidence (2006). These books are themselves worth savoring even without use value, but through discussion and illustrations Tufte distills a number of principles that help guide the development of graphics for presenting information. (In his book, Visual Explanations (1997), Tufte has a now well-known discussion of the Space Shuttle Challenger disaster where he shows how the flight engineers had data that indicated the shuttle should not be launched but lacked an effective way of presenting them to make clear the unreasonableness and risk associated with launching when the shuttle did. (p. 38-54). The other book is Envisioning Information (1990).)
Perhaps the most important concept Tufte articulates is the “data:ink” ratio: that is, assess how much ink is used to depict data. This leads to the next most important principle I use and to the fourth principle, which is in some dimension an entailment of the third:
3. Maximize the amount of data relative to the ink used in its graphical analogue.
4. Everything on the graphic must point to the information/advocacy objective of creating the graphic in the first place, or don’t waste an opportunity to have each element of the graphic serve an information-conveying function.
Tufte’s data:ink metric leads to a virtually wholesale condemnation of pie-charts for example. Everyone seems to like pie-charts, and their use makes us think that we are “doing graphics.” Pie charts and bar graphs are built into MS Excel, and many lawyers believe themselves to be accomplished Excel-nauts if they use the built-in chart wizard within the program. But typically these charts fail on a data:ink analysis.
So, what’s the problem with (mom-and-apple) pie-charts? Let’s begin by analyzing what we’re trying to accomplish with a pie-chart: (i) tabulate the data and (ii) indicate their relative weight. A pie chart uses a lot of ink to accomplish those objectives, and as Tufte argues almost always a simple table will convey the same information as effectively without distracting noise and mental effort. I certainly cannot easily discern from a distance the difference between two similarly sized pie slices. And when we have many slices of a pie, the visual comparisons and discernment required are surely beyond most average folks (maybe some artists can keep it all straight). It is no answer to this observation to say that this is why the pie slices are labeled and that using a pie shows the slices total 100 percent. It is almost always going to be true that laying out the data in a table with the percentage share and totaling them to 100 at the bottom is more efficient. There is just no justification for making the viewer – here, jurors – try to use mental calipers to compare the slices, rather than making it easy by just telling the viewer the data in a straightforward manner. (And this all is at least doubly true when the slices are not directly labeled and instead a color-coded key is used along side a rainbow-segmented pie.)
Maybe entertainment value is the only saving grace for pie charts: they have pretty colors. And I’ll confess that I’ve allowed a pie chart to sneak in, but in general pie charts simply flunk any analysis of the amount of ink needed to convey information or, more importantly, the work we are asking the viewer to do in order to understand the point we’re trying to convey by using the graphic. A variation of the pie chart I used once in a first-party bad-faith case took a dollar bill and showed what portion of the policy premium was siphoned off by the various intermediaries: brokers, managing general agents, etc. The point was to illustrate that the particular program was set up to line the intermediaries’ pockets and did not leave enough money over to accumulate reserves to pay claims; as a result, with less than fifty percent of the premium left to pay claims, policy limits equalling more than ten times the premium, and an expected short period of time between collecting premiums and paying claims, the insurer adopted the approach of denying all claims, engaging in a rope-a-dope strategy with its policyholders, and paying only those policyholders that had the gumption, wherewithal, and stick-to-it-tiveness to sue for coverage. This pie chart (the segmented dollar bill) was appropriate to use because the metric – the dollar bill – was understandable, the particular shares of the intermediaries and their relative compensation was not so important, and the key point of there not being enough money left over to pay expected claims after sales expenses was easily comprehensible. (I keep a copy of this one on the wall of my office.)
The data:ink analysis relates to the fourth point: don’t waste an opportunity to advocate (er, to convey information). Put differently, look at your proposed graphic: is each pixel or color dot being used to convey a point? Or as Tufte puts somewhat differently: “What are the content-reasoning tasks that this display is supposed to help with?”. An example: for an environmental-coverage matter we were seeking recovery of defense and indemnity costs at a very complex environmental site, where the company had spent deci-millions working on investigation and then remediation. A huge property was involved, and the remedial plan was multi-pronged, and different projects were proceeding concurrently. We had submitted a huge amount of money to the insurers for defense-cost reimbursement (holding in abeyance indemnity amounts), and the various defense and investigation costs were cumulated in a number of concurrent projects that morphed over time to remedial work (indemnity).
To try to unscramble the omelet, we held a big meeting with the insurers following our submission of a set of well-organized notebooks of information and bills. We prepared a nine-foot wide timeline graphic. The timeline bisected the chart horizontally, and defense costs were put above the horizontal line and indemnity, below. Each project then was shown as its own horizontal bar floating above the timeline (or, below, in the case of indemnity). So, project 3 was shown beginning at one point on the timeline and continuing as costs were incurred. (Eventually, a project would transform to indemnity (as it moved from investigation to implementation of a remedy), so the bar would stop above the line, and the same colored bar would then start below the line and continue so long as invoices were being incurred. By using the same color, the viewer could follow the bouncing ball as a defense project became indemnity.)
This chart was helpful in helping the viewer (in this instance the insurers’ outside counsel and claims handlers) understand what we were talking about when discussing, e.g., project 6. Along the timeline itself we marked when various EPA orders (106 and RODs) occurred. In this way, the viewer could correlate the date of an EPA and then see our responsive effort (commencing investigation and ultimately remediation). And note that we were managing many simultaneous projects at the property, and different EPA orders addressed one or more of the projects.
By taking the mass of costs we incurred over time and plotting them as concurrent streams of projects and illustrating both the relationship between the EPA orders and our responsive work we were able to put together the story for which the millions of dollars of invoices were artifacts. (We also were conveying the message that we were taking the case seriously, had an effective means of presenting the morass of information, and were gearing up for trial – all of which conduced to settlement.)
But we missed a further explanatory opportunity in how we put this chart together. The various differently colored horizontal bars on the chart differentiated the numerous projects, which was helpful. The jump from above the line to below the line showed moving from defense (the cost of investigating a problem) to indemnity (the cost of fixing it). But what we missed was a means of showing (i) how costs at a project were accumulating over time and (ii) the relative importance or costliness of one project to another. And Tufte’s data:ink principle would have helped discern this: what we could have done was to start a project as a point above the line and grow the project in a wedge shape reflecting the accumulating of costs. So, starting at the point, we could have then drawn the wedge larger and larger correlated to how much money we were spending on that project at the site.
So, instead of a timeline with horizontal colored bars above and below the timeline, we would have had in place of the bars a number of isosceles triangles beginning with a point and opening up moving left to right (following time) in proportion to the amount of money that had been spent. (And we could have illustrated that a project was closed at the time of our presentation by drawing a vertical line closing off the opening wedge of the triangle and having no vertical line closing off the triangle for projects that were still ongoing.)
Our 9-foot wide chart was effective and crucial in accomplishing our objective of presenting multi-decimillions worth of invoices, a series of EPA orders, and the classification of costs at a project between defense and indemnity. Our chart laid this all out clearly and helpfully. But it was a missed opportunity of sorts also to create a truly “data rich” presentation of information that would have also shown the viewer the relative importance of different projects, the rate of the incurrence of money at different projects, and which projects were closed or not.
This anecdote is meant to provide some concreteness to the process by which trial graphics should be created and to equip lawyers with a set of principles by which to create and more importantly to revise the graphics with which they work. Tufte’s books, The Visual Display of Quantitative Information and Beautiful Evidence summarize a number of his design principles, which I’ll put together as my point 5 (liberally quoting and paraphrasing him):
5. Follow Tufte’s lead:
a. Above all else show the data
- Do not clutter the presentation of information with “chart junk”: garish decoration, unnecessarily graphs or embroidery of graph plots.
b. Maximize the data-ink ratio
c. Erase non-data-ink.
d. Erase redundant data-ink.
e. If the nature of the data suggests the shape of the graphic, follow that suggestion.
- Otherwise, move toward horizontal graphics about 50 percent wider than tall.
f. The representation of numbers, as physically measured on the surface of the graphic itself, should be directly proportional to the numerical quantities represented.
g. Clear, detailed, and thorough labeling should be used to defeat graphical distortions and ambiguity. Write out explanations of the data on the graphic itself. Label important events in the data.
h. Show data variation, not design variation.
i. In time-series displays of money, deflated and standardized units of monetary measurement are nearly always better than nominal units.
j. The number of information-carrying (variable) dimensions depicted should not exceed the number of dimensions in the data.
k. Graphics must not quote data out of context.
l. Most explanatory and evidential images should be mapped, placed in an appropriate context for comparison, and located on the universal grid of measurement (that is, they should have a clear marked scale).
m. Focus on causality, including making transparent where the data are uncertain.
i. Credibility must be earned afresh locally by means of specific evidence demonstrating the relevance and explanatory power of the idea in its new application.
n. Indicate the sources and levels of data.
o. Annotate linking lines.
p. Nouns in diagrams should be labeled, annotated, explained, described.
q. Clunky boxes, cartoony arrows, amateur typography, and colorful chartjunk degrade diagrams.
- In an organizational chart, the boxes are unnecessary; the typographic placement of the title or name on the two-dimensional space is sufficient.
r. Completely integrate words, numbers, images, diagrams.
s. Thoroughly describe the evidence used. Provide a detailed title, indicate the authors and sponsors, document the datasources, show complete measurement scales, point out relevant issues.
No doubt, the foregoing is difficult to implement. The development of effective trial graphics takes time, effort, and creativity. Like a brilliant brief, effective trial graphics are not first drafts. And I certainly would not grade all my past efforts at trial graphics as “A’s”. But I try: it is an effort that I owe my clients, but perhaps more importantly it is an effort that I owe the viewer: judge, jury, opposing counsel, the insurer. As an advocate, I am an intermediary and medium of communication. This is no less true with graphics – which I am using as a strategy for communication – than it is with words. Tufte (again) sums up the point nicely in what he calls his Sixth Principle for the analysis and display of data:
“Analytical presentations ultimately stand or fall depending on the quality, relevance, and integrity of their content.”