Share Their Story

How McDermott Saved Constellation Brands Millions

Background

McDermott Will & Emery is an international law firm with a diversified business practice. With more than 1,100 attorneys, the firm has 18 offices around the globe and a strategic alliance with MWE China Law Offices in Shanghai. McDermott’s specialties include tax, health care, benefits, intellectual property, and other business litigation matters across all industries. A top-notch antitrust practice, tier-one e-discovery group, and diverse experience allow McDermott to provide sophisticated and cost-effective legal services for their clients.

Martha Louks is a consultant in McDermott’s discovery practice group and is experienced in the use of technology during litigation. Martha supports McDermott’s case teams by providing a blend of legal and technological insight in each matter.

The Case

In 2012, North American brewers Anheuser-Busch InBev and Grupo Modelo announced a proposed merger that would combine two of the largest beer suppliers in the world.

Constellation Brands—a wine, beer, and spirits company with an extensive portfolio of brands in the beverage space—was also involved in the deal. As part of the terms of the merger, Constellation would purchase Modelo’s 50 percent stake in a fourth entity, Crown Imports, which had been a joint venture between Constellation and Modelo. By purchasing the remaining 50 percent stake in Crown, Constellation would own full distribution rights in the U.S. for Corona and other Modelo beer brands.

[DOJ’s] second request left the merging parties with just over two months to sift through millions of documents.

The deal, however, was stalled by requests for information by the U.S. Department of Justice, which was tasked with determining whether the merger would give too much market share to a single entity. This second request left the merging parties with just over two months to sift through millions of documents and begin producing information.

As counsel for both Constellation and Crown, the team at McDermott had a lot of work ahead of them.

“Fortunately, this case was well-suited for a computer-assisted review workflow,” said Martha. “It was a great opportunity to use Relativity Assisted Review.”

Prioritizing a Cooperative Approach

Computer-assisted review had been an important tool in McDermott’s arsenal for some time. A case team had recently used it on a related matter, and several members of the McDermott team had already begun discussing the technology with the U.S. DOJ. With a strong relationship already in place, McDermott quickly got to work establishing an acceptable workflow for the case.

“We were very clear and transparent with the Department of Justice from the beginning,” said Martha. “It was important to establish an approach that everyone was comfortable with, and we were all aware of the potential benefits in terms of time and cost savings.”

Martha and Warren Rosborough—an antitrust partner at McDermott who had served in the U.S. DOJ’s antitrust division for four years prior to joining the firm—shared many calls with the U.S. DOJ to discuss the potential of Assisted Review in the case.

The DOJ’s team … recognized that computer-assisted review could mean smaller productions with better quality information.

“The DOJ’s team was excited about the possibility, too. They recognized that computer-assisted review could mean smaller productions with better quality information. As a precaution, they just wanted some control over the process to ensure its accuracy,” said Martha.

McDermott and the U.S. DOJ held regular calls to establish a favorable methodology for the case. Early culling nearly halved the original data volume, but still left well over a million documents to be reviewed.

“We agreed not to cull data with search terms before we ran it through Assisted Review,” Martha explained. “Instead, we culled only with traditional methods like de-duplication, date restrictions, and de-NISTing. From there, we relied on Assisted Review to identify documents for production.”

It was also decided well in advance that any documents Assisted Review deemed uncategorized would be linearly reviewed, and both teams outlined which issues should be considered responsive to the second request.

“We agreed to treat the documents collected by Crown and those by Constellation separately,” said Martha. “The DOJ requested that we use a responsiveness standard for one party that was broader than the standard for the other, so we set up two separate Assisted Review projects. This case was a great opportunity to see the results of computer-assisted review with two very different approaches, and we felt that the best way to achieve accurate results was to approach each data set independently.”

With an agreed-upon protocol in place, the McDermott team was able to start on the case quickly. Throughout the coming weeks, they tackled the data and kept lines of communication with the U.S. DOJ wide open.

A Tailored Technique

“The DOJ recognized that computer-assisted review could mean smaller productions with better quality information.”
MARTHA LOUKS, Discovery Consultant

Data from Constellation and Crown was loaded into Relativity in mid-September 2012. From the start, McDermott’s team understood the difference that a small, cohesive group of expert reviewers can make in a computer-assisted review project.

“We had a group of about five key people who were all subject- matter experts in this case—and, for the first several days, everyone worked very closely together,” said Martha. “We knew it was crucial that all five of these reviewers were on the same page: when you’re training the system, you have to maintain consistency.”

During the initial training rounds, reviewers frequently discussed what responsiveness meant for each data set, held one another accountable for coding protocols, and worked as a tight-knit team. In a daily team phone call, reviewers shared insights and asked questions to help keep everyone focused.

“It was going well, and DOJ was continuously aware of our progress,” said Martha. “It was very beneficial to have everyone on the same page.”

The workflow went smoothly, though the team did notice differences between the two data sets. For example, using a broader definition of responsiveness required more training rounds to achieve stability.

“We ran these projects simultaneously, responding to the same document request, so it was interesting to see the differences in each emerge along the way,” said Martha. “We made sure DOJ was kept apprised of those differences along the way.”

Beyond the separate data sets, McDermott needed to handle additional server data that did not become available until the project was already in progress.

“The server data took longer to collect, so it wasn’t incorporated into the data sets until later,” said Martha. “But with Assisted Review, we were able to use the documents we had already coded to fine-tune the system and quickly integrate the extra documents.”

In the first round that included the server data, the team used existing samples to code the new documents. For subsequent rounds, they pulled new samples to better train the system. In the end, the addition of server data didn’t inhibit the team’s results. After just 10 days and a handful of training rounds, their first production was ready to be delivered to the U.S. DOJ.

With a rolling production schedule in place, McDermott’s team was also able to use the analytics indexes they had fine-tuned with Assisted Review to prioritize linear review for privilege and issue coding. All responsive documents were produced to the U.S. DOJ by October 26th—only six weeks after McDermott had begun training Assisted Review.

Validated Results

The U.S. DOJ and McDermott continued their collaboration even after the final production was out the door. To validate the results, the U.S. DOJ wanted to review a random sample of non-responsive documents to make sure the responsive documents had indeed been produced. Several litigation support professionals visited McDermott’s office to observe Martha creating a series of sample sets. The U.S. DOJ chose a set at random and reviewed the documents for responsiveness. A clean team performed this exercise, meaning no member was actively working on the matter. In the end, the U.S. DOJ was satisfied with the results of their review.

[McDermott’s] quick work and increased accuracy saved their clients more than $2 million.

As a result of a cooperative working relationship, a fast and effective review, and quality productions, the McDermott team received direct validation from the U.S. DOJ for the Assisted Review workflow.

All in all, McDermott was able to target just 21 percent of the original data set for manual review. The quick work and increased accuracy saved their clients more than $2 million. In the end, the merger received U.S. DOJ approval in spring 2013.

“Both our team and our clients were very happy with the results of this case,” said Martha. “We’re excited to evaluate each new matter for its potential with Assisted Review, and we look forward to continuously making our work more effective for our clients.”

Want to see Assisted Review in action?