by Louis Martin - iDiscovery Solutions, Inc.
on December 15, 2016
Analytics & Assisted Review
e-Discovery professionals rely on analytics tools for cases of all sizes and have celebrated some big wins for clients as a result. As important as it is to share success stories, it’s also important to share close calls, from which we can sometimes learn far more.
In this season’s spirit of gratitude, I wanted to offer up my appreciation to analytics for coming to the rescue in the three near-crises that follow, and pay it forward by sharing my take on our lessons learned—pain and panic not included.
Picture a board room with a beautiful morning view of a city skyline. As you turn your gaze away from the window, you’re faced with a conference room full of decision makers for a law firm—all staring at you. Gulp. The topic at hand? Analytics.
This was the opportunity—and challenge—faced by my team at a firm that had never used analytics before. The partners equated analytics only with technology-assisted review or “predictive coding.” To help build awareness of the full toolset in the analytics family, we prepared a demonstration with real data from the current case, showcasing Relativity Analytics and some of its important features like categorization, email threading, concept searching and, almost as an afterthought, clustering.
We walked through several of the analytics features and detailed explanations of technology-assisted review, with crickets from the crowd. Nearing the conclusion of our presentation, we moved on to visualization tools—specifically, an explanation of clustering. One of the senior attorneys in the back of the room pointed at a specific cluster and asked a question about it. I enlarged the cluster and the concepts within it, organizing them into a list view. We ran a quick search and discovered …
The data we were looking at was from a completely different case.
It wasn’t an error on our side. The data had been included in the set by opposing counsel by accident, but its parent case was very similar to the one we were addressing. Uncovering those documents proved extremely important to our clients’ case, as there was a similar issue within the one we were there to talk about. The attorneys in the room were thrilled. Our demonstration of the hypothetical value of analytics became instantly tangible. Who can say whether that data would have been discovered otherwise?
After the demo, we hadn’t even reached the parking lot when the client called us and asked how soon they could start applying analytics to their cases. The case in question was resolved very soon afterward.
Historically, meet and confers haven’t always showcased preparedness of opposing counsel—I’ve been in meetings that revealed the other side had only the vaguest sense of the data they held.
As I’m sure my fellow practitioners are aware, the 2015 amendments to the Federal Rules of Civil Procedure have raised the bar on meet and confer expectations. Parties are encouraged to use these conferences early on to negotiate an e-discovery plan and talk through strategy. The tone for the entire case can be set by entering those meetings prepared, aware of the cards you hold, and ready to ask for what you want.
In one matter, we were going to collect from numerous locations, personal devices, network shares, laptops, and legacy systems; you name it—we were collecting from it. Given the broadness in scope and data types, we knew this review would be challenging. After looking at the first round of collected data, it was clear that traditional e-discovery techniques like keyword searching and date filtering were not going to provide enough insight and would not reduce the number of relevant documents in the data set.
We initiated discussions with opposing counsel extremely early on in the matter about using analytics to generate needed insight into our client’s data. Everyone agreed on technology-assisted review, since opposing counsel was very familiar with it. We worked together to determine and document what the workflow would look like and what items we would disclose. We agreed on a specific margin of error and, after each round of Relativity Assisted Review, we provided a summary report and had a phone discussion about the results.
Leveraging analytics in advance allowed the attorneys to see what was in the data collected, including important concepts and issues. By fully maximizing the meet and confer, both parties were able to provide a complete work plan which included analytics, an agreement that saved the client an estimated 6-8 months of review time.
As the hours ticked down before a Thanksgiving holiday weekend, we received a frantic call from a client: they had received short notice about a second request regarding their company’s pending merger. They had less than two weeks to provide the requested documents in a specific production format or their merger would not go through. The data set included more than seven million documents that would need to be reviewed over the holiday weekend.
As an additional burden, our client’s counsel had already set unrealistically high expectations with opposing counsel on the precision, recall, and margin of error values we’d use with Relativity Assisted Review. This document set did not meet those standards.
There was no way we’d be able to manually review that size document population without throwing a lot of people at it. The long weekend and upcoming holidays would make it difficult to source a large enough group for the project.
We moved quickly to pull together a strong project and review team. We contacted a trusted reviewer group so they could get started. Together, we looked at the document set and worked with them on what would be handed over to opposing counsel. We saw they had a set of pre-coded documents from earlier in the process. We revisited those with subject matter experts to ensure the documents still provided a good sampling of the overall data set, and we used those documents as our seed set to kick-start our technology-assisted review project.
We spent considerable time discussing with counsel what precision and recall is, what the margin of error means, and how changing the margin of error can make a major difference in the number of documents you need to manually review to reach the desired goals. This is important. You must be able to explain to people in understandable terms what you’re seeing and what these things mean so they can explain to others who are important to the case and may not be technologically savvy.
It was contentious for a while, as counsel really wanted to hit the numbers they had promised. Ultimately, we came to an agreement on different, yet acceptable, precision and recall scores by leveraging previously reviewed sets of documents that had been scored and dissecting the reports from those sets.
In the end, leveraging technology-assisted review and a talented project team, we completed the project on time and on budget, allowing our client’s merger to continue as planned.
The ultimate lesson learned from all three scenarios is that technology expands the realm of possibility. It has significantly increased the speed and reduced the cost of review. With the right expertise and technology, yesterday’s “impossible” project is today’s successful one—and where warranted, no case is too large or small to investigate.
Louis Martin, a senior managing consultant at iDiscovery Solutions, Inc. (iDS), has over 15 years of experience in project management, as well as managing and consulting with law firms and legal departments on a wide range of matters involving relational databases, ESI, applying database development, data analysis, e-discovery, and computer networking experience.
Analytics, Disruption, and the Hybrid Path to Adoption
Why Aren't You Using Email Threading on Every Case?
How to Simplify 3 Common Types of e-Discovery and Document Review
Now in Relativity Analytics: 3 Customer-Driven Enhancements
4 Ways to Move e-Discovery Data That You May Not Know About