Real-World Stories of Relativity Assisted Review



by Constantine Pappas on October 31, 2012

Analytics & Assisted Review , Community , Law Firm , Litigation Support , Product Spotlight

Have you used Relativity Assisted Review? You may have a different name for it, though, as the industry has been using various phrases for computer-assisted review for some time. Assisted Review is the process of amplifying attorneys’ review decisions to suggest coding decisions on all documents in a universe, and then validating the results with statistical analysis.

Last week at Relativity Fest, our clients were able to share their own stories of using Assisted Review in the field, in our Real-World Stories of Relativity sessions. Assisted Review has been used in internal investigations, environmental matters, and, most commonly, litigation. It can be beneficial to reduce review costs for large volumes of documents, ensure all reviewers have the same understanding of the issues, or quickly analyze a case. As a result, Fest was an excellent opportunity for our clients to step into the spotlight and share their experiences.

Dave Herman of UHY Advisors shared his experience using Assisted Review for an internal investigation. A large Fortune 500 investigation under the Foreign Corrupt Practices Act had an initial scope of 200 custodians spanning 15 countries. The data was globally de-duplicated during processing—before being imported to Relativity—but interestingly not searched nor culled. This common review step was not executed to help ensure nothing would be missed. Relativity Analytics was critical in finding the slang use of key terms and similar documents that might otherwise have been eliminated. Searching was used to break databases into groups for further prioritization in training Assisted Review and to quickly identify good example documents—a valuable step as feeding high-quality examples greatly speeds up the process.

Adam Wright Strayer of Complete Discovery Source (CDS Legal) used Assisted Review on a large environmental matter. There were 4.5 million documents produced between 12 defendants and six plaintiffs, and the goal was to find the key items in the production. There were eight different law firms representing the defendants. While one firm managed the project, all of the eight trained the system for examples of responsive and non-responsive documents. This created a great deal of inconsistencies in the review, which Assisted Review kept identifying as overturns. This significant number of documents where the attorney value and the system-applied value were not aligned ended up reported in Assisted Review’s overturn details report. This is where CDS began to get things on track.

Discussions commenced where disparate groups talked about what was responsive. Gaining this team alignment was important. To get the review back on track after 12 rounds had already been completed, the team used some strategic sampling and keywords to identify better examples. They also focused on the key documents in similar document groups. Also, some documents were removed that were overly complicated. This brought the overturn values down, but not yet to expected levels. The key to success was breaking down the document set further. The responsive overturn was high, but the non-responsive overturn was only 3 percent. This allowed the client to concentrate on the responsive documents and eliminate the review of a very large dataset. There were a large amount of false positives, but an even larger group of documents they didn’t have to review at all, because they felt confident in the non-responsive overturn percentage. Ignoring 3.5 million documents proved Assisted Review to be their light at the end of a very long tunnel.

Dean Anderson of NightOwl Discovery used Assisted Review for a case of 2.5 million documents. The challenge here was that the team only had 45 days to produce. An initial keyword search brought back 350,000 documents. The team started there and fed the system examples of documents from the keyword group. At the end of the Assisted Review workflow, only 11,000 documents were ultimately reviewed by a person. Only 1 percent of the database didn’t get coded by the system and 73 percent of the total population was eliminated from a linear review. The production went out and the team felt confident in the documents produced.

It’s easy to think Assisted Review is the future of review, but that future is now. These scenarios only saw success by using the technology. It was interesting for us to hear our clients share how Assisted Review provided the leverage to tackle large volumes of documents, get review teams on the same track, and verify productions—and we hope sharing that information can be beneficial to all of you.

If you need assistance learning the Assisted Review process or strategizing a project, feel free to contact us at advice@relativity.com.

Comments

Post a Comment

Required Field