by Tracy Nguyen and Clara Skorstad – Kilpatrick Townsend & Stockton on January 19, 2016
Our firm processes hundreds of thousands of files on a daily basis, and over the years, we’ve learned what it takes to manage such large amounts of data. Like in most facets of life, success in processing comes from a combination of planning and organization.
At first glance, processing data can seem like a complex undertaking, manageable only by a subset of niche technologists. Every processing tool out there does things a little bit differently and has its own idiosyncrasies, and the practice itself even seems to have its own language—dedupe? deNIST? OCR?—making it even more cryptic to beginners.
However, with the right preparation, your team can become a processing powerhouse. Here are six best practices we’ve discovered over the years to be truly successful with processing.
1. Assemble the Right Team
Successful processing requires the coordination of several people, all working toward the goal of providing superior service to your case teams and clients. To accurately and efficiently process data, your team should have skill sets for every stage of the project. For example, you will need:
- An experienced project manager to draft processing instructions and manage the overall project
- A system administrator to handle technical tasks
- Experienced analysts to push the data through the processing tool and QC the deliverables
Additionally, make sure you are adequately covered for troubleshooting. You never know when issues will arise, but we can tell you—it’s often during off hours. We have analysts on call every weekend and evening to ensure issues get resolved as they occur.
2. Give New Team Members Comprehensive One-on-one Training
When someone on your team is first learning to process data, you’ll likely be inclined to have them attend a training hosted by your software vendor. While these sessions are good ways to get your feet wet, the best resources for really mastering processing are often found within your own organization.
Make it a policy that new members of your core processing team shadow a project manager or processing specialist who can help them become comfortable with all the processing essentials, down to the mouse click. And we’re using the term “shadow” lightly—the new team member should be the one sitting at the computer, making each move.
Schedule this training for a couple hours per day for a few weeks, starting with simple tasks, such as learning common processing terms, before diving into topics such as how to preserve metadata and how to handle corrupt files.
3. Create a System for Handling Data
One of the most important things you can do when developing an internal processing protocol is create and document a system for handling the data. For example, what does the folder structure look like? Will your team organize data by custodian name or job name? Or maybe chronologically by load date?
Without a system in place, it is very easy to become disorganized and misplace or lose data (which could have serious legal ramifications). Further down the line, it is also difficult to guarantee that you are creating a complete archive or complying with court- or client-ordered data-destruction directives.
4. Report on Progress
Typically, processing tools will give you the ability to generate numerous reports throughout the job. Knowing what your tool’s reporting capabilities are and using those reports can help you relay important information to you client.
At our firm, we always run the following reports:
- A report on file types and data sizes to help calculate and present the estimated time and cost of a processing job
- A report that outlines exceptions—files that were not processed—such as password-protected files, corrupted files, media files, or empty files. If problems arise, this report paints a picture of what wasn’t processed and why for your client.
- A de-duplication report that provides clients with a list of files that were duplicates
5. Create a QC checklist
You cannot call a processing job “complete” until you’ve QC’d all the data and ensured all processing steps were completed properly. To make QC’ing easier and ensure you haven’t missed anything, it’s helpful to create a written checklist of the items included in your QC. This may include:
- Checking the consistency of the metadata, images, and text
- Exception tagging—did you tag all exceptions?
- Checking OCR (optical character recognition) to ensure files have text output
- Comparing filters with the data—if a filter was applied prior to processing, does the data reflect the criteria?
- Did you process everything your client asked to be processed? It never hurts to double check.
6. Call for Backup
If we could leave you with just one easy-to-remember piece of advice, it’s this: Prior to telling your client a processing job is good to go, get a second set of eyes on it.
Processing data in-house by our team of experienced project managers and analysts, all working cohesively and pursuant to our standard written protocols, has allowed us to manage timing, processes, and costs. It has also enabled us to offer flexibility and data security to our clients.
Remember, with processing, the end goal is to get the data into the review team’s hands as quickly as possible. The more streamlined your internal operations, the faster you’ll be able to do this.
We’re sure there are other tips and tricks from other “processing powerhouses” out there. Please share them with us in the comments.
Tracy Nguyen is a senior analyst within Kilpatrick Townsend’s e-discovery team. She works directly with project managers, attorneys, and clients on all phases of the EDRM. Her areas of expertise include data preparation, loading, processing, production, and quality control.
Clara Skorstad is a senior project manager within Kilpatrick Townsend’s e-discovery team. In her 15+ year project management career, she has regularly consulted with attorneys and clients on all phases of the EDRM and project planning. Clara has also been involved as a technical resource on a significant number of large, highly sensitive matters.