VLM by vdiscovery

vdiscovery's Legal AI, VLM is a generative AI tool integrated directly into relativity. VLM can summarize, categorize and perform question answering on documents within a Relativity workspace. It is built on both locally running language models as well as Microsoft Azure’s Enterprise GPT Implementation to ensure a security and privacy focus.

VLM graphic

Generative AI integrated into Relativity, designed to enhance document review.

vdiscovery’s Legal AI, VLM integrates the power of generative AI directly into any Relativity workspace. It can create summaries and automatically categorize documents and perform question answering through an intuitive interface. VLM’s deep integration into Relativity makes full use of fields to store generated content and allows for functions such as summarization to be run across thousands of documents and including summarizations or categorizations in any document list view. VLM is built with a security and privacy focus and employs a hybrid approach using language models run locally behind our firewall as well as Microsoft Azure’s Enterprise implementation of GPT.

Key benefits of VLM include:

  • Generative AI integrated into Relativity
  • Summarization and categorization prompts built-in
  • Privacy Focus using both local language models and Azure GPT
  • Field integration for reusable results
  • Question Answering on documents in an intuitive interface