Case Analytics: Evaluating Your Project

This article contains information about using Everlaw's Case Analytics feature for high-level project evaluation. The Case Overview, Case Size, and Review Progress tabs in Case Analytics provide opportunities for monitoring your project from a bird's-eye view. You can also combine various metrics to construct custom analytical reports that meet your particular needs. 

Table of Contents

Return to table of contents

Case Overview

Case Overview provides high level statistics about the progress of review.

  • Documents coded: The percent of documents that have been coded in the project.
  • Documents viewed: The percent of documents that have been viewed in the project.
  • Documents rated: The percent of documents that have been rated in the project, and a breakdown of the distribution of hot, warm, cold ratings.
  • Total Documents: The total number of documents in the project, including duplicates.
  • Docs Reviewed/Week: The average number of documents reviewed in a week. This metric takes into account the cumulative number of documents that have been viewed on your project; it does not take into account documents that have been given ratings or codes. 

ETA to View All Docs: The estimated amount of time it would take to view all the documents in the project given the current review pace.

analytics1.PNG

Return to table of contents

Case Size

The Case Size tab shows you the current and historical size of your database on Everlaw. You can also disaggregate the information to show the total GBs of native data and the total GBs of processed data. Since billing is often based on project size, this tab can be useful for illuminating your billing process. Only projects started on or after August 16, 2015 will have project size information; if your project started before, this tab will not be visible.

analytics2.png

Return to table of contents

Review Progress

The Review Progress page charts the documents viewed and documents rated over the lifetime of the project (for projects that were in Everlaw before analytics was released, the review progress chart begins on that date - 7/13/2012). You may hover your mouse over the graph to see any week since the project was created - the tooltip will show you how many new documents were viewed that week, and how many documents were rated. These two parameters are both cumulative. In other words, over time, the amount of documents that have been rated and viewed should rise, as review activity progresses on your project. (Note that there may not be a one-to-one correspondence between documents that have been viewed and those that have been rated. For example, some documents may have been viewed but not given a rating. Additionally, it is possible to rate documents without viewing them, through batch actions.)

This graph is similar to the Overview Tab, except you are able to see the entire history of the project. You may add or remove parameters, or series, to the graph by clicking the appropriate checkbox in the legend. The checkbox next to the legend title selects or deselects all the series. If only some of the data is selected, the checkbox will appear gray, but is still clickable and will add all the data to the graph.

You may hover over one of the series in the legend to emphasize that parameter in the graph by bringing it into the foreground.

analytics3.PNG

To see the data a little more clearly, click on the Y-axis of the graph. The graph will resize to display the documents as a percent instead of absolute number; this provides a better view of the data. For example, if you have reviewed 4.1% of the documents on your project and viewed 8.4% of the documents, the Y-axis will resize and display units from 0-10%. This will make the current percentage and historical progression of viewed and rated documents easier to visualize. Click on the Y-axis again to return the axis labels to an absolute number. The image below shows a graph that has been resized by clicking on the Y-axis.

analytics4.png

Return to table of contents

Custom

If built-in reports do not provide all of the information you want, you can make your own custom reports by going to the Custom page. You can select any metrics you want to compare, as well as any users in the project. Select the metrics from the dropdown menus under the "Custom Visualization" title, and the users you wish to analyze from the the Users key. You may hover over one of the users in the legend to emphasize that user's data by bringing the user's line into the foreground. For a detailed description of the various statistics available, please see the Data Collection and Metrics Details section at the end of this page.

Under the graph is a bar that allows you to set which time period you would like to view. The default view is the last 4 weeks, but it is easily adjusted by dragging the markers along the bar. You may also toggle between weeks and months to narrow down or expand any time frame you wish to see. The graph below is a visualization of review time by user by week, over a period of a month.

analytics5.PNG

 

 

The raw statistics that we collect (and that are available in the Custom Visualization widget) are the following. "New" means previously not viewed by the user:

  • Review Time: The length of time the user has spent in review. This timer is active whenever the user has a review window open and is viewing page images or native pages. A maximum of five minutes will be logged on a page if there is no intervening activity, to prevent issues with users idling or leaving review windows open. Time spent on the home page, searching, or viewing search results is not logged.
  • Docs Viewed: A count of the total number of times the user opened up a document in the review window (even if the user had already previously viewed that document).
  • Pages Viewed: A count of the total number of pages the user viewed in the review window (even if they had already viewed that page).
  • New Docs: A count of the total number of new documents the user opened
  • New Pages: A count of the total number of new non-native pages the user viewed.
  • New Natives: A count of the total number of new native files the user opened
  • Docs With Natives: A count of the total number of new documents the user opened that had natives
  • Pages Opened: A count of the total number of non-native pages in the new documents the user opened (e.g. if the user opened document A, which has 3 pages, and document B, which has 2, this count would be 5 - which pages the user views doesn't matter here)
  • Native Pages Viewed: A count of the total number of native pages the user viewed.
  • New Native Pages:  A count of the total number of new native pages the user viewed.
  • Rated Hot/Warm/Cold/Unrated: A count of the number of documents the user manually rated.
  • Batch Rated Hot/Warm/Cold/Unrated: A count of the number of documents the user rated using the Batch Rate tool.

Return to table of contents

Have more questions? Submit a request

0 Comments

Please sign in to leave a comment.