Evaluating Your Reviewers Using Project Analytics

Project Analytics on Everlaw provides several ways to evaluate reviewer activity on a group or individual level. You can use these tools to visualize trends and pinpoint areas for future improvement among your reviewers.

Note

Metrics in the Project Analytics page are based on the project time zone. For example, if an action takes place at 11pm PST but the project time zone is EST, it is reported in Project Analytics as happening the following day.
The underlying data is reaggregated every 15 minutes. To check for fresh data, reload the page or continue to edit your input values.

Access Project Analytics

Required permissions: You must have Project Admin permissions or be an Organization Admin to access project analytics.

To access the Project Analytics page, go to Project Management user-settings-filled.png> Project Analytics.

Platform Time

Any time a user spends on your project is logged and reported under Platform Time. After 15 minutes with no activity, Everlaw considers this account idle and no longer tracks time until an action is performed. An action is:

  • Clicking any button on the page
  • Scrolling the page.

platform time.png

Platform time is separated into seven categories. For each category, time is tracked based on time spent in a subset of Everlaw pages:

  • Total platform: The total amount of non-idle time a user has spent logged in
  • Review time:
    • Results table
    • Using Quick Review
    • Search page
  • Project management:
    • Project Settings
    • Advanced Settings
    • Production Notifications Tab
    • Database Settings
    • Project Analytics
    • Assignments page
  • Document analytics:
    • Clustering
    • Predictive Coding
    • Search Term Reports
    • Data Visualizer
    • Legal Holds
  • Uploads:
    • Native uploads
    • Processed uploads
    • File transfers
  • Productions:
    • Creating a production
    • Creating a production protocol
    • Production Protocol page
    • Productions page
  • Storybuilder:
    • Timeline
    • Dashboard
    • Draft
    • Deposition

Filter and organize Platform Time

You can filter the data by date range, user group, or specific user:

  • To filter by date range, you can either:
    • Select the drop down next to Date range and choose from the options
    • Enter a custom date range, with the beginning date in From and the end date in To
  • To filter by user, select the dropdown next to User. Select the name of each user you want to include. You can type a name to filter the list.
  • To filter by User group, select the dropdown next to User group. Select the user group(s) you want to include. You can type the name of a user group to filter the list.

Select a column header to sort a column by descending values. Select again to sort it by ascending values.

Export Platform Time

To export an Excel file of the Platform Time table, select the three-dot menu at the top of the table and choose Export.
export platform time.png

Applied filters (such as date range, User Group, and User), sorting behavior, and removal and/or addition of columns are reflected in your export.

Screen Shot 2024-04-29 at 2.33.05 PM.png

The export includes separate tabs for Total, Monthly, Weekly, and Daily platform time:

  • The Totals tab shows all users, including those with no recorded activity
  • The Monthly, Weekly, and Daily tabs only include users with activity data.

Note: The smallest unit of time is one second; users with data values of 0 may have activity that amounts to less than one second.

Review Activity

The Review Activity table shows the raw stats of all the reviewers on the project. The bottom row includes a Total summed across all the reviewers. These stats include:

  • Total documents viewed
  • Total pages viewed
  • Documents viewed per hour
  • Pages viewed per hour
  • Repeat views by percent of new documents viewed
  • The percent of documents viewed that rated hot, warm or cold 

review_activity.png

Filter Review Activity

You can filter the data by date range, user group, or specific user.

To filter by date range, you can either:

  • Select the drop down next to Date range and choose from the options
  • Enter a custom date range, with the beginning date in From and the end date in To

To filter by user, select the dropdown next to User. Select the name of each user you want to include. You can type a name to filter the list.

To filter by User group, select the dropdown next to User group. Select the user group(s) you want to include. You can type the name of a user group to filter the list.

Export Review Activity

To download a CSV file to your computer, select the three-dot menu at the top of the table and choose Export. The exported CSV reflects any filters you have applied to the table.

You can use this data to help diagnose performance, like tracking how many documents or pages per hour each reviewer has moved through to understand comparative efficiencies. This can help you adjust review assignments and even future team composition.

The columns displayed by the Review Activity are:

  • Total Docs: A count of the total number of times the user opened up a document in the review window (even if the user had already previously viewed that document). A "view" is counted regardless of how long the document was opened for.
  • Pages Viewed: A count of how many native OR non-native pages the user viewed (counting duplicates as well)
  • Docs Per Hour: Calculated using Docs Viewed and Review Time
  • Pages Per Hour: Calculated using Pages Viewed (from this chart) and Review Time.
  • Repeat Views: The percentage of all Docs Viewed that the user had already previously viewed. Previous view is counted if the document was viewed at anytime, regardless of how long it was opened for.
  • Ratings: A count of the total number of documents the user rated (manually or batch rated, hot/warm/cold)
  • Hot/Warm/Cold: The percentage of documents the user rated hot/warm/cold out of the total number they rated (e.g. if they rated 10 docs - 5 hot, 2 warm, and 3 cold - the percentages would be 50% hot, 20% warm and 30% cold)

Reviewer Accuracy

Reviewer accuracy statistics are calculated between the reviewer's ratings and a Project Admin's ratings to see how accurate reviewer ratings are. Project Admins can review previously reviewed documents to calculate a reviewer's accuracy. To do so:

  1. Open a results table of a portion of the documents a reviewer has rated.
  2. Rate the documents yourself. This means changing any incorrect ratings applied by that review.
  3. Based on this information, Everlaw calculates an estimated error rate for a given reviewer, and provides the statistical confidence measure for the estimated error rate.
    • Any document a (non-admin) reviewer has rated counts as reviewed by the reviewer
    • Any document viewed or rated by an admin counts as reviewed by the admin
    • If a Project Admin opens a document that a reviewer has rated, even briefly, and does not change the rating, the document will count in favor of the reviewer's accuracy
      analytics6.png

The Reviewer Accuracy table displays:

  • The statistical confidence the system has in the predicted error rate. This is represented by the green circle in the leftmost column. The more filled-in the circle is, the higher the confidence level. Hovering over the circle will provide the exact percentage
  • The reviewer's name
  • The predicted error rate as a percent (i.e., for Demo Reviewer 2 in the table above, the system estimates that they make an error on 9% of the documents they review)
  • Next steps

Next steps allow Project Admins to start taking action on the information they've gathered. These might include:

  • Generating predictions
  • Improve the accuracy of the predictions
  • Review the documents that have registered errors.

You can verify documents,or review errors. Admin-verified documents are documents that have been viewed or rated by an admin. Verify x docs refers to the number of documents the admin must view or rate in order to achieve 95% confidence in the reviewer's accuracy.

Screen_Shot_2020-04-03_at_2.43.43_PM.png

Select more --> in the upper right of the table to reveal additional columns. Select les <-- to hide these columns again.

analytics7.png

Select any of the numbers in the Rated, Needs Review, Admin Reviewed, or Errors column to open a results table with those documents.

By default, the statistics are generated from the beginning of the project. To use a different start date to calculate the statistics, you can input a new date into the filter by starting date box. To permanently use a different start date, select Set Date as Default after choosing a date to filter by.

starting date.png

Rating Conflicts

The rating conflicts page displays all the rating conflicts in the project, broken down by type of conflict.

Screenshot

Admin-Admin conflict: If a Project Admin opens a document that a reviewer has rated and does not change the rating, the Admin is "agreeing" with that rating. If another Admin changes the rating later, this creates an Admin-Admin conflict.

To resolve conflicts, select the card or bar graph of the conflict type or user you want to resolve conflicts for. This opens a results table of the documents with a column for the conflict type. To resolve conflicts, open a document and modify the rating. The conflict statistics will update accordingly.

Alternatively, you can affirm the documents' current ratings and resolve conflicts in a batch. 

To do so:

  1. Select Batch > Modify. This opens the batch modification panel.
  2. Select Resolve Rating Conflicts.
    resolve conflicts.png
  3. Select Apply.
  4. Select Apply  again to confirm the action.

Review Progress

Review Progress charts the documents viewed and documents rated over the lifetime of the project Note: for projects that were in Everlaw before analytics was released, the review progress chart begins on that date - 7/13/2012.

Hover your mouse over the graph to see any week since the project was created - the tooltip shows you how many new documents were viewed that week, and how many documents were rated. These two parameters are both cumulative. In other words, over time, the amount of documents that have been rated and viewed should rise, as review activity progresses on your project.

review. progress.png
Note: There may not be a one-to-one correspondence between documents that have been viewed and those that have been rated. For example, some documents may have been viewed but not given a rating. Additionally, it is possible to rate documents without viewing them, through batch actions.)

This graph is similar to the Overview Tab, except you are able to see the entire history of the project.There are two interactive components on this page:

  • You can add or remove parameters, or series, to the graph, select the appropriate checkbox in the legend.

    The checkbox next to the legend title selects or deselects all the series. If only some of the data is selected, the checkbox will appear gray, but is still clickable and will add all the data to the graph. H
    over over one of the series in the legend to emphasize that parameter in the graph by bringing it into the foreground.
  • To see the data more clearly, click on the Y-axis of the graph. The graph resizes to display the documents as a percent instead of absolute number. For example, if you have reviewed 4.1% of the documents on your project and viewed 8.4% of the documents, the Y-axis will resize and display units from 0-10%. This will make the current percentage and historical progression of viewed and rated documents easier to visualize. Click on the Y-axis again to return the axis labels to an absolute number. The image below shows a graph that has been resized by clicking on the Y-axis.
    analytics4.png

Rating Trends

Rating trends give you insight into how reviewers are applying ratings. The patterns  can help you identify reviewers you might want to retrain, or whose work you should consider QAing. Each rating is represented as a separate bar chart. Select and deselect the boxes next to the names in the User table add and remove users from the visualization.

rating_trends.png

Each bar represents the percentage of the total number of documents rated by that user with that rating. Hover over the bar to see the number individual documents rated by that user, and what percentage that number is compared to all documents they have rated. 

Screen_Shot_2019-05-10_at_11.12.19_AM.png

These comparative charts can help you understand, for example, whether someone rates documents hot more liberally compared to their colleagues. Or, for example, you might find that a reviewer rates documents "hot" or "cold" with more relative frequency than they do "warm." In this case you might want to consider clarifying your criteria for the various rating categories to ensure consistency.

Reviewer Pace

The Reviewer Pace scatter plot graphs the number of documents viewed over the lifetime of the project. You can select users' data to appear in the graph by selecting them in the User key. If there are overlapping lines, hover over one of the users in the key to bring the user's line into the foreground.

Under the graph is a bar that allows you to set which time period you would like to view. You can also toggle between weeks and months to narrow down or expand the time frame you wish to see.

analytics13.png

Custom

If built-in reports do not provide all of the information you want, you can make your own custom reports on the Custom page. To create a visualization:

  1. Select the Y-axis metric in the left dropdown.
    Y axis.png
  2. Select the X-axis metric in the right dropdown.
    X azis.png
  3. Select the users you want to visualize in the User table.
  4. Choose the time period to view using the bar at the bottom. You can also toggle between weeks, by months, or Total.
    time bar.png

Hover over a user's name in the legend to bring the their line into the foreground. 

analytics5.PNG

 

Metrics for Custom Visualizations

The raw statistics that we collect (and that are available in the Custom Visualization widget) are the following (Here, "New" means previously not viewed by the user):

  • Review Time: The length of time the user has spent in review. This timer counts any time a user spends in the review window, on the results table, or on the search page.
  • Docs Viewed: A count of the total number of times the user opened up a document in the review window (even if the user had already previously viewed that document).
  • Pages Viewed: A count of the total number of pages the user viewed in the review window (even if they had already viewed that page).
  • New Docs: A count of the total number of new documents the user opened
  • New Pages: A count of the total number of new non-native pages the user viewed.
  • New Natives: A count of the total number of new native files the user opened
  • Docs With Natives: A count of the total number of new documents the user opened that had natives
  • Pages Opened: A count of the total number of non-native pages in the new documents the user opened (e.g. if the user opened document A, which has 3 pages, and document B, which has 2, this count would be 5 - which pages the user views doesn't matter here)
  • Native Pages Viewed: A count of the total number of native pages the user viewed.
  • New Native Pages:  A count of the total number of new native pages the user viewed.
  • Rated Hot/Warm/Cold/Unrated: A count of the number of documents the user manually rated.
  • Batch Rated Hot/Warm/Cold/Unrated: A count of the number of documents the user rated using the Batch Rate tool.