Reporting

To view you and your team members' performance over time, you can go to Reports > Overview. In case you are an individual reviewee, the data on this page are based only on the evaluations you have received. If you are an admin or evaluator, the data on this page will be for all your group members.

Reading the Dashboard

The dashboard allows options to filter by period (quarter, month, week, or day), Completed At date range, source (by workload name), evaluator (by type or individual), scorecard, and ticket group.

When viewing by week, the selected start date will automatically round to the previous Sunday. For example, if you select Tuesday, October 15, the data shown will start from Sunday, October 12.

You can save your selected filters as a custom view by clicking Save as a new view. This is useful if you have a specific set of filters you use frequently.

The Export Report button will download a file in .xlsx format with all metrics displayed on the page.

On this page, you can see histograms of the following:

  • performance

  • passed evaluations and pass rate

  • failed evaluations and fail rate

  • critical criteron and fail rate

The x-axis will display the metrics of evaluations completed by that date. The left hand side y-axis will display the metric by percentage, and the right hand side y-axis will display the metric by count.

To see the exact metric number, hover over the histogram bar.

On the top left, you can see the 2 metrics for the entire time period covered by the bar chart on the top right. The table at the bottom, display the 2 metrics by criterion.

Clicking on a histogram bar will open a new tab to the Evaluations page with your filters applied, allowing you to identify exactly which evaluations contributed to that metric.

This page also shows a heat map of each of the following:

  • performance by criterion

  • performance by criterion group

  • performance by reviewee groups

  • performance by individual reviewees

  • performance by ticket group

Each score is color-coded as defined in workflows > reports > Average score scale midpoint. For example, the default midpoint is set to 50%, and the color scale is as follows:

Green: > 75% Light Red: > 50% and ≤ 75% Red: ≤ 50%

Clicking on a score will open a new tab to the Evaluations page with your filters applied, allowing you to identify exactly which evaluations contributed to that score.

The groups tab lets you view the performance of each group by metric, criterion, or criterion group.

The individuals tab lets you view the performance of individual reviewees.

The ticket groups tab lets you view the performance of each ticket group.

The evaluation insights tab lets you view your team's performance by criterion. You can identify trends or issues that might need to be picked up immediately and break things down by group or individual. This allows you to see which groups or individuals might contribute the most to the each root cause and understand if the issue id driven from specific individuals or is an org-level issue.

Last updated

Was this helpful?