Labellerr’s Quality Metrics Dashboard is a dedicated feature for monitoring and improving the performance of annotation teams. It equips managers, reviewers, and clients with detailed, actionable insights into annotator and reviewer productivity, review accuracy, and quality control in the data labeling process. The dashboard also makes it easy to track client rejections and identify areas for improvement—ensuring higher data quality and operational efficiency.

Accessing the Quality Metrics Dashboard

  • Open any recent project in Labellerr.
  • Navigate to the Quality Metrics option from the project interface.
  • Set your desired date range to filter results for a relevant time period.

Dashboard Sections and Key Features

The dashboard consists of three main tabs:

1. Annotator Tab

Monitors the individual performance of each annotator:
  • User Name & Email: Identify who performed the annotation.
  • Total Files Annotated: The number of files each annotator worked on during the selected period.
  • Accepted Files: Count of files accepted by reviewers with no changes needed.
  • Reviewer Rejections: Files rejected by reviewers due to quality issues.
  • Client Rejections: Files rejected by clients even after review.
  • Files Skipped: How many files were skipped by the annotator.
  • Acceptance Rate: Percentage of files accepted out of total annotated—a direct measure of accuracy and reliability.

2. Reviewer Tab

Tracks the performance of each reviewer:
  • Reviewer Name & Email: Who reviewed the files.
  • Total Files Reviewed: Volume of work checked within the time range.
  • Accepted During Review: Number of files accepted as is.
  • Marked as Rejected: Files that the reviewer flagged for changes.
  • Client Rejections: After the reviewer acceptance, files that clients still rejected.
  • Files Skipped: Count of files skipped or not reviewed.
  • Review Accuracy: Calculated based on reviews ultimately accepted by the client, helping identify consistency and stringency in reviews.

3. Client Rejected Tab

Provides detailed insights into files rejected by clients:
  • Unique File ID: Traceability to the particular file rejected.
  • Annotator Name: Who originally annotated the file.
  • Reviewer Name: Who performed the review.
  • Rejection Reason/Feedback: Client’s exact reason for rejection (if provided), helping teams learn and adjust.
  • Rejection Date: When the rejection occurred for record-keeping and trend analysis.

Additional Features

  • CSV Download: Export all metrics as a CSV for external analysis or reporting.
  • Date Range Selection: Custom time frames for granular performance tracking.
  • Full Traceability: Every status (from annotation to review to client acceptance/rejection) is traceable by user and file.

Benefits

  • Spot Performance Trends: Quickly identify top performers and those needing support.
  • Enhance Quality Control: Pinpoint where errors occur—during annotation, review, or client acceptance.
  • Data-Driven Management: Use concrete metrics for team improvement, feedback, and project reporting.
  • Audit-Ready: Maintain an auditable record of all actions and decisions.

Best Practices

  • Regularly review the dashboard to catch issues early.
  • Use client feedback to retrain annotators and reviewers.
  • Export and share CSVs for collaborative analysis and quality meetings.