Overview
The Magic Wand Confidence Filtering feature enables you to upload pre-annotations with confidence scores and dynamically filter them during review. This powerful tool helps your annotation team focus on reviewing predictions that need the most attention while maintaining efficient workflows.Smart Filtering
Filter annotations by confidence levels in real-time with an interactive slider
Quality Control
Focus review efforts on low-confidence predictions that need human validation
Workflow Efficiency
Accept high-confidence predictions quickly while carefully reviewing uncertain ones
How It Works
When you upload pre-annotations with confidence buckets, the Magic Wand feature automatically appears in the annotation interface, allowing annotators to filter annotations based on their confidence levels.1
Upload with Confidence Bucket
Upload your pre-annotations using the SDK with a specified confidence bucket parameter
2
Magic Wand Appears
The slider control automatically appears in the annotation interface for files with confidence-tagged annotations
3
Filter by Confidence
Annotators use the slider to show/hide annotations based on confidence thresholds
4
Review & Refine
Focus on reviewing lower-confidence predictions while quickly accepting high-confidence ones
Visual Workflow
The slider control appears automatically when annotations are uploaded with confidence buckets. Drag the slider to show/hide annotations based on your selected confidence threshold.
SDK Implementation
Upload Pre-annotations with Confidence Bucket
- Low Confidence
- Medium Confidence
- High Confidence
Use Case: Model predictions with low certainty (< 50% confidence) that require careful human review
Best For: Uncertain predictions, edge cases, or challenging images that need thorough human validation
Complete Workflow Example
Here’s a comprehensive example showing how to segment your model’s predictions by confidence and upload them accordingly:Confidence Bucket Guidelines
Choose the appropriate confidence bucket based on your model’s prediction confidence scores:Low Confidence
Confidence Range: 0% - 50%Characteristics:
- Uncertain predictions
- Multiple possible labels
- Challenging image conditions
- Edge cases
- Thorough manual review required
- Expect corrections needed
- Focus annotator attention here
Medium Confidence
Confidence Range: 50% - 80%Characteristics:
- Likely accurate predictions
- Some ambiguity present
- Standard image conditions
- Common object types
- Quick verification recommended
- Minor adjustments expected
- Balanced review effort
High Confidence
Confidence Range: 80% - 100%Characteristics:
- Very confident predictions
- Clear, unambiguous objects
- Ideal image conditions
- Well-trained categories
- Spot-check validation
- Minimal corrections needed
- Fast-track approval
Magic Wand Usage in Annotation Interface
Once annotations are uploaded with confidence buckets, annotators can leverage the Magic Wand slider:Accessing the Magic Wand
Accessing the Magic Wand
- Open a file with uploaded pre-annotations
- Look for the Magic Wand icon/slider in the left sidebar
- The tool appears automatically when confidence-tagged annotations exist
Using the Confidence Slider
Using the Confidence Slider
- Drag the slider to adjust the confidence threshold
- Annotations below the threshold are hidden
- Annotations above the threshold remain visible
- Use this to focus on reviewing specific confidence ranges
Review Workflow Optimization
Review Workflow Optimization
Efficient Review Strategy:
- Start with low confidence threshold (show all annotations)
- Review and correct low-confidence predictions first
- Gradually increase threshold to review medium confidence
- Quickly verify high-confidence predictions
- Submit when all confidence levels are validated
Parameter Reference
conf_bucket Parameter
Specifies the confidence level for uploaded pre-annotationsValid Values:
'low'- For predictions with low confidence (typically < 50%)'medium'- For predictions with medium confidence (typically 50-80%)'high'- For predictions with high confidence (typically > 80%)
None (no confidence filtering applied)Troubleshooting
Magic Wand Not Appearing
Magic Wand Not Appearing
Symptom: The slider control doesn’t appear in the annotation interface.Resolution Steps:
- Verify annotations were uploaded with
conf_bucketparameter specified - Confirm the confidence bucket value was set to ‘low’, ‘medium’, or ‘high’
- Check that pre-annotations were successfully applied (no file name mismatches)
- Refresh the annotation interface
- Ensure you’re using the latest version of Labellerr
All Annotations Same Confidence
All Annotations Same Confidence
Symptom: Slider doesn’t filter anything; all annotations appear at same level.Resolution Steps:
- Verify you uploaded different confidence buckets in separate uploads
- Check that you didn’t upload all predictions with the same conf_bucket value
- Ensure your prediction pipeline correctly segments by confidence
- Re-upload with properly categorized confidence levels
Incorrect Confidence Categories
Incorrect Confidence Categories
Symptom: Annotations appear in wrong confidence categories.Resolution Steps:
- Review your confidence score thresholds in preprocessing
- Verify the mapping between confidence scores and bucket labels
- Check for any errors in your prediction segmentation logic
- Consider recalibrating your model if systematic misalignment exists
Related Documentation
Upload Pre-annotations
Complete guide to uploading pre-annotations via SDK
Create Project
Set up projects programmatically for annotation workflows
Segment Anything
Learn about Meta’s SAM integration for one-click segmentation
Support
For technical assistance with the Magic Wand Confidence Filtering feature, contact [email protected]

