10 min read

Optimizing Your Graders

Fine-tune scoring criteria based on conversion data to improve lead quality predictions over time.

Overview

Grader optimization is an ongoing process of analyzing performance data, identifying patterns, and adjusting criteria to improve lead quality predictions. This guide covers systematic approaches to optimization.

Performance Analysis

Key Metrics to Monitor

MetricTargetDescription
Precision>75%Percentage of high-scored leads that convert
Recall>80%Percentage of converting leads that were scored high
Conversion RateVariesOverall lead-to-customer conversion rate
Score DistributionBalancedProper spread across Hot/Warm/Cold categories

Data Collection

// Track conversion outcomes
{
  "submissionId": "sub_xyz789",
  "initialScore": 87,
  "convertedToCustomer": true,
  "conversionDate": "2023-10-30T14:30:00Z",
  "revenue": 25000,
  "conversionTime": 14 // days
}

A/B Testing Framework

Split Testing Setup

  1. Define Hypothesis - "Increasing job title weight will improve precision"
  2. Create Test Groups - 50% Control, 50% Variant
  3. Set Success Metrics - Primary: conversion rate, Secondary: sales velocity
  4. Determine Sample Size - Minimum 1,000 leads per group
  5. Run Test Duration - Typically 30-60 days

Example A/B Test

{
  "test_name": "Job Title Weight Increase",
  "hypothesis": "VP/Director leads convert 2x better than others",
  "control_group": {
    "job_title_weight": 15,
    "performance_baseline": "65% precision"
  },
  "variant_group": {
    "job_title_weight": 25, 
    "expected_improvement": "75% precision"
  }
}

Optimization Techniques

1. Feature Importance Analysis

# Analyze which fields predict conversions best
feature_importance = {
    'email_domain': 0.28,
    'job_title': 0.22, 
    'company_size': 0.18,
    'industry': 0.15,
    'form_completion': 0.12,
    'geography': 0.05
}

# Focus optimization on top predictive features

2. Threshold Tuning

// Adjust score thresholds based on sales capacity
const currentMetrics = {
  hotLeads: 127,    // per week
  salesCapacity: 100, // leads sales can handle
  conversionRate: 0.23
};

// Increase hot threshold if overwhelmed
if (currentMetrics.hotLeads > currentMetrics.salesCapacity) {
  adjustThreshold('hot', currentThreshold + 5);
}

3. Seasonal Adjustments

{
  "seasonal_modifiers": {
    "Q4": {
      "budget_keywords": +10,
      "urgency_indicators": +15,
      "reason": "Year-end buying surge"
    },
    "summer": {
      "vacation_indicators": -5,
      "reason": "Slower decision making"
    }
  }
}

Continuous Improvement Process

Weekly Review

  • Performance Dashboard - Check key metrics trends
  • Top/Bottom Performers - Analyze best and worst scoring patterns
  • False Positives/Negatives - Review misclassified leads

Monthly Optimization

  • Criteria Adjustment - Update weights based on data
  • New Feature Testing - Try additional form fields or derived data
  • Threshold Recalibration - Adjust Hot/Warm/Cold boundaries

Quarterly Deep Dive

  • Model Retraining - For AI-powered graders
  • Feature Engineering - Add new data sources or calculations
  • Major Criteria Overhaul - Significant changes to scoring logic

Common Optimization Patterns

Pattern 1: Score Inflation

Problem: All leads scoring too high Solution: Increase scoring thresholds or add negative criteria

Pattern 2: Low Precision

Problem: High-scored leads not converting Solution: Analyze false positives, tighten criteria for high scores

Pattern 3: Missing Good Leads

Problem: Converting leads scored low initially
Solution: Review false negatives, add positive criteria

Tools & Resources

Built-in Analytics

  • Conversion Tracking - Automatic outcome analysis
  • Score Distribution - Visual breakdown of lead grades
  • A/B Test Manager - Automated split testing platform
  • Performance Alerts - Notifications when metrics decline

External Integration

// Send performance data to external analytics
const performanceData = {
  period: 'last_30_days',
  metrics: await grader.getPerformanceMetrics(),
  conversions: await grader.getConversionData()
};

await analyticsAPI.send(performanceData);

Best Practices

1. Start with Baselines

Establish performance benchmarks before making changes.

2. Change One Thing at a Time

Isolate variables to understand impact of each change.

3. Statistical Significance

Wait for adequate sample sizes before drawing conclusions.

4. Document Changes

Keep detailed records of all optimization efforts and results.

5. Involve Sales Team

Get feedback from reps on lead quality and scoring accuracy.

Next Steps

After implementing optimization: