ATOM Documentation

← Back to App

Regression Prevention Guide

**Phase:** 298 - Frontend-Backend Unification

**Last Updated:** 2026-04-12

---

Overview

This guide explains how to prevent regressions using automated quality gates, monitoring, and best practices established in Phase 298.

What is Regression?

Regression is when a code change breaks existing functionality. Examples:

  • Bug fix introduces new bug
  • Refactoring changes behavior
  • New feature breaks existing feature
  • Performance degradation
  • Type safety violation

Why Prevent Regressions?

  • **User Trust:** Regressions break user trust
  • **Cost:** Fixing regressions is expensive
  • **Time:** Regressions delay development
  • **Quality:** Regressions reduce code quality

---

CI/CD Quality Gates

Overview

The .github/workflows/regression-prevention.yml workflow provides automated quality gates that run on:

  • Every pull request
  • Every push to main
  • Daily scheduled run (2 AM UTC)

Jobs

Job 1: Frontend Tests (Vitest)

Runs all frontend tests with Vitest.

- name: Run frontend tests
  run: npm test -- --run --reporter=verbose

**What it checks:**

  • All frontend tests pass
  • Test execution time < 5 minutes
  • No test failures or errors

**When it fails:**

  • Fix broken tests
  • Remove skip() from skipped tests
  • Fix flaky tests (non-deterministic)

Job 2: Backend Tests (pytest)

Runs all backend tests with pytest.

- name: Run backend tests
  run: |
    cd backend-saas
    python3 -m pytest -v --tb=short --cov=. --cov-report=xml

**What it checks:**

  • All backend tests pass
  • Test execution time < 5 minutes
  • Coverage report generated

**When it fails:**

  • Fix broken tests
  • Fix failing assertions
  • Remove pytest.mark.skip from skipped tests

Job 3: Coverage Check (>= 90%)

Verifies that both frontend and backend have 90%+ test coverage.

- name: Check backend coverage
  run: |
    coverage_report=$(python3 -c "...")
    if (( $(echo "$coverage_report < 90" | bc -l) )); then
      echo "::error::Backend coverage $coverage_report% is below 90% threshold"
      exit 1
    fi

**What it checks:**

  • Frontend coverage >= 90%
  • Backend coverage >= 90%

**When it fails:**

  • Add tests for uncovered code
  • Remove unreachable code
  • Fix coverage calculation

**How to view coverage:**

# Frontend coverage
open coverage/index.html

# Backend coverage
open backend-saas/htmlcov/index.html

# View coverage report
cat backend-saas/coverage.xml

Job 4: Code Duplication Check (< 5%)

Verifies that code duplication is below 5%.

- name: Check frontend duplication
  run: |
    jscpd src --format "typescript,tsx" --threshold 5

- name: Check backend duplication
  run: |
    pylint --disable=all --enable=similarities \
      --similarities-line-count-threshold=50 \
      --min-similarity-lines=50

**What it checks:**

  • Frontend duplication < 5%
  • Backend duplication < 5%

**When it fails:**

  • Remove duplicate code
  • Extract common logic to shared functions
  • Use composition over copying

**How to fix duplication:**

# Run redundancy detection
cd backend-saas
python3 scripts/detect_redundancy.py \
  --frontend ../src \
  --backend .

# Review report
cat .planning/phases/298-frontend-backend-unification/redundancy-report.md

# Refactor to remove duplicates

Job 5: Type Safety Check (no `any` in API layer)

Verifies that there are no any types in the API layer.

- name: Check for `any` types in API layer
  run: |
    any_count=$(grep -r ": any\|<any>" src/lib/api --include="*.ts" | wc -l)
    if [ "$any_count" -gt 0 ]; then
      echo "::error::Found $any_count instances of `any` type in API layer"
      exit 1
    fi

**What it checks:**

  • No any types in src/lib/api/**/*.ts
  • No any types in src/lib/api/**/*.tsx

**When it fails:**

  • Replace any with proper types
  • Define interfaces for API contracts
  • Use generic types where appropriate

**How to fix:**

// Before (bad)
async function createAgent(data: any): Promise<any> { ... }

// After (good)
interface CreateAgentRequest {
  name: string;
  maturity: MaturityLevel;
  capabilities: string[];
}

interface AgentResponse {
  id: string;
  name: string;
  maturity: MaturityLevel;
  capabilities: string[];
}

async function createAgent(
  data: CreateAgentRequest
): Promise<AgentResponse> { ... }

**Exception:** error: any is acceptable in catch blocks

try {
  await riskyOperation();
} catch (error: any) {
  // error: any is acceptable here
  console.error(error.message);
}

Job 6: Contract Tests

Runs all contract tests to verify frontend-backend API alignment.

- name: Run contract tests
  run: npm test -- src/lib/api/__tests__/contract-tests --run

**What it checks:**

  • All contract tests pass
  • Request/response structures match
  • Type mappings are correct
  • Auth/tenant requirements are consistent

**When it fails:**

  • Fix type mismatches
  • Update contract tests for API changes
  • Align frontend and backend

Job 7: Integration Tests

Runs all integration tests (property-based, fuzz, concurrency, E2E).

- name: Run property-based tests
  run: npm test -- src/lib/api/__tests__/property-based --run

- name: Run fuzz tests
  run: npm test -- src/lib/api/__tests__/fuzzy --run

- name: Run concurrency tests
  run: npm test -- src/lib/api/__tests__/concurrency --run

- name: Run E2E workflow tests
  run: npm test -- src/lib/api/__tests__/integration --run

**What it checks:**

  • Property-based tests find edge cases
  • Fuzz tests find validation bugs
  • Concurrency tests find race conditions
  • E2E tests find workflow bugs

**When it fails:**

  • Fix discovered bugs
  • Update tests for behavior changes
  • Add regression tests for new bugs

Job 8: Performance Benchmarks

Runs performance benchmarks to ensure no performance degradation.

- name: Run performance benchmarks
  run: npm test -- src/lib/api/__tests__/performance --run

**What it checks:**

  • API response time < threshold
  • Test execution time < threshold
  • Memory usage < threshold

**When it fails:**

  • Optimize slow code
  • Fix memory leaks
  • Add caching where appropriate

Job 9: Phase 298 Verification

Runs comprehensive Phase 298 verification script.

- name: Run Phase 298 verification
  run: |
    python3 backend-saas/scripts/verify-phase-298.py \
      --output .planning/phases/298-frontend-backend-unification/phase-298-verification-report.md

**What it checks:**

  • All 10 verification criteria pass
  • Overall phase status is PASSED

**When it fails:**

  • Review verification report
  • Fix failed criteria
  • Re-run verification

Job 10: Report Summary

Generates summary report and posts to PR.

- name: Post status to PR
  uses: actions/github-script@v7
  with:
    script: |
      const status = /* check all jobs */;
      const body = `## Regression Prevention Results\n\n${emoji} **Overall Status**: ${status ? 'PASSED' : 'FAILED'}`;
      github.rest.issues.createComment({ issue_number: context.issue.number, body });

**What it does:**

  • Checks all job results
  • Posts summary to PR
  • Fails if any job failed

---

Pre-commit Hooks

Overview

Pre-commit hooks run automatically on every git commit to catch issues before they're pushed.

Hook: Type Safety Check

Checks for any types in API layer.

# .husky/pre-commit
any_count=$(find src/lib/api -name "*.ts" -o -name "*.tsx" | \
  xargs grep -h ": any\|<any>" 2>/dev/null | \
  grep -v "error: any" | \
  grep -v "// " | \
  grep -v "/\*" | \
  wc -l | tr -d ' ')

if [ "$any_count" -gt 0 ]; then
  echo "  ❌ Found $any_count instances of \`any\` type in API layer"
  exit 1
fi

**What it checks:**

  • No any types in API layer (except error: any)

**When it fails:**

  • Replace any with proper types
  • Commit again

**How to bypass (not recommended):**

git commit --no-verify -m "feat: add feature"

Hook: Python Coverage Enforcement

Checks coverage for billing/quota files (from Phase 297-02).

# .husky/pre-commit
if [ -f "backend-saas/hooks/pre-commit" ]; then
  cd backend-saas && bash hooks/pre-commit
fi

**What it checks:**

  • Billing/quota files have 90%+ coverage
  • No decrease in coverage

**When it fails:**

  • Add tests for uncovered code
  • Commit again

---

Quality Metrics Monitoring

Overview

Quality metrics are tracked over time to detect trends and regressions early.

Metrics Tracked

  1. **Test Coverage:** Frontend and backend coverage percentage
  2. **Code Duplication:** Frontend and backend duplication percentage
  3. **Type Safety:** Count of any types in API layer
  4. **Test Execution Time:** Frontend and backend test duration
  5. **Bug Count:** Open, closed, and regression bugs
  6. **Performance:** API response time, throughput

How to View Metrics

// Get current metrics
import { qualityMetricsAPI } from '@/lib/monitoring/quality-metrics';

const report = await qualityMetricsAPI.getCurrentMetrics();
console.log('Overall score:', report.overallScore);
console.log('Metrics:', report.metrics);
console.log('Recommendations:', report.recommendations);

// Check for regressions
const regressions = await qualityMetricsAPI.checkRegressions();
if (regressions.length > 0) {
  console.error('Regressions detected:', regressions);
}

// Get metric history
const history = await qualityMetricsAPI.getMetricHistory('frontend_coverage', 30);
console.log('30-day history:', history);

// Get metrics trend
const trends = await qualityMetricsAPI.getMetricsTrend(
  ['frontend_coverage', 'backend_coverage', 'code_duplication'],
  7
);
console.log('7-day trends:', trends);

Trend Analysis

Metrics are analyzed for trends:

  • **Improving:** Metric getting better over time
  • **Degrading:** Metric getting worse over time (alert!)
  • **Stable:** Metric not changing significantly

Alerting

Alerts are triggered when:

  • Coverage drops > 5%
  • Duplication increases > 5%
  • Type safety violations detected
  • Performance degrades > 10%
  • New regressions detected

---

How to Investigate Regressions

Step 1: Identify the Regression

# Run verification script
cd backend-saas
python3 scripts/verify-phase-298.py

# View report
cat .planning/phases/298-frontend-backend-unification/phase-298-verification-report.md

Step 2: Find the Breaking Change

# View recent commits
git log --oneline -20

# View diff for suspicious commit
git show <commit-hash>

# Bisect to find breaking commit
git bisect start
git bisect bad HEAD
git bisect good <last-known-good-commit>
git bisect run python3 scripts/verify-phase-298.py

Step 3: Fix the Regression

# Create fix branch
git checkout -b fix/regression-<issue>

# Implement fix
# ...

# Test fix
python3 scripts/verify-phase-298.py

# Commit fix
git add .
git commit -m "fix(<plan>): resolve regression in <feature>"

Step 4: Add Regression Test

// Add test to prevent regression
describe('Regression Test for Issue #123', () => {
  it('should not lose precision when serializing ACU values', async () => {
    const acu = 0.1 + 0.2;  // 0.30000000000000004
    const serialized = serializeACU(acu);
    expect(serialized).toBe('0.30');  // Precision preserved
  });
});

Step 5: Verify Fix

# Run all tests
npm test && cd backend-saas && pytest

# Run verification script
python3 scripts/verify-phase-298.py

# Push fix
git push origin fix/regression-<issue>

---

Best Practices

1. Never Skip Tests

# ❌ Bad: Skip tests
git commit --no-verify -m "feat: add feature"

# ✅ Good: Fix tests
# Fix broken tests first
git commit -m "feat: add feature"

2. Fix Regressions Immediately

# ❌ Bad: Ignore regression
# "I'll fix it later"

# ✅ Good: Fix immediately
# Create fix branch, implement fix, add test

3. Add Regression Tests

// ✅ Good: Add test for every bug fix
describe('Regression Test for Bug #456', () => {
  it('should handle Unicode characters in agent names', async () => {
    const agent = await createAgent({ name: '🤖 Agent' });
    expect(agent.name).toBe('🤖 Agent');
  });
});

4. Monitor Metrics

// ✅ Good: Check metrics regularly
const report = await qualityMetricsAPI.getCurrentMetrics();
if (report.overallScore < 80) {
  console.warn('Quality score below 80%:', report.recommendations);
}

5. Use Feature Flags

// ✅ Good: Use feature flags for risky changes
const FEATURE_ENABLE_NEW_API = process.env.FEATURE_ENABLE_NEW_API === 'true';

async function createAgent(data: CreateAgentRequest) {
  if (FEATURE_ENABLE_NEW_API) {
    return await createAgentV2(data);
  } else {
    return await createAgentV1(data);
  }
}

---

Additional Resources

Documentation

Scripts

  • backend-saas/scripts/verify-phase-298.py - Comprehensive verification
  • backend-saas/scripts/detect_redundancy.py - Redundancy detection

Workflows

  • .github/workflows/regression-prevention.yml - CI/CD quality gates
  • .husky/pre-commit - Pre-commit quality checks

Monitoring

  • src/lib/monitoring/quality-metrics.ts - Quality metrics API
  • backend-saas/core/monitoring/quality_metrics.py - Quality metrics collector

---

**Last Updated:** 2026-04-12

**Phase:** 298 - Frontend-Backend Unification

**Status:** Regression prevention established ✅