Next Steps¶
Current State (2026-01-28)¶
What's Working¶
- FastAPI server (port 32406) with comprehensive OpenAPI documentation
- FastHTML map visualization (port 32405)
- DuckDB with spatial extension (441 congressional districts)
- Redis caching with ~200x speedup on cached requests
- Wikipedia endpoint with file-based 48hr cache
- Geographic layers system (counties, state legislative districts)
- Radio station coverage data with FCC contours (15,979 contours, 38,431 coverage relationships)
- Systemd services for production deployment
- Internal tile server (
tiles.nominate.ai) for basemap tiles
Recent Completions¶
- Internal Tile Server Migration (2026-01-28): Migrated from CartoDB to
tiles.nominate.ai(closes #3) - Radio Coverage Import (2026-01-26): Imported updated radio data from cbradio
- 15,979 radio contours (15,049 FM + 930 AM)
- 38,431 district coverage relationships
- All 441 districts have radio coverage
- Fixed zero-area contour handling in spatial joins
-
See:
docs/COVERAGE-IDEAS.mdfor use cases -
API Documentation Overhaul (2026-01-03): Full OpenAPI/Swagger documentation with:
- Rich markdown descriptions for all endpoints
- Operation IDs for SDK generation
- Response examples and error schemas
- Code samples (Python, JavaScript, cURL)
- Field-level documentation with examples
Priority Queue¶
P0 - Critical¶
1. Wikipedia Cache Pre-Warm Endpoint¶
GitHub Issue: #1 Assigned to: @me
Add bulk pre-warm endpoint for Wikipedia data to avoid cold-cache latency.
Requirements: - Background execution (return 202 Accepted) - Concurrency control (3-5 parallel requests) - Progress tracking endpoint - Rate limiting for Wikipedia API - Admin authentication
Impact: Eliminates 5-7 sec latency on cache misses
P1 - High Priority¶
2. Pre-built Asset Bundles¶
Generate static gzipped bundles served via nginx for instant client loads.
Files to create:
- scripts/build_release.py
- data/releases/v1.0.0/manifest.json
Impact: Instant load after first client download
3. Background Cache Warmer¶
Scheduled job to keep Wikipedia cache warm automatically.
Files to create:
- api/jobs/cache_warmer.py
- systemd/cbdistricts-cache-warmer.timer
Approach: Run nightly to refresh expiring Wikipedia entries
P2 - Medium Priority¶
4. Migrate to Internal Tile Server ✅ COMPLETED¶
GitHub Issue: #3 - Closed
Migrated from external CartoDB tiles to internal https://tiles.nominate.ai.
Changes made:
- web/static/js/map.js - Updated tile URLs to use tiles.nominate.ai
- web/server.py - Updated docstring
- CLAUDE.md - Updated tile server documentation
5. Expand Test Coverage¶
Currently only Wikipedia service has tests.
Targets: - District service unit tests - Demographics service tests - API integration tests - Cache decorator tests
6. CI/CD Pipeline¶
No automated testing on push.
Requirements: - GitHub Actions workflow - Run tests on PR - Lint with ruff - Type check with mypy
7. Admin Authentication¶
Secure admin endpoints (cache management, data refresh).
Options: - API key in header - Integrate with PIN Gate auth - IP whitelist for localhost only
P3 - Low Priority / Backlog¶
8. Radio Data Enhancements¶
From docs/COVERAGE-IDEAS.md:
- Station format import (genre data from cbradio)
- Election data overlay (Cook PVI, historical results)
- Population-weighted metrics
- Owner/network analysis for bulk media buys
9. Wikipedia Regex Improvements¶
Rep/party extraction could be more robust.
10. Isolated Virtualenv¶
Currently using shared nominates env.
11. Add mypy Type Checking¶
Config exists but not enforced in CI.
Future Data Classes¶
The layer architecture supports additional polygon types:
| Data Class | Source | Status |
|---|---|---|
| Congressional Districts (119th) | Census TIGER | ✅ Complete |
| Michigan Counties | Census TIGER | ✅ Complete |
| Michigan State House | Census TIGER | ✅ Complete |
| Michigan State Senate | Census TIGER | ✅ Complete |
| Radio Coverage Contours | FCC | ✅ Complete (15,979) |
| State Legislative (Other States) | Census TIGER | 🔜 Planned |
| County Boundaries (All States) | Census TIGER | 🔜 Planned |
| Municipal Boundaries | Census TIGER | 📋 Backlog |
Each class follows the same pattern: - Polygons (GeoJSON/GeoParquet in DuckDB) - Associated metadata - Redis cache + optional pre-built bundles
Documentation Status¶
| Document | Status |
|---|---|
| API /docs | ✅ Complete |
| README.md | ✅ Complete |
| CLAUDE.md | ✅ Complete |
| MAP-DATA-INGEST.md | ✅ Complete |
| LAYER-ONBOARDING-GUIDE.md | ✅ Complete |
| PERFORMANCE-ARCHITECTURE.md | ✅ Complete |
| systemd-deployment.md | ✅ Complete |
| COVERAGE-IDEAS.md | ✅ New (2026-01-26) |
Quick Commands¶
# Check service status
sudo systemctl status cbdistricts-api cbdistricts-web
# View logs
sudo journalctl -u cbdistricts-api -f
# Restart services
sudo systemctl restart cbdistricts-api cbdistricts-web
# Check cache status
curl http://localhost:32406/api/v1/cache
curl http://localhost:32406/api/v1/cache/stats
# Test API endpoints
curl http://localhost:32406/api/v1/health
curl http://localhost:32406/api/v1/districts/1903
curl http://localhost:32406/api/v1/districts/1903/wiki
curl http://localhost:32406/api/v1/radio/stats
# Clear radio cache after data refresh
python3 -c "
import sys; sys.path.insert(0, '.')
from api.cache.decorators import invalidate_cache
invalidate_cache('radio:*')
"
# Database stats
python3 -c "
import duckdb
conn = duckdb.connect('data/output/cbdistricts.duckdb', read_only=True)
for t in ['districts', 'demographics', 'radio_stations', 'radio_contours', 'radio_district_coverage']:
try:
c = conn.execute(f'SELECT COUNT(*) FROM {t}').fetchone()[0]
print(f'{t}: {c:,}')
except: pass
"
# View API documentation
open https://districts.nominate.ai/docs
Last updated: 2026-01-28 (Internal tile server migration completed)