
Field Notes - Jan 15, '26
Executive Signals
- Email is the new portal: treat inbox as API, poll only on breaches
- One source of truth: CRM owns states, dashboards stay debug-only and temporary
- Deterministic beats guesswork: parse body, then attachment, then link, login
- Spend the cheap tokens: escalate models only when mappings consistently fail
- Queues before databases: webhook to queue, object-store binaries, DB metadata
CEO
CRM Owns Status, Period
Make the CRM the single system of record for status, codes, and human-readable reasons. Engineering dashboards are for ops and debugging, not parallel truth. Preserve the original mailbox as the audit trail; forward copies to ingestion so humans can double-check without mutating history.
- Publish the exact fields the CRM owns and enforce them
- Push statuses and reasons into the CRM and run workflows there
- Keep DevOps-owned forwarding; leave the source inbox untouched
Thresholds Over Polling for OEM Portals
Default to email ingestion for compliance statuses. Trigger a portal check only when an OEM-specific SLA is breached, not as background polling. Build a thin adapter interface now so brand-specific portal work can be added just-in-time, not rushed.
- Define per-OEM SLAs and trigger portal checks only on breach
- Implement the adapter interface; add brand adapters only when needed
- Log every fallback to refine SLAs and flag systemic issues
Product
Automate Status First, Translate Next
Ship Phase 1 by posting raw approvals, denials, and codes to the CRM directly from email. Phase 2 translates codes into plain language for the teams fixing issues. Start with the top five denial reasons per OEM and expand as patterns emerge.
- Post raw status + codes to CRM before adding translation
- Maintain a rules table mapping codes to plain language with fallbacks
- Collect covenant PDFs and annotated screenshots to ground translations
Engineering
Ingestion Architecture That Ages Well
Prefer webhook to queue, not webhook to database. Store binaries in object storage and keep the database for metadata. Let queues drive workers with retries and dead-letter handling; apply retention rules to keep storage lean.
- Add idempotency keys and DLQ-backed retries
- Define retention windows and purge payloads on schedule
- Keep the source mailbox as audit; forward for ingestion
Parser Matrix Beats Guesswork
Use a deterministic extraction order to cut scope and speed delivery: subject/body, then attachment, then external link, then portal login. Stop at the first bucket that yields an answer. OEMs are internally consistent, so maintain brand-specific parsers and a live matrix per OEM.
- Maintain a per-OEM grid of where approvals and denials appear
- Treat attachments separately from authenticated links; log in only if forced
- Grey out lower buckets once a higher one returns a clear result
Threshold-Triggered Portal Adapters
Design a skeleton interface for portal adapters now and only implement brand adapters when the SLA policy demands it. This prevents constant polling from turning into accidental alerting and keeps scope contained while preserving a fast path for exceptions.
- Define adapter contracts and tests; defer brand implementations
- Record adapter invocations to improve thresholds and coverage
- Tie adapter execution to SLA breach events, not time-based cron
Keep AI Spend Tiny
Do not use portal polling as an alerting system. Alert on SLA breaches from email expectations, then decide if a portal check and model run are warranted. Start with cheap models seeded by the top mappings and escalate only when unmatched.
- Prompt small models with top-five mappings; escalate on failure only
- Use cached-input, batch, and low-priority APIs with per-run budgets
- Tier alerts: “Delayed” after expected window; “Red” when multiple emails are missing