
Field Notes - Dec 11, '25
Executive Signals
- Buttons beat boxes: low-friction taps lift qualified leads despite fewer forms
- Routers over brains: LLMs choose branches; deterministic flows keep QA, compliance, predictability
- Evidence on failure: conditional capture trims latency and storage without losing debuggability
- Kill shadow ops: co-owned infra as code prevents schedule drift and ghost debugging
- Adapter factory, not snowflakes: templated selectors and states harden integrations and accelerate scale
- Regex first, LLM second: deterministic extraction wins on cost, latency, and correctness
Marketing
Conversational Takeover Beats Static Forms
People convert when cognitive load is low. Let users advance with simple taps instead of typing into blank boxes. Expect legacy “form submissions” to fall even as total qualified leads rise. Rebase reporting so stakeholders focus on qualified volume and quality, not the legacy form line.
- Offer both paths: conversational assistant and legacy form
- Make “total qualified leads” the KPI; de-emphasize form-only trends
- Instrument drop-off per step; prune early free-text prompts
Engineering
Use AI as the Router, Keep Execution Deterministic
Treat the assistant like a branch selector, not the system of record. Let an LLM pick the next predefined step, while execution remains explicit if/then logic. You get predictable outcomes, simpler QA, and compliance-friendly, replayable logs.
- Define a finite action set; log the chosen branch and rationale
- Gate on confidence; fall back to a safe path or form below threshold
- Version the flowchart; ship changes behind flags
Capture Evidence Conditionally to Cut Latency
Full artifact capture is great for debug, expensive in production. Save what you need when you need it: pre-submit only on failure, post-submit on success. You preserve diagnosability while lowering per-run latency and storage.
- Flip to conditional artifacts once pass rates stabilize
- Store DOM hashes for “success” state to avoid extra captures
- Sample full captures (e.g., 1 in N runs) for ongoing QA
Kill Shadow DevOps Repos Before They Kill Your Launch
Launches slip when deploy manifests live in a separate, locked repo. Builders need a write or PR path to infra-as-code or they’ll chase ghosts. Co-own deploy assets to keep code and runtime aligned.
- Co-locate app and deploy manifests or enforce synchronized PRs
- Add a preflight to confirm live manifest matches the expected commit SHA
- Maintain a demo-mode fallback so the launch isn’t gated on the scheduler
Build an Adapter Factory, Not One-Offs
Every integration shares the same lifecycle: manual run, selector mapping, reference/approval capture, popup handling, then submit/confirm states. Plan for multiple test submissions per target and encode these steps into a reusable skeleton.
- Templatize one adapter skeleton with slots for selectors and state checks
- Normalize synonyms (“reference” vs “approval”); centralize regexes
- Track per-adapter pass rates; promote to prod after stability thresholds
Prefer Deterministic Parsing; Use LLM Extraction as Backstop
Reference numbers and confirmations are usually stable strings. Regex-based extraction beats LLMs on speed, cost, and correctness when patterns hold. Bring in LLM extraction only when markup or labels vary widely.
- If deterministic extract exceeds error budget, enable scoped LLM fallback
- Cache successful patterns per target and alert on drift
- Measure end-to-end latency; revert if LLMs push you over SLOs