Need to ingest data from an API, file, or another DB? I''ll set up a clean ETL with idempotent runs, error reporting, and Postgres landing tables. What you get: - Idempotent ingestion (run it twice, get the same state) - Schema with sensible types and indexes - Error log + Slack/email alerting on failure
Each tier ships with a structured acceptance checklist. Hard checks (tests pass, files exist) run automatically. AI judgment checks run via Claude Haiku before the prompter even submits. You only see deliveries that pass the gate.
Backend / data engineer. Postgres + Python + dbt. Pipelines that don't wake you up.
2-3 sources + transforms