Senior Data Engineer
Role Description
You’ll be the founding Data Engineering hire who lays the tracks that let our teams turn product, operations, and GTM data into decisions—fast and reliably. This role sets up our end-to-end data platform (warehouse, ingest, orchestration, BI) so every team has trustworthy, timely, self-serve insights, directly supporting our mission of resilient data everywhere.
As a Senior Data Engineer, you will…
Gather requirements from stakeholders and leverage those to architect and stand up Ditto’s modern data stack from 0→1: choose and implement the warehouse (Snowflake, Databricks, etc.), transformation, orchestration, and BI layers; establish SLAs, cost controls, and observability.
Create robust data ingest from both common and custom sources; design resilient models that serve analytics, experimentation, and operations.
Implement ELT with managed connectors and/or open-source ingestion; codify transformations with frameworks that bring testing, CI/CD, and data contracts to analytics code.
Orchestrate workflows, alerts, and retries; enforce data quality gates so broken data never reaches stakeholders.
Partner with Product, Ops, and GTM to define clear metrics and a semantic layer; enable self-serve exploration and trusted dashboards in Looker/Omni/Mode, or other.
Provide pragmatic analyst support (SQL debugging, dashboard bootstraps, troubleshooting) to unblock teams while the function scales.
Help scope the next role(s) on Ditto’s Data Eng team and help in the hiring process
What You’ll Need…
Mastery of SQL and strong data modeling instincts; comfortable modeling across a diversity of business domains. Proficiency in Python.
Proven ability to build (not just maintain) data platforms: selecting tools, setting standards, and delivering from first principles in a fast-changing environment.
Fluency with ELT + transformation frameworks (e.g., dbt or SQLMesh) and with orchestrators (e.g., Airflow, dbt. dagster).
Pragmatic engineering habits: testing, version control, PR reviews, incident management, incident CI/CD, documentation, and observability for data.
Excellent cross-functional communication; you translate fuzzy questions into data products and measurable outcomes aligned to Ditto’s edge-first world.
Experience with ingest tools (Fivetran, Airbyte, Stitch) and reverse ETL; familiarity with common SaaS/DB sources.
Nice to haves…
Hands-on with Snowflake or Databricks, and performance/cost tuning at scale.
Built transformations with dbt or SQLMesh and managed semantic layers/metrics for BI (Looker, Omni, Mode).
Tooling we use / value: Snowflake or Databricks; Fivetran/Airbyte/Stitch; dbt or SQLMesh; Airflow; Looker/Omni/Mode/Sigma/other. (We’re open to your informed recommendations as you build the function.)
If this sounds like the kind of 0→1 you love—owning the blueprint, shipping the foundations, and seeing your work power every decision at a high-growth company—let’s talk.
Senior Data Engineer
Role Description
You’ll be the founding Data Engineering hire who lays the tracks that let our teams turn product, operations, and GTM data into decisions—fast and reliably. This role sets up our end-to-end data platform (warehouse, ingest, orchestration, BI) so every team has trustworthy, timely, self-serve insights, directly supporting our mission of resilient data everywhere.
As a Senior Data Engineer, you will…
Gather requirements from stakeholders and leverage those to architect and stand up Ditto’s modern data stack from 0→1: choose and implement the warehouse (Snowflake, Databricks, etc.), transformation, orchestration, and BI layers; establish SLAs, cost controls, and observability.
Create robust data ingest from both common and custom sources; design resilient models that serve analytics, experimentation, and operations.
Implement ELT with managed connectors and/or open-source ingestion; codify transformations with frameworks that bring testing, CI/CD, and data contracts to analytics code.
Orchestrate workflows, alerts, and retries; enforce data quality gates so broken data never reaches stakeholders.
Partner with Product, Ops, and GTM to define clear metrics and a semantic layer; enable self-serve exploration and trusted dashboards in Looker/Omni/Mode, or other.
Provide pragmatic analyst support (SQL debugging, dashboard bootstraps, troubleshooting) to unblock teams while the function scales.
Help scope the next role(s) on Ditto’s Data Eng team and help in the hiring process
What You’ll Need…
Mastery of SQL and strong data modeling instincts; comfortable modeling across a diversity of business domains. Proficiency in Python.
Proven ability to build (not just maintain) data platforms: selecting tools, setting standards, and delivering from first principles in a fast-changing environment.
Fluency with ELT + transformation frameworks (e.g., dbt or SQLMesh) and with orchestrators (e.g., Airflow, dbt. dagster).
Pragmatic engineering habits: testing, version control, PR reviews, incident management, incident CI/CD, documentation, and observability for data.
Excellent cross-functional communication; you translate fuzzy questions into data products and measurable outcomes aligned to Ditto’s edge-first world.
Experience with ingest tools (Fivetran, Airbyte, Stitch) and reverse ETL; familiarity with common SaaS/DB sources.
Nice to haves…
Hands-on with Snowflake or Databricks, and performance/cost tuning at scale.
Built transformations with dbt or SQLMesh and managed semantic layers/metrics for BI (Looker, Omni, Mode).
Tooling we use / value: Snowflake or Databricks; Fivetran/Airbyte/Stitch; dbt or SQLMesh; Airflow; Looker/Omni/Mode/Sigma/other. (We’re open to your informed recommendations as you build the function.)
If this sounds like the kind of 0→1 you love—owning the blueprint, shipping the foundations, and seeing your work power every decision at a high-growth company—let’s talk.