Senior Data Engineer
As a Senior Data Engineer, you’ll play a pivotal role in shaping the future of our data platform. Our goal is to deliver trusted, near-real-time data products that support analytics, AI applications, and customer-facing data features. You’ll collaborate across engineering, product, and analytics teams to build the architecture and tooling that powers insight and action.
Outcomes you will drive:
Build and maintain scalable, modular data pipelines using tools like dbt, Azure Data Factory, etc.
Design batch and streaming data workflows that support near-real-time reporting and operational intelligence.
Deliver high-quality, trusted datasets to enable analytics, dashboards, embedded apps, and AI use cases.
Influence and guide the evolution of our data platform tooling and architectural decisions.
Contribute to structured architectural patterns such as Medallion for layered, reusable data models.
Drive data quality through testing, observability, and proactive alerting (e.g. dbt test, data contacts).
Partner across teams to improve velocity, reusability, and access to data with documentation, lineage, and governance.
Does this sound like you?
Specific experience is less important than a demonstrable desire to learn and the ability to complete complex projects. However, a good candidate should have the following:
5+ years of experience in data engineering or analytics engineering roles
Deep mastery of SQL and extensive, hands-on experience with Snowflake
Strong experience with dbt or similar data transformation frameworks
Proficient in Python, Scala, or similar languages used in data pipeline logic/automation
Experience with orchestration tools like Azure Data Factory, Airflow, or similar
Comfortable working in a modern, git-based development environment with CI/CD
Experience with cloud-native data streaming technologies such as Azure Event Grid,
Exposure and understanding of Data Architectural patterns such as Medallion
Experience using Infrastructure as Code tooling, Terraform a bonus
Bonus: Experience working in Snowflake features such as Cortex, Data Shares, Snowpark
A few ways to stand out...
Experience working in data visualization tools such as Redash and Gooddata
Experience with semantic modeling or enabling data for AI applications
You’ve productionized batch and streaming pipelines that scale
You’ve contributed to tooling decisions or platform evolution in a growing data team
You’ve supported external data products, analytics features, or ML/AI-powered applications
You understand how to balance speed and governance across data lifecycle and tooling
Senior Data Engineer
As a Senior Data Engineer, you’ll play a pivotal role in shaping the future of our data platform. Our goal is to deliver trusted, near-real-time data products that support analytics, AI applications, and customer-facing data features. You’ll collaborate across engineering, product, and analytics teams to build the architecture and tooling that powers insight and action.
Outcomes you will drive:
Build and maintain scalable, modular data pipelines using tools like dbt, Azure Data Factory, etc.
Design batch and streaming data workflows that support near-real-time reporting and operational intelligence.
Deliver high-quality, trusted datasets to enable analytics, dashboards, embedded apps, and AI use cases.
Influence and guide the evolution of our data platform tooling and architectural decisions.
Contribute to structured architectural patterns such as Medallion for layered, reusable data models.
Drive data quality through testing, observability, and proactive alerting (e.g. dbt test, data contacts).
Partner across teams to improve velocity, reusability, and access to data with documentation, lineage, and governance.
Does this sound like you?
Specific experience is less important than a demonstrable desire to learn and the ability to complete complex projects. However, a good candidate should have the following:
5+ years of experience in data engineering or analytics engineering roles
Deep mastery of SQL and extensive, hands-on experience with Snowflake
Strong experience with dbt or similar data transformation frameworks
Proficient in Python, Scala, or similar languages used in data pipeline logic/automation
Experience with orchestration tools like Azure Data Factory, Airflow, or similar
Comfortable working in a modern, git-based development environment with CI/CD
Experience with cloud-native data streaming technologies such as Azure Event Grid,
Exposure and understanding of Data Architectural patterns such as Medallion
Experience using Infrastructure as Code tooling, Terraform a bonus
Bonus: Experience working in Snowflake features such as Cortex, Data Shares, Snowpark
A few ways to stand out...
Experience working in data visualization tools such as Redash and Gooddata
Experience with semantic modeling or enabling data for AI applications
You’ve productionized batch and streaming pipelines that scale
You’ve contributed to tooling decisions or platform evolution in a growing data team
You’ve supported external data products, analytics features, or ML/AI-powered applications
You understand how to balance speed and governance across data lifecycle and tooling