MENU
  • Remote Jobs
  • Companies
  • Go Premium
  • Job Alerts
  • Post a Job
  • Log in
  • Sign up
Working Nomads logo Working Nomads
  • Remote Jobs
  • Companies
  • Post Jobs
  • Go Premium
  • Get Free Job Alerts
  • Log in

Sr. Data Engineer - Kafka

hims & hers

Full-time
USA
$140k-$170k per year
kafka
devops
javascript
python
docker
The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

We're looking for a savvy and experienced Senior Data Engineer to join the Data Platform Engineering team at Hims. As a Senior Data Engineer, you will work with the analytics engineers, product managers, engineers, security, DevOps, analytics, and machine learning teams to build a data platform that backs the self-service analytics, machine learning models, and data products serving over a million Hims & Hers users.

You Will:

  • Architect and develop data pipelines to optimize performance, quality, and scalability

  • Build, maintain & operate scalable, performant, and containerized infrastructure required for optimal extraction, transformation, and loading of data from various data sources

  • Design, develop, and own robust, scalable data processing and data integration pipelines using Python, dbt, Kafka, Airflow, PySpark, SparkSQL, and REST API endpoints to ingest data from various external data sources to Data Lake

  • Develop testing frameworks and monitoring to improve data quality, observability, pipeline reliability, and performance

  • Orchestrate sophisticated data flow patterns across a variety of disparate tooling

  • Support analytics engineers, data analysts, and business partners in building tools and data marts that enable self-service analytics

  • Partner with the rest of the Data Platform team to set best practices and ensure the execution of them

  • Partner with the analytics engineers to ensure the performance and reliability of our data sources

  • Partner with machine learning engineers to deploy predictive models

  • Partner with the legal and security teams to build frameworks and implement data compliance and security policies

  • Partner with DevOps to build IaC and CI/CD pipelines

  • Support code versioning and code deployments for data Pipelines

You Have:

  • 8+ years of professional experience designing, creating and maintaining scalable data pipelines using Python, API calls, SQL, and scripting languages

  • Demonstrated experience writing clean, efficient & well-documented Python code and are willing to become effective in other languages as needed

  • Demonstrated experience writing complex, highly optimized SQL queries across large data sets

  • Experience with cloud technologies such as AWS and/or Google Cloud Platform

  • Experience building event streaming pipelines using Kafka/Confluent Kafka

  • Experience with IaC technologies like Terraform

  • Experience with data warehouses like BigQuery, Databricks, Snowflake, and Postgres

  • Experience with Databricks platform

  • Experience with modern data stack like Airflow/Astronomer, Databricks, dbt, Fivetran, Confluent, Tableau/Looker

  • Experience with containers and container orchestration tools such as Docker or Kubernetes

  • Experience with Machine Learning & MLOps

  • Experience with CI/CD (Jenkins, GitHub Actions, Circle CI)

  • Thorough understanding of SDLC and Agile frameworks

  • Project management skills and a demonstrated ability to work autonomously

Nice to Have:

  • Experience building data models using dbt

  • Experience with Javascript and event tracking tools like GTM

  • Experience designing and developing systems with desired SLAs and data quality metrics

  • Experience with microservice architecture

  • Experience architecting an enterprise-grade data platform

About the job

Full-time
USA
$140k-$170k per year
Posted 1 year ago
kafka
devops
javascript
python
docker
Enhancv advertisement
+ 1,284 new jobs added today
30,000+
Remote Jobs

Don't miss out — new listings every hour

Join Premium

Sr. Data Engineer - Kafka

hims & hers
The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

We're looking for a savvy and experienced Senior Data Engineer to join the Data Platform Engineering team at Hims. As a Senior Data Engineer, you will work with the analytics engineers, product managers, engineers, security, DevOps, analytics, and machine learning teams to build a data platform that backs the self-service analytics, machine learning models, and data products serving over a million Hims & Hers users.

You Will:

  • Architect and develop data pipelines to optimize performance, quality, and scalability

  • Build, maintain & operate scalable, performant, and containerized infrastructure required for optimal extraction, transformation, and loading of data from various data sources

  • Design, develop, and own robust, scalable data processing and data integration pipelines using Python, dbt, Kafka, Airflow, PySpark, SparkSQL, and REST API endpoints to ingest data from various external data sources to Data Lake

  • Develop testing frameworks and monitoring to improve data quality, observability, pipeline reliability, and performance

  • Orchestrate sophisticated data flow patterns across a variety of disparate tooling

  • Support analytics engineers, data analysts, and business partners in building tools and data marts that enable self-service analytics

  • Partner with the rest of the Data Platform team to set best practices and ensure the execution of them

  • Partner with the analytics engineers to ensure the performance and reliability of our data sources

  • Partner with machine learning engineers to deploy predictive models

  • Partner with the legal and security teams to build frameworks and implement data compliance and security policies

  • Partner with DevOps to build IaC and CI/CD pipelines

  • Support code versioning and code deployments for data Pipelines

You Have:

  • 8+ years of professional experience designing, creating and maintaining scalable data pipelines using Python, API calls, SQL, and scripting languages

  • Demonstrated experience writing clean, efficient & well-documented Python code and are willing to become effective in other languages as needed

  • Demonstrated experience writing complex, highly optimized SQL queries across large data sets

  • Experience with cloud technologies such as AWS and/or Google Cloud Platform

  • Experience building event streaming pipelines using Kafka/Confluent Kafka

  • Experience with IaC technologies like Terraform

  • Experience with data warehouses like BigQuery, Databricks, Snowflake, and Postgres

  • Experience with Databricks platform

  • Experience with modern data stack like Airflow/Astronomer, Databricks, dbt, Fivetran, Confluent, Tableau/Looker

  • Experience with containers and container orchestration tools such as Docker or Kubernetes

  • Experience with Machine Learning & MLOps

  • Experience with CI/CD (Jenkins, GitHub Actions, Circle CI)

  • Thorough understanding of SDLC and Agile frameworks

  • Project management skills and a demonstrated ability to work autonomously

Nice to Have:

  • Experience building data models using dbt

  • Experience with Javascript and event tracking tools like GTM

  • Experience designing and developing systems with desired SLAs and data quality metrics

  • Experience with microservice architecture

  • Experience architecting an enterprise-grade data platform

Working Nomads

Post Jobs
Premium Subscription
Sponsorship
Reviews
Job Alerts

Job Skills
Jobs by Location
Jobs by Experience Level
Jobs by Position Type
Jobs by Salary
API
Scam Alert
FAQ
Privacy policy
Terms and conditions
Contact us
About us

Jobs by Category

Remote Administration jobs
Remote Consulting jobs
Remote Customer Success jobs
Remote Development jobs
Remote Design jobs
Remote Education jobs
Remote Finance jobs
Remote Legal jobs
Remote Healthcare jobs
Remote Human Resources jobs
Remote Management jobs
Remote Marketing jobs
Remote Sales jobs
Remote System Administration jobs
Remote Writing jobs

Jobs by Position Type

Remote Full-time jobs
Remote Part-time jobs
Remote Contract jobs

Jobs by Region

Remote jobs Anywhere
Remote jobs North America
Remote jobs Latin America
Remote jobs Europe
Remote jobs Middle East
Remote jobs Africa
Remote jobs APAC

Jobs by Skill

Remote Accounting jobs
Remote Assistant jobs
Remote Copywriting jobs
Remote Cyber Security jobs
Remote Data Analyst jobs
Remote Data Entry jobs
Remote English jobs
Remote Entry Level jobs
Remote Spanish jobs
Remote Project Management jobs
Remote QA jobs
Remote SEO jobs

Jobs by Country

Remote jobs Australia
Remote jobs Argentina
Remote jobs Belgium
Remote jobs Brazil
Remote jobs Canada
Remote jobs Colombia
Remote jobs France
Remote jobs Germany
Remote jobs Ireland
Remote jobs India
Remote jobs Japan
Remote jobs Mexico
Remote jobs Netherlands
Remote jobs New Zealand
Remote jobs Philippines
Remote jobs Poland
Remote jobs Portugal
Remote jobs Singapore
Remote jobs Spain
Remote jobs UK
Remote jobs USA


Working Nomads curates remote digital jobs from around the web.

© 2026 Working Nomads.