MENU
  • Remote Jobs
  • Companies
  • Go Premium
  • Job Alerts
  • Post a Job
  • Log in
  • Sign up
Working Nomads logo Working Nomads
  • Remote Jobs
  • Companies
  • Post Jobs
  • Go Premium
  • Get Free Job Alerts
  • Log in

Senior Data Engineer

Welltech

Full-time
Poland, Spain, Ukraine, Greece
engineer
python
big data
sql
aws
Apply for this position

🚀 Who Are We?

Welcome to Welltech—where health meets innovation! 🌍 As a global leader in Health & Fitness industry, we’ve crossed over 200 million installs with three life-changing apps, all designed to boost well-being for millions. Our mission? To transform lives through intuitive nutrition trackers, powerful fitness solutions, and personalized wellness journeys—all powered by a diverse team of over 700 passionate professionals with presence across 5 hubs.

Why Welltech? Imagine joining a team where your impact on global health and wellness is felt daily. At Welltech, we strive to be proactive wellness partners for our users, while continually evolving ourselves.

What We're Looking For

As a Senior Data Engineer, you will play a crucial role in building and maintaining the foundation of our data ecosystem. You’ll work alongside data engineers, analysts, and product teams to create robust, scalable, and high-performance data pipelines and models. Your work will directly impact how we deliver insights, power product features, and enable data-driven decision-making across the company.

This role is perfect for someone who combines deep technical skills with a proactive mindset and thrives on solving complex data challenges in a collaborative environment.

Challenges You’ll Meet:

  • Pipeline Development and Optimization: Build and maintain reliable, scalable ETL/ELT pipelines using modern tools and best practices, ensuring efficient data flow for analytics and insights.

  • Data Modeling and Transformation: Design and implement effective data models that support business needs, enabling high-quality reporting and downstream analytics.

  • Collaboration Across Teams: Work closely with data analysts, product managers, and other engineers to understand data requirements and deliver solutions that meet the needs of the business.

  • Ensuring Data Quality: Develop and apply data quality checks, validation frameworks, and monitoring to ensure the consistency, accuracy, and reliability of data.

  • Performance and Efficiency: Identify and address performance issues in pipelines, queries, and data storage. Suggest and implement optimizations that enhance speed and reliability.

  • Security and Compliance: Follow data security best practices and ensure pipelines are built to meet data privacy and compliance standards.

  • Innovation and Continuous Improvement: Test new tools and approaches by building Proof of Concepts (PoCs) and conducting performance benchmarks to find the best solutions.

  • Automation and CI/CD Practices: Contribute to the development of robust CI/CD pipelines (GitLab CI or similar) for data workflows, supporting automated testing and deployment.

You Should Have:

  • 4+ years of experience in data engineering or backend development, with a strong focus on building production-grade data pipelines.

  • Solid experience working with AWS services (Redshift, Spectrum, S3, RDS, Glue, Lambda, Kinesis, SQS).

  • Proficient in Python and SQL for data transformation and automation.

  • Experience with dbt for data modeling and transformation.

  • Good understanding of streaming architectures and micro-batching for real-time data needs.

  • Experience with CI/CD pipelines for data workflows (preferably GitLab CI).

  • Familiarity with event schema validation tools/ solutions (Snowplow, Schema Registry).

  • Excellent communication and collaboration skills. Strong problem-solving skills—able to dig into data issues, propose solutions, and deliver clean, reliable outcomes.

  • A growth mindset—enthusiastic about learning new tools, sharing knowledge, and improving team practices.

Tech Stack You’ll Work With:

  • Cloud: AWS (Redshift, Spectrum, S3, RDS, Lambda, Kinesis, SQS, Glue, MWAA)

  • Languages: Python, SQL

  • Orchestration: Airflow (MWAA)

  • Modeling: dbt

  • CI/CD: GitLab CI (including GitLab administration)

  • Monitoring: Datadog, Grafana, Graylog

  • Event validation process: Iglu schema registry

  • APIs & Integrations: REST, OAuth, webhook ingestion

  • Infra-as-code (optional): Terraform

Bonus Points / Nice to Have:

  • Experience with additional AWS services: EMR, EKS, Athena, EC2.

  • Hands-on knowledge of alternative data warehouses like Snowflake or others.

  • Experience with PySpark for big data processing.

  • Familiarity with event data collection tools (Snowplow, Rudderstack, etc.).

Interest in or exposure to customer data platforms (CDPs) and real-time data workflows.

Apply for this position
Bookmark Report

About the job

Full-time
Poland, Spain, Ukraine, Greece
10 Applicants
Posted 6 days ago
engineer
python
big data
sql
aws

Apply for this position

Bookmark
Report
Enhancv advertisement

30,000+
REMOTE JOBS

Unlock access to our database and
kickstart your remote career
Join Premium

Senior Data Engineer

Welltech

🚀 Who Are We?

Welcome to Welltech—where health meets innovation! 🌍 As a global leader in Health & Fitness industry, we’ve crossed over 200 million installs with three life-changing apps, all designed to boost well-being for millions. Our mission? To transform lives through intuitive nutrition trackers, powerful fitness solutions, and personalized wellness journeys—all powered by a diverse team of over 700 passionate professionals with presence across 5 hubs.

Why Welltech? Imagine joining a team where your impact on global health and wellness is felt daily. At Welltech, we strive to be proactive wellness partners for our users, while continually evolving ourselves.

What We're Looking For

As a Senior Data Engineer, you will play a crucial role in building and maintaining the foundation of our data ecosystem. You’ll work alongside data engineers, analysts, and product teams to create robust, scalable, and high-performance data pipelines and models. Your work will directly impact how we deliver insights, power product features, and enable data-driven decision-making across the company.

This role is perfect for someone who combines deep technical skills with a proactive mindset and thrives on solving complex data challenges in a collaborative environment.

Challenges You’ll Meet:

  • Pipeline Development and Optimization: Build and maintain reliable, scalable ETL/ELT pipelines using modern tools and best practices, ensuring efficient data flow for analytics and insights.

  • Data Modeling and Transformation: Design and implement effective data models that support business needs, enabling high-quality reporting and downstream analytics.

  • Collaboration Across Teams: Work closely with data analysts, product managers, and other engineers to understand data requirements and deliver solutions that meet the needs of the business.

  • Ensuring Data Quality: Develop and apply data quality checks, validation frameworks, and monitoring to ensure the consistency, accuracy, and reliability of data.

  • Performance and Efficiency: Identify and address performance issues in pipelines, queries, and data storage. Suggest and implement optimizations that enhance speed and reliability.

  • Security and Compliance: Follow data security best practices and ensure pipelines are built to meet data privacy and compliance standards.

  • Innovation and Continuous Improvement: Test new tools and approaches by building Proof of Concepts (PoCs) and conducting performance benchmarks to find the best solutions.

  • Automation and CI/CD Practices: Contribute to the development of robust CI/CD pipelines (GitLab CI or similar) for data workflows, supporting automated testing and deployment.

You Should Have:

  • 4+ years of experience in data engineering or backend development, with a strong focus on building production-grade data pipelines.

  • Solid experience working with AWS services (Redshift, Spectrum, S3, RDS, Glue, Lambda, Kinesis, SQS).

  • Proficient in Python and SQL for data transformation and automation.

  • Experience with dbt for data modeling and transformation.

  • Good understanding of streaming architectures and micro-batching for real-time data needs.

  • Experience with CI/CD pipelines for data workflows (preferably GitLab CI).

  • Familiarity with event schema validation tools/ solutions (Snowplow, Schema Registry).

  • Excellent communication and collaboration skills. Strong problem-solving skills—able to dig into data issues, propose solutions, and deliver clean, reliable outcomes.

  • A growth mindset—enthusiastic about learning new tools, sharing knowledge, and improving team practices.

Tech Stack You’ll Work With:

  • Cloud: AWS (Redshift, Spectrum, S3, RDS, Lambda, Kinesis, SQS, Glue, MWAA)

  • Languages: Python, SQL

  • Orchestration: Airflow (MWAA)

  • Modeling: dbt

  • CI/CD: GitLab CI (including GitLab administration)

  • Monitoring: Datadog, Grafana, Graylog

  • Event validation process: Iglu schema registry

  • APIs & Integrations: REST, OAuth, webhook ingestion

  • Infra-as-code (optional): Terraform

Bonus Points / Nice to Have:

  • Experience with additional AWS services: EMR, EKS, Athena, EC2.

  • Hands-on knowledge of alternative data warehouses like Snowflake or others.

  • Experience with PySpark for big data processing.

  • Familiarity with event data collection tools (Snowplow, Rudderstack, etc.).

Interest in or exposure to customer data platforms (CDPs) and real-time data workflows.

Working Nomads

Post Jobs
Premium Subscription
Sponsorship
Free Job Alerts

Job Skills
API
FAQ
Privacy policy
Terms and conditions
Contact us
About us

Jobs by Category

Remote Administration jobs
Remote Consulting jobs
Remote Customer Success jobs
Remote Development jobs
Remote Design jobs
Remote Education jobs
Remote Finance jobs
Remote Legal jobs
Remote Healthcare jobs
Remote Human Resources jobs
Remote Management jobs
Remote Marketing jobs
Remote Sales jobs
Remote System Administration jobs
Remote Writing jobs

Jobs by Position Type

Remote Full-time jobs
Remote Part-time jobs
Remote Contract jobs

Jobs by Region

Remote jobs Anywhere
Remote jobs North America
Remote jobs Latin America
Remote jobs Europe
Remote jobs Middle East
Remote jobs Africa
Remote jobs APAC

Jobs by Skill

Remote Accounting jobs
Remote Assistant jobs
Remote Copywriting jobs
Remote Cyber Security jobs
Remote Data Analyst jobs
Remote Data Entry jobs
Remote English jobs
Remote Spanish jobs
Remote Project Management jobs
Remote QA jobs
Remote SEO jobs

Jobs by Country

Remote jobs Australia
Remote jobs Argentina
Remote jobs Brazil
Remote jobs Canada
Remote jobs Colombia
Remote jobs France
Remote jobs Germany
Remote jobs Ireland
Remote jobs India
Remote jobs Japan
Remote jobs Mexico
Remote jobs Netherlands
Remote jobs New Zealand
Remote jobs Philippines
Remote jobs Poland
Remote jobs Portugal
Remote jobs Singapore
Remote jobs Spain
Remote jobs UK
Remote jobs USA


Working Nomads curates remote digital jobs from around the web.

© 2025 Working Nomads.