MENU
  • Remote Jobs
  • Companies
  • Go Premium
  • Job Alerts
  • Post a Job
  • Log in
  • Sign up
Working Nomads logo Working Nomads
  • Remote Jobs
  • Companies
  • Post Jobs
  • Go Premium
  • Get Free Job Alerts
  • Log in

Senior Data Engineer - Streaming Platform

Voodoo

Full-time
EMEA
engineer
java
python
aws
scala
Apply for this position

Founded in 2013, Voodoo is a tech company that creates mobile games and apps with a mission to entertain the world. Gathering 800 employees, 7 billion downloads, and over 200 million active users, Voodoo is the #3 mobile publisher worldwide in terms of downloads after Google and Meta. Our portfolio includes chart-topping games like Mob Control and Block Jam, alongside popular apps such as BeReal and Wizz.

Team

The Engineering & Data team builds innovative tech products and platforms to support the impressive growth of their gaming and consumer apps which allow Voodoo to stay at the forefront of the mobile industry. 

Within the Data team, you’ll join the Ad-Network Team which is an autonomous squad of around 30 people.  The team is composed of top-tier software engineers, infrastructure engineers, data engineers, mobile engineers, and data scientists (among which 3 Kaggle Masters). The goal of this team is to provide a way for Voodoo to monetize our inventory directly with advertising partners, and relies on advanced technological solutions to optimize advertising in a real-time bidding environment. It is a strategic topic with significant impact on the business.

This role can be done fully remote in any EMEA country.

Role

  • Design, implement, and optimize real-time data pipelines handling billions of events per day with strict SLAs.

  • Architect data flows for bidstream data, auction logs, impression tracking and user behavior data.

  • Build scalable and reliable event ingestion and processing systems using Kafka, Flink, Spark Structured Streaming, or similar technologies.

  • Operate data infrastructure on Kubernetes, managing deployments, autoscaling, resource limits, and high availability.

  • Collaborate with backend to integrate OpenRTB signals into our data platform in near real-time.

  • Ensure high-throughput, low-latency processing, and system resilience in our streaming infrastructure.

  • Design and manage event schemas (Avro, Protobuf), schema evolution strategies, and metadata tracking.

  • Implement observability, alerting, and performance monitoring for critical data services.

  • Contribute to decisions on data modeling and data retention strategies for real-time use cases.

  • Mentor other engineers and advocate for best practices in streaming architecture, reliability, and performance.

  • Continuously evaluate new tools, trends, and techniques to evolve our modern streaming stack.

Profile (Must have)

  • Extensive experience in data or backend engineering, with at least 2+ years building real-time data pipelines.

  • Proficiency with stream processing frameworks like Flink, Spark Structured Streaming, Beam, or similar.

  • Strong programming experience in Java, Scala, or Python, with a focus on distributed systems.

  • Deep understanding of event streaming and messaging platforms such as GCP Pub/Sub, AWS Kinesis, Apache Pulsar, or Kafka — including performance tuning, delivery guarantees, and schema management.

  • Solid experience operating data services in Kubernetes, including Helm, resource tuning, and service discovery.

  • Experience with Protobuf/Avro, and best practices around schema evolution in streaming environments.

  • Familiarity with CI/CD workflows and infrastructure-as-code (e.g., Terraform, ArgoCD, CircleCI).

  • Strong debugging skills and a bias for building reliable, self-healing systems.

  • Nice to have:

  • Knowledge of stream-native analytics platforms (e.g., Druid, ClickHouse, Pinot).

  • Understanding of frequency capping, fraud detection, and pacing algorithms.

  • Exposure to service mesh, auto-scaling, and cost optimization in containerized environments.

  • Contributions to open-source streaming or infra projects.

Nice to Have

  • Knowledge of stream-native analytics platforms (e.g., Druid, ClickHouse, Pinot).

  • Understanding of frequency capping, fraud detection, and pacing algorithms.

  • Exposure to service mesh, auto-scaling, and cost optimization in containerized environments.

  • Contributions to open-source streaming or infra projects.

Benefits

  • Best-in-class compensation

  • Other benefits according to the country you reside in

Apply for this position
Bookmark Report

About the job

Full-time
EMEA
15 Applicants
Posted 4 weeks ago
engineer
java
python
aws
scala

Apply for this position

Bookmark
Report
Enhancv advertisement

30,000+
REMOTE JOBS

Unlock access to our database and
kickstart your remote career
Join Premium

Senior Data Engineer - Streaming Platform

Voodoo

Founded in 2013, Voodoo is a tech company that creates mobile games and apps with a mission to entertain the world. Gathering 800 employees, 7 billion downloads, and over 200 million active users, Voodoo is the #3 mobile publisher worldwide in terms of downloads after Google and Meta. Our portfolio includes chart-topping games like Mob Control and Block Jam, alongside popular apps such as BeReal and Wizz.

Team

The Engineering & Data team builds innovative tech products and platforms to support the impressive growth of their gaming and consumer apps which allow Voodoo to stay at the forefront of the mobile industry. 

Within the Data team, you’ll join the Ad-Network Team which is an autonomous squad of around 30 people.  The team is composed of top-tier software engineers, infrastructure engineers, data engineers, mobile engineers, and data scientists (among which 3 Kaggle Masters). The goal of this team is to provide a way for Voodoo to monetize our inventory directly with advertising partners, and relies on advanced technological solutions to optimize advertising in a real-time bidding environment. It is a strategic topic with significant impact on the business.

This role can be done fully remote in any EMEA country.

Role

  • Design, implement, and optimize real-time data pipelines handling billions of events per day with strict SLAs.

  • Architect data flows for bidstream data, auction logs, impression tracking and user behavior data.

  • Build scalable and reliable event ingestion and processing systems using Kafka, Flink, Spark Structured Streaming, or similar technologies.

  • Operate data infrastructure on Kubernetes, managing deployments, autoscaling, resource limits, and high availability.

  • Collaborate with backend to integrate OpenRTB signals into our data platform in near real-time.

  • Ensure high-throughput, low-latency processing, and system resilience in our streaming infrastructure.

  • Design and manage event schemas (Avro, Protobuf), schema evolution strategies, and metadata tracking.

  • Implement observability, alerting, and performance monitoring for critical data services.

  • Contribute to decisions on data modeling and data retention strategies for real-time use cases.

  • Mentor other engineers and advocate for best practices in streaming architecture, reliability, and performance.

  • Continuously evaluate new tools, trends, and techniques to evolve our modern streaming stack.

Profile (Must have)

  • Extensive experience in data or backend engineering, with at least 2+ years building real-time data pipelines.

  • Proficiency with stream processing frameworks like Flink, Spark Structured Streaming, Beam, or similar.

  • Strong programming experience in Java, Scala, or Python, with a focus on distributed systems.

  • Deep understanding of event streaming and messaging platforms such as GCP Pub/Sub, AWS Kinesis, Apache Pulsar, or Kafka — including performance tuning, delivery guarantees, and schema management.

  • Solid experience operating data services in Kubernetes, including Helm, resource tuning, and service discovery.

  • Experience with Protobuf/Avro, and best practices around schema evolution in streaming environments.

  • Familiarity with CI/CD workflows and infrastructure-as-code (e.g., Terraform, ArgoCD, CircleCI).

  • Strong debugging skills and a bias for building reliable, self-healing systems.

  • Nice to have:

  • Knowledge of stream-native analytics platforms (e.g., Druid, ClickHouse, Pinot).

  • Understanding of frequency capping, fraud detection, and pacing algorithms.

  • Exposure to service mesh, auto-scaling, and cost optimization in containerized environments.

  • Contributions to open-source streaming or infra projects.

Nice to Have

  • Knowledge of stream-native analytics platforms (e.g., Druid, ClickHouse, Pinot).

  • Understanding of frequency capping, fraud detection, and pacing algorithms.

  • Exposure to service mesh, auto-scaling, and cost optimization in containerized environments.

  • Contributions to open-source streaming or infra projects.

Benefits

  • Best-in-class compensation

  • Other benefits according to the country you reside in

Working Nomads

Post Jobs
Premium Subscription
Sponsorship
Free Job Alerts

Job Skills
API
FAQ
Privacy policy
Terms and conditions
Contact us
About us

Jobs by Category

Remote Administration jobs
Remote Consulting jobs
Remote Customer Success jobs
Remote Development jobs
Remote Design jobs
Remote Education jobs
Remote Finance jobs
Remote Legal jobs
Remote Healthcare jobs
Remote Human Resources jobs
Remote Management jobs
Remote Marketing jobs
Remote Sales jobs
Remote System Administration jobs
Remote Writing jobs

Jobs by Position Type

Remote Full-time jobs
Remote Part-time jobs
Remote Contract jobs

Jobs by Region

Remote jobs Anywhere
Remote jobs North America
Remote jobs Latin America
Remote jobs Europe
Remote jobs Middle East
Remote jobs Africa
Remote jobs APAC

Jobs by Skill

Remote Accounting jobs
Remote Assistant jobs
Remote Copywriting jobs
Remote Cyber Security jobs
Remote Data Analyst jobs
Remote Data Entry jobs
Remote English jobs
Remote Spanish jobs
Remote Project Management jobs
Remote QA jobs
Remote SEO jobs

Jobs by Country

Remote jobs Australia
Remote jobs Argentina
Remote jobs Brazil
Remote jobs Canada
Remote jobs Colombia
Remote jobs France
Remote jobs Germany
Remote jobs Ireland
Remote jobs India
Remote jobs Japan
Remote jobs Mexico
Remote jobs Netherlands
Remote jobs New Zealand
Remote jobs Philippines
Remote jobs Poland
Remote jobs Portugal
Remote jobs Singapore
Remote jobs Spain
Remote jobs UK
Remote jobs USA


Working Nomads curates remote digital jobs from around the web.

© 2025 Working Nomads.