MENU
  • Remote Jobs
  • Companies
  • Go Premium
  • Job Alerts
  • Post a Job
  • Log in
  • Sign up
Working Nomads logo Working Nomads
  • Remote Jobs
  • Companies
  • Post Jobs
  • Go Premium
  • Get Free Job Alerts
  • Log in

Staff Data Engineer

Apollo.io

Full-time
USA
engineer
python
big data
hadoop
machine learning
Apply for this position

As a Staff Data Engineer, you will be responsible for maintaining and operating the data platform that caters to machine learning workflows, analytics and powers some of the products offered to Apollo customers. 

Daily adventures/responsibilities

  • Develop and maintain scalable data pipelines and build new integrations to support continuing increases in data volume and complexity.

  • Develop and improve Data APIs used in machine learning / AI product offerings

  • Implement automated monitoring, alerting, self-healing (restartable/graceful failures) features while building the consumption pipelines.

  • Implement processes and systems to monitor data quality, ensuring production data is always accurate and available.

  • Write unit/integration tests, contribute to the engineering wiki, and document work.

  • Define company data models and write jobs to populate data models in our data warehouse.

  • Work closely with all business units and engineering teams to develop a strategy for long-term data platform architecture.

Competencies

  • Customer driven: Attentive to our internal customers’ needs and strive to deliver a seamless and delightful customer experience in data processing, analytics, and visualization.

  • High impact: Understand what the most important customer metrics are and make the data platform and datasets an enabler for other teams to achieve improvement.

  • Ownership: Take ownership of team-level projects/platforms from start to finish, ensure high-quality implementation, and move fast to find the most efficient ways to iterate.

  • Team mentorship and sharing: Share knowledge and best practices with the engineering team to help up-level the team.

  • Agility: Organized and able to effectively plan and break down large projects into smaller tasks that are easier to estimate and deliver. Can lead fast iterations.

  • Speak and act courageously: Not afraid to fail, challenge the status quo, or speak up for a contrarian view.

  • Focus and move with urgency: Prioritize for impact and move quickly to deliver experiments and features that create customer value.

  • Intelligence: Learns quickly, demonstrates the ability to understand and absorb new codebases, frameworks, and technologies efficiently.

Qualifications

Required:

  • 8+ years of experience as a data platform engineer or a software engineer in data or big data engineer.

  • Experience in data modeling, data warehousing, APIs, and building data pipelines.

  • Deep knowledge of databases and data warehousing with an ability to collaborate cross-functionally.

  • Bachelor's degree in a quantitative field (Physical/Computer Science, Engineering, Mathematics, or Statistics).

Preferred:

  • Experience using the Python data stack.

  • Experience deploying and managing data pipelines in the cloud.

  • Experience working with technologies like Airflow, Hadoop, FastAPI and Spark.

  • Understanding of streaming technologies like Kafka and Spark Streaming.

Apply for this position
Bookmark Report

About the job

Full-time
USA
Posted 2 weeks ago
engineer
python
big data
hadoop
machine learning

Apply for this position

Bookmark
Report
Enhancv advertisement

30,000+
REMOTE JOBS

Unlock access to our database and
kickstart your remote career
Join Premium

Staff Data Engineer

Apollo.io

As a Staff Data Engineer, you will be responsible for maintaining and operating the data platform that caters to machine learning workflows, analytics and powers some of the products offered to Apollo customers. 

Daily adventures/responsibilities

  • Develop and maintain scalable data pipelines and build new integrations to support continuing increases in data volume and complexity.

  • Develop and improve Data APIs used in machine learning / AI product offerings

  • Implement automated monitoring, alerting, self-healing (restartable/graceful failures) features while building the consumption pipelines.

  • Implement processes and systems to monitor data quality, ensuring production data is always accurate and available.

  • Write unit/integration tests, contribute to the engineering wiki, and document work.

  • Define company data models and write jobs to populate data models in our data warehouse.

  • Work closely with all business units and engineering teams to develop a strategy for long-term data platform architecture.

Competencies

  • Customer driven: Attentive to our internal customers’ needs and strive to deliver a seamless and delightful customer experience in data processing, analytics, and visualization.

  • High impact: Understand what the most important customer metrics are and make the data platform and datasets an enabler for other teams to achieve improvement.

  • Ownership: Take ownership of team-level projects/platforms from start to finish, ensure high-quality implementation, and move fast to find the most efficient ways to iterate.

  • Team mentorship and sharing: Share knowledge and best practices with the engineering team to help up-level the team.

  • Agility: Organized and able to effectively plan and break down large projects into smaller tasks that are easier to estimate and deliver. Can lead fast iterations.

  • Speak and act courageously: Not afraid to fail, challenge the status quo, or speak up for a contrarian view.

  • Focus and move with urgency: Prioritize for impact and move quickly to deliver experiments and features that create customer value.

  • Intelligence: Learns quickly, demonstrates the ability to understand and absorb new codebases, frameworks, and technologies efficiently.

Qualifications

Required:

  • 8+ years of experience as a data platform engineer or a software engineer in data or big data engineer.

  • Experience in data modeling, data warehousing, APIs, and building data pipelines.

  • Deep knowledge of databases and data warehousing with an ability to collaborate cross-functionally.

  • Bachelor's degree in a quantitative field (Physical/Computer Science, Engineering, Mathematics, or Statistics).

Preferred:

  • Experience using the Python data stack.

  • Experience deploying and managing data pipelines in the cloud.

  • Experience working with technologies like Airflow, Hadoop, FastAPI and Spark.

  • Understanding of streaming technologies like Kafka and Spark Streaming.

Working Nomads

Post Jobs
Premium Subscription
Sponsorship
Free Job Alerts

Job Skills
API
FAQ
Privacy policy
Terms and conditions
Contact us
About us

Jobs by Category

Remote Administration jobs
Remote Consulting jobs
Remote Customer Success jobs
Remote Development jobs
Remote Design jobs
Remote Education jobs
Remote Finance jobs
Remote Legal jobs
Remote Healthcare jobs
Remote Human Resources jobs
Remote Management jobs
Remote Marketing jobs
Remote Sales jobs
Remote System Administration jobs
Remote Writing jobs

Jobs by Position Type

Remote Full-time jobs
Remote Part-time jobs
Remote Contract jobs

Jobs by Region

Remote jobs Anywhere
Remote jobs North America
Remote jobs Latin America
Remote jobs Europe
Remote jobs Middle East
Remote jobs Africa
Remote jobs APAC

Jobs by Skill

Remote Accounting jobs
Remote Assistant jobs
Remote Copywriting jobs
Remote Cyber Security jobs
Remote Data Analyst jobs
Remote Data Entry jobs
Remote English jobs
Remote Spanish jobs
Remote Project Management jobs
Remote QA jobs
Remote SEO jobs

Jobs by Country

Remote jobs Australia
Remote jobs Argentina
Remote jobs Brazil
Remote jobs Canada
Remote jobs Colombia
Remote jobs France
Remote jobs Germany
Remote jobs Ireland
Remote jobs India
Remote jobs Japan
Remote jobs Mexico
Remote jobs Netherlands
Remote jobs New Zealand
Remote jobs Philippines
Remote jobs Poland
Remote jobs Portugal
Remote jobs Singapore
Remote jobs Spain
Remote jobs UK
Remote jobs USA


Working Nomads curates remote digital jobs from around the web.

© 2025 Working Nomads.