MENU
  • Remote Jobs
  • Companies
  • Go Premium
  • Job Alerts
  • Post a Job
  • Log in
  • Sign up
Working Nomads logo Working Nomads
  • Remote Jobs
  • Companies
  • Post Jobs
  • Go Premium
  • Get Free Job Alerts
  • Log in

Data Engineer I

Pax8

Full-time
USA
$93k-$115k per year
engineer
python
docker
sql
aws
The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

Position Summary:

As a Data Engineer I, you'll be diving into the exciting world of building data systems that drive critical insights and decision-making across Pax8. This role is perfect for someone who enjoys working with big datasets, has a passion for problem-solving, and thrives on collaboration. You'll have the opportunity to interface with stakeholders like Product Managers and business teams to gather requirements, ensuring the data solutions we build are aligned with business needs. Your work will directly impact end users, creating data flows and analytic processes that make a difference.

You’ll be at the heart of our data operations, helping to build and maintain data pipelines that fuel analytic workflows and processes. Your tasks will include developing systems that efficiently collect, transform, store, and manage data using well-documented techniques. You’ll work alongside experienced technical leaders to plan and execute detailed tasks. You’ll also conduct simple investigative analyses and tests to ensure the data is reliable and meets business requirements. Working with stakeholders, you’ll gain a deep understanding of their needs, helping shape visual solutions for tools like PowerBI. Your insights will contribute to improving data visualizations and stakeholder reporting. You’ll also contribute to data modeling efforts, ensuring our data infrastructure is optimized for analytics and reporting.

To succeed in this role, you'll need strong technical chops in SQL and Python, with experience using data frame tools like Pandas and Numpy. A solid understanding of object-oriented programming (OOP) principles and a proven ability to apply them in developing clean, efficient, and scalable solutions will be crucial. You’re familiar with big databases like Redshift, BigQuery, Presto, or Hive and have some experience with data orchestration tools such as Airflow or AWS Glue. A solid understanding of dimensional modeling will set you up for success in designing effective data structures. You’ll have strong communication skills, allowing you to effectively interface with stakeholders and gather requirements. Stakeholder management will be crucial as you translate business needs into technical solutions. Experience in B2B, Retail, or SaaS industries is a plus, giving you the context to better understand the business side of the role.

Essential Responsibilities:

  • Learns coding techniques/standards and applies them to their work

  • Define, build, test, and implement scalable data pipelines using Python and SQL

  • Transforms data to support varied use cases

  • Optimizes existing data pipelines and improves existing code quality

  • Writes unit and integration tests

  • Works collaboratively with peers to solve pressing data issues

  • Participates in on-call rotation

Ideal Skills, Experience, and Competencies:

  • At least one (1) to three (3) years of relevant data engineering experience.

  • Intermediate experience with the Python programming language.

  • Intermediate experience with SQL.

  • Experience with Data Modeling.

  • Exposure to a JVM language.

  • Exposure to Apache Spark or other distributed processing engines.

  • Exposure to Apache Kafka or other stream processing frameworks.

  • Exposure to Terraform, Docker, Kubernetes, or other similar infrastructure tooling.

  • Exposure to job orchestration and/or ETL tools such as Airflow, Prefect, Glue, Talend, or Informatica.

  • Exposure to cloud environments such as AWS, Azure, or Google Cloud.

  • Exposure to analytical databases such as Redshift, Athena, Big Query, and Presto.

  • Ability to build partnerships and work collaboratively with others to meet shared objectives.

  • Ability to actively seek new ways to grow using both formal and informal development challenges.

  • Ability to effectively absorb and apply peer feedback.

Required Education & Certifications:

  • B.A./B.S. in a related field or equivalent work experience.

Compensation:

  • Qualified candidates can expect a compensation range of $93,000 to $115,000 or more depending on experience.

Expected Closing Date: 11/8/24

#LI-Remote #LI-JF1 #BI-Remote #DICE-J

About the job

Full-time
USA
$93k-$115k per year
Posted 1 year ago
engineer
python
docker
sql
aws
Enhancv advertisement
+ 1,284 new jobs added today
30,000+
Remote Jobs

Don't miss out — new listings every hour

Join Premium

Data Engineer I

Pax8
The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

Position Summary:

As a Data Engineer I, you'll be diving into the exciting world of building data systems that drive critical insights and decision-making across Pax8. This role is perfect for someone who enjoys working with big datasets, has a passion for problem-solving, and thrives on collaboration. You'll have the opportunity to interface with stakeholders like Product Managers and business teams to gather requirements, ensuring the data solutions we build are aligned with business needs. Your work will directly impact end users, creating data flows and analytic processes that make a difference.

You’ll be at the heart of our data operations, helping to build and maintain data pipelines that fuel analytic workflows and processes. Your tasks will include developing systems that efficiently collect, transform, store, and manage data using well-documented techniques. You’ll work alongside experienced technical leaders to plan and execute detailed tasks. You’ll also conduct simple investigative analyses and tests to ensure the data is reliable and meets business requirements. Working with stakeholders, you’ll gain a deep understanding of their needs, helping shape visual solutions for tools like PowerBI. Your insights will contribute to improving data visualizations and stakeholder reporting. You’ll also contribute to data modeling efforts, ensuring our data infrastructure is optimized for analytics and reporting.

To succeed in this role, you'll need strong technical chops in SQL and Python, with experience using data frame tools like Pandas and Numpy. A solid understanding of object-oriented programming (OOP) principles and a proven ability to apply them in developing clean, efficient, and scalable solutions will be crucial. You’re familiar with big databases like Redshift, BigQuery, Presto, or Hive and have some experience with data orchestration tools such as Airflow or AWS Glue. A solid understanding of dimensional modeling will set you up for success in designing effective data structures. You’ll have strong communication skills, allowing you to effectively interface with stakeholders and gather requirements. Stakeholder management will be crucial as you translate business needs into technical solutions. Experience in B2B, Retail, or SaaS industries is a plus, giving you the context to better understand the business side of the role.

Essential Responsibilities:

  • Learns coding techniques/standards and applies them to their work

  • Define, build, test, and implement scalable data pipelines using Python and SQL

  • Transforms data to support varied use cases

  • Optimizes existing data pipelines and improves existing code quality

  • Writes unit and integration tests

  • Works collaboratively with peers to solve pressing data issues

  • Participates in on-call rotation

Ideal Skills, Experience, and Competencies:

  • At least one (1) to three (3) years of relevant data engineering experience.

  • Intermediate experience with the Python programming language.

  • Intermediate experience with SQL.

  • Experience with Data Modeling.

  • Exposure to a JVM language.

  • Exposure to Apache Spark or other distributed processing engines.

  • Exposure to Apache Kafka or other stream processing frameworks.

  • Exposure to Terraform, Docker, Kubernetes, or other similar infrastructure tooling.

  • Exposure to job orchestration and/or ETL tools such as Airflow, Prefect, Glue, Talend, or Informatica.

  • Exposure to cloud environments such as AWS, Azure, or Google Cloud.

  • Exposure to analytical databases such as Redshift, Athena, Big Query, and Presto.

  • Ability to build partnerships and work collaboratively with others to meet shared objectives.

  • Ability to actively seek new ways to grow using both formal and informal development challenges.

  • Ability to effectively absorb and apply peer feedback.

Required Education & Certifications:

  • B.A./B.S. in a related field or equivalent work experience.

Compensation:

  • Qualified candidates can expect a compensation range of $93,000 to $115,000 or more depending on experience.

Expected Closing Date: 11/8/24

#LI-Remote #LI-JF1 #BI-Remote #DICE-J

Working Nomads

Post Jobs
Premium Subscription
Sponsorship
Reviews
Job Alerts

Job Skills
Jobs by Location
Jobs by Experience Level
Jobs by Position Type
Jobs by Salary
API
Scam Alert
FAQ
Privacy policy
Terms and conditions
Contact us
About us

Jobs by Category

Remote Administration jobs
Remote Consulting jobs
Remote Customer Success jobs
Remote Development jobs
Remote Design jobs
Remote Education jobs
Remote Finance jobs
Remote Legal jobs
Remote Healthcare jobs
Remote Human Resources jobs
Remote Management jobs
Remote Marketing jobs
Remote Sales jobs
Remote System Administration jobs
Remote Writing jobs

Jobs by Position Type

Remote Full-time jobs
Remote Part-time jobs
Remote Contract jobs

Jobs by Region

Remote jobs Anywhere
Remote jobs North America
Remote jobs Latin America
Remote jobs Europe
Remote jobs Middle East
Remote jobs Africa
Remote jobs APAC

Jobs by Skill

Remote Accounting jobs
Remote Assistant jobs
Remote Copywriting jobs
Remote Cyber Security jobs
Remote Data Analyst jobs
Remote Data Entry jobs
Remote English jobs
Remote Entry Level jobs
Remote Spanish jobs
Remote Project Management jobs
Remote QA jobs
Remote SEO jobs

Jobs by Country

Remote jobs Australia
Remote jobs Argentina
Remote jobs Belgium
Remote jobs Brazil
Remote jobs Canada
Remote jobs Colombia
Remote jobs France
Remote jobs Germany
Remote jobs Ireland
Remote jobs India
Remote jobs Japan
Remote jobs Mexico
Remote jobs Netherlands
Remote jobs New Zealand
Remote jobs Philippines
Remote jobs Poland
Remote jobs Portugal
Remote jobs Singapore
Remote jobs Spain
Remote jobs UK
Remote jobs USA


Working Nomads curates remote digital jobs from around the web.

© 2026 Working Nomads.