Senior Data Engineer

Full-time
Brazil
Posted 1 year ago
Go ad-free with Premium ×
The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

About your Team

At Teachable, our Data Team is all about getting things done with minimal overhead. We work closely together, sharing ideas, designing efficient systems, and building a robust data platform that supports the entire organization. Reporting to the Data Engineering Manager, you’ll take the lead on key initiatives, from optimizing our data lake to shaping the future of our events pipeline and ETL processes.

Our data infrastructure is primarily hosted on AWS. We leverage managed Airflow for orchestration, Kafka for event collection, Redshift for analytics, and S3 for data lake storage. As a Senior Data Engineer, your expertise will be pivotal in influencing the tools and technologies we use, as well as setting best practices for the entire data ecosystem.

The Impact You'll Make

In this role, you'll be at the heart of our data strategy, taking ownership of mission-critical infrastructure that supports teams across the company. You’ll also have the opportunity to make strategic recommendations on how we can evolve our data platform to stay ahead of the curve.

This is a fully remote role based in Brazil, and you’ll collaborate closely with teams across the U.S. and Brazil (including Hotmart’s Data Team). 

Your work will follow Brasilia Standard Time, and you’ll be hired as a CLT contract employee with compensation in BRL.

What You’ll Do

  • Partner with Engineering and Product teams on high-impact initiatives.

  • Design and implement robust data pipelines using AWS services.

  • Enhance event collection, queueing, and processing mechanisms.

  • Build and refine meaningful alerts with automated corrective actions.

  • Troubleshoot and resolve issues, ensuring system reliability and stability.

  • Architect scalable data products with long-term vision and efficiency in mind.

  • Identify and resolve complex technical challenges involving system integration, infrastructure, and software bugs.

About You

  • 4+ years of professional experience in data engineering.

  • Proficiency in processes used in modern data engineering

    • Solid understanding of cloud infrastructure

    • Hands-on experience with relevant data-related AWS services

    • Advanced understanding of the relational data model

    • Experience interacting with RESTful APIs and webhooks

    • Understanding of DevOps disciplines and Git version control

    • A commitment to writing clean, efficient, and maintainable code

    • Familiarity with monitoring tools like CloudWatch or New Relic

    • Familiarity with incident management tools like PagerDuty is desirable

    • Understanding Software Engineering paradigms and lifecycle is desirable

  • Proficiency in languages used in data engineering

    • Strong SQL skills

    • Expertise utilizing Python for data wrangling

    • Experience using Airflow for data pipeline orchestration

    • Experience with Spark creating big data workloads

    • Experience working with Delta Lake, Hudi or Iceberg

    • Experience working with large-scale data warehouses like Redshift

    • Experience with Terraform composing AWS infrastructure is desirable

  • A team player who values diversity and collaboration

 

 

Go ad-free with Premium ×
About the Job
Full-time
Brazil
Posted 1 year ago
Check if your resume is a good fit
25/100
Get Full Report
+ 1,284 new jobs added today
30,000+
Remote Jobs

Don't miss out — new listings every hour

Join Premium

Senior Data Engineer

The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

About your Team

At Teachable, our Data Team is all about getting things done with minimal overhead. We work closely together, sharing ideas, designing efficient systems, and building a robust data platform that supports the entire organization. Reporting to the Data Engineering Manager, you’ll take the lead on key initiatives, from optimizing our data lake to shaping the future of our events pipeline and ETL processes.

Our data infrastructure is primarily hosted on AWS. We leverage managed Airflow for orchestration, Kafka for event collection, Redshift for analytics, and S3 for data lake storage. As a Senior Data Engineer, your expertise will be pivotal in influencing the tools and technologies we use, as well as setting best practices for the entire data ecosystem.

The Impact You'll Make

In this role, you'll be at the heart of our data strategy, taking ownership of mission-critical infrastructure that supports teams across the company. You’ll also have the opportunity to make strategic recommendations on how we can evolve our data platform to stay ahead of the curve.

This is a fully remote role based in Brazil, and you’ll collaborate closely with teams across the U.S. and Brazil (including Hotmart’s Data Team). 

Your work will follow Brasilia Standard Time, and you’ll be hired as a CLT contract employee with compensation in BRL.

What You’ll Do

  • Partner with Engineering and Product teams on high-impact initiatives.

  • Design and implement robust data pipelines using AWS services.

  • Enhance event collection, queueing, and processing mechanisms.

  • Build and refine meaningful alerts with automated corrective actions.

  • Troubleshoot and resolve issues, ensuring system reliability and stability.

  • Architect scalable data products with long-term vision and efficiency in mind.

  • Identify and resolve complex technical challenges involving system integration, infrastructure, and software bugs.

About You

  • 4+ years of professional experience in data engineering.

  • Proficiency in processes used in modern data engineering

    • Solid understanding of cloud infrastructure

    • Hands-on experience with relevant data-related AWS services

    • Advanced understanding of the relational data model

    • Experience interacting with RESTful APIs and webhooks

    • Understanding of DevOps disciplines and Git version control

    • A commitment to writing clean, efficient, and maintainable code

    • Familiarity with monitoring tools like CloudWatch or New Relic

    • Familiarity with incident management tools like PagerDuty is desirable

    • Understanding Software Engineering paradigms and lifecycle is desirable

  • Proficiency in languages used in data engineering

    • Strong SQL skills

    • Expertise utilizing Python for data wrangling

    • Experience using Airflow for data pipeline orchestration

    • Experience with Spark creating big data workloads

    • Experience working with Delta Lake, Hudi or Iceberg

    • Experience working with large-scale data warehouses like Redshift

    • Experience with Terraform composing AWS infrastructure is desirable

  • A team player who values diversity and collaboration