MENU
  • Remote Jobs
  • Companies
  • Go Premium
  • Job Alerts
  • Post a Job
  • Log in
  • Sign up
Working Nomads logo Working Nomads
  • Remote Jobs
  • Companies
  • Post Jobs
  • Go Premium
  • Get Free Job Alerts
  • Log in

Senior Data Engineer

Truelogic Software

Full-time
Latin America
python
devops
docker
aws
architecture
The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

Truelogic is a leading provider of nearshore staff augmentation services, located in New York. Our team of 500 tech talents is driving digital disruption from Latin America to the top projects in U.S. companies. Truelogic has been helping companies of all sizes to achieve their digital transformation goals.

Would you like to make innovation happen? Have you ever dreamed of building Products that impact millions of users? Nice! Then we have a seat for you on our team!

What are you going to do?

You will have the opportunity to work in a forward-thinking and growth-oriented environment, at a massive industry that we all rely on but don’t think about much because it’s been done the same way for decades without much change or innovation. Well-funded Series A startup, developing a brand-new concept that will transform and enable an industry plagued with inefficiency, volatility, and $23B of economic waste annually. The Client team is passionate about collaborating with the world’s leading ocean carriers, shippers and forwarders to bring technology and innovation to this industry that we love, and we hope that you’ll be part of building this new digital foundation that will have lasting effects for years to come.

Occupy a unique position in the market. As a Data Engineer , you will be responsible for capturing data and transforming it into high performance datasets to be exploited by Data Analysts, Data Scientists and Engineers. You will also be responsible for supporting the development, architecture, enhancement, and ongoing maintenance of our Data Lakehouse in a cloud environment. You will develop a deep understanding of the client's data architecture, the frameworks to process the data, and maintain quality data that powers the applications on our technology platform.

  • Work closely with the Data Team Lead and Product Manager to deliver engineering solutions for our user requirements

  • Help support the architecture, development timeline, and maintenance of the cloud-based data Lakehouse

  • Build data pipelines, architecting a new data storage as necessary, and maintaining ETL processes in a cloud environment

  • Implement a robust system architecture that takes into consideration how changes will affect the testability, maintainability, and scalability of the system

  • Optimize our data process, balancing maintaining optimal performance and continuous improvements to our existing data processes

  • Consistently work to identify and mitigate security concerns, refactor old code judiciously, and follow development best practices

  • Automate testing and continuous integration in delivering bug-free, fully test- covered code in short time frames

  • Collaborate with various teams across the business (Operations, Product, etc.) to drive solutions that have a serious impact on our systems and our business overall

What will help you succeed

  • 4+ years of experience designing, building, maintaining data pipeline technologies (ETL/ELT)

  • 4+ years of experience with database models, design, DDLs and DMLs

  • 2+ years of experience in Data Warehouse/Data Lake Architecture and Development 

  • 1+ years experience with workflow orchestration services like AWS Step Function, AWS SQS, AWS Lambda

  • Prior experience in a startup and/or comfort owning decisions in a rapidly changing environment

  • Experience on Cloud Solutions (preferably AWS)

  • Experience building and supporting end-to-end integrations, across various methods (API/EDI/SFTP/Flat Files)

  • Proficiency in the following areas with the respective tools: Python, ETL: Airflow, AWS Glue, AWS Data Pipeline; Databases: PostgreSQL; Infrastructure: Linux, AWS, Docker

About the job

Full-time
Latin America
18 Applicants
Posted 1 year ago
python
devops
docker
aws
architecture
Enhancv advertisement

30,000+
REMOTE JOBS

Unlock access to our database and
kickstart your remote career
Join Premium

Senior Data Engineer

Truelogic Software
The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

Truelogic is a leading provider of nearshore staff augmentation services, located in New York. Our team of 500 tech talents is driving digital disruption from Latin America to the top projects in U.S. companies. Truelogic has been helping companies of all sizes to achieve their digital transformation goals.

Would you like to make innovation happen? Have you ever dreamed of building Products that impact millions of users? Nice! Then we have a seat for you on our team!

What are you going to do?

You will have the opportunity to work in a forward-thinking and growth-oriented environment, at a massive industry that we all rely on but don’t think about much because it’s been done the same way for decades without much change or innovation. Well-funded Series A startup, developing a brand-new concept that will transform and enable an industry plagued with inefficiency, volatility, and $23B of economic waste annually. The Client team is passionate about collaborating with the world’s leading ocean carriers, shippers and forwarders to bring technology and innovation to this industry that we love, and we hope that you’ll be part of building this new digital foundation that will have lasting effects for years to come.

Occupy a unique position in the market. As a Data Engineer , you will be responsible for capturing data and transforming it into high performance datasets to be exploited by Data Analysts, Data Scientists and Engineers. You will also be responsible for supporting the development, architecture, enhancement, and ongoing maintenance of our Data Lakehouse in a cloud environment. You will develop a deep understanding of the client's data architecture, the frameworks to process the data, and maintain quality data that powers the applications on our technology platform.

  • Work closely with the Data Team Lead and Product Manager to deliver engineering solutions for our user requirements

  • Help support the architecture, development timeline, and maintenance of the cloud-based data Lakehouse

  • Build data pipelines, architecting a new data storage as necessary, and maintaining ETL processes in a cloud environment

  • Implement a robust system architecture that takes into consideration how changes will affect the testability, maintainability, and scalability of the system

  • Optimize our data process, balancing maintaining optimal performance and continuous improvements to our existing data processes

  • Consistently work to identify and mitigate security concerns, refactor old code judiciously, and follow development best practices

  • Automate testing and continuous integration in delivering bug-free, fully test- covered code in short time frames

  • Collaborate with various teams across the business (Operations, Product, etc.) to drive solutions that have a serious impact on our systems and our business overall

What will help you succeed

  • 4+ years of experience designing, building, maintaining data pipeline technologies (ETL/ELT)

  • 4+ years of experience with database models, design, DDLs and DMLs

  • 2+ years of experience in Data Warehouse/Data Lake Architecture and Development 

  • 1+ years experience with workflow orchestration services like AWS Step Function, AWS SQS, AWS Lambda

  • Prior experience in a startup and/or comfort owning decisions in a rapidly changing environment

  • Experience on Cloud Solutions (preferably AWS)

  • Experience building and supporting end-to-end integrations, across various methods (API/EDI/SFTP/Flat Files)

  • Proficiency in the following areas with the respective tools: Python, ETL: Airflow, AWS Glue, AWS Data Pipeline; Databases: PostgreSQL; Infrastructure: Linux, AWS, Docker

Working Nomads

Post Jobs
Premium Subscription
Sponsorship
Free Job Alerts

Job Skills
API
FAQ
Privacy policy
Terms and conditions
Contact us
About us

Jobs by Category

Remote Administration jobs
Remote Consulting jobs
Remote Customer Success jobs
Remote Development jobs
Remote Design jobs
Remote Education jobs
Remote Finance jobs
Remote Legal jobs
Remote Healthcare jobs
Remote Human Resources jobs
Remote Management jobs
Remote Marketing jobs
Remote Sales jobs
Remote System Administration jobs
Remote Writing jobs

Jobs by Position Type

Remote Full-time jobs
Remote Part-time jobs
Remote Contract jobs

Jobs by Region

Remote jobs Anywhere
Remote jobs North America
Remote jobs Latin America
Remote jobs Europe
Remote jobs Middle East
Remote jobs Africa
Remote jobs APAC

Jobs by Skill

Remote Accounting jobs
Remote Assistant jobs
Remote Copywriting jobs
Remote Cyber Security jobs
Remote Data Analyst jobs
Remote Data Entry jobs
Remote English jobs
Remote Spanish jobs
Remote Project Management jobs
Remote QA jobs
Remote SEO jobs

Jobs by Country

Remote jobs Australia
Remote jobs Argentina
Remote jobs Brazil
Remote jobs Canada
Remote jobs Colombia
Remote jobs France
Remote jobs Germany
Remote jobs Ireland
Remote jobs India
Remote jobs Japan
Remote jobs Mexico
Remote jobs Netherlands
Remote jobs New Zealand
Remote jobs Philippines
Remote jobs Poland
Remote jobs Portugal
Remote jobs Singapore
Remote jobs Spain
Remote jobs UK
Remote jobs USA


Working Nomads curates remote digital jobs from around the web.

© 2025 Working Nomads.