MENU
  • Remote Jobs
  • Companies
  • ✦ Go Premium
  • Job Alerts
  • Post a Job
  • Log in
  • Sign up
Working Nomads logo Working Nomads
  • Remote Jobs
  • Companies
  • Post Jobs
  • ✦ Go Premium
  • Get Free Job Alerts
  • Log in

Senior Data Engineer

Upwork

Freelance / Contract
Latin America
engineer
python
rust
cloud
security
Apply for this position

Upwork ($UPWK) is the world’s work marketplace. We serve everyone from one-person startups to over 30% of the Fortune 100 with a powerful, trust-driven platform that enables companies and talent to work together in new ways that unlock their potential.  

Last year, more than $3.8 billion of work was done through Upwork by skilled professionals who are gaining more control by finding work they are passionate about and innovating their careers. 

This is an engagement through Upwork’s Hybrid Workforce Solutions (HWS) Team. Our Hybrid Workforce Solutions Team is a global group of professionals that support Upwork’s business. Our HWS team members are located all over the world.

This hybrid engagement will help build and operate Data Platform as a Service capabilities for internal teams. The role focuses on enabling scalable, secure, reliable, and well-governed data products through platform engineering practices—CI/CD for data, data mesh enablement, automation, observability, and self-service workflows. This engineer will partner closely with data engineering, analytics, and AI teams to improve platform reliability, developer experience, and time-to-delivery.

Work/Project Scope:

  • Build and operate platform services that enable teams to deliver data products reliably (pipelines, transformations, orchestration, metadata, governance).

  • Design and implement CI/CD for data (tests, deployments, promotion workflows, rollback strategies, versioning).

  • Improve data platform reliability through observability, SLAs/SLOs, alerting, incident response, and runbooks.

  • Enable data mesh patterns: domain ownership, standardized interfaces, reusable templates, and paved paths.

  • Develop internal tooling and automation for onboarding datasets, creating standardized pipelines, and enforcing best practices (quality, security, lineage).

  • Implement or enhance data quality and validation frameworks (contract testing, reconciliation, anomaly detection).

  • Optimize platform performance and cost (warehouse optimization, job efficiency, resource scaling).

  • Collaborate with Security/Compliance to ensure encryption, access control, auditability, and least-privilege practices.

  • Partner with AI teams to ensure data products are fit for AI/ML workloads (feature readiness, dataset versioning, reproducibility, governance).

  • Improve and maintain Airflow orchestration, including DAG design, dependency management, and operational reliability for dbt and analytics workflows.

Must Haves (Required Skills):

  • Strong software engineering foundation building production systems (Python and/or Rust preferred; strong APIs/services mindset).

  • Proven experience in data platform engineering (not just building pipelines—building platforms for others).

  • Hands-on experience with CI/CD, Infrastructure as Code, and automation.

  • Experience with observability and reliability engineering (metrics, logs, tracing, SLOs, on-call readiness).

  • Strong knowledge of modern data ecosystem patterns (data modeling, orchestration, warehousing/lakehouse concepts).

  • Practical experience enabling data mesh or self-service platform capabilities.

  • Ability to work across ambiguity, drive delivery, and influence standards.

Preferred Background/Experience:

  • Experience with Snowflake and modern orchestration/testing patterns (dbt/SQLMesh-like workflows, Strong Airflow/Dagster, data quality tools).

  • Experience with Kubernetes and cloud-native deployments. Experience integrating metadata/catalog/lineage tooling (Atlan/Collibra/Amundsen/OpenMetadata, etc.).

  • Familiarity with AI data requirements (dataset governance, experiment reproducibility, feature pipelines).

Upwork is proudly committed to fostering a diverse and inclusive workforce. We never discriminate based on race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical condition), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.

Additionally, to the extent permitted under applicable law, a criminal background check may be required as a condition of engagement.

We use BrightHire, an AI-enabled tool, to record interviews and summarize interview transcripts. The tool allows the interviewer to focus on the discussion and does not score or evaluate talent or make recommendations. The interview transcripts are reviewed, and decisions are only made by humans. Any individual who prefers not to have their interview recorded through BrightHire can opt out when the interview is scheduled.

Apply for this position
Bookmark Report

About the job

Freelance / Contract
Latin America
Senior Level
Posted 3 days ago
engineer
python
rust
cloud
security

Apply for this position

Bookmark
Report
Enhancv advertisement
+ 1,284 new jobs added today
30,000+
Remote Jobs

Don't miss out — new listings every hour

Join Premium

Senior Data Engineer

Upwork

Upwork ($UPWK) is the world’s work marketplace. We serve everyone from one-person startups to over 30% of the Fortune 100 with a powerful, trust-driven platform that enables companies and talent to work together in new ways that unlock their potential.  

Last year, more than $3.8 billion of work was done through Upwork by skilled professionals who are gaining more control by finding work they are passionate about and innovating their careers. 

This is an engagement through Upwork’s Hybrid Workforce Solutions (HWS) Team. Our Hybrid Workforce Solutions Team is a global group of professionals that support Upwork’s business. Our HWS team members are located all over the world.

This hybrid engagement will help build and operate Data Platform as a Service capabilities for internal teams. The role focuses on enabling scalable, secure, reliable, and well-governed data products through platform engineering practices—CI/CD for data, data mesh enablement, automation, observability, and self-service workflows. This engineer will partner closely with data engineering, analytics, and AI teams to improve platform reliability, developer experience, and time-to-delivery.

Work/Project Scope:

  • Build and operate platform services that enable teams to deliver data products reliably (pipelines, transformations, orchestration, metadata, governance).

  • Design and implement CI/CD for data (tests, deployments, promotion workflows, rollback strategies, versioning).

  • Improve data platform reliability through observability, SLAs/SLOs, alerting, incident response, and runbooks.

  • Enable data mesh patterns: domain ownership, standardized interfaces, reusable templates, and paved paths.

  • Develop internal tooling and automation for onboarding datasets, creating standardized pipelines, and enforcing best practices (quality, security, lineage).

  • Implement or enhance data quality and validation frameworks (contract testing, reconciliation, anomaly detection).

  • Optimize platform performance and cost (warehouse optimization, job efficiency, resource scaling).

  • Collaborate with Security/Compliance to ensure encryption, access control, auditability, and least-privilege practices.

  • Partner with AI teams to ensure data products are fit for AI/ML workloads (feature readiness, dataset versioning, reproducibility, governance).

  • Improve and maintain Airflow orchestration, including DAG design, dependency management, and operational reliability for dbt and analytics workflows.

Must Haves (Required Skills):

  • Strong software engineering foundation building production systems (Python and/or Rust preferred; strong APIs/services mindset).

  • Proven experience in data platform engineering (not just building pipelines—building platforms for others).

  • Hands-on experience with CI/CD, Infrastructure as Code, and automation.

  • Experience with observability and reliability engineering (metrics, logs, tracing, SLOs, on-call readiness).

  • Strong knowledge of modern data ecosystem patterns (data modeling, orchestration, warehousing/lakehouse concepts).

  • Practical experience enabling data mesh or self-service platform capabilities.

  • Ability to work across ambiguity, drive delivery, and influence standards.

Preferred Background/Experience:

  • Experience with Snowflake and modern orchestration/testing patterns (dbt/SQLMesh-like workflows, Strong Airflow/Dagster, data quality tools).

  • Experience with Kubernetes and cloud-native deployments. Experience integrating metadata/catalog/lineage tooling (Atlan/Collibra/Amundsen/OpenMetadata, etc.).

  • Familiarity with AI data requirements (dataset governance, experiment reproducibility, feature pipelines).

Upwork is proudly committed to fostering a diverse and inclusive workforce. We never discriminate based on race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical condition), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.

Additionally, to the extent permitted under applicable law, a criminal background check may be required as a condition of engagement.

We use BrightHire, an AI-enabled tool, to record interviews and summarize interview transcripts. The tool allows the interviewer to focus on the discussion and does not score or evaluate talent or make recommendations. The interview transcripts are reviewed, and decisions are only made by humans. Any individual who prefers not to have their interview recorded through BrightHire can opt out when the interview is scheduled.

Working Nomads

Post Jobs
Premium Subscription
Sponsorship
Reviews
Job Alerts

Job Skills
Jobs by Location
Jobs by Experience Level
Jobs by Position Type
Jobs by Salary
API
Scam Alert
FAQ
Privacy policy
Terms and conditions
Contact us
About us

Jobs by Category

Remote Administration jobs
Remote Consulting jobs
Remote Customer Success jobs
Remote Development jobs
Remote Design jobs
Remote Education jobs
Remote Finance jobs
Remote Legal jobs
Remote Healthcare jobs
Remote Human Resources jobs
Remote Management jobs
Remote Marketing jobs
Remote Sales jobs
Remote System Administration jobs
Remote Writing jobs

Jobs by Position Type

Remote Full-time jobs
Remote Part-time jobs
Remote Contract jobs

Jobs by Region

Remote jobs Anywhere
Remote jobs North America
Remote jobs Latin America
Remote jobs Europe
Remote jobs Middle East
Remote jobs Africa
Remote jobs APAC

Jobs by Skill

Remote Accounting jobs
Remote Assistant jobs
Remote Copywriting jobs
Remote Cyber Security jobs
Remote Data Analyst jobs
Remote Data Entry jobs
Remote English jobs
Remote Entry Level jobs
Remote Spanish jobs
Remote Project Management jobs
Remote QA jobs
Remote SEO jobs

Jobs by Country

Remote jobs Australia
Remote jobs Argentina
Remote jobs Belgium
Remote jobs Brazil
Remote jobs Canada
Remote jobs Colombia
Remote jobs France
Remote jobs Germany
Remote jobs Ireland
Remote jobs India
Remote jobs Japan
Remote jobs Mexico
Remote jobs Netherlands
Remote jobs New Zealand
Remote jobs Philippines
Remote jobs Poland
Remote jobs Portugal
Remote jobs Singapore
Remote jobs Spain
Remote jobs UK
Remote jobs USA


Working Nomads curates remote digital jobs from around the web.

© 2026 Working Nomads.