MENU
  • Remote Jobs
  • Companies
  • Go Premium
  • Job Alerts
  • Post a Job
  • Log in
  • Sign up
Working Nomads logo Working Nomads
  • Remote Jobs
  • Companies
  • Post Jobs
  • Go Premium
  • Get Free Job Alerts
  • Log in

Backend Engineer - Optimized Checkout & Link Data Engineering

Stripe

Full-time
USA, Canada
$163k-$245k per year
data engineering
java
hadoop
sql
aws
The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

Who we are

About Stripe

Stripe is a financial infrastructure platform for businesses. Millions of companies—from the world’s largest enterprises to the most ambitious startups—use Stripe to accept payments, grow their revenue, and accelerate new business opportunities. Our mission is to increase the GDP of the internet, and we have a staggering amount of work ahead. That means you have an unprecedented opportunity to put the global economy within everyone’s reach while doing the most important work of your career.

About the team

The Optimized Checkout & Link team at Stripe builds best-in-class checkout experiences across web and mobile that delight consumers and streamline checkout flows for merchants. Based across North America, we're a diverse team who are deeply passionate about redefining the payment experience creating outstanding value for merchants, increasing revenue, lowering cost and growing their business. We work on Checkout, Payment Links, Elements, Payment Methods, and Link – each playing a crucial part in augmenting the economic landscape of the internet. Our days are filled with exciting challenges and collaborative problem-solving as we strive to simplify payment options, create unique business solutions and enhance checkout ease. Join us in crafting the future of digital commerce.

What you’ll do

We’re looking for people with a strong background in data engineering and analytics to help us scale while maintaining correct and complete data.

Responsibilities

  • Conceptualize and own the data architecture for multiple large-scale projects, while evaluating design and operational cost-benefit tradeoffs within systems

  • Be an advocate for data quality and excellence of our platform.

  • Create and contribute to frameworks that improve the efficacy of logging data, while working with data infrastructure to triage issues and resolve

  • Gather requirements, understand the big picture, create detailed proposals in technical specification documents.

  • Productizing data ingestion from various sources, data delivery to various destinations, and creating well-orchestrated data pipelines.

  • Optimize pipelines, dashboards, frameworks, and systems to facilitate easier development of data artifacts

  • Conduct SQL data investigations, data quality analysis and optimizations.

  • Contribute in peer code reviews, and help the team produce high quality code.

  • Mentor team members by giving/receiving actionable feedback

Who you are

We’re looking for someone who meets the minimum requirements to be considered for the role. If you meet these requirements, you are encouraged to apply. The preferred qualifications are a bonus, not a requirement.

Minimum requirements

  • Bachelor's degree in Computer Science or Engineering Master’s degree is preferred.

  • Have a strong engineering background and are interested in data

  • 5+ years of experience with writing and debugging data pipelines using a distributed data framework (Hadoop/Spark/Pig etc…)

  • Great data modeling skills, database design, relational/non-relational.

  • Very strong SQL proficiency, and preferably SQL query optimization experience.

  • Strong coding skills in Scala or Java preferably for building performance data pipelines.

  • Strong understanding and practical experience with systems such as Hadoop, Spark, Presto, Iceberg, and Airflow

  • Versed in software production engineering practices, version control, code peer reviews, automated testing, and CI/CD.

  • Excellent communication skills.

  • Experience in AWS cloud is preferred.

 

About the job

Full-time
USA, Canada
$163k-$245k per year
Posted 1 year ago
data engineering
java
hadoop
sql
aws
Enhancv advertisement
+ 1,284 new jobs added today
30,000+
Remote Jobs

Don't miss out — new listings every hour

Join Premium

Backend Engineer - Optimized Checkout & Link Data Engineering

Stripe
The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

Who we are

About Stripe

Stripe is a financial infrastructure platform for businesses. Millions of companies—from the world’s largest enterprises to the most ambitious startups—use Stripe to accept payments, grow their revenue, and accelerate new business opportunities. Our mission is to increase the GDP of the internet, and we have a staggering amount of work ahead. That means you have an unprecedented opportunity to put the global economy within everyone’s reach while doing the most important work of your career.

About the team

The Optimized Checkout & Link team at Stripe builds best-in-class checkout experiences across web and mobile that delight consumers and streamline checkout flows for merchants. Based across North America, we're a diverse team who are deeply passionate about redefining the payment experience creating outstanding value for merchants, increasing revenue, lowering cost and growing their business. We work on Checkout, Payment Links, Elements, Payment Methods, and Link – each playing a crucial part in augmenting the economic landscape of the internet. Our days are filled with exciting challenges and collaborative problem-solving as we strive to simplify payment options, create unique business solutions and enhance checkout ease. Join us in crafting the future of digital commerce.

What you’ll do

We’re looking for people with a strong background in data engineering and analytics to help us scale while maintaining correct and complete data.

Responsibilities

  • Conceptualize and own the data architecture for multiple large-scale projects, while evaluating design and operational cost-benefit tradeoffs within systems

  • Be an advocate for data quality and excellence of our platform.

  • Create and contribute to frameworks that improve the efficacy of logging data, while working with data infrastructure to triage issues and resolve

  • Gather requirements, understand the big picture, create detailed proposals in technical specification documents.

  • Productizing data ingestion from various sources, data delivery to various destinations, and creating well-orchestrated data pipelines.

  • Optimize pipelines, dashboards, frameworks, and systems to facilitate easier development of data artifacts

  • Conduct SQL data investigations, data quality analysis and optimizations.

  • Contribute in peer code reviews, and help the team produce high quality code.

  • Mentor team members by giving/receiving actionable feedback

Who you are

We’re looking for someone who meets the minimum requirements to be considered for the role. If you meet these requirements, you are encouraged to apply. The preferred qualifications are a bonus, not a requirement.

Minimum requirements

  • Bachelor's degree in Computer Science or Engineering Master’s degree is preferred.

  • Have a strong engineering background and are interested in data

  • 5+ years of experience with writing and debugging data pipelines using a distributed data framework (Hadoop/Spark/Pig etc…)

  • Great data modeling skills, database design, relational/non-relational.

  • Very strong SQL proficiency, and preferably SQL query optimization experience.

  • Strong coding skills in Scala or Java preferably for building performance data pipelines.

  • Strong understanding and practical experience with systems such as Hadoop, Spark, Presto, Iceberg, and Airflow

  • Versed in software production engineering practices, version control, code peer reviews, automated testing, and CI/CD.

  • Excellent communication skills.

  • Experience in AWS cloud is preferred.

 

Working Nomads

Post Jobs
Premium Subscription
Sponsorship
Reviews
Job Alerts

Job Skills
Jobs by Location
Jobs by Experience Level
Jobs by Position Type
Jobs by Salary
API
Scam Alert
FAQ
Privacy policy
Terms and conditions
Contact us
About us

Jobs by Category

Remote Administration jobs
Remote Consulting jobs
Remote Customer Success jobs
Remote Development jobs
Remote Design jobs
Remote Education jobs
Remote Finance jobs
Remote Legal jobs
Remote Healthcare jobs
Remote Human Resources jobs
Remote Management jobs
Remote Marketing jobs
Remote Sales jobs
Remote System Administration jobs
Remote Writing jobs

Jobs by Position Type

Remote Full-time jobs
Remote Part-time jobs
Remote Contract jobs

Jobs by Region

Remote jobs Anywhere
Remote jobs North America
Remote jobs Latin America
Remote jobs Europe
Remote jobs Middle East
Remote jobs Africa
Remote jobs APAC

Jobs by Skill

Remote Accounting jobs
Remote Assistant jobs
Remote Copywriting jobs
Remote Cyber Security jobs
Remote Data Analyst jobs
Remote Data Entry jobs
Remote English jobs
Remote Entry Level jobs
Remote Spanish jobs
Remote Project Management jobs
Remote QA jobs
Remote SEO jobs

Jobs by Country

Remote jobs Australia
Remote jobs Argentina
Remote jobs Belgium
Remote jobs Brazil
Remote jobs Canada
Remote jobs Colombia
Remote jobs France
Remote jobs Germany
Remote jobs Ireland
Remote jobs India
Remote jobs Japan
Remote jobs Mexico
Remote jobs Netherlands
Remote jobs New Zealand
Remote jobs Philippines
Remote jobs Poland
Remote jobs Portugal
Remote jobs Singapore
Remote jobs Spain
Remote jobs UK
Remote jobs USA


Working Nomads curates remote digital jobs from around the web.

© 2026 Working Nomads.