MENU
  • Remote Jobs
  • Companies
  • Go Premium
  • Job Alerts
  • Post a Job
  • Log in
  • Sign up
Working Nomads logo Working Nomads
  • Remote Jobs
  • Companies
  • Post Jobs
  • Go Premium
  • Get Free Job Alerts
  • Log in

Lead Data & Analytics Engineer

Brilliant

Full-time
North America, Latin America
$180k-$220k per year
python
sql
architecture
user experience
cloud
The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

About Brilliant

Brilliant is making a world of great problem solvers. We focus on adults learning quantitative skills – especially in math, data, and CS/AI – and deliver a best-in-class interactive learning experience across web and apps. Our courses teach you what you need to know, while skipping the stuff you don’t – so expect more about solving equations, statistical analysis, logical deduction, neural networks, and generative AI, and less about abstract theorems and integrating complicated trig functions. 

We serve hundreds of thousands of paid subscribers, and we’re hoping you might be the right person to contribute to accelerating our footprint to millions of customers (and changed lives). In addition to what’s below, you can see all open roles and learn more about our team culture on our careers page.

We have always prioritized building a healthy business as the backbone of achieving our mission. We are default alive (will be profitable before needing to raise), have never had layoffs, are growing new customers at an exciting pace (high double-digits year-over-year). Our investors are top-tier + mission aligned, and we’ve kept our valuations tethered to reality – we aren’t playing “catch up” like many others.

In our day-to-day, we value adventure, excellence, generosity, and candor. We are optimists in the face of uncertainty, we take pride in our work, we go the extra mile for each other, and we tell it like it is (the good and the bad). We’re all here to do the best work of our lives together, and have a lot of fun along the way. 

We believe that real-time collaboration and human connection are necessary ingredients in building a high-velocity, creatively-oriented consumer product. We maintain core hours (10am - 3pm Pacific) where everyone is online, regardless of timezone. Over half of us are located near our hubs in SF and NYC, and folks outside of those cities travel to attend team offsites once-per-quarter.

The Role

In this high-autonomy position, you'll direct the development and maintenance of our data infrastructure and data developer experiences for the benefit of the Data and Engineering teams. You’ll collaborate closely with a team of 4 data scientists and 5 engineering managers across an engineering team of 40. Your work will be among the most highly leveraged in the company.

You’ll build and extend modern data infrastructure built around dbt (including dbt Cloud) and Snowflake, with supporting tools like Fivetran, Census, and Amplitude.

Responsibilities

  • Design efficient and scalable data pipeline architecture for collecting data across a variety of sources, enabling different functions to leverage transformed data for analytics and operations.

  • Improve existing data modeling and deployment practices, fostering best practices to make the team more efficient and improve data quality.

  • Collaborate with engineers, product managers, and data scientists to understand data needs, overseeing end-to-end event instrumentation for new features, including naming conventions and properties.

  • Drive data “operationalization” – ensuring that we’re sending the right data to the right tools and services, on time and under cost (such as by management of tools like Census).

  • Ensure consistent pipeline performance when it comes to latency and error-handling.

  • Optimize the entire data stack — from data storage to transformation to analytical tooling — from a performance, cost, and scalability standpoint.

  • Lead us into a future of convenient data governance, by selecting ideal CDP and supporting tools.

Who are you?

  • Experienced: You bring at least 5 years of software engineering experience, including at least 2 years of working directly with some part of the “modern” data stack (dbt core & cloud, Fivetran, Snowflake, or equivalents).

  • Empathy for both worlds: You’ve worked closely enough with software engineering teams to understand their concerns and have also walked in the shoes of a data scientist.

  • Technically proficient: You possess advanced SQL skills and solid Python skills, and you’ve directly built or managed live systems which involved reliance on third-party tools.

  • A builder: You’re enthusiastic about establishing the foundations of a data team and their tools from scratch.

What might you tackle in the first 90 days?

  • Audit our data infrastructure from top to bottom to proactively identify performance, scale, and complexity considerations.

  • Audit our data stack (e.g. Snowflake, Fivetran, dbt, Census, Amplitude, Avo) to ensure conformity to best practices.

  • Audit existing ELT process for business critical data models and recommend ways to improve data quality, integrity, and reliability.

  • Review and extend data observability, monitoring, and alerting — with a deep empathy for how data issues could adversely affect the end user experience.

  • Determine priority (and vendor/OSS selection) for data governance tooling.

  • Determine priority and general implementation approach for supporting managed business metrics throughout dbt and related tools.

$180,000 - $220,000 a year

Our Engineering Team

Our engineers are extraordinary programmers without big egos. We love to share knowledge and support each other. We work together as an interdependent team to accomplish a common goal, and we know how to get things done. We maintain high personal standards, and possess an ongoing, voluntary, and self-motivated pursuit of knowledge.

Compensation and Benefits

We use a systematic compensation framework: salary scales are set each year for each job vertical, managers level folks on their team, and those levels are mapped directly to our compensation scales. A location-based adjustment is applied outside of SF and NYC (typically 5-10%) - feel free to ask us about your location!

Given the systematic approach, we always make First and Best offers - there is no negotiation (for new hires nor our existing teammates). This ensures people are paid based on their expected contribution, not their negotiation skills.

We offer top-notch health care plans, with 100% of the premiums covered for medical, dental, and vision for employees. About 1/3 of our team are parents, and we provide generous parental leave + up to $1900/mo in dependent healthcare coverage.

We offer flexible PTO, with a norm of taking off about 6 weeks per year (including federal holidays). We also provide home office equipment, a professional development stipend, and free food at our offices.

Our CCPA Privacy Notice can be found here.

About the job

Full-time
North America, Latin America
$180k-$220k per year
Posted 1 year ago
python
sql
architecture
user experience
cloud
Enhancv advertisement
+ 1,284 new jobs added today
30,000+
Remote Jobs

Don't miss out — new listings every hour

Join Premium

Lead Data & Analytics Engineer

Brilliant
The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

About Brilliant

Brilliant is making a world of great problem solvers. We focus on adults learning quantitative skills – especially in math, data, and CS/AI – and deliver a best-in-class interactive learning experience across web and apps. Our courses teach you what you need to know, while skipping the stuff you don’t – so expect more about solving equations, statistical analysis, logical deduction, neural networks, and generative AI, and less about abstract theorems and integrating complicated trig functions. 

We serve hundreds of thousands of paid subscribers, and we’re hoping you might be the right person to contribute to accelerating our footprint to millions of customers (and changed lives). In addition to what’s below, you can see all open roles and learn more about our team culture on our careers page.

We have always prioritized building a healthy business as the backbone of achieving our mission. We are default alive (will be profitable before needing to raise), have never had layoffs, are growing new customers at an exciting pace (high double-digits year-over-year). Our investors are top-tier + mission aligned, and we’ve kept our valuations tethered to reality – we aren’t playing “catch up” like many others.

In our day-to-day, we value adventure, excellence, generosity, and candor. We are optimists in the face of uncertainty, we take pride in our work, we go the extra mile for each other, and we tell it like it is (the good and the bad). We’re all here to do the best work of our lives together, and have a lot of fun along the way. 

We believe that real-time collaboration and human connection are necessary ingredients in building a high-velocity, creatively-oriented consumer product. We maintain core hours (10am - 3pm Pacific) where everyone is online, regardless of timezone. Over half of us are located near our hubs in SF and NYC, and folks outside of those cities travel to attend team offsites once-per-quarter.

The Role

In this high-autonomy position, you'll direct the development and maintenance of our data infrastructure and data developer experiences for the benefit of the Data and Engineering teams. You’ll collaborate closely with a team of 4 data scientists and 5 engineering managers across an engineering team of 40. Your work will be among the most highly leveraged in the company.

You’ll build and extend modern data infrastructure built around dbt (including dbt Cloud) and Snowflake, with supporting tools like Fivetran, Census, and Amplitude.

Responsibilities

  • Design efficient and scalable data pipeline architecture for collecting data across a variety of sources, enabling different functions to leverage transformed data for analytics and operations.

  • Improve existing data modeling and deployment practices, fostering best practices to make the team more efficient and improve data quality.

  • Collaborate with engineers, product managers, and data scientists to understand data needs, overseeing end-to-end event instrumentation for new features, including naming conventions and properties.

  • Drive data “operationalization” – ensuring that we’re sending the right data to the right tools and services, on time and under cost (such as by management of tools like Census).

  • Ensure consistent pipeline performance when it comes to latency and error-handling.

  • Optimize the entire data stack — from data storage to transformation to analytical tooling — from a performance, cost, and scalability standpoint.

  • Lead us into a future of convenient data governance, by selecting ideal CDP and supporting tools.

Who are you?

  • Experienced: You bring at least 5 years of software engineering experience, including at least 2 years of working directly with some part of the “modern” data stack (dbt core & cloud, Fivetran, Snowflake, or equivalents).

  • Empathy for both worlds: You’ve worked closely enough with software engineering teams to understand their concerns and have also walked in the shoes of a data scientist.

  • Technically proficient: You possess advanced SQL skills and solid Python skills, and you’ve directly built or managed live systems which involved reliance on third-party tools.

  • A builder: You’re enthusiastic about establishing the foundations of a data team and their tools from scratch.

What might you tackle in the first 90 days?

  • Audit our data infrastructure from top to bottom to proactively identify performance, scale, and complexity considerations.

  • Audit our data stack (e.g. Snowflake, Fivetran, dbt, Census, Amplitude, Avo) to ensure conformity to best practices.

  • Audit existing ELT process for business critical data models and recommend ways to improve data quality, integrity, and reliability.

  • Review and extend data observability, monitoring, and alerting — with a deep empathy for how data issues could adversely affect the end user experience.

  • Determine priority (and vendor/OSS selection) for data governance tooling.

  • Determine priority and general implementation approach for supporting managed business metrics throughout dbt and related tools.

$180,000 - $220,000 a year

Our Engineering Team

Our engineers are extraordinary programmers without big egos. We love to share knowledge and support each other. We work together as an interdependent team to accomplish a common goal, and we know how to get things done. We maintain high personal standards, and possess an ongoing, voluntary, and self-motivated pursuit of knowledge.

Compensation and Benefits

We use a systematic compensation framework: salary scales are set each year for each job vertical, managers level folks on their team, and those levels are mapped directly to our compensation scales. A location-based adjustment is applied outside of SF and NYC (typically 5-10%) - feel free to ask us about your location!

Given the systematic approach, we always make First and Best offers - there is no negotiation (for new hires nor our existing teammates). This ensures people are paid based on their expected contribution, not their negotiation skills.

We offer top-notch health care plans, with 100% of the premiums covered for medical, dental, and vision for employees. About 1/3 of our team are parents, and we provide generous parental leave + up to $1900/mo in dependent healthcare coverage.

We offer flexible PTO, with a norm of taking off about 6 weeks per year (including federal holidays). We also provide home office equipment, a professional development stipend, and free food at our offices.

Our CCPA Privacy Notice can be found here.

Working Nomads

Post Jobs
Premium Subscription
Sponsorship
Reviews
Job Alerts

Job Skills
Jobs by Location
Jobs by Experience Level
Jobs by Position Type
Jobs by Salary
API
Scam Alert
FAQ
Privacy policy
Terms and conditions
Contact us
About us

Jobs by Category

Remote Administration jobs
Remote Consulting jobs
Remote Customer Success jobs
Remote Development jobs
Remote Design jobs
Remote Education jobs
Remote Finance jobs
Remote Legal jobs
Remote Healthcare jobs
Remote Human Resources jobs
Remote Management jobs
Remote Marketing jobs
Remote Sales jobs
Remote System Administration jobs
Remote Writing jobs

Jobs by Position Type

Remote Full-time jobs
Remote Part-time jobs
Remote Contract jobs

Jobs by Region

Remote jobs Anywhere
Remote jobs North America
Remote jobs Latin America
Remote jobs Europe
Remote jobs Middle East
Remote jobs Africa
Remote jobs APAC

Jobs by Skill

Remote Accounting jobs
Remote Assistant jobs
Remote Copywriting jobs
Remote Cyber Security jobs
Remote Data Analyst jobs
Remote Data Entry jobs
Remote English jobs
Remote Entry Level jobs
Remote Spanish jobs
Remote Project Management jobs
Remote QA jobs
Remote SEO jobs

Jobs by Country

Remote jobs Australia
Remote jobs Argentina
Remote jobs Belgium
Remote jobs Brazil
Remote jobs Canada
Remote jobs Colombia
Remote jobs France
Remote jobs Germany
Remote jobs Ireland
Remote jobs India
Remote jobs Japan
Remote jobs Mexico
Remote jobs Netherlands
Remote jobs New Zealand
Remote jobs Philippines
Remote jobs Poland
Remote jobs Portugal
Remote jobs Singapore
Remote jobs Spain
Remote jobs UK
Remote jobs USA


Working Nomads curates remote digital jobs from around the web.

© 2026 Working Nomads.