MENU
  • Remote Jobs
  • Companies
  • Go Premium
  • Job Alerts
  • Post a Job
  • Log in
  • Sign up
Working Nomads logo Working Nomads
  • Remote Jobs
  • Companies
  • Post Jobs
  • Go Premium
  • Get Free Job Alerts
  • Log in

Senior Data Engineer

HubSpot

Full-time
USA
$158k-$236k per year
engineer
python
sql
aws
security
The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

POS-4028

We are seeking a Senior Data Engineer to join our team.​ In this role, you will work with various external data sources, design and maintain data pipelines, and ensure the efficiency and security of our data processes. Your contributions will help HubSpot leverage data-driven insights for effective decision-making.

This role resides within our Employee Technology group which comprises Business Systems Analysts, System Engineers, Analytic Engineers and administrators responsible for HubSpot's employee-facing application infrastructure.

In this role you will:

Data Acquisition and Ingestion

  • Acquire data from multiple sources using pre-built connectors and custom solutions.

  • Develop data ingestion pipelines to effectively manage internal and external API interactions, addressing challenges such as rate limiting, pagination, and retry mechanisms.

  • Process and securely ingest large datasets, ensuring encrypted data transfer and runtime credential safety.

  • Implement Snowpipe for data ingestion into Snowflake and utilize Snowflake’s external tables for accessing S3 data.

AWS Infrastructure and File Management

  • Manage and optimize data workflows within the AWS ecosystem, with a focus on S3 operations.

  • Manipulate S3 files upon arrival, archive older files to Glacier, and do light cleansing/alteration of files, as needed, for various workflows.

  • Automate AWS processes using tools like AWS Lambda, Step Functions, or custom scripts to improve efficiency and reliability.

Data Migrations and Utility Development

  • Develop scripts for data migrations between Snowflake instances and automate tasks using Looker’s APIs.

  • Create utility scripts for business needs, such as disabling inactive users, purging broken/unused Looker content, and simplifying otherwise burdensome/manual processes.

Data Modeling and Development

  • Build and maintain data models using dbt, following best SQL practices and data warehousing principles to ensure schema design and data integrity.

  • Develop data pipelines in a DAG structure, ensuring clear dependencies and performing data validation and testing.

  • Implement CI/CD workflows with GitHub actions/hooks for automated testing and deployment.

We are looking for people who have:

  • Proven experience in developing custom data ingestion pipelines and integrating with APIs, effectively managing complexities such as rate limiting and pagination.

  • Proficient in automating workflows using AWS Lambda, Step Functions, or comparable tools.

  • Extensive knowledge of databases (especially Snowflake.) Experience with Snowflake technologies such as Snowpipe, Snowpark, Data Sharing, Cloning etc., and Snowpipe usage.

  • Strong background in SQL for data warehouse transformations. Experience with dbt is a plus.

  • Solid understanding of data warehousing principles, DAG structures, and comprehensive data validation/testing frameworks.

  • Skilled in Python for creating utility scripts and automated workflows.

  • Experience with data manipulation libraries such as Pandas, PySpark, or similar technologies.

  • Familiarity with streaming data solutions, including Kafka, AWS Kinesis, RabbitMQ, ​etc

Cash compensation range: 157600-236400 USD Annually This resource will help guide how we recommend thinking about the range you see. Learn more about HubSpot’s compensation philosophy. The cash compensation above includes base salary, on-target commission for employees in eligible roles, and annual bonus targets under HubSpot’s bonus plan for eligible roles. In addition to cash compensation, some roles are eligible to participate in HubSpot’s equity plan to receive restricted stock units (RSUs). Some roles may also be eligible for overtime pay. Individual compensation packages are based on a few different factors unique to each candidate, including their skills, experience, qualifications and other job-related reasons. We know that benefits are also an important piece of your total compensation package. To learn more about what’s included in total compensation, check out some of the benefits and perks HubSpot offers to help employees grow better. At HubSpot, fair compensation practices isn’t just about checking off the box for legal compliance. It’s about living out our value of transparency with our employees, candidates, and community.

We are seeking a Senior Data Engineer to join our team.​ In this role, you will work with various external data sources, design and maintain data pipelines, and ensure the efficiency and security of our data processes. Your contributions will help HubSpot leverage data-driven insights for effective decision-making.

This role resides within our Employee Technology group which comprises Business Systems Analysts, System Engineers, Analytic Engineers and administrators responsible for HubSpot's employee-facing application infrastructure.

In this role you will:

Data Acquisition and Ingestion

  • Acquire data from multiple sources using pre-built connectors and custom solutions.

  • Develop data ingestion pipelines to effectively manage internal and external API interactions, addressing challenges such as rate limiting, pagination, and retry mechanisms.

  • Process and securely ingest large datasets, ensuring encrypted data transfer and runtime credential safety.

  • Implement Snowpipe for data ingestion into Snowflake and utilize Snowflake’s external tables for accessing S3 data.

AWS Infrastructure and File Management

  • Manage and optimize data workflows within the AWS ecosystem, with a focus on S3 operations.

  • Manipulate S3 files upon arrival, archive older files to Glacier, and do light cleansing/alteration of files, as needed, for various workflows.

  • Automate AWS processes using tools like AWS Lambda, Step Functions, or custom scripts to improve efficiency and reliability.

Data Migrations and Utility Development

  • Develop scripts for data migrations between Snowflake instances and automate tasks using Looker’s APIs.

  • Create utility scripts for business needs, such as disabling inactive users, purging broken/unused Looker content, and simplifying otherwise burdensome/manual processes.

Data Modeling and Development

  • Build and maintain data models using dbt, following best SQL practices and data warehousing principles to ensure schema design and data integrity.

  • Develop data pipelines in a DAG structure, ensuring clear dependencies and performing data validation and testing.

  • Implement CI/CD workflows with GitHub actions/hooks for automated testing and deployment.

We are looking for people who have:

  • Proven experience in developing custom data ingestion pipelines and integrating with APIs, effectively managing complexities such as rate limiting and pagination.

  • Proficient in automating workflows using AWS Lambda, Step Functions, or comparable tools.

  • Extensive knowledge of databases (especially Snowflake.) Experience with Snowflake technologies such as Snowpipe, Snowpark, Data Sharing, Cloning etc., and Snowpipe usage.

  • Strong background in SQL for data warehouse transformations. Experience with dbt is a plus.

  • Solid understanding of data warehousing principles, DAG structures, and comprehensive data validation/testing frameworks.

  • Skilled in Python for creating utility scripts and automated workflows.

  • Experience with data manipulation libraries such as Pandas, PySpark, or similar technologies.

  • Familiarity with streaming data solutions, including Kafka, AWS Kinesis, RabbitMQ, ​etc

Cash compensation range: 157600-236400 USD Annually This resource will help guide how we recommend thinking about the range you see. Learn more about HubSpot’s compensation philosophy. The cash compensation above includes base salary, on-target commission for employees in eligible roles, and annual bonus targets under HubSpot’s bonus plan for eligible roles. In addition to cash compensation, some roles are eligible to participate in HubSpot’s equity plan to receive restricted stock units (RSUs). Some roles may also be eligible for overtime pay. Individual compensation packages are based on a few different factors unique to each candidate, including their skills, experience, qualifications and other job-related reasons. We know that benefits are also an important piece of your total compensation package. To learn more about what’s included in total compensation, check out some of the benefits and perks HubSpot offers to help employees grow better. At HubSpot, fair compensation practices isn’t just about checking off the box for legal compliance. It’s about living out our value of transparency with our employees, candidates, and community.

About the job

Full-time
USA
$158k-$236k per year
Posted 5 months ago
engineer
python
sql
aws
security
Enhancv advertisement

30,000+
REMOTE JOBS

Unlock access to our database and
kickstart your remote career
Join Premium

Senior Data Engineer

HubSpot
The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

POS-4028

We are seeking a Senior Data Engineer to join our team.​ In this role, you will work with various external data sources, design and maintain data pipelines, and ensure the efficiency and security of our data processes. Your contributions will help HubSpot leverage data-driven insights for effective decision-making.

This role resides within our Employee Technology group which comprises Business Systems Analysts, System Engineers, Analytic Engineers and administrators responsible for HubSpot's employee-facing application infrastructure.

In this role you will:

Data Acquisition and Ingestion

  • Acquire data from multiple sources using pre-built connectors and custom solutions.

  • Develop data ingestion pipelines to effectively manage internal and external API interactions, addressing challenges such as rate limiting, pagination, and retry mechanisms.

  • Process and securely ingest large datasets, ensuring encrypted data transfer and runtime credential safety.

  • Implement Snowpipe for data ingestion into Snowflake and utilize Snowflake’s external tables for accessing S3 data.

AWS Infrastructure and File Management

  • Manage and optimize data workflows within the AWS ecosystem, with a focus on S3 operations.

  • Manipulate S3 files upon arrival, archive older files to Glacier, and do light cleansing/alteration of files, as needed, for various workflows.

  • Automate AWS processes using tools like AWS Lambda, Step Functions, or custom scripts to improve efficiency and reliability.

Data Migrations and Utility Development

  • Develop scripts for data migrations between Snowflake instances and automate tasks using Looker’s APIs.

  • Create utility scripts for business needs, such as disabling inactive users, purging broken/unused Looker content, and simplifying otherwise burdensome/manual processes.

Data Modeling and Development

  • Build and maintain data models using dbt, following best SQL practices and data warehousing principles to ensure schema design and data integrity.

  • Develop data pipelines in a DAG structure, ensuring clear dependencies and performing data validation and testing.

  • Implement CI/CD workflows with GitHub actions/hooks for automated testing and deployment.

We are looking for people who have:

  • Proven experience in developing custom data ingestion pipelines and integrating with APIs, effectively managing complexities such as rate limiting and pagination.

  • Proficient in automating workflows using AWS Lambda, Step Functions, or comparable tools.

  • Extensive knowledge of databases (especially Snowflake.) Experience with Snowflake technologies such as Snowpipe, Snowpark, Data Sharing, Cloning etc., and Snowpipe usage.

  • Strong background in SQL for data warehouse transformations. Experience with dbt is a plus.

  • Solid understanding of data warehousing principles, DAG structures, and comprehensive data validation/testing frameworks.

  • Skilled in Python for creating utility scripts and automated workflows.

  • Experience with data manipulation libraries such as Pandas, PySpark, or similar technologies.

  • Familiarity with streaming data solutions, including Kafka, AWS Kinesis, RabbitMQ, ​etc

Cash compensation range: 157600-236400 USD Annually This resource will help guide how we recommend thinking about the range you see. Learn more about HubSpot’s compensation philosophy. The cash compensation above includes base salary, on-target commission for employees in eligible roles, and annual bonus targets under HubSpot’s bonus plan for eligible roles. In addition to cash compensation, some roles are eligible to participate in HubSpot’s equity plan to receive restricted stock units (RSUs). Some roles may also be eligible for overtime pay. Individual compensation packages are based on a few different factors unique to each candidate, including their skills, experience, qualifications and other job-related reasons. We know that benefits are also an important piece of your total compensation package. To learn more about what’s included in total compensation, check out some of the benefits and perks HubSpot offers to help employees grow better. At HubSpot, fair compensation practices isn’t just about checking off the box for legal compliance. It’s about living out our value of transparency with our employees, candidates, and community.

We are seeking a Senior Data Engineer to join our team.​ In this role, you will work with various external data sources, design and maintain data pipelines, and ensure the efficiency and security of our data processes. Your contributions will help HubSpot leverage data-driven insights for effective decision-making.

This role resides within our Employee Technology group which comprises Business Systems Analysts, System Engineers, Analytic Engineers and administrators responsible for HubSpot's employee-facing application infrastructure.

In this role you will:

Data Acquisition and Ingestion

  • Acquire data from multiple sources using pre-built connectors and custom solutions.

  • Develop data ingestion pipelines to effectively manage internal and external API interactions, addressing challenges such as rate limiting, pagination, and retry mechanisms.

  • Process and securely ingest large datasets, ensuring encrypted data transfer and runtime credential safety.

  • Implement Snowpipe for data ingestion into Snowflake and utilize Snowflake’s external tables for accessing S3 data.

AWS Infrastructure and File Management

  • Manage and optimize data workflows within the AWS ecosystem, with a focus on S3 operations.

  • Manipulate S3 files upon arrival, archive older files to Glacier, and do light cleansing/alteration of files, as needed, for various workflows.

  • Automate AWS processes using tools like AWS Lambda, Step Functions, or custom scripts to improve efficiency and reliability.

Data Migrations and Utility Development

  • Develop scripts for data migrations between Snowflake instances and automate tasks using Looker’s APIs.

  • Create utility scripts for business needs, such as disabling inactive users, purging broken/unused Looker content, and simplifying otherwise burdensome/manual processes.

Data Modeling and Development

  • Build and maintain data models using dbt, following best SQL practices and data warehousing principles to ensure schema design and data integrity.

  • Develop data pipelines in a DAG structure, ensuring clear dependencies and performing data validation and testing.

  • Implement CI/CD workflows with GitHub actions/hooks for automated testing and deployment.

We are looking for people who have:

  • Proven experience in developing custom data ingestion pipelines and integrating with APIs, effectively managing complexities such as rate limiting and pagination.

  • Proficient in automating workflows using AWS Lambda, Step Functions, or comparable tools.

  • Extensive knowledge of databases (especially Snowflake.) Experience with Snowflake technologies such as Snowpipe, Snowpark, Data Sharing, Cloning etc., and Snowpipe usage.

  • Strong background in SQL for data warehouse transformations. Experience with dbt is a plus.

  • Solid understanding of data warehousing principles, DAG structures, and comprehensive data validation/testing frameworks.

  • Skilled in Python for creating utility scripts and automated workflows.

  • Experience with data manipulation libraries such as Pandas, PySpark, or similar technologies.

  • Familiarity with streaming data solutions, including Kafka, AWS Kinesis, RabbitMQ, ​etc

Cash compensation range: 157600-236400 USD Annually This resource will help guide how we recommend thinking about the range you see. Learn more about HubSpot’s compensation philosophy. The cash compensation above includes base salary, on-target commission for employees in eligible roles, and annual bonus targets under HubSpot’s bonus plan for eligible roles. In addition to cash compensation, some roles are eligible to participate in HubSpot’s equity plan to receive restricted stock units (RSUs). Some roles may also be eligible for overtime pay. Individual compensation packages are based on a few different factors unique to each candidate, including their skills, experience, qualifications and other job-related reasons. We know that benefits are also an important piece of your total compensation package. To learn more about what’s included in total compensation, check out some of the benefits and perks HubSpot offers to help employees grow better. At HubSpot, fair compensation practices isn’t just about checking off the box for legal compliance. It’s about living out our value of transparency with our employees, candidates, and community.

Working Nomads

Post Jobs
Premium Subscription
Sponsorship
Free Job Alerts

Job Skills
API
FAQ
Privacy policy
Terms and conditions
Contact us
About us

Jobs by Category

Remote Administration jobs
Remote Consulting jobs
Remote Customer Success jobs
Remote Development jobs
Remote Design jobs
Remote Education jobs
Remote Finance jobs
Remote Legal jobs
Remote Healthcare jobs
Remote Human Resources jobs
Remote Management jobs
Remote Marketing jobs
Remote Sales jobs
Remote System Administration jobs
Remote Writing jobs

Jobs by Position Type

Remote Full-time jobs
Remote Part-time jobs
Remote Contract jobs

Jobs by Region

Remote jobs Anywhere
Remote jobs North America
Remote jobs Latin America
Remote jobs Europe
Remote jobs Middle East
Remote jobs Africa
Remote jobs APAC

Jobs by Skill

Remote Accounting jobs
Remote Assistant jobs
Remote Copywriting jobs
Remote Cyber Security jobs
Remote Data Analyst jobs
Remote Data Entry jobs
Remote English jobs
Remote Spanish jobs
Remote Project Management jobs
Remote QA jobs
Remote SEO jobs

Jobs by Country

Remote jobs Australia
Remote jobs Argentina
Remote jobs Brazil
Remote jobs Canada
Remote jobs Colombia
Remote jobs France
Remote jobs Germany
Remote jobs Ireland
Remote jobs India
Remote jobs Japan
Remote jobs Mexico
Remote jobs Netherlands
Remote jobs New Zealand
Remote jobs Philippines
Remote jobs Poland
Remote jobs Portugal
Remote jobs Singapore
Remote jobs Spain
Remote jobs UK
Remote jobs USA


Working Nomads curates remote digital jobs from around the web.

© 2025 Working Nomads.