Senior Data Engineer
About the Role
We are seeking a highly skilled and experienced Senior Data Engineer to lead the development and management of our data platform. This pivotal role will focus on supporting critical data needs and developing foundational data models that are essential for advancing our cross-border remittances business. You will leverage your expertise in data warehousing, transformation, and modeling to build robust and scalable data solutions. Your responsibilities will encompass the entire build and deployment lifecycle of data movement and transformation, including managing users, scaling compute resources, and ensuring the platform operates optimally and efficiently. A key focus will be on maintaining robust data governance, ensuring data is well-defined, searchable, and trusted across the entire organization.
Key Responsibilities:
Data Modeling & Management
Design, implement, and evolve data models for core datasets, including creating new models and enhancing existing ones to meet business requirements.
Collaborate with stakeholders across the organization to translate commercial objectives into trusted data outcomes and dependable solutions.
Establish and enforce data governance policies to guarantee data quality, security, and compliance.
Implement and maintain data cataloging and metadata management solutions for easy data discoverability and self-serve.
Platform Operations & Optimization
Manage all aspects of the build and deployment lifecycle for data movement and transformation processes.
Oversee user access and resource allocation for compute infrastructure, ensuring efficient scaling and utilization.
Ensure effective use of platform tools and resources for efficient ETL/ELT processing, providing customized tooling where appropriate.
Manage the costs associated with all data platform technologies through effective tooling.
Technical Leadership & Innovation
Provide technical leadership and mentorship to other data engineers within the team.
Evaluate and integrate new data technologies and tools to enhance the capabilities and efficiency of the data platform.
What you’ll bring to the table
Experience: At least 7 years of experience in data engineering, with a strong emphasis on developing data platforms, data modeling, and managing data assets.
Data Platform Expertise: In-depth, hands-on experience with technologies for data ingestion, job orchestration, data warehousing, and reporting.
ETL/ELT: Proven ability to design, implement, and manage robust ETL/ELT pipelines using various tools (e.g., Spark, dbt).
Cloud Platforms: Strong experience with cloud data platforms (e.g., AWS, GCP, Azure), including compute, storage, and database services.
Programming: Proficient in SQL and advanced programming in at least one language (e.g., Python, Scala, Java).
Orchestration: Experience with workflow orchestration tools (e.g., Astronomer, Apache Airflow).
Data Governance: Solid understanding and practical experience with data governance principles, data cataloging, and metadata management.
Performance Optimization: Demonstrated ability to optimize data platform performance and manage compute resources efficiently.
Problem-Solving: Excellent analytical and problem-solving skills with meticulous attention to detail.
Communication: Strong communication and interpersonal skills, capable of articulating complex technical concepts to both technical and non-technical audiences.
Bonus Points
Experience in the financial services or FinTech industry, specifically with cross-border payments.
End-to-end experience building data platforms, connecting all components from ingestion to reporting, using industry-standard tools. These include: Fivetran (ingestion), Databricks (lakehouse), Astronomer/Apache Airflow (orchestration), and Metaplane/Datadog (observability).
Proficiency in data modeling tools and techniques (e.g., dbt).
Familiarity with modern data lake formats such as Delta Lake and Iceberg.
Experience with real-time data processing and streaming technologies (e.g., Flink, Epsio).
Experience with infrastructure as code (e.g., Terraform) for managing data platform resources.
Familiarity with data observability and monitoring tools (e.g. Metaplane, Datadog).
Proficiency in BI and data visualization tools; experience using Mode is preferred.
Experience using Blackline or Autorek reconciliation software
Senior Data Engineer
About the Role
We are seeking a highly skilled and experienced Senior Data Engineer to lead the development and management of our data platform. This pivotal role will focus on supporting critical data needs and developing foundational data models that are essential for advancing our cross-border remittances business. You will leverage your expertise in data warehousing, transformation, and modeling to build robust and scalable data solutions. Your responsibilities will encompass the entire build and deployment lifecycle of data movement and transformation, including managing users, scaling compute resources, and ensuring the platform operates optimally and efficiently. A key focus will be on maintaining robust data governance, ensuring data is well-defined, searchable, and trusted across the entire organization.
Key Responsibilities:
Data Modeling & Management
Design, implement, and evolve data models for core datasets, including creating new models and enhancing existing ones to meet business requirements.
Collaborate with stakeholders across the organization to translate commercial objectives into trusted data outcomes and dependable solutions.
Establish and enforce data governance policies to guarantee data quality, security, and compliance.
Implement and maintain data cataloging and metadata management solutions for easy data discoverability and self-serve.
Platform Operations & Optimization
Manage all aspects of the build and deployment lifecycle for data movement and transformation processes.
Oversee user access and resource allocation for compute infrastructure, ensuring efficient scaling and utilization.
Ensure effective use of platform tools and resources for efficient ETL/ELT processing, providing customized tooling where appropriate.
Manage the costs associated with all data platform technologies through effective tooling.
Technical Leadership & Innovation
Provide technical leadership and mentorship to other data engineers within the team.
Evaluate and integrate new data technologies and tools to enhance the capabilities and efficiency of the data platform.
What you’ll bring to the table
Experience: At least 7 years of experience in data engineering, with a strong emphasis on developing data platforms, data modeling, and managing data assets.
Data Platform Expertise: In-depth, hands-on experience with technologies for data ingestion, job orchestration, data warehousing, and reporting.
ETL/ELT: Proven ability to design, implement, and manage robust ETL/ELT pipelines using various tools (e.g., Spark, dbt).
Cloud Platforms: Strong experience with cloud data platforms (e.g., AWS, GCP, Azure), including compute, storage, and database services.
Programming: Proficient in SQL and advanced programming in at least one language (e.g., Python, Scala, Java).
Orchestration: Experience with workflow orchestration tools (e.g., Astronomer, Apache Airflow).
Data Governance: Solid understanding and practical experience with data governance principles, data cataloging, and metadata management.
Performance Optimization: Demonstrated ability to optimize data platform performance and manage compute resources efficiently.
Problem-Solving: Excellent analytical and problem-solving skills with meticulous attention to detail.
Communication: Strong communication and interpersonal skills, capable of articulating complex technical concepts to both technical and non-technical audiences.
Bonus Points
Experience in the financial services or FinTech industry, specifically with cross-border payments.
End-to-end experience building data platforms, connecting all components from ingestion to reporting, using industry-standard tools. These include: Fivetran (ingestion), Databricks (lakehouse), Astronomer/Apache Airflow (orchestration), and Metaplane/Datadog (observability).
Proficiency in data modeling tools and techniques (e.g., dbt).
Familiarity with modern data lake formats such as Delta Lake and Iceberg.
Experience with real-time data processing and streaming technologies (e.g., Flink, Epsio).
Experience with infrastructure as code (e.g., Terraform) for managing data platform resources.
Familiarity with data observability and monitoring tools (e.g. Metaplane, Datadog).
Proficiency in BI and data visualization tools; experience using Mode is preferred.
Experience using Blackline or Autorek reconciliation software
