Sr. Manager, Data & Insights

Full-time
Brazil
Posted 1 year ago
Go ad-free with Premium ×
The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Management jobs

Summary

Our analytics team is responsible for creating and maintaining advanced analytics products from which both the Company and our clients depend to support their operations, business metrics and decision-making. This team develop high-quality datasets, KPI metrics, billing information, dashboards on BI Tools, and Machine Learning models for internal or external customers taking a central role in the Company strategy. As the company already set a strong Data foundation and a complete Data stack that includes real-time data tools and reliable data platforms that allow us to go beyond our initial accountability, this team is gaining more responsibilities as the company expands and prepares new AI-enabled products for our clients worldwide.

What you'll do

  • Align stakeholders around the team’s vision, goals, metrics and roadmap;

  • Build, improve and maintain trusted and scalable Data and Machine Learning Products: Business-level Datasets,  BI  dashboards,  Key performance indicators for the Company and their ETL-related Data Pipelines (Spark), with data visualisation and storytelling best practices over testing, monitoring, observability and security;

  • Conduct POCs and establish partnerships to launch new products and features based on AI, Machine Learning, and predictive analytics;

  • Moderate and effectively contribute to technical decisions on data architecture, data warehouse modelling, build vs buy, evolving data Infrastructure stack and team practices over Analytics and Machine learning;

  • Conduct ad-hoc and exploratory data analysis identifying tendencies and patterns using complex data generating insights to support the company's strategic decisions;

  • Contribute actively nudging the company to a Data-driven culture by  improving data-driven decision-making, data literacy, lineage and standards establishing common standards and supporting analytical tools to be used companywide;

  • Work and communicate clearly on multidisciplinary teams composed of business analysts, analytic engineers, data engineers, and data scientists;

  • As a senior leader we also expect you to tutor other engineers and organize complex cross-team initiatives that are related to your expertise;

  • Contribute to the dissemination of agile development culture and code review;

  • Act as a mentor to accelerate the technical career growth of your team members, peers and the company;

  • Provide effective feedback to the team's technicians;

  • Lead the implementation and maintenance of development processes, ensuring that everyone understands the benefits and trade-offs involved, and adjusting them as needed to improve the efficiency and effectiveness of the team.

Minimum Qualifications

  • Big Data Processing with Spark for at least batch processing.  

  • Experience with Python, SQL and BI tools.

  • At least 5 years managing teams from which 4 years must be managing multi-disciplinary teams composed of 5 to 12 Data focused by professionals with at least 2 different profiles ( Engineering and Quantitative profiles): at least 1 Data Engineering profile (either Software Engineering, Data Engineering, Data infrastructure or Analytics Engineering) and at least 1 applied quantitative domain (Data Scientists,  Data Analytics, Advanced Business Analytics, Machine Learning Engineering).

  • Negotiation and communication skills to deal with different levels of stakeholders. 

  • Advanced English Level

  • (Desirable) Preferably with a Master's degree in quantitative fields (Mathematics, Software Engineering, Statistics, Physics, Economics or similar fields).

  • (Desirable)Concrete previous hands-on experience as a Data Engineer, Analytics Engineer, Data Scientist, Software Engineer or Machine Learning Engineer.

  • (Desirable) Previous experience with quantitative research or proven experience with numerical Machine Learning algorithms (Time series, Tree-based, logistic regression, Support vector machines, Clustering, Panel Data, Value at Risk, and Bayesian Statistics) especially with transactional Data (transactions, user personalisation, fraud detection, financial analytics, asset management).

  • (Desirable) Experience using Data Orchestration Systems (e.g. Airflow, Prefect, Luigi, DBT, or others) and Metadata management systems (Apache Atlas, Open Metadata or others);

  • (Desirable) Experience Developing automated Unit and Integration tests for data pipelines;

  • (Desirable) Experience using notebooks for data exploration (e.g. Jupyter, Databricks, Hex);

  • (Desirable) Experience consuming data-intensive distributed messaging processing systems (e.g. Kafka, SQS, Pub-Sub, Pulsar, Kinesis);

  • (Desirable) Experience developing and maintaining CI/CD for Data Pipelines (e.g. CodeFresh, Jenkins, etc);

Core Benefits

  • Remote work

  • Flexible hours

  • Gympass

  • Meal & Food vouchers

  • Remote work financial support

  • Life Insurance

  • Medical and Dental Assistance

  • Employee child care benefit: daycare

  • Vidalink partnership

  • Day off (Birthday)

  • Support for studying languages

  • 50% off AWS and GCP certifications

Technologies that we apply in our day

Engineering 

  • Java, Groovy and Go

  • Automated Testing

  • K6 (Load Testing) and Gremlin (Chaos Testing)

  • SQL /  NoSQL

  • Git

  • Rest APIs and streaming data

  • Cloud (AWS and Google)

  • Docker and Kubernetes

  • Codefresh & ArgoCD

  • Grafana & Honeycomb

  • Jira / Confluence

Data

  • AWS Services;

  • Data Processing: Spark, Flink

  • Python

  • Airflow

  • Relational databases (PostgreSQL and MySQL)

Platform Engineering

  • AWS

  • Codefresh

  • ArgoCD

  • Grafana & Honeycomb

  • Kubernetes

  • Terraform

  • Go, Python, and Shell Script

  • Prometheus

  • Istio

Security

  • SAST

  • SCA

  • IaC Scans

 

Go ad-free with Premium ×
About the Job
Full-time
Brazil
Posted 1 year ago
Check if your resume is a good fit
25/100
Get Full Report
+ 1,284 new jobs added today
30,000+
Remote Jobs

Don't miss out — new listings every hour

Join Premium

Sr. Manager, Data & Insights

The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Management jobs

Summary

Our analytics team is responsible for creating and maintaining advanced analytics products from which both the Company and our clients depend to support their operations, business metrics and decision-making. This team develop high-quality datasets, KPI metrics, billing information, dashboards on BI Tools, and Machine Learning models for internal or external customers taking a central role in the Company strategy. As the company already set a strong Data foundation and a complete Data stack that includes real-time data tools and reliable data platforms that allow us to go beyond our initial accountability, this team is gaining more responsibilities as the company expands and prepares new AI-enabled products for our clients worldwide.

What you'll do

  • Align stakeholders around the team’s vision, goals, metrics and roadmap;

  • Build, improve and maintain trusted and scalable Data and Machine Learning Products: Business-level Datasets,  BI  dashboards,  Key performance indicators for the Company and their ETL-related Data Pipelines (Spark), with data visualisation and storytelling best practices over testing, monitoring, observability and security;

  • Conduct POCs and establish partnerships to launch new products and features based on AI, Machine Learning, and predictive analytics;

  • Moderate and effectively contribute to technical decisions on data architecture, data warehouse modelling, build vs buy, evolving data Infrastructure stack and team practices over Analytics and Machine learning;

  • Conduct ad-hoc and exploratory data analysis identifying tendencies and patterns using complex data generating insights to support the company's strategic decisions;

  • Contribute actively nudging the company to a Data-driven culture by  improving data-driven decision-making, data literacy, lineage and standards establishing common standards and supporting analytical tools to be used companywide;

  • Work and communicate clearly on multidisciplinary teams composed of business analysts, analytic engineers, data engineers, and data scientists;

  • As a senior leader we also expect you to tutor other engineers and organize complex cross-team initiatives that are related to your expertise;

  • Contribute to the dissemination of agile development culture and code review;

  • Act as a mentor to accelerate the technical career growth of your team members, peers and the company;

  • Provide effective feedback to the team's technicians;

  • Lead the implementation and maintenance of development processes, ensuring that everyone understands the benefits and trade-offs involved, and adjusting them as needed to improve the efficiency and effectiveness of the team.

Minimum Qualifications

  • Big Data Processing with Spark for at least batch processing.  

  • Experience with Python, SQL and BI tools.

  • At least 5 years managing teams from which 4 years must be managing multi-disciplinary teams composed of 5 to 12 Data focused by professionals with at least 2 different profiles ( Engineering and Quantitative profiles): at least 1 Data Engineering profile (either Software Engineering, Data Engineering, Data infrastructure or Analytics Engineering) and at least 1 applied quantitative domain (Data Scientists,  Data Analytics, Advanced Business Analytics, Machine Learning Engineering).

  • Negotiation and communication skills to deal with different levels of stakeholders. 

  • Advanced English Level

  • (Desirable) Preferably with a Master's degree in quantitative fields (Mathematics, Software Engineering, Statistics, Physics, Economics or similar fields).

  • (Desirable)Concrete previous hands-on experience as a Data Engineer, Analytics Engineer, Data Scientist, Software Engineer or Machine Learning Engineer.

  • (Desirable) Previous experience with quantitative research or proven experience with numerical Machine Learning algorithms (Time series, Tree-based, logistic regression, Support vector machines, Clustering, Panel Data, Value at Risk, and Bayesian Statistics) especially with transactional Data (transactions, user personalisation, fraud detection, financial analytics, asset management).

  • (Desirable) Experience using Data Orchestration Systems (e.g. Airflow, Prefect, Luigi, DBT, or others) and Metadata management systems (Apache Atlas, Open Metadata or others);

  • (Desirable) Experience Developing automated Unit and Integration tests for data pipelines;

  • (Desirable) Experience using notebooks for data exploration (e.g. Jupyter, Databricks, Hex);

  • (Desirable) Experience consuming data-intensive distributed messaging processing systems (e.g. Kafka, SQS, Pub-Sub, Pulsar, Kinesis);

  • (Desirable) Experience developing and maintaining CI/CD for Data Pipelines (e.g. CodeFresh, Jenkins, etc);

Core Benefits

  • Remote work

  • Flexible hours

  • Gympass

  • Meal & Food vouchers

  • Remote work financial support

  • Life Insurance

  • Medical and Dental Assistance

  • Employee child care benefit: daycare

  • Vidalink partnership

  • Day off (Birthday)

  • Support for studying languages

  • 50% off AWS and GCP certifications

Technologies that we apply in our day

Engineering 

  • Java, Groovy and Go

  • Automated Testing

  • K6 (Load Testing) and Gremlin (Chaos Testing)

  • SQL /  NoSQL

  • Git

  • Rest APIs and streaming data

  • Cloud (AWS and Google)

  • Docker and Kubernetes

  • Codefresh & ArgoCD

  • Grafana & Honeycomb

  • Jira / Confluence

Data

  • AWS Services;

  • Data Processing: Spark, Flink

  • Python

  • Airflow

  • Relational databases (PostgreSQL and MySQL)

Platform Engineering

  • AWS

  • Codefresh

  • ArgoCD

  • Grafana & Honeycomb

  • Kubernetes

  • Terraform

  • Go, Python, and Shell Script

  • Prometheus

  • Istio

Security

  • SAST

  • SCA

  • IaC Scans