Data Architect
Architecture & Design Design and implement scalable BI and data architecture using Data Lake or Lakehouse paradigms, including medallion architecture. Architect and optimize ELT/ETL pipelines using SQL Server, Dataflows, and data Pipelines. Integrate on-premises data sources using On-prem Data Gateway (ODG) with cloud-based solutions. Develop a robust semantic layer and underlying data model to bridge technical data and business language, applying data virtualization principles. Design and optimize semantic models that represent data in a way that enhances understanding and usability for analytics. Build and manage data pipelines, Lakehouses, Data Lakes, and Data Warehouses in Azure or AWS. Manage Power BI dashboards with advanced DAX and real-time data integration. Implement data governance, security, and compliance best practices. Ensure data quality, lineage, and availability across all reporting layers. Collaborate with business stakeholders to translate requirements into technical solutions and intuitive business language. Lead and mentor a team of BI developers, data engineers, analysts, and collaborate with data architects. Drive adoption of unified data platform, modern BI practices, and semantic modelling methodologies across the organization. Experience: 10+ years in Data & Analytics and expertise in semantic data modelling and data engineering skills. Strong experience with data platforms such as Snowflake, Databricks, Microsoft Fabric, or Azure Data Gateway, Azure Synapse. BI Tools: Working Knowledge in Power BI and integration with Microsoft Fabric is a plus. Data Integration: Experience with On-prem Data Gateway (ODG) and hybrid data environments. Data Engineering: Proficient in designing ELT/ETL processes using SQL Server, columnar data format such as Delta or Parquet and Fabric Pipelines. Architecture: Strong understanding of medallion architecture, data virtualization principles, cloud-based data management, and analytics technologies. Programming: Expertise with Python, T-SQL, Spark or other scripting languages for data transformation. Methodologies: Agile/Scrum project delivery experience. Communication: Strong verbal and written communication skills with the ability to convey complex technical data in simple business language.
About the job
Apply for this position
Data Architect
Architecture & Design Design and implement scalable BI and data architecture using Data Lake or Lakehouse paradigms, including medallion architecture. Architect and optimize ELT/ETL pipelines using SQL Server, Dataflows, and data Pipelines. Integrate on-premises data sources using On-prem Data Gateway (ODG) with cloud-based solutions. Develop a robust semantic layer and underlying data model to bridge technical data and business language, applying data virtualization principles. Design and optimize semantic models that represent data in a way that enhances understanding and usability for analytics. Build and manage data pipelines, Lakehouses, Data Lakes, and Data Warehouses in Azure or AWS. Manage Power BI dashboards with advanced DAX and real-time data integration. Implement data governance, security, and compliance best practices. Ensure data quality, lineage, and availability across all reporting layers. Collaborate with business stakeholders to translate requirements into technical solutions and intuitive business language. Lead and mentor a team of BI developers, data engineers, analysts, and collaborate with data architects. Drive adoption of unified data platform, modern BI practices, and semantic modelling methodologies across the organization. Experience: 10+ years in Data & Analytics and expertise in semantic data modelling and data engineering skills. Strong experience with data platforms such as Snowflake, Databricks, Microsoft Fabric, or Azure Data Gateway, Azure Synapse. BI Tools: Working Knowledge in Power BI and integration with Microsoft Fabric is a plus. Data Integration: Experience with On-prem Data Gateway (ODG) and hybrid data environments. Data Engineering: Proficient in designing ELT/ETL processes using SQL Server, columnar data format such as Delta or Parquet and Fabric Pipelines. Architecture: Strong understanding of medallion architecture, data virtualization principles, cloud-based data management, and analytics technologies. Programming: Expertise with Python, T-SQL, Spark or other scripting languages for data transformation. Methodologies: Agile/Scrum project delivery experience. Communication: Strong verbal and written communication skills with the ability to convey complex technical data in simple business language.