Solution Architect(MS-DevOps)
To see similar active jobs please follow this link: Remote Development jobs
Overview:
We are seeking qualified Solution Architects proficient in Software/Data engineering and DevOps to help deliver our Elastic Operations service. This position will report to our Managed Services team in Bangalore, India, This is a hands-on technical Developer/Architect position. Hence, only experienced candidates with a deep passion for understanding and designing complex data solutions must apply.
As a Solutions Architect engineer, you will be responsible for designing, validating, optimizing, and maintaining small/large-scale complex data integration and data pipeline workloads. You will be working on large-scale, complex data platform projects running on Snowflake and other native cloud platform services in AWS and Azure. You will also participate in data integration, data modeling, data governance, and data security tasks. In addition, you will need the ability to learn and quickly upskill on data ecosystem technologies related to data ingestion, data transformation, data modeling, data migration, platform design, and architecture, with some exposure to data visualization tools like PowerBI.
Required Experience:
9-12 years of hands-on experience as a software engineer, DevOps, or Data engineer in Data modeling, designing, implementing, and supporting modern data solutions.
Experience in Core cloud data platforms like Snowflake, AWS, Azure, or Databricks.
Deep working knowledge of end-to-end pipelines for small and large-scale data sets from various sources based on applications. Ability to diagnose and fix broken pipelines.
Understanding of common data integration and data transformation patterns for small and large-scale data sets.
Deep understanding of data validation processes using utilities or manual processes.
Hands-on experience troubleshooting, optimizing, and enhancing data pipelines and bringing improvements in the production environment.
Extensive experience in providing operational support across a large user base for a cloud-native data warehouse (Snowflake and/or Redshift).
Programming expertise in Java, Python and/or Scala.
SQL and the ability to write, debug, and optimize SQL queries.
Unmatched troubleshooting and performance tuning skills (data warehouse)
Willing to work in a developer and support role across customers. Proficient in Incident Management and Troubleshooting.
Excellent client-facing written and verbal communication skills and experience.
Create and deliver detailed technical presentations for an executive audience.
Excellent client-facing written and verbal communication skills and experience.
4-year Bachelor's degree in Computer Science or a related field.
Prefer one or more of the following:
Production experience in core data platforms: Snowflake, AWS, Azure, GCP, Hadoop, Databricks
Cloud and Distributed Data Storage: S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems
Data integration technologies: Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory or other data integration technologies
Multiple data sources (e.g. queues, relational databases, files, search, API)
Design, implement, and maintain CI/CD pipelines to enable rapid and reliable software releases
Monitor and optimize the performance of CI/CD pipelines to reduce build and deployment times
Automated data transformation and data curation: dbt, Spark, Spark streaming, automated pipelines
Workflow Management and Orchestration: Airflow, AWS Managed Airflow, Luigi, NiFi
CI/CD tools: Git, Flyway, and Liquibase
Why phData? We offer:
Medical Insurance for Self & Family
Medical Insurance for Parents
Term Life & Personal Accident
Wellness Allowance
Broadband Reimbursement
Professional Development Allowance
Reimbursement of Skill Upgrade Certifications
Certification Reimbursement
#LI-DNI
Solution Architect(MS-DevOps)
To see similar active jobs please follow this link: Remote Development jobs
Overview:
We are seeking qualified Solution Architects proficient in Software/Data engineering and DevOps to help deliver our Elastic Operations service. This position will report to our Managed Services team in Bangalore, India, This is a hands-on technical Developer/Architect position. Hence, only experienced candidates with a deep passion for understanding and designing complex data solutions must apply.
As a Solutions Architect engineer, you will be responsible for designing, validating, optimizing, and maintaining small/large-scale complex data integration and data pipeline workloads. You will be working on large-scale, complex data platform projects running on Snowflake and other native cloud platform services in AWS and Azure. You will also participate in data integration, data modeling, data governance, and data security tasks. In addition, you will need the ability to learn and quickly upskill on data ecosystem technologies related to data ingestion, data transformation, data modeling, data migration, platform design, and architecture, with some exposure to data visualization tools like PowerBI.
Required Experience:
9-12 years of hands-on experience as a software engineer, DevOps, or Data engineer in Data modeling, designing, implementing, and supporting modern data solutions.
Experience in Core cloud data platforms like Snowflake, AWS, Azure, or Databricks.
Deep working knowledge of end-to-end pipelines for small and large-scale data sets from various sources based on applications. Ability to diagnose and fix broken pipelines.
Understanding of common data integration and data transformation patterns for small and large-scale data sets.
Deep understanding of data validation processes using utilities or manual processes.
Hands-on experience troubleshooting, optimizing, and enhancing data pipelines and bringing improvements in the production environment.
Extensive experience in providing operational support across a large user base for a cloud-native data warehouse (Snowflake and/or Redshift).
Programming expertise in Java, Python and/or Scala.
SQL and the ability to write, debug, and optimize SQL queries.
Unmatched troubleshooting and performance tuning skills (data warehouse)
Willing to work in a developer and support role across customers. Proficient in Incident Management and Troubleshooting.
Excellent client-facing written and verbal communication skills and experience.
Create and deliver detailed technical presentations for an executive audience.
Excellent client-facing written and verbal communication skills and experience.
4-year Bachelor's degree in Computer Science or a related field.
Prefer one or more of the following:
Production experience in core data platforms: Snowflake, AWS, Azure, GCP, Hadoop, Databricks
Cloud and Distributed Data Storage: S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems
Data integration technologies: Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory or other data integration technologies
Multiple data sources (e.g. queues, relational databases, files, search, API)
Design, implement, and maintain CI/CD pipelines to enable rapid and reliable software releases
Monitor and optimize the performance of CI/CD pipelines to reduce build and deployment times
Automated data transformation and data curation: dbt, Spark, Spark streaming, automated pipelines
Workflow Management and Orchestration: Airflow, AWS Managed Airflow, Luigi, NiFi
CI/CD tools: Git, Flyway, and Liquibase
Why phData? We offer:
Medical Insurance for Self & Family
Medical Insurance for Parents
Term Life & Personal Accident
Wellness Allowance
Broadband Reimbursement
Professional Development Allowance
Reimbursement of Skill Upgrade Certifications
Certification Reimbursement
#LI-DNI
