Senior DevOps Engineer (Cloud)

Full-time
India
Posted 1 year ago
Go ad-free with Premium ×
The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

Responsibilities:

  • Operate and manage modern data platforms - from streaming, to data lakes, to analytics, and beyond across a progressively evolving technical stack.

  • Demonstrate clear ownership of multiple simultaneous customer accounts across a variety of technical stacks and act as a technical leader. 

  • Proactively identify and articulate best practices and improvements to customers on how to best leverage their Cloud Data Platform. 

  • Skillfully navigate complex customer environments and build Epics, Stories, and Tasks in order to mature and improve our customer’s data platform.  Delegate to and coach Engineers and ensure successful and timely delivery of these Epics.

  • Provide thought leadership by recommending the right technologies and approaches to maturing and solving problems to help ensure performance, security, scalability, and user satisfaction.

  • Continually hunt for ways to automate and optimize existing processes.

Required Experience:

  • Strong working knowledge of SQL and the ability to write, debug, and optimize SQL queries.

  • Extensive experience in providing architectural guidance and operational support across a large user base for a cloud-native data warehouse (Snowflake and/or Redshift). 

  • Expertise with cloud-native data technologies in AWS or Azure. 

  • Proven experience learning new technology stacks and training team members.

  • Professional track record of creating, challenging, and improving processes and procedures.

  • Strong troubleshooting and performance tuning skills.

  • Client-facing written and verbal communication skills and experience.

Preferred Experience: 

  • Production experience and certifications in core data platforms such as Snowflake, AWS, Azure, GCP, Hadoop, or Databricks.

  • Production experience working with Cloud and Distributed Data Storage technologies such as S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems.

  • Production experience working with Data integration technologies such as Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, HVR, NiFi, AWS Data Migration Services, Azure DataFactory or others.

  • Production experience working with Workflow Management and Orchestration such as Airflow, AWS Managed Airflow, Luigi, NiFi.

  • Working experience with infrastructure as code using Terraform or Cloud Formation.

  • Strong expertise of scripting language to automate repetitive tasks (preferred Python).

  • Well versed with continuous integration and deployment frameworks with hands-on experience using CI/CD tools like Bitbucket, Github, Flyway, Liquibase. 

  • Bachelor's degree in Computer Science or a related field 

Perks and Benefits

  • Medical Insurance for Self & Family

  • Medical Insurance for Parents

  • Term Life & Personal Accident

  • Wellness Allowance

  • Broadband Reimbursement

  • Professional Development Allowance

  • Reimbursement of Skill Upgrade Certifications

  • Certification Bonus

#LI-DNI

Go ad-free with Premium ×
About the Job
Full-time
India
Posted 1 year ago
Check if your resume is a good fit
25/100
Get Full Report
+ 1,284 new jobs added today
30,000+
Remote Jobs

Don't miss out — new listings every hour

Join Premium

Senior DevOps Engineer (Cloud)

The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

Responsibilities:

  • Operate and manage modern data platforms - from streaming, to data lakes, to analytics, and beyond across a progressively evolving technical stack.

  • Demonstrate clear ownership of multiple simultaneous customer accounts across a variety of technical stacks and act as a technical leader. 

  • Proactively identify and articulate best practices and improvements to customers on how to best leverage their Cloud Data Platform. 

  • Skillfully navigate complex customer environments and build Epics, Stories, and Tasks in order to mature and improve our customer’s data platform.  Delegate to and coach Engineers and ensure successful and timely delivery of these Epics.

  • Provide thought leadership by recommending the right technologies and approaches to maturing and solving problems to help ensure performance, security, scalability, and user satisfaction.

  • Continually hunt for ways to automate and optimize existing processes.

Required Experience:

  • Strong working knowledge of SQL and the ability to write, debug, and optimize SQL queries.

  • Extensive experience in providing architectural guidance and operational support across a large user base for a cloud-native data warehouse (Snowflake and/or Redshift). 

  • Expertise with cloud-native data technologies in AWS or Azure. 

  • Proven experience learning new technology stacks and training team members.

  • Professional track record of creating, challenging, and improving processes and procedures.

  • Strong troubleshooting and performance tuning skills.

  • Client-facing written and verbal communication skills and experience.

Preferred Experience: 

  • Production experience and certifications in core data platforms such as Snowflake, AWS, Azure, GCP, Hadoop, or Databricks.

  • Production experience working with Cloud and Distributed Data Storage technologies such as S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems.

  • Production experience working with Data integration technologies such as Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, HVR, NiFi, AWS Data Migration Services, Azure DataFactory or others.

  • Production experience working with Workflow Management and Orchestration such as Airflow, AWS Managed Airflow, Luigi, NiFi.

  • Working experience with infrastructure as code using Terraform or Cloud Formation.

  • Strong expertise of scripting language to automate repetitive tasks (preferred Python).

  • Well versed with continuous integration and deployment frameworks with hands-on experience using CI/CD tools like Bitbucket, Github, Flyway, Liquibase. 

  • Bachelor's degree in Computer Science or a related field 

Perks and Benefits

  • Medical Insurance for Self & Family

  • Medical Insurance for Parents

  • Term Life & Personal Accident

  • Wellness Allowance

  • Broadband Reimbursement

  • Professional Development Allowance

  • Reimbursement of Skill Upgrade Certifications

  • Certification Bonus

#LI-DNI