Solutions Architect (Big Data)
To see similar active jobs please follow this link: Remote Development jobs
FEQ225R141 As a Solution Architect (Big data) in our Professional Services team, you will work with customers on short to medium-term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects that require integrating with client systems, training, and other technical tasks to help customers get the most value out of their data. RSAs are billable and know how to complete projects according to specifications with excellent customer service. You will report to the regional Manager/Lead.
The impact you will have:
Position, size, scope and deliver the migrations from on-prem and cloud source systems to Databricks.
Handle a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to', and productionalizing customer use cases
Work with engagement managers to scope a variety of professional services with input from the customer
Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build, and deployment of industry-leading big data and AI applications
Consult on architecture and design; bootstrap or implement customer projects which leads to a customer's successful understanding, evaluation, and adoption of Databricks.
Provide an escalated level of support for customer operational issues
Collaborate with the Databricks Technical, Project Manager, Architect, and Customer teams to ensure the technical components of the engagement are delivered to meet customer's needs
Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement-specific product and support issues
What we look for:
10+ years experience in data engineering, data platforms & analytics
8+ years of experience in a customer-facing role
Experience specifically in planning and executing complex migrations to the cloud
Experience in leading and driving migration workshops working by hand in hand with partners and end customers.
Experience working across a global partner ecosystem to leverage external expertise and tooling to migration legacy data solutions to the cloud
Comfortable writing code in either Python or Scala
Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one
Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals
Familiarity with CI/CD for production deployments
Working knowledge of MLOps
Design and deployment of performant end-to-end data architectures
Experience with technical project delivery - managing scope and timelines
Documentation and white-boarding skills
Experience working with clients and managing conflicts
Build skills in technical areas that support the deployment and integration of Databricks-based solutions to complete customer projects
Ability to travel up to 30% when needed
Solutions Architect (Big Data)
To see similar active jobs please follow this link: Remote Development jobs
FEQ225R141 As a Solution Architect (Big data) in our Professional Services team, you will work with customers on short to medium-term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects that require integrating with client systems, training, and other technical tasks to help customers get the most value out of their data. RSAs are billable and know how to complete projects according to specifications with excellent customer service. You will report to the regional Manager/Lead.
The impact you will have:
Position, size, scope and deliver the migrations from on-prem and cloud source systems to Databricks.
Handle a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to', and productionalizing customer use cases
Work with engagement managers to scope a variety of professional services with input from the customer
Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build, and deployment of industry-leading big data and AI applications
Consult on architecture and design; bootstrap or implement customer projects which leads to a customer's successful understanding, evaluation, and adoption of Databricks.
Provide an escalated level of support for customer operational issues
Collaborate with the Databricks Technical, Project Manager, Architect, and Customer teams to ensure the technical components of the engagement are delivered to meet customer's needs
Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement-specific product and support issues
What we look for:
10+ years experience in data engineering, data platforms & analytics
8+ years of experience in a customer-facing role
Experience specifically in planning and executing complex migrations to the cloud
Experience in leading and driving migration workshops working by hand in hand with partners and end customers.
Experience working across a global partner ecosystem to leverage external expertise and tooling to migration legacy data solutions to the cloud
Comfortable writing code in either Python or Scala
Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one
Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals
Familiarity with CI/CD for production deployments
Working knowledge of MLOps
Design and deployment of performant end-to-end data architectures
Experience with technical project delivery - managing scope and timelines
Documentation and white-boarding skills
Experience working with clients and managing conflicts
Build skills in technical areas that support the deployment and integration of Databricks-based solutions to complete customer projects
Ability to travel up to 30% when needed
