Principal Solutions Architect
To see similar active jobs please follow this link: Remote Development jobs
Required Experience:
8+ years as a hands-on Solutions Architect designing and implementing data solutions
2+ years previous Consulting leadership experience working with external customers with the ability to multitask, prioritize tasks, frequently change focus, and work across a variety of projects.
Programming expertise in Java, Python and/or Scala
Core cloud data platforms including Snowflake as well as AWS, Azure, Databricks or GCP
SQL and the ability to write, debug, and optimize SQL queries
Demonstrated expertise in effectively leading and managing a team comprising Solution Architects and Data Engineers, fostering internal growth through coaching, mentoring, and performance management.
Proven track record of collaborating with client stakeholders, technology partners, and cross-functional sales and delivery team members across distributed global teams, ensuring seamless, successful project delivery outcomes.
Create strong cross-practice relationships to drive customer success.
Exhibits a strong sense of ownership in resolving challenges, committed to ensuring exceptional outcomes for all aspects of project execution.
Ability to develop end-to-end technical solutions into production — and to help ensure performance, security, scalability, and robust data integration.
Client-facing written and verbal communication skills and experience
Create and deliver detailed presentations
Detailed solution documentation (e.g. including POCS and roadmaps, sequence diagrams, class hierarchies, logical system views, etc.)
4-year Bachelor's degree in Computer Science or a related field
Prefer any of the following:
Production experience in core data platforms: Snowflake, AWS, Azure, GCP, Hadoop, Databricks
Cloud and Distributed Data Storage: S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems
Data integration technologies: Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory, Informatica Intelligent Cloud Services (IICS), Google DataProc or other data integration technologies
Multiple data sources: (e.g. queues, relational databases, files, search, API)
Complete software development life cycle experience: including design, documentation, implementation, testing, and deployment
Automated data transformation and data curation: dbt, Spark, Spark streaming, automated pipelines
Workflow Management and Orchestration: Airflow, AWS Managed Airflow, Luigi, NiFi
Methodologies: Agile Project Management, Data Modeling (e.g. Kimball, Data Vault)
Why phData? We offer:
Remote-First Work Environment
Casual, award-winning small-business work environment
Collaborative culture that prizes autonomy, creativity, and transparency
Competitive comp, excellent benefits, generous PTO plan plus 10 Holidays (and other cool perks)
Accelerated learning and professional development through advanced training and certifications
#LI-DNI
About the job
Principal Solutions Architect
To see similar active jobs please follow this link: Remote Development jobs
Required Experience:
8+ years as a hands-on Solutions Architect designing and implementing data solutions
2+ years previous Consulting leadership experience working with external customers with the ability to multitask, prioritize tasks, frequently change focus, and work across a variety of projects.
Programming expertise in Java, Python and/or Scala
Core cloud data platforms including Snowflake as well as AWS, Azure, Databricks or GCP
SQL and the ability to write, debug, and optimize SQL queries
Demonstrated expertise in effectively leading and managing a team comprising Solution Architects and Data Engineers, fostering internal growth through coaching, mentoring, and performance management.
Proven track record of collaborating with client stakeholders, technology partners, and cross-functional sales and delivery team members across distributed global teams, ensuring seamless, successful project delivery outcomes.
Create strong cross-practice relationships to drive customer success.
Exhibits a strong sense of ownership in resolving challenges, committed to ensuring exceptional outcomes for all aspects of project execution.
Ability to develop end-to-end technical solutions into production — and to help ensure performance, security, scalability, and robust data integration.
Client-facing written and verbal communication skills and experience
Create and deliver detailed presentations
Detailed solution documentation (e.g. including POCS and roadmaps, sequence diagrams, class hierarchies, logical system views, etc.)
4-year Bachelor's degree in Computer Science or a related field
Prefer any of the following:
Production experience in core data platforms: Snowflake, AWS, Azure, GCP, Hadoop, Databricks
Cloud and Distributed Data Storage: S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems
Data integration technologies: Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory, Informatica Intelligent Cloud Services (IICS), Google DataProc or other data integration technologies
Multiple data sources: (e.g. queues, relational databases, files, search, API)
Complete software development life cycle experience: including design, documentation, implementation, testing, and deployment
Automated data transformation and data curation: dbt, Spark, Spark streaming, automated pipelines
Workflow Management and Orchestration: Airflow, AWS Managed Airflow, Luigi, NiFi
Methodologies: Agile Project Management, Data Modeling (e.g. Kimball, Data Vault)
Why phData? We offer:
Remote-First Work Environment
Casual, award-winning small-business work environment
Collaborative culture that prizes autonomy, creativity, and transparency
Competitive comp, excellent benefits, generous PTO plan plus 10 Holidays (and other cool perks)
Accelerated learning and professional development through advanced training and certifications
#LI-DNI