MENU
  • Remote Jobs
  • Companies
  • Go Premium
  • Job Alerts
  • Post a Job
  • Log in
  • Sign up
Working Nomads logo Working Nomads
  • Remote Jobs
  • Companies
  • Post Jobs
  • Go Premium
  • Get Free Job Alerts
  • Log in

Data Architect - Platform

Enterra Solutions

Full-time
USA - East
python
supply chain
big data
hadoop
aws
The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

More about Enterra Solutions, LLC:

Enterra leverages its history across government, commercial, and academic domains to help the world’s leading brands and organizations unlock growth and profit by delivering unique insights at unprecedented speeds and with verifiable accuracy. ​​ Our breakthrough Autonomous Decision Science® (ADS®) platform closes critical market gaps. By combining human-like reasoning with transparent mathematics and real-world optimization capabilities, business solutions built on our platform uncover previously unrealizable value across our clients’ value chain and orchestrate enterprise-wide optimization and decision-making. ​ Our current business solution areas are focused within and across consumer insights, revenue growth and supply chain. By combining our proprietary technology with our clients’ knowledge and practices, Enterra anticipates market changes systematically and at market speed—transforming clients into Autonomous Intelligent Enterprises.

What you will do: 

The successful candidate will join a diverse team to:

  • Build unique high-impact business solutions utilizing advanced technologies for use by world class clients.

  • Design and maintain the underlying data architecture for the end-to-end solution offerings.

  • Design and maintain data structures for machine learning and other analytics.

  • Guide the data technology stack used to build Enterra’s solution offerings.

  • Combine machine learning, artificial intelligence (ontologies, inference engines and rules) and natural language processing under a holistic vision to scale and transform businesses — across multiple functions and processes.

Responsibilities Include:

  • Work with other Enterra personnel to develop and enhance commercial quality solution offerings in the consumer goods and retail industry within the Revenue Growth capability area.

  • Understand complex business requirements and take the lead to propose elegant and simplified enterprise information architecture solutions

  • Design and facilitate enterprise information/data architecture for structured and unstructured data for use across multiple Enterra solution offerings across multiple clients. Note: there is no standard form of the client data so every client has variations.

  • Design and facilitate the architecture to assemble large, complex data sets to meet analytical requirements – analytics tables, feature-engineering etc.

  • Specify logical data integration (ETL) strategies for data flows between disparate source/target systems (i.e., client systems) and the enterprise information repositories.

  • Design and facilitate optimal data pipeline architecture, incorporating data wrangling and Extract-Transform-Load (ETL) flows.

  • Design and implement data solutions using Master Data Management principles and tools

  • Design and implement data governance and quality initiatives ensuring consistent translation and usage of data

  • Partner with leaders and team members across the business to design and institute practices that will drive the appropriate levels of rigor and quality in enterprise information architecture

  • Design, develop and maintain controls on data quality, interoperability and sources to effectively manage corporate risk

  • Design in-depth data analysis, data modeling and data design approaches on complicated datasets with potentially complex data integration scenario

  • Define processes for the effective, integrated introduction of new data

  • Establish and contribute to standards for ensuring consistent usage of our information platforms.

  • Ensure speed of data delivery without compromising data quality measures

  • Work with a team of Data Engineers to implement the designs.

  • Ability to think through multiple alternatives and select the best possible solutions to solve tactical and strategic business needs.

  • Evaluate new technology for use within Enterra.

 

Requirements:

  • Master’s degree in computer science, or a STEM (Science, Technology, Engineering or Math) field required.

  • Minimum of 5 years of hands-on experience in data architecture.

  • Experience in the consumer goods and retail industry – especially in dealing with sales data is not critical but strongly desired.

  • Minimum of 3 years of experience in an analytics/data science environment.

  • Minimum of 3 years of experience within a big data environment.

  • Demonstratable knowledge of data warehousing, business intelligence, and application data integration solutions.

  • Demonstratable experience in developing applications and services that run on a cloud infrastructure (Azure preferred, AWS or GCP).

  • Demonstratable experience with dimensional modeling techniques and creation of logical and physical data models (entity relationship modeling, Erwin diagrams, etc.)

  • Excellent problem-solving and communication skills in English.

  • Ability to thrive in a fast-paced, remote environment.

  • Comfortable with ambiguity with the ability to build structure and take a proactive approach to drive results.

  • Attention to detail – quality and accuracy in deliverables.

  • Strong interpersonal skills, including the ability to advocate for data management best practices and standards.

The following additional skills would be beneficial:

  • Knowledge of one or more of the following technologies: Data Science, Machine Learning, Natural Language Processing, Business Intelligence, and Data Visualization.

  • Knowledge of statistics and experience using statistical or BI packages for analyzing large datasets (Excel, R, Python, Power BI, Tableau etc.).

  • Experience with at least one of the following – Databricks, Spark, Hadoop or Kafka.

About the job

Full-time
USA - East
4 Applicants
Posted 8 months ago
python
supply chain
big data
hadoop
aws
Enhancv advertisement

30,000+
REMOTE JOBS

Unlock access to our database and
kickstart your remote career
Join Premium

Data Architect - Platform

Enterra Solutions
The job listing has expired. Unfortunately, the hiring company is no longer accepting new applications.

To see similar active jobs please follow this link: Remote Development jobs

More about Enterra Solutions, LLC:

Enterra leverages its history across government, commercial, and academic domains to help the world’s leading brands and organizations unlock growth and profit by delivering unique insights at unprecedented speeds and with verifiable accuracy. ​​ Our breakthrough Autonomous Decision Science® (ADS®) platform closes critical market gaps. By combining human-like reasoning with transparent mathematics and real-world optimization capabilities, business solutions built on our platform uncover previously unrealizable value across our clients’ value chain and orchestrate enterprise-wide optimization and decision-making. ​ Our current business solution areas are focused within and across consumer insights, revenue growth and supply chain. By combining our proprietary technology with our clients’ knowledge and practices, Enterra anticipates market changes systematically and at market speed—transforming clients into Autonomous Intelligent Enterprises.

What you will do: 

The successful candidate will join a diverse team to:

  • Build unique high-impact business solutions utilizing advanced technologies for use by world class clients.

  • Design and maintain the underlying data architecture for the end-to-end solution offerings.

  • Design and maintain data structures for machine learning and other analytics.

  • Guide the data technology stack used to build Enterra’s solution offerings.

  • Combine machine learning, artificial intelligence (ontologies, inference engines and rules) and natural language processing under a holistic vision to scale and transform businesses — across multiple functions and processes.

Responsibilities Include:

  • Work with other Enterra personnel to develop and enhance commercial quality solution offerings in the consumer goods and retail industry within the Revenue Growth capability area.

  • Understand complex business requirements and take the lead to propose elegant and simplified enterprise information architecture solutions

  • Design and facilitate enterprise information/data architecture for structured and unstructured data for use across multiple Enterra solution offerings across multiple clients. Note: there is no standard form of the client data so every client has variations.

  • Design and facilitate the architecture to assemble large, complex data sets to meet analytical requirements – analytics tables, feature-engineering etc.

  • Specify logical data integration (ETL) strategies for data flows between disparate source/target systems (i.e., client systems) and the enterprise information repositories.

  • Design and facilitate optimal data pipeline architecture, incorporating data wrangling and Extract-Transform-Load (ETL) flows.

  • Design and implement data solutions using Master Data Management principles and tools

  • Design and implement data governance and quality initiatives ensuring consistent translation and usage of data

  • Partner with leaders and team members across the business to design and institute practices that will drive the appropriate levels of rigor and quality in enterprise information architecture

  • Design, develop and maintain controls on data quality, interoperability and sources to effectively manage corporate risk

  • Design in-depth data analysis, data modeling and data design approaches on complicated datasets with potentially complex data integration scenario

  • Define processes for the effective, integrated introduction of new data

  • Establish and contribute to standards for ensuring consistent usage of our information platforms.

  • Ensure speed of data delivery without compromising data quality measures

  • Work with a team of Data Engineers to implement the designs.

  • Ability to think through multiple alternatives and select the best possible solutions to solve tactical and strategic business needs.

  • Evaluate new technology for use within Enterra.

 

Requirements:

  • Master’s degree in computer science, or a STEM (Science, Technology, Engineering or Math) field required.

  • Minimum of 5 years of hands-on experience in data architecture.

  • Experience in the consumer goods and retail industry – especially in dealing with sales data is not critical but strongly desired.

  • Minimum of 3 years of experience in an analytics/data science environment.

  • Minimum of 3 years of experience within a big data environment.

  • Demonstratable knowledge of data warehousing, business intelligence, and application data integration solutions.

  • Demonstratable experience in developing applications and services that run on a cloud infrastructure (Azure preferred, AWS or GCP).

  • Demonstratable experience with dimensional modeling techniques and creation of logical and physical data models (entity relationship modeling, Erwin diagrams, etc.)

  • Excellent problem-solving and communication skills in English.

  • Ability to thrive in a fast-paced, remote environment.

  • Comfortable with ambiguity with the ability to build structure and take a proactive approach to drive results.

  • Attention to detail – quality and accuracy in deliverables.

  • Strong interpersonal skills, including the ability to advocate for data management best practices and standards.

The following additional skills would be beneficial:

  • Knowledge of one or more of the following technologies: Data Science, Machine Learning, Natural Language Processing, Business Intelligence, and Data Visualization.

  • Knowledge of statistics and experience using statistical or BI packages for analyzing large datasets (Excel, R, Python, Power BI, Tableau etc.).

  • Experience with at least one of the following – Databricks, Spark, Hadoop or Kafka.

Working Nomads

Post Jobs
Premium Subscription
Sponsorship
Free Job Alerts

Job Skills
API
FAQ
Privacy policy
Terms and conditions
Contact us
About us

Jobs by Category

Remote Administration jobs
Remote Consulting jobs
Remote Customer Success jobs
Remote Development jobs
Remote Design jobs
Remote Education jobs
Remote Finance jobs
Remote Legal jobs
Remote Healthcare jobs
Remote Human Resources jobs
Remote Management jobs
Remote Marketing jobs
Remote Sales jobs
Remote System Administration jobs
Remote Writing jobs

Jobs by Position Type

Remote Full-time jobs
Remote Part-time jobs
Remote Contract jobs

Jobs by Region

Remote jobs Anywhere
Remote jobs North America
Remote jobs Latin America
Remote jobs Europe
Remote jobs Middle East
Remote jobs Africa
Remote jobs APAC

Jobs by Skill

Remote Accounting jobs
Remote Assistant jobs
Remote Copywriting jobs
Remote Cyber Security jobs
Remote Data Analyst jobs
Remote Data Entry jobs
Remote English jobs
Remote Spanish jobs
Remote Project Management jobs
Remote QA jobs
Remote SEO jobs

Jobs by Country

Remote jobs Australia
Remote jobs Argentina
Remote jobs Brazil
Remote jobs Canada
Remote jobs Colombia
Remote jobs France
Remote jobs Germany
Remote jobs Ireland
Remote jobs India
Remote jobs Japan
Remote jobs Mexico
Remote jobs Netherlands
Remote jobs New Zealand
Remote jobs Philippines
Remote jobs Poland
Remote jobs Portugal
Remote jobs Singapore
Remote jobs Spain
Remote jobs UK
Remote jobs USA


Working Nomads curates remote digital jobs from around the web.

© 2025 Working Nomads.