
Data Modeler / Data Architect with Strong Snowflake and deploying production hubs
Data Modeler / Data Architect with Strong Snowflake and deploying production hubs Location: US Remote
Resume and Photo ID required:For the safety and security of our clients and their systems
DESCRIPTION OF SERVICES
The client is seeking a Data Modeler/Data Architect to support the DataOps team within the Division of Data Management and Research (DMR).
This role is critical in building and expanding the clients Snowflake-based data hub ecosystem to improve the efficiency of data analysis, provisioning, and reporting across the agency.
The contractor will work with a team of about 50 employees and contractors dedicated to transforming data practices across multiple siloed systems.
Key Responsibilities:
Design and develop data models and data pipelines for current and upcoming data hubs.
Construct and deploy ETL processes and data transformation logic across various environments (Dev, TEST, PROD).
Work primarily with data sources from CBMs (Coordinating Board Management) and FADS (Financial Aid Database System).
Deploy data hub and reference tables in Snowflake.
Collaborate with DataOps team members and analysts for user acceptance testing (UAT) support, including data refreshes and post-UAT adjustments.
Maintain clear and thorough documentation in Azure DevOps.
Contribute to the evolution and scaling of the data hub infrastructure to support the clients broader research and analysis initiatives.
Planned Data Hubs (FY 2025):
The DataOps team has already deployed several production hubs (Enrollment, Completion, SCH, Persistence) and plans to develop new ones, including:
Transfer
8th Grade Cohort
Faculty
Applications and Admissions
College Readiness/TSI
Labor Market Data
Preferred Qualifications:
Proven experience as a Data Modeler or Data Architect in data-intensive environments.
Strong Snowflake experience is required.
Proficiency in ETL development, data integration, and pipeline automation.
Solid understanding of data warehousing concepts and dimensional modeling.
Familiarity with Azure DevOps for documentation and project tracking.
Experience working in agile, cross-functional data teams.
Strong communication skills and ability to work independently in a remote or hybrid environment.
Data Modeler / Data Architect with Strong Snowflake and deploying production hubs Location: US Remote
Resume and Photo ID required:For the safety and security of our clients and their systems
DESCRIPTION OF SERVICES
The client is seeking a Data Modeler/Data Architect to support the DataOps team within the Division of Data Management and Research (DMR).
This role is critical in building and expanding the clients Snowflake-based data hub ecosystem to improve the efficiency of data analysis, provisioning, and reporting across the agency.
The contractor will work with a team of about 50 employees and contractors dedicated to transforming data practices across multiple siloed systems.
Key Responsibilities:
Design and develop data models and data pipelines for current and upcoming data hubs.
Construct and deploy ETL processes and data transformation logic across various environments (Dev, TEST, PROD).
Work primarily with data sources from CBMs (Coordinating Board Management) and FADS (Financial Aid Database System).
Deploy data hub and reference tables in Snowflake.
Collaborate with DataOps team members and analysts for user acceptance testing (UAT) support, including data refreshes and post-UAT adjustments.
Maintain clear and thorough documentation in Azure DevOps.
Contribute to the evolution and scaling of the data hub infrastructure to support the clients broader research and analysis initiatives.
Planned Data Hubs (FY 2025):
The DataOps team has already deployed several production hubs (Enrollment, Completion, SCH, Persistence) and plans to develop new ones, including:
Transfer
8th Grade Cohort
Faculty
Applications and Admissions
College Readiness/TSI
Labor Market Data
Preferred Qualifications:
Proven experience as a Data Modeler or Data Architect in data-intensive environments.
Strong Snowflake experience is required.
Proficiency in ETL development, data integration, and pipeline automation.
Solid understanding of data warehousing concepts and dimensional modeling.
Familiarity with Azure DevOps for documentation and project tracking.
Experience working in agile, cross-functional data teams.
Strong communication skills and ability to work independently in a remote or hybrid environment.