Epicareer Might not Working Properly
Learn More

ETL Developer and Data Migration Specialist

Salary undisclosed

Apply on


Original
Simplified

Job Description

Job Description
Assignment: RQ08078 - Software Developer - ETL - Senior
Job Title: ETL Developer and Data Migration Specialist
Requisition (SS): RQ08078
Start Date: 2024-12-02
End Date: 2025-03-31
Client: Justice Technology Services
Office Location: Onsite / 595 Bay Street, Toronto
Organization: Justice Technology Services
Ministry: Ministry of Solicitor General
# Business Days: 102.00

Note: This position is currently listed as "Onsite"; however, the Assignment under this request will provisionally be "Hybrid", working 7.25 hours per calendar day, sometime between 8:00 AM and 5:00 PM (excluding breaks) Monday to Friday inclusive, unless otherwise identified. These conditions are subject to change as the OPS reflects on its current situation. During the duration of the assignment, you will be under the discretion of the Hiring Manager's requirements for the Project you are assigned to.

Must haves:

  • 5+ years of proven working experience in an ETL role; strong understanding of ETL principles, including data extraction, transformation, and loading processes; knowledge of common ETL design patterns. Understanding of data pipeline architectures, Azure workflow orchestration tools, and concepts related to data ingestion, transformation, and movement.
  • Proficiency in Azure Data Factory, Azure Synapse workspaces, PolyBase, including knowledge of pipeline creation, data flows, integration runtimes, triggers, and monitoring.
  • Strong SQL skills and experience working with Azure SQL Databases and Dataverse; good understanding of Azure storage concepts and technologies.
  • Proficiency in scripting languages like Python, experience with Azure-specific scripting using PowerShell or Azure CLI.
  • Expert proficiency with data manipulation languages (T-SQL, PL/SQL), data definition languages, physical database design, data modeling, query performance analysis & tuning
  • Knowledge of integration technologies commonly used with Dynamics, such as DataVerse / Common Data Service (CDS), Data Entities, and APIs.
  • Experience with continuous integration/continuous deployment (CI/CD) processes around DevOps, data workflows, Synapse workspaces.
  • Familiarity with data warehousing concepts and experience working with Azure Synapse Analytics or similar platforms for building and managing large-scale data warehousing/lakehouse solutions.
  • Experience with SSIS, SSRS, PowerBI

Nice To have:

  • Azure cloud certifications (e.g. Azure fundamentals, Azure Data Engineer associate, Azure Database Administrator associate)
  • Experience with PowerApps platform, Power Automate, Dynamics CE and F&O

Description

Scope

  • The Office of the Public Guardian and Trustee (OPGT) requires a Senior Software Developer ETL to perform data migration activities from the existing OPGT legacy applications to the new Dynamics 365 OPGT solution and data warehouse/lakehouse.

Assignment Deliverables

As a member of the data migration team, you will be responsible to migrate the data from the existing OPGT legacy applications to the new Dynamics 365 OPGT solution and the new data warehouse/lakehouse. A high-level list of deliverables for the data migration team follows:

  • Data Analysis: analyze the existing data in the legacy applications, understand its structure, quality, and relationships and help in designing an appropriate migration strategy;
  • Data Mapping and Transformation: map the data elements from the legacy application to the corresponding entities and fields in Dynamics 365 CE, F&O and Data Lakehouse. Handle necessary data transformations, ensuring compatibility and consistency between the legacy data and the target system;
  • Data Extraction: help extract the required data from the legacy application, develop and implement extraction processes to retrieve data from various sources, such as databases, files, APIs, or other relevant legacy systems;
  • Data Cleansing and Validation: cleanse and validate the extracted data to ensure its accuracy, completeness, and consistency. Help with identifying and resolving data quality issues, performing deduplication, and applying business rules to ensure the integrity of the migrated data;
  • Data Migration Strategy and Execution: review the present migration strategy that outlines the overall approach, sequence, and timeline for migrating the data from the legacy application to Dynamics 365 using a delta-load approach; execute the migration plan efficiently, managing data transfers and ensuring minimal disruption to ongoing operations;
  • Data Testing and Quality Assurance: conduct thorough testing to verify the accuracy and integrity of the migrated data; define test cases, perform data reconciliation, and address any issues or discrepancies that arise during the testing phase; develop KPIs to report on the progress, completeness and quality of the data migration effort;
  • Documentation: document the entire data migration process, including data mapping rules, transformation logic, migration scripts, and any specific configurations;
  • Ongoing Support: provide post-migration support, analyze and address data-related issues or questions; help optimize data management processes in the new environment;
  • Data Lakehouse: help expand the current data lakehouse implementation and help build a gold layer for reporting; conduct data analysis and profiling; develop data transformation and cleansing pipelines; implement data integration and harmonization;
  • Other duties as assigned;

The Vendor's Personnel will also be required to:

  • Complete work and achieve milestones within the assigned deadlines;
  • Notify the Cluster/Ministry project Manager in writing of any issues or other material concerns related to the Assignment Deliverables, as soon as he/she becomes aware of them
  • Submit Deliverables for the Cluster/Ministry approval as they are completed.
  • Comply with the Ontario Government and the Cluster/Ministry security procedures and practices
  • Comply with the Ontario Government and the Cluster/Ministry architecture/technology standards and best practices
  • Comply with the Ontario Government and the Cluster/Ministry Conflict of Interest and Confidentiality Guidelines
  • Provide knowledge and skill transfer to a designated Cluster/Ministry staff; and
  • Comply with the Ontario Government I&IT Directive, Operational Policy on the I&IT Project Gateway Process, and other applicable Guidelines, Standards and Procedures.

A Note on the VOR Master Service Agreement:

  • The VOR Master Service Agreement which expires on April 5, 2025, leaves some Contracts with funding unassigned for fiscal 2025-26. If the current statement of work expires on March 31, 2025, the remaining funds can be used to exercise an option to extend the SOW beyond March 31, 2025, based on business case approvals. Such extensions will be allowable, only if the Master Service Agreement is extended beyond April 5, 2025, and be upon the same terms, conditions, and covenants contained in the SOW.
  • The start date is subject to change based on security clearances and contract signing timelines.

Experience and Skillset Requirements

Desired Skills and Experience

  • 5+ years of proven working experience in an ETL role; strong understanding of ETL principles, including data extraction, transformation, and loading processes; knowledge of common ETL design patterns. Understanding of data pipeline architectures, Azure workflow orchestration tools, and concepts related to data ingestion, transformation, and movement.
  • Experience in integrating various data sources and systems, both on-premises and in the cloud, using Azure ETL services or other ETL tools
  • Knowledge of integration technologies commonly used with Dynamics, such as DataVerse / Common Data Service (CDS), Data Entities, and APIs.
  • Expertise in data transformation techniques, such as data cleansing, aggregation, enrichment, and normalization using Azure cloud technologies
  • Understanding of data quality management practices, including data profiling, data validation, and error handling within ETL processes.
  • Understanding of data governance principles, data privacy regulations and experience working with high-sensitivity data, and knowledge of best practices for data security and compliance in Azure.
  • Ability to monitor and troubleshoot ETL processes, optimize query performance, and implement efficient data processing techniques in Azure.
  • Proficiency in Azure Data Factory, Azure Synapse workspaces, PolyBase, including knowledge of pipeline creation, data flows, integration runtimes, triggers, and monitoring.
  • Strong SQL skills and experience working with Azure SQL Databases and Dataverse; good understanding of Azure storage concepts and technologies.
  • Proficiency in scripting languages like Python, and experience with Azure-specific scripting using PowerShell or Azure CLI.
  • Expert in data manipulation languages (T-SQL, PL/SQL), data definition languages, physical database design, data modelling, query performance analysis & tuning
  • Familiarity with version control systems (e.g., Azure Repos) and collaboration tools (e.g., Azure DevOps) for managing code, tracking changes, and collaborating with team members.
  • Experience with continuous integration/continuous deployment (CI/CD) processes around DevOps, data workflows, Synapse workspaces.
  • Experience with SQL Server Management Studio, Azure data management tools, XRM toolbox, data modeling tools (preferably, ERWIN).
  • Familiarity with data warehousing concepts and experience working with Azure Synapse Analytics or similar platforms for building and managing large-scale data warehousing/lakehouse solutions.
  • Experience with SSIS, SSRS, PowerBI

Resumes Evaluation/Criteria:

Criteria 1: Data Migration, ETL - 40 Points

  • Demonstrated experience with ETL development, data pipelines, workflow orchestration and data ingestion, transformation, and movement
  • Demonstrated experience in integrating various data sources and systems, both on-premises and in the cloud, using Azure ETL services or other ETL tools
  • Demonstrated experience working with Azure Data Factory, Azure Synapse workspaces, PolyBase, including knowledge of pipeline creation, data flows, integration runtimes, triggers, and monitoring.
  • Demonstrated experience with data manipulation languages (T-SQL, PL/SQL), data definition languages, query performance analysis & tuning
  • Demonstrated experience with SQL Server, Oracle, Azure SQL Databases
  • Demonstrated experience with data modeling tools (preferably, ERWIN)
  • Demonstrated experience in scripting languages like Python and with Azure-specific scripting using PowerShell or Azure CLI.
  • Experience with software development lifecycle
  • Experience with data modeling, physical database design, data flow diagrams

Criteria 2: Data Warehouse and Reporting - 20 Points

  • Demonstrated experience working with Azure Synapse Analytics or similar platforms for building and managing large-scale data warehousing/lakehouse solutions
  • Experience with data warehousing modelling concepts such as star and snowflake schemas
  • Experience with SSIS, SSRS, PowerBI
  • Experience with supporting a data warehouse in a production environment

Criteria 3: Azure Platform - 20 Points

  • Experience with Azure Data Factory (ADF) and Synapse Workspaces
  • Demonstrated experience Azure data management tools, DevOps, Synapse Studio
  • Experience in Azure resource configuration and administration such as Azure Data Lake, Blob Storage, Key Vault, Application Insight resources, resource groups and subscriptions.
  • Familiar with Azure cloud platform
  • Azure cloud certifications

Criteria 4: Dynamics 365 - 10 Points

  • Demonstrated experience working with integration technologies commonly used with Dynamics, such as DataVerse / Common Data Service (CDS), Data Entities, and APIs.
  • Demonstrated experience with PowerApps platform, Power Automate, Dynamics CE &F&O

Criteria 5: DevOps and CI/CD - 10 Points

  • Demonstrated experience with continuous integration/continuous deployment (CI/CD) tools and processes around DevOps, data workflows, Synapse workspaces.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job