Senior Business Intelligence Specialist - ETL Developer
Salary undisclosed
Checking job availability...
Original
Simplified
We are looking for Senior Business Intelligence Specialist for our client in Toronto, ON
Job Title: Senior Business Intelligence Specialist
Job Type: Contract
Job Description:
Job Description:
Responsibilities:
- Design, develop and implement data ingestion pipeline from Oracle source to Azure Data Lake and Data bricks - initial load and incremental ETL.
- Oracle Golden Gate (knowledge and experience are an asset) for data ingestion and Change Data Capture (currently in final stages of proof of concept).
- Azure Data Factory (good knowledge) to orchestrate pipeline execution.
- Azure Data bricks/Py Spark (expert Python/Py Spark knowledge required) to build transformations of raw (bronze) data into curated zone (silver) and data mart zone (gold).
- Power Designer (asset) to read and maintain data models.
- Review requirements, source data tables and relationships to identify solutions for optimum data models and transformations.
- Review existing on-prem design to produce design and migration steps.
- Design data ingestion mechanisms and transformations to update Delta Lake zones (bronze, silver, and gold), using Golden Gate as CDC.
- Work with IT partner on configuration of Golden Gate - responsible to provide direction.
- Prepare design artifacts and process diagrams, understand and update dimensional data models and source-to-target-mapping (STTM) documents.
- Analyze data - physical model mapping from data source to datamart model.
- Understand data requirements and recommend changes to the data model.
- Develop scripts to build physical model, and to create schema structure.
- Access Oracle DB and SqlServer environments, use SSIS and other development tools for analyzing legacy solution to be migrated.
- Proactively communicate with leads on any changes required to conceptual, logical and physical models, communicate and review dependencies and risks.
- Develop ETL strategy and solution for different sets of data modules
- Create physical level design documents and unit test cases.
- Develop Databricks notebooks and deployment packages for Incremental and Full Load.
- Develop test plans and perform unit testing of pipelines and scripts.
- Assess data quality and conduct data profiling.
- Troubleshoot performance issues, ETL Load issues, check log activity for each Individual package and transformation.
- Participate in Go Live planning and production deployment, and create production deployment steps and packages.
- Create design and release documentation.
- Provide Go Live support and review after Go Live.
- Review existing ETL process, tools and provide recommendation on improving performance and reduce ETL timelines.
- Review Infrastructure and any performance issues for overall process improvement.
- Knowledge Transfer to Ministry staff, develop documentation on the work completed.
- Experience of 7+ years of working in Data Warehousing and ETL development.
- Experience of 3+ years of working with Data bricks , Azure Data Factory, and Python/Py Spark, SQL Server, SSIS, and T-SQL Development.
- Experience building data ingestion and change data capture using Oracle Golden Gate.
- Experience working with building Databases, Data Warehouse and Data Mart and working with incremental and full loads.
- Experience with any ETL tools such as Azure Data Factory and SqlServer Integration Services.
- Experience working with MS SQL Server and other RDBMS (Oracle, PL/SQL).
- Experience on dimensional data modeling, and tools e.g. Power Designer.
- Experience with snowflake and star schema model; experience in designing data warehouse solutions using slowly changing dimensions.
- Experience with Delta Lake concepts and Medallion architecture (bronze/silver/gold).
- Understanding data warehouse architecture, dimensional data and fact model.
- Analyzing, designing, developing, testing and documenting ETL from detailed and high-level specifications, and assist in troubleshooting.
- Utilize SQL to perform tasks other than data transformation (DDL, complex queries).
- Good knowledge of database and delta lake performance optimization techniques.
- Experience working in an Agile environment, using DevOps tools for user stories, code repository, test plans and defect tracking.
- Ability to assist in the requirements analysis and design specifications.
- Work closely with Designers, Business Analysts and other Developers.
- Liaise with Project Managers, Quality Assurance Analysts and Business Intelligence Consultants.
- Demonstrated experience in creating both Functional Design Documents (FDD) & Detailed Design Documents (DDD).
- Experience in Fit-Gap analysis, system use case reviews, requirements reviews, coding exercises and reviews.
- Experience in the development and maintaining a plan to address contract deliverables, through the identification of significant milestones and expected results with weekly status reporting.
- Work with the Client & Developer(s) assigned to refine/confirm Business Requirements.
- Participate in defect fixing, testing support and development activities for ETL tool. Assist with defect fixing and testing support for Power BI reports.
- Analyze and document solution complexity and interdependencies by function including providing support for data validation.
- Demonstrated experience in Microsoft specific software development and several years of practical experience (minimum 7+ years overall).
- Proven experience in developing in Azure DevOps.
- Experience in application mapping to populate delta lake and dimensional data mart schemas.
- Demonstrated experience in Extract, Transform & Load and Extract, Load & Transform software development and several years of practical experience (minimum 7+ years).
- Experience in providing ongoing support on Azure pipeline/configuration and SSIS development.
- Experience building data ingesting and change data capture using Golden Gate.
- Assist in the development of the pre-defined and adhoc reports and meet the coding and accessibility requirements.
- Demonstrated experience with Oracle and Microsoft interfaces.
- Proficient in SQL and Azure DevOps.
- Implementing logical and physical data models.
- The Developer must have previous work experience in conducting Knowledge Transfer and training sessions, ensuring the resources will receive the required knowledge to support the system.
- The resource must develop learning activities using review-watch-do methodology & demonstrate the ability to prepare and present.
- Development of documentation and materials as part of a review and knowledge transfer to other members.
- Development and facilitation of classroom based, or virtual instructor demo led sessions for Developers.
- Monitor identified milestones and submission of status reports to ensure Knowledge Transfer is fully completed.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job We are looking for Senior Business Intelligence Specialist for our client in Toronto, ON
Job Title: Senior Business Intelligence Specialist
Job Type: Contract
Job Description:
Job Description:
Responsibilities:
- Design, develop and implement data ingestion pipeline from Oracle source to Azure Data Lake and Data bricks - initial load and incremental ETL.
- Oracle Golden Gate (knowledge and experience are an asset) for data ingestion and Change Data Capture (currently in final stages of proof of concept).
- Azure Data Factory (good knowledge) to orchestrate pipeline execution.
- Azure Data bricks/Py Spark (expert Python/Py Spark knowledge required) to build transformations of raw (bronze) data into curated zone (silver) and data mart zone (gold).
- Power Designer (asset) to read and maintain data models.
- Review requirements, source data tables and relationships to identify solutions for optimum data models and transformations.
- Review existing on-prem design to produce design and migration steps.
- Design data ingestion mechanisms and transformations to update Delta Lake zones (bronze, silver, and gold), using Golden Gate as CDC.
- Work with IT partner on configuration of Golden Gate - responsible to provide direction.
- Prepare design artifacts and process diagrams, understand and update dimensional data models and source-to-target-mapping (STTM) documents.
- Analyze data - physical model mapping from data source to datamart model.
- Understand data requirements and recommend changes to the data model.
- Develop scripts to build physical model, and to create schema structure.
- Access Oracle DB and SqlServer environments, use SSIS and other development tools for analyzing legacy solution to be migrated.
- Proactively communicate with leads on any changes required to conceptual, logical and physical models, communicate and review dependencies and risks.
- Develop ETL strategy and solution for different sets of data modules
- Create physical level design documents and unit test cases.
- Develop Databricks notebooks and deployment packages for Incremental and Full Load.
- Develop test plans and perform unit testing of pipelines and scripts.
- Assess data quality and conduct data profiling.
- Troubleshoot performance issues, ETL Load issues, check log activity for each Individual package and transformation.
- Participate in Go Live planning and production deployment, and create production deployment steps and packages.
- Create design and release documentation.
- Provide Go Live support and review after Go Live.
- Review existing ETL process, tools and provide recommendation on improving performance and reduce ETL timelines.
- Review Infrastructure and any performance issues for overall process improvement.
- Knowledge Transfer to Ministry staff, develop documentation on the work completed.
- Experience of 7+ years of working in Data Warehousing and ETL development.
- Experience of 3+ years of working with Data bricks , Azure Data Factory, and Python/Py Spark, SQL Server, SSIS, and T-SQL Development.
- Experience building data ingestion and change data capture using Oracle Golden Gate.
- Experience working with building Databases, Data Warehouse and Data Mart and working with incremental and full loads.
- Experience with any ETL tools such as Azure Data Factory and SqlServer Integration Services.
- Experience working with MS SQL Server and other RDBMS (Oracle, PL/SQL).
- Experience on dimensional data modeling, and tools e.g. Power Designer.
- Experience with snowflake and star schema model; experience in designing data warehouse solutions using slowly changing dimensions.
- Experience with Delta Lake concepts and Medallion architecture (bronze/silver/gold).
- Understanding data warehouse architecture, dimensional data and fact model.
- Analyzing, designing, developing, testing and documenting ETL from detailed and high-level specifications, and assist in troubleshooting.
- Utilize SQL to perform tasks other than data transformation (DDL, complex queries).
- Good knowledge of database and delta lake performance optimization techniques.
- Experience working in an Agile environment, using DevOps tools for user stories, code repository, test plans and defect tracking.
- Ability to assist in the requirements analysis and design specifications.
- Work closely with Designers, Business Analysts and other Developers.
- Liaise with Project Managers, Quality Assurance Analysts and Business Intelligence Consultants.
- Demonstrated experience in creating both Functional Design Documents (FDD) & Detailed Design Documents (DDD).
- Experience in Fit-Gap analysis, system use case reviews, requirements reviews, coding exercises and reviews.
- Experience in the development and maintaining a plan to address contract deliverables, through the identification of significant milestones and expected results with weekly status reporting.
- Work with the Client & Developer(s) assigned to refine/confirm Business Requirements.
- Participate in defect fixing, testing support and development activities for ETL tool. Assist with defect fixing and testing support for Power BI reports.
- Analyze and document solution complexity and interdependencies by function including providing support for data validation.
- Demonstrated experience in Microsoft specific software development and several years of practical experience (minimum 7+ years overall).
- Proven experience in developing in Azure DevOps.
- Experience in application mapping to populate delta lake and dimensional data mart schemas.
- Demonstrated experience in Extract, Transform & Load and Extract, Load & Transform software development and several years of practical experience (minimum 7+ years).
- Experience in providing ongoing support on Azure pipeline/configuration and SSIS development.
- Experience building data ingesting and change data capture using Golden Gate.
- Assist in the development of the pre-defined and adhoc reports and meet the coding and accessibility requirements.
- Demonstrated experience with Oracle and Microsoft interfaces.
- Proficient in SQL and Azure DevOps.
- Implementing logical and physical data models.
- The Developer must have previous work experience in conducting Knowledge Transfer and training sessions, ensuring the resources will receive the required knowledge to support the system.
- The resource must develop learning activities using review-watch-do methodology & demonstrate the ability to prepare and present.
- Development of documentation and materials as part of a review and knowledge transfer to other members.
- Development and facilitation of classroom based, or virtual instructor demo led sessions for Developers.
- Monitor identified milestones and submission of status reports to ensure Knowledge Transfer is fully completed.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job