Epicareer Might not Working Properly
Learn More

ADF Engineer

Salary undisclosed

Apply on

Availability Status

This job is expected to be in high demand and may close soon. We’ll remove this job ad once it's closed.


Original
Simplified

ADF ETL Engineer
Contract to Hire

100 % Remote (Need to work in
9 am-5pm Hawaii Standard time)
responsibilities
5+ years of Data engineering experience with a focus on Data Warehousing
2+ years of experience creating pipelines in Azure Data Factory (ADF)
5+ years developing ETL using Informatica PowerCenter, SSIS, Azure Data Factory, or similar tools.
5+ years of experience with Relational Databases, such as Oracle, Snowflake, SQL Server, etc.
3+ years of experience creating stored procedures with Oracle PL/SQL, SQL Server T-SQL, or Snowflake SQL
2+ years of experience with GitHub, SVN, or similar source control systems
2+ years of experience processing structured and un-structured data.
Experience with HL7 and FHIR standards, and processing files in these formats.
3+ years analyzing project requirements and developing detailed specifications for ETL requirements.
Excellent problem-solving and analytical skills, with the ability to troubleshoot and optimize data pipelines.
Ability to adapt to evolving technologies and changing business requirements.
Bachelors or Advanced Degree in a related field such as Information Technology/Computer Science, Mathematics/Statistics, Analytics, Business
Requirements:
5+ years of Data engineering experience with a focus on Data Warehousing
2+ years of experience creating pipelines in Azure Data Factory (ADF)
5+ years developing ETL using Informatica PowerCenter, SSIS, Azure Data Factory, or similar tools.
Preferred skills:
2+ years of batch or PowerShell scripting
2+ years of experience with Python scripting.
3+ years of data modeling experience in a data warehouse environment
Experience or familiarity with Informatica Intelligent Cloud Services (specifically Data Integration)
Experience designing and building APIs in Snowflake and ADF (e.g. REST, RPC)
Experience with State Medicaid / Medicare / Healthcare applications
Azure certifications related to data engineering or data analytics.

Project:
Optum is currently seeking a hands-on Senior Data Engineer to support our HI EDW within the Enterprise Datawarehouse and Analytics group. The Data Engineerwill work with large healthcare datasets and will translate client's business requirements into enterprise systems, applications, or process designs for large complex health data solutions. The role will drive and support initiatives for the HI EDW as well as participate in the wider EDW groups areas of data usage and governance, information management, privacy and security, SOA, data analytics and visualization and information modeling.
Ideal background
Strong technical experience in ADF and Snowflake
Top skills needed:
5+ years of Data engineering experience with a focus on Data Warehousing
2+ years of experience creating pipelines in Azure Data Factory (ADF)
5+ years developing ETL using Informatica PowerCenter, SSIS, Azure Data Factory, or similar tools.
What experience will set candidates apart from one another?
Strong technical experience in ADF and Snowflake
Team:
1 Project manager, 1 Data Architect, 1 Data Engineering Leas, 2 Data Engineers

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job