Ab Initio Developer(Only W2)
Job title-Ab Initio Developer (Hadoop, Hive, AWS)
Location-Remote
Must Have
- Current Health insurance Customer Exp
- Licenses/Certifications
- AWS Certified Big Data - Specialty (Must Have)
- Cloudera Certified Developer for Apache Hadoop (CCDH) (Must Have)
- OCP Java SE 6 Programmer Certification (Good to have)
Ab Initio Developer (Hadoop, Hive, AWS)
Seeking a Lead Level Ab Initio Developer in Hadoop and AWS ecosystem! The selected candidate will be responsible for orchestrating, deploying, maintaining and scaling cloud OR on-premise infrastructure targeting big data and platform data management (Relational and NoSQL, distributed and converged) with emphasis on reliability, automation and performance. This role will focus on leading the development of solutions and helping transform the company's platforms deliver data-driven, meaningful insights and value to company. ESSENTIAL FUNCTIONS:
- 20% Lead the team to design, configure, implement, monitor, and manage all aspects of Data Integration Framework. Defines and develop the Data Integration best practices for the data management environment of optimal performance and reliability.
- 20% Develops and maintains infrastructure systems (e.g., data warehouses, data lakes) including data access APIs. Prepares and manipulates data using Hadoop or equivalent MapReduce platform.
- 15% Provides detailed guidance and performs work related to Modeling Data Warehouse solutions in the cloud OR on-premise. Understands Dimensional Modeling, De-normalized Data Structures, OLAP, and Data Warehousing concepts.
- 15% Oversees the delivery of engineering data initiatives and projects. Supports long term data initiatives as well as Ad-Hoc analysis and ELT/ETL activities. Creates data collection frameworks for structured and unstructured data. Applies data extraction, transformation and loading techniques in order to connect large data sets from a variety of sources.
- 15% Enforces the implementation of best practices for data auditing, scalability, reliability and application performance. Develop and apply data extraction, transformation and loading techniques in order to connect large data sets from a variety of sources.
- 10% Interprets data, analyzes results using statistical techniques, and provides ongoing reports. Executes quantitative analyses that translate data into actionable insights. Provides analytical and data-driven decision-making support for key projects. Designs, manages, and conducts quality control procedures for data sets using data from multiple systems.
- 5% Improves data delivery engineering job knowledge by attending educational workshops; reviewing professional publications; establishing personal networks; benchmarking state-of-the-art practices; participating in professional societies.
Required Skills
- BS Degree is required
- 10 years experience leading database design and ETL development, within Ab Initio specifically. Experience in leading data engineering and cross functional teams to implement scalable and fine tuned ETL/ELT solutions (batch and streaming) for optimal performance. Experience developing and updating ETL/ELT scripts.
- Hands-on experience with Ab Initio ETL development in Cloudera, Hadoop, Hive and AWS ecosystem, relational database layout, development, data modeling.
- 7+ years hands on experience as a Big Data Engineer in Hadoop and AWS ecosystem in Healthcare industry; preferably BCBS.
- Hands on experience with developing application for batch data loads and data streaming using technologies using Cloudera/Hadoop and/or AWS technologies
- Healthcare payor industry experience is a big plus Licenses/Certifications
AWS Certified Big Data - Specialty (Must Have)
Cloudera Certified Developer for Apache Hadoop (CCDH) (Must Have)
OCP Java SE 6 Programmer Certification (Good to have)
Job title-Ab Initio Developer (Hadoop, Hive, AWS)
Location-Remote
Must Have
- Current Health insurance Customer Exp
- Licenses/Certifications
- AWS Certified Big Data - Specialty (Must Have)
- Cloudera Certified Developer for Apache Hadoop (CCDH) (Must Have)
- OCP Java SE 6 Programmer Certification (Good to have)
Ab Initio Developer (Hadoop, Hive, AWS)
Seeking a Lead Level Ab Initio Developer in Hadoop and AWS ecosystem! The selected candidate will be responsible for orchestrating, deploying, maintaining and scaling cloud OR on-premise infrastructure targeting big data and platform data management (Relational and NoSQL, distributed and converged) with emphasis on reliability, automation and performance. This role will focus on leading the development of solutions and helping transform the company's platforms deliver data-driven, meaningful insights and value to company. ESSENTIAL FUNCTIONS:
- 20% Lead the team to design, configure, implement, monitor, and manage all aspects of Data Integration Framework. Defines and develop the Data Integration best practices for the data management environment of optimal performance and reliability.
- 20% Develops and maintains infrastructure systems (e.g., data warehouses, data lakes) including data access APIs. Prepares and manipulates data using Hadoop or equivalent MapReduce platform.
- 15% Provides detailed guidance and performs work related to Modeling Data Warehouse solutions in the cloud OR on-premise. Understands Dimensional Modeling, De-normalized Data Structures, OLAP, and Data Warehousing concepts.
- 15% Oversees the delivery of engineering data initiatives and projects. Supports long term data initiatives as well as Ad-Hoc analysis and ELT/ETL activities. Creates data collection frameworks for structured and unstructured data. Applies data extraction, transformation and loading techniques in order to connect large data sets from a variety of sources.
- 15% Enforces the implementation of best practices for data auditing, scalability, reliability and application performance. Develop and apply data extraction, transformation and loading techniques in order to connect large data sets from a variety of sources.
- 10% Interprets data, analyzes results using statistical techniques, and provides ongoing reports. Executes quantitative analyses that translate data into actionable insights. Provides analytical and data-driven decision-making support for key projects. Designs, manages, and conducts quality control procedures for data sets using data from multiple systems.
- 5% Improves data delivery engineering job knowledge by attending educational workshops; reviewing professional publications; establishing personal networks; benchmarking state-of-the-art practices; participating in professional societies.
Required Skills
- BS Degree is required
- 10 years experience leading database design and ETL development, within Ab Initio specifically. Experience in leading data engineering and cross functional teams to implement scalable and fine tuned ETL/ELT solutions (batch and streaming) for optimal performance. Experience developing and updating ETL/ELT scripts.
- Hands-on experience with Ab Initio ETL development in Cloudera, Hadoop, Hive and AWS ecosystem, relational database layout, development, data modeling.
- 7+ years hands on experience as a Big Data Engineer in Hadoop and AWS ecosystem in Healthcare industry; preferably BCBS.
- Hands on experience with developing application for batch data loads and data streaming using technologies using Cloudera/Hadoop and/or AWS technologies
- Healthcare payor industry experience is a big plus Licenses/Certifications
AWS Certified Big Data - Specialty (Must Have)
Cloudera Certified Developer for Apache Hadoop (CCDH) (Must Have)
OCP Java SE 6 Programmer Certification (Good to have)