Epicareer Might not Working Properly
Learn More

ETL Data Engineer

Salary undisclosed

Checking job availability...

Original
Simplified

NOTES:

Strong ETL, DBT, Snowflake and SQL

Must have re/insurance/P&C experience

The ETL Data Engineer is mainly responsible for transforming replicated source data into data products that are easily consumed by the Reporting team and other downstream users. The role requires an expertise in Data Modeling and Data Pipeline development within an Analytical, column-oriented data store. This role reports to their Data Engineering & ETL Lead.

Responsibilities & Accountabilities:

  • Translate requirements and data mapping documents into a technical design.
  • SQL development using Data Built Tool (DBT) framework
  • Python knowledge for Data Built Tool (DBT) macros and data pipelines.
  • Debug and troubleshoot issues found during testing or production.
  • Communicate project status, issues, and blockers with the team.
  • Contribute to continuous improvement by identifying and addressing opportunities.

Ideal Candidate Profile:

  • Bachelor’s degree in computer science or equivalent required.
  • Minimum of 5 years of experience in ETL development within a Data Warehouse.
  • Minimum of 5 years working with Dimensional Data Modeling, such as Kimball, Data Vault, or Medallion Architecture.
  • Strong familiarity with Data Built Tool (DBT).
  • Expertise in column-oriented SQL modelling.
  • Experience in P&C Insurance or Financial Services Industry is a plus.
  • Strong understanding of test-driven development applied to Enterprise Data
  • Highly comfortable with git and change management best practices

NOTES:

Strong ETL, DBT, Snowflake and SQL

Must have re/insurance/P&C experience

The ETL Data Engineer is mainly responsible for transforming replicated source data into data products that are easily consumed by the Reporting team and other downstream users. The role requires an expertise in Data Modeling and Data Pipeline development within an Analytical, column-oriented data store. This role reports to their Data Engineering & ETL Lead.

Responsibilities & Accountabilities:

  • Translate requirements and data mapping documents into a technical design.
  • SQL development using Data Built Tool (DBT) framework
  • Python knowledge for Data Built Tool (DBT) macros and data pipelines.
  • Debug and troubleshoot issues found during testing or production.
  • Communicate project status, issues, and blockers with the team.
  • Contribute to continuous improvement by identifying and addressing opportunities.

Ideal Candidate Profile:

  • Bachelor’s degree in computer science or equivalent required.
  • Minimum of 5 years of experience in ETL development within a Data Warehouse.
  • Minimum of 5 years working with Dimensional Data Modeling, such as Kimball, Data Vault, or Medallion Architecture.
  • Strong familiarity with Data Built Tool (DBT).
  • Expertise in column-oriented SQL modelling.
  • Experience in P&C Insurance or Financial Services Industry is a plus.
  • Strong understanding of test-driven development applied to Enterprise Data
  • Highly comfortable with git and change management best practices