Job Description
Data Engineer
About the Role:
The ETL Data Engineer is mainly responsible for transforming replicated source data into data products that are easily consumed by the Reporting team and other downstream users. The role requires expertise in Data Modeling and Data Pipeline development within an Analytical, column-oriented data store.
Responsibilities & Accountabilities:
Translate requirements and data mapping documents into a technical design.
SQL development using Data Built Tool (DBT) framework
Python knowledge for Data Built Tool (DBT) macros and data pipelines.
Debug and troubleshoot issues found during testing or production.
Communicate project status, issues, and blockers with the team.
Contribute to continuous improvement by identifying and addressing opportunities.
Ideal Candidate Profile:
Bachelor's degree in computer science or equivalent required.
Minimum of 5 years of experience in ETL development within a Data Warehouse.
Minimum of 5 years working with Dimensional Data Modeling, such as Kimball, Data Vault, or Medallion Architecture.
Strong familiarity with Data Built Tool (DBT).
Expertise in column-oriented SQL modeling.
Experience in P&C Insurance or Financial Services Industry is a plus.
Strong understanding of test-driven development applied to Enterprise Data
Highly comfortable with Git and change management best practices
Ability to process and adapt to change in a rapid growth new-company environment. Associates are expected to be curious, thrifty, and resourceful to manage through the unknowns of growing a specialty (re)insurance business from the ground up.