Job Description:
- Analyze and assess new databases/datasets from data share partners.
- Create business rules and functional requirements necessary to set up ETL processes to ingest new datasets, as well as recurring.
- Ensure quality and accuracy of the DSP datasets, this includes:
-Creating business logic and processes to QA new and re-occurring data loads.
-Maintain data standards and structure by producing validation and quality assurance reports to ensure data quality
and consistency.
- Identify, analyze, and interpret trends or patterns in data sets.
- Interpret data and develop recommendations based on findings.
- Interface with company's clients and third parties as needed to obtain and interpret data.
- Perform other tasks on projects as needed/assigned by management.
- Handling any ad-hoc requests related to DSP data.
- Create business rules and functional requirements necessary to modify existing DSP processes, as well as any new processes proposed by the data team.
- Support products that depend on Property and Casualty data.
- Answer and address any related data issues.
- Work with developers to support the team.
Qualifications:
- BS/BA in Information Systems, Statistics, Mathematics, Business, Economics or other related discipline, or equivalent combination of education and experience.
- Can understand and write SQL scripts on any of modern relational databases (Oracle/SQL Sever/Postgres/MySQL).
- Able to identify patterns and trends in large data sets using descriptive and inferential statistics.
- Familiarity with property and casualty insurance data and general knowledge of insurance industry preferred.
- Experience in data mining techniques and procedures and ability to apply these techniques appropriately.
- Well-established understanding of various graphical and textual data presentation and their appropriate uses to convey information to our users and clients.
- Ability to create functional specifications.
- Experience working with BI tools (domo, micro strategy, etc.)
- Extensive experience in using statistical tools (preferably R) to summarize and analyze large data sets and present analysis.