Data Modeler
Pointwest is looking for a visionaryData Modelerto serve as the technical lead for our modern data initiatives. This is a role for a high-impact practitioner who has successfully navigated the transition from traditional Data Warehousing to modernData Lakehousearchitectures.
Driven by our core values ofLeadership, Excellence, and Innovation, you will design the blueprints that transform raw data into a strategic corporate asset. You will embody the Pointwest culture ofAgility, Accountability, and Customer Centricityby ensuring the structural health of our data ecosystem and optimizing cloud performance to deliver world-class insights.
Key Result Areas and Key Performance Indicators
A. Key Tasks
Delighting Our Customers & Stakeholders
- Architecting Excellence:Design and implement conceptual, logical, and physical data models that support both structured SQL analytics and advanced Data Science workloads.
- Medallion Strategy:Lead the implementation of Bronze, Silver, and Gold data layers to provide stakeholders with curated, high-quality data assets.
- Reliability First:Act as the first line of defense against Garbage In, Garbage Out by designing automated validation scripts and performing root cause analysis on discrepancies.
Growing Our Business
- Schema Evolution:Manage the lifecycle of data schemas to ensure the environment remains scalable and flexible as the business integrates new data sources.
- Technical Leadership:Serve as the bridge between complex business logic and technical execution, ensuring all data movements are documented and transparent.
Improving the Way We Work
- Performance Optimization:Collaborate with engineers to optimize partitioning, clustering, and indexing (e.g., Z-Ordering) to reduce cloud compute costs and improve speed.
- Standardization:Enforce enterprise data standards and naming conventions to prevent the creation of Data Swamps and ensure a unified logical model.
Developing Myself and Others
- Multi-Model Expertise:Apply and share knowledge of various modeling techniques, including Dimensional/Star Schema, 3NF, and Data Vault, to solve diverse business challenges.
B. Key Metrics
Delighting Our Customers & Stakeholders
- Data Trust Score:Maintain a high Data Trust Score (>95%) through successful automated validation and rapid resolution of discrepancy tickets.
- Model Adoption:Target 80% or more of production reports being sourced from curated Gold Layer models rather than ad-hoc tables.
Growing Our Business
- Architectural Integrity:Measurable reduction in Schema Debt by minimizing deprecated or redundant tables over time.
- Mapping Accuracy:Achieve a Logic Re-work rate of less than 10% during the development phase.
Improving the Way We Work
- Compute Efficiency:Deliver a 15-20% reduction in cloud compute costs (Snowflake/Databricks credits) through smart physical design and refactoring.
- Documentation Coverage:Maintain 100% documentation coverage for Gold-layer assets, including full Source-to-Target Mappings (STTM).
Developing Myself and Others
- Deployment Velocity:Ensure Data Engineers can move from model design to production with minimal clarification sessions, reflecting the clarity of your technical guidance.
Qualifications
- Experience:5-7 years of dedicated Data Modeling experience within Data Warehouse or Lakehouse environments.
- Proven Track Record:Successfully delivered at least 5 distinct data projects from initial requirements through to production.
- Technical Mastery:Expert-level proficiency in Kimball/Star Schema and 3NF; hands-on experience with Medallion Architecture.
- Cloud Platforms:Technical proficiency in Snowflake, Databricks, or Google BigQuery.
- Tooling:Experience with industry-standard tools such as erwin, SQLDBM, SAP PowerDesigner, or dbt.
- Education:Bachelor's degree in Computer Science, Information Systems, or a related quantitative field.