Search by job, company or skills

Deltek

Principal Data Engineer

5-7 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 3 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Responsibilities:

We're seeking a Principal Data Engineer to join our Enterprise Data & Intelligence team during an exciting transformation from a legacy SQL Server enterprise data warehouse to Snowflake as our Enterprise-wide intelligent data platform. You'll play a critical role in building and maintaining data pipelines that not only power today's analytics but also lay the foundation for tomorrow's unified trusted data foundation, self-service analytics, semantic layers, and AI/ML capabilities to enable strategic data capabilities.

This position requires someone who combines strong technical skills with a deep understanding of data warehousing fundamentals and the ability to translate complex technical concepts for diverse stakeholders. You'll work on everything from pipeline development to transformation logic in dbt, ensuring our intelligent data platform delivers reliable, well-structured data that empowers both traditional reporting and advanced analytics use cases.

  • Design, develop, and maintain data pipelines that feed our Snowflake intelligent data platform using Python, Fivetran, and other modern ETL/ELT tools
  • Build and maintain transformation logic in dbt, converting SQL Server stored procedures to modular, testable dbt models on Snowflake
  • Develop dimensional data models following best practices, creating the foundation for semantic layers and self-service analytics
  • Translate complex SQL Server stored procedures and SSIS packages from our legacy EDW database to dbt models and cloud-native solutions
  • Implement proper grain definition, ensure additivity of facts, and maintain dimensional consistency across data marts to support both BI and future ML workloads
  • Create dbt tests, documentation, and lineage to ensure data quality and transparency
  • Partner with analytics teams, business stakeholders, and subject matter experts across the enterprise to understand requirements, validate business logic, and translate them into robust technical solutions that scale with our platform's intelligence capabilities
  • Develop and enforce data quality standards that ensure our platform remains trustworthy for both human and AI consumers
  • Contribute to our migration strategy from traditional EDW database to an intelligent data platform architecture through analytical problem-solving and strategic thinking
  • Containerize data applications using Docker and deploy to Kubernetes environments
  • Facilitate knowledge transfer sessions and documentation to ensure the entire team understands new patterns and approaches

Qualifications:

  • Strong SQL expertise with the ability to write complex, optimized queries and understand execution plans
  • Snowflake experience, including understanding of virtual warehouses, micro-partitions, and cloud data warehouse best practices
  • DBT proficiency, including:
  • Building modular, reusable data models
  • Writing custom tests and macros
  • Understanding materializations (tables, views, incremental models)
  • Documentation and lineage best practices
  • Converting stored procedure logic to dbt models
  • Python proficiency for data engineering tasks (pandas, SQLAlchemy, pipeline orchestration)
  • Deep understanding of dimensional modeling and data warehousing fundamentals, including:
  • Fact and dimension table design patterns
  • Slowly changing dimensions (Type 1, 2, 3)
  • Grain declaration and consistency
  • Additivity and semi-additive measures
  • Conformed dimensions and bus matrix design
  • Experience building production data pipelines with proper error handling, logging, and monitoring
  • Understanding of modern data stack concepts (ELT vs ETL, data lakehouse, streaming vs batch)
  • Version control proficiency with Git

Professional Skills

  • Strong analytical mindset with the ability to break down complex problems and evaluate multiple solution approaches
  • Clear communication skills to articulate technical decisions, trade-offs, and recommendations to both technical and non-technical audiences
  • Demonstrated ability to work effectively across teams and business units, building consensus around enterprise data standards and driving solutions forward
  • Experience mentoring others and sharing knowledge to elevate team capabilities

Preferred Qualifications

  • Fivetran experience or comparable tools
  • SQL Server and SSIS background to aid in the stored procedure to DBT migration
  • Container orchestration with Docker and Kubernetes
  • Azure DevOps and CI/CD pipeline development
  • Building and maintaining deployment pipelines for dbt projects
  • Automated testing and validation of data transformations
  • DataOps practices and principles
  • Understanding of data governance and data quality frameworks is essential for platform intelligence
  • Experience with federated BI approaches and semantic layer design
  • Familiarity with Power BI
  • Knowledge of availability groups, failover clusters, or similar HA/DR concepts

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 138935649