Job Role: -Senior Data Engineer
Job Location: -Metro Manila
Experience: -8+ Years
Role OverviewSeeking an experienced Senior Data Engineer responsible for designing, developing, and optimizing scalable data warehousing and data platform solutions. The role involves end-to-end data architecture, pipeline development, platform migration, and cross-functional collaboration to support analytics and business decision-making. Ideal for candidates with strong Snowflake, Python, and cloud engineering expertise.
Key Responsibilities: -Data Architecture & Engineering- Design and implement scalable, high-performance data architectures using Snowflake, dbt, and Python.
- Build and maintain data warehouse layers including data foundation layers and data marts.
- Develop robust ETL/ELT pipelines for extracting, transforming, and loading data from diverse sources.
- Optimize data models and implement Medallion Architecture best practices.
- Manage data ingestion, storage, transformations, and delivery to analytics teams.
- Perform performance tuning, caching, and workload optimization across the data ecosystem.
Data Platform Management- Lead the migration of on-prem or legacy data platforms to Snowflake with minimal disruption.
- Manage full data lifecycle within Snowflakefrom ingestion to analytics.
- Troubleshoot and resolve environment-level issues and ensure high system availability.
- Automate manual processes and re-architect pipelines for scalability and efficiency.
Cross-Functional Collaboration- Work closely with product, analytics, design, and engineering teams to understand data needs.
- Support stakeholders by resolving data issues and ensuring data quality and integrity.
- Collaborate with technical and non-technical teams, translating business requirements into data solutions.
Innovation & Continuous Improvement- Explore emerging technologies such as Big Data, Machine Learning, Generative AI, and Predictive Modeling.
- Identify opportunities to improve data quality, processes, and system performance.
- Maintain coding standards, documentation, and best practices.
Skills & Requirements: -Experience & Technical Expertise- 8+ years of professional experience in Data Engineering.
- Strong programming skills, with hands-on Python expertise (must-have).
- Deep knowledge of Snowflake and experience designing cloud-based data solutions.
- Experience with dbt, SQL, and distributed data processing.
- Strong understanding of ETL/ELT concepts and pipeline development across databases, APIs, external data providers, and streaming sources.
- Proficiency with AWS, Azure, and/or GCP.
- Experience building and maintaining REST APIs for data workflows.
- Solid understanding of distributed systems, scalability concepts, and fault-tolerant architecture.
- Strong proficiency in both relational (SQL Server, Oracle, PostgreSQL, MySQL) and NoSQL databases.
- Strong analytical and root-cause analysis skills for structured and unstructured datasets.
Soft Skills & Core Competencies- Excellent communication skills, especially with non-technical stakeholders.
- Ability to explain insights and business narratives derived from data.
- Strong adherence to coding standards, governance, and data quality best practices.
- Demonstrated ability to evaluate and work with AI/ML and emerging technologies.
- Detail-oriented, proactive, and capable of working independently in a remote environment.