***Note: This position is open to applicants based in the Philippines only.
Job Scope
The Senior Data Engineer is a key contributor in building and maintaining robust, secure data infrastructure for internal and external clients. This role combines deep technical expertise, architectural knowledge, and AI/ML engineering skills to design scalable data systems for client engagements and optimize internal data platforms with a strong emphasis on data governance, quality assurance, and privacy protection. The engineer collaborates closely with client teams and internal stakeholders, develops comprehensive data architectures, and acts as a subject matter expert in implementing secure, compliant data engineering solutions across both client projects and internal systems while ensuring the highest standards of data protection and quality.
Key Responsibilities:
Client-Facing Delivery
- Lead the design, development, and implementation of scalable data pipelines and ETL/ELT processes for client engagements, delivering business intelligence and advanced analytics solutions with robust data quality controls
- Collaborate directly with external clients to understand their data requirements and translate business needs into secure, compliant technical data engineering solutions
- Ensure all client data solutions adhere to strict data governance frameworks and privacy protection standards
Internal Infrastructure & Data Management
- Build internal data infrastructure from scratch or integrate with existing client systems, including secure data warehouses and data lakes that support organizational operations and analytics capabilities
- Design and implement comprehensive data models for both client projects and internal systems, ensuring optimal performance, scalability, and data integrity
- Develop and maintain data ingestion processes and APIs to seamlessly integrate data from multiple sources while maintaining data quality and security
Data Governance & Security
- Establish and enforce comprehensive data privacy protocols and data governance frameworks to ensure compliance with regulatory requirements and protect sensitive client data
- Implement automated data quality monitoring systems to proactively identify and resolve data integrity issues across all data pipelines
- Design and maintain data lineage tracking and audit trails to ensure full transparency and accountability in data processing workflows
- Apply security best practices including data encryption, access controls, and secure data handling procedures throughout all data engineering processes
Technology Integration & Innovation
- Integrate AI/ML technologies (e.g., automated data processing, intelligent data validation, predictive data quality monitoring, and ML-powered analytics pipelines) into client deliverables and internal workflows
- Leverage cloud-based platforms including Snowflake and Databricks to build scalable, cost-effective, and secure data solutions
- Stay current with emerging data engineering technologies and data governance best practices to enhance both client outcomes and internal capabilities
Qualifications:
- Bachelor's degree in Physics, Engineering, Computer Science, Information Systems, Mathematics, or a related technical field
- At least 3 years of experience in data engineering, with proven experience building data pipelines for business intelligence and advanced analytics
- Strong business acumen with experience working directly with external clients and internal cross-functional teams
- Demonstrated track record of delivering complex data engineering projects that drive measurable business impact while maintaining data security and quality standards
- Excellent communication skills, both verbal and written, with the ability to present technical concepts to non-technical stakeholders and clients
- Experience with data privacy regulations and governance frameworks is essential
- Experience in both client-facing and internal environments is a plus
- Accounting, finance, or manufacturing domain knowledge is a plus
Technical and Other Skills:
Core Technical Proficiency
- Expert-level proficiency in SQL for complex data extraction, transformation, and modeling
- Advanced programming skills in Python and SQL for data pipeline development and automation
- Experience with data orchestration tools (e.g., Apache Airflow, dbt) and workflow management systems
- Skilled in API development and integration, including RESTful APIs and data ingestion from various sources
Cloud & Data Platform Expertise
- Extensive experience with cloud data platforms, specifically Snowflake and Databricks, for building enterprise-scale data solutions
- Strong expertise in data modeling techniques and tools for designing efficient database schemas and data warehouse architectures
- Deep understanding of ETL/ELT processes, data warehousing concepts, and modern data architecture patterns
- Understanding of CRM/ERP systems (e.g., QuickBooks, Xero, NetSuite, SAP) and experience integrating them into data pipelines
Infrastructure & Development Operations
- Familiarity with containerization technologies (Docker, Kubernetes) and cloud services (AWS, Azure, GCP)
- Familiarity with version control systems (Git) and CI/CD practices for data engineering workflows
- Experience with Agile/Scrum methodologies and project collaboration tools
Data Governance & Security
- Understanding of data governance frameworks, data privacy regulations (GDPR, CCPA), and security best practices for handling sensitive financial data
- Experience implementing data quality frameworks, data lineage tracking, and automated data validation processes
- Knowledge of data encryption, access control mechanisms, and secure data handling procedures