The Data Engineering Center of Excellence has been established in the DPM team to help realize the vision of becoming a customer-centric organization, driven by a data and analytics capability that enhances customer interactions and revenue generation.
The Big Data Engineer is responsible for development and automation of Data Lake ingestion, transformation and consumption services; adopting new technology; and ensuring modern operations in order to deliver consumer driven Data Lake solutions in both on-premises and Cloud platform implementations.
The role:
- Implement request for ingestion, creation, and preparation of data sources
- Develop and execute jobs to import data periodically/ (near) real-time from an external source
- Setup a streaming data source to ingest data into the platform
- Delivers data sourcing approach and data sets for analysis, with activities including data staging, ETL, data quality, and archiving
- Design a solution architecture on both On-premises and Cloud platforms to meet business, technical and user requirements
- Profile source data and validate fit-for-purpose
- Works with Delivery lead and Solution Architect to agree pragmatic means of data provision to support use cases
- Understands and documents end user usage models and requirements
Minimum Qualifications:
Candidates must have at least 3 years of experience in the below technologies:
- Data warehouses and data lakes
- Big data Technologies
- Analytical and SQL skills
- DataBricks/Azure Cloud Tools (nice to have).
- A Python/Scala programming
Candidates must also be willing to work full onsite in Makati City.