Search by job, company or skills

Accenture in the Philippines

Databricks Unified Data Analytics Platform Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted 13 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.

Cognitive Engineer

Responsibilities:

  • Design and build complex pipelines using Delta Lake, Auto Loader, Delta Live Tables (DLT), and deployment using Asset Bundles.
  • Proven experience as a Data Architect and Data Engineer leading enterprise-scale Lakehouse initiatives.
  • Expert-level understanding of modern Data & Analytics Architecture patterns including Data Mesh, Data Products, and Lakehouse Architecture.
  • Excellent programming and debugging skills in Python.
  • Strong experience with PySpark for building scalable and modular ETL/ELT pipelines.
  • Architect data ingestion and transformation using DLT Expectations, modular Databricks Functions, and reusable pipeline components.
  • Must have hands-on expertise in at least one major cloud platform: AWS, GCP, or Azure.
  • Lead implementation of Unity Catalog: create catalogs, schemas, role-based access policies, lineage visibility, and data classification tagging (PII, PHI, etc.).
  • Guide organization-wide governance via Unity Catalog setup: workspace linkage, SSO, audit logging, external locations, and Volume access.
  • Enable cross-platform data access using Lakehouse Federation, querying live from externally hosted databases .
  • Leverage and integrate Databricks Marketplace to consume high-quality third-party data and publish internal data assets securely.
  • Experience with cloud-based services relevant to data engineering, data storage, data processing, data warehousing, real-time streaming, and serverless computing.
  • Govern and manage Delta Sharing for securely sharing datasets with external partners or across tenants.
  • Design and maintain PII anonymization, tokenization, and masking strategies using dbx functions and Unity Catalog policies to meet GDPR/HIPAA compliance.
  • Architect Power BI, Tableau, and Looker integration with Databricks for live reporting and visualization over governed datasets.
  • Build Databricks SQL Dashboards to enable stakeholders with real-time insights, KPI tracking, and alerts.
  • Hands on Experience in applying Performance optimization techniques
  • Lead cross-functional initiatives across data science, analytics, and platform teams to deliver secure, scalable, and value-aligned data products.
  • Provide thought leadership on adopting advanced features like Mosaic AI, Vector Search, Model Serving, and Databricks Marketplace publishing.
  • Working knowledge of dbt (Data Build Tool) is a plus.
  • Strong background in data modeling and data warehousing concepts is required.

  • Good to Have:

  • Certifications: Databricks Certified Professional or similar certifications.
  • Machine Learning: Knowledge of machine learning concepts and experience with popular ML libraries.
  • Knowledge of big data processing (e.g., Spark, Hadoop, Hive,Kafka)
  • Data Orchestration: Apache Airflow.
  • Knowledge of CI/CD pipelines and DevOps practices in a cloud environment.
  • Experience with ETL tools like Informatica, Talend, Matillion, or Fivetran.
  • Familiarity with dbt (Data Build Tool)

  • More Info

    Job Type:
    Industry:
    Employment Type:

    Job ID: 143363103