Search by job, company or skills

G

Network Data Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted 22 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

At Globe, our goal is to create a wonderful world for our people, business, and nation. By uniting people of passion who believe they can make a difference, we are confident that we can achieve this goal.

Job Description

Network Data Engineer are responsible of the following:
1.Scalable Pipeline Architecture & Orchestration: Design and deploy end-to-end ETL/ELT frameworks that automate the ingestion and transformation of network telemetry into the NTG Data Mart, ensuring high-availability and low-latency data flow.
2.Strategic Data Modeling & Schema Design: Translate complex domain requirements into high-performance dimensional models (Star/Snowflake) and technical designs, optimizing the repository for both storage efficiency and rapid analytical querying by the Data Analyst.
3.Automated Data Quality & Observability: Implement Data Quality as Code by building automated validation and cleansing scripts, while proactively monitoring pipeline health to ensure ingested data is curated, accurate, and anomaly-free.
4.Data Governance, Security, & Lineage Deployment: Enforced rigorous security protocols and access controls while maintaining comprehensive metadata documentation, ensuring the data is not only secure and compliant but also fully transparent and discoverable for experts.
5.Operational Excellence & Stakeholder Enablement: Maintain and optimize the data infrastructure to meet Service Level Agreements (SLAs), providing the Data Analyst with reliable, analysis-ready datasets that bridge the gap between raw network logs and strategic business insights.

Job Description

Duties and Responsibilities

  • End-to-End Pipeline Orchestration: Own the full lifecycle of data by architecting and operating automated ELT/ETL pipelines. This includes designing the ingestion logic and managing the daily orchestration to ensure the NTG Data Mart is populated with high-fidelity, low-latency network data.

  • Collaborative Schema & Dimensional Modeling: Partner with domain experts to translate business intuition into scalable technical data models. You are responsible for both the initial design and the iterative refactoring of tables to ensure the data remains optimized for the Data Analyst's evolving analytical needs.

  • Automated Validation & Observability: Build Data Quality as Code by integrating automated testing and monitoring into the pipeline. You are responsible for ensuring the data is accurate and idempotent, proactively identifying and fixing broken data before it impacts organizational decision-making.

  • Secure Governance & Data Transparency: Enforce rigorous security, IAM, and compliance standards while maintaining an up-to-date Data Catalog. You ensure that while the data is locked down from a security perspective, it is wide open from a documentation perspective so experts can self-serve with confidence.

  • Standardization & Data Enablement: Act as the technical bridge for the Data Analyst by peer-reviewing expert-led analyses and converting high-value one-off queries into standardized, production-grade features. Your success is measured by the Analyst's ability to work autonomously on top of your infrastructure.

KPI's

  • Data Freshness SLA (Reliability)
  • Data Integrity Score (Quality)

  • Pipeline Success Rate (Stability)

  • Mean Time to Insight (MTTI) (Velocity)

  • Catalog Coverage & Discoverability (Enablement

Top 3-5 Deliverables

  • Automated Data Pipelines: Production-grade ETL/ELT workflows that ensure the continuous, scheduled ingestion of network data into the NTG Data Mart.

  • Optimized Data Models: High-performance, analysis-ready schemas (Star/Snowflake) that simplify complex network telemetry for the Data Analyst.

  • Data Observability Suite: Integrated monitoring and Data Quality as Code frameworks to track pipeline health, freshness, and accuracy in real-time.

  • Live Data Catalog: A centralized, documented repository of all table definitions and lineage, enabling the Data Analyst to self-serve with confidence.

  • Standardized Metric Library: A curated set of pre-calculated, production-grade business features ensuring a Single Source of Truth across all analytical reports.

Equal Opportunity Employer
Globe's hiring process promotes equal opportunity to applicants, Any form of discrimination is not tolerated throughout the entire employee lifecycle, including the hiring process such as in posting vacancies, selecting, and interviewing applicants.

Globe's Diversity, Equity and Inclusion Policy Commitment can be accessed

Make Your Passion Part of Your Profession. Attracting the best and brightest Talents is pivotal to our success. If you are ready to share our purpose of Creating a Globe of Good, explore opportunities with us.

More Info

Job Type:
Industry:
Employment Type:

Job ID: 145789915