Search by job, company or skills

Nityo Infotech

Power BI Developer / Data Engineer (Mid - Senior)

new job description bg glownew job description bg glownew job description bg svg
  • Posted 20 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Senior Power BI Developer

Location: Pasay City (MOA Area)

Work Setup: 4 Days Onsite

Salary: PHP 100,000 PHP 120,000

Job Description

We are looking for a highly skilled Senior Power BI Developer who will be responsible for designing and developing enterprise-level dashboards, reports, and analytical datasets. The role requires close collaboration with business stakeholders and technical teams to deliver scalable and high-performing BI solutions.

Key Responsibilities

Collaborate with the project team to design and build data models within Databricks SQL

Create optimized external tables for serving data to BI tools

Develop and maintain analytical datasets for enterprise reporting and self-service BI

Design and develop Power BI reports and dashboards

Create compelling data visualizations using Power BI for enterprise analytics

Enable business users to extract data through self-service extracts

Ensure high performance of reports and dashboards through proper data modeling

Automate the scheduling and publishing of Power BI reports

Monitor and optimize Power BI datasets and dataflows for performance

Work closely with business stakeholders to gather reporting requirements

Document data models, metrics, and KPI definitions

Work with and maintain SSIS packages

Requirements

Strong experience with Power BI development and dashboard creation

Experience working with Databricks SQL

Hands-on experience with SSIS packages

Strong understanding of data modeling and BI reporting best practices

Experience delivering enterprise-level analytics and reporting solutions

Strong communication and stakeholder management skills

IT Senior Data Engineer

Location: Taguig City

Work Setup: Hybrid (3 Days Onsite, 2 Days Work From Home)

Salary: Up to PHP 160,000

Work Schedule: 9:00 AM 6:00 PM (subject to change upon onboarding)

Job Description

We are seeking an experienced IT Senior Data Engineer to design, develop, and maintain scalable data pipelines and cloud-based data platforms. The ideal candidate should have strong experience in data engineering frameworks, cloud platforms, and modern data processing technologies.

Key Responsibilities

Design and develop scalable data pipelines and workflows

Build and maintain cloud-based data infrastructure

Develop data ingestion and transformation processes using Python and SQL

Work closely with analytics and business teams to support data requirements

Monitor and optimize data pipelines for performance and reliability

Implement best practices in data architecture and engineering

Requirements

Minimum of 8 years of Data Engineering or Data Science experience

Strong proficiency in Python (at least 4 out of 5 level proficiency)

Working knowledge of SQL (at least 3 out of 5 level proficiency)

At least 4 years of experience with Databricks (Snowflake experience may also be considered)

Hands-on experience working with Microsoft Azure cloud services

Experience working with large-scale data processing environments

Open to undergraduate or diploma holders provided qualifications are met

Senior Data Engineer

Location: Quezon City

Work Setup: Hybrid (3 Days Onsite)

Salary: Up to PHP 150,000

Work Schedule: 9:00 AM 6:00 PM (Must be flexible for on-call support with compensation)

Job Description

We are looking for a Senior Data Engineer to design and implement scalable data pipelines and manage modern data platforms supporting analytics and reporting.

Key Responsibilities

Design, develop, and maintain data pipelines and ETL workflows

Use Azure Data Factory (ADF) for data orchestration and integration

Develop data transformations using DBT (Data Build Tool)

Manage and optimize Snowflake data warehouse environments

Ensure performance, scalability, and reliability of data pipelines

Support analytics and reporting teams with high-quality data solutions

Requirements

5 to 7 years of relevant experience in Data Engineering

Strong hands-on experience with Azure Data Factory (ADF)

Experience working with Snowflake data warehouse

Experience with DBT (Data Build Tool)

Strong knowledge of cloud-based data platforms and data warehouse architecture

Good communication and collaboration skills

Data Engineer

Location: Mega Tower, Mandaluyong

Work Setup: Hybrid (3 Days Onsite, 2 Days Work From Home)

Salary: PHP 60,000 PHP 70,000 Basic

Shift Schedule: Mid Shift (3:00 PM 12:00 MN)

Job Description

We are looking for a Data Engineer to support the development and maintenance of data pipelines and data warehouse solutions. The role will collaborate closely with engineering and analytics teams to deliver reliable and scalable data systems.

Key Responsibilities

Develop and maintain ETL pipelines and data integration workflows

Support the development of data warehouse solutions

Integrate data from multiple sources such as RDBMS, data lakes, and cloud platforms

Ensure performance, reliability, and quality of data pipelines

Collaborate with analytics teams to support reporting and business insights

Requirements

Bachelor's degree in Computer Science, Information Technology, or related field

Minimum of 3 years of experience in Data Engineering, Data Warehousing, or Database Management

Strong understanding of data warehousing concepts

Experience working with data warehouse solutions such as Redshift, BigQuery, or Snowflake

Experience connecting to various data sources including SQL databases, data lakes, and cloud infrastructure

Strong problem-solving and analytical skills

Data Engineer Business Intelligence

Location: Mega Tower, Mandaluyong

Work Setup: Hybrid (3 Days Onsite, 2 Days Work From Home)

Salary: PHP 70,000 PHP 80,000 Basic

Shift Schedule: Mid Shift (3:00 PM 12:00 MN)

Job Description

We are seeking a Data Engineer Business Intelligence to develop and maintain data pipelines and support business intelligence solutions using modern cloud data platforms.

Key Responsibilities

Develop and maintain data pipelines and ETL processes

Design and manage data warehouse environments

Work with cloud data platforms such as Snowflake, Databricks, Microsoft Fabric, or Redshift

Write and optimize SQL queries and Python scripts

Support BI teams in delivering analytics and reporting solutions

Requirements

Bachelor's degree holder

At least 3 years of experience in Data Engineering

Experience with data warehousing and data pipeline development

Experience with cloud data platforms such as Snowflake, Databricks, Fabric, or Redshift

Strong knowledge of Python and SQL

Good analytical and problem-solving skills

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 144576649