Search by job, company or skills

iSupport Worldwide

Data Engineer (dbt + Snowflake)

3-7 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 7 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

What is your mission

We are looking for a highly skilledData Engineerto design, build, and optimize large-scale data pipelines, develop dbt models, and implement Snowflake and GCP-based data architectures. You will manage orchestration using Airflow/Composer, ensure data quality, maintain documentation, and support production workloads. If you excel in modern data engineering, cloud systems, and scalable architectures, this role is for you.

You will provide the best service to our partner brands by performing these tasks:

  • Design, build, and optimizeETL/ELT data pipelines
  • Develop and maintaindbt modelsimplementing business logic
  • Create and manageAirflow/Composer DAGsfor orchestration
  • Architect and implement scalableSnowflakesolutions
  • Builddimensional modelsand data marts
  • ImplementFDW connectionsfor cross-database querying
  • Integrate data from multiple source systems
  • Deploy and manage data infrastructure onGCP(BigQuery, Cloud Storage, Dataflow, Pub/Sub, Cloud Composer, Cloud Functions)
  • Improve pipeline performance, reliability, and cost efficiency
  • Implement and monitordata quality checks, SLAs, and anomaly detection
  • Maintain documentation, data dictionaries, and pipeline guides
  • Conduct code reviews and collaborate with cross-functional teams

Who are we looking for
  • Bachelor's degree in Computer Science, IT, Engineering, or a related field
  • 37 years of experience as a Data Engineer
  • Strong experience designing and maintainingETL/ELT pipelines
  • Hands-on experience withSnowflake,dbt, andGCP(BigQuery, Cloud Composer, Cloud Storage, Cloud Functions, Dataflow, Pub/Sub)
  • AdvancedSQLskills (complex queries, optimization, window functions, CTEs)
  • Experience withPostgreSQL, postgres_fdw, MySQL, or SQL Server
  • Proficiency inPython(Pandas, NumPy, SQLAlchemy) and bash/shell scripting
  • Experience integrating data from APIs, databases, and varied file formats
  • Knowledge of dimensional modeling, data warehousing, and cloud data architecture
  • Familiarity with Git, REST APIs, CI/CD workflows, and orchestration tools
  • Nice to have: Kafka/streaming, Airbyte/Fivetran, Spark/Beam, Terraform, or data observability tools
  • Strong analytical thinking and problem-solving skills
  • Excellent communication, collaboration, and documentation ability
  • Detail-oriented, organized, and comfortable working in fast-paced environments

Company Perks:
  • Above-industry salary package and incentives
  • Comprehensive HMO benefits and life insurance from day 1
  • Free learning and development courses for your personal and career growth
  • Dynamic company events
  • Opportunities for promotion
  • Free meals and snacks

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 136414609

Similar Jobs