Search by job, company or skills

  • Posted 20 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Join Our Team at Lean Solutions Group (LSG)!

Lean Solutions Group (LSG) is a next-generation solutions provider combining AI-driven automation, industry expertise, and tech-powered talent. Built in the demanding Supply Chain sector, our model now supports 600+ clients across multiple industries, powered by 10,000+ employees in five countries. We help businesses achieve immediate efficiency, long-term resilience, and scalable growth by integrating intelligent technology, optimized processes, and high-performance teams.

At LSG, we believe in your talent and your potential. Join a multicultural, people-first environment where you can grow, sharpen your skills, and unlock new career opportunities. Here, every day brings fresh challenges, collaboration, and purpose.

Our Mission: Transform business challenges into lasting success through purpose-built teams, technology, and expertise.

Our Vision: A world where people, empowered by technology, turn any challenge into a catalyst for growth.

What you will be doing:

  • Build & Maintain Pipelines: Develop and maintain batch and streaming data pipelines using technologies such as Spark and Delta Lake.
  • Data Platform Development: Support ingestion, transformation, and curation workflows across platforms such as Fabric, Snowflake, Databricks, or AWS.
  • Data Modeling: Contribute to logical and physical data models that support analytics and business reporting. Workflow Orchestration: Work with orchestration tools such as Airflow, dbt, Lakeflow, or similar technologies to manage data workflows.
  • Data Quality & Governance: Help implement data governance practices, monitoring, and access controls to ensure reliable data delivery.
  • Collaboration: Partner with data analysts, data scientists, and engineering teams to deliver datasets and pipelines that support business insights.
  • Optimization: Assist in monitoring pipeline performance and improving reliability, scalability, and cost efficiency.

Requirements & Qualifications:

To excel in this role, you should possess:

  • 3+ years of experience working as a Data Engineer or in a related data engineering role.
  • Experience building data pipelines and ETL/ELT processes.
  • Strong programming skills in Python and SQL.
  • Hands-on experience with Apache Spark or similar distributed data processing frameworks.
  • Understanding of data modeling concepts and data warehousing practices.
  • Experience working with cloud data platforms such as Fabric, Snowflake, Databricks, or AWS.
  • Familiarity with data orchestration tools such as Airflow, dbt, or similar solutions.
  • Experience collaborating with cross-functional data teams in agile environments.

Nice to Have:

  • Familiarity with CI/CD pipelines for data workflows
  • Exposure to machine learning or AI-related data pipelines
  • Experience with Azure data services
  • Basic knowledge of streaming architectures

Soft Skills:

  • Strong problem-solving mindset with attention to data quality and performance.
  • Clear communication and collaboration with technical and non-technical stakeholders.
  • Ability to work effectively in distributed teams.
  • Continuous learning attitude and curiosity about new data technologies.

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 144533975

Similar Jobs