Search by job, company or skills

E

Data Engineer

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 2 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description


Role Title: Data Engineer

ECLARO: A quick Summary
ECLARO is an award-winning professional services firm headquartered in New York City and operating in the U.S., Canada, UK, Ireland, Australia and the Philippines. We are dedicated to a singular purpose: providing the Right People to meet every client's needs and solve business challenges through strategic staffing, permanent placement, custom outsourcing & offshoring. Utilizing our proprietary TRINIT-E® Service Maturity Model, we help clients implement programs to promote innovation, automation and process improvement.

About the Role:
The Data Engineer, will design, build, and optimize scalable data pipelines and services that power analytics, AI/ML, and internal applications across the enterprise. This role owns the ingestion, transformation, validation, and delivery of structured, semi-structured, and unstructured data in a cloud-native environment.

The ideal candidate brings strong experience building robust ETL/ELT workflows, designing scalable data models, and exposing trusted data through APIs and reusable data services. This role will partner closely with analytics, application, platform, and AI teams to ensure data is high-quality, governed, performant, and ready for downstream use.

What You'll Do
• Build robust ETL/ELT workflows that handle structured, semi-structured, and unstructured data sources at scale.
• Design data models and ontologies that maximize analytical value and support graph-oriented and downstream application use cases.
• Implement data validation, cleansing, reconciliation, and integrity checks across ingestion processes to ensure consistent, trustworthy data.
• Develop APIs and data services that expose graph and enterprise data to internal applications and platforms.
• Optimize queries and data workflows for performance, scalability, and resilience.
• Collaborate with subject matter experts and technical stakeholders to identify, evaluate, and integrate new data sources.
• Maintain detailed documentation for pipelines, data models, metadata, lineage, and integration processes.
• Monitor and tune ETL/ELT processes for efficiency, resilience, and scalability, including alerting for data quality issues.
Qualifications:
• Bachelor's degree or higher in Computer Science, Computer Engineering, Data Science, or a related field.
• 5 years of professional experience in data engineering, including end-to-end ownership of ETL/ELT pipeline design, development, deployment, and monitoring.
• Practical experience implementing data validation, cleansing, transformation, and reconciliation processes to ensure high-quality, trustworthy datasets.
• Strong programming skills in Python and experience with common data engineering libraries and frameworks such as Pandas, SQLAlchemy, PySpark, and/or Airflow.
• Deep experience with relational (SQL) and NoSQL databases, data warehouse platforms, and schema/model design.
• Experience with data governance, metadata management, and lineage tracking.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 146765399

Similar Jobs