
Search by job, company or skills
Role Summary
Support data architecture, infrastructure, and reporting ecosystem. Ensure reliable data pipelines, optimize cloud performance, and deliver high-quality data to business stakeholders and BI tools.
Infrastructure & Pipelines
- Maintain and optimize cloud-based data infrastructure
- Build, monitor, and troubleshoot ETL pipelines
- Ensure data pipeline reliability, quality, and timely delivery
- Implement data governance, security, and access controls
- Support infrastructure automation
Reporting & Analytics Support
- Design and maintain data models and reporting marts for BI tools
- Support business reporting needs and ad-hoc data requests
- Optimize query performance for dashboards and reports
- Maintain data documentation, dictionaries, and lineage
Required Skills
- SQL: Strong proficiency in Redshift/PostgreSQL, complex queries, performance tuning
- AWS: Hands-on with data-related infrastructure
- ETL/Orchestration: Experience with ETL processing tools (Airflow, Kafka Connect)
- Python: Data processing, automation, scripting
- Data Modeling: Star schema, dimensional modeling, data warehouse concepts
- BI Tools: Experience supporting Power BI, Looker, Qlik Sense, or similar
- Nice to have: Docker, ClickHouse, MongoDB, Elasticsearch, data quality frameworks
Requirements
- Bachelor's degree in Computer Science, IT, Engineering, or related field
- 2-4 years of experience in data engineering, infrastructure, or cloud operations
- Financial services or fintech background preferred
- Strong problem-solving and troubleshooting abilities
Job ID: 144825949