Job Summary
The Senior Developer will be responsible for designing, building, and leading the development of applications that power Data Science, AI, and Machine Learning initiatives. S/he excels in cloud-native development (AWS and/or Databricks), application integration, and building scalable data-driven systems. S/he will collaborate closely with the Data Team to turn analytics and AI concepts into production-grade solutions.
Essential Duties and Responsibilities
Application Development & Architecture
- Architect, design, and build cloud-native applications in AWS (Lambda, API Gateway, ECS/EKS, Step Functions, etc) or Databricks (Workflows, Delta Live Tables, MLflow, Apps, etc).
- Build high-performance APIs, microservices, and automation workflows to support data science, ML model operations, and analytics applications.
- Lead end-to-end solutions architecture for data-driven products, from concept to production deployment. o Ensure applications follow best practices for scalability, reliability, and security.
Integration & Cloud Engineering
- Integrate applications with internal and external systems using APIs, event-driven design, messaging systems, or serverless services. o Implement CI/CD pipelines and DevOps practices for smooth deployment and monitoring.
- Work closely with the data team to build services that integrate efficiently with data lakes, data warehouses, and ML pipelines.
Data Science & ML Enablement
- Build interfaces, services, or tools that enable machine learning model training, deployment, and monitoring.
- Operationalize ML models (MLOps) using appropriate cloud or Databricks components.
- Collaborate with Data Scientists to transform prototypes into robust, scalable applications.
Leadership & Collaboration
- o Mentor fellow developers and provide technical guidance across teams.
- o Contribute to architectural decisions and technical strategy.
- o Work cross-functionally with other IT teams to understand requirements and translate them into technical solutions.
Qualifications
- Degree in Computer Science, Engineering, or related field; equivalent experience welcomed.
- 5+ years of hands-on software development experience, with strong proficiency in languages such as Python, Java, or Scala.
- Experience leading, implementing or supporting MLOps or DataOps frameworks.
- Knowledge of containerization and orchestration (Docker, Kubernetes).
- Experience working with distributed systems and big data technologies (Spark, Delta Lake, Kafka).
- Prior leadership experience (technical lead, project lead, or mentoring roles).
- Proven experience building production applications in AWS (Lambda, API Gateway, ECS/EKS, S3, RDS, DynamoDB, Step Functions, etc) or Databricks (Delta Live Tables, MLflow, Workflows, Apps, etc).
- Strong background in application and system integration (APIs, event streaming, webhooks, queues). • Experience with CI/CD tools (GitHub Actions, etc.).
- Familiarity with ML lifecycle concepts (feature engineering, deployment, monitoring) or experience supporting data science teams.
- Strong understanding of modern software architecture: microservices, serverless, event-driven systems.
- Excellent problem-solving skills and ability to work on complex technical challenges.
- Strong communication skills and ability to collaborate with diverse teams.
- Ownership mindset; proactive in identifying improvements and driving solutions.