Benefits:
- 15% Night differential
- 20 Paid Time Off (PTO) per year
- Annual Appraisal
- Annual Incentive
- Hybrid Work Arrangement
- Weekends off
- HMO with FREE 2 dependents
- Group life insurance
Position Summary
As a Cloud Engineer specializing in Data & Analytics, you will be responsible for designing, implementing, and managing scalable and secure cloud solutions on Google Cloud Platform. You will collaborate closely with cross-functional teams to understand business requirements and translate them into robust cloud architectures that support data-driven decision-making and analytics initiatives. The ideal candidate will have extensive experience in architecting cloud-based data platforms, implementing data warehousing solutions, and optimizing data pipelines for performance and reliability.
Primary duties and responsibilities:
- Architect end-to-end cloud solutions on Google Cloud Platform to support data and analytics workloads
- Work closely with stakeholders to understand business requirements, data sources, and analytics goals
- Design scalable, cost-effective, and resilient cloud architectures that align with business objectives and industry best practices
- Design highly available, scalable, and secure data architectures
- Evaluate and recommend appropriate GCP services and technologies to meet specific data processing, storage, and analysis requirements
- Architect end-to-end cloud solutions on Google Cloud Platform to support data and analytics workloads
- Lead the implementation and configuration of GCP services
- Develop data ingestion, transformation, and processing pipelines
- Collaborate with data engineers and analysts
- Implement security controls and compliance standards
- Provide technical leadership and mentorship
- Collaborate closely with senior leadership, business stakeholders, and other departments to align data initiatives with organizational goals
- Foster strong relationships and effective communication across teams and functions
- Promote a culture of data-driven decision-making and knowledge-sharing
Qualifications:
- Bachelors degree from an accredited college/university in technology related field or equivalent combination of education, training, and experience.
- Ability to build and optimize data sets, big data data pipelines.
- Ability to perform root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions
- Ability to build processes that support data transformation, workload management, data structures, dependency, and metadata
- Strong SQL and relational database design/development skills
- Development experience with cloud-based modern data warehouses such as Google Big Query or equivalent.
- Applicants should have a demonstrated understanding and experience of relational SQL databases including Big Query or equivalent, and function/object-oriented scripting languages including Scala, Java and Python.
- Applicants should also have an understanding of software and tools including big data tools like Kafka, Spark; workflow management tools such as Airflow.