GENERAL RESPONSIBILITIES:
The Data Architect is responsible for the overall design and the approach to designing both the logical and the physical data architecture of the data warehouse and data lake. The data architect should have a broad understanding of data warehousing, ETL and Hadoop, and is responsible for providing technical leadership to the implementation teams.
DUTIES AND RESPONSIBILITIES:
- Design logical data models, ensuring alignment with business requirements, industry standards, and current data modeling best practices.
- Create robust physical data models, optimizing structures for efficient data storage and retrieval.
- Provide optimization strategies for existing data models, incorporating performance improvements, scalability, and maintainability.
- Review deliverables to ensure completeness and correctness against business requirements, promoting data quality and integrity.
- Ensure that the physical design meets the standards set by the Data Architecture team, adhering to industry best practices and compliance requirements
- Guide and provide inputs to Solution Architects on the implementation of various Business Intelligence (BI) solutions, ensuring alignment with the overarching data strategy.
- Develop in-depth source system knowledge to determine the best sources for specific information, supporting data accuracy and reliability.
- Provide source-to-target mapping for reports, extractions, and other analytical requirements, ensuring a clear understanding of data flow and transformations.
- Other job-related activities that may be assigned from time to time.
JOB SPECIFICATIONS:
- Education – Graduate of a Bachelor's Degree preferably in Information Technology or Computer Science.
- Related Work Experience - 5+ years of experience in data architecture-related work. Must have experience in designing and/or developing a mid - large scale data warehouse or data lake.
- Experience in implementing various relational or NoSQL databases or Hadoop technologies. (e.g. Oracle, SQL Server, DB2, Teradata, Hive, Pig, Impala, MongoDB, Cassandra, HBase).
- Experienced in either Reporting or ETL is required.
- Knowledge – Knowledgeable on the following:
- SDLC with emphasis on design phase or any other Agile methodologies
- Data warehousing implementation using MPP databases.
- Knowledgeable of both 3NF and Star Schema model implementation.
- Knowledgeable of various Hadoop implementation strategies.
- Knowledgeable on SDLC with emphasis on the design phase or any other Agile methodologies
- Skills:
- Excellent communication skills with the ability to articulate technical concepts to non-technical stakeholders.
- Demonstrates presentation skills with a high degree of comfort for diverse audiences.
- Must have sound problem resolution, judgment, analytical, and decision-making.
- Possesses solid stakeholder relationship management skills and the ability to collaborate with various teams to attain goals and pursue excellence.
- Demonstrates excellent Organizational, Time Management, and Attention to Detail Skills.
- Ability to work with minimum supervision and able to guide new team members.
- Must be adept at working in a fast-paced environment with tight SLAs.