Company Description
TestingXperts (Tx) is a global leader in AI-powered Digital Assurance and Quality Engineering services, specializing in Digital Engineering. With headquarters in Pennsylvania, USA, and London, UK, the company operates through a network of offices and delivery centers across multiple countries, including the US, Canada, South Africa, and Singapore. Tx offers next-generation technology services such as Automation, DevOps, AI/ML, RPA, and Big Data with a specialized approach to meet client needs. The company has been recognized as a leader in the industry by renowned analysts like Gartner and Everest Group and has received prestigious global awards. TestingXperts is also a certified Great Place to Work for five consecutive years, emphasizing its commitment to excellence and employee well-being.
Key Responsibilities
- Design, develop, and optimize complex SQL queries for large-scale data processing
- Build and maintain scalable ETL/data transformation pipelines
- Develop configurable and metadata-driven data processing frameworks
- Implement robust data validation and quality checks
- Optimize data workflows to meet performance and SLA requirements
- Support scalable architecture capable of handling multiple client environments
- Ensure efficient batch data processing and pipeline orchestration
- Collaborate with cross-functional engineering teams to integrate data workflows with downstream systems
- Participate in design reviews, performance tuning, and production support
Required Technical Skills
- Advanced SQL (complex joins, query optimization, indexing, performance tuning)
- Strong understanding of relational database concepts and data modeling
- Experience building scalable ETL or data transformation pipelines
- Strong programming skills in Python for data processing
- Hands-on experience with Azure ecosystem, including: Azure SQL, Data Lake, ADF, Synapse etc
- Good knowledge and experience with SSIS
- Experience designing configurable, metadata-driven data solutions
- Strong understanding of performance optimization for large datasets
- Experience working with SLA-driven batch pipelines
- Understanding of REST APIs (integration awareness level)
- Exposure to multi-tenant or scalable architecture patterns
- Ability to translate business requirements into technical solutions
Good to Have
Experience working in financial services, banking, or Loan PRo or similar regulated domains
Experience handling high-volume data processing environments