Search by job, company or skills

Rouxel TP

Helpdesk Customer Support – Quality Analyst

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 17 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

About the Role

The Quality Analyst will play a crucial role in maintaining the high standards of our customer support interactions. This position involves conducting quality reviews, analyzing data, and ensuring that our support team adheres to established processes and guidelines.

Responsibilities

Quality Review & Interaction Auditing

  • Conduct systematic QA evaluations of support interactions — including tickets, live chats, email threads, and escalation cases — assessing accuracy, technical correctness, process adherence, and communication quality against defined QA rubrics.
  • Audit technical integration support cases involving API failures, tracking discrepancies, SDK issues, and payload errors to ensure agents are following correct diagnostic and resolution procedures.
  • Perform targeted calibration reviews of high-complexity cases involving billing disputes, conversion attribution discrepancies, and ad policy escalations to validate consistency across the support team.
  • Identify patterns and trends in QA defects — including recurring knowledge gaps, process deviations, and documentation errors — and escalate findings to the QA Lead and Training team.
  • Maintain a structured audit log of all reviewed interactions, ensuring accurate scoring, fair calibration, and a defensible record of QA outcomes.

SQL-Based Data Quality & Analytics QA

  • Write and execute advanced SQL queries to validate data pipeline outputs, verify conversion event integrity, cross-check attribution data, and audit reporting accuracy across platform databases.
  • Identify data anomalies, duplicate event records, missing conversion signals, and attribution calculation errors through systematic SQL-driven investigation and root-cause analysis.
  • Build and maintain SQL-based QA dashboards and reporting scripts that surface team-level quality metrics, defect frequency trends, and SLA compliance data for operational reporting.
  • Validate the accuracy of SQL queries written by support agents and analysts, reviewing query logic for correctness, efficiency, and potential data integrity risks.
  • Support QA-driven data investigations by querying event-level data from ad platforms to reconcile discrepancies between reported conversion figures and raw tracking data.

API & Integration Quality Validation

  • Review and validate API integration implementations — including RESTful APIs, webhooks, and server-side tracking configurations — to ensure they conform to platform documentation, schema requirements, and expected behavioral standards.
  • Audit API request and response payloads in JSON and XML formats to identify malformed data, missing required parameters, schema mismatches, incorrect authentication headers, and error code handling failures.
  • Evaluate the quality of technical escalation handling for API-related issues, verifying that support agents correctly interpret API documentation, apply the right debugging methodology, and communicate resolutions accurately.
  • Test and validate webhook event delivery, retry logic, and payload structure during integration QA cycles to ensure real-time data flows are operating as expected.
  • Review SDK implementation QA reports and validate event instrumentation correctness across web and mobile platforms by cross-referencing raw event logs with expected implementation specs.

Conversion Tracking & Attribution QA

  • Perform end-to-end quality validation of conversion tracking setups — including pixel-based, server-side, and hybrid event architectures — to confirm data completeness, deduplication accuracy, and event schema compliance.
  • Review and validate attribution model outputs, cross-checking multi-touch attribution calculations, attribution window configurations, and conversion path logic against expected platform behavior and client specifications.
  • Audit conversion event data for signal quality issues including delayed event firing, mismatched event names, incorrect value parameters, and missing required fields that could distort attribution or ROAS reporting.
  • Conduct QA reviews of marketing analytics reports to verify that funnel metrics, ROAS figures, CPA calculations, and audience insights are derived from accurate, well-validated data sources.
  • Flag and document conversion tracking or attribution discrepancies, working with Engineering and Data teams to trace root causes and ensure corrective actions are validated before closing the QA finding.

Ads Manager & Platform QA

  • Evaluate the quality of support interactions and technical resolutions related to Ads Manager platforms — including Meta Business Suite, Google Ads, TikTok for Business, and similar — verifying that agents apply correct platform knowledge and follow documented workflows.
  • Perform QA on campaign setup reviews, pixel configuration audits, and ad policy compliance cases handled by the support team, validating the accuracy and thoroughness of agents technical assessments.
  • Review integration QA cases involving Conversions API (CAPI) implementations, match rate analysis, and signal deduplication to confirm correct diagnostic approaches and resolution quality.
  • Monitor platform update communications and assess their impact on existing QA rubrics, test cases, and evaluation criteria — recommending rubric updates to the QA Lead when platform changes affect support standards.
  • Conduct periodic QA health checks on common Ads Manager support workflows to identify emerging quality gaps before they become systemic issues.

Test Case Design & QA Documentation

  • Design, write, and maintain comprehensive QA test cases and evaluation rubrics covering technical integration scenarios, API validation workflows, conversion tracking checks, and attribution verification procedures.
  • Develop structured QA checklists and scoring matrices for evaluating support interactions across dimensions including technical accuracy, process compliance, communication quality, and resolution completeness.
  • Maintain a centralized QA knowledge base — including defect taxonomies, known issue libraries, common error patterns, and API debugging guides — that supports consistent evaluation standards across the QA team.
  • Document QA findings with sufficient technical detail to enable the Training team to develop targeted remediation content and for Engineering to understand platform behavior gaps.
  • Contribute to the continuous improvement of QA frameworks by proposing new test scenarios, updating existing rubrics in response to product changes, and identifying coverage gaps in current evaluation processes.

Calibration, Coaching & Feedback Delivery

  • Participate in regular QA calibration sessions with fellow analysts, QA Leads, and Team Leaders to ensure consistency, fairness, and alignment in scoring across all evaluators.
  • Deliver structured, actionable quality feedback to support agents and technical specialists — clearly communicating

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 146819035

Similar Jobs