Affinity Solutions is the leading consumer purchase insights company. We provide a complete view of U.S. and U.K. consumer spending, across and between brands, via exclusive access to fully permissioned transaction data from over 100 million consumers. Our proprietary AI technology, Comet™, transforms these purchase signals into actionable insights for business and marketing leaders to drive optimal outcomes and build lasting customer relationships. Visit www.affinitysolutions.com to discover how we're shaping the future of consumer purchase insights.
About Your Role:
Affinity is seeking an accomplished Sr. Data Quality Engineer I to ensure the quality, reliability, and accuracy of data pipelines, APIs, integrations, and data products built by our software engineering teams. In this critical role, you will design and implement comprehensive quality testing frameworks, validate data integrity across complex systems, and partner with data engineers and software engineers to deliver trusted, high-performance solutions.
Your Responsibilities:
- Quality Testing and Validation
- Design, develop, and execute comprehensive test strategies for data pipelines, APIs, integrations, and data products built by software engineering teams
- Develop and maintain automated testing frameworks for API validation, data quality checks, integration testing, and end-to-end pipeline testing
- Perform thorough testing of RESTful APIs, including functional testing, performance testing, security testing, contract testing, and integration testing with third-party vendors
- Validate data accuracy, completeness, consistency, and timeliness across ETL/ELT pipelines, data warehouses, and data lake environments
- Test data clean room implementations, privacy controls, query constraints, and secure data-sharing mechanisms to ensure compliance with security standards
- Create and maintain comprehensive test cases, test data sets, and testing documentation for all quality assurance activities
- Data Pipeline and Integration Testing
- Validate data transformations, aggregations, and calculations across Snowflake, AWS, and other cloud data platforms
- Test integration pipelines, including LiveRamp XMI, Salesforce, AWS/AMC clean rooms, CAPI integrations, and MadConnect to ensure seamless data flow and accuracy
- Perform regression testing on data pipelines to ensure changes do not introduce data quality issues or break existing functionality
- Validate data lineage and metadata accuracy and ensure proper implementation of data governance controls
- Test database performance, query optimization, and data structure implementations to identify bottlenecks and ensure optimal performance at scale (200BIL+ records)
- Automation and Monitoring
- Build and maintain CI/CD test automation pipelines using Jenkins and other DevOps tools to enable continuous quality validation
- Implement automated data quality monitoring, anomaly detection, and alerting systems to proactively identify issues
- Develop test harnesses and mock services for isolated component testing and integration validation
- Create performance benchmarks and load testing scenarios to validate system scalability and reliability
- Establish and track quality metrics, test coverage, defect rates, and SLAs to measure and improve testing effectiveness
- Compliance and Security Testing
- Validate implementation of data privacy regulations (GDPR, CCPA, HIPAA) and ensure compliance across all data products
- Test security measures, including data encryption, masking, tokenization, role-based access controls (RBAC), and authentication mechanisms (OAuth, JWT, SSO)
- Verify proper implementation of data access controls including aggregation constraints, projection policies, row access policies, column masking, and differential privacy
- Conduct security testing on APIs and integrations to identify vulnerabilities and ensure adherence to security best practices
- Collaboration and Documentation
- Collaborate closely with senior data and software engineers (API and integrations) to understand requirements, identify test scenarios, and provide quality feedback early in the development cycle
- Participate in code reviews, design discussions, and sprint planning to ensure quality is built into solutions from the start
- Document test plans, test results, defects, and quality reports with clear, actionable insights for engineering teams
- Provide technical mentorship to junior QA engineers and promote testing best practices across the organization
- Partner with infrastructure teams to coordinate test environment setup and deployment validation
- Continuous Improvement
- Stay current with emerging testing technologies, tools, and methodologies in data quality, API testing, and test automation
- Identify opportunities to improve testing efficiency, reduce testing cycles, and enhance overall quality processes
- Lead proof-of-concept initiatives to evaluate new testing tools and frameworks (Great Expectations, Soda Core, Postman, REST Assured, etc.)
- Drive strategic recommendations to enhance data quality validation, testing coverage, and organizational quality maturity
Your Qualifications:
- Bachelor's degree in Computer Science, Information Systems, Data Engineering, Software Engineering, or related technical field; Master's degree preferred
- 5+ years of progressive experience in data quality engineering, QA engineering, or test automation with focus on data systems and APIs
- Demonstrated track record of implementing comprehensive testing frameworks for enterprise-scale data platforms and APIs
- Proven experience testing complex data pipelines, integrations, and RESTful APIs in production environments
- Core Testing Competencies:
- Expert-level experience with API testing tools and frameworks (Postman, REST Assured, SoapUI, JMeter, Swagger/OpenAPI)
- Strong proficiency in SQL for data validation, query testing, and database verification across large datasets
- Advanced Python programming skills for test automation, data validation scripts, and custom testing tools (pytest, unittest)
- Experience with data quality testing tools (Great Expectations, Soda Core, dbt tests, or similar)
- Strong understanding of ETL/ELT testing methodologies and data pipeline validation techniques
- Knowledge of test automation frameworks and CI/CD integration (Selenium, Jenkins, GitLab CI, GitHub Actions)
- Experience with performance testing and load testing tools for APIs and data systems (JMeter, Gatling, Locust)
- Platform and Technology Experience:
- Hands-on experience testing in cloud platforms (AWS, Google Cloud Platform, or Azure)
- 2+ years of experience with Snowflake ecosystem, including testing SnowPipes, Streams, Views, stored procedures, and data models
- Experience with AWS services testing (S3, Lambda, Airflow, Redshift, Athena, Glue)
- Familiarity with data warehouses (Amazon Redshift, Google BigQuery, Snowflake) and testing data at scale
- Knowledge of data clean room technologies and testing secure data shares using RBAC
- Experience with version control systems (Git) and testing in CI/CD environments
- Understanding of workflow orchestration tools (Apache Airflow, Prefect, Dagster) for pipeline testing
- API and Integration Testing:
- Extensive experience with RESTful API testing, including functional, integration, contract, security, and performance testing
- Knowledge of API standards (OpenAPI/Swagger, OAuth 2.0, JWT, GraphQL); able to validate implementations
- Experience testing third-party API integrations (LiveRamp, Salesforce, AWS/AMC, CAPI, MadConnect)
- Understanding of API monitoring, logging, and observability solutions for quality validation
- Data Governance and Compliance Testing:
- Working knowledge of data privacy regulations (GDPR, CCPA, HIPAA); able to validate compliance
- Experience testing data security implementations including encryption, masking, tokenization, and access controls
- Understanding of data access controls testing (aggregation constraints, projection policies, row access policies, column masking, differential privacy)
- Experience validating metadata management, data lineage, and data cataloging implementations
Additional Preferred Qualifications:
- Experience with distributed computing frameworks (Apache Spark, Hadoop) and testing data at scale (200BIL+ records)
- Familiarity with BI tools (Thoughtspot, Sigma, Looker, Tableau) for validating data visualizations and reports
- Knowledge of data modeling methodologies (dimensional modeling, data vault, 3NF) to inform testing strategies
- Understanding of JavaScript/Node.js for API testing and test automation
- Experience with data cataloging and governance platforms (Datahub, Openmetadata, Alation)
Salary Range: $130,000 – $145,000
Office Hours: 9am – 5:30pm
Benefits for full-time employees of Affinity Solutions begin on the first of the month following your date of hire with a generous employer contribution for medical, dental, and vision. In addition to company paid holidays, wellness time off, other wellness benefits, and employee discounts, you will also get employer paid life insurance and have the option to enroll into an employer-matched 401K Plan. We strongly encourage work/life balance by providing unlimited vacation days, available starting 90 days from your hire date as a team member.