About Accelerant
Accelerant is a data-driven risk exchange connecting underwriters of specialty insurance risk with risk capital providers. Accelerant was founded in 2018 by a group of longtime insurance industry executives and technology experts who shared a vision of rebuilding the way risk is exchanged – so that it works better, for everyone. The Accelerant risk exchange does business across more than 20 different countries and 250 specialty products, and we are proud that our insurers have been awarded an AM Best A- (Excellent) rating. For more information, please visit www.accelerant.ai.
The Role:Our objective is to reshape the value chain for our Members and Insurers using data driven insights. Our success will be based on the value data creates for our Members, risk capital providers, other suppliers and ourselves.
This role is part of a newly created and fast-growing division. The Data Office combines 3 teams; Data Products, Data Quality and Data Management. Today, the Data Quality team consists of the Data Quality Lead and a Data Quality Support Analyst.
The Data Quality Rules Analyst is responsible for front-line ownership of the validation lifecycle, translating Data Quality intent and standards into clear, consistent, and scalable deterministic controls across ingestion, data products, and systems.
The role focuses on rule specification, catalogue governance, control behaviour, and tuning - ensuring that large volumes of data quality controls are well-structured, interpretable, and maintainable as the platform scales.
Key Responsibilities
Validation Intake & Front-Line Support
- Act as the first point of contact for:
- New validation requests
- Validation support requests (e.g. overrides, unexpected behaviour, suspected defects)
- Clarify intent, scope, expected behaviour and enforcement with Data Owners and Product.
- Determine whether requests can be handled within existing standards or require escalation.
- Escalate to the Data Quality Lead only where intent, appetite or priority is unclear or challenged.
Validation Rule Specification
- Translate validation requirements into clear, build-ready rule specifications, including:
- rule intent and business description
- severity and enforcement level
- thresholds and tolerances
- expected behaviour and override guidance
- Define deterministic data quality controls across:
- ingestion validations
- data product and cross-system controls
- system- and application-level checks
- Ensure rules are unambiguous, testable, and monitorable, resolving specification-level ambiguity independently where possible.
Validation Catalogue Ownership
- Own and maintain the validation catalogue content as a governed artefact, including:
- rule metadata and descriptions
- ownership mapping
- lifecycle status (proposed, active, noisy, deprecated)
- linkage to datasets and Atlan metadata
- Ensure catalogue hygiene and consistency as rule volumes scale.
- Rationalise validation inventories by:
- identifying duplicate or overlapping rules
- consolidating low-value checks
- recommending retirement of obsolete or ineffective controls
Control Monitoring, Interpretation & Tuning
- Monitor rule behaviour, including:
- failure rates
- override frequency
- noise patterns
- Interpret control behaviour and form hypotheses about root cause.
- Propose bounded refinements to thresholds or logic where evidence is clear.
- Escalate to the Data Quality Lead where changes affect agreed Data Quality appetite or enforcement posture.
Anomaly Detection & RCA Support
- Perform first-pass interpretation of anomalies and outliers.
- Analyse data and control behaviour to support Root Cause Analysis (RCA).
- Provide evidence-based recommendations to inform DQ Lead decisions.
- Support Data Owners and Product during RCA with technical and analytical insight.
Must-Have Experience
· 3–6 years’ experience in data quality, data governance, analytics engineering, or data operations roles.
· Experience working with large, complex datasets with high volumes of data quality validations.
· Strong analytical mindset, able to interpret data behaviour, trends, distributions, outliers and cross-dataset consistency.
· Proven experience interpreting data quality issues at scale and forming evidence-based recommendations on rule refinement, thresholds, and upstream process improvements
· Experience defining and maintaining deterministic data quality controls, including clear rule intent, severity, thresholds and expected behaviour.
· Validation catalogue experience, including maintaining rule metadata, ownership and lifecycle status.
· Working knowledge of SQL sufficient to interrogate datasets independently, explore data distributions, validate assumptions behind proposed controls, and investigate unexpected validation behaviour
· Experience supporting Root Cause Analysis (RCA) through data analysis and evidence gathering.
· Ability to operate effectively in a maturing data quality environment, applying defined standards and guardrails while tooling and processes continue to evolve.
· Strong collaboration skills, working closely with Product, Technology and Data teams to ensure controls are implemented as specified.
Top Skills
Similar Jobs
What you need to know about the Austin Tech Scene
Key Facts About Austin Tech
- Number of Tech Workers: 180,500; 13.7% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Dell, IBM, AMD, Apple, Alphabet
- Key Industries: Artificial intelligence, hardware, cloud computing, software, healthtech
- Funding Landscape: $4.5 billion in VC funding in 2024 (Pitchbook)
- Notable Investors: Live Oak Ventures, Austin Ventures, Hinge Capital, Gigafund, KdT Ventures, Next Coast Ventures, Silverton Partners
- Research Centers and Universities: University of Texas, Southwestern University, Texas State University, Center for Complex Quantum Systems, Oden Institute for Computational Engineering and Sciences, Texas Advanced Computing Center



.png)