COVU Logo

COVU

Senior Data Engineer

Reposted 14 Hours Ago
Remote
Hiring Remotely in United States
Senior level
Remote
Hiring Remotely in United States
Senior level
The Senior Data Engineer will develop and maintain scalable data pipelines, ensure data quality, and modernize ETL processes to support the analytics platform.
The summary above was generated by AI

About COVU
COVU is a venture-backed technology startup transforming the insurance industry. We empower independent agencies with AI-driven insights and digitized operations, enabling them to manage risk more effectively. Our team is building an AI-first company set to redefine the future of insurance distribution.

Location:

This role can be hybrid or remote. If the candidate is based in the Los Angeles (LA) area, it will be a hybrid role working from our office in West Hollywood. For candidates based anywhere else in the US, this will be a fully remote role.

Role Overview

 We are seeking an experienced and product-focused Senior Data Engineer to be a core member of our Platform product team. This is a high-impact role where you will play a pivotal part in evolving our core data infrastructure.

Your primary mission will be to develop key components of our "Policy Journal" - the foundational data asset that will serve as the single source of truth for all policy, commission, and client accounting information. You will work closely with the Lead Data Engineer and business stakeholders to translate requirements into robust data models and scalable pipelines that drive analytics and operational efficiency for our agents, managers, and leadership.

This role requires a blend of greenfield development, strategic refactoring of existing systems, and a deep understanding of how to create trusted, high-quality data products.

What You’ll Do:

  • Develop the Policy Journal: Be a primary builder of our master data solution that unifies policy, commission, and accounting data from sources like IVANS and Applied EPIC. You will implement the data models and pipelines that create the "gold record" powering our platform.
  • Ensure Data Quality and Reliability: Implement robust data quality checks, monitoring, and alerting to ensure the accuracy and timeliness of all data pipelines. You will champion and contribute to best practices in data governance and engineering.
  • Build the Foundational Analytics Platform: Implement and enhance our new analytics framework using modern tooling (e.g., Snowflake, dbt, Airflow). You will build and optimize critical data pipelines, transforming raw data into clean, reliable, and performant dimensional models for business intelligence.
  • Modernize Core ETL Processes: Systematically refactor our existing Java & SQL (PostgreSQL) based ETL system. You will identify and resolve core issues (e.g., data duplication, performance bottlenecks), strategically rewriting critical components in Python and migrating orchestration to Airflow.
  • Implement Data Quality Frameworks: Working within our company's QA strategy, you will build and execute automated data validation frameworks. You will be responsible for writing tests that ensure the accuracy, completeness, and integrity of our data pipelines and the Policy Journal.
  • Collaborate and Contribute to Design: Partner with product managers, the Lead Data Engineer, and business stakeholders to understand complex business requirements. You will be a key technical contributor, translating business needs into well-designed and maintainable solutions.

What We're Looking For:

  • 5+ years of experience in data engineering, with a proven track record of building and maintaining scalable data pipelines in production.
  • Expert-level proficiency in Python and SQL.
  • Strong experience with modern data stack technologies, including a cloud data warehouse (Snowflake or Redshift), a workflow orchestrator (Airflow is highly preferred), and data transformation tools.
  • Hands-on experience with AWS data services (e.g., S3, Glue, Lambda, RDS).
  • Experience in the insurance technology (insurtech) industry and familiarity with insurance data concepts (e.g., policies, commissions, claims).
  • Demonstrated ability to contribute to the design and implementation of robust data models (e.g., dimensional modeling) for analytics and reporting.
  • A pragmatic problem-solver who can analyze and refactor complex legacy systems. While you won't be writing new Java code, the ability to read and understand existing Java/Hibernate logic is a strong plus.
  • Excellent communication skills and the ability to collaborate effectively with both technical and non-technical stakeholders.

Bonus Points For:

  • Direct experience working with data from Agency Management Systems like Applied EPIC, Nowcerts, EZlynx, etc… 
  • Direct experience working with Carrier data (Accord XML, IVANS AL3) 
  • Experience with business intelligence tools like Tableau, Looker, or Power BI.
  • Prior experience in a startup or fast-paced agile environment.

Application Process:

  1. Intro call with People team
  2. Technical interviews
  3. Final interview with leaders

Top Skills

Airflow
AWS
Dbt
Glue
Lambda
Postgres
Python
Rds
S3
Snowflake
SQL

Similar Jobs

Yesterday
In-Office or Remote
Boston, MA, USA
90K-161K Annually
Senior level
90K-161K Annually
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Design and implement automated test suites for AI/ML workflows, analyze clinical data, perform various tests, and mentor junior engineers.
Top Skills: Api Testing ToolsAWSCloudFormationCypressGCPGithub ActionsPythonSeleniumTerraform
Yesterday
In-Office or Remote
Raleigh, NC, USA
Senior level
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Senior Data Engineer will design data models, develop ETL pipelines using Azure and Databricks, deploy AI models, and ensure compliance with data regulations.
Top Skills: AzureAzure Data FactoryDatabricksDelta LakeMlflowPysparkSQLUnity Catalog
2 Days Ago
Easy Apply
Remote or Hybrid
United States
Easy Apply
165K-175K Annually
Senior level
165K-175K Annually
Senior level
AdTech • Artificial Intelligence • Marketing Tech • Software • Analytics
The Senior Data Engineer will design, build, and operate data pipelines for Zeta's AdTech platform, focusing on high-scale data processing and analytics-ready datasets.
Top Skills: AirflowAthenaAWSCassandraDagsterDeltaDynamoDBEmrFlinkGlueGoHudiIcebergJavaKafkaKinesisMySQLParquetPostgresPythonRedisRedshiftS3ScalaSparkSQLStep Functions

What you need to know about the Austin Tech Scene

Austin has a diverse and thriving tech ecosystem thanks to home-grown companies like Dell and major campuses for IBM, AMD and Apple. The state’s flagship university, the University of Texas at Austin, is known for its engineering school, and the city is known for its annual South by Southwest tech and media conference. Austin’s tech scene spans many verticals, but it’s particularly known for hardware, including semiconductors, as well as AI, biotechnology and cloud computing. And its food and music scene, low taxes and favorable climate has made the city a destination for tech workers from across the country.

Key Facts About Austin Tech

  • Number of Tech Workers: 180,500; 13.7% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Dell, IBM, AMD, Apple, Alphabet
  • Key Industries: Artificial intelligence, hardware, cloud computing, software, healthtech
  • Funding Landscape: $4.5 billion in VC funding in 2024 (Pitchbook)
  • Notable Investors: Live Oak Ventures, Austin Ventures, Hinge Capital, Gigafund, KdT Ventures, Next Coast Ventures, Silverton Partners
  • Research Centers and Universities: University of Texas, Southwestern University, Texas State University, Center for Complex Quantum Systems, Oden Institute for Computational Engineering and Sciences, Texas Advanced Computing Center

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account