Lower Logo

Lower

Senior Data Engineer

Posted One Month Ago
Hybrid
2 Locations
150K-18K Annually
Senior level
Hybrid
2 Locations
150K-18K Annually
Senior level
As a Senior Data Engineer, you will design ETL pipelines, optimize data infrastructure on Snowflake, and ensure data quality through robust frameworks and testing. Collaboration with stakeholders is key to enhance analytics and improve data-driven decision-making.
The summary above was generated by AI

Here at Lower, we believe homeownership is the key to building wealth, and we’re making it easier and more accessible than ever. As a mission-driven fintech, we simplify the home-buying process through cutting-edge technology and a seamless customer experience.

With tens of billions in funded home loans and top ratings on Trustpilot (4.8), Google (4.9), and Zillow (4.9), we’re a leader in the industry. But what truly sets us apart? Our people. Join us and be part of something bigger.

Job Description:

We are seeking a Senior Data Engineer to play a key role in building and optimizing our data infrastructure to support business insights and decision-making. In this role, you will design and enhance denormalized analytics tables in Snowflake, build scalable ETL pipelines, and ensure data from diverse sources is transformed into accurate, reliable, and accessible formats. You will collaborate with business and sales stakeholders to gather requirements, partner with developers to ensure critical data is captured at the application level, and optimize existing frameworks for performance and integrity. This role also includes creating robust testing frameworks and documentation to ensure quality and consistency across data pipelines.

What you'll do:

  • Data Pipeline Engineering:  
  • Design, develop, and optimize high-performance ETL/ELT pipelines using Python, dbt, and Snowflake. 
  • Build and manage real-time ingestion pipelines leveraging AWS Lambda and CDC systems. 
  • Cloud & Infrastructure:  
  • Develop scalable serverless solutions with AWS, adopting event-driven architecture patterns. 
  • Manage containerized applications using Docker and infrastructure as code via GitHub Actions. 
  • Advanced Data Management:  
  • Create sophisticated, multi-layered Snowflake data models optimized for scalability, flexibility, and performance. 
  • Integrate and manage APIs for Salesforce, Braze, and various financial systems, emphasizing robust error handling and reliability. 
  • Quality Assurance & Operations:  
  • Implement robust testing frameworks, data lineage tracking, monitoring, and alerting. 
  • Enhance and manage CI/CD pipelines, drive migration to modern orchestration tools (e.g., Dagster, Airflow), and manage multi-environment deployments.

Who you are:

  • 5+ years of data engineering experience, ideally with cloud-native architectures. 
  • Expert-level Python skills, particularly with pandas, SQLAlchemy, and asynchronous processing. 
  • Advanced SQL and Snowflake expertise, including stored procedures, external stages, performance tuning, and complex query optimization. 
  • Strong proficiency with dbt, including macro development, testing, and automated deployments. 
  • Production-grade Pipeline Experience specifically with Lambda, S3, API Gateway, and IAM. 
  • Proven experience with REST APIs, authentication patterns, and handling complex data integrations. 

Preferred Experience 

  • Background in financial services or fintech, particularly loan processing, customer onboarding, or compliance. 
  • Experience with real-time streaming platforms like Kafka or Kinesis. 
  • Familiarity with Infrastructure as Code tools (Terraform, CloudFormation). 
  • Knowledge of BI and data visualization tools (Tableau, Looker, Domo). 
  • Container orchestration experience (ECS, Kubernetes). 
  • Understanding of data lake architectures and Delta Lake. 

Technical Skills 

  • Programming: Python (expert), SQL (expert), Bash scripting. 
  • Cloud: AWS (Lambda, S3, API Gateway, CloudWatch, IAM). 
  • Data Warehouse: Snowflake, dimensional modeling, query optimization. 
  • ETL/ELT: dbt, pandas, custom Python workflows. 
  • DevOps: GitHub Actions, Docker, automated testing. 
  • APIs: REST integration, authentication, error handling. 
  • Data Formats: JSON, CSV, Parquet, Avro. 
  • Version Control: Git, GitHub workflows. 

What Sets You Apart 

  • Systems Thinking: You see the big picture, designing data flows that scale and adapt with the business. 
  • Problem Solver: You quickly diagnose and resolve complex data issues across diverse systems and APIs. 
  • Quality Advocate: You write comprehensive tests, enforce data quality standards, and proactively prevent data issues. 
  • Collaborative: You thrive working alongside analysts, developers, and product teams, ensuring seamless integration and teamwork. 
  • Continuous Learner: You actively seek emerging data technologies and best practices to drive innovation. 
  • Business Impact: You understand how your data engineering decisions directly influence and drive business outcomes. 

Benefits & Perks 

  • Competitive salary and comprehensive benefits (healthcare, dental, vision, 401k match) 
  • Hybrid work environment (primarily remote, with two days a week in downtown Columbus Ohio 
  • Professional growth opportunities and internal promotion pathways 
  • Collaborative, mission-driven culture recognized as a local and national "best place to work" 

Lower provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.

Top Skills

AWS
Dbt
Docker
Github Actions
Kafka
Kinesis
Lambda
Python
Snowflake
SQL

What you need to know about the Austin Tech Scene

Austin has a diverse and thriving tech ecosystem thanks to home-grown companies like Dell and major campuses for IBM, AMD and Apple. The state’s flagship university, the University of Texas at Austin, is known for its engineering school, and the city is known for its annual South by Southwest tech and media conference. Austin’s tech scene spans many verticals, but it’s particularly known for hardware, including semiconductors, as well as AI, biotechnology and cloud computing. And its food and music scene, low taxes and favorable climate has made the city a destination for tech workers from across the country.

Key Facts About Austin Tech

  • Number of Tech Workers: 180,500; 13.7% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Dell, IBM, AMD, Apple, Alphabet
  • Key Industries: Artificial intelligence, hardware, cloud computing, software, healthtech
  • Funding Landscape: $4.5 billion in VC funding in 2024 (Pitchbook)
  • Notable Investors: Live Oak Ventures, Austin Ventures, Hinge Capital, Gigafund, KdT Ventures, Next Coast Ventures, Silverton Partners
  • Research Centers and Universities: University of Texas, Southwestern University, Texas State University, Center for Complex Quantum Systems, Oden Institute for Computational Engineering and Sciences, Texas Advanced Computing Center

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account