Senior Data Engineer at Bestow
Sorry, this job was removed at 3:39 p.m. (CST) on Thursday, July 9, 2020
Who We Are
Bestow is a digital life insurance company built on full stack technology and AI. In a world in need of greater financial resilience and protection, Bestow democratizes access to smart financial products and powers some of the world’s leading consumer platforms. We are reimagining and rebuilding a 400-year-old, $7 trillion industry to create a brighter future for millions of families. And we’re just getting started.
The Bestow team is a diverse band of first principles thinkers on a mission to do good. We’re fortunate to be backed by leading investors and partners including Valar Ventures, NEA, 8VC and MunichRe.
Bestow is paving a new way to get Life Insurance. Do you want to build products to reinvent a centuries old industry? If so, we would love you to hear from you.
We are hiring a Senior Data Engineer who will design and build data pipeline and warehouse solutions. You enjoy creating solutions applying the latest data technologies. You should also have a background of traditional ETL and data pipeline tools.
Bestow engineers are great teammates. You will be spending a significant portion of your time working with application and infrastructure engineers, as well as stakeholders across the company. You’ll be learning how the engineers export and ingest data in order to determine optimal solutions for the stakeholders. As such, you have exceptional written and verbal communication skills.
Your Day to Day:
Data Pipelines / ETL
Build robust solutions for transferring data from first and third-party applications to and from our data warehouse using Apache Airflow, and also designing a Kafka-based streaming ETL solution from the ground up to stream application data into the data warehouse with normalized structure;
Create data pipelines to and from internal and external sources while maintaining quality of the data;
Create a framework for adding additional data sources, transformations and destinations;
Work with Site Reliability Engineers to ensure continuous integration and automated deployment.
We leverage Google BigQuery as our data warehouse. You will help administer BigQuery and design schemas for effective data use. You'll also collaborate with team members to understand their data access use cases and design solutions;
Design schemas for efficient storage and query execution;
Evaluate the company’s production of data and transform it into a single source of truth in the data warehouse.
A Little About You
- 4+ years of data engineering and/or DBA experience
- Experience with columnar databases such as Google BigQuery or Amazon Redshift
- You a wiz at relational databases such as PostgreSQL or MySQL
- You know multi-source data ingestion into data warehouses
- You have worked with cloud hosting platforms like AWS or others (GCP preferred)
- Proficient with Python 3
- Writing scripting languages such as Bash, Python or Ruby is part of your daily job
- Experience with distributed systems and microservices
- Bonus if you have experience with real-time data pipelines like Kafka or Alooma
- Clear, concise written and verbal communicationA desire and willingness to learnInitiative and motivation to make things happen
What We Can Offer to You
- Competitive salary
- Generous PTO
- Flexible schedule and work/life balance
- 100% company-paid health, dental, and vision insurance
- Choose your own computer setup (Mac or PC)
- Office snacks and weekly team lunches
- Team building events and activities
We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Bestow does not currently sponsor applicants for work visas.
Read Full Job Description