Senior Data Engineer
Bestow is a smarter approach to life insurance using big data and technology to bring simple and affordable coverage to everyone.
Who We Are
First thing you need to know; Bestow is not your typical insurance company. We’ve completely reimagined industry assumptions, and harnessed technology and big data to create life insurance products that aren’t just delightful to use, but deliver on our mission: to make financial products accessible to more people than ever before.
Check us out at hellobestow.com
Who We’re Looking For
“Innovative” and “disruptive” can be overused words. But at Bestow, it isn’t jargon, it’s an
everyday rally cry. We are rewriting the rules of the 400-year-old, $7 trillion life insurance
industry, serving communities who have been underserved in the past, and we believe the
future is a bright one.
So, if you’re excited by the idea of building an industry-defining brand from scratch, working at a
mission-driven company, and fundamentally shaking up an age-old industry, read on, friend.
As a Senior Data Engineer you will design and build data pipeline and warehouse solutions. This is a greenfield opportunity to lay the foundation of our data platform. You enjoy creating solutions applying the latest data technologies. You should also have a background of traditional ETL and data pipeline tools.
Bestow engineers are great teammates. You will be spending a significant portion of your time working with application and infrastructure engineers. You’ll be learning how they export and ingest data in order to determine optimal solutions. As such, you have exceptional written and verbal communication skills.
Do you want to build products to reinvent a centuries old industry? If so, we'd love to hear from you.
Challenges On Which You Can Expect to Work:
Data Pipleines / ETL
We’re aiming to build a unified solution for transferring data from first and third-party applications to our data warehouse.
Evaluate data pipeline tools, design a robust data pipeline solution and implement the design;
Create a framework for adding additional data sources, transformations and destinations;
Work with Site Reliability Engineers to ensure continuous integration and automated deployment
We leverage Google BigQuery as our data warehouse. You will help administer BigQuery and design schemas for effective data use.
Collaborate with team members to understand their data access use cases and design solutions;
Design schemas for efficient storage and query execution;
Create logic to transform data from source system to our data warehouse.
We're Seeking Someone Who Has
- 4+ years of data engineering and/or DBA experience
- Professional experience with columnar databases such as Google BigQuery or Amazon Redshift
- Professional experience with relational databases such as PostgreSQL or MySQL
- Professional experience with multi-source data ingestion into data warehouses
- Professional experience with a cloud hosting platform (GCP preferred)
- Bonus if you have experience with real-time data pipelines like Kafka or Alooma
- Experience with Python 3 and Golang
- Experience with scripting languages such as Bash, Python or Ruby
- Experience with distributed systems and microservices
- Clear, concise written and verbal communication
- A desire and willingness to learn
- Initiative and motivation to make things happen
What We Can Offer To You:
- Competitive salary
- Generous PTO
- Flexible schedule and work/life balance
- 100% company-paid health, dental, and vision insurance
- Choose your own computer setup (Mac or PC)
- Office snacks and weekly team lunches
- Team building events and activities
We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Read Full Job Description