Data Pipeline Engineer

Sorry, this job was removed at 11:06 a.m. (CST) on Friday, July 9, 2021
Find out who’s hiring remotely
See all Remote jobs
Apply
By clicking Apply Now you agree to share your profile information with the hiring company.

What You'll Do

As a Data Pipeline Engineer, you will be responsible for building robust, high-quality, and scalable solutions to expand and improve b.well’s data ingestion pipeline. You’ll work firsthand with business and product to take specs from concept to reality all while improving the existing pipeline functionality and performance. You’ll be expected to take ownership, juggle a wide array of roles and responsibilities, and help grow our Austin technical presence.

  • Design, implement, and maintain b.well’s data pipeline using Python, Spark, Prefect, and other modern technologies
  • Develop tooling and automation to ingest files in dozens of formats and standards through the data pipeline
  • Help mentor other developers to improve their career development and coding abilities
  • Work closely with other engineers, designers, and management teams to rapidly build, iterate, test, and deploy new features and products and main a high-quality robust code base
  • Improve and scale the existing products and tools
  • Launch new projects from ideation to completion
  • Monitor and reduce cybersecurity product risk
  • You will safeguard sensitive data by following policies and training concerning your security and privacy responsibilities

Job Requirements:

  • 5+ years of professional programming experience
  • 3+ years of Python experience
  • Exceptional and demonstrable data engineering experience
  • Experience in loading, validating, cleaning, and manipulating data files
  • Strong experience with relational and/or NoSQL databases
  • Deep understanding of JSON, XML, CSV
  • Experience with streaming data
  • Strong experience with cloud-based infrastructure
  • Comfort with Linux/Unix command line
  • Experience working with offshore development teams
  • Experience working closely with design and PM individuals/teams

Great to have:

  • 5+ years of advanced Python experience
  • 3+ years of data pipeline engineering experience
  • 1+ years of experience with Spark
  • Experience with Snowflake, Redshift, or other columnar databases
  • Experience with Pandas, Anaconda, or similar libraries
  • Experience with Airflow or Prefect
  • Experience with Docker
  • Experience scaling technology solutions to hundreds of thousands active users
  • Experience mentoring other developers
  • Deep understanding of common API methodologies
  • Strong experience with unit testing and test-driven development
  • Startup experience
  • Advanced degree in Computer Science

Blow Us Away:

  • Experience working with third-party healthcare APIs, data streams, and/or flat files
  • Experience in cybersecurity
  • Experience with HIPAA, HITECH, or HITRUST
  • An active GitHub profile or other public code portfolio
  • Active Stack Overflow profile
  • Documented work on open source projects
Read Full Job Description
Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.

Location

We are conveniently located in North Austin, but currently, all work remotely (sometimes in our pajamas).

Similar Jobs

Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.
Learn more about b.well Connected HealthFind similar jobs