Full Stack Engineer, Data Tooling
Come work with us!
Smarter Sorting is a VC funded, Austin-based startup that uses data-driven technology to turn Household Hazardous Waste into a reusable product. We’re on a mission to use machine learning to build the world's first smart chemical database (and save the world)! By helping North America's largest retailers and biggest cities identify items in their regulated waste stream, our tech literally turns waste into product, saving our customers money and increasing reuse.
The Smarter Sorting team is made up of top-notch talent who are driven circular economy practitioners. We're determined to build world-class tech that transforms the chemical waste management industry, speeding our advancement towards a zero waste future.
We are seeking experienced and self-driven Full Stack Engineers who live and breathe Ruby and are excited to dive into data. Our Full Stack Engineer will architect and develop a suite of data ingestion and curation tools for internal and external use. As a member of the data team, this individual will help build the future data platform for the regulated waste industry and also contribute to our cloud platform services. This individual will work with a talented, distributed engineering team that is focused on delivering products on a scalable and efficient platform.
- 2+ years of experience as a Full Stack Ruby Engineer
- Bachelor’s degree in Computer Science, Statistics, Informatics, Information Systems or other quantitative field is preferred
- Ability to design and build successful full-stack applications in Ruby, Rails & Python
- Experience supporting and working with cross-functional teams in a dynamic environment
- Experience with relational SQL and NoSQL databases such as Postgres, Mongo & ElasticSearch
- Data pipeline experience is a major plus
- Build and maintain our outward facing data portals, internal toolsets and provide the glue for our machine learning efforts
- Work closely with our Product Team to determine requirements for data projects
- Deliver tools for scalable data editing and sourcing
- Design and build a suite of data ingestion and curation tools for internal and external useIdentify, design, and implement internal process improvements for automating manual processes, optimizing data delivery and re-designing infrastructure for greater scalability
- Deploy secure applications on docker K8s architecture
- Own one or more home-grown data toolsets
- Contribute to the deeper platform stack, making contributions to our platform and ETL infrastructure
- Understanding of security considerations
- Passionate about clean functional code
- Interested in learning new techniques
- Able to define and complete a fault-tolerant web application with minimal guidance
- Able to work in a distributed team environment
- Extremely clear communicatorEnthusiastic about data
Your Success Metrics:
- 1 month: Rapidly learn our current web-application portfolio. Take ownership of one of our home-grown tools. Groks our platform infrastructure.
- 3 months: Design and build out tooling for up to 3 mechanical and internal campaigns. Design and build one or more admin tool for operation admins to rapidly deploy campaigns and monitor results. Contribute to the platform migration to Architecture 2.0.
- 6 months: Fully own two or more data tooling applications. Become a full contributor to the larger platform.
If this sounds like you, apply now!
Read Full Job Description