Data Engineer at Sysco LABS (Austin, TX)
Sysco LABS is a technology-focused division within Sysco, dedicated to reimagining foodservice through innovation. An extension of Sysco's commitment to deliver exceptional products and services to the foodservice industry, Sysco LABS uses customer and market intelligence, data-driven insights and agile technology development to rethink the entire foodservice ecosystem.
Our innovation will improve everything from the ordering process, inventory, pricing and automation to the in-restaurant customer experience. Operating with the mindset of a startup and backed by the authoritative expertise of an industry leader, Sysco LABS' mission is to improve the Sysco customer experience and consistently deliver cost savings and new innovations through technology.
About the Role
Build scalable data and analytics models and architecture from event formation to ingestion to reporting, from scratch. Implement robust systems used by every team within Sysco Labs. Shape the vision and architecture of end-to-end pipeline while following industry best practices.
- Manage all aspects of the data and analytics system from stream configuration to ETL to aggregate tables and cubes for reporting needs. Establish and maintain the data pipeline architecture.
- Partner will all facets of the business to include development, operations and consumers of data. Form and maintain relationships with stakeholders to understand needs and translate those needs into priorities and executable plans.
- Write scripts to schedule data ingestion and syncing. Iterate to create back end logic to create data marts from requirements for the purposes of self-serving stakeholders.
- Other duties as assigned.
- 3 years' experience of hands-on data platform design and development in large-scale and complex environments.
- Experience with SQL and Python.
- Experience with Redshift and Postgres.
- Experience with AWS platform and technologies including Kinesis, Firehose and Glue.
- Experience with cloud storage such as S3.
- Experience orchestrating data ingestion and syncing metadata with event data.
- Experience with ETL for reporting and business users (marketing, finance, product owners).
- Recognized degree in computer science or certification in technical related field.
- 5 years' experience of hands-on data platform design and development in large-scale and complex environments.
- Experience with Tableau and Tableau Server.