Truckstop Logo

Truckstop

Data Engineer

Reposted Yesterday
Remote
Hiring Remotely in US
100K-105K Annually
Senior level
Remote
Hiring Remotely in US
100K-105K Annually
Senior level
As a Data Engineer III, you will design and maintain scalable ELT pipelines and data models, develop data infrastructure, optimize storage, and enhance data integrations. Collaborate with cross-functional teams to ensure data quality and leverage AI tools to improve delivery.
The summary above was generated by AI

At Truckstop, we have transformed the entire freight-moving lifecycle with our SaaS solutions.  From freight matching to payments and everything in between, we are the trusted partner for carriers, brokers, and shippers alike. We lead this industry forward with our One Team mindset committing to principles such as assume positive intent, have each other’s back, and be your authentic self.  Our drive for greatness produces high expectations, yet our regard for humans is even higher. Join a team of brilliant minds and generous hearts who care deeply about other's success.   

Data Engineer II — Truckstop.com 

Truckstop.com is seeking a seasoned Data Engineer II to help strengthen and scale our modern data platform. In this role, you’ll build and optimize the pipelines, models, and infrastructure that power our analytics, product intelligence, and customer-facing solutions. We’re looking for someone who thrives in a fast-moving environment, collaborates well across teams, and embraces AI-driven development to move faster and smarter. 

What You’ll Do 

  • Design, build, and maintain scalable ELT pipelines and data models with Snowflakedbt, and SQL. 
  • Develop data infrastructure and platform components using TerraformPython, and modern orchestration tools. 
  • Work closely with engineering, analytics, and product teams to ensure data quality, reliability, and availability. 
  • Optimize ingestion, transformation, and storage patterns across Postgres and other relational systems. 
  • Partner with BI and analytics teams to enable self-service reporting in Domo (or other BI tools such as Metabase, Tableau, Power BI). 
  • Manage and enhance data integration workflows using Airbyte and Matillion. 
  • Drive architectural improvements around data governance, observability, automation, and scaling. 
  • Leverage AI-powered coding and workflow tools (e.g., GitHub Copilot, Cursor, CodeWhisperer, OpenAI Assistants) to accelerate delivery and improve code quality. 

What We’re Looking For 

  • Strong experience with SnowflakeSQL, and dbt in a production environment. 
  • Solid understanding of Terraform and infrastructure-as-code practices  
  • Proficiency in Python for data processing, scripting, and automation. 
  • Experience implementing and maintaining ELT pipelines and data integrations. 
  • Familiarity with Postgres or other relational databases. 
  • Hands-on experience with BI or analytics tools. 
  • Experience with AirbyteMatillion, or similar ETL/ELT platforms is highly valued. 
  • Demonstrated use of AI tools (e.g., Copilot, Cursor, Codex, Claude Code) in day-to-day engineering work. 
  • Excellent communication skills and the ability to work cross-functionally. 
  • Bonus: Background in supply chain, freight, or logistics. 

At Truckstop we are dedicated to creating a workplace that is equitable, inclusive, and inspiring for every employee. In an effort to provide greater transparency, we are sharing the base salary range for this position. The position is also eligible for a yearly bonus. Final salary is based on a number of factors including market location, job-related knowledge, education/training, certifications, key skills, experience, internal peer equity as well as business considerations.

The anticipated base pay range for this position is :
$100,000$105,000 USD

The above description covers the most significant duties performed but does not include other related occasional work that may be assigned or is completed by the employee. 

Truckstop provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. 
Truckstop participates in the E-Verify program. Learn more about the E-Verify program here: https://www.e-verify.gov/

Truckstop Privacy Policy

Top Skills

Airbyte
Codewhisperer
Cursor
Dbt
Domo
Github Copilot
Matillion
Metabase
Openai Assistants
Postgres
Power BI
Python
Snowflake
SQL
Tableau
Terraform

Similar Jobs

Yesterday
In-Office or Remote
Minnetonka, MN, USA
Senior level
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Design, build, and maintain scalable data architectures and ETL/ELT pipelines on cloud platforms (Azure). Ensure data quality, security, compliance, and support AI/ML workflows while leading projects and mentoring junior engineers.
Top Skills: Pyspark,Scala,Apache Spark,Hive,Hadoop,Nosql,Sql,Postgresql,Mysql,Azure,Ci/Cd,Devops,Mlops,Etl,Elt,Data Warehousing,Paas,Python
Yesterday
In-Office or Remote
Little Rock, AR, USA
90K-161K Annually
Senior level
90K-161K Annually
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Develop and maintain ETL pipelines (ADF) to load claims into the Medicaid reporting system, resolve data quality issues, optimize performance, create and monitor automated data shares and file extracts, and work with state partners on business-driven extracts.
Top Skills: Azure Data Factory,Snowflake,Oracle Pl/Sql,Sql Server T-Sql,Powershell,Python,Json,Xml,Fixed-Width,Delimited,Etl
3 Days Ago
Remote or Hybrid
Framingham, MA, USA
69K-129K Annually
Mid level
69K-129K Annually
Mid level
Big Data • Healthtech • Software
Design, build, and maintain scalable ETL/ELT pipelines using Python, Spark, Databricks, Airflow and SSIS. Integrate and cleanse diverse healthcare datasets, implement Unity Catalog for metadata and governance, optimize Spark performance and JVM tuning, support Medallion architecture, and collaborate with cross-functional teams to automate CI/CD, observability, and data quality processes.
Top Skills: Python,Scala,Sql,Apache Spark,Databricks,Aws,Ssis,Apache Airflow,Unity Catalog,Jenkins,Gitlab Ci,Parquet,Delta,Csv,Xml,Nosql,Jvm,Medallion Architecture

What you need to know about the Austin Tech Scene

Austin has a diverse and thriving tech ecosystem thanks to home-grown companies like Dell and major campuses for IBM, AMD and Apple. The state’s flagship university, the University of Texas at Austin, is known for its engineering school, and the city is known for its annual South by Southwest tech and media conference. Austin’s tech scene spans many verticals, but it’s particularly known for hardware, including semiconductors, as well as AI, biotechnology and cloud computing. And its food and music scene, low taxes and favorable climate has made the city a destination for tech workers from across the country.

Key Facts About Austin Tech

  • Number of Tech Workers: 180,500; 13.7% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Dell, IBM, AMD, Apple, Alphabet
  • Key Industries: Artificial intelligence, hardware, cloud computing, software, healthtech
  • Funding Landscape: $4.5 billion in VC funding in 2024 (Pitchbook)
  • Notable Investors: Live Oak Ventures, Austin Ventures, Hinge Capital, Gigafund, KdT Ventures, Next Coast Ventures, Silverton Partners
  • Research Centers and Universities: University of Texas, Southwestern University, Texas State University, Center for Complex Quantum Systems, Oden Institute for Computational Engineering and Sciences, Texas Advanced Computing Center

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account