Advance Local Logo

Advance Local

Senior Data Engineer

Posted 2 Days Ago
Remote
Hiring Remotely in USA
120K-140K Annually
Senior level
Remote
Hiring Remotely in USA
120K-140K Annually
Senior level
The Senior Data Engineer will design and maintain the data infrastructure, ensure data integration across platforms, and enforce best practices in data engineering.
The summary above was generated by AI
Job Summary & Responsibilities

Advance Local is looking for a Senior Data Engineer to design, build, and maintain the enterprise data infrastructure that powers our cloud data platform.  This position will combine your deep technical expertise in data engineering with team leadership responsibilities for data engineering, overseeing the ingestion, integration, and reliability of data systems across Snowflake, AWS, Google Cloud, and legacy platforms.  You’ll partner with data product and across business units to translate requirements into technical solutions, integrate data from numerous third-party platforms, (CDPs, DMPs, analytics platforms, marketing tech) into central data platform, collaborate closely with the Data Architect on platform strategy and ensure scalable, well-engineered solutions for modern data infrastructure using infrastructure as code and API-driven integrations.

 

The base salary range is $120,000 - $140,000 per year.

 

What you’ll be doing:

  • Lead the design and implementation of scalable data ingestion pipelines from diverse sources into Snowflake.
  • Partner with platform owners across business units to establish and maintain data integrations from third party systems into the central data platform.
  • Architect and maintain data infrastructure using IAC, ensuring reproducibility, version control and disaster recovery capabilities.
  • Design and implement API integrations and event-driven data flows to support real time and batch data requirements.
  • Evaluate technical capabilities and integration patterns of existing and potential third-party platforms, advising on platform consolidation and optimization opportunities.
  • Partner with the Data Architect and data product to define the overall data platform strategy, ensuring alignment between raw data ingestion and analytics-ready data products that serve business unit needs.
  • Develop and enforce data engineering best practices including testing frameworks, deployment automation, and observability.
  • Support rapid prototyping of new data products in collaboration with data product by building flexible, reusable data infrastructure components.
  • Design, develop, and maintain scalable data pipelines and ETL processes; optimize and improve existing data systems for performance, cost efficiency, and scalability.
  • Collaborate with data product, third-party platform owners, Data Architects, Analytics Engineers, Data Scientists, and business stakeholders to understand data requirements and deliver technical solutions that enable business outcomes across the organization.
  • Implement data quality validation, monitoring, and alerting systems to ensure reliability of data pipelines from all sources.
  • Develop and maintain comprehensive documentation for data engineering processes and systems, architecture, integration patterns, and runbooks.
  • Lead incident response and troubleshooting efforts for data pipeline issues, ensuring minimal business impact.
  • Stay current with the emerging data engineering technologies, cloud services, SaaS platform capabilities, and industry best practices.

 

Our ideal candidate will have the following:

  • Bachelor's degree in computer science, engineering, or a related field
  • Minimum of seven years of experience in data engineering with at least two years in a lead or senior technical role
  • Expert proficiency in Snowflake data engineering patterns
  • Strong experience with AWS services (S3, Lambda, Glue, Step Functions) and Google Cloud Platform
  • Experience integrating data from SaaS platforms and marketing technology stacks (CDPs, DMPs, analytics platforms, CRMs)
  • Proven ability to work with third party APIs, webhooks, and data exports
  • Experience with infrastructure such as code (Terraform, Cloud Formation) and CI/CD pipelines for data infrastructure
  • Proven ability to design and implement API integrations and event-driven architecture
  • Experience with data modeling, data warehousing, and ETL processes at scale
  • Advanced proficiency in Python and SQL for data pipeline development
  • Experience with data orchestration tools (airflow, dbt, Snowflake tasks)
  • Strong understanding of data security, access controls, and compliance requirements
  • Ability to navigate vendor relationships and evaluate technical capabilities of third-party platforms
  • Excellent problem-solving skills and attention to detail
  • Strong communication and collaboraion skills 

Top Skills

Airflow
AWS
Cloud Formation
Dbt
GCP
Python
Snowflake
SQL
Terraform

Similar Jobs

Yesterday
Remote or Hybrid
3 Locations
102K-171K Annually
Senior level
102K-171K Annually
Senior level
eCommerce • Information Technology • Retail • Industrial
As a Senior Data Engineer, you will design and maintain scalable data pipelines, develop data products, and collaborate across teams to deliver insights.
Top Skills: SparkAws GlueCloudFormationDockerGithub ActionsKafkaKubernetesPostgresPythonSap S/4HanaScalaSnowflakeSQLTerraform
Yesterday
Easy Apply
Remote
United States
Easy Apply
118K-184K Annually
Senior level
118K-184K Annually
Senior level
AdTech • Digital Media • Marketing Tech • Software • Automation
The Sr Data Engineer will design, implement, and maintain deployment and ETL pipelines while integrating various data sources and ensuring efficient developer experiences.
Top Skills: Apache AirflowArgo Ci/CdArgo WorkflowsBazelCircle CiDockerFlywayGitHarnessJavaJenkinsKubernetesLookerPower BIPythonSnowflakeSQLThoughtspot
11 Days Ago
Remote or Hybrid
2 Locations
160K-180K Annually
Senior level
160K-180K Annually
Senior level
AdTech • Consumer Web • Digital Media • eCommerce • Marketing Tech
The Senior Data Engineer will build and optimize data integration pipelines, manage data quality, implement business requirements, and ensure system efficiency. A strong emphasis on coding standards and collaboration is required.
Top Skills: Apache BeamApache KafkaSparkAWSGoogle Cloud PlatformPub/SubPythonSQL

What you need to know about the Austin Tech Scene

Austin has a diverse and thriving tech ecosystem thanks to home-grown companies like Dell and major campuses for IBM, AMD and Apple. The state’s flagship university, the University of Texas at Austin, is known for its engineering school, and the city is known for its annual South by Southwest tech and media conference. Austin’s tech scene spans many verticals, but it’s particularly known for hardware, including semiconductors, as well as AI, biotechnology and cloud computing. And its food and music scene, low taxes and favorable climate has made the city a destination for tech workers from across the country.

Key Facts About Austin Tech

  • Number of Tech Workers: 180,500; 13.7% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Dell, IBM, AMD, Apple, Alphabet
  • Key Industries: Artificial intelligence, hardware, cloud computing, software, healthtech
  • Funding Landscape: $4.5 billion in VC funding in 2024 (Pitchbook)
  • Notable Investors: Live Oak Ventures, Austin Ventures, Hinge Capital, Gigafund, KdT Ventures, Next Coast Ventures, Silverton Partners
  • Research Centers and Universities: University of Texas, Southwestern University, Texas State University, Center for Complex Quantum Systems, Oden Institute for Computational Engineering and Sciences, Texas Advanced Computing Center

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account