Build and maintain Databricks-based ETL/ELT pipelines, transform and model data, optimize workflows, monitor and troubleshoot pipelines, and collaborate with cross-functional teams.
Position Overview
V4C.ai is seeking a motivated Data Engineer to join our remote team in the United States. In this role, you will support the design, development, and maintenance of data solutions using Databricks, helping clients and internal teams process, transform, and analyze data effectively. You'll work on building reliable data pipelines and workflows in a collaborative environment, gaining hands-on experience with modern data engineering tools and cloud technologies.
Key Responsibilities- Collaborate with team members and stakeholders to understand data requirements and contribute to building scalable data pipelines and workflows in Databricks.
- Develop and implement ETL/ELT processes using Databricks, Python, SQL, and related tools to ingest, transform, and prepare data.
- Assist in optimizing data workflows for better performance, reliability, and cost-efficiency within Databricks environments.
- Support the creation and maintenance of data models, tables, and integrations in cloud platforms (Azure, AWS, or similar).
- Work closely with cross-functional teams (data analysts, scientists, and engineers) to deliver clean, accessible data for analytics and reporting.
- Monitor data pipelines, troubleshoot basic issues, and contribute to documentation and best practices.
- Stay curious about new Databricks features and data engineering trends to support ongoing improvements.
- Bachelor's degree in Computer Science, Data Science, Engineering, Information Systems, or a related field (or equivalent practical experience).
- 1-2 years of professional experience in data engineering, data processing, analytics engineering, or a closely related role (internships, co-ops, or academic projects with relevant tools count toward this).
- Hands-on experience and comfort building basic data pipelines or transformations.
- Proficiency in Python and SQL; experience with Scala is a plus but not required.
- Basic understanding of cloud platforms such as Azure, AWS, or GCP (ex: working with storage, compute, or data services).
- Solid analytical and problem-solving skills with attention to detail and a focus on writing clean, maintainable code.
- Strong communication skills and ability to work collaboratively in a remote team environment.
- Eagerness to learn, take ownership of tasks, and grow within data engineering.
Top Skills
AWS
Azure
Databricks
Elt
ETL
GCP
Python
Scala
SQL
Similar Jobs
Big Data • Healthtech • Software
Design, build, and maintain scalable ETL/ELT pipelines using Python, Spark, Databricks, Airflow and SSIS. Integrate and cleanse diverse healthcare datasets, implement Unity Catalog for metadata and governance, optimize Spark performance and JVM tuning, support Medallion architecture, and collaborate with cross-functional teams to automate CI/CD, observability, and data quality processes.
Top Skills:
Apache AirflowSparkAWSCsvDatabricksDeltaGitlab CiJenkinsJvmMedallion ArchitectureNoSQLParquetPythonScalaSQLSsisUnity CatalogXML
Professional Services • Software
Lead the buildout of a new enterprise data platform, designing infrastructure, pipelines, and storage while ensuring data governance and quality.
Top Skills:
AWSAzureDatabricksGCPJavaSnowflakeSQL
Digital Media • Gaming • Information Technology • Software • Sports • Esports • Big Data Analytics
As a Senior Data Engineer, you'll design and implement scalable data systems, optimize performance, and lead projects while collaborating with teams to enhance data solutions.
Top Skills:
BigQueryGitRedshiftSnowflakeSQL
What you need to know about the Austin Tech Scene
Austin has a diverse and thriving tech ecosystem thanks to home-grown companies like Dell and major campuses for IBM, AMD and Apple. The state’s flagship university, the University of Texas at Austin, is known for its engineering school, and the city is known for its annual South by Southwest tech and media conference. Austin’s tech scene spans many verticals, but it’s particularly known for hardware, including semiconductors, as well as AI, biotechnology and cloud computing. And its food and music scene, low taxes and favorable climate has made the city a destination for tech workers from across the country.
Key Facts About Austin Tech
- Number of Tech Workers: 180,500; 13.7% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Dell, IBM, AMD, Apple, Alphabet
- Key Industries: Artificial intelligence, hardware, cloud computing, software, healthtech
- Funding Landscape: $4.5 billion in VC funding in 2024 (Pitchbook)
- Notable Investors: Live Oak Ventures, Austin Ventures, Hinge Capital, Gigafund, KdT Ventures, Next Coast Ventures, Silverton Partners
- Research Centers and Universities: University of Texas, Southwestern University, Texas State University, Center for Complex Quantum Systems, Oden Institute for Computational Engineering and Sciences, Texas Advanced Computing Center

.jpg)

