ComboCurve Logo

ComboCurve

Senior Data Engineer, Python

Posted 51 Minutes Ago
Remote
Hiring Remotely in United States
Senior level
Remote
Hiring Remotely in United States
Senior level
The Senior Data Engineer will design and maintain data pipelines, develop data models, and collaborate with teams on optimization problems, using Python and cloud technologies.
The summary above was generated by AI

ComboCurve is a industry leading cloud-based software solution for A&D, reservoir management, and forecasting in the energy sector. Our platform empowers professionals to evaluate assets, optimize workflows, and manage reserves efficiently, all in one integrated environment.
By streamlining data integration and enhancing collaboration, we help operators, engineers, and financial teams make informed decisions faster. Trusted by top energy companies, ComboCurve delivers real-time analytics and exceptional user support, with a world-class customer experience team that responds to inquiries in under 5 minutes.

We are seeking a highly analytical and experienced Senior Data Engineer to help optimize production forecasting and operations scheduling within the petroleum engineering domain. You’ll bridge the gap between complex mathematical models (reservoir dynamics, optimization, logistics) and robust, cloud-scale data systems.

This role requires a unique combination of deep Python expertise, mastery of modern data processing and API frameworks, and a strong foundational understanding of mathematics, reasoning, and petroleum engineering principles.


Responsibilities

Data Architecture & Engineering

  • Design, build, and maintain scalable data pipelines for ingesting, transforming, and validating time-series data related to well performance, sensor readings, and operational logs.
  • Develop robust, high-performance data models using PyArrow and Pandas for efficient analysis and transfer.
  • Implement data quality and schema validation using Pydantic to ensure data integrity across all stages of the pipeline.
  • Manage and optimize data storage and retrieval in MongoDB, and integrate with cloud-native platforms like GCP BigQuery or Snowflake where applicable.

API & Application Development

  • Build, deploy, and maintain high-performance asynchronous microservices and prototypes using FastAPI or Flask to serve complex optimization and scheduling model predictions.
  • Use Postman for testing, documenting, and automating API workflows.
  • Containerize and orchestrate applications using Docker and manage deployment on Google Cloud Platform (GCP).

Quantitative Analysis & Optimization

  • Collaborate with reservoir and operations teams to translate complex scheduling and logistics problems into mathematical models (e.g., linear programming, resource allocation).
  • Implement numerical routines and simulations efficiently using NumPy for use in production environments.
  • Apply strong logical and analytical reasoning to debug, validate, and interpret the outputs of operational scheduling algorithms.

Requirements

  • Education: Bachelor’s or Master’s degree in Petroleum Engineering, Computer Science, Mathematics, Operations Research, or related quantitative field, or equivalent experience.
  • Quantitative Strength: Proven ability to work with mathematical modeling, optimization, and time-series analysis, including:

o   Linear and Mixed-Integer Programming

o   Probability and Statistics

o   Algorithmic Complexity and Performance Reasoning

  • Collaborative mindset — experience working closely with data scientists, product owners, and domain experts to deliver production-ready systems.

Preferred Qualifications

  • Domain Expertise: Solid understanding of well operations, drilling logistics, production data, and scheduling workflows.
  • Experience working with large-scale or streaming datasets.
  • Experience with mathematical modeling and optimization libraries (SciPy, PuLP, OR-Tools).
  • Experience setting up CI/CD pipelines and container deployments on GCP.

Top Skills

Ci/Cd
Docker
Fastapi
Flask
Gcp Bigquery
MongoDB
Numpy
Or-Tools
Pandas
Postman
Pulp
Pyarrow
Pydantic
Python
Scipy
Snowflake

Similar Jobs

15 Days Ago
Easy Apply
Remote or Hybrid
USA
Easy Apply
170K-200K Annually
Senior level
170K-200K Annually
Senior level
eCommerce • Information Technology • Machine Learning • Marketing Tech • Database • Analytics • Big Data Analytics
As a Senior Software Engineer for Data Systems, you'll design scalable data pipelines, optimize data flows, and build APIs for integrations, requiring strong collaboration with cross-functional teams.
Top Skills: AirflowBigQueryDockerKubernetesPythonSQL
9 Days Ago
In-Office or Remote
2 Locations
Senior level
Senior level
Fintech • Consulting
The Senior Data Engineer develops and maintains data pipelines, collaborates on scalable data solutions, and ensures data security via cloud services.
Top Skills: .NetAWSAzureETLJavaPythonScala
20 Days Ago
Easy Apply
In-Office or Remote
2 Locations
Easy Apply
165K-195K Annually
Senior level
165K-195K Annually
Senior level
Healthtech • Information Technology • Mobile • Productivity • Software • Analytics • Telehealth
The Senior Software Engineer will enhance Doximity's data platforms, collaborating with data teams to resolve challenges, and implement solutions to improve the data stack.
Top Skills: Apache AirflowAWSDockerKafkaKubernetesPodmanPythonSnowflakeSQL

What you need to know about the Austin Tech Scene

Austin has a diverse and thriving tech ecosystem thanks to home-grown companies like Dell and major campuses for IBM, AMD and Apple. The state’s flagship university, the University of Texas at Austin, is known for its engineering school, and the city is known for its annual South by Southwest tech and media conference. Austin’s tech scene spans many verticals, but it’s particularly known for hardware, including semiconductors, as well as AI, biotechnology and cloud computing. And its food and music scene, low taxes and favorable climate has made the city a destination for tech workers from across the country.

Key Facts About Austin Tech

  • Number of Tech Workers: 180,500; 13.7% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Dell, IBM, AMD, Apple, Alphabet
  • Key Industries: Artificial intelligence, hardware, cloud computing, software, healthtech
  • Funding Landscape: $4.5 billion in VC funding in 2024 (Pitchbook)
  • Notable Investors: Live Oak Ventures, Austin Ventures, Hinge Capital, Gigafund, KdT Ventures, Next Coast Ventures, Silverton Partners
  • Research Centers and Universities: University of Texas, Southwestern University, Texas State University, Center for Complex Quantum Systems, Oden Institute for Computational Engineering and Sciences, Texas Advanced Computing Center

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account