Position Description:
Develops and deploys pipelines, using DevOps and Continuous Integration/Continuous Delivery (CI/CD) best practices within a Cloud native infrastructure. Provides data analysis on complex systems analysis projects, often across subsystems and companies, and in a matrix organization. Develops Extract, Transform, Load (ETL) and Extract, Load, Transform (ELT) pipelines to move data to and from Snowflake data store, using Python and Snowflake SnowSQL. Establishes CD pipelines to deploy tools and practices, using GitHub, Jenkins, Stash, and Artifactory. Supports the creation, maintenance, and compliance of Agile/SCRUM development standards and guidelines. Performs data manipulation using Amazon Web Services (AWS). Performs data mining and data analysis, using Oracle, SQL server, and NoSQL database.
Primary Responsibilities:
- Designs, implements, and maintains data structures, batch jobs, and interfaces to external systems.
- Develops original and creative technical solutions to on-going development efforts.
- Develops applications for multiple projects supporting several divisional initiatives.
- Supports and performs all phases of testing leading to implementation.
- Assists in the planning and conducting of user acceptance testing.
- Develops comprehensive documentation for multiple applications supporting several corporate initiatives.
- Responsible for post-installation validation and triaging of any issues.
- Establishes project plans for projects of moderate scope.
- Performs independent and complex technical and functional analysis for multiple projects supporting several initiatives.
- Manages data services hosted on the operational data stores and file-based interfaces.
- Confers with systems analysts and other software engineers/developers to design systems.
- Gathers information on project limitations and capabilities, performance requirements, and interfaces.
- Develops and oversees software system testing and validation procedures, programming, and documentation.
Education and Experience:
Bachelor’s degree in Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and five (5) years of experience as a Principal Data Engineer (or closely related occupation) analyzing, designing, and building ETL processes, and using data integration solutions to ensure reliable and scalable data management across operational or analytical capability platforms, using Snowflake, Kafka, and AWS.
Or, alternatively, Master’s degree in Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and three (3) years of experience as a Principal Data Engineer (or closely related occupation) analyzing, designing, and building ETL processes, and using data integration solutions to ensure reliable and scalable data management across operational or analytical capability platforms, using Snowflake, Kafka, and AWS.
Skills and Knowledge:
Candidate must also possess:
- Demonstrated Expertise (“DE”) architecting, designing, and developing microservices-based Application Programming Interfaces (APIs) and testing automation frameworks, using Python, Java, Swagger, Amazon Elastic Kubernetes Service (EKS), and serverless technologies.
- DE developing CI/CD pipelines in a hybrid on-prem and Cloud environment (AWS) to deliver changes in production and non-production environments, using DevOps tools (GitHub, Jenkins, Maven, and Terraform).
- DE analyzing, designing, developing, and testing ETL batch processing application for data warehouse and Online Transaction Processing (OLTP) based systems, using AWS, Snowflake, Oracle, and PostgreSQL PL/SQL.
- DE performing Logical and Physical Data Modelling for relational databases – SQL Server, Postgres, Oracle, and Snowflake; and optimizing database and query performance by implementing appropriate data types, indexing strategies, and partitioning techniques based on data access patterns.
#PE1M2
#LI-DNI
Certifications:Category:Information TechnologyMost roles at Fidelity are Hybrid, requiring associates to work onsite every other week (all business days, M-F) in a Fidelity office. This does not apply to Remote or fully Onsite roles. Some roles may have unique onsite requirements. Please consult with your recruiter for the specific expectations for this position.
Please be advised that Fidelity’s business is governed by the provisions of the Securities Exchange Act of 1934, the Investment Advisers Act of 1940, the Investment Company Act of 1940, ERISA, numerous state laws governing securities, investment and retirement-related financial activities and the rules and regulations of numerous self-regulatory organizations, including FINRA, among others. Those laws and regulations may restrict Fidelity from hiring and/or associating with individuals with certain Criminal Histories.
Top Skills
Similar Jobs
What you need to know about the Austin Tech Scene
Key Facts About Austin Tech
- Number of Tech Workers: 180,500; 13.7% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Dell, IBM, AMD, Apple, Alphabet
- Key Industries: Artificial intelligence, hardware, cloud computing, software, healthtech
- Funding Landscape: $4.5 billion in VC funding in 2024 (Pitchbook)
- Notable Investors: Live Oak Ventures, Austin Ventures, Hinge Capital, Gigafund, KdT Ventures, Next Coast Ventures, Silverton Partners
- Research Centers and Universities: University of Texas, Southwestern University, Texas State University, Center for Complex Quantum Systems, Oden Institute for Computational Engineering and Sciences, Texas Advanced Computing Center

.jpeg)

