Data Integration Engineer

| Hybrid
Sorry, this job was removed at 10:31 a.m. (CST) on Monday, December 17, 2018
Find out who's hiring in Austin.
See all Cybersecurity + IT jobs in Austin
Apply
By clicking Apply Now you agree to share your profile information with the hiring company.

SourceDay is seeking a Data Integration Engineer to be apart of our Data Wielding Ninja Team.  SourceDay is the leading cloud-based software solution that automates the purchase order management process. By seamlessly integrating with ERP systes, SourceDay extends purchasing capabilities by centralizing and managing the PO lifecycle for buyers and suppliers, eliminating manual processes while improving supplier performance.  SourceDay specializes in bringing life to our customer's data through our proprietary transformations. These transformations are built, by folks like yourself, across an array of complex ecosystems.  Through SourceDay’s optimized subscribe and publish model you will have the opportunity to continue enhancing our data pipeline capabilities and onboard new customers at a quick pace.  SourceDay needs someone who has the experience in developing data integrations of third-party solutions for various ERPs, like Epicor, Syteline, Microsoft NAV & GP, Visual, Intuitive, Acumatica, and NetSuite.  Having worked with one or more of these ERPs is a significant bonus.

Do you like to:

  • Build APIs and Interfaces for others to consume data?
  • Bootstrap data pipelines to be self-sufficient and easily maintainable?
  • Take on big data challenges by applying your mastery of capturing, cleaning, storing, and securing the data you collect?
  • Take pride in your stewardship of the data you collect and the security, visualization, and queryability you are able to provide?

If you answered yes to all of these then you will enjoy being a part of the SourceDay’s engineering team.

Responsibilities

  • Studies data sources by interviewing customers; defining, analyzing, and validating data objects; identifying the relationship among data objects.
  • Plans data integration process by developing common definitions of sourced data; designing common keys in physical data structure; establishing data integration specifications; examining data applications; examining data models and data warehouse schema; determining best-fit data interchange methods; assessing middleware tools for data integration, transformation, and routing; developing project scope and specifications; identifying factors that negatively impact integration; forecasting resource requirements; establishing delivery timetables.
  • Delivers data integration by implementing shared databases; integrating data shared across legacy, new development, and purchased package environments; developing system modification specifications; mapping data; establishing interfaces; developing and modifying functions, programs, routines, and stored procedures to export, transform, and load data; meeting performance parameters; resolving and escalating integration issues; coordinating actions among users, operations staff, and outside vendors; recommending adjustments as objectives change; documenting operational procedures and data connections.
  • Validates data integration by developing and executing test plans and scenarios including data design, tool design, and data extract/transform.
  • Maintains data warehouse performance by identifying and resolving data conflicts; upgrading data definitions;
  • Improves data integration by designing and evaluating new data interchange formats; improving physical design; rewriting data policy, standards, and procedures;
  • Maintains team accomplishments by communicating essential information; coordinating actions; obtaining expert input; reviewing open issues and action items; contributing information to team meetings and reports; transferring knowledge of data integration process, techniques, and issues to application and support teams.

Requirements

  • Bachelor's degree in Computer Science or related field, or equivalent experience
  • Experience: 4-6 Years
  • Strong Analytical and problem-solving skills
  • Strong SQL and database knowledge
  • Good understanding of Data Governance and other Data Quality practices
  • Experience with leveraging and integrating data pipelines within AWS services jungle
  • General networking understanding
  • Installation and Configuration experience with various mid-to-Enterprise market ERP systems like Epicor, Syteline, Microsoft NAV & GP, Visual, Intuitive, Acumatica, and NetSuite
  • Hands-on experience building data processing pipelines (e.g, in Storm, Beam, Spark, Flink, Lambda)
  • Strong experience with object-oriented and/or functional languages (e.g. C#, Java, Scala, Go, Python)
  • Proficiency with metaprogramming languages (e.g. Ruby)
  • Deep understanding of relational as well as NoSQL data stores (e.g., Snowflake, Redshift, BigTable, Spark) and approaches
  • Strong experience in developing services within IIS and SQL Server
  • Strong API development experience in building scalable services

 

Read Full Job Description
Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.

Location

Our office is in the Arboretum, just a few minutes from MOPAC, 183, and 360. We're surrounded by dozens of restaurants, shops, & bars.

Similar Jobs

Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.
Learn more about SourceDayFind similar jobs