Logo for Tech Holding

Data Engineer / Integrations Specialist - Contract

Roles & Responsibilities

  • 5+ years working with Python in a data engineering or backend integration context
  • Hands-on experience building data pipelines and ETL processes (extracting, transforming, and loading data between systems)
  • Proven experience integrating third-party REST APIs, with authentication, rate limits, retries, and error handling
  • Strong understanding of data quality including validation, deduplication, schema management, and error recovery

Requirements:

  • Own the data layer of a growing AI-powered platform; design, build, and maintain ETL processes, data pipelines, and API integrations to ensure data accuracy and availability across customer ERP environments
  • Data sync pipeline for a live customer: crawl ERP products, customers, and pricing data into the validation layer
  • ERP connector with REST API integration — auth, retries, timeouts, error handling
  • Durable workflow: validate extracted order data against ERP reference data and submit on approval; implement data quality checks and monitoring for the sync pipeline

Job description

About us:

Working at Tech Holding isn't just a job, it's an opportunity to be a part of something bigger. We are a full-service consulting firm that was founded on the premise of delivering predictable outcomes and high-quality solutions to our clients.  Our founders and team members have industry experience and have held senior positions in a wide variety of companies – from emerging startups to large Fortune 50 firms – and we have taken our combined experiences and developed a unique approach that is supported by the principles of deep expertise, integrity, transparency, and dependability.

The Role:

We are seeking a Data Engineer to own the data layer of a growing AI-powered platform focused on automating document processing and ERP integrations. This is a hands-on engineering role centered on building reliable, scalable data pipelines and integration systems that move data between AI workflows and customer ERP environments. You will design, build, and maintain ETL processes, data pipelines, and API integrations that ensure data is accurate, consistent, and available across multiple customer environments. This role requires strong ownership from design to deployment and monitoring and the ability to work independently in a fast-moving, async-first startup environment.

You will play a critical role in enabling real-time validation and submission of customer data by connecting document intelligence outputs to ERP systems through robust, production-grade integrations.

What You’ll Build:

Short-term (first 60 days):

  • Data sync pipeline for a live customer: crawl ERP products, customers, and pricing data into our
    validation layer
  • ERP connector with REST API integration — auth, retries, timeouts, error handling
  • Durable workflow: validate extracted order data against ERP reference data, submit on approval
  • Data quality checks and monitoring for the sync pipeline.

Medium-term:

  • Generalized ERP adapter pattern (we’re building this across multiple ERPs, data models vary
    significantly)
  • Improved validation: confidence scoring, auto-submission rules, exception handling ERP Unlocked Inc. | Confidential
  • Schema extensions as new customer requirements and ERP platforms surface
  • Data reconciliation tooling across multiple customer environments

Requirements:

  • 5+ years working with Python in a data engineering or backend integration context
  • Hands-on experience building data pipelines and ETL processes, extracting, transforming, and
    loading data between systems
  • Proven experience integrating third-party REST APIs, auth, rate limits, retries, error handling
  • Strong understanding of data quality: validation, deduplication, schema management, error recovery
  • Comfortable owning a data track end-to-end: design → build → ship → monitor
  • Can read API documentation and figure things out independently
  • Strong async Python

Nice to have:

  • Experience with durable workflow orchestration (Temporal, Prefect, Celery, Airflow, etc.)
  • Data pipeline frameworks (Dagster, dbt, Airflow, etc.)
  • ERP integration experience, any platform (NetSuite, Epicor, Acumatica, WhereFour, or similar)
  • TypeScript/React at a working level — not the primary need, but useful
  • Worked at an early-stage startup before

Not Required:

  • Deep ML/AI experience - we use managed AI APIs, not model training
  • DevOps expertise - we have infra sorted, you just need to deploy confidently
  • Frontend specialization - AI tooling covers most of this

What We Offer:

  • Remote opportunity with collaborative team culture
  • Exposure to cloud-first environments and modern DevOps tooling
  • Opportunities for growth and cross-functional impact
  • Dynamic and fast-paced engineering environment

Tech Holding is proud to be an Equal Opportunity Employer and is committed to fostering a diverse and inclusive workplace. We welcome applicants from all backgrounds and experiences, and we consider qualified applicants without regard to race, color, religion, gender, sexual orientation, gender identity, national origin, disability, veteran status, or any other legally protected characteristic. If you require accommodation in the application process, please contact our HR 

Data Engineer Related jobs

Other jobs at Tech Holding

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.