Logo for Satellite Innovations

Data Engineer - Sparrow

Roles & Responsibilities

  • 2-4 years of experience as a Software Engineer or Data Engineer
  • Strong hands-on experience with Python for building and automating data pipelines
  • Strong SQL skills for querying, transforming, and validating data
  • Experience working with Databricks (Spark, Delta Lake)

Requirements:

  • Design, build, and maintain data pipelines and ETL workflows using Python
  • Write and optimize SQL queries for data transformation and validation
  • Develop data processing jobs in a Databricks-first environment using Spark and Delta Lake
  • Orchestrate data workflows using Airflow (DAGs, scheduling, dependencies, retries)

Job description

Satellite builds dedicated engineering teams for top U.S. -based startups. We help clients grow their products by sourcing world-class technical talent around the globe - and now we’re looking for a mid-level Data Engineer to join a fintech project - Sparrow Card.
You’ll become part of a collaborative engineering team, working closely with the client and contributing to the architecture and development of scalable data systems from the ground up.


What You’ll Do:
  • Design, build, and maintain data pipelines and ETL workflows using Python.
  • Write and optimize SQL queries for data transformation and validation.
  • Develop data processing jobs in a Databricks-first environment using Spark and Delta Lake.
  • Orchestrate data workflows using Airflow (DAGs, scheduling, dependencies, retries).
  • Implement data quality checks to ensure reliability and consistency of data pipelines.
  • Support the Principal Data Engineer and collaborate closely with the Data Engineering team.
  • Work with cloud-based data platforms, primarily within AWS.
  • Use Git for version control and contribute to basic CI/CD workflows.
  • Participate in code reviews and continuous improvement of data solutions.nt.

  • We’re Looking For:
  • 2-4 years of experience as a Software Engineer or Data Engineer.
  • Strong hands-on experience with Python for building and automating data pipelines.
  • Strong SQL skills for querying, transforming, and validating data.
  • Experience working with Databricks (Spark, Delta Lake).
  • Hands-on experience with Airflow for workflow orchestration.
  • Familiarity with AWS fundamentals in a cloud-based data environment.
  • Experience building and maintaining ETL pipelines.
  • Experience with Git and basic CI/CD processes.
  • Nice to have: experience with streaming data pipelines.
  • Ability to work effectively in a collaborative, fast-paced environment.
  • Good English communication skills (verbal and written).
  • Data Engineer Related jobs

    Other jobs at Satellite Innovations

    We help you get seen. Not ignored.

    We help you get seen faster — by the right people.

    🚀

    Auto-Apply

    We apply for you — automatically and instantly.

    Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

    AI Match Feedback

    Know your real match before you apply.

    Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

    Upgrade to Premium. Apply smarter and get noticed.

    Upgrade to Premium

    Join thousands of professionals who got noticed and hired faster.