Match score not available

Senior Data Pipeline Developer

73% Flex
Remote: 
Full Remote
Contract: 
Salary: 
170 - 230K yearly
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Bachelor's degree in Computer Science, Engineering, Mathematics, or related field, 5+ years of experience in data engineering, Proficiency in data pipeline technologies like Apache Kafka, NiFi, Airflow, Strong programming skills in languages such as Python, Java, or Scala, Experience with cloud platforms like AWS, Azure, or Google Cloud.

Key responsabilities:

  • Design, build and optimize data pipeline solutions
  • Ingest, transform and cleanse data for seamless flow
  • Orchestrate and automate workflow processes efficiently
  • Optimize performance for large-scale data processing needs
  • Implement monitoring, error handling, and documentation best practices
Unreal Gigs logo
Unreal Gigs Startup https://www.unrealstaffing.com/
2 - 10 Employees
See more Unreal Gigs offers

Job description

Logo Jobgether

Your missions

Company Overview: Welcome to the forefront of data-driven innovation! Our company is dedicated to leveraging the power of data to drive transformative change and solve complex problems across industries. We're committed to building cutting-edge data pipeline solutions that enable efficient data ingestion, processing, and delivery. Join us and be part of a dynamic team shaping the future of data pipeline development.

Position Overview: As a Senior Data Pipeline Developer, you'll play a critical role in designing, building, and optimizing our data pipeline solutions. You'll work on challenging projects, from data ingestion and transformation to orchestration and automation, to support the needs of our data-driven organization. If you're a seasoned developer with expertise in data pipeline technologies and a passion for building scalable and reliable data systems, we want you on our team.

Requirements

Key Responsibilities:

  1. Data Pipeline Design: Design and implement data pipeline solutions to ingest, process, and deliver data from various sources to target systems, ensuring data quality, integrity, and timeliness.
  2. Data Ingestion: Develop and maintain data ingestion processes to collect data from diverse sources, including databases, APIs, files, and streaming sources, ensuring seamless data flow and interoperability.
  3. Data Transformation: Transform and cleanse data as it moves through the pipeline, applying business rules, data enrichment, and validation to meet business requirements and enable downstream analytics and reporting.
  4. Orchestration and Automation: Implement workflow orchestration and automation solutions to schedule and manage data pipeline workflows, reducing manual intervention and improving operational efficiency.
  5. Performance Optimization: Optimize data pipeline performance through parallel processing, partitioning, and other techniques, ensuring scalability and responsiveness for large-scale data processing needs.
  6. Monitoring and Alerting: Implement monitoring and alerting systems to track data pipeline performance and health, proactively identifying and resolving issues to minimize downtime and data loss.
  7. Error Handling and Retry Mechanisms: Implement error handling and retry mechanisms to handle data processing failures and ensure data reliability and consistency.
  8. Documentation and Best Practices: Document data pipeline designs, processes, and best practices, providing clear and comprehensive documentation to facilitate understanding and collaboration among team members.
  9. Collaboration: Collaborate with cross-functional teams, including data engineers, data scientists, and business analysts, to understand requirements and deliver data pipeline solutions that meet business needs.
  10. Mentorship and Development: Mentor junior developers, sharing expertise and best practices in data pipeline development, and facilitate knowledge sharing sessions within the team.

Qualifications:

  • Bachelor's degree or higher in Computer Science, Engineering, Mathematics, or related field.
  • 5+ years of experience in data engineering, with a focus on designing, building, and optimizing data pipeline solutions.
  • Proficiency in data pipeline technologies such as Apache Kafka, Apache NiFi, Apache Airflow, or similar.
  • Strong programming skills in languages such as Python, Java, or Scala, with experience in data processing frameworks like Apache Spark or Apache Beam.
  • Experience with cloud platforms such as AWS, Azure, or Google Cloud Platform, and services like AWS Glue, Azure Data Factory, or Google Dataflow.
  • Strong understanding of data integration concepts and techniques, with experience integrating data from diverse sources and systems.
  • Strong problem-solving skills and analytical thinking, with the ability to troubleshoot complex data pipeline issues and optimize system performance.
  • Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams and communicate technical concepts to non-technical stakeholders.

Benefits

  • Competitive salary: The industry standard salary for Senior Data Pipeline Developers typically ranges from $170,000 to $230,000 per year, depending on experience and qualifications.
  • Comprehensive health, dental, and vision insurance plans.
  • Flexible work hours and remote work options.
  • Generous vacation and paid time off.
  • Professional development opportunities, including access to training programs, conferences, and workshops.
  • State-of-the-art technology environment with access to cutting-edge tools and resources.
  • Vibrant and inclusive company culture with opportunities for growth and advancement.
  • Exciting projects with real-world impact at the forefront of data-driven innovation.

Join Us: Ready to shape the future of data pipeline development? Apply now to join our team and be part of the data revolution!

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Soft Skills

  • Interpersonal Skills
  • Team Collaboration
  • Problem Solving
  • Analytical Thinking
  • Teamwork
  • Mentorship

Go Premium: Access the World's Largest Selection of Remote Jobs!

  • Largest Inventory: Dive into the world's largest remote job inventory. More than half of these opportunities can't be found on standard platforms.
  • Personalized Matches: Our AI-driven algorithms ensure you find job listings perfectly matched to your skills and preferences.
  • Application fast-lane: Discover positions where you rank in the TOP 5% of applicants, and get personally introduced to recruiters with Jobgether.
  • Try out our Premium Benefits with a 7-Day FREE TRIAL.
    No obligations. Cancel anytime.
Upgrade to Premium

Find other similar jobs