Offer summary
Qualifications:
Bachelor's degree in Computer Science, Engineering, Mathematics, or related field, 5+ years of experience in data engineering, Proficiency in data pipeline technologies like Apache Kafka, NiFi, Airflow, Strong programming skills in languages such as Python, Java, or Scala, Experience with cloud platforms like AWS, Azure, or Google Cloud.
Key responsabilities:
- Design, build and optimize data pipeline solutions
- Ingest, transform and cleanse data for seamless flow
- Orchestrate and automate workflow processes efficiently
- Optimize performance for large-scale data processing needs
- Implement monitoring, error handling, and documentation best practices