Offer summary
Qualifications:
Bachelor’s or Master’s degree in Computer Science, Engineering, or related field, 5+ years experience in software development, data engineering or similar, Strong programming skills in Python, familiarity with AWS, GCP, Docker, PostgreSQL, Expertise in optimizing integration pipelines and tools (e.g., Airflow, Apache Kafka), Experience in data pipeline design, troubleshooting, and monitoring for data quality.
Key responsabilities:
- Develop and maintain reliable data integrations for key systems
- Optimize integration pipelines for high performance and cost-efficiency
- Refine technology stack focusing on PostgreSQL, Druid, Pandas, Python, SQL
- Provide technical support as SME in data integration and systems engineering
- Communicate technical concepts to both technical and non-technical stakeholders