Role: Airflow data engineer Job Location: Remote Duration: 12+ Months Interview: Video
Job Description: We are looking for self-motivated and data-driven engineers, architects, and designers who desire to find solutions and make an impact at scale while collaborating with a distributed team of like-minded and highly skilled professionals.
Must have experience on Airflow, snowflake, python, AWS and API Consumption
5+ years of software development or data engineering experience in Python, Spark, Scala or equivalent technologies
Experience designing and building highly scalable data pipelines Airflow)
Knowledge and experience of working with large datasets (PB-scale)
Proven track record of working with cloud technologies (AWS)
Experience with developing or consuming web interfaces (REST API)
Experience with modern software development practices, leveraging CI/CD, and containerization such as Docker
Roles & Resposbilities.
Design, build and support scalable data pipelines, systems, and APIs for the AdTech and MarTech Data Platform
Use distributed computing frameworks and other cutting-edge technologies to support data ingestion at scale
Produce high-quality code that is robust, efficient, testable and easy to maintain
Required profile
Experience
Level of experience:Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.