Degreed is the upskilling platform that connects learning to opportunities. We integrate everything people use to learn and build their careers—skill insights, LMSs, courses, videos, articles, and projects—and match everyone to growth opportunities that fit their unique skills, roles, and goals.
The Senior Software Engineer - Integrations works with other Engineers to create, implement and maintain the infrastructure used by the Integrations team. In this role, you’ll take ownership of infrastructure, data pipelining and data modeling to support the Integrations team's efforts. In addition to working bi-directional integration with external systems, you will work to upskill other members of the Integration Team on the integration framework’s infrastructure and technologies.
This role will be based in Bengaluru, India. After an in-office onboarding period, this position will be remote-first but with requirements for recurring in-office days for team syncs and group gatherings. Candidates will also be required to travel internationally 1-2 times annually for full company gatherings.
Day in the Life
- Enhance and support the current data infrastructure that enables Integrations team to create meaningful data integrations
- Build a secure, stable, reliable, scalable infrastructure that helps to support the teams growing needs
- Develop and support data infrastructure, data pipeline and data modeling
- Background creating infrastructures utilizing Airflow, Kubernetes, and Python
- Work closely with stakeholders to drive future integrations initiatives
- Create utilities that surface what is happening within data integration pipelines
- Write documentation that provides insights to the use and maintain the integrations framework
- Mentor and upskill engineers on the team
- This description reflects management's assignment of essential functions; it does not prescribe or restrict other tasks as assigned and is subject to change at any time.
Who You Are
- Team player and self starter who excels at working towards a common goal with minimal guidance
- 3+ years’ experience with Linux systems, development and support of performant and scalable production Python services, and data pipeline platforms (eg. Airflow, Stitch, Spark)
- 4 + years of hands-on experience with SQL (via MS SQL Server and Snowflake), Data Modeling, and ELT processing on Azure or similar cloud platforms
- 2+ years of writing automated testing using tools like pytest
- Practical experience with software development processes and tools such as Scrum, code reviews, GitHub, Jira, etc.
What Sets You Apart
- 2+ years’ experience using and maintaining production CI/CD pipelines using Github Actions
- Experience utilizing Dbt to actualize models
- Exposure to Jinja coding
- Practical hands-on experience with creating Airflow DAG’s and with Kubernetes
- Lifelong learner who is always upskilling their current knowledge and adding new skills
Educational/Certification Requirements
- Bachelor’s or Master’s degree in a field like Computer Science, Data Analytics, Software Engineering