Position Summary
The Senior Data Operations Engineer is a crucial member of the Clients team, providing essential data support within the modern data stack, alongside coaching and hands-on assistance to engineers, analysts, business users, data scientists, and decision-makers across the company. This role demands deep knowledge of SQL, Python and Optimization, as well as familiarity with tools like Snowflake, Databricks, Azure Cloud, DBT, and git version control.
Senior Data Operations Engineers play a key role in managing and enhancing data workflows, combining technical skills with a thorough understanding of data management principles. They contribute to coding new features, assist other principal engineers with architectural plans, and conduct code reviews to ensure that new features and fixes efficiently meet stakeholder needs. Ideal candidates should enjoy teamwork and be adept at publicly sharing their ideas.
The Senior Data Operations Engineer reports to the Manager of IT Data Operations Engineering.
Essential Responsibilities
- Collaborate with business partners to comprehend external system configurations and establish connectivity, facilitating downstream data engineering development
- Oversee the entire data pipeline, from data collection to deployment of data models
- Monitor data pipeline performance and support bug fixing and performance analysis along the data pipeline; resolve any issues or bottlenecks
- Extensive experience in optimization and cost savings, with a proven track record of effectively managing and enhancing data workflows to reduce overhead and improve operational efficiency in data operations
- Perform end-to-end unit testing and code reviews to promote data integrity across a variety of products built by the development team
- Identify and implement process improvements, such as automating manual processes
- Provide technical support and training to end-users on data access and usage
- Be comfortable presenting to large groups in public settings with high visibility
- Be a strong advocate for a culture of process and data quality across development teams
- Follow an agile development methodology
- Other duties as assigned
Minimum Experience and Qualifications
- Bachelor's degree in Computer Science or Engineering; OR demonstrated capability to perform job responsibilities with a combination of a High School Diploma/GED and at least four (4) years of previous relevant work experience
- Five (5) years of relevant experience in a data role working with data warehouses and data analytics tools
- Familiarity with cloud services (AWS, Azure, or Google Cloud) and understanding of data warehousing solutions like Snowflake
- Proficiency with SQL skills
- Experience with modern Extract/Load/Transform (ELT) orchestration tools like Azure Data Factory or Airflow
- Experience with git and git-based workflows
- Experience in optimization and enhancement of cloud environments
- Knowledge of data modeling, data warehousing, and data architecture principles
- Excellent problem-solving skills and the ability to work in a team environment
- Strong communication skills and the ability to convey complex data issues in clear terms to non-technical stakeholders
- Must be legally eligible to work in the country in which the position is located
Preferred Experience And Qualifications
- Experience implementing best-practices for performance tuning and data processes, optimizing resource utilization, and implementing cost-effective solutions to enhance data operations
- Strong knowledge of Python programming
- Proven track record of successfully contributing to a project that transitioned a large enterprise to a new cloud data warehouse, like Snowflake
- Prior airline experience




