Senior DataOps Engineer

Work set-up: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

Bachelor's degree in Computer Science or Engineering or equivalent experience., At least five years of experience working with data warehouses and analytics tools., Proficiency in SQL and familiarity with cloud services like Azure, AWS, or Google Cloud., Experience with ELT orchestration tools, git workflows, and data modeling principles..

Key responsibilities:

  • Manage and monitor the entire data pipeline from collection to deployment.
  • Collaborate with business partners to understand system configurations and establish connectivity.
  • Perform code reviews, optimize data workflows, and implement process improvements.
  • Provide technical support and training to end-users on data access and usage.

Sky Systems, Inc. (SkySys) logo
Sky Systems, Inc. (SkySys) Information Technology & Services Startup https://myskysys.com/
11 - 50 Employees
See all jobs

Job description


Role: Senior DataOps Engineer
Position Type: Full-Time Contract (40hrs/week)
Contract Duration: 12 months+
Work Hours: 8am-4pm EST OR 4am-1pm EST
Work Schedule: 8 hours/day (Mon-Fri)
Location: 100% remote (Candidates can work from anywhere in India)

Position Summary

The Senior Data Operations Engineer is a crucial member of the Client's team, providing essential data support within the modern data stack, alongside coaching and hands-on assistance to engineers, analysts, business users, data scientists, and decision-makers across the company. This role demands deep knowledge of SQL, Python and Optimization, as well as familiarity with tools like Snowflake, Databricks, Azure Cloud, DBT, and git version control. Senior Data Operations Engineers play a key role in managing and enhancing data workflows, combining technical skills with a thorough understanding of data management principles. They contribute to coding new features, assist other principal engineers with architectural plans, and conduct code reviews to ensure that new features and fixes efficiently meet stakeholder needs. Ideal candidates should enjoy teamwork and be adept at publicly sharing their ideas. The Senior Data Operations Engineer reports to the Manager of IT Data Operations Engineering.

Essential Responsibilities

  • Collaborate with business partners to comprehend external system configurations and establish connectivity, facilitating downstream data engineering development
  • Oversee the entire data pipeline, from data collection to deployment of data models
  • Monitor data pipeline performance and support bug fixing and performance analysis along the data pipeline; resolve any issues or bottlenecks
  • Extensive experience in optimization and cost savings, with a proven track record of effectively managing and enhancing data workflows to reduce overhead and improve operational efficiency in data operations
  • Perform end-to-end unit testing and code reviews to promote data integrity across a variety of products built by the development team
  • Identify and implement process improvements, such as automating manual processes
  • Provide technical support and training to end-users on data access and usage
  • Be comfortable presenting to large groups in public settings with high visibility
  • Be a strong advocate for a culture of process and data quality across development teams
  • Follow an agile development methodology
  • Other duties as assigned

Minimum Experience and Qualifications:

  • Bachelor's degree in Computer Science or Engineering; OR demonstrated capability to perform job responsibilities with a combination of a High School Diploma/GED and at least four (4) years of previous relevant work experience
  • Five (5) years of relevant experience in a data role working with data warehouses and data analytics tools
  • Familiarity with cloud services (AWS, Azure, or Google Cloud) and understanding of data warehousing solutions like Snowflake
  • Proficiency with SQL skills
  • Experience with modern Extract/Load/Transform (ELT) orchestration tools like Azure Data Factory or Airflow
  • Experience with git and git-based workflows
  • Experience in optimization and enhancement of cloud environments
  • Knowledge of data modeling, data warehousing, and data architecture principles
  • Excellent problem-solving skills and the ability to work in a team environment
  • Strong communication skills and the ability to convey complex data issues in clear terms to non-technical stakeholders
  • Available for overnight travel (10%)
  • Must pass a pre-employment drug test
  • Must be legally eligible to work in the country in which the position is located
  • Authorization to work in the US is required. This position is not eligible for visa sponsorship

Preferred Experience And Qualifications:

  • Experience implementing best-practices for performance tuning and data processes, optimizing resource utilization, and implementing cost-effective solutions to enhance data operations
  • Strong knowledge of Python programming
  • Proven track record of successfully contributing to a project that transitioned a large enterprise to a new cloud data warehouse, like Snowflake
  • Prior airline experience

Required profile

Experience

Level of experience: Senior (5-10 years)
Industry :
Information Technology & Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Teamwork
  • Communication
  • Problem Solving

Field Engineer (Solutions) Related jobs