Match score not available

Sr. DevOps Engineer (Elastic Operations)

72% Flex
Remote: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

5+ years in software/devOps/data engineering, Experience with Snowflake, AWS, Azure, Proficiency in SQL, Java, Python/Scala, Bachelor's degree in Computer Science or related field.

Key responsabilities:

  • Design, optimize & maintain complex data workloads
  • Work on large-scale data platform projects
  • Participate in data integration, modeling, and security tasks
  • Learn and upskill on data ecosystem technologies
  • Troubleshoot, optimize, and enhance data pipelines
phData (hiring!) logo
phData (hiring!) SME https://www.phdata.io/
201 - 500 Employees
See more phData (hiring!) offers

Job description

Logo Jobgether

Your missions

phData is revolutionizing how our clients use data and artificial intelligence. As the premier services provider specializing in data application and data platform services, we partner with the leading technology companies across the modern data stack to deliver cutting-edge solutions. We are technology evangelists around critical ecosystem tools like Snowflake, AWS, Azure, dbt, Sigma, Tableau, and Power BI. We are passionate about helping global enterprises overcome their toughest challenges by building AI solutions and data applications and then getting these solutions into production.

phData is a remote-first global company with employees based in the United States, Latin America and India. We celebrate the culture of each of our team members and foster a community of technological curiosity, ownership and trust. Even though we're growing extremely fast, we maintain a casual, exciting work environment. We hire top performers and allow you the autonomy to deliver results.

Overview:

We are seeking a qualified Sr. DevOps engineer proficient in Software/Data engineering and DevOps to help deliver our Elastic Operations service. This position will report to our Managed Services team in Bangalore, India, This is a hands-on technical Developer/DevOps position. Hence, only experienced candidates with a deep passion for understanding and designing complex data solutions must apply. 

As a Senior DevOps engineer, you will be responsible for designing, validating, optimizing, and maintaining small/large-scale complex data integration and data pipeline workloads. You will be working on large-scale, complex data platform projects running on Snowflake and other native cloud platform services in AWS and Azure. You will also participate in data integration, data modeling, data governance, and data security tasks.  In addition, you will need the ability to learn and quickly upskill on data ecosystem technologies related to data ingestion, data transformation, data modeling, data migration, platform design, and architecture, with some exposure to data visualization tools like PowerBI. 

Required Experience:

  • 5+ years of hands-on experience as a software engineer, DevOps, or Data engineer in Data modeling, designing, implementing, and supporting modern data solutions.
  • Experience in Core cloud data platforms like Snowflake, AWS, Azure, or Databricks.
  • Deep working knowledge of end-to-end pipelines for small and large-scale data sets from various sources based on applications. Ability to diagnose and fix broken pipelines.
  • Understanding of common data integration and data transformation patterns for small and large-scale data sets.
  • Deep understanding of data validation processes using utilities or manual processes.
  • Hands-on experience troubleshooting, optimizing, and enhancing data pipelines and bringing improvements in the production environment.
  • Extensive experience in providing operational support across a large user base for a cloud-native data warehouse (Snowflake and/or Redshift).
  • Programming expertise in Java, Python and/or Scala.
  • SQL and the ability to write, debug, and optimize SQL queries.
  • Unmatched troubleshooting and performance tuning skills.
  • Willing to work in a developer and support role across customers. Proficient in Incident Management and Troubleshooting.
  • Excellent client-facing written and verbal communication skills and experience.
  • 4-year Bachelor's degree in Computer Science or a related field.

Prefer one or more of the following:

  • Production experience in core data platforms: Snowflake, AWS, Azure, GCP, Hadoop, Databricks
  • Cloud and Distributed Data Storage: S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems
  • Data integration technologies: Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory or other data integration technologies
  • Multiple data sources (e.g. queues, relational databases, files, search, API)
  • Design, implement, and maintain CI/CD pipelines to enable rapid and reliable software releases
  • Monitor and optimize the performance of CI/CD pipelines to reduce build and deployment times
  • Automated data transformation and data curation: dbt, Spark, Spark streaming, automated pipelines
  • Workflow Management and Orchestration: Airflow, AWS Managed Airflow, Luigi, NiFi
  • CI/CD tools: Git, Flyway, and Liquibase

​​Why phData? We offer:

  • Medical Insurance for Self & Family
  • Medical Insurance for Parents
  • Term Life & Personal Accident
  • Wellness Allowance
  • Broadband Reimbursement
  • Professional Development Allowance
  • Reimbursement of Skill Upgrade Certifications
  • Certification Reimbursement

#LI-DNI

phData celebrates diversity and is committed to creating an inclusive environment for all employees. Our approach helps us to build a winning team that represents a variety of backgrounds, perspectives, and abilities. So, regardless of how your diversity expresses itself, you can find a home here at phData. We are proud to be an equal opportunity employer. We prohibit discrimination and harassment of any kind based on race, color, religion, national origin, sex (including pregnancy), sexual orientation, gender identity, gender expression, age, veteran status, genetic information, disability, or other applicable legally protected characteristics. If you would like to request an accommodation due to a disability, please contact us at People Operations.

Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Soft Skills

  • Excellent Communication
  • Teamwork
  • Independence

Go Premium: Access the World's Largest Selection of Remote Jobs!

  • Largest Inventory: Dive into the world's largest remote job inventory. More than half of these opportunities can't be found on standard platforms.
  • Personalized Matches: Our AI-driven algorithms ensure you find job listings perfectly matched to your skills and preferences.
  • Application fast-lane: Discover positions where you rank in the TOP 5% of applicants, and get personally introduced to recruiters with Jobgether.
  • Try out our Premium Benefits with a 7-Day FREE TRIAL.
    No obligations. Cancel anytime.
Upgrade to Premium

Find more DevOps Engineer jobs