Match score not available

Snowflake Data Engineer with Kafka

Remote: 
Full Remote
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

4+ years of experience with Snowflake, Strong experience with Apache Kafka, Proficiency in SQL and ETL/ELT processes, Bachelor’s or Master’s degree in Computer Science.

Key responsabilities:

  • Design and maintain Snowflake data warehouse solutions
  • Develop and optimize ETL/ELT pipelines from Kafka
Leute Passen India logo
Leute Passen India
51 - 200 Employees
See more Leute Passen India offers

Job description

We are looking for a talented Snowflake Engineer with experience in Apache Kafka to design, develop, and optimize our data pipelines. In this role, you will be responsible for building scalable, high-performance data systems that integrate with various data sources and contribute to the overall data architecture. You will collaborate with cross-functional teams to ensure seamless data flow and support advanced analytics initiatives.

Key Responsibilities:
  • Design, implement, and maintain Snowflake data warehouse solutions, ensuring high performance, scalability, and security.
  • Develop and optimize ETL/ELT pipelines to ingest, transform, and load data into Snowflake from Kafka and other data sources.
  • Work closely with data architects, data analysts, and data scientists to understand business requirements and translate them into technical solutions.
  • Design and implement Kafka-based messaging systems to stream data in real-time to Snowflake.
  • Troubleshoot and resolve data-related issues, including performance bottlenecks and data quality issues.
  • Monitor and optimize data pipelines for efficiency, scalability, and cost-effectiveness.
  • Implement data governance and security practices to ensure compliance with organizational standards.
  • Provide technical guidance and mentorship to junior engineers on Snowflake and Kafka-related technologies.
  • Stay updated on emerging technologies and best practices in data engineering and cloud services.

Required Skills and Experience:
  • 4+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
  • Strong experience with Apache Kafka for stream processing and real-time data integration.
  • Proficiency in SQL and ETL/ELT processes.
  • Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
  • Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
  • Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
  • Knowledge of data governance, security, and compliance best practices.
  • Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
  • Ability to work in a collaborative team environment and communicate effectively with cross-functional teams.

Preferred Skills:
  • Experience with other cloud data services like AWS Redshift, Google BigQuery, or Azure Synapse.
  • Familiarity with containerization (Docker, Kubernetes) and orchestration tools.
  • Experience with machine learning models and integration with data pipelines.

Educational Qualification:
  • Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, or related field.

Why Join Us:
  • Work with innovative technologies in a fast-paced, collaborative environment.
  • Opportunity to contribute to cutting-edge data engineering solutions.
  • Competitive salary and benefits.
  • Continuous learning and career development opportunities.

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Problem Solving
  • Collaboration
  • Communication
  • Analytical Skills

Data Engineer Related jobs