Match score not available

Kafka  Developer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Strong understanding of Kafka architecture and components such as brokers, topics, and consumer groups., Proficiency in programming languages like Java or Scala, with experience in system design and data management., Familiarity with distributed systems concepts and Kafka security mechanisms., Experience with Kafka connect, KSQL architecture, and cloud integration on platforms like AWS or Azure..

Key responsabilities:

  • Identify and rectify Kafka messaging issues in a timely manner.
  • Collaborate with business and IT teams to design and implement solutions using Agile methodology.
  • Administer and troubleshoot distributed Kafka clusters across various environments (DEV, QA, UAT, PROD).
  • Provide technical direction and guidance to other engineers on the project.

Diverse Lynx logo
Diverse Lynx Large http://www.diverselynx.com
1001 - 5000 Employees
See all jobs

Job description

Role: Kafka Developer

Location: Raleigh, NC / Phoenix AZ/Remote

Duration: Full Time

Job Description

Kafka Developer

A strong understanding of Kafka architecture, including brokers, topics, partitions, and consumer groups, alongside skills in reading and interpreting logs, monitoring metrics, familiarity with distributed systems concepts, proficiency in programming languages like Java or Scala, and knowledge of network connectivity and configuration to identify and resolve potential problems within a Kafka cluster.

Key skills required for Kafka messaging troubleshooting:

• Deep understanding of Kafka architecture: Thorough knowledge of how Kafka components like brokers, topics, partitions, consumer groups, and replication factors work together.

• Log analysis: Ability to interpret Kafka logs from producers, consumers, and brokers to identify error messages, warnings, and potential issues.

• Monitoring and metrics: Familiarity with monitoring tools to track key Kafka metrics like consumer lag, message throughput, broker CPU usage, and network latency.

• Distributed systems knowledge: Understanding of concepts like fault tolerance, data replication, leader election, and distributed consensus to troubleshoot issues related to cluster failures.

• Programming language proficiency: Strong coding skills in Java or Scala, as many Kafka applications are written in these languages, allowing you to debug custom producers and consumers.

• Network troubleshooting: Ability to diagnose network connectivity issues between brokers and clients, including checking network configurations and firewall rules.

• Kafka configuration management: Knowledge of Kafka configuration parameters, including topic creation, partition settings, replication factors, and consumer group settings.

• Security understanding: Awareness of Kafka security mechanisms like authentication, authorization, and encryption to troubleshoot related issues.

• Troubleshooting tools and techniques: Familiarity with Kafka management tools, command-line utilities, and debugging techniques to investigate and resolve issues.

• Consumer lag: Identifying the cause of high consumer lag (e.g., slow processing, insufficient consumers) and adjusting consumer configurations or application logic.

• Broker failures: Analyzing logs and metrics to determine the root cause of a broker failure and taking actions like rebalancing partitions or restarting the broker.

• Message delivery issues: Investigating missing messages, message duplication, or out-of-order delivery by examining producer and consumer configurations.

• Performance bottlenecks: Identifying performance issues related to high message throughput, network congestion, or slow disk I/O and optimizing Kafka settings.

As Kafka Developer

A strong proficiency in Confluent Kafka architecture, a programming language like Java or Scala, expertise in system design, data management skills, and the ability to understand and implement data streaming pipelines.

Key skills required for Kafka Developer:

• Deep understanding of Confluent Kafka: Thorough knowledge of Kafka concepts like producers, consumers, topics, partitions, brokers, and replication mechanisms.

• Programming language proficiency: Primarily Java or Scala, with potential for Python depending on the project.

• System design and architecture: Ability to design robust and scalable Kafka-based data pipelines, considering factors like data throughput, fault tolerance, and latency.

• Data management skills: Understanding of data serialization formats like JSON, Avro, and Protobuf, and how to manage data schema evolution.

• Kafka Streams API (optional): Knowledge of Kafka Streams for real-time data processing within the Kafka ecosystem.

• Monitoring and troubleshooting: Familiarity with tools to monitor Kafka cluster health, identify performance bottlenecks, and troubleshoot issues.

• Cloud integration : Experience deploying and managing Kafka on cloud platforms like AWS, Azure, or GCP.

• Distributed systems concepts: Understanding of concepts like distributed consensus, leader election, and fault tolerance.

• Security best practices: Knowledge of Kafka security features to implement authentication and authorization mechanisms.

• Communication and collaboration: Ability to work effectively with other developers, data engineers, and stakeholders to design and implement Kafka solutions.

Other Skills required:

• Strong Experience with Kafka connect / KSQL architecture and associated clustering model.

• Hands on experience with Kafka Db connector for Oracle, Mysql.

• Strong fundamentals and experience in Kafka administration, configuration, and troubleshooting.

• Understand and experience with Kafka clustering, and its fault-tolerance model supporting HA and DR.

• Have developed KStreams pipelines, as well as deployed KStreams clusters.

• Strong problem-solving skills and a passion for debugging complex issues and mature code.

• Experience with using agile methodologies for software development.

• Experience with developing KSQL queries and best practices of using KSQL vs streams.

• Familiarity with Confluent Control Center; or have worked on Kafka Monitoring Tool (UI).

• Ability to work in fast-paced and dynamically changing environment.

• Ability to lead the effort; and work with minimum supervision.

Duties & Responsibilities:

• Standalones identify and rectify the Kafka Messaging issues within justifying time.

• Work with the business and IT team to un derstand business problems, and to design implement, and deliver an appropriate solution using Agile methodology across the larger program.

• Work independently to implement solutions on multiple platform (DEV, QA, UAT, PROD).

• Provide technical direction, guidance, and reviews to other engineers working on the same project.

• Administer Distributed kafka cluster in Dev,QA, UAT, PROD environments and troubleshoot performance issue

• Implement and debug subsystems/microservice and components.

• Follows automate-first/automate-everything philosophy.

• Hands on in programming languages







Diverse Lynx LLC is an Equal Employment Opportunity employer. All qualified applicants will receive due consideration for employment without any discrimination. All applicants will be evaluated solely on the basis of their ability, competence and their proven capability to perform the functions outlined in the corresponding role. We promote and support a diverse workforce across all levels in the company.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Leadership
  • Problem Solving

Related jobs