Logo for PhoenixTeam

Kafka Engineer

Roles & Responsibilities

  • Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field.
  • 3+ years of experience developing, administering, and supporting Apache Kafka in enterprise environments.
  • Hands-on experience managing Kafka clusters, topics, partitions, and event streaming pipelines.
  • Experience integrating Kafka with microservices, API Gateways (APIGW), and backend systems.

Requirements:

  • Design, build, administer, and maintain Kafka clusters across development, test, and production environments; manage topics, partitions, brokers, replication, retention policies, and access controls.
  • Develop and support event streaming pipelines using Kafka for real-time data processing; integrate Kafka with API Gateway-based microservices and downstream backend systems.
  • Implement Kafka security best practices, including authentication, authorization, encryption in transit, and auditing; ensure compliance with CMS security, data governance, and operational standards.
  • Document Kafka architectures, configurations, operational procedures, and integration patterns; provide technical guidance, troubleshooting support, and knowledge transfer to internal teams.

Job description

Overview


The Kafka Engineer / Administrator / Developer is a key member of the program technical team, supporting large-scale data streaming, system integration, and platform modernization initiatives. This role is responsible for designing, developing, administering, and optimizing Apache Kafka clusters and event-driven architectures that support high-volume, mission-critical data flows. The Kafka Engineer works closely with Federal Government stakeholders, architects, developers, DevOps teams, API Gateway (APIGW) teams, and backend system owners to ensure reliable, secure, and scalable event streaming pipelines. This role plays a critical part in enabling real-time data integration, microservices communication, and operational resilience across complex enterprise systems.

 

Key Functions

Kafka Engineering & Administration

  • Design, build, administer, and maintain Kafka clusters across development, test, and production environments.
  • Manage Kafka topics, partitions, brokers, replication, retention policies, and access controls.
  • Monitor Kafka performance, availability, throughput, and latency; proactively identify and resolve issues.
  • Perform capacity planning, tuning, upgrades, patching, and disaster recovery planning for Kafka environments.
  • Implement and maintain high availability and fault-tolerant Kafka configurations.

Event Streaming & Integration

  • Develop and support event streaming pipelines using Kafka for real-time and near-real-time data processing.
  • Integrate Kafka with API Gateway (APIGW)–based microservices and downstream backend systems.
  • Design and implement Kafka producers, consumers, and connectors (e.g., Kafka Connect) to support system integrations and ETL/data movement needs.
  • Collaborate with application teams to define event schemas, topics, and data contracts.
  • Ensure reliable message delivery, data integrity, and error handling across streaming workflows.

Security, Compliance & Operations

  • Implement Kafka security best practices, including authentication, authorization, encryption in transit, and auditing.
  • Ensure Kafka implementations comply with CMS security, data governance, and operational standards.
  • Support DevSecOps practices, CI/CD pipelines, and infrastructure-as-code approaches where applicable.
  • Participate in incident response, root cause analysis, and operational readiness activities.

Collaboration & Documentation

  • Work closely with architects, developers, DevOps engineers, and system administrators to support solution design and delivery.
  • Document Kafka architectures, configurations, operational procedures, and integration patterns.
  • Provide technical guidance, troubleshooting support, and knowledge transfer to internal teams.

 

Minimum Qualifications

  • Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
  • 3+ years of experience developing, administering, and supporting Apache Kafka in enterprise environments.
  • Hands-on experience managing Kafka clusters, topics, partitions, and event streaming pipelines.
  • Experience integrating Kafka with microservices, API Gateways (APIGW), and backend systems.
  • Strong understanding of event-driven architectures, messaging patterns, and data streaming concepts.
  • Experience with Linux-based environments and command-line administration.
  • Strong troubleshooting and performance tuning skills.
  • Ability to clearly communicate technical concepts to both technical and non-technical stakeholders.

 

Preferred Qualifications

  • Experience supporting federal healthcare programs.
  • Experience working in Agile, Scrum, and/or DevSecOps environments.
  • Familiarity with cloud-based Kafka deployments (AWS MSK or similar managed Kafka services).
  • Experience with CI/CD pipelines and automation tools.
  • Knowledge of cloud security concepts and secure data transmission.
  • Experience with monitoring tools and observability platforms for Kafka (e.g., Prometheus, Grafana, CloudWatch).
  • Familiarity with schema management tools (e.g., Schema Registry).
  • Knowledge of containerized environments and orchestration tools (Docker, Kubernetes) is a plus.

 

Position Details

  • Employment Type: Full-Time, W2
  • Location: 100% Remote (US-based only)
  • Hours: 40 hours/week, availability during core business hours
  • Start Date: ASAP
  • Eligibility: Must be eligible to obtain a Public Trust clearance
  • Salary: $100,000 – $130,000 (commensurate with experience)

Compensation$100,000-$130,000

Related jobs

Other jobs at PhoenixTeam

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.