Lead Java PySpark Developer

Work set-up: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)

Offer summary

Qualifications:

Bachelor's or Master's degree in Computer Science or related field., At least 6 years of experience in Java, preferably with Spring Boot., Over 3 years of experience with PySpark in distributed data processing., Strong understanding of SQL, data modeling, and data security standards..

Key responsibilities:

  • Design and develop scalable data pipelines using PySpark.
  • Build and enhance Java-based microservices and APIs for data ingestion.
  • Collaborate with data teams to deliver high-quality, secure solutions.
  • Optimize data workflows and troubleshoot performance issues.

qode.world logo
qode.world

Job description

Job Title: Lead Java + PySpark Developer

Location: Dallas TX, Pittsburgh PA, Cleveland OH

Employment Type: Full-time


Job Summary:

Incedo is seeking a skilled and experienced Java + PySpark Developer to join its data engineering team. The ideal candidate will work on scalable data pipeline development, data integration, and backend microservices, contributing to analytics and business insights platforms.

Key Responsibilities:

  • Design, develop, and maintain scalable data processing pipelines using PySpark.
  • Build and enhance Java-based microservices/APIs to support data ingestion and transformation.
  • Collaborate with data scientists, data analysts, and architects to deliver high-quality, secure, and performant solutions.
  • Optimize data workflows and troubleshoot performance bottlenecks.
  • Participate in code reviews, design discussions, and provide technical leadership when needed.
  • Implement data validation, quality checks, and error-handling mechanisms.
  • Ensure adherence to data security and compliance standards within financial environments.

Required Skills:

  • 6+ years of hands-on experience in Java (Spring Boot preferred).
  • 3+ years of experience with PySpark in distributed data processing.
  • Experience with Hadoop ecosystem, Hive, HDFS, and Spark SQL.
  • Solid understanding of RESTful APIs, JSON/XML, and integration patterns.
  • Strong knowledge of SQL and data modeling.
  • Familiarity with CI/CD tools, Git, and Agile methodology.

Preferred Skills:

  • Experience with AWS, Azure, or GCP cloud services.
  • Prior work in the Banking/Finance domain (especially risk or fraud analytics).
  • Knowledge of containerization using Docker/Kubernetes.

Educational Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field.

Let me know if you'd like to tailor this to a contract or remote role, or if you want a version that includes Python only with less Java.

Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Related jobs