Logo for CodersBrain

PySpark developer_CBS

Roles & Responsibilities

  • Minimum 6+ years of experience in creating Spark jobs using Java or Scala.
  • Strong knowledge of data loading, transformation, and enrichment techniques.
  • Experience with Big Data tools such as Hive and HBase.
  • Proficiency in Spark Streaming, SQL, and Data Warehouse concepts.

Requirements:

  • Develop and maintain Spark jobs for data processing.
  • Handle data loading, transformation, and enrichment tasks.
  • Work with Big Data tools like Hive and HBase.
  • Analyze and troubleshoot data processing issues.

Job description

Location : ChennaiHyderabadBangalore

Details JD:
1. Minimum 6+ years of experience in creating SPARK Jobs using JavaScala
2. Should have very good experience in developing data loading and transformation tasks using external sources, merge data, perform data enrichment and load in to target data destinations
3. Must have good knowledge on Big data tools HIVE and HBASE tables
4. Should have experience on Spark Streaming
5. Must have good knowledge on SQL
6. Must have good knowledge on Data warehouse concepts
7. Must have good analytical skills to analyse the issue
8. Should have Handson UnixLinux knowledge
9. Knowledge on AWS, PySpark will be an advantage.

Related jobs

Other jobs at CodersBrain

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

✨

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.