Sr Developer Big Data

Work set-up: 
Hybrid
Work from: 
Chicago (US)

KYYBA Inc logo
KYYBA Inc Human Resources, Staffing & Recruiting SME https://www.kyyba.com/
501 - 1000 Employees
See all jobs

Job description


 Sr Developer-  Bigdata_ NS-1026495
Grade: 37
Roles - 5
Location:Chicago,IL/Richardson,TX

Salary range: $115k-$120k+ Annual bonus{13.3%-20%}+401k match{3.5%}+Employer contributed pension plan+ Medical plan
Need US Citizens/ GC {PEOPLE THOSE WHO DON'T NEED SPONSORSHIP}
As the health care industry continues to rapidly transform, our IT team conceives, develops and delivers impactful technology solutions to support access to quality, affordable health care for our members. We are driven by our collective company purpose: To do everything in our power to stand with our members in sickness and in health®. Our IT team unleashes the power of this purpose through technology. We come to work every day to make a difference, and we deliver the highest quality and best solutions to our members.
Job Purpose: This position is responsible for developing, integrating, testing, and maintaining existing and new applications; knowing one or more programming languages; knowing one or more development methodologies / delivery models.
Required Job Qualifications:
·     Bachelor Degree and 4 years Information Technology experience OR Technical Certification and/or College Courses and 6 year Information Technology experience OR 8 years Information Technology experience.
·     Possess ability to sit and perform computer entry for entire work shift, as required.
·     Possess ability to manage workload, manage multiple priorities, and manage conflicts with customers/employees/managers, as applicable.
·     Must have extensive hands on experience in designing, developing, and maintaining software solutions in Hadoop cluster.
·     Must have experience with strong UNIX shell scripting, SQOOP, eclipse, HCatalog
·     Must have experience with NoSql Databases like HBASE, Mongo or Cassandra
·     Must have experience with Developing Pig scripts/Hive QL,UDF for analyzing all semi-structured/unstructured/structured data flows.
·     Must have working experience with Developing MapReduce programs running on the Hadoop cluster using Java/Python.
·     Must have working experience with Spark and Scala.
·     Must have hands on experience using Talend with Hadoop technologies.
·     Must have knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2) and considerations for scalable, distributed systems
·     Must demonstrate Hadoop best practices
·     Must have working experience in the data warehousing and Business Intelligence systems
·     SDLC Methodology (Agile / Scrum / Iterative Development).
·     Systems change / configuration management.
·     Business requirements management.
·     Problem solving /analytical thinking
Preferred Job Qualifications:
·     Bachelor Degree in Computer Science or Information Technology.
*CA

Required profile

Experience

Industry :
Human Resources, Staffing & Recruiting

Related jobs