Match score not available

Staff Software Engineer

EXTRA HOLIDAYS - EXTRA PARENTAL LEAVE
Remote: 
Full Remote
Contract: 
Salary: 
150 - 150K yearly
Experience: 
Senior (5-10 years)
Work from: 
Texas (USA), United States

Offer summary

Qualifications:

Bachelor's degree in relevant field, 7 years of Data Engineering experience, Proficiency in Big data technologies, Hands-on ETL and cloud migration skills, Experience with BI reporting tools.

Key responsabilities:

  • Design and develop software solutions
  • Enhance existing software capabilities
  • Develop automated shell scripts for data transfer
  • Collaborate on setting up software clusters
  • Create scalable algorithms using Spark
H-E-B logo
H-E-B Retail (Super / Hypermarket) XLarge https://careers.heb.com/
10001 Employees
See more H-E-B offers

Job description

Logo Jobgether

Your missions

Responsibilities:

Company Name: H-E-B, LP
Job Location: 3890 W. Northwest Hwy., Suite 400, Dallas, TX 75220
Job title: Staff Software Engineer
Minimum Salary: $149,781
Education: Bachelor's degree in Electronics and Communication Engineering, Computer Science, or related.
SOC Code: 15-1252
SOC Occupation Title: Software Developers
Duration: Regular Hire
Work week: Full-time
Supervision Experience Required: No
Travel Required: No - Employer will allow remote/telecommuting throughout the US.

Experience: 7 years of experience with Data Engineering, or related. Requires the following skills: 7 years of experience in developing Big data pipelines using Spark, Scala, Map Reduce, Pig, Sqoop, and Hive. Transferring and analyzing data to Big Query from various sources. Working with cloud technologies GCP Dataproc and cloud storage. Must have hands-on experience in Code Migration and Data Migration for Extracting, Transforming, and Loading data using ETL tools (Sync sort, DMExpress) on UNIX and Windows. Creating BI reports for business users using Looker. Developing Teradata SQL Scripts through various procedures, functions, and packages to implement the business logics. Scheduling ETL workflows using Oozie, Autosys, Atomic, Crontab, and Apache Airflow. Data modeling and performance tuning using versioning tools GITHUB and TortoiseSVN.

Job duties: Research, design, and develop computer and network software or specialized utility programs for a statewide supermarket chain. Analyze user needs and develop software solutions, applying principles and techniques of computer science, engineering, and mathematical analysis. Update software or enhance existing software capabilities. Design, develop, and modify software systems using scientific analysis and mathematical models to predict and measure the outcomes and consequences of the design. Develop multiple automated shell scripts for data transfer from other sources to Hadoop. Develop Spark RDD's and Data Frames to utilize the capability of built-in memory processing improving the performance of the application. Work with Hadoop admin in setting up edge nodes and installing the required software on cluster. Design and implement a data pipeline enabling near real time systems such as Micro Services Architecture for the business decision making system. Analyze data to design the scalable algorithm using Spark. Develop the generic applications such as data pull and setup the notebooks in cluster.

Required profile

Experience

Level of experience: Senior (5-10 years)
Industry :
Retail (Super / Hypermarket)
Spoken language(s):
Check out the description to know which languages are mandatory.

Soft Skills

  • Microsoft Windows
  • Analytical Thinking
  • Problem Solving

Software Engineer Related jobs