Match score not available

Senior Data Engineer - Datalake (Remote)

Remote: 
Full Remote
Contract: 
Salary: 
110 - 130K yearly
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

Bachelor's degree or equivalent experience, 5+ years in Data Engineering or ETL role, Strong experience with PySpark, Python, Knowledge of Hadoop ecosystems, SQL, Linux/Unix, Experience with AWS cloud technology preferred.

Key responsabilities:

  • Create and support complex systems and processes
  • Develop applications, reports, and business rules
  • Estimate work efforts and respond to vendor RFPs
  • Fulfill end-user requests and provide guidance
  • Lead task delivery under deadlines in cross-functional teams
Vericast logo
Vericast
5001 - 10000 Employees
See more Vericast offers

Job description

Company Description

Vericast is the financial institution (FI) performance partner. We help banks and credit unions drive growth, improve efficiency, increase engagement and navigate change through the power of data, technology and people. Our advanced analytics, data-driven insights and integrated solution set enable better execution with agility, precision and scale. That’s why thousands of financial institutions look to Vericast and our 150 years of financial services expertise to help them achieve more. For more information, visit http://www.vericast.com or follow Vericast on LinkedIn.

Job Description

A Senior Data Engineer - Datalake is a problem solver who analyzes models, designs, creates, modifies, and supports a set of complex systems, processes, or operations that enable optimal business capabilities.

KEY DUTIES/RESPONSIBILITIES

  • Creates Application and System Design Documents and Develops Applications / Reports / Systems / Enterprise Solutions
  • Estimates component/application system level and Enterprise solution work efforts
  • Creates RFI/RFP requests / responses for vendor product evaluations
  • Designs, Develops, and Implements Complex Business Rules
  • Responsible for fulfilling end-user requests. Provides on call support as required. Periodically provides guidance to and assists in training less experienced individuals. Routinely operates under deadlines and may be subject to extreme workloads
  • Delivers personal tasks on time and leads the delivery of tasks for natural or cross-functional workgroup
  • Participates on initiatives with deliverables and meets quality standards on time
  • Leads cross-functional initiatives with deliverables and meets quality standards on time

Qualifications

EDUCATION

  • Bachelor’s degree or equivalent work experience

EXPERIENCE

  • 5+ years of experience in Data Engineering or ETL Development role
  • Strong experience with PySpark, Python for building solid data pipelines
  • Strong experience with Iceberg, Hive, S3, Trino
  • Hands-on experience with Hadoop ecosystems, relational databases and SQL queries
  • Hands-on experience with Apache Ranger, Rancher/Kubernetes preferred
  • Hands-on experience with Talend, Red Point or other ETL technologies an advantage
  • Experience with Agile Software Development methodologies
  • Experience with GitLab, CI/CD process and ServiceNow etc..

KNOWLEDGE/SKILLS/ABILITIES

  • Solid programming skills in object-oriented/functional scripting languages like Python, PySpark for building data pipelines with experience in testing, logging to ensure quality of code and data observability (Required)
  • Experience in distributed systems and parallel data processing using big data tools such as Spark, PySpark, Hadoop, Kafka, Hive. (Required)
  • Proficiency in querying with relational databases (Required)
  • Strong knowledge of Linux/Unix-based computer systems (Required)
  • Strong experience with Iceberg, Hive, S3, Trino (Required)
  • Experience in building Data Processing pipelines using ETL tools like Talend, SSIS etc.. (Required)
  • Hands-on experience with Apache Ranger, Rancher/Kubernetes
  • Understanding of Machine Learning models and algorithms interfacing with Data Science team
  • Proficiency in data visualization tools to showcase insights using Tableau, matplotlib etc..
  • Nice to have AWS cloud experiences in Redshift, Lambda, Sage Maker, Glue etc..
  • Experience with building Rest API
  • Excellent data analytical, conceptual, and problem-solving skills
  • Excellent communication skills to promote cross-team collaboration

Additional Information

Base Salary: $110,000 - $130,000

The ultimate compensation offered for the position will depend upon several factors such as skill level, cost of living, experience, and responsibilities.

Vericast offers a generous total rewards benefits package that includes medical, dental and vision coverage, 401K with company match and generous PTO allowance. A wide variety of additional benefits like life insurance, employee assistance and pet insurance are also available, not to mention smart and friendly coworkers!

At Vericast, we don’t just accept differences - we celebrate them, we support them, and we thrive on them for the benefit of our employees, our clients, and our community. As an Equal Opportunity employer, Vericast considers applicants for all positions without regard to race, color, creed, religion, national origin or ancestry, sex, sexual orientation, gender identity, age, disability, genetic information, veteran status, or any other classifications protected by law. Applicants who have disabilities may request that accommodations be made in order to complete the selection process by contacting our Talent Acquisition team at [email protected]. EEO is the law. To review your rights under Equal Employment Opportunity please visit: www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf.

 #LI-KK1 #LI-REMOTE

Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Verbal Communication Skills
  • Problem Solving
  • Analytical Skills

Data Engineer Related jobs