Match score not available

Data Engineer (AWS, Snowflake, dbt)

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Bachelor’s Degree in Computer Science or related field, 2-4 years of experience in data engineering, Proficiency in AWS and Snowflake, Experience with dbt for data transformation.

Key responsabilities:

  • Design and deploy scalable data pipelines
  • Collaborate with cross-functional teams on data requirements
Blue Cross of Idaho logo
Blue Cross of Idaho Insurance Large http://www.bcidaho.com/
1001 - 5000 Employees
See more Blue Cross of Idaho offers

Job description

Our Data Engineers play a pivotal role in designing, developing, and maintaining robust data pipelines and architectures. Working closely with cross-functional teams, you ensure the scalability, reliability, and efficiency of our data infrastructure. The ideal candidate will have extensive experience in cloud technologies, particularly AWS, along with proficiency in Snowflake and dbt. Additionally, experience in building event-driven applications is essential for this role.

This position has flexibility to be based in hybrid work location (onsite Meridian Idaho campus and work-from-home) and/or work fully remote within a mutually acceptable location. #LI-Remote; #LI-Hybrid

Education must meet one of the following requirements:

  • Bachelor’s Degree in Computer Science, Electrical Engineering, Information Systems, or closely related field of study or equivalent work experience (Two years’ relevant work experience is equivalent to one-year college); Master’s degree preferred.

  • International Degree equivalency

  • Applicable certification(s) as defined by the leader +2 years additional experience

  • Associate’s Degree in Computer Science, Electrical Engineering, Information Systems, or closely related field of study + 2 years additional experience

Experience: 2-4/+ years of experience in data engineering roles, with a focus on building scalable data pipelines and architectures. Experience ideally includes:

  • Proficiency in cloud technologies, particularly AWS (Amazon Web Services), including services such as S3, EC2, Lambda, Glue, Kinesis, KMS/Kafka etc.

  • Expert level experience with Snowflake data warehouse platform, including data modeling, performance tuning, and administration.

  • Minimum 3/+ years of hands-on experience implementing a large enterprise application with very large data volumes with dbt (Data Build Tool) for data transformation and orchestration.

  • Solid understanding of event-driven architecture principles and experience in building event-driven applications. Experience in AWS KMS/Kafka is highly desirable.

  • Proficiency in programming languages such as Python, Java, or Scala for data processing and scripting.

  • Experience with containerization technologies such as Docker, ECS, Fargate.

  • Excellent problem-solving skills and ability to work effectively in a fast-paced, collaborative environment.

  • Strong communication skills with the ability to effectively communicate technical concepts to non-technical partners.

Key Responsibilities:

  • Design, develop, and deploy scalable data pipelines and architectures on AWS cloud infrastructure.

  • Implement and optimize data models using Snowflake and dbt for efficient data transformation and analysis.

  • Collaborate with data scientists, analysts, and software engineers to understand data requirements and ensure alignment with business objectives.

  • Build event-driven data processing systems to enable real-time data ingestion, processing, and analytics.

  • Implement ABC (Audit/Balance/Control), monitoring, alerting, and logging solutions to ensure the reliability and performance of data pipelines.

  • Evaluate and implement best practices for data governance, security, and compliance.

  • Mentor team members and provide technical guidance and support as needed.

Reasonable accommodations

To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed above are representative of the knowledge, skill and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.

We are an Equal Opportunity Employer and do not discriminate against any employee or applicant for employment because of race, color, sex, age, national origin, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any other federal, state or local protected class.

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Industry :
Insurance
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Communication
  • Problem Solving

Data Engineer Related jobs