Data Engineer

Work set-up: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Bachelor's degree in Computer Science, Data Science, or related field., Experience in data engineering, including building data pipelines and analytics., Proficiency in SQL, Python, and data transformation tools., Knowledge of AWS cloud services such as EC2, S3, Redshift, and Lambda..

Key responsibilities:

  • Develop and maintain data pipelines and APIs using various tools and AWS services.
  • Collaborate with team members to evaluate business needs and translate them into technical solutions.
  • Participate in project planning, defining milestones and resource requirements.
  • Design detailed specifications, user interfaces, and process flows for data systems.

Protagona logo
Protagona
51 - 200 Employees
See all jobs

Job description

As a Data Engineer, you will be part of a talented team of engineers responsible for the deployment and configuration of cloud resources to meet individual client business needs in AWS. Client engagements cover a wide variety of business requirements and require our engineers to adapt quickly and stay on top of recent cloud technology trends. Candidates should be able to identify and remediate issues within cloud-based systems, based on their knowledge of industry standards and best practices.

Responsibilities

  • Work with the team to evaluate business needs and priorities, liaise with key business partners and address team needs related to data systems and management 
  • Translate business requirements into technical specifications; establish and define details, definitions, and requirements of applications, components and enhancements
  • Participate in project planning; identifying milestones, deliverables and resource requirements; tracks activities and task execution
  • Generate design, development, test plans, detailed functional specifications documents, user interface design, and process flow charts for execution of programming
  • Develop data pipelines / APIs using a variety of tools, including but not limited to Python, SQL, Spark and various AWS services
  • Use an analytical, data-driven approach to drive a deep understanding of fast changing business 
  • Build large-scale batch and real-time data pipelines with data processing frameworks in AWS
  • Moving data from on-prem to cloud and cloud data conversions

Desired Skills & Experience

  • Experience in data engineering with an emphasis on data pipelines, analytics and reporting
  • Exposure to the AWS Cloud Platform
  • Experience in SQL, data transformations, and troubleshooting across at least one database Platform (Redshift, Amazon RDS, Cassandra, Snowflake, PostgreSQL, Databricks, etc.)
  • Experience in the design and build of data extraction, transformation, and loading processes by writing custom data pipelines
  • Experience in a scripting languages such as Python
  • Experience designing and building solutions utilizing various Cloud services such as EC2, S3, EMR, Kinesis, RDS, Redshift/Spectrum, Lambda, Glue, Athena, API gateway, etc.

Nice to Haves

  • AWS Certification - Solutions Architect Pro or Data Specialty
  • Machine Learning experience
  • Advanced GenAI experience

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Troubleshooting (Problem Solving)
  • Analytical Thinking
  • Problem Solving

Data Engineer Related jobs