Data Warehouse Engineer (AWS Analytics)

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

4+ years of experience in cloud-based data warehousing, preferably with AWS., Expert-level proficiency in SQL and Python., Hands-on experience with Amazon Redshift, Apache Airflow, and Amazon QuickSight., Strong knowledge of data modeling, warehousing principles, and performance tuning..

Key responsibilities:

  • Architect and maintain a cloud-native data warehouse infrastructure on AWS.
  • Design and implement efficient, scalable data models using Amazon Redshift.
  • Build and monitor Apache Airflow DAGs for automated ETL/ELT workflows.
  • Collaborate with stakeholders to define data requirements and ensure availability and quality.

AgilityFeat logo
AgilityFeat http://AgilityFeat.com
11 - 50 Employees
See all jobs

Job description

Location: Remote in Latam
Job Type: Full Time Contractor

Client location: Serbia


For companies around the world, attracting and retaining skilled data professionals remains a major challenge. Meanwhile, Latin America is home to a deep pool of talented engineers eager to work on meaningful, innovative projects.


At AgilityFeat, we connect top data and engineering talent across Latin America with remote teams at leading companies worldwide. We’ve helped countless organizations scale their data infrastructure while opening doors for Latin American professionals to take on impactful, career-growing opportunities.


If you’re a Latin American data expert with fluent English and the skill set below, we’ve got a great opportunity with a global company for you!


Please note: There is NO FLEXIBILITY in the fluency requirement: B2+ or above is required. Do not apply if you have only Intermediate English.


Data Warehouse Engineer (AWS Analytics)


About You

  • You are an experienced Data Warehouse Engineer who thrives in AWS-based environments.
  • You are passionate about data architecture, performance optimization, and automation.
  • You collaborate effectively with cross-functional teams to enable high-impact business decisions.
  • You are fluent in English and have strong communication and problem-solving skills.


Key Responsibilities


  • Architect and maintain a cloud-native data warehouse infrastructure on AWS.
  • Design and implement efficient, scalable data models using Amazon Redshift.
  • Build and monitor Apache Airflow DAGs for automated ETL/ELT workflows.
  • Develop dashboards and visualizations in Amazon QuickSight for key business metrics.
  • Integrate AWS services like DynamoDB, S3, and EventBridge for real-time and batch data processing.
  • Collaborate with stakeholders to define data requirements and ensure availability and quality.
  • Monitor system performance and optimize for cost and efficiency.
  • Ensure data security, governance, and operational reliability across all solutions.


Technical Requirements


  • 4+ years of experience in cloud-based data warehousing, preferably with AWS.
  • Expert-level proficiency in SQL and Python.
  • Hands-on experience with:
    • Amazon Redshift
    • Apache Airflow
    • Amazon QuickSight
    • S3, DynamoDB, EventBridge
  • Strong knowledge of data modeling, warehousing principles, and performance tuning.
  • Familiarity with CI/CD and Infrastructure as Code (IaC) best practices.
  • Proactive mindset focused on data integrity, cost control, and resilient design.


Soft Skills


  • Strong communication and collaboration skills.
  • Ability to work independently and as part of distributed teams.
  • Ownership-driven and detail-oriented approach.


Fluent English is mandatory.

All information must be submitted in English.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Detail Oriented
  • Collaboration
  • Communication
  • Problem Solving

Data Engineer Related jobs