Mid-Level Data Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Proficient in SQL and relational database management, with experience in PostgreSQL, Snowflake, and SQL Server., Strong experience with ETL/ELT tools and processes, and expertise in Python for data manipulation., Hands-on experience with workflow orchestration tools like Apache Airflow and cloud storage concepts such as AWS S3., Business Intelligence skills with tools like Power BI for data visualization and reporting..

Key responsibilities:

  • Design, build, and maintain scalable data pipelines for structured and unstructured data.
  • Support data warehouse and data lake development by integrating diverse data sources.
  • Monitor and optimize ETL/ELT workflows, ensuring data integrity and performance.
  • Collaborate with cross-functional teams to gather requirements and deliver data solutions.

Lean Tech logo
Lean Tech Information Technology & Services SME https://www.leangroup.com/
501 - 1000 Employees
See all jobs

Job description

Description

Company Overview:

 

Lean Tech is a rapidly expanding organization situated in Medellín, Colombia. We pride ourselves on possessing one of the most influential networks within software development and IT services for the entertainment, financial, and logistics sectors. Our corporate projections offer a multitude of opportunities for professionals to elevate their careers and experience substantial growth. Joining our team means engaging with expansive engineering teams across Latin America and the United States, contributing to cutting-edge developments in multiple industries.

 

Currently, we are seeking a Mid-Level Data Engineer with a strong English level to join our team. Here are the challenges that our next warrior will face and the requirements we look for: 

 

Position Title: Mid-Level Data Engineer
Location: Colombia (Remote)

 

What you will be doing:
  • Design, build, and maintain scalable and robust data pipelines for ingesting, transforming, and delivering structured and unstructured data.
  • Support data warehouse and data lake development by integrating and transforming spreadsheets and diverse data sources.
  • Implement, monitor, and optimize ETL/ELT workflows ensuring data integrity, quality, and performance.
  • Conduct thorough data validation, testing, and quality assurance on migrated and transformed data.
  • Collaborate with cross-functional teams including senior engineers, architects, and business intelligence stakeholders to gather requirements and deliver solutions.
  • Write, optimize, and maintain complex SQL queries, stored procedures, and database objects.
  • Monitor system and pipeline performance; troubleshoot and resolve data-related issues to minimize downtime.
  • Manage data ingestion processes, including API-based data extraction and loading.
  • Participate in code reviews, version control management, and documentation of data engineering processes.
  • Support automation of workflows using Airflow and similar orchestration tools.
  • Assist in the maintenance and evolution of data infrastructure and metadata governance.
  • Create data visualizations and reports to support analysis and decision-making using Power BI or similar tools.

 

Requirements & Qualifications
To excel in this role, you should possess:
  • Main Skills:
These are the must-haves that are essential for this position. 
  • Proficient in SQL and relational database management, including writing and optimizing complex queries.
  • Experience with PostgreSQL, Snowflake, SQL Server, and Fabric.
  • Strong experience with ETL/ELT tools and processes; ability to design and manage data pipelines.
  • Expertise in Python for data manipulation, particularly using Pandas; knowledge of PySpark is a plus.
  • Hands-on experience with workflow orchestration tools, especially Apache Airflow.
  • Experience with cloud storage and data lake concepts, particularly AWS S3.
  • Business Intelligence skills with tools such as Power BI for data visualization and reporting.
  • Familiarity with data warehouse modeling and transformation frameworks like DBT (nice to have).
  • Knowledge of data migration, validation, and performance tuning techniques.
  • Secondary Skills:
These are additional skills that will help you succeed in this role. 
  • Familiarity with Alembic or other database schema migration tools.
  • Experience with data integration from APIs, using tools like Postman for API testing.
  • Knowledge of software development best practices including version control (Git), CI/CD pipelines, and code reviews.
  • Understanding of metadata management and data governance principles.
  • Exposure to Agile methodologies and collaborative team environments.
  • Basic knowledge of backend technologies or interest in expanding to full-stack data engineering skills.

 

Why you will love Lean Tech:

 

  • Join a powerful tech workforce and help us change the world through technology
  • Professional development opportunities with international customers
  • Collaborative work environment
  • Career path and mentorship programs that will lead to new levels. 



Join Lean Tech and contribute to shaping the data landscape within a dynamic and growing organization. Your skills will be honed, and your contributions will play a vital role in our continued success. Lean Tech is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.

Required profile

Experience

Industry :
Information Technology & Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Data Engineer Related jobs