Bachelor’s degree in Systems Engineering, Computer Science, or a related field with 4+ years of experience in Data/Analytics Engineering or Data Analysis
Power BI (Advanced): Expert-level proficiency in DAX, data modeling, performance optimization, and designing business-ready dashboards
Snowflake with dbt: Proven experience building analytics-ready models, including testing, documentation, and SQL query optimization
SQL and Python programming for data manipulation, analysis, and pipeline scripting
Requirements:
Build and develop complex Power BI dashboards and reports, applying data visualization best practices to ensure intuitive, performant, and business-focused dashboards
Design and optimize analytics-ready data models in Snowflake, using dbt to ensure data is structured correctly for high-level reporting and self-service BI
Design and develop scalable data pipelines and workflows using Airflow to ensure the timely delivery of data for analysis
Maintain and optimize ETL processes using Python and SQL to extract data from multiple sources and load it into our analytics environment
Job description
Description
Company Overview
At Lean Tech, we are dedicated to building sophisticated and scalable digital solutions. As a forward-thinking organization, we are undertaking a significant strategic initiative to modernize our data infrastructure, migrating from a custom legacy stack to a state-of-the-art, cloud-based platform. Our culture is rooted in collaboration, continuous learning, and adaptability, fostering a dynamic and global team environment where engineers are empowered to solve complex challenges and uphold high engineering standards.
Position: Mid+ Level Data Analyst
Location: Remote - Colombia
Position Overview We are seeking a Mid+ Data Analyst with a strong Data Engineering focus to bridge the gap between complex data infrastructure and actionable business intelligence. This role is designed for an analytical professional who is comfortable working end-to-end: from building and optimizing the ETL pipelines that feed our ecosystem to designing advanced, high-impact dashboards for business stakeholders. You will work closely with cross-functional teams to ensure that our data is not only robust and highly available but also translated into intuitive visualizations and models that drive strategic decision-making. Your responsibilities will include:
Advanced Visualization: Build and develop complex Power BI dashboards and reports, applying data visualization best practices to ensure they are intuitive, performant, and business-focused.
Analytics Modeling: Design and optimize analytics-ready data models in Snowflake, using dbt to ensure data is structured correctly for high-level reporting and self-service BI.
Pipeline Ownership: Design and develop scalable data pipelines and workflows using Airflow to ensure the timely delivery of data for analysis.
Data Ingestion & Transformation: Maintain and optimize ETL processes using Python and SQL to extract data from multiple sources and load it into our analytics environment.
Insight Generation: Monitor and analyze data quality, consistency, and reliability across systems to ensure business stakeholders are making decisions based on accurate information.
Collaborative Optimization: Work with application and product teams to optimize normalized transactional data structures in PostgreSQL for better reporting performance.
Technical Troubleshooting: Proactively monitor and optimize analytics workloads and data pipelines to minimize downtime and improve dashboard latency.
Cross-functional Integration: Act as the technical link between Data Science, Product, and Business teams to ensure seamless data integration and alignment with business requirements.
Required Skills & Experience
Bachelor’s degree in Systems Engineering, Computer Science, or a related field with 4+ years of experience in Data/Analytics Engineering or Data Analysis.
Power BI (Advanced): Expert-level proficiency in DAX, data modeling, performance optimization, and designing business-ready dashboards.
Snowflake & dbt: Proven experience building analytics-ready models, including testing, documentation, and SQL query optimization.
SQL & Python: Strong programming skills for data manipulation, analysis, and pipeline scripting.
AWS: Hands-on experience with S3 and RDS (PostgreSQL) for data storage and management.
Orchestration: Experience using Airflow to automate and monitor the data workflows that power your reports.
Databases: Solid understanding of PostgreSQL, including transactional data modeling and optimization for analytics.
ETL/ELT: Practical experience designing and maintaining data pipelines for large, complex datasets.
Nice to Have Skills
Experience using JavaScript, HTML, and CSS for: ○ Dashboard customization ○ Embedded analytics ○ Supporting low-code / no-code solutions when required
Familiarity with low-code / no-code platforms (e.g., Lovable or similar).
Soft Skills
Experience working closely with business stakeholders and translating requirements into data solutions.
Strong analytical mindset and attention to detail.
Proactive, go-getter mentality.
Trustworthy and dependable.
Strong communication and interpersonal skills.
Ability to adapt and thrive in dynamic, fast-paced environments.
Comfortable collaborating across technical and non-technical teams.
Accountability.
Why You Will Love Working with Us:
Join a powerful tech workforce and help us change the world through technology. Professional development opportunities with international customers, a collaborative work environment, a career path, and mentorship programs that will lead to new levels.
Join Lean Tech and contribute to shaping the data landscape within a dynamic and growing organization. Your skills will be honed, and your contributions will play a vital role in our continued success. Lean Tech is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.