Logo for In All Media

1946 Data Engineer (Python) at In All Media Inc

Key Facts

Full time
Senior (5-10 years)
English

Other Skills

  • β€’
    Communication
  • β€’
    Time Management
  • β€’
    Teamwork
  • β€’
    Critical Thinking
  • β€’
    Problem Solving

Roles & Responsibilities

  • Dual Language Expertise: 5–7 years of professional experience with Python (data engineering) and Java/Spring Boot (service layers).
  • Data Engineering Mastery: 5+ years building production-grade pipelines with ETL tools (dbt) and Airflow.
  • Big Data Stack: 3+ years of hands-on experience with Snowflake, Redshift, Spark, or Kafka.
  • Backend Fundamentals: Strong grasp of REST API design, Gradle, and microservices architecture in a distributed environment.

Requirements:

  • Design and build scalable backend microservices using Java, Spring Boot, and Gradle.
  • Architect and maintain ETL/ELT pipelines with Python and dbt for reliable data flow.
  • Design RESTful APIs to connect frontend experiences with complex backend data systems.
  • Manage data storage across SQL and NoSQL environments (Snowflake, Redshift) and orchestrate data workflows with Airflow.

Job description

πŸ“Œ Position: Senior Data Engineer (AI Ecosystem)

Location: Remote from LATAM

Contract Type: Full-time vendor (Contracted via Inallmedia.com)

Time Zone Alignment: Central Time (CT)

🧭 About Inallmedia.com

Inallmedia.com is a global technology and design firm focused on building impactful digital solutions through remote, distributed teams across LATAM. We partner with international clients across industries, providing long-term technical expertise, product innovation, and team augmentation.

πŸš€ Project Overview

You will join a high-impact engineering squad dedicated to evolving user-facing experiences through AI-driven features and intelligent workflows. As a Senior Data Engineer, your mission is to bridge the gap between raw data collection and application logic. You will be instrumental in building a cohesive, intelligent ecosystem that powers modern user interactions, working within a fast-paced Agile environment that prioritizes innovation and architectural excellence.

πŸ” Key Responsibilities

  • Service Development: Design and build scalable backend microservices using Java, Spring Boot, and Gradle.
  • Data Pipeline Engineering: Architect and maintain robust ETL/ELT pipelines using Python and dbt to ensure seamless data flow across the ecosystem.
  • API Architecture: Design high-quality RESTful APIs that connect sophisticated frontend experiences to complex backend data systems.
  • Big Data Management: Optimize and manage data storage across SQL and NoSQL environments, leveraging technologies like Snowflake or Redshift.
  • Workflow Orchestration: Utilize Airflow to manage and schedule complex data workflows and dependencies.
  • AI Integration: Evolve backend services to support AI-powered features, ensuring infrastructure is prepared for LLM-driven and intelligent user experiences.
  • System Reliability: Troubleshoot distributed systems, lead code reviews, and participate in architectural brainstorming to ensure peak performance and reliability.

πŸ’‘ Must-Have Skills

  • Dual Language Expertise: 5–7 years of professional experience working with Python (for data engineering) and Java/Spring Boot (for service layers).
  • Data Engineering Mastery: 5+ years building production-grade pipelines with ETL tools (specifically dbt) and Airflow.
  • Big Data Stack: 3+ years of hands-on experience with Snowflake, Redshift, Spark, or Kafka.
  • Database Proficiency: Extensive experience navigating and optimizing both SQL and NoSQL environments.
  • Backend Fundamentals: Strong grasp of REST API design, Gradle, and microservices architecture in a distributed environment.
  • Education: Bachelor’s or Master’s degree in a technical field (Computer Science, Math, Statistics, or equivalent).
  • Remote Fluency: Proven experience working in Agile teams within 100% remote environments.
  • Fluent English: Excellent verbal and written communication skills for daily technical collaboration.

🌟 Nice-to-Have Skills

  • AI/LLM Interest: Previous exposure to integrating machine learning models or intelligent workflows into backend services.
  • Cloud Ecosystems: Familiarity with AWS or GCP cloud infrastructure and deployment patterns.
  • Soft Skills: A self-starter mindset with the critical thinking skills necessary to manage competing priorities in a dynamic project.

🌐 Location & Time Zone

This position is 100% remote for candidates based in LATAM. To ensure effective collaboration with our North American partners, the role requires full alignment with Central Time (CT) business hours.

πŸ’¬ Language

All interviews, technical documentation, and daily stand-ups will be conducted exclusively in English.

#LI

Data Engineer Related jobs

Other jobs at In All Media

We help you get seen. Not ignored.

We help you get seen faster β€” by the right people.

πŸš€

Auto-Apply

We apply for you β€” automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

✨

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.