Logo for Enroute

AI Engineer + Data Engineer

Roles & Responsibilities

  • Databricks Lakehouse architecture experience and data engineering
  • Advanced SQL development and analytics-focused data modeling
  • Experience designing and maintaining scalable data pipelines with modular transformations (dbt/Jinja SQL)
  • Experience integrating LLM APIs (OpenAI, Anthropic) into data workflows and applying AI for data enrichment, anomaly detection, and automated classification

Requirements:

  • Design and build AI-powered data pipelines within Databricks
  • Integrate LLMs into data workflows for automation and intelligence
  • Develop scalable data models to support analytics and AI use cases
  • Collaborate with data, analytics, and engineering teams to improve data reliability and scalability

Job description

We love technology, and we enjoy what we do. We are constantly driven by curiosity, innovation, and the desire to improve every day. We take ownership of our work, value collaboration, and foster a culture of trust and accountability. Our Enrouters embrace challenges, ask questions, learn quickly, and grow together.

We pride ourselves on offering competitive compensation, excellent benefits, a great work environment, flexible schedules, and policies that promote a healthy work-life balance. We care about who you are both inside and outside the workplace, and we are committed to building a strong community of driven, responsible, respectful—and above all—happy individuals. We want you to genuinely enjoy working with us.

We are seeking a data-driven Ai Engineer to join our team at a high-growth advertising technology company. This role focuses on scaling our reporting infrastructure for advertising performance and billing reconciliation, ensuring that financial and operational data is accurate, automated, and actionable.

In this role, you will be responsible for developing robust data pipelines, ensuring data quality and reliability, and enabling efficient data consumption across the organization. You will collaborate closely with cross-functional teams including Product, Engineering, Analytics, and Business stakeholders to deliver high-impact data platforms.

The ideal candidate is a proactive problem-solver with strong technical expertise, capable of working with large datasets, modern data architectures, and cloud-based environments. You thrive in fast-paced settings, navigate ambiguity with confidence, and are passionate about turning data into actionable value.

Requirements

Databricks & Data Engineering (Must-Have)

  • Strong experience working with Databricks Lakehouse architecture
  • Experience designing and maintaining scalable data pipelines
  • Ability to process and model large-scale analytical datasets

SQL & Data Modeling (Must-Have)

  • Advanced SQL development
  • Experience building modular and reusable transformations
  • Experience with Jinja SQL (dbt or similar frameworks)
  • Strong data modeling skills for analytics and reporting

AI Engineering & Data Workflows (Must-Have)

  • Experience integrating LLM APIs (OpenAI, Anthropic, etc.) into data workflows
  • Hands-on experience using AI for:
    • Data enrichment
    • Anomaly detection
    • Automated classification
  • Experience with LangChain, LlamaIndex, or similar frameworks
  • Exposure to Model Context Protocol (MCP) or similar approaches to connect AI models with external tools and data sources

Platform & Engineering Practices (Nice-to-Have / Medium)

  • Experience with GitHub workflows
  • Familiarity with CI/CD pipelines (Jenkins or similar)
  • Experience working with YAML/YML configuration files

Key Responsibilities

  • Design and build AI-powered data pipelines within Databricks
  • Integrate LLMs into data workflows for automation and intelligence
  • Develop scalable data models to support analytics and AI use cases
  • Implement AI-driven enhancements such as anomaly detection and data enrichment
  • Collaborate with data, analytics, and engineering teams to improve data reliability
  • Optimize performance and scalability of data and AI workflows
  • Support automation through CI/CD practices
  • Ensure data quality, traceability, and maintainability across pipelines

Benefits

Monetary compensation

Year-end Bonus

IMSS, AFORE, INFONAVIT

Major Medical Expenses Insurance

Minor Medical Expenses Insurance

Life Insurance

Funeral Expenses Insurance

Preferential rates for car insurance

TDU Membership

Holidays and Vacations

Life happens days

Bereavement days

Civil Marriage days

Maternity & Paternity leave

English and Spanish classes

Performance Management Framework

Certifications

TALISIS Agreement: Discounts at ADVENIO, Harmon Hall, U-ERRE, UNID

Taquitos Rewards

Amazon Gift Card on your Birthday

Work-from-home Bonus

Laptop Policy

Equal employment:

Enroute is committed to providing equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.

Artificial Intelligence Engineer Related jobs

Other jobs at Enroute

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.