Intermediate Data Engineer OP01793

extra holidays
Work set-up: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

At least 3 years of experience in data engineering and data warehouse modeling., Proficiency in designing and building ETL processes for large data volumes and streaming solutions., Expert-level SQL skills with experience in Snowflake and Apache Iceberg tables., Degree in Computer Science, Data Engineering, or related fields..

Key responsibilities:

  • Review and analyze existing ETL solutions for migration.
  • Design, optimize, and migrate batch and streaming data pipelines to GCP Landing Zone.
  • Build and manage data transformations with dbt for ELT pipelines in Snowflake.
  • Collaborate with team members and stakeholders to ensure data solutions meet business needs.

Dev.Pro logo
Dev.Pro SME https://dev.pro/
501 - 1000 Employees
See all jobs

Job description

🟒 Are you in Brazil, Argentina or Colombia? Join us as we actively recruit in these locations, offering a comfortable remote environment. Submit your CV in English, and well get back to you!

We invite a Intermediate Data Engineer to contribute to a largescale data modernization effort for a major enterprise client. You’ll help migrate and transform complex legacy data pipelines to a modern custombuilt cloud environment for improved scalability, maintainability, and compliance. You’ll work closely with architects, DevOps, QA, and product stakeholders to deliver scalable, reliable data solutions that meet unique business needs.

🟩 Whats in it for you:

  • Join a fully integrated delivery team built on collaboration, transparency, and mutual respect
  • Contribute to highimpact data platform transformation and gain experience with Google Landing Zones
  • Work handson with modern, indemand technologies like GCP, Snowflake, Apache Iceberg, dbt, Airflow, Dataflow, and BigQuery
    • βœ… Is that you?

      • 3+ years in data engineering and data warehouse modeling
      • Strong proficiency in designing and building ETL for large data volumes and streaming solutions
      • Expertlevel SQL skills and experience in Snowflake and Apache Iceberg tables
      • Handson experience with GCP services (BigQuery, GCS, Airflow, Dataflow, Dataproc, PubSub)
      • Proficiency in Python for ETL scripting and DAG development
      • Experience using dbt for data transformation and orchestration
      • Familiarity with CICD processes and tools (Git, Terraform, Serverless)
      • Degree in Computer Science, Data Engineering, Information Systems, or related fields
      • Strong communication and collaboration abilities
      • UpperIntermediate+ English level
        • Desirable:

          • Experience building and managing streaming data pipelines and eventdriven architectures
          • Experience writing Bash scripts
          • Experience with Java for Dataflow jobs
          • Familiarity with data lakehouse architectures using Iceberg tables
          • Proficiency with Docker for containerizing data pipelines and supporting orchestration
          • Familiarity with AIassisted tools like GitHub Copilot
            • 🧩Key responsibilities and your contribution

              In this role, youll be actively involved in key data engineering activities, helping ensure the project’s success and timely delivery.

              • Review and analyze existing ETL solutions for migration to the new architecture
              • Design, optimize, and migrate batch and streaming data pipelines to the GCP Landing Zone
              • Build and manage data transformations with dbt, supporting ELT pipelines in Snowflake
              • Ensure the new data infrastructure meets performance and quality SLAsSLOs
              • Implement monitoring and alerting for pipelines to ensure system fault tolerance
              • Develop migration scripts to transfer historical data to Iceberg tables
              • Collaborate closely with the team and other stakeholders to align on data requirements and solutions
              • Participate in code reviews, design discussions, and technical planning
                • 🎾 Whats working at Dev.Pro like?

                  Dev.Pro is a global company thats been building great software since 2011. Our team values fairness, high standards, openness, and inclusivity for everyone β€” no matter your background

                  🌐 We are 99.9% remote β€” you can work from anywhere in the world
                  🌴 Get 30 paid days off per year to use however you like β€” vacations, holidays, or personal time
                  βœ”οΈ 5 paid sick days, up to 60 days of medical leave, and up to 6 paid days off per year for major family events like weddings, funerals, or the birth of a child
                  ⚑️ Partially covered health insurance after the probation, plus a wellness bonus for gym memberships, sports nutrition, and similar needs after 6 months
                  πŸ’΅ We pay in U.S. dollars and cover all approved overtime
                  πŸ““ Join English lessons and Dev.Pro University programs, and take part in fun online activities and teambuilding events

                  Our next steps:

                  βœ… Submit a CV in English β€” βœ… Intro call with a Recruiter β€” βœ… Internal interview β€” βœ… Client interview β€” βœ… Offer

                  Interested? Find out more:

                  πŸ“‹How we work

                  πŸ’» LinkedIn Page

                  πŸ“ˆ Our website

                  πŸ’»IG Page

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Communication

Data Engineer Related jobs