Logo for DyFlex Solutions

Consulting Data Engineer at DyFlex Solutions

Roles & Responsibilities

  • Hands-on data engineering experience in production environments
  • Strong proficiency in Python and SQL; experience with at least one additional language (e.g., Java or TypeScript/JavaScript)
  • Experience with modern frameworks such as Apache Spark, Airflow, dbt, Kafka, or Flink
  • Background in building ML pipelines, MLOps practices, or feature stores

Requirements:

  • Build and maintain scalable data pipelines for ingesting, transforming, and delivering data
  • Manage and optimise databases, warehouses, and cloud storage solutions
  • Implement data quality frameworks and testing processes to ensure reliable systems
  • Design and deliver cloud-based solutions (AWS, Azure, or GCP)

Job description

Consulting Data Engineer

DyFlex is an SAP Platinum Partner delivering high-quality SAP solutions across Australia. We’re now expanding our Data and AI practice to match the strength and reputation of our established SAP capability.


As a Consulting Data Engineer, you’ll design, build, and deploy scalable data pipeline and machine learning solutions that deliver real business value. You’ll use strong SQL and modern data stacks to create reliable, cost effective pipelines and support ML workloads. Experience with at least one of Databricks, Snowflake, BigQuery, MS Fabric is required, more than one a bonus. Experience with tools like Spark, dbt, Dataform is a plus but not required. You will work on meaningful technical challenges with autonomy and communicate technical outcomes clearly to both technical and non-technical stakeholders.

We value engineers who think creatively, communicate effectively, and engage confidently with stakeholders. We’re looking for engineers who do more than write code. You’ll listen to client challenges, dig into the core problem, help shape solutions, and explain them clearly. If you want to build something from the ground up with a team that’s already proven it can deliver meaningful outcomes, we’d like to hear from you.

Your tasks and responsibilities:

  • Build and maintain scalable data pipelines for ingesting, transforming, and delivering data
  • Manage and optimise databases, warehouses, and cloud storage solutions
  • Implement data quality frameworks and testing processes to ensure reliable systems
  • Design and deliver cloud-based solutions (AWS, Azure, or GCP)
  • Take technical ownership of project components and lead small development teams
  • Engage directly with clients, translating business requirements into technical solutions
  • Champion best practices including version control, CI/CD, and infrastructure as code

Your qualifications and experience:

  • Hands-on data engineering experience in production environments
  • Strong proficiency in Python and SQL; experience with at least one additional language (e.g. Java, Typescript/Javascript)
  • Experience with modern frameworks such as Apache Spark, Airflow, dbt, Kafka, or Flink
  • Background in building ML pipelines, MLOps practices, or feature stores is highly valued
  • Proven expertise in relational databases, data modelling, and query optimisation
  • Demonstrated ability to solve complex technical problems independently
  • Excellent communication skills with ability to engage clients and stakeholders
  • Degree in Computer Science, Engineering, Data Science, Mathematics, or a related field

What we offer:

  • Work with SAP’s latest technologies on cloud as S/4HANA, BTP and Joule, plus Databricks, ML/AI tools and cloud platforms
  • A flexible and supportive work environment including work from home
  • Competitive remuneration and benefits including novated lease, birthday leave, salary packaging, wellbeing programme, additional purchased leave, and company-provided laptop
  • Comprehensive training budget and paid certifications (Databricks, SAP, cloud platforms, Snowflake, BigQuery)
  • Structured career advancement pathways with mentoring from senior engineers
  • Exposure to diverse industries and client environments

Join a renowned organisation delivering projects to some of Australia’s leading enterprises

DyFlex is committed to providing a safe, flexible and respectful environment for staff free from all forms of discrimination, bullying and harassment. We are proud of our diverse and inclusive team as only together we can continually improve ourselves and achieve best outcomes for our customers– we are the region's leading SAP Platinum Partner!

Please note that we can’t offer sponsorship for this role and we expect a clear statement about your legal rights in Australia, especially if you are applying from overseas.

Data Engineer Related jobs

Other jobs at DyFlex Solutions

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.