Databricks Engineer

Work set-up: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

Proven experience with Databricks and Apache Spark., Strong understanding of data architecture and ETL/ELT processes., Knowledge of Delta Lake features and data governance tools., Bachelor's degree in Computer Science, Data Engineering, or related field..

Key responsibilities:

  • Design and build scalable data solutions using Databricks.
  • Develop and maintain complex data pipelines for large datasets.
  • Implement data quality frameworks with automated testing and monitoring.
  • Optimize data processing jobs for performance and cost efficiency.

The Codest logo
The Codest Information Technology & Services SME https://thecodest.co/
51 - 200 Employees
See all jobs

Job description

🌍 Hello World!

We are The Codest International Tech Software Company with tech hubs in Poland delivering global IT solutions and projects. Our core values lie in “Customers and People First” approach that prioritises the needs of our customers and a collaborative environment for our employees, enabling us to deliver exceptional products and services.

Our expertise centers on web development, cloud engineering, DevOps and quality. After many years of developing our own product Yieldbird, which was honored as a laureate of the prestigious Top25 Deloitte awards, we arrived at our mission: to help tech companies build impactful product and scale their IT teams through boosting IT delivery performance. Through our extensive experience with product development challenges, we have become experts in building digital products and scaling IT teams.

But our journey does not end here we want to continue our growth. If you’re goaldriven and looking for new opportunities, join our team! What awaits you is an enriching and collaborative environment that fosters your growth at every step.

We are currently looking for:

DATABRICKS ENGINEER

Here, you will have an opportunity to contribute to a banking app for one of the leading financial groups in Japan. The platform is equipped with bank modules and data management features and it is customerfacing as well. We are seeking an experienced Databricks Engineer to design, build, and manage scalable data solutions and pipelines using Databricks. You’ll work closely with crossfunctional teams to ensure data is reliable, accessible, and efficient to power analytics and business intelligence initiatives.


📈 Your Responsibilities:

  • Architect medallion architecture (Bronze, Silver, Gold) lakehouses with optimized performance patterns

  • Build strong data quality frameworks with automated testing and monitoring

  • Implement advanced Delta Lake features such as time travel, vacuum operations, and Zordering

  • Develop and maintain complex ETLELT pipelines processing largescale datasets daily

  • Design and implement CICD workflows for data pipelines using Databricks Asset Bundles or equivalent tools

  • Create realtime and batch data processing solutions with Structured Streaming and Delta Live Tables

  • Optimize Spark jobs for cost efficiency and performance, leveraging cluster autoscaling and resource management

  • Develop custom integrations with Databricks APIs and external systems

  • Design scalable data architectures using Unity Catalog, Delta Lake, and Apache Spark

  • Establish data mesh architectures with governance and lineage tracking

    Required profile

    Experience

    Level of experience: Senior (5-10 years)
    Industry :
    Information Technology & Services
    Spoken language(s):
    English
    Check out the description to know which languages are mandatory.

    Other Skills

    • Collaboration
    • Problem Solving

    Field Engineer (Solutions) Related jobs