We are The Codest International Tech Software Company with tech hubs in Poland delivering global IT solutions and projects. Our core values lie in “Customers and People First” approach that prioritises the needs of our customers and a collaborative environment for our employees, enabling us to deliver exceptional products and services.
Our expertise centers on web development, cloud engineering, DevOps and quality. After many years of developing our own product Yieldbird, which was honored as a laureate of the prestigious Top25 Deloitte awards, we arrived at our mission: to help tech companies build impactful product and scale their IT teams through boosting IT delivery performance. Through our extensive experience with product development challenges, we have become experts in building digital products and scaling IT teams.
But our journey does not end here we want to continue our growth. If you’re goaldriven and looking for new opportunities, join our team! What awaits you is an enriching and collaborative environment that fosters your growth at every step.
We are currently looking for:
DATABRICKS ENGINEER
Here, you will have an opportunity to contribute to a banking app for one of the leading financial groups in Japan. The platform is equipped with bank modules and data management features and it is customerfacing as well. We are seeking an experienced Databricks Engineer to design, build, and manage scalable data solutions and pipelines using Databricks. You’ll work closely with crossfunctional teams to ensure data is reliable, accessible, and efficient to power analytics and business intelligence initiatives.
📈 Your Responsibilities:
Architect medallion architecture (Bronze, Silver, Gold) lakehouses with optimized performance patterns
Build strong data quality frameworks with automated testing and monitoring
Implement advanced Delta Lake features such as time travel, vacuum operations, and Zordering
Develop and maintain complex ETLELT pipelines processing largescale datasets daily
Design and implement CICD workflows for data pipelines using Databricks Asset Bundles or equivalent tools
Create realtime and batch data processing solutions with Structured Streaming and Delta Live Tables
Optimize Spark jobs for cost efficiency and performance, leveraging cluster autoscaling and resource management
Develop custom integrations with Databricks APIs and external systems
Design scalable data architectures using Unity Catalog, Delta Lake, and Apache Spark
Establish data mesh architectures with governance and lineage tracking
Raytheon Missiles & Defense
Pratt & Whitney
3M
Raytheon Missiles & Defense
FRAMOS