π΄ Work format: long-term, full-time, 100% remote π
β° Start: ASAP π
Hi there! π
Weβre looking for Databricks Architect for our US-based client. The work involves activities such as data migration, data collection, and optimization of Databricks-based solutions. The client has a steady demand for specialists. Most of the projects are short-term (with a high chance of extension), and thanks to the consistent demand, the client is able to offer new engagements after a project ends.
Currently, weβre looking for specialists for an AI/ML project in the healthcare domain. The project involves processing text data and analyzing images generated by medical devices (X-ray, MRI, etc.). The collected data will be migrated to a cloud-based Databricks platform. The platform is designed to handle the full data lifecycle, with built-in features ensuring compliance, auditability, cohort creation, and model reuse. The goal is to solve existing data management challenges (fragmented sources, manual processes, insufficient security).
Weβre looking for individuals with strong Python skills. Cloud experience and knowledge of Databricks and Apache Spark are crucial. The projects are mainly US-based - in most cases, working hours require only a slight overlap (e.g., 10:00 to 18:00 CET), but weβre flexible and happy to accommodate preferences.
Key responsibilities:
π Ensuring secure data storage
π Processing and indexing DICOM data
π Data validation, building processing pipelines, creating and sharing cohorts
π Planning and executing database migrations
π Close collaboration with the team (including data engineers, data scientists, clinical informatics professionals, and support staff)
Requirements:
β‘οΈ Solid experience in a data engineering or related role (8+ years)
β‘οΈ Strong knowledge of the Databricks platform and Apache Spark
β‘οΈ Proficient in Python
β‘οΈ Experience with cloud data migrations
β‘οΈ Experience working in AWS environments (Amazon S3)
β‘οΈ Background in AI/ML project delivery
β‘οΈ Strong interpersonal and teamwork skills
β‘οΈ Initiative and the ability to work independently
β‘οΈ English fluency for effective communication within the team
Nice to have:
β‘οΈ Experience with other cloud platforms (e.g., Azure - Data Factory, Synapse, Logic Apps, Data Lake)
β‘οΈ Experience designing and optimizing data pipelines using DBT, SSIS, TimeXtender, or similar ETL/ELT solutions
β‘οΈ Exposure to big data or noSQL platforms (Redshift, Hadoop, EMR, Google Data, etc.)
How we work and what we offer:
π― We value open communication throughout the recruitment and employment process - we strive for clarity and transparency at every step
π― Our recruitment approach is human-centered - we simplify processes to be as smooth and candidate-friendly as possible
π― We operate on a "remote first" basis - remote work is the norm, and business travel is kept to a minimum
π― We offer private medical care (Medicover) and a Multisport card for contractors
How to apply? π
Send us your application via the form!
Nationwide
Raytheon
PwC South East Asia Consulting
Mallinckrodt Pharmaceuticals
Alpha FMC