6+ years of experience, Expertise in Data bricks or Snowflake platform, Python/Java/Scala, and other relevant technologies like Airflow.
Key responsabilities:
Develop reusable data processing components based on Data mesh design
Translate designs into high-quality code for cloud-based data integration and warehouses
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
Coders Brain is a global leader in IT services, digital and business solutions that partners with its clients to simplify, strengthen and transform their businesses. We ensure the highest levels of certainty and satisfaction through a deep-set commitment to our clients, comprehensive industry expertise and a global network of innovation and delivery centers.
We achieved our success because of how successfully we integrate with our clients.
Coders Brain is a global leader in its services, digital, and business solutions that partners with its clients to simplify, strengthen, and transform their businesses. We ensure the highest levels of certainty and satisfaction through a deep-set commitment to our clients, comprehensive industry expertise, and a global network of innovation and delivery centers.
Job Description:
Core skill – Data bricks platform Or Snowflake platform, Python/Java/Scala, Airflow, CI/CD (Devops) process, Building ETL data pipelines, Object Storage in AWS/Azure, Delta lake/Datalake, Kafka.
Exp level – 6+ years
Job Responsibilities:
Developing latest reusable data processing component & libraries based on Data mesh design pattern
Designing & implement a modern highly responsive factory approach & inner source the components for enterprise use
Translating designs and wire frames into high-quality code
Should have extensive experience building data integration and warehouses in cloud (Azure and AWS)
Strong hands-on experience in python and Java.
Extremely strong SQL skills on OLAP and OLTP technologies.
Ability to do Data models Using Data vault and Dimensional Models
Understand and use CI/CD with on AWS/Azure & Kubernetes.
Learn and understand new tools and techniques on Databricks, snowflake, no-sql Databases, data governance and data quality,
Optimizing components for maximum performance across a vast array of data processing and data consumption patterns.
Required profile
Experience
Industry :
Management Consulting
Spoken language(s):
English
Check out the description to know which languages are mandatory.