Match score not available

Databricks Engineer

Remote: 
Full Remote
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

Bachelor's degree in computer science or related field, 5+ years of experience in a similar role, Extensive experience with Python or PySpark, Experience with Azure/AWS Data services.

Key responsabilities:

  • Develop solutions on Big Data
  • Migrate existing pipelines to Cloud

Sales Consulting logo
Sales Consulting http://salesconsulting.recruitmentagiler.com
51 - 200 Employees
See all jobs

Job description

The company is home to 2500+ creative technologists and is one of Eastern Europe's largest Software Product Engineering delivery networks. We serve global clients in several industries, including Banking & Financial Services, Insurance, Healthcare & Life Sciences, Communication Media & Technology, and Retail & MLEU (manufacturing, logistics, energy & utilities).

Our product thinking mindset defines, builds, and launches new, experience-centered software products that reinvent business.

The company was founded in the early 1990s to be the engineering partner for thriving Silicon Valley tech startups. The organization prides itself on its great culture and reputation for attracting and developing the best technical talent in Romania with end-to-end delivery expertise.

We are looking for exceptional Databricks Senior to work with our cross-functional team and join our world-class community of talented experts in Romania.

For this position you should be able to check the following:

- Bachelor's degree in computer science or related field
- 5+ years of experience in a similar role; leading fellow data engineers within the data engineering area;
- Extensive experience working in agile project environments;
- Working experience with Python or PySpark is required;
- Working experience and understanding of Azure/ AWS Data-bricks, Azure/ AWS Data Factory, Azure/ AWS Data Lake, Azure / AWS SQL DW, and Azure / AWS SQL is required;
- The ability to apply such methods to solve business problems using one or more Azure Data and Analytics services in combination with building data pipelines, data streams, and system integration.
- Experience in driving new data engineering developments (e.g. applying new cutting-edge data engineering methods to improve the performance of data integration, using new tools to improve.


A day in the life of a Databricks Engineer:

- Develop and provide solutions on Big Data;
- Develop high-traffic, flawless web applications using Python, Cloud platform, and PySpark;
- Code with performance, scalability, and usability in mind;
- Work on new tools in leading industry trends, with new and emerging technologies, prototypes, and engineering process improvements;
- Migrate existing pipelines developed in ADF and python scripts from On-premise to Cloud using ADF and Databricks
- Develop new pipelines using ADF for data extraction from multiple sources (MySQL databases, Oracle databases, CSV files) and Databricks (PySpark & SparkSQL) for curation, validation and applying required business transformation logic
- Work on client projects to deliver Microsoft Azure / AWS based Data engineering & Analytics solutions
- Engineer and implement scalable analytics solutions on Azure

Employee Benefits:

  • Flexible Work Schedule - Outside of main work hours, you can create a schedule that suits your needs
  • Whether you like to work from home or go to the office, the choice is yours
  • Annual Vacation Days - 26 days to relax, explore and spend time with loved ones
  • Trainings, workshops, and certifications, unlimited Udemy subscription and more
  • Private medical package
  • Meal tickets
  • Referral bonuses
  • Life insurance
  • Banking services
  • Bookster

Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Teamwork
  • Communication
  • Problem Solving

Field Engineer (Solutions) Related jobs