Match score not available

Databricks Architect

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

12+ years of experience in architecture, Expertise in Databricks design and architecture, Strong knowledge of AWS, Experience with DevOps tools and practices, Familiarity with Hadoop environments.

Key responsabilities:

  • Design and implement scalable Databricks solutions
  • Optimize performance, cost, and reliability of data systems
  • Build monitoring tools to improve performance
  • Implement DevOps processes on Databricks platform
  • Drive performance initiatives and recommendations
Tiger Analytics logo
Tiger Analytics XLarge https://www.tigeranalytics.com/
1001 - 5000 Employees
See more Tiger Analytics offers

Job description

Tiger Analytics is pioneering what AI and analytics can do to solve some of the toughest problems faced by organizations globally. We develop bespoke solutions powered by data and technology for several Fortune 100 companies. We have offices in multiple cities across the US, UK, India, and Singapore, and a substantial remote global workforce.

If you are passionate about working on business problems that can be solved using structured and unstructured data on a large scale, Tiger Analytics would like to talk to you. We are seeking an experienced and dynamic Databricks Architect to play a key role in designing and implementing robust data solutions that help in solving the client's complex business problem.

Requirements

  • 12+ years of experience in Architecture, design and data engineering development of large-scale data ecosystems.
  • Databricks Design and Architecture: Design and architect scalable and secure Databricks solutions platform, optimizing performance, cost, and reliability.
  • Hands on Databricks/Dataops Architect that can help reduce consumption, and improve performance issues. Can help build monitoring tools.
  • Good exposure to AWS. Strong comm and leadership skills and Self driven and work independently.
  • Experience in Performance initiatives and drive recommendations (identify tuning opportunities, creation of performance environment etc.
  • Implement DevOps process on the Databricks platform , harmonized and curation pipelines across all the programs.
  • Experience with hadoop environment.
  • Experience with DevOps tools and practices, including CI/CD pipeline setup using Git, Terraform, GitHub Actions, or Jenkins.
  • Core Skills:- Databricks platform knowledge, performance optimization, DevOps, Unity Catalog migration implementation exposure, Spark Streaming pipeline design exposure

Benefits

Significant career development opportunities exist as the company grows. The position offers a unique opportunity to be part of a small, fast-growing, challenging, and entrepreneurial environment, with a high degree of individual responsibility.

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Communication
  • Problem Solving
  • Leadership

Related jobs