Match score not available

Data Manager

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Strong expertise in Data Bricks and Delta Live Tables., Proficient in SQL and Python for data management., Experience with modern data pipeline tools and cloud platforms., Knowledge of cloud data lakes and DevOps principles..

Key responsabilities:

  • Design, build, and maintain data products.
  • Lead migration of data pipelines to modern frameworks.
Allata logo
Allata Scaleup https://allata.com/
201 - 500 Employees
See more Allata offers

Job description

Allata is an IT company dedicated to strategy, architecture, and enterprise-level application development with offices in the US, India, and Argentina. We aim to be strategic advisors for our clients, focusing on helping them enhance or scale business opportunities, create efficiencies, automate processes through custom technologies, and find elegant solutions to inefficient problems.

We provide Data Analytics, Advanced Integrations, Product Launch, Experience Design, Support, Cloud, DevOps, Software Development, among other services. Our agile centralized development teams powered by our on-site senior leadership allow us to work with you as a stand-alone group be it integrating into your in-house dev teams or providing external architectural guidance.



We are actively seeking a Data Manager with strong expertise in Data Bricks and Delta Live Tables to guide and drive the evolution of our client's data ecosystem.

The ideal candidate will combine technical leadership with hands-on execution, leading the design, migration, and implementation of robust data solutions while mentoring team members and collaborating with stakeholders to achieve enterprise-wide analytics goals.

Role & Responsibilities:
  • Provide technical leadership in designing, building, and maintaining reusable data products using Data Bricks, Delta Live Tables (DLT), DBT, Python, and SQL.
  • Lead the migration of existing data pipelines to modern frameworks and ensure scalability and efficiency.
  • Establish and enforce data quality standards, leveraging automated testing and reporting to ensure data accuracy, consistency, and compliance.
  • Develop and oversee the data infrastructure, pipeline architecture, and integration solutions while actively contributing to hands-on implementation.
  • Collaborate with business and technical stakeholders to define requirements and ensure data solutions align with business goals.
  • Build and maintain scalable, efficient data processing pipelines and solutions for data-driven applications.
  • Develop comprehensive documentation of data engineering processes and provide mentorship to junior team members.
  • Monitor and ensure adherence to data security, privacy regulations, and compliance standards.
  • Troubleshoot and resolve complex data-related challenges and incidents in a timely manner.
  • Stay at the forefront of emerging trends and technologies in data engineering and advocate for their integration when relevant.

  • Hard Skills - Must have:
  • Proven expertise in Data Bricks, Delta Live Tables, SQL, and Python for processing and managing large data volumes.
  • Strong experience in designing and implementing dimensional models and medallion architecture.
  • Hands-on experience with modern data pipeline tools (e.g., DBT, AWS Glue, Azure Data Factory, Fivetran) and cloud platforms (e.g., Snowflake, Redshift, BigQuery).
  • Knowledge of cloud data lakes (e.g., Data Bricks Lakehouse, Azure Storage, AWS S3).
  • Demonstrated experience applying DevOps principles (e.g., Git, CI/CD) to data engineering projects.
  • Familiarity with batch and streaming data processing techniques.

  • Hard Skills - Nice to have/It's a plus:
  • Familiarity with architectural best practices for building data lakes.
  • Experience with BI tools (e.g., Power BI, Tableau) and deploying data models.
  • Knowledge of observability and monitoring frameworks for data solutions.
  • Hands-on experience with big data technologies (e.g., Spark, Delta Lake, Hive).
  • Experience with Databricks Unity Catalog, i.e. configuring and managing data governance and access controls in a Lakehouse environment.

  • Soft Skills / Business Specific Skills:
  • Demonstrated ability to lead technical discussions and guide teams in solving complex data challenges.
  • Strong collaboration and stakeholder management skills in distributed teams.
  • Proactive problem-solving attitude with excellent organizational and analytical capabilities.
  • Fluent English communication skills, both written and verbal.
  • Familiarity with Agile methodologies and SDLC practices.
  • Required profile

    Experience

    Spoken language(s):
    English
    Check out the description to know which languages are mandatory.

    Other Skills

    • Problem Solving
    • Collaboration
    • Communication
    • Analytical Skills
    • Leadership
    • Organizational Skills

    Data Manager Related jobs