Match score not available

Sr.Cloud Data Engineer

Remote: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

Bachelor's degree in computer science or related field, Proficiency in SQL and Python, Experience with Azure and cloud data services, Knowledge of data governance and security best practices, Experience with big data tools and pipeline management.

Key responsabilities:

  • Design and implement data solutions using Fabric OneLake
  • Develop and maintain scalable data pipelines and architectures
  • Collaborate with teams to understand data needs
  • Create data tools for analytics teams
  • Ensure secure data management across multiple regions
Encora Inc. logo
Encora Inc. XLarge http://www.encora.com/
5001 - 10000 Employees
See more Encora Inc. offers

Job description

Important Information

Experience: + 7 years

Job Mode: Full-time

Work Mode: Work from home

Job Summary

As a Cloud Data Engineer specializing in Fabric OneLake, you will be responsible for designing, building, and managing our cloud-based data infrastructure. You will play a critical role in the development and optimization of our data lake, ensuring scalability, reliability, and security.

Responsibilities and Duties

  • Design and implement scalable and secure data solutions using Fabric OneLake technology.
  • Developing and maintaining scalable data pipelines and architectures that support data ingestion, processing, storage, and delivery across multiple sources and destinations.
  • Collaborate with IT and business teams to understand data needs and deliver high-quality scalable solutions.
  • Develop and maintain data pipelines, architectures, and data sets.
  • Ensure optimal data delivery architecture for end-to-end data flow from ingestion to analytics.
  • Work with stakeholders to assist with data-related technical issues and support data infrastructure needs.
  • Create data tools for analytics and data scientist team members to assist them in building and optimizing our product.
  • Keep our data separated and secure across national boundaries through multiple data centers and Azure regions.

Qualifications and Skills

  • A bachelor's degree in computer science, Engineering, or a related field, or equivalent work experience.
  • Proficiency in SQL and Python, and familiarity with other programming languages and frameworks such as Scala, R, or Spark.
  • Experience with cloud-based data services and platforms such as Azure, AWS, or GCP, and with data warehouse and ETL tools such as Snowflake, SSIS, or Informatica.
  • Knowledge of data modeling, data quality, data governance, and data security best practices and standards.
  • Proven experience with Azure DataBricks, Azure Data Factory, and other Azure services.
  • Strong analytic skills related to working with unstructured datasets.
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with data pipeline and workflow management tools.
  • Experience with Azure SQL DB, Cosmos DB, or other database technologies.
  • Experience with stream-processing systems.
  • Strong project management and organizational skills.
  • Strong communication, collaboration, and problem-solving skills, and a passion for learning new technologies and methodologies.

Preferred Tech Skills:

  • Experience with Fabric OneLake development and management.
  • Knowledge of networking within Azure Data Bricks, including VNET settings and firewall rules.
  • Ability to set up linked services within Azure Data Factory and execute ADB notebooks.
  • Familiarity with on-premise to cloud data migration and managing data across hybrid environments.
  • Cloud Platforms: Proficient in Azure, including Azure DataBricks, Azure Data Factory, and Azure SQL Data Warehouse.
  • Programming Languages: Strong command of Python, Scala, and SQL.
  • Big Data Tools: Experience with Hadoop, Spark, Kafka, and other big data technologies.
  • Data Storage: Knowledge of various data storage solutions like Azure Blob Storage, Azure Data Lake Storage, and Cosmos DB.
  • Data Processing: Familiarity with data processing tools such as Azure Stream Analytics and Azure HDInsight.
  • DevOps Tools: Experience with CI/CD pipelines, using tools like Jenkins, Azure DevOps, or GitHub Actions.
  • Data Security: Understanding of data security practices, including encryption, data masking, and access control within cloud environments.
  • Monitoring and Logging: Proficiency in using monitoring tools like Azure Monitor and log analytics solutions like Azure Log Analytics.

Our IT and business objectives require a cloud data engineer who can build NFM Data Platform and execute reliable and adaptable solutions that allow us to make data-driven decisions in near real time. This role will assist in building and achieving a single source of truth for the company, fostering data accessibility and distribution of data.

About Encora

Encora is the preferred digital engineering and modernization partner of some of the world's leading enterprises and digital native companies. With over 9,000 experts in 47+ offices and innovation labs worldwide, Encora's technology practices include Product Engineering & Development, Cloud Services, Quality Engineering, DevSecOps, Data & Analytics, Digital Experience, Cybersecurity, and AI & LLM Engineering.

At Encora, we hire professionals based solely on their skills and qualifications, and do not discriminate based on age, disability, religion, gender, sexual orientation, socioeconomic status, or nationality.

Required profile

Experience

Level of experience: Senior (5-10 years)
Industry :
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Motivational Skills
  • Verbal Communication Skills
  • Organizational Skills
  • Collaboration
  • Analytical Skills

Data Engineer Related jobs