Snow flake Architect with Azure

Work set-up: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Experience with end-to-end Snowflake cloud data warehouse implementations., Strong knowledge of Snowflake features like data sharing, resource monitors, and performance tuning., Proficiency in data modeling, ELT processes, and SQL, preferably on Azure., Background in designing enterprise data lake architectures and working with Azure cloud platform..

Key responsibilities:

  • Lead the design and development of enterprise data lake architectures on Azure and Snowflake.
  • Implement and optimize Snowflake data warehousing solutions, including data modeling and performance tuning.
  • Collaborate with cross-functional teams to ensure solutions meet business requirements and standards.
  • Maintain data security, governance, and best practices across data platforms.

Keylent Inc logo
Keylent Inc Information Technology & Services SME https://www.keylent.com/
201 - 500 Employees
See all jobs

Job description


Visa status: U.S. Citizens and those authorized to work in the U.S. are encouraged to apply.
Tax Terms: W2, 1099
Corp-Corp or 3rd Parties: Yes



Snow flake Architect
Position is remote  and duration is one year.

 
Please share profiles that match the below JD.
  • Accountabilities across multiple functional and technical data and analytical areas with a wide range of complexity. Technical architect and leader of medium to complex projects for a specific business capability or across multiple capabilities and technologies focusing on Microsoft Azure – Snowflake Cloud DW. Knowledge about other technologies such as SAP HANA, SAP BODS, SAP BW & SQL Servers is preferable as all these exist in K-C's environment
  • Develop an effective, coherent, reliable, and phased enterprise data lake architecture approach for Revenue Growth Management to help business grow and change
  • Develop a roadmap for the enterprise data lake platforms (Microsoft Azure/ Snowflake) for advance analytics and map business opportunities to appropriate data lake architecture patterns as business strategy and technology mature
  • Develop and maintain processes to acquire, analyze, store and cleanse and transform large datasets using tools like Microsoft Azure - Azure Data Factory & Spark, SnowSQL
  • Must have experience end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on cloud preferably using Microsoft Azure.
  • Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures with JAVA and standard DWH and ETL concepts
  • Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
  • Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns
  • Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe. Knowledge with Big Data model techniques using Python, R is preferable
  • Expertise in implementing transformation (SCD2) using continuous pipelines with stream & tasks,
  • Experience in Data Migration from RDBMS, Azure Blob to Snowflake cloud data warehouse
  • Deep understanding of relational as well as NoSQL data stores, Caching methods and approaches (star and snowflake, dimensional modelling)
  • Experience with data security and data access controls and design
  • Experience on usage of materialized and Secure tables/ views and good understanding of loading, unloading data.
  • Deploy Snowflake following best practices, and provide subject matter expertise in data warehousing, specifically with Snowflake
  • Experienced in designing, developing, implementing, optimizing and troubleshooting complex data warehouse cloud platforms Saas Snowflake
  • Comfortable in using Azure Cloud Platform and Databricks
  • The Data Architect is responsible for design, development and expansion of enterprise data models that support business requirements. They will develop, maintain and support an enterprise-wide data model to show the entity relation across business functions and applications. They will also be heavily involved in analyzing, profiling and reviewing the data elements from each source system and create a logical and physical data model design
  • Help develop and maintain enterprise data standards, best practices, security policies and governance process for Enterprise data lake globally 
  • Translates non-functional and functional requirements into end-to-end analytical solution designs, ensuring solution aligns with business goals and processes, uses and provides enterprise information consistently, integrates effectively with other applications, supports a common application environment and user interaction model.
  • Collaborates with other Solution Engineers and Enterprise Architecture to make sure that solutions fit within enterprise context and aims for standardization of solutions across K-C's application landscape.
  • Coordinates medium to complex solution architecture implementations while leading design variances based upon business needs while ensuring artifacts are documented in enterprise architecture repository.
 
 

Required profile

Experience

Industry :
Information Technology & Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Governance
  • Collaboration
  • Problem Solving

Azure Architect Related jobs