Strong experience implementing and maintaining robust, scalable data pipelines on Azure (Azure Data Factory, Azure SQL Data Warehouse, Azure Analysis Services, or Azure Databricks/Synapse/Fabric)
Strong Snowflake experience
At least 1 year of experience developing complex data transformations using DBT
Requirements:
Build, deploy, and maintain robust and scalable Azure-based data pipelines using Azure Data Factory, Azure SQL Data Warehouse, Azure Analysis Services, and Databricks/Synapse/Fabric
Design and implement multi-layer data platform architectures with end-to-end ETL processes from various sources to staging areas and data marts
Develop complex data transformations using DBT and SQL; troubleshoot performance issues and perform root-cause analysis
Support data migration, validation, cleansing, and data quality assurance to ensure accurate and timely reporting
Job description
Data Engineer Location: Remote in USA Duration: 06 Months plus Rate: DOE
US Citizens and Green cards are Preferred.
We are looking for Caterpillar formers only. We will ONLY be able to consider candidates who previously worked at Caterpillar
Required Qualifications:
Candidate must be located within commuting distance of Hartford, CT or Raleigh, NC or Phoenix, AZ or Richardson, TX or be willing to relocate to the area.
Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
US citizens and those authorized to work in the US are encouraged to apply.
At least 2 years of experience with Information Technology
Strong experience in implementing, and maintaining robust and scalable data pipelines on Azure using services such as Azure Data Factory, Azure SQL Data Warehouse, Azure Analysis Services or any of the Azure Databricks/Synapse/Fabric.
Strong experience in data platforms – with multi layered approach, Design/Architecture setup is needed.
Strong snowflake experience.
At least 1 year of experience in developing complex transformations using Data Build tool (DBT).
Strong experience in SQL, troubleshooting performance issues, identifying root cause and applying fixes.
Strong experience in Azure and Python/Shell Scripting languages
Preferred Qualifications:
Experience in Data Analysis, Data Migration, Data Validation, Data Cleansing, Data Verification, identifying data mismatch, Data Import, and Data Export using multiple ETL tools such as Informatica, DataStage, Teradata, Talend
Performance tuning of the databases to ensure optimal reporting user experience.
Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts and data load.