Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
We are a staffing solutions and IT Solutions company that provides you with a platform to find the right fit for your desired job profile.
The employment situation is constantly shifting with the changing times, and we are here to ensure that you gather the workforce that compliments your long-term goals. We understand the struggle of the search for talent that accommodates the skills and qualifications for specific profiles, one that blends in with the theme of your organization.
We aim to be the best available source for young talent to find their dream jobs, by helping them narrowing down their options to the most suitable work profiles available in the market.
At RIG, we consistently work towards creating the latest technology that may not only simplify your work process but also provides you with the most cost-effective solutions. We work hard to make sure that your business strives in the market to be on the top in your field.
Position: Azure Data engineer Duration: 12-month contract
Visa Constraints : None
Communication skiils should be great
Confirmed -- two positions still open
Updated MUST HAVES:
Hands on rest api development using Python frameworks
Azure expertise with hands on AKS/Kubernetes development
Mongo db experience
Position Overview:
Develop data pipelines to ingest, load, and transform data from multiple sources.
Leverage Data Platform, running on Google Cloud, to design, optimize, deploy and deliver data solutions in support of scientific discovery
Use programming languages like Java, Scala, Python and Open-Source RDBMS and NoSQL databases and Cloud based data store services such as MongoDB, DynamoDB, Elasticache, and Snowflake
The continuous delivery of technology solutions from product roadmaps adopting Agile and DevOps principles
Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences
Design and develop data pipelines, including Extract, Transform, Load (ETL) programs to extract data from various sources and transform the data to fit the target model
Test and deploy data pipelines to ensure compliance with data governance and security policies
Moving implementation to ownership of real-time and batch processing and data governance and policies
Maintain and enforce the business contracts on how data should be represented and stored
Ensures that technical delivery is fully compliant with Security, Quality and Regulatory standards
Keeps relevant technical documentation up to date in support of the lifecycle plan for audits/reviews.
Pro-actively engages in experimentation and innovation to drive relentless improvement e.g., new data engineering tools/frameworks
Implementing ETL processes, moving data between systems including S3, Snowflake, Kafka, and Spark
Work closely with our Data Scientists, SREs, and Product Managers to ensure software is high quality and meets user requirements
Required Qualifications
Bachelor's or Master's degree in Computer Science, Engineering, or related field.
5+ years of experience as a data engineer building ETL/ELT data pipelines.
Experience with data engineering best practices for the full software development life cycle, including coding standards, code reviews, source control management (GIT, continuous integraions, testing, and operations)
Experience in programming language Python and SQL good to have Java, C#, C++, Go, Ruby, and Rust