Match score not available

Senior Data Engineer

Remote: 
Full Remote
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

Minimum 5 years of experience in data engineering, Expertise in AWS, SQL, Python, and Git, Experience with Snowflake and orchestration tools, Proficiency in CI/CD pipelines and data governance.

Key responsabilities:

  • Design and maintain ETL/ELT pipelines
  • Manage and optimize Snowflake databases
Astek logo
Astek https://astekgroup.fr/
5001 - 10000 Employees
See more Astek offers

Job description

How about joining ASTEK Polska’s community of Data Engineers?




Salary:

  • up to 1100 PLN net + VAT/MD (B2B) depending on your professional experience




Work model:

  • Remote (NALA TIME ZONE)





About the project:

We are seeking a highly skilled and motivated Senior Data Engineer to join our growing data team. The ideal candidate will have a strong background in developing data pipelines, implementing data models, and executing best practices for developing data products. This role requires expertise in AWS, GitLab CI/CD, dbt, Snowflake, SQL, Python, Git, and DevOps, as well as hands-on experience with orchestrators such as Airflow and AutoMateNow, and data governance technologies like Monte Carlo and Collibra.​




Your day-to-day responsibilities include:

  • Develop Data Pipelines: Design, develop, and maintain robust, scalable, and efficient ETL/ELT pipelines to support diverse data sources and large-scale data processing.
  • Data Modeling: Create and maintain data models and data architecture to ensure data integrity and optimum performance.
  • Best Practices Implementation: Apply industry best practices in data warehousing, data governance, and data lifecycle management.
  • Collaboration: Work closely with data scientists, analysts, and other stakeholders to gather requirements and deliver high-quality data products.
  • Automation and CI/CD: Implement CI/CD pipelines using GitLab for automated testing, deployment, and integration of data workflows.
  • Database Management: Manage and optimize Snowflake databases for performance, scalability, and cost-effectiveness.
  • Coding and Scripting: Write efficient SQL queries and Python scripts to extract, load, and transform data.
  • Version Control: Utilize Git for version control, ensuring traceable and manageable code changes.
  • Orchestration: Utilize orchestration tools like Airflow and AutoMateNow to manage and automate data workflows.
  • Data Governance: Implement and manage data governance and data observability solutions using technologies like Monte Carlo and Collibra.
  • Monitoring and Optimization: Monitor data pipeline performance and troubleshoot issues to ensure reliability and efficiency.
  • Documentation: Maintain comprehensive documentation for data pipelines, models, and processes.



You’re ideal for this role if you have:

  • Minimum 5 years of experience in data engineering or a similar role.
  • Proven experience with AWS cloud services related to data processing, such as S3, Redshift, Lambda, Glue, and Data Pipeline.
  • Proficiency in designing and implementing CI/CD pipelines with GitLab.
  • Experience with dbt (data build tool) for transforming data within the warehouse.
  • Strong knowledge of Snowflake with hands-on experience in data warehousing solutions.
  • Expertise in SQL for data querying, manipulation, and optimization.
  • Proficiency in Python for scripting and automation tasks.
  • Familiarity with DevOps practices and tools.
  • Experience with orchestrators like Airflow and AutoMateNow.
  • Knowledge of data governance and observability tools such as Monte Carlo and Collibra.
  • Strong working knowledge of Git for version control.



It would be great if you also have:

  • AWS Certified Data Analytics – Specialty or AWS Certified Solutions Architect.
  • Experience with real-time data processing and streaming technologies (e.g., Kafka, Kinesis).



Your personality:

  • You are a team player
  • You are focused on long-term cooperation
  • You easily adopt to changes



Added value for you:

  • Long-term cooperation
  • Possibility to choose preferred type of cooperation (regular job contract with all benefits or flexible B2B contract)
  • Technical trainings, certificates and upskilling
  • Competence Center mentoring- you will be a member of CC community from the first day of your work. You’ll have a chance to develop your skills, participate in various conferences and share your knowledge and experience with people who face the same challenges in their daily work
  • Clear career path
  • Employee benefits package
  • Friendly work atmosphere, social events and team-building meetings



Need more information? Contact me: katarzyna.jaroszewska@astek.net

It’s not about you? Recommend us your friend and get a bonus up to 7,000 PLN

No ref: AO143860​

Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration

Data Engineer Related jobs