Data Engineer with Python, Pyspark

Work set-up: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Strong programming skills in Python, PySpark, and advanced SQL., Experience with cloud data platforms like Snowflake and Databricks., Proficiency in reporting tools such as Power BI, MicroStrategy, Tableau, or Spotfire., Ability to work with large-scale data systems and complex datasets..

Key responsibilities:

  • Design and develop scalable data pipelines using Python, PySpark, and SQL.
  • Build and maintain data integration workflows across cloud platforms like Snowflake, Databricks, and Informatica.
  • Collaborate with data scientists, analysts, and stakeholders to deliver insights.
  • Implement data quality, governance, and security best practices.

Elfonze Technologies logo
Elfonze Technologies Scaleup https://www.elfonze.com/
201 - 500 Employees
See all jobs

Job description

This is a remote position.

Key Responsibilities
  • Design and develop scalable data pipelines using Python, PySpark, and SQL.

  • Build and maintain data integration workflows across cloud platforms such as Snowflake, Databricks, and data tools like Informatica.

  • Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver insights.

  • Implement data quality, governance, and security best practices across platforms.

  • Design and develop dashboards and reports using Power BI, MicroStrategy, Tableau, and Spotfire.

  • Optimize performance of data systems in cloud environments including Azure, AWS, and GCP.

  • Troubleshoot and resolve data and reporting issues efficiently.


Mandatory Skills
  • Strong programming skills in Python, PySpark, and advanced SQL.

  • Experience with cloud data platforms like Snowflake and Databricks.

  • Proficiency in at least three reporting tools such as Power BI, MicroStrategy, Tableau, and Spotfire.

  • Demonstrated ability to work with large-scale data systems and complex datasets.


Nice to Have Skills
  • Experience with Informatica for ETL/ELT workflows.

  • Familiarity with cloud services in Azure, AWS, or Google Cloud Platform (GCP).

  • Exposure to Alteryx or other self-service analytics tools.



Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Problem Solving

Data Engineer Related jobs