This is a remote position.
Design and develop scalable data pipelines using Python, PySpark, and SQL.
Build and maintain data integration workflows across cloud platforms such as Snowflake, Databricks, and data tools like Informatica.
Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver insights.
Implement data quality, governance, and security best practices across platforms.
Design and develop dashboards and reports using Power BI, MicroStrategy, Tableau, and Spotfire.
Optimize performance of data systems in cloud environments including Azure, AWS, and GCP.
Troubleshoot and resolve data and reporting issues efficiently.
Strong programming skills in Python, PySpark, and advanced SQL.
Experience with cloud data platforms like Snowflake and Databricks.
Proficiency in at least three reporting tools such as Power BI, MicroStrategy, Tableau, and Spotfire.
Demonstrated ability to work with large-scale data systems and complex datasets.
Experience with Informatica for ETL/ELT workflows.
Familiarity with cloud services in Azure, AWS, or Google Cloud Platform (GCP).
Exposure to Alteryx or other self-service analytics tools.
SmartestEnergy
2nd Watch
Adthena
vidIQ
SuperAwesome