Match score not available

RevOps Data Engineer

EXTRA HOLIDAYS
Remote: 
Full Remote
Contract: 
Salary: 
134 - 134K yearly
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Extensive experience in data engineering, Proficiency in GCP services and BigQuery, Advanced SQL skills for data processing, Strong experience with API integration, Experience in Python for data tasks.

Key responsabilities:

  • Develop and maintain data pipelines for Revenue Operations
  • Design and optimize data infrastructure on GCP BigQuery
  • Integrate diverse data sources for unified access
  • Build and schedule workflows ensuring data quality
  • Collaborate with teams to meet business data needs
Simpro Software logo
Simpro Software SME https://www.simprogroup.com/
501 - 1000 Employees
See more Simpro Software offers

Job description

Logo Jobgether

Your missions

First Things First - What We Can Offer You

  • Responsible Time Off
  • Opportunities for growth and development
  • Second-to-none product training provided
  • Comprehensive medical, dental, vision package with 100% employer paid options
  • 401k/Retirement Plan with 6% employer match
  • Flexible work environment
  • Dog-friendly office environment
  • Free parking
  • Happy hours and office games
  • Ground-breaking parental leave program
  • $5k Travel Grant for medical procedures

The Job

As a RevOps Data Engineer, you'll play a vital role in shaping our Enterprise Data Warehouse on BigQuery, ensuring our data infrastructure is efficient, scalable, and ready to meet the needs of a growing, data-driven organization. This is your opportunity to leave a significant impact on how we leverage data to drive revenue and operational excellence.

Salary: Up to $133,500 per annum depending on experience
Location: East Coast, Remote

What You’ll Do

  • Support the Revenue Operations Team: Develop, maintain, and evolve data pipelines used by the Revenue Operations Team for reporting, analytics, and decision making that will help grow the business. 

  • Develop and Maintain Data Infrastructure: Design, implement, and maintain integrations, data transformations, and data models within the enterprise data warehouse on GCP BigQuery.

  • Data Integration: Integrate data from diverse sources to create a unified and accessible view of the organization’s data.

  • Custom Solutions: Develop custom connectors to extract data from external systems and integrate it into BigQuery data warehouse.

  • Workflow Management: Build, optimize and schedule data workflows to ensure data quality, integrity and security throughout the entire data lifecycle.

  • Data Architecture: Perform data modelling, schema design, and performance tuning to support scalable and efficient data storage and retrieval.

  • Access Control & Security: Manage Identity and Access Management (IAM) to enforce appropriate access controls, ensuring secure and compliant access for business users to the data warehouse.

  • Collaboration: Partner with cross-functional teams and business stakeholders to gather data requirements and deliver solutions that align with business objectives.

What You’ll Bring

  • Extensive Data Engineering Experience: Proven expertise in data engineering, including ETL/ELT processes, data modelling, data warehouse development and support in cloud environments, particularly GCP.

  • Proficiency in GCP Services: In-depth knowledge of GCP services, especially BigQuery, Cloud Functions, Cloud Storage, and IAM, with hands-on experience in managing access controls and optimizing data workflows.

  • Advanced SQL Skills: Advanced proficiency in SQL for data transformation, modelling, and querying, specifically within BigQuery, ensuring high-performance data retrieval and processing.

  • API Integration Expertise: Strong experience in building and maintaining custom API connections to extract data from external systems using Python or Node.js.

  • Data Model Design: Demonstrable experience in designing and implementing data models optimized for analytical queries and reporting, ensuring that data is intuitive, well-structured, and accessible for business analysts.

  • Python for Data Engineering: Experience using Python for data engineering tasks, including proficiency in libraries such as Pandas, with the ability to handle large datasets efficiently (desirable).

Core values required of all Simpro, AroFlo & Clockshark employees:


We Are One Team

We Are Customer Centric
We Are Growth Minded
We Are Accountable
We Celebrate Success

Simpro, AroFlo and ClockShark are equal opportunity employers with a best-of-class onboarding program and supportive team environments. This means that we want everyone to feel welcome with us and to provide equal opportunities for everyone, regardless of age, disability, gender reassignment, marriage and civil partnership, pregnancy and maternity, race, religion or belief, sex or sexual orientation, or any other non-performance factor.

So, if you'd like to join a fun and progressive organisation where there are opportunities to develop your career, please apply now with your CV/resume.

*Please note, no agencies will be accepted in the recruitment of this role. We would like to take this opportunity to thank all candidates for their application. Only candidates who meet the criteria above will be contacted for an interview.

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
Check out the description to know which languages are mandatory.

Soft Skills

  • collaboration

Data Engineer Related jobs