Match score not available

Data Engineer

extra holidays - extra parental leave - work from home - fully flexible
Remote: 
Full Remote
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

5+ years of experience in Data Engineering., Bachelor or advanced degree in technical field., Experience with GCP and Bigquery required., Expert knowledge of SQL and data architectures..

Key responsabilities:

  • Designing and launching reliable data pipelines.
  • Partnering with teams to develop internal data products.
Confluent logo
Confluent Large http://confluent.io/
1001 - 5000 Employees
See more Confluent offers

Job description

Position at Confluent India Private Limited

With Confluent, organisations can harness the full power of continuously flowing data to innovate and win in the modern digital world. We have a purpose that drives us to do better everyday – we're creating an entirely new category within data infrastructure - data streaming. This technology will allow every organisation to create experiences and use the power of data in ways that profoundly impact the way we all live. This impact is our purpose and drives us to do better every day.

One Confluent. One team. One Data Streaming Platform.

Data Connects Us.

About the Role:

The mission of the Data Engineering team at Confluent is to serve as the central nervous system of all things data for the company: we build analytics infrastructure, insights, models, and tools, to empower data-driven thinking, and optimize every part of the business. This position offers limitless opportunities for an ambitious data engineer to make an immediate and meaningful impact within a hyper-growth start-up, and contribute to a highly engaged open source community.
This is a partnership-heavy role. As a member of the Data team, you will enable various functions of the company to be data-driven. As a Data Engineer, you will take on big data challenges in an agile way. You will build data pipelines that enable data scientists, analytics and operation teams, and executives to make data accessible to the entire company. You will also build data models to deliver insightful analytics while ensuring the highest standard in data integrity. You are encouraged to think out of the box and play with the latest technologies while exploring their limits. Successful candidates will have strong technical capabilities, a can-do attitude, and are highly collaborative.

What You Will Do:

  • Designing, building and launching extremely efficient and reliable data pipelines to move data across a number of platforms, including Data Warehouse and real-time systems.
  • Developing strong subject matter expertise and managing the SLAs for those data pipelines.
  • Assessing options and opportunities in order to provide recommendations to business partners and stakeholders.
  • Partnering with Data Scientists and business partners, like Business Intelligence, analytics teams and system administrators, to develop internal data products to improve operational efficiencies organizationally
  • Here are some examples of our work:
     Data Pipelines - Create new pipelines or enhance existing pipelines using SQL &, Python. 
     Data Quality and Anomaly Detection - Improve existing tools to detect anomalies real time and through offline metrics
     Data Modeling - Partner with analytic consumers to improve existing datasets and build new ones

What You Will Bring:

  • 5+years of experience in a Data Engineering role, with a focus on data warehouse technologies, data pipelines and BI tooling.
  • Bachelor or advanced degree in Computer Science, Mathematics, Statistics, Engineering, or related technical discipline.
  • Experience with GCP and Bigquery is mandatory.
  • Experience with ETL pipeline tools like Airflow, and with code version control systems like Git.
  • Expert knowledge of SQL and of relational database systems and concepts.
  • Strong knowledge of data architectures and data modeling and data infrastructure ecosystems.
  • Experience working in an Agile team environment.
  • The ability to communicate cross-functionally, derive requirements and architect shared datasets; the ability to synthesize, simplify and explain complex problems to different types of audiences, including executives.
  • The ability to thrive in a dynamic environment. That means being flexible and willing to jump in and do whatever it takes to be successful.

What Gives You an Edge:

  • Experience with Apache Kafka
  • Experience with GTM & Sales systems (ex: Salesforce, Netsuite, etc.)
  • Knowledge of batch and streaming data architectures
  • Product mindset to understand business needs, and come up with scalable solutions


Come As You Are

At Confluent, equality is a core tenet of our culture. We are committed to building an inclusive global team that represents a variety of backgrounds, perspectives, beliefs, and experiences. The more diverse we are, the richer our community and the broader our impact. Employment decisions are made on the basis of job-related criteria without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, or any other classification protected by applicable law.

Click HERE to review our Candidate Privacy Notice which describes how and when Confluent, Inc., and its group companies, collects, uses, and shares certain personal information of California job applicants and prospective employees.
#LI-Remote

Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Communication
  • Problem Solving

Data Engineer Related jobs