Logo for Actian Corporation

Engineering Intern, Streaming Observability

Roles & Responsibilities

  • Strong proficiency in Java or Python (as the primary languages for Apache Beam).
  • Familiarity with Apache Beam or Google Cloud Dataflow.
  • Understanding of distributed systems concepts and data movement from producers to consumers.
  • Experience with Kafka or streaming data concepts (bonus points if you’ve touched Kafka).

Requirements:

  • Design and Build: Develop the logic to intercept and observe streaming data metrics as it passes through Kafka I/O transforms.
  • Work with Modern Tech: Use Apache Beam to create portable data processing pipelines that handle massive scale.
  • Integrate: Work closely with our core engineering team to hook these observations into our Integration Manager dashboard.
  • Test Validate: Ensure that the monitoring doesn't break the pipes—maintaining high throughput while gathering insights.

Job description

Company

At Actian we believe data should be a competitive advantage. Through the deployment of data technology, underpinned by a relentless and trusted service commitment, we help business critical systems transact and integrate at their very best. As a trusted leader in data management, integration, and analytics, our mission is to helping businesses unlock the full potential of their data to drive better decision-making and innovation wherever it resides — in the cloud, on-premises, or hybrid environments.

With a global team of experts and a culture of innovation, we’re dedicated to helping our customers solve their most complex data challenges.

Internship Overview

We are looking for interns to join us for our 2026 Summer Internship Program! This 12-week program is set to begin June 8th, so if you are looking for an incredible opportunity to partner with the best and brightest minds in the industry, apply today. This program has been designed with our interns in mind and includes structured learning plans, a dedicated buddy, and a focused capstone project that you will have the opportunity to present in our Internship Showcase!

 

What It’s Like Interning with Us!

  • Intern Events— just because the internship is remote, doesn’t mean we don’t have time for fun! Regular intern events will be hosted throughout your 12-weeks with us!
  • Time with Executives— Interns all get a chance to connect with our executive team through panel discussions, 1:1s, Q&A meetings, and events
  • Workshops — Interns participate in workshops geared towards helping new professionals
  • Opportunity to travel – we will fly you out for onsite orientation at our Austin, Texas office location!

Position Overview

We are looking for a forward-thinking Engineering Intern, Streaming Observability to bridge the gap between "data in motion" and "data understood." Our Integration Manager requires a sophisticated observability layer to monitor streaming data flowing through Kafka I/O via the Apache Beam unified programming model.

In this role, you won't just be writing code; you will be pioneering an AI-First development workflow. You will be expected to leverage Generative AI tools like Claude and the GSD (Get Stuff Done) framework to accelerate every phase of your project. From the initial research of architectural patterns to the planning, implementation, and automated testing of your Proof of Concept (POC), you will use AI to handle the boilerplate and syntax, allowing you to focus on high-level system design and robust technical documentation.


Responsibilities:
  • Design & Build: Develop the logic to intercept and observe streaming data metrics as it passes through Kafka I/O transforms.
  • Work with Modern Tech: Use Apache Beam to create portable data processing pipelines that handle massive scale.
  • Integrate: Work closely with our core engineering team to hook these observations into our Integration Manager dashboard.
  • Test & Validate: Ensure that the monitoring doesn't "break the pipes"—maintaining high throughput while gathering insights.

  • Nice to Haves:
  • Java or Python: Strong proficiency in at least one of these (as they are the primary languages for Beam).
  • Distributed Systems Concepts: You understand the basics of how data moves from point A to point B in a network.
  • The "Data Mindset": You know what a "Producer" and a "Consumer" are in the context of messaging (bonus points if you’ve touched Kafka before).
  • Previous experience with Apache Beam or Google Cloud Dataflow.

  • Requirements:
  • Must be actively enrolled in a college degree program
  • Must be legally authorized to work in the United States
  • We value diversity at our company. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or any other applicable legally protected characteristics in the location in which the candidate is applying. 

    Related jobs

    Other jobs at Actian Corporation

    We help you get seen. Not ignored.

    We help you get seen faster — by the right people.

    🚀

    Auto-Apply

    We apply for you — automatically and instantly.

    Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

    AI Match Feedback

    Know your real match before you apply.

    Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

    Upgrade to Premium. Apply smarter and get noticed.

    Upgrade to Premium

    Join thousands of professionals who got noticed and hired faster.