Offer summary
Qualifications:
3+ years industry experience as a data engineer, Proficient in Python and SQL.Key responsabilities:
- Build real-time data pipelines for financial documents
- Deploy and monitor mission-critical ETL pipelines
Rogo will be the biggest Financial Services Artificial Intelligence company in the world. We're creating a category-defining AI company built on top of foundational AI models like GPT-4.
Exceptional early users: high-paying contracts with the world's largest investment banks, hedge funds, private equity firms, and consultants.
Massive demand: extensive waitlist of firms waiting for deployment.
World-class team: we take talent density very seriously. We like working with incredibly smart, driven people.
Cutting-edge technology: Work directly with the world's most advanced LLMs, AI, and RAG to build the future of generative AI and redefine finance.
Top-of-market cash and equity compensation.
Challenges:
We are building systems that can automate the most complex knowledge work in the world, e.g., financial analysis, research, due diligence, and more.
Creating financial research that's worth paying attention to: aggregating, analyzing, and producing insights from real-time information. Say goodbye to equity research.
Dealing with the most sensitive data in the world: client data from the largest financial services companies on earth.
Working past the edge of published AI research: tackling problems beyond the complexity of existing AI benchmarks.
Unsolved product, architectural, and business problems: natural language interfaces, prohibitively expensive evaluation of models, massive marginal costs, versioning/training/segregating models per task, client, and so on.
As a Data Engineer at Rogo, you will help build out our real-time data pipelines for millions of unstructured financial documents to feed our financial LLM. Itβs cutting-edge data engineering at the AI frontier.
Hard Requirements:
3+ years of industry experience as a data engineer
Highly proficient with Python and SQL, and an intuitive understanding of multi-threading, multi-processing, asyncio, and other concurrency primitives
Experience with at least one of: Postgres, Snowflake or Elasticsearch
Experience deploying and monitoring mission-critical ETL pipelines with large and heterogenous datasources
Experience working with Apache Airflow
Experience with AWS or other cloud environment
Bonus Requirements:
Experience with a strongly typed language (e.g., Rust)
Experience at a hypergrowth startup
Financial Services work experience
Experience with stream processing
Knowledge of Datadog and other Telemetry tooling
You have fun solving hard problems: we're tackling tech/product/business problems that are unsolved. It's super exciting.
You like to work hard: we feel lucky to work on these problems, and we enjoy pouring our all into solving them.
You care deeply about talent density: we care deeply about working with people who are super smart and motivated.
You have eclectic interests: whether you're a sci-fi aficionado, history buff, strategy game guru, policy wonk, or movie trivia expert, you'll find kindred spirits here.
Dayshape
Nagarro
GlobalLogic
TalentFusion Solutions
Moody's Corporation