We’re looking for a Machine Learning Engineer to join our team!
If you're excited about applied AI, agentic systems, and want to build intelligent tools that truly collaborate with people — we’d love to welcome you on board. Join us and help shape the future of AI-powered workplace assistants.
We are developing an AI-powered internal assistant framework that enables intelligent, context-aware interactions across organizational environments. The system is designed to enhance how users access, understand, and act on complex project information by leveraging semantic search, agentic AI, LLM/LMM models, and MCP (Multi-Context Protocol).
This assistant framework will serve as a foundation for building task-specific agents capable of integrating with various enterprise tools and knowledge sources. It is aimed at supporting day-to-day operations, minimizing cognitive overload, and improving information retrieval across technical and management roles, by unifying fragmented knowledge and workflows into a seamless, AI-driven interface. We're building a system that doesn't just respond — it understands, adapts, and collaborates.
Develop and integrate ML and NLP models to power intelligent assistant features
Build agentic workflows using LangChain, LangGraph, or similar frameworks
Prototype user interfaces and internal tools using Streamlit or Gradio
Collaborate with the engineering and product teams to plan and deliver ML-driven features
Work with Docker to manage development and runtime environments
Use Git for version control and write clean, maintainable code
Query structured data using SQL
Contribute to model deployment and operations in a cloud environment (primarily Azure)
3+ years of experience in a similar position
Proficient in Python, with experience in building data pipelines and interacting with APIs
Experience with agentic AI frameworks such as LangChain or LangGraph
Solid understanding of machine learning and NLP, especially transformer-based architectures (e.g., BERT, GPT), embeddings, and their use in tasks such as retrieval-augmented generation (RAG), semantic search, and classification
Hands-on experience with ML frameworks (e.g., PyTorch, TensorFlow)
Familiarity with prototyping tools such as Streamlit or Gradio
Experience of engineering best practices: Git, Docker, cloud basics, and task estimation
Working knowledge of SQL
Upper-Intermediate level of English (both written and spoken)
WOULD BE A PLUS
Experience with other agentic or LLM orchestration tools
Experience with MLOps or model deployment
Comfortable working in Linux terminal environments
ServiceNow
Sigma Software Group
Bjak
Bjak
Bjak