Match score not available

MLOps Architect

Remote: 
Full Remote
Contract: 
Experience: 
Expert & Leadership (>10 years)
Work from: 

Offer summary

Qualifications:

12-15 years in machine learning projects, Expertise in TensorFlow and PyTorch, Experience with Azure cloud platform, Bachelor's degree in Computer Science or related field, Experience with big data tools and databases.

Key responsabilities:

  • Design production model deployment pipelines using Azure DevOps
  • Implement microservices frameworks for system efficiency
  • Troubleshoot and improve CI/CD processes
  • Write consensus-building design documents
  • Develop solutions from conception to production in cloud environments
Danta Technologies logo
Danta Technologies SME https://www.dantatechnologies.com
51 - 200 Employees
See more Danta Technologies offers

Job description

MLOps Architect
Remote, US (EST area)

- overall Experience of 12-15 Yrs Experience supporting machine learning projects.

- Expert with Client platforms (e.g., TensorFlow, PyTorch).

- Experience with cloud platforms (e.g. Azure).

- Bachelor's degree in Computer Science, Engineering, or a related field.

- Proven experience as a Data Engineer or in a similar role.

- Experience with big data tools (e.g., Hadoop, Spark) and databases (e.g., SQL, NoSQL).

- Knowledge of machine learning concepts and workflows.

- Strong programming skills (e.g., Python, Java).

- Excellent problem-solving abilities and attention to detail.

- Strong communication skills to effectively collaborate with other teams.

  • Design and optimize pipelines for model deployments in production environments using containers (Docker or Azure Kubernetes), Azure DevOps and/or MLOps and Azure Data Factory.
  • Knowledge of Azure Databricks pipeline and Client Model deployment is mandatory.
  • Implement efficient microservices frameworks i.e. API Management, message broker, load balancing, etc
  • Troubleshoot, improve, and scale continuous integration, continuous delivery, and continuous deployment (CI/CD) pipelines
  • Write design documents to build consensus for new systems components and enhancements to existing components
  • Extensive programming experience in Python and/or R with knowledge of Object-Oriented Programming.
  • Experience in Azure DevOps & Azure Cloud Services (e.g. Azure Blob, Azure Key Vault, Azure Data Factory) or similar experience with AWS or Google
  • Experience with CI/CD pipelines, Automated Testing, Automated Deployments, Agile methodologies, Unit Testing and Integration Testing tools
  • Demonstrated history of designing solution pipelines from conception to deployment in production environments e.g. Docker containers on Kubernetes-based platforms with data orchestration in Azure Data Factory

Nice to Have:

  • Knowledge of JFrog Artifactory is a plus.
  • Experience with front-end user interface development using various HTML-related tool frameworks like Django, FastAPI (Python), Java/Javascript etc
  • Experience with in memory and/or distributed computing frameworks (e.g. Spark, Hadoop)
  • Conduct performance testing on API endpoints and batch jobs to identify and correct CPU and memory bottlenecks

Skill

Mandatory

Proficiency Level (1-5)

Azure DevOps CI/CD

Y

4,5

Azure Cloud Engineer

Y

4, 5

Databricks Pipelines

Y

3,4

Familiarity with Client platforms (e.g., TensorFlow, PyTorch) & MLOPS

Y

3

Azure Client

N

3,4

Knowledge of JFrog Artifactory

N

2,3 (Good to have)

Required profile

Experience

Level of experience: Expert & Leadership (>10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Communication
  • Detail Oriented
  • Problem Solving

Interior Designer Related jobs