AI Developer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

5+ years of experience in Python development with a focus on AI/ML., Proficiency in AI/ML frameworks such as TensorFlow, PyTorch, and scikit-learn., Experience with AWS services including AWS Bedrock, S3, and Lambda., Strong knowledge of relational and NoSQL databases like PostgreSQL and MongoDB..

Key responsabilities:

  • Develop, train, and deploy machine learning models using various frameworks.
  • Design and implement RESTful APIs for AI model integration.
  • Manage cloud infrastructure and deploy AI solutions on AWS.
  • Collaborate with teams to integrate AI capabilities and document processes.

Talpro - Leaders in Technology Hiring logo
Talpro - Leaders in Technology Hiring Human Resources, Staffing & Recruiting SME https://www.talproindia.com/
51 - 200 Employees
See all jobs

Job description

This is a remote position.

Job Description: AI Developer (5+ Years Experience)


Mandatory Skills:


  • Python Development with AI/ML

  • TensorFlow, PyTorch, scikit-learn

  • NLP, LLMs, RAG Systems, Computer Vision

  • AWS Bedrock & AWS Services (S3, Lambda, SageMaker, ECS/EKS, IAM)

  • RESTful APIs (Flask/FastAPI)

  • Docker & Kubernetes

  • Data Engineering (ETL, Apache Airflow/AWS Glue/Spark)

  • Relational & NoSQL databases (PostgreSQL, MySQL, DynamoDB, MongoDB)



Secondary or Good to Have Skills:


  • CI/CD Pipelines and DevOps

  • Project management tools (Jira/Trello)

  • Cloud infrastructure optimization and security



Years of Experience: 5+ Years

Role Type: Permanent (Talpro)

CTC Offered: 12 LPA

Notice Period: Immediate

Work Mode: Permanent Remote




Job Summary:

We are seeking a proficient AI Developer to join our remote team, bringing extensive experience in Python-based AI/ML development, cloud computing (AWS Bedrock), and backend development. The role demands expertise in deploying robust, scalable AI solutions, optimizing machine learning models, and managing data pipelines efficiently. Strong analytical and collaborative skills, combined with an ability to clearly communicate technical concepts, are essential.




Job Responsibilities:


1. AI & Machine Learning Development:


  • Develop, train, fine-tune, and deploy sophisticated ML models using TensorFlow, PyTorch, scikit-learn, and NumPy.

  • Work extensively with NLP, Large Language Models (LLMs), Retrieval-Augmented Generation (RAG), and computer vision technologies.

  • Integrate and optimize pre-trained models provided by AWS Bedrock for scalability and efficiency.



2. Backend Development & API Integration:


  • Design and implement RESTful APIs with Flask or FastAPI for AI model deployment and integration.

  • Develop robust microservices architectures ensuring security, efficiency, and scalability of backend services.



3. Cloud & DevOps:


  • Deploy AI solutions on AWS Bedrock, integrating seamlessly with AWS ecosystem services (S3, Lambda, SageMaker, ECS/EKS, IAM).

  • Manage cloud infrastructure, ensuring optimized costs, performance, and security.

  • Implement containerized solutions with Docker and Kubernetes, along with CI/CD practices for rapid and reliable deployments.



4. Database & Data Engineering:


  • Design, query, and optimize databases (PostgreSQL, MySQL, DynamoDB, MongoDB).

  • Develop automated data pipelines using tools like Apache Airflow, AWS Glue, or Spark.

  • Manage robust ETL processes tailored for AI applications.



5. Collaboration & Documentation:


  • Closely collaborate with data scientists, engineers, and product teams to integrate AI capabilities.

  • Prepare comprehensive documentation on AI models, system architectures, and deployment processes.

  • Clearly communicate complex AI concepts to non-technical stakeholders.





Essential Requirements:


✅ Programming & AI Development:


  • Minimum 5 years’ experience in Python-centric AI/ML development.

  • Proven expertise in frameworks: TensorFlow, PyTorch, scikit-learn.

  • Solid experience with neural networks, NLP techniques, LLMs, RAG, and computer vision.



✅ Backend & API Development:


  • Demonstrated proficiency developing RESTful APIs using Flask or FastAPI.

  • Strong understanding of microservices architecture, OOP, and functional programming.



✅ Cloud & DevOps:


  • Significant hands-on experience with AWS Bedrock and related AWS services (S3, Lambda, SageMaker, ECS/EKS, IAM).

  • Practical skills in containerization (Docker) and orchestration (Kubernetes).

  • Familiarity with CI/CD workflows and DevOps methodologies.



✅ Data Engineering & Databases:


  • Expertise in relational (PostgreSQL, MySQL) and NoSQL (DynamoDB, MongoDB) database systems.

  • Extensive knowledge of ETL practices, Apache Airflow, AWS Glue, or Spark-based data pipelines.



✅ Version Control & Collaboration:


  • Proficient with Git, collaborative coding practices, and code versioning workflows.

  • Experience using project management and collaboration tools (e.g., Jira, Trello).





Soft Skills:


  • Strong analytical problem-solving capabilities.

  • Excellent verbal and written communication skills.

  • Team-oriented, with proven collaborative abilities.

  • Adaptable and proactive learner, keen on adopting emerging AI technologies.




Requirements

Job Requirements:


Essential Requirements:


Programming & AI Development:

5+ years of experience in Python development with AI/ML focus.

Proficiency in AI/ML frameworks like TensorFlow, PyTorch, and scikit-learn.

Experience in neural networks, NLP, LLMs, RAG, and computer vision.


Backend & API Development:

Experience in RESTful API development using Flask or FastAPI.

Strong knowledge of OOP, functional programming, and microservices architecture.


Cloud & DevOps:

Experience with AWS Bedrock and other AWS services (S3, Lambda, SageMaker, ECS/EKS, IAM).

Hands-on experience with containerization (Docker) and orchestration (Kubernetes).

Understanding of CI/CD pipelines and DevOps practices.


Data Engineering & Databases:

Strong knowledge of PostgreSQL, MySQL, DynamoDB, and MongoDB.

Experience with data pipelines, ETL processes, and Apache Airflow/Spark/AWS Glue.


Version Control & Collaboration:

Proficiency in Git and best practices for collaborative development.

Familiarity with project management tools like Jira or Trello.


Soft Skills:


✔️ Strong problem-solving and analytical thinking.

✔️ Excellent communication skills to explain AI models and technical concepts.

✔️ Ability to work collaboratively in cross-functional teams.

✔️ Adaptability to learn and implement emerging AI/ML technologies.




Salary:

bhaskar@talproindia.com

Required profile

Experience

Industry :
Human Resources, Staffing & Recruiting
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Adaptability
  • Teamwork
  • Communication
  • Problem Solving

AI Specialist Related jobs