AI, Intern

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

3rd year student pursuing a degree in a relevant field such as Computer Science or Data Science., Proficiency in Python programming and understanding of Machine Learning concepts., Familiarity with frameworks like PyTorch and TensorFlow, and version control with Git., Knowledge of cloud platforms and distributed data processing frameworks is a plus..

Key responsabilities:

  • Design and build AI solutions to detect anomalies in data and provide real-time alerts.
  • Collaborate with data engineers to implement scalable AI-driven solutions and optimize model performance.
  • Develop predictive models for resource allocation and failure prediction in data pipelines.
  • Create a Flask-based web application to monitor the status of key data products.

Globalization Partners logo
Globalization Partners Large https://www.g-p.com
1001 - 5000 Employees
See all jobs

Job description

About Us

Our leading SaaS-based Global Growth Platform™ enables clients to expand into over 180 countries quickly and efficiently, without the complexities of establishing local entities. At G-P, we’re dedicated to breaking down barriers to global business and creating opportunities for everyone, everywhere.

Our diverse, remote-first teams are essential to our success. We empower our Dream Team members with flexibility and resources, fostering an environment where innovation thrives and every contribution is valued and celebrated.

The work you do here will positively impact lives around the world. We stand by our promise: Opportunity Made Possible. In addition to competitive compensation and benefits, we invite you to join us in expanding your skills and helping to reshape the future of work.

At G-P, we assist organizations in building exceptional global teams in days, not months—streamlining the hiring, onboarding, and management process to unlock growth potential for all.

We are looking for 3rd year student to join as an AI Intern to work on cutting-edge AI-driven solutions for monitoring, optimizing, and securing our data platform. This internship provides an opportunity to apply AI/ML techniques to real-world Big Data and cloud-based challenges. You'll work at the intersection of artificial intelligence, data engineering, and distributed systems.

Key Responsibilities:

  • Design and build solutions leveraging AI to detect anomalies and deviations in data and provide real-time alerts to enable quick responses and mitigate risks. For example, automatically identify and flag potential occurrences of sensitive information in plain text format within diverse datasets. Work closely with data engineers and analysts to implement scalable AI-driven solutions, optimize model performance, and enhance data quality monitoring.
  • Leverage machine learning and AI techniques to forecast query execution time, considering factors such as query complexity, data volume, and system load, leading to improved query scheduling and prioritization in large-scale data platforms. This opportunity allows the application of advanced AI methodologies to real-world data challenges, driving performance optimization in modern data ecosystems.
  • Use ML models to recommend optimal resource allocation for data pipelines based on past usage trends. By analyzing historical usage patterns and current system states, the system anticipates future resource needs and optimizes allocation decisions. Integrate intelligent recommendations, improving pipeline efficiency, cost-effectiveness, and scalability.
  • Develop a failure prediction model using historical pipeline failure patterns to proactively mitigate issues. This proactive monitoring solution continuously analyzes performance metrics, log data, and system events to detect anomalies that signal potential failures. Integrate predictive insights, enabling proactive issue resolution and improving pipeline reliability.
  • Build Flask-based web application within Databricks to display real-time status of key data products. This application will provide a centralized dashboard with visual indicators (e.g., green/red status, last refresh time) to monitor data pipeline health and freshness.

Required Skills:

  • Programming proficiency in Python
  • Understanding of Machine Learning concepts and frameworks (PyTorch, TensorFlow)
  • Familiarity with LLM models and Gen AI will be a plus
  • Version control with Git

🔑 Good to have:

  1. Familiarity with cloud platforms (AWS S3, EC2)
  2. Experience with distributed data processing frameworks (e.g., Spark)

We will consider for employment all qualified applicants, including those with arrest records, conviction records, or other criminal histories, in a manner consistent with the requirements of any applicable state and local laws, including the National Vetting Bureau (Children and Vulnerable Persons) Act 2012, the Private Security Services Act 2004, and the Criminal Justice (Spent Convictions and Certain Disclosures) Act 2016. 

G-P. Global Made Possible.

G-P is a proud Equal Opportunity Employer, and we are committed to building and maintaining a diverse, equitable and inclusive culture that celebrates authenticity. We prohibit discrimination and harassment against employees or applicants on the basis of race, color, creed, religion, national origin, ancestry, citizenship status, age, sex or gender (including pregnancy, childbirth, and pregnancy-related conditions), gender identity or expression (including transgender status), sexual orientation, marital status, military service and veteran status, physical or mental disability, genetic information, or any other legally protected status.

G-P also is committed to providing reasonable accommodations to individuals with disabilities. If you need an accommodation due to a disability during the interview process, please contact us at careers@g-p.com.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

AI Specialist Related jobs