2022: AI/ML Workloads in Containers: 6 Key Facts - PowerPoint PPT Presentation

About This Presentation
Title:

2022: AI/ML Workloads in Containers: 6 Key Facts

Description:

Before IT leaders and their teams begin to dig into the nitty-gritty technical aspects of containerizing AI/ML workloads, some principles are worth thinking about up front. Here are six fact to consider. – PowerPoint PPT presentation

Number of Views:6

less

Transcript and Presenter's Notes

Title: 2022: AI/ML Workloads in Containers: 6 Key Facts


1
2022 AI/ML Workloads in Containers 6 Key Facts
2
Introduction
Before IT leaders and their teams begin to dig
into the nitty-gritty technical aspects of
containerizing AI/ML workloads, some principles
are worth thinking about up front. Here are six
essentials to consider.
3
Table of Content
  • AI/ML workloads represent workflows
  • The benefits are similar to other containerized
    workloads
  • Teams need to be aligned
  • The "pay attention" points dont really change
  • Containers wont fix all underlying issues
  • Be smart about build vs. buy

4
AI/ML Workloads Represent workflows
1.
5
AI/ML Workloads Represent Workflows
  • Data gets gathered, cleaned, and processed,
    Haff says. Then, the work continues Now its
    time to train a model, tuning parameters based on
    a set of training data. After model training, the
    next step of the workflow is deploying to
    production. Finally, data scientists need to
    monitor the performance of models in production,
    tracking prediction, and performance metrics.
  • Traditionally, this workflow might have involved
    two or three handoffs to different individuals
    using different environments, Haff says.
    However, a container platform-based workflow
    enables the sort of self-service that
    increasingly allows data scientists to take
    responsibility for both developing models and
    integrating into applications.

6
The Benefits are Similar to Other Containerized
Workloads
2.
7
The benefits are similar to other containerized
workloads
  • Nauman Mustafa, head of AI ML at Autify, sees
    three overarching benefits of containerization in
    the context of AI/ML workflows
  • Modularity It makes important components of the
    workflow such as model training and deployment
    more modular. This is similar to how
    containerization can enable more modular
    architectures, namely microservices, in the
    broader world of software development.
  • Speed Containerization accelerates the
    development/deployment and release cycle,
    Mustafa says. (Well get back to speed in a
    moment.)
  • People management Containerization also makes it
    easier to manage teams by reducing cross-team
    dependencies, Mustafa says. As in other IT
    arenas, containerization can help cut down on the
    hand off and forget mindset as work moves from
    one functional group to another.

8
Teams Need to be Aligned
3.
9
Teams Need to be Aligned
  • Make sure everyone involved in building and
    operating machine learning workloads in a
    containerized environment is on the same page,
    says Frank from ISG. Operations engineers may be
    familiar with running Kubernetes, but may not
    understand the specific needs of data science
    workloads. At the same time, data scientists are
    familiar with the process of building and
    deploying machine learning models, but may
    require additional help when moving them to
    containers or operating them going forward.
  • In a world where repeatability of results is
    critical, organizations can use containers to
    democratize access to AI/ML technology and allow
    data scientists to share and replicate
    experiments with ease, all while being compliant
    with the latest IT and InfoSec standards, says
    Sherard Griffin, director of global software
    engineering at Red Hat.

10
The "Pay Attention" Points Dont Really Change
4.
11
The "Pay Attention" Points Dont Really Change
  • Here are three examples of operational
    requirements that youll need to pay attention
    to, just like with other containerized
    applications
  • Resource allocation Mustafa notes that proper
    resource allocation remains critical to
    optimizing cost and performance over time.
    Provision too much and youre wasting resources
    (and money) over time too little and youre
    setting yourself up for performance problems.
  • Observability Just because you cant see a
    problem does not render it out of existence.
    Ensure that you have the necessary observability
    software in place to understand how your
    multi-container applications behave, Frank says.
  • Security From a security point of view,
    launching AI/ML solutions is no different from
    launching other solutions in containers,
    Alexandra Murzina, ML engineer at Positive
    Technologies. That means tactics such as applying
    the principle of least privilege (both to people
    and the containers themselves), using only
    trusted, verified container images, runtime
    vulnerability scanning, and other security layers
    should remain top of mind.

12
Containers Wont Fix all Underlying Issues
5.
13
Containers Wont Fix all Underlying Issues
  • Just as automation wont improve a flawed process
    (it just helps that flawed process run faster and
    more frequently), containerization is not going
    to address fundamental problems with your AI/ML
    workloads.
  • If youre baking bias into your ML models, for
    example, running them in containers will do
    nothing to address that potentially serious
    issue.
  • Containers are very beneficial for running AI/ML
    workloads, says Raghu Kishore Vempati, director
    of technology at Capgemini Engineering. But
    containerizing AI/ML workloads alone doesnt make
    the model more efficient. It only provides a way
    to accelerate the productivity associated with
    training the models and inferring on them.

14
Be Smart About Build vs. Buy
6.
15
Be Smart About Build vs. Buy
  • As with most technical choices, theres a should
    we or shouldnt we? decision in terms of
    containerizing AI/ML workloads. Also like most
    important technical choices, nothing comes free.
  • There is a cost associated with containerizing
    machine learning workflows, which may not be
    justified for tiny teams, but for large teams,
    benefits outweigh the cost, Mustafa from Autifly
    says.
  • IT leaders and their teams should do it with
    clear goals or reasons in mind just because we
    can shouldn't be the only reason on your list.
  • Dont overcomplicate an already complex
    situation, Frank says. Make sure that
    containerizing ML workloads will provide business
    value beyond the intellectual exercise.
  • Source enterprisersproject

16
  • Next-Gen Tech Services
  • Mobile Application Development
  • Cloud Computing Services
  • Quality Assurance
  • Digital Marketing
  • Visit www.wecode-inc.com
  • Email sales_at_wecode-inc.com
Write a Comment
User Comments (0)
About PowerShow.com