Videos

Resilient cloud-native storage is critical for edge AI applications

by
August 9, 2024
2
minute read

For edge AI applications to run successfully, the right cloud-native storage infrastructure must be in place. AI needs a lot of data for model training and inference, which means low latency storage for that data is critical. In this video, Ripul Patel, Technical Director at Rakuten Cloud, discusses the impact of storage on edge AI deployments and to make Software Defined Storage resilient.

When applications are installed, they require both storage and policies. For edge deployments, it’s important to streamline the process of assigning storage resources to an application.

Traditionally, this process typically involves a storage administrator that must carve out the right amount of storage from the drive to meet the needs of the application. The admin must also assign storage policies and determine how to protect the volume. All this involves many manual steps that are not scalable for the large number of remote servers typically used in edge compute deployments.

What is needed is a self-service portal for automated storage configuration. The administrator defines the application’s requirements, and the storage software handles the configuration and management without other manual involvement. This is what Rakuten Cloud-Native Storage offers. It enables edge compute scalability by managing up to hundreds of thousands of stores with zero manual intervention.

In the video, Patel takes audience questions and discusses the flexibility, scalability, performance, and low resource footprint of Rakuten Cloud-Native Storage for edge AI applications.

AI
Automation
Edge Cloud