Nvidia is all set to launch a new platform called EGX Platform, specifically designed to bring real-time artificial intelligence (AI) to edge networks.
The motive behind this launch is to put AI computing closer to where sensors collect data before it is sent to larger data centers.
Justin Boitano, Senior Director for Enterprise and Edge Computing at Nvidia said, “AI is required in this data-driven world.” And added, “We analyze data near the source, capture anomalies and report anomalies back to the mothership for analysis.”
Boitano said that the racks will fit in any industry-standard rack, so they will fit into edge containers from the likes of Vapor IO and Schneider Electric. He further added, we are hitting crossover where there is more compute at edge than cloud because more work needs to be done there.
EGX utilizes Nvidia’s low-power Jetson Nano processor, but also all the way up to Nvidia T4 processors that has the ability to deliver more than 10,000 trillion operations per second (TOPS) for real-time voice recognition and other real-time AI tasks.
Nvdia is also working on software stack called Nvidia Edge Stack that can be updated constantly. As the software runs in containers, no reboots are required, just a restart of the container.
EGX runs enterprise-grade Kubernetes container platforms such as Red Hat Openshift.
Edge Stack is improved software that includes Nvidia drivers, a CUDA Kubernetes plugin, a CUDA container runtime, CUDA-X libraries and containerized AI frameworks and applications, including TensorRT, TensorRT Inference Server and DeepStream.