Gcore, a provider of Edge AI, cloud, network, and security solutions, has launched "Gcore Inference at the Edge." This new service delivers low-latency experiences for AI applications by allowing the distributed deployment of pre-trained machine-learning models to Edge inference nodes.
Gcore Inference at the Edge allows real-time AI inference with high-performance nodes equipped with NVIDIA L40S GPUs, a response time under 30 ms, model autoscaling, built-in DDoS protection, and compliance with data privacy and security standards like GDPR, PCI DSS, and ISO/IEC 27001.
Founded in 2014, Gcore is a Luxembourg-based provider of cloud and Rdge solutions. It offers network infrastructure that includes content delivery, hosting, and security services, serves diverse industries, and ensures optimal digital experiences for users worldwide.
By using this site, you agree to allow SPEEDA Edge and our partners to use cookies for analytics and personalization. Visit our privacy policy for more information about our data collection practices.