Thursday, September 13, 2018

New NVIDIA Data Center Inference Platform to Fuel Next Wave of AI-Powered Services

Fueling the growth of AI services worldwide, NVIDIA today launched an AI data center platform that delivers the industry’s most advanced inference acceleration for voice, video, image and recommendation services. The NVIDIA TensorRT™ Hyperscale Inference Platform features NVIDIA® Tesla® T4 GPUs based on the company’s breakthrough NVIDIA Turing™ architecture and a comprehensive set of new inference software. Delivering the fastest performance with lower latency for end-to-end applications, the platform enables hyperscale data centers to offer new services, such as enhanced natural language interactions and direct answers to search queries rather than a list of possible results.