Intel's semiconductor company has added a new feather in its AI hat in the form of two new cloud-based Nervana NNP (neural network processors) for AI (artificial intelligence). The duo joined the Nervana family of NNPs introduced in August, which includes NNP-T for training in-depth learning models of any size and NNP-I for deep learning inferences to develop new insights without too much use. Naveen Rao, Intel Corporate Vice President and General Manager, Artificial Intelligence Intel Products Group, elaborates on Intel's AI strategy and India's importance to them in a telephone conversation with Mint.
Type in anything that you want. Then click Quill It on the right to paraphrase your input.
N.R. AI means that AI will be one of the dominant computing capabilities everywhere. How does our roadmap work? So we've got a few different ways of segmenting it. We have Intel Nervana neural network processors in the data center for inferencing and learning, and then we have Movidus VPU (Vision Processing Unit) at the Edge. The movidus is less than 5 watts, the assumption is between 10 and 40 watts and the practice is then between 100 and 300 watts. So these are the three large segments we are searching for today's AI acceleration. We are also working to bring these or any aspect of the AI capabilities to our other products such as FPGA (field programmable gate array) and CPU (central processing unit). So for the CPU, we just started a new one.
These are 8-bit instructions for vectors that are very important to data center or PC inference capabilities. So this is an example of how we have expanded inference capabilities by 2 to 3x across multiple different use cases all the way from laptops, desktops to servers. Besides this, we are also developing a roadmap for graphics. So for many years we've always had integrated graphics. And we will also develop standalone graphics that will have AI capabilities as well.
Tags : Intel, AI, Naveen Rao, ,