Intel on Tuesday took the wraps off Ice Lake, its new 3rd Gen Intel Xeon Scalable processor for information centres. This CPU, which is based on a 10-nanometre course of action, can theoretically provide up to 40 cores per processor, for – in the words of Intel – an typical 46 % jump in functionality on common information center workloads more than the final generation. Intel also claims that it is the only information centre processor with constructed-in AI.
Ice Lake is developed to energy the industry’s broadest variety of workloads, from the cloud to the network to the edge. More than 50 exceptional OEMs are launching more than 250 servers based on the new platform alongside, and Intel mentioned, Ice Lake will go on to be deployed by the world’s leading tier cloud service providers to highlight its adoption across all market place segments.
| Rocket Lake-S deep dive: Intel on broken embargoes, benchmark scores, walled gardens and energy of Computer
“Our 3rd Gen Intel Xeon Scalable platform is the most flexible and performant in our history, designed to handle the diversity of workloads from the cloud to the network to the edge,” Navin Shenoy, EVP and GM of Intel’s Data Platforms Group mentioned in a statement. “Intel is uniquely positioned with the architecture, design and manufacturing to deliver the breadth of intelligent silicon and solutions our customers demand.”
In addition to the close to 50 % enhance in functionality, the new chip also has Intel’s SGX (Software Guard Extensions), to “protect sensitive code and data with the smallest potential attack surface within the system.” The technologies can isolate and course of action up to 1 terabyte of code and information in private memory locations that Intel calls enclaves. There are a couple of more safety options in Ice Lake, like Total Memory Encryption and Platform Firmware Resilience, that collectively can “address today’s most pressing data protection concerns.” The chip also options cryptographic acceleration for organizations that run encryption-intensive workloads.
Lastly, the new chip boasts 74 % quicker AI functionality compared with the preceding generation producing it doable “to infuse AI into every application from edge to network to cloud.”
Ice Lake supports up to six terabytes of method memory, up to eight channels of DDR4-3200 memory and up to 64 lanes of PCIe Gen4 per socket.