OSTA

ASBIS tarnib laias valikus IT tooteid oma klientidele Eestis. Külasta Asbise edasimüüjate loetelu, et leida kõige lähemal asuv IT pood

Uudised

märts 18, 2024
Intel® Ethernet 800 Series Network Adapters support speeds up to 100Gbps and ...
märts 12, 2024
AI Leaders and Top Companies From Healthcare, Transportation, Financial ...
märts 11, 2024
At MWC 2024 in Barcelona, Intel announced its future Intel® Xeon® processor ...
märts 08, 2024
Recognized for Marketing Excellence and Received TikTok for Business Overseas ...
märts 01, 2024
Solidigm D5-P5336: Massive Capacity, Minimal Cost
veebruar 27, 2024
Dell Technologies introduces new edition of RISE Partner Program. The new ...
Five Use Cases of Intel® Optane™ DC Persistent Memory at Work in the Data Center

mai 13, 2019

INTEL, OPTANE, Memory

Five Use Cases of Intel® Optane™ DC Persistent Memory at Work in the Data Center

Intel® Optane™ DC persistent memory is poised to be one of the most transformative innovations to hit the market in years. This new, revolutionary memory technology delivers large capacity, DRAM-like performance, and data persistence.

One of my favorite things is watching as developers internalize the opportunity of a whole new, large persistent data resource with memory-like performance. The first visit can be bewildering. About a month later, we get asked to come back because they have an idea. Soon after that, our software engineers are working side-by-side with them on a new application of Intel Optane DC persistent memory. It’s inspiring to watch innovation in real-time.

Intel Optane DC persistent memory has as a wide range of uses, and more are being discovered by innovators in the global ecosystem every day. In this blog, I want to highlight a small sample of use cases that exhibit how this product can make a significant impact on the data center.

1. In-Memory Analytics with SAP HANA

In-memory database deployments are growing rapidly in these data-intensive times. SAP is one of Intel’s first collaborators on Optane DC persistent memory, and SAP HANA is one of the original use cases. For an in-memory database, the potential benefits of big, fast, affordable persistent memory are substantial. Let’s look at two deployment scenarios.

In an existing SAP HANA deployment, system capacity can be added with “extension nodes,” whose role is primarily to expand the data capacity for the existing processing nodes. Intel’s analysis reveals that an extension node equipped with 7.5 terabytes of DRAM and Intel Optane DC persistent memory enables up to 25% more data to be available in the database main store and saves over 10% in system hardware costs compared to an all-DRAM configuration.1

In the next scenario, a growing SAP HANA deployment needs to consolidate multiple older, memory-limited nodes into a new, modern 4-socket “scale-up node.” In this scenario, the configuration with Intel Optane DC persistent memory enables twice as much in-memory data capacity at 39% lower total system cost per database terabyte.2 Whether refreshing an existing SAP HANA deployment, or starting a new one, the math favoring Intel Optane DC persistent memory is pretty clear – more data, lower cost. You can learn more how Intel and SAP have worked together on these technologies here and here. And look for additional in-memory database support for Intel Optane DC persistent memory from Aerospike, Gigaspaces, Microsoft, Oracle, Redis Labs and more. On top of these savings, Intel Optane DC persistent memory allows for much faster restarts for large analytics systems, decreasing restart times from minutes down to seconds.

2. Multi-tenant Database Server Virtualization

Demand for database services keeps growing, and multi-tenant database-as-a-service virtualization is key to scaling up cost-effectively. More VM instances per server and larger databases can strain memory resources, and adding more and more DRAM becomes cost-prohibitive. The high capacity and affordability of Intel Optane DC persistent memory can be the breakthrough to greater VM density and economical scaling.

Intel testing compared two systems for virtualization using Microsoft Windows Server 2019 with Hyper-V, running an OLTP cloud benchmark. One platform was equipped with 768 gigabytes of DDR4 DRAM, the other with 192 gigabytes of DDR4 DRAM plus 1 terabyte of Intel Optane DC persistent memory. In this scenario, the system with Intel Optane DC persistent memory supported up to 36% more VMs, and did so with a 30% lower cost per VM.3 Greater VM density per node at a substantially lower cost.

3. Content Delivery Network Video Streaming

Content Delivery Networks encounter similar problems when it comes to scaling. From 2017 to 2022, total IP traffic is projected to grow 3x, and more than 80% of this is video4, much of it streamed or live-cast from modernized edge servers close to the consumers. Content providers will need to increase the number of streams per server, and this requires more memory. Again, the capacity and economics of Intel Optane DC persistent memory will help content providers affordably scale.

In the comparison, let’s hold Quality of Service (QoS) equal to the standard HTTP GETs 99-percentile target. Platforms equipped with a 2nd generation Intel Xeon Gold 6252 processor and 1.5 terabytes of memory will both deliver high-quality video at this QoS. However, the platform with Intel Optane DC persistent memory will deliver at over 40% lower memory cost.5 This can quickly add up to major savings as edge-based content delivery ramps.

4. AI/ML Analytics with the SAS VIYA Platform

SAS VIYA is a multi-product family for advanced analytics that includes in-memory artificial intelligence (AI) and machine learning for complex data tasks, such as image, language, and sentiment analysis. Scaling SAS VIYA with large amounts of DRAM can dramatically increase the cost of the hardware platforms. In Intel’s testing, when running 3 analytic models simultaneously, you can achieve similar analytics throughput using platforms provisioned with 1.5 terabytes of DRAM or 1.5 terabytes of Intel Optane DC persistent memory and 192 gigabytes of DRAM. However, the latter configuration delivers that level of machine learning services at up to 43% lower memory cost.6 Intel continues to help ramp up AI performance, reduce its cost, and bring this powerful technology to the mainstream.

5: Hyper-Converged Infrastructure Scaling

The benefits of Intel Optane DC persistent memory aren’t exclusive to massive, multi-terabyte usages. A moderate amount can make a big difference in the right scenario. Software-defined hyper-converged infrastructure is becoming very popular in modernized public and private clouds, and Windows Server Software Defined technologies, like Microsoft Azure HCI, are among the leading solutions. The economics of hyper-converged infrastructure depend a great deal on the virtual machine density per node. In Intel testing, we compared different memory configurations running Microsoft Windows Storage Spaces Direct, one with 384 gigabytes of DDR4 DRAM, the other with 512 gigabytes of Intel Optane DC persistent memory and 192 gigabytes of DRAM across four nodes. The additional memory capacity when using Intel Optane DC persistent memory, unsurprisingly, enabled 35% more VMs per node. What is surprising is that it does so at 27% lower hardware cost per VM, and it didn’t require a huge, multi-terabyte memory footprint to deliver the results.7

Just the Beginning

Intel Optane DC persistent memory is a broadly applicable technology with many diverse uses. The five use cases described here are just a small sample of the ways the revolutionary combination of large capacity, affordability, and data persistence will transform the data center and digital services. This is just the beginning.

For a more in-depth look at Intel Optane DC persistent memory and how it can benefit you and your customers, visit intel.com/optane 

 

This product and technical expertise to build solutions based on it are available via ASBIS, please, contact ASBIS Experts in your country to get more information about Intel® Optane™ DC Persistent Memory.

 


*Other brands and names may be property of others.

1.See www.intel.com/2019xeonconfigs Footnote #16

2.See www.intel.com/2019xeonconfigs Footnote #17

3.See www.intel.com/2019xeonconfigs Footnote #10

4.Source: Cisco Visual Networking Index: Forecast and Trends, 2017–2022, Nov. 28, 2018, https://www.cisco.com/c/en/us/solutions/collateral/service-provider/visual-networking-index-vni/white-paper-c11-741490.html

5.See www.intel.com/2019xeonconfigs Footnote #18

6.See www.intel.com/2019xeonconfigs Footnote #19

7.See www.intel.com/2019xeonconfigs Footnote #20

Disclaimer: The information contained in each press release posted on this site was factually accurate on the date it was issued. While these press releases and other materials remain on the Company's website, the Company assumes no duty to update the information to reflect subsequent developments. Consequently, readers of the press releases and other materials should not rely upon the information as current or accurate after their issuance dates.