Creating open standards to ensure data center hardware is equipped to cope with the amounts of information data centers are required to process in real time is the worthy goal of many. Computer Weekly brought this news to our attention in their article, “Using cross-industry collaboration on open standards to address big data in the datacentre.”
Existing server architectures may struggle to cope with the growing volumes of data that applications need to process expediently and in real time, or as quickly as possible, to deliver maximum value.
It is important that they support memory semantic operations so they can be accessed like dynamic random-access memory (DRAM) instead of being treated like storage.
Data centers’ system management needs are complex. With semantic technology and standards in play, the results should be up to the challenge.
Melody K. Smith
Sponsored by Access Innovations, the world leader in thesaurus, ontology, and taxonomy creation and metadata application.