Emerson Network Power has recently conducted a survey among data center professional regarding the near future of the data center. Most participants said the problems facing data centers today will remain largely the same a decade from now, although we will be, hopefully, applying new and improved solutions.
Energy consumption should continue to be one of the major hurdles for data centers. When asked what percentage of data center power usage would come from particular sources (i.e. coal, oil, natural gas, solar, nuclear, etc.), the answers averaged out with solar power taking the top seat at over 20 percent.
However, this is unlikely to happen due to the inefficiency of current solar technology compared to data center power density growth rates. Other options include private power generation through power cell integration into server racks themselves, which is currently being studied by Microsoft engineers.
"We have these 'compute bubbles' that we can move from server to server within a data center, and now we’re starting to see those bubbles move from one data center to another," says Mark Monroe, consulting engineer at DLB Associates. "I can see that becoming more common in 10 years, where we move the load to where it is most efficient or cheapest to run it at a given time."
In terms of computing and storage within the data center, it is no surprise survey participants are predicting continuing growth in cloud solutions. One-third of the participants said 60-70 percent of all computing will be done in the cloud, with a quarter saying 80-99 percent will be moved off-premises.
While the survey predicts data centers to shrink in size over the next decade, compute density is sure to increase over what is held in-house today merely due to technological advances.
Data center management is yet another important area of development; 43 percent of participants believe full data center automation will be achieved by 2025, meaning completely self-healing and unmanned. Surprisingly though, only 29 percent of participants believe the extent of advances to be full systems and layers transparency.
Sandia National Laboratories recently announced its involvement in a project called Beyond Moore Computing, aimed at driving the creation of future data center technology. While they do not expect much will aesthetically change within the data center in the next two decades, there will be significant transformation within the servers. One particular technology Sandia sees taking off is multi-level computing.
"For example, more parallelism in cores, more cores, alternative architectures, accelerators like GPUs will grow in prominence and design, accelerators will emerge for specific functions (like encryption)," says Erik DeBenedictis of Sandia's Advanced Device Technologies.
This move to 3D layering is as much a solution to computing limitations as it is one for power consumption and cooling. Sandia is currently looking at the possibility of new 3D field effect transistors as a place to start.
"While the 3D FET helps preserve expectations and is essentially invisible, it will become progressively harder to preserve expectations over the next 10 year," says DeBenedictis. "This will lead to more severe forms of what we have observed since 2003 or so: More cores at nearly unchanging clock rate, with throughput per watt growing slowly unless code can be parallelized or adapted to GPUs."
Innovations like this will present unique opportunities for complex, expensive infrastructures, such as the Internet of Things (IoT). Massive IoT fleets are sure to become standard practice for marketing and commercial purposes in general, especially when they can pack cheap 3D computing technology. What remains to be seen is large-scale IoT's impact on the data center.
Whatever the case may be, the data center is not likely to go anywhere. According to one survey participant, "There will be significantly increased 'cloud' storage and computing, as most computing user interfaces will be distributed among mobile or wearable devices. On the other hand, some enterprise computing will always be necessary and may actually increase as more artificial intelligence functions are brought in-house for decision making and productivity."
The participant base consisted of professionals from the U.S., Latin American, Western Europe and the Asia Pacific region. To find the full survey from Emerson head here.