As many companies adopt big data analytics to better target consumers and streamline operations, data centers need to reconfigure their enterprise server setups to more effectively facilitate these efforts.
Thanks in part to trends such as the Internet of Things, social media, mobile devices and cloud computing, the amount of data generated in an average month has reached previously unheard-of volumes. According to IBM, 90 percent of all data currently in existence was created over the past two years, and these volumes are only expected to grow further in the future. IT services company CSC predicted that 35 zettabytes of data will be created by 2020, which is 44 times greater than the amount created in 2009.
For smart enterprises, this data growth represents an enormous opportunity. With more information to feed into an analytics program run on a state-of-the-art enterprise server, companies can gain granular insights into their customers and operations. Perhaps the two most famous examples of the power of big data analytics come from Target and The New York Times blogger Nate Silver:
- In 2012, the Times reported that by using information collected from past purchases in addition to other identifying data such as age and gender, retailer Target determined with great accuracy what a shopper will want to obtain in the next few months. For example, by looking at recent payment data, Target was able to determine that a high school girl in Minnesota was pregnant before her parents knew.
- By using large amounts of data in his analytics program, Silver was able to accurately predict the outcome of the 2012 presidential election before voting day, AdAge reported. His algorithm was correct in 49 states, and this is the second presidential election Silver was able to correctly call early.
As Big Data Grows, So Too Do Enterprise Server and Storage Needs
While big data analytics can provide businesses with innumerable benefits, these organizations also need a place to put all the data. According to Microsoft, approximately 62 percent of companies that have at least 1,000 employees and operate within the manufacturing, healthcare and financial services industries currently store 100 terabytes or more of data. Furthermore, about 32 percent of these firms indicated that their data storage demands will double over the next two to three years.
Compounding these enterprise server and storage issues is the need to keep backups of the data for security and compliance concerns. Organizations using customer financial data or other pieces of sensitive information likely need to maintain one or more extra versions of the data in case something happens to the originals.
Still, storage is far from the only big data-related problem plaguing data centers. The key to big data analytics rests not just in the amount of information, but rather in an organization’s ability to glean useful insights from it. As software platforms such as Apache Hadoop gain prominence, the tools needed to accomplish this kind of analysis become easier to obtain than ever. However, companies using these kinds of programs need to make sure that the enterprise server facilitating the analytics effort is powerful enough for the job.
Big data may present data centers with a number of key storage- and processing-related concerns, but luckily the answer is simple: Make sure that every server in the data center is an enterprise server built with these dilemmas in mind. As data streams grow even larger in the coming years, C-level executives will likely want to make big data analytics a bigger enterprise focus. Big data is here to stay, and data centers may need to upgrade their configurations to accommodate the trend and remain relevant.
“The world of ‘big data’ is changing dramatically right before our eyes – from the increase in big data growth to the way in which it’s structured and used,” according to CSC. “The trend of ‘big data growth’ presents enormous challenges, but it also presents incredible business opportunities.”