1) Storage breakthroughs nipping the “Digital Dark Age” in the bud
Since the early 1990’s, an increasing proportion of data created and used has been in the form of digital data. Today, the world produces more than 1.8 zettabytes of digital information a year. Yet, digital storage can in many ways be more perishable than paper. Disks corrode, bits “rot” and hardware becomes obsolete. This presents a real concern of a “Digital Dark Age” where digital storage techniques and formats created today may not be viable in the future as the technology originally used becomes antiquated. We’ve seen this happen—take the floppy disk for example. A storage tool that was so ubiquitous people still click on this enduring icon to “save” their digital work and any word, presentation or spreadsheet documents—yet most Millennials have never seen it in person. But new research shows storage mediums can be vastly denser than they are today. While new form factors such as solid state disks will help us provide more stable longer-term preservation of data, and the promise of "the cloud" allows access to data anywhere, anytime. Recently, IBM researchers combined the benefits of magnetic hard drives and solid-state memory to overcome challenges of growing memory demand and shrinking devices. Called Racetrack memory, this breakthrough could lead to a new type of data-centric computing that allows massive amounts of stored information to be accessed in less than a billionth of a second. This storage research challenges previous theoretical limits to data storage—ensuring our digital universe will always be preserved.
2) Data curation will provide structure in midst of the data deluge
Now that we have the capability to preserve our digital universe, we need to find a way to make it useful. We need to take the next step past data preservation to data curation. Data curation is the active and ongoing management of data through its lifecycle. This smarter data categorization adds value to data that will help glean new opportunities, improve the sharing of information and preserve data for later re-use. Social media is a great example of the power of curated data. Sites like FaceBook, Google+, Pinterest, etc. compile our digital lives and gives their users a platform to organize their content. However, there's also a lot of work involved in selecting, appraising and organizing data to make them accessible and interpretable. The key is bringing data sets together, organizing them and linking them to related documents and tools. If data can be stored in a way that provides context, organizations can find new and useful ways to use that data.
3) Storage analytics will open new business insights
With data curation allowing organizations the platform to better utilize their data, analytics will help turn that data into intelligence and, ultimately, knowledge. With the information that historical trending analytics and infrastructure analytics provides, you can index and search in a more intelligent way than ever before. By doing analytics on stored data, in backup and archive, you can draw business insight from that data, no matter where it exists. The application of IBM Watson technology for healthcare provides a good example. Watson collects data from many sources and is able to analyze the meaning and context. By processing vast amounts of information and using analytics, it can suggest options targeted to a patient's circumstances, can assist decision makers, such as physicians and nurses, in identifying the most likely diagnosis and treatment options for their patients. Through intelligent storage and data retrieval systems, we can learn more with the information we have today to improve service to customers or open new revenue streams by leveraging data in new ways.
4) Storage becomes a celebrity – new business needs are pushing storage into the spotlight
As our digital and data-driven universe expands, certain industries are able to reach new levels of innovation by having the capacity to house, organize and instantaneously access information. For example, Hollywood is known for its big budget blockbusters, but it’s the big storage demands required by new formats such as digital, CGI, 3D and high definition that’s impacting not just the bottom line, but studios’ ability to produce these types of movies. Data sets for movies have become so large it’s at the petabyte level. Filmmakers are beginning to trade in film reels for SSDs as just one day’s worth of filming can generate hundreds of terabytes of data. The popularity of these high data-generating formats means studios are looking for new storage technologies that can handle the demand. The healthcare industry may even be facing an even bigger data dilemma than the entertainment business. Take a look at the Institute University of Leipzig, in Germany, which has a major genetic study called LIFE to examine disease in populations. LIFE is cataloging genetic profiles of several thousand patients to pinpoint gene mutations and specific proteins. This process alone generates multiple terabytes of data. Even one 300-bed hospital may generate 30 terabytes of data per year. Those figures will only grow with higher-resolution medical imaging, and new tools or services such as making electronic healthcare records available online.
5) Intervention...The Data Hoarder
In this era of Big Data, more is always better, right? Not so – especially when every byte of data costs money to store and protect. Businesses are turning into data hoarders and spending too much time and money collecting useless or bad data, potentially leading to misguided business decisions. This practice can be changed with simple policy decisions and implementing existing capabilities in technologies that exist in smarter storage, but companies are hesitant to delete any data (and many times duplicate data) due to the fear of needing specific data down the line for business analytics or compliance purposes. Part of the solution starts with eliminating the copies. Nearly 75% of the data that exists today is a copy (IDC). By deleting and disabling redundant information, organizations are investing in data quality and availability for content that matters to the business. Consider the effect of unneeded data, costing money by replicating throughout an organization’s information systems. This outdated data can also potentially be accessed for fraud.
Raising the quality of data is not costly—not getting it right is.