Every year I try to publish a set of storage trends that I believe most IT shops are trying to address and where technologies exist to help resolve. Here are my thoughts for 2012...
1) Storage breakthroughs
nipping the “Digital Dark Age” in the bud
Since the early 1990’s, an increasing proportion of data
created and used has been in the form of digital data. Today, the world
produces more than 1.8 zettabytes of digital information a year. Yet, digital storage can in many ways be more perishable
than paper. Disks corrode, bits “rot” and hardware becomes obsolete. This
presents a real concern of a “Digital Dark Age” where digital storage
techniques and formats created today may not be viable in the future as the
technology originally used becomes antiquated. We’ve seen this happen—take the floppy disk for example. A
storage tool that was so ubiquitous people still click on this enduring icon to
“save” their digital work and any word, presentation or spreadsheet
documents—yet most Millennials have never seen it in person. But new research shows storage mediums can be vastly
denser than they are today. While new form factors such as solid state disks
will help us provide more stable longer-term preservation of data, and the
promise of "the cloud" allows access to data anywhere, anytime. Recently, IBM researchers combined the benefits of magnetic hard
drives and solid-state memory to overcome challenges of growing memory demand
and shrinking devices. Called Racetrack memory, this breakthrough could lead to
a new type of data-centric computing that allows massive amounts of stored
information to be accessed in less than a billionth of a second. This storage research challenges previous theoretical
limits to data storage—ensuring our digital universe will always be preserved.
2) Data curation will provide
structure in midst of the data deluge
Now that we have the capability to preserve our digital
universe, we need to find a way to make it useful. We need to take the next
step past data preservation to data curation. Data curation is the active and ongoing management of data
through its lifecycle. This smarter data categorization adds value to data that
will help glean new opportunities, improve the sharing of information and
preserve data for later re-use. Social media is a great example of the power of curated
data. Sites like FaceBook, Google+, Pinterest, etc. compile our digital lives
and gives their users a platform to organize their content. However, there's also a lot of work involved in selecting,
appraising and organizing data to make them accessible and interpretable. The
key is bringing data sets together, organizing them and linking them to related
documents and tools. If data can be stored in a way that provides context,
organizations can find new and useful ways to use that data.
3) Storage analytics will open
new business insights
With data curation allowing organizations the platform to
better utilize their data, analytics will help turn that data into intelligence
and, ultimately, knowledge. With the information that historical trending analytics
and infrastructure analytics provides, you can index and search in a more
intelligent way than ever before. By doing analytics on stored data, in backup
and archive, you can draw business insight from that data, no matter where it
exists. The application of IBM Watson technology for healthcare
provides a good example. Watson collects data from many sources and is able to
analyze the meaning and context. By processing vast amounts of information and
using analytics, it can suggest options targeted to a patient's circumstances,
can assist decision makers, such as physicians and nurses, in identifying the
most likely diagnosis and treatment options for their patients. Through intelligent storage and data retrieval systems, we
can learn more with the information we have today to improve service to
customers or open new revenue streams by leveraging data in new ways.
4) Storage becomes a celebrity
– new business needs are pushing storage into the spotlight
As our digital and data-driven universe expands, certain
industries are able to reach new levels of innovation by having the capacity to
house, organize and instantaneously access information. For example, Hollywood is known for its big budget
blockbusters, but it’s the big storage demands required by new formats such as
digital, CGI, 3D and high definition that’s impacting not just the bottom line,
but studios’ ability to produce these types of movies. Data sets for movies
have become so large it’s at the petabyte level. Filmmakers are beginning to trade in film reels for SSDs
as just one day’s worth of filming can generate hundreds of terabytes of data.
The popularity of these high data-generating formats means studios are looking
for new storage technologies that can handle the demand. The healthcare industry may even be facing an even bigger
data dilemma than the entertainment business. Take a look at the Institute
University of Leipzig, in Germany, which has a major genetic study called LIFE
to examine disease in populations. LIFE is cataloging genetic profiles of
several thousand patients to pinpoint gene mutations and specific proteins.
This process alone generates multiple terabytes of data. Even one 300-bed hospital may generate 30 terabytes of
data per year. Those figures will only grow with higher-resolution medical
imaging, and new tools or services such as making electronic healthcare records
5) Intervention...The Data
In this era of Big Data, more is always better, right? Not
so – especially when every byte of data costs money to store and protect. Businesses are turning into data hoarders and spending too
much time and money collecting useless or bad data, potentially leading to
misguided business decisions. This practice can be changed with simple policy
decisions and implementing existing capabilities in technologies that exist in
smarter storage, but companies are hesitant to delete any data (and many times
duplicate data) due to the fear of needing specific data down the line for
business analytics or compliance purposes. Part of the solution starts with eliminating the copies.
Nearly 75% of the data that exists today is a copy (IDC). By deleting and
disabling redundant information, organizations are investing in data quality
and availability for content that matters to the business. Consider the effect
of unneeded data, costing money by replicating throughout an organization’s information
systems. This outdated data can also potentially be accessed for fraud.
the quality of data is not costly—not getting it right is.
TSM FastBack for Workstations is a centrally-managed solution that reduces the risks of losing important information stored on thousands of personal computers across an entire enterprise, as described here: http://www-01.ibm.com/software/tivoli/products/storage-mgr-fastback-workstation/
IBM will be running a beta program for the next release of this product, providing those taking part with early access to the latest planned enhancements. If you would like to participate, please contact the beta coordinator, Matthew Boult (firstname.lastname@example.org).
Are you interested in the next release of Tivoli Storage Manager (TSM)?
IBM will be running a beta program featuring the upcoming version of TSM.
If you would like to participate in this beta program,
please contact the beta coordinator, Mary Anne Filosa
Pulse was amazing! For those of you that attended my presentation on FlashCopy Manager and also stopped by to see me in the Expo... thank you!
I wanted to tell you that IBM will be running a joint beta program featuring upcoming versions of FlashCopy Manager and TSM for Virtual Environments. If you would like to participate in this beta program, please contact the beta project manager, Mary Anne Filosa (email@example.com).
ARE YOU SPEAKING AT PULSE?
IF SO, READ ON PLEASE...and book your room at the MGM Grand today to avoid a price increase!
1. Have you uploaded your presentation?
The deadline to upload presentations was January 20th to enable appropriate reviews and posting to the Pulse 2012 SmartSite Agenda Builder
. Your presentation will be converted to PDF and can be downloaded or printed in advance by attendees, pending your approval. For a full list of presentation guidelines and processes please review the Presentation tab on the online Speaker Kit.
2. Do you know what audio visual equipment will be available in your session room?
Click the A/V tab in your online Speaker Kit
to review this important information.
3. Are you connected?
Follow the conference news & highlights on Twitter or the Pulse blog. Click the Speaker Kit tab to find links and hashtags for use with social media. Find Pulse attendees using the Pulse SmartSite agenda builder.
4. Attendees are always interested in getting to know their speaker! Do you have a bio?
Review and update your brief bio by logging onto the Speaker Kit
5. Have you started to build your Pulse conference agenda on SmartSite, the attendee conference portal?
You will need your conference registration confirmation number to login to this site. Click the Build My Agenda icon to view scheduled sessions.
6. Have you registered for the conference and booked your hotel?
Review the registration instructions listed in the registration tab on the speaker kit website.
Very important... Conference hotel accommodations are limited and available on a first-come, first served-basis. Conference rates are valid until January 27, 2012
or until the room block is sold out, whichever comes first.
Please take a few minutes to review the information in your online Speaker Kit, and follow-up on all speaker actions as needed.
If you have any questions or need additional information, please contact the speaker support at PulseSpeaker@experient-inc.com
. We look forward to seeing you at the MGM Grand in Las Vegas March 4-7!
I wanted to share some information about an article that we just published with regards to backing up Exchange Server 2010.
Along with all the other new features of Exchange Server 2010, Microsoft introduced Database Availability Groups (DAGs). DAGs are part of the large focus that Microsoft put on High Availability and Site Resilience within Exchange Server 2010
. DAGs allow you to have passive database copies (aka "replicas") that can serve as hot standbys for protection against machine failures, database failures, network failures, viruses, or other issues that may cause an access problem to a database.
DAGs are similar in function to Exchange Server 2007 Cluster Continuous Replication (CCR) replicas. However, they extend the capabilities even further. One of the key benefits that customers get when they use DAGs in their enterprise is the ability to completely offload backups from their production Exchange Servers. That means they can run all of their backups from a database copy instead of the production database so as not to impact their production Exchange servers. This enables the production Exchange Servers to spend their resources on doing what they know best, i.e. handling email and facilitating collaboration.IBM Tivoli Storage Manager for Mail : Data Protection for Exchange
and IBM Tivoli Storage FlashCopy Manager
completely support backing up DAG passive database copies. Data Protection for Exchange and FlashCopy Manager also support using those backups to recover the production database as well as for recovering individual mailboxes and items. You can find more details in the IBM Tivoli Storage Manager for Mail: Data Protection for Microsoft Exchange Server Installation and User's Guide V6.1.2
We just published an article (which includes a sample script) to help you automate backing up your Exchange Server 2010 DAG databases. We know that you will find this quite helpful in setting up your backup strategy:
Backup 1000 virtual machines in less than 36 minutes
Is it possible to achieve higher levels of data protection, recovery and availability for the virtualized systems than in your non-virtualized environment? Yes, is what you will hear from IBM and VMware at VMworld® 2012
We’re all aware of how data protection and recovery has only gotten more complicated with the explosive growth in server virtualization. Applications running on virtual environments are becoming even more critical for business success, and the data volumes residing in virtual environments are growing in leaps and bounds. Also, given the shared physical resources of ESX/ESXi systems, a smarter approach to I/O intensive processes such as backup and recovery are needed.
As a virtual environment user, if you’re looking for faster backup and restore for the VMware datastores, then IBM Tivoli Storage FlashCopy Manager Version 3.1 (referred to as FCM 3.1) is the answer. It is designed to handle high demands - but how quick can it be?
Seeking to expunge all doubts, and recognizing the demand for efficient and fast backup of virtual machine data residing in VMware datastores, the Tivoli Storage Manager performance team carried out a benchmark test.
To assess how fast FCM 3.1 can meet the data protection demands from VMware virtual environments, the team conducted tests on VMware environments that have up to 1000 on-line virtual machines with a total capacity of 18 TB of disk space.
The results were astounding. Test results for up to 1000 virtual machines (the maximum tested) showed that FlashCopy backup elapsed time increases linearly with the number of virtual machines:
* 500 virtual machines can be backed up by FCM 3.1 in 15 minutes
* 1000 virtual machines can be backed up by FCM 3.1 in 36 minutes
So how is this useful to you? FCM 3.1 provides:
* Simplified deployment and management of advanced, application-aware data backup
* Improved backup and recovery times from hours to minutes
* Improved administrative productivity by simplified management and automation of routine tasks
You can also write to our team for further information at:
Sanjay Patel @ firstname.lastname@example.org
Richard Vining @ email@example.com
Michael Barton @ firstname.lastname@example.org
Hamsa Srinivasan @ email@example.com
IBM Tivoli Storage Development is currently running a beta program for a new release of FlashCopy Manager.
We are looking for additional participants for this program which could be new or existing FlashCopy Manager users, as well as Data Protection for Exchange or Data Protection for SQL users as those products are incorporated into FlashCopy Manager. IBM is very interested in obtaining valuable customer and business partner input on this release prior to General Availability.
We want you to participate! Why not take advantage of this opportunity to help shape these products while at the same time helping to ensure that your environments are understood and your requirements are met? By participating you'll have the ear of development and will be able to participate in weekly discussions with development. This is a win-win for everyone.
If you are interested in participating in this beta program please contact Mary Anne Filosa (firstname.lastname@example.org).
IBM is looking for candidates to participate in a beta program for an upcoming release of Tivoli Storage Manager.
The beta program is planned to start in late January 2014 including worldwide participation to obtain customer and business partner feedback on the release.
If you are interested in enrolling in this beta program, please submit a sign-up form here:
For any questions, please contact Mary Anne Filosa, email@example.com.
The last time I blogged I was telling you about IBM Tivoli Storage FlashCopy Manager on Windows and just how cool it was. Well, I am working on some more neat stuff and I wanted to tell you about a beta program for upcoming release of FlashCopy Manager. It is called the Beta program for IBM Tivoli Storage FlashCopy Manager. If you want to test some of the new functions and features of the upcoming release of IBM Tivoli Storage FlashCopy Manager, please contact Mary Anne Filosa (firstname.lastname@example.org) or your IBM Sales representative to get details.
The enrollment period is ending soon, so don't wait to be a part of the action!
Hi there! Are you going IBM Pulse 2011
in Las Vegas next week? I'm going and I hope you will come join me. I will be presenting Session 1494: Protecting your critical business applications with IBM Tivoli® Storage FlashCopy® Manager
on Wednesday, March 2nd at 11:00 am. I will also be in the Pulse Solutions Expo
. You can come talk to me and see a demo of FlashCopy Manager on Windows in action. It should be a great week in Vegas. There are a lot of really good education sessions, customer presentations, hands-on labs, BOF sessions, and more. I hope you will stop by and say hello!
Simplify Data Protection and Reduce Costs With Unified Recovery Management
On September 22, we will be hosting an educational webcast that will address the challenges of providing data protection and recovery for rapidly growing amounts of diverse enterprise data. During this call, you will hear about our unified recovery management solution that can help reduce complexity, risk and costs. Included in this solution is a new simple, value-based option for procuring and managing software licenses.
Speaker: Rich Vining, Product Marketing Manager
Date: September 22, 2011
Time: 11:00 AM Eastern US
Please register for this event using this link.
After registering you will receive a confirmation note with call-in instructions.
In the past two years that IBM acquired Butterfly, it generated hundreds of Analysis Engine Reports (AER) analyzing billions of gigabytes and established facts about Tivoli Storage Manager (TSM) that should make competition sit up and notice.
The Backup Analysis Engine report from Butterfly Software uses light-touch, agent-less software technology to analyze existing heterogeneous data backup environment. It is a non-intrusive analysis based on empirical production data collected in minutes and incorporated into the Backup Analysis Engine report from IBM Butterfly Software.
Why is Butterfly important?
Gartner Magic Quadrant: Backup and Recovery 2013 Competitive analysis says between 2012 and 2016, one-third of organizations will change backup vendors due to frustration over cost, complexity and/or capability. To be able to say conclusively that TSM solution can save backup infrastructure costs by as much as 38% when compared to some of the other competitive products opens the door for IBM to go get these 33% of the organizations looking for a change.
AER is the Key
More demand for AERs is expected with the launch of the automated “self-service” AER generation model. Scheduled to go live at the beginning of 2H 2014, it will scale out as a service to IBM and its Business Partners. These facts drive home the fact that Butterfly AERs have metamorphosed into a well accepted and standard approach to storage infrastructure analytics.
Meet the Butterfly Storage and Backup Assessment Team at Pulse 2014
If the butterfly flutter has caught your interest, visit Pulse 2014 on Feb 23-26 at Las Vegas and meet the folks who deliver Butterfly Storage and Backup Assessments in the IT Optimization section of the IBM booth. Find out how your company can use business analytics to dramatically lower the cost of running your backup and recovery or primary storage infrastructure.
I don't know about you, but I have been virtualizing like crazy over the last few years, humongous servers have been turning into medium sized virtual machines, test and lab environments had turned into small files running on my laptop from a flash drive.
My IT department have been virtualizing even more, consolidating servers, sharing storage resources among multiple machines and converting NICs (Network Interface Cards) into virtual switches (I still haven't figured out how they did that).
The move into a virtualized environment is very useful for reducing energy consumption, decreasing physical server and storage foot print and driving up processor and storage utilization but it also has some side effects when it comes to data protection.
The problem begins at the same place that drove us into virtualization to begin with, resource sharing, You may now have 10 virtualized servers running on the same physical host, if your backup process consumed only 5% CPU and IO on your physical server, imagine what would happen if all 10 virtual machines kick off the backup process at the same time...
There are multiple valid approaches for providing data protection to those virtual machines and I’ll try to address each and every one of them in upcoming blogs…
- File based VS block based backups
- Keep your existing backup methodology (Agent-based backup)
- Perform the backup through the host (VMware console/hyper-v host OS)
- Hardware based snapshots
- Utilize vendor specific APIs that provide "agentless" or off-host backup (VMware's VCB and vStorage)
Other enhancements that might not necessarily be backup related but have to be seriously considered when virtualizing include
- Deduplication (client side or target side)
Stay tuned, I’ll be going into more details around File Based VS Block Based backups in my next blog.
I am often asked... "When can I use FlashCopy Manager with my EMC disk array?" (substitute "EMC" with your favorite vendor)
With FlashCopy Manager for Windows, you can leverage hardware snapshots for any disk array that has a VSS Hardware Provider. This is because Windows has a built-in architecture (referred to as "VSS") that enables pluggable snapshot support. We wrote a developerWorks article that explains how this works and how it integrates with TSM a few years ago. (Note: This article refers to "TSM for Copy Services" instead of "FlashCopy Manager" because it was written before the product name was changed.)
But, with FlashCopy Manager for UNIX and Linux and FlashCopy Manager for VMware, you must wait until support is added for your desired disk array. Last year, IBM partnered with Rocket Software to develop a device adapter pack that plugs in to FlashCopy Manager for UNIX and Linux and FlashCopy Manager for VMware to extend support to more storage devices. You install it on top of an existing FlashCopy Manager (version 4.1 or later) installation on the application server being protected by FlashCopy Manager for UNIX (or on the proxy backup server in case of FlashCopy Manager for VMware) and configure it to talk to the storage device. After that, you are able to leverage the power of FlashCopy Manager snapshot protection for the hardware device supported by that device adapter pack!
At the end of last year, Rocket Software released support for EMC Symmetric (VMAX and DMX). They are planning to add more disk arrays in 2014. If you have devices that you want to see added, contact Rocket Software.
Have a great day!