I Have New Job!
For the past 4+ years at IBM, I was the worldwide product marketing manager for the IBM Tivoli Storage Manager (TSM) family of data protection and recovery solutions. But as of a month ago, I am now in a brand new role, Customer Experience Product Manager across the Tivoli Storage software group.
I’ve been asked to bring together various IBM business functions to help identify areas for customer experience improvement, measure success, and generally be the champion for all things that will help address a customer’s experience with our products.
In addition to reading and learning and reaching out to like-minded people, my initial efforts have been to assess where we are and the progress we’ve made to date. For example, we’ve made tremendous progress within the TSM family over the past 4 years starting with the release of TSM v6.1 – adding in valuable features such as deduplication, replication, monitoring and reporting, while also streamlining and unifying the user interface. We also introduced a new pricing model that makes it much easier to acquire, manage and forecast your backup software licenses.
The next release of the TSM family will be announced in early Q4-2012, and it will include some exciting enhancements to improve the user experience in TSM, TSM for Virtual Environments and Tivoli Storage FlashCopy Manager. And next year we’ll be rolling out a brand new user interface that promises to make the day-to-day administration of TSM much, much easier.
We’ll also be rolling out the IBM SmartCloud Virtual Storage Center, our storage hypervisor that was pre-announced at PULSE in March, and it promises to do for storage what VMware and others have done for servers: improve utilization, simplify management, and reduce costs. There are many upcoming events in October that will feature these Tivoli Storage product announcements, and we hope you can join us to hear more details. They are:
IBM Smarter Computing Executive Summit
– Oct 2 – 4, Pinehurst, NCInterop New York 2012 / Enterprise Cloud Summit
– Oct. 1 – 5, New YorkIBM InterConnect 2012
– Oct. 9 – 11, SingaporeSAP TechEd
– Oct. 15 – 19, Las Vegas, NVIBM Information On Demand
– Oct. 21 – 25, Las Vegas, NV
IBM Business Without Limits Storage Days – various dates & locations
Oct. 25, Columbus
Oct. 30, Jacksonville – registration url TBD
Nov. 13, NYC – registration url TBD
Your experience with our products is only one pillar in the building that we’re trying to construct. How does it tie in with your interactions with our marketing and sales teams? Does our story match our solution capabilities? Are you getting what you were promised?
Does the product documentation provide the answers to your questions, or help to avoid needing to ask the questions at all?
And what about support? A recent survey
by STORAGE Magazine of enterprise backup software users rated IBM #1 in the category of support. But what can we be doing better, or differently, to keep you delighted in your relationship with IBM?
At the end of the day, I’m looking at this role in 2 ways:
- What can we fix or improve?
- How can we ensure that everything new that we do is viewed through a customer experience lens?
To be successful, I’m going to need the help of a lot of people who have a vested interest in our success – across all facets of our business, but especially from our customers and partners. Please reach out to me at firstname.lastname@example.org with any ideas or comments.
"The postings on this site are my own and don't necessarily represent IBM's positions, strategies or opinions."
IBM’s Business Without Limits
Recently, I had the distinct pleasure to deliver
a presentation on Data Storage and Compliance at the IBM Tivoli event 'Business
Without Limits 2012' in Bangalore,
India. There are more than 100 attendees who
attended the event from almost every industry.
My Track for the day: Addressing Data Growth, Threats and Compliance; Unified
The volume, velocity and importance of data have
increased dramatically during the past few years to the point where most backup
and archiving solutions can't keep up with the scalability, functionality,
performance, reliability and budget realities of today and tomorrow. Attendees
understood how to reduce backup data capacity by as much as 95%; how to reduce
the amount of new data at risk by 90% or more; and how to simplify global data
recovery operations and achieve compliance by leveraging a unified management
I was privileged to present in such an interactive
session, where customers understood how our broad product portfolio will help
in addressing their business challenges.
IBM now brings ‘Business Without Limits 2012’ to several cities across the United States
in October and November. This is an
exclusive IBM Tivoli event designed to increase awareness and thought
leadership among the IT managers, Infrastructure leaders, Systems
Administrators, Storage Managers, and Data Center Managers. IBM’s Business
Without Limits event is coming soon to the following cities:
Oct 18: Philadelphia
Oct 25: Columbus
The event will focus on how IBM’s Integrated Service Management strategy
brings together the different capabilities to enable integrated delivery of
business services across complex, interconnected physical and digital
IBM’s Business Without Limits event will have the following Storage Tracks:
the pivotal role of storage in modern data center
backup and unifying recovery
your data protection headaches to the cloud
storage analytic and reporting
This conference will explore how you can capitalize on the opportunities of
a smarter planet and remove the barriers to innovation that will help you
achieve “Business without Limits.” As
today’s leaders are transitioning to smarter, flexible cloud infrastructures
that speed the delivery of innovative products and services, effective storage
management becomes a critical component of that success. Please join us at this event to learn more!
As an IBM marketing manager, my job includes writing about storage technology. This post is about more than technology, though. It’s about a new breakthrough capability for managing storage costs and service levels.
I recently met with IBM Distinguished Engineer, Mike Sylvia, who has been working on a Business Transformation project to enable automated right tiering for storage in IBM data centers. Right tiering is the notion that data should be hosted on the optimal storage tier to balance cost and performance requirements.
Mike explained that applications tend to be hosted on top tier storage. When he analyzed actual usage patterns, Mike found most data can be effectively hosted on lower cost storage. Mike’s project put numbers to a problem that is often hidden from view and, until now, nearly impossible to solve.
Hosting data on the wrong storage tier turns out to be a huge efficiency problem. Mike predicts IBM will save $13 million over 3 years in one data center, by periodically moving data to the right tier. During the pilot, users saw their cost for storage drop by 50% per TB on average. This is big.
Like many advancements, IBM’s automated right tiering capability is accomplished by integrating existing technology. Mike Sylvia’s project combines storage virtualization, storage management automation and analytics. Today, IBM offers the technology in a bundled solution called SmartCloud Virtual Storage Center.
How does it work?
Step 1. IBM’s storage virtualization controller collects detailed usage metrics about storage it manages throughout the data center, without impacting application performance.
Step 2. IBM’s Storage Analytics Engine studies usage patterns over time to understand performance requirements.
Step 3: Storage tier recommendations are generated in reports that can be shared with application owners and IT management.
Step 4: Storage virtualization enables online data migration, with no disruption to applications or users.
Repeat: Usage patterns change over time, of course, so right tiering becomes an ongoing process.
Why does it work?
Automated right tiering delivers the efficiency benefits of Information Lifecycle Management without the headaches and hidden costs. Automated right tiering has significant benefits for both data owners and IT leaders, so everyone wins.
For example, application and database owners can gain the following benefits:
Applications can move to top tier storage when they need it, without waiting for a maintenance window.
Average storage costs drop significantly, without a drop in services.
IT leaders benefit, too. For example:
Storage tier decisions are based on analysis of actual usage patterns, not predictions. Storage performance management tasks are eliminated.
Data can quickly and easily be moved back to its original storage tier if requested, without incurring an outage.
IBM automated right tiering works with most storage systems, so deployment is nondisruptive.
The technology that enables automated right tiering has significant additional benefits, such as the ability to eliminate scheduled outages for storage system maintenance.
Problem solved. How has your organization addressed the storage right tiering challenge?
Watch a video of Mike Sylvia describing his automated right tiering project at the IBM Edge conference in June, 2012.
Listen to IBM storage virtualization expert and master inventor, Barry Whyte’s 2-part webcast called, “Storage Virtualization - IBM SVC – Benefits”, in April, 2012
Visit IBM’s Virtualized SAP Demo and other smarter solutions at VMworld August 26-30, 2012 in San Francisco
IBM has bundled automated right tiering technology into a new solution called SmartCloud Virtual Storage Center, available through IBM sellers and Business Partners.
IDC has recently released its Worldwide Storage Software QView for the first quarter of 2012. In it, IDC estimates that the total Storage Software market for 1Q12 grew about 3.3% over 1Q11. IBM had a solid quarter while Symantec faltered, allowing IBM to take the overall #2 share rank position for 1Q12.
§ In the Overall Storage Software Market, IBM moved up to #2 share rank position in 1Q12, gaining 2.0 share points over 1Q11.
§ In Data Protection and Recovery, IBM held its #2 share rank position, gaining 1.8 points of share over 1Q11.
§ IBM retained its #1 position for Archiving Software growing faster than the market. HP holds #2 spot with its 2011 acquisition of Autonomy.
IBM offers a comprehensive, flexible storage management software portfolio that helps organizations address storage management challenges across the enterprise, including data centers, remote/branch offices and desktop/laptop computers. Learn more about the specific components within the IBM storage software family that can help you create a more responsive and resilient storage infrastructure for your on demand business.
In a previous post, we talked about the recent reviews that the Tivoli Storage Productivity Center (TPC) received. In particular, we are very pleased with the 'Leader' designation received in the recent Gartner Magic Quadrant review
for Storage Resource Management.
Its not just the analyst reactions that are positive. Based upon a customer focused feature list, the product team undertook an overhaul of the Graphical User Interface (GUI) and introduced a dashboard that provideseasy touse comprehensive reporting. To ensure they had got it right, the proposed changes were demonstrated on the Expo floor at the Pulse 2012 conference in Las Vegas earlier this year. Responses from the user base were enthusiastic, to the extent that this next iteration is quickly becoming a sought after item.
A beta test program was initiated at the conference, as the true litmus test that the proposed new features would stand the test in a true production environment. Early responses point to some interesting observations. When polled about their experiences with the next evolution of the product, one of the most talked about aspects were features provided to simplify complex reporting. Beta testers derived great time and productivity benefits from having a picture of the full storage environment; something they had to previously go to multiple places together. A common benefit registeredwas time savings when it came to complex reporting.
What is compelling however is the business analytics that this next iteration yields. Tivoli Storage Productivity Center (TPC) provides detailed topology viewsof the entire storage infrastructure. In the overhauled GUI, administrators can observe the overall health of the environment instantly. A simple 'right click' provides detailedviews on each of the storage network entities. The facilitation of these environment wideviews led a beta customer to observe that 'more than just the storage engineer can now get a simple view of their SAN environment'. What does this mean? It means that what started out as a time saver for the practitioners - the storage engineers - now becomes an entryway for the management team to get a quick look at the overall environment, allowing for higher level strategic discussions about storage environments and needs.
Is this good or bad? A recent survey revealed that CMOs will outspend CIOs on IT by 2017. When I tweeted this I was asked by @jamie_joyce why it would take this long. My answer is that its likely due to the classic tension between a cost saving position on infrastructure vs a growth position on Business Analytics or feature offerings. When you think about Big Data within Business Analytics and the proliferation of mobile devices as two huge growth areas, the commonality is a mass proliferation of data in orders of magnitude never imagined. The conversation comes back to storage,and the associated resource management.
Which way does your company lean? Where is your head in that tension between cost savings and growth when it comes to your storage environment?
I chatted with Product Marketing Manager Amalore Jude about this and the kind of reaction the team got at Pulse in Vegas March of this year when they demo'd thenew GUI interface. He was quite pleased with the response. 'Customers were very excited looking at the new, next-generation interface' he told me 'many are awaiting June 4, when they can actually lay their hands on it.'
Well, June 4 is around the corner. If you are a regular reader of this blog, its quite likely I will meet you at the Edge Conference in Florida next week. If you're there, please tweet me @brenny or find me somehow and say hi.
is selling out but there are still passes available for the Tech Edge portion of the four part event. It's not too late to register. The Tech Edge portion is well laid out with over 250 sessions that are being led by IBMers and customers. Sometimes its better to hear the war stories of your peers when you're trying to figure out how to exploit what you have, or are considering getting.
One customer who is speaking is Gary Fry of Unum. His session on March 6, 10-11am in Rm 115 is on Unum's use of the SAN Volume Controller and his experiences beta testing the new evolution of TPC.
So, if you are going, then I hope to see you out there. If you haven't yet decided, then getting a first look at this next evolution of storage infrastructure management is hopefully good motivation to consider it.
Every year I try to publish a set of storage trends that I believe most IT shops are trying to address and where technologies exist to help resolve. Here are my thoughts for 2012...
1) Storage breakthroughs
nipping the “Digital Dark Age” in the bud
Since the early 1990’s, an increasing proportion of data
created and used has been in the form of digital data. Today, the world
produces more than 1.8 zettabytes of digital information a year. Yet, digital storage can in many ways be more perishable
than paper. Disks corrode, bits “rot” and hardware becomes obsolete. This
presents a real concern of a “Digital Dark Age” where digital storage
techniques and formats created today may not be viable in the future as the
technology originally used becomes antiquated. We’ve seen this happen—take the floppy disk for example. A
storage tool that was so ubiquitous people still click on this enduring icon to
“save” their digital work and any word, presentation or spreadsheet
documents—yet most Millennials have never seen it in person. But new research shows storage mediums can be vastly
denser than they are today. While new form factors such as solid state disks
will help us provide more stable longer-term preservation of data, and the
promise of "the cloud" allows access to data anywhere, anytime. Recently, IBM researchers combined the benefits of magnetic hard
drives and solid-state memory to overcome challenges of growing memory demand
and shrinking devices. Called Racetrack memory, this breakthrough could lead to
a new type of data-centric computing that allows massive amounts of stored
information to be accessed in less than a billionth of a second. This storage research challenges previous theoretical
limits to data storage—ensuring our digital universe will always be preserved.
2) Data curation will provide
structure in midst of the data deluge
Now that we have the capability to preserve our digital
universe, we need to find a way to make it useful. We need to take the next
step past data preservation to data curation. Data curation is the active and ongoing management of data
through its lifecycle. This smarter data categorization adds value to data that
will help glean new opportunities, improve the sharing of information and
preserve data for later re-use. Social media is a great example of the power of curated
data. Sites like FaceBook, Google+, Pinterest, etc. compile our digital lives
and gives their users a platform to organize their content. However, there's also a lot of work involved in selecting,
appraising and organizing data to make them accessible and interpretable. The
key is bringing data sets together, organizing them and linking them to related
documents and tools. If data can be stored in a way that provides context,
organizations can find new and useful ways to use that data.
3) Storage analytics will open
new business insights
With data curation allowing organizations the platform to
better utilize their data, analytics will help turn that data into intelligence
and, ultimately, knowledge. With the information that historical trending analytics
and infrastructure analytics provides, you can index and search in a more
intelligent way than ever before. By doing analytics on stored data, in backup
and archive, you can draw business insight from that data, no matter where it
exists. The application of IBM Watson technology for healthcare
provides a good example. Watson collects data from many sources and is able to
analyze the meaning and context. By processing vast amounts of information and
using analytics, it can suggest options targeted to a patient's circumstances,
can assist decision makers, such as physicians and nurses, in identifying the
most likely diagnosis and treatment options for their patients. Through intelligent storage and data retrieval systems, we
can learn more with the information we have today to improve service to
customers or open new revenue streams by leveraging data in new ways.
4) Storage becomes a celebrity
– new business needs are pushing storage into the spotlight
As our digital and data-driven universe expands, certain
industries are able to reach new levels of innovation by having the capacity to
house, organize and instantaneously access information. For example, Hollywood is known for its big budget
blockbusters, but it’s the big storage demands required by new formats such as
digital, CGI, 3D and high definition that’s impacting not just the bottom line,
but studios’ ability to produce these types of movies. Data sets for movies
have become so large it’s at the petabyte level. Filmmakers are beginning to trade in film reels for SSDs
as just one day’s worth of filming can generate hundreds of terabytes of data.
The popularity of these high data-generating formats means studios are looking
for new storage technologies that can handle the demand. The healthcare industry may even be facing an even bigger
data dilemma than the entertainment business. Take a look at the Institute
University of Leipzig, in Germany, which has a major genetic study called LIFE
to examine disease in populations. LIFE is cataloging genetic profiles of
several thousand patients to pinpoint gene mutations and specific proteins.
This process alone generates multiple terabytes of data. Even one 300-bed hospital may generate 30 terabytes of
data per year. Those figures will only grow with higher-resolution medical
imaging, and new tools or services such as making electronic healthcare records
5) Intervention...The Data
In this era of Big Data, more is always better, right? Not
so – especially when every byte of data costs money to store and protect. Businesses are turning into data hoarders and spending too
much time and money collecting useless or bad data, potentially leading to
misguided business decisions. This practice can be changed with simple policy
decisions and implementing existing capabilities in technologies that exist in
smarter storage, but companies are hesitant to delete any data (and many times
duplicate data) due to the fear of needing specific data down the line for
business analytics or compliance purposes. Part of the solution starts with eliminating the copies.
Nearly 75% of the data that exists today is a copy (IDC). By deleting and
disabling redundant information, organizations are investing in data quality
and availability for content that matters to the business. Consider the effect
of unneeded data, costing money by replicating throughout an organization’s information
systems. This outdated data can also potentially be accessed for fraud.
the quality of data is not costly—not getting it right is.
ARE YOU SPEAKING AT PULSE?
IF SO, READ ON PLEASE...and book your room at the MGM Grand today to avoid a price increase!
1. Have you uploaded your presentation?
The deadline to upload presentations was January 20th to enable appropriate reviews and posting to the Pulse 2012 SmartSite Agenda Builder
. Your presentation will be converted to PDF and can be downloaded or printed in advance by attendees, pending your approval. For a full list of presentation guidelines and processes please review the Presentation tab on the online Speaker Kit.
2. Do you know what audio visual equipment will be available in your session room?
Click the A/V tab in your online Speaker Kit
to review this important information.
3. Are you connected?
Follow the conference news & highlights on Twitter or the Pulse blog. Click the Speaker Kit tab to find links and hashtags for use with social media. Find Pulse attendees using the Pulse SmartSite agenda builder.
4. Attendees are always interested in getting to know their speaker! Do you have a bio?
Review and update your brief bio by logging onto the Speaker Kit
5. Have you started to build your Pulse conference agenda on SmartSite, the attendee conference portal?
You will need your conference registration confirmation number to login to this site. Click the Build My Agenda icon to view scheduled sessions.
6. Have you registered for the conference and booked your hotel?
Review the registration instructions listed in the registration tab on the speaker kit website.
Very important... Conference hotel accommodations are limited and available on a first-come, first served-basis. Conference rates are valid until January 27, 2012
or until the room block is sold out, whichever comes first.
Please take a few minutes to review the information in your online Speaker Kit, and follow-up on all speaker actions as needed.
If you have any questions or need additional information, please contact the speaker support at PulseSpeaker@experient-inc.com
. We look forward to seeing you at the MGM Grand in Las Vegas March 4-7!
IBM has detailed innovative projects and research that show new
storage approaches to support Big Data growth and drive business innovation.
Healthcare, financial services, media and entertainment, and
scientific research among many industries face the challenge of storing and
managing the proliferation of data to extract critical business value. As
storage needs rise dramatically, storage budgets lag, requiring new innovation
and approaches around storing, managing, and protecting Big Data, cloud data,
virtualized data and more.
Watson-inspired Storage Takes on the Cosmos: IBM is working on a project with the Institute
for Computational Cosmology (ICC) at Durham University in the U.K. and Business
Partner OCF to build a storage system to better store and manipulate Big Data
for its cosmology research on galaxies. ICC is adopting the same IBM General
Parallel File System technology used in the
IBM Watson system to store and manage more than one petabyte of data from two
significant projects on galaxy formation and the fate of gas outside of
galaxies. The enhanced storage system will enable up to 50 researchers, working
collaboratively to access and review data simultaneously. It will also help ICC
learn to manage data better, storing only essential data and storing it in the
New Storage Platform Delivers More Personalized, Visual
Healthcare: A medical archiving
solution from IBM Business Partners Avnet Technology Solutions and TeraMedica,
Inc. powered by IBM systems, storage and software gives patients and caregivers
instant access to critical medical data at the point-of-care. Developed in
collaboration with IBM, the medical information management offering can manage
up to 10 million medical images, helping health care practitioners provide
better patient care with greater efficiency and at reduced costs. The
integrated platform allows users to manage and view clinical images originating
from different treatments and providers to bring secure, consistent image
management and distribution at point-of-care.
Virtualization Consolidates Storage Footprint for Medical Center: Kaweah Delta Health Care District (KDHCD), a
general medical and surgical hospital in Visalia, Calif., needed to reduce its
operational costs while increasing storage space. To meet these demands, KDHCD
tapped IBM's storage systems to create a new storage platform that reallocates
resources and saves a significant amount of data space with thin-provisioning
technology. Virtualization creates a smaller hardware footprint so the hospital
also saved on power and cooling costs. KDHCD now has a consolidated storage
environment that provides the scalability, ease-of-management, and security to
support critical healthcare data management for the hospital.
IBM is looking for customers and business partners who are interested in participating in an Early Access Program (EAP)/Beta Program for an upcoming release of FlashCopy Manager, Data Protection for SQL, and Data Protection for Exchange. If you would like to nominate your organization to participate in this EAP/Beta, please send an email to:
Mary Anne Filosa (email@example.com)
and be sure to include your organization's name. Once your email is received you will be sent instructions on signing off on the EAP/Beta legal form online and when that signoff has been completed, you will be sent a link to the program's nomination site. We encourage you to respond quickly if you are interested as the program begins in mid December.
Live Webcast: Using Tivoli Storage Productivity Center to be the "eyes" into your SAN environment, and to see how that environment is changing. LIVE!
In the ever changing SAN environment, Tivoli Storage Productivity Center has many components to help the Storage Administrator know when a where to focus their attention. We will walk through many of these in a live demo and see how they can be used.
Let TPC help you keep up with storage growth instead of working longer hours!
Speaker: Scott McPeek,
IBM Program Director, Storage Sales Enablement. He has worked in the software industry more than 30 years, the last ten years have been with IBM as part of the TrelliSoft SRM acquisition. Scott now focuses on storage resource management, storage performance management and virtualization with products like TPC, SVC and the Storwize V7000.
Register for this Live Webcast here
How are you spending your time this weekend? Polishing up your Pulse 2012 storage session abstract, hopefully!
With only 4 days left to submit a 100-word abstract
by Nov. 7, we thought it would be helpful to share some final pointers. Keep in mind that this year's theme
is Business Without Limits and we are seeking to understand how you
gained visibility, control and automation to deliver better business
What are the key benefits to you as a Speaker?
One full Pulse conference pass ($1995 value) and the opportunity to gain visibility for your company, and take advantage of an incredible networking opportunity with over 7,000 industry experts, press, and analysts.Here's some pointers on how to get your Storage Management session abstract accepted:
1. Focus it on topics such as how you used Tivoli Storage Manager to manage "big data"; success with recent upgrades; or cloud storage
2. Tell us about the key business challenges you were trying to solve, and how IBM Tivoli storage solutions helped you address these challenges
3. What was the ROI, or key results, from implementing a Tivoli storage solution, and what valuable lessons did you learn from the experience
Don't forget to register during early bird registration by December 16
if you do not plan to speak at Pulse and attend the conference
complimentary. Early Bird registration can save you up to $700 off
registering onsite! See you at Pulse 2012!
Well it's that time again, hard to believe, I know...PULSE call for papers has opened
, and we want to have another banner year in the Tivoli Storage Sessions! Last year we were standing room only in many of our sessions and this year we hope to fill each room once again.
As for topic suggestions, we'd like to hear from customers who:
- Recently upgraded
- Use TSM to manage 'big data'
- Have best practices, created with our Tivoli Storage portfolio that they want to share
It's simple, just go to this link and submit a 100-word abstract.
The deadline is November 7th,
so there's no time like the present!Speaker Benefits include:
Your abstract should include:
- One full conference pass ($1995 value). Only one speaker per company, per session qualifies.
- Use of our exclusive Client Speaker VIP Lounge
- Networking opportunities with over 7,000 industry experts, press, and analysts
- Your company’s name in the Pulse Pocket Agenda and a description of your presentation and speaker details on the Pulse SmartSite
- Initial business challenges and objectives
- Statistics about your deployment layout and company
- The IBM solution/products applied by your organization
- How the IBM solution/products help address the pain points
- Lessons learned from the experience
NEW!! Technical Services Webinar: Capacity Planning in a Tivoli Storage Manager Environment
As much as customers would like to "backup everything and keep it forever", storage is not unlimited. The reality of ever increasing data growth, combined with regulatory compliance and the associated risks make the arduous task of capacity planning for backup ever more critical. A new Reporting and Monitoring tool is available with Tivoli Storage Manager (TSM). This new tool, based on IBM Tivoli Monitoring, can collect and report on historical data and is an integral part of a capacity planning regimen. Presenters:
This session will demonstrate a capacity planning methodology that conforms to the ITIL Capacity Planning process description by showing how the TSM Reporting and Monitoring tool and other TSM components can be utilized for to ease the pain of capacity planning. Additionally, this session will look at strategies, like data deduplication, to reduce the amount of backup data while maintaining regulatory compliance.
Mark Vanderboll, IBM Tivoli Global Response Team
Dave Daun, IBM Advanced Technical Skills
Access the webinar here: http://bit.ly/qdOuJU
In response to: Enabling Private IT for Storage Cloud -- Part II (management controls)
To see a transcript of the live chat held on Friday, September 30th
about this topic visit this link:
And don't forget to listen to the 'open mic' conversation about
Storage Hypervisors with IBM's Ron Riffe, the author of this blog
series, and ESG analyst, Mark Peters:
I recently read an excellent post
by Ron Riffe, a fellow IBMer discussing practical recommendations for introducing cloud techniques into a private storage environment – the end goal being to save your company a substantial amount of money while becoming more responsive to the needs of the business. The first of the four steps discussed in the post was to introduce a storage hypervisor – virtualization of your storage infrastructure. It’s a good idea, especially if you have already virtualized some or all of your production server environment with something like VMware.
But there’s more to it than just the efficiency and mobility you get from virtualizing. The customers we talk to are finding new value that rises out of the synergy when both the server and storage environments are virtualized. One example is in the area of data protection. In this post, I’m going to explain the 1+1=3 effect for data protection that comes from combining VMware with a good storage hypervisor.
Let’s start with a quick walk down memory lane. Do you remember what your data protection environment looked like before virtualization? There was a server with an operating system and an application… and that thing had a backup agent on it to capture backup copies and send them someplace (most likely over an IP network) for safe keeping. It worked, but it took a lot of time to deploy and maintain all the agents, a lot of bandwidth to transmit the data, and a lot of disk or tapes to store it all. The topic of data protection has modernized quite a bit since then.
Today, you’re using a server hypervisor (VMware) to efficiently pack several virtual machines onto one physical server – and to make it so you can deploy, move and decommission those VMs pretty much at will. If you are still using the old techniques for data protection (deploying an agent on each individual VM, and then transferring all the backup data for those VMs through the one IP network pipe) on that physical server, you’re probably running into significant performance and application availability problems, and also missing out on some significant savings (if you listen carefully, you can hear your backup environment screaming ‘modernize me, MODERNIZE ME!”).
Fast forward to today. Modernization has come from three different sources – the server hypervisor, the storage hypervisor and the unified recovery manager. The end result is a data protection environment that captures all the data it needs in one coordinated snapshot action, efficiently stores those snapshots, and provides for recovery of just about any slice of data you could want. It’s quite the beautiful thing.
Data capture: VMware has provided a nice set of APIs that allow disk arrays and backup vendors to intelligently drive snapshots of a VMware datastore (for the techies, these are the vStorage API’s for Data Protection, or VADP). The problem is that integration from a disk array to these API’s is a tier-1 kind of service that is found on very few disk arrays today. That’s where a good storage hypervisor comes in. A storage hypervisor will include its own integration between VMware VADP and hardware-assist snapshot and it will plug the control GUI directly into the VMware vCenter management console. That means, regardless of what type of disk array capacity you have chosen to use for your VMware data, the storage hypervisor will be able to do a hardware-assisted snapshot of the VMware datastore (all your VMs at once – sweet!).
Here’s a scenario we see…
- Administrators want to snapshot the VMware datastore 4 times a day. 4 days worth are maintained – 16 total snapshots “online”
- For longer term recovery, they promote one snapshot each day to a unified recovery manager. 1 month of these are maintained – 31 total snapshots “nearline”
The snapshots can add up, so efficiency is important. For the “online” snapshots, a good storage hypervisor stores only incremental changes, compresses the result and stores it as a thin provisioned volume on lower-tier disk capacity (the new 3TB SAS drives make a nice choice). Notice in this scenario, the administrator is also promoting one of the snapshots each day (say, the midnight snapshot) to an enterprise recovery manager. If you are using IBM’s Tivoli Storage Manager Suite for Unified Recovery, then it will insert deduplication in the list of efficiency techniques being applied to the snapshot (incremental snapshots that are deduplicated, compressed, and stored on lower-tier disk… that’s about as efficient as it gets).
Flexible recovery: Whether the snapshot is online or nearline, the only reason you have it is so that you can recover when something (anything) goes wrong. A good hypervisor / unified recovery manager combination will give VMware administrators the ability to peer inside the snapshot and recover individual files, virtual volumes, or entire VMs. Using the scenario above, your recovery point would be no more than 6 hours old for the last 4 days, and your recovery time would be measured in minutes.
IBM offers one of the worlds best known unified recovery managers and the worlds most widely deployed storage hypervisor. With over 7000 storage hypervisor deployments, we’ve had a lot of opportunity to build some depth. Deep integration with VMware for modernizing your data protection environment is one example. If you are running VMware and haven’t yet modernized data protection, IBM can help. You can learn more at the following links.
Storage hypervisor platform
: IBM System Storage SAN Volume Controller
(SVC)Storage hypervisor management, storage service catalog, and self-service provisioning
: Tivoli Storage Productivity Center Standard Edition
Join the conversation!
The virtual dialogue on this topic will continue in a live group chat
on September 23, 2011 from 12 noon to 1pm Eastern Time
. Join some of the Top 20 storage bloggers, key industry analysts and IBM Storage subject matter experts to discuss storage hypervisors and get questions answered about improving your private storage environment.