What makes high performance storage high performing?
One of the most important ways to optimize performance is to tier your data and storage systems for the right combination of speed, capacity and economy. One of the most advanced storage solutions — hybrid storage — is able to migrate data between tiers as needed.
While cost remains the largest motivating factor for good storage management, business concerns are just as important. You cannot afford to max out storage capacity only to watch your applications fail because they are unable to run. The result may be downtime, lost revenues or lost data. To maximize storage efficiency and optimize storage usage, you want the right data on the right storage tier. You want to see the current distribution of data and the recommended distribution of data based on the tiered thresholds that you set.
After all, data is not created equal. Your business organization might have data that is very highly transactional and a few millisecond hiccup can result in lost business or a even a terrible user experience. For example, think high velocity trading in a online brokerage house. On the other hand, some of your data is rarely accessed, but over time can accumulate to a huge amount. For example, think email archives. This is where the tiered storage solution in IBM Spectrum Control Storage Insights enables you to see past and current trends in capacity and space usage, and based on past and current storage usage can offer recommendations for your storage needs.
In Storage Insights, click Home > Dashboard to see the Tier Planning chart.
To get more information on tier planning, click See Details.
When you go to Insights > Tier Planning, you can see:
The charts that show the current and recommended distribution of the volumes across the tiers that you created
The list of the volumes that are identified as candidates for re-tiering
The Tier Planning page shows you more information about how your data is currently distributed and the recommended distribution of data across the storage tiers. In the table, the volumes that contain data that is not on the right tier is displayed. To maximize storage efficiency and optimize storage usage, you can decide which volumes you want to down-tier.
In the Allocated Space by Tier chart, you can hover the mouse pointer over the column for the tier to see the current and recommended allocation of space for each tier.
You want the data in your data center to be placed on the tiers that best match the performance requirements in your business organization. You place your storage into tiers by assigning tier levels to pools and you set performance thresholds to generate recommendations for your storage. Each time that data is collected, the tiering of your storage is analyzed.
“IBM Spectrum Control Storage Insights from IBM not only delivers a performance dashboard to monitor overall health, but also enables you to drill into and address storage issues affecting service delivery.” - - Johan van Arendonk, Systems Engineer, E-Storage B.V.
So what's next?
Give it a try!
Explore the live demo that contains the actual product with sample data; it’s the quickest way to try out the insightful solutions. You will need an IBM ID to log in. If you do not have an IBM ID, Storage Insights provides a link for you to create one. Follow the guided tutorials for not only tier planning optimization, but troubleshooting performance, monitoring block capacity, monitoring file capacity and identifying reclaimable storage.
Another way to get fast insights into your storage environment is the free 30-day trial. Try the Storage Insights solution by using your IBM ID, or by creating one. After the30-day trial you can easily convert to a subscription.
On Monday evening, IBM Tivoli Storage Manager Product Manager, Tom O’Brien, provided an update on the Data Protection roadmap. A roadmap is a high level development plan that helps customers, business partners and subject matter experts prepare for new capabilities. Check back with the IBM Systems Storage blog for a posting about the Data and Storage Management roadmap, and other storage sessions at InterConnect.
The TSM Family development team is continuing their mission to make IBM storage software more useful (intelligent, intuitive and transparent). The roadmap is focused on innovations that matter, such as better support for hybrid clouds, administration changes that can reduce human error and large scale environment support.
IBM’s transparent development process creates working partnerships with customers throughout the design and development process, with the goals of implementing new features right the first time, and in the right priority order.
I would like to thank the customers who have participated in Beta and Early Access Programs. You’ve done a fantastic job. I encourage readers to participate in these programs and contribute your ideas. If you’re interested, please send a note to Lorena Colston, email@example.com.
A Storage Pool Called ‘DEDUP’
IBM is building a new type of Storage Pool designed for disk-to-disk backups using TSM deduplication. The new design differs from deduplication in prior TSM versions, in that deduplication occurs in-line, at the time of the backup, rather than later, as a post-processing step. That means TSM deduplication works more like deduplication Data Domain and ProtecTIER deduplication appliances, but without the cost and limitations.
‘Dedup’ storage pools have additional benefits. There is less TSM server processing that can interfere with backup windows: Deduplication, reclamation and database reorganization.
Cloud Storage Pools
Cloud storage pools will give TSM Administrators fast access to nearly infinite amounts of storage pool space. No more waiting for data center storage provisioning. No more stress about running out of space in the backup environment.
What if backup data could overflow onto cloud storage, whenever you needed more space in your on-premises backup environment?
What if old backups migrated to cloud storage until they expired, using an automated disk-to-disk-to cloud process?
What if you could implement a basic multi-site data recovery capability faster, without having to wait for equipment to be deployed in a recovery center?
What if your TSM storage pools could live on the cloud temporarily, to aid a migration or upgrade project?
Cloud storage pools will be easy to deploy. Policies and security are managed from your on-premises TSM environment. TSM servers are not required at the Cloud Service Provider facility.
Web-based self-service restore portal
There’s no service like self-service, especially for data recovery. When data is lost, you want it recovered fast – and the fewer people who know, the better. IBM is building a recovery portal that data owners can use without special training or software. Restores can be requested anytime, from anywhere.
Data Centers and Service Providers will be able to handle day-to-day restore requests without customer service requests or administrator time.
The restore portal is planned initially for VM workloads, but designed to expand to more workloads over time.
Off-site Replication for Everyone
It’s a challenge to move daily backups to recovery centers, in such a way that applications can be recovered quickly should the primary site becomes unavailable. For many organizations, this is an expensive and risky form of insurance, to cover something that will probably never happen. And, like most insurance, it’s no fun to pay or experience a loss.
IBM is making off-site backup replication faster and easier. Set up and monitoring will be integrated into TSM Operations Center. Like other TSM Operations Center features, replication setup is done with a few mouse clicks, and can be monitored at-a-glance, from any browser enabled device.
Performance will be faster, particularly in bad network neighborhoods. Network optimization enables consistently fast throughput, even over long distances and in environments with high packet loss.
These capabilities, combined with Recovery Simulators already available, create a modern data protection environment that gives you confidence your data is protected.
Easier to Buy and Deploy
Two packaging changes are planned: TSM Suite for Unified Recovery will add FlashCopy Manager, and a new pre-packaged TSM Virtual Appliance is expected.
... And That's Not All
Several other planned capabilities were discussed, too. This blog only touches on the highlights. Contact your IBM seller or IBM Business Partner for additional details.
A discussion of unreleased products requires the following standard IBM disclaimer:
The information contained in this publication is provided for informational purposes only. While efforts were made to
verify the completeness and accuracy of the information contained in this publication, it is provided AS IS without warranty
of any kind, express or implied. In addition, this information is based on IBM’s current product plans and strategy,
which are subject to change by IBM without notice. IBM shall not be responsible for any damages arising out of the use
of, or otherwise related to, this publication or any other materials. Nothing contained in this publication is intended to,
nor shall have the effect of, creating any warranties or representations from IBM or its suppliers or licensors, or altering
the terms and conditions of the applicable license agreement governing the use of IBM software.
References in this publication to IBM products, programs, or services do not imply that they will be available in all countries
in which IBM operates. Product release dates and/or capabilities referenced in this presentation may change at any
time at IBM’s sole discretion based on market opportunities or other factors, and are not intended to be a commitment to
future product or feature availability in any way. Nothing contained in these materials is intended to, nor shall have the
effect of, stating or implying that any activities undertaken by you will result in any specific sales, revenue growth, savings
or other results.
About the Author
Mike Barton is a Worldwide Storage Marketing Manager at IBM. Prior to 2007, Mike was a Technical Manager and Principal IT Specialist for IBM, and a Sales Rep and Principal IT Specialist for Sybase. He holds ITIL Foundation and Gartner Group TCO Certifications. Mike has been with IBM over 15 years and has over 25 years of Information Technology experience. The opinions expressed herein are his own.
This morning at IBM InterConnect, IBM executives Steve ‘Woj’ Wojtowecz and Michelle Steen kicked off a week of storage sessions with a broad ranging discussion that included Software Defined Storage, product announcements, and roadmaps.
Over 60 storage sessions this week! There will be presentations by customers, Business Partners, and IBM experts. Sessions will feature cloud solutions, storage for Big Data and Analytics, software-defined storage, data protection updates and more. My colleagues and I will blog about as many sessions as we can, so check the IBM Systems Storage blog for the latest news.
Michelle Steen, Ph. D.
Mgr., IBM Storage Product Management
Software Defined Storage
Last week, IBM announced Spectrum Storage, a Software Defined Storage family of solutions (For details, please see the IBM Press Release, Computerworld, Datamation, Woj’s blog or my blog). Software Defined Storage is the next logical iteration for the intelligence that manages storage systems. Originally, storage management software, such as RAID, ran on servers. At the time, there was no processing power available on disk systems. As disk systems became more powerful, the intelligence moved onto storage systems, where it could deliver better performance. Today, storage systems can be defined entirely in software, not tied to physical storage or servers, creating the opportunity to significantly reduce costs and improve business agility.
Software Defined Storage will be a key enabler for hybrid clouds, internet marketplaces and other applications that require data to be more mobile.
IBM’s approach to Software Defined Storage is more valuable to customers. Other approaches provide minimal capability and try to lock in a particular brand of hardware. IBM’s approach is open, comprehensive, and uses core technology that’s been proven in Data Centers, so it’s ready to be deployed today.
IBM Storage Insights (beta)
A new cloud-based solution running on SoftLayer, Storage Insights deploys in as little as 30 minutes and starts sharing insights immediately. For example, you can quickly identify unused and misallocated storage for reclamation, which can postpone the next storage capacity upgrade. Proprietary analytics from IBM Research provides storage tier optimization recommendations, which can reduce users’ per terabyte cost of storage by up to 50%. Application and department views of storage help data owners ‘see’ their storage, often for the first time. An optional Performance Management module helps troubleshoot application performance issues and optimize data placement.
Virtual Storage Center (VSC) expands support for IBM Spectrum Scale (GPFS) and Elastic Storage Systems (ESS) file-based storage management. Version 5.2.5 new reports, hard and soft quota views, and performance management. Now, performance can be analyzed in near real time, from the perspective of the file system, its underlying storage, and the fabric in between. This update is the 3rdphased deployment of management capabilities for the Spectrum Scale platform, making VSC the most robust management software for Spectrum Scale.
FlashCopy Manager (FCM) expands support to include Spectrum Scale snapshots. This addition provides new options for large file system owners. Now, one vendor supports the data owners’ choice of remote mirroring, managed snapshots, incremental backups and policy-based hierarchical space management. You don’t have to pay for the high cost option for all your data, as competitors often recommend.
IBM Business Resiliency Services announced the availability of Cloud Managed Backup services on SoftLayer, using Tivoli Storage Manager. The solution enables fast deployment of business-class hybrid cloud and cloud-to-cloud managed backup services. IBM has emerged as one of the top managed backup services for business workloads. TSM is the #1 platform in both signings and terabytes managed for IBM Cloud Managed Backup service.
Michelle Steen also introduced product roadmaps for both Data and Storage Management (Spectrum Control, TPC/VSC) and Data Protection and Recovery (Spectrum Control, TSM Family). There will be separate breakout sessions this evening and Tuesday afternoon. Please check back for separate blog posts on roadmap sessions. The future is about hybrid clouds, software defined storage, new performance / scalability limits, and usability.
Mike Barton is a Worldwide Storage Marketing Manager at IBM. Prior to 2007, Mike was a Technical Manager and Principal IT Specialist for IBM, and a Sales Rep and Principal IT Specialist for Sybase. He holds ITIL Foundation and Gartner Group TCO Certifications. Mike has been with IBM over 15 years and has over 25 years of Information Technology experience. The opinions expressed herein are his own.
IBM InterConnect 2015 – the premier conference for cloud and mobile kicks off from February 22 – 26 in Las Vegas, Nevada - to deliver one of the most comprehensive technology events ever. While the excitement has already begun, let’s take a look at the top 5 reasons why you shouldn’t miss IBM InterConnect 2015:
General Sessions, Keynotes, and Break-out Sessions:
InterConnect 2015 offers you over 42 tracks, 8 streams, 3 general sessions to learn from the top industry experts about the latest and the greatest trends and technologies. From development to architecture to operations, InterConnect will provide 1500+ sessions worth of the best education, networking, and exhibits on topics like cloud, mobile, security, DevOps, and more. For instance, hear the latest IBM strategies from key executives like Tom Rosamilia, IBM SVP and gain insight from the General Session guest speakers, Barbara Corcoran, Daymond John, and Robert Herjavec of ABC's Shark Tank. InterConnect 2015 also offers you an opportunity to build your own agenda. Click here to learn more.
Business Partners Summit:
Save the date for the InterConnect 2015 Business Partner Summit, being held on February 22nd. This one-day summit offers a wide range of content and relationship building activities; networking with IBM executives, product and industry experts, and other Business Partners; and features bestselling author, venture capitalist, and entrepreneur Josh Linkner as the guest speaker. Download the Business Partner Summit Program Guide.
With great software, comes great responsibility! This year, thousands of software developers, architects, designers, and programmers will leave their language, platform, and editor wars behind to come together for two days of sessions, training, and building with the people and companies who are creating the new technology landscape. Hack, make, break, and shake with people who are super smart at what they do. Just like you. Click here to see what to expect from dev @InterConnect.
The Solution EXPO at InterConnect 2015 is the hub for networking, collaboration, and engagement. We’ve also incorporated new architectural elements which create a modern, functional space that facilitates the many demos and discussions taking place on the EXPO floor. Click here to check the key features included in the Solution EXPO at InterConnect 2015.
Get Social with InterConnectGO:
InterConnectGO is the digital interactive platform for InterConnect 2015. Hosted by gamer and video star Veronica Belmont, the event features three full days of live streaming video straight to your laptop or mobile device. If you have colleagues who can’t attend InterConnect 2015, encourage them to register for InterConnectGO for the free online digital experience. Also, if you like being on social, there’s a whole new opportunity waiting for you. Just share your InterConnect story or experience and get a chance to be featured as InterConnect Social Jockey on @IBMStorage. Don’t forget to mention @IBMStorage and include hashtag #IBMInterConnect in your posts.
Isn’t it exciting! So what are you waiting for? Register now and be a part of InterConnect 2015 experience! For up-to-the-minutes updates, follow #IBMInterConnect or #IBMStorage on Twitter. If you have any questions, please don't hesitate to contact me at firstname.lastname@example.org. I look forward to seeing you at InterConnect 2015!
Day 2 at IBM Edge 2014 focused on how clients, Business Partners and IBM are working together to build smarter infrastructures to meet the business challenges discussed on Day 1 (cloud, analytics, mobile and social).
Chris O’Connor, @chrisoc_IBM, Vice President of Strategy & Engineering, IBM Cloud & Smarter Infrastructure, spoke about the need to seamlessly extend infrastructures from what organizations own today to what they’ll need in the future. He recommended two approaches:
Cloud-enable existing workloads
Think about ‘cloud first’ for new workloads
The idea is to accelerate time to market and be able to make real time actionable insights. With 70% of enterprises planning to pursue hybrid clouds by 2015, according to a 2013 report by Gartner, a two-pronged approach makes sense.
Andrea Nelson, Director of Storage Marketing at Intel collaborated Gartner’s estimate, saying an estimated 50% of organizations less than 10 years old are putting their IT infrastructures on the cloud today.
Chris spoke about the importance of standards, such as OpenStack, that help organizations quickly assemble Software Defined Systems from components, rather than building clouds a stick at a time. With new development platforms such as IBM’s Code name: BlueMix, organizations can construct enterprise-capable cloud applications faster, without having to deploy a cloud infrastructure.
Mike North, Sr. Director of Programming for the National Football League spoke about the importance of speeding up the infrastructure to enable analytics. ‘Time to truth’ is critical for analytics. With faster processing, the NFL is able to look at 100s of potential schedules and choose the one with the best potential outcomes for their constituents. IBM’s Arvind Krishna suggested that traditional analytics is like driving a car by looking at the rear view mirror – You can only see where you’ve been. Predictive analytics helps you see into the future, react faster, and achieve better business results
Maria Winans, @mariawinans, IBM VP of Social Business, spoke about how IBM and other organizations are driving people-centric engagement for new profit channels. She also spoke about the importance of analytics, saying you can’t personalize customer experiences if you can’t do the required analytics. Maria offered 3 suggestions for successful social business initiatives:
Build shared value
Protect your brand
New mobile applications offer the opportunity to improve customer satisfaction and customer loyalty, as well as generate new revenue. Rapid transformation is happening across industries and geographies. IBM estimates there will be over 1 trillion connected objects and devices by 2015. Mobile applications are enriched by cloud, analytics and social business initiatives.
Storage virtualization and Software Defined Storage
Storage virtualization is the foundation for Software Defined Storage. Virtualization provides an abstraction layer between physical storage and applications that use it. The result is a storage infrastructure that can grow and change without impacting users or applications. Software Defined Storage will be required t manage the vast amounts of data organizations expect to manage in the years ahead.
Steve ‘Woj’ Wojtowecz, @steve_woj, IBM Vice President, Storage Software Development, shared new research from ITG that analyzed storage TCO using IBM, EMC, and VMware storage management solutions. ITG highlighted 4 issues that significantly impact storage TCO:
Storage software costs
Storage administration costs
IBM Virtual Storage Center users were far more successful than their peers using EMC or VMware storage management solutions:
In large enterprises, storage TCO was 72% lower with IBM than EMC
In midsized environments, storage TCO was 35% lower with IBM than VMware storage management.
Jose Garcia, Manager of Enterprise Storage and VMware at UCLA Health System, discussed his storage transformation project. Storage virtualization enabled rapid deployment of an Electronic Health Records system that improves patient care and improves organizational efficiency. Storage virtualization also reduced storage costs and enabled rapid data growth. Improved efficiency saved enough to fund a 3rd data center that will improve resilience and flexibility.
Collaborators wanted. No Eeyores. No squirrels.
Snehal Antani from GE Capital spoke about the importance of delivering IT at market speed, and with commercial intensity. He offered a strategy of dealing with important groups of people in the organization:
Kings and Queens
Collaborators can accelerate change. Identify your collaborators and put them on a pedestal
Cynics are like Eeore in Winnie-the-Pooh. They’ll tell you why change is hard, and focus on what might go wrong. Ignore your cynics.
Kings and Queens are executives and managers who are eager to be offended. They resist change that may impact their empires. They’re a small, but vocal, group. Don’t give them a megaphone.
Snehal also pointed out that technologists can get distracted by new technology, even if it isn’t essential to simplify or accelerate IT delivery. It’s like yelling, ‘Squirrel!’ to distract dogs, as in the movie, Up. GE Capital has signs that say, ‘No Eeyores’ and ‘No squirrels’.
Bottom line: Infrastructure matters
Can the right infrastructure help you build competitive advantage? Yes, of course. Infrastructure matters.
About the author
Mike Barton is a worldwide storage marketing manager at IBM. Mike is a former IT specialist with Gartner TCO and ITIL certifications. The opinions expressed herein are his own.
ITG Management Report: Cost/Benefit Analysis of IBM Virtual Storage Center Compared to EMC Storage Virtualization Solutions
[Software Defined Storage (SDS)] is getting a lot of attention lately by press, analysts and technology providers such as IBM, causing organizations large and small to take notice. SDS describes a set of storage access and data management services that can deliver what IT administrators are most interested in these days:
Lower storage costs
Less reliance on specific storage systems
Simplified data and storage management
Improved utilization of existing resources
International Data Corporation (IDC) published a [taxonomy for Software Defined Storage] which defines software-based storage as a storage software stack running on commodity, off-the-shelf computing hardware. SDS should offer a full suite of storage services and federation of the underlying storage to enable data mobility, according to IDC.
The interesting thing is, while the name Software Defined Storage is relatively new, IBM has been delivering technology and client solutions that match the SDS definition for over a decade.
Matching IDC’s definition, [IBM SAN Volume Controller], introduced in 2003, is an x86-based appliance running Linux code, providing federated storage virtualization across heterogeneous storage platforms and enabling advanced storage services. SAN Volume Controller has been proven to scale to multiple petabytes. This core technology is also included in IBM’s midrange Storwize storage systems. To date, over 55,000 SAN Volume Controller and Storwize systems have been shipped worldwide, making IBM one of the most popular business class storage virtualization solutions.
If you can’t attend Edge, look for video interviews with Brian Jeffery, Managing Director of International Technology Group, and Steve Wojtowecz, VP of Storage and Network Management Software Development, on [TheCUBE, by Wikibon], live on Monday, May 19 and afterwards on demand.
By now, everyone in the IT Industry is convinced about the benefits of virtualization. Server virtualization – Yes! But storage virtualization? That’s not an easy one!
In Server virtualization, we simply divide one physical server into several virtual environments that saves you lots of money and helps wring the best out of your infrastructures, but did you know that an inadequate storage system can actually cause the economic benefits of server virtualization to fall through because virtual machines can consume large amount of storage?
So, why is it that Storage Virtualization still isn’t as popular as Server Virtualization?
In storage virtualization, we group physical storage from multiple network storage devices so that they look like a single storage device that can be managed from a single console. This can raise several complexities when –
Data center has storage products from different vendors
Some vendors provide storage virtualization for their own hardware only
You need availability of virtualized storage features from vendors in non-virtualized offerings
Throw in the further dilemma of choosing where to put your virtualization: hosts? network? or arrays? Or perhaps being bound by vendor lock-in clause?You probably have been buying additional devices and systems on an ad hoc basis to meet new storage demands, but think of the advantages you could gain with a storage architecture (read storage virtualization) that allows you to upgrade when needed and in a cost effective manner! Now that’s enough to convince you on the virtues of storage virtualization……
Let’s find out why IBM’s VSC is the best bet in the market.
A recent ITG report compared IBM VSC solution to other large vendor (EMC, VMware) over a period of 5 years to determine savings,
And the verdict is – IBM Virtual Storage Center (VSC):
The only solution to address large-scale multivendor virtualization
Averaged 72 percent less than EMC for overall costs of ownership
Averaged 35 percent less than VMware over 5-year total cost of ownership
Supports more than 200 platforms -- all of the major vendors -- as well as from many smaller vendors.
You can get behind the calculations and analysis done in the two ITG reports to understand the how and why of it. As one user shared when asked to describe VSC in simple terms, “It works.”
IBM’s Data Protection has all the right pieces
Jason Buffington, Senior Analyst, ESG in his interview to Dave Vellante from Wikibon, said IBM fills out the whole data protection spectrum and that it’s new UI is a great proof point to why it’s not your grand daddy’s solution. One of the top 5 problems people face in protecting a virtualized environment is lack of visibility. IBM’s new UI does a great job in adding that visibility. IBM has all the right pieces with its breadth of data protection solutions. With IBM starting to put cloud more aggressively into the mix, 2014 looks interesting.
Data Protection is a rainbow that must have all the colors
In Jason’s opinion when defining Data protection strategies, one should think of Data Protection as a rainbow with Backup, Snapshots, Replication, Archive and Availability making up for the different colors. So when have you ever seen a rainbow with no green? Mechanism of data protection should not only include the whole range of solutions but must also include a hybrid approach that includes Tape, Disk or the Cloud. Organizations can pick the color from the spectrum, according to what they want to recover and how.
Disk, Tape, Cloud – they are all going to stay
Disk is not going to be the one all and end all. Tape is going to stay, with economic advantages and new innovations like LTFS that makes tape look like disk, adding flexibility and durability. Cloud as a backup service is not the Silver Bullet either because it’s only a deployment mechanism, it does not make your back up problems go away, one still has to run it, run the admin and push the agents out. One needs to have an on premise intermediary appliance for fast recovery before going to the cloud. However when ESG looked at what were the primary use cases of cloud for the next couple of years they found Data Protection at number one and Disaster recovery at number three. Jason suggests that every solution should be considering cloud as part of it.
Data protection need not be so Hard
His advice to IT pros who are worried about cost and complexity of data protection is that it need not be so hard, there are great solutions available that allow you to backup, archive, snapshot, replicate and do an entire range of functions, from a single GUI, to a single data store from a single administrators view. People only need to wake up to the solution and start using it.
In the past two years that IBM acquired Butterfly, it generated hundreds of Analysis Engine Reports (AER) analyzing billions of gigabytes and established facts about Tivoli Storage Manager (TSM) that should make competition sit up and notice.
The Backup Analysis Engine report from Butterfly Software uses light-touch, agent-less software technology to analyze existing heterogeneous data backup environment. It is a non-intrusive analysis based on empirical production data collected in minutes and incorporated into the Backup Analysis Engine report from IBM Butterfly Software.
Why is Butterfly important? Gartner Magic Quadrant: Backup and Recovery 2013 Competitive analysis says between 2012 and 2016, one-third of organizations will change backup vendors due to frustration over cost, complexity and/or capability. To be able to say conclusively that TSM solution can save backup infrastructure costs by as much as 38% when compared to some of the other competitive products opens the door for IBM to go get these 33% of the organizations looking for a change.
AER is the Key More demand for AERs is expected with the launch of the automated “self-service” AER generation model. Scheduled to go live at the beginning of 2H 2014, it will scale out as a service to IBM and its Business Partners. These facts drive home the fact that Butterfly AERs have metamorphosed into a well accepted and standard approach to storage infrastructure analytics.
Meet the Butterfly Storage and Backup Assessment Team at Pulse 2014 If the butterfly flutter has caught your interest, visit Pulse 2014 on Feb 23-26 at Las Vegas and meet the folks who deliver Butterfly Storage and Backup Assessments in the IT Optimization section of the IBM booth. Find out how your company can use business analytics to dramatically lower the cost of running your backup and recovery or primary storage infrastructure.
Backup redesign continues to be toward the top of most analysts’ lists for 2013 IT priorities. I’ve talked a lot about some of the catalysts behind this trend like data growth, big data, VMware and software defined storage. With IT managers redesigning, the incumbent enterprise backup vendors have a lot of motivation to offer innovative solutions that are a bit ahead of the times. The leaders have all placed strategic bets on what the winning formula will be. I discussed these bets in my post “Forrester’s take on enterprise backup and recovery.”
For its part, IBM is being quick about helping IT managers redesign. The help starts with a clear understanding of the economic benefit a redesign can bring. After all, in today’s environment few IT managers make technology moves simply for the sake of technology. Storage is about economics. I discuss this more fully in my post “Does trying to find a better economic approach to storage give you ‘Butterflies’?” But there is still efficient technology that enables these economic savings, and the person in IBM who is ultimately responsible for the technology in IBM Tivoli Storage Manager (TSM) is the product manager, Dr. Xin Wang.
Recently I spoke with Xin about the important shifts IT managers are facing and how she is helping IT managers reimagine backup.
The Line*: Xin, I’m going to start with the “Dr.” part of your title. Should folks call you the Backup Doctor?
Xin: (laughing) Well, I don’t know about that. I’m actually a doctor of Applied Physics. One thing that drove me to earn a PhD and has moved me ever since is that I love to learn. I started my career in IBM hard disk drive research, spent some time as a storage software developer and development manager, and have now been working with backup clients as a product manager for several years.
The Line: Wow, I could probably do an entire post just on your career. But let’s stay focused. What have you learned about the challenges IT managers are facing and this whole backup redesign movement?
Xin: It’s interesting. The challenges aren’t secret but they carry big implications for backup. Data is growing like crazy; that’s no secret. But it is now so big that the old method of loading an agent on a server to collect and copy backup data over a network to a tape isn’t keeping up. So IT managers are redesigning.
And what about servers? Servers aren’t servers anymore. Thanks to VMware, they are virtual machines that come, go and move around in a hurry. Traditional backup is too rigid. So IT managers are redesigning.
Administrators are changing too. The generation of backup admins who grew up tuning the environment is giving way to a new generation of backup, VMware and cloud admins who need much more intuitive and automated management tools. And so IT managers are redesigning. (Editorial comment: I discussed the change in administration in my post “Do IT managers really ‘manage’ storage anymore?”)
The Line: Okay, I think I’m seeing your trend. IT managers are redesigning. And it seems like you’ve got a clear idea of why. Can we take your list one at a time? I think my readers would be interested in what you are doing with TSM in each of these areas.
Xin: Sure, that makes sense.
Check back for part 2 of the interview in which Xin shares her near term plans for TSM. If you have questions for Xin, please join the conversation by leaving a comment below.
*The Line is my personal blog, and when it appears in the interview, it represents me as the interviewer.
With a new school year underway, vacation season for many come and gone and the Labor Day long-weekend upon us in North America, entering September marks the unofficial end of summer. For many this is a somewhat depressing time of year as we realize that colder temperatures and the on-set of winter aren’t far off.
However it’s not all bad news. Some of us prefer outdoor activities in the fall and winter months and when it comes to business, the fall presents a renewed interest in sharpening our skills and seeking networking opportunities by attending industry conferences and events.
For Storage professionals in North America an ideal opportunity comes in the form of Storage Decisions New York on September 16 & 17. Storage Decisions New York plans to bring together over 500 end-users, independent experts, analysts, and top solution providers to engage in thought-provoking presentations, interactive networking opportunities, and sponsor showcases featuring the latest trends and technologies impacting the storage industry. The 2-day conference, scheduled for is the only place you will find the industry's foremost independent experts – and the most qualified group of storage professionals – under one roof.
As a platinum sponsor of Storage Decisions New York, IBM will have a multi-faceted presence at the conference with ample opportunities to engage with the storage community. One of the highlights is our Tech-in-Action talk, where IBM’s Storage Software Business Strategist Ron Riffe will outline IBM's point of view on The Critical Decisions for Improving The Economics of Storage. Ron will touch on a range of considerations including the need for improved administration, the role of software-defined and the impact of flash – just to name a few.
Over the course of the two-day event, IBM storage experts will be available in booths 24 and 25 to meet attendees and discuss practical solutions to today’s storage challenges. The IBM booth will also be where attendees can pick up their complimentary conference USB key which will loaded with conference-wide materials and presentations.
Storage Decisions New York is worth taking a look at as a great way to kick-off the fall conference cycle. If you're planning to attend stop by and visit us. If you happen to be on the west coast and concerned that New York is too far to travel, don't worry Storage Decisions is stopping in San Francisco on October 30.
Last December while attending the 2012 Gartner Data Center Conference in Las Vegas, I listened to an insightful presentation by analysts Sheila Childs and Pushan Rinnen on the bring-your-own-device (BYOD) phenomenon. They were particularly focused on issues related to protecting an organizations data in a BYOD world (more on why in a moment). One scenario that captured my attention went something like this.
It’s my device. I had it before I brought it to work and I was using Dropbox or iCloud to sync and share all my files. Now, my device has work data on it too. My security-conscious CIO doesn’t want work data shared on those public services. But I’m accustomed to, and almost dependent on my sync and share capability and my organization hasn’t yet given us a private alternative.
Now, in my roles as a technology strategist I spend a good bit of time helping to plan our investments. With the speed at which mobile and social technologies are sweeping through organizations, I have to admit the case that Sheila, Pushan and other Gartner analysts made that week for the rapidly emerging data protection crisis in BYOD sync and share was compelling. It occurred to me that credible vendors who were able to solve the problem in short order would be in high demand. That was eight months ago.
Fast forward seven months
In July, Forrester analysts Ted Schadler and Rob Koplowitz published The Forrester Wave™: File Sync And Share Platforms, Q3 2013 in a quest to uncover those credible vendors. I liked the way they characterized the problem. “Employees’ need to synchronize files grew from a whisper to a scream over the past few years. . . .The scream will grow louder as the number of tablets will triple to 905 million by 2017 to join the billions of computers and smartphones used for work.” The report evaluated and scored 16 of the most significant solution providers against 26 criteria. Among the leaders was IBM SmartCloud Connections. You can see the complete list of leaders here.
Change is here
The interesting thing that most folks miss in the sync and share conversation is – it’s about more than just syncing and sharing. As BYOD smartphones and tablets begin to proliferate the workplace, document management will shift from email attachments and file servers into social collaboration. Forrester points to a further social shift from casual partner collaboration to compliant workflow in regulated industries.
That kind of data is important – and the reason that the Gartner analysts were focused on the data protection issues of this BYOD world. Organizations today have well matured processes for protecting data on file servers and email systems, usually with an enterprise backup product. I commented on this set of tools in my post on Forrester’s Take on Enterprise Backup and Recovery. But as corporate information is relocated from file servers and email systems to sync and share systems, Gartner had an unmistakable reminder for its customers, “Consumer File Sync/Share Is Not Backup”.
I agree! The good news is that IBM has taken the time to ensure its enterprise backup product, IBM Tivoli Storage Manager Suite for Unified Recovery, protects synched and shared files in IBM Connections with all the same efficiency it does file servers, email systems and most any other data important to an organization.
What is your organization doing with file sync and share? How are you protecting that information?
In the modern datacenter, there’s a lot of shifting going on when it comes to traditional storage management responsibilities. What used to be the domain of a central storage and backup administration team has been thrown up for grabs as server virtualization and software defined everything have entered the scene. I hinted at this a bit in my post Do IT managers really “manage” storage anymore? But let’s consider a practical example that’s quite common with clients I speak to. If you are going to VMworld 2013, plan on attending the IBMTSM for VE hands-on lab to get more details.
Microsoft SQL Server is the foundation for a lot of applications that are critical to business operation – meaning CIO’s and IT managers are interested in its recoverability. Those same CIO’s and IT managers are also interested in the recoverability of their VMware estates, the software defined compute (SDC) platform that houses those databases. For many clients, the problem is that these two domains are tightly guarded by two independent superheroes, and neither is specifically trained in storage.
Superhero #1: The database administrator (DBA)
Most DBAs that I’ve known have an almost personal connection with their databases. They care for them as they would their own children. The thought of leaving one unprotected (without a backup) equates to dereliction of duty. Ignoring the idea that it takes a village to raise a child (or in this case that there may be other members of the IT administration village like VMware admins and backup admins), SQL Server DBAs will often work alone with the backup tools Microsoft provides to ensure their databases are protected. Good for the SQL Server, but not so much for the surrounding infrastructure. For databases running on VMware, routine full backups even with periodic differential backups can consume a LOT of disk space and virtual compute resources, and also contribute to the I/O blender effect.
Superhero #2: The VMware administrator
TSM for VE in VMware vSphere web client
VMware administrators can be just as focused on their domain as DBAs are. Their attention is on being able to recover persistent or critical virtual machine (VM) images, regardless of what app happens to be riding along. VMware has done a nice job of creating and supporting an industry of tightly integrated backup providers. These tools can get at the VMware data through a set of vStorage API’s for Data Protection (VADP) and VMware administrators can manage them through vCenter plug-ins. But few VMware admins are completely aware of all the workloads that run on their VMs and even less aware of the unique recovery needs of all those workloads. It’s just hard to keep up.
Common ground exists
One tool that bridges the gap is IBM Tivoli Storage Manager for Virtual Environments (TSM for VE). Nicely integrated with both VADP and SQL Server, TSM for VE can bring together VMware administrators and the DBAs in ways that would make any IT manager smile. Here are two of the more common approaches.
We can each do our own thing – together
As noted above, SQL Server DBAs take full backups sprinkled with differentials. Even though this approach can tax server and storage resources, and contribute to the I/O blender effect, it is in the DBA comfort zone. When the app is running on a VMware virtual machine, the DBA has the option of storing those backups on disk storage associated with the VM. It’s a nice thing to do because it allows the VMware admin to stay within his comfort zone too. Using vCenter to drive a VADP integrated snapshot tool like TSM for VE, the VMware admin can capture a complete copy of the virtual machine, along with the SQL Server backups the DBA created. Since the likely use of such a snapshot would be to recover the VM and then recover the database from its backup, there’s really not a reason to include the source SQL Server database or logs in the snapshot. With TSM for VE, the VMware admin can exclude the source SQL Server database from being redundantly backed up adding to an already formidable set of built-in efficiency techniques (with TSM for VE, snapshots are taken incrementally – forever, and can be deduplicated and compressed). It’s a good compromise solution letting each admin stay in his or her comfort zone. But it can be better.
We can join forces and do something really great
With TSM for VE, VMware admins and SQL Server DBAs can put their heads together and choose to do something really great. For the DBA, it’s an exercise in less-is-more. The DBA stops doing her own backups. No more full or differential copies of the database. No more taxing resource usage on the VM. No more I/O blender effect. Just, no more. How? Well, with a VMware VADP integrated backup tool like TSM for VE, the snapshot of the VM is accompanied by a freeze and thaw of the SQL Server database (techno-speak for putting the database in a consistent state), just like what happens when a backup is independently initiated by a DBA. And with TSM for VE, as soon as the TSM server confirms that it has successfully stored the consistent snapshot in a safe, physically separate place, it will connect back with the SQL Server to truncate the database logs.
In addition to the less-is-more benefits above, think about the differences in restore with these two scenarios. When the DBA and VMware admin simply coexist, each doing their own thing, restoring the SQL Server database includes steps for restoring:
The VM snapshot to get the database backups in place
The full database backup
The subsequent differential backups
By comparison, when the DBA and the VMware admin join forces with TSM for VE, the steps are dramatically simplified. Restoring the snapshot equates to restoring a consistent copy of the database. And remember, because these snapshots are highly efficient, they can be taken quite frequently.
Going to VMworld 2013? Come visit IBM on the Solutions Exchange floor at booth #1545.
VMworld 2013 is just around the corner and at IBM, we’re gearing up for a great set of conversations with our joint clients. As you’re planning your agenda, here are a couple of things worth looking in to.
For years Hollywood has been enamored with idea artificial intelligence. Beyond tabulating, beyond programmed responses, what would happen if a computer could learn, reason, analyze, predict? In short, what could computers do if they could think? Sadly, more often than not, Hollywood’s answer resulted in some kind of disaster. In 2001 a Space Odyssey, the HAL 9000 computer decided to kill the astronauts on Discovery One. In the 1983 film WarGames, the WOPR computer played games with global thermonuclear war, and in the Terminator franchise of movies, SkyNet attempted to exterminate the human race. Ugh!!
I’m proud that I work for a company who has a very different perspective on the potential of cognitive computing. Instead of blowing people up, IBMers around the world are developing cognitive systems to help us make better decisions.
Recently, Forrester published The Forrester Wave™: Enterprise Backup And Recovery Software, Q2 2013. I wasn’t surprised by their suggestion that “CommVault [Sympana 10.0], EMC [Avamar 7.0 and NetWorker 10.1], IBM [TSM 6.4), and Symantec [Netbackup 7.5] lead the pack. It’s a tight four-horse race for the top honors — [they] all scored high on strategy and current offerings.” These are the four vendors that are always pushing and shoving on each other in analyst comparisons. The thing that caught my attention in this report was the expert job analyst Rachel Dines did in pealing back a complex market space to uncover some important strategic observations about each vendor. Read more...
Last Monday, EMC announced ViPR as its new Software-defined Storage platform. Almost simultaneously, Chuck Hollis described it as ‘Breathtaking’ in his usually excellent blog. I must admit, one thing I routinely find breathtaking about EMC is their approach to marketing. They have a knack for being able to take unexceptional technology (or, as in this case, combinations of technology and theories about the future), and spin an extraordinarily compelling story. With all seriousness and without tongue in cheek… Nicely done EMC! Read more.
Royse Wells, Chief Storage Architect for International Paper discusses Tivoli Storage Manager Operations Center, previewed at Pulse 2013.TSM Operations Center is a new graphical interface that helps administrators and management get quick summary views about the backup environment and simplify administration.
Jeff Jones, UNUM
UNUM Uses Tivoli Storage Manager for Virtual Environments
Jeff Jones is senior infrastructure manager at UNUM, a leading provider of financial protection benefits in the US and UK.UNUM has about 85% virtual servers today.UNIM uses Tivoli Storage Manager for Virtual Environments to deliver faster backups and restores, and reduce the risk of data loss for 650 Windows and Linux VMs.
Klavs Kabell, IT-WIT
Modernizing Backup for Today’s Virtual Environments
Klavs Kabell is a Senior System Consultant at IT-WIT, an IBM Business Partner in Denmark specializing in backup solutions.Klavs discusses how backup solutions have evolved, as VM deployments have grown.Tivoli Storage Manager for Virtual Environments helps simplify VM backup administration and tracking, while incremental ‘forever’ technology improves storage efficiency.
Thomas Bak, Front-safe
Cloud backup and archive using TSM and Frontsafe Portal
Front-safe received the Best Cloud Solution award at the IBM Pulse 2013 conference, and the 2013 IBM Beacon Award for the Best Solution to Optimize the World’s Infrastructure.Learn about the value of enabling backup as a cloud service, using Front-safe Portal software.
Laura DuBois, Program VP of Storage for IDC, and Steve Steve Wojtowecz, IBM VP of Storage and Networking Software discuss client opportunities and requirements for storage clouds and compute clouds.Client cloud storage requirements include backup and archive clouds, file storage clouds, and storage that supports compute clouds.
Chris Dotson, IBM CIO Office
IBM’s storage transformation featuring SmartCloud Virtual Center
Chris Dotson works in IBM’s CIO Office as a Senior IT Architect for Services Transformation.He is guiding IBM’s own storage transformation.As a large enterprise, IBM manages over 100 petabytes of data, growing at 25% per year.Chris discusses block storage virtualization, automated block storage tiering, file cloud storage, and automated block storage management at IBM.He shows how SmartCloud Virtual Storage Center is reducing storage costs by 50% with no noticeable performance impact to users.
BJ Klingenberg, IBM Global Technology Services
IBM Global Technology Services Uses Tivoli Storage Productivity Center (TPC) to Manage Clients’ Storage Environments
BJ Klingenberg is a Distinguished Engineer and Enterprise Storage Management lead for IBM.BJ shares his experiences using IBM Tivoli Storage Productivity Center in IBM’s Service Provider environment. Service Provider environments are governed by Service Level Agreements, so managing capacity, performance and availability are essential capabilities. Storage efficiency is essential to remaining competitive.See how TPC helps IBM deliver outstanding customer service at competitive prices.
Jason Buffington, ESG Senior Analyst, and Tom Hughes, IBM Worldwide Storage Executive discuss business and technical challenges for data protection.Tom and Jason discuss new solutions and Best Practices for protecting data more efficiently and effectively for today’s cloud, mobile and virtual environments.
Colin Dawson, TSM Server Architect introduces Tivoli Storage Manager Operations Center, the next generation of backup administration from IBM. He describes how TSM Operations Center was designed and built, using extensive user feedback.
Jonathan Bryce, OpenStack Foundation founder and Todd Moore, IBM
OpenStack Provides Compute, Storage and Network Interoperability for Clouds
The OpenStack Foundation has gained 170 corporate and over 8,200 individual members since its inception in 2012, making it one of the fastest growing cloud standards.Jonathan Bryce, Executive Director and founder of the OpenStack Foundation, and Todd Moore, IBM Software Group Director of Interoperability and Partnerships discuss the capabilities and opportunities for building cloud solutions using OpenStack to manage compute, storage and network resources.
Deepak Advani, IBM
Optimizing IT Infrastructures for Today’s Workloads
Deepak Advani, General Manager of Tivoli Software discusses top issues and opportunities facing clients as they adopt new breeds of applications to engage with customers and improve operations using mobile devices, cloud and analytics.
Day 1 at Pulse 2013 was grand and exciting! I am not in Vegas and how do I know? No points for guessing, all thanks to
social media #IBMPulse and my friends in Vegas who are tweeting away every
moment of the event.
Today is for IBM’s business partners! Deepak Advani, Tivoli
General Manger reemphasized on the role of Business partners and how critical
it is for IT innovations to achieve business results.Todd has very well captured the essence
of this event in his Pulse
Day 1 was also the day of awards!! Congratulations are in
!IBM Tivoli Award for Best Data Management Center Solution
was picked up by CobaltIron. CobaltIron will cover their solution during their
Wednesday session, including how to transform data backup costs into a business
opportunity – a must attend session! (Watch out for session# 1914, Room# 114,
2:00pm on Mar 6)
Front-Safe got the IBM Tivoli Award for the Best Cloud Solutions.
Watch out for their session on March 6 (session# 2436, room# 114) highlighting how
to create a Cloud Service around IBM Tivoli Storage Manager.
Then came the real example of how we turn opportunities to positive
outcomes, Chris Gartner! We talk about technologies all day, but the one that
created ripples in twitter.com was the “Pursuit for Happiness”. Thanks Todd for
blogging from Pulse.
Day 1 evening was reserved for Birthday Celebrations! Yes, IBM
Tivoli Storage Manager Birthday turns 20!! The celebration marks two decades of
back-up and recovery management leadership. Solution expo hall was abuzz with storage
enthusiasts! Thanks to Dave Russell of Gartner Inc for having joined TSM's 20th
birthday celebrations!! Needless to say, what a great way to network with
Storage experts from around the world!
And what a way to end the Day 1, but a musical extravaganza!
Bella Electric Strings performed at the Expo opening.
Thanks folks for bringing Pulse to people who are tracking and trending the world over. What I may not know is how much you all won in the blackjack J..Have a good one!
Data protection matters! Actually it matters even more with the advent of big data. The unique challenges of managing & protecting big data has forced IT professionals to relook at their data backup & protection policies. Every year ESG conducts a forward looking spending intention survey. They shared a couple of interesting facts that do not surprise but definitely reinstate my thoughts. When organizations were asked what they would consider most important IT priorities over the next 16-18 months, 30 percent responded back saying “improved data backup & recovery”!
And when they were asked what they would characterize as challenges with their organizations’ current data protection processes and technologies, “cost” & “need to reduce back up time” came out to be the major concerns.
ESG analysts Mark Peters and Tony Palmer shared these insights as they took us through the results of their lab testing on Tivoli Storage Manager. If you are not familiar with IBM Tivoli Storage Manager (TSM), it is a scalable client/server software primarily designed for centralized, automated data protection. The goal of the ESG report is to educate IT professionals and provide insight into the advanced data backup technologies such as forever incremental back up, deduplication and why it is so important in current scenario. Click here for the ESG video.
The TSM Lab validation was performed using a combination of hands on testing, audits of IBM customers in live production environments and detailed discussion with IBM experts. The objective is to validate some of the valuable features and functions of the product and show how those can be used to solve real customer problems, and identify any area of improvement. IBM has continuously invested in TSM platform bringing innovation to data protection and recovery. ESG evaluates how the newer versions of TSM provide a turnkey solution to a range of data protection issues. They found that the two technologies (deduplication and progressive incremental backups) working in tandem were able to achieve 90 percent data reduction after just six incremental backups and 95 percent data reduction after ten backups. Replication function is also fully integrated with deduplication, thus optimizing quicker recovery during disasters. TSM uses policy-based automation along with intelligent move-and-store techniques, helping to reduce data administration efforts. Over all, ESG’s validation rightfully points to the key enhancements to the TSM platform that drive greater scalability, efficiency, and data availability. Please register and download the detail 23 page ESG Lab Validation Report here. Opinions are my own
If you are following the developments related to Pulse 2013 you’re likely well aware that Peyton Manning has been announced as the keynote speaker and the evening entertainment at Pulse Palooza will feature Carrie Underwood. If you’ve been to Pulse before you also know you can expect compelling thought leadership in the General Sessions and deep content in over 300 breakout sessions to choose from.
Over and above all that exciting news there’s one thing that keeps attendees coming back year after year – the opportunity to network. Each year following Pulse, attendees tell us through the post-Pulse survey that networking with the over 8000 conference attendees rises to the top as the most valuable aspect of the event.
The opportunity to network with your Storage colleagues at Pulse 2013 will once again be front and center at the conference. Formal opportunities exist such as the Storage Birds of a Feather session, Meet the Experts in Storage, Client Connections along with access to Storage subject matter experts from development and product management in the Expo Hall. And of course in Las Vegas there’ll also be plenty of informal gatherings to connect with Storage professionals to share knowledge and expertise.
A great way to start the networking process is to take in the numerous client-led sessions in the Unified Recovery and Storage Management track within the Cloud and IT Optimization stream at Pulse 2013. Following the track-kick off which features Dave Russell, Research Vice President at Gartner you’ll have the opportunity to hear IBM clients sharing their experiences, some highlights of which include:
• Learning about the experiences of Chesapeake Energy with the new TSM Backup and Recovery Dashboard based on their participation in the Early Adopters Program;
• Understanding how The University of Sydney is using SmartCloud Virtual Storage Center to provide centralized management of its diverse storage environment;
• Hearing how Banco de Brasil improved its backup capabilities by taking advantage of the latest advancements in Tivoli Storage Manager;
• A panel of experts from Blue Cross Blue Shield of Louisiana, Kindred Healthcare and Centene Health discussing how they are protecting healthcare data with IBM storage solutions.
While this is just a tiny sampling of the type of organizations that will take to the podium in the Storage track at Pulse there’s a wealth of experience to help you tackle your most pressing Storage Management challenges. Taking in the sessions is only the beginning – connecting with these storage professionals in the numerous networking opportunities at Pulse is how the conference truly comes to life.
If you’re already registered for Pulse start you can start networking now by connecting with the growing list of speakers and other conference attendees on the Pulse2013 Vivastream site and if not visit the PULSE 2013 home page for all the conference details and to register today.
In many organizations today, storage replication is riddled with manual errors and/or poorly written in-house scripts that often provide no view of overall copy environment status. Additionally, setup and ongoing management of large-scale copy services is increasingly becoming cumbersome. Tivoli Storage Productivity Center (TPC) enables simplified yet comprehensive control over replication process. With the release of TPC v5.1 in June 2012, the replication management capabilities are now well integrated into the TPC core license.
TPC extends support for FlashCopy, Metro Mirror, Global Mirror, and Metro Global Mirror sessions. While providing central view of the replication environment, TPC provides end-to-end management and tracking of copy services, including both planned and unplanned disaster recovery procedures. In addition, TPC enables practice volume sessions that allow storage managers to test their DR environment without interfering with daily DR operations.
The following new capabilities were added to TPC v5.1:
Failover operations that are managed by other applications Applications such as the IBM Series i Toolkit, VMware Site Recovery Manager, and Veritas Cluster Server manage failover operations for certain session types and storage systems. If an application completes a failover operation for a session, the ‘Severe’ status is displayed for the session. An error message is also generated for the role pairs for which the failover occurred.
Additional support for space-efficient volumes in remote copy sessions You can use extent space-efficient volumes as copy set volumes for the following IBM System Storage® DS8000® session types: • FlashCopy® (System Storage DS8000 6.2 or later) • Metro Mirror (System Storage DS8000 6.3 or later) • Global Mirror or Metro Global Mirror (System Storage DS8000 6.3 or later)
Reflash After Recover option for Global Mirror Failover/Failback with Practice sessions You can use the Reflash After Recover option with System Storage DS8000 version 4.2 or later. Use this option to create a FlashCopy replication between the I2 and J2 volumes after the recovery of a Global Mirror Failover/Failback with Practice session. If you do not use this option, a FlashCopy replication is created only between the I2 and H2 volumes.
No Copy option for Global Mirror with Practice and Metro Global Mirror with Practice sessions You can use the No Copy option with System Storage DS8000 version 4.2 or later. Use this option if you do not want the hardware to write the background copy until the source track is written to. Data is not copied to the I2 volume until the blocks or tracks of the H2 volume are modified.
Recovery Point Objective Alerts option for Global Mirror sessions You can use the Recovery Point Objective Alerts option with IBM TotalStorage Enterprise Storage Server® Model 800, System Storage DS8000, and System Storage DS6000™. Use this option to specify the length of time that you want to set for the recovery point objective (RPO) thresholds. The values determine whether a Warning or Severe alert is generated when the RPO threshold is exceeded for a role pair. The RPO represents the length of time in seconds of data exposure that is acceptable in the event of a disaster.
Recently, I had the distinct pleasure to deliver
a presentation on Data Storage and Compliance at the IBM Tivoli event 'Business
Without Limits 2012' in Bangalore,
India.There are more than 100 attendees who
attended the event from almost every industry.
My Track for the day: Addressing Data Growth, Threats and Compliance; Unified
The volume, velocity and importance of data have
increased dramatically during the past few years to the point where most backup
and archiving solutions can't keep up with the scalability, functionality,
performance, reliability and budget realities of today and tomorrow. Attendees
understood how to reduce backup data capacity by as much as 95%; how to reduce
the amount of new data at risk by 90% or more; and how to simplify global data
recovery operations and achieve compliance by leveraging a unified management
I was privileged to present in such an interactive
session, where customers understood how our broad product portfolio will help
in addressing their business challenges.
IBM now brings ‘Business Without Limits 2012’ to several cities across the United States
in October and November.This is an
exclusive IBM Tivoli event designed to increase awareness and thought
leadership among the IT managers, Infrastructure leaders, Systems
Administrators, Storage Managers, and Data Center Managers. IBM’s Business
Without Limits event is coming soon to the following cities:
The event will focus on how IBM’s Integrated Service Management strategy
brings together the different capabilities to enable integrated delivery of
business services across complex, interconnected physical and digital
IBM’s Business Without Limits event will have the following Storage Tracks:
the pivotal role of storage in modern data center
backup and unifying recovery
your data protection headaches to the cloud
storage analytic and reporting
This conference will explore how you can capitalize on the opportunities of
a smarter planet and remove the barriers to innovation that will help you
achieve “Business without Limits.”As
today’s leaders are transitioning to smarter, flexible cloud infrastructures
that speed the delivery of innovative products and services, effective storage
management becomes a critical component of that success.Please join us at this event to learn more!
As an IBM marketing manager, my job includes writing about storage technology.This post is about more than technology, though.It’s about a new breakthrough capability for managing storage costs and service levels.
I recently met with IBM Distinguished Engineer, Mike Sylvia, who has been working on a Business Transformation project to enable automated right tiering for storage in IBM data centers.Right tiering is the notion that data should be hosted on the optimal storage tier to balance cost and performance requirements.
Mike explained that applications tend to be hosted on top tier storage.When he analyzed actual usage patterns, Mike found most data can be effectively hosted on lower cost storage.Mike’s project put numbers to a problem that is often hidden from view and, until now, nearly impossible to solve.
Hosting data on the wrong storage tier turns out to be a huge efficiency problem.Mike predicts IBM will save $13 million over 3 years in one data center, by periodically moving data to the right tier.During the pilot, users saw their cost for storage drop by 50% per TB on average.This is big.
Like many advancements, IBM’s automated right tiering capability is accomplished by integrating existing technology.Mike Sylvia’s project combines storage virtualization, storage management automation and analytics.Today, IBM offers the technology in a bundled solution called SmartCloud Virtual Storage Center.
How does it work?
Step 1.IBM’s storage virtualization controller collects detailed usage metrics about storage it manages throughout the data center, without impacting application performance.
Step 2.IBM’s Storage Analytics Engine studies usage patterns over time to understand performance requirements.
Step 3:Storage tier recommendations are generated in reports that can be shared with application owners and IT management.
Step 4:Storage virtualization enables online data migration, with no disruption to applications or users.
Repeat:Usage patterns change over time, of course, so right tiering becomes an ongoing process.
Why does it work?
Automated right tiering delivers the efficiency benefits of Information Lifecycle Management without the headaches and hidden costs.Automated right tiering has significant benefits for both data owners and IT leaders, so everyone wins.
For example, application and database owners can gain the following benefits:
Applications can move to top tier storage when they need it, without waiting for a maintenance window.
Average storage costs drop significantly, without a drop in services.
IT leaders benefit, too.For example:
Storage tier decisions are based on analysis of actual usage patterns, not predictions.Storage performance management tasks are eliminated.
Data can quickly and easily be moved back to its original storage tier if requested, without incurring an outage.
IBM automated right tiering works with most storage systems, so deployment is nondisruptive.
The technology that enables automated right tiering has significant additional benefits, such as the ability to eliminate scheduled outages for storage system maintenance.
Problem solved.How has your organization addressed the storage right tiering challenge?
IDC has recently released its Worldwide Storage Software QView for the first quarter of 2012. In it, IDC estimates that the total Storage Software market for 1Q12 grew about 3.3% over 1Q11.IBM had a solid quarter while Symantec faltered, allowing IBM to take the overall #2 share rank position for 1Q12.
§In the Overall Storage Software Market, IBM moved up to #2 share rank position in 1Q12, gaining 2.0 share points over 1Q11.
§IBM retained its #1 position for Archiving Software growing faster than the market.HP holds #2 spot with its 2011 acquisition of Autonomy.
IBM offers a comprehensive, flexible storage management software portfolio that helps organizations address storage management challenges across the enterprise, including data centers, remote/branch offices and desktop/laptop computers. Learn more about the specific components within the IBM storage software family that can help you create a more responsive and resilient storage infrastructure for your on demand business.
Live Webcast: Using Tivoli Storage Productivity Center to be the "eyes" into your SAN environment, and to see how that environment is changing. LIVE!
In the ever changing SAN environment, Tivoli Storage Productivity Center has many components to help the Storage Administrator know when a where to focus their attention. We will walk through many of these in a live demo and see how they can be used.
Let TPC help you keep up with storage growth instead of working longer hours!
Scott McPeek, IBM Program Director, Storage Sales Enablement. He has worked in the software industry more than 30 years, the last ten years have been with IBM as part of the TrelliSoft SRM acquisition. Scott now focuses on storage resource management, storage performance management and virtualization with products like TPC, SVC and the Storwize V7000.
How are you spending your time this weekend? Polishing up your Pulse 2012 storage session abstract, hopefully! With only 4 days left to submit a 100-word abstract by Nov. 7, we thought it would be helpful to share some final pointers. Keep in mind that this year's theme
is Business Without Limits and we are seeking to understand how you
gained visibility, control and automation to deliver better business
What are the key benefits to you as a Speaker? One full Pulse conference pass ($1995 value) and the opportunity to gain visibility for your company, and take advantage of an incredible networking opportunity with over 7,000 industry experts, press, and analysts.
Here's some pointers on how to get your Storage Management session abstract accepted: 1. Focus it on topics such as how you used Tivoli Storage Manager to manage "big data"; success with recent upgrades; or cloud storage 2. Tell us about the key business challenges you were trying to solve, and how IBM Tivoli storage solutions helped you address these challenges 3. What was the ROI, or key results, from implementing a Tivoli storage solution, and what valuable lessons did you learn from the experience
Don't forget to register during early bird registration by December 16 if you do not plan to speak at Pulse and attend the conference
complimentary. Early Bird registration can save you up to $700 off
registering onsite! See you at Pulse 2012!
Well it's that time again, hard to believe, I know...PULSE call for papers has opened, and we want to have another banner year in the Tivoli Storage Sessions! Last year we were standing room only in many of our sessions and this year we hope to fill each room once again.
As for topic suggestions, we'd like to hear from customers who:
Use TSM to manage 'big data'
Have best practices, created with our Tivoli Storage portfolio that they want to share
NEW!! Technical Services Webinar: Capacity Planning in a Tivoli Storage Manager Environment
As much as customers would like to "backup everything and keep it forever", storage is not unlimited. The reality of ever increasing data growth, combined with regulatory compliance and the associated risks make the arduous task of capacity planning for backup ever more critical. A new Reporting and Monitoring tool is available with Tivoli Storage Manager (TSM). This new tool, based on IBM Tivoli Monitoring, can collect and report on historical data and is an integral part of a capacity planning regimen.
This session will demonstrate a capacity planning methodology that conforms to the ITIL Capacity Planning process description by showing how the TSM Reporting and Monitoring tool and other TSM components can be utilized for to ease the pain of capacity planning. Additionally, this session will look at strategies, like data deduplication, to reduce the amount of backup data while maintaining regulatory compliance.
Presenters: Mark Vanderboll, IBM Tivoli Global Response Team Dave Daun, IBM Advanced Technical Skills
IBM Champion program is still accepting nominations for experts on IBM Tivoli Software- Nominations are open through August 19. The IBM Champion program recognizes exceptional contributors to the technical community -- clients and partners who work alongside IBM to build solutions for a smarter planet. An IBM Champion is an individual who leads and mentors his or her peers and motivates them toward IBM solutions and services. Champions can be found running user groups, managing websites, speaking at conferences, answering questions in online forums, writing blogs, submitting wiki articles, sharing how-to videos, and writing technical books.
The IBM Champion program recognizes and thanks these innovative thought leaders, amplifying their voice and increasing their sphere of influence in the technical community. IBMers, partners and clients are encouraged to submit nominations through August 13th. To learn more and to submit your nominations, go to: IBM Champion homepage.
The Central Depository Company of Pakistan Limited (CDC) is the only depository in Pakistan, handling the electronic settlement of transactions carried out at the country's three stock exchanges.
Business need: With numerous point management tools, time-consuming manual processes and no single help desk, IT administrators were constantly operating in a reactive mode and faced just 90 percent system availability.
Solution: IBM Business Partner Gulf Business Machines helped CDC implement an Integrated Service Management solution from IBM that increases IT efficiency while improving the effectiveness of business services.
Benefits: 90 percent reduction in average time for root cause analysis; estimated 50 percent reduction in time to support new lines of business; 98 percent improvement in service level agreement (SLA) levels.
"IBM Tivoli Storage Productivity Center gave us greater visibility into storage utilization, helping us optimize capacity planning and improve our storage ROI to save 30%" —Syed Asif Shah, Chief Information Officer, Central Depository Company of Pakistan Limited
Read the complete case study for more details on the solutions used for CDC to implement and Integrated Service Management solution. More success stories of other customer implementations of IBM technologies can be found here.
A new supercomputer at the University of Kentucky has placed it in the top 10 of public universities for compute power.
According to UK President Lee T. Todd Jr, "This supercomputer will allow our world-class researchers to discover new solutions to the complex problems facing the Commonwealth, the nation, and the world."
This new high-performance compute cluster comes with 200 terabytes of usable disk storage. This important data is protected by Tivoli Storage Manager (TSM) and Heirarchical Storage Manager (HSM) that is connected to the UK central backup system.
I'd like to congratulate UK on making it into the top 10 public university supercomputers. Lastly, I'd like thank them for selecting TSM to protect this critical infrastructure.
You can find the full report on the UK supercomputer at HPCwire.
I have been writing about IBM Tivoli Storage FlashCopy Manager on Windows and some of the new functions that we released earlier this year like Exchange Server 2010 support and SQL Server 2008 R2 support. We are working on some more exciting enhancements and I want to tell you about an early access program for the next release of FlashCopy Manager. If you are interested in looking at and testing some of the new functions and features of the next release of IBM Tivoli Storage FlashCopy Manager, please contact your IBM Tivoli Sales Representative to get more information.
This is a nice opportunity to see what is coming in the next release of FlashCopy Manager and test it in your own environment. Act now!
Juniper Networks recently published a solution brief regarding the performance boost you get from using TSM Fastback in concert with their WAN optimization (WXC). The value proposition is pretty straightforward: reduced backup times and reduced WAN bandwidth and cost. You can read the full details in the report, but here are a few snippets worth noting:
Conceptual view of the bandwidth savings ...
Savings of backing up 92GB over a 155Mbps link with 100 ms latency:
These savings are above and beyond those you already get with using TSM Fastback (taken from solution brief):
The IBM Tivoli Storage Manager FastBack provides an extensive and cost-effective data protection and recovery solution specifically designed to help remote offices maintain operations, regardless of the type of data loss.
The FastBack Client uses next-generation, block-level technology to capture new and changed data on the application servers as frequently as needed—up to true CDP—with almost no performance impact on the applications. This provides for much more granular recovery, leaving much less data at risk of loss than traditional once-a-night backup solutions.
The FastBack Server provides the management, policy engine, and local repository for the protected data. The server includes near-instant restore capabilities, enabling critical applications to resume within seconds following almost any type of data loss. The server also initiates “selective replication” jobs to send copies of selected data over the WAN to another location for disaster recovery and backup consolidation capabilities.
The FastBack DR Server aggregates the backup data from multiple remote offices—enabling extremely fast recovery of remote office workloads should an office go offline for any reason. The FastBack DR Server also can be used to enhance protection of business-critical application servers in the data center, and it integrates easily with central data protection and retention solutions such as IBM’s Tivoli Storage Manager.
TSM Fastback is a solution that has seen strong adoption from customers with remote offices ... If backup times or bandwidth usage over a WAN are a concern, I suggest you look into the WXC offering from Juniper Networks in concert with TSM Fastback.
At the recent Gartner IOM 2010 conference in Orlando, Florida, I had the good fortune of listening to a series of interesting topics and meeting some really smart people. As one might have guessed, the bulk of the sessions focused on virtualization and cloud topics. But the one topic that piqued my interested was unrelated to virtualization and cloud - it was deduplication and was hosted by Dave Russell.
The intent of the session was to bring forward a some customer examples that were deploying deduplication technologies in their backup and recovery solutions. Most of you that read this blog know that deduplication and data reduction have been a hot topic in the industry. And as you likely know, almost every major vendor out there offers some form of deduplication with its associated benefits.
This session provided us two customers who were willing to talk about their experiences with deduplication and the benefits they've received. One customer is using CommVault and the other is using IBM Tivoli Storage Manager v6 (TSM). While both customers showcased the quantified benefits from deduplication, the presentation from the TSM customer went beyond just the benefits of deduplication. The TSM customer revealed their quantified benefits and also identified some of the best practices they developed regarding deduplication.
This particular TSM customer is a large producer of natural gas in the U.S. The customers environment has TSM managing about 1.3 petabytes of data from over 1500+ nodes. Overall, their approach to managing backup storage is do it as efficiently as possible and to reduce the overall amount of backup data under management.
Prior to leveraging TSM deduplication, this customer began with "incremental forever" and compression. Once TSM v6 was released, they adopted deduplication at the server and client in concert with the other data reduction features provided by TSM.
As they began evaluating their use of deduplication, they had to deal with demands from their internal customers - DBA and Exchange admins like full backups etc. Furthermore, they had to consider how their rate of data change, evaluate retention policies, and ensure that their restore requirements weren't negatively impacted by the use of deduplication.
After significant testing and planning, the customer decided that they would initially deploy deduplication for their Oracle databases and Windows OS and system state backups. The results of using TSM deduplication were impressive ...
Oracle deduplication results - 75% reduction of Oracle backup data after deduplication. This was on 3.8TB of physical space on disk and about 15 TBs of data on tape.
And their results on Windows OS and System State were a whopping 94% ... taking them from 172GB of managed data down to 11.4 GB. In this scenario, the customer leveraged TSM 6.2 client- or source-side deduplication.
Overall, very impressive results. By leveraging the data reduction features within TSM, the customer was able to save by using less tapes library cells, tape drives, and disks.
In the end, the customer stated that TSM data reduction (with deduplication) helped them meet their objectives - efficiently reduce data under management. Furthermore, it allows them to reduce their overall HW costs and meet or improve restore requirements. The last comment the customer made before closing the session was that with all the various TSM data reduction capabilities in production, their job had ultimately gotten simpler now that their environment was running more efficiently ...
This is a fantastic story that I really enjoy sharing. If you are a TSM customer and have benefited from its data reduction technologies, then please give me a shout as I would like to hear your story as well.
Working with IBM, a hospital in Asia Pacific gained a data protection solution that meets users' data availability requirements, scales on demand to support a growing warehouse of patient data and medical images, and simplifies data migration and data recovery tasks.
The benefits of the solution include a 50% reduction in backup window; restores individual Microsoft Exchange objects in minutes; restores systems in under 10 minutes.
Read the complete case studyto see how this Asian Pacific hospital gained peace of mind with virtualixed data protection from IBM.
More success stories of other customer implementations of IBM technologies can be found here
While I was at Pulse 2010 in Las Vegas, I had the opportunity of Interviewing Scott Sterry from Cristie Software Limited. Cristie Bare Machine Recovery integrates with IBM Tivoli Storage Manager to provide a Bare Machine Recovery (BMR) solution for Windows®, Linux, SUN Solaris and HP-UX.
If you were unable to attend the live Pulse 2010 event in Las Vegas, you can still attend the Virtual event - register today. You can also check out the Pulse Comes To You Web site to see if there will be an event in a city near you.
The end of last year was pretty hectic for a lot of us and you might not have attended IBM's "Information on Demand Gala" but as a refresher, we introduced our Smart Archive Strategy. Several of my customers have been asking for a refresher on the topic and we've just posted a short video describing this comprehensive approach that combines IBM software, systems and service capabilities designed to help you extract value and gain new intelligence from information by collecting, organizing, analyzing and leveraging that information. For more information, watch this video, visit the IBM Smart Archive Strategy Website and meet me at Pulse 2010 by attending the Storage Track sessions to discuss your specific archiving needs.
The count down is on... with only 2 weeks left to Pulse 2010, I wanted to give you and update on additional perks you'll have access to if you register and attend. Meet the Experts! Talk one-on-one with Product Experts
Booth 80: SAN Volume Controller and Tivoli Storage Productivity Center – storage virtualization, storage resource management, data discovery
Optimizing Infrastructure: Smarter Systems, Storage and Information Retention Zone
Booth 92: IBM Information Archive and IBM Smart Archiving Strategy
Booth 93: IBM XIV: Storage Reinvented
Booth 95: IBM System Storage DS8000 Series
Delivering Business Value with Smarter Services
Booth 79: IBM Storage Enterprise Resource Planner
Check out my previous blog,The Pulse Roadmap to Storage Expertise, for information on some of the sessions that you can attend. Use the on-line agenda tool
to build your agenda and view all the sessions available (requires only
an IBM.com password - you do NOT have to be a Pulse registered attendee
to create a Pulse schedule online).
Share Your Story This year at Pulse 2010 we are scheduling video tape interviews with clients who are willing to share their thoughts on what they are doing to achieve visibility, control, and automation in their infrastructure. We will be filming client videos at Pulse starting Sunday, February 21, through Wednesday, February 24. The content will be used to produce short videos that we will leverage to tell the needs clients are addressing in their organizations. Our customers have been sharing their stories throughout 2009 as you can see below. Interested in participating? Notify me at email@example.com
IBM Tivoli Storage Manager FastBack is a great continuous data
protection, backup and recovery solution for both midmarket and
large enterprise organizations, for branch offices or data centers.
For more storage sessions while at Pulse 2010 check out this blog