(Photo courtesy of http://www.greenbookblog.org)
What makes high performance storage high performing?
One of the most important ways to optimize performance is to tier your data and storage systems for the right combination of speed, capacity and economy. One of the most advanced storage solutions — hybrid storage — is able to migrate data between tiers as needed.
While cost remains the largest motivating factor for good storage management, business concerns are just as important. You cannot afford to max out storage capacity only to watch your applications fail because they are unable to run. The result may be downtime, lost revenues or lost data. To maximize storage efficiency and optimize storage usage, you want the right data on the right storage tier. You want to see the current distribution of data and the recommended distribution of data based on the tiered thresholds that you set.
This is where IBM’s new SaaS offering, IBM Spectrum Control™ Storage Insights, can be a valuable tool in your storage management operations toolkit.
After all, data is not created equal. Your business organization might have data that is very highly transactional and a few millisecond hiccup can result in lost business or a even a terrible user experience. For example, think high velocity trading in a online brokerage house. On the other hand, some of your data is rarely accessed, but over time can accumulate to a huge amount. For example, think email archives. This is where the tiered storage solution in IBM Spectrum Control Storage Insights enables you to see past and current trends in capacity and space usage, and based on past and current storage usage can offer recommendations for your storage needs.
In Storage Insights, click Home > Dashboard to see the Tier Planning chart.
To get more information on tier planning, click See Details.
When you go to Insights > Tier Planning, you can see:
The charts that show the current and recommended distribution of the volumes across the tiers that you created
The list of the volumes that are identified as candidates for re-tiering
The Tier Planning page shows you more information about how your data is currently distributed and the recommended distribution of data across the storage tiers. In the table, the volumes that contain data that is not on the right tier is displayed. To maximize storage efficiency and optimize storage usage, you can decide which volumes you want to down-tier.
In the Allocated Space by Tier chart, you can hover the mouse pointer over the column for the tier to see the current and recommended allocation of space for each tier.
You want the data in your data center to be placed on the tiers that best match the performance requirements in your business organization. You place your storage into tiers by assigning tier levels to pools and you set performance thresholds to generate recommendations for your storage. Each time that data is collected, the tiering of your storage is analyzed.
“IBM Spectrum Control Storage Insights from IBM not only delivers a performance dashboard to monitor overall health, but also enables you to drill into and address storage issues affecting service delivery.” - - Johan van Arendonk, Systems Engineer, E-Storage B.V.
So what's next?
Give it a try!
Explore the live demo that contains the actual product with sample data; it’s the quickest way to try out the insightful solutions. You will need an IBM ID to log in. If you do not have an IBM ID, Storage Insights provides a link for you to create one. Follow the guided tutorials for not only tier planning optimization, but troubleshooting performance, monitoring block capacity, monitoring file capacity and identifying reclaimable storage.
Another way to get fast insights into your storage environment is the free 30-day trial. Try the Storage Insights solution by using your IBM ID, or by creating one. After the30-day trial you can easily convert to a subscription.
Go to IBM Service Engage to learn more about IBM Spectrum Control Storage Insights cloud storage delivery model now!
See the IBM Spectrum Control Storage Insights product page for more details.
See this insight report: Evaluator Group: IBM Spectrum Control™ Storage Insights Enables Broader Use of Storage Resource Management
Checkout the documentation for IBM Spectrum Control™ Storage Insights in IBM Knowledge Center.
For continuing information on Storage Insights or IBM Spectrum Control, follow me, Bob Graczyk, on Twitter, @bobby_gratz or Linkedin.
On Monday evening, IBM Tivoli Storage Manager Product Manager, Tom O’Brien, provided an update on the Data Protection roadmap. A roadmap is a high level development plan that helps customers, business partners and subject matter experts prepare for new capabilities. Check back with the IBM Systems Storage blog for a posting about the Data and Storage Management roadmap, and other storage sessions at InterConnect.
The TSM Family development team is continuing their mission to make IBM storage software more useful (intelligent, intuitive and transparent). The roadmap is focused on innovations that matter, such as better support for hybrid clouds, administration changes that can reduce human error and large scale environment support.
IBM’s transparent development process creates working partnerships with customers throughout the design and development process, with the goals of implementing new features right the first time, and in the right priority order.
I would like to thank the customers who have participated in Beta and Early Access Programs. You’ve done a fantastic job. I encourage readers to participate in these programs and contribute your ideas. If you’re interested, please send a note to Lorena Colston, firstname.lastname@example.org.
A Storage Pool Called ‘DEDUP’
IBM is building a new type of Storage Pool designed for disk-to-disk backups using TSM deduplication. The new design differs from deduplication in prior TSM versions, in that deduplication occurs in-line, at the time of the backup, rather than later, as a post-processing step. That means TSM deduplication works more like deduplication Data Domain and ProtecTIER deduplication appliances, but without the cost and limitations.
‘Dedup’ storage pools have additional benefits. There is less TSM server processing that can interfere with backup windows: Deduplication, reclamation and database reorganization.
Cloud Storage Pools
Cloud storage pools will give TSM Administrators fast access to nearly infinite amounts of storage pool space. No more waiting for data center storage provisioning. No more stress about running out of space in the backup environment.
What if backup data could overflow onto cloud storage, whenever you needed more space in your on-premises backup environment?
What if old backups migrated to cloud storage until they expired, using an automated disk-to-disk-to cloud process?
What if you could implement a basic multi-site data recovery capability faster, without having to wait for equipment to be deployed in a recovery center?
What if your TSM storage pools could live on the cloud temporarily, to aid a migration or upgrade project?
Cloud storage pools will be easy to deploy. Policies and security are managed from your on-premises TSM environment. TSM servers are not required at the Cloud Service Provider facility.
Web-based self-service restore portal
There’s no service like self-service, especially for data recovery. When data is lost, you want it recovered fast – and the fewer people who know, the better. IBM is building a recovery portal that data owners can use without special training or software. Restores can be requested anytime, from anywhere.
Data Centers and Service Providers will be able to handle day-to-day restore requests without customer service requests or administrator time.
The restore portal is planned initially for VM workloads, but designed to expand to more workloads over time.
Off-site Replication for Everyone
It’s a challenge to move daily backups to recovery centers, in such a way that applications can be recovered quickly should the primary site becomes unavailable. For many organizations, this is an expensive and risky form of insurance, to cover something that will probably never happen. And, like most insurance, it’s no fun to pay or experience a loss.
IBM is making off-site backup replication faster and easier. Set up and monitoring will be integrated into TSM Operations Center. Like other TSM Operations Center features, replication setup is done with a few mouse clicks, and can be monitored at-a-glance, from any browser enabled device.
Performance will be faster, particularly in bad network neighborhoods. Network optimization enables consistently fast throughput, even over long distances and in environments with high packet loss.
These capabilities, combined with Recovery Simulators already available, create a modern data protection environment that gives you confidence your data is protected.
Easier to Buy and Deploy
Two packaging changes are planned: TSM Suite for Unified Recovery will add FlashCopy Manager, and a new pre-packaged TSM Virtual Appliance is expected.
... And That's Not All
Several other planned capabilities were discussed, too. This blog only touches on the highlights. Contact your IBM seller or IBM Business Partner for additional details.
A discussion of unreleased products requires the following standard IBM disclaimer:
The information contained in this publication is provided for informational purposes only. While efforts were made to
verify the completeness and accuracy of the information contained in this publication, it is provided AS IS without warranty
of any kind, express or implied. In addition, this information is based on IBM’s current product plans and strategy,
which are subject to change by IBM without notice. IBM shall not be responsible for any damages arising out of the use
of, or otherwise related to, this publication or any other materials. Nothing contained in this publication is intended to,
nor shall have the effect of, creating any warranties or representations from IBM or its suppliers or licensors, or altering
the terms and conditions of the applicable license agreement governing the use of IBM software.
References in this publication to IBM products, programs, or services do not imply that they will be available in all countries
in which IBM operates. Product release dates and/or capabilities referenced in this presentation may change at any
time at IBM’s sole discretion based on market opportunities or other factors, and are not intended to be a commitment to
future product or feature availability in any way. Nothing contained in these materials is intended to, nor shall have the
effect of, stating or implying that any activities undertaken by you will result in any specific sales, revenue growth, savings
or other results.
About the Author
Mike Barton is a Worldwide Storage Marketing Manager at IBM. Prior to 2007, Mike was a Technical Manager and Principal IT Specialist for IBM, and a Sales Rep and Principal IT Specialist for Sybase. He holds ITIL Foundation and Gartner Group TCO Certifications. Mike has been with IBM over 15 years and has over 25 years of Information Technology experience. The opinions expressed herein are his own.
This morning at IBM InterConnect, IBM executives Steve ‘Woj’ Wojtowecz and Michelle Steen kicked off a week of storage sessions with a broad ranging discussion that included Software Defined Storage, product announcements, and roadmaps.
Over 60 storage sessions this week! There will be presentations by customers, Business Partners, and IBM experts. Sessions will feature cloud solutions, storage for Big Data and Analytics, software-defined storage, data protection updates and more. My colleagues and I will blog about as many sessions as we can, so check the IBM Systems Storage blog for the latest news.
Michelle Steen, Ph. D.
Mgr., IBM Storage Product Management
Software Defined Storage
Last week, IBM announced Spectrum Storage, a Software Defined Storage family of solutions (For details, please see the IBM Press Release, Computerworld, Datamation, Woj’s blog or my blog). Software Defined Storage is the next logical iteration for the intelligence that manages storage systems. Originally, storage management software, such as RAID, ran on servers. At the time, there was no processing power available on disk systems. As disk systems became more powerful, the intelligence moved onto storage systems, where it could deliver better performance. Today, storage systems can be defined entirely in software, not tied to physical storage or servers, creating the opportunity to significantly reduce costs and improve business agility.
Software Defined Storage will be a key enabler for hybrid clouds, internet marketplaces and other applications that require data to be more mobile.
IBM’s approach to Software Defined Storage is more valuable to customers. Other approaches provide minimal capability and try to lock in a particular brand of hardware. IBM’s approach is open, comprehensive, and uses core technology that’s been proven in Data Centers, so it’s ready to be deployed today.
IBM Storage Insights (beta)
A new cloud-based solution running on SoftLayer, Storage Insights deploys in as little as 30 minutes and starts sharing insights immediately. For example, you can quickly identify unused and misallocated storage for reclamation, which can postpone the next storage capacity upgrade. Proprietary analytics from IBM Research provides storage tier optimization recommendations, which can reduce users’ per terabyte cost of storage by up to 50%. Application and department views of storage help data owners ‘see’ their storage, often for the first time. An optional Performance Management module helps troubleshoot application performance issues and optimize data placement.
Learn more about Storage Insights from Jason Davison’s recent blog or the IBM Cloud Marketplace.
IBM Virtual Storage Center v5.2.5
Virtual Storage Center (VSC) expands support for IBM Spectrum Scale (GPFS) and Elastic Storage Systems (ESS) file-based storage management. Version 5.2.5 new reports, hard and soft quota views, and performance management. Now, performance can be analyzed in near real time, from the perspective of the file system, its underlying storage, and the fabric in between. This update is the 3rd phased deployment of management capabilities for the Spectrum Scale platform, making VSC the most robust management software for Spectrum Scale.
Learn more about Virtual Storage Center on IBM.com.
IBM Tivoli Storage FlashCopy Manager v126.96.36.199
FlashCopy Manager (FCM) expands support to include Spectrum Scale snapshots. This addition provides new options for large file system owners. Now, one vendor supports the data owners’ choice of remote mirroring, managed snapshots, incremental backups and policy-based hierarchical space management. You don’t have to pay for the high cost option for all your data, as competitors often recommend.
Learn more about FlashCopy Manager on IBM.com.
Tivoli Storage Manager on the Cloud
IBM Business Resiliency Services announced the availability of Cloud Managed Backup services on SoftLayer, using Tivoli Storage Manager. The solution enables fast deployment of business-class hybrid cloud and cloud-to-cloud managed backup services. IBM has emerged as one of the top managed backup services for business workloads. TSM is the #1 platform in both signings and terabytes managed for IBM Cloud Managed Backup service.
Learn more about IBM Resiliency Services solutions on IBM.com.
Michelle Steen also introduced product roadmaps for both Data and Storage Management (Spectrum Control, TPC/VSC) and Data Protection and Recovery (Spectrum Control, TSM Family). There will be separate breakout sessions this evening and Tuesday afternoon. Please check back for separate blog posts on roadmap sessions. The future is about hybrid clouds, software defined storage, new performance / scalability limits, and usability.
Watch InterConnect General Session Keynotes via InterConnectGO. To stay connected, follow the conversation by using the hashtag #IBMInterConnect or #IBMStorage on Twitter.
About the Author
Mike Barton is a Worldwide Storage Marketing Manager at IBM. Prior to 2007, Mike was a Technical Manager and Principal IT Specialist for IBM, and a Sales Rep and Principal IT Specialist for Sybase. He holds ITIL Foundation and Gartner Group TCO Certifications. Mike has been with IBM over 15 years and has over 25 years of Information Technology experience. The opinions expressed herein are his own.
IBM InterConnect 2015 – the premier conference for cloud and mobile kicks off from February 22 – 26 in Las Vegas, Nevada - to deliver one of the most comprehensive technology events ever. While the excitement has already begun, let’s take a look at the top 5 reasons why you shouldn’t miss IBM InterConnect 2015:
General Sessions, Keynotes, and Break-out Sessions:
InterConnect 2015 offers you over 42 tracks, 8 streams, 3 general sessions to learn from the top industry experts about the latest and the greatest trends and technologies. From development to architecture to operations, InterConnect will provide 1500+ sessions worth of the best education, networking, and exhibits on topics like cloud, mobile, security, DevOps, and more. For instance, hear the latest IBM strategies from key executives like Tom Rosamilia, IBM SVP and gain insight from the General Session guest speakers, Barbara Corcoran, Daymond John, and Robert Herjavec of ABC's Shark Tank. InterConnect 2015 also offers you an opportunity to build your own agenda. Click here to learn more.
Business Partners Summit:
Save the date for the InterConnect 2015 Business Partner Summit, being held on February 22nd. This one-day summit offers a wide range of content and relationship building activities; networking with IBM executives, product and industry experts, and other Business Partners; and features bestselling author, venture capitalist, and entrepreneur Josh Linkner as the guest speaker. Download the Business Partner Summit Program Guide.
With great software, comes great responsibility! This year, thousands of software developers, architects, designers, and programmers will leave their language, platform, and editor wars behind to come together for two days of sessions, training, and building with the people and companies who are creating the new technology landscape. Hack, make, break, and shake with people who are super smart at what they do. Just like you. Click here to see what to expect from dev @InterConnect.
The Solution EXPO at InterConnect 2015 is the hub for networking, collaboration, and engagement. We’ve also incorporated new architectural elements which create a modern, functional space that facilitates the many demos and discussions taking place on the EXPO floor. Click here to check the key features included in the Solution EXPO at InterConnect 2015.
Get Social with InterConnectGO:
InterConnectGO is the digital interactive platform for InterConnect 2015. Hosted by gamer and video star Veronica Belmont, the event features three full days of live streaming video straight to your laptop or mobile device. If you have colleagues who can’t attend InterConnect 2015, encourage them to register for InterConnectGO for the free online digital experience. Also, if you like being on social, there’s a whole new opportunity waiting for you. Just share your InterConnect story or experience and get a chance to be featured as InterConnect Social Jockey on @IBMStorage. Don’t forget to mention @IBMStorage and include hashtag #IBMInterConnect in your posts.
Isn’t it exciting! So what are you waiting for? Register now and be a part of InterConnect 2015 experience! For up-to-the-minutes updates, follow #IBMInterConnect or #IBMStorage on Twitter. If you have any questions, please don't hesitate to contact me at email@example.com. I look forward to seeing you at InterConnect 2015!
IBM Edge 2014 Tuesday Storage Recap
Day 2 at IBM Edge 2014 focused on how clients, Business Partners and IBM are working together to build smarter infrastructures to meet the business challenges discussed on Day 1 (cloud, analytics, mobile and social).
Chris O’Connor, @chrisoc_IBM, Vice President of Strategy & Engineering, IBM Cloud & Smarter Infrastructure, spoke about the need to seamlessly extend infrastructures from what organizations own today to what they’ll need in the future. He recommended two approaches:
Cloud-enable existing workloads
Think about ‘cloud first’ for new workloads
The idea is to accelerate time to market and be able to make real time actionable insights. With 70% of enterprises planning to pursue hybrid clouds by 2015, according to a 2013 report by Gartner, a two-pronged approach makes sense.
Andrea Nelson, Director of Storage Marketing at Intel collaborated Gartner’s estimate, saying an estimated 50% of organizations less than 10 years old are putting their IT infrastructures on the cloud today.
Chris spoke about the importance of standards, such as OpenStack, that help organizations quickly assemble Software Defined Systems from components, rather than building clouds a stick at a time. With new development platforms such as IBM’s Code name: BlueMix, organizations can construct enterprise-capable cloud applications faster, without having to deploy a cloud infrastructure.
Mike North, Sr. Director of Programming for the National Football League spoke about the importance of speeding up the infrastructure to enable analytics. ‘Time to truth’ is critical for analytics. With faster processing, the NFL is able to look at 100s of potential schedules and choose the one with the best potential outcomes for their constituents. IBM’s Arvind Krishna suggested that traditional analytics is like driving a car by looking at the rear view mirror – You can only see where you’ve been. Predictive analytics helps you see into the future, react faster, and achieve better business results
Maria Winans, @mariawinans, IBM VP of Social Business, spoke about how IBM and other organizations are driving people-centric engagement for new profit channels. She also spoke about the importance of analytics, saying you can’t personalize customer experiences if you can’t do the required analytics. Maria offered 3 suggestions for successful social business initiatives:
Build shared value
Protect your brand
New mobile applications offer the opportunity to improve customer satisfaction and customer loyalty, as well as generate new revenue. Rapid transformation is happening across industries and geographies. IBM estimates there will be over 1 trillion connected objects and devices by 2015. Mobile applications are enriched by cloud, analytics and social business initiatives.
Storage virtualization and Software Defined Storage
Storage virtualization is the foundation for Software Defined Storage. Virtualization provides an abstraction layer between physical storage and applications that use it. The result is a storage infrastructure that can grow and change without impacting users or applications. Software Defined Storage will be required t manage the vast amounts of data organizations expect to manage in the years ahead.
Steve ‘Woj’ Wojtowecz, @steve_woj, IBM Vice President, Storage Software Development, shared new research from ITG that analyzed storage TCO using IBM, EMC, and VMware storage management solutions. ITG highlighted 4 issues that significantly impact storage TCO:
Storage software costs
Storage administration costs
IBM Virtual Storage Center users were far more successful than their peers using EMC or VMware storage management solutions:
In large enterprises, storage TCO was 72% lower with IBM than EMC
In midsized environments, storage TCO was 35% lower with IBM than VMware storage management.
Jose Garcia, Manager of Enterprise Storage and VMware at UCLA Health System, discussed his storage transformation project. Storage virtualization enabled rapid deployment of an Electronic Health Records system that improves patient care and improves organizational efficiency. Storage virtualization also reduced storage costs and enabled rapid data growth. Improved efficiency saved enough to fund a 3rd data center that will improve resilience and flexibility.
Collaborators wanted. No Eeyores. No squirrels.
Snehal Antani from GE Capital spoke about the importance of delivering IT at market speed, and with commercial intensity. He offered a strategy of dealing with important groups of people in the organization:
Kings and Queens
Collaborators can accelerate change. Identify your collaborators and put them on a pedestal
Cynics are like Eeore in Winnie-the-Pooh. They’ll tell you why change is hard, and focus on what might go wrong. Ignore your cynics.
Kings and Queens are executives and managers who are eager to be offended. They resist change that may impact their empires. They’re a small, but vocal, group. Don’t give them a megaphone.
Snehal also pointed out that technologists can get distracted by new technology, even if it isn’t essential to simplify or accelerate IT delivery. It’s like yelling, ‘Squirrel!’ to distract dogs, as in the movie, Up. GE Capital has signs that say, ‘No Eeyores’ and ‘No squirrels’.
Bottom line: Infrastructure matters
Can the right infrastructure help you build competitive advantage? Yes, of course. Infrastructure matters.
About the author
Mike Barton is a worldwide storage marketing manager at IBM. Mike is a former IT specialist with Gartner TCO and ITIL certifications. The opinions expressed herein are his own.
ITG Management Report: Cost/Benefit Analysis of IBM Virtual Storage Center Compared to EMC Storage Virtualization Solutions
ITG Management Report: Cost/Benefit Analysis of IBM Virtual Storage Center Compared to VMware Tools for Storage Virtualization and Management
IBM Software Defined Storage
TheCUBE by Wikibon
[Software Defined Storage (SDS)] is getting a lot of attention lately by press, analysts and technology providers such as IBM, causing organizations large and small to take notice. SDS describes a set of storage access and data management services that can deliver what IT administrators are most interested in these days:
Lower storage costs
Less reliance on specific storage systems
Simplified data and storage management
Improved utilization of existing resources
International Data Corporation (IDC) published a [taxonomy for Software Defined Storage] which defines software-based storage as a storage software stack running on commodity, off-the-shelf computing hardware. SDS should offer a full suite of storage services and federation of the underlying storage to enable data mobility, according to IDC.
The interesting thing is, while the name Software Defined Storage is relatively new, IBM has been delivering technology and client solutions that match the SDS definition for over a decade.
Matching IDC’s definition, [IBM SAN Volume Controller], introduced in 2003, is an x86-based appliance running Linux code, providing federated storage virtualization across heterogeneous storage platforms and enabling advanced storage services. SAN Volume Controller has been proven to scale to multiple petabytes. This core technology is also included in IBM’s midrange Storwize storage systems. To date, over 55,000 SAN Volume Controller and Storwize systems have been shipped worldwide, making IBM one of the most popular business class storage virtualization solutions.
Sitting on top of the storage virtualization platform, [IBM Virtual Storage Center] offers industry leading end-to-end storage management with analytics driven data management and policy-based automation to enable self-tuning, self-optimizing storage. According to recent research by International Technology Group, IBM’s approach can reduce storage Total Cost of Ownership by [up to 72% compared to EMC solutions in large enterprises], and [up to 35% compared to VMware storage management solutions in mid-size environments].
At the top of IBM’s storage software stack are interfaces that simplify storage, including:
[OpenStack integration], for automated storage provisioning by cloud applications
VMware vSphere integration, which provides VMware administrators with a familiar interface for simplified storage provisioning and management.
IBM advanced graphical interface that dramatically simplifies end-to-end troubleshooting and performance management, provisioning, and other time consuming storage administration tasks
[IBM Cloud Storage Access] user self-service portal, sold separately
While other vendors scramble to build new offerings for SDS, IBM is extending proven technology that can address your needs today and help you migrate to new era workloads whenever you’re ready.
See [IBM Software Defined Storage at the IBM Edge conference] next week in Las Vegas. Software Defined Storage sessions will be presented at Exec Edge and Tech Edge, and we’ll have live demos in the Solution Center.
If you can’t attend Edge, look for video interviews with Brian Jeffery, Managing Director of International Technology Group, and Steve Wojtowecz, VP of Storage and Network Management Software Development, on [TheCUBE, by Wikibon], live on Monday, May 19 and afterwards on demand.
Learn more about VSC and the rest of IBM’s storage software portfolio at: http://www.ibm.com/software/products/category/storage-software.
About the author:
Jason Davison is the Segment Manager for Storage Virtualization and Cloud Solutions in IBM’s Cloud and Smarter Infrastructure product management group. Views expressed are my own.
By now, everyone in the IT Industry is convinced about the benefits of virtualization. Server virtualization – Yes! But storage virtualization? That’s not an easy one!
In Server virtualization, we simply divide one physical server into several virtual environments that saves you lots of money and helps wring the best out of your infrastructures, but did you know that an inadequate storage system can actually cause the economic benefits of server virtualization to fall through because virtual machines can consume large amount of storage?
So, why is it that Storage Virtualization still isn’t as popular as Server Virtualization?
In storage virtualization, we group physical storage from multiple network storage devices so that they look like a single storage device that can be managed from a single console. This can raise several complexities when –
Data center has storage products from different vendors
Some vendors provide storage virtualization for their own hardware only
You need availability of virtualized storage features from vendors in non-virtualized offerings
Throw in the further dilemma of choosing where to put your virtualization: hosts? network? or arrays? Or perhaps being bound by vendor lock-in clause?You probably have been buying additional devices and systems on an ad hoc basis to meet new storage demands, but think of the advantages you could gain with a storage architecture (read storage virtualization) that allows you to upgrade when needed and in a cost effective manner! Now that’s enough to convince you on the virtues of storage virtualization……
Let’s find out why IBM’s VSC is the best bet in the market.
A recent ITG report compared IBM VSC solution to other large vendor (EMC, VMware) over a period of 5 years to determine savings,
And the verdict is – IBM Virtual Storage Center (VSC):
The only solution to address large-scale multivendor virtualization
Averaged 72 percent less than EMC for overall costs of ownership
Averaged 35 percent less than VMware over 5-year total cost of ownership
Supports more than 200 platforms -- all of the major vendors -- as well as from many smaller vendors.
You can get behind the calculations and analysis done in the two ITG reports to understand the how and why of it. As one user shared when asked to describe VSC in simple terms, “It works.”
IBM’s Data Protection has all the right pieces
Jason Buffington, Senior Analyst, ESG in his interview to Dave Vellante from Wikibon, said IBM fills out the whole data protection spectrum and that it’s new UI is a great proof point to why it’s not your grand daddy’s solution. One of the top 5 problems people face in protecting a virtualized environment is lack of visibility. IBM’s new UI does a great job in adding that visibility. IBM has all the right pieces with its breadth of data protection solutions. With IBM starting to put cloud more aggressively into the mix, 2014 looks interesting.
Data Protection is a rainbow that must have all the colors
In Jason’s opinion when defining Data protection strategies, one should think of Data Protection as a rainbow with Backup, Snapshots, Replication, Archive and Availability making up for the different colors. So when have you ever seen a rainbow with no green? Mechanism of data protection should not only include the whole range of solutions but must also include a hybrid approach that includes Tape, Disk or the Cloud. Organizations can pick the color from the spectrum, according to what they want to recover and how.
Disk, Tape, Cloud – they are all going to stay
Disk is not going to be the one all and end all. Tape is going to stay, with economic advantages and new innovations like LTFS that makes tape look like disk, adding flexibility and durability. Cloud as a backup service is not the Silver Bullet either because it’s only a deployment mechanism, it does not make your back up problems go away, one still has to run it, run the admin and push the agents out. One needs to have an on premise intermediary appliance for fast recovery before going to the cloud. However when ESG looked at what were the primary use cases of cloud for the next couple of years they found Data Protection at number one and Disaster recovery at number three. Jason suggests that every solution should be considering cloud as part of it.
Data protection need not be so Hard
His advice to IT pros who are worried about cost and complexity of data protection is that it need not be so hard, there are great solutions available that allow you to backup, archive, snapshot, replicate and do an entire range of functions, from a single GUI, to a single data store from a single administrators view. People only need to wake up to the solution and start using it.
Check out other Wikibon Interviews at Pulse 2014
In the past two years that IBM acquired Butterfly, it generated hundreds of Analysis Engine Reports (AER) analyzing billions of gigabytes and established facts about Tivoli Storage Manager (TSM) that should make competition sit up and notice.
The Backup Analysis Engine report from Butterfly Software uses light-touch, agent-less software technology to analyze existing heterogeneous data backup environment. It is a non-intrusive analysis based on empirical production data collected in minutes and incorporated into the Backup Analysis Engine report from IBM Butterfly Software.
Why is Butterfly important?
Gartner Magic Quadrant: Backup and Recovery 2013 Competitive analysis says between 2012 and 2016, one-third of organizations will change backup vendors due to frustration over cost, complexity and/or capability. To be able to say conclusively that TSM solution can save backup infrastructure costs by as much as 38% when compared to some of the other competitive products opens the door for IBM to go get these 33% of the organizations looking for a change.
AER is the Key
More demand for AERs is expected with the launch of the automated “self-service” AER generation model. Scheduled to go live at the beginning of 2H 2014, it will scale out as a service to IBM and its Business Partners. These facts drive home the fact that Butterfly AERs have metamorphosed into a well accepted and standard approach to storage infrastructure analytics.
Meet the Butterfly Storage and Backup Assessment Team at Pulse 2014
If the butterfly flutter has caught your interest, visit Pulse 2014 on Feb 23-26 at Las Vegas and meet the folks who deliver Butterfly Storage and Backup Assessments in the IT Optimization section of the IBM booth. Find out how your company can use business analytics to dramatically lower the cost of running your backup and recovery or primary storage infrastructure.
Backup redesign continues to be toward the top of most analysts’ lists for 2013 IT priorities. I’ve talked a lot about some of the catalysts behind this trend like data growth, big data, VMware and software defined storage. With IT managers redesigning, the incumbent enterprise backup vendors have a lot of motivation to offer innovative solutions that are a bit ahead of the times. The leaders have all placed strategic bets on what the winning formula will be. I discussed these bets in my post “Forrester’s take on enterprise backup and recovery.”
For its part, IBM is being quick about helping IT managers redesign. The help starts with a clear understanding of the economic benefit a redesign can bring. After all, in today’s environment few IT managers make technology moves simply for the sake of technology. Storage is about economics. I discuss this more fully in my post “Does trying to find a better economic approach to storage give you ‘Butterflies’?” But there is still efficient technology that enables these economic savings, and the person in IBM who is ultimately responsible for the technology in IBM Tivoli Storage Manager (TSM) is the product manager, Dr. Xin Wang.
Recently I spoke with Xin about the important shifts IT managers are facing and how she is helping IT managers reimagine backup.
The Line*: Xin, I’m going to start with the “Dr.” part of your title. Should folks call you the Backup Doctor?
Xin: (laughing) Well, I don’t know about that. I’m actually a doctor of Applied Physics. One thing that drove me to earn a PhD and has moved me ever since is that I love to learn. I started my career in IBM hard disk drive research, spent some time as a storage software developer and development manager, and have now been working with backup clients as a product manager for several years.
The Line: Wow, I could probably do an entire post just on your career. But let’s stay focused. What have you learned about the challenges IT managers are facing and this whole backup redesign movement?
Xin: It’s interesting. The challenges aren’t secret but they carry big implications for backup. Data is growing like crazy; that’s no secret. But it is now so big that the old method of loading an agent on a server to collect and copy backup data over a network to a tape isn’t keeping up. So IT managers are redesigning.
And what about servers? Servers aren’t servers anymore. Thanks to VMware, they are virtual machines that come, go and move around in a hurry. Traditional backup is too rigid. So IT managers are redesigning.
Administrators are changing too. The generation of backup admins who grew up tuning the environment is giving way to a new generation of backup, VMware and cloud admins who need much more intuitive and automated management tools. And so IT managers are redesigning. (Editorial comment: I discussed the change in administration in my post “Do IT managers really ‘manage’ storage anymore?”)
The Line: Okay, I think I’m seeing your trend. IT managers are redesigning. And it seems like you’ve got a clear idea of why. Can we take your list one at a time? I think my readers would be interested in what you are doing with TSM in each of these areas.
Xin: Sure, that makes sense.
Check back for part 2 of the interview in which Xin shares her near term plans for TSM. If you have questions for Xin, please join the conversation by leaving a comment below.
*The Line is my personal blog, and when it appears in the interview, it represents me as the interviewer.
With a new school year underway, vacation season for many come and gone and the Labor Day long-weekend upon us in North America, entering September marks the unofficial end of summer. For many this is a somewhat depressing time of year as we realize that colder temperatures and the on-set of winter aren’t far off.
However it’s not all bad news. Some of us prefer outdoor activities in the fall and winter months and when it comes to business, the fall presents a renewed interest in sharpening our skills and seeking networking opportunities by attending industry conferences and events.
For Storage professionals in North America an ideal opportunity comes in the form of Storage Decisions New York on September 16 & 17. Storage Decisions New York plans to bring together over 500 end-users, independent experts, analysts, and top solution providers to engage in thought-provoking presentations, interactive networking opportunities, and sponsor showcases featuring the latest trends and technologies impacting the storage industry. The 2-day conference, scheduled for is the only place you will find the industry's foremost independent experts – and the most qualified group of storage professionals – under one roof.
As a platinum sponsor of Storage Decisions New York, IBM will have a multi-faceted presence at the conference with ample opportunities to engage with the storage community. One of the highlights is our Tech-in-Action talk, where IBM’s Storage Software Business Strategist Ron Riffe will outline IBM's point of view on The Critical Decisions for Improving The Economics of Storage. Ron will touch on a range of considerations including the need for improved administration, the role of software-defined and the impact of flash – just to name a few.
Over the course of the two-day event, IBM storage experts will be available in booths 24 and 25 to meet attendees and discuss practical solutions to today’s storage challenges. The IBM booth will also be where attendees can pick up their complimentary conference USB key which will loaded with conference-wide materials and presentations.
Storage Decisions New York is worth taking a look at as a great way to kick-off the fall conference cycle. If you're planning to attend stop by and visit us. If you happen to be on the west coast and concerned that New York is too far to travel, don't worry Storage Decisions is stopping in San Francisco on October 30.
Last December while attending the 2012 Gartner Data Center Conference in Las Vegas, I listened to an insightful presentation by analysts Sheila Childs and Pushan Rinnen on the bring-your-own-device (BYOD) phenomenon. They were particularly focused on issues related to protecting an organizations data in a BYOD world (more on why in a moment). One scenario that captured my attention went something like this.
It’s my device. I had it before I brought it to work and I was using Dropbox or iCloud to sync and share all my files. Now, my device has work data on it too. My security-conscious CIO doesn’t want work data shared on those public services. But I’m accustomed to, and almost dependent on my sync and share capability and my organization hasn’t yet given us a private alternative.
Now, in my roles as a technology strategist I spend a good bit of time helping to plan our investments. With the speed at which mobile and social technologies are sweeping through organizations, I have to admit the case that Sheila, Pushan and other Gartner analysts made that week for the rapidly emerging data protection crisis in BYOD sync and share was compelling. It occurred to me that credible vendors who were able to solve the problem in short order would be in high demand. That was eight months ago.
Fast forward seven months
In July, Forrester analysts Ted Schadler and Rob Koplowitz published The Forrester Wave™: File Sync And Share Platforms, Q3 2013 in a quest to uncover those credible vendors. I liked the way they characterized the problem. “Employees’ need to synchronize files grew from a whisper to a scream over the past few years. . . .The scream will grow louder as the number of tablets will triple to 905 million by 2017 to join the billions of computers and smartphones used for work.” The report evaluated and scored 16 of the most significant solution providers against 26 criteria. Among the leaders was IBM SmartCloud Connections. You can see the complete list of leaders here.
Change is here
The interesting thing that most folks miss in the sync and share conversation is – it’s about more than just syncing and sharing. As BYOD smartphones and tablets begin to proliferate the workplace, document management will shift from email attachments and file servers into social collaboration. Forrester points to a further social shift from casual partner collaboration to compliant workflow in regulated industries.
That kind of data is important – and the reason that the Gartner analysts were focused on the data protection issues of this BYOD world. Organizations today have well matured processes for protecting data on file servers and email systems, usually with an enterprise backup product. I commented on this set of tools in my post on Forrester’s Take on Enterprise Backup and Recovery. But as corporate information is relocated from file servers and email systems to sync and share systems, Gartner had an unmistakable reminder for its customers, “Consumer File Sync/Share Is Not Backup”.
I agree! The good news is that IBM has taken the time to ensure its enterprise backup product, IBM Tivoli Storage Manager Suite for Unified Recovery, protects synched and shared files in IBM Connections with all the same efficiency it does file servers, email systems and most any other data important to an organization.
What is your organization doing with file sync and share? How are you protecting that information?
In the modern datacenter, there’s a lot of shifting going on when it comes to traditional storage management responsibilities. What used to be the domain of a central storage and backup administration team has been thrown up for grabs as server virtualization and software defined everything have entered the scene. I hinted at this a bit in my post Do IT managers really “manage” storage anymore? But let’s consider a practical example that’s quite common with clients I speak to. If you are going to VMworld 2013, plan on attending the IBM TSM for VE hands-on lab to get more details.
Microsoft SQL Server is the foundation for a lot of applications that are critical to business operation – meaning CIO’s and IT managers are interested in its recoverability. Those same CIO’s and IT managers are also interested in the recoverability of their VMware estates, the software defined compute (SDC) platform that houses those databases. For many clients, the problem is that these two domains are tightly guarded by two independent superheroes, and neither is specifically trained in storage.
Superhero #1: The database administrator (DBA)
Most DBAs that I’ve known have an almost personal connection with their databases. They care for them as they would their own children. The thought of leaving one unprotected (without a backup) equates to dereliction of duty. Ignoring the idea that it takes a village to raise a child (or in this case that there may be other members of the IT administration village like VMware admins and backup admins), SQL Server DBAs will often work alone with the backup tools Microsoft provides to ensure their databases are protected. Good for the SQL Server, but not so much for the surrounding infrastructure. For databases running on VMware, routine full backups even with periodic differential backups can consume a LOT of disk space and virtual compute resources, and also contribute to the I/O blender effect.
Superhero #2: The VMware administrator
TSM for VE in VMware vSphere web client
VMware administrators can be just as focused on their domain as DBAs are. Their attention is on being able to recover persistent or critical virtual machine (VM) images, regardless of what app happens to be riding along. VMware has done a nice job of creating and supporting an industry of tightly integrated backup providers. These tools can get at the VMware data through a set of vStorage API’s for Data Protection (VADP) and VMware administrators can manage them through vCenter plug-ins. But few VMware admins are completely aware of all the workloads that run on their VMs and even less aware of the unique recovery needs of all those workloads. It’s just hard to keep up.
Common ground exists
One tool that bridges the gap is IBM Tivoli Storage Manager for Virtual Environments (TSM for VE). Nicely integrated with both VADP and SQL Server, TSM for VE can bring together VMware administrators and the DBAs in ways that would make any IT manager smile. Here are two of the more common approaches.
We can each do our own thing – together
As noted above, SQL Server DBAs take full backups sprinkled with differentials. Even though this approach can tax server and storage resources, and contribute to the I/O blender effect, it is in the DBA comfort zone. When the app is running on a VMware virtual machine, the DBA has the option of storing those backups on disk storage associated with the VM. It’s a nice thing to do because it allows the VMware admin to stay within his comfort zone too. Using vCenter to drive a VADP integrated snapshot tool like TSM for VE, the VMware admin can capture a complete copy of the virtual machine, along with the SQL Server backups the DBA created. Since the likely use of such a snapshot would be to recover the VM and then recover the database from its backup, there’s really not a reason to include the source SQL Server database or logs in the snapshot. With TSM for VE, the VMware admin can exclude the source SQL Server database from being redundantly backed up adding to an already formidable set of built-in efficiency techniques (with TSM for VE, snapshots are taken incrementally – forever, and can be deduplicated and compressed). It’s a good compromise solution letting each admin stay in his or her comfort zone. But it can be better.
We can join forces and do something really great
With TSM for VE, VMware admins and SQL Server DBAs can put their heads together and choose to do something really great. For the DBA, it’s an exercise in less-is-more. The DBA stops doing her own backups. No more full or differential copies of the database. No more taxing resource usage on the VM. No more I/O blender effect. Just, no more. How? Well, with a VMware VADP integrated backup tool like TSM for VE, the snapshot of the VM is accompanied by a freeze and thaw of the SQL Server database (techno-speak for putting the database in a consistent state), just like what happens when a backup is independently initiated by a DBA. And with TSM for VE, as soon as the TSM server confirms that it has successfully stored the consistent snapshot in a safe, physically separate place, it will connect back with the SQL Server to truncate the database logs.
In addition to the less-is-more benefits above, think about the differences in restore with these two scenarios. When the DBA and VMware admin simply coexist, each doing their own thing, restoring the SQL Server database includes steps for restoring:
The VM snapshot to get the database backups in place
The full database backup
The subsequent differential backups
By comparison, when the DBA and the VMware admin join forces with TSM for VE, the steps are dramatically simplified. Restoring the snapshot equates to restoring a consistent copy of the database. And remember, because these snapshots are highly efficient, they can be taken quite frequently.
Going to VMworld 2013? Come visit IBM on the Solutions Exchange floor at booth #1545.
VMworld 2013 is just around the corner and at IBM, we’re gearing up for a great set of conversations with our joint clients. As you’re planning your agenda, here are a couple of things worth looking in to.
IBM has a lot of expertise to share when it comes to optimizing virtual environments. A few weeks in my Outside the Line – an interview on Virtualization optimization post, I was able to catch up with several of the experts who are leading this work. At VMworld, IBM will be showcasing these solutions on the Solutions Exchange floor at booth #1545.
IBM Tivoli Storage Manager for Virtual Environments (TSM for VE) is one of the mostefficient backup integrations that have been done with the VMware vStorage API’s for Data Protection (VADP). I offered some quick insights in my post VMware backup for the iPOD generation. At VMworld 2013, you’ll have an opportunity to take a test drive in the TSM for VE hands-on lab.
Are you going to VMworld? What are you most looking forward to?
For years Hollywood has been enamored with idea artificial intelligence. Beyond tabulating, beyond programmed responses, what would happen if a computer could learn, reason, analyze, predict? In short, what could computers do if they could think? Sadly, more often than not, Hollywood’s answer resulted in some kind of disaster. In 2001 a Space Odyssey, the HAL 9000 computer decided to kill the astronauts on Discovery One. In the 1983 film WarGames, the WOPR computer played games with global thermonuclear war, and in the Terminator franchise of movies, SkyNet attempted to exterminate the human race. Ugh!!
I’m proud that I work for a company who has a very different perspective on the potential of cognitive computing. Instead of blowing people up, IBMers around the world are developing cognitive systems to help us make better decisions.
Leading the Way to a New Computing Era
Number one on my list of Top 5 Observations from IBM Edge 2013 had to do with the cultural changes driving Big Data. The thing about big data is that, in large part, we don’t yet know what we don’t know. Read more...