Backup redesign continues to be toward the top of most analysts’ lists for 2013 IT priorities. I’ve talked a lot about some of the catalysts behind this trend like data growth, big data, VMware and software defined storage. With IT managers redesigning, the incumbent enterprise backup vendors have a lot of motivation to offer innovative solutions that are a bit ahead of the times. The leaders have all placed strategic bets on what the winning formula will be. I discussed these bets in my post “Forrester’s take on enterprise backup and recovery.”
For its part, IBM is being quick about helping IT managers redesign. The help starts with a clear understanding of the economic benefit a redesign can bring. After all, in today’s environment few IT managers make technology moves simply for the sake of technology. Storage is about economics. I discuss this more fully in my post “Does trying to find a better economic approach to storage give you ‘Butterflies’?” But there is still efficient technology that enables these economic savings, and the person in IBM who is ultimately responsible for the technology in IBM Tivoli Storage Manager (TSM) is the product manager, Dr. Xin Wang.
Recently I spoke with Xin about the important shifts IT managers are facing and how she is helping IT managers reimagine backup.
The Line*: Xin, I’m going to start with the “Dr.” part of your title. Should folks call you the Backup Doctor?
Xin: (laughing) Well, I don’t know about that. I’m actually a doctor of Applied Physics. One thing that drove me to earn a PhD and has moved me ever since is that I love to learn. I started my career in IBM hard disk drive research, spent some time as a storage software developer and development manager, and have now been working with backup clients as a product manager for several years.
The Line: Wow, I could probably do an entire post just on your career. But let’s stay focused. What have you learned about the challenges IT managers are facing and this whole backup redesign movement?
Xin: It’s interesting. The challenges aren’t secret but they carry big implications for backup. Data is growing like crazy; that’s no secret. But it is now so big that the old method of loading an agent on a server to collect and copy backup data over a network to a tape isn’t keeping up. So IT managers are redesigning.
And what about servers? Servers aren’t servers anymore. Thanks to VMware, they are virtual machines that come, go and move around in a hurry. Traditional backup is too rigid. So IT managers are redesigning.
Administrators are changing too. The generation of backup admins who grew up tuning the environment is giving way to a new generation of backup, VMware and cloud admins who need much more intuitive and automated management tools. And so IT managers are redesigning. (Editorial comment: I discussed the change in administration in my post “Do IT managers really ‘manage’ storage anymore?”)
The Line: Okay, I think I’m seeing your trend. IT managers are redesigning. And it seems like you’ve got a clear idea of why. Can we take your list one at a time? I think my readers would be interested in what you are doing with TSM in each of these areas.
Xin: Sure, that makes sense.
Check back for part 2 of the interview in which Xin shares her near term plans for TSM. If you have questions for Xin, please join the conversation by leaving a comment below.
*The Line is my personal blog, and when it appears in the interview, it represents me as the interviewer.
I recently participated in a Vendor Day Summit for a large prospective client and I had the opportunity to talk with them about their storage management needs. Over the past 2 years, they have seen over 200% growth in the total capacity of their storage infrastructure and near-term projections are expected to continue this trend. Storage growth has also led to performance concerns with tier 1 applications due to the increased demand on the infrastructure. So, we came to the summit to talk to the client about IBM SmartCloud Virtual Storage Center (VSC), IBM’s flagship storage management solution that combines our market leading technologies from IBM SAN Volume Controller, IBM Tivoli Storage Productivity Center, and IBM FlashCopy Manager along with intelligent analytics and automated management to reduce administrative complexity. It helps improve capacity utilization and slow storage growth, offer flexibility in the placement of workloads to balance performance demand, extend the life of existing storage assets, and reduce the overall cost associated with managing their infrastructure.
During the course of the conversation, it became apparent that the storage administration staff is significantly overburdened by daily tasks like managing provisioning requests, handling fire drills, and troubleshooting end user support calls. VSC is uniquely capable of improving storage administrator efficiency through a variety of key capabilities and we are continuing to enhance those capabilities to address our clients’ most challenging problems.
IBM’s Smarter Storage vision is aimed at providing global data availability across physical, virtual, and cloud environments, to provide unprecedented levels of cost efficiency and simplicity through innovation and applied analytics, and unrivalled client experience and success. VSC has been built from the ground up to embody that vision and provide our clients with a unique storage management experience.
When we first introduced VSC in October, 2012, it included a new user experience that provides intuitive access to various tasks and views within the console to improve administration of complex infrastructures. Many of the legacy views from our market leading storage resource management product, Tivoli Storage Productivity Center, were updated or completely redesigned, based primarily on direct customer feedback. During the development phase, we run an extensive Early Access and Beta Program to enable clients to test our new features and provide direct feedback to our developers, helping to provide that unrivalled client experience and success. However, we knew when we released that first version that there was still work to do. I am pleased to say that in our plans for an upcoming release of VSC, we have made significant strides to further improve the user experience by providing more intuitive troubleshooting tools, especially in the area of performance management, as well as a completely new interface for managing service classes and storage pools. This capability will be critical for clients that are interested in transforming traditional storage environments into a ‘cloud optimized’ infrastructure that standardizes provisioning services and reduces provisioning time from days and weeks to minutes and hours.
Going back to the Vendor Day Summit, this client was also in need of help managing the growth of their environment, a multi-petabyte heterogeneous infrastructure with multiple classes of storage to service their users. One of VSC’s key features at our original launch was the ability to provide intelligent analytics on top of a virtualized storage environment. Leveraging the market leading capabilities of IBM SAN Volume Controller, administrators can create storage pools that include all of their existing storage assets, organized by performance, drive type, RAID level, etc. to consolidate the management of a heterogeneous environment. Workloads can then be quickly provisioned out of these storage pools based on service level requirements to satisfy end user requests. VSC can help the administrator select the appropriate pool of storage for workload placement based on the user defined characteristics of the workload using built-in intelligence about not only the new workload, but also its impact on existing workloads within the storage pools.
Once the workload has been provisioned, VSC collects performance and utilization data, which can be used to build a comprehensive view across the entire infrastructure. The Analytics Engine within Virtual Storage Center can then identify workloads that are not placed on the appropriate tier or class of storage based on their usage patterns. The Analytics Engine will also make recommendations to migrate volumes to a more appropriate tier of storage. This intelligent analytics capability often results in significant levels of cost efficiency for clients, as they are able to migrate many of their workloads off of the most expensive storage.
This capability was especially interesting to our prospective clients at the Vendor Day Summit and our planned upcoming enhancements to automate the migration of those volumes based on the recommendations for the Analytics Engine was viewed as a major advantage. Storage administrators will still have the ability to approve migrations before they are executed, but tying the recommendations for migration with the actual execution tasks is another way IBM is helping clients improve the efficiency of their data center operations.
Stay tuned to learn more about how IBM SmartCloud Virtual Storage Center is helping clients improve storage utilization, availability, performance, and service levels of heterogeneous storage environments.
With a new school year underway, vacation season for many come and gone and the Labor Day long-weekend upon us in North America, entering September marks the unofficial end of summer. For many this is a somewhat depressing time of year as we realize that colder temperatures and the on-set of winter aren’t far off.
However it’s not all bad news. Some of us prefer outdoor activities in the fall and winter months and when it comes to business, the fall presents a renewed interest in sharpening our skills and seeking networking opportunities by attending industry conferences and events.
For Storage professionals in North America an ideal opportunity comes in the form of Storage Decisions New York on September 16 & 17. Storage Decisions New York plans to bring together over 500 end-users, independent experts, analysts, and top solution providers to engage in thought-provoking presentations, interactive networking opportunities, and sponsor showcases featuring the latest trends and technologies impacting the storage industry. The 2-day conference, scheduled for is the only place you will find the industry's foremost independent experts – and the most qualified group of storage professionals – under one roof.
As a platinum sponsor of Storage Decisions New York, IBM will have a multi-faceted presence at the conference with ample opportunities to engage with the storage community. One of the highlights is our Tech-in-Action talk, where IBM’s Storage Software Business Strategist Ron Riffe will outline IBM's point of view on The Critical Decisions for Improving The Economics of Storage. Ron will touch on a range of considerations including the need for improved administration, the role of software-defined and the impact of flash – just to name a few.
Over the course of the two-day event, IBM storage experts will be available in booths 24 and 25 to meet attendees and discuss practical solutions to today’s storage challenges. The IBM booth will also be where attendees can pick up their complimentary conference USB key which will loaded with conference-wide materials and presentations.
Storage Decisions New York is worth taking a look at as a great way to kick-off the fall conference cycle. If you're planning to attend stop by and visit us. If you happen to be on the west coast and concerned that New York is too far to travel, don't worry Storage Decisions is stopping in San Francisco on October 30.
Last December while attending the 2012 Gartner Data Center Conference in Las Vegas, I listened to an insightful presentation by analysts Sheila Childs and Pushan Rinnen on the bring-your-own-device (BYOD) phenomenon. They were particularly focused on issues related to protecting an organizations data in a BYOD world (more on why in a moment). One scenario that captured my attention went something like this.
It’s my device. I had it before I brought it to work and I was using Dropbox or iCloud to sync and share all my files. Now, my device has work data on it too. My security-conscious CIO doesn’t want work data shared on those public services. But I’m accustomed to, and almost dependent on my sync and share capability and my organization hasn’t yet given us a private alternative.
Now, in my roles as a technology strategist I spend a good bit of time helping to plan our investments. With the speed at which mobile and social technologies are sweeping through organizations, I have to admit the case that Sheila, Pushan and other Gartner analysts made that week for the rapidly emerging data protection crisis in BYOD sync and share was compelling. It occurred to me that credible vendors who were able to solve the problem in short order would be in high demand. That was eight months ago.
Fast forward seven months
In July, Forrester analysts Ted Schadler and Rob Koplowitz published The Forrester Wave™: File Sync And Share Platforms, Q3 2013 in a quest to uncover those credible vendors. I liked the way they characterized the problem. “Employees’ need to synchronize files grew from a whisper to a scream over the past few years. . . .The scream will grow louder as the number of tablets will triple to 905 million by 2017 to join the billions of computers and smartphones used for work.” The report evaluated and scored 16 of the most significant solution providers against 26 criteria. Among the leaders was IBM SmartCloud Connections. You can see the complete list of leaders here.
Change is here
The interesting thing that most folks miss in the sync and share conversation is – it’s about more than just syncing and sharing. As BYOD smartphones and tablets begin to proliferate the workplace, document management will shift from email attachments and file servers into social collaboration. Forrester points to a further social shift from casual partner collaboration to compliant workflow in regulated industries.
That kind of data is important – and the reason that the Gartner analysts were focused on the data protection issues of this BYOD world. Organizations today have well matured processes for protecting data on file servers and email systems, usually with an enterprise backup product. I commented on this set of tools in my post on Forrester’s Take on Enterprise Backup and Recovery. But as corporate information is relocated from file servers and email systems to sync and share systems, Gartner had an unmistakable reminder for its customers, “Consumer File Sync/Share Is Not Backup”.
I agree! The good news is that IBM has taken the time to ensure its enterprise backup product, IBM Tivoli Storage Manager Suite for Unified Recovery, protects synched and shared files in IBM Connections with all the same efficiency it does file servers, email systems and most any other data important to an organization.
What is your organization doing with file sync and share? How are you protecting that information?
In the modern datacenter, there’s a lot of shifting going on when it comes to traditional storage management responsibilities. What used to be the domain of a central storage and backup administration team has been thrown up for grabs as server virtualization and software defined everything have entered the scene. I hinted at this a bit in my post Do IT managers really “manage” storage anymore? But let’s consider a practical example that’s quite common with clients I speak to. If you are going to VMworld 2013, plan on attending the IBM TSM for VE hands-on lab to get more details.
Microsoft SQL Server is the foundation for a lot of applications that are critical to business operation – meaning CIO’s and IT managers are interested in its recoverability. Those same CIO’s and IT managers are also interested in the recoverability of their VMware estates, the software defined compute (SDC) platform that houses those databases. For many clients, the problem is that these two domains are tightly guarded by two independent superheroes, and neither is specifically trained in storage.
Superhero #1: The database administrator (DBA)
Most DBAs that I’ve known have an almost personal connection with their databases. They care for them as they would their own children. The thought of leaving one unprotected (without a backup) equates to dereliction of duty. Ignoring the idea that it takes a village to raise a child (or in this case that there may be other members of the IT administration village like VMware admins and backup admins), SQL Server DBAs will often work alone with the backup tools Microsoft provides to ensure their databases are protected. Good for the SQL Server, but not so much for the surrounding infrastructure. For databases running on VMware, routine full backups even with periodic differential backups can consume a LOT of disk space and virtual compute resources, and also contribute to the I/O blender effect.
Superhero #2: The VMware administrator
TSM for VE in VMware vSphere web client
VMware administrators can be just as focused on their domain as DBAs are. Their attention is on being able to recover persistent or critical virtual machine (VM) images, regardless of what app happens to be riding along. VMware has done a nice job of creating and supporting an industry of tightly integrated backup providers. These tools can get at the VMware data through a set of vStorage API’s for Data Protection (VADP) and VMware administrators can manage them through vCenter plug-ins. But few VMware admins are completely aware of all the workloads that run on their VMs and even less aware of the unique recovery needs of all those workloads. It’s just hard to keep up.
Common ground exists
One tool that bridges the gap is IBM Tivoli Storage Manager for Virtual Environments (TSM for VE). Nicely integrated with both VADP and SQL Server, TSM for VE can bring together VMware administrators and the DBAs in ways that would make any IT manager smile. Here are two of the more common approaches.
We can each do our own thing – together
As noted above, SQL Server DBAs take full backups sprinkled with differentials. Even though this approach can tax server and storage resources, and contribute to the I/O blender effect, it is in the DBA comfort zone. When the app is running on a VMware virtual machine, the DBA has the option of storing those backups on disk storage associated with the VM. It’s a nice thing to do because it allows the VMware admin to stay within his comfort zone too. Using vCenter to drive a VADP integrated snapshot tool like TSM for VE, the VMware admin can capture a complete copy of the virtual machine, along with the SQL Server backups the DBA created. Since the likely use of such a snapshot would be to recover the VM and then recover the database from its backup, there’s really not a reason to include the source SQL Server database or logs in the snapshot. With TSM for VE, the VMware admin can exclude the source SQL Server database from being redundantly backed up adding to an already formidable set of built-in efficiency techniques (with TSM for VE, snapshots are taken incrementally – forever, and can be deduplicated and compressed). It’s a good compromise solution letting each admin stay in his or her comfort zone. But it can be better.
We can join forces and do something really great
With TSM for VE, VMware admins and SQL Server DBAs can put their heads together and choose to do something really great. For the DBA, it’s an exercise in less-is-more. The DBA stops doing her own backups. No more full or differential copies of the database. No more taxing resource usage on the VM. No more I/O blender effect. Just, no more. How? Well, with a VMware VADP integrated backup tool like TSM for VE, the snapshot of the VM is accompanied by a freeze and thaw of the SQL Server database (techno-speak for putting the database in a consistent state), just like what happens when a backup is independently initiated by a DBA. And with TSM for VE, as soon as the TSM server confirms that it has successfully stored the consistent snapshot in a safe, physically separate place, it will connect back with the SQL Server to truncate the database logs.
In addition to the less-is-more benefits above, think about the differences in restore with these two scenarios. When the DBA and VMware admin simply coexist, each doing their own thing, restoring the SQL Server database includes steps for restoring:
The VM snapshot to get the database backups in place
The full database backup
The subsequent differential backups
By comparison, when the DBA and the VMware admin join forces with TSM for VE, the steps are dramatically simplified. Restoring the snapshot equates to restoring a consistent copy of the database. And remember, because these snapshots are highly efficient, they can be taken quite frequently.
Going to VMworld 2013? Come visit IBM on the Solutions Exchange floor at booth #1545.
VMworld 2013 is just around the corner and at IBM, we’re gearing up for a great set of conversations with our joint clients. As you’re planning your agenda, here are a couple of things worth looking in to.
IBM has a lot of expertise to share when it comes to optimizing virtual environments. A few weeks in my Outside the Line – an interview on Virtualization optimization post, I was able to catch up with several of the experts who are leading this work. At VMworld, IBM will be showcasing these solutions on the Solutions Exchange floor at booth #1545.
IBM Tivoli Storage Manager for Virtual Environments (TSM for VE) is one of the mostefficient backup integrations that have been done with the VMware vStorage API’s for Data Protection (VADP). I offered some quick insights in my post VMware backup for the iPOD generation. At VMworld 2013, you’ll have an opportunity to take a test drive in the TSM for VE hands-on lab.
Are you going to VMworld? What are you most looking forward to?
For years Hollywood has been enamored with idea artificial intelligence. Beyond tabulating, beyond programmed responses, what would happen if a computer could learn, reason, analyze, predict? In short, what could computers do if they could think? Sadly, more often than not, Hollywood’s answer resulted in some kind of disaster. In 2001 a Space Odyssey, the HAL 9000 computer decided to kill the astronauts on Discovery One. In the 1983 film WarGames, the WOPR computer played games with global thermonuclear war, and in the Terminator franchise of movies, SkyNet attempted to exterminate the human race. Ugh!!
I’m proud that I work for a company who has a very different perspective on the potential of cognitive computing. Instead of blowing people up, IBMers around the world are developing cognitive systems to help us make better decisions.
Leading the Way to a New Computing Era
Number one on my list of Top 5 Observations from IBM Edge 2013 had to do with the cultural changes driving Big Data. The thing about big data is that, in large part, we don’t yet know what we don’t know. Read more...
Several months ago IDC published an interesting Market Analysis Perspective: Worldwide Enterprise Servers, 2012 — Technology Market that uncovered something quite contrary to conventional wisdom when it comes to virtualized server environments. Server virtualization, or software defined compute (SDC) as it is coming to be known, promised to control server sprawl and return balance to the portion of the IT budget allocated to servers. The IDC research confirms that the controlled server sprawl part of the promise has largely been realized. Since 2000, the worldwide spend on x86 stuff has actually declined from $70B to about $56B. Equally as important, environmental spending on power and cooling has leveled off too. The revelation that surprised most folks was the dramatic expansion in spend on management. Since 2000, spend on management has more than tripled reaching $171B and now accounts for 68% of IT spend on x86 infrastructure.
These days the measuring stick for server virtualization seems to be around VM density, or the number of virtual machines that can be supported on a single physical server. The theory has been that as VM density increases, management costs decline. Most IT managers I talk to can point to fairly good and increasing VM density in their environments. So what’s causing associated management costs to increase so much and what can IT managers do to improve the situation? Read more...
Recently, Forrester published The Forrester Wave™: Enterprise Backup And Recovery Software, Q2 2013. I wasn’t surprised by their suggestion that “CommVault [Sympana 10.0], EMC [Avamar 7.0 and NetWorker 10.1], IBM [TSM 6.4), and Symantec [Netbackup 7.5] lead the pack. It’s a tight four-horse race for the top honors — [they] all scored high on strategy and current offerings.” These are the four vendors that are always pushing and shoving on each other in analyst comparisons. The thing that caught my attention in this report was the expert job analyst Rachel Dines did in pealing back a complex market space to uncover some important strategic observations about each vendor. Read more...
I routinely follow a number of blogs by storage industry thought leaders. Among them is a usually insightful blog by EMC’s Chuck Hollis. Last Friday I read his post titled Software-Defined Storage – Where Are We? As Chuck described, the post was intended to explore “Where are the flags being planted? Is there any consistency in the perspectives? How do various vendor views stack up? And what might we see in the future?” The questions themselves captured my attention. First, they are great questions that everyone who is watching this space should want answered. Second, I wanted to see which vendors EMC was interested in comparing with. Notably missing from Chuck’s list was IBM, a vendor who both has a lot to say and a lot to offer on the subject of software defined. (read more)
I started out my day by co-presenting with Brian Sherman - doing a repeat of the DS8870 R7.1 Easy Tier update - good attendance even for a repeat! (about 50). A very interactive session - the participants got all their answers! During this day, I listened to excellent sessions on Cloud Storage, Encryption and the new TSM Ops Center GUI! I did try to get into the TSM Q&A session with Dave Cannon and the TSM for VE session with Dan Wolfe, but they were SRO!
I wrapped up the day by hosting the "Meet the Experts" storage session. I was fortunate enough to have a good turnout of my fellow speakers and a good turn out of customers with questions! A very good session! A lot of discussion around the topic of Software Defined Storage!
That wraps up Tech Edge2013 for me! I am looking forward to Tech Edge2014 in Las Vegas next May at the Venetian Hotel!
Day 2 of the MSP Summit continued to build upon the enthusiam established on Tuesday.
The day's activities began with UBM Channel's Robert Faletra running a very interesting discussion on the MSP and Channel Landscape. More and more partners are moving away from the 'vintage' model of selling hardware and software and moving toward 'progressive' and 'transformative' modes that embrace the cloud model...and this transition if occuring faster than expected. He noted that 52% of BPs offer cloud services today, accounting for 20% of their revenues. These figures will grow to 68% and 28% respectively by 2015. Robert also noted that recurring revenue streams derived from an MSP business model creates value for the partner company in the eyes of the investment community.
IBM's Judy Smolski then facilitated an insightful panel discussion on Transformational Approached to IT Delivery and Business Value Innovation. Much of the discussion focused on how BPs can and should work with each other to address customer needs. IBM's sponsorship of 'Connect to Win' sessions in NA is one way BPs can get together to discuss how their services may complement each other.
From a Storage Software perspective, a session lead by IBM's Steve 'Woj' Wojtowecz on Enabling Your Managed Services with the Cloud was the highlight of the day. Woj provided insights into the transition from traditional VAR to MSP model, the growing storage cloud services sales opportunity, and why customers need services like 'Backup as a Service' to more effectively and efficiently runs their businesses. Thomas Bak of Front-safe followed with a discussion on the cloud portal they developed on TSM which helps MSPs offer BaaS to their customers. The Frontsafe Cloud Portal provides chargeback capabilities, OEM branding and reporting of the customer's backup environment...and a tiered distribution model that allows MSPs to quickly grow their businesses. Heinrich Venter of iSanity then came on stage to talk about how his company leveraged TSM and the Frontsafe Cloud Portal to develop his BaaS service in South Africa. A very well received session that elicited a good many questions and interest from the audience!
The day continued with another eight sessions that provided valuable information to MSPs and to those partners considering this path. A very productive day!
Hi all, Had another great day yesterday, Wednesday - Day 3 - I saw sessions on how to use TPC/SVC to migrate data centers, Hardware updates, How to use Cognos to create custom reports and more! I heard that many Tivoli Storage Manager sessions were so full that people had to be turned away at the door! We closed down a successful Expo after lunch Wednesday - Lots of good business happened at the Expo this past Monday through Wednesday!
Some of today's highlights for me - Listened to an excellent IBM SmartCloud Storage Access session by Manuel Avalos Vega - Easy to provision Cloud Storage! Next up was Brian Sherman, IBM Distinguished Engineer in the ATS organization, presenting DS8000 FlashCopy Client Scenarios to a full house - very nice. Brian and I teamed up the next session on DS8870 R7.1 Easy Tier and Beyond. A very interactive session with the packed room! Cathy Nunez, Director of DS8000 Development, was a helpful participant! In the last session of the day, I presented the zOS Storage Management Ecosystem Update with able assistance from Kevin Hosozawa, OMEGAMON XE for Storage Product Manager and Louis Hanna, a senior Systems Migration Project Office Practitioner!
The two day MSP Summit got off to an exciting start today with over 300 attendees at the kickoff session hosted by Andy Monshaw and Deepak Advani.
Andy noted that over 4000 MSPs were now recognized in the IBM Partnerworld system and that roughly 1/2 had transitioned from being a traditional VAR. He also noted that 'MSPs are the new IT department' in the eyes of their clients and that NextGen MSPs are those that 1) provide high value with innovation, 2) deeply focus on differentiation, and 3) move beyone virtualization and to the cloud. Deepak noted that cloud is an undeniable trend, and in Vegas nomenclature suggested to prospective MSPs that they 'place their bets' on it and start building their business model. He noted that 48% of CIOs now evaluate cloud first over traditional IT offerings. Deepak also emphasized the importance of building around and supporting open architectures such as Open Stack. An MSP Roundtable facilitated by Joe Panettieri of MSP Mentor reinforced messages such as 'moving up the stack' in providing service offerings to drive more revenues and profit.
Cobalt Iron's Richard Spurlock followed with a very insightful session on Innovating Next Generation Data Protection, explaining how his adaptive data protection solutions based on TSM address the new combination of Enterprise Backup and the Cloud Experience. He noted that flexible cloud deployments (public, private, hybrid) most effectively help customers cope with the challenges of cost and complexity.
IBM's Rodney Foreman lead a session in which he emphasized that successful MSPs must be able to articulate the value of the cloud, and have a roadmap on how they plan to enhance their services over time. He also discussed seven key attributes for cloud providers, and called out 'visibility' or providing a dashboard view of the customer's environment as a key success factor. Rodney also noted that Cloud Storage was one of the easiest services to explain to customers and was fueling its growth.
A number of other sessions focused on accelerating time-to-revenue, customer sat, cash flows and the importance of marketing rounded out the first day's sessions.
Big crowds, a lot of great presentations and an enthusiastic audience all contributed to an outstanding day! .