the last post (Part 1) I indicated that we were going to talk about the first
steps of the establishment of an ECM Center of Excellence organization. But
before we go there we have a few other things that need to fall in place.
Establish a ECM Program and Program Manager
ECM to be successful at a enterprise level, it must be treated as a program. A
program with a Steering Committee that is representative of the whole
enterprise and leverage it to provide guidance, impetus, and a high-level
sign-off for company-wide issues such as the corporate taxonomy, key metadata,
and security models as well as critical SLA and Disaster Recovery/Business
Continuity requirements. The ECM Steering Committee is responsible for the
establishment of a ECM Center of Excellence, first by assigning a ECM Program
Manager that is responsible:
program services are visible, planned and managed to the client’s goals and
the development of ECM vision and strategy
development, promotion and management of ECM services
creation and adherence of best practices, proven methodologies and processes
continuously refine COE metrics and management reporting
Develop an ECM Program Roadmap
of the first item of the ECM Steering Committee is to assign resources to work
with the ECM Program Manager to develop an ECM
Program Roadmap. First step of the ECM Program Roadmap is to evaluate
current state followed by gathering current and future business and IT needs.
Once the current state and needs are determined then the high-level future
architecture is defined. A gap analysis from current to future is performed
which drives the rest of the ECM roadmap. The roadmap needs to address both the
ECM COE development/implementation, ECM technology planning/design/deployment
tasks and the establishment of measurements/system validation tasks.
ECM COE Foundation Development group of tasks in the ECM Program Roadmap,
includes the identification of needed services, and a development plan for the
COE resources. The development of key ECM COE processes to run the COE as well
as develop and deliver the needed services.
ECM Technology Planning & Design group of task focuses on the development
of the ECM Technical architecture and produce the architecture artifacts to
integrate into your Enterprise Architecture. On of the important artifacts that
I want to point out is the concept of Solution Patterns. Later I will discuss
packaged (tiered services) and these solution patterns provide governance when
to use particular packages and provides guidelines around the extension
(customization) of these packages. The other activity of this grouping is the
development of an ECM Technology Deployment Plan which will be a detailed plan
and timeline of the deployment of all the ECM underpinning as well as the
services provided to the organization
Identify ECM COE Offered Services
evaluate the current state (first swim lane) of the ECM Program roadmap,
information was gathered to define a high-level architecture to meet the
business/IT needs. Now the ECM COE resources look at the business
challenges/objectives and the future architecture and define appropriate levels
of functionality for various segments of the potential user base.
COE and business units work together to define various packages or tiers of ECM
functionality (for example: ranging from packages with basic store-and-retrieve
capabilities, to more advance packages offering revision-control and automated
workflow capabilities). Other tiers may take into account the acceptance of
change in the business units. Some business units may desire mature services
and avoid change while others may require more leading edge technology which
typically involves introduction of technology on a more frequent basis.
leveraging package solutions is the desired for fast deployment and cost
containment of ECM to the business, it may not meet all the requirement of the
business. In that case the package can be used as the base and custom
components augment the package to meet the business needs. All custom
components will be design with reuse in mind and could be considered ECM COE
services as well.
the Current/Planned ECM technology and the project pipeline, a prioritized
delivery plan of packages and custom components can be developed. The Industry
ECM Direction can be used to determine other services/packages that might need
to be developed that are not currently needed by projects in the pipeline.
Services that have no immediate need but need is expected in the near term,
should be noted and once the ECM COE has needed services implemented, these
will be areas of research focus (Research is a key recommended process that
will not be discussed in this presentation).
far we have been talking about technology services but the ECM COE also
provides Advise & Consult Role based services to the enterprise. Services
Solution Design Services – ECM COE Architect Role provides the knowledge to
develop business solutions leveraging the ECM technologies. These solutions can
be either: Content Storage and Retrieval Solutions: These solutions are
primarily focused on centrally managing content (paper and electronic) as well
as content retrieval. Typically a generic client is provided for retrieval.
Document Centric Workflow Solutions: These solutions not only focus on
centrally managing the content but managing the business process that leverages
the content. Typically this will be a BPM application with a customized user
Solution Requirement Gathering Services - ECM COE Process Designer/Business
Analyst provides the skills to help the business gather and understand their
requirement and translate them into ECM solution requirements. This information
is then used by ECM COE Architect to more efficiently and accurately design a
Modeling Services - ECM COE Process Designer/Business Analyst is
knowledgeable in principles of process modeling as well as the tools to assist
the business with documenting their current and future process with ECM
technology enablement. The ECM enable process models then can be used by the
technical designer to automate the model with the BPM tool
Technology Research Services - ECM COE provides the skills to evaluate and
research new ECM technologies and help the business in the selection of
technologies to meet their requirements
ECM COE role services should be indentified during this exercise so that it can
be communicated to the customers resulting in strengthen ECM solution provided
by the ECM COE
time flies when having fun. I am out of time and my daughter is waiting for me
to tuck her in so on the next post we’ll pick up with a discussion of the next
step of a ECM COE Foundation Development.
let me know what you think. All feedback is greatly appreciated.
Still finalizing your plans for Connect 2013? We've got a
few sessions that I think will interest you – be sure to add these to your
calendar! These are excellent opportunities for you to pose your questions to
our subject matter and industry experts, along with some IBM ECM customers.
Interested in meeting with ECM executives? We've got you
here to request a meeting with either Doug Hunt, ECM Business Leader, Ken
Bisconti, Vice President ECM Products and Strategy, or Carol Taylor, WW Sales
Leader for Social Content Management.
We also invite you to find us in the exhibit hall at IBM
booth 23 – stop by for a demo of what Social Content Management can do for your
Monday, January 28
11am (Swan Hotel, room 1,2): Genworth Financial, Work Smarter,
Not Harder, presented by Tim Perry, CTO of Genworth Financial
Tuesday, January 29
10am (Swan Hotel, room 9,10): Slumberland Furniture:
Using IBM Software to Deliver Consistently Superior Customer Experiences,
presented by Jamie Page, Director, Slumberland Furniture
11:15am (Swan Hotel, Pelican 1,2): Living Social, Its Not
Just About the Conversations and Topics, a panel discussion of experts,
including Joe Shepley, Doculabs, Larry Hawes, Dow Brook Advisory Services,
Cengiz Satir, IBM, and Steve Studer, IBM
1:30pm (Dolphin Hotel, S. Hemisphere
IV, V): Content & Social Ignites Context: IBM’s Content Platform of
Engagement, presented by Tim Perry, CTO of Genworth Financial, Doug Hunt,
IBM ECM Business Leader, and Ken Bisconti, Vice President of IBM ECM Products
5:30pm (Dolphin Hotel, S.
Hemisphere I): Ignite business performance in real-time with social
collaboration, mobile and content, presented by Ian Story, IBM and Steve
Wednesday, January 30
10am (Swan Hotel, room 4): Reduce, Reuse, and Recycle
Corporate Content, presented by Maig Worel, IBM
1:30pm (Swan Hotel, Mockingbird 1,2): Improving your
Information Economics with Complete Lifecycle Governance, presented by Mark
Thursday, January 31
7am (Swan Hotel, Toucan 1): Archiving and de-duplicating Email,
Files, and Social Content, presented by Cengiz Satir, IBM
Stay Social with us during the show #IBMConnect – @IBM_ECM @csatir
Although I have been involved in document capture for over 20 years, it was not until Datacap joined forces with IBM in 2010 that we started to meet regularly with large banks to help them address their massive mortgage processing challenges. Even given all the things that I had learned over the years about high-volume document capture, I have been surprised just how many nuances and special considerations that there are when it comes time to scan a mortgage.
Are you considering scanning and advanced document capture in your mortgage business (or are you just interested in learning more capture tricks-of-the-trade)? If so, then here's my list of the two most important ways that mortgage document capture is special:
1) Document = Batch
Most document capture applications are batch oriented. Why? Because it is almost always more efficient to scan a number of documents all at once (a "batch") versus one at a time. It is also a very useful simplification technique to reduce the number of "things" to track by grouping them into a batch, for example, if a batch consists of 50 documents, then there is a 50-to-1 reduction in 'things' to track.
There are some situations, however, where each document is its own batch. For example, this is often the case when the capture system reads from faxes. Typically each transmission is read into its own batch, and the sender is typically sending one document. Bank branch batch capture (described here) is another good example, where a customer hands over a document to a branch officer and that officer scans that document as a “batch.”
But mortgages are different. Depending on how you count documents, a mortgage packet of 200 or 250 pages may consist of 15 or 20 fairly generic document types up to 50 to 75 very specific doc types. In other words, the one meta-document, the "mortgage," is made up of many different individual documents, e.g. the loan agreement, proof of employment, liens, etc.
2) The primacy of document classification
For many years, advanced document capture was called "forms processing" because the task was to read data off of fixed forms. The archetypical application of forms processing technology was reading tax returns for government revenue departments. There may be different tax forms and schedules, but typically they had bar codes or other easy-to-identify distinguishing marks. (Read the Virginia Department of Taxation case study.)
A mortgage "document" with all its sub-documents is a completely different beast. In the packet there may be some forms with bar codes, but there are many pages that have to be "read" to figure out what they are. The biggest task - by far - when processing a mortgage is to figure out what each of the sub-documents is, and where they end and the next begins. There's no easy one-size-fits-all solution. Doing a good job requires an armory of techniques, some simple and fast like bar code recognition, and some much more sophisticated such as fingerprint matching and textual classification via content analytics.
Of course, mortgage processing shares many challenges and processing characteristics with other large-scale document capture environments. For example, demands for timeliness are high – getting the documents into the repository at the first possible moment in order to make them available for loan servicing or other parts of the organization. And there is a role – in some organizations – for remote capture in a browser or through MFPs of mortgages and/or related follow-on documents.
Mortgage processing is a bit different than many, perhaps most, document capture applications. But if you have any experience in document capture, you know that one of the enduring characteristics of capture is that it is “hard” exactly because each application is different. Even within the category of mortgage processors - e.g. originators, wholesale, correspondent – each have different needs on what document sub-types they want to identify. The knowledge and experience of one implementation can help with the next, but it is never just a matter of plugging in the same application for two different banks and expecting them to both work the same way!
I’ve been on my high horse – or more accurately, my bicycle – for a while about looking at the benefits of document capture from a different point of view: the customer’s customer, and how it affects them. I summarized these ideas in a video tribute to the Tour de France which ended in Paris yesterday… Tour de Office...
But let’s look at the details. How does document capture fit into a strategy of improving customer satisfaction, better customer retention, and acquisition? Let's think of it from the point of view of a bank and their retail customers. Banking globally is still one of those businesses where paper plays a crucial role in many customer interactions. Capturing data off of a new account application or a loan origination form is crucial to the bank's business - and doing it accurately and in a timely manner will have a big impact on customer satisfaction.
But these metrics, no matter how significant, do not tell the whole story of the way that document capture and imaging can improve a bank customer's experience. It goes beyond the simple ability to turn a document around, i.e. to process the information on it.
Take a look at this quick video from PT Bank Internasional Indonesia (also known as Maybank) in Indonesia. As a rapidly growing bank, they faced daunting obstacles in scaling up their paper processes, particularly as they surveyed the market for international expansion. Account opening, for example, was a manual, paper-based process, delaying the sharing of information between branches. The goal, for Maybank was a central repository to store documents so that branches and departments that previously faced massive hurdles in sharing information could start to do so effortlessly.
Digitizing and electronically capturing customer documents helps Maybank simplify and iterate account-opening and remittance processes, which reduces turnaround time and improves customer satisfaction. The solution also allows them to update customer records and transactions in real time and then share this information across multiple systems for accurate decision making.
In other words, a significant benefit, and one that has a direct impact on customer satisfaction, was the ability to break down some of the barriers within the bank itself. (I’ve explored this theme in an earlier blog on Smarter Commerce.)
And let's not forget that by "truncating" (scanning and destroying) the paper early in the process - at the branch - there is a huge benefit to the customer of increased security for their personal information. The loan application they fill in, instead of floating around the branch, and being sent as unencrypted paper around various bank departments, with sensitive personal information, including income and dates, now becomes an electronic document, all access to which is strictly controlled and tracked.
This is just one, real-world example illustrating some of the indirect benefits of document capture. Of course, capturing a document is really just the beginning of a process, most commonly associated with document imaging. It’s an academic question whether the true value is in the capturing or the storing (and retrieval!) of documents. The point is that when combined, capturing and imaging documents can play a central role in a customer-centric document strategy, which both improves internal processes and customer satisfaction.
Learn more about IBM’s Production Imaging Edition which features a unique bundle of document imaging and capture technologies here.
And don’t forget to follow me @CaptureGuru on Twitter!
Hello, and welcome to this inaugural post with the aim of discussing Enterprise Report Management (ERM) related topics.Please look elsewhere for discussions about enterprise risk management or enhanced remote mirroring.
ERM is not new, the technology has been around for over 20 years with products like IBM Content Manager OnDemand,IBM FileNet COLD, IBM FileNet Report Manager, and IBM Report Management and Distribution System.
During this time many organizations have woven ERM applications into the backbone of their businesses to manage the storage and access of formatted high volume computer output and reports in support of customer service and, more recently, customer self service.
Other applications include online check storage and retrieval.If your internet banking application allows you to view your checks online, chances are they are being stored in an IBM Content Manager OnDemand system.
Historically ERM has been viewed as a standalone application. But within the past 3-4 years, ERM products have been increasingly integrated with other ECM products to support content, records and business process management applications. Not surprisingly, leading analysts now track ERM as a subcomponent of the Enterprise Content Management (ECM) market.
I look forward to discussing the use of ERM within the broader ECM community and beyond. Here’s looking forward to the next 20 years.
If the word "social" brings to mind the teenagers
in your life endlessly Twittering and updating Facebook, you might be hesitant
to tell your employees to go forth and Be Social. But before you imagine a
doom-and-gloom scenario of all of your employees blogging cat pictures,
consider the benefits of social technologies.
Social spaces include – but are not limited to – external
sites like Facebook or LinkedIn. Your organization might be more social than
you think! Consider spaces like Team Rooms, collaboration spaces, or wikis –
those social spaces encourage interaction and generate valuable content and
knowledge. Your organization can collaborate more efficiently – what took
minutes or hours can now take seconds to find using social collaboration tools,
which can be established to share information internally, externally, or both.
But how do I ensure that my organization doesn't end up
with a mire of too many wikis, blogs, etc. and end up losing track of my
content? It's important to formalize a social content strategy, much like
your existing content management strategies. You'll want to be sure that you're
saving the right things at the right time and ensuring that data is properly
stored and managed.
The Association for Information and Image Management (AIIM)
created a white paper called "Managing Social Content – to maximize value and minimize risk" (click the link to download) that is an excellent starting point for any
organization trying to determine "how social should I be?" The paper explores
topics such as the benefits of social business, the growing importance of
recording interactions from social business applications, current models for
social content management, functional requirements for a social content
management system, and more.
If you will be attending the AIIM Conference in San Francisco, we would
love to meet with you to discuss your organization's specific needs. Join us
for a roundtable in the SolutionCenter at March 21 at 3pm
to discuss how social your business should be. See our session on March 22 at
2pm "The Future is Here: Content-in-Context is IBM Social Content Management" about how to keep your content social and "moving"
to increase value.
Join IBM at the SAPinsider
BI 2013 event, co-located with the Financials, GRC and Admin & Infrastructure
events, held March 19-22 at the MGM Grand hotel in Las Vegas, NV.
IBM offers attendees an exclusive opportunity to spend one-on-one time with
subject matter experts from across North America and our IBMClientCenter: Lab for SAP Solutions. Grab a
refreshment, join a conversation, meet IBM Watson, the grand challenge quiz
show champion, and get your most pressing questions answered in a relaxed
atmosphere. From BI and SAP HANA to cloud, mobility, and infrastructure, our
in-house experts can help you overcome your toughest challenges and provide
insights you can apply at your own organization. Coupled with these lively
discussions are customer stories, microforums, and live product demos. The IBM
Lounge is located at #1012 on the show floor and will be open during exhibit
ECM will be
leading three microforums inside the IBM Lounge (booth #1012 in the exhibit
hall) that you won't want to miss:
Tuesday, March 19
Overview of all IBM ECM solutions for
use with SAP
Wednesday, March 20
Don't touch that invoice! End-to-end
Thursday, March 21
Save BIG with Value-based
Archiving & Governance for SAP
In Part 2, Randy spoke of the process to identify ECM CC services packages and tiers. During that exercise the services were prioritized based on the project pipe line when the services will be needed. In this part, I will focus on developing a staffing plan to deliver and support the planned services and projects. There are a number of activities that must be understood to develop a well thought out staffing plan. First the organization needs to establish a project concurrency capacity objective. The number of concurrent projects will determine the number of resources and skills mix needed to deploy solution leveraging the ECM CC Community services.
Once the number of concurrent projects is determined, then the information gathered in the Evaluate Current State step of the roadmap can be used to define an organization leveraging many of the ECM skills that currently exist in the organization. A plan is defined to address the skill gaps and take into account the prioritization of the projects in the pipeline, as well as the concurrency objectives when determining a timeline and the immediacy of filling the gaps. Once the timeline and immediacy is understood, the plan would outline the approach to fill both short term and long term gaps. For some skills it may involve contract resources to address immediate needs while leveraging those resources in a mentoring capacity.
Here is an example of the impact of the number of concurrent ECM enabled projects on the staffing of an ECM CC.
In this example, an IBM ECM CC modeled customer looked back at the projects they deployed over a year and determined the average number of hours by role. They also determined an average elapse time of the projects, which was 3½ months for an ECM engagement for the given year. With that information they determined that if they started 1 new project per month that it should average 3½ projects being managed in a given month. In the above chart you can see the impact this organization saw when managing 3½ projects concurrently and the impact if more projects are introduced on a monthly bases. This analysis helps them to determine staffing needs given the amount of work that the steering committee had established as guidelines.
The next part in this series will discuss developing an ECM CC Engagement Process. This is one of the Best Practices and Standards that needs to be developed to guide ECM Solution delivery and helps the organization to take full advantage of the benefits of having an ECM CC.
In the meantime, please feel free to leave feedback or suggest topics that you'd like me to explore. Love to hear from you!
Document capture improves customers' experience at the bank
These days customers expect their branch banking experience to be like their other retail activities - quick, efficient, and seamlessly interwoven into the smart phone-based tempo of their lives. To stay competitive, a bank must deliver a snappy, iPhone-like experience, even in the context of a walk-in customer. While walking into a bank branch is not the same thing as using an on-demand service from Google and Facebook, many bank customers, particularly the younger ones that banks are always seeking to attract, expect immediate and accurate responses to their requests.
Since documents - like loan and new account applications, signature cards, and changes of address - are central to bank operations, the ability to process those documents quickly and accurately is key to meeting today's consumer expectations. And meeting or exceeding those expectations is the route to a happy customer experience.
Here's how a European bank made their customers much happier
I've recently worked with one of our banking customers in Europe. When I first met with them a year and a half ago, they were determined to improve their customers' in-branch experience, particularly when the customer would come in for a paper document-related transaction (as opposed to simply depositing or removing money).
Before Datacap, the customer would sit with a bank officer who would review the documents for completeness, thank them, and instruct them to come back the next day. Then the officer would type content from the documents into the relevant bank system, as well as faxing or sending the physical documents to a central location.
Now the process has been revolutionized. The paper documents are still there, the customer is still sitting with the bank officer, but once the officer takes a quick look at the document, they drop them into a document scanner at their desk. Then they wait (see my blog about that wait and why it is important and a challenge!). The goal is a turn around time of around 15 seconds, but at peak times it might take a minute.
While the bank officer and the customer are having a quick chat, a lot is happening in the background:
- document images are sent from the branch to central servers
- OCR and ICR, document classification, image enhancement, data validations all take place
- the results are sent back to the branch
Then the officer can review the electronic document, sign off and complete the transaction.
Speed doesn't help if the data being capture is riddled with errors. A mistake on a loan application can be devastating - a denied loan and a potentially desperate search for alternative financing. The manual process of entering data has always been a weak link in banking transactions. Automated data extraction using optical character recognition and related techniques and technologies can reduce errors, particularly when those technologies are paired with skilled workers, such as a bank officer, who reviews the results of automated recognition.
Accuracy is improved because the software will flag problem fields for review. So the bank officer looks only at fields with suspected problems. Worse case scenario: they review every field and it takes as long as it would to do manually. Either way, accuracy is improved because you have one pass of automated data entry and one pass of review by an experienced human.
Fast and accurate transaction make for happier customers
Document capture helps make sure the customer's high expectations are speedily met, so they can leave feeling good and maybe even with some additional money in their pocket! Happier customers make for happier banks who can more easily acquire new customers more easily and retain their existing ones.
For ongoing insights follow me on Twitter @CaptureGuru.
The release of FileNet P8 Version 5.1.0 products brings with it new features in the information center. The IBM FileNet P8 Version 5.1.0 information center features collaboration tools, including commenting on help topics and sharing examples. The information center now includes information that was previously available only in PDF documents. And upgrading and configuring information has been expanded.
The ECM Regional UserNet events, hosted by local ECM user groups in partnership
with IBM, are designed to bring together an extended local ecosystem of talent
to draw and learn from including clients, IBM ECM experts and ValueNet partners.
Attend one of this year’s Regional UserNet events at no cost where you can:
Participate in an exceptional agenda including keynote presentations
from industry leaders, over 36 breakout sessions delivered by our product
experts and business partners, and don’t forget the product demos and hands-on
Leverage the many networking opportunities to share ideas for using
information to drive ROI, reduce costs, manage risk and improve case management.
Experience real-world solutions and the latest industry innovations that
can help your organization achieve an information-led transformation.
you are an IT professional, senior executive, part of the legal department, a
line-of-business manager or compliance officer, this is your opportunity to
participate in the exchange of best practices and experience the leading
industry solutions. Best of all, these events are FREE!
The CM8 User Group is pleased to invite you to a meeting for CM8 users on November 5 during the IOD conference. Meet us in Tropics B to hear from CM8 users and share your own experiences. For the first time ever, this session will be simultaneously broadcast online! Even if you can't join us in person, we'd love to have you attend our web session!
The goals of the CM8 User Group are: To allow CM8 customers to "share & network" their install/success stories and to learn how to "achieve greater business value" from their CM8 investments.
Hosted by Bridget Klare - Kroger Company, we'll cover a comprehensive agenda, including:
Royal Bank of Canada - Joseph Likuski, ECM Leader at RBC, will discuss RBC's CM8 implementation and use of IBM Content Collector for email
Erie Insurance - Mary Jo Ingalls, ECM Leader at Erie, will cover Erie's journey with IBM over the years, including IS to CM8 migration, CM for claims, addition of CMOD and how they bridge them all together.
Demo: IBM Content Navigator for CM8 - Ian Story, IBM Senior Product Manager, will demo the new ICN V. 3 release which includes CM8 & CMOD federation, e-Client/pClient replacement functions and much more
CM8 Product Strategy Panel Discussion - Featuring IBM's Jim Reimer, IBM CM8 Architect, Cristiane Hilkner, Vice President, ECM Development, Ian Story, Senior Product Manager, Shailesh Gupta, CM8 Product Development Manager
CM8 Next Product Update - Shailesh Gupta - IBM ECM Product Manager
Whether in person at IOD or via the webcast, you won't want to miss this event! If you're attending IOD, there's no need to RSVP – join us in conference room Tropics B at 10am.
When: Tuesday, November 5, 2013 10:00am – 12:15pm PT
Guest post by Steve Studer Offering Manager - IBM ECM Marketing
Products and Strategy
Observing the success of social networking and content
search tools reminds me of one of my favorite books called "Connections"
Burke. What I find they have common
is that a business' success is very much dependent on content being utilized
meaningful ways, e.g. connecting that content to the right people who can then
leverage the information for multiple purposes. Mr. Burke sites that one of the
biggest catalysts that changed the world was the establishment of the Library
in Venice where every ship arriving to do trade
was asked to provide books that were then reproduced by scribes and made
available to anyone who came to the library of Venice. What I find so fascinating is how the
idea's presented in this book was also adapted into a popular television series
on PBS and is now is electronically available. To me the essence of Social
Collaboration is to connect content to people and processes and to provide
context where Idea's can be freely exchanged. This typifies why IBM
Connections Enterprise Content Edition is so important to business today.
By the way, anyone interested in reading the book or watching the TV series can
for "Connections by James Burke" or View the multi- part
documentary on YouTube or buy the EBook
As James Burke often points out, the library concept expands
knowledge transfer at the same time it can be a great catalyst for change.
Connecting content with people can be the greatest incubator for expanding
abstract ideas. One case in point that happened to me just last week was one of
my colleagues found a WIKI link which I had authored and tagged in IBM's
Internal Connections site. What I found
so interesting was how the person discovered me. I had tagged a document inside the wiki with
meta-data tags for content analytics and social collaboration. I turns out this
person is working with several customers and was looking for an expert on the
topic of Social Content and any presentation materials that could serve as
educational tool for the customer.
Reaching the assets was only small part of her solution to her
challenge. The fundamental piece she
needed was connecting with the right subject matter expert who could help
present these concepts. I'm happy to say
I made her day because she was able to intuitively navigate our internal Social
Content community, locating both the content and expert, literally giving life
to the meaning “context to content and people”.
In closing, as James Burke so brilliantly points out, the
transfer of knowledge is the greatest catalysts for change. Consider what
wonders man has been able to expand upon when the transfer of information
happened at a wind and sails pace and the medium was primarily paper in the
form of books or correspondence. Back
then, the sharing of content and knowledge was rarely done face-to-face because
of the time it took to travel long distances, e.g. weeks, months and many
instances even years. It is unfathomable to me the impact that these new social
tools will have when you consider that connecting content with people on
opposite ends of the world can now be linked at the speed of Ethernet, not to
mention the ability to share that information through multiple social and
mobile content mediums. These new
technologies will definitely contribute to the "The Day the Universe Changed"
or at minimum a Smarter
When I was the CEO of an independent document capture company (Datacap), I found it painful to listen to customers articulate why they were reluctant to bring in a specialty vendor for technology like document capture. "There is a lot of overhead in managing vendor relationships," is the way the discussion would often start. Datacap, being itself a relatively small company working with only a modest number of vendors, did not prepare me to be sympathetic to these concerns.
I would argue that it was far better for them to select the "best of breed" solution then to hang their hats on keeping the number of vendors they are working with down. Seemed obvious to me that the better technology would also be the better choice, no matter how many vendors that involved.
Sometimes that argument worked... but many times it did not help. Customers from large organizations have significant challenges, even within the confines of that organization. Sometimes Herculean efforts are required just to align the interests of different departments, notably those with operational responsibilities versus those with technical/IT responsibilities. Throw into the mix hundreds of vendors with different licensing terms, different support structures and availability, and you get the recipe for ongoing chaos.
Large financial institutions are the very model of these challenges. As banking has been transformed globally from a large set of small organizations with primarily local customers into far-reaching regional, national, and global institutions, the challenges they face in managing technology and, of course, the vendors providing that technology, are vast. What's one more vendor in the mix? It might be the straw the breaks the camel's back!
Now I understand the logic behind vendor consolidation far better - and it helps explain the success we have had in banking and finance as IBM Datacap... that we were never able to achieve when we were an independent vendor. Particularly with the Production Imaging Edition (PIE), we are offering a complete imaging packaging from capture to repository. Of course, there are other explanations as well, not the least of which is the increasingly tight integration we have established with the repository where captured documents are stored.
There's no magic number of vendors, but there is a hidden dynamic that I have learned: on the buying side of the technology equation there are constraints and considerations that legitimately have an impact on vendor selection. Have you run into those constraints? I'd like to hear your experience.