Slide 1

Michael Mainelli and Ian Harris, The Z/Yen Group

Take Stock in Your Data Warehouse or Knowledge Management is Asset Management

[A version of this article originally appeared as "Value Propositions – Data Warehousing & Business Intelligence Software", Conspectus, Prime Marketing Publications Ltd (March 1999) pages 22-24.

Fads and Fashion

There is a fashion for putting business-oriented jargon on top of incrementally new technology, witness data warehousing or knowledge management.  Jargon serves no purpose for creative people who have the imagination and the drive to get technology to work for them.  Creative people have sensed for some time the unity and longer-term possibilities of information technology.  One more buzz-phrase from the boring `suits' annoys the creative sorts.  Meanwhile, people whose critical faculties have supplanted their creative faculties chant their mantra, "someone (else) must tell me the business benefit in simple language I can understand before I will change my views".  "Knowledge management" is fashionable jargon, yet knowledge management as a concept adds value by getting the creative and the critical people to think about information in new, structured ways - as an asset.

Some phrases ring loud bells with an auditor - "X is an asset.  X is crucial to our organisation.  People in our organisation manage X.  X has value."  Assets - useful or valuable qualities or things - require care and attention.  Asset value needs to be preserved and, where possible, enhanced.  If knowledge is an asset, let's treat it like one.  There are some challenging implications from this simple association.  For any significant asset, an auditor expects seven typical pieces of evidence: 

  • accurate understanding of the Cost of the asset; 
  • confirmation of Ownership of the asset; 
  • some Disclosure of the importance of the asset;
  • ability to confirm the Value of the asset; 
  • evidence of the Existence of the asset; 
  • clear lines of Responsibility for the asset;
  • measurable Benefit from the asset.

Without all of the above, phrases such as "knowledge as an asset" or "knowledge management" are just fluff sitting on top of some new technology.  (Perhaps the somewhat surreal, but snappy, "COD-VERB" should catch on as a mnemonic device).  If knowledge is not an asset, let's stop bandying about phrases with little meaning and talk about a few databases we happen to have lying about.  Meeting the above tests should be a goal for most information systems professionals.  Examining each of the seven factors in turn, we can see how leading organisations are starting to put weight behind the concept of knowledge value.  The following table proposes some tests for each factor:





¨       how much has it cost to build our knowledge  base?

¨       how much does it cost to maintain?

¨       what are the indirect costs of knowledge, i.e.  how much are our people creating in the course of doing work and how much activity is specifically for knowledge enhancement?

¨       specific numbers in the management accounts

¨       activity-based costing analysis of knowledge acquisition, interpretation, storage and renewal

¨       client profitability statements which analyse the knowledge created during assignments


¨       do we have clear title to our knowledge?

¨       do we know what bought knowledge is crucial?

¨       is our data more or less valuable than externally benchmarked knowledge bases?

¨       copyrights, trademarks, documented methodologies

¨       maps to data and knowledge sources

¨       regular challenges of external data costs; frequent comparisons of knowledge sources and usefulness


¨       do we publish a value?

¨       could we defend a minimum value?

¨       appraisals of the knowledge base

¨       annual reviews of knowledge base achievements


¨       do we have a valuation methodology for our knowledge?

¨       do we measure the increase in value?

¨       is the value of knowledge part of our economic value added measures?

¨       do other people value our knowledge?

¨       clear descriptions of how knowledge adds value to people and processes

¨       agonising over the value of knowledge to be published

¨       knowledge as part of the return on capital equation, depreciation or amortisation charges

¨       licensing fees, usage charges, database swaps


¨       do we audit our data for completeness, accuracy and location?

¨       do we have change control procedures which control major changes in our data?

¨       can we point to knowledge which is `retired' (not just archived)?

¨       frequent knowledge `raids';

¨       use of external knowledge auditors;

¨       statistics on average use, lifecycle times, data demographics, aging statistics

¨       data mortality analysis


¨       can senior executives find this knowledge that they talk about?

¨       is someone in charge of the value of all organisational knowledge?

¨       is someone responsible for each major area of knowledge, its maintenance and development?

¨       executive usage time

¨       people being held accountable for falls in knowledge value

¨       contention over who gets to update knowledge areas



¨       are our people (managers) willing to pay from their budgets for access and usage - are they prepared to pay back when they strike paydirt?

¨       can we demonstrate competitive advantage?

¨       disputes over internal costings for data warehouse use - any charges for data warehouse access

¨       return on investment vis-à-vis competitors


The single most important factor is valuation.  Valuation binds all seven factors of asset management.  Without value, assets are meaningless.  The value of information is a much bandied about concept.  However, theory is rapidly catching up with some of the problems in valuing information.  Two theoretical areas used in risk/reward management, real options and information theory, are being combined by information analysts to arrive at values of organisational knowledge.  As with other intangible assets, such as brands, knowledge values are still somewhat uncertain but, as techniques spread and more comparisons of value become available, confidence and usage will rise rapidly.  The demand for better measures of knowledge also rises as investors commit more and more capital to knowledge-intensive organisations.

The clearest way to test the value of knowledge is to remove it, e.g.  separate a business unit from the organisational knowledge base.  If the business unit's costs increase markedly, clients leave, new pitches fail and staff lose morale, the organisational knowledge base was probably important.  Unfortunately, such a do-or-die test is very close to the analogous test of the value of clean air, e.g.  isolate people from all air and see how much it is worth.  Not only is the test crude and fatal, it misses the subtlety of value.  Value is best tested by seeing what some people are prepared to pay and what other people expect to charge, yet there are even more discriminating ways of assessing value than relying solely on data charges.

Some basic economics can help us to determine the optimum size of a data warehousing or knowledge management initiative.  If we can estimate the marginal benefits and the marginal costs of additional information, which might or might not form part of the initiative, we ought to be able to decide whether the additional information will provide net value or not.  This might sound theoretical, but in our experience successful data warehousing and knowledge management initiatives have to grapple with practical decisions around extending the scope of the initiative almost from day one of implementation.  Each rescoping (or "feature creep") decision should involve some thinking on the net value (marginal benefits and costs) likely to arise from that decision.  In order to measure the net value of knowledge, canny businesses agree some "meta indicators" of knowledge value and gather "data on the data" accordingly.  Whilst we accept that it is not possible to compute the value of knowledge "to the penny" using hard measures, we do advocate using hard measures as much as possible as evidence of value.  For example, the amount of time staff spend using information arising from the initiative can be measured and indicates the value those staff place on that information.

One risk/reward categorisation of the value of knowledge is shown below:


Knowledge is enhanced through: 

  • structural work where obvious information is enhanced: management information systems, knowledge management systems, sifting and horizon-peering systems; 

  • repairing dangerous areas of poor knowledge or gaps: competitor information and analysis, market research of client perceptions, regulatory futures; 

  • filling in uncertain areas where patterns need to be challenged and anomalies become paramount, almost playing the corporate `fool' to challenge accepted thinking; 

  • strategic information, where thought and consideration are foremost in deriving value from existing information which has not yet brought value.

In the stable state, organisations find patterns of success and reinforce them.  In the change state, anomalies indicate that established patterns are undergoing discontinuous change; the underlying paradigm shifts; new patterns of success emerge.  The tension between these two states is key - timing is everything.  If nothing is expended in enhancing knowledge in any of the above categories, the knowledge base is probably of little value - an asset without anyone prepared to pay for maintenance is either eternal or valueless.

One of the more intriguing notions prompted by knowledge management is the tension between pure information theory, where the rate of change in information is a key indicator of the potential value (an important concept in data compression and data summarisation), and organisational anomaly detection, where detecting invalid assumptions is a key indicator of information value.  In pure information theory, volatile information, e.g.  who is asking for price quotes each minute, is more important than static information, e.g.  who normally seems to bid against us each year.  In pure information theory, the former is not as compressible as the latter.  However, in strategic thinking, low volatility information can be far more important because it challenges assumptions.  In normal circumstances, competitor tracking information only tells us "we're competing against the usual suspects".  Yet from time to time an unexpected new competitor - possibly a new entrant, possibly a substitute - will represent more valuable information for strategic management decisions than tracking the minute-to-minute to's-and-fro's of client relationships.

Detecting changes in normal patterns, anomalies, has a value out of proportion to the pure information theory value.  This difference in value is readily apparent.  If your business has delivered software to 250 banks for 10 years and now learns lessons from its first software delivery to a government agency, the knowledge gained on this one assignment is likely to be much more than that gained from the 251st bank.  A fundamental way of discriminating between information where volatility represents value (e.g.  who matters in an organisation) and where anomaly represents value (e.g.  why is this competitor not a usual suspect) is to relate the information to whether it helps to confirm or refute a key assumption or a key decision.  Decisions, e.g.  whom should we approach in X organisation, have discrete answers where volatility is important and differ from assumptions, e.g.  these are our competitors, where anomaly detection or pattern breaking are important.  One way of categorising information for decision making (paraphrasing Henderson, Rockart and Sifonis, 1984) is:

Among the most interesting developments in demonstrating the value of knowledge are "knowledge measurement systems".  Knowledge measurement systems are to the knowledge base as management information systems are to the organisation.  Knowledge measurement systems measure key factors in the knowledge base, particularly the increasing value of the knowledge base.  Some early, but still rudimentary, examples of attempts at knowledge measurement include value assessment systems such as Tertio's, pattern and anomaly detection systems such as DELOS or Bayesian inference systems such as Autonomy's.  All of these are early indicators of people beginning to estimate knowledge value and paying for systems which assist them.

Where property is a significant asset, managers expect systems to demonstrate the increase in portfolio value, the volatility in valuation, the uncertainties in value, comparisons against property indices, the rate of increase (return) in value and many other fairly obvious measures.  If knowledge is an asset, then knowledge measurement systems are a natural consequence, yet asset valuation skills seem to fly out the window when important, but more abstract, knowledge assets are discussed.  Demand from senior managers for a scorecard of information value in the period (e.g.  additions, subtractions, renewals, retirements, depreciation, appreciation, the current internal market price of access, etc.) - is probably a key indicator that an organisation is beginning to understand the value of information.  We may soon add KVA measures (knowledge value added) or KVA indices to the expected EVA measures (economic value added).  COD-VERB may sound like fishy grammar, but when it is fully satisfied we may begin to realise that knowledge management has begun to swim fast and free with both creative and critical people in all organisations.

Michael Mainelli and Ian Harris are directors of Z/Yen Limited.  Z/Yen specialises in risk/reward management, an innovative approach to improving performance in strategy, systems, people and organisation.  Z/Yen clients to date include blue chip companies in banking, technology, charities & sales/service companies.