Close this search box.
Close this search box.

Accuracy Talks Straight #6 – The Academic Insight

Isabelle Comyn-Wattiau
Professor at ESSEC Business School, Chair of Information Strategy and Governance

Valuing our wealth of data, a challenge no company can escape

Evoking the value of data in 2022, when the media is overflowing with examples of companies suffering damage linked to data, may seem counter-intuitive. Yet, it is well known that data has value, and it is the very reason why the attacks targeting data are not just simple cyberattacks. More and more, they aim to seize the informational wealth of the target organisations.

Data security can be broken down into three areas: availability, confidentiality and integrity. Attacking an information system compromises its availability, thereby endangering the process that the system underpins. This is what we were able to observe at the Corbeil-Essonnes hospital a few months ago. Due to a lack of available data linked to the patient, the diagnosis and care process is made longer and more expensive. It can even have an impact on patient health by delaying a course of treatment. During these attacks, we also fear a confidentiality breach of highly sensitive data.

And, if by chance the computer hackers modify these data, they could compromise their integrity. Thus, all three parts of data security are affected, with extensive damage: first, the health of the patient, but also the reputation of the hospital and the cost linked to the restoration of the information systems and all the affected processes. Limiting ourselves to the security of the data is a reductive defensive approach, even if we cannot rule it out. Determining the value of data is a significant issue for most companies. The press publishes daily success stories of start-ups where a good idea for sharing or pooling highly operational information leads to new, unsuspected value. Thus, in 2021, the market capitalisation of Facebook reached around $1 trillion, but the net value of the company based on its assets and liabilities was only $138 billion.1

The difference in terms of value can be explained by the data that Facebook collects from its users and uses in turn to feed its advertising algorithms. For economists, data represent unrivalled assets (in that they can be consumed by various users without diminishing), which do not necessarily depreciate as we use them; on the contrary, they can generate new information, for example when combined with others. For some, however, the value does depreciate, and very quickly at that. All these characteristics mean that data fall into a highly specific class of asset that resembles no other intangible asset, brand, software, patent, etc.

Tackling the value of data also requires us to agree on the vocabulary to be used: data versus information. Without reopening the debate on the difference between data and information, we can consider them identical in an initial approach to the topic. Some, however, will wish to distinguish data – the input of the system, unmodifiable, a result of measuring a phenomenon – from information – the output of the system after cleaning, restatement, refinement, aggregation, transformation, etc.

The value of information has been studied in line with accounting practices notably by Moody and Walsh.2 They first endeavoured to demonstrate that information can be considered as an asset: it offers a service and an economic advantage, it is controlled by the organisation and it is the result of past transactions. They then proposed three appraisal methods to value information.

The first is based on costs – of acquisition, processing, conservation, etc. It is the easiest to put in place because these elements are more or less present in the financial controller’s dashboard. However, these costs do not reflect all aspects of data, for example the development of their value over time. The second appraisal method is based on the market and consists of determining the value that can be obtained by selling the data.

Here, we talk about an exchange value. This approach requires a considerable effort. Moreover, it is not always possible to obtain a reliable measure of the value of the data. Finally, the third method is based on utility.

This means appraising the use value of the data by estimating the economic value that it can generate as a product or a catalyst. But this value is difficult to anticipate, and estimating the share of its catalytic effect is also highly complex.

Thus, it seems that the various approaches to determine the value of data are partial but complementary: some are based on the use value or exchange value of data; others assume rational corporate behaviour and assess data at the level of investment made to acquire it and manage it throughout its life cycle; still others are based on risk. The risk approaches see data as the target of threats to the company or organisation. Such risks may be operational; thus, the missing or damaged data may cause certain processes to function poorly.

But there are also legal or regulatory risks, as more and more texts stipulate obligations for data; the General Data Protection Regulation is just one example, albeit the most well known, no doubt. The risks can also be strategic when they concern the reputation of the company or lead the company to make bad decisions.

Finally, some authors have taken an approach based on externalities for open data, which are available to all but which, by making the most of them, can bring a benefit to society at large.

The concept of data value is linked to the objective of proper data governance: maximising data value by minimising the associated risks and costs.3 By adopting this three-pronged approach (value, risk and cost), we can better obtain a holistic view of data value and improve its valuation.

These three aspects are complementary, but we must not exclude context.

Indeed, the same information does not have the same value depending on the temporal, geographic, economic or political context in which the valuation process is conducted. The question of why the valuation is needed must be answered in order to characterise the relevant contextual elements: political, economic, social, technological, ecological and legal (PESTEL) in particular.

The object of the valuation must itself be identified. One of the difficulties in estimating the value of data is choosing the appropriate level of detail: are we talking about the entirety of an information system (e.g. the client information system) or

a set of data (e.g. the client database) or indeed a key piece of information (e.g. the launch price of a competing product)? It is clear that the value of an information system is not the simple sum of the value of its components.

Few approaches for the valuation of data are sufficiently holistic and general to enable their application to any type of data in any context. Recommendations can be made, for example, the recommendation to choose between a top-down approach and a bottom-up approach. But the holistic approach can only be holistic by combining these two value paths.

It is because the company is still unable to measure the real and potential value of data that it does not make sufficient investment in data governance and information sharing.

It is a vicious circle that ultimately leads to the company being unable to realise the full value of data.

A virtuous circle can be built by starting with the most critical data, for example (but not necessarily) client data, and gradually getting on board all data actors – producers, transformers, sellers, distributors, consumers of these data. They have the different points of view necessary for a holistic approach.


1 J. Akoka, I. Comyn-Wattiau, Evaluation de la valeur des données – Modèle et méthode, Record of the 40th Congress of INFORSID (INFormatique des ORganisations et Systèmes d’Information et de Décision), Dijon, 2022
2 D. Moody, P. Walsh, Measuring the Value of Information – an Asset Valuation Approach, Record of the European Conference on Information Systems (ECIS), 1999
3 World Economic Forum, Articulating Value from Data, White Paper, 2021

< Read more from Accuracy Talks Straight