Accuracy has been assigned by SPAC Pegasus Entrepreneurs to perform the
financial due diligence in the context of the following operation:
FL Entertainment (composed of Banijay
Group and Betclic Everest Group) is merging with SPAC Pegasus Entrepreneurs,
which has been backed by European investment firm Tikehau Capital and
Transaction gives implied pro forma equity value of
€4.1 billion and pro forma enterprise value of €7.2 billion – the largest
business combination by a European-listed SPAC
The Banijay Group is the world’s largest independent content production
company. The Betclic Everest Group operates in the online sports betting and
Over the past few years, the banking industry has witnessed a new wave of digital transformation. Virtual banking, for example, has become more popular in many regions. Other digitalisation trends such as open banking, RegTech, AI and data-driven decision-making, to name a few, are in the headlines.
In addition, the Covid-19 pandemic is changing the way that banks and customers interact. Today, retail banking products are very much commoditised, with interest rates and other features of bank offers proving similar between them. FinTechs and TechFins have therefore emerged to bring new and better customer experiences to their incumbent counterparts. Customer expectations have also been changing, and customers are seeking digital, seamless, fast and integrated services more and more. Thus, on the other side of the table, banks are not left with much choice but to undertake necessary digital transformations to meet expectations.
Many retail banking transformations are taking place in the market. In the broad sense, we can categorise them into three types: (1) moving from product-centric to customer-centric (i.e. to have more and faster customer interactions, to offer more personalised services and advice, etc.), (2) automating end-to-end services (i.e. adoption of technology for on-boarding, e-KYC, risk management, internal controls, etc.), and (3) enabling big data analytics and data-driven business decision-making.
In this article, we focus on the second and the third categories. In particular, we dedicate this article to the discussion of credit scorecards, as one of the major tools for data-driven risk management and business decision-making. We will review traditional scorecard development methodologies and then discuss the latest trends.
2. Key makes it possible to unlock value
Retail banks typically use credit scorecards, which are mathematical models, to predict the behaviours of their customers. The most important behaviour to predict is whether the customers will default on or repay their borrowings. When it comes to such predictions, two types of scorecard are widely used: the application scorecard and the behavioural scorecard.
Table 1 – Comparison between application and behavioural scorecards
3. Traditional scorecard development framework
There are a number of tools that can be used for the development of retail credit scorecards. Historically, SAS was arguably the dominant programming language for retail credit risk management, including the development of credit scorecards. Over the past decade, open source programming languages, such as Python and R, have become more and more popular. While most banks are still using SAS now, many have started using open source languages in parallel.
Traditionally, a six-phase framework is adopted for credit scorecard development. As demonstrated in figure 1 below, the six phases are (1) data processing, (2) variable transformation and selection, (3) logistic regression, (4) performance inference, (5) scorecard scaling and (6) scorecard validation. Refer to appendix 1 for a more detailed discussion regarding the development procedures.
Figure 1 – Six-phase credit scorecard development processes
4. Challenges in traditional credit scoring
Traditional credit scorecards have been used by market practitioners for a few decades. However, they are not perfect when considered through the lens of big data. Below we highlight the major challenges faced by traditional credit scorecards today.
Figure 2 – Key challenges faced by traditional credit scoring
5. Key enablers to unlock value
We have already briefly mentioned the trends to tackle the challenges encountered by traditional credit scoring. These are certainly at the heart of retail credit decisioning in the era of big data analytics.
Figure 3 – Key trends in retail credit scoring
Trend one: Big data analytics and the use of alternative data
Looking at retail banking globally, we are seeing a strong focus on improving data and deeply understanding customer needs to create personalised experiences. Big data analytics and the use of alternative data have become one of the most prevalent trends in the industry’s transformation. With rising computing power and increasing access to advanced analytics tools, market practitioners are starting to realise the hidden value of data as well as to search for new data sources.
Retail banking has long been a data-driven business, where data is generated at every stage of the customer journey. However, historically, most banks did not have an efficient way to realise the potential of the data nor the IT infrastructure necessary to do so. Furthermore, traditional data as used in the past is just the tip of the iceberg; huge amounts of alternative data, in either structured or unstructured forms, are generated every second from various data sources, both internally and externally, in this digital era.
Over the past decade, thanks to advances in big data analytics, retail banks now have increasing capabilities to process traditional and alternative data efficiently; thus, they are able to build up the customer’s 360-degree profile digitally. With that in mind, banks are starting to provide a more tailor-made customer experience via their banking apps and digital platforms. In addition, upselling and cross-selling campaigns can now target specific customer segments based on insights from big data analytics. Developments in AI and machine learning also help banks and data providers to gain insights from unstructured data (e.g. using nature language processing (NLP) to gauge a customer’s sentiments).
Figure 4 – Traditional data and alternative data comparison
The use of alternative data not only improves the robustness of the scorecard model but also enables banks to assess the creditworthiness of untapped customer segments. This helps to extend financial services to the two billion unbanked adults globally.
With alternative data, retail banks are also able to develop more scorecards, such as income scores, propensity scores and marketing scores. These further help banks decide to whom to lend their money, how much to lend, in what time frame and through what channels.
Some FinTech firms and digital financial service providers have taken the initiative to make use of alternative data sources for credit scoring. Credit bureaus, such as Experian, can now add rent payment history to their credit scoring algorithms thanks to a tool developed by the UK PropTech firm CreditLadder. Lenddo,a software business in Singapore, has incorporated social media and mobile phone data to assess clients’ credit levels. By aggregating data from SMS footprints, electronic devices, emails and credit bureau reports, among others, Algo360, an alternative credit score solution provider, helps new-to-credit customers get loans. Small FinTech companies have used smartphone activity, including calls, GPS data and contact information, to execute credit scoring in microfinance. As alternative data accumulates, the output from predictive models is likely to become more reliable and accurate over time.
Alternative data is not only beneficial when credit scoring individuals, but also in the case of SMEs. Banks commonly consider SMEs to be high-risk clients since information about them is limited, causing difficulty in evaluating their creditworthiness. Because of the intrinsic qualities mentioned previously, alternative data, in conjunction with traditional data sources, will help to build a more comprehensive profile of SMEs, allowing lenders to make better decisions. Digital SME lenders (e.g. Kabbage, an Atlanta-based FinTech company) are making wide use of alternative data such as bank account money flows and balances, business accounting, social media, real-time sales, payments, trading, logistics, and credit reporting service provider data, as well as various other private and public sources of data, to improve risk assessment and to tap into a large market of underserved SMEs.
Moreover, the value of data can be further ‘mined’ if combined with AI and machine learning techniques, which brings us to the second major trend in the digital transformation of retail banking: AI and machine learning in modelling.
Trend two: AI and machine learning in modelling
Retail banking is one of the industries where the use of artificial intelligence (AI) and machine learning (ML) has become widespread. We have talked about the tremendous potential of data, and we believe that these new techniques are well placed to assist in unleashing this potential, particularly when it comes to credit decisioning.
Machine learning can be applied to strengthen traditional logistic regression credit scoring or a solely ML-based model can be developed for credit scoring. Below we highlight some ML techniques that can be applied by banks when developing their credit scoring systems.
Figure 5 – Common machine learning analytics applied for credit scoring system development
An ML-based model would have several advantages over a logistic regression model. First, it can capture the non-linear nature of risk factors, and thus if trained appropriately, can possess higher predictive power. Second, it is agile and dynamic enough to perform the timely assessment of customer credit quality based on greater amounts of relevant and recent data. Third, the model can be highly automated and self-improving, thereby lowering ongoing operational costs.
Higher predictive power
ML-based models are trained with much more data than their traditional counterparts. These include both traditional data and alternative data as discussed above. While traditional models are not designed to discover complicated relationships between large amounts of data, ML-based models are much stronger in this area. As such, it would not be surprising to see that ML-based models are more predictive than traditional models.
More agile and dynamic
ML-based models are continuously trained with the most up-to-date data, so that they are able to perform real-time assessments of customer creditworthiness. This allows the models to provide rapid feedback to model users for credit approval and other decision-making processes. Due to their agility, ML-based models are also more customer-centric and offer smoother assessments of customer creditworthiness. As a result, greater financial inclusion is possible.
Figure 6 – Risk assessment over time – ML model vs traditional model
ML-based models are designed to be self-improving over time and thus highly automated. Traditional models require users to recalibrate them (e.g. on a yearly basis) and redevelop them (e.g. every few years). ML-based models are able to update themselves based on updated data feeds. As such, operational costs for ML-based models are lower, especially in the long term.
With these benefits, it is no wonder that credit bureaus are aggressively using ML to evaluate large amounts of data and generate improved insights. Equifax, for example, provides its clients with tailor-made services by applying neural networks to an artificial intelligence credit scoring approach. Equifax is not alone in experimenting with ML; Experian boosts its analytics products with ML capabilities to provide richer, more insightful information. Even for ‘credit invisible’ clients with infrequently updated credit files, VantageScore incorporates ML to analyse risks and provide ratings. ML has also proved to be effective in detecting high-risk behaviours and providing more accurate credit scorecards by TransUnion and FICO. A blend of Tree Ensemble Modelling (a machine learning technique employed by FICO) and scorecards significantly improves predictive performance in credit assessment, compared with traditional scorecards.
In addition to traditional credit bureaus, FinTech companies are also actively exploring possibilities in ML to run their businesses. For example, LendingClub, the world’s largest online platform connecting investors and borrowers, has created its credit-scoring algorithm based on ten years of LendingClub data, AI and ML technologies; Kabbage is developing next-generation ML and analytics stacks for credit risk modelling and portfolio analysis; and LendUp, an American online direct lender, employs ML algorithms to identify the top 15% of borrowers who are most likely to pay back their debts.
Notwithstanding the advantages of ML-based models, they possess some limitations to be resolved. First, ML-based models are not as trivial as traditional models, and the modelled results can be challenging to interpret. It can also be more challenging to explain the models to regulators and auditors. Second, the performance of ML-based models is highly dependent on the quality of the data used. When feeding huge amounts of data into the models, ensuring the quality of the data can be challenging.
Trend three: Process automation
The third trend is the increasing automation in almost every part of the business. In order to provide fast interactions and personalised customer experiences, automation in know your customer, credit approval, risk management and reporting has become highly important. For example, OppFi, a leading financial technology platform, effectively automates the credit scoring process by using AI models, real-time data analysis and proprietary scoring algorithms. Zest AI, an AI-empowered credit life cycle management organisation, provides banks with its automated services in data processing and documentation as well as compliance validation, deployment and integration. With the help of process automation, banks and FinTech companies are largely improving customer experience and greatly reducing operating costs by cutting loan application processes to a few minutes. Credit scoring is at the heart of credit approval and risk management, and its automation largely relates to the automation of data processing, modelling and validation.
With big data analytics, banks use both internal and external data to a great extent. Data collection and data cleansing are the major tasks to be automated. Data collection involves the collection of data from different sources, whether traditional or alternative, as well as its digitisation and standardisation. Data cleansing involves data validity checking, data backfilling, treatments for outliers and doubtful data, etc.
A large part of model development can be automated with proper governance and approval processes. For ML-based models, this is more trivial as the models are designed to improve themselves on an ongoing basis using the latest data. For traditional models, automation can be useful for recalibration and the generation of challenger models.
The validation of models can be entirely automated, whether for traditional or ML-based models. Model validation consists of calculating predefined performance metrics and comparing them with predefined thresholds. It is relatively straightforward to automate such processes and generate validation reports.
Figure 8 – Process automation with the help of ML and AI
What Accuracy does
For clients who need to navigate the digital transformation in the retail banking industry, especially in credit scoring, Accuracy is well placed to work with you on the following tasks:
• Perform an independent review and validation of your existing credit scorecards
• Develop credit scorecards using programming languages including SAS, Python, R, VBA, etc. The development process is semi-automated for easier repetition and maintenance
• Advise you on the adoption of alternative data for credit scorecard development, whether for traditional or machine learning models
• Develop machine-learning-based credit scorecards using open-source languages such as Python
• Perform automation on data processing, modelling and validation
• Perform the overall strategic shaping of retail banking digital transformation and adoption of big data analytics
At Accuracy, our financial services industry experts work with banks and non-bank financial institutions on mergers and acquisitions, strategic transformations, quantitative modelling and adoption of technology solutions. We have been working closely with global financial institutions as well as small and medium-sized institutions over the past two decades.
Appendix 1 – detailed procedures for retail credit scorecards development
For our fourth edition of the Accuracy Talks Straight, René Pigot discusses about the nuclear industry, before letting Romain Proglio introduce us to H2X-Ecosystems, a start-up that enable both the production and the consumpt ion of hydrogen on-site. We then analyse the development of the retail baking with David Chollet, Nicolas Darbo and Amaury Pouradier Duteil. Sophie Chassat, Philosopher and partner at Wemean, explores the value of work. And finally, we look closer at the long-term discount rate with Philippe Raimbourg, Director of the Ecole de Management de la Sorbonne and Affiliate professor at ESCP Business School, as well as the improvement of the economic panorama with Hervé Goulletquer, our senior economic adviser.
After being weakened by various events and decisions that shed an unfavourable light on it (Fukushima, Flamanville, Fessenheim), the nuclear industry is now enjoying something of a resurgence.
The French president’s recent announcement of a programme to build six EPR2 reactors shows his choice to maintain a base of decarbonised electricity production using nuclear energy.
Though it is a subject of much debate, this decision is born of cold pragmatism: despite their demonstrated large-scale deployment, renewable energies remain subject to the whims of the weather. Alone they will not be able to substitute dispatchable power generation facilities, when considering the ambition behind commitments to reduce greenhouse gas emissions by 2050.
Faced with the electrification of the economy, the decision to maintain nuclear power in the French energy mix alongside renewable energies is not so much an option as a necessity. The guarantor of balance in the French network, RTE, also recognises this: prospective scenarios with no renewal of the nuclear base depend, in terms of supply security, on significant technological and societal advances – a high-stakes gamble to say the least. Beyond these aspects, nuclear power also constitutes an obvious vector of energy independence for Europeans. Current affairs cruelly remind us of this, and the situation could almost have led to a change in the German position, if we look at the latest declarations of their government.
In France, initial estimations put construction costs at €52bn, but financing mechanisms are yet to be defined. The only certainty is that state backing will be essential to guarantee a competitive final price of electricity, given the scale of the investments and the risks weighing on the project. Ultimately, the financial engineering for the project will need to be imaginative in order to align the interests of the state, EDF and the consumers.
Founded in 2018 in Saint Malo, H2X-Ecosystems provides companies and regional authorities with the opportunity to create complete virtuous ecosystems marrying energy production and decarbonised mobility. These ecosystems enable both the production and the consumpt ion of hydrogen on-site. They are co-built with and for local actors to make the most of their regional resources in order to create added value, whilst maintaining it at the local level. In this way, the ecosystems participate in the development of these rural, periurban or urban areas.
Renewable and low-carbon hydrogen is produced from water electrolysis using renewable energy, which has recently become one of the major levers for decarbonisation. H2X-Ecosystems links this production with typical consumers (buses, refuse collectors, etc.) but also and especially with light mobility and delivery services: self-service cars and last mile delivery. Indeed, the company has made available a hybrid car operating on both solar power and hydrogen, thanks to which on-grid recharge stations are no longer necessary. All this comes without noise pollution or greenhouse gases (CO2, NOx, etc.).
More generally, H2X-Ecosystems is present throughout the hydrogen value chain, from production to storage to consumption: electrolyser, high power electro-hydrogen unit, power pack (fuel cells and removable tanks) able to be incorporated in light mobility solutions.
H2X-Ecosystems has signed a partnership agreement with Enedis Bretagne for the deployment of its high-power electro-hydrogen unit designed to provide a temporary power source to the grid during construction work or in the event of an incident. This unit makes it possible to reduce Enedis’s CO2 emissions and noise pollution by replacing its fossil fuel units with this technology.
In addition, during a period of high pressure on energy prices, the value offer put forward by H2X-Ecosystems enables a move towards control of energy expenditure and energy autonomy for industrial sites by relying in particular on this electro-hydrogen generator combined with other complementary systems (renewable energies, on-site hydrogen production, etc.).
In his presentation of the France 2030 plan, French President Emmanuel Macron confirmed the importance of this sector in the future: ‘We are going to invest almost 2 billion euros to develop green hydrogen. This is a battle that we will lead for ecology, for jobs, and for the sovereignty of our country.’
Relying in particular on nuclear power to perform highly decarbonised electrolysis, France has a leading role to play. H2X-Ecosystems is participating to the full by establishing its first production tools in France, whilst reconciling its development with a virtuous ecological approach that will generate added value, energy independence and profitability for companies and regions.
Retail banking, the old guard versus the new
David Chollet Partner, Accuracy
Nicolas Darbo Partner, Accuracy
Amaury Pouradier Duteil Partner, Accuracy
Retail banking is a sector that is set to see its rate of transformation accelerate in the next few years. The past 10 years have seen in particular distribution methods evolve towards more digitalisation, without calling into question the physical model, however. In the 10 years to come, in a world where technology will gradually make it possible to serve major needs via platforms, supply, distribution and technological solutions must all evolve.
1. THE TRANSFORMATIONS AT WORK
It is not worth spending too much time explaining the context in which retail banking has been developing for several years now; suffice it to say that there are three principal challenges: ultra-low rates, regulation that has toughened considerably since 2008 and the arrival of new players.
Beyond this context, the sector is experiencing major technological changes.The first such change regards data. Open banking designates an underlying trend that is pushing banking IT systems to open up and share client data (identity, transaction history, etc.). A new open banking ecosystem is gradually taking shape, in which multiple actors (banks, payment bodies, technology publishers, etc.) share data and incorporate each other’s services in their own interfaces, making it possible to provide new services and to create new tools.
Another major development is banking as a service (BaaS). Historically, retail banking was a fixed-cost industry. The opening up of data, the swing to the cloud and the API-sation of banking systems have made closed and vertically integrated production models redundant. Each of the production building blocks of financial services can now be proposed ‘as a service’. This transformation leads to a swing from a fixed-cost economic model to a variable-cost basis. By outsourcing their banking system, digital challengers can launch their businesses with lower costs and shorter time frames.
Finally, the sector cannot entirely avoid the phenomenon of super-apps, which are gradually changing uses by aggregating services for highly diverging needs. This change may slowly make the way clients are served obsolete and probably requires the development of what we might call ‘embedded finance’.
2. THE FUTURE OF TRADITIONAL PLAYERS
Traditional banks have generally resisted the prevailing winds mentioned above. Over the past 10 years, their revenues have not collapsed, though their growth has proved to be somewhat moderate.
Traditional players still have a certain number of strengths. First, historical banks have complete product ranges, which of course cover daily banking (account, card, packages, etc.), but also the balance sheet side of things, with credit and savings products. Classifying the IT systems of major banks among their strengths may seem rather unconventional. Nevertheless, these large systems, though not agile, are often highly robust, and they have made it possible to shrink the technological gap with neobanks. Finally, traditional players are financially powerful and capable of investing to accelerate a technological plan when necessary.
Naturally, these players have some weaknesses, the main one being the customer experience. However, this point does not relate to the gap with neobanks, which has most often been filled; it relates to the gap with purely technological players for example. When considering the trend of convergence of needs, this weakness may represent something of a handicap for the financial sector as a whole. Another weakness relates to these players’ low margin for manoeuvre in terms of the reduction of headcount or number of agencies, if the implementation of a massive cost-reduction programme proved necessary.
These players are deploying or will have to deploy different types of strategy. First, there are the financial actions, be they concentrating or restructuring. Concentration aims to dispose of all activities away from the bank’s main markets in order to be as large as possible in domestic markets. Restructuring, in Spain in particular but also in France with the business combination between SG and CDN, aims to reduce the break-even point.
Banks should also take other actions. In terms of IT, there will come a time, in the not too distant future, where the lack of agility of historical systems will no longer be compensated by their robustness. Developments will accelerate and the speed of developments will become key.
Finally, traditional players will have to rethink their distribution models in the light of digital technology and the convergence of the service of major types of need, which will enable embedded finance. The idea of embedded finance is to incorporate the subscription of financial products directly into the customer’s consumption or purchase path. The financial service therefore becomes available contextually and digitally.
3. THE FUTURE OF NEOBANKS
Neobanks have developed in successive waves for more than 20 years, and the last wave saw the creation of players developing rapidly and acquiring millions of clients. They are capable of raising colossal funds on the promise of a huge movement of clients towards their model.
The primary strength of neobanks is their technology. Having started from scratch in terms of IT, they have been able to rely on BaaS to develop exactly what they need, all with a good level of customer service.
Moreover, these players generally target precise segments; as a result, they have a perfectly adapted offer and customer path, something that is more difficult for generalist banks.
Their weaknesses are often the corollary of their strengths.
Yes, their limited offer makes it possible to better fulfil certain specific needs, but in a world where technology is enabling the emergence of multi-service platforms, addressing only some of a customer’s financial services needs is not necessarily a good idea. It places neobanks on the periphery of a business line that itself is not best placed in the trend of convergence of needs. But if neobank offers are limited, it is not necessarily by choice.
Developing credit and savings products, areas most often lacking in neobanks, would need them to change size in terms of controls and capital consumption in particular. Finally, the consequence of this limited offer is their inability to capture the most profitable retail banking customers en masse: the customer with multiple accounts. This explains their low revenues, which plateau at €20 per client.
This does not necessarily condemn the future of the neobank. For a start, it is necessary to distinguish between countries based on the availability of banking services. In countries with a low level of banking accessibility, neobanks have an open road before them, like Nubank in Brazil (40 million customers). In countries with a high level of banking accessibility, it is a different story. The low level of revenues and the trend of convergence of major needs will force neobanks to make choices: they can urgently extend their offer to balance sheet products, like Revolut appears to be doing; they can decide to skip the balance sheet step and widen their offer directly to other areas, like Tinkoff is doing in Russia; or they can let themselves be acquired by a traditional player that has an interest in them from a technological perspective – but they should not wait too long to do so.
The retail-banking sector is more than ever under the influence of major transformations. These may be internally generated, like those that touch on data and BaaS, or externally generated, like the development of platforms serving major needs, initially driven by consumer desire for simplification. In this context, traditional players must address two major topics: embedded finance, on the one hand, and potentially the swing towards decidedly more agile systems to stay competitive, on the other. As for neobanks, their offer must be extended to cover balance sheet products urgently, at the risk of losing some agility, or to cover other needs.
But the finance sector as a whole should probably seek to simplify the consumption of its services considerably, faced as it is with non-financial players that have already undertaken this transformation.
Does the value of work still mean something
Sophie Chassat Philosopher, Partner at Wemean
‘When “the practice of one’s profession” cannot be directly linked with the supreme spiritual values of civilisation – and when, conversely, it cannot be experienced subjectively as a simple economic constraint – the individual generally abandons giving it any meaning ’,wrote Max Weber in 1905 at the end of The Protestant Ethic and the Spirit of Capitalism.1 But is this not what we can observe a century later? A world where the value of work seems no longer evident, as if it were ‘endangered’ 2…
Big Quit in the USA, the hashtags #quitmyjob, #nodreamjob or #no_labor, communities with millions of followers like the group Antiwork on the social network Reddit: the signals of a form of revolt, or even disgust with work, are multiplying. This is not just a change to work (as might be suggested by remote working or the end of salaried employment as the only employment model), but a much more profound questioning movement – like a refusal to work. This is a far cry from Chaplin’s claim that the model of work is the model of life itself: ‘To work is to live – and I love living!’3
In Max Weber’s view, work established itself as a structuring value of society when the Reformation was definitively established in Europe and triumphantly exported to the United States. But the sociologist insisted on one thing: the success of this passion for work can only be explained by the spiritual interest that was linked to it. It is because a life dedicated to labour was the most certain sign of being one of God’s chosen that men gave themselves to it with such zeal. When the ethical value of work was no longer religious, it became social, serving as the index of integration in the community and the recognition of individual accomplishment.
And today? What is the spiritual value of work tied to when the paradigm of (over)production and limitless growth is wobbling, and when ‘helicopter money’ has been raining down for long months? Younger generations, who are challenging the evidence of this work value most vehemently, must lead us to elucidate the meaning of work for the 21st century; studies showing that young people are no longer willing to work at any price are multiplying.
The philosopher Simone Weil, who had worked in a factory, believed in a ‘civilisation of work’, in which work would become ‘the highest value, through its relationship with man who does it [and not] through its relationship with what is produced.’ 4 Make of man the measure of work: that is perhaps where we must start so that tomorrow we can link an ethical aspect to work again – the only one to justify its value. ‘The contemporary form of true greatness lies in a civilization founded on the spirituality of work,’ 5 wrote Weil.
1Max Weber, L’Éthique protestante et l’esprit du capitalisme [The Protestant Ethic and the Spirit of Capitalism], Flammarion “Champs Classiques”, 2017. Quote translated from the French: « Dès lors que « l’exercice du métier » ne peut pas être directement mis en relation avec les valeurs spirituelles suprêmes de la civilisation – et que, à l’inverse, il ne peut pas être éprouvé subjectivement comme une simple contrainte économique –, l’individu renonce généralement à lui donner un sens. »
2Dominique Méda, Le Travail ; Une Valeur en voie de disparition ? [Work; an endangered value?], Flammarion “Champs-Essais”, 2010.
3David Robinson Chaplin: His Life and Art, 2013, Penguin Biography.
4David Robinson Chaplin: His Life and Art, 2013, PenguTranslated from the French : « la valeur la plus haute, par son rapport avec l’homme qui l’exécute [et non] par son rapport avec ce qu’il produit. »in Biography.
5Simone Weil, L’Enracinement [The need for Roots], Gallimard, 1949.
The long-term discount rate
Philippe Raimbourg Director of the Ecole de Management de la Sorbonne (Université Panthéon-Sorbonne) Affiliate professor at ESCP Business School
If since Irving Fisher we know that the value of an asset equals the discounted value of the cash flows that it can generate, we also know that the discounting process significantly erodes the value of long-term cash flows and reduces the attractiveness of long-term projects.
THIS RESULT IS THE CONSEQUENCE OF A DUAL PHENOMENON:
• the passage of time, which automatically whittles down the present value of all remote cash flows; • the shape of the yield-to-maturity curve, which generally leads to the use of higher discount rates the further in the future the cash flows are due; indeed, we usually observe that the yield curve increases with the maturity of the cash flow considered.
THE DISCOUNTING PROCESS SIGNIFICANTLY ERODES THE VALUE OF LONG-TERM CASH FLOWS
For this reason, the majority of companies generally invest in short-term and medium-term projects and leave long-term projects to state bodies or bodies close to public authorities.
We will try to explain here the potentially inevitable nature of this observation and under what conditions long-term rates can be less penalising than short-term ones. This will require us to explain the concept of the ‘equilibrium interest rate’ as a first step.
THE EQUILIBRIUM INTEREST RATE
We are only discussing the risk-free rate here, before taking into account any risk premium. In a context of maximising the inter-temporal well-being of economic agents, the equilibrium interest rate is the rate that enables an agent to choose between an investment (i.e. a diminution of his or her immediate well-being resulting from the reduction of his or her consumption at moment 0 in favour of savings authorising the investment) and a future consumption, the fruit of the investment made.
WE CAN EASILY SHOW THAT TWO COMPONENTS DETERMINE THE EQUILIBRIUM INTEREST RATE:
• economic agents’ rate of preference for the present; • a potential wealth effect that is positive when consumption growth is expected.
The rate of preference for the present (or the impatience rate) is an individual parameter whose value can vary considerably from one individual to another. However, from a macroeconomic point of view, this rate is situated in an intergenerational perspective, which leads us to believe that the value of this parameter should be close to zero. Indeed, no argument can justify prioritising one generation over another.
The wealth effect results from economic growth, enabling economic agents to increase their consumption over time. The prospect of increased consumption encourages economic agents to favour the present and to use a discounting factor that is ever higher the further into the future they look.
In parallel to this potential wealth effect, we also understand that the equilibrium interest rate depends on the characteristics and choices of the agents. They may have a strong preference for spreading their consumption over time, or on the contrary, they may not be averse to possible inequality in the inter-temporal distribution of their consumption.
Technically, once the utility function of the consumers is known (or assumed), it is the degree of curvature of this function that will provide us with the consumers’ R coefficient of aversion to the risk of inter-temporal imbalance in their consumption.
If this coefficient equals 1, this means that the consumer will be ready to reduce his or her consumption by one unit at time 0 in view of benefitting from one additional unit of consumption at time 1. A coefficient of 2 would mean that the consumer is ready to reduce his or her consumption by two units at time 0. It is reasonable to think that R lies somewhere between 1 and 2.
From this perspective, in 1928 Ramsey proposed a simple and illuminating formula for the equilibrium interest rate. Using a power function to measure the consumer’s perceived utility, he showed that the wealth effect in the formation of the equilibrium interest rate was equal to the product of the nominal period growth rate of the economy and the consumer coef ficient of aversion R. This leads to the following relationship:
r = δ + gR
where r is the equilibrium interest rate, δ the impatience rate, g the nominal period growth rate of the economy and R the consumer’s coefficient of aversion to the risk of inter-temporal imbalance in his or her consumption.
Assuming a very low value for δ and a value close to the unit for R, we see that the nominal growth rate of the economy constitutes a reference value for the equilibrium interest rate. This equilibrium interest rate, as explained, is the risk-free rate that must be used to value risk-free assets; if we consider risky assets, we must of course add a risk premium.
In the current context, Ramsey’s relationship makes it possible to appreciate the extent of the effects of unconventional policies put in place by central banks, which have given rise to a risk-free rate close to 0% in the financial markets.
THE LONG-TERM DISCOUNT RATE
Now that we have established the notion of the equilibrium interest rate, we can move on to the question of the structure of discount rates based on their term.
We have just seen that the discount rate is determined by the impatience rate of consumers, their coefficient of aversion R and expectations for the growth rate of the economy. If we consider the impatience rate to be negligible and by assuming that the coefficient of aversion remains unchanged over time, this gives a very important role to the economic outlook: the discount rate based on maturity will mainly reflect the expectations of economic agents in terms of the future growth rate.
Therefore, if we expect economic growth at a constant rate g, the yield-to-maturity curve will be flat. If we expect growth acceleration (growth of the growth rate), the rate structure will grow with the maturity. However, if we expect growth to slow down, the structure of the rates will decrease.
We thus perceive the informative function of the yield-to-maturity curve, which makes it possible to inform the observer of the expectations of financial market operators with regard to expectations of the growth rate of the economy.
WE ALSO SEE THAT THE PENALISATION OF THE LONG-TERM CASH FLOWS BY THE DISCOUNTING PROCESS IS NOT INEVITABLE.
When the economic outlook is trending downwards, the rate structure should be decreasing. But we must not necessarily deduce that this form of the yield curve is synonymous with disaster. It can very easily correspond to a return to normal after a period of over-excitation. For example, coming back to the present, if the growth rate of the economy is particularly high because of catch-up effects, marking a significant gap compared with the sustainable growth rate in the long term, the rate structure should be decreasing and the short-term discount rate higher than the discount rate applicable for a longer time frame.
It is only the action of the central banks, which is particularly noticeable on short maturities, that is preventing such a statistical observation today.
When improvement does not necessarily rhyme with simplification
Today, though this statement may apply more to developed countries than to developing countries, the economic landscape appears on the surface more promising. COVID-19 is on the verge of transforming from epidemic into endemic. Economic recovery is considered likely to last, and the delay to growth accumulated during the COVID-19 crisis has mostly been caught up. Last but not least, prices are accelerating.
This last phenomenon is quite spectacular, with a year-on-year change in consumer prices passing in the space of two years (from early 2020 to early 2022) from 1.9% to 7.5% in the United States and from 1.4% to 5.1% in the eurozone. What’s more, this acceleration is proving stronger and lasting longer than the idea we had of the consequences on the price profile of opening up an economy previously hindered by public health measures.
Faced with these dynamics on the dual front of health and the real economy, opinions on the initiatives to be taken by central banks have changed. The capital markets are calling for the rapid normalisation of monetary policies: stopping the increase in the size of balance sheets and then reducing them, as well as returning the reference rates to levels deemed more normal. This, of course, comes with the creation of both upward pressure and distortions in the rate curves, as well as a loss of direction in the equity markets.
At this stage, let’s have a quick look back to see how far we may have to go. During the epidemic crisis, the main Western central banks (the Fed in the US, the ECB in the eurozone, the Bank of Japan and the Bank of England) accepted a remarkable increase in the size of their balance sheets. For these four banks alone, the balance sheet/GDP ratio went from 36% at the beginning of 2020 to 60% at the end of 2021. This is the counterpart to the bonds bought and the liquidity injected in their respective banking systems. At the same time, the reference rates were positioned or maintained as low as possible (based on the economic and financial characteristics of each country or zone): at +0.25% in the US, at -0.50% in the eurozone, at -0.10% in Japan and at +0.10% in the UK. This pair of initiatives served to ensure the most favourable monetary and financial conditions. They ‘supplemented’ the actions taken by the public authorities: often state-backed loans granted to businesses and furlough measures in parallel to significant support to the economy (around 4.5 points of GDP on average for the OECD zone; note, the two types of measure may partly overlap).
Now, let’s try to set out the monetary policy debate. The net rebound of economic growth in 2021, the widely shared feeling that economic activity will continue following an upward trend, and price developments that are struggling to get back into line all contribute to a situation that justifies the beginning of monetary policy normalisation. It goes without saying that the timing and the rhythm of this normalisation depend on conditions specific to each geography.
HOWEVER, WE MUST BE AWARE OF THE SINGULAR NATURE OF THE CURRENT SITUATION.
The current inflationary dynamics are not primarily the reflection of excessively strong demand stumbling over a supply side already at full capacity.
More so, they reflect – and quite considerably – production and distribution apparatuses that cannot operate at an optimal rhythm because of the disorganisation caused by the epidemic and sometimes by the effects brought about by public policies. The return to normal – and if possible quickly – is a necessity, unless we are willing to accept lasting losses of supply capacity. With this in mind, we must be careful not to speed down the road to monetary neutrality; otherwise, we risk a loss of momentum in economic growth and a sharp decline in financial markets, both of which would lead us away from the desired goal.
Another point must be mentioned, even if it is more classic in nature: the acceleration of consumer prices is not without incident on households. It gnaws away at their purchasing power and acts negatively on their confidence, both things that serve to slow down private consumption and therefore economic activity.
THIS IS ANOTHER ELEMENT SUPPORTING THE GRADUAL NORMALISATION OF MONETARY POLICY.
How do the two ‘major’ central banks (the Fed in the US and the ECB in the eurozone) go about charting their course on this path, marked out on the one hand by the impatience of the capital markets and on the other by the need to take account of the singularity of the moment and the dexterity that this singularity requires when conducting monetary policy?
All we can do is observe a certain ‘crab walk’ by the Fed and the ECB. Let’s explain and start with the US central bank.
The key phrase of the communiqué at the end of the recent monetary policy committee of 26 January is without doubt the following: ‘With inflation well above 2 percent and a strong labor market, the Committee expects it will soon be appropriate to raise the target range for the federal funds rate.’ Not surprisingly, the reference rate was raised by 25 percentage points on 16 March, and as there is no forward guidance, the rhythm of the monetary normalisation will be data dependent (based on the image of the economy drawn by the most recently published economic indicators). At first, the focus will be on the price profile; then, the importance of the activity profile will grow.
The market, with its perception of growth and inflation, will be quick to anticipate a rapid pace of policy rate increases. The Fed, having approved the start of the movement, is trying to control its tempo. Not the easiest of tasks!
Let’s move on to the ECB. The market retained two things from the meeting of the Council of Governors on 3 February: risks regarding future inflation developments are on the rise and the possibility of a policy rate increase as early as this year cannot be ruled out.
Of course, the analysis put forward at the time was more balanced, and since then, Christine Lagarde and certain other members of the Council, such as François Villeroy de Galhau, have been working to moderate market expectations that are doubtlessly considered excessive.
We can see it clearly. It will all be a question of timing and good pacing in this incipient period of normalisation. In medio stat virtus1, as Aristotle reminds us. But how establishing it can be difficult!
1Virtue lies in a just middle.
IMPACT OF THE RUSSIAN INVASION OF UKRAINE: NECESSARY DOWNWARD REVISION OF ECONOMIC ASSESSMENT
• The world outside Russia, especially Europe, will not get through the crisis unscathed. The continued acceleration of prices and the fall in confidence are the principal reasons for this. Indeed, the price of crude oil has increased by over 30% (+35 dollars per barrel) since the beginning of military operations, and the price of ‘European’ gas has almost doubled. In the same way, it is impossible to extrapolate the rebound in the PMI indices of many countries in February; they are practically ancient history. Growth will slow down and inflation will become more intense, with the United States suffering less than the eurozone.
• Vigilance (caution) may need to be even greater. This new shock (the scale of which remains unknown) is rattling an economic system that is still in recovery: the epidemic is being followed by a difficult rebalancing of supply and demand, creating an unusual upward trend in prices compared with the past few decades. Is the economic system’s resistance weaker as a result?
• In these conditions, monetary normalisation will be more gradual than anticipated. Central banks should monitor the increase in energy (and also food) prices and focus more on price dynamics excluding these two components – what we call the ‘core’. The most likely assumption is that this core will experience a slower tempo, above all because of less well-orientated demand.
In light of the current context, this month’s edition of the Economic Brief will focus on the relationship between war and the economy. In particular, we will look into links between the two; we will delve into economic theory in relation to war; and we will examine some of the impacts of the ongoing crisis in Ukraine on the world economy.
In this first edition of the Economic Brief in 2022, we look into some of the significant factors currently affecting the global economy. We start with COVID-19, its development, and a new mentality taking hold. We then move on to the Purchasing Managers Index to see what it tells us of the level of confidence in economic activity across three major zones. Finally, we take a closer look at inflation and the structural developments that are set to affect prices in the future.
Accuracy supported House of HR with the acquisition of the Dutch company TMI, a company specialised in secondment and recruitment in health care. With the TMI acquisition, House of HR aims to increase its presence in the healthcare sector a market in which the group has long wanted to position itself on a larger scale. The acquisition of TMI is a significant step in realizing the objective for setting up a specialized branch of HR services for health care within the group.
For lawyers involved in the world of litigation and arbitration, claims for damages of all kinds are a common (if not daily) occurrence. However, as the quantification of such damages falls to experts in the fields of accounting, valuation and economics, many legal practitioners can find it difficult to sensecheck points put forward by experts on both sides of the debate.
This article was drafted with exactly this intent in mind: to provide legal practitioners with a straightforward introduction to the main concepts and methods adopted in the assessment of damages.
1. DAMAGES FRAMEWORKS
In common law jurisdictions the theory and principles underlying damages claims are well established. Although the specific application of whether certain types of claims are allowed may differ in civil law jurisdictions such as China, the underlying principles of assessment set out in this article should still apply.
One of the key documents for this discussion is Fuller and Perdue’s classic article, The Reliance Interest in Contract Damages (1937)1, which elucidates 3 key principles of contract damages, namely:
• The Expectation principle;
• The Reliance principle; and
• The Restitution principle.
The Expectation principle holds that damages following breach of contract should put the claimant in the economic position in which they would have been, if the respondent had fulfilled its promise.
The Reliance principle holds that damages for the breach should make the claimant as well off as it would have been had the promisor never made its promise at all.
The Restitution principle holds that damages for breach require the respondent to return any benefit conferred on them by the claimant as a result of the promise.
The first of the 3 principles (Expectation) is doctrinally dominant in discourse about damages and is regularly cited in expert witness reports regarding damages, hence it will form the focus of this article.
One important elaboration of this principle was clarified in the Chorzow Factory case (1928), wherein the Permanent Court of International Justice (PCIJ) established the reparation standard for intentionally wrongful acts under customary international law as follows:
“The essential principle contained in the actual notion of an illegal act… is that the reparation must, as far as possible, wipe out all the consequences of the illegal act and re-establish the situation which would, in all probability, have existed if that act had not been committed.” 2
Practically for the expert, this typically means assessing the economic position of the claimant under two situations: the actual situation, in which the “illegal act” occurred, and a counterfactual (the “But For”) situation in which it did not. Reparation, or damages, would then equal the difference between the two, thereby re-establishing the economic position in the counterfactual situation.
2. TYPES OF DAMAGE SUFFERED
There are several types of damage which can be suffered by the claimant, depending on the exact illegal act involved:
This type of damage is commonly encountered in construction disputes where a contractor has incurred additional costs beyond its tender price due to action (or inaction) by the owner, or conversely, rectification / remedial costs if the subcontractor’s work has been delayed or substandard.
This is one of the most common types of damage claims encountered in commercial disputes. An example would be where a supplier has failed to supply a key, hard-to-source component of a product leading to a loss of sales and therefore profits.
Loss of use of an asset/investment
One potential example would be where the claimant (a manufacturer) loses the ability to produce and sell its products due to damaging acts by the respondent, such as expropriation of or intentional damage to a manufacturing plant.
Loss of opportunity
This type of damage can often be encountered in claims against professional advisors, where for example incorrect advice or inaction by an advisor leads to the claimant losing out on an opportunity to invest in a redevelopment project.
An example would be in cases where an action or allegation by the respondent has damaged the image or reputation of the claimant’s business, leading to concrete economic loss.
3. COMMON BASES FOR CLAIMING DAMAGES
Two common types of legal dispute where expert assistance is often required to assess the quantum of damages are (i) commercial contract disputes and (ii) investment-treaty disputes under Bilateral Investment Treaties (BIT).
Commercial contract disputes can arise wherever a contracting party has not fulfilled its part of a contract, whether that be in terms of providing certain products or services, or completing agreed work to a certain standard within an agreed duration, etc. Investment treaty disputes, by definition, arise between a state (or State-Owned Entity) and an investor, which can be an individual or a company, often when assets or enterprises operated by said investor are expropriated by the state. The Chorzow factory case referred to in Section 1 could be considered a typical case as it involved a nitrate factory where the factory in question was expropriated by the Polish government from its German owner(s).
Although from a legal standpoint there are many differences between the two types of claim (not least procedural), from an economic valuation perspective the applicable principles are very similar. Hence examples of both types of case will be covered in the sections below.
4. OVERVIEW OF VALUATION APPROACHES
Except in extremely straightforward cases – possibly where a claimant has been deprived of an asset whose value is standardised – most damages claims will involve some kind of valuation process. These claims can be classified as either direct or indirect losses. Direct losses refer to instances where the claimant has suffered the loss of access to or use of an income-generating enterprise or asset, and therefore would include shareholder disputes, divorce cases and certain types of expropriation claims. Indirect losses, as usually defined in contract law, arise from a special circumstance of the case and are only recoverable if the party knew or should have known of the circumstance of the loss when entering into the contract.
Approaches to valuing losses can broadly be divided into three main categories, namely (i) Income, (ii) Market Multiples, and (iii) Cost-based approaches. In very simplistic terms, the income approach values an asset based on the income it will generate; the market multiples approach values an asset by comparing it to other comparable assets or businesses in the market; and the cost-based approach values an asset based on the current replacement or reproduction cost of an asset, whilst taking into account any deductions required for deterioration or obsolescence of the asset.
The income approach is strongly preferred in valuing losses due to its flexibility and wide applicability. In theory, some form of the income approach can be used for valuing any income-generating asset3. This approach converts the expected future economic benefits from the asset – generally, cash flows – into a single, present value. Because this approach bases value on the ability to generate revenue and profits, it would be well-suited to valuing established, profitable businesses, as well as, say, new mining assets where the income streams are well defined. In comparison, it would be more challenging – but not impossible – to reliably apply this method to an early-phase high-tech start-up company as the range of valuations produced would be extremely wide due to uncertainty as to the size and timing of cash flows.
Types of claim in which income approaches could be adopted include:
I. Breach of contract disputes where a management or distribution contract has been terminated. Common situations where these are seen involve long-term hotel management and pharmaceutical distribution contracts;
II. Unfair competition disputes where, for example, a competitor may have diverted business or orders away from the claimant company;
III. Advance Loss of Profit claims where an incident has led to either delayed start-up or interruption of production, such as at a power plant or cement factory.
But For v. Actual
In our experience of disputes, the historical actual situation is generally a matter of factual evidence and agreed upon between the parties (although they may not agree on the ‘forecast’ actual situation). One of the key tasks of the expert is rather to determine what the counterfactual situation would have been (the But For scenario). The loss is effectively the difference between the Actual and But For situations, as shown in the diagram above.
In all the examples listed above, an expert would need to examine the historical books and records of the company, as well as its internal budgets and business plans, and also take into account any relevant industry research as to future trends, in order to form an opinion as to what a reasonable But For scenario would have been. We discuss establishing a reliable But For scenario further below.
The income approach can also take into account common litigation issues such as offsetting mitigation of losses by the claimant, and/or discounts for minority holdings, etc. Once an income approach is decided upon there are two main methods of arriving at a present value, namely Discounted Cash Flows (“DCF”) and capitalised earnings, with DCF being much more common.
One well-known published example of a case where the income approach was adopted is the matter of Suez v. Argentina4, being an investor-state ICSID dispute arising from Argentina’s termination of the concession granted to a consortium of claimants led by Suez S.A. (“Suez”) to provide water distribution and waste water treatment services to the city of Buenos Aires. The Suez consortium had been granted a 30-year concession as part of the privatisation of said services in 1993 and had run the concession relatively smoothly for the first 7 years of the concession. Tensions arose between Suez and the Argentine government during the Argentine financial crisis of 2001 – 2003, leading to the eventual termination of the concession by Argentina in 2006. Suez claimed US$1.09 billion in lost management fees, unpaid dividends and losses on equity investments, all of which were assessed using the income approach.
Although the final award by the ICSID tribunal was substantially less than that originally claimed (US$404.5 million), the tribunal agreed with the use of the income approach by the claimants’ expert.
Market Multiples Approach
The underlying logic of the market multiples approach is that the value of an asset or business should be similar to the value of another comparable asset or business, for example a business of a similar size in the same industry. An easily understood analogy would be that the price of a 10-year-old Ford sedan should be similar to the price of another Ford sedan of the same model and age.
The market multiples approach is often used in cases involving loss of use, loss of opportunity and reputational harm, as it relies on valuing an asset as a whole, rather than the additional costs or loss of profits arising from a specific action.
This approach is typically used (both in damages contexts and in the investment world) in the valuation of the equity of non-listed companies, for which a share price is not directly attainable but prices for similar, listed companies are. For the market approach to be appropriate, it is crucial that there be a sizeable pool of companies which are similar in terms of product and scale to the subject company and for which there are sufficient observable data points regarding their value. However, the market approach can be used for any assets for which there are sufficient comparators with observable prices.
Applied to the valuation of non-listed companies, the expert (i) identifies recent, arm’s length transactions involving comparable public or private businesses, and then (ii) develops pricing multiples which can be applied to the subject company’s normalised earnings or other relevant metrics of value. These pricing multiples can be based either on the market price of comparable listed companies on a stock exchange, or alternatively on real-world transactions involving entire comparable companies or operating units which have been sold.
The advantages to the market multiples approach are that it is simple to understand, widely used in the investment industry, is based on objective, observable third-party data and does not rely on internal business plans which can often be over-optimistic or biased.
The most common limitation to the use of this method is the lack of sufficient comparable companies, particularly in the case of high-tech start-up companies where there may be few, if any, other listed companies offering the same product or service.
An example where this approach was applied was in the ICSID arbitration case of Crystallex International Corp. (“Crystallex”) v. the Republic of Venezuela, in which Crystallex – a Canadian mining company – launched a claim arising from Venezuela’s expropriation of a gold mine being developed by Crystallex in that country. In that case, the tribunal rejected the respondent’s cost-based valuation approach, and strongly preferred the approach of the claimant’s expert, which was a market multiples approach using the gold reserves of the mine as a metric of value, leading to a damages award of US$1.2 billion in Crystallex’s favour.
The cost approach is based on the assumption that most or all of the value of a company is in its assets. In this method, the expert determines the overall enterprise value by calculating the value (whether that be Book value or Fair Market value) of the company’s assets net of its liabilities.
The advantages of this method are that it is simple to understand, is based on the current situation of the company and is arguably less subjective in that it does not involve projecting the future performance of the company. It can be an appropriate method when valuing holding companies, companies in liquidation or asset-intensive businesses where cash-generating operations tend to contribute less of the overall value. An assessment of sunk/additional costs, which are typically based on the claimant’s historical financial records, would also fall under the cost approach umbrella.
However, for the majority of companies where cash-generating operations do contribute most of the value, it would not be appropriate. Also it does not directly value intangible assets such as brands or Intellectual Property (IP), and so the expert would have to assess that value separately.
An example of an investment dispute where the cost approach was applied is Asian Agricultural Products (“AAP”) v. Sri Lanka. AAP was a Hong Kongbased company which had a minority shareholding in Serendib Sea Foods “Serendib”), which engaged in prawn farming in the eastern region of Sri Lanka. Serendib’s prawn farm sustained severe damage during a major domestic insurrection between a separatist guerrilla group and government forces. Subsequently, AAP alleged that it suffered a total loss of its investment and claimed compensation of US$9 million. In the final award, the tribunal awarded US$460,000, being compensation purely for the tangible assets of the business, as Serendib was loss-making, and had only made two shipments of its single product (prawns) to its target export market (Japan) at the time of the incident.
5. ESTABLISHING A RELIABLE BUT FOR
One of the most important aspects in arriving at a robust calculation of loss under the income method is establishing a reliable But For scenario.
First, the expert must show that he or she has correctly identified the impact of the breach. Many contemporaneous factors can affect a company’s performance in a given year, and failing to correctly identify these other factors can lead to the expert overestimating the loss. For example, imagine a hypothetical dispute where an auto component manufacturer had failed to supply the required components in time to a well-known car manufacturer. However, the breach coincided with depressed demand for cars in multiple countries due to the impact of the Covid-19 virus. In such a case it would clearly be wrong to forecast loss purely based on pre-Covid-19 historical performance as the overall market conditions at the time of the breach were materially different.
Second, the expert needs to demonstrate that rather than unquestioningly accepting the company’s projections at face value, he or she has critically examined historical performance and any forecasts of the company to arrive at an opinion.
This process involves not just looking at the company’s financial statements, management accounts and internal forecasts, but also external evidence such as broker forecasts and market research reports. These external information sources can be very helpful in highlighting areas where the company has been overly optimistic, and pointing out market-wide trends which may have impacted on the company’s performance during the loss period.
It is at this juncture that benchmarking has a large role to play: a company’s forecast sales prices for a commodity can be benchmarked against broker projections, or the entire valuation using the DCF approach can be benchmarked against alternative methodologies such as market multiples or a costbased approach.
The above is a general layman’s introduction to the main valuation approaches used in assessing damages in litigation cases. The particular valuation method(s) adopted in any given case must be rooted in the type of loss claimed and the facts of the case.
Finally, it should be noted that the methods listed above are not mutually exclusive, and it is common for experts to use a secondary method (e.g. market multiples) as a sense-check for their primary valuation method (often an income approach).
1 Fuller & Perdue, The Reliance Interest in Contract Damages (Pts. 1 & 2), Yale Law Journal Vol 46 No.3 (Jan 1937)
2 Factory at Chorzow (Germany v. Poland), Merits, 1928 Permanent Court of International Justice
3 As opposed to non-income generating assets (often collectibles) such as fine wine, art, jewellery and gold
4 ICSID award in the matter of Suez et al. v. The Argentine Republic dated 9 April 2015, https://www.italaw.com
Our partners Morgan Heavener, Frédéric Loeper, and Darren Mullins authored an Expert Analysis Chapter for the International Comparative Legal Guide – Corporate Investigations 2022. The chapter, New Frontiers in Compliance Due Diligence: Data Analytics and AI-Based Approaches to Reviewing Acquisition Targets, shares their insights regarding the increasing regulatory and practical requirements for conducting compliance-related due diligence and more sophisticated ways to approach such due diligence.
critical and unprecedented circumstances that we are all experiencing, our
priority is to preserve the health of our teams, whilst continuing our activity
with the same exacting standards and high quality that you have come to know.
Some of our
offices in Asia have been affected for several months already and have
demonstrated the resilience of our firm. We put in place the organisation and
IT and communications systems necessary for them to remain perfectly
operational, and these have now been extended to all of our locations
continues on all our engagements without exception, as we continue to meet your
needs globally. Of course, we remain ready to assist you in making the
decisions relevant and necessary for the current situation, as well as when
normal activity will resume.
that you and your loved ones stay safe in this exceptional situation, and that
your teams and companies are able to face these challenges in the best of
conditions. We assure you of our unfailing support.
For this last edition of the Economic Brief in 2021, we will take a look back at the year and see how it developed across three different zones that drive the global economy: the United States, China and the eurozone. For each of these zones, we will observe the forecast development, as predicted by the Bloomberg Consensus, of three key elements of their economies: GDP growth, inflation and budget deficit.
Who’s Who Legal identifies the foremost legal practitioners and consulting experts in business law based upon comprehensive, independent research. Entry into their guides is based solely on merit.
Accuracy’s forensic, litigation and arbitration experts combine
technical skills in corporate finance, accounting, financial modelling,
economics and market analysis with many years of forensic and
transaction experience. We participate in different forms of dispute
resolution, including arbitration, litigation and mediation. We also
frequently assist in cases of actual or suspected fraud. Our expert
teams operate on the following basis:
• An in-depth assessment of the situation; •
An approach which values a transparent, detailed and well-argued
presentation of the economic, financial or accounting issues at the
heart of the case; • The work is carried out objectively with the intention to make it easier for the arbitrators to reach a decision; • Clear, robust written expert reports, including concise summaries and detailed backup; • A proven ability to present and defend our conclusions orally.
approach provides for a more comprehensive and richer response to the
numerous challenges of a dispute. Additionally, our team includes delay
and quantum experts, able to assess time related costs and quantify
financial damages related to dispute cases on major construction
For our third edition of Accuracy Talks Straight, Frédéric Recordon discusses business and economic developments in China, before letting Romain Proglio introduce us to Amiral Technologies, a start-up specialised in disruptive technology. We then analyse the development of the hydrogen industry with Jean-François Partiot and Hervé de Trogoff. Sophie Chassat, philosopher and partner at Wemean, explores Chinese society “as one”. And finally, we look closer at the numbers with Bruno Martinaud, Entrepreneurship Academic Director at Ecole Polytechnique, as well as at the macroeconomic and microeconomic risk in China with Hervé Goulletquer, our senior economic adviser.
GOVERNING A GREAT COUNTRY IS LIKE COOKING A SMALL FISH1
At first glance,
everything seems to be going well in China. The
country has overcome the COVID-19 pandemic, its economy has regained momentum,
and it seems to be entering a new era of prosperity, one that European
businesses present in the country should be able use to good advantage.
However, upon reading the 14th Five-Year Plan (2021–2025), troubling signs of the country starting to
turn in on itself are becoming evident, allowing considerable doubt to
linger over the country’s future growth trajectory.
After 40 years of
modernisation, economic reform and opening up to the world, the Chinese economy
has reached c. USD 10k in GDP per capita, a level similar to that of Japan and
South Korea after equivalent 40-year periods of economic growth in these
countries in the past. However, for the last five
years, Chinese growth has been significantly
running out of steam, and this trend may well continue if the country
chooses insulation over the openness practised since Deng Xiaoping.
The Dual Circulation policy, the core of the 14th Plan, seems to
prioritise the autonomy of the domestic market (internal circulation) over
openness to foreign trade and investment (external circulation), despite the
reassuring words of President Xi during the opening of the China International
Import Expo in Shanghai on 4 November 2020.
The fact that several sectors key to Chinese development, such as the internet, energy and education sectors, have recently been taken in hand and that a growing role is being attributed to public companies – despite their low efficiency and productivity – to the detriment of a highly dynamic private sector are testament to thethinking behindsuch strict economic control. They also demonstrate a major turning point in the recent economic history of the country. President Xi clearly owned this turning point when he stated: “the invisible hand [market forces] and the visible hand [government intervention] must be used correctly. (…) In China, the firm direction of the Party constitutes the fundamental guarantee”.2
The European Chamber of Commerce in China in its 2021/2022 Position
Paper dated 23 September 2021 expressed its concern about an insular withdrawal
and urged the Chinese government to continue its work of reform and opening up
to foreign companies.
The months to come will give an indication of China’s future trajectory. We can hazard that the country will be governed like a small fish is cooked, something President Xi likened to walking carefully across a thin sheet of ice.3
1President Xi Jinping quoting the Book of the Way and its Virtue (Dao De Jing, , 道德经), The Governance of China, p.493
2 President Xi Jinping, speech during the 15th session of the Political Bureau of the XVIII Central Committee of the Communist Party, The Governance of China, p.137
3President Xi Jinping, interview with BRICS correspondents, 19 March 2013
On 21 October 2021, Amiral Technologies announced its first round of fundraising totalling €2.8m. This represents an initial success for the start-up, founded in 2018 in Grenoble on the basis of an observation shared by numerous industrial players: how can we reliably predict breakdowns?
A spin-off of the CNRS, Amiral Technologies
is based on almost 10 years of university research in artificial intelligence
and automation & control theory. The company has successfully developed
disruptive technology: from sensors installed on machines, detecting physical
signals such as electric current, vibrations or humidity, algorithms make it
possible to generate general health indicators for the equipment. These health
indicators are then interpreted by unsupervised machine learning algorithms.
They make it possible to identify the causes of breakdowns most likely to take
Unlike the majority of other solutions on
the market, this solution (named DiagFit), which makes use of machine learning,
does not require the history of breakdowns identified on a piece of equipment
to be able to use artificial intelligence. Indeed, the algorithm is adapted to
a specific use case in order to define a normalised functioning environment for
More precise, quicker, and independent of
the sensors themselves, the technology is already in use with SMEs and
mid-sized businesses, as well as with large industrial groups such as Valéo,
Airbus, Daher, Vinci and Thales.
The predictive maintenance market benefits
from sustained growth dynamics, driven by an industrial base equipped with more
and more sensors, a need to optimise inventories of spare parts and, of course,
a greater need to avoid any costly shutdown in the production chain.
Amiral Technologies now aims to become the top supplier for the European market. The fundraising will enable it to strengthen its technical and commercial team, as well as to accelerate the development of DiagFit and its scientific and technological research.
For some years
now, hydrogen has been presented as the miraculous solution to develop clean
transport and energy storage on a large scale. The combustion of hydrogen,
which produces energy, water and oxygen only, is indeed 100% clean, and we can
certainly glimpse its promising potential. However, the carbon footprint of its
production varies considerably depending on its origin. The hydrogen sector is
not necessarily clean, and it is only decarbonised hydrogen that is stirring up
so much desire.
• Historically, industrial hydrogen – also known as grey hydrogen – has been produced from fossil fuels, and its environmental record is unsatisfactory, or even poor, depending on whether the CO2 emitted during its production is captured and stored. Grey hydrogen is an inevitable by-product of oil refining (desulphurisation of oil) and ammonia production. Today, more than 90% of the hydrogen produced in the world is grey, but this proportion is destined to fall significantly to the benefit of green and blue hydrogen.
• All eyes are now on the production of green hydrogen, that is, the hydrogen produced from decarbonised electricity (solar, wind, nuclear, and hydro power).
• Some researchers are also looking into the exploitation of white hydrogen, that is, hydrogen sourced naturally. As surprising as it may seem, knowledge of the existence and extraction possibilities of this native hydrogen is still rudimentary. For the time being, white hydrogen remains the dream of a few pioneers. Related knowledge is inchoate and accessible volumes unknown. Its research cycle and potential development will be long. If this path were to prove economically viable, it would most likely be explored by large oil producers thanks to their in-situ extraction expertise.
– Finally, big oil and petrochemical groups are calling for a transitory phase using blue hydrogen. Produced using natural gas, it can be considered clean as long as all related CO2 and methane emissions are captured.
For the decades to
come, green and blue hydrogen will be the major areas of development in the
energy industry. But this ambition is confronted with three constraints.
Constraint 1: Demand versus capacity
‘Nothing is more imminent than the impossible’ – Victor Hugo, Les Misérables
expectations for the industry seem excessive today, as the requirements for
energy production capacity are of titanic proportions if we are to consider
decarbonising a significant share of the market. As a reminder, global energy
consumption mostly serves industry (29%), ground and air transport (29%) and
residential consumption (21%).
hydrogen sector meets less than 2% of energy needs.
To cover global
energy consumption in 2030, an area the size of France would need to be covered
in photovoltaic solar panels, according to Land Art Generator (US). And that is
assuming that these panels benefit from optimal and constant sunlight and that
they give maximum yield. As observed yields from solar power today stand at
25%, the logical conclusion would mean using an area four times the size of
France to achieve the same goal.
These figures help us to understand why the great powers are now considering reinvesting massively in the nuclear industry and securing their access to uranium deposits across the world. A huge redeployment of nuclear power for electricity generation might make it possible to solve the environmental equation in 50 years (climate change – IPCC objectives). Various significant issues remain to be resolved, of course, including questions of nuclear safety and the treatment and storage of nuclear waste. But given that the time scale to resolve these issues is measured more in the hundreds, if not the thousands of years, rather than in 50, some will quickly weigh up the consequences and decide.
Constraint 2: A development cycle for major projects that cannot be shortened
‘The difference between the possible and the impossible can be found in determination’ – Gandhi
electrolysis processes offer a low energy yield, and the green hydrogen sector
will require the construction of gigafactories, the technology, design and scale-up
of which are not yet fully appreciated.
attempts to accelerate the process, we are talking about major projects, and
their development cycles are standardised. There needs to be a 10 to 20 MW
prototype / test site, before any 100 MW sites – currently the target entry
capacity to play in the big league – can be launched.
projects follow the classic cycle in major project engineering as presented
below. If we take as an example the liquefaction process for natural gas, which
is the most similar in terms of engineering and construction complexity to that
of large-scale electrolysis, between five and seven years would be necessary to
go from the feasibility study to the commissioning of the test site.
Cycle d’ingénierie de grands projets
Then, if we say
that feedback from the test site will be provided in parallel to the conception
and feasibility studies of a gigafactory, we would need to consider five to
seven additional years before the gigafactory could begin its operations. It
would be reasonable to imagine that a 100 MW factory would be composed of
independent units, whose installation would be sequential over an additional
period of 12 to 24 months. Based on this plan, we would need to count around 15
years in total to create a gigafactory with an effective production of 100 MW.
To accelerate the
development cycle of these types of project, the following levers could be activated:
• Directly selecting qualified service providers, minimising the tender phase. Based on our experience in similar major projects, the ‘open book’ selection solution makes it possible to reduce the tender offer time, all whilst maintaining effective control over capital expenditure. This lever could facilitate the truncation of the tender phase, potentially winning around 12 months.
• Beginning construction of the prototype and obtaining in parallel the administrative and environmental authorisations for the gigafactory site. This lever would make it possible to reduce the development cycle by a few months.
• Starting up the factory capacity sequentially, segmented in discrete units, and by doing so, advancing the beginning of production by up to a year.
• Launching engineering and construction of the gigafactory in parallel to the prototype and managing the feedback on process optimisation through retrofitting (a rare disruptive approach but efficient).
• Accelerating the engineering and construction cycles by financing a more expensive project and mobilising more resources at a given moment.
For even greater
urgency, more disruptive levers could be applied:
• Removing certain administrative and environmental constraints and the related delays.
• Developing tools (IT and AI), making it possible to accelerate the engineering stage significantly.
• Working on smaller interlinked units able to be serially produced.
In all these cases, we must accept that the costs and risks resulting from the use of an acceleration lever will be higher than those of a traditional development cycle.
Constraint 3: Financial constraint
‘If you have to ask how much it costs, you can’t afford it’ – John Pierpont Morgan
Developing the green
hydrogen sector requires massive and sustained investment. The major powers
have finally understood that fact and acted: more than 30 countries have
announced investments totalling almost 300 billion euros to develop the sector.
substantial investments still seem insufficient when confronting the carbon
behemoth menacing the planet. Based on the calculations of the Energy
Transition Commission shared in April 2021, 15 trillion dollars must be
invested between 2021 and 2050 to decarbonise the global energy market. That
comes to 50 times more than what has been announced to date.
As Bill Gates said
via his Catalyst initiative from Breakthrough Energy, the scientific, political
and economic worlds have already proved their ability to support innovation in
energy and to give it a favourable development framework. That is what happened
in the past few decades with solar and wind energy and lithium-ion batteries.
But in 2021, we no
longer have the luxury to wait decades. We must collectively make a quantum
leap to accelerate decarbonisation innovation and its implementation. We are
talking about not only investing in proportions that far exceed investments
made in the past but also freeing ourselves of historical financial IRR models.
Here is a short list of some of the actions that may be put in place:
• Sourcing a colossal amount of capital from central banks, countries, financial institutions, great fortunes and philanthropists.
• Also targeting a significant proportion of personal savings (pension funds, mutual investment funds, etc.).
• Enhancing incentives for decarbonisation technologies by implementing systems more powerful than carbon taxes and credits (the effect of which is spot), for example, using specific interest rates based on a project’s future environmental impact.
• Not providing a financial return (IRR) for some of the capital invested. The expected return would become mostly environmental…
– Breakthrough Energy, a non-profit organisation, raised over a billion euros for its Catalyst initiative at the end of September 2021.
• Putting in place environmental reporting that is as reliable as financial reporting.
Investments in decarbonisation are revolutionising finance through their magnitude and the nature of their expected return; this will be environmental, not financial.
Sophie Chassat Philosopher, Partner at Wemean
Culturally, China functions as our opposite: its customs, mental models and rituals are highly compelling to us. To our great benefit, the philosopher François Jullien insists, seeing in Chinese thought a valuable means of decentring ourselves and leaving behind the certainties of Western culture – particularly binarism, a lack of nuance and the constant use of force in the name of logic.1 That in no way means that we must consider this other perspective to be right, but experiencing absolute difference, as the other perspective invites us to do, often allows us to choose new paths for ourselves.
Amongst the most
fascinating elements, there is the way in which Chinese society always seems to
react ‘as one’: collective expression there is unanimous. Of course, the nature
of the political regime and its current toughening stance with regard to the
expression of any form of singularity or standing out from the crowd have much
to do with it. Nevertheless, China has always represented the polar opposite of
individualism and communitarianism, which, in the West, have led to the loss of
a sense of public interest.
To picture this
collective movement in its entirety, we might think of Hobbes’ Leviathan with the famous image on the
frontispiece of the work presenting the body of the king composed of the masses
of individuals from the kingdom, who, if we look more closely, have no faces,
their being fully turned towards the face of the sovereign. This detail reminds
us of the danger of using organic metaphors to talk about societies: they may
well claim to mean that if the parts are there for the whole, the whole is also
there for the parts; however, often the parts end up cowering before the
A hive or even a murmuration (the natural phenomenon seen with large flocks of birds or schools of fish moving in concert, with each animal seeming to follow some form of choreography laid out in advance, without any individual leading the movement) might also provide, at first glance, images suggestive of the collective movements of which the Chinese are capable. But, of course, we must not linger over such animal analogies; the ethnologist Claude Lévi-Strauss rightly considered them to be the beginning of barbarousness, tantamount as they are to denying the human quality of the other culture.2
Though none of these metaphors depicts a desirable model, the fact remains that this way of functioning ‘as one’ holds up a negative mirror to us: how can we overcome the impasse of the ‘society of individuals’ (Norbert Elias), which characterises a model of Western society where any higher interest seems to have been lost? How can we find something like a collective impulse? What if our individual impulses made us want a collective impulse in the first place?3 Leaving behind individualism does not mean annihilating the individual; it is an invitation to stop looking only at oneself and to move towards shared achievements. Between the West and China, between atomism and holism, a third path is possible.
1François Jullien, A treatise on efficacy (1996).
2Claude Lévi-Strauss, Race and History (1952).
3Sophie Chassat has recently released Élan Vital: Antidote philosophique au vague à l’âme contemporain, Calmann-Lévy editions (October 2021).
Bruno Martinaud Entrepreneurship Academic Director, Ecole Polytechnique
It’s 2009. Kevin Systrom (soon joined by his co-founder, Mike Krieger) is working on a geolocation social media project, similar to Foursquare. Together, they manage to convince Baseline Ventures and Andressen Horowitz to invest $500,000 in the project. This enables them to dedicate themselves full time to the adventure. A year later, Burbn is launched in the form of an iPhone application that makes it possible to save locations, plan outings, post photos, etc. The application is downloaded massively, but the verdict is not quite what they hope for: the users, beta-testers, don’t like it at all. Too cluttered, too messy, it’s confusing and most of them have stopped using it. A patent failure. All this being very normal, the entrepreneur digests the feedback, learns from the experience and moves on to a new adventure. The metrics are bad – duly noted. And yet Kevin Systrom doesn’t stop there because he notices something that at first glance seems trivial: the photo sharing function (one amongst so many others) seems to be used by a small number of regular users… He investigates, questions these users and realises that the small group loves this function (and only this one). Instagram is born, all from the happy realisation that a small number of people, hidden in the multitudes that didn’t like Burbn, use the app for one reason.
story highlights a counter-intuitive principle for the educated manager: numbers
lie in the beginning. Burbn’s metrics were catastrophic. The rational response
would have been to acknowledge that fact and move on to the next project. But a
weak signal was hiding there, showing potential.
story of Viagra follows a similar pattern. Pfizer laboratories were developing
a blood pressure regulator, which was in phase III of testing before gaining
market authorisation. If we remember that the development of a new molecule
represents an investment of approximately $1bn, that would mean around $700m to
$800m had already been invested in the project. Pressure was therefore high to
achieve this authorisation as soon as possible. It just so happened that
someone in Pfizer’s teams noticed that some people in the test sample hadn’t
returned the pills that should have been left over as part of the procedure
given to them. Who pays attention to that? Some incoherent data, with no direct
link to the topic (efficiency of the molecule)… A few abnormal results in a
table of 300 columns and 100,000 rows… And yet, by investigating, this person
realised that those who weren’t giving back the extra pills all shared the same
characteristics of age and sex. Pfizer then realised that this blood pressure
regulator had an unexpected side effect so interesting that the project changed
A simple observation lies behind these examples that we can compare endlessly an innovative project, a start-up that is just starting up, they are adventures to be explored.
Exploring first means remembering that you don’t know what works and what doesn’t work in your idea. It’s recognising that you’re facing complex issues, that you don’t quite grasp all the variables of these issues and don’t understand how the variables interact, or their effects.
that starting point comes the following consequence, the subject of this
article: you don’t know what to measure and you don’t know the meaning of what you’re
measuring. This goes both ways: what might initially seem like poor metrics, as
in the case of Burbn, can hide a gem. But the opposite is also true. We have
recently worked with a start-up developing a smart object for well-being, aimed
at the public at large. The company quickly sold some tens of thousands of the
product, and based on this success, raised funds to scale up quickly and
control the market, only to find that its sales, far from growing, plateaued
and then fell. It turns out that 30,000 products being sold wasn’t the sign of
massive and rapid market adoption, but the majority of the addressable market.
After a period of trying different things, questioning themselves, doubting and
researching, the start-up’s founders finally found a B2B market, centred on a
service offer based on the smart object. The irony is that its strong early
figures didn’t mean that it had found its market.
observations lead to two simple and practical recommendations, which seem
almost trivial when writing them, but they can be slippery in their
Remember that the only way to progress in a complex environment is through
experimentation. Trial and error. Keep what works. Eliminate what doesn’t.
Understanding will come later. Pixar has always applied this empirical approach
to the extreme. From a starting concept, Pixar tests everything. There have
been, throughout the production process, 43,536 variations of Nemo, 69,562 of
Ratatouille and 98,173 of Wall-E… That’s the path between initial idea and
2. Give yourself the tools to ‘capture’ weak signals, that is, put strategies in place to save what seems irrelevant in one instant but which could be useful later. Remember that at a given moment, in the first life of an innovative project, no one is able to determine what is relevant and what is not.
Unfortunately, the human mind is wired in such a way as to try to give early meaning to the information that comes to it, which leads to neglecting the need to test everything (because we’ve already understood) and to filtering out noise (because we’ve already identified the signal)… These are probably the two deadly sins of the innovator or the start-up entrepreneur.
When we look at the Chinese economy, in this early autumn, two dynamics emerge. First, from a macroeconomic perspective, we can note very disappointing GDP growth during the third quarter of the year. As was forecast by Bloomberg’s economic consensus, one of the most viewed forecast aggregates, performance barely got off the mark (+0.2%, quarter on quarter). This phenomenon may not last long, however, and from the fourth quarter, the country may return to its previous performance level (around 1.5%, quarter on quarter). But even if we accept the forecasts, is there not a risk of being taken by surprise again in the near future?
China: the slumpmay not last
Second, and this time from a microeconomic
perspective, we have the Evergrande issue. It is the country’s largest property
developer, which, over time, has transformed into a type of conglomerate. It is
unable to pay its debts and coupons that are falling due. And it is fair to say
that its debts are high: over 300 billion dollars in total or almost 2% of the
country’s GDP, including 90 billion in financial debt (bank loans and bonds),
150 billion in commercial debt (including deposits from off-plan buyers), and
80 billion off balance sheet (essentially investment products issued by the
company). Available cash would only cover 40% of its short-term debt (maturing
within 12 months). Before starting a carve-out process for some of the assets,
it was estimated that a fire sale would involve a debt haircut of some 50%.
We should note that the Evergrande case, however iconic and high profile the company may be, is not unique. Other developers are putting themselves in defaulting positions, even when they are able to pay what they owe. They use the toughening regulation, which hinders their business development considerably and try to ensure their creditors are the ones with the losing hand. Or, to put it another way, they try to create enough scandal or public difficulty to force the public authorities to revise their attitude.
Evergrande: asset prices clearly falling
A major credit event in a suddenly
deteriorated economic environment gives us a more worrying outlook: what if China
was no longer a centre of stability in a world that very much needs one?
The Xi Administration (based on the name of
President Xi Jinping) has started a restructuring/consolidation phase for the
country’s economy, with the aim of reinforcing its fundamentals. It no doubt
considered the international environment to be favourable to such an action.
The decline of the COVID-19 pandemic, the return of global growth and a
theoretically more cooperative US president should create sufficiently
promising external demand conditions to compensate any ‘blunders’ in domestic
spending that the reforms (even if conceived and implemented well) would doubtless
However, as is often the case in life,
things have not gone exactly according to plan.
The Beijing government started with three
areas: real estate, debt and inequality. Work on all three must be reduced.
Let us start with real estate. Its total
weight in the Chinese economy, taking into account upstream and downstream
effects, is estimated at between 25% and 30%. The scale is reminiscent of what
we saw in Spain or Ireland before the Great Recession in 2008. Might we do well
to take this similarity as an invitation to prevent rather than to cure, after
the real estate bubble bursts? Moreover, real estate needs have become less
significant (apart from the considerable wave of migration from the countryside
to cities), whilst prices have skyrocketed. An average of 42 m2 per
person in a dwelling is perfectly comparable to what we can see in major
Western European countries. However, the ratio of property prices to average
household income is over 40 in Beijing or Shanghai (2018 figures). Though comparable
to the ratio for Hong Kong, it is significantly higher than its equivalent for
London or Paris (around 20), not to mention New York (12). This level observed
in large Chinese cities is only understandable if economic growth and
demographics remain sufficiently strong to justify a highly dynamic demand for
property and therefore to maintain expectations of property price increases. We
know that the demographics are not heading in this direction, and we sense that
the potential GDP growth is slowing…
Preventing the formation of a real estate bubble could be seen as a pressing obligation. First, is it not necessary to preserve the financial system’s ability to take the initiative at a time of structural change in the economy? The system’s exposure to the real estate sector is significant, between 50% and 60% of total bank loans granted. Second, less investment in real estate would facilitate, all else being equal, increased investment in capital goods or intellectual property products. Measures of both productivity and economic growth could find themselves improved
Credit exposure in real estate sector
Chine: heading towards a new breakdown in fixed investment?
Now let us talk about debt. Debt in
non-financial corporates is high; in fact, it is among the highest in major countries
around the globe. It represents 160% of the country’s GDP. Of course, we can
highlight the much more reasonable levels noted for households and public
authorities and therefore talk about a very ‘presentable’ average. But
embarking on economic reforms, which will most certainly create losers as well
as the expected winners, starting from a situation with a high level of debt in
the corporate sector is uncomfortable. This is even more so the case when we
consider the ricochet effect on the financial system of the difficulties facing
a certain number of companies.
We must therefore understand that the
importance given to greater stability in the financial system risks weighing on
economic growth. As we highlighted previously, this is another reason to ensure
more efficient investment signposting – towards where there is the greatest
potential for long-lasting and inclusive growth.
Debt of non-financial Chinese companies among the highest
The thread that runs from real estate to debt leads to inequalities. These inequalities are too great, and Beijing is aiming to reduce them. The Chinese real estate ‘adventure’ described above, in addition to the development of the technology sector and its consequent outperformance of the market, has contributed to an increase in inequalities, now putting them at the same level as in the United States. The richest 1% holds 30% of the wealth of all households in China, a proportion that doubled in the 20 years from 1995 to 2015. For Beijing, this development seems to carry the risk of challenging political stability. Is it not understandable then that the middle class should call for a reduction in these inequalities?
China: inegality becoming a political matter
No sooner said than done, we might wish to say; after all, President Xi is not one to dawdle. A large number of measures have been implemented to effect this triple ambition. Many relate to the technology and real estate sectors and encourage greater moral standards from the country’s citizens. The table below provides a summary of the changes.
China: a significant catalogue of party/government initiatives
But all this has a destabilising effect!
Ensuring parallelism between the impact of decisions that will suppress growth
(real estate, finance and technology) and the impact of those to come that will
boost it (aim to increase added-value content of the Chinese economy, less
dependence on foreign countries, and ‘healthy’ stimulation of domestic demand,
to mention what we currently understand) will require significant skill in
economic policy. Even in what remains a relatively nationalised system, it will
be quite a challenge. Benefitting from a favourable external environment is
certainly a ‘pressing obligation’ for Beijing today – never mind if, at least
at first, it flies in the face of the ambition to become more autonomous from the
rest of the world. Are we there yet?
Not really – with such a complicated
international environment (from the COVID-19 pandemic, which has not yet disappeared,
to persistent Sino-American tensions, not to mention a global economy that is
still recovering), it will be necessary to arbitrate between the desirable
(domestic reforms) and the possible (degrees of freedom offered by the economic
context and external policies). That will mean accelerating when possible and
slowing down when necessary. It will be an arduous task for the person in
charge of economic policy, not to mention ensuring that the business community
falls in line. It will not always be easy!
A topic much in the news of late is inflation. Indeed, its recent rise is dominating market news, and its effects are being felt globally. This edition of the Economic Brief will see us look into this striking rise in inflation and what patterns might be taking shape. We will also look into how inflation and pay rises interact, as well as how they might affect future employee compensation negotiations.
Accuracy supported the shareholder and management of Baas B.V. – a Dutch player in the construction of energy infrastructure with additional services in the field of fiber optic networks and in-building installations – with the sale of part of its shares to GIMV, a private equity investor with offices in Belgium, the Netherlands, Germany and France. Together with its investment in Verkleij B.V. (made in April this year), GIMV will set-up a strategic national combination active in the design, construction and maintenance of essential infrastructure for energy, water and telecom. The combination will form a strong combination due to the complementary areas of expertise and, multidisciplinary and stable party for the future.
Accuracy, the international independent advisory firm, has promoted four of its directors to partners in its Paris, Montreal and Singapore offices. This brings the total number of Accuracy partners to 56, spread across 13 countries.
Accuracy conducted financial buy-side due diligence for 21 Invest in the context of the acquisition of Edukea Group, an European platform specialized in the training of natural health and well-being professions.
During the summer, economic figures were updated to reflect the latest activity. Of particular note were the figures for July and August, which appear to show the incipient normalisation of the global economy, a trend that is set to continue. In this edition of the Economic Brief, we will look into the reasons behind this normalisation effect. We will also touch on a new development being seen in the labour market.
Accuracy, the global independent advisory firm, has promoted two of its
directors to partners in its Singapore office. This brings Accuracy’s total
number of partners to 56, across 13 countries.
Samuel Widdowson specialises in forensic construction planning and programming in a variety of contexts, whilst Zaheer Minhas specialises in major projects infrastructure advisory across multiple sectors. Their promotions reflect Accuracy’s continued expansion in the Southeast Asia market.
For the second edition of Accuracy Talks Straight, Nicolas Barsalou gives us his point of view on the way out of the crisis, before letting Romain Proglio introduce us to Delfox, a start-up specialising in artificial intelligence. We will then analyse the impact of the crisis on the aeronautics sector with Philippe Delmas, Senior Aerospace & Defence advisor, Christophe Leclerc and Jean-François Partiot. Sophie Chassat, philosopher and partner at Wemean, will invite us to explore the way out of the crisis from a cultural angle. Finally, we will focus on public debt with Jean-Marc Daniel, French economist and Professor at ESCP Business School, as well as on inflationary risk with Hervé Goulletquer, Senior Economic Advisor.
The crisis that we have been experiencing for almost one and half years now has no equivalent in modern history. It is neither a classic cyclical crisis, nor a replica of the great financial crisis of 2008. It would be dangerous to think, therefore, that we are coming out of it in the same way as previous crises.
What are we seeing? Two words enable us to deepen the analysis.
The first is “contrast”. This is, of course, not the first time that an economic crisis has affected some geographies more severely than others, particularly, in this case, Europe more than the Far East. However, it is the first time that we observe such diversity in the impact on different economic sectors. As a result, some affected sectors will take several years to return to their situation in 2019, like air transport or tourism for example. Conversely, other sectors have taken advantage of the crisis, like online activities (e-commerce, streaming services, video games), or have served as “safe investments”, like luxury goods.
The second word is without a doubt “uncertainty”. Given the tense geopolitical context and unprecedented capital injections in the economy, the current bright spell may lead in the relatively short term to another more classic crisis, made all the more dangerous as recent wounds will not have healed.
As advisers to innumerable economic players across the world, we observe an unprecedented de-correlation between certain market situations and the general state of the economy. On the one hand, the mergers and acquisitions market, boosted by an unparalleled level of liquidity, has rarely – if ever – experienced such exuberance both in volumes and in prices, and this was the case well before the crisis emerged. On the other hand, the corporate restructuring market is also very active, carried in particular by bank renegotiations for certain sectors in difficulty.
This paradox exists in appearance only: given the elements mentioned above, it is possible and quite natural to observe these two trends at the same time.
In this context, we think that, now more than ever, financial and economic players should avoid sheeplike behaviour and analyse each situation in an individualised and tailor-made way.
The most interesting cases to consider are certainly those sectors that are experiencing both positive and negative trends. The real estate sector is particularly relevant because it is undergoing profound and long-lasting change, combined with the effects of the last crisis. Let’s look at two representative sub-sectors: retail and office property.
The first has long been affected by the strong and continued development of e-commerce, a phenomenon that accelerated in 2020 under the effects of the lockdown and the closure of numerous shopping centres, to the extent that the value of retail property at the end of last year was at a historic low. Our long-held belief is that this fall in values was excessive, characteristic of the sheep like behaviour mentioned above and not adapted to the modern economy. Centres that are well located, well managed and well equipped will continue to be major players in retail. It is fortunate that, for a few weeks now, others are beginning to realise this and that these property values are rising again.
The second sub-sector benefitted up to the 2020 crisis from a favourable situation, thanks to a structural mismatch between supply and demand and real interest rates at zero that pushed up the so-called “safe investment” values like property. Moreover, the crisis has until now had little impact: for the most part, rent has continued to be paid and, given an extremely accommodating monetary policy, capitalisation rates and therefore values have changed little. But these two parameters are now threatened. The rise of remote working, if it proves to be long-lasting and significant (more than just one or two days a week), will inevitably have considerable consequences on the number of square metres necessary for office space as well as its location. Not all of these impacts will necessarily be negative: though it is certain that large business centres like La Défense and Canary Wharf are suffering and will continue to suffer, central business districts may see their values and occupation rates continue to rise.
As for macroeconomic parameters, and notably inflation, only an oracle could predict how they will develop: the only thing to do is to remain vigilant and to provide the means to minimise fragility through strategies that favour flexibility and agility. In this respect, it will be essential to monitor the development of the banking sector, but that would be a topic for another discussion…
* “THE MYRTLES HAVE FLOWERS THAT SPEAK OF THE STARS AND IT IS FROM MY PAIN THAT THE DAY IS MADE THE DEEPER THE SEA AND THE WHITER THE SAIL AND THE MORE BITTER THE EVIL THE MORE WONDERFUL THE GOOD”
LOUIS ARAGON “THE WAR AND WHAT FOLLOWED” (FROM THE UNFINISHED NOVEL)
Founded in 2018 in Bordeaux, Delfox is an artificial intelligence platform that uses reinforcement learning to model systems able to evolve intelligently, autonomously and intuitively in a constantly changing environment, without human intervention or programming in advance.
The technology developed by Delfox consists in giving objectives to the AI, which must then find a way to achieve them. When it comes to AI, it is essential to understand that this intelligence is based above all on learning.
It is therefore learning mechanisms that lie at the heart of Delfox’s development, which has progressed significantly for over two years in cuttingedge skills like deep learning and reinforcement learning, as well a s the related advanced algorithms.
The goal is to teach a machine to react autonomously, without indicating how to resolve an issue. The machine itself proposes solutions, which will lead to rewards or penalties; it will therefore learn from its mistakes.
For example, teaching a drone to go from point A to point B does not mean telling it to avoid collisions or to accelerate at certain points of the journey; it is about letting it react by itself and rewarding or penalising it based on the solutions it proposes. Potential applications are vast.
There is, of course, the area of satellites, in which Delfox is already working with Ariane Group for space surveillance purposes. Delfox participates in detecting satellite trajectories based on data provided by the GEOTracker space surveillance network to avoid collisions and interference.
But the fields of application are a lot more extensive than just satellite uses: autonomous military and urban drones, cars, logistics, defence, the navy, and more are all potential areas of interest.
Autonomy will no doubt be a key segment of activity in the next decade, and Delfox is already one of the most successful players in the field. With a team of 15 people, Delfox aims to reach €1m in revenues in 2021 and is already working with Ariane Group, Dassault Aviation, Thales and the DGA (French government defence procurement and technology agency).
The aeronautics industry is feeling the heat
Philippe Delmas Senior Advisor – Aerospace & Defence, Accuracy
Air transport is at the top of the list when it comes to sectors most heavily affected by the COVID-19 crisis. Behind it, the entire aeronautics industry is suffering, from manufacturers to equipment suppliers of all sizes. The shock is all the more brutal as annual growth stood on average at 5% over the past 40 years and was forecast to continue at over 4% a year for the decades to come.
In 2020, air traffic fell by 66% compared with 2019, and both the timing and the extent of its recovery remain uncertain. For domestic flights in large countries, recovery will depend on the speed and efficiency of vaccination efforts. It is already strong in the United States (traffic was only 31% lower in March 2021 than in March 2019) and China (+11% higher), but it remains weak in the European Union (63% lower). For international flights, recovery will depend on lockdowns linked to the emergence of new variants and the rate of vaccination in each country, not to mention the confidence that countries will have in each other’s efforts to contain the coronavirus. This recovery is currently very weak. In total, the level of traffic in 2021 will remain much lower than historical levels. At the end of April 2021, the IATA forecast world air traffic at 43% of the level in 2019 (compared with a forecast of 51% in December). Globally, a return to the 2019 level of activity will no doubt have to wait until mid-2022 for domestic flights and 2023, or even 2024, for long-haul flights. Only air freight has experienced continued growth, but it represents less than 10% of all air traffic.
Several factors lead us to consider that air traffic is not yet ready for a return to the long-lasting growth experienced in the decades before the crisis (5% a year from 1980 to 2019), and various arguments reinforce this vision:
– Passengers’ ecological concerns are becoming of prime importance – some will be more reluctant to travel and especially to travel far.
– Large groups have got through the COVID-19 crisis by completely stopping all business travel: short, medium and long haul.
It was an abrupt lesson, with radical conclusions favouring the strict limitation of such travel. As a result, these groups generated significant savings, as well as an improved ecological balance sheet, something monitored by the markets more and more closely. According to the leaders of major European groups surveyed at the end of 2020, business travel may permanently fall by 25% to 40% compared with 2019.
– These two factors are already enough to bring about a significant drop in traffic, but this drop will be compounded by a third factor, an immediate consequence of an airline’s economic model: first class and business class passengers are the major levers of profitability for a long-haul flight. If their traffic is reduced by 25% to 40%, airlines will have no other choice but to increase average prices significantly for all passenger classes.
The impact on prices of the change in behaviour should lead to a new economic balance: a reduction in business class volumes of 30% may lead to an average increase in ticket prices (business and economy) of 15%. With a price/volume elasticity of 0.9, an average fall in economy travel of 13.5% can be expected.
To sum this up, the forecast impact on passenger traffic could be as follows:
– A fall in business class and first class passenger numbers of 30% – A fall in economy class passenger numbers of 13.5% – An increase in average sales prices of 15%.
In our opinion, the sudden turbulence in the industry presents a unique opportunity for it to restructure; its untenable financial situation obliges it to do so. The air transport sector has taken out debt of over $250 billion since the beginning of the pandemic, and its total net debt should exceed its revenues during the course of 2021 or in early 2022. Today, the sector continues to lose tens of billions of dollars in cash each quarter, contributing to the rise in its debt levels.
The industry will be forced to overhaul its model significantly, especially given that this economic constraint doubles up as an ecological constraint that is just as fierce. Indeed, air travel is a substantial emitter of CO2, representing up to 2.5% of emissions globally and around 4% in the European Union. In addition, air travel suaffers another constraint that is specific to the sector, namely that CO2 represents only a fraction of its overall climatic impact. The most recent studies (July 2020) confirm that its emissions of nitric oxide (NO) at high altitudes contribute more to global warming than its emissions of CO2.
In total, air travel alone represents 5–6% of humanity’s impact on the climate. But it is not for lack of trying – the industry has been making substantial efforts. CO2 emissions per passenger kilometre have shrunk by 56% since 1990, one of the best performances of all industries. The total emitted tonnage of CO2 has nevertheless doubled over the same period because of the increase in traffic. Ryanair, the European low-cost leader, summarises the climatic impasse of air transport quite nicely: its aeroplanes are very recent, their occupancy at a maximum (average rate of 95%), but it is the company with the highest CO2 emissions in Europe after nine operators of coal power plants.
Technological progress will continue but, for aeroplanes as we know them, it will not be accelerating. As for truly new technologies (hydrogen, electricity), their time will undoubtedly come, but too late to play a significant role in meeting the object ives o f the Intergovernmental Panel on Climate Change (IPCC) in 2050, that is, limiting global warming to 1.5°C and net carbon emissions to zero.
In this context, the industry must reinvent itself, taking into account the following points:
– Growth in traffic will for a long time remain lower than the growth seen in previous decades. – Progress in energy efficiency will continue but will not accelerate. – This progress should be completed by credible and rapid climatic solutions (i.e. not offsetting), like clean fuel. Boeing and Airbus recently announced, in spring 2021, their desire to accelerate their use of green kerosene quickly and significantly. But the volumes will be insufficient to meet the objectives of the IPCC. – The serious issue of high-altitude emissions – currently left out of the equation – will have to be dealt with. – Owing to and considering the cost of decarbonisation solutions, the cost of air travel will inevitably increase by a significant margin. – This increase will weigh heavily on the most price-sensitive traffic, tourism, whilst technology will clearly and permanently reduce “high contribution” traffic. – Combined with a concerning debt situation, these factors will force a complete overhaul of the economic model of air transport.
Despite this severe assessment, we think that there are ways for the industry to react radically and constructively. We will present some of them soon.
1Boeing and Airbus 2International Air Transport Association (IATA) 3Accuracy interviews with management of large groups 4OECD, INSEE 5IATA
Coming out of a crisis, but what are we heading into?
Sophie Chassat Philosopher, partner at Wemean
The metaphor is a medical one: a crisis is the “critical” moment where everything can change one way or the other – the moment of vitality or the moment of mortality. It would seem, however, that things might not be so clear-cut and that, as Gramsci put it, a crisis instead takes the form of an “interregnum”, “consist[ing] precisely in the fact that the old is dying and the new cannot be born”. What will come out of all this? The suspense… Whatever the answer, it may well come out of left field.
This is what we’re currently feeling: a not very comfortable in-between, and we don’t know where it will lead us. The new world is not coming, and the old world is not coming back, even if, like the characters in Camus’s The Plague, we blithely or even unconsciously take up our old habits again as soon as the storm passes. Yet, at the same time, we know that something has changed, that this crisis has been, in the truest sense, an “experience”, a word whose etymology means “out of peril” (from the Latin ex-periri). Indeed, coming out of a crisis means always coming through and learning a lesson from it. The ordeal inevitably sees us transformed.
But what would be a “good” way to come out of a crisis? A way that would mean coming out on top and not crashing out? For the philosopher Georges Canguilhem, “The measure of health is a certain capacity to overcome organic crises and to establish a new physiological order, different from the old. Health is the luxury of being able to fall ill and recover.”
Overcoming a crisis is inventing a new way of life to adapt to an unprecedented situation. Indeed, health is the ability to create new ways of life, whilst illness can be seen as an inability to innovate. We must also be wary of all the semantics that suggest a return to the same or the simple conclusion of a certain state: “restarting”, “resuming”, “returning to normal”, “lifting lockdown”.
Inventing, creating… that’s what will truly and vitally take us out of the crisis. As another philosopher, Bruno Latour, put it from the very fi rst lockdown, “if we don’t take advantage of this unbelievable situation to change, it’s a waste of a crisis”. That’s why we must also see this period of coming out of a crisis as an occasion to come out of our mental bubbles and leave our prejudices behind. And let’s not forget to question the meaning of our decisions: why do we want to change? What new era do we want to head into, knowing that other crises are waiting for us? The thicker the fog, the stronger and further our headlights must shine.
1“The crisis consists precisely in the fact that the old is dying and the new cannot be born; in this interregnum a great variety of morbid symptoms appear.” Antonio Gramsci, Prison notebooks (written between 1929 and 1935).
2“For the moment he wished to behave like all those others around him, who believed, or made believe, that plague can come and go without changing anything in men’s hearts.” Albert Camus, The Plague (1947).
3Georges Canguilhem, “On the Normal and the Pathological”, in. Knowledge of Life (2008).
4Le Grand Entretien, France Inter, 3 April 2020.
Considerations on public debt
Jean-Marc Daniel French economist, Professor at ESCP Business School
By replacing corporate debt, the economic support policies linked to COVID-19 have sent public debt levels through the roof globally. According to the IMF, global public debt should increase from 83% of GDP at the end of 2019 to 100% at the end of 2021. At that time, this ratio is expected to reach 119% in France, 158% in Italy and… 264% in Japan. Yet, many of the comments brought about by this explosion are absurd.
FOUR MISCONCEPTIONS ARE OFTEN SPREAD ABOUT PUBLIC DEBT.
The first is that it constitutes a burden that one generation transfers to the next. However, as early as the 18th century, Jean-François Melon demonstrated the approximative nature of such a claim. Melon, the secretary of the famous John Law at the time when the latter was propounding his public debt monetisation policy, sought to justify himself after the policy’s failure. He gave his view on what happened in his Essai politique sur le commerce (Political essay on trade) where he declared:
“THROUGH PUBLIC DEBT, THE COUNTRY IS LENDING TO ITSELF.”
He insists on the fact that public debt does not effect a transfer from one generation to another but rather from one social group, taxpayers, to another, the holders of public securities, who receive the interest.
The second misconception is that the repayment of debt presents a threat to public finances. Some therefore suggest issuing perpetual debt, so that it will never have to be repaid. However, it just so happens that, in practice, public debt is already perpetual. Indeed, governments do little more than pay interest. Since the beginning of the 19th century, no entry has been made in a government’s budget for the repayment of its debt. Each time a loan comes to maturity, it is immediately replaced.
The third misconception about public debt is that a precipitous rise in interest rates would constitute a threat; after all, the government’s concrete and formal commitment is to pay interest. The increasing scarcity of potential lenders would generate this rise in rates and would restrict the opportunities for governments to borrow. However, every modern economy has a central bank acting as lender as a last resort. As a result, banks have no problem buying debt that they can subsequently dispose of by selling it back to central banks – and they do so without limit. The effective interest rate and the amount of debt held by private players ultimately depend on the action of the central bank. Incidentally, the status of the US central bank, the Federal Reserve, is explicitly defined in its mission:
“Maintain long run growth of the monetary and credit aggregates commensurate with the economy’s long run potential to increase production, so as to promote effectively the goals of maximum employment, stable prices, and moderate long-term interest rates.”
Though independent, central banks now maintain very low rates with the clear aim of alleviating the cost of interest for governments. In addition, as the central bank transfers back to the government the debt interest that the latter pays to the former, the portion of public debt owned by the central bank is free, which systematically reduces the average interest rate paid by the government. The situation in Japan presents an illustrative example of this. According to the OECD, its public debt/GDP ratio stood at 226% in 2019. The Japanese government quite calmly considers that this ratio will reach 600% in 2060. Its insouciance can be attributed to the fact that its net interest costs amounted to almost zero in 2019, thanks to an ultra-accommodating monetary policy and half of public debt being owned by the country’s central bank.
Finally, the fourth misconception is that there would be a division between good debt and bad debt.
Good public debt would finance investment; bad public debt would finance operations. This division makes little sense: it is based on taking the thinking behind private debt and applying it to public debt. It assumes that public investment spending prepares for the future, whilst public operational spending sacrifices the future for the present. However, it is easy to see that the salary of a researcher, whose work will lead to technical progress and therefore more growth, is operational spending, whilst the construction of a road leading nowhere corresponds to investment spending…
Nevertheless, the idea of good and bad debt should be detailed further because, in certain conditions, it should guide fiscal policy. Incidentally, our ancestors had identified the problem.
For a long time, religious authorities considered that remunerating a loan was tantamount to usury.
Their reasoning became more refined over time, to the extent that in the 13th century, Saint Thomas Aquinas could write:
“He who lends money transfers the ownership of the money to the borrower. Hence the borrower holds the money at his own risk and is bound to pay it all back: wherefore the lender must not exact more. On the other hand he that entrusts his money to a merchant or craftsman so as to form a kind of society, does not transfer the ownership of his money to them, for it remains his, so that at his risk the merchant speculates with it, or the craftsman uses it for his craft, and consequently he may lawfully demand as something belonging to him, part of the profits derived from his money.”
The nascent political economy then distinguished between two types of loan: on the one hand, there were “commercial” loans, also known as “production loans”, which financed investments and the emergence of future wealth, creating something on which to pay interest; on the other hand, there were loans aimed at helping those in difficulty, called “consumer loans”, which follow the same line of thinking as donations and should therefore be free.
The modern materialisation of Saint Thomas Aquinas reflections leads to the following affirmation: private debt is justified when financing investment that brings a structural improvement to growth, whilst public debt is justified in response to cyclical hazards, ensuring collective solidarity with economic sectors in difficulty due to cyclical fluctuations.
European treaties are based on these principles, the “Treaty on stability, coordination and governance in particular.”
THIS TREATY STIPULATES:
The budgetary position of the general government of a Contracting Party shall be balanced or in surplus; [this] rule shall be deemed to be respected if the annual structural balance of the general government [falls within] a lower limit of a structural deficit of 0,5 % of the gross domestic product at market prices.
It confirms the distinction between a “good deficit” – the circumstantial deficit, which appears when growth is struggling and disappears when growth is sustained – and a “bad deficit” – the structural deficit, which is independent of the cycle and remains no matter the circumstances.
What is worrying today is that we are moving away from this scheme, which is not without negative consequences. The first of these consequences relates to equality between supply and demand. Any public expenditure that is not financed by a tax on private spending increases demand. If this increase lasts, it will lead to one of two situations: an external contribution, that is, a deepening trade deficit, or the opportunity for the production system to increase its prices, that is, a boost to inflation.
The second negative consequence relates to an increase in public debt generating negative expectations for private players.
First, the instinct to save in order to prepare for an uncertain financial future brought about by the accumulation of debt leads to an increase in asset prices – property bubbles might be the most obvious materialisation of this phenomenon. This is what economists call “Ricardian equivalence”.
Second, these negative expectations erode the credibility of the currency.
Countries (like Lebanon) that see their currencies disappear in favour of the dollar because of a surge in public debt are rare. Nevertheless, we are witnessing a resurgence of gold, which remains the ultimate monetary recourse in the collective unconscious, a resurgence underlined by the soaring price of this precious metal.
All this to say that it is time to put an end to the “no matter the cost”, even if the cessation of payments of the government is not on the agenda.
Let’s remember the time before the pandemic. Prices are reasonable. From the beginning of 2010 to the beginning of 2020, the average annual increase in consumer price indices, when we exclude particularly volatile items like energy and food products, reaches 1.8% in the United States and 1.1% in the eurozone. The 2% objective set by central banks is not met and even the very low rate of unemployment (at the beginning of last year, it was 3.5% in the US and 5% in Germany) seems unable to generate an acceleration, via more dynamic labour costs.
Labour market developments – deregulation and a decrease in the bargaining power of employees – may explain the majority of this result. A collective preference for saving over investment and the credibility of monetary policies are other explanations that can be put forward.
But it’s only after a COVID-19 crisis that has lasted almost a year and a half and a way out that is finally taking shape, at least in the US and Europe, that the price landscape seems to have been thrown upside down! In two months (April and May), this very same core of prices increases by 1.6% in the United States (a 10% annual rate!) and 0.7% in the eurozone (an annual rate of over 4%). Just what is going on? This price acceleration comes as somewhat of a (bad) surprise, particularly because the objective of economic policy, throughout the pandemic, has been to maintain productive capacities (companies and employees), so that activity can restart ‘like before’ when the public health conditions allow it.
So, in terms of prices, things may not be happening exactly as expected. What explanations can we give? Let’s start with three.
First, the reopening of an economy more or less “preserved” over a fairly long period requires rebalancing. Starting production again is not instantaneous, and demand during lockdown is not the same as demand during unlockdown. For supply, a raw materials index, like the S&P GSCI, increases by 65% over one year (and even 130% compared with the low point in April 2020). Similarly, the cost of sea freight increases over one year by more than 150%. As for demand, during this interim period between one economic state and another, two mechanisms of upward price distortion coexist. The goods or services that turned out to be the winners of the lockdown have still not relinquished their crowns; their prices remain dynamic. Those that were the losers can now “pick themselves back up”, or rather pick their prices back up! The two graphs below illustrate what is happening in the US.
Based on this two fold observation and at this stage of analysis, an initial conclusion emerges: the price acceleration phenomenon may very well prove temporary, as the central bankers keep telling us. The production circuit will get back up to “cruising speed”, and the concomitance of these two movements in the rise of certain retail prices is not expected to last.
US: price winners from unlockdown (4% of index)
US: price winners from lockdown (12% of index)
We must remember the mechanisms that are at the heart of forming consumer prices. There are three key points in the matter.
1. Transmission losses between the raw product prices and consumer prices are very significant, so much so that in the American case the correlation between the two series is only 10%.
2. The profile of labour costs, and especially those per unit of output (the former from which the evolution of labour productivity is subtracted), shapes, with a delay of a few quarters, the profile of consumer prices. The messages sent by the front end of this relationship are not worrying. Unemployment is still far from its pre-COVID-19 level and businesses are putting a lot of emphasis on the need to improve their efficiency.
3. Inflation expectations play a significant role in the formation of prices. Indeed, the stability of expectations is the guarantor of the stability of prices. The reasoning behind this is as follows: if all consumers start to believe that prices will accelerate, they will together precipitate purchasing decisions. The imbalance, which is most often inevitable, between a sudden increase in demand and an offer that struggles to adapt quickly leads to the phenomenon of price acceleration. This phenomenon will escalate and become permanent if labour costs follow prices. It would then be justified to talk about inflation. Let’s say that, for the time being at least, expectations have done quite well in resisting the “fuss” generated by these somewhat sharp increases in consumer prices.
China: transmission losses between production price index (PPI) and core consumer price index (CPI)
US: key role of unit labour costs in the formation of retail prices
To conclude on this second analytical point, the risk of “cyclical” inflation seems rather limited at the moment.
Finally, despite the explicit wish and will to return to normal once the pandemic is behind us, shouldn’t we question the changes that it has brought about? Let’s ask three questions:
1. How can we eliminate the divergences generated by the health crisis (countries, sectors, companies and households, employment and savings)?
2. What will be the effect of the rise in debt (public and private)?
3. How can we normalise an economic policy that is so highly accommodating?
It is precisely because these questions exist that the resolve behind current economic policy is both remaining and transforming. The best illustration of the approach can be found in the United States in the High Pressure Economy. Its ambition is threefold: to prevent a decline in potential growth, to reorientate the economy towards the future (digital, environment and education/training) and to galvanise both supply and demand. This requires an increase in public demand and an increase in transfers, with the idea that private spending will follow. At the same time, it is also necessary to ensure that sectoral and structural policies contribute to the corresponding supply side changes, higher productivity gains and more jobs, all while avoiding excessive timing differences between the respective upward shifts in demand and supply. Otherwise, there would be a risk of creating less reasonable price conditions. Further, there is no point trying to hide it: there is an element of “creative destruction”’ in the approach taken.
THREE DEVELOPMENTS ARE STARTING TO APPEAR.
1. The questioning of the triptych – movements (goods and people) / concentration (locations of production and possibly companies) / hyperconsumption – because of the constraints of sustainable development
2. The rebuilding of productive supply (air transport, tourism, automotive, etc.)
3. The matching of labour supply and demand with both labour shortages and excesses.
We have to admit that we are not facing a classic, cyclical sequence. Adjusting economic policy may not be appropriate (stimulus either poorly calibrated or ill-suited), and structural and sectoral changes may generate imbalances at the macroeconomic level; price acceleration would be an indicator of this. Of course, so far, this is all conjecture, but we have a duty to remain vigilant.
LET’S LOOK AT THE THREE CONCLUSIONS THAT WE HAVE REACHED:
The temporary is not made to last ; the cyclical sequences are not sending any particularly worrying messages in terms of prices today or in the near future; the mix, formed through economic policy initiatives and structural changes currently being set in motion, should be closely monitored because it could be a source of imbalances, including greater inflation. A certain historical reference may be worth considering: the years following the end of World War II. Indeed, this period had a need both to support the economy and to reabsorb the imbalance between an awakening civilian demand and a then very military supply. All of this forced structural and sectoral developments. But beware: even if there is a certain resonance in terms of the sequences, the issue of time is perceived differently. It was necessary to move very quickly 75 years ago, but many believe, rightly or wrongly, that time pressure is less intense today. As such, neither policy initiatives nor structural changes would be of such a magnitude and speed to generate serious imbalances, including the likes of more inflation.
This edition of the Economic Brief will see us focus on economic growth. More specifically, we will examine the economic growth lost during the COVID-19 crisis and the time lag in catching up to where we should be, contrasting the situation in China with that in the United States, Europe and elsewhere.
Accuracy conducted financial buy-side due diligence and provided assistance with the SPA and completion accounts for Schneider Electric in the context of the acquisition of a controlling stake in ETAP Automation.
Accuracy provided financial due diligence support to House of HR on the acquisition of Cohedron, a leading group of full-service companies in the public sector. The takeover enables House of HR to strengthen their position on the Dutch market and in the public service sector.
In this edition of the Economic Brief, we will examine some of the factors behind the increase in corporate leverage. We look at how corporate debt can amplify the effects of a financial shock and go on to investigate what the price may be for the current high levels of debt.
In this month’s Economic Brief, we delve into changes in mobility and economic confidence. We look at how different countries are opening up at different rates and analyse confidence in both the services sector and the manufacturing sector. Finally, we look into recent comments comparing the economic situation in the 2020s with the 1920s to discover if there is some truth to them.
Accuracy assisted NewPort Capital with its investment in Amslod, a a fast-growing and leading direct-to-consumer Dutch e-bike brand. The investment of NewPort Capital will enable Amslod to accelerate its growth strategy and strengthen its position in the Dutch and European E-bike sector.
This month in the Economic Brief, we look a little more closely at the impact of the pandemic in the developed world. We examine the different rates of vaccination in the West and consider how economic confidence has been affected. We go on to consider the impact of the crisis on debt servicing for companies before touching briefly on inflation.
Open Banking is changing the face of the Canadian financial services industry
Consumers increasingly expect online banking services to rival the services they can access in branch. Open Banking or Consumer-Directed Finance – which officially recognizes consumers’ ownership over their financial data – has the potential to drive innovations in financial technology that will vastly improve the customer experience. Open Banking policy is driven by, and beneficial to, consumers, but it has the potential to seriously disrupt the financial services industry. As fintechs and tech giants begin entering the market and developing products and services based on Open Banking APIs, the Canadian banking oligopoly will begin to erode.
Absent immediate and decisive action by incumbents, there is a significant risk that traditional financial institutions may eventually be reduced to commoditized providers of financial services, competing with other incumbents solely on price in a race to the bottom. For example, Open Banking enabled applications could drastically reduce the need to shop around for mortgages and loans; consumers could simply share their credit information with a third-party app that provides them with a list of quotes from major financial institutions from which they could simply select the lowest rate. Despite posing a threat to incumbents, fintechs can also serve as allies, helping them better serve their customers while defending against the more serious competitive threat posed by the tech giants.
Tech giants are capitalizing on the shift to consumer-directed finance to penetrate the financial services market
Worryingly, most Canadian financial institutions lack the infrastructure and / or the expertise needed to compete effectively with tech giants like Amazon and Facebook, who have both the technological and financial resources to displace incumbents. Google has recently entered the consumer banking space by allowing consumers to open checking accounts and transfer money digitally through the Google Pay app and Apple partnered with Goldman Sachs in 2019 to launch the Apple credit card. In China, an early adopter of Open Banking, tech giants like Tencent and Alibaba have already begun to dominate the financial services market with integrated platforms such as WeChat Pay and AliPay.
Chinese Mobile Payments Market Share by Transaction ValueSource: Bloomberg, iResearch data as of June 30, 2020
In addition to looming competition from Western tech giants, fintechs also threaten to disrupt the industry by chipping away at the consumer-facing link in the financial services value chain by providing innovative services built on Open Banking APIs (including account aggregation, robo-advisory, automated accounting etc.). This shift is particularly problematic in an era of near-zero interest rates putting pressure on lenders’ bottom lines. While fintechs may appear to threaten incumbents, they can also serve as perfect partners for financial institutions looking to remain competitive in the face of potential competition from tech giants. By providing fintechs with access to an API ecosystem and creating mutually beneficial partnerships, financial institutions can leverage their expertise and agility to bring innovative products and services to market faster. Speed to market is key because Open Banking adoption, as well as the competitive pressures that come with it, is intensifying and incumbents should act quickly.
Growth of Financial Services APIs and Fintech Deal Volume
The Canadian Open Banking ecosystem is still in its infancy, incumbents should learn from experiences abroad to help prioritize the most promising use cases
The secular digitization trend in the financial services industry was accelerated by the COVID-19 pandemic, as well as the ensuing lockdowns, which forced consumers to increasingly rely on digital banking as bank branches around the world were shuttered. According to a recent report by TrueLayer, a UK-based Open Banking data aggregator, use of their Payments API grew 832% between March and July 2020. Further, the average transaction value of those payments more than doubled since last year, while usage rates have yet to fall off. The same report analyzed millions of API calls and found that, as of 2020, PFM (personal finance management) was by far the most popular Open Banking use case, representing nearly a quarter of all API calls. PFM includes applications such as account aggregation, smart budgeting and auto-saving. PFM applications are likely the most popular use case because they’re the easiest to implement, but a there are number of emerging use cases in proptech, insurtech and regtech that promise to further revolutionize consumer banking.
Open Banking API Calls by Use Case – Europe (% of API Calls)
Open Banking not only makes retail banking more convenient for consumers, it also provides SMEs with a previously inaccessible suite of time saving digital tools that help them stay competitive. The TrueLayer report shows that nearly 10% of all API calls for related to automated accounting applications. By automating accounting and other back-office tasks, entrepreneurs can spend more time on their businesses and less money on professional services.
Canadian incumbents have begun preparing for the shift towards consumer-directed finance by forging partnerships with leading fintechs
A small group of large financial institutions, commonly referred to as the Big 6, has long dominated the Canadian financial services value chain. Other industry players including insurance companies, wealth management firms, credit unions and payment processors have successfully competed with the Big 6 in certain verticals, but none have materially disrupted the industry across the value chain. While the oligopolistic nature of the industry has helped Canadian banks outperform their European and American peers, consumers have gotten the short-end of the stick. Open Banking promises to change this by opening up bank data and processes to third parties who can leverage it to create innovative products and services. Most incumbents lack the expertise, technological infrastructure and operational agility to build competitive products and services internally. As such, most incumbents are forced to chose between acquisitions and partnerships when implementing Open Banking use cases, with the latter being the preferred choice for most. Canadian industry leaders have already established partnerships with a number promising fintechs across the value chain in anticipation of the looming paradigm shift.
Canadian Fintech Partnerships
Source: Luge Capital, Company Websites
In 2019, Desjardins announced a partnership with Hardbacon, a personal financial management and account aggregation application, in order to drive traffic to Desjardins’ online brokerage service. In 2017, RBC collaborated with Wave to provide their SME clients with a suite of accounting, invoicing and financial management tools. These mutually beneficial partnerships help incumbents by allowing them to offer improve their service offering and stay competitive without substantial investment. While the existing incumbent-fintech partnerships are effective, they are far from sufficient. Customers’ needs and expectations are constantly evolving and incumbents must continuously innovate to ensure they can keep up with the rapid pace of change. If they don’t, the tech giants will.
Continuing the theme from the last edition of the Economic Brief leads us to consider how countries are dealing with the economic impact of COVID-19 and what actions they are taking to overcome the crisis. As can be predicted, the traditional rich–poor dichotomy is alive and well.
That is the conclusion
of a French study led by the Ministry of Labour during the first lockdown, an
unprecedented period during which the use of remote working became the norm
overnight for many sectors of activity.
One year on from the
birth of this revolution, and thanks to the feedback of over 450 colleagues, I
would like to share with you some invaluable lessons learned.
A setting conducive
Despite a certain
feeling of distrust and thanks to our extraordinary capacity to adapt, remote
working has proved itself. It facilitates, when the home allows it, a setting that
favours the concentration necessary to perform certain tasks like drafting
reports, for example. It also proves to be efficient in the following concrete
cases: short interactions with colleagues; presentations of simple documents;
and meetings with a limited number of participants, well-prepared content and a
An obstacle to
learning and creativity
Remote working imposes
a certain distance, however, no matter what technological tools are chosen or
how frequently they are used. This distance slows down the smooth running of a
quality learning process. Indeed, such a process can only take place in direct
contact with the realities of the job. The apprentice has to be able to
observe, question and understand best practices to be able to get to grips with
them. Remote working also diminishes creativity by depriving us of the precious
interactions that take place outside of the nitty-gritty of the job. It is
these interactions that make up the life of an office, a team, a company. An
unexpected comment here, a nod or shake of the head there, an encouraging look
… so many exchanges that make it possible to call something into question or
to be audacious and which allow us to innovate together.
Ultimately, if a
company is understood as simply the sum of its parts – or rather the sum of
isolated individuals – it is meaningless. Remote working deprives us of this
key aspect of the group, of this shared project nourished every day by our
interactions, our agreements and disagreements: conviviality, giving us a sense
of belonging and being useful.
The food for thought on this topic is vast and, as the health crisis continues to affect us and obliges us to adapt once more, I invite you to share with us your feelings on this new way of working.
On 21 January 2021, whilst visiting the Centre for Nanoscience and Nanotechnologies (C2N-CNRS) on the Plateau de Saclay (the European Silicon Valley), President Emmanuel Macron unveiled his ambitious Quantum Plan. The aim of this plan, which relies on France’s excellent research credentials, is to close the country’s gap in terms of investment.
It must therefore
promote work and research on computers, sensors, calculators and even
cryptography. In total, almost 1.8 billion euros will be dedicated to this
The plan ‘is a
plan for the whole ecosystem’, the French president also announced, proof that
technologies will emerge on the market in particular through certain start-ups
in the quantum technology sphere.
One of the most
promising, Quandela, is one of the first companies in the world to
commercialise photonic qubit emitters in the form of single photons. This first
technological building block is essential for the creation of future quantum
Created in 2017
by Pascale Senellart (CNRS research director), Valérian Giesz and Niccolo
Somaschi, Quandela is a spin-off of the C2N-CNRS. The team’s objective, based
on this light pulse technology, is to improve the calculating speed of research
computers and ultimately to build the first quantum computers.
offered by such a development are immeasurable, from the potential discovery of
new medication thanks to simulations of molecular interactions to applications
in aeronautics or banking by enabling virtually infinite data and risk
Quandela is at
the heart of the quantum revolution and is approaching the next step in its
growth thanks to fundraising realised in July 2020 with Quantonation (the first
venture capital fund dedicated to quantum technologies and innovative physics)
and Bpifrance (via the French Tech Seed fund). This fundraising will in
particular make it possible to accelerate the commercial deployment of the next
generation of products.
Quandela has been supported for some months by La Place Stratégique – an organisation sponsored by the French state (Ministère des Armées, Direction générale de l’armement, Agence de l’innovation de défense, Gendarmerie Nationale), large corporates (Thales, Arquus) and the firms Accuracy and Jeantet – avocats – whose role is to assist the young companies that will count in tomorrow’s world.
Customisation and personalisation in the beauty sector
The personalisation of beauty is
much more than just simple marketing innovation
Marketing and innovation have always been key success factors in beauty
and personal care companies. This is even more so the case today in an
environment where the consumer has access to a much broader offer and greater information
thanks to the internet.
Historically, marketing and innovation cycles were mostly product-centric,
focusing on the continuous improvement and upgrading of product ranges and
brands. However, this marketing routine has been brutally disrupted by new
growing consumer expectations. Indeed, marketing and innovation have now become
customer-centric to feed the need for natural products on the one hand and more
personalised products on the other.
We know that growing concerns for the environment and organic products
But when it comes to the customisation and personalisation (hereafter C&P1) of beauty products, to what extent should we consider this as a major structural trend or just a marketing gimmick to please millennial consumers?
We firmly believe that the customisation and personalisation trend will significantly
reshape the beauty industry as it directly drives brand differentiation and
Below we will detail how and why.
The C&P trend is driven by customer expectations and enabled by technological innovations
Graph 1. Drivers and enablers of the C&P trend
Three drivers generated by customer expectations
customer-centric products The growing appeal of customised and personalised beauty
products reflects a change in the expectations of consumers, notably in mature
markets saturated by a standardised offer and overconsumption.
considerations C&P enables consumers to select the ingredients
used in the products (trend to offer sustainable, vegan, cruelty-free or
inclusion and diversification Customised beauty makes it possible to fulfil customer
needs that are not addressed by mass-market products (e.g. Afro-Caribbean
haircare, women with darker skin tones).
Two technological enablers
Digitalisation The growing convergence of the online and offline
worlds and the increase in BtoC are paving the way for the development of
advancements and the rise of new industrial technologies The combination of scientific and technological
advancements offers a unique opportunity to obtain consumer data, analyse it
and understand consumer needs in order to create fully tailored beauty
solutions. The significant strategic value of consumer data is greater than
ever for beauty and personal care companies.
of these two enablers materialises through five main solutions or ways of
operating that companies have implemented in their C&P strategies.
1. High-tech beauty
In the wake of personalisation through
algorithms, several major players are developing high-tech beauty products
providing customers with a complete personalisation experience. These companies
use artificial intelligence, augmented reality or even 3D printing to be at the
forefront of beauty technology.
To illustrate, L’Oréal presented a new device at the 2020 Consumer Electronics Show called ‘Perso’, which is expected to be launched in 2021. This device creates high-end personalised skincare, lipstick and foundation products. The product operates in four steps: (i) a personal skin assessment is conducted thanks to the ModiFace technology (artificial intelligence); (ii) the user’s local environmental conditions are then assessed by the device thanks to geo-location data; (iii) the user is able to customise the product formula for specific wants or needs; and (iv) eventually, the device produces the cosmetic product taking into account all of the required parameters.
2. Personalisation through algorithms
An increasing number of beauty and personal care
players offer personalised cosmetics created by algorithms. Customers usually
answer a questionnaire or undertake an assessment to ascertain their needs,
whether it be online or in store. Answers and/or results are then analysed by
algorithms to determine the product formula that best matches their individual
For example, the French brand IOMA offers
personalised skincare cosmetics based on an online questionnaire or an in-store
skin assessment. An algorithm will automatically recommend the ideal formula
from more than 33,000 possible combinations. Information on consumers such as
skin assessments enrich IOMA’s skin ‘Atlas’, a database which summarises,
compares and samples skin data to develop new skincare solutions.
3. Face-to-face consultations
In order to find the most appropriate cosmetics
for each individual, some brands have put in place face-to-face meetings with
experts to help customers create personalised products tailored to their
As part of its Technology Incubator, L’Oréal
launched Color&Co, a direct-to-consumer brand specialised in personalised
hair-colouring kits, in 2019. Its value proposition lies in a ten-minute free
video chat with a specialised colourist, who creates a personalised kit adapted
to the customer’s wants and hair specificities, which have been described
previously in a short questionnaire. The product is then directly shipped to
the customer’s door and contains everything needed for the customer to dye his
or her hair at home. Face-to-face consultations therefore provide consumers
with personalised cosmetics that aim to answer the growing demand for inclusion
4. Mix & Match products
Several brands are currently offering ‘Mix &
Match’ products, which allow customers to make a choice between all available
components and to build customised products matching their own expectations.
For example, Guerlain launched ‘Rouge G’ in 2018. It
is a customisable lipstick offering customers the possibility to choose their
lipstick colour from 30 available shades and to select their favourite lipstick
case from 15 different proposals. Therefore, Mix & Match solutions enable
consumers to express their own individuality and can be used as a means to
better retain customers through a co-creation process.
Chatbots have been increasingly used on company
websites and on social media in order to provide customers with a more tailored
approach to service. Indeed, chatbots usually direct a consumer towards an item
that he or she might enjoy. They occasionally work together with augmented
reality technology, which enables customers to try beauty products virtually before
By way of illustration, French makeup retailer Sephora launched a smart beauty bot, Sephora Virtual Artist, allowing customers to try on a wide range of makeup products instantly (lipsticks, eyeshadows, eyeliners, etc.) by uploading a selfie into the corresponding app. Having benefitted from a customised user experience, customers can then purchase their favourite products directly on Sephora’s mobile website. These five solutions differ in terms of the initial investment required, the complexity of their implementation and the degree of personalisation (see graph below).
Graph 2. C&P solutions in the beauty and personal care market
Successful C&P operations should lead to more profitable business economics
Beauty companies expect C&P to generate a large positive economic
contribution, which should improve their profitability significantly and
margins via disintermediation
The personalisation business model is based on building a direct
relationship with the consumer. This is revolutionary for beauty companies as
personalisation tools and platforms enable them to circumvent traditional
retailers and capture their distribution margins. Indeed, the trade-off between
incurring additional distribution costs and saving profit from retailers is beneficial
Invoice a price
C&P mechanims also provide significant potential for price premiums: consumers perceive the value of customised and personalised products to be higher. The analysis of several product samples representing various C&P solutions reveals that the applicable price premium increases with the degree of personalisation offered. On average, the price premium charged for these products is found to be close to +50% of the reference product (see graph 3).
Graph 3. Premium charged analysis on degree of personalisation
premiums further take into account business model and cost structure
adaptations required to shift from a mass-market to an individual on-demand
business model. To fully capture the underlying value of the C&P trend,
beauty companies would have to invest in solutions up front and may also incur
higher production and distribution costs.
consumer base, enhance loyalty and increase purchase order frequency
The shift from product centricity to customer centricity and thus
tailored solutions for customers is based on the increased quantity and
spectrum of data provided by final customers. The data collected goes beyond
the traditional direct contact details (email, phone number, home address,
birthday, etc.) as customers are required to input their individual
specifications, such as skin tone, product preferences (colours, shades, etc.),
product expectations, appetite for natural products and more. Providing truly
individualised solutions to customers has a positive impact on customer
acquisition and loyalty and further increases barriers for the customer to leave
and use other brands.
Further, the availability, subsequent analysis and use of this precise
consumer data provides an opportunity for beauty companies to develop and
implement their own BtoC business models. This not only makes it possible to
bypass traditional retailers, but also makes it possible to implement personalisation-based
subscription models. Such models are already being implemented in the beauty
space with, for example, ‘The Dollar Shave Club’ and even in other FMCG sectors
with, for example, Nestlé’s ‘Tails.com’, a personalised pet food subscription
concept. These models enable companies to increase consumer purchase order
frequency by automating the ordering process.
A successful C&P strategy can double the LifeTime Value (LTV2) of one client
There is a lot of value to be created by
addressing the C&P trend driven by the points mentioned above, that is, capturing
retailer margins via disintermediation, benefitting from price premiums (see graph
3) and enhancing consumer loyalty and therefore increasing purchase order
frequency (see graph 4).
price premiums may seem to be the most evident source of value, we found that consumer
acquisition & loyalty, as well as disintermediation are the key drivers of lifetime
value linked to the business models focused on C&P.
Additionally, beauty companies will be required to make initial investments and organisational efforts to foster innovation, build industrial capacity, and develop and maintain digital BtoC platforms. These investments may seem expensive from a business perspective at one point in time, but the opportunity cost of doing nothing may prove to be more expensive: beauty companies may lose relevance in the eyes of the customer and subsequently lose sales and market share.
Graph 4. Impact of the personalisation and e-commerce on LTV
Ultimately, the C&P trend is not a marketing gimmick but a major economic repositioning of the industry. By transforming their business models, beauty companies can leverage on this trend and create significant lifetime value.
1 Lifetime value (LTV) corresponds to the monetary value of a customer relationship, based on the present value of the projected future cash flows from the customer relationship.
2Whilst customisation refers to specific changes performed by an end-user to adapt a product to his or her specific needs, personalisation is done by the system itself, which will identify customers and provide them with content matching their own characteristics.
Sophie Chassat Philosopher, partner at Wemean
The crisis has forced us to stop looking at things and finally see them. Let’s share a few words on this distinction, which comes from the philosopher Bergson. Most of the time, we put labels on situations, enabling us to quickly identify them and move on to action. To paraphrase Bergson: when we look at an object, usually, we don’t see it; what we see are the conventional signs that enable us to recognise the object and distinguish it practically from another, for convenience.1 However, as Bergson would go on to say, it is only when we pay attention to the uniqueness of things that we can really see them – and therefore measure their singularity in order to provide an adequate response, to adapt and to truly innovate.
By plunging us into an unprecedented situation, the crisis has shattered our preconceived filters. At first blinded, our eyes have gradually been opened. We have seen the dysfunctions that we previously considered normal. Remote working has become some sort of optical apparatus, a veritable telescope helping us to put many things into perspective: by seeing ‘at a distance’ (the literal meaning of the prefix tele) the way we work, we can measure, for example, the importance of direct human contact, as suggested by Frédéric Duponchel in his editorial.
Above all, we have started to explore our blind spots and hidden regions – these zones that can be identified by the ‘Johari window’2 , a matrix that reminds us of our individual perspectives and biases. Each individual, just like each organisation, has his or her ‘arena’ (known to self and known to others), ‘façade’ (known to self but unknown to others), ‘blind spot’ (unknown to self but known to others) and ‘unknown’ (unknown to self and unknown to others) – it is the exploration of this last zone that the crisis has made possible, or rather necessary. We should note that to realise this exploration, numerous organisations lean towards the clarification of their ‘vision’: the fact that topics like the ‘raison d’être’ and the ‘mission’ remain high on the company agenda shows the fundamental need to adopt new ways of seeing one’s business.
To train for this new way of seeing, reading a recently published work of art history alone qualifies as an ocular workout: in Le Strabisme du tableau. Essai sur les regards divergents du tableau3 , Nathalie Delbard invites us to take a fresh look at classical portraits and discover that numerous subjects in the pieces have a slight squint, not because of problems of sight, the author explains from the outset, but because the painters thus encourage us, the viewers, to shift our gaze off-centre. Our points of reference are wavering, but new perspectives are opening up. As Apollinaire put it, ‘Victory above all will be / To see well in the distance / To see everything / From close / And let everything have a new name’. 4
Sophie Chassat is a philosopher, a partner at the advisory firm WEMEAN and a corporate director. She works on strategic issues linked to the contribution of business projects: defining them, activating them operationally and determining their impact on governance.
1Bergson, Madrid conferences on the human soul (1916) in. Mélanges.
2The Johari window was conceptualised by Joseph Luft and Harrington Ingham in 1955 to represent (and improve!) communication between two entities.
3From L’incidence Editeur, 2020. The title can be roughly translated as ‘The squint in works of art. An essay on divergent gazes in works of art’
4“La Victoire”, in. Caligrammes (1918). The original French: ‘La Victoire avant tout sera / De bien voir au loin / De tout voir / De près / Et que tout ait un nom nouveau’
Consequences of the development of green finance for companies
Franck Bancel Academic adviser, Accuracy
Since the Paris Agreement was signed in 2015, the fight against global warming has established itself at the top of the agenda for many companies. The reduction of greenhouse gas emissions has become a priority, requiring the implementation of new management systems. In this context, so-called green finance, which enables environmentally friendly project financing, is gaining traction. The development of green finance has major consequences for companies and raises multiple questions: how can we define the concept of green finance? What does it mean for companies? What is the role of the financial sector? What financial instruments have been specifically developed to meet company needs?
What is green finance?
finance groups all financial activities that contribute to the fight against
global warming. For this reason, it is also called ‘climate finance’ or ‘carbon
finance’. It is not ‘sustainable finance’. Sustainable finance, which has a
broader definition, prioritises responsible investment (RI) and adds
environmental, social and governance (ESG) criteria to purely financial
finance calls into question one of the major principles followed by financial
analysts. Traditionally, finance has no other objective than to facilitate the
allocation of resources to the most profitable projects, without consideration
of their impacts on the environment. By contrast, for green finance, only the
projects that favour the transition from fossil fuels should be considered.
This does not mean that the notion of profitability ceases to exist; nothing
prevents companies from choosing the most profitable projects from amongst the
green projects available; what changes is the order of priority. The search for
profitability is now subordinate to the green nature of the investment.
What is climate risk for companies?
As Mark Carney explained in his famous speech
from 2015 on ‘Breaking the Tragedy of the Horizon’, climate risk can be broken
down into three distinct risks. First, the occurrence of extreme climate- and
weather-related events (hurricanes, droughts, etc.) may generate a physical
risk that materialises through the destruction of certain assets and losses of
activities for companies. Transition risk is linked to regulatory changes
decided upon by public authorities, which may lead certain companies to call
into question their economic model or even to disappear altogether. Let us take
the example of the automotive sector: because of regulatory changes, the
manufacture of combustion engines (petrol or diesel) will diminish drastically
in the decade to come, even though these engines utterly dominated just a few
years ago. Finally, liability risk related to non-compliance with environmental
legislation may generate significant financial damage and interest. We can
imagine that in the more or less distant future, companies may be pursued
legally for endangerment of others, as were, for example, tobacco companies.
At first glance, one might consider that the
majority of these risks will not materialise in the short term and that
companies have the time to adapt. We think, on the contrary, that companies
must anticipate these risks and quickly implement the appropriate management
processes to deal with them. Certain sectors must adapt now: their longevity is
threatened. In this way, in the oil and gas sector, certain major players have
started to invest massively in new sectors (batteries, electricity, etc.) and
to diversify their operations significantly. As for the less polluting sectors,
the need to reform may be less urgent, but the trend remains the same. Large
groups will gradually force their subcontractors to reduce their carbon
footprint and pressure will be high on SMEs. Access to financing in good
conditions will also require compliance with emission criteria (and more
generally with ESG criteria). This is what banks explain, as their credit
distribution models develop in this sense. The image and value of a company’s
brand is now inherently linked to its ability to contribute to the fight
against global warming.
can companies manage climate risk?
Climate risk is not the subject of a
centralised management process in the majority of companies. Today, two large
departments are involved in the management of climate issues: the sustainable
development department ensures the operational management of projects
compatible with the fight against global warming. This means enabling the
company to comply with its climate commitments by proposing operational
solutions to reduce its carbon footprint throughout the value chain. For
example, can the company replace one material with another whose production
emits fewer greenhouse gases, without altering the quality of the final
products? How can the more virtuous suppliers in terms of emissions, etc. be
selected? The finance department centralises the information and produces
financial and extra-financial reporting in relation to environmental
performance. Climate reporting will become a central part of a company’s
financial communications in a context where financial information will become
standardised under pressure from the financial community and public
authorities. Investors request more and more information to evaluate not only
emissions but also all negative externalities. In the years to come, the
sustainable development and finance departments will have to cooperate further
and produce together new indicators that incorporate both financial and
Moreover, the companies in certain sectors (power plants, manufacturing
plants, etc.) are subject to emission ceilings. They are granted a certain
quantity of emission rights (quotas) but can purchase further rights on the
market if they find that they do not have enough (or indeed they can sell their
rights if they find themselves with an excess). The European Union has
committed to a policy to reduce the number of quotas available, which should
automatically generate an increase in their value over time and result in new
constraints for companies.
What is the role of the financial sector?
For the financial sector, the aim is to redirect activity as a priority towards projects compatible with the fight against climate change. Most of the large players in finance, whether banks or investment funds, have made commitments to reduce the carbon footprint of their portfolios. As a result, some banks have stopped financing companies that operate in the coal sector. More generally, one may question the financing of ‘fossil fuel’ companies; the continuation of their activities would challenge the objectives set to limit global warming (certain writers talk about ‘stranded assets’ to discuss these fossil fuel assets). Banks are now obliged to undertake climate stress tests and measure the impact of climate risk on their solvency. Article 173 (paragraph VI) of the Loi sur la Transition Energétique pour la Croissance Verte (Law on Energy Transition for Green Growth) requires portfolio management companies to publish information on the consideration of their ESG policy and therefore on the consequences of their investments on the climate.
To help investors better grasp this new
environment, public authorities have implemented ecolabels in various countries
that require labelled funds to invest significantly in green assets. This is
the case in France with Greenfin, in Luxembourg with LuxFLAG Environment and
LuxFLAG Climate Finance, and in Nordic countries with Nordic Swan Ecolabel.
These labels are backed by a taxonomy that defines what a green economic
activity is. The taxonomies play a major role in this respect because they
guide investors in their investment decisions. The European Union has created a
draft taxonomy that distinguishes between carbon-neutral activities (low-carbon
transport, etc.), transitioning activities (building renovation, etc.) and transition-facilitating
activities (production of wind turbines, etc.).
are green financial instruments?
In this context, new financial instruments have
been developed by markets and banks with the aim of promoting the transition to
greener energy sources. For example, green bonds have experienced spectacular
growth in the past few years. They are bonds from which the funds collected
must exclusively be used to finance or refinance, in whole or in part, green
projects. For a company, issuing green bonds generates significant additional
costs (administrative costs linked to the issue process, legal costs, audit
costs, reporting costs, higher mobilisation of staff, etc.) for a very limited
reduction in the cost of financing. According to financial literature, the
additional costs equate to seven base points, whilst the premium is only two
points. However, issuing green bonds enables companies to increase their
investor base, secure the issue even in difficult market conditions and
generate organisational gains (better cooperation between the finance and
operational teams, increase in competence of the finance teams on subjects
linked to ecological impact, etc.). Green bonds are not the only green
financial instruments that have been developed. Banks, for example, have
started to securitise green assets (that is, issue securities on the market
whose value is based on the repayment of green loans). The development of this
market will depend, however, on regulators, which may reduce the capital costs
of the banks that finance this type of loan or even make the financing costs of
‘brown’ assets more expensive.
In conclusion, the fight against climate change has created in only a few years a new paradigm. For a company, considering that it need not pay any attention and just continue business as usual seems like a risky choice. However, although the route has been mapped out, a great many questions essential to the deployment of green finance projects and tools remain unanswered. The transition to greener energies is particularly technical; the physical measures, just like the financial ones, are either subject to disagreement or considered insufficient. Convergence is expected to take place in the coming decade and will further accelerate the changes under way.
Gillian Tett, one of the chief editors at the Financial Times, commented earlier this year that people in New York found it harder to part with their Christmas trees after the holiday period. Has the COVID crisis really changed our relationship with time and space? Private and professional life is intertwining, just as the line between home and office is blurring. Are our points of reference changing? Will we find them again when the pandemic has finally been put behind us?
We should keep in mind this warning about possible behavioural changes taking place when we wonder what 2021 has in store for us. Of course, we should start this forward-looking exercise by taking a look at the macroeconomic forecasts. They bring hope. The IMF has revised its figures for global growth upwards: +0.3 points to 5.5%, after -3.5% in 2020. At the IMF, they seem to think that the loss of economic activity generated by the health crisis is going to be more than compensated! Can we say then that everything is going back to normal, back to business as usual?
No. And we must consider other approaches to better understand the upcoming period.
Let’s stay on macroeconomic territory for a while and note a few points:
1. Recovery remains highly conditional upon developments on the health front. If the decline in the pandemic is delayed even by only a few months, the first half of the year will be lost to the recovery; performance for the whole year will clearly be affected. Taking the example of the eurozone, we can see that growth fell by over 7% in 2020. Under the commonly accepted idea that there will be a net decline in the pandemic from spring, the economic rebound could reach between 4% and 4.5% this year. Delay the decline by just three months and a third of this growth would be cut!
2. The big growth figures that we’re talking about shouldn’t mask the point that it will take time to get back on the track that was expected before the pandemic. According to the World Bank, by 2022 there will be a shortfall of four trillion dollars in wealth creation. This is more or less the size of the German economy and is not something to be sniffed at. Should we fear being ‘condemned’ to another episode of slowdown in potential growth after a major crisis, even if its origin is neither economic nor financial? To make sure that we don’t have to respond in the affirmative to these questions, committing to a recovery policy that prioritises supply over demand seems essential. Will this be the case?
We must also ask ourselves what lies behind
these figures, which retrace the developments of very broad economic
aggregates. In difficult times such as these, we often see, behind the
averages, an increase in standard deviations. This means that certain
households, certain businesses and certain countries are suffering more. The
least qualified have been the most affected by the downturn in the labour market.
How much time will it take for any improvement in employment to reach them?
It’s also clear that prospects are not the same for a small business in the
tourism sector as for another in the digital sector with global activities.
Finally, a country heavily involved in manufacturing industries and with
significant room for manoeuvre in terms of supportive policies (Germany, for
example) is in a better position than another specialised in labour-intensive
services and constrained by long-deteriorating public accounts. We must wonder
about the economic, social and political implications of this divergence. Are
we heading towards less growth (convoy theory?), more inequality and ultimately
less harmonious societies – both internally and with others – which are therefore
more difficult to manage? If this is the case, what measures should be taken to
counter these risks?
We should also consider the changes in behaviour brought about by the crisis:
1. A whole series of innovations already in progress are accelerating, whether it be digitalisation, distance selling, remote working, telemedicine, artificial intelligence or biotech. Certain sectors (transport services and upstream industrial branches, for example) will have to reinvent themselves.
2. Households and companies may change their trade-off between spending and saving: more caution, just in case, and so more savings? The economic and financial implications of such a change would be significant, namely a declining investment trend and interest rates coming to rest once again one notch lower than before.
3. Those responsible for public policy are therefore facing a complicated environment to grasp in all respects: they must manage the past (a heavy and high public debt) and prepare for the future (facilitate the structural changes towards the energy and environmental transition and also towards digitalisation). But what will the consequences ultimately be for productivity and growth profiles or the financial performance of companies? How much time will it take for all this to be visible, if it happens?
When we can’t see tomorrow very well, it’s
only human to hang on to what we know – yesterday. But this ‘back to basics’ only makes sense as a
springboard to dive into the new opportunities provided by a changing world: after
a crisis, it’s often out with the old and in with the new. Let’s keep our
Christmas trees longer than usual if it makes us feel better, but let’s make
sure to keep an eye open for the weak signals of a changing world. That is how
Who’s Who Legal identifies the foremost legal practitioners and consulting experts in business law based upon comprehensive, independent research. Entry into their guides is based solely on merit.
Accuracy’s forensic, litigation and arbitration experts combine technical skills in corporate finance, accounting, financial modelling, economics and market analysis with many years of forensic and transaction experience. We participate in different forms of dispute resolution, including arbitration, litigation and mediation. We also frequently assist in cases of actual or suspected fraud. Our expert teams operate on the following basis:
• An in-depth assessment of the situation; • An approach which values a transparent, detailed and well-argued presentation of the economic, financial or accounting issues at the heart of the case; • The work is carried out objectively with the intention to make it easier for the arbitrators to reach a decision; • Clear, robust written expert reports, including concise summaries and detailed backup; • A proven ability to present and defend our conclusions orally.
Our approach provides for a more comprehensive and richer response to the numerous challenges of a dispute. Additionally, our team includes delay and quantum experts, able to assess time related costs and quantify financial damages related to dispute cases on major construction projects.
A review of recent economic figures leads us to the conclusion that this crisis may well be the most brutal and precipitous in recent history. In fact, studies show that when compared with the financial crisis of 2007–2009, the current crisis generated the same level of stress for economies in only six months instead of two years.
One of the most striking characteristics of this crisis is its diversity of impact, whether at the citizen, business or country level.
Accuracy, the global independent advisory firm, has promoted two new partners as part of its continued growth. Charlene Burridge, in London and Florence Westermann in Accuracy’s Paris office. These promotions bring Accuracy’s total number of partners to 52, across 13 countries.
Accuracy assisted Korian in the structuring, modelling and conclusion of a major real estate partnership with BNP Paribas Cardif and EDF Invest. This long-term partnership relates to a pan-European vehicle of 81 health assets that will be controlled and managed by the Korian group.
conducted buy-side financial due diligence on behalf of Bpifrance, in advance
of its investment of €8m in Coretec Group, a French family-owned group and
expert in the engineering and manufacturing of tailor-made equipment for the
automotive industry. It is the first investment of the Fonds Avenir Automobile
2 (FAA2), created to support automotive suppliers and managed by Bpifrance.
Group employs 360 people across four sites located in France, Poland and the
Czech Republic and has achieved revenues of €33m as of 31 March 2020.
Report highlights Accuracy’s global reach, cross-disciplinary skills, innovative technology
has been named in Global Investigations Review’s GIR 100 2020, an independent
guide to the world’s top 100 investigations firms. The GIR 100 is based on
extensive research with practitioners in the field and identifies those firms
able to handle sophisticated cross-border government-led and internal
also the only France-based firm named by the guide and one of just 12
consultancy firms. The guide highlights Accuracy’s global reach of offices in
over a dozen countries.
publication states that “rapid expansion and innovation have boosted Accuracy’s
profile in recent years,” and notes the firm’s addition of “cross-disciplinary
professionals with forensics, economics, technology and law enforcement
experience” as well as innovative use of technology.
describes Accuracy’s engagements for multinational companies “where multiple
government agencies are investigating allegations of fraud, corruption and
embezzlement” and its frequent partnership with GIR 100 ranked law firms.
“Accuracy is proud to be recognised among this elite group of investigations firms,” said Frédéric Duponchel, Accuracy’s Managing Partner. “Accuracy has assembled a team of highly experienced investigations professionals to help our clients with government and internal investigations, and this recognition demonstrates the dedication our team has brought to this work.”
Who’s Who Legal identifies the foremost legal practitioners and consulting experts in business law based upon comprehensive, independent research. Entry into their guides is based solely on merit.
Accuracy’s forensic, litigation and arbitration experts combine technical skills in corporate finance, accounting, financial modelling, economics and market analysis with many years of forensic and transaction experience. We participate in different forms of dispute resolution, including arbitration, litigation and mediation. We also frequently assist in cases of actual or suspected fraud. Our expert teams operate on the following basis:
• An in-depth assessment of the situation; • An approach which values a transparent, detailed and well-argued presentation of the economic, financial or accounting issues at the heart of the case; • The work is carried out objectively with the intention to make it easier for the arbitrators to reach a decision; • Clear, robust written expert reports, including concise summaries and detailed backup; • A proven ability to present and defend our conclusions orally.
Our approach provides for a more comprehensive and richer response to the numerous challenges of a dispute.
For some twenty years, ecological considerations in political decisions on both a national and local scale have led numerous cities across the world to put ‘clean’ mobility at the top of their agendas. This means developing vehicles that emit low amounts of local pollutants (NOx, fine particles, etc.) and atmospheric pollutants (greenhouse gases).
We talk about a ‘clean’ vehicles when they produce little or no polluting emissions, but in practice no vehicle is truly clean. They all emit local pollutants and greenhouse gases during their production, during their use and at the end of their useful lives.
This article deals principally with ‘zero direct emission’ transport (called hereafter ‘zero emission’ or ‘ZE’ for the sake of simplicity) which emits no direct pollution (exhaust emissions), in contrast to decarbonised transport which emits little or no CO2 and depends on the energy mix of each country.
European regulations requiring low- or zero-emission public transport have caused the number of calls to tender issued by cities for these types of transport to grow. In France, the LTECV law (for Loi de Transition Energétique pour la Croissance Verte – Energy transition for green growth law) has scheduled investments in transport infrastructure.
At present, electric battery buses are the most advanced solutions from a technical and industrial perspective for zero-emission transport. Demand for electric battery buses has therefore exploded in Europe, and the capacity of operators to roll out these vehicles in cities, whilst finding the right economic balance, has become a significant strategic challenge.
Upstream, an electric battery manufacturing sector is being established in Europe (i) to meet this demand, (ii) secure supply (currently mostly sourced from China), (iii) create jobs, and (iv) answer an environmental necessity, among other objectives. Indeed, when analysing the entire life cycles, taking into account manufacturing and transportation, if the battery is made in China, the environmental impact assessment of the electric vehicle can be disappointing. However, rolling out an electric fleet of vehicles is complex: it requires a larger initial investment than a classic fleet, both for the acquisition of the fleet itself and for the creation of the necessary infrastructure (adaptation and modernisation of bus stations and depots, recharging power, etc.). It also implies greater operating constraints (recharging time, management of battery performance, etc.). The implementation of these electric public transport fleets therefore requires complex financial and strategic choices from manufacturers, investors and operators.
The in-depth work that we have undertaken and summarised in this article makes it possible to understand the developments taking place in the electric battery sector, but also to identify the main value creation levers based on various scenarios at the level of the battery, the bus or the fleet. It also highlights other trends in the future of mobility, whether from a strategic (new business models) or technological (hydrogen battery) standpoint.
A. Electric batteries are currently 90% produced in Asia (60% in China alone). In light of the significant market growth, the wish to create a certain level of independence and the will to reduce the impact on the environment, a European electric battery sector is emerging, based on several consortia.
B. Production costs per kWh will also reduce thanks to, on the one hand, technological innovations in progress and, on the other hand, mproved recycling techniques and increased battery capacities.
C. Our analysis of the value chain and cost structure of a battery has enabled us to identify the production steps that provide the greatest added value. A quantitative analysis has made it possible to assess value creation levers: smart charging and recycling have proved to be two key points to maximise the economic value of a battery over its entire life cycle.
D. Making strategic choices at certain key steps in the life cycle of the battery are critical to exploit its full potential for value creation. In particular, considering how to reuse the battery at the end of its first life makes it possible to optimise itseconomic potential.
E. An intermediary financial model serving as the link between the producer model and the operator model is under development: Battery as a Service (BaaS). This model gives the historical operator the opportunity to use a battery that is neither sold nor purely rented to him, but made available via a flexible, bespoke contract adapted to his needs at any moment.
F. Moreover, other forms of low- or zero- emission public transport are emerging alongside electric battery vehicles, such as hydrogen fuel cell electric buses (zero emission) or biomethane buses (low emission). There are so many decisions to make for investors, operators and other actors in the sector – decisions that need bespoke strategic support.
New regulations and more accessible prices have given rise to the ambitions of numerous cities to reduce CO2 emissions by putting in place low- or zero-emission public transport fleets. Moreover, the Paris Agreement and certain laws related to energy transition in Europe have established precise objectives for 2025 and 2030, in particular the LTECV (Loi de Transition Energétique pour la Croissance Verte – Energy transition for green growth law) from August 2015 in France. In addition, for some ten years, the improvement in electric battery performance, the diversification of the offer (autonomy, capacity, charging time, etc.), the significant rowth of demand and the reduction of prices have all facilitated the rise of electric mobility.
The zero-emission (electric or hydrogen fuel cell battery) or low-emission (biomethane or natural gas) sector has turned out to be even more strategic in this post-quarantine period linked to COVID-19, which has further highlighted the stakes related to energy transition. As stated by the UN, this crisis ‘provides a global impetus to reach sustainable development objectives by 2030’. However, the path between ambition and implementation is riddled with pitfalls. For example, Paris – via the RATP and Ile-de-France Mobilités – was aiming for a 100% clean bus fleet by 2025, with 80% electric buses (i.e. zero emission) and 20% biomethane buses (i.e. low emission), in the city’s ‘2025 bus plan’.
However, economic constraints are such that today the objective is to replace only two thirds of the fleet with electric buses, the last third being comprised of biomethane (‘biogas’) buses1. These economic constraints concern both the financial investment and the economic and operating models. But let’s start by looking into the current stakes of the electric battery market.
1. THE CURRENT ELECTRIC BATTERY MARKET
A. The rise of a sustainable and competitive electric battery sector in Europe
Over the past ten years or so, the lithium-ion battery market has exploded. Today, two major trends are at play (Figure 1):
• the decrease in the price of lithium- ion batteries, which amounted to $209 per kWh in 2017 and should fall below $100 per kWh by 2025;
• the increase in global roduction capacity, estimated at 13% per annum on average between 2018 and 2030.
Today, global production of Li-ion batteries, all uses combined, amounts to a capacity of around 500 GWh. Asia, and China in particular, is the leader in this sector by far: Chinese production alone represented approximately ten times European production. It follows then that seven of the top ten Li-ion battery manufacturers are Chinese – the leader being the giant CATL – representing capacity of approximately 300 GWh2.
The sub-sector relating to Li-ion battery electric vehicles represents 70% of this market, that is, approximately 350 GWh.
Figure 1: Development of production capacity and prices of all-purpose Li-ion batteries between 2005 and 20303 4 5
And 40% of this sub-sector relates in particular to buses and other commercial vehicles, that is, 140 GWh. This production is also dominated by China, particularly the Chinese company CATL (70%6 of the bus battery market), as the electrification of bus fleets in China was pushed by the government much earlier than in Europe: for example, since 2009 the city of Shenzen has benefitted from government subsidies for the development of its electric fleet.
Though production remains mostly Chinese, the USA and Europe should gain market shares, growing from only 10% of global electric battery production in 2020 to 40% in 2030. This rise in production capacity outside Asia will lead to a better balance between supply and demand. It is therefore a contributing factor to the reduction of prices, gains in factory productivity thanks to economies of scale, and the increase in the capacity of production chains. Tesla’s gigafactory in Nevada, for example, will produce 35 GWh annually in 2020 against 20 GWh in 2018. Similarly, the Swedish company Northvolt, starting with a capacity of 16 GWh, plans to double its factory’s production capacity by 2030 and end up reaching 150 GWh in 2050.
With regard to Europe in particular, the local sector is being built where political risk is low, financial incentives are high and administrative processes are easy. Easy access to qualified labour, reliable energy resources and a secure supply of raw materials are all essential. All of these conditions come together in Europe, where the commitment to transitioning to a low-emission system is strong. The presence of highly qualified engineers is also an advantage for the years to come, in the context of rapid technological developments. All of these elements make Europe a high-potential zone for the production of electric batteries. Indeed, significant political and financial means have been mobilised to give rise to European or transnational projects.
Therefore, as shown in Figure 2, even if Asia remains dominant in the electric battery market, an international rebalancing will take place by 2030, particularly at the European level.
Figure 2: Development of production capacity of Li-ion batteries by region
(location based on company HQ)4
Figure 3 presents the current landscape of cell and battery production in Europe. The significant presence of Asian actors is evident, as well as the European large-scale factory construction projects, aiming to structure a sustainable and economically viable industrial sector.
The EU programme European Battery Alliance (EBA250), launched in October 2017, is made up of 17 private companies directly involved throughout the value chain, including BASF, BMW, Eneris, and especially the joint venture ACC (Automotive Cells Company) between PSA (and its German subsidiary Opel) and SAFT (a subsidiary of Total). They are supported by over 120 other companies and partner research organisations, as well as public bodies such as the European Investment Bank. The aim is to develop highly innovative and sustainable technologies for Li-ion batteries (whether liquid electrolyte or semi-conductor) that are safer and greener, exhibiting a longer lifespan and a shorter charging time than those currently on the market. The EBA250 benefits from €5 billion in private financing and €3.2 billion in public inancing, including €1 billion from France and €1.2 billion from Germany.
Figure 3: Cell and battery production plant projects under way in Europe7 8 9 10 11 12
More precisely, ACC, often nicknamed the ‘Airbus of batteries’, will build a pilot plant in the south-west of France, followed by two cell production factories for electric batteries in the Hauts-de- France region and in Germany. Another major project – the construction of a gigafactory – is being undertaken by the French start-up Verkor13 (notably supported by Schneider Electric) and aims to produce Li-ion cells for southern Europe (France, Spain and Italy) from the end of 2023. This project takes its inspiration directly from the Swedish start-up Northvolt, which raised €1 billion from private investors (including Volkswagen, BMW and Goldman Sachs) to finance the creation of a lithium-ion battery production factory in Sweden. Verkor’s project represents an investment of €1.6 billion and the 200-hectare factory will likely be based in France. Similarly, the Norwegian company Freyr launched the construction of a battery cell manufacturing plant in Norway (€4.5 billion), which will have a capacity of 32 GWh from 2023 and will be one of the largest in Europe.
It is worth mentioning that other projects are under development to build a European battery recycling sector, a key step in the electric battery value chain. Supported by Eramet, BASF and Suez, the ReLieVe (Recycling for Li-ion batteries for Electric Vehicles) project – with a smaller budget of €4.7 million – aims to develop an innovative and competitive ‘closed-loop’ recycling process, enabling the recovery of nickel, cobalt, manganese and lithium for new batteries.
B. Better performance thanks to new conception and recycling technologies, which lead to a reduction in production costs
The technical performance criteria of electric batteries such as autonomy or specific capacity (stored energy by unit of mass) should triple by 2030 thanks to new battery technologies, as shown in Figure 4. Incremental innovations in Li-ion batteries will make it possible in the short term to replace the rare metals used in the manufacture of the electrodes, such as cobalt and manganese, which are too expensive and polluting. The 33% reduction in the use of cobalt, partially replaced by nickel, which is much less expensive, will make it possible to offset the 40% ncrease in the price of cobalt forecast between 2020 and 2030. With 60% nickel, 20% manganese and only 20% cobalt, NMC 622 technology will replace NMC 111 batteries (which ontain 33% cobalt) and will represent 30% of the market in 2030. By 2030, new disruptive technologies are expected, with new cathodes and solid electrolytes in particular, reatly increasing the reliability of the battery. Current batteries that use a liquid electrolyte work efficiently at room temperature and over a range between 0°C and 45°C14; a solid electrolyte, however, enables a wider range of use, between -20°C and 100°C15. In addition, Samsung has recently patented a battery in which the cathode and the anode are covered with graphene balls; its recharge time is five times quicker. As for batteries with silicone anodes, they have greater capacity thanks to the replacement of usual graphite anode with an anode in silicone derived from the purification of sand.
Figure 4a: Development of battery technologies up to 203016 17
Figure 4b: Development of market share of the different Li-ion battery technologies up to 2030
Ultimately, recycling costs should fall as understanding of current techniques (hydrometallurgy and pyrometallurgy) advances. A new, much less expensive technique is currently under development: the ‘direct recycling’ process. In this process, the electrolyte and the materials making up the cathodes are recovered to be reused directly with no metallurgical treatment necessary. Figure 5 below shows the advantages and disadvantages of each of these recycling methods.
Figure 5: New recycling methods: less expensive and more environmentally friendly solutions18 19 20
The combination of these elements (improved performance, reduction in proportions of rare materials, new recycling processes) will enable a drastic reduction in production costs by 2030, making the electric battery market a promising sector for investors. Our cost structure model (cf. Figure 6 below) indicates that by 2030 the production cost of an NMC 111 battery will decrease by at least 25% compared with its current level. For future battery technologies, this reduction will be greater. For example, Tesla has announced a 56% reduction by 2022 in the production cost per kilowatt-hour of its new batteries thanks to a series of technical innovations.
However, though costs are expected to fall significantly, the financial equation for electric vehicle fleets remains complex. Our analysis of the life cycle of a battery, its cost structure and its performance factors makes it possible to identify certain value creation levers that could make all the difference for transport operators.
Figure 6: The cost structure of a battery (NMC 111) makes is possible to anticipate its production costs by 2030
2. MAXIMISING THE VALUE OF A BATTERY THANKS TO THE DETAIL OF ITS COSTS THROUGHOUT ITS LIFE CYCLE
A. A cost structure that reveals the stages with the highest added value in the manufacturing cycle of a battery
The electric battery value chain can be broken down into several stages (Figure 7): supply of raw materials, manufacture of basic chemical components, conception and production of cells generating electrical energy, conception and production of modules, manufacture of packs (protection against shocks, vibrations), integration of the battery into smart control and performance management systems (battery management system), and, finally, recycling of components and metals at the end of their useful lives. This last stage shows that batteries still have value, even at the end of their useful lives.
Figure 7: Value chain of an electric battery: stakes and challenges21 22
To determine the cost structure of a battery, we have analysed each stage to determine its impact on the value of a new battery. Four types of expenditure appear at each stage: purchase costs (raw materials or components), labour costs, R&D costs and fixed costs (expenditure linked to electricity or additional material necessary for the conception of the cells).
The stage related to the manufacture of basic components is the most expensive (26% of the total cost) because it concerns the various elements making up the electrodes and the solvent contained in the electrolyte. The integration of the battery into a smart system is also a crucial step (22%) due to the importance of the software in monitoring the performance of the battery, which requires a significant investment in R&D. This stage also provides the most added value insofar as the increase in the level of production will not lead to an explosion in R&D costs – these will already have been incurred. Finally, the cell conception and production stage is the third most expensive. It is characterised by high R&D and labour costs.
Figure 8: Value chain of an NMC 111 battery in 202023 24
B. Identification of the key stages in a battery’s life cycle to maximise its valuee
The state of health (SoH) of a battery is an indicator that helps to optimise its use. Mobility contracts with electric bus operators generally stipulate an SoH of between 100% and 80%. Beyond this limit, the battery cannot be used with the same level of security and efficiency – it is the end of its first life. The battery is therefore at a critical moment in its life cycle, where choices must be made: if the battery performance allows it, it can be used again in another contract; it can be allocated to an stationary energy storage unit in its second life (to balance the grid, for example); and it can be sold at the end of its useful life to be recycled, where certain components will be refined to be reused.
Figure 9: Life cycle of an electric battery (based on SoH)
The state of health of a battery makes it possible to evaluate its state. Four factors can lead to the deterioration (decrease in capacity and increase in internal resistance) of a battery:
• Temperature (T): extreme temperatures negatively affect the state of health of a battery. At high temperatures, the internal activity of a battery increases, thereby reducing its capacity; below 0°C, internal resistance increases considerably, thereby accelerating its ageing25.
• The charge and discharge rate (C-rate): this corresponds to the intensity of the electric current going through the battery. The higher it is, the quicker the battery will age.
• The state of charge (SoC): this relates to the proportion of energy stored by the battery compared with its total state of charge. The capacity of a battery deteriorates not only during charge/discharge but also, to a lesser extent, when it is not used or stored if it is not empty. The storage of batteries with a relatively low SoC is therefore recommended to limit their deterioration. To optimise their length of life, recharging batteries to 100% should be done occasionally to balance the cells.
• Depth of discharge (DoD): this represents the percentage of energy that has been lost by the battery since its last recharge and therefore characterises its charging profile. The greater the DoD, the quicker the battery will deteriorate. According to the type of battery used, the optimal DoD (hardly possible operationally!) varies between 50% and 70%.
Knowing the deterioration factors of a battery makes it possible to anticipate this deterioration based on its use, its technology, the monitoring of its performance and its conservation. For example, charge and discharge modes vary greatly depending on whether the battery is used in urban or semirural environments. A semirural use would lead to greater deterioration due to the distances travelled, requiring more frequent and rapid recharges.
Based on these factors, we have highlighted value creation levers able to be used to control and maximise the value of the battery throughout its life cycle. These levers concern the optimisation of a battery’s use, the management of its performance and the management of used batteries.
Figure 10: The ten value creation levers of an electric battery
One of these levers is smart charging, that is, smart and innovative technology making it possible to recharge electric buses at the optimal time: not saturating the grid with demand for electricity, avoiding peaks in demand from both households and electric vehicles at the same time, for example.
A second interesting lever concerns improving recycling techniques, leading to a reduction in recycling costs. Indeed, the continued improvement of current techniques (hydrometallurgy and pyrometallurgy) and the emergence of new efficient techniques (the ‘direct recycling’ process) contribute to the prolonged use of the battery into a second life, followed by its recycling, instead of a shorter use that would be limited to the first life of the battery followed by its sale.
Finally, a third lever consists of managing battery performance, and therefore the know-how related to performance monitoring. ‘Maintenance’ contracts are proposed by battery suppliers. As part of these contracts, certain parameters (SoC, DoD, C-rate, charge intensity, temperature during charge/discharge, etc.) are measured via a battery management system (BMS) to monitor performance: the battery undergoes several charge and discharge cycles under varying conditions and the analysis of the data collected by the BMS can lead to the battery’s replacement if it has deteriorated too much or if the conditions of use no longer comply with the conditions of the contract, particularly those related to safety. But this performance monitoring is currently proving to be more a matter of insurance than of maintenance in the strict sense of the word. That is why a value creation lever would be to renegotiate the contract to bring it closer to the real costs of monitoring performance or even internalising this know-how, more for strategic reasons than for financial ones. Indeed, controlling operating data and battery performance data in real time is crucial because it makes it possible to adapt battery technologies as closely as possible to the use made of them. It should be noted, however, that this last lever is only applicable with great difficulty at present, as numerous battery manufacturers do not allow their clients to internalise this service.
To illustrate all of this, we have modelled in the example below the effects of different levers on a fleet of 25 buses in both an urban and a semirural context. The options analysed are as follows: smart charging or not during the first life; resale of the battery or reuse in a new contract at the end of the first life; reuse in energy stationary storage infrastructure in the second life (as reserve capacity in this particular case). We note that:
• smart charging creates value systematically and, moreover, has the benefit of being simple to implement;
• frequency regulation is not worthwhile, due to high investment costs, a second life that is too short, and an energy resale price that is too low in France;
• the use of a new contract at the end of a battery’s first life, rather than reselling it, is appealing in an urban scenario because the battery deteriorates more slowly than in a semirural scenario.
There are so many operational decision-making factors to take into account that have a veritable impact on the economic model of electric fleets. That said, beyond these levers enabling operators to optimise the performance of their batteries, there are still other avenues to explore in the face of the complexities of the classic electric bus model: the first consists of a new financial and operational management model for these buses; the second consists of alternative modes of low-or zero-emission transport.
Figure 11: NPV calculation for an NMC battery based on its use and the use of certain levers26 27
3. NEW PERSPECTIVES IN THE MANAGEMENT OF ZERO-EMISSION BUSES
A. The emergence of new economic models: the BaaS model
Despite significant technological advances and the expected reduction in the production costs of electric batteries, technical constraints remain substantial for electric transport operators. First, capital expenditure is higher than for classic vehicles (50% higher than for a diesel fleet28). In addition, performance control, battery maintenance and decisions to be made when battery efficiency is reduced are complex parameters to implement for historical bus operators. In this context, the emergence of the battery as a service (BaaS) model almost seems obvious.
Battery as a service basically frees transport operators from the constraints and risks associated with the management of a battery. The service provider takes care of all aspects linked to the battery’s use, from its certification (in compliance with safety and environmental standards) to performance monitoring to recycling; the service provider also ensures that the service provided complies with the expectations of its client, the transport operator, at all times, with a view towards value optimisation. The service provider therefore has to find the optimal contract and use profile for the battery, depending on the stage of the battery’s life cycle – and therefore its performance – at any given moment. It is its understanding of the different value creation levers, as well as its in-depth knowledge of battery performance, that enables the service provider to determine the ideal client or contract profile adapted to its battery. Some of the most well known BaaS companies include Global Technology Systems, Yuso, Swobbee and Epiroc.
Figure 12: Three different business models
B. The development of new low- or zero-emission means of transport
Figure 13: Forecast number of electric and hydrogen buses up to 2025
In parallel with the rise of electric battery buses, other clean means of transport are under development, such as low-emission buses running on biomethane (biogas) or zero-emission buses running on hydrogen. These technologies are growing substantially across the world, despite differences in their level of maturity, depending on the country.
Based on the local energy source, biogas buses constitute a low-emission technology (reduction of 25% of emissions of toxic fumes compared with petrol vehicles), which has the advantage of an excellent level of autonomy and a short recharge time. However, the infrastructure to be put in place is substantial and expensive.
ZE electric buses (battery- or hydrogen-based) are two complementary technologies. Indeed, hydrogen technology (which is more expensive) becomes more relevant where the battery technology reaches its limits or in future cases (grid saturation, for example).This zero-emission technology provides a high level of autonomy and relatively short recharge cycles (Air Liquide estimates that a hydrogen bus can be recharged in less than 20 minutes29). Nevertheless, the required infrastructure is considerable (hydrogen recharge stations) and the network is virtually non-existent or only at its inception in the majority of large cities today. However, as numerous French cities have shown an interest in this technology by launching pilot projects, the government’s recent recovery plan following the COVID-19 health crisis will dedicate more than €7 billion over ten years to this energy of the future, aiming to build factories that are able to produce the electrolyser in particular (the electrolyser makes it possible to transform hydrogen into electricity via the electrolysis of water). The hydrogen plan forecasts financing of €1.5 billion to develop a hydrogen sector similar to that being undertaken for electric batteries – this is in cooperation with Germany.
Figure 14: New types of low- or zero-emission mobility30 31
The main challenge facing the development of the electric battery sector is multiplying supply considerably to be able to match the significant increase in demand. This project is currently materialising through the creation of a sustainable and competitive battery manufacturing and recycling industry in Europe.
In parallel, battery technologies are improving, with batteries gaining in autonomy and specific capacity. Recycling methods are also the subject of critical technical innovation, which should lead to a significant reduction in total production costs by 2030.
However, constraints remain significant for electric mobility players: the amount of capital expenditure, the control of battery performance and the complexity of decisions to be made when their efficiency starts to deteriorate are all parameters that have favoured the emergence of new economic models of battery use, such as the BaaS model, as well as other modes of clean mobility that should be closely monitored, such as the hydrogen bus.
These developments, in economic model and technology, should lead historical players and new entrants in the zero-emission transport sector to change their strategy and investment policies.
In this phase of significant transformations for the whole sector, Accuracy has developed a strategic support framework in order to help these players to identify and seize the truly sustainable and profitable opportunities in the value chain.
1 De moins en moins de bus électriques dans la future flotte de la RATP, Ville Rail & Transports, Marie-Hélène Poingt, 04.03.2020
2 https://www.energytrend.cn/news/20191014-76629.html, Institut de recherche de point de départ (SPIR)
3 Lithium-ion Battery Costs and Market, Bloomberg New Energy Finance, 05.07.2017
4 Developing a promising sector for Quebec’s economy, Propulsion Québec, avril 2019
5 Roadmap Battery Production Equipment 2030, VDMA, 2018
6 http://escn.com.cn/news/show-711124.html, China Energy Storage Network
7 Comment la ﬁlière des batteries pour véhicules électriques tente de se structurer en Europe, L’Usine Nouvelle, 06.09.2019
8 CATL starts building battery plant in Germany, electrive.com, 19.10.2019
9 LG Chem battery gigafactory in Poland to be powered by EBRD, European Bank, 07.11.2019
12 Samsung SDI expands its battery plant in Hungary, INSIDEEVs, 24.02.2020
13 Avec Verkor, la France compte un autre projet de giga-usine de batteries, Les échos, Lionel Steinman, 30.07.2020
14 La batterie Lithium-Ion, mieux comprendre pour mieux s’en servir, Amperes.be, 10.05.2017
15 La batterie à électrolyte solide : une révolution pour l’automobile, Les numériques, Erick Fontaine, 23.11.2017
16 Study on the Characteristics of a High Capacity Nickel Manganese Cobalt Oxide (NMC) Lithium-Ion Battery—An Experimental Investigation, www.mdpi.com/journal/energies, 29.08.2018
17 Oxygen Release and Its Effect on the Cycling Stability of LiNixMnyCozO2 (NMC) Cathode Materials for Li-Ion Batteries, Journal of The Electrochemical Society, 02.05.2017
18 A Mini-Review on Metal Recycling from Spent Lithium Ion Batteries, www.elsevier.com/ locate/eng
19 The recycling of Lithium-ion batteries, Ifri, 2020
The COVID crisis is the largest global economic shock since the Second World War. As a result of the sanitary crisis, billions of people were confined to their homes leading to a sudden halt of the economy. Consequently, global trade came to a standstill, hundreds of millions lost their jobs and indebtedness has greatly increased. As economies begin to restart, it is becoming clear that the impacts of the crisis are not merely transitory.
In collaboration with:
Charles-Antoine Condomine (Manager), Marius Henault (Analyst),
Justine Schmit (Manager) and Vincent Thebault (Manager).
It is widely recognised that a safe haven investment is one whose value does not fall during an economic or financial crisis. A safe haven investment is therefore a counter-cyclical investment in the sense that it is highly resistant to economic cycles and exhibits lower correlation to more risky asset classes.
Gold is often presented as the safe haven investment of choice. It has served as a store of value since ancient times. Its market price is not directly linked to changes in financial markets or the economic context, but this does not necessarily mean that its value will not fall, as was the case from 2012 to 2016 (figure 7).
Even if real estate is also often presented as a safe haven investment, it is worth investigating the reality of such a proposal. Indeed, real estate is often presented and discussed as a whole in the mainstream media. However, the term covers various asset classes that each follow their own logic and rationale. The current crisis is revealing the risk associated with various assets, and now is therefore an ideal time to discuss this notion.
In this article, we will examine the real estate market from two different perspectives:
A. First, we will analyse direct ownership of a real estate asset. We will note that the French prefer investing in real estate directly, as it is a reassuring asset class with an unrivalled (and even improving) balance between risk and reward over the long term.
B. We will then focus on mechanisms for indirect investment in real estate, such as shares in different types of real estate companies, developers and REITs1, with a special focus on this last type. These companies (whether listed or not) tend to outperform the market in the long term, but they are more risky than direct ownership of the underlying asset. This usually results from the high levels of debt that these companies often have, as well as the highly specific nature of their assets (housing, shopping centres, offices, warehouses, etc.).
As a conclusion, we will touch upon the impact of the current health crisis on the French real estate market. The market has slowed down considerably, with multiple risks weighing down on both real estate assets and stocks.
1. INVESTING IN BRICKS AND MORTAR, AN UNBEATABLE RISK–REWARD RATIO OVER THE PAST 30 YEARS
A. Real estate, an asset class appreciated by investors (and individuals)…
The French are fond of investing in real estate: for example, 65% of them owned their main residence in 2018, compared with 52% in Germany2. This proportion grew continuously between 1980 and 2010 and has since remained flat at this level. Further, real estate represents approximately 61% of the wealth of an average French household3.
Although the price of real estate can vary depending on different parameters (location, interest rate, economic context, demographics, etc.), several reasons can explain the preference of individuals and investors for this class of asset:
• Real estate assets are tangible, physical, material.
• They benefit from (i) a primary use value in the sense that they fulfil certain fundamental needs (for individuals as much as for businesses) and (ii) a high exchange value. These assets can easily be rented or sold, which makes them liquid in a market and therefore easily transferable. This represents a level of security for the owner, who is able to get rid of the asset quickly and relatively easily.
• Real estate lasts over time, with a longer operating cycle and investment horizon than the majority of other assets (long construction and occupation period (minimum 18 months, French law for commercial leases 3, 6, 9 or 12 years), etc.).
• Real estate assets are able to generate stable revenues over time.
For these reasons, the price of real estate does not correlate particularly highly with other asset classes such as shares or bonds. These differences make investing directly in real estate an attractive option when pursuing a strategy to diversify an investment portfolio. But beyond pure diversification purposes, investing directly in real estate in France can help investors to optimise their risk–reward ratio in their portfolios.
B. …and an unbeatable risk-reward ratio over the past 30 years
The return on a real estate asset can be received in two ways:
• Rent received (or saved for an individual in his or her main residence) from the letting of the asset
• The variation of the value of the real estate over time.
Based on data published by the Institut de l’Epargne Immobilière et Foncière (IEIF), over a period of 30 years, the return on a real estate asset has equalled that of the stock market but with a lower level of volatility (and therefore lower risk)4.
Figure 1: Average risk–reward ratio over 30 years (1988–2018) in France for different classes of asset
Source: IEIF (Institut de l’Epargne Immobilière et Foncière)
Figure 1 notes:
The return calculation is based on an entry price, an exit price and intermediate flows for unlisted assets and on annual performance with revenues reinvested for listed assets.
*Livret A corresponds to a type of instant savings account.
If we look at the development of the risk–reward ratio in the shorter term (five years in the figure below), we can see that real estate globally outperforms other classes of asset.
Figure 2: Average risk–reward ratio over five years (2013–2018) in France for different classes of asset
Over the period 2013–2018, a period of more limited volatility in the stock market, the return on real estate assets remains high for a limited risk. The stability in this performance can be explained by several factors, including:
• the scarcity effect of real estate in certain areas (land availability, lack of new housing construction, etc.);
• the appetite of the French for this asset class;
• strict financing conditions required by lending banks in France, based on solvability criteria versus real estate value. Such a mechanism limits downward trends in case of economic downturns;
• recent easier access to credit and a downward trend in interest rates.
The risk–reward ratio of real estate assets is therefore more attractive than that of other classes of asset, no matter their level of risk. It should be noted that unlisted real estate funds (SCPI, OPCI) perform in line with direct ownership. Only gold (which does not generate rent or dividends!) comes close to the return profile of a real estate asset, but it suffers from a more disadvantageous volatility profile due to a significant price sensitivity in terms of different tensions in the economy, movements further amplified by leveraged financial products.
C. Transaction volumes historically affected by crises but proving resilient
The characteristics of the real estate market have pushed French investors to invest heavily in this asset class.
Figure 3: Number of transactions and year-on-year change in prices of second-hand housing in France from Q1 2000 to Q3 2019
Source: CGEDD according to DGFiP (MEDOC) and notary databases
The number of conveyancing transactions for homes in France remained globally constant until the subprime crisis; it even grew between 2001 and 2003 despite the stock market crash following the bursting of the dot-com bubble and the 9/11 attacks in 2001.
The volume of transactions fell by approximately 30% between 2007 and 2008, as access to credit was restricted in this period. This fall in the number of transactions led to a decrease in the average price of real estate in France of 9% between Q1 2008 and Q1 2009.
The real estate market started to recover from 2009, with the massive quantitative easing programmes instigated by central banks and updated conditions for access to credit. Between spring 2009 and summer 2011, real estate prices in France grew by 12%5, and the cumulative volume of transactions over 12 months returned to its level from before the crisis, all this despite the ongoing eurozone crisis.
This level of transaction volumes continued growing significantly over the period, after a low point in 2012. This low point mostly derived from a slowdown in investment decisions made by individuals because of the economic uncertainty partially generated by the sovereign debt crisis in the eurozone.
Thus, we can see that over a long period the volume of transactions for residences can sometimes be affected by economic crises, but it tends to recover quickly. In parallel, the value of the underlying assets is resilient in times of crisis, notably because of the reluctance of real estate owners to reduce selling prices.
Direct ownership of real estate can therefore be considered a safe haven investment due to its lower volatility compared with the stock market. Direct ownership’s past performance can even be considered as a paradox and a golden era as its return on investment has been equal to (over a 30-year period) or even higher than (over a 5-year period) stocks. Such surprising performance can be explained by several factors, such as falling interest rates or the scarcity effect. In the next part, we will review indirect real estate ownership investment through ownership of REIT shares.
2. LISTED REAL ESTATE PLAYERS: FROM DEFENSIVE TO VOLATILE VALUES
A. Two large categories of shares for two different roles: developers and REITs
As a reminder, the real estate industry can be split into two main types of players:
• Developers, who build and then sell properties
• REITs, who invest in and then manage (or outsource management of) properties.
These two categories for the most part comprise listed international players operating in several countries – real estate provides investment opportunities all around the world. The largest market is in the USA, with over 220 listed real estate funds and numerous significant developers. By comparison, Europe has around 30 listed funds.
The IEIF Real Estate France index provides a view of the market performance of real estate players in France, composed of 13 REITs and five real estate developers listed in France (cf. appendices for breakdown). Over the period from 1991 to 2020, companies making up the index developed significantly, with for example several major mergers and acquisitions (Unibail and Rodamco in 2007, then Westfield in 2017, Klépierre and Corio in 2015). Furthermore, several French REITs are amongst the leaders in Europe.
B. Real estate stocks: from defensive to volatile values
As shown in the figure below, the IEIF index has outperformed the CAC 40 since the dot-com bubble burst in 2001–2003, with a rapid recovery following the subprime crisis of 2008–2009.
Figure 4: IEIF Real Estate France index vs CAC 40 until 31 December 2019 (base 100 as at 31/12/1990)
Source: IEIF, CapitalIQ
Figure 4 notes:
As REITs distribute a significant level of dividends, the indices have been given with the reinvestment of gross coupons.
In September 2000, the CAC 40 reached a significant peak of 6,944 points (excluding reinvested dividends), the culmination of a period of strong growth since the middle of the 1990s. The bursting of the dot-com bubble followed, then the 9/11 attacks in 2001, leading to an almost 65% decrease in the value of the CAC 40 in two and half years. During this crisis – and even though the CAC 40 fall was accentuated by the weight of banks in it – REIT share values, in contrast to the CAC 40, continued to climb (even faster than gold). Over this period, investments in listed real-estate companies acted like safe haven investments, with prices uncorrelated to market developments.
Conversely, after the dot-com crisis and the 9/11 attacks, namely in around 2003, REIT market values correlated much more closely with the development of the rest of the market. This change in the behaviour of REIT share values results from a certain number of structural factors detailed below.
Risk–reward ratio of real estate stocks vs direct ownership of real estate assets
The figures below consider the REITs included in the IEIF Real Estate France index, in the analysis of risk–reward presented previously6.
Figure 5: Risk–reward ratio over 15 years in France – REITs vs other classes of asset, 2003–2018
Source: IEIF (Institut de l’Epargne Immobilière et Foncière), Accuracy analysis
Figure 5 notes:
The periods 1991–2003 and 2003–2018 are presented for the stock exchange index and REITs.
As previously observed, between 1991 and 2003, REIT shares constituted low-risk assets, exhibiting for that matter a reward profile below that of the market. Conversely, the period 2003–2018 shows a change in behaviour for these assets, becoming more volatile with a greater reward profile.
Further, we can observe the difference in volatility between the real estate assets (direct ownership of real estate assets) and the shares of the real estate companies. This can mainly be explained by the level of financial leverage of these companies (figure 10).
Several elements can be put forward to explain the shift in the profile of real estate stocks towards greater volatility:
• Growth of financial leverage: REITs saw their debt ratio increase from 0.6 in 2005 to close to 1 from 20187.
• Underlying real estate prices grew significantly in the 2000s, thereby increasing the value of REIT portfolios and leading to significant growth of their corresponding share prices.
• The dividend distribution rate grew from 0.6% in 2005 to 6.4% in 2018 (see appendix, page 12)8.
• A tax incentive scheme (SIIC regime in 2003) was introduced in France creating a structural surplus value in this sector, even if, for several years now, the market net asset values of REITs (mainly based on the gross market values of the underlying assets) have been significantly below their share market value, notably for retail REITs (e-commerce impact).
• New high-performing niches have developed (e.g. logistics with the growth of e-commerce).
• There was a consolidation trend in the market, with several significant mergers as previously mentioned.
The strong performance of the real estate market also pushed numerous institutional players to increase the percentage of their investment in the sector, or even to create their own real estate funds, such as Axa, Amundi and BNP Paribas. However, when it comes to retail, performance has recently been affected by the expansion of e-commerce, further accelerated by the COVID-19 crisis (see below).
Due to several factors (including higher leverage and dividend yields), the profile of REIT stocks has changed: formerly defensive and resilient to economic cycles, they have now become volatile, yield assets.
3. THE COVID-19 CRISIS: A LEAP INTO THE UNKNOWN
We have not seen the like of the current crisis since 1929, either in terms of its nature or in terms of its current and future impact on the economy and the markets. Stock markets globally lost almost a third of their value in less than a month and numerous economies shut down as a result of the lockdown of several billion people across the world.
Listed real estate companies have not been spared: the IEIF Real Estate France index has been affected even more than the CAC 40 since the start of the year, as shown in the figure below.
This is notably the consequence of more volatile stocks (as explained previously), in particular the impact of the previously unforeseen closure of shopping centres. It is, however, too soon to conclude on any kind of development in the valuations of REITs and real estate developers.
Figure 6: IEIF Real Estate France index vs CAC 40, base 100 as at 31/12/2019
Source: IEIF, CapitalIQ
Two major risks are weighing down on the real estate market (both direct and indirect ownership):
• A restricted availability of credit: as was the case during the subprime crisis, banks may restrict access to credit for a certain period, reducing investors’ borrowing capacities or removing them from the credit market altogether.
• Business failures: business failures would lead to an imbalance between supply and demand in the professional real estate market.
As observed in the past and if the conditions for obtaining mortgages are favourable, the safe haven status of direct ownership investment real estate should make it possible for real estate transactions to pick up again and for asset values to remain stable or suffer only a very limited decrease.
However, market players – both REITs and developers – are exposed to far greater risks.
The lingering risk for REITs in the short and medium term relates mostly to rental risk. As their portfolios are largely composed of offices and retail properties, the health of the economy as a whole will have a direct impact on their level of risk.
Almost all retailers had to close for a certain period of time during the lockdown. The recoverability risk and the level of rent during and after the lockdown period is therefore very significant. This is all the more true as rent (and service charges) represent a very high fixed cost for retailers.
We are thus seeing an increase in the number of requests from retailers for the outright cancellation of rent during shutdown periods and the full variability of rent on the basis of revenue for a period to be defined (generally the time it takes to regain the pre-crisis level of activity).
In the longer term, this trend may end up calling into question the standard long-term commercial lease in France, with the aim (for retailers) of making lessors bear more of any potential operating risk.
As for real estate developers, the risks are more limited at this stage and mostly concern delays in construction, the slowdown in the commercialisation of certain assets, and changes to buyers’ needs.
Indeed, the massive use of remote working during the lockdown and the proof of its success may well lead the main users of office spaces to rethink their use of – and even their need for – their headquarters and other buildings.
Direct ownership of real estate clearly presents the characteristics of a safe haven investment (limited risk, certain return, uncorrelated with economic cycles, etc.); indeed, it has demonstrated this fact historically. Paradoxically, it has also demonstrated historically that it is at least as profitable as stocks for a more advantageous risk profile. We note, however, that these characteristics depend heavily on the development of interest rates and access to credit policies put in place by financial institutions.
Conversely and since approximately 2003, listed developers and REITs have become offensive stocks that outperform and are more volatile than the market. The new risk–reward ratio associated with them can be justified through various parameters: the sometimes high level of leverage of these players, an accommodating dividend distribution policy, the economic performance of the underlying asset and the implementation of tax incentives, notably.
The economic crisis resulting from the COVID-19 health crisis has had a direct impact on the real estate market. In the short term, the hardening of conditions to obtain credit should, in theory, automatically lead to a reduction in transaction volumes, leading to deflationary pressure on prices. In the long term, we are witnessing the calling into question of the use value of certain assets (commercial, office, urban residential real estate, in particular). If this paradigm shift were to become structural, it would be a departure from the long development cycle analysed above and would necessarily result in a decline in financial value.
Figure 7: Price of gold vs CAC 40 from 01/01/1980 to 21/07/2020
Figure 8: IEIF Real Estate France index – Weighting as at 21/07/2020
Figure 9: REITs and real estate companies vs CAC 40, base 100 as at 31/12/2019
The social and environmental issues that we currently face constitute major challenges that raise questions for all of us. When it comes to these issues, we no longer have a choice: we need to find innovative solutions to tackle them.
Today, philanthropists are no longer the only ones addressing social and environmental issues; financial investors are now fully fledged players in this challenge.
Why? Because it has now been proved possible, and not at all contradictory, to combine social and economic performance in an investment. Indeed, key players in the public and private sectors have seized the opportunity offered by impact investments. They see impact investing as a major tool to catalyse the change of scale needed for innovative responses to fundamental issues.
This paper is aimed at providing a first insight as to how stakeholders (regulators, investors and social entrepreneurs) are working together to create a professional ecosystem looking to improve society while generating financial returns.
Over the next episodes of this series will deep dive on the main sectors, the challenges of social impact measurement and the financial innovations related to this market.
A. Impact investing refers to investments with an intentional positive impact on people and/or the environment, taking place via sustainable and profitable initiatives and subject to evolving impact measurement.
B. In 2018, this market was estimated at $502 billion, a market eight times larger in just five years.
C. This growth has been mainly driven by (i) growing social consciousness in our society regarding environmental issues, (ii) new and traditional investors searching for ways to contribute to society in order to complement limited public spending to address these problems alone and (iii) growing interest for responsible and economically sustainable business models.
D. However, this sector is facing lots of new challenges such as how to appropriately measure social impact, the necessity for a legislative framework and the constant need for innovative solutions.
1. THE CURRENT IMPACT INVESTING MARKET
A. Impact investing: much more than just a trend
Impact investing is undoubtedly a concept with various and evolving definitions. The Rockefeller Foundation used the term for the first time in 2007 during a conference organised to evaluate the possibility of developing a type of investment with a social and environmental
The Global Impact Investing Network (GIIN) defines impact investing as an investment that explicitly combines social return and financial return. More concretely, it refers to investments with an intentional positive impact on people and/or the environment, taking place in sustainable and profitable businesses and subject to proactive impact measurement before, during and after the investment (greenhouse gas emissions avoided, number of jobs created, etc.). This type of investment aims to break the usual practices of financial players who used to either give money away without any expectation of financial return (“doing good”) or invest money as well as possible to maximise financial returns (“doing well”). Impact investing is therefore doing good and doing well at the same time.
It is essential to disassociate impact investing from socially responsible investments. The purpose of the latter is to exclude from investment portfolios certain companies that may be less virtuous than others or to choose, in each sector, the “best in class” according to extra financial criteria. Unlike socially responsible investments, impact investing concerns companies where investors prioritise not so much “return” but “meaning”. It is necessary to understand the differences between those companies that claim to be “green” because they apply environmental, social and governance filters when they screen financial deals and those companies whose true ambition is to create positive social and environmental impacts.
Spectrum of business based on social impact and financial return
Source: “Financial performance of impact investment” market study 2017-2018 and Accuracy analysis
B. A growing $502 billion financial market
The impact investment market has grown significantly in the last decade. In terms of assets under management, GIIN estimated a market of $60 billion in this sector in 20141. This figure increased to $228 billion in 20172 and $502 billion3 in 2018, a market eight times larger in just five years.
The graph below highlights the diversification of the investments in this sector. Impact investors have different asset allocation strategies, investing across geographies, sectors and types of instruments.
Assets under management (“AUM”) by geography, sector and instrument
Source: GIIN 2019 Annual Impact Investor Survey and Accuracy analysis
Regarding return on investment, contrary to the general belief that impact funds generally underperform traditional investment funds and do not necessarily seek market returns, GIIN published a study “Global Impact Investing Network – 2019” whereby it shows that 66% of impact funds seek market-rate returns. Moreover, 77% of the funds performed in line with their expected financial return and 14% outperformed expectations4.
In Europe, the UK has pioneered in impact investing benefitting from an innovative and growing market since the 2000s thanks to numerous political and regulatory initiatives, followed by Northern European countries, like Sweden, and Western European countries like France, the Netherlands and Switzerland.
Expectations vs. performance
Source: GIIN 2019 Annual Impact Investor Survey and Accuracy analysis
C. Consciousness + opportunity + search for meaning: three key growth drivers
The question that now comes to mind is the following: why is this market growing today? Our analysis demonstrates that two worlds are converging. First, the world of finance, which has experienced many setbacks since 2007, is seeking to include ethical considerations in its investments; it is discovering the impact investment as a business model capable of generating social wealth with more secure longterm profitability. Second, the world of social entrepreneurs, who are now able to generate profitability and a strong social impact, wishes to free itself of public finance (which is more and more limited). This convergence has been driven by several factors.
• First, society is now more conscious that economic growth aspirations are outpacing global resource supplies. Currently, we consume 1.7 planets’ worth of resources5 and, consequently, we are facing challenges such as climate change, resource depletion, biodiversity loss, consumption patterns, pollution and waste management, supply chain fairness and widening income inequality. The United Nations explained that significant resources are required to address these challenges. The Sustainable Development Goals (SDG) gap provides opportunities for private capital to supplement public funds.
• Second, as proved by the United Nations, social public spending is no longer growing as fast as is necessary. Charitable donations and public aid are no longer sufficient to tackle global social issues.
• Third, world leading investment firms are increasingly attracted to responsible business models and socially responsible returns. Not only does this add another layer of motivation to their staff but it also provides new business opportunities. These players are able to bring an outstanding talent pool to assist in the development of new opportunities with a long-term view on sustainability and social impact. The development of a more socially responsible form of finance is also a response to the excesses of the financial sector highlighted by the 2008 crisis.
Estimated capital requirement – Potential private sector contribution (USD trillion)
Source: World Investment Report United Nations Conference on trade and development 2014 and Accuracy analysis
2. THE FUTURE OF THIS GROWING MARKET
A. Innovate, finance and support – the new players in the game
The growth of this sector has created a new innovative and large ecosystem that includes more and more players and solutions. These new players are helping to put in place the tools that promote a more efficient functioning of this market. This new ecosystem is mainly composed of entrepreneurs, investors and support structures.
Previously considered as a “niche” for a few specific social investors, impact investing is now a new market for “traditional” investment funds. For example, a leading global investment firm announced in February 2020 the closing of its Global Impact Fund at $1.3 billion. This fund is dedicated to investments in companies with business models aimed at providing solutions to environmental or social challenges. In the same month, another leading investment fund announced its intentions to direct its investments towards companies committed to impact investing.
In addition, the EU has reached an agreement to establish European rules defining sustainable investments. This new framework (due to be in place by 2021) would give a “green” label for investments that cover renewable activities; it would grant lower labels for investments that are not fully renewable but help to reduce CO2 emissions.
Impact Investing ecosystem
Source: Accuracy analysis
B. The main challenges for a better future
What challenges does impact investing face? The first is the impact measure itself. Due to its specific characteristics, financial players must adopt a new logic: the objective is to generate returns on financial investment while obtaining the greatest possible social impact. One of the key challenges for this is to measure social impacts in the same way financial impacts are measured, that is with some form of tangible metric. Several options are currently being explored in order to measure the “Social Return On Investment” (SROI).
The second challenge relates to the legislative framework and public incentives. The legislative framework should be adapted to allow and enhance the development of impact investing worldwide. One good example of this was introduced in the UK in April 2014 with the Social Investment Tax Relief (SITR), a tax reduction of up to 30% of the investment for individuals who invest in small social enterprises.
The third challenge relates to completing the funding chain of social companies. Like traditional investment funds, the majority of impact funds choose to finance companies either in the growth stage or to finance mature companies. Indeed, mature companies represent approximately 55% of assets under management and companies in the growth stage represent 34%, but companies in the venture and seed stages represent only 9% and 3% respectively 7. Therefore, the ecosystem needs to complete the social entrepreneurship funding chain so that appropriate funds are available at all the key development stages of a social enterprise.
The fourth challenge concerns the financial world’s need to continue innovating in order to find available liquidities for impact investment players. An example of these are Social Impact Bonds (or “Pay for Success Bonds”) which pursue the objective of raising private capital to finance public social actions. Social Impact Bonds intend to fund activities, which, if not carried out, will result in future costs for public authorities. The first Social Impact Bonds were launched in the UK. The project goal was to provide re-entry services to prisoners leaving prison. The bond is remunerated on the basis of a fixed and clear objective: to reduce offense recurrence by 10%. If the objectives are achieved, public authorities reimburse investors with capital, at a specific rate computed based on the savings made by the public authorities thanks to the bonds. However, if the objectives are not achieved, private investors lose the investment.
All stakeholders in this innovative ecosystem have a role to play in promoting this new dynamic investment market. Financial intermediaries like banks, financial players, financial and strategic advisors and accounting firms, must also better understand how impact investing can become a vector of progress.
Companies focused on impact investing are clearly here to stay. They aim to have an impact on our society and at the same time earn a return for their investors. However, the road ahead is long and there are many challenges to establish a common framework to enhance this blooming ecosystem. The current pandemic will increase social issues. It will also force countries to rethink and redesign how we define the economy and how we assess value. Now is the time to capitalise on this sector in order to build markets that ensure sustainable growth for all. Now is the time to do “well” by doing “good”!
1 The Impact Investor Survey – Global Impact Investing Network and J.P. Morgan – May-2015 2 Annual Impact Investor survey – Global Impact Investing Network – 2018 3 Sizing the impact investing market – Global Impact Investing Network – April 2019 4 Annual impact investor survey 2019 – Global Impact Investing Network – 2019 5 Ecological Footprint — The Global Footprint Network 6 Financed by public and private investment.
Accuracy advised Protect Medical, the holding company set up by Borromin Capital Fund IV in June 2019 to acquire Söhngen Group (Söhngen), on the acquisition of Spencer Italia S.r.l. (Spencer) on 17th April 2020. With this, Protect Medical is following its strategy to build a leading European first aid and EMS (Emergency Medical Services) full-service provider through organic growth and acquisition.
Great Place To Work® has announced the results for the 2020 Best Workplaces in France awards. For the 12th year in a row, Accuracy is among the top-ranked participants, taking fourth place in the 50–500 employees category.
Accuracy conducted buy-side due diligence for La Caisse de Dépôts et Consignations in the context of the acquisition of investment stakes held by l’Agence des Participations de l’Etat and La Banque Postale in the Société de Financement Local (SFIL), ex-Dexia Crédit Local.
How do business support structures enable value creation in France?
Business support structures looking for a new role and new models
At a time when innovation is increasingly becoming the driving force behind all economies, start-ups find themselves on the front line thanks to their simple and agile structures that enable them to venture into the most promising sectors.
However, innovation cannot develop successfully without a complete and adapted ecosystem, bringing together all the players that must interact and join forces to bring innovative projects to life (organisations, companies, start-ups, universities, investors).
This article details the dynamics that are forging the innovation ecosystem in Morocco, whether related to national strategies or private initiatives. Our mapping of support structures makes it possible to assess strategic issues in particular, for any country looking to take full advantage of its potential for talent and entrepreneurship.
This is key as innovation constitutes a critical lever for the economic growth and development of a country.
A. Support structures for innovative start-ups have multiplied in Morocco thanks to political and private initiatives.
B. Nonetheless, their presence in Morocco remains uneven and concentrated in the Casablanca region.
C. Large Moroccan companies are progressively contributing to the innovation ecosystem and are starting to use open innovation as a value creation lever.
D. Moroccan innovation, measured in terms of patent applications and start-up fundraising, is still not achieving its full potential.
1. INNOVATION SUPPORT STRUCTURES IN MOROCCO WHAT DOES THE CURRENT INNOVATION LANDSCAPE IN MOROCCO LOOK LIKE?
A. A growing number of structures
Support structures are composed of both physical and non-physical structures. They include (i) incubators and accelerators, (ii) co- working spaces, (iii) support programmes and (iv) financing programmes. Based on our research, there are 74 active and planned support structures in the country.
Among them, Technopark was the pioneer and now constitutes a textbook case. Created in 2001 as the fruit of a public–private partnership, Technopark is managed by the MITC (Moroccan Information Technopark Company), whose founding shareholders are the Moroccan state (35%), the Caisse de Dépôt et de Gestion (17.5%) and Moroccan banks (47.5%). The MITC offers work spaces and supports startups by allowing them to benefit from its privileged ecosystem. The model was duplicated in Rabat in 2012 and Tangier in 2015 and will soon be duplicated in Agadir (opening planned in 2021). Technopark has supported over 1,100 companies since its creation, particularly in the information and communication technology (ICT), green technology and cultural industry sectors. It is well known that start-ups require financial support and specialised assistance. But the need to integrate them into a community to interact and exchange ideas is just as vital. Indeed, a start-up community represents a rich and diverse source of collective intelligence, which enables start-ups to discuss ideas in co-working spaces, exchange best practices and build a network to develop.
New structures have developed in Morocco with this very thinking in mind, offering support, training and mentoring services. These support structures organise different events like hackathons, where various teams (composed of developers and project leaders) must find the solution to a strategic issue by producing a proof of concept (in general, software or an application) in a very short space of time. In December 2019, Emerging Business Factory organised the first ‘water hackathon’ in Marrakech with the aim of making water use in the area sustainable and eco-responsible.
Other support structures have implemented co-working spaces for all those who wish to launch their entrepreneurial projects and are looking for a community of partners. One such example is New Work Lab, created in Casablanca in 2013. It is a space dedicated to the development of Moroccan start-ups through the organisation of meetings, training and the provision of a co-working space.
Mapping of start-up support structures
B. An uneven geographic split
Although support structures are concentrated primarily in and around Casablanca, regional dynamics resulting from a strong political will take shape through:
• the duplication of Technopark in other cities in the country;
• regional development projects, such as the innovation city of the Souss-Massa region, which plans to make R&D laboratories available to start-ups, or Mazagan’s urban hub, developed by the OCP and the government;
• support mechanisms on a national scale, with for example the Réseau Entreprendre Maroc and Injaz Al-Maghrib, which support start- ups, or even the financing programme Fonds Innov Invest.
However, the support offered in some large Moroccan cities, like Fes and Meknes for example, is far below the needs of their large student populations.
At the start of the school year in 2017, the Euromed University of Fes (UEMF) had over 1,300 students and researchers2, suggesting a potential talent and entrepreneur pool that should not be neglected.
C. Mostly generalist structures supported by a wide range of sponsors
Though the vast majority (75.7%) of support structures are generalist, three specialisations stand out:
• ICT, in particular thanks to the rise of fintechs working with corporates (e.g. StartOn, Fintech Challenge);
• green technology, with Morocco having set itself the target of reducing its energy dependence and investing in renewable energies (e.g. Social Green Tech Bidaya);
• the social and solidarity economy, relying on, for example, sport to create a link between youth employment and the entrepreneurial spirit (e.g. TIBU Maroc).
It is interesting to note that the sponsors of support structures are diverse: 57% of support structures are backed by at least two organisations (assistance, financial support, etc.). Further, 32% of these structures come from public-private partnerships. Entrepreneurial support initiatives thus form part of a collective intelligence approach, a pooling of resources between complementary players – in short, open innovation.
2. THE GROWING INVOLVEMENT OF LARGE COMPANIES
HOW ARE LARGE MOROCCAN COMPANIES TAKING HOLD OF INNOVATION?
A. OCP: a heavyweight in the national economy and a global innovation model
Moroccan companies are gradually incorporating open innovation and digitalisation into their organisations in addition to increasing their employees’ awareness of innovation culture. A good example of this can be found in the OCP group.
OCP is the world leader in phosphates and the leading industrial company in Morocco. It has put in place an ambitious investment programme (2008-2027), where it aims to double its mining capacity and triple its transformation capacity.
However, where it is of particular interest is in its efforts to boost innovation. Indeed, it has initiated several projects to boost innovation in Morocco and within the group. In addition to physical support structures, numerous programmes have been implemented, such as the Seed- Stars Startup Competition or the Impulse acceleration programme in partnership with Mass Challenge, as detailed below.
The university environment gives us access to innumerable research centres across the world and open innovation […]
When we are at the university, we are able to have a different type of dialogue, one that is much more productive
Mohamed Soual, chef economist at OCP.
B. The growing involvement of Moroccan banks
Moroccan banks are not to be outdone in this matter. Attijariwafa Bank and BMCE Bank of Africa were pioneers in 2001 by financing Technopark Casablanca. They have been highly active in the promotion of innovation over the past five years.
Moroccan banks’ initiatives in favour of entrepreneurs have naturally led them to turn their innovation approach inwards to improve their own processes and offers in the context of increasing digitalisation. But the delegation of support management to a pure player is often essential in order to facilitate cooperation and maximise value creation between stakeholders. This is particularly the case when the stakeholders have very different cultures, notably when it comes to public-private partnerships.
Though Morocco’s start-up ecosystem has been strengthened by the launch of various support and financing mechanisms (as detailed below), the number of innovative technology companies in the country measured in terms of patents and fundraising is not meeting its potential, as detailed in the following pages.
Management of support structures created by Moroccan companies on the Moroccan All Shares Index (MASI)
3. INNOVATION AND ITS FINANCING IN MOROCCO
HOW HAS INNOVATION DEVELOPPED IN MOROCCO OVER THE PAST FIFTEEN YEARS?
A. Successive industrialisation strategies have contributed to raising the general level of innovation
Comprehensively evaluating the innovative nature of a country requires consideration of its institutional environment, infrastructure, training, R&D, and market structure and creation. The Global Innovation Index 2019 ranks Morocco 74th out of 126 countries based on 80 variables ranging from ease of obtaining credit to the protection of minority interests in a company. This index also distinguishes between input variables that define a country’s potential for innovation and output variables that measure effective innovation.
Our analyses here focus on the two output criteria that seemed the most tangible: research dynamism, measured through the number of patent filing requests (industry driver), and fundraising for technology and digital start-ups, which testifies to the potential for economic development. Reviewing the development of these variables against the backdrop of successive industrialisation plans implemented by the Ministry of Industry, Trade, and the Green and Digital Economy since the mid-2000s highlights a correlation. As shown in the figure elow, thanks to industrial policies, as well as the country’s stability and closeness to the European Union, Morocco has become a top destination for foreign investors.
Since 2005, three major industrial strategies have succeeded each other, with a substantial effect on the development of the number of patents filed. Nevertheless, these effects seem to differ based on the nature of the players considered. Indeed, patents filed by non-residents tripled between 2014 and 2018, whilst those filed by Moroccan residents almost halved over the same period.
The dynamism of national research seems to be losing momentum and remains mostly the domain of universities (58% in 2018), with Moroccan companies only requesting 9% of patent applications.
At the same time, the significant growth in filings from abroad testifies to the increased appeal of the country. This can be explained by two factors. First, the presence of foreign actors in Morocco has intensified across various sectors, like the automotive or aeronautics sectors, following the Industrial Acceleration Plan. Second, the implementation of a new way of filing patents by the European Patent Office, thanks a partnership with the Ministry of Industry, Trade, and the Green and Digital Economy in 2015, enables those filing patents in the European Union to also request patent protection in Morocco. Thus, the USA (20%) and European countries, with France and Germany in the lead (8% each), are the most represented among the countries of origin of those filing patents.
Patent filing requests in Morocco (2005–2018) and industrialisation strategies
Note: We have analysed the development of the number of patent applications because of the ease of access to this data. Though this makes it possible to have a vision of the effects of the successive industrial policies, it does not make it possible to assess the entirety of their effect
B. But the funds raised by start-ups remain modest compared with other countries in the region
Fundraising constitutes another indicator of the dynamism of the innovation sector. Though it is difficult to establish a causal link between fundraising and the practice of filing patents, these two phenomena constitute complementary indicators of the dynamism of innovation in the countries concerned.
The spread of fundraising practices reflects, in particular, the growing participation of national and international private players in innovation financing. However, the small amounts concerned tend to show the predominance of public capital in innovation financing or indeed the internalisation of innovation by existing companies.
In terms of fundraising, Morocco places 12th in Africa in 2019 with USD 7 million raised by technology and digital start-ups (vs USD 3 million in 2018 corresponding to 15th place)3. We have gathered data to compare the situations in Algeria, Tunisia, Nigeria, Kenya and Egypt with that of Morocco. The differences can be explained by various factors such as access to financing, financing raised in other countries or the use of alternative eans of financing. In particular, we have put the amounts raised in perspective by setting them against the respective GDPs of each country. Finally, for all these countries except Algeria (for which we do not have enough data), we have studied the development of the amounts raised between 2018 and 2019.
Generally speaking, we note an increase in fundraising over these two years: Morocco, Kenya, Nigeria, Egypt and Tunisia all experience a significant increase in the amounts collected. As for the amounts themselves, Kenya and Nigeria stand out clearly from the other countries. The case of Nigeria may be explained simply by the size of the country’s economy (USD 368 billion in 2018); the case of Kenya, however, is different (USD 87 billion in 2018). Indeed, Kenya proves to be fertile ground for the development of startups. Widely distributed low-cost internet and the digitalisation of payments in 2007 with the launch of M-Pesa, a mobile phone money-transfer system, have greatly facilitated transactions and have been a boon to entrepreneurs in the country.
By way of comparison, the amounts raised in Morocco seem low in relation to the country’s GDP (USD 120 billion in 2018). Beyond the difference in the size of the economies considered, there may be various reasons for this result: fewer private initiatives due to an economy structured around rent-based activities or in low-risk sectors, insufficient tax incentives for both entrepreneurs and investors or even the less prevalent practice of fundraising.
The lower access to fundraising in Morocco compared with other African countries (Egypt, Nigeria, Kenya, Rwanda, etc.) can also be explained by the language. In contrast to the francophone Morocco, the other countries mentioned are anglophone, meaning they are more easily able to capture foreign capital (particularly coming from the US and the UK).
These figures can be used on a macroeconomic scale to measure trends, such as the opening of certain economies to foreign capital, but they also reveal the level of appropriation of certain best practices by local players. In this context, public policies can play a facilitating role in a ‘top–down’ approach. Nevertheless, local realities should not be ignored. Indeed, beyond public policies, it is the players’ actions and the quality of their interactions that enable them to create together innovative programmes and determine the dynamism of a sector. OCP, as mentioned in part 2, is a prime example of this and highlights the importance of involving all players in the implementation of an innovation ecosystem.
Thus, using best practices inspired by foreign countries could strengthen local ecosystems. These measures would be of such a nature as to enable the realisation of the innovation potential of a country like Morocco by promoting the rise of start-ups.
Development of fundraising between 2018 and 2019
Source: Partech (fundraising) and World Bank (GDP)
Si ces chiffres peuvent être utilisés à l’échelle macroéconomique pour mesurer des tendances telle que l’ouverture de certaines économies aux capitaux extérieurs, ils révèlent aussi l’état d’appropriation de certaines bonnes pratiques par les acteurs locaux. Dans ce contexte, les politiques publiques peuvent jouer le rôle de facilitateur dans une approche « top-down ». Néanmoins les réalités locales ne oivent pas être ignorées. En effet, au-delà des politiques publiques, ce sont les jeux d’acteurs et la qualité des interactions qui permettent la co-construction de programmes innovants et déterminent le dynamisme d’un secteur. L’exemple de l’OCP abordé dans la partie 2 est à e titre évocateur et souligne l’importance d’impliquer l’ensemble des acteurs dans la mise en place d’un écosystème d’innovation.
Ainsi, la diffusion des meilleures pratiques inspirées des pays étrangers pourrait renforcer les écosystèmes locaux. De telles mesures seraient de nature à permettre la réalisation du potentiel d’innovation de pays comme le Maroc, en favorisant l’essor des start-ups.
1 Relationship between the number of existing support structures in the region and all existing support structures in Morocco
2 L’économiste.com, Edition n°5032, 2017, ‘Fès-UEMF : Une université à la fine pointe de la technologie’
3 Partech, 2019 Africa Tech Venture Capital report, page 13 (released in January 2020)
Note on methodology: Partech, a venture capital fund, publishes the ranking of fundraising rounds in Africa annually
The start-ups taken into consideration must fulfil the following criteria: (i) the start-ups are tech and / or digital, (ii) their market is Africa (in both
operations and revenues) and (iii) the funds raised exceed USD 200,000
Congratulations to our client Gimv for the acquisition of Köberl Group! The group is one of the leading full-service providers of facility management and technical building services in the southern German market. Accuracy provided financial due diligence services and SPA advice.
The 2020 results of banks in France largely support the long-term trends in retail banking in the country.
With this in mind, it should prove interesting to analyse the figures over the past five years to evaluate the impact of the breakdowns at work. This will help to understand the marked decrease in the relative weight of retail banking in the results of the six largest French banks, whether it makes up a significant proportion of their income (Mutualistes, Banque Postale) or a much smaller one (BNP Paribas, Société Générale).
The fall in revenues of around 1 % per annum for all banks combined is the main driver of these developments. Within net banking income (NBI), it is of course the interest margin that is declining, partly because of low interest rates, and partly because of commercial
This decrease in margin materialises in a much higher decrease in gross operating profit in retail banking, with a fall of 5 % per annum since 2014. As the cost of risk is decreasing significantly, however, the fall in net profit is limited.
In addition to this macroeconomic context, French retail banks are suffering from the specificities and practices of the French market.
Mortgages, for example, as the “harpoon” product of the customer relationship, have always generated particularly low margins for banks in France. This proved to be highly detrimental during the waves of early repayments from 2015 to 2018 and ended up costing the system several billion euros in NBI for individual gains in market share that were practically non-existent.
To compensate, the banks have all sought to expand their outstanding amounts: those linked to mortgages have increased by 28 % since 2014, passing from €833 billion to €1.071 trillion. Questioning the profitability of this choice is all the more pertinent given that French banks often use brokers for mortgages (40 % of volumes), despite some of the densest agency networks in Europe.
The other major supply of credit, consumer credit, generates structurally higher margins. The outstanding amounts are approximately five times smaller than for mortgages (€188 billion in September 2019), and the market is dominated by the specialised entities of BNP Paribas, Crédit Agricole and Crédit Mutuel (80 % market share between them).
But consumer credit has seen growth rates of 3 % per annum since 2014, and it forms a major part of the strategic plans of all banks. It regularly sees innovative new products, such as the recent split payments innovation, and competition is expected to grow in this business area in the years to come.
In terms of savings, the two regulated products, Livret A and PEL, which are specific to the French market (but correspond approximately to instant savings and home ownership savings accounts), represent over €540 billion in savings at the end of September 2019. The rates that they offer, however, are more of a hindrance than a help to French banks in the current economic context.
Indeed, home ownership savings represent a specific difficulty for retail banks, with its even higher interest rate. It has passed from 2.5 % to 1 % since 2014, but this has not stopped the outstanding amounts from climbing €60 billion over the same period, putting a further strain on the NBI of the banks that collect them, with an average rate of 2.65 %. Here again, the banks do not all follow the same policy.
Therefore, whilst the six main French banks suffer in varying degrees when it comes to retail banking, they do not all have the same strategies. Those banks with the strongest networks may be able to choose between products and volumes, but those with less extensive networks must further diversify and more closely monitor the profitability of each activity.
The symptoms may vary, but for banks to get back their profitability, the remedies that they must employ are probably the same: continue to better segment and personalise offers, and invest so as not to be left behind by neobanks in terms of customer experience. They might also choose a different way of functioning for mortgages. This already seems to be the case since the beginning of the year.
If innovation is a strategic issue both on a ‘macro’ scale in terms of national economies and on a ‘micro’ scale in terms of the businesses involved, then so is its financing. This financing relies heavily on business support structures.
In France, the first support structures aimed to provide an outlet for public research, but the rise of private structures has come hand in hand with a growing awareness of the need for profitability. As a result, though the number of business support structures continues growing, it is no longer uncommon to see some placed under compulsory liquidation (Ekito, 33 Entrepreneurs) or required to change course (Numa, Usine JO).
A viable business model is difficult to achieve when providing services only aimed at start-ups. Fundraising has therefore become an entirely separate source of revenue for business support structures, in a context of heightened competition between them. Moreover, a structure’s ability to support fundraising – measured by the number rounds undertaken as well as by the amounts raised – has become an indicator of performance, sometimes somewhat reductively.
At the same time, the way in which traditional fundraising works is being called into question by the rise of new mechanisms, such as crowdfunding or the use of blockchain and, more generally, by significant societal developments. It is important to assess both the potential and the limits of these new tools and means of financing, as well as the perspectives that they open up to financing players, first among which are the business support structures.
A. Business support structures play an important part in the financing of innovation in France.
B. Different support structure models offer different approaches when supporting fundraising.
C. Although continuously growing, traditional fundraising measures come with a number of significant limitations for start-ups.
D. On a global scale, the Initial Coin Offering (ICO) phenomenon represents an attempt to reinvent fundraising, calling into question in particular the role of traditional players such as investment funds and business support structures.
E. However, taking advantage of the limitations of ICOs, new practices are already coming to the fore; they present new opportunities for innovation players in France.
1. INNOVATION FINANCING IN FRANCE: THE ROLE OF BUSINESS SUPPORT STRUCTURES
A. The substantial growth of venture capital activities in recent years has considerably strengthened the role of business support structures
Venture capital activities – the financing of risky companies with strong potential – have grown significantly in France since 2015 (almost 30% per annum on average for fundraising over the period 2015–2019). In 2019, over €5 billion was invested in these types of operation.
This phenomenon should be viewed against the backdrop of the growing number of start-ups over the past decade, which has brought with it the development of the business support structure market. These business support structures have also seen their number skyrocket, and over 700 communes in France have at least one.
Business support structures are generally the first to assist companies in their fundraising. Indeed, 74% of total venture capital investments in France are used to finance incubated start-ups1. Further, the incubation of a start-up by a business support structure maximises its chances of succeeding in its fundraising. This occurs because (i) the fact of being incubated sends a sign to potential investors and (ii) the business support structure makes it easier to connect young businesses with investors, in addition to providing them with the knowledge necessary to undertake such operations.
Start-up fundraising in France by year [€bn]
Sources: EY 2019 barometer of venture capital, Accuracy analyses
B. Essential support in advance of and throughout the fundraising process
Beyond purely belonging to an ecosystem – a source of value in itself for all stakeholders – there are three major benefits when an innovative project works with a business support structure: (i) administrative and financial support, (ii) human support necessary for the operational and strategic structuring of the company and (iii) the means to measure market response. This last benefit can help when reflecting on the business model, defining customer segments, undertaking targeted surveys or even developing a user-centred approach along the lines of design thinking. All these actions make it possible to gain traction commercially and to make progress towards a proof of concept. This is essential for the initial fundraising round to succeed and to open access to further private financing
at a later stage. The start-ups that succeed in their fundraising are, in a way, privileged, and the role of business support structures is fundamental well in advance of this stage. The diagram below, which displays the resources available for innovative businesses depending on their level of maturity, clearly shows the stakes surrounding support provided in advance. It shows in particular ‘Death Valley’, a tricky step where many startups fall and unfortunately do not get back up.
Whilst financing mechanisms for innovation are available in the technological and economic maturation phases, difficulties arise during the proof of concept phase and at the commercial launch. This is the moment when the start-up generally needs additional financial resources to boost its commercial traction but it also comes precisely when the start-up has used up all its equity funding. Not yet sufficiently desirable for private investors looking for commercial growth (and therefore waiting for proof that customers have validated the offer!), the start-up finds itself in danger of failing.
This state of affairs is all the more significant in non-metropolitan areas. Indeed, in such areas, private financial resources are less accessible, whilst the need is greater due to a lack of available technical skills (concentration of key profiles such as developers in large cities) and a smaller ecosystem (difficulty obtaining access to large customer accounts, industrial partners, financing specialists, etc.).
Detail of key steps in the creation and financing of a start-up
Therefore, the role of business support structures before fundraising is vital, particularly in non- metropolitan areas, to reduce the length of time spent in ‘Death Valley’ as much as possible. More precisely, this means delaying entry into this phase, all whilst anticipating coming out on the other side.
The support provided will therefore follow two complementary and interdependent axes (illustrated in the diagram above):
•The acceleration of technological and commercial maturation: the aim here is to guide the start-ups and give them the technical means necessary to realise their idea and bring it to market. The technical improvement and economic development of the project feed into each other in an iterative process centred on the customer. Ultimately, we get to the minimum viable product and an initial
commercial proof of concept.
• The administrative and financial engineering aiming to maximise leverage effects: to obtain the proof of concept, financial resources need to be anticipated, mobilised and optimised to meet technical, human and commercial needs. However, sometimes start-ups do not sufficiently understand the innovation financing chain, particularly the mechanisms offered by structures like Bpifrance or regional authorities, which are increasingly at ease with their economic jurisdiction. Beyond pure knowledge of these mechanisms, business support structures help start-ups to use them at the right time and to benefit from substantial experience in administrative engineering.
These two axes help to secure the start-ups’ development path and make them desirable to investors, whilst giving them more time and therefore greater negotiating power.
“Our incubator’s knowledge of innovation financing is a key factor in our current growth. It has enabled us to obtain proof of the technical and commercial relevance of our solution and, therefore, prepare our recent fundraising round of €1.5 million with greater peace of mind.”
Pierre Naccache co-founder of Asystom
C. Differentiated approaches based on the business support structures
Business support structures offer a wide variety of sizes and models, with varying profiles and therefore approaches.
Private structures, public structures, university structures
There are three main families of support structures that can be identified: those financed primarily by public subsidies, split into public subsidies and university funding, and those where private capital dominates. The structures financed publicly can concentrate on a wider range of topics and support a larger number of start-ups; for structures supported by universities or private capital, their economic equation requires them to take a more narrow and selective position in order to generate profits.
The public or private nature of the support structures also has an influence on how investors perceive the support.
We have analysed the top 20 support structures in terms of the average amount raised and the number of fundraising rounds supported. They represent 39% (i.e. a cumulative amount of approximately €1,750m) of the total amount of funds raised by incubated start-ups in France.
It is interesting to note that public support structures (excluding universities) represent 35% of this fundraising in number of rounds. Further, two of them hold the top two spots (Agoranov and Bpifrance Le Hub – see chart below). This success can be explained by the fact that their presence in a particular operation sends a very positive signal to investors: their engagement is a sign of stability, in that they do not necessarily favour short-term profitability, but take into account economic development, the strengthening of local ecosystems or support for strategic sectors.
Structures that have supported the most start-ups towards fundraising
Source: Study on the fundraising activities referenced by Capital Finance over the period 31 March 2017–8 April 2019
The top three support structures in France in terms of fundraising
Note: Four grandes écoles incubators place in the top 20 (Drahi -X Novation Center, ParisTech Entrepreneurs, Incubateur HEC and ESSEC Ventures). This can be explained by the fact that these structures aim to develop and apply scientific innovations, but also by their extensive networks of alumni, particularly among the main financing players (investment funds, banks, business angels, administration).
Venture capital structures, large group structures
For private structures, two main types of business plan stand out: one based on the integration of a venture capital activity, the other organised around strong links with a large group.
In the first case, the structures support a small group of high-potential startups, in which they also acquire shares.
Undertaking subsequent fundraising rounds is therefore a necessary condition of profitability for these players.
In the second case, the support structures have more of a technology-monitoring role for the group to which they are attached. Through them, the group takes a stake (often a minority interest), aiming to create new product and service lines that fit the core business of the group or to counter the risk of potential disruptions.
Generalist structures, specialised structures
Generalist structures generate 44% of fundraising; the remainder is generated by support structures with specific sectoral positioning. The information and communication technology, health and energy sectors are particularly well represented, covering 21%, 8% and 7% of the number of specialised structures respectively. As for the average amounts raised by sector, food, energy, telecoms and chemicals cover
the most significant amounts. Note that the average of the foodtech sector is inflated by the record level of funds raised by Wynd (€72 million), a start-up supported by ShakeUpFactory, an accelerator specialised in foodtech2.
Number of fundraising rounds by incubation sector
Source: Study on the fundraising activities referenced by Capital Finance over the period 31 March 2017–8 April 2019
More or less selective structures
A support structure’s ability to generate funds is not directly linked to the number of start-ups it supports – though the contrary could well be expected.
If we consider the 17 largest support structures in terms of number of fundraising rounds undertaken, they support on average 64 start-ups (49 excluding Wilco, which has supported over 300).
However, the three most ‘successful’ structures (Agoranov, Bpifrance le Hub and The Family) all support a smaller number (41, 55 and 17 respectively).
This reflects the varying degree of selectivity between support structures. In particular, an investment-fund-type structure like The Family will choose projects primarily based on their future ability to raise funds. Public structures like Agoranov and Bpi have selection criteria that include short-term fundraising prospects but also wider objectives, such as offering a commercial outlet for technologies developed in public research laboratories.
Number of start-ups supported and fundraising rounds undertaken by structure
Source: Study on the fundraising activities referenced by Capital Finance over the period 31 March 2017–8 April 2019
D. The limits of business support structures
The growth of the amounts invested in venture capital and the multiplication of business support structures hide an uneven situation in France, as well as the numerous inefficiencies that startups face.
First of all, though business support structures have certainly spread out across the country in recent years (particularly via the FrenchTech label), fundraising remains concentrated in Île-de-France. Start-ups in this area in 2019 represented 75%3 of the amounts collected in France and 97 in 10 of the most significant amounts raised. This can be explained by the fact that the majority of investment funds and business angels are based in Paris.
Further, as well-intentioned as the support provided may be, fundraising can be seen as a risk by entrepreneurs looking to retain control of the management of their company.
Finally, undertaking fundraising rounds remains a difficult task for start-ups: they have to invest significant human and financial resources in processes where the outcomes are unknown; they
often have to repeat the processes for each potential investor with no possible economies of scale; and they are often limited to a national target so as to reduce the number of physical meetings.
In this context, alternative modes of financing have started to emerge little by little, namely crowdfunding mechanisms (crowdfunding, crowdlending, crowdequity) and blockchain mechanisms
(Initial Coin Offering, Security Token Offering).
Number and value of fundraising rounds by region in France in 2019
Source: Accuracy and Eldorado analyses
2. THE INITIAL COIN OFFERING (ICO) PHENOMENON – THE WRONG ANSWER TO REAL PROBLEMS
A. The beginning of ICOs
The years 2017–2018 saw the rise of a new way of fundraising for start-ups: Initial Coin Offerings or ICOs, a play on the term IPO for Initial Public Offering. An ICO corresponds to the issue on the primary market of an asset (a token), of which the ownership and transactions are recorded on a blockchain4 ICOs quickly became an innovative means of raising funds for tech entrepreneurs.
In this process, tokens represent the future right of use of a service, with the token issue guaranteeing the financing. They almost equate to a voucher that can be resold on the secondary market. ICOs can therefore resemble crowdfunding, as the issue aims to finance a service that usually only exists at the fundraising stage of the project.
As soon as the tokens are issued, they can be exchanged directly on the secondary market. Their value is determined by the demand for the service that requires or results from their use. Investors therefore bet on the growing adoption of the service to maximise their return on investment. In addition, it is not rare for project owners to reserve a portion of the tokens issued for themselves in order to benefit from the success of their service.
How an ICO works
B. After seemingly replacing traditional financing means, ICOs are in great decline
After the very first ICO in July 2013 by the Omni project, these operations multiplied in parallel with a significant increase in the market capitalisation of cryptocurrencies, the primary investment vehicle for these operations.
Between September 2017 and November 2019, more than $29 billion was raised in the world by this type of operation, primarily at the end of 2017 and in the first half of 2018 (between September 2017 and December 2017, the capitalisation of this market quadrupled). It should be noted that in France, the number of these operations over the same period was more limited, with only 48 ICOs and $153.6 million raised5. However, this is not representative of the situation in the country, insofar as numerous entrepreneurs chose to organise their ICOs elsewhere6.
Whilst the success of this method of financing can be explained initially by a certain number of specific circumstances, such as the strong increase in the capitalisation of cryptocurrencies, other structural factors need to be considered in order to take the full measure of this phenomenon.
First, the relative efficiency of this fundraising, that is, the ratio between resources (both human and financial) mobilised by a start-up and the amounts collected is significant. For a small company, issuing tokens on the primary market is in theory less expensive than classic fundraising and makes it possible to access a greater number of investors. As for larger companies, an ICO is also less expensive than issuing regulated financial securities.
Breakdown of funds raised in France and across the world
in 2018 and 2019 by fundraising process
This mechanism also has the advantage of security, with the blockchain by its very nature unable to be hacked. Finally, it offers better liquidity because the securities can be easily exchanged, in contrast to a direct investment in a start-up, something that is far from liquid.
The strong growth of the amounts collected during ICOs could give the impression that they would become the indispensable fundraising tool for innovation. However, since the second half of 2018, the amounts collected, as well as the number of ICOs, are in net decline: $2bn per month on average between September 2017 and August 2018, against $0.36bn (i.e. five times less) between September 2018 and November 2019.
Such a slowdown can be explained in particular by the fall in the market capitalisation of cryptocurrencies. Indeed, the loss of value of cryptocurrencies has reduced investors’ available funds and has weakened the cash positions of companies that retained amounts raised in the form of cryptocurrencies. The inordinate exposure of some projects to price fluctuations has led to numerous bankruptcies, which has also highlighted the complete lack of protection for token-holders. The fall in the number of ICOs can also be explained by the numerous fraud schemes and scams that plague this type of operation, of which some have featured heavily in the media.
Despite this mixed balance sheet linked to a passing trend for cryptocurrencies, ICOs remain important for three reasons: (i) they represent an expression of mistrust towards historical players in fundraising (advisory services, traditional investors, and business support structures), (ii) they serve as a means of obtaining financing for those that could not obtain it elsewhere and (iii) they are attractive to those with a pronounced desire for innovation. It is based on these elements that a new generation of financing models is now being built.
Amounts collected by ICO, number of ICOs and market capitalisation of cryptocurrencies
A. Initial Exchange Offering: a return to trusted third parties?
An Initial Exchange Offering (IEO) is an ICO undertaken directly on a cryptocurrency exchange platform (an exchange). In this process, the exchange plays the role of the trusted third party. By sorting through the projects and undertaking sufficient due diligence on them, the exchange guarantees investors that the project is serious and, in particular, that it has potential. Indeed, the exchanges7 have the skills necessary to value the projects and put their reputation on the line by accepting to list them – thus providing the projects with the liquidity that investors seek.
The IEO practice has been growing since the end of 2018: whilst the number of ICOs has been consistently falling since March 2018, the number of IEOs has been consistently rising since January of the same year.
This recentralisation is somewhat paradoxical in light of the decentralisation aims of blockchain, but it makes it possible to assuage the concerns of investors. By repurposing the principle of the ICOs, IEOs show that this innovative process can still be relevant for certain projects, whilst conserving the far from negligible advantages it has over traditional fundraising, provided that asymmetry of information between start-ups and investors is remedied. It highlights that only a thorough analysis of the business model, the addressable market and the benefit of blockchain technology can guarantee a project’s reasonable chances of success.
In this sense, the ICO experiment makes the case for the increasing involvement of trusted third parties such as advisory firms, support structures and, in the case of IEOs, exchanges.
Amounts collected by ICO / IEO and market capitalisation of cryptocurrencies
A Security Token Offering (STO) corresponds to an issue on the financial securities primary market, represented by a token on a blockchain. Far from a simple voucher like in the context of ICOs or IEOs, a token here gives rise to the right to future revenues, either fixed or variable depending on their configuration. As the financial securities are subject to strict regulations, investors have potential recourse against ill-intentioned start-up entrepreneurs.
Blockchain here represents only the technological infrastructure on which the transactions are registered. However, it opens up wider perspectives in terms of the digitalisation of financial securities and beyond. The acquisition of a €5 million stake in Tokeny Solutions8, a Luxembourg start-up specialised in the “tokenisation”9 of financial assets, by stock market operator Euronext, follows this dynamic.
Indeed, blockchain characteristics make it possible to consider the lowcost securitisation of any type of asset, whether works of art, financial securities or real estate. The liquidity of these securities on the secondary market is not yet guaranteed due to the lack of exchange platforms with the necessary licences and sufficient volumes, but dozens of projects across the world are under development in this area.
Examples of STO projects in progress around the world
Blockchain also represents an opportunity to develop innovative financial securities, which for example allocate to investors a percentage of certain elements determined in advance, such as operating profit or profit before tax. Whilst this type of instrument was already possible before blockchain, the automation made possible by smart contracts10 changes the cost/benefit equation. This would mean generating wider interest among investors, in exchange for less involvement in governance.
Though the ICO phenomenon is losing momentum, business support structures can learn a great deal from it, despite first perceiving it as a threat. It has revealed the complexities and the excesses of ‘traditional’ fundraising and is an opportunity for the innovation financing ecosystem to reinvent itself with new hybrid mechanisms.
And it is not the only driving force. Indeed, the digitalisation of operations is growing with more and more services being performed remotely: support, events, conferences, fundraising, etc. This revolution in our way of working is leading to a greater number of structures with no physical locations and off-site support programmes.
In addition to adapting to this innovative way of working, business support structures will have to adapt to new market requirements: improve their ability to operate in non-metropolitan areas by mobilising relevant investors outside major cities or develop their mastery of cutting-edge technologies such as blockchain, which makes it possible to digitalise complex operations without jeopardising data security.
Between the search for profitability, new start-up needs (mobility, flexibility, transparency), developments in utility (crowdfunding) and technology (blockchain), and changes in their ecosystem (digitalisation, globalisation, decentralisation), business support structures need to be as innovative as their innovative clients!
1 Accuracy study led between 31 March 2017 and 08 April 2019 on over 650 fundraising rounds undertaken by 380 incubated start-ups
2 Wynd operates mostly but not exclusively in foodtech
3 Accuracy analyses
4A blockchain is a distributed database that cannot be falsified. It can only be modified by increments. For further details, see the book: Blockchain – The key to unlocking the value chain, M. Della Chiesa, F. Hiault, C. Téqui (Eyrolles, 2019)
5 According to Accuracy analysis of CoinSchedule data 6 Les Echos, « ICO : les start-up tricolores boudent la France », 06/06/2018
7The exchanges are the largest players in this new ecosystem; by their size alone, they give investors confidence
8 Les Echos Investir, “Euronext prend une participation de 23,5% dans la fintech Tokeny Solutions” (Euronext takes 23.5% stake in Tokeny
Solutions fintech), 01/07/2019 9Tokenisation describes the act of creating a token on a blockchain thereby materialising the ownership of an element external to the blockchain – for example shares in financial assets
10 A smart contract is a transaction that is conditioned and programmed on a blockchain
Accuracy received the Great Place to Work certification for all of its participating offices.
What is the Great Place to Work® Certification?
Great Place to Work® Certification is the most definitive ‘Employer-of-Choice’ recognition that organisations aspire to achieve. The Certification is recognised all around the world by employees and employers alike and is considered the ‘Gold Standard’ in identifying and recognising Great Workplace Cultures.
Accuracy is announcing the promotion of three new partners. These promotions take place in the context of Accuracy’s continued growth since its creation 15 years ago. Today, Accuracy has locations in 13 countries and counts some 450 consultants and 50 partners.
Accuracy advised L. Possehl & Co. mbh in the context of the acquisition of the European Foundation Group B.V. (EFG). The Group specialises in providing foundation solutions based on screw piles for buildings, homes, industrial and infrastructural works, as well as soil drilling solutions in Germany.
The unbridled rhythm of innovation, the risk of disruption, the volatility of clients and the dearth of talents. These are all factors pushing large groups to innovate not only quickly, but also efficiently. This innovation imperative requires, in particular, large companies and start-ups to come together.
However, large companies wanting to support start-ups is not enough to make the collaboration work. If innovation clearly constitutes a bridge linking the worlds of large groups and young businesses, its foundations can be weakened due to strategic objectives and ways of working that are structurally different.
Our mapping of business support structures in France makes it possible to understand the primary trends in the French innovation ecosystem, as well as the performance levers able to be used to support start-ups:
A. Large French groups are actively engaged in supporting start-ups. This is the case for 90% of CAC 40 companies, which have created their own structure, participated in multi-company schemes or joined existing structures.
B. Initially centred on the Parisian region, French innovation is developing rapidly outside the capital, with larger cities welcoming more and more support structures.
C. The trend is for structures specialised in the supporting entity’s sector(s) of activity: this is both a means of differentiation and a performance factor. It also enables the large group to integrate created value more easily into its own activity.
D. Five axes of reflection make it possible to define the most appropriate format of support for the large group’s strategic objectives. Four relate to the solutions provided: hosting, human, technical and financial resources; the fifth relates to the level of maturity of the start-up.
E. A third-party expert is essential to implementing the support strategy, but also its governance. It is a question of facilitating cooperation and maximising value creation between the parties involved, which may have extremely different cultures!
Whether it is a matter of the heart or the head, the union between large groups and young start-ups today is vital to securing growth levers in a world under constant transformation.
But how can large groups get their bearings among the myriad possible support formats? From incubators to accelerators, via co-working spaces, company nurseries, fablabs and corporate venture capital, what criteria should a group choose to find the right support structure to meet its strategic objectives? Should it specialise in its own area of activity or stay generalist, ready to capture value wherever it can be found?
Accuracy has undertaken a mapping of French support structures to provide the necessary keys to fully understanding the ecosystems in place. This vision will make it possible to judge which third-party experts can provide assistance in finding a truly productive and profitable approach.
1. INNOVATION IS UNDERGOING A REVOLUTION!
HOW TO TAKE ADVANTAGE OF IT TO CREATE VALUE?
A. From absorption to support
In an ever more uncertain environment, innovation is no longer restricted to internal R&D investments, patent portfolio management and the integration of outsourced technologies. It is now closely linked to risk-taking, through investments in audacious projects: to stay in the race, companies have to bet on (more or less young) disruptive entrepreneurs.
The majority of large companies initially adopted a strategy of absorption. This was sometimes aggressive and destabilising for the entrepreneurs, and often inefficient in terms of innovation. However, previous failures and the appearance of new open innovation tools have boosted new practices. Today, 90% of large groups favour start-up support structures, either by creating their own or by sharing or delegating the management of it.
Management of support structures in which CAC 40 companies invest
For example, the Vinci group created its own structure, “Léonard”, which among other things, stimulates intrapreneurship. It is all at once a start-up incubator, a co-working space and a meeting place for actors in municipal/regional transformation. As for Airbus, it signed a partnership with the incubator Centrale Audencia ENSA Nantes. In addition to the services provided by the incubator itself, those working there have access to a dedicated space (technical showroom and co-working space) able to host their intrapreneurial projects.
Other companies prefer to ask a third-party expert to set up their support system. For instance, AstraZeneca asked a pure player, Interfaces, to create and then manage its “Realize” programme, which aims to innovate in terms of a patient’s journey, data management and scientific innovation in the field of oncology.
B. A more and more balanced regional network
Paris and the French desert? Not so fast… It is not surprising that the capital is the nerve centre of French innovation: it boasts 26% of existing structures, including the top performing ones and those receiving the most media attention. However, the other regions of France are not to be outdone: major regional cities are also giving themselves the means to play a role in the race for innovation.
Indeed, the French ecosystem has a network of support structures that is becoming more and more complete. More than 700 municipalities have at least one support structure, and all regions are seeing their number of structures increase.
Breakdown of support structures in France
Our quantified analysis makes it possible to take an inventory of the situation and to predict the future dynamics of each region. Ile-de-France shows an innovation support ecosystem that is already relatively mature, whilst the other regions, even those already well developed such as around Bordeaux and Toulouse, continue to show strong growth prospects.
In short, France’s innovation ecosystem is rather logically based on the economic dynamism of the different regions and seems to form a Sun Belt à la française, which starts in Rennes and descends all the way down to the Nice region, passing by Bordeaux, Toulouse and Montpellier.
C. Innovation ecosystems more and more specialised by sector
In this regionalisation of innovation, certain areas have chosen to rely on their economic history to create specialised channels by sector. But is it better to go generalist or specialist? The majority of large groups have had to make this decision, with each adopting the strategy that seems most relevant to its strategic and economic imperatives.
However, the fact is that specialisation is gaining ground. Themed platforms now make up a significant proportion of the support structures in France. This may be because, on the one hand, the added value of the support may be significantly larger, and on the other, companies generally seek benefits in their areas of activity. Moreover, specialisation is a differentiation factor in the face of increasing competition following the rise in the number of support structures in recent years.
The development of the banking sector illustrates this transformation perfectly. Since 2014, Crédit Agricole’s “Village by CA” has spread throughout France, in line with the presence of its regional head offices, and regardless of the sector of application. Its aim is to assist entrepreneurs by providing coaching, a potential network of business partners and mentoring by bank employees, in the hope that they then become suppliers or clients of the bank. All other large banks have followed suit, creating their support structures, but sometimes limiting them to their core business lines. For example, “Plateforme 58” by La Banque Postale, is active in banking and insurance, as well as in financial technologies, health, education and services. As for BNP Paribas with its acceleration programme “Bivwak!” (in addition to “WAI”), HSBC with “Lab innovation” and Société Générale with “Swave”, they concentrate on innovations that are applicable to the bank’s business lines, supporting fintechs and insurtechs.
In this specialisation trend, certain sectors seem more attractive than others. The graph below clearly shows the areas that are over-represented in the innovation ecosystem when looking at their market size. In all probability, the greentech, fintech, biotech, and agritech sectors, but also media and communication, will drive innovation for the next few years. Hence why it is important for companies to position themselves now to secure the creation of value tomorrow!
Investment by specialisation
2. HOW TO CHOOSE THE RIGHT FORMAT FOR START-UP SUPPORT AND MAXIMISE RETURN ON INVESTMENT?
A. What type of structure for what strategic objectives?
Even if 90% of CAC 40 companies have chosen to invest in at least one start-up support structure, the format used is not always appropriate to achieve their strategic objectives.
There are multiple types of support structures. The services offered vary, ranging from simple hosting services to the provision of machine tools for prototypes, access to mentoring or bespoke acceleration programmes, the organisation of networking events and also assistance with financing. So how should a large group choose the most appropriate format for its strategic objectives?
Of course, it should start by clarifying these objectives, which underpin its investment logic. Is its ambition to obtain a quick return on investment? To participate in the development of a region to make it more dynamic? To monitor technology closely in order to integrate any developments by the start-up as quickly as possible? To face human resources challenges through intrapreneurship, the recruitment of new talents, the employer brand or the sharing of new ways of working?
Defining these objectives make it possible in turn to define the type of start-up to target (in particular, in terms of maturity) as well as its associated needs (hosting, technical means, human resources, and financial means). The relative weighting of these five elements therefore determines the most relevant support structure, in light of both the strategic priorities of the large group and the actual needs of the start-up.
Indeed, the mapping below presents the different support ecosystems that exist in France, based on the relative weight of each of these five criteria.
Mapping of main support structures
By way of example, incubators are essentially aimed at communication objectives, HR issues and technology capture, their offer mostly includes hosting, coaching or mentoring, and is aimed more at start-ups. As for fablabs, the objective is less geared towards communication and more towards the development of talents and regions. For that reason, they tend to deal with more mature projects (often in their prototyping stage), for which a large group may supply significant technical means.
B. The thorny question of governance: a trusted third party to make alliances last
Once the structure has been identified and fully considered, the difference – as usual – resides in execution.
First, to attract the most promising start-ups, groups must ensure that they bring a differentiating factor to the table. It is for this reason that Univail-Rodamco-Westfield offers the opportunity to test innovations and business models in its shopping centres, whilst the highly active communication surrounding EDF Pulse provides a strong level of exposure.
Second, supporting start-ups is an investment project just like any other, and in this respect, it requires the rigorous monitoring of KPIs defined in advance. This performance steering, whether it be through strategic partnerships, equity investments or support programmes, raises the tricky question of how much independence is necessary to innovate. How can a large group implement a governance structure making it possible to provide support to the start-up but without suffocating it? Adapting internal processes so as not to stifle the start-up’s development with too much rigidity, involving top management to strengthen the legitimacy of the programme internally, communicating regularly but not intrusively… There are many different success factors, the implementation of which may require the presence of a trusted third party.
This third party can contribute to building a bespoke support programme and supervising it once in place, particularly in the case of multi-company structures such as “Plant 4.0”, which groups together Total, Vinci Energies, Solvay, Eiffage, Orano and Air Liquide.
The trusted third party must understand the advantages and disadvantages of each type of structure to create a bespoke programme that responds effectively to the large group’s strategic innovation challenges. But above all, it must be a bridge between the large group and the start-up: indeed, these actors each have differing strategic objectives, which only converge when it comes to innovation. Accuracy can be the trusted third party, acting to orchestrate, coordinate and optimise cooperation in this “shared space”.
1Relationship between the number of existing support structures in the region and the entirety of existing structures in France.
2Relationship between the number of opening support structure projects and the already existing support structures (within the region).
3Share of INSEE income by branch over total 2016 income in France.
4Number of support structures by specialisation over the total number of support structures.
– Accuracy database – December 2019
– David with Goliath study, The alliance of young and large businesses – 2018
On 20th February we will be hosting an event, with a series of panels, on how technology can be used to address legal and compliance challenges. The event will be in Frankfurt am Main and hosted with Deutscher AnwaltSpiegel and F.A.Z.-Fachverlag.
the evidence that we collect for document reviews, there are often glaring red
flags indicating guilty behaviour. They can be used at the outset of an
investigation to assess whether there really is an issue, including by
highlighting who has something to hide and what they are hiding.”
Global Investigations Review publishes article on corporate investigations by Accuracy partner Rick Barker.
A new ‘Brexit Index’ that measures the strength of the economic relationship between the EU and UK – based on the movement of people, goods, services and investment – has dropped 12% before Brexit has even happened, according to global independent advisory firm, Accuracy.
Accuracy is proud to be expanding its forensics capabilities, adding highly experienced new advisors to its investigations practice and developing new document analytics tools to quickly identify evidence or areas of concern. Accuracy’s forensic investigations team uses its experience, financial acumen, and technology-based solutions to help its clients identify and investigate risks associated with fraud, corruption, sanctions violations and competition issues, among many more.
New additions to the team
Morgan Heavener has recently joined Accuracy as a partner based in the Paris office. Prior to joining Accuracy, he worked as a US investigations attorney, leading numerous complex cross‐border investigations, with a particular focus on the US Foreign Corrupt Practices Act (FCPA). Morgan has represented large international corporations in enforcement actions brought by the US Department of Justice, the Securities and Exchange Commission and the UK’s Serious Fraud Office, among others. He also has experience developing anti-corruption compliance programmes for multinationals across a variety of industries, including energy, technology and finance.
Roberto Maluf is another new addition to Accuracy’s Paris office, enhancing the firm’s corporate intelligence capabilities. Roberto and his team provide background information on entities and individuals for pre-transactional or third-party due diligence purposes and litigation support, as well as for support in forensic investigations into corporate impropriety or financial crime.
Accuracy’s growing investigations practice anticipates further senior hires in Dubai, Hong Kong and Germany in the coming months.
In addition to expanding its investigations team, Accuracy is developing innovative tools to assist in all types of investigations.
Our strong cyber and forensic investigations teams are able to make use of our customised review platform to complement their expertise. For example, the Accuracy Forensic View (AFV) review platform can be used to greatly accelerate document reviews, including by examining what has been recently deleted or encrypted.
Does an allegation warrant an investigation? Where should you start? Are bad actors destroying evidence? What document review search will ever find evidence that has been hidden deliberately?
The AFV turns these challenges into advantages. The review tool is based on digital detective investigation techniques developed over many years, focusing on what bad actors are hiding. Electronic discovery collection images identify those actors’ efforts to hide documents, allowing investigators to focus on who has something to hide and what it is. The AFV can jump-start the investigation and discover the hidden truth. When combined with Relativity’s power, the AFV gives investigators and reviewers the best of both worlds.
Rick Barker, a partner in our Milan office, will be presenting on the benefits of this approach at the RelativityFest event in Chicago.
For more information regarding Accuracy’s document review solutions, including the AFV, please contact firstname.lastname@example.org.
More than three years ago, in June 2016, the British public voted in favour of the UK’s withdrawal from the EU. Since then, European and British political leaders have been struggling to find a way to respect the popular vote, whilst preserving the economies and societies of the two partners. Their difficulties can be explained by the high level of integration between the two economies, after several decades of development in the European domestic market.
Accuracy analysed the economics that unite the two partners, corresponding to the four freedoms guaranteed by the European single market: the free movement of goods, services, capital and people. The Accuracy Brexit Index combines these four elements in one aggregated index, the development of which makes it possible to monitor over time the process of (dis)integration of the two economies.
On this basis, five macro-trends stand out:
– Whilst Brexit is still only a future destination towards which the UK is heading, relations with the EU – as measured by the Accuracy Brexit Index – have already deteriorated by 12%.
– This effective drop in the level of integration between the two economies can be explained by the uncertainty generated by the 2016 vote. It primarily affects investment flows (13% decrease of the investments index) and migration flows (25% decrease of the migrations index).
– Analysing the development of the exchanges between the two economies reveals two different dynamics. On one hand, the flows of capital and people are largely determined by economic agents’ expectations and the level of predictability of economic policy. The psychological impact of the 2016 referendum in addition to the context of uncertainty immediately translated into a decrease in exchanges. On the other hand, the flows of goods and services only show a small decline since June 2016, driven by the fact that these flows depend on shorter-term cost trade-offs. The lack of change in trading conditions before Brexit takes place has made it possible to maintain a certain level of commercial relations.
– Despite experiencing a drop in EU migration, we can already see migration flows into the UK rebalancing, with Asian populations replacing those flows which before came from the EU. This trend is expected to continue in the long term and can be explained by the considerable need for labour in certain sectors of the British economy. European immigrants have been replaced by Asian immigrants to such an extent that the total number of immigrants entering the UK has remained relatively stable.
– It is probable in the long term, after the implementation of Brexit, that the short-term trends observed in the wake of the 2016 referendum will reverse: trade in goods and services is expected to decline structurally, whilst the uncertainty surrounding the terms of the UK’s withdrawal will diminish and could ease constraints on investment. However, the level of investment is unlikely to reach its pre-Brexit levels, and migration flows from the EU will depend on new entry conditions. They, too, will be unlikely to reach their historical levels.
Accuracy, the global independent advisory
firm, has expanded its Hong Kong office with the addition of So Kim Lau, who
joins the team as director. Kim joins Accuracy from a “Big Four” accountancy
firm, where she was a director in the corporate restructuring and insolvency
After opening an office in Hong Kong in January 2019, following those in Singapore,
Dubai and Casablanca in 2016, 2017 and 2018 respectively, Accuracy has entrusted
Frédéric Recordon with the management of its base in Beijing.
is proud to have supported the development and publication of the Chartered
Institute of Arbitrators (the “CIArb”) Guidelines for Witness Conferencing in
International Arbitration (the “Guidelines”). The Guidelines were launched in
Singapore on 23 April 2019, at the inaugural CIArb Asia-Pacific Regional
a director in the Singapore office, is a member of the Drafting Sub-Committee
and has been actively involved since the inception of the Guidelines.
The Guidelines seek to provide non-prescriptive
guidance on the use of witness conferencing in International Arbitration. They
allow arbitrators, counsel for the parties and experts to be on the same page
and know what to consider and what to expect of each other as part of witness
The Guidelines can be downloaded here. We hope that they will be of assistance to all practitioners, no matter the level of experience, and to all types of users. It will be an “aide-mémoire” to experienced practitioners and the explanatory notes will be helpful for less experienced practitioners, providing further detail of the considerations for witness conferencing in arbitration proceedings.
To see our global reach in more detail, please visit our disputes page, or contact your normal Accuracy contact, who would be happy to discuss further and respond to any queries.
The CIArb lies at the forefront of thought leader in
dispute resolution. As the foremost professional body for dispute resolution,
the institute seeks to advance and promote research, academic thought and new
professional policy and practices concerning dispute resolution as a Learned
Society. It works closely with academic institutions and other professional
bodies across the world. Among many other activities, it seeks to promote
greater understanding and use of alternative dispute resolution methods and
works closely with professional organisations throughout the world and involves
its local members heavily in its projects and activities.
After setting up in Singapore in 2016, Dubai in 2017, and Beijing and Casablanca in 2018, Accuracy continues to grow its global presence by opening an office in Hong Kong.
The Hong Kong office is led by Xavier Gallais (HEC 1996), an Accuracy partner since 2008, who joined the firm after several years at Arthur Andersen followed by Suez Environnement (financial management). Xavier Gallais has almost 20 years of experience; he has taken part in numerous international operations and has led over 200 engagements as an Accuracy partner.
Accuracy continues to strengthen its international network and its activities in Asia through the promotion of David Thornes, of the Singapore office, to partner. The independent advisory firm created in 2004 has 17 offices and 400 consultants, including 48 partners.
Matthew Hanson, senior manager and Khalid Lachheb, senior director, both at Accuracy, have been interviewed in the Corporate Disputes magazine, Jan – mar 2019 issue, on the following topic: “Construction disputes – selection phase”
Accuracy advised Bpifrance and Omnes Capital in the context of their increased stake in UNITe. With 64 production sites, UNITe is one of the leading independent producers of 100% renewable electricity in France.
Accuracy is proud to announce that 12 of its partners and directors have been included in the Who’s Who Legal: Arbitration 2019 list of Arbitration Expert Witnesses, in recognition of the outstanding quality of their services, and a further three partners are included in the Arbitration Future Leaders – Expert Witnesses list:
Accuracy assisted Federation Entertainment in the context of its €16 million fundraising. Federation Entertainment is a television production company, producing series such as Le Bureau des Légendes broadcast on Canal+ and Marseille, broadcast on Netflix and TF1.
Guido Althaus, a partner in Accuracy’s Frankfurt office, has won the 2019 Corporate Intl Magazine Global Award: ‘Insolvency Litigation Expert Witness of the Year in Germany’.
The award marks excellence for the world’s leading advisers and financiers in an array of countries and continents. The 12th Corporate INTL’s annual awards 2019 is the largest to date, with more nominations received than ever before and remains one of the most respected in the industry.
“For several years, Guido has concentrated on litigation and arbitration support for clients and courts in both German and international dispute cases. We are delighted that this award highlights his market activities”, affirmed Kay Wüste, also an Accuracy partner in the Frankfurt office.
Jean-Michel Blanquer, Moussa Camara and Annette Roux winners of the Grand Prix de l’Economie 2018
The three winners of the Grand Prix de l’Economie 2018 awarded by “Les Echos” (in partnership with Accuracy and Darrois Villey Maillot Brochier) were handed their prizes on Tuesday 16th October, in Paris, by the president of the jury, Henri de Castries. The Minister of National Education was honoured in the Politics and Economics category. Annette Roux, president of Bénéteau, was chosen in the Business category while Moussa Camara, the founder of the association Les Determinés, was a winner in the Hope category for his support in business creation.
Two Accuracy partners, Stuart Appelbe and Anthony Theau-Laurent, have been interviewed in the Corporate Disputes magazine, July – September 2018 issue, on the following topics: “Disputes in the construction sector” and “Disputes arising from M&A transactions”.
Read the two interviews by clicking on the links below:
After Singapore in 2016 and Dubai in 2017, Accuracy continues its international growth by opening an office in Casablanca*.
This office will be run by Taoufik Lachheb, partner (43 years old, graduate of the Ecole Polytechnique, at Accuracy since 2013, a specialist in large projects) and Aomar Elalamy, manager (graduate of the Ecole Centrale Paris, at Accuracy since 2014, specialist in financial institutions and complex modelling).
Accuracy has announced the signing of an affiliation agreement with The Claro Group, a financial advisory and management consulting firm with offices throughout the United States.
While each organisation’s respective ownership and governance structure will remain the same, the affiliation will enhance the economies of scale for both organisations and allow for sharing of resources, knowledge, and best practices to better serve their increasingly global client base.
Accuracy advised Axa Investment Managers – Real Assets and Crédit Agricole Assurances, as consortium partners with Fluxys, in their joint transaction to acquire from EDF and Total a 37.76% stake in Dunkerque LNG, the owner of the liquefied natural gas (LNG) terminal in Dunkirk.
Leontine Koens-Betz, Managing Partner of Accuracy in the Netherlands, wins Best Advisor Award at the ACG Growth Awards. Last Thursday ACG Holland, the Dutch arm of the global M&A organisation focused on driving middle-market growth, held its third annual Growth Awards Ceremony at Strand Zuid in Amsterdam.
The Growth Awards recognise the achievements and contribution of member companies and individuals who have demonstrated leadership in their field.
Nominations for the awards were submitted by the ACG Holland members and the winners were selected by a panel represented by the event sponsors: Valery Capital, Orange Clover, RSM Due Diligence and the event organisers.
Accuracy advised Kersia (formerly Hypred), a portfolio company of Ardian, in the context of the acquisition of Kilco, a company based in Scotland (United Kingdom) and a specialist in animal health and food safety combination.
After the recruitment of four new partners in 2017, Accuracy, the independent advisory firm created in 2004, has promoted another four consultants to partner level in three of its offices. This brings the total number of partners in the firm to 47 for a total headcount of 370 across 15 offices worldwide.
Accuracy advised Faurecia S.A., a global leader in automotive equipment, in the context of the acquisition of Hug Engineering AG, a key player in the area of diesel particulate filters and catalytic exhaust gas purification. The target was acquired from ElringKlinger, one of the leading global companies in solutions for all types of drive systems.
Accuracy announces the opening of an office in Dubai. Led by Zane Hedge, 47 years old, a specialist in construction and infrastructure projects, this office already has five consultants and forecasts a rapid expansion
Accuracy assisted the Schneider Electric Group in the context of the merge of its industrial software activities with AVEVA, notably including financial due diligence work, historical financial reporting and assistance in the discussions.
Accuracy mobilised its international teams in 17 countries in this large-scale operation.
Reinier Huisman (39) has joined the international financial consultancy firm Accuracy from his previous position as a member of the Global Strategy/M&A team at eBay Inc, effective 1st September. Huisman has many years of experience in strategy consultancy and will be responsible for further developing Accuracy’s proposition for clients. He is the third partner in the Dutch office, joining Leontine Koens-Betz and Bas van Helden.
The international arbitration landscape has been a hive of activity in recent times.
Among the ongoing trends are the growing complexity and variety of cases, the increasing use of third-party funding and the continued “litigationising” of the process. As a popular dispute resolution mechanism that gives parties the flexibility to tailor the procedure to their particular needs, demand for international arbitration is expected to rise. For now, it provides a solid forum to resolve international business disputes, and predictability when enforcing awards.
To read the rest of the article, please click on the link below.
Despite (i) the increasing awareness of fraud and corruption amoung leaders and boards of directors, (ii) the implementation of stronger compliance models aimed at preventing fraud, bribery and corruption cases, and (iii) stricter national regulations, the number of italian cases concerning violations of anti-briberey and anti-corruption legislation is growing. Additional investment is needed. The recent introduction of “ISO37001”, an internal rule which proved organizations with guidance in fighting corruption through the implementation of effective preventive measures, is an efficient fraud and corruption prevention system, offering companies best practice in improving an anti-bibery management model…
To read the rest of the article, please click on the link below.
Accuracy has added former KPMG partner Jan Gijsbert Bakker as an external consultant to its litigation support team in anticipation of a sharp increase in the practice’s activities in the Netherlands.
The firm expects increased demand for its services following the establishment of an English-language court for international business disputes in the Netherlands and as Dutch lawmakers examine draft legislation that would make it easier to initiate class action lawsuits.
Accuracy advised SUEZ on its acquisition of GE Water & Process Technologies from GE, jointly with Caisse de dépôt et placement du Québec for a value of €3,2 billion.
For this mainly American acquisition, Accuracy mobilised its global teams in the USA and Canada to assist SUEZ on this large-scale operation, which will allow the world’s n°2 in water services to increase its presence worldwide and in the industrial sector.
Accuracy is pleased to announce that nine of its experts have been named in the Who’s Who Legal list of top Expert Witnesses in Arbitration, including one Accuracy expert who is listed in the top 10 “Most Highly Regarded Individuals” in Europe.
The Accuracy experts named: Jean-Baptiste de Courcel, Erik van Duijvenvoorde, Jonathan Ellis, Damien Gros, Roula Harfouche, Ekaterina Lohwasser, Christophe Schmit, Anthony Theau-Laurent and Hervé de Trogoff.
More information can be found on the Who’s Who Legal website here.
Accuracy performed the financial due diligence for HIG Capital in the context of the acquisition of 49% stake of the Dutch company Ecore, one of the worldwide leaders in recycling, through its French subsidiary, Guy Dauphin Environnement.
Frankfurt am Main, 17 October 2016 – Effective 1 October 2016, Kay Wüste has become partner at Accuracy in Frankfurt. Together with Dr. Ekaterina Lohwasser he forms the leadership team of Accuracy Deutschland GmbH.
Following a strategic realignment last year, the aim is now to accelerate the growth of Accuracy in Germany. “Together, we want to further enhance Accuracy’s market position” Kay Wüste stated. “I am looking forward to shaping the future of Accuracy together with a dynamic and international team.“
Prior to joining Accuracy, Kay Wüste worked at Ernst & Young/Arthur Andersen and started his career as a financial controller at a German corporation. With more than 15 years of experience in corporate finance he specialises in complex financial projects conducted in an international environment. In this role, he gained extensive experience in large carve-out transaction advisory engagements, both buy and sell side, advising clients on SPAs and valuation assignments.
Accuracy performed the financial vendor due diligence of SVP, company focused on offering customized solutions for small, medium and large enterprises, in the context of its acquisition by MML Capital Partners.
Accuracy performed the financial due diligence for the family group Legris Industries, in the context of the acquisition of Schiederwerk, the German company specialised in the development of power supplies.
Just three months ago Accuracy, for the second year running, was ranked number one in the category “Best Workplaces in France: fewer than 500 employees” by the Great Place to Work Institute. Today, Accuracy is again in the spotlight after being awarded 13th place in the Institute’s European ranking.
Accuracy performed the financial due diligence for Docapost, the Digital Branch subsidiary of La Poste, in the context of the acquisition of Applicam, electronic payment solutions provider for businesses and communities.
Heiko Ziehms, Erik van Duijvenvoorde and Edmond Richards of financial advisory firm Accuracy introduce the findings of a report examining the major causes of post-M&A disputes.
Few corporate activities waste as much management time, and cost as much money, as post-M&A disputes. They can take years to resolve and often hinder a company’s operating activities.
Data suggest that up to one-third of M&A transactions end up in legal disputes. A post-deal dispute therefore represents a very real risk when buying or selling a company, and avoiding disputes should be among the key objectives for any deal. However, successfully avoiding them requires an understanding of their causes.
A major new report into the causes of post-M&A disputes reveals reasons why deals end in legal acrimony. Financial advisory firm Accuracy reviewed over 900 claims over a ten-year period and across a breadth of sectors to produce the most in-depth, data-driven research of its kind. The report, entitled ‘An Autopsy of Cross-Border M&A Disputes’, examines the leading causes of conflict and provides recommendations on how companies can avoid making similar mistakes in the future.
All of the disputes covered by the Accuracy report – with values ranging from €5m to €10bn – contained at least one of the following four factors:
Volatility in the target company’s markets finding its way into the transaction;
Ambiguity in the wording of the sale and purchase agreement (SPA);
Pressure to acquire and the ‘thrill of the deal’; and
Cases of fraud (around 10 per cent of the post deal M&A disputes involved allegations of fraud that were central to the claim).
Volatility, which often manifests itself in sharp movements in an acquisition target’s customer or supplier markets, can cause a surprise when the final purchase price is determined. If volatility has not been managed effectively, the purchaser may end up paying more, or less, than they anticipated, with a dispute likely to result.
Another common factor in M&A disputes is ambiguity or untechnical wording in SPAs where they make reference to accounting or financial matters, for example in representations and warranties or in the accounting policies to be used in determining purchase price adjustments at closing. Careful and precise drafting, and input from accountants and financial experts where appropriate, reduces this risk. In instances where this does not provide sufficient protection, the parties may agree the use of fixed values (as opposed to values that are measured at or after closing). The ‘locked boxed’ mechanism, where the final purchase price is fixed, is a case in point. The research shows that a staggering 80 per cent of disputes reviewed did not use this completion method.
External pressures or the ‘thrill of the deal’ effect also lead to post M&A disputes. This is where acquirers come under significant pressure to do deals, resulting in ‘red flags’ being ignored because it will interfere with the deal being done.
Additionally, around 10 per cent of the disputes reviewed by Accuracy were subject to fraudulent behaviour resulting from the manipulation, falsification or withholding of important data. Examples include overstated revenues and earnings, understated liabilities and deliberately incomplete disclosure.
Not all disputes are avoidable. However, many disputes are foreseeable, and to avoid them is sometimes down to not getting a few words in the SPA wrong or ensuring a single conversation between the M&A lawyers and the financial due diligence team can take place. This can save years of litigation and a great deal of money, time and unnecessary stress.
Heiko Ziehms and Erik van Duijvenvoorde are partners and Edmond Richards a manager at Accuracy.
The “high standards and consideration” model – the managerial concept at the heart of Accuracy’s success – has once again been recognised in the Great Place to Work® rankings, which again places the firm in first position of “companies (less than 500 employees) where it is good to work in France”. Ranking in the top three every year since its first entrance in 2007, Accuracy achieved first place in 2015 and has retained this position in 2016.
The “high standards and consideration” model – the managerial concept at the heart of Accuracy’s success – has been recognised once again in the Great Place to Work® rankings, which again places the firm in first position of “companies (less than 500 employees) where it is good to work in France”. Ranking in the top three every year since its first entrance in 2007, Accuracy achieved first place in 2015 and has retained this position in 2016.
The consistency of Accuracy’s top three placement in the Great Place to Work® rankings illustrates the reality and depth of the principles that the firm practices with its teams, and constitutes one of the pillars of its strategy. Trust, respect and honesty are indeed at the heart of Accuracy’s managerial strategy; they are key to the quality of work and services that Accuracy delivers to its clients, to whom the firm owes its continued success and growth since its foundation. In a business where personal investment, reliability and quality requirements are particularly high, being able to work in an environment built on solidarity, teamwork, transparency, fulfilment and involvement of all in projects and the life of the firm, is the main way of nurturing the “high standards and consideration” concept.
Accuracy is pleased to announce the promotions of six new partners. These promotions are made in a context of Accuracy’s continued growth since its foundation eleven years ago. Today, Accuracy has offices in ten countries, 280 professionals, 36 partners and has seen its sales grow by 11% in 2015 compared to 2014.
Accuracy confirmed its advisory role in the £12.5bn sale of EE, the UK’s largest mobile telecommunications provider, to BT, the UK’s largest provider of landlines and home internet, which Closed on 29 January to create the UK’s leading communications company. Accuracy advised Orange, who owned 50% of EE in a joint venture with Deutsche Telekom (via its mobile operator T-Mobile), on the sale. The scope of work included preparation of a vendor assistance report, compilation of financial information for the negotiation of the SPA and the calculation of the Estimated Completion Accounts.
Accuracy is pleased to announce that five of its partners – Jean-Baptiste de Courcel, Erik van Duijvenvoorde, Roula Harfouche, Christophe Schmit and Hervé de Trogoff – have been named among the leading arbitration expert witnesses worldwide in TheInternational Who’s Who of Commercial Arbitration 2016, in which only 165 international experts were selected.
Roula Harfouche is a Partner and Anthony Theau-Laurent is a Director in the London office of Accuracy. They specialise in business and IP valuations particularly in dispute contexts, and in the assessment of complex damages in litigation and arbitration contexts. Here, they discuss the main methods used for valuing a business and their relative advantages, and the requirements of acting as an independent expert in dispute cases .
Accuracy has been awarded first place in the 2015 edition of the Great Place toWork® rankings of “companies (less than 500 employees) where it is good to work in France”. Patrick Dumoulin, Director of the Great Place to Work® Institute France, presented the award last night to Frédéric Duponchel, CEO of Accuracy, at the Trianon Theatre. This award represents Accuracy’s conviction, strategic focus and active policy since its foundation in late 2004: respect and well-being of its employees is an essential condition of service quality, client satisfaction and success of the firm.
Accuracy confirmed its advisory role in the £12.5bn sale of EE, the UK’s largest mobile telecommunications provider, to BT, the UK’s largest provider of landlines and home internet, which is set to create the UK’s leading communications company. Accuracy advised Orange, who owned 50% of EE in a joint venture with Deutsche Telekom (known in the UK as T-Mobile), on the sale. The scope of work included assistance in and compilation of information for the SPA negotiations as well as the preparation of a vendor assistance report containing selected financial information on the target.
Accuracy performed financial due diligence for Boursorama, a subsidiary of Société Générale, in the context of the acquisition the fintech, Fiduceo, specialist in personal finance management solutions.
Given the choice, experts would generally prefer to get involved at the earliest possible stage of a dispute. Why? Because in our experience our early involvement results in both greater personal satisfaction and the best value for the client for the use of our time; it rapidly provides the client with an independent view on the technical merits of its case and its effects on the resulting quantum of loss. This allows the parties to structure and prioritise their dispute resolution strategies before incurring any significant costs, and is likely to result in the most efficient dispute resolution outcome.
To read the rest of the article, please click on the link below.
The following article, ‘Early Resolution of Disputes – an Expert’s Perspective’, written by Accuracy partners, Hervé de Trogoff and Roula Harfouche, has been published in Transnational Dispute Management.
Given the choice, experts would generally prefer to get involved at the earliest possible stage of a dispute. Why? Because in our experience our early involvement results in both greater personal satisfaction and the best value for the client for the use of our time; it rapidly provides the client with an independent view on the technical merits of its case and its effects on the resulting quantum of loss. This allows the parties to structure and prioritise their dispute resolution strategies before incurring any significant costs, and is likely to result in the most efficient dispute resolution outcome.
To read the rest of the article, please click on the link below.
Accuracy has the pleasure to announce that Hervé de Trogoff has been named as one of the leading arbitration expert witnesses worldwide in The International Who’s Who of Commercial Arbitration 2015, in which only 120 international experts were selected.
Accuracy assisted the Carlyle Groupin the context of its entry intoexclusive negotiationsfor the acquisition ofa majority stake intheoutdoor hotelgroup listed onAlternextHomairalongsideMontefioreInvestment.
For the sixth consecutive year, Accuracy competed for the classification of “companies where it is good to work” in France, organised by the Great Place to Work Institute ®. This year, once again, Accuracy achieved third position in the overall rankings.
For its twelfth edition, Great Place to Work ® has published the list of companies where it is good to work in France and awards 49 companies for their innovative policies for their employees.
Accuracy achieved third position in the rankings (firms with fewer than 500 employees) which, according to the Great Place to Work ® model, aims to promote, from an independent inquiry, “the companies where you trust your leadership, you are proud of your work and you are happy to work with your colleagues.”
The Accuracy partners have the pleasure to announce the promotion of Nicolas Paillot de Montabertas a partner. Nicolas becomes the 19th partner in the Paris office and the 39th partner at worldwide level.
Gilde Healthcare Services acquires majority share in Viroclinics Biosciences from Erasmus MC.
Viroclinics Biosciences is a diagnostic and clinical trial operation service company providing diagnostic and preclinical studies along with drug development for prevention and treatment of virus infections.
With Accuracy, for example, large cases have been successfully carried out: Saur, Club Med, Fnac, real estate clientele… As a result, the first semester for the independent financial consulting firm, consisting of 220 consultants (of which 37 partners), was up 25% of total revenue compared to the same period in 2012. Its growing reputation, as it celebrates its nine years of existence this autumn, is one of the factors in its persistent financial health.
And there is more. “The recovery and company in difficulty activities have been very strong,” said Frédéric Duponchel, CEO and co-founder of the firm. In addition, companies are requiring more and more assistance in the construction of their business plans and analysis of their markets. “All our indicators are green, including those in our integrated network abroad. We expect a consolidated turnover of 50 million euros in 2013, compared to 44 million last year” says Frédéric Duponchel.
Since its creation, Accuracy’s goal has remained the same. The objective? To become “the McKinsey of figures” through tailor-made services and high added-value developed by an active R&D cell. Another characteristic: to ensure stability, any capital gain on shares is forbidden. “Only cash flows generate wages, and this changes the state of mind,” said Frederic Duponchel.
Accuracyperformed financial duediligenceand valuationon behalf ofGaumontin the context ofthe acquisition ofFidelineFilmsowned byPierreRichard.FidelineFilms holdsacatalogue of fifteen films, including the French films “LesCompères“, “Les Fugitifs“, and “La Chèvre”.
Accuracy performed financial due diligence in connection with the takeover of Technor Italsmea in Italy, Technor Middle East in Dubai and Technor Asia Pacific in Singapore from the Norwegian fund, Hitecvision.
Bas van Helden has been appointed partner at Accuracy Netherlands. At Accuracy Netherlands Bas will be responsible for the further development of the Valuation practice. Prior to Accuracy, Bas was a partner with KPMG where he advised quoted and non-quoted corporates and private equity investors on a broad spectrum of valuation engagements.
The Dutch market leader in Cloud Computing and IT Services, IS Group, announced the acquisition of Amsterdam-based hosting provider, Nxs Internet. The acquisition of Nxs Internet directly contributes to the financial result of IS Group. The addition of Nxs to IS Group further strengthens the company’s position in the public sector, media and e-commerce industries.
Breeding and young plant production companies, Dümmen and Agribio Group, announced the intention to merge together. The combination will profit from the advanced breeding technologies of Agribio Group and the professional supply chain skills of Dümmen. The combined company will have around 6,000 employees worldwide and realise €175 million ($235 million) sales annually.
For the fifth consecutive year, Accuracy competed for the classification of “companies where it is good to work” in France, organised by the Great Place to Work Institute ®. This year, once again, Accuracy achieved third position in the overall rankings.
For its eleventh edition, Great Place to Work ® has published the list of companies where it is good to work in France and awards 49 companies for their innovative policies for their employees.
Accuracy achieved third position in the rankings (firms with fewer than 500 employees) which, according to the Great Place to Work ® model, aims to promote, from an independent inquiry, “the companies where you trust your leadership, you are proud of your work and you are happy to work with your colleagues.”
USG People, a specialist in HR services in the onshore and offshore industry, has been acquired by Rabo Capital. USG Energy will be able to fully benefit from further growth within the market segment in the future, supported by Rabo Capital.
Vendis Capital, the private equity fund specialised in the European consumer industry, has taken an interest in Hypo Group.The Dutch Hypo Group is a fast growing supplier of branded equestrian fashion, for riders as well as horses.
Accuracyassisted in the projectby combining itsexpertise inproject management, economic and strategic analysisand financial modelingto produce thebusiness plan for the new companyand to manageits implementation.
Groupe Gorgé has acquired Van Dam from the Hollandia Group. Van Dam is a highly specialised Dutch supplier of blast and fire protection solutions, serving blue chip customers in the Oil & Gas, Defence and Wind Energy industries.
Accuracy provided vendor assistance to HVL for the disposal of Armada Outdoor to Andus Group. Armada specialises in total solutions for corporate identity, such as illuminated advertising and signposting.
Despite the current economic uncertainties, whether in mature economies or in emerging markets, countries and businesses are looking for creative solutions to achieve their long term views and fulfill their needs. Government and business leaders have to find ways to be both strategic and more efficient in developing and directing funding and managing major capital infrastructure projects and programmes in order to deliver their intended benefits. […]
In today’s environment more than ever, governments and businesses must respond to ever increasing pressure and scrutiny from stakeholders to provide tangible evidence that they are rapidly and effectively delivering on their plans and strategies while managing risks effectively. As they are investing at an increasing rate in projects or programmes for improving performance and the Built Environment, introducing enabling technology, redesigning processes and managing change; they must do so in an environment that grows more and more complex, crossing enterprise and country borders, involving a variety of internal and external stakeholders, and with emphasis on speed to deployment and cost reduction. Organisations need to be confident that their programmes and projects will be successful, benefits will be realised, and changes will be sustainable. […]
In a context of multiple crises (economic crisis, liquidity crisis, crisis of confidence) and strengthening of banking regulations (Basel III, Vickers report, Volcker Rule), financial institutions are forced to seriously reconsider their model and management. As such, Accuracy has broadened its range of expertise with the arrival of Nicolas Darbo as a partner. Nicolas Darbo will manage the Accuracy service offer for actors in the financial sector (banking, insurance).
Arnaud Lambert and Heiko Ziehms, partners at Accuracy, spoke at this conference on the 27th March 2012 at the Cercle de l’Union Interalliée which brings together several German professionals.
The following topics were discussed:
Recent M&A activity in Germany
Key structural features of German businesses: consequences for French acquirers
Legal forms of companies in the German Mittelstand and differences in governance (Council / Board of Directors vs. Executive Board / Supervisory Board and Vorstand / Aufsichtsrat). What should a French investor expect: German practices in VDD, financial aspects of the SPA, the paticularities of financial DD
Accuracy advised Al Babtain on its takeover of the insolvent company Petitjean (financial analysis and assistance to the preparation of the financial aspects of the takeover offer submitted to the court).
Accuracy announced this January that it was opening an office in New Delhi to support its European clients in India and also to offer its services to Indian companies. The management of this office has been entrusted to Sumit Khosla, who joined Accuracy as a Managing Partner.
Accuracy has reinforced its European presence by opening two new offices in Munich and Rome. At the same time, Accuracy has also set up its first office in North America: Accuracy Canada Inc., which is based in Montreal and Quebec.
Zeil: “Bavaria is a compelling location for international players”
MUNICH Accuracy, the leading corporate finance advisory group, has opened an office in Munich, its second in Germany. Martin Zeil, Bavarian State Minister for Economic Affairs, welcomed the move saying: “The opening of Accuracy’s Munich office shows the importance and drawing power of Bavaria’s economy. Bavaria’s leading rank in Europe with its high concentration of potent and successful enterprises makes it an indispensible node in the global network of renowned international consultancies like Accuracy.”
David Cayet, Managing Partner at Accuracy Germany, seconds: “Munich, as one of Germany’s top business locations, plays a key role in our growth strategy.” Proximity to clients is of importance to Accuracy. The agglomeration of major listed companies as well as successful medium-sized companies in the Munich region enables Accuracy to develop tailored solutions at close hand to clients that contribute to their success. Cayetalso remarks that the attractiveness of Munich as workplace will help Accuracy to build a team of high-calibre advisors.
Accuracy is a truly independent corporate financial advisory firm that can offer companies and their shareholders a global reach in the field of financial advisory services. Accuracy now has offices in seven European countries and accompanies its international clients all around the globe. Accuracy combines extensive know-how in areas such as auditing techniques, valuations, financial modelling, financial forecasting, and market analyses in order to serve its clients and help them deal with diverse situations including: acquisitions, disposals, companies in difficulty, restructuring, litigation and dispute resolution.
Accuracy is continuing to grow and is now present in North America, Canada with the opening of two offices in Quebec and Montreal.
Guylaine Leclerc, the Managing Partner, will be in charge of Accuracy’s development activities in Canada. She will be assisted in this task by Manon Roy and François Filion, who are both specialists in financial fraud, litigation and arbitration.
The Amsterdam Foodbank is a non-profit organisation, which is committed to distributing food to needy households. The Foodbank serves as the link between the excess food within the “for-profit” food industry (retailers, manufacturers, growers, etc.) and the “non-profit” food pantries and community. The Amsterdam Foodbank is currently responsible for the distribution of food and other necessary items to households via 13 distribution points throughout Amsterdam. Three hundred volunteers assist the Foodbank in the distribution.
On Monday, 31 January 2011, Accuracy Netherlands, represented by Surita Reynolds (Director of Accuracy Netherlands) donated 20 footballs, which included €25 per football to the Amsterdam Foodbank to support them in their cause. Ellie Goemans, coordinator of the Amsterdam Foodbank, commented that it is especially the children from the families who rely on the Foodbank, that will gain the greatest happiness from a “luxury” item such as a football, whilst the cash donation will help to cover the running costs of the Foodbank.
Erik van Duijvenvoorde has over 20 years’ experience in financial advisory services and will strengthen Accuracy’s European partnership thanks to his considerable knowledge of private equity and his British culture. He will share his time between Accuracy’s offices in Paris and London.
On 1 October 2010, Accuracy Netherlands moved to its new office at the World Trade Centre Schiphol (WTC Schiphol). The official opening was held on the 25 November 2010, at our new office. During the opening, Kees Jansma (presenter, producer and press officer of the Dutch National Football team) was our guest speaker. We discovered how a team of professionals achieves top performance. Please send an email to email@example.com for more information about the official opening of our new office.
The year 2010 has proven to be a very successful year for Accuracy Netherlands so far. It has been able to support its clients on interesting assignments in all its service lines. For example Accuracy Netherlands has been able to assist various private equity firms with a large number of successful acquisitions. We have provided assistance to:
Mentha Capital on its acquisition of Axon Digital Design
Synergia and VLD the acquisition of Ploeger Agro B.V.
Van Landschot Participaties on its acquisition of 40% of the shares of Gerco Brandpreventie
DG Infra+ on its first investment (a joint-venture with ParkKing) in the Netherlands
Nimbus on its acquisition of Stork Food & Dairy Systems
Nearly all our due diligence reports where used by the financing banks.
Accuracy the Netherlands has performed various corporate recovery assignments, including:
Assistance to a real estate company with the divesture of part of its assets (this included assistance with the sale process as well as the design and preparation of a dataroom)
An IBR assignment for a midsize advisory firm; this IBR entailed the financial review of a business plan that was prepared for the restructuring of this firm. Our report has been used by the financing banks
In the area of Valuation and Fairness Opinion Accuracy Netherland has supported the following clients:
Friesland Bank Investments with the PPA for its acquisition of DORC International
A large foreign insurer with the contemplated transaction of a Dutch industry rival
As of 1st October 2010, Accuracy Netherlands has officially opened its new office at Schiphol WTC. The office is at the center of Schiphol Airport and thus located in the heart of domestic and international trade.
September 2010: Accuracy opens its seventh European office in London.
Accuracy pursues growth and keeps up the rhythm of a foreign office opening each year since its creation in 2004. Thus, after Madrid, Amsterdam, Milan, Frankfurt and Brussels, Accuracy decided to open in London.
Accuracy will propose in London its whole service line and will add a new expertise in terms of construction and engineering projects.
In this optic, two new partners joined Accuracy, Wendy MacLaughlin and Hervé de Trogoff. Respectively 39 and 35 years, they are well known specialists in large construction and engineering projects worldwide. Experts in the analysis of delays, they intervened in numerous international litigation and arbitration cases.
Wendy MacLaughlin has a MSc Project management from Reading University and a MSc Construction Law and arbitration from King’s College London and Hervé de Trogoff has a MEng in Civil Engineering from HEI and a MBA from the London Business School.
French corporate advisory firm Accuracy advised the Altarea-ABP-Predica consortium in their recent acquisition of the CAP 3000 shopping centre from Galeries Lafayette at a transaction price of €450 million reflecting a yield of 4.65%. The success of the acquisition, says Accuracy partner Nicolas Barsalou who worked on this transaction alongside senior manager Nicolas Paillot de Montabert reflects the difference the firm brings to its clients.
Neuilly-sur-Seine, March 15, 2010 – For the second year running, Accuracy has earned its place on the list of “Best Workplaces” in France organised by the Great Place to Work® institute. This year once again, Accuracy is a Gold Prize-winner, taking third place in the overall ranking.
Neuilly-sur-Seine, February 10, 2010 – Accuracy has recently appointed three new partners at its Paris office, which now has a total of 80 consultants. The promotion of Jean-Baptiste de Courcel, Frédéric Loeper and Henri Philippe brings the number of partners at the Paris office to 17.
Neuilly-sur-Seine, September 2, 2009 – Accuracy has had a solid start for the autumn season as it further reinforced its teams with the recruitment of one new partner: Xavier Chevreux (ex Carlyle Group and the Legris Industries Group).
To mark the first anniversary of CFnews (the Corporate Finance news source) on April 29 2009, Accuracy, accompanied by Agathe Zilber, presented the results of a survey examining the theme of: “When will the M&A market pick up again?” at the Cercle de l’Union Interalliée.
For the second consecutive year, Accuracy and CFnews have jointly organised a study, within the framework of their partnership, involving the senior management, financial, and M&A departments of major groups.
In 2008, the purpose of this study was to examine the relationship between major companies and investment funds. The goal of the 2009 survey was to shed light on the impact of the financial crisis upon M&A operations.
During this special anniversary evening, Frédéric Duponchel, (Accuracy’s Managing Director), and Arnaud Lambert, (Associate) gave speeches to exclusively present the survey results.
The speeches were followed by the awarding of prizes to four company managers.
You can view the results of the study online.
For further information about the organisation of this event, please visit: www.cfnews.net.
At the start of 2012, the outlook for the world economy remains uncertain and highly divergent.
Most emerging economies have shown an upturn and while there has been moderate recovery in North America, prospects for growth in Western Europe remain weak. Furthermore, countries in the West continue to struggle to restore their public finances. Their central banks have been injecting massive amounts of cash into the economy, maintaining structural imbalances and inflating asset values.
For each country where Accuracy is present, an expert provides clarity and analysis in their field.
We have presented these interviews in their original language so as to not distort the subject.
Neuilly-sur-Seine, March 26, 2009 – For the first time since its foundation, Accuracy has competed for a place on the list of “the best workplaces” in France organised by the Great Place to Work® Institute. The company earned a place among the Gold Prize-winners, and took second place in the overall ranking.
For the first time since its foundation, Accuracy competed for a place on the list of “the best workplaces” in France organised by the Great Place to Work® institute
Accuracy earned a place among the Gold Prize winners, and took second place in the overall rankings.
For its seventh edition, the Great Place to Work® has published its list of prize-winning companies offering the “Best Workplaces” in France. Out of the 109 companies competing, 30 companies were rewarded for the innovative policies they have adopted on behalf of their employees.
Participating in this competition for the first time, Accuracy immediately jumped to second place in the overall rankings.
After joining Accuracy in 2005 as an analyst, Sébastien Caron (X99, HEC 2003), who has been a manager since June 2008, describes his three years’ experience with the company in an interview with X-Passion magazine (The journal for students at the École Polytechnique).
Please click on the following link for full details of the interview:
For its seventh round table meeting, Accuracy chose to examine the all-important subject of people and skills, an issue that is always of primary importance to any company.
In order to explore this theme, Accuracy was joined by a leading company manager Mr Henri Proglio, CEO of Veolia Environnement and a leading academic, Mr Laurent Batsch, President of Paris-Dauphine University.
“Where can we find talent for our businesses?”
This theme was debated for nearly an hour and a half.
In response to questions from Frédéric Duponchel, (Accuracy’s Managing Director), the two speakers contributed their respective views from a recruiter’s and a trainer’s viewpoint, including the following questions:
Do our graduate schools and universities provide a sufficient source of talent?
Which characteristics are most desirable today?
Does the recruitment of atypical candidate profiles offer an alternative?
As part of a partnership with Premier Cercle, an independent research company specialising in the organisation of economic, financial and technological conferences, Accuracy gave a speech on October 8, 2008 at the British Embassy during a breakfast debate on the theme of: “Winston Churchill, decision maker – the lessons of the War Rooms.”
As strategic and tactical centres used to organise resistance to German attacks during the Second World War, Winston Churchill’s War Rooms were a strong factor behind the Allied victory.
Since then, this concept has been adopted and widely used in the corporate world.
Companies such as Accuracy or Arsene make day-to-day use of a modern version of the “War Room” in the business world by pooling key information during corporate acquisitions and disposals, dispute resolution, economic difficulties or structural problems, sharing this information in a single location and for a limited period of time.
Frédéric Duponchel, (Accuracy’s Managing Director), spoke on the subject along with a number of other speakers including:
Sir Peter Westmacott, British Ambassador to France,
Admiral Pierre-François Forissier, Head of the Chief of Staff of the French navy,
Jacques Gounon, CEO of Eurotunnel,
Frédéric Donnedieu de Vabres, Lawyer & Partner in Arsene (a law firm specialising in corporate taxation cases),
A little over a year ago, Accuracy organised roundtable sessions to discuss the relationship between the private equity world and that of the major companies. On Tuesday, 23 September, 2008, the company organised a new roundtable meeting to discuss the following theme:
“What can companies learn from the practices of the private equity sector in the field of managers’ pay?”
Jean-François Palus, Executive Vice President- (Finance) for the PPR group, Claire Pedini, Human Resources and Communication Manager for the Alcatel-Lucent group, Alexis Dargent, Lawyer & partner for Mayer Brown and Christophe Leclerc, an Accuracy Associate, met to debate this issue.
The discussions, which were managed by Frédéric Duponchel(Accuracy’s Managing Director), successively focused on the ethical, technical, practical and legal issues related to the construction of management compensation packages in major companies.
The discussions continued in a more informal manner during a lunch featuring the wine-related theme of “The Chablis 1ers and Grands Crus from Domaine Pinson.”
Levallois-Perret, May 14, 2008 – Accuracy has announced that it will open of its fourth European office in Milan. This office will join Accuracy’s other offices in Paris, Madrid and Amsterdam as it marks a new stage in Accuracy’s expansion throughout continental Europe. The company’s positioning policy and independence continue to be extremely popular with both companies and financial investors.
During the CF News launch evening held on 13 February, 2008 at the Senate President’s department’s Boffrand Reception Rooms, Accuracy along with Agathe Zilber, presented the results of the survey exploring the idea: “Investment funds and major companies: dangerous or fruitful links?”
Specifically organised for senior management, financial and M&A teams from major groups, this survey was jointly completed by Accuracy and CF News as part of their ongoing partnership and was intended to clarify the relationship between major companies and investment funds.
Accuracy presented the exclusive results of the survey during this inauguration evening for CF News, a new information website and online database devoted to Corporate Finance.
The speech was followed by an analysis by Agathe Zilber, Founder of CF News, and useful commentaries from Eddie Misrahi, President of the AFIC, and Barnaby Noble, Manager of the Alstom M&A.
In partnership with Capital Finance, Accuracy gave a speech during a breakfast debate on the problems related to management compensation packages, along with the law firm Sarrau Thomas Couderc on 27 November.
This debate was moderated by Franck Moulins (the Chief Editor of Capital Finance) and was attended by more than a hundred people.
– Why and how do changes in the relationship between LBO funds and management make it necessary to adopt a new approach to management packages? – How are management packages being adapted to take account of these changes? – Which legal, fiscal and financial aspects now need to be taken into consideration?
These questions were covered at length by the four speakers: Jean-Bernard Thomas, Hervé-Antoine Couderc (STC), Arnaud Lambert and Christophe Leclerc (Accuracy).
To learn more about this event, please click hereto view the meeting report.
A new Accuracy roundtable meeting was held on Monday, November 19, 2007
Before an audience of more than 100 people, Gonzague de Blignières and Henri Lachmann responded to questions from Frédéric Duponchel on the following theme: “Private Equity: what sort of capitalism can we expect for the 21st century?”
The creation and sharing of value, transparency, and the long-term future of the model were all themes discussed during this productive, friendly and very frank discussion. The guests then answered questions from the audience. The debate continued later in a more informal manner based on the wine-related theme chosen for the evening: ” The great wines of the Côte de Nuits from Bouchard Père et Fils”.
As part of a partnership with Premier Cercle (an independent research company specialising in the organisation of economic, financial and technological conferences) Accuracy gave a speech during a breakfast meeting on October 18 on the theme of “Rugby: A management school?”
This event, which was held a few days before the Rugby World Cup Final and provided a great opportunity to explore the many similarities between the values of this sport and the corporate world, such as the best way to lead a team to victory and the ideal manner in which to encourage staff performance.
After an extensive introduction from Daniel Bouton, (CEO of Société Générale), Henri Lachmann (President of Schneider Electric), and Pierre Villepreux (Regional Development Manager for Europe for the International Rugby Board), led the discussion numerous subjects including: “Training, or the role of the leader,” “In the scrum with the Lions,” or “Combat: all part of a manager’s job.”
A number of leading speakers took part in the discussion:
– Bruno Bousquié, Vice President of Bearingpoint,
– Benjamin Cohen, CFO of the Accor group,
– Aubin Hueber, Trainer of the French amateur rugby squad,
– François Leccia, President of Noventeam and Director of the Sport & Management Institute of Grenoble EM
– Ross Melzer, Business Development Manager for Wall Street Journal Europe
– Alain de Pouzilhac, CEO of France 24,
– Michel Rouger, President of the Préaje Institute,
– Henri Tcheng, Managing Director of Bearingpoint.
The morning ended with an exclusive interview with Bernard Laporte, coach of the French rugby team, who was interviewed by Frédéric Duponchel, (Accuracy’s Managing Director Manager).
What example can private equity firms give to the listed companies?
Last Wednesday, June 20, Accuracy organised a new roundtable based on the following theme: “What example can private equity firms give to the listed companies?”Accuracy had the privilege to welcome:
– Jérôme Lefébure, CFO of the M6 Group
– Vincent Marcel, Director of financial business and Strategic Operations of the Valeo Group
– Jean-François Palus, CFO of the PPR Group
– Eric Reiss, CFO of the Carrefour Group
to debate this question.
The debate was animated by Frédéric Duponchel, Managing Director of Accuracy.
Providing a 10-day cruise to South Corsica for 30 youngsters undergoing cancer and leukaemia treatement was the mission of a major project organised by Rêve d’enfance (Childhood Dreams), a humanitarian association supported by a group of students from the HEC business school. Profoundly moved by this initiative, Accuracy eagerly lent a hand with this great cause by providing the association with the financial support needed to get this fifth cruise underway in 2007 .
In 2007, Accuracy supported the humanitarian association Rêve d’enfance
On June 18 of last year, six sailing boats set off from Bastia for an incredible Mediterranean cruise around Corsica. From the small port of Macinaggio to Propriano, passing Porto-Vecchio and Bonifacio, children, students, skippers and medical interns were able to leave behind the difficulties of the the childrens’ serious illness during a short holiday. In the words of the Rêve d’enfance team: “A child’s smile is the best reward of all”.
Relay races in inflatable boats, “balloon tag” games, visits to monuments, water fights and ice cream tasting were some of the activities enjoyed during this maritime saga filled with new discoveries and good humour. “This boat trip was an amazing personal adventure” recalls Mathilde, a medical intern in Paris.
Following the success of this trip, the members of Rêve d’enfance are preparing the next convalescence trip, which Accuracy is delighted to to support.
Sarah ROZENBERG, President of the Rêve d’enfance association and an HEC student would like to take this opportunity to thank Accuracy
“After having experienced the magic of the Rêve d’enfance Cruise, you really get a chance to appreciate the importance of these 10 days to these children recovering from cancer or leukaemia. Seeing them laughing, running, singing and dancing each day, having fun just like any kids of their age despite their experience of the disease makes us more determined than ever to fight to bring these children a little happiness, and above all we would like to warmly thank our partners without whom the 2007 Cruise would simply have never happened. A huge thank you to Accuracy for its support. Thanks to you, this year once again Rêve d’enfance has been able to repeat this initiative and 30 children from all over France were able to forget their illness for a while during an incredible trip to Corsica”.
Levallois-Perret, June 5, 2007– Following the Paris and Madrid offices, Accuracy has announced the opening of its third European office in Amsterdam. The positioning policy and independence of this young financial consulting firm is attracting increasing numbers of companies and financial investors.
On November 14, 2006 the Capital Finance newsletter, the Accuracy consultancy company and the Sarrau Thomas Couderc law firm organised a breakfast debate on the theme of: “Management packages as part of LBO operations: new aspects and challenges”.
The Accur’option ® evaluation tool developed by Accuracy offers a useful response to these problems.
Franck Moulins, Journalist and Departmental Manager for Capital Finance, ran the debate, which included speeches by:
Frédéric Duponchel, Managing Director, Accuracy,
Christophe Leclerc, Associate, Accuracy,
Hervé-Antoine Couderc, Lawyer & partner, Sarrau Thomas Couderc,
Stéphane de Lassus, Lawyer & partner, Sarrau Thomas Couderc.
…on the following points:
The modelling and comparability of management compensation packages
The challenges involved in valuing management compensation packages
Increasing the number of people who to benefit from a management compensation package
Extension of these remuneration systems beyond LBO operations
A company may find itself faced with various forms of financial crime, and in particular fraud, money laundering and corruption.
What sort of risks does the company incur (internal and legal risks, unfavourable media coverage, etc.)?
Which factors may lead to a risk?
How can the company measure its level of exposure?
How can risks be avoided, detected and reduced?
On June 1, 2006, Accuracy’s Managing Director, Frédéric Duponchel brought together the following experts to answer the many questions commonly asked about this highly topical theme:
After having been the first woman to manage the financial section of the Paris Public Prosecutor’s Office from 1995 to 2000, Anne-Josée Fulgéras worked as a consultant. Today, she is Compliance and Legal Affairs Director of Natexis Banques Populaires.
As Head of the Financial Security and Anti-Laundering department of the Banque Fédérale des Banques Populaires, David Hotte is regularly involved in assessment and training assignments for the International Monetary Fund and the World Bank.
Michel Léger is one of the twelve members appointed by decree in 2003 to the High Counsel of Statutory Auditors (Haut Conseil du Commissariat aux Comptes). After a 30-year career with Arthur Andersen, he today manages the company Léger & Associés.
Accuracy, a subsidiary of the Aon group specialising in the provision of corporate financial consultancy services has created Accur’option®, an innovative new valuation model for all types of circumstances encountered within unlisted companies.
Company leaders do not hesitate to enlist consulting services to carry out acquisition processes. During the following integration phase, management’s attention naturally focuses on the operational aspects. Yet, the mastery of financial and accounting issues is essential for legal and fiscal security as well as for the sustainability of the new organisation.
With the new IAS-IFRS accounting standards, financial valuation and the DCF (Discounted Cash Flows) method more particularly, have taken on key importance in the process of preparing consolidated accounts. This prospect is somewhat preoccupying given the lack of consensus on how to determine the discount rate, a key parameter in the DCF method and one that is familiar to financiers under its English acronym WACC.
In recent years, business valuation has been central to many discussions that go beyond the small circle of financiers. The “fair value” of the IFRS accounting standards has led business leaders and auditors to look closely at valuation methods; given the high stakes in financial communications, this interest is particularly noticeable in the valuation of intangible assets on company balance sheets…
The strength and relevance of economic arguments are becoming increasingly important in valuations of damages, whether it involves a case as diverse as an intellectual property dispute, a failure to disclose by the seller, a violation of competition rules, or a wrongful termination of a contract.
Among the business valuation methods, the discounted cash flow (DCF) method has gotten a bad rap. At best, it is criticised for being based on sand, i.e. based on forecast data that are uncertain in principle. At worst, it is suspected of serving the desired result by adjusting one or the other components of the method, namely future cash flows and discount rates. In short, the DCF method is inherently subjective and easily manipulated.
by Frédéric Duponchel, Chairman and CEO of Accuracy
Frédéric Duponchel, Chairman and CEO of Accuracy
In high-stakes litigation, the litigants may call upon a financial expert who works with their attorneys. In addition to mastery of the economic and financial aspects of litigation, this involvement can back up a position technically and provide crucial support during critical phases, such as testimony.
Today, the litigant financial expert is more often a firm than an individual; it must be able to line up experienced teams with thorough technical knowledge, while demonstrating its teaching ability and ensuring the independence its findings.
During an acquisition, the analysis of pension liabilities and costs can turn out to be crucial from a financial point of view. The methods for handling these aspects highlight the economic issues related to these financial components.
Assessment of damages involves appraisal of the incurred loss (damnum emergens) and the lost profit (lucrum cessans). Behind financial and accounting calculations, which can seem simple, hide the practical difficulties of interpreting numerical concepts. These difficulties often explain a large part of the difference between the assessments respectively made by each of the parties involved.
Accuracy is a corporate finance consulting firm specialising in litigation support.
To appraise brands, there are two approaches – intrinsic and analogical – which cover the two categories of methods implemented to appraise companies. Alongside these valuation approaches, there are also references that can provide an initial estimate.
This article aims to present a non-exhaustive but sufficiently complete critical overview to provide a perspective on the various references and methods available.
Accuracy is a corporate finance consulting firm specialising in litigation support.
Assessment of damages involves appraisal of the incurred loss (damnum emergens) and the lost profit (lucrum cessans). Behind financial and accounting calculations, which can seem simple, hide the practical difficulties of interpreting numerical concepts. These difficulties often explain a large part of the difference between the assessments respectively made by each of the parties involved.
Accuracy is a corporate finance consulting firm specialising in litigation support.
Assessment of damages involves appraisal of the incurred loss (damnum emergens) and the lost profit (lucrum cessans). Behind financial and accounting calculations, which can seem simple, hide the practical difficulties of interpreting numerical concepts. These difficulties often explain many of the differences between the assessments made by each of side of the argument.
Do the accounting losses reported by the buyer of a company after the acquisition, representing a deterioration of the net assets, necessarily cause damages? How are they assessed? We will provide a number of lines of thought (Part 1) and an analytical framework (Part 2) from our experience as a litigant expert in post-acquisition litigation.
Appraising a financially troubled business in an LBO is a virtuous exercise that provides lessons. On the one hand, it provides an understanding of where the volatility of the value of the shares comes from. On the other hand, it demonstrates that holding the share of such a company means holding an option! This change of perspective helps to explain many of the decisions made.
Shareholders, mezzanine investors, senior bankers, management, and all of the parties involved in the financial restructuring of a business need objective information to fuel their thinking. An independent financial and economic analysis gives them a common basis for negotiation.
Very sluggish since 2008, the acquisition market could flash a hint of a recovery in fall 2009. Difficulty in levying debt, uncertainties about the cash flow forecasts, apprehension of stakeholders…the situation has not been conducive to the acquisitions/sales since late summer 2008.
The advent of IFRS (International Financial Reporting Standards) has jostled the practices of businesses, auditors, and financial analysts. Among the many questions raised by the application of these standards, one in particular concerns valuers.
Accuracy is regularly appointed as a financial and economic expert in litigation. Explain your mission.
Frédéric Duponchel and Christophe Schmit: In most of our cases, we are appointed as a litigant expert. We are called upon by corporate attorneys or directly by companies. We also work alongside other experts. In construction litigation, for example, technical experts are also appointed, but they do not address the financial aspect of litigation.
Amid great caution in M&A, transaction support professionals should move towards detailed, targeted, critical analyses.
Deviations from certain VDD standards, observed between 2001 and 2007, must leave room for alternative analysis tools.
Financial analysts are – or should be – amongst the most assiduous readers of the consolidated accounts published by listed companies, quite simply because this reading is one of the key components of their profession. By neglecting to do so at the height of the stock-market frenzy in 1999-2000, several star analysts had their fingers burnt and tarnished the credibility of an entire profession.
Retail banking has been experiencing a decline in its profitability for the past ten years
Retail banking has been under pressure for several years now. Its profitability has significantly declined: standing at 17% on average for French banks in 2000, it virtually halved after 2008 to stabilise at around 8–9% in recent years. Its price-to-book ratio is in line with this trend, with values around 0.5 on average, and a range from 0.3 to 0.7 in Europe.
This decline has made the banking sector much less attractive than others. Its ROE is currently below that of sectors like telecoms or consumer goods, not to mention health, the new El Dorado for investors. As for price-to-book ratios, the banking sector has one of the worst of all after having been in front for some time.
One of the causes of the sector’s deterioration lies in its lacklustre net interest margin. For all French banks combined, the net interest margin has fallen by 10% since 2009, whilst outstanding amounts have increased by 10% over the same period. However, the race for volumes and outstanding amounts is now more difficult to run due to the cost in RWA and, therefore, in equity.
In this context, French banks’ results remained relatively stable in 2018. When looking at the market’s six major banks, the NBI of their retail banking divisions was stable compared with 2017 at around €60 billion. The net income of the same divisions also remained stable at around €18 billion, despite slight variations from one bank to another.
Notwithstanding, this stability cannot hide the scale of the challenges facing the sector in the coming years, after ten lean years already. At least five such challenges can be found, with four in the financial and regulatory sphere and one in the commercial sphere, which will likely be the most dangerous in the long term. Combined, these challenges could reduce ROE by an additional 2.5 points by 2025.
Four challenges affect the financial and regulatory sphere
The first challenge relates to a fundamental aspect of the business: the level of interest rates. Retail banking plays a major role in the economy by ensuring the essential function of transforming short-term resources – deposits – into long-term resources – mortgages and business loans, in particular. ALM is at the heart of this transformation, which requires an increasing yield curve, ideally with a slope of 2%.
However, the monetary policy put in place after the Lehman Brothers crisis led to a double decrease in rates: first, in the short term, through the decrease of the reference rates, and second, in the long term, through the quantitative easing programmes. The yield curve slope in Europe gradually fell to almost nil. Yet earning money with no slope flies in the face of the fundamental laws of retail banking, which finds itself in an unprecedented and dangerous situation.
The second challenge relates to equity. Between 2005 and 2018, French banks more than doubled their equity under Basel 3. At the end of 2018, this equity amounted to almost €300 billion against €130 billion in 2005. But after Basel 3, Basel 4 is now coming into play. The impact will certainly be much less significant than previously indicated, but this increase could further reduce ROE by a few precious tenths of a point.
The third challenge is cyclical: the cost of risk. Overall, the cost of risk for all business lines grew from 0.1% of outstanding amounts in 2005 to almost 1.3% in 2009, before gradually returning to the levels in place before the crisis, at around 0.3% in 2018. However, if the cycle were due to deteriorate, the curve could invert again. Each additional tenth of a point in the cost of risk represents €2 billion for French banks; it is not negligible.
The last challenge in the financial and regulatory sphere – in theory random, but ultimately more structural in the past ten years – relates to fines. Since 2007, European and American regulators have imposed USD 230 billion in fines on international banks, of which subprime cases represented almost 70%. Many may well think that the worst has passed, but fraud and money laundering could take over from the subprime crisis, leading to continued damaging penalties in case of breaches.
Ultimately, these four challenges could deduct almost two additional points of ROE from French banks, taking the sector down to extremely low levels within five years. Beyond pure performance effects, these challenges may also handicap banks at a time when they need to invest massively to face the following challenge, that of business and development.
The commercial challenge is the primary threat in the long term
The commercial sphere is not exempt from difficulties. Indeed, the commercial challenge represents the primary threat to retail banking in the long term, with the growth in intensity of the competitive landscape relating to three factors: a reduction in barriers to entry, the transformation of the use of retail banking and the technological revolution in progress. New players – neobanks – are making the most of it, and threatening traditional banks in the process.
If we summarise the history of neobanks over the past 30 years, we can consider that there have been two phases. The first saw players in the supermarket and hypermarket sector, like Carrefour and Casino, and players in the insurance sector, like Axa and Allianz, extend their offers to include banking services, with the aim of making clients more loyal. In response, traditional banks most often created online banks to retain clients.
The second phase saw the context change. Since 2010, new categories of players have attempted to penetrate the market with relatively different models. One such category includes players like Revolut or N26, which are trying to create low-cost banks with full service offers from scratch, particularly by creating partnerships. These players are in a race to reach critical mass. Another category relates to new-technology players, which enter the market through specific services – payments, in particular – but are not looking to provide a full range of services from the outset. By way of example, we could cite Apple Card or Apple Pay, or Amazon providing insurance services. These players tend to eat away at traditional banks’ revenues.
This effect is growing, as neobanks are gaining 300,000 clients each year. Initially, their NBI per client is much lower than that of traditional banks, with the client often only opening a secondary account. But, with time, and the gradual development of the range of products, the service offer grows and these secondary accounts can turn into main accounts.
Neobanks are also imposing new quality standards in two ways: first, through high levels of client satisfaction on digital applications (some traditional banks have already reacted by significantly upgrading their applications) and second, through immediate account opening. A client may set up an account with a neobank in just a few minutes, far from traditional standards.
Nevertheless, despite their sometimes undeniable commercial success, no neobank has managed to find its way to profitability – their aggressive welcome offers are often an obstacle to it.
Traditional bank models appear to be partially protected in the short term
Usually, retail banking is presented under two complementary environments: the first covers items that have no impact on a bank’s balance sheet (“non-BS environment”) and includes services related to payments and everyday banking; the second covers items that do have an impact on the balance sheet (“BS environment”) and includes savings, credit, and insurance products. For some years now, a third environment has emerged for certain players: complementary proximity services, such as telephony or smart-home products.
Barriers to entry for the two main environments have decreased in recent years, but not homogeneously. For the non-BS environment, they have been the subject of a double diminution, both regulatory (DSP2, open banking, GDPR, etc.) and technological (digital progress, artificial intelligence). The BS environment is better protected, via numerous regulations on equity and liquidity, but unfortunately suffers from the lack of slope linked to the flattening of the yield curve.
In this context, the risk for traditional banks is to become simply the balance sheet for other players that capture the client relationship. Traditional banks have the advantages of expertise and existing distribution networks, but they are in general further behind on client experience and technology compared with online banks and especially fintechs.
Nevertheless, their physical networks create a real advantage and a strong differentiating factor for at least three reasons: they provide, first and foremost, a degree of proximity where, in recent years, the city centre is being rediscovered; they facilitate human contact, enabling real dialogue with the client and making it possible to provide bespoke advice; finally, they make it possible to transition to a physical presence, which makes real sense in the digital age. Moreover, digital services do not currently provide 100% of the financial services needed.
The most threatened environment is, of course, that of everyday banking and payments, which is very sensitive to speed and price, two factors well served by digital technology. By contrast, the BS environment is less vulnerable, and the opportunity for a client to meet an adviser remains an advantage when seeking a mortgage or advice on savings.
Traditional banks must continue to accelerate their commercial and organisational transformation
To resist the rise of new players, traditional banks must upgrade their non-BS environments to the level of best practices in the digital sphere, and they must differentiate their tariffs based on their clients. As for the BS environment, they must capitalise on the existence of their physical networks, often in excellent locations; they must improve the quality of advice offered; and they must put in place better-segmented loyalty programmes.
These observations could constitute a three-part action plan, based on diversification, ubiquity and respectful use of data.
Diversification is vital to allow traditional banks to escape the impossible equation of a non-BS environment open to new, more agile players and a BS environment suffering from suffocating yields. Creating new proximity services would make it possible to develop revenues, whilst strengthening the link between banks and their clients.
These services must first be defined in relation to the current core business, around major events in the relationship. Housing – the acquisition of property, in particular – lends itself naturally to this very well. But it is possible to go further, towards services related to travel or even certain hobbies. These successive expansions will allow banks to become key players in the everyday lives of their clients.
Ubiquity is the second axis, which is very complementary to the first. Physical networks, whilst expensive, constitute a major advantage for traditional banks. All that remains is to transform them into living spaces capable of attracting a large number of clients back. But this physical network must also be combined with a flawless digital service, developed internally or via partnerships with fintechs.
However, banks will only be able to capitalise on the advantage of their physical networks if they are able to deliver the quality of advice expected by their clients. To do so, they will have to meet three major needs: maintain stability in the client relationship/adviser, pay special attention in key life moments (unemployment, retirement, divorce, etc.), and provide personalised advice for certain products (life assurance and mortgages, in particular).
Finally, traditional banks have a major role to play in the use of data, by becoming a trusted partner of their clients and, in so doing, taking the opposing position to GAFA. If they succeed, they can win back points in the corporate image battle. This is especially the case given that clients seem ready to share their data if it means that they will receive better quality advice.
These three axes could constitute the core of a five-year plan aiming to win back all or some of the ROE points that will be lost. Nevertheless, these actions will not be enough on their own and will only serve to complement the savings and process-simplification plans already implemented in all banks, and which will have to be accelerated further.
In such a tense context on all sides, retail banking is almost at a crossroads, required to reinvent itself to take into account changes in use and the arrival of new players, all whilst being deprived of some of its financial means to invest through the absence of slope in the yield curve. The players facing these challenges are not all equal, and banks are not likely to follow the same trajectory.
In this world of constantly developing technology, it is clear that change is coming for the legal profession. Artificial intelligence (“AI”) is only one of a number of new technologies said to be threatening or revolutionising the legal industry. Lawyers attend conferences on new technology and leave more aware than ever of the challenges of AI, Online Dispute Resolution (“ODR”), GDPR, cyberattacks on law firms and even blockchain.
However, from the perspective of a forensic technology expert, the legal profession is not quite under threat. There are some interesting legal technologies arriving and some old technologies re-emerging that are worth mentioning, but, overall, jobs are safe. This is even more so the case for arbitration.
Arbitration is not the most innovative of domains in terms of technology. Investigations, litigation and regulatory breaches, by contrast, have seen significant change, but the status quo of the arbitration world seems likely to remain intact, at least in the near future.
There appears to be a certain degree of inertia in arbitration when it comes to technology. We can see this inertia with ODR, for example. ODR systems have recently come to be regarded as game changers in arbitration for SMEs, with India, the UAE, Hong Kong and Singapore leading the charge. ODRs promise democratised, accessible, affordable, and rapid resolutions. However, such a revolution was forecast decades ago when initiatives like the Virtual Magistrate Project in 1996 and Net Case in 2005 were set to act as the catalysts for it. They had little success. That said, many disputes never make it into the formalised (and expensive) arbitration world; as such, there may yet be room for ODR to grow. For example, ODR could be useful today in small-but-frequent online transactions such as Amazon purchases. These sorts of disputes, however, are not likely to need lawyers, arbitrators or experts.
In today’s world, most of the population can easily access online tools via their mobile devices, but the arbitration sphere has very little forcing it into the cyber age. Just because there are new technological tools available, it does not mean that they should be applied to everything. For example, the powerful document review platforms that now incorporate AI, most notably used to support huge US litigation cases, are far too sophisticated for smaller disputes. Where there is a small number of documents, or where the issues are legal points or financial calculations, there is simply no need. Technological revolutions are more likely to affect industries with a high degree of simple, repetitive tasks with high volume. Google Maps, AirBnB and Uber are good examples. These services are almost the complete opposite of what is required in arbitration. There are undoubtedly elements of arbitral work that can benefit from technology, but, globally, little will change.
REVIEW OF TECHNOLOGIES
Away from the latest technological innovations, there is still plenty of progress being made. The continuing reduction in the use of paper is good, if humble, progress. This may not seem to be much of an advance but many lawyers and experts today still work with hard copy far more than is necessary. Better use of office management packages, increased use of document review systems, live tribunal visual aids (including live transcripts) and even the use of “new” ODR platforms are all to be expected to some degree going forward.
The use of eDiscovery-type document review platforms is slowly being adopted and becoming more prevalent. These platforms (also referred to as eRooms, virtual rooms or document databases) are now standard in litigation and investigations. As a result, legal professionals have become accustomed to using them when the volume of documents and emails warrants it. Reductions in the cost of these systems should also contribute to their increased use. The more advanced components of these tools like concept searching, email threading, and AI are crucial when managing large volumes of data. However, in arbitration, absent disclosure requirements, these platforms would at most only be used for their keyword search functionality, collaborative review and formal legal production of exhibits.
An interesting subsidiary use of document review systems may arise from the EU General Data Protection Regulations legislation. The two main impact points of GDPR on the arbitration world are the treatment of personal data and the protection of that data. The document review tools that are used to search for documents for review can easily be used to find personal documents and emails for exclusion. This offers a potentially reasonable and defensible approach to managing the GDPR personal data risk. Online review platforms usually have tight security including controls like two-factor authentication required to gain access to the documents. Such controls are similar to the security steps required to access online banking or the additional steps users need to take when their Amazon, Facebook or Google accounts detect suspicious activity. As an aside, post GDPR, it makes sense for all of us to start treating data in the same way that banks treat credit and debt. Money needs to be kept safe and the liabilities need to be limited or disposed of. You and your clients’ data needs to be locked away and old data needs to be deleted as soon as it is no longer required. Otherwise, you need to be willing to be penalised 4% of your global revenues – the maximum fine under GDPR.
OPPORTUNITIES AND THREATS
Unfortunately, an area of real change is in hacking. The legal world, as well as the rest of the global professional services industry, is increasingly targeted by cyberattacks. These large-scale attacks are performed by nation-state actors, organised crime, or even hacktivists. There are now numerous examples of law firms being breached, with secrets being leaked or financial insights being exploited by governments or racketeers. This is not new, but breaches have become more prevalent and more newsworthy. In addition, hackers are not only interested in the big players – cyber breach tools are automated and prolific. They scan and probe everywhere, searching for ways to steal and blackmail. Legal professionals must be vigilant – both for themselves and for their clients. The financial and reputational risks are high and it is reassuring to see that the ICC has recognised the threat and released a guidance paper on this topic, which provides an overview and some useful tips to protect data.
Returning to AI, there are a few other areas where it is entering into the legal sphere. One example is offered by LegalMation, which uses IBM’s Watson AI engine to draft litigation response documents and claims to save 80% of the drafting time. AI may also be used to predict the results of litigation cases. One such tool claims that it is able to predict the result 70% of the time. It is important to note that AI systems need large volumes of historical information in order to predict outcomes. Most arbitrations are, by their very nature, confidential thus limiting AI’s ability to replicate an arbitrator unless either (i) litigation records were used to predict arbitration outcomes or (ii) law firms or arbitration groups start to pool their award data repositories. Only after a foundation of information was pooled would it be possible to see AI being used to help inform the arbitration process. Eventually, this AI could be treated as an independent and non-binding advisor, expert or (additional) arbitrator.
If and when AI can be used in arbitration, potential alternative use cases may also be found. For parties seeking legal funding, AI could be used to predict the likelihood of success and whether the action is worth funding. A less likely scenario is the use of AI to identify outliers or patterns to reveal unfair or unprofessional conduct, for example, detecting when arbitrators have been bribed to rig a decision.
In any case, adoption of AI in the world of arbitration will be a slow process. A technological expert from IBM’s Watson team claimed that they were originally throwing every scenario at AI and expecting significant results, but achieved little success. Now, the team is being more measured and pragmatic about what is possible and what to expect. The more information that goes into AI, the more effective it becomes; eventually, AI will integrate into arbitration, but not in the immediate future.
Another topic of much excitement, when talking about technology, is blockchain. However, when it comes to arbitration, its application seems somewhat limited. One potential application could be to verify arbitral awards and determine the uptake of smart contracts in the business world. In theory, blockchain could be used to add a layer of anonymisation on top of the confidentiality already built into most arbitrations; in practice, however, it would be very difficult to execute. Other areas of focus tend to be on work arising from disputes in the fledgling cryptocurrency/blockchain industry or blockchain’s use in smart contracts.
What about forensic technology? In highly simplified terms, forensic technology is specialist IT help desk support to forensic accountants, experts, investigators and lawyers. When technical issues arise in legal matters, forensic technology experts find solutions to overcome them or to save time. For example, tools can be created that can automatically process and consolidate hundreds of spreadsheets in seconds, saving legal and financial teams countless days of manual data manipulation, which would have been both painful and prone to error. In the future, innumerable bespoke tools will be developed, either in-house or commercially. It is already possible to buy contract analysis tools or add-ins, which simplify large and/or repetitive contract consistency or anomaly checks. It is reasonable to expect an increase of backroom specialised technology support within arbitration teams.
It is likely that a new type of lawyer or a new lawyer skillset emerges in the next generation. Laws and contracts are written in a very similar fashion to the programming of code. They are both a logical sequence of IFs, THENs and EXCEPTIONs. There are already many different programming languages; the creation of a new language that allows lawyers to create smart contracts easily does not seem beyond the realms of possibility. These smart contract creation packages will have connectors to internet services such as email, online databases, banks and ODR platforms allowing them to automatically interact with real-world business triggers and actions. This introduction of technological experts and AI into the legal community may lead to friction for the older generation, but it may also lead to innovative ways to serve clients.
When looking at the history of new technology in arbitration, it is possible to see the slow introduction of “amazing” and “revolutionary” technologies such as email, Dropbox, and video conferencing. Now, it is possible to imagine a future where construction dispute experts give evidence using virtual reality: where avatars use virtual dynamic dashboards that clearly show the critical path and the underlying drivers of a project; where a virtual rendering of the infrastructure project is shown, which grows and shrinks when the virtual timeline slides to points of interest; where arbitrators and other parties can interact with the simulations including pinching in to zoom in to see details, highlights or additional notes; and where, as one of the arbitrators, an AI engine interacts, questions and contributes to the fair, informed and wise assessment of the matters in dispute. This possibility, however, remains limited to science fiction for the time being…
When football clubs are sold, the club values divulged can be extremely variable, depending on the championships and countries concerned (from a few million for a French Ligue 2 club to several hundred million for an English Premier League club). But why are there such differences? Do these values reflect a value bubble or valuation errors? Or, on the contrary, are they the result of the valuation methods used in other sectors?
To answer these questions, we have to consider what actually is the “value” of a football club. There are, in our opinion, three values for a club: the financial value, the transaction value and the societal value. These three values can be presented as three concentric circles that grow each time: in the centre, the financial value, and on the outside, the societal value.
The “financial” value of a football club is the value returned purely to its shareholders. It can be determined using typical financial valuation methods, with the DCF (or discounted cash flows) method being the most relevant. This method discounts the future cash flows based on scenarios consistent with the economic environment and the sporting projects of the club. In this context, clubs from the largest championships that benefit from the highest audiovisual rights or those that own their stadium have the greatest financial value.
The “transaction” value is higher than the financial value as it includes some of the synergies that an acquirer is willing to pay for. These synergies can be commercial in nature, for example, when a club acts as a media showcase for an entrepreneur who plans to use the club to promote his or her other activities. They can also represent the price to pay to generate media interest or reach other objectives (philanthropic, political, etc.). Finally, acquirers can generate other synergies linked to owning several clubs, for example.
Finally, the “societal” value takes into account the value created for all stakeholders (in economics, we talk about “externalities”). This relates to measuring the impact of a club on the wealth creation of a town or a region. This value takes into account all direct economic benefits (revenues of shopkeepers that increase during match days, etc.) and indirect economic benefits (greater attractiveness of a city or region in which a leading club develops, etc.). In certain cases, when a club is effectively the flag bearer of a region and becomes a veritable institution, the societal value can be considerable (e.g. FC Barcelona). By definition, these stakeholder flows are very difficult to determine.
Overall, a football club is most often sold at a transaction value that is higher than its financial value but significantly lower than its societal value. Understanding these different concepts enables us to understand the different dimensions of the value of a football club. In practice, if financial models make it possible to define the financial value of a club and, to a lesser extent, its transaction value, the societal value remains largely beyond the scope of a typical financial framework.
Take care of the pennies and the pounds will take care of themselves
Popular British saying
You insist that there is something a machine cannot do. If you will tell me precisely what it is that a machine cannot do, then I can always make a machine which will do just that!
John Von Neumann (1948)
Companies’ operations are complex and costly. They have entire teams dedicated to making sure that penny leaks (which can be substantial) are contained and dealt with. These teams receive assistance from elaborate systems that gather, store and analyse data.
These elaborate systems are composed of multiple Artificial Intelligence (AI) bricks1, which significantly increase the accuracy of performance assessment and decision-making, whether it be on compliance issues, investment strategies or operations management.
In this article, we will ask what is artificial intelligence and examine how it can be applied to business.
1. THE IMITATION GAME: A RULE-BASED GAME
We cannot talk about AI without mentioning this philosophical cornerstone. The Imitation Game first appeared for the first time in an article by the English mathematician Alan Turing2, the instigator of modern AI. He argued very early on that machines would one day be able to perform any task better than humans could. In his 1950 discourse in the scientific journal ‘Mind’, he felt uneasy with the question:
‘Can machines think?’
He considered that defining the terms would be tricky and, instead, chose to introduce a question and answer (Q&A) game played by a human, a machine and a judge. In it, the judge could ask any questions that he or she wanted. At the end of the game, the judge had to decide whether one of the players was a machine, and if so, identify which one it was. According to Turing, if the judge was unable to distinguish between the human player and the robotic one, then the machine would pass the Turing Test.
Turing considered that any machine able to fool the human judge would be an Artificial Intelligence.
He based the Imitation Game on questions and answers, feeling that it would be the best way to focus on all the particularities of the human mind without favouring either the machine or the human player.
However, the essence of this test is not in the form of the game itself (here a very broad Q&A) but rather on the idea that for a given game, a machine can achieve human-like skills so that an external observer cannot distinguish it from a human player. We may therefore say that, within the context of a game of some sort, any form of AI automatically imitates a human player. Such an approach enlarges the realm of AI, with the rules of the game becoming essential to determine what qualifies as imitation3.
This philosophical twist might seem like a convenient intellectual argument to label certain algorithms – which would otherwise not be – as AI, but this is not the intent.
Instead, the intent is to circumvent disputes around the scope of AI4. One of the most evident sources of dispute comes from the AI effect: as soon as AI solves a problem, the problem is no longer part of AI5. The AI effect constantly requires the redefinition of AI unnecessarily. To settle matters, we have turned to historical figures of philosophy, mathematics and computer science, like George Boole and Alan Turing (and many others6), to guide our definition process.
George Boole (1815–1864) and Alan Turing (1912–1954)
Concisely, intelligence is a combinatorial and Boolean matter. Any situation requiring intelligent decision-making can be modelled with a decision tree. To take into account what would be a better decision, this model attributes utility weights to each of the leaf nodes. The intelligent decision is the one given by the leaf node with the highest utility value. The tree exists within a set of rules defined by what we can call a ‘game’. If a machine plays the game and imitates humans to the point of fooling them, it is an AI.
Of course, this assertion raises the question of whether humans constantly evolve in a finite or infinite decision tree. For the sake of this argumentation, we will put this question aside and assume that it is, in any case, so vast that tree size no longer matters.
Ultimately, the race for AI will be won when a machine is developed that can browse the immensity of the human decision tree. With appropriate reward–penalty reinforcement learning, it will be possible to teach such a machine human ways and customs. The challenges to achieve this ultimate AI are numerous, yet history shows that this should not prevent us from expecting the existence of such a machine in a humanly conceivable time span.
2. THE IMITATION GAME: IT’S ALL RELATIVE
Let’s look at the example of music. Automation occurred long ago in Cizre, Turkey. The ingenious inventor Ismail Al-Jazari created, circa 12067, a programmable orchestra that mimicked humans playing music.
More recently, Sony has claimed on multiple occasions that it has invented AIs capable of learning and reproducing music. These AIs are called Flow Machines and reproduce any given type of music. One such Flow Machine, DeepBach, is able to generate Bach-type harmonies, which cannot be distinguished from the original. Another, the Continuator8, was able to copy a Jazz pianist, and Jazz critics could not differentiate between the two.
From Al-Jazari’s automaton to the DeepBach Flow Machine, the complexity of the decision tree has grown exponentially. We present below the decision tree of Al-Jazari’s automaton at each note of the melody.
Decision tree of Al-Jazari’s automaton at each note of the melody
The automaton had three different instruments and most probably only one note for the trumpets. The number of leaf nodes in its decision tree is quite straightforward and amounts to
2Number of instruments = 8
Now considering the full score played by the automaton, we would have to reproduce the tree at each leaf node for as many times as there are notes. This would lead to a much greater tree whose size would equal
2Number of instruments x Score length = 8Score length
DeepBach goes further as it can play as many instruments as desired and any note. Its tree is much more complex, equalling
2Number of instruments x Number of notes x Score length
To learn the ways of Bach, it allocates degrees of importance to each leaf node.
The question is this: are both systems AI? None of these technologies would pass the Imitation Game as defined by Turing, but they do pass the test within their own set of rules.
To illustrate the issue at hand, let’s put AI aside for a moment and consider life instead. Prions, for example, are not considered as living beings, whilst bacteria are. This may be elementary for those with a Biology background, but even experienced biologists would find it difficult to correctly classify what happens in between these two extremes. There, we find viruses, which still fuel debate on whether or not they constitute living beings.
It’s the same story here!
Consider that you are a passer-by in Al-Jazari’s time, and you hear music coming from his automated fountain. You think to yourself, ‘What a marvellous machine, Al-Jazari is a genius’. To your surprise, you realise that several human players have taken the place of the mechanical players. Had you not looked closely, you would not have been able to tell the difference.
Could we not therefore say that the automaton passes a form of Turing Test based on a simpler Imitation Game?
Answering this question sparked lively debate among Accuracy’s expert community. With such a definition of AI, we could say that Al-Jazari’s automaton is, indeed, an AI. Compared with DeepBach, however, Al-Jazari’s automaton seems rather simple. And yet even DeepBach is not technically an AI, since it cannot play Turing’s original Imitation Game.
If we say that a machine that passes the Turing Test is to AI as bacteria are to life, and automatons (such as Al-Jazari’s machine) are to AI as prions are to life, then algorithms like DeepBach, Watson or even Google Duplex occupy that same in-between space as viruses. A line has to be drawn to establish a normative limit.
At Accuracy, we believe that the Imitation Game is valuable depending on the rules of the game. For the purpose of recreating a human-like AI, these rules should incorporate a certain level of interaction with humans, as well as certain learning capabilities.
The recent exponential growth of AI applications is the result of major technological advances in computer power, data management (from generation to storage via communication) and algorithms (neural networks in particular). These advances now make it possible to create an AI that could play ‘corporate games’, at least with regard to corporate operations.
3. AI FOR BUSINESS
Taking Von Neumann at his word, no intelligent task is beyond the reach of machine intelligence.
Business is no exception to this rule.
All businesses operate with a single purpose: offering their services or products to customers in exchange for money. With this money, they maintain their operations, paying creditors, shareholders and taxes, and investing to broaden their activities. Year after year, the accounts keep a thorough record of all these cash flows. When the time comes to make important decisions, C-level management rely entirely on these records to correctly assess the state of the business. From the numbers in the books, management decide on the best future course of action (often by valuing the different strategical options). They then implement that course and monitor the implemented activities.
The accounts are therefore the quantitative bedrock of corporate management.
As mentioned in the introduction, a fully functional business AI would be made of multiple bricks, each controlling a specific operational branch (e.g. a marketing AI brick, or set of bricks, would have control over marketing operations). The AI bricks are analogous to organs in the human body (see figure below). Just as the skeleton provides the frame for the human body and its organs, finance and accounting provide the framework for the business AI, ensuring its operational coherence and avoiding dis-synergies between branches.
Comparison of business AI with human life
Typical AI bricks are usually hardware and softwarepieces, and sometimes both. In terms of hardware, the most essential task of all is to generate, transmit and maintain data. Obvious hardware components include sensors, cables and relay servers. They also contain database servers, as well as robotic installations, which can handle packages in warehouses, production sites, etc. In terms of software, the principal task is to choose the appropriate path within a decision tree, either deterministically (rule-based methods) or statistically (machine or deep learning). Software AI bricks can thus be optimisers, simulators (e.g. Monte Carlo simulation software), machine or deep learning algorithms and so on.
Examples of AI bricks and their applications
To this day, developments within companies are very heterogeneous. When companies implement AI, it is usually for a specific branch of operations, using several AI bricks. Often, these different AI systems handle their branch of operations independently.
To obtain a fully functioning business AI, there are two main axes of improvements (see figure below):
– Multiplying AI bricks in the different operating branches and connecting them to a single human–machine interface
– Slowly replacing the human–machine interface with an artificial brain that would assess the different strategies available to C-level management and choose the best course of action.
These developments should occur whilst respecting the value chain in its entirety, as the implemented AI bricks would be coordinated around the accounts.
Moreover, innovation changes towards AI must occur at the right pace and with the right entropy shift. The right entropy shift would give enough time to developers and data scientists to fully seize and implement the company’s ethos and processes in its systems. It would also give flexibility to executive management to reallocate resources, re-establish updated processes and retrain employees for other tasks when necessary.
From AI bricks to a fully working AI for business
Business exists, in its very essence, only when two human parties can help each other. In this view, one fulfils the needs of the other by means of a transaction, based on a swap or monetary exchange. As long as humans exist, business transactions will be undertaken. Embedding a business AI in a company ensures the safe, thorough and precise execution of all the laborious aspects of its operations.
That said, when it comes to the creation of value, maintaining strong client relationships, providing a healthy working environment to employees, etc., the only trustworthy ‘machines’ to execute these tasks will remain human.
4. AI AT ACCURACY
As with its other areas of expertise, Accuracy brings together various skillsets to consider the full implications of AI. With its wide range of financial expertise on any aspect of corporate mechanics, technical expertise on the latest data science modelling paradigms, and unique knowledge and access to innovation hubs, Accuracy is well placed to face the challenges surrounding AI for business.
– Expert reports to choose the correct algorithms and their appropriate implementations for any sort of operational activity with full respect of the value chain integrity
– Implementation of AI bricks with an in-house implementation or in the form of Software-As-a-Solution using enhanced security
– Access and interface with innovative start-ups from the AI world whenever additional knowhow is required (e.g. installation of robotics, specific hardware, etc.).
For further information
1 As discussed in the third section, an AI system is made of multiple bricks. These AI bricks, within their decision perimeter, are AIs themselves. Once incorporated in bigger systems, the term AI brick becomes more appropriate as they become a part of some higher intelligence system.
3 Naturally, an AI that imitates human behaviour well in a given game may not do as well in another type of game.
4 See Wikipedia – second paragraph of the introduction
5 See Wikipedia – first paragraph of first section
The construction sector is not the only one to witness it, but it is a striking example of it: its market has a false bottom. For several decades, large international construction companies have been playing a singular game, which often recompenses not necessarily the best builder but the most audacious.
1. DEALING THE CARDS
Let’s take an example: an international call for tenders that would pit five major players in the construction industry against each other to build a new airport. Because of fierce global competition, it’s highly likely that the winning company would be chosen as much for its creativity and technical expertise as for its audacity to take risks and offer a price that would give it a low margin, if not nil. It then falls on the company to make a profit thanks to another type of project: claims, be they through amicable negotiations or before a court. The company’s result will be built in two stages and according to two distinct processes.
It’s an odd situation where construction companies hope to run into difficulties or see certain risks materialise outside of their control in order to have as many opportunities as possible to revise their contracts and prices. These can include delays in obtaining technical instructions, soil quality issues, the bankruptcy of business partners, social upheaval, strikes, archaeological discoveries, land releases, technical changes, and so on. They represent just as many opportunities to call into question the contract price and open negotiations with the aim of recovering the original stake. But during levelling work, not everybody stumbles upon Roman ruins. First round in the game of chance.
2. SECOND TURN
It’s unlikely to see the market curb the chaotic nature of its workings by itself, even though construction prices have reached levels that considerably reduce any hopes of making a profit. On the contrary, all signs seem to show that this activity model is here to stay, pushing value creation towards areas other than those of the operational and engineering departments.
The different players assisting these businesses – in analysing the facts, establishing the causes, explaining the law and quantifying the damages – must be delighted with the multiplication of claims. They are creating a veritable construction dispute market, with many forms from classic negotiation to international arbitration. However, it’s reasonable to believe that the construction market deserves better than the frequent and very costly use of dispute settlement procedures, which seem to succeed one another without benefitting from any learning curve.
The observations often remain the same. At the end of a large project, construction companies struggle to be able to revisit the execution of it with the benefit of hindsight to demonstrate the logical sequence of events that can explain delays and additional costs. It is, in principle, extremely difficult to take the necessary step back during the execution of a project to measure all the consequences of a given event on three key variables: quality, time and cost. As a result, it’s necessary to make the best of the various types of information collected in order to later reconstitute the succession of events that tells the story of the project: accounting data, planning data, active personnel, weather data, and materials used are all essential to reconstruct, sometimes day by day and work item by work item, the progress of major construction sites.
Construction companies are rarely experts at this game, and it’s striking to note how wide the gap is between their technical expertise deployed to serve their clients and their reticence to apply the same demanding nature to their management practices. Experts and their clients therefore strive to conduct a real investigation, albeit late in the day, where the clues left by the parties are often incomplete, disparate or even non-existent. The scavenger hunt can only be longer, is sometimes hazardous, and gives rise to estimations, or even bluffs, which can sully the credibility of negotiations or hearings.
All these inaccuracies, for want of a better alternative, lead to unbalanced transactional protocols or push the courts to disappointing Solomonic judgements. After gambling on reconstituting its margin, a construction company must add the uncertainty of a poorly argued negotiation. Second round in the game of chance.
3. FROM CLUEDO TO MASTERMIND
It would be a dream to easily read a project summary that had been written as the construction work progressed thanks to efficient procedures for collecting and processing the information. This type of exercise, whose fundamental objective is to measure and explain the difference between what was forecast and what actually happened, always requires a cold review and in-depth analyses. However, between the advent of an artificial intelligence able to perform this task in real time and the current state of the art, the gap is wide. When it comes to using new management tools, the construction sector is at the bottom of the class. This delay compared with other sectors is all the more significant given the veritable digital revolution on the horizon.
The construction sphere became aware of the benefits of modelling tools several years ago. What we call BIM, for Building Information Modelling, corresponds to assistance in works management applied to construction according to several axes: 3D view of works, ability to navigate the entirety of past plans and budgets, tracking of responsibilities in obtaining authorisations, possibilities to see time spent per work zone, etc. In its most advanced versions, the BIM would become a complete library of data, able to be read, be it from a technical, financial or legal perspective. We might imagine in the long term a real-time monitoring tool for equipment and teams that would be able to immediately reconstitute the resources mobilised, quickly identify productivity losses and quantify their consequences. This would serve to determine the optimal construction and team deployment solutions that best respond to the most important constraint (respecting the planning timeline, minimising additional costs, etc.).
These systems exist and are used, but still in preliminary versions. Their development and their actual application in projects would revolutionise construction methods and monitoring projects because their implications are innumerable. There is no reason why this should not be the case, since a construction site is already a place where a plethora of information can be collected. It’s a strange paradox that sees players in the construction sector hesitate to enter into a world that seems designed for them! Between the diversity of data to analyse (personnel, equipment and material) and the ability to monitor it on technical, temporal and geographical axes, digital tools are ideal for extracting extremely varied analyses. What would they be?
It’s possible to anticipate two successive phases for using the data.
The first phase would be turned towards the business. It would use the mass of data available with the aim of improving how projects are conducted or better demonstrating the veracity of its difficulties. Knowing the situation in real time would improve the reactivity of construction companies, who would manage an integrated model of resources and costs with the objective of finding the optimum at a constant quality. Their interactions with their clients would be more rational, more responsive and more transparent to the great benefit of their legal departments, which would be better able to preserve their rights.
Moreover, collecting data would make it possible to better understand and demonstrate the mechanisms of general disruption of a project, which would result from observing the deformation of the model in extreme cases of numerous or repeated difficulties. There is a wealth of literature explaining the principle of cumulative impacts on the productivity of a construction site. It can be understood intuitively: after a certain amount of accumulated difficulties, the disorder reaches such a level that productivity collapses, even though each one of the difficulties taken in isolation would only have a small effect. In spite of this, the construction sector struggles to demonstrate the phenomenon and quantify it. The majority of claims for general disruption must be limited to illustrating it qualitatively and relying on the demonstrative nature of the events supposedly at its source. By systematically analysing the productivity data of a large number of projects whose execution environment would be saved and codified, the causal link between the coexistence of numerous disruptions and the exponential slowing down of work productivity could (finally) be established. The courts would then have real demonstrations to make an informed decision on the issues that currently give way to far too much intuition or, at best, inference.
4. TOWARDS A TEAM SPORT
The second phase for using data, which is even more ambitious, would see greater transparency of results between construction players, clients and suppliers, to the advantage of the market. The aeronautical sphere has long since incorporated this idea by sharing a large amount of data, particularly from aviation accidents or incidents, with the aim of improving security. Yet, technological advances and audacious contemporary structures that generate new risks should encourage construction players to establish greater transparency in the characteristics of projects: technical characteristics first and foremost, but also the reality of resources implemented. The data revolution will generate a major step forward in terms of managing large projects on a global scale. Similarly to other sectors, a global coordinating body could facilitate the systematic and anonymous collection of a project’s characteristics according to its various components (technical, costs, and delays), all with a great deal of detail. The project would be registered and characterised by taking into account its entire execution up to its dispute phase. It would be incumbent on this coordinating body to share the data freely, analyse it, encourage studies and research based on its databases, or even issue independent opinions on the coherence of proposals during tender phases.
Making this anonymous data available on an open market with diverse free players would accelerate innovation, promote the transfer of best practices from sector to sector, and strengthen the links between large companies, start-ups and researchers. This flow of free information would make it possible to decompartmentalise the construction world, benefit from new expertise, and encourage breakthrough innovations by pooling the sometimes significant investment costs that the sector’s profitability does not allow.
We wager that this international ‘control tower’ would contribute to re-establishing more rational rules for the game to the advantage of all stakeholders, clients, project managers and builders. It would help to correct the race to the bottom for prices by showing and quantifying the systematic nature of financial claims issued by construction companies during their final negotiations, as well as the resulting costs. It would also be easier, by using the available references, to banish ‘suicidal behaviours’ and supervise negotiations during a call for tenders or at the end of a project. Interactions would be clarified thanks to an arsenal of comparable situations, which would reduce the field of uncertainties and would make it possible to set off a virtuous circle to establish fair prices. In the long term, the circulation of information and the sharing of rational data would lead to a convergence of project economics. These exchanges would standardise margins for comparable work and would make resorting to cumbersome dispute practices more episodic. This would further enhance technological breakthroughs.
So what now? Companies are struggling to take the first, and undoubtedly the most difficult, step by training a new generation of builders on the ground to use these tools. Of course, they have to start with a reasonable ambition, by creating step by step their own knowledge bases, grouping tools and best practices that are fully accepted by their users. These users will be the key players in the daily collection of data; they will enrich it through analyses; and they will be the primary beneficiaries of their work. The power of the tools suggests a major change in the tasks entrusted to construction managers. First adopters will have a major competitive advantage when optimising their costs and defending their profitability, as well as disrupting the market by drastically modernising their relationship with their clients through a transparent and reasoned dialogue.
The company that succeeds in doing so will have broken the mould of the market.
Quite apart from the exponentially increasing numbers of Teslas and other fully electric cars around nowadays, another surge in electric fleets is noteworthy. With 23,000 electric buses forecast to be in operation in 2025 (compared with around 1,200 buses currently), public transport is being electrified at a rapid pace. More specifically, Accuracy forecasts that France and the UK will lead this market development in Europe, supported by the Paris agreement and country-level zero-emission (ZE) legislation aiming for a fully ZE public transport bus fleet by 2030. Whilst incumbent operators in these countries are gaining experience with the first electric bus fleets, significant challenges remain – even for early adopters.
Market dynamics in public transport are characterised by competitive tender processes to earn the right to exclusively operate in a specific geographic area – a concession area – for (usually) a period of 10 years. Given the low margin and often subsidised operations in public transport business cases, it is crucial for operators to have a competitive edge in order to increase their win rate in tenders. One such competitive edge could be a more profound understanding of the life cycle of bus batteries and its implications for the investment case of electric bus fleets.
Our view is that many operators still lack a full understanding of the electric bus fleet business case and its operational complexities. They are therefore unable to compete effectively. Moreover, public transport operators will be forced to develop new skills (such as battery maintenance) and will need to reinvent themselves to accommodate the new dynamics in the value chain that coincide with becoming a large-scale energy consumer.
Operators who are on top of fast-track innovations and have an in-depth understanding of the financial sensitivities related to electric bus fleet management are able to set the pace in a potential winner-take-all market.
This article highlights three turbulent effects that we expect to dominate in the rapid electrification of our public transport in Europe and concludes with some recommendations to overcome short-term barriers to market adoption.
1. THE BUS MARKET WILL ELECTRIFY FASTER THAN YOU CAN IMAGINE
Forecast urban electric buses in operation in Europe, by country 2018-2025
France, the United Kingdom, Poland, the Nordics, the Netherlands and Germany together account for more than half the total number of electric buses in Europe today. For these countries combined, we forecast a total of 3,900 electric buses in operation by 2021 and 16,710 buses by 2025. This implies a CAGR of 68% between 2018 and 2021 and a CAGR of 34% between 2021 and 2025.
In 2015, 195 states and the European Union unanimously approved the COP21 agreement. Since 2016, 174 countries have begun adopting the agreement into their own legal system. Following this, municipalities have adjusted – or are in process of adjusting – tender criteria to induce a shift to ZE vehicles in new concessions.
Legislation context: France and the Netherlands
Prior to the COP21 agreement, France adopted the Energy Transition Law. This law ensures that when public bus fleets are renewed, a minimum level of low-emission vehicles is respected. More precisely, the law states that “the State, public institutions, territorial communities and their groupings, when directly or indirectly managing a fleet of more than twenty buses for scheduled or on-demand passenger transportation services, must purchase low-emission vehicles when renewing the fleet, with a minimum proportion of 50% as of 2020, then in full as of 2025. In the case of services provided by the Régie Autonome des Transports Parisiens (RATP), the minimum proportion of 50% applies starting from 2018”1.
In the Netherlands, in 2016, the transport authorities signed the Administrative Agreement on zero-emission public transport, which stipulates that by 2025, all new buses will be ZE ‘at the exhaust’ and by 2030, the entire fleet of more than 5,000 buses will be ZE.
Bus fleet replacement schemes are quickly coming of age
After an experimentation phase, an increasing number of tender processes are being launched for electric buses (“e-buses” hereafter). In major cities committed to improving air quality, such as Paris and London, large-scale replacement schemes have already been set in motion:
– In France, in 2018, Ile-de-France Mobilités and RATP launched the largest call for tender in Europe to buy 800 e-buses (worth €400 million)2. This will help to achieve the target set by RATP in 2015 to make its fleet of 4,700 buses entirely clean by 2025.
– In March 2018, the Mayor of London published a strategic plan3 for transportation, which stated the objectives for the city’s bus fleet:
– Starting from 2018, all new buses should be hybrid, electric or hydrogen.
– By 2020, all single decker buses in central London (i.e. around 200 buses) should be purely electric or hydrogen.
– By 2035, all single decker buses in inner and outer London will be purely electric or hydrogen (i.e. around 2,600 buses).
– By 2037, all buses will be ZE.
2. EUROPE IS LOSING CONTROL OVER ACCESS TO BATTERIES AND TECHNOLOGICAL INNOVATIONS – CHINA IS RUNNING THE SHOW BEHIND THE SCENES
China is currently the market leader for both the production and operation of e-buses. In 2017, there were 385,000 electric buses in the world, 99% of which were located in China. In this market, 9,500 electric buses are deployed every five weeks. By comparison, the cumulative number of e-buses in operation in Europe reached 1,202 units in 2018, less than 13% of what is rolled out in China on a five-weekly basis.
European manufacturers like VDL, MAN, Mercedes, Volvo and Solaris are still in their scale-up or start-up phase in relation to e-buses. Although the quality of e-buses built in Europe is perceived as premium, European manufacturers are possibly too late to meet a large share of the steeply increasing demand for e-buses in the coming years. As a result, they have a slower innovation curve compared with Chinese players, which currently cover 90% of the market in Europe4. China has invested heavily in the development of this alternative mode of transport: the share of e-bus sales in the country jumped from 0.6% in 2011 to 22% in 20175.
The cumulative number of e-buses in operation in Europe reached
1,202 units in 2018, less than 13% of what is rolled out in China
on a weekly basis.
Four major factors explain the rapid development of Chinese e-bus fleets. First, national and regional subsidies helped cushion the high upfront costs of e-buses. Second, significant urban pollution generated a growing concern, which acted as a catalyst. Third, Chinese transportation infrastructure started from scratch, whilst in Europe, for instance, the infrastructure is already established and new technologies must be incorporated into it. Finally, developing new technologies that are competitive in the world market is a long-term goal for the Chinese government.
China: on the ground in Europe
BYD, a Chinese company specialised in the production of e-buses and coaches, has invested €10 million in France to build an assembly factory (Beauvais, Oise). The factory’s production capacity stands at 200 e-buses per year, but it has yet to receive sufficient orders to run at full scale6.
Operators still largely in the dark with regard to battery performance data
Typically, OEMs that ucprode e-buses for transport operators do so without providing the owner-operator with access to the inner workings of the bus’s battery (the battery management system or ‘BMS’) to understand its performance over time. This is crucial information because the battery’s performance can quickly deteriorate if usage conditions are suboptimal – a painful issue if your battery becomes useless and you have an ongoing contractual obligation to service a concession area. In addition, access to the data can offer unique insights, which can make it possible to improve certain financial aspects or the battery lifetime by making operational adjustments (such as stabilising battery temperature, adjusting for frequency regulation effects and driver behaviour).
Importance of knowing the battery’s state of health
Although less rapid than other battery technologies, lithium-ion batteries exhibit a deterioration of their performance over time named battery aging or State of Health (SoH), which comprises a loss of available energy and power. This results from a loss of capacity or a rise of internal resistances (impedance).
A battery’s state of health (SoH) is critical in assessing its long-term economic value. Currently, SoH is estimated by the battery management system software, which is protected by the OEM and therefore not accessible to the battery’s owner-operator. Additionally, battery performance degradation is difficult to predict precisely because it depends on 4 main factors:
– Charging/discharging speed, measured in Coulomb-rate (C-rate), expresses the intensity of the electric current circulating through the battery. The higher the C-rate, the faster the battery ages7.
– Extreme temperatures adversely affect battery life. Higher temperatures increase internal activity so lead to capacity losses8. Below 0°C, the cell internal resistance increases dramatically so the usable capacity drops noticeably and accelerates the battery aging9.
– Battery aging occurs during energy exchanges but also when it is stored or just unused. The battery capacity is reduced and the impedance increases over the storage time. At constant temperature, the higher the State of charge (SoC) is, the more the capacity is reduced over the storage time because the SoC (% battery filling) corresponds to the level of activity inside the cell7.
– Depth of discharge (DoD) corresponds to the level of energy delivered by the battery before being recharged and therefore an average DoD characterises its usage profile. Higher DoD (deep cycle decharging) evolves in an acceleration of the aging mechanisms10. Consequently, the cell performance degrade slower by charging as frequently as possible but while keeping the DoD in the center of the capacity range.
We believe that an ecosystem of local manufacturers, maintenance providers and operators in Europe would make it easier to standardise technologies and open up the locked box of battery data. Having a local ecosystem would be preferable to seeking to establish one with distant Chinese manufacturers, as it would incentivise stakeholders to seek collaboration and joint investments and innovations.
3. THE SURGE IN ELECTRIFICATION LEADS TO THREE ADJACENT ECOSYSTEM CHALLENGES FOR PUBLIC AND PRIVATE INVESTORS
Our electricity networks are not capable of managing the vast increase in electricity demand forecast to come from huge fleets of e-buses.
For bus operators, it is essential to run a tight schedule. Limiting idle charging time makes it possible to meet passenger demand with an optimal fleet size. As a result, high capacity chargers are used throughout the day for fast charging (currently this is 450 kW but is expected to increase to 900 kW in the near future)11. As the capacity of these chargers is very high compared with fast chargers for cars (up to 175 kW) and many (sometimes dozens) charge simultaneously on a single grid connection, this has an enormous effect on the fluctuation of electricity usage.
If one fast charger is installed for every ten buses on average, the peak in national energy consumption will be noticeably affected (for example, the peak energy demand of approx. 20 MW in the Netherlands will increase by 1%). Although this might not sound very significant, it creates challenges for the system. The demand for energy to charge buses is not constant, like industrial usage, but binary and volatile. A sudden increase or decrease in demand for energy puts tension on the grid, which, in rural areas, where power consumption is relatively low and mainly residential, can lead to grid blackouts. For some of these areas, again taking the Dutch market as an example, it is not even possible to operate an electric bus fleet at present, as the power capacity of the local grid is insufficient, and high-power grid connections cannot be established within 3–5 years.
To ensure further electrification of our public transport, the electricity grid operator will need to take into account these energy demand requirements and make appropriate investments in the energy grid. Without this, short-term solutions can be found through out-of-the-box solutions (e.g. mini-grid ecosystems using solar-PV projects at bus depots), but they present further complexities to (not only) public transport operators.
A flood of used batteries will come to market in 5 to 10 years, raising the question of a battery’s “second life” and opening numerous use case possibilities.
Although e-buses are already operational in many countries, the issue of the “second life” of e-bus batteries has yet to be solved. Indeed, e-bus batteries function optimally for a period of 5 to 10 years, depending on its use and operational set-up. After that time, partly due to intensive usage profiles, they are no longer fit for public transport, but they still have around 80% of their initial capacity12. Combined with the rapidly growing size of e-bus fleets, the number of batteries reaching the end of their “mobility” life will be a challenge that inevitably will have to be tackled. In addition to its environmental aspect, this challenge can be a source of many value-creating opportunities.
Battery lifecycle and three-stage circular-economy approach
Currently, e-buses are still being introduced gradually into the market. As a result, the track record of long-term battery performance and experience in determining actual “second life” use cases remain very limited. However, Accuracy has filtered these value-creating opportunities down to 13 distinct use cases that are adjacent to or extend the mobility use of the battery. Each case is linked to at least one of four key conceptual value drivers: (1) load management, (2) grid balancing, (3) lifetime expansion and (4) residual value. Some of the most attractive use cases include (one or more of) the following value drivers:
Attractive use cases including key value drivers
Bus batteries are particularly useful for “second life” (non-mobility or extended life) use cases for the following reasons:
– There will be a multitude coming to market. In 2017, in China alone, 87,000 additional e-buses were operational13. At this rate, close to 100,000 bus batteries would reach the end of their core mobility usage in 5 to 10 years and would represent significant potential for other applications.
– They have a much higher unit capacity compared with passenger EV batteries. This makes it easier to set up secondary use cases such as stationary energy storage facilities. Bus batteries have a high capacity of 150 to 300 kWh and are therefore suitable for heavy-duty charging infrastructure dedicated, for example, to short-haul heavy weight trucks or utility trucks.
There are already early examples of “second life” uses being implemented, although in most cases they relate to batteries from electric cars:
– In March 2018, MAN Truck & Bus and Hamburg’s public service provider VHH announced that they will construct a prototype stationary storage facility with “second life” e-bus batteries14.
– In June 2017, Renault and Powervault announced a home stationary energy battery storage system based on “second life” electric vehicle batteries15. This partnership is said to reduce the cost of a Powervault smart battery by 30% and extend the useful life of a Renault battery by up to 10 years. Although this example refers to electric car batteries, it could be extended to e-bus batteries for larger scale applications.
– On 29 June 2018, the Johan Cruijff Arena in Amsterdam put into operation a 3 MW mega battery, which will ensure that football matches and concerts can continue in the event of power failures16.
Again, electric car batteries (148 units) have been used for this “second life”, but similar applications could be designed for e-bus batteries.
Current tender criteria inhibit adversarial effects to successful e-bus roll-outs.
Policymakers have introduced tender criteria to promote the shift to electric fleets for recent bus concessions coming to market; however, they need to be considered carefully. In our opinion, these criteria do not always align with an optimal operational and financial investment case for e-bus fleets as they create specific complexities in the tender process and – unintentionally – force operators to make suboptimal financial decisions.
Currently, electric buses require a significantly higher investment than traditional diesel buses (up to 1.5x the CAPEX investment, not even including the charging infrastructure CAPEX needed), despite being cheaper in maintenance and propulsion. Current tender criteria contain hybrid or “soft” ZE criteria (allowing for the combination of traditional diesel and ZE bus fleets), whereas upcoming tenders are likely to have “hard” ZE requirements. In truth, some incentives and arrangements are made to help ZE bus set-ups win the call for tender, such as financial incentives (e.g. off-balance-sheet financing, carbon credits). However, in order to maintain the quality of public transport until ZE bus prices have dropped significantly, more needs to be done to ascertain what would be an attractive financial investment case for operators, given the higher risk and uncertainties involved that have not yet been resolved as demonstrated in some examples below.
Some examples of complexities in calls for tender for electric bus fleets (not exhaustive):
– As high-power grid connections and charging infrastructures are not present for new concession areas (i.e. those that have not yet had ZE bus operations, which is the case for most), uncertainties and investment requirements are much higher for operators, creating a huge disadvantage when compared with diesel fleet operations. Shared infrastructures with neighbouring corporates operating ZE fleets for their logistics are economically feasible but are difficult to achieve if tender processes do not adhere to more holistic economic perspectives.
– Day-one readiness timelines are challenging as the procurement of buses, infrastructure and grid connections can only be arranged once the concession is awarded. The handover from preceding concession operators has to happen in a short window, which – again – is much more challenging when compared with diesel fleet operations.
– Awarding the concession naturally coincides with awarding the subsidies for the concession period; however, the subsidy does not usually correlate to passenger traffic growth during the period. As a result, operators initially have undesirable fleet mix/sizes to be able to handle long-term passenger demand. This can make planning ZE bus routes and setting up charging infrastructure difficult, creating more challenges than traditional diesel bus operations.
– The transfer of charging infrastructure to the new concession holder after the 10-year operating period is common in current tender criteria, but it may not be tied to the same location. However, transferring the charging infrastructure from one location to another cannot be done overnight.
Our recommendation to policymakers is to adhere to the above areas, if they want to maintain the quality of public transport and meet the electrification goals set.
We consider that, for Europe to be a viable global player in electric buses as well as in the next wave of electrification of vehicles (e.g. the heavy-weight segments of cargo transport & logistics), it is crucial for European policymakers, energy grid operators, OEMs and public transport operators to work together and use their experience of scaled production to close the gap on China in today’s market.
This can only come about if the emerging barriers such as we identified in this article are quickly resolved. This includes the barrier facing bus fleet operators needing access to the energy grid; policymakers to align public tender criteria with the real ‘cost of ownership’ investment cases (including “second life” battery usage and (joined) infrastructure requirements); and manufacturers to unlock battery performance data to operators through, for example, open innovation.
1 Art. L. 224-8 of the Environmental Code.
2 RATP, Bus2025 : L’ambitieux Plan de la RATP pour un parc 100% propre, April 2018.
3 Caroline Pidgeon, Going electric – The future of London’s buses, January 2019.
4 Bernd Heid, Matthias Kässer, Thibaut Müller and Simon Pautmeie, Fast transit: why urban e-buses lead electric-vehicle growth, McKinsey&Company, October 2018.
5 Electric buses in cities – Driving towards cleaner air and Lower CO2, Bloomberg New Energy Finance. August 2018.
6 Le Figaro, BYD investit 10 millions d’euros en France, 23 March 2017.
7 Battery University, How does internal resistance affect performance?, 2010
8 Noshi Omar et al., Rechargeable Lithium Batteries, Chapter 9 – Ageing and degradation of lithium-ion batteries, 2015.
9 Discover Energy Corp., Temperature effects on battery performance & life, January 2015.
10 Jean-Marc Timmermans et al., Batteries 2020 – Lithium-ion battery first and second life ageing, validated battery models, lifetime modelling and ageing assessment of thermal parameters, 18th European Conference on Power Electronics and Applications, October 2016, page 6.
11 State Grid Corporation of China, China aims for 900 kW hyper power charging standard
12 In any case, usually, OEM warranties end when batteries reach around 80% of SoH. According to industry experts, continuing the highly intensive mobility use after the 80%-norm would result in accelerated depreciation of the battery performance.
13 Electric buses in cities – Driving towards cleaner air and Lower CO2, Bloomberg New Energy Finance. August 2018.
14 Nora Manthey, MAN & Hamburg to give 2nd life to electric bus batteries, Electrive.com, 19 March 2018.
15 Renault Press Release, Renault and Powervault give EV batteries a “second life” in Smart Energy Deal, 5 June 2017.
16 Johan Cruijff Arena wordt superbatterij voor elektriciteitsnet, Website Johan Cruijf ArenA. 29 June 2018.
Monitoring phone calls for traders is the most technically demanding of surveillance activities – for a myriad of reasons: the calls are complex, the quality of the recordings is poor, and, most frustratingly, the technology involved rarely works.
The net result is that regulated industries have, for a long time, avoided deploying voice monitoring solutions, despite the fact that voice communications provide an excellent insight into misconduct. Phone calls are more free-flowing and consequently more incriminating than emails1.
From a risk perspective, it makes little sense to monitor emails but not phone calls; from a budget perspective, this used to make sense due to the costs involved in monitoring audio. However, technology is now changing and costs are dropping, fixing one issue and creating another!
2. WHY IS VOICE TECHNOLOGY SO BAD? ALEXA WORKS!
Many people are, quite rightly, frustrated that technology used to convert voice to text within financial services is so poor. Amazon’s Alexa can be bought for £40 or less and is seen as more accurate than a £1m implementation – this seems to be unfair, at best.
The reasons for this are largely historical.
Much of the technology deployed within banks to monitor traders has been repurposed from technology that was designed for the retail industry. The monitored calls in retail banking are generally scripted and predictable conversations, recorded on high-quality headsets in a quiet call centre.
Trader calls are the polar opposite: the calls are unscripted, unpredictable and recorded on low-quality microphones with a very low “bit rate”. Most systems are recording at around 8-bit. This is four times lower than a podcast and sixteen times lower than the BBC broadcast standards.
The poor recording quality, combined with the poor microphones and a noisy background, means that it is hard for the technology to discern the different sounds and words, thereby creating a low-quality transcription. In addition, many of the technologies being deployed are using “on-premises” solutions rather than cloud-based platforms. Amazon’s Alexa is leveraging not just the massive computing power of AWS but also all of the data that it can access. Alexa constantly updates and improves the platform, supported by millions of users around the world that help drive its deep learning technology. This computing and data power, combined with the fact that Alexa devices have seven microphones in them to ensure high-quality recording and separation of conversations, means that Alexa’s capability far exceeds that of the standard, static, on-premises solution.
3. IS IT GETTING BETTER?
The accuracy levels for speech-to-text technology have increased dramatically, after being almost static for nearly a decade. From 2000 to 2010, most technology was around 70% accurate at best (see chart below); from 2013 to 2016, this improved radically and hit the 94.1% mark. This is a key milestone as it matches the benchmark of human accuracy.
Since 2016, this leading-edge technology has started to become mainstream, with stable and readily available access to the public.
In short, the real world technology has dramatically improved and can be deployed by banks and insurance firms alike.
Speech-recognition word-error rate for selected benchmarks
4. SO IS IT ALL FIXED THEN?
The accuracy of voice-to-text technology is now the same as – or even exceeds – that of humans. However, this does not mean that the problem has been resolved. Far from it, this is due to two reasons: quality and volume.
The tests and benchmarks are based on a standard “switchboard” data set, not trader calls. The test data is therefore far higher quality than real world trader calls. This difference in call quality means that simply connecting the data to the new voice platforms will not resolve the issue. There will need to be a process to select the most effective platform and then tune it for the specific data set.
The second issue is the volume of data. Even if there were 100% accuracy in speech-to-text transcription, there would also be the same problems of false positives that exist within eComms; all the speech-to-text process will do is provide large amounts of text for analysis. This problem can be addressed through other methods such as cognitive computing to drive the understanding of the subject matter and machine learning to reduce false positives and find more relevant risks. These areas are beyond the scope of this article and are covered in other publications by Accuracy.
Voice analytics has historically been very poor with low-quality transcription rates and high numbers of false positives. There has been a rapid increase in the ability to transcribe voice to text in recent years with ever improving technology, but this will not solve the problem of voice surveillance.
Voice surveillance requires an understanding of what is being said and why. It’s not just a transcription or search for keywords: it’s about understanding behaviour.
1Experience in working across numerous investigations has shown that more issues are found on calls than in emails.
2National Institute of Standards and Technology (2009). “Rich Transcription Evaluation”
3 IBM, George Saon, Hong-Kwang J. Kuo, Steven Rennie and Michael Picheny (2015). “The IBM 2015 English Conversational Telephone Speech Recognition System”
4 IBM, George Saon, Tom Sercu, Steven Rennie and Hong-Kwang J. Kuo (2016). “The IBM 2016 English Conversational Telephone Speech Recognition System”
5 IBM (2017). “English Conversational Telephone Speech Recognition by Humans and Machines”
6 Microsoft, W. Xiong, J. Droppo, X. Huang, F. Seide, M. Seltzer, A. Stolcke, D. Yu and G. Zweig (2016).“The Microsoft 2016 Conversational Speech Recognition System”
7 Microsoft, W. Xiong, J. Droppo, X. Huang, F. Seide, M. Seltzer, A. Stolcke, D. Yu and G. Zweig (2016).“Achieving Human Parity in Conversational Speech Recognition”
8 Microsoft, W. Xiong, L. Wu, F. Alleva, J. Droppo, X. Huang, A. Stolcke (2017).“The Microsoft 2017 Conversational Speech Recognition System”
9 Google I/O Keynote (2017)
10 Google (2017). “State-of-the-art Speech Recognition With Sequence-to-Sequence Models”
Over the last twenty years, artificial intelligence (“AI”), defined as the development of advanced computer systems able to perform tasks that normally require human intelligence, such as visual perception, communication and decision-making, has been developing at an increasing pace around the globe1. AI is not necessarily a technology in and of itself. It is an ambition, a quest for the intelligent machine, where AI-labelled innovations considered cutting edge in the past are now general applications common enough to question whether they should still be labelled AI.
As is often the case with technology, some locations have taken up positions at the forefront of AI research and development (“R&D”). As of 2018, the United States (“US”) and China are leading AI development, but other countries around the globe have emerged to produce high-quality AI applications and R&D, notably in cities such as Montreal, London, Paris and Tel Aviv. Alphabet CEO, Eric Schmidt, has compared the momentum of AI development to the moon race2. When analysing global AI development and initiative to ensure competitiveness, the AI race is even more heated as it involves a broad spectrum of industrial applications and is reinforced by national interests as well as growing capital investment.
AI has quickly progressed with the development of IT infrastructure, computing power and data availability. In addition, AI powerhouses have emerged in clusters because of an intertwinement of the availability and proximity to qualified talent and academic institutions, financial capital, a culture of innovation sharing, and a growing involvement of public and private entities3. Clustering is common in science as innovation rarely happens in a single, hidden place. As explained during an MIT interview with Yoshua Bengio, a deep learning specialist based in Montreal, science moves in small steps thanks to the collaboration of diverse communities where actors interact and share information in a spirit of collaboration leading to orthogonal research directions and exploration paths4. This article provides an overview of leading AI hubs and their particular dynamics around the globe.
AI Ecosystem Drivers
1. KEY HUBS OVERVIEW
The US, as the birthplace of some of the largest digital players such as Alphabet, Apple, Amazon, Facebook, IBM, and Microsoft, has been driving a large share of innovation. These companies have had the necessary ingredients to move AI forward through access to large volumes of proprietary data, technology and capital, as well as the ability to attract highly skilled labour. With over 850,000 people working in AI, the US has one of the biggest pools of qualified professionals and has the capacity to train students in some of the best science, technology, engineering and mathematics (“STEM”) academic institutions5. Between 2013 and 2016, the US had the lion’s share of global private AI investments with over 60% of investments (valued at US$30–40bn). However, with the race heating up, US companies received 38% of global funds amounting to US$15.2bn in 20176. The US has an active merger and acquisition (“M&A”) market led by large strategic players, with over 40 acquisitions by Facebook, Amazon, Apple and Google between 2012 and 20177. These conditions, combined with the American entrepreneurial spirit, have led to the growing emergence of over 2,040 AI start-ups, representing 40% of the AI start-up ecosystem worldwide8. The government has been supportive with a strategic plan, flexible regulation and financial support amounting to over US$1.1bn annually in 2015–2017 and over US$2bn in 2018 from the US Defense Advanced Research Projects Agency (“DARPA”)9.
Silicon Valley has been at the forefront of technological innovation with a symbiotic ecosystem comprising universities, start-ups, tech companies and venture capitalists (“VC”). With over two million tech workers employed at the headquarters of some of the biggest tech players in the world as well as in 15,000 start-ups, Silicon Valley boasts the highest number of entrepreneurs globally10. The pipeline of talent comes from all over the world and from the best local universities such as Stanford, UC Berkley and UC San Diego. These institutions have been in the vanguard of AI development through their positioning as some of the best academic and corporate AI labs. In terms of financing, the Valley has historically received up to 40% of global capital investments in AI11.
The East Coast Boston-New York area has been driving innovation due to the presence of top tier academic institutions such as NYU, Cornell and Boston-based MIT, which developed early natural language processing programs in the 1960s and has been an innovation driver ever since. It is also where the term artificial intelligence was first coined in 1956 at Dartmouth University. In 2018, the MIT announced an investment of US$1bn to create a new college combining AI, machine learning, and data science with other academic disciplines12. This is the largest financial investment in AI by any US academic institution to date. New York is also a major hub where 11% of US AI job postings are located and where labs such as NYU Tandon Future Labs bridge academia, start-ups and industry collaborations13. The area is also a leading financial hub and the second-strongest funding ecosystem after Silicon Valley in the number of early-stage VC investments. Many global banks including Goldman Sachs, JP Morgan and Credit Suisse have set up machine learning teams to apply AI to investment and retail banking14.
Global AI start-up equity funding
In less than a decade, China has demonstrated its ambition to become the global leader in AI with an expected GDP impact going as far as 0.8–1.4 per cent per year by 203015. In 2017, the Ministry of Industry and Information communicated its vision via the Next Generation AI Development Plan where it set forth the following goals:
– reach globally advanced level in AI technology, models and methods by 2020;
– make AI a major economic driving force and be the world’s premier AI innovation centre by 2025;
– build an AI core industry exceeding RMB 150bn (US$21.5bn), and exceed RMB 1tr (US$143bn) in related industries by 203016.
China is walking the walk and has given itself the means to achieve its goal with major research centres in Beijing (US$2.3bn pledged), Tianjin (US$5bn pledged and US$16bn planned until 2025) and Shenzhen (US$5bn)17. Chinese technology giants, who have the resources to move AI forward, have agreed to organise innovation streams, with Alibaba leading smart cities, Baidu covering autonomous vehicles, Tencent responsible for medical imaging and IFlyTek managing smart voice. Due to the collection of large amounts of data and internet oversight, data access is facilitated for firms like IFlyTek to have access to data such as biometric information from the government more easily. Beyond the Chinese technological giants, China ranks second for the number of AI enterprises with over 1,000 firms in mainland China only18.
In regards to talent, even if mainland China produces more graduates in science and engineering every year than the United States, Japan, South Korea as well as Taiwan combined and forms professionals in strong STEM universities such as Peking and Tsinghua, it still currently lacks sufficient AI talent to fully achieve its aspiration. A growing number of job postings, rising salaries and desire to hire the best talent worldwide demonstrate the race to hire more AI experts19. Nonetheless, China’s R&D efforts are reflected in the rise of Chinese unicorns and in AI patent applications, which were for example six times more than in the US for deep learning related keywords in 201720.
WIPO Patents in Artificial Intelligence Technology Related Sectors as at 2018
If considered as a unified entity, Europe has significant innovation mass through its AI start-ups (circa 22% of global share), highly reputable academic institutions and a willingness to develop AI as illustrated by the EU Commission’s innovation policy21. The UK (London), France (Paris) and Germany (Cyber Valley) lead the competition in the European AI space. However, even if EU members agree to compete efficiently on a global scale, Europe needs an ambitious and rapid deployment strategy, covering both business and public administration, to create an ecosystem where ideas and research translate into pragmatic socioeconomic opportunities.
The UK, and more specifically London, is perceived as the leading European AI hub with over 760 enterprises of which around 650 are in London22. The ecosystem benefits from expertise, collaboration and a talent pipeline from universities such as Cambridge, Imperial College London and Oxford. As a global financial centre, London has a strong AI position with applications in finance, insurance and law23. London has government support with a strategy created to identify actions supporting AI growth across industries to drive innovation and productivity. An example of governmental initiative is the Tech Nation Visa, where applicants with exceptional talent can work without sponsorship requirements24. The programme proves beneficial as 43% of London AI start-ups have been founded by non-UK nationals25. Start-ups in London tend to raise lower funding compared to American and Chinese hubs, but the city is leading European VC investment with over £200m in private funding and £500m in public funding invested in 2017. London also benefits from the presence of top AI players such as Google Deepmind, which focusses on deep-learning applications for positive impacts, and OakNorth, which concentrates on fintech26.
France became a place of AI renaissance when President Macron made AI development a priority with his national strategy stating that AI is not only a technological, but also an economic, social, ethical and political revolution27. President Macron has pledged investments to stimulate innovation and turn France into a country of unicorns. As part of this plan, over €1.5bn are devoted towards AI development28. Through the AI for Humanity platform, France is among the countries giving most thought to regulation, diversity and ethics to ensure AI development is in line with the best standards of acceptability for citizens29. Amongst these initiatives are tech visas and the ambition to share governmental data to allow anyone to build AI services. The ecosystem is mainly clustered in Paris where there are strong STEM universities including CNRS, Paris Saclay and the National Institute for IT research and automation (INRIA). Over the last few years, Paris has benefitted from material foreign direct investment with notably IBM, Samsung and Facebook establishing labs. More recently, Google has indicated that it will add 1,000 individuals to its fundamental research lab. The French capital is home to a vibrant start-up community with over 20 incubators including the biggest in the world (Station F, a 34,000 m2 campus). The ecosystem bolsters over 120 companies and includes notable start-ups such as Dataiku, which develops machine learning on “dirty” data, and Prophesee, which develops computer vision sensors and systems in all fields of artificial vision30.
Germany is making increasing efforts to be at the forefront of AI development. The 2018 government coalition plan looks to attract talent, respond to the changing nature of work, integrate AI into government services, make public data more accessible, and further the development of ethical AI31. As part of the plan, the government has pledged €3bn until 2025 and expects funding to be matched by the private sector. It has also laid out a vision for German-built AI solutions to have a “Made in Germany” seal of quality32. Germany prefers to focus on increasing productivity in factories and supply chains around the world by leveraging industrial rather than consumer data, which is harder to access given public concerns about data privacy33. The German private sector has been showing increased activity since 2015 with the creation of over 100 AI related start-ups in a variety of sectors including ADA Health in healthcare and Arago in process automation. Germany is also looking to establish 12 research centres to train talent and conduct R&D in collaboration with industrial players34. R&D efforts are diffuse, but players such as Porsche, Daimler and Bosch have concentrated their efforts in the Stuttgart Cyber Valley to provide an industry-backed push to create a stimulating AI ecosystem conducive to technology transfers between academic laboratories (including University of Tubingen as well as University of Stuttgart) and industry35. With its Industry 4.0 efforts, Germany is perceived as taking a leading position in industrial applications including autonomous vehicles and robotics36.
With a significant AI talent base, Canada has pioneered several advances in AI, robotics and deep learning since the 1980s. In 2017, it was the first country to release a national AI strategy distinctly focussed on R&D and talent across the country. The strategic plan reaffirms support to the cities of Toronto and Montreal, which have become some of the most advanced global hubs, by pledging CA$1.3bn (US$1bn) in financing and creating three new AI institutes in Edmonton (AMII), Toronto (Vector Institute), and Montreal (MILA)37. The ecosystem has been rapidly expanding and now has over 280 enterprises mainly located in Toronto and in Montreal.
Toronto is North America’s second-largest financial services hub and is a major manufacturing centre, making it fertile ground for industry and start-ups. It is one of the 20 strongest start-up ecosystems in the world and benefits from leading research conducted at its 16 academic institutions, notably at the University of Toronto and the University of Waterloo38. The city is a fintech leader with over 500 ventures, which raised US$400m in 201739. The city has one of the highest concentrations of AI start-ups in the world with, amongst others, the Vector Institute, NextAI, and the Creative Destruction Lab. Toronto is also home to the R&D labs of Uber, Thomson Reuters, the Royal Bank of Canada, Shopify, Amazon and Google. Google is also looking to apply AI in a new kind of project with the Google Sidewalk Toronto initiative combining forward-thinking urban design and new digital technology to create people-centred neighbourhoods40.
In Montreal, firms can find a hospitable environment offering the cheapest operating costs in North America and a pool of over 90,000 skilled information and communication technology (IcT) workers41. The city has six universities including McGill University and Université de Montreal, which established labs collaborating with the ecosystem by providing talent and knowledge-sharing. The city boasts one of the largest concentrations of AI scientists in the world and one of most respected deep learning research groups led by Yoshua Bengio. Since 2010, the ecosystem has become a global AI hub by attracting the likes of Facebook, Google, Samsung as well as Microsoft and having them invest to take advantage of Montreal’s expertise. Notable start-ups include Element AI, which raised US$102m in Series A funding in 2017, making it one of the first large-scale funding rounds in the world.
Israel, and particularly Tel Aviv, is an active innovation hub led by four universities and a start-up ecosystem with over 950 active start-ups, 50% of which raised one or more funding rounds42. Over the last four years, there were over 140 new AI centred start-ups created per year across a variety of industries. Israeli start-ups raised over US$7.5bn cumulatively from private and public sources43. AI is part of the national innovation strategy, and the government is supportive, encouraging collaboration with industry and academia. This effort is yielding results with applications in some of the most advanced defence technology in the world; over 30% of border protection is ensured by AI-powered systems in 201844. Furthermore, with a US$275m public investment, there is keen interest to develop digital health AI-powered services.
Other Asian Hubs
Other Asian hubs such as Singapore, Seoul and Tokyo are active players in the AI race in machine learning, deep learning and robotics. Singapore’s AI.SG initiative is a pioneering model, backed by S$150m (US$109m) in investment over five years to attract more resources, talent, and institutional support45. The initiative focusses on applying AI to finance, smart cities, and healthcare. These are all priorities for Singapore, a financial centre with an ageing population and constrained by space. In South Korea, the government has announced investments totalling South Korean won (KRW) 2.2tn (US$2bn) to strengthen AI R&D and build a public–private AI research centre jointly with leading Korean conglomerates46. Meanwhile in Japan, AI development is integrated in an industrialisation road map aiming to increase the use of data-driven AI applications and build ecosystems connecting multiple domains. The strategy focusses in particular on three priority areas for Japan: productivity, health and mobility. Like in many other countries, the policy includes investments across the ecosystem including R&D, talent, start-ups and data47.
Estimated Number of Artificial Intelligence Start-up as at 2018
2. COMMON CHALLENGES
Whilst AI hubs are scattered across the globe, they face similar challenges. There is a growing global war for talent where qualified professionals are scarce, as demonstrated by the demand for AI-related roles that more than doubled over the last three years, and where 40% of companies are reporting difficulties in filling IcT vacancies48, 49. Another challenge AI firms face is the availability of quality data to train AI algorithms. The human mind can learn to make a logical decision based on a few examples, but AI needs millions of data points to learn a similar decision pattern50. The challenge is exacerbated by the difficulty in transferring learning across domains because most algorithms trained in one domain cannot be transferred to perform in another area, creating a need for large data sets51. In addition to data, current computational power can prove insufficient in complex applications such as deep learning where hundreds of overlaid iterations must be processed simultaneously52. Another challenge is that in many circumstances, AI is developed for medical, defence, legal, and financial applications where health, life and human well-being are at stake. Thus, the ethics, regulation and, more broadly, the acceptability and implementation of pragmatic AI-driven solutions might be the ultimate challenge for an AI-driven society53. The rapid pace of AI development and R&D investments creates opportunities to gain efficiency and increase productivity, but it also creates challenges for industries to understand and implement these technologies as well as for governments to regulate AI.
3. ASSESSING NEW OPPORTUNITIES
Accuracy has a global reach aligned with some of the most vibrant AI hubs, such as London, Paris, Montreal and Beijing. The firm benefits from multidisciplinary professionals across a broad spectrum including advanced analytics, strategy, valuation and data science. Accuracy is always exploring forward-looking avenues to apply open innovation technologies and continue providing bespoke advice to fulfil the strategic and financial needs of our clients. Positioned as a partner of choice to guide change, we can help build bridges to innovation ecosystems, leverage data science and assist in creating change with smart investments. In the next series of the perspectives articles, AI driven technology and its applications will be further explored.
In the first Accuracy Perspectives edition of October 2018, we discussed how financial planning tools are evolving, at the request of the regulator, towards integrated platforms that enable the production of budget forecasts and stress tests from quantitative models1. The aim of this article is to complete the ‘project, tool and governance’ visions by reviewing the statistical approaches at the heart of planning systems.
The introduction of quantitative methods in financial planning exercises is gradually taking hold. Motivated by increased stress testing and the will to complement their operational staff’s expert views, banks are developing revenue forecast models for each of their various business lines.
These forecasts can be based on different types of models (analytical mechanics, ALM behavioural models, etc.). To project activity volumes, econometric models2 are vital. This article details the (particularly statistical) issues faced by modelling teams and provides some ideas about how to overcome them.
1. THE STATISTICAL APPROACH IS AT THE HEART OF CREATING FORECAST MODELS
Statistical models enable the use of an activity’s historical performance measures to define the mathematical relationship between such measures and external variables (macroeconomic variables, banking market data, etc.) or internal variables (seasonality). The forecasts therefore consist of multivariate regression models, be they linear or non-linear3.
In this context, a compromise must often be found between the ease of implementation and appropriation by the business lines, on one hand, and predictive power and statistical robustness, on the other hand. In order to arrive at this compromise, it is necessary to implement an iterative process in three steps (as detailed in figure 1):
1. Collecting and transforming the data
2. Building the model
3. Evaluating the performance of the model.
Iterative process leading to the creation of a model (figure 1)
– Organisational implementation facilitating the collection of quality data.
– Transformation of data to make it usable via the deletion of exceptional events, the completion of missing data, a seasonality adjustment and other potential mathematical transformations.
– The choice of model in step 2 depends on the quality and historical depth of the data.
– Choice of model applied – linear initially then more advanced methods are used depending on the data and the type of business studied.
– Machine learning methods enable the orientation of the choice of variables; however, we will prefer more standard methods:
a) Easier to put in place
b) Enables us to obtain results that are simpler to audit (“black box” issue of forecasts in Machine Learning).
– Use of statistical tests to check the mathematical robustness of the model.
– Back-testing of model over historical periods.
– Sensitivity test of model based on a shock to explanatory variables.
2. MANAGEMENT OF THE DATA: A PREREQUISITE FOR THE CONSTRUCTION OF STABLE MODELS
The quality of a statistical approach depends largely on the data on which the modelling works are based. Indeed, if the historical data from which the relationships with macroeconomic indicators (or other indicators) are defined include “polluting” effects, the models are less precise and can even lead to erroneous conclusions. In this context, the validation of the quality of the data by the business lines is a prerequisite for any statistical analysis. It is important to involve members of the business line teams in these considerations to capture all ‘non-standard’ items.
Once these verifications are complete, the data series can be transformed to improve the quality of the models to test in step (2). These transformations should take into account different types of effects, including:
– the representativeness of exceptional business items (one-offs), such as large mergers and acquisitions (jumbo deals) or fiscal shocks, which can reoccur in reality;
– the completion of missing and/or asynchronous data. For example, we might want to forecast an indicator on a monthly basis from explanatory variables that are only available quarterly. In this case, it is possible (i) to rely on the quarterly series (implying the loss of points of analysis), (ii) to interpolate on a monthly basis (linearly or not), or (iii) even to realise a filtering4 by completing the data using statistically coherent estimations from indicators with a greater frequency;
– the seasonality5 to be adjusted, for example in the case of non-seasonal explanatory variables. Seasonality transformations can be based on the ARIMA-126 algorithm or the STL process based on local regressions7;
– the smoothing of the series to be explained, for example by calculating a moving average8;
– the introduction of a delay effect, or lag, to the series to be explained.
Alternatively, other mathematical transformations can be applied to the series to iteratively improve the results of the evaluated models during the model evaluation in step (3):
– data differentiation – of the order 1 or 2 – with a short frequency (one quarter, for example) or a long frequency (one year, for example). This generally enables the correction of non-stationarity9 statistical bias but can sometimes be unstable in projections;
– the application of functions (growth rate, square, logarithm) aiming to capture non-linear effects;
– the use of models in co-integration (see below) in case of the non-stationarity of the variables to be explained.
The appreciation of the quality of the data collected as well as the characteristics of the transformations applied to them during the data collection in step (1) must be taken into account when choosing the model to develop during the modelling in step (2). Indeed, the historical depth of the data must be sufficient to capture distinct scenarios (crises, differentiated interest rate scenarios, etc.). Moreover, the data must present comparable business realities (for example, a scale effect linked to the number of traders on a desk or a reorganisation of the activity must be taken into account to harmonise the series from a statistical perspective).
3. ADAPTING THE CHOICE OF THE EXPLANATORY VARIABLES AND THE EXPRESSION OF THE MODEL TO THE ENVIRONMENT
Choice of model
The expression of the model must adapt to the needs of the users of the tool.
– Linear approaches, for example, are simpler to implement but do not allow the model to capture more complex relationships than affine relationships between the explained variable (or its growth) and the explanatory variables (or their growth). Coupled with the use of the non-linear transformations on the explanatory variables of the model, however, simple linear approaches enable the model to capture non-linearities. For example, the logarithm of mortgages can be correlated to the logarithm of the growth of household income. The use of logarithmic transformation enables the model to link the variables whose orders of magnitude are different.
– Machine learning methods10, such as the random forest11, are very good tools to orientate the choice of variables. They are, however, rarely retained because they are often complex to implement and difficult to audit for a regulator. Furthermore, they do not highlight exogenous drivers of activity and can remain too centred on autoregressive forms12.
Models in co-integration
In case of non-stationarity13, classic statistical models are unstable and specific techniques must be used. One central notion today is that of the model in co-integration for macroeconomic variables. A set of variables is co-integrated with the observed series if there exists a combination of variables that enables the cancellation of ‘the stochastic trend’ of the observed series to end up with a stationary series. For example, it has been demonstrated that in the United States, actual consumption per inhabitant and actual available income per inhabitant are co-integrated, highlighting a stable relationship between these two non-stationary series. These co-integrated variables are therefore linked to the observed series by a ‘long-term’ linear equation, which can be interpreted as a macroeconomic equilibrium in relation to which the differences constitute temporary fluctuations. By looking at the previous example, a temporary fluctuation in consumption in relation to available income can occur in a given quarter, but it will have a comparably opposite effect on the future consumption of the next quarter, which tends to bring the two series towards their point of equilibrium represented by the long-term relationship.
The historical approaches to understand this type of relationship are those of Engle and Granger14 or Johannsen15, as well as the models called Autoregressive Distributed Lag (ARDL). All these models capture both long-term relationships and deviations from these equilibria via mean-reverting and error-correction models.
Choice of variables
Initially, the explanatory variables will be chosen from among all the transformed variables, thanks to simple correlation studies. The choice of explanatory variables can also be informed by the business line expertise, systematic classification approaches (such as principal component analysis16) or even their significance in machine learning methods (we then preserve the variables highlighted by the method but by applying classic statistical models to them).
On the contrary, certain variables will be excluded a posteriori by the statistical tests in step (3). In particular, too many variables can result in over-fitting and collinear variables to unstable regression coefficients17.
Calibration of parameters
The method to estimate the parameters of the regression depends on the tests undertaken in step (3). They will be estimated either by the least squares estimator or, for example, by Yule-Walker18 estimators to avoid the bias that is inherent to the existence of autocorrelation of residuals in the series used.
The problems raised by non-stationarity also concern the inference of the parameters of the estimated model, for which the usual asymptotic laws derived in the context of stationary series can lead to inconsistencies if used as such.
Notably, p-values (see below) and confidence intervals are no longer reliable in the context of non-stationary series or co-integration.
4. EVALUATING THE MODELS GIVES CREDIBILITY TO THE STATISTICAL PROJECTION WORKS
The predictive power of the model must be verified by a set of tests. Statistical tests or backtesting can be done to support the choice of the model, even though neither of them is eliminatory as regards the choice of the model. We note that the verification requirement of these tests is to be weighted by the quality of the available data. The sensitivity of the model to a shock to the explanatory variables must be appreciated in all cases.
The calculation of the significance of the variables (p-value) is important, but the estimation of the parameters and the calculation of the p-values must be corrected in case of the non-compliance of the basic assumptions of the linear regression19:
– Stationarity of the time series20 (homogeneity of their distribution over time): the results of the linear regressions can be unstable over time if the series are not stationary, even in the case of a good R2. In this case, it is preferable to transform the variables (step (1)) or to choose a co-integration model (step (2)).
– Homoscedastic residuals21 (constant variance over time) and/or more generally not self-correlated22: In case of non-compliance, this may be indicative of an unfound explanatory variable. This can significantly bias the variances and the confidence intervals of the coefficients. It is therefore necessary to correct the coefficients23 or to modify the estimators used24.
– Normality of residuals25: this assumption of linear regression is, however, rarely verifiable on small samples (asymptotic property) and is not necessary for the convergence of the parameter estimators.
Backtest (or cross-validation or out-of-sample performance)
If the historical depth permits, it is possible to measure the difference between the actual historical series and the model calibrated on a different time period. By repeating the exercise over several sub-periods, it is possible to verify the stability of the coefficients of the regression. An equivalent average error between the period tested and the calibration period is a good indicator that the model is not over-calibrated (over-fitted).
Modelling all the financial aggregates of a bank requires modelling activities of diverse natures relying on heterogeneous statistical models. In this context, the team responsible for the development of the models must adapt to the reality of each of the segments of activity. This results in a plethora of statistical models to be articulated on a flexible platform enabling them to be linked to each other and to the data sources (business line databases notably) to propose results that are directly usable by the platform’s teams.
Four types of difficulty must be overcome to constitute a sufficiently solid base of models to integrate to the platform:
– Difficulty in finding predictive statistical models on certain perimeters: not all activities are able to be modelled using a statistical approach, and some are more complex to understand (specific commissions, general fees, ). Moreover, the majority of classic statistical models struggle to capture non-linearities in past behaviours.
– Adding overlapping distinct effects to the activities’ core modelling: Foreign exchange effects, concentration of portfolios, etc. Contagion effects, reputation effects and all feedback effects are particularly complex to capture.
– Difficulty in collecting quality data that is easy to update after the first modelling exercise: lack of depth of the data, homogeneity issues, etc.
– Organisational difficulties and issues with the tool.
Whilst banks now have recognised quantitative modelling teams, these skills are mainly concentrated in the Risk teams on credit risk and market risk issues. For most banks, prospective modelling based on statistical methods implies the constitution of specialist teams.
The methods discussed above provide a global vision of the statistical measures available to planning teams to build their projection models. The human dimension and the ability to recruit talent capable of building complex models is at the heart of the issue.
The development of a tactical approach, via agile tools, enables banks to create initial support for the platform and to distinguish the construction of the models from their industrialisation in the bank’s systems.
1 ‘L’émergence des plateformes intégrées de planification financière et de stress tests’ Revue Banque 824, pp. xx-xx
2 Econometric models enable the modelling of economic variables based on the statistical observation of relevant quantities.
3 Regression models are used to explain the evolution of a variable according to one (univariate model) or several variables (multivariate model). These regression models can be linear if there is a relationship of direct proportionality between the explained variable and the explanatory variables.
4 This can be implemented using a Kalman filter, for example.
5 Or more generally the autocorrelation of the series, which amounts to introducing an endogenous variable into the model.
6 The ARIMA-X12 algorithm is a popular method of seasonality adjustment developed by the US Census Bureau. This method applies to series with monthly or quarterly seasonality. It is implemented in most statistical software and is one of the methods advocated by the European Statistical System (ESS).
7 The STL (“Seasonal and Trend Decomposition Using Loess”) procedure is a method of breaking down a time series into a seasonal component, a trend and residuals. As such, it is also a method of adjusting the seasonality that may be preferred in some cases to ARIMA-X12-type methods (especially in case of fluctuating seasonal components or in the presence of outliers).
8 More generally, this can be incorporated, with the seasonality, into an ARMA-type (Auto Regressive Moving Average), ARIMA-type (AutoRegressive Integrated Moving Average) or SARIMA-type (Seasonal ARIMA) modelling process.
9 The stationary (or not) character of a time series refers to the homogeneity of its statistical distribution over time. A weaker property used in practice (weak stationarity) is the fact of having its first two moments (mean and variance) constant, as well as an invariant autocorrelation function by translation over time.
10 These methods are part of what is now called ‘Machine Learning’, which aims to leverage data to determine the form of the model to be adopted, rather than specifying it upstream. These methods are based on the statistical analysis of a large number of data of various natures.
11 Random forests are a family of machine learning algorithms that rely on sets of decision trees. The interest of this method is to train a set of decision trees on subsets of the initial dataset and thus to limit the problem of over-learning. This type of algorithm makes it possible to perform classification (estimation of discrete variables) and regression (estimation of continuous variables).
12 An autoregressive model is one in which a variable is explained by its past values rather than by other variables.
13 A random process is considered stationary if it is stable over time. Mathematically, this results in particular in a constant expectation (there is no trend) and a constant variance.
14 Co-Integration and Error Correction: Representation, Estimation, and Testing, Robert F. Engle and C. W. J. Granger, 1987).
15 Estimation and Hypothesis Testing of Cointegration Vectors in Gaussian Vector Autoregressive Models, Johansen, Søren, 1991.
16 Principal Component Analysis (PCA) is a method of data analysis, which consists in transforming variables that correlate amongst themselves into new variables that are de-correlated from each other on the basis of their mathematical characteristics (orthogonal decomposition to own values).
17 The Lasso or Ridge regressions allow the regularisation of the problem and the selection of variables of greater interest by introducing penalty terms.
18 The Yule-Walker equations establish a direct correspondence between the parameters of the model and its autocovariances. They are useful for determining the autocorrelation function or estimating the parameters of a model.
19 When the assumptions that provide the asymptotic distributions or the confidence intervals of the estimators are no longer satisfied, the confidence intervals can still be calculated by simulation (bootstrapping or resampling).
20 The classic tests to run are those of Dickey-Fuller (increased), Phillips-Perron or Kwiatkowski-Phillips-Schmidt-Shin.
21 Tests of Breusch-Pagan and of Goldfeld and Quandt.
Creating value for shareholders is a never-ending exercise for executives and finding the right opportunities in today’s economic environment leaves little room for mistakes. From the wide range of potential strategic initiatives to increase an investor’s wealth, M&A activities remain popular strategic options for executives, who see them as an effective way to build bigger, more profitable, more competitive and more diversified businesses.
A management team may be looking into an acquisition for many reasons, but it is widely accepted that strategic buyers expect a certain amount of synergies from integrating a target’s activities into their own. However, knowing that 70% to 90% of M&A projects fail to deliver the expected benefits initially identified1, are management teams measuring potential value creation the right way?
It is tempting to point the finger at bad post-merger integration processes or other external factors, but our experience shows that many deals would have had a higher success rate by adopting a holistic approach at the synergy valuation stage, combining both economic and financial analysis.
1. A QUICK THEORY REMINDER
We can establish the theoretical added value of an M&A project for shareholders as the difference between the actual price paid for an acquisition and the estimated maximum deal value as shown below.
Putting aside the benefit of controlling the new integrated entity, the estimation of the net present value (NPV) of synergies can significantly influence the perceived added value created in a deal.
When using either a DCF or an EBITDA multiple to determine the NPV of synergies, the following four factors will have an impact on the valuation:
– Additional cash flow generated from potential synergies;
– Future growth rate from potential change in economic landscape;
– Change in the capital cost structure;
– Time horizon for synergies to materialise or fade.
Every business is unique so it is nearly impossible to develop a standardised approach to this exercise, but having a structured methodology could help avoid costly mistakes. What follows is a brief introduction to synergy valuation topics. It should not be thought of as an exhaustive approach, but as a focus on certain concepts and issues that management teams may face when valuing the financial considerations of an M&A deal.
2. WHAT IT SHOULD LOOK LIKE
Synergies are expected to arise from different sources depending on the sector. Globally, our experience working on M&A projects shows that the publicly communicated run-rate EBIT synergies stand at 1% to 5% of combined sales.
Revenue synergies are more complicated to benchmark against comparable transactions since the benchmarking exercise requires a detailed analysis of the target’s product specificities. It could also be substantially affected by the chosen product integration strategy.
Benchmarking cost synergies against similar deals is more reliable. By way of example, we regularly observe supply chain and procurement savings of 2% to 4% of the combined procurement cost base in the merger of similarly sized industrial or manufacturing players. We also regularly see 10% to 20% savings on real estate costs, depending on the overlap of the geographical footprint. More generally, players in these industries can expect a split of run-rate cost synergies as shown in the following chart.
3. NOW WHERE TO START?
The following two assessments will have a significant impact on the entire synergy quantification process:
– Product & service offerings matrix: Are the offering propositions of the two companies (i) complementary, (ii) directly competing or (iii) overlapping on the value chain?
– Integration strategy for the target: Is the target’s product portfolio going to be (i) integrated into the existing brands or (ii) kept as a distinct offering?
Two overlapping offerings will yield two different synergy assessments depending on the product integration strategy. For example, technology & software companies frequently face this dilemma when dealing with two successful stand-alone solutions after an acquisition. Killing one product might lead to significant commercial and R&D cost savings, but it could potentially lead to negative top line synergies from adverse customer conversion to the surviving product.
Good practice would be to perform distinct analyses for each integration scenario to have clear visibility on which one will maximize value creation.
4. BEYOND THE FINANCIAL STATEMENTS
M&A activities can have a significant impact on the competitive landscape of the industry, and these effects should be integrated into the financial and valuation models. The current economic environment and the stage of the economic cycle will also affect growth expectations and strategic developments and should therefore feed into synergy assessment and valuation.
For example, in a consolidating sector where there is significant value to scale, it could make sense for a player to acquire what initially appears to be an overpriced target to keep its current competitive advantages. Not pursuing this deal may lead to a loss in pricing power against its main competitors or the ability to scale cost reductions, for example. In this situation, it is important to consider this potential loss of competitive advantage in the synergy analysis. A deal might not generate any additional value at the current price, but not making it might ultimately lead to value deterioration.
Another example would be an equipment manufacturer in the oil & gas industry that considers a very promising and synergistic acquisition. By financing the acquisition with debt to capitalise on low borrowing fees and opportunities to optimise capital structure, the company might have serious short-term liquidity issues as current economic indicators exhibit a downward trend and show a potential economic slowdown in oil & gas equipment in the near term.
Good practice looks beyond the base case scenario or the current business plan assumptions and does not assess synergies purely based on the current economic environment.
5. SOME ATTENTION POINTS
The following is a non-exhaustive list of attention points that we usually focus on when dealing with synergies.
Revenue synergies identification
Discarding revenue synergies
Revenue synergies can be difficult to estimate for many reasons such as specific product or technological complexities. This uncertainty can lead executives and investors to discard synergies from the final assessment, which can lead to huge opportunity cost mistakes.
Involving operational and technical experts is a good way to increase the credibility of the revenue synergy exercise. Assigning proper contingency reserves is also a good way to avoid missing any potential upside value, especially when comparing multiple targets and projects.
Revenue synergies and associated operating profit margin
A common practice is to assume that the current fixed cost structure will absorb the additional revenue generated from top line synergies by using a higher operating profit margin.
However, eliminating some costs (e.g. selling & marketing) can undermine the revenue synergy drivers themselves (e.g. developing new markets or exploring new distribution channels, which require significant commercial efforts).
Cost synergies identification
Synergy vs restructuring plan
Restructuring initiatives can be negatively perceived internally and under certain circumstances are difficult to realise. An acquisition can be a good opportunity to create synergies and optimise the cost structure.
If the restructuring plan can be implemented outside of the acquisition process, it should not be considered in the value creation analysis.
Financial statements figures
Many synergy assessments depend on financial and accounting reporting, especially when considering the cost structure.
However, businesses usually report costs differently, even when they are in the same sector. For example, R&D costs for one company can fall under SG&A costs or COGS, or they can capitalised on the balance sheet, whereas another company might allocate them entirely to SG&A.
Some synergies do not last
A common practice is to calculate a run-rate synergy level and assume that it will continue unchanged in the long run.
Whilst this is mostly true for cost synergies, revenue synergies will most likely face pressure at some point in the future. For example, an increase in pricing power due to an increase in market share will eventually fade away as the sector further consolidates. Care should be taken when using EBITDA multiples to value synergies.
Change in future growth perspectives
Acquiring another company will likely change the market outlook and future expected growth of the new company by obtaining a new technology or gaining access to new markets, for example.
The change in the future growth prospects of the combined entity should be reflected in the DCF or EBITDA multiple analyses to value synergies.
Potential change in the cost of capital
An acquisition triggering a change in economic activities and global industry exposure will most likely distort the acquiring company’s cost of capital.
Synergy valuation should be based on the new combined entity’s cost of capital, and not solely on that of the acquirer preceding the transaction. Financial expenses should also be considered, if additional debt is raised to counter the acquisition’s impact on the capital structure.
6. ACCESS TO DATA
An important step is to understand the financial cost structure and geographical footprint for both the acquiring company and the target. This involves gathering detailed financial and operational data such as distribution channels, plant and office locations with their respective characteristics (headcount, fixed costs, capacity), recurring revenue patterns, client concentration, etc.
It certainly is complicated to gather reliable data for the target, but it is surprising how complicated it can be for the acquiring company to access its own data. Only a limit