Can Big Data save the GDP from becom­ing outdated?

Prof. Dr Marc Bertonèche

The dig­i­tal rev­o­lu­tion is accel­er­at­ing auto­mat­ed process­es and will most like­ly damp­en the major­i­ty of the dri­ving forces rais­ing the GDP, name­ly by reduc­ing the num­ber of work­ing employ­ees, decreas­ing the demand for skills and shrink­ing the avail­abil­i­ty of cap­i­tal stock. Is the GDP still the right para­me­ter to assess the per­for­mance or the eco­nom­ic and social well-being of a country?

In her lat­est Duet Inter­view, Dr Cal­daro­la, author of Big Data and Law, has asked Prof.  Marc Bertonèche, Eco­nom­ics Pro­fes­sor at Uni­ver­si­ties such as INSEAD, Har­vard, Oxford, Bor­deaux, to dis­cuss the impact of Big Data on economies.

Gross domes­tic prod­uct (GDP) is the stan­dard mea­sure of the val­ue added or cre­at­ed through the pro­duc­tion of goods and ser­vices in a coun­try dur­ing a cer­tain peri­od. As such, it also mea­sures the income earned from that pro­duc­tion, as well as the total amount spent on final goods and ser­vices minus the imports. Is inno­va­tion a per­for­mance indi­ca­tor and is Big Data an inno­va­tion– per­haps even a dis­rup­tive inno­va­tion? Giv­en the nature of Big Data, can it, there­fore, bol­ster the GDP?

Prof. Dr. Marc Bertonèche: Big Data, as such, does not raise the Gross Domes­tic Prod­uct. It does, how­ev­er, cre­ate con­di­tions which can lead to a sig­nif­i­cant increase in the GDP. We all know that the GDP depends upon per­son­al con­sump­tion, busi­ness invest­ment, gov­ern­ment spend­ing and net exports. In the con­text of dig­i­tal­i­sa­tion, the dri­ving forces, fed by the huge devel­op­ment of Big Data, that will raise the GDP and sig­nif­i­cant­ly impact economies include:

  • New goods and ser­vices are being pro­duced as a result of more accu­rate mar­ket­ing. Com­pa­nies are antic­i­pat­ing what cus­tomers real­ly want and will do so even more adept­ly in the future. They will, there­fore, be able to gen­er­ate more tai­lor-made and sophis­ti­cat­ed prod­ucts and ser­vices which will lead to new and increased rev­enue streams.
  • Busi­ness process­es can now be opti­mised and organ­i­sa­tions bet­ter man­aged. More pre­cise­ly, Big Data will allow for demand to be fore­cast more accu­rate­ly, and thus, boost pro­duc­tiv­i­ty, opti­mise pric­ing, improve and strength­en sup­ply chains, and reduce employ­ee turnover among oth­er processes.
  • Inno­va­tion will be accel­er­at­ed through short­er research and short­er devel­op­ment cycles (the case of the vac­cines against COVID-19 is a per­fect exam­ple). Inno­va­tion, in par­tic­u­lar dis­rup­tive inno­va­tion, will be the dri­ving force behind rais­ing the GDP.  Our tra­di­tion­al man­age­r­i­al tools will have to be rede­fined and mod­ernised to avoid the wide­ly used R O I (« Return On Invest­ment ») becom­ing « Restraint On Innovation »…!

Because data is eas­i­er to col­lect, trans­mit, store and analyse, it is, and will increas­ing­ly become, the cru­cial engine of growth of the GDP.

We have just invoked a key word by men­tion­ing analyse. It is impor­tant to empha­sise that data is a quite use­less com­mod­i­ty if there are not effi­cient ways to extract mean­ing­ful infor­ma­tion from it.

Key strate­gic skills will obvi­ous­ly be crit­i­cal to turn data into action (as in Arti­fi­cial Intel­li­gence, Machine Learn­ing, Inter­net of Things…). More and more sophis­ti­cat­ed ana­lyt­ic approach­es are need­ed to make the most of Big Data. I can’t stress enough that data on its own is sim­ply raw mate­ri­als which can be of lit­tle val­ue if not cor­rect­ly analysed and man­u­fac­tured.

A recent report by the McK­in­sey Glob­al Insti­tute esti­mates that Big Data could gen­er­ate an addi­tion­al $ 3 tril­lion ($ 3,000 bil­lion !!) in val­ue every year (of this, 1.3 tril­lion would ben­e­fit the US). Even if all the ben­e­fits do not direct­ly affect the GDP, as mea­sured today, their impact on eco­nom­ic growth will remain more than significant.

An MIT study by Erik Bryn­jolf­son found that com­pa­nies adopt­ing data-dri­ven deci­sion-mak­ing process­es achieved a 5% to 6% high­er pro­duc­tiv­i­ty than their peers.

Omid­yar Net­work has just released a study con­clud­ing that the use of Big Data for gov­ern­ment poli­cies could boost annu­al income in the G20 coun­tries by some­thing between 700 and 950 bil­lion US dollars.

And I could eas­i­ly add more to the list of numer­ous stud­ies illus­trat­ing the pos­i­tive and strong impact of Big Data on the growth of economies and the lev­el of their GDP- and this is only the begin­ning. The world is gen­er­at­ing data in ever larg­er quan­ti­ties. The data avalanche is dra­mat­i­cal­ly increas­ing the huge vari­ety of data col­lect­ed and the veloc­i­ty at which data is record­ed. IFL Sci­ence, in an arti­cle enti­tled « How much data does the world gen­er­ate every minute? » esti­mates that 90% of the data avail­able in the world today was gen­er­at­ed in the last 2 years !! ‑and every 2 years, the amount of data will be dou­bling!!   That’s just one more rea­son why I can’t overem­pha­sise the impor­tance of trans­form­ing these data into real­ly use­ful and effi­cient infor­ma­tion through care­ful and appro­pri­ate steps of analy­sis. Grad­u­ate aca­d­e­m­ic pro­grams should include these meth­ods and tech­niques in their cur­ricu­lum and organ­i­sa­tions should devel­op train­ing cours­es for their employees.

How can Big Data improve the way we mea­sure the over­all per­for­mance (suc­cess or fail­ure) of the econ­o­my? Will it give us the oppor­tu­ni­ty to devel­op new indicators?

Crit­i­cism is grow­ing among econ­o­mists on the use­ful­ness and rel­e­van­cy of Gross Domes­tic Prod­uct (GDP) as a mea­sure of eco­nom­ic and social well-being. They empha­sise the need to devel­op new ways of eval­u­at­ing progress using a broad­er and much more rel­e­vant mea­sure­ment of per­for­mance. Joseph Stiglitz, the 2001 Nobel Prize in Eco­nom­ics recip­i­ent, has been one of the strongest voic­es to argue in that direc­tion and has led a com­mis­sion tasked with defin­ing new approach­es to mea­sure eco­nom­ic per­for­mance and social progress. « It is clear, he wrote, that GDP is not an ade­quate sum­ma­ry sta­tis­tic of how well we are doing.»

The cur­rent pan­dem­ic has clear­ly demon­strat­ed that greater wealth and high­er income per capi­ta does not nec­es­sar­i­ly trans­late into bet­ter indi­vid­ual or soci­etal well-being. The Unit­ed States of Amer­i­ca, despite hav­ing the high­est GDP and one of the top income per capi­ta in the world, have been great­ly affect­ed by the pan­dem­ic and have demon­strat­ed dra­mat­ic inequal­i­ties in terms of health­care access. Life expectan­cy in the US is low­er than that of many oth­er coun­tries which have com­pa­ra­ble GDP and income per capita.

Why is that issue so cru­cial today? Because, as Stiglitz argues, « what we mea­sure affects what we do. If we mea­sure the wrong thing, we will do the wrong thing. » How can we devel­op a bet­ter per­for­mance mea­sure­ment sys­tem for any econ­o­my? We need to track eco­nom­ic growth, of course, but also social, polit­i­cal, envi­ron­men­tal, and soci­etal progress.  A sin­gle sum­ma­ry sta­tis­tic, what­ev­er it is, will nev­er be able to inte­grate and reflect the com­plex­i­ty of a soci­ety. Empha­sis­ing a sin­gle indi­ca­tor is one of the rea­sons why the GDP has been a very poor met­ric of per­for­mance and has been fac­ing grow­ing crit­i­cism. What we need is a « dash­board » approach and the data avail­able today and to-mor­row will make it pos­si­ble to build it. This set of holis­tic para­me­ters should obvi­ous­ly include eco­nom­ic data, but also social indi­ca­tors such as health, edu­ca­tion, equal­i­ty, secu­ri­ty…, togeth­er with gov­er­nance issues, such as human rights, progress made toward a demo­c­ra­t­ic process and envi­ron­men­tal tar­gets achieved, such as emis­sion reduc­tion, bio­di­ver­si­ty etc….

My opin­ion is:

Big Data will give us a chance to ensure that the wide­ly used para­me­ter Gross Domes­tic Prod­uct (GDP) does not degen­er­ate into a « Gen­uine­ly Dat­ed Para­me­ter » and that the tra­di­tion­al Return On Invest­ment (ROI) does not turn into « Restraint On Innovation »…!

Prof. Dr. Marc Bertonèche

What holds true at the macro­eco­nom­ic lev­el is of course equal­ly valid at the com­pa­ny lev­el. Organ­i­sa­tions need to devel­op met­rics inte­grat­ing the three pil­lars of per­for­mance: the eco­nom­ic pil­lar, the social one, as well as the envi­ron­men­tal and gov­er­nance one. The ESG approach has been over­whelm­ing­ly accept­ed, and even required, in more and more coun­tries. Invest­ments inte­grat­ing this triple assess­ment of per­for­mance have expe­ri­enced an incred­i­ble growth and accord­ing to a study pub­lished by the US Busi­ness Round­table, reached about $ 30,000 bil­lion ($ 30 tril­lions) in 2019‑, a growth of near­ly 70% since 2014 and a ten­fold since 2004!

What do you think are and will be the eco­nom­ic and social con­se­quences of tech­no­log­i­cal progress, automa­tion and dig­i­tal­i­sa­tion on jobs?

This is a very dif­fi­cult ques­tion, and nobody real­ly knows the answer to it. There are two very diver­gent sce­nar­ios which can be con­sid­ered: a very pes­simistic sce­nario and an opti­mistic one.

Look­ing first at the for­mer, the pes­simistic vari­a­tion has been described and analysed in var­i­ous stud­ies. A study realised in 2013 by Carl B. Frey and Michaël Osborne from the Uni­ver­si­ty of Oxford, esti­mat­ed the pro­por­tion of jobs heav­i­ly threat­ened by tech­no­log­i­cal progress and automa­tion in the next two decades at 47%, includ­ing accoun­tants, audi­tors, sales­per­sons, real estate agents, lawyers, and… econ­o­mists. For non-qual­i­fied work­ers, a study by MIT ( Eric Bryn­jolf­s­son and Andrew McAfee) fore­casts crit­i­cal unem­ploy­ment in the next decades because of the huge devel­op­ment of robots, « the immi­grants of the future » and the expo­nen­tial increase of what David Grae­ber, from the Lon­don School of Eco­nom­ics, calls « the bull­shit jobs », which describes ways to keep peo­ple arti­fi­cial­ly busy for no eco­nom­ic rea­son oth­er than fight­ing the inabil­i­ty of the eco­nom­ic sys­tem to gen­er­ate enough jobs to guar­an­tee some lev­el of social stability.

In the Finance sec­tor, the impact would be, in this pes­simistic sce­nario, dis­as­trous. Lee Coul­ter, CEO of Trans­form AI and a rec­og­nized expert in automa­tion and arti­fi­cial intel­li­gence, said, in a con­fer­ence spon­sored by CFO in New York City in Novem­ber 2019 « 70% of what is done to-day in a Finance depart­ment can be automised and a huge num­ber of jobs will disappear ».

What is fair­ly cer­tain is that retrain­ing will become an absolute neces­si­ty should this state of affairs come to pass. IBD pre­dicts that more than 120 mil­lion work­ers will need to be retrained in the next 3 years at a glob­al lev­el. What is also obvi­ous is that this sce­nario will require the imple­men­ta­tion of a Min­i­mum Social Income for every­body in the economy.

Turn­ing now to the opti­mistic sce­nario, this view relies on the basic assump­tion that, although automa­tion and tech­nol­o­gy will destroy many cur­rent jobs, new jobs will arise to replace tra­di­tion­al activ­i­ties. The argu­ment draws on past expe­ri­ence hav­ing shown that the devel­op­ment of com­put­ers and inter­net had led to the destruc­tion of jobs but has cre­at­ed many more new activ­i­ties. By tak­ing over all the tra­di­tion­al­ly sim­ple, repet­i­tive and bor­ing tasks, automa­tion will allow peo­ple to access more attrac­tive and chal­leng­ing jobs, a shift which will allow us to devel­op new skills in the work­places in terms of cre­ativ­i­ty, crit­i­cal think­ing, inter­ac­tions with oth­ers, to name a few. Big Data will undoubt­ed­ly cre­ate mas­sive require­ments for new skills. The McK­in­sey Glob­al Insti­tute pre­dicts that, by 2024, there will be a short­age of approx­i­mate­ly 250,000 data sci­en­tists- in the US alone.

I do not want to go into fur­ther detail con­cern­ing each sce­nario, but the debate is on and nobody is able to ful­ly assess the impact of new tech­nolo­gies and Big Data on the job mar­kets. Prob­a­bly, as often is the case for these kinds of issue: « in medio stat vir­tus », with real­i­ty being a com­bi­na­tion of extreme the­o­ries.  What is obvi­ous, how­ev­er, as you men­tioned ear­li­er, will be a cru­cial need for retrain­ing – and we would all do well to pre­pare our­selves at the eco­nom­ic lev­el for this huge challenge.

In your opin­ion, what are the main chal­lenges gen­er­at­ed by Big Data?

There are sev­er­al chal­lenges and con­cerns which should be addressed with great atten­tion. The first one which is on everybody’s mind is pri­va­cy. Because data is col­lect­ed every­where, from per­son­al com­put­ers and smart­phones to sen­sors in homes, they might include very sen­si­tive infor­ma­tion. Even if the data sets are care­ful­ly con­struct­ed to anonymise indi­vid­u­als, as Matthew Hard­ing and Jonathan Hersh remind us in their note « Big Data in Eco­nom­ics », « infor­ma­tion that is de-iden­ti­fied may be eas­i­ly ex-post iden­ti­fied using machine learn­ing tools ». Sev­er­al stud­ies, for exam­ple, have shown how easy it is to iden­ti­fy almost 100% of indi­vid­u­als in a data­base con­tain­ing sup­pos­ed­ly pseu­do­nymised cred­it card trans­ac­tions. It is inter­est­ing to remem­ber that Big Data relies on free data and on a total lack of con­sent from the sources of these per­son­al data for its col­lec­tion and its analy­sis, a fact which con­tra­dicts the found­ing prin­ci­ples of our cap­i­tal­ist systems.

The sec­ond chal­lenge is secu­ri­ty. Pro­tect­ing the data which have become a « crit­i­cal busi­ness asset », accord­ing to Bernard Maar in his excel­lent book on « Tech Trends in Prac­tice (Wiley 2020), is vital for any organ­i­sa­tion- and will become even more so in the future. With the devel­op­ment of the IoT (Inter­net of Things), the threat of attacks by hack­ers will increase tremen­dous­ly, as many con­nect­ed devices are total­ly unse­cured. The need for a sol­id and effi­cient data secu­ri­ty pol­i­cy is essential.

Big Data may suf­fer- and this is the third chal­lenge- from a heavy selec­tion bias depend­ing on how and by whom data is being gen­er­at­ed. As Matthew Hard­ing and Jonathan Hersh note in the above-men­tioned study, « Not every­one has the same propen­si­ty to use dig­i­tal devices, or any app or web­site, result­ing in pos­si­ble bias­es, par­tic­u­lar­ly when mak­ing gen­er­al­i­sa­tions about sub­groups which hap­pen to be rep­re­sent­ed in the data ». Appro­pri­ate meth­ods should be used to min­imise, or even, if pos­si­ble, elim­i­nate these bias­es (social bias, gen­er­a­tion bias, racial bias etc.). More­over, many valu­able sources of Big Data may be con­trolled by pri­vate organ­i­sa­tions which could impose lim­its on user free­dom, result­ing in pub­li­ca­tion biases.

Final­ly, Big Data is cost­ly to gath­er and col­lect, to store, to pre­pare and clean and to analyse through ana­lyt­ics mod­els and requires heavy invest­ments in tech­nol­o­gy and human skills. Are all com­pa­nies, and espe­cial­ly small busi­ness­es, able to have access to Big Data? Can all com­pa­nies afford the high bud­get, the com­plex infra­struc­ture and the exten­sive know-how need­ed to ben­e­fit from Big Data?

These are valid ques­tions which must be con­sid­ered, even if var­i­ous experts, includ­ing Bernard Maar, in his book men­tioned ear­li­er, argue that « thanks to aug­ment­ed ana­lyt­ics and Big-Data-as-a-ser­vice (BDaas)… Big Data will be acces­si­ble to any­body, « even small busi­ness­es, with­out the need for expen­sive infra­struc­ture invest­ments and over­com­ing the mas­sive skills gap in Big Data ».

These chal­lenges are enor­mous and call for improv­ing data lit­er­a­cy across the board through knowl­edge- shar­ing and con­tin­u­ous train­ing and edu­ca­tion. They also stress the urgent need for all organ­i­sa­tion to cre­ate a data strat­e­gy- the objec­tive of which is « to focus on find­ing the exact, spe­cif­ic pieces of data that will best ben­e­fit the orga­ni­za­tion » (B. Maar). 

A very ambi­tious task indeed ‑but absolute­ly nec­es­sary if we are to enjoy the huge ben­e­fits of Big Data.

Prof. Dr Bertonèche, thank you for shar­ing your thoughts on the impact of Big Data on economies.

Thank you, Dr Cal­daro­la, and I look for­ward to read­ing your upcom­ing inter­views with rec­og­nized experts, delv­ing even deep­er into this fas­ci­nat­ing topic.

About me and my guest

Dr Maria Cristina Caldarola

Dr Maria Cristina Caldarola, LL.M., MBA is the host of “Duet Interviews”, co-founder and CEO of CU³IC UG, a consultancy specialising in systematic approaches to innovation, such as algorithmic IP data analysis and cross-industry search for innovation solutions.

Cristina is a well-regarded legal expert in licensing, patents, trademarks, domains, software, data protection, cloud, big data, digital eco-systems and industry 4.0.

A TRIUM MBA, Cristina is also a frequent keynote speaker, a lecturer at St. Gallen, and the co-author of the recently published Big Data and Law now available in English, German and Mandarin editions.

Prof. Dr Marc Bertonèche

Marc L. Bertonèche is an Emeritus Chair Professor of Corporate Finance at the University of Bordeaux and was on the Faculty of INSEAD, the European Institute of Business Administration in Fontainebleau France, for more than twenty years. From 1986 to 2014 Prof. Bertonèche was a visiting Professor at Harvard Business School where he taught finance in the MBA program, in the Advanced Management Program (AMP) and in the General Manager Program (GMP). He has been voted best teacher by the student body on numerous occasions and has won several awards for excellence and innovation in the classroom. Dr Bertonèche was an Associate Fellow at the Saïd Business School and is currently an Associate Fellow at Green Templeton College at Oxford University and a Distinguished Visiting Professor at HEC in Paris. He is a very popular guest speaker and is a consultant for companies all over the world as well as being a member of the executive board of firms in Europe, USA and Asia.
Professor Bertonèche holds Master’s degrees with honours in Economics and Political Science from the University of Paris and a Doctorate in Business Administration with Distinction from the University of Bordeaux. He received his MBA and his Ph.D. in Finance from Northwestern University's Graduate School of Management, where he was a Distinguished Scholar and a Beta-Gamma-Sigma.

Dr Maria Cristina Caldarola

Dr Maria Cristina Caldarola, LL.M., MBA is the host of “Duet Interviews”, co-founder and CEO of CU³IC UG, a consultancy specialising in systematic approaches to innovation, such as algorithmic IP data analysis and cross-industry search for innovation solutions.

Cristina is a well-regarded legal expert in licensing, patents, trademarks, domains, software, data protection, cloud, big data, digital eco-systems and industry 4.0.

A TRIUM MBA, Cristina is also a frequent keynote speaker, a lecturer at St. Gallen, and the co-author of the recently published Big Data and Law now available in English, German and Mandarin editions.