Fionn Murtagh’s Blog

Themes: information economy, intellectual property, research

Archive for the ‘Grand Challenges’ Category

The University in the Age of Its Technical Reproducibility

with one comment

The University in the Age of Its Technical Reproducibility

To consider our epoch now of the university in the age of its technical reproducibility, this is an insightful article by Walter Benjamin, “The Work of Art in the Age of Its Technical Reproducibility” treats ultimately all of art, when reproduction and hence commodification has become the order of the day. These books have versions of this article: Walter Benjamin, Selected Writings, Volume 4, 1938-1940 (pages 251-283), Walter Benjamin, Selected Writings, Volume 3, 1935-1938 (pages 101-133), published by Belknap Press of Harvard University Press, in 2002.

The essence of art can survive but its nature is quite different from what art was before it became part and parcel of our industrial environment. The context is: “It has always been one of the primary tasks of art to create a demand whose hour of satisfaction has not yet come.” And he quotes André Breton with this: “The artwork has value only insofar as it is alive to reverberations of the future.” This too is very insightful: “The technological reproducibility of the artwork changes the relation of the masses to art.” (The point made then is the shift of interest and engagement, for instance from a Picasso painting to, instead, a Chaplin film.)

Here are copies of this article, among others, that are online. Das Kunstwerk im Zeitalter seiner technischen Reproduzierbarkeit ( dieser Artikel).
Ceci est un article par Yannick Maignien intitulé L’oeuvre d’art à l’époque de sa reproduction numérisée (l’article).
An saothar ealaíne i ré a athchruthaithe ar bhonn meicniúil .
Agus anseo anois, Taighde Eolaíochta agus Léann i Ré a n-Athchruthaithe ar Bhonn Meicniúil.
Jetzt, “Wissenschaftliche Forschung und Formation im Zeitalter seinen technischen Reproduzierbarkeit”
Maintenant, La recherche scientifique et l’enseignement à l’époque de leur reproduction mécanisée .
Now, Research and Teaching and Learning in the Age of Their Technical Reproducibility.

Here is a photo of Walter Banjamin’s grave in Portbou, Catalonia, Spain. The grave, to the fore in this photo, has the Mediterranean just visible down in the centre of the photo.


The following are notes relating to how the university as an institution has now become commoditized, such that a university or similar institute can be reproduced at will, of course assuming the resources are made available.

It is not just alone that teaching and learning have become commoditized, nor research.

Teaching and Learning

Employability has become the key issue in teaching and learning, having replaced the concept of career, which now belongs to the distant past. The right hand side advertising of jobs appears in vast numbers, in London and elsewhere. It is pleasant to note that the left hand side advertisement, indicating 130,000 jobs, that dates from early 2016. Now, however, there are 150,000 advertised.  In 2019, there are now 195,00 advertised.








In an article by Chris Havergal in the Times Higher Education, Death of the university greatly exaggerated, says Michael Crow – At THE World Academic Summit, academics and entrepreneurs debate impact of technology on teaching, Entrepreneurs who predict the death of the university have “no idea what they are talking about” – Michael Crow, President of Arizona State University.




The past decade or more has seen the fantastic growth of research output by India, China, and other nations including Iran, Korea and others. China spent more than GBP 100 billion in R&D funding in 2012 alone.

However, the following is all too clear for all of us.

Thomas Estermann, director for governance, funding and public policy development at European University Association, spoke to Times Higher Education about its findings. “Low success rates are a huge issue for Europe,” he says. Not only are they inefficient, as the time spent developing proposals is wasted, but they can demotivate academics and top research ideas and potential groundbreaking discoveries may be rejected, he adds.

From: H. Else, “Billions lost in bids to secure EU research funding. Study highlights the true cost of low success rates in Horizon 2020”, in Times Higher Education, 6 Oct. 2016.

The following is one useful thing to do, and perhaps even ethical: to analyse all that is rejected. A framework that can also include this is at issue in a paper that is in press in Journal of Classification. Authors: Fionn Murtagh, Michael Orlov, Boris Mirkin, and title: “Qualitative Judgement of Research Impact: Domain Taxonomy as a Fundamental Framework for Judgement of the Quality of Research”, and here is a preprint.


Maurice Coakley’s book, “Ireland in the World Order: A History of Uneven Development”, 2012

leave a comment »

With the long sweep of history from late mediaeval times through to the North Atlantic financial crisis [p. 167] post-2008, this book was and is compulsive reading, instilling throughout a need to know what would happen in the end!

Maurice Coakley’s narrative starts in the late mediaeval world. The Protestant Reformation both boosted literacy and was based on a tradition of a literate laity. The Catholic Counter-Reformation also motivated greatly literacy. A major difference between the Irish Pale (region of English overlordship around Dublin) and the Scottish Lowlands, was that written contracts had become common in the latter [p. 62]. Literacy became a necessity for social position and advancement [p. 73]. That is in keeping with an interesting book dealing with Scottish history in terms of the polarity between dúchas and oidhreacht (the latter is heritage; the former has no straightforward translation in English; perhaps what comes closest in meaning is German Heimatliches):

Allan MacInnes, “Clanship, Commerce and the House of Stuart, 1603-1788”, Tuckwell Press, 1996. (See this online site.)

The roots of the present often enough go back to certain specific historical times, and Coakley focuses on the 14th century economic and social crises of feudalism. He sees a resurgence of an agrarian pastoralism in Ireland and expands greatly on this, in the interrelationships between Gaelic social order, Anglo-French society in Ireland, English monarchial overlordship, and property rights from social collectivities or individuals/families.

Coakley draws out from this a great deal that is hugely insightful. I liked this depiction relating to a much later Ireland in the 1930s: “… small-farming communities … a world in which agricultural labour is still largely collective and cooperative. If these smallholders were culturally conservative – religious belief and seasonal patterns rhymed – they were not politically conservative, and their social outlook included a strong egalitarian streak.” [P. 154] This characterizes well people of my parents’ generation.

Coakley has some comments on Irish intellectual life on p. 155 (and for political expression, p. 162), which I would highly commend, to be read in the context of his narrative.

So what then is the end of the narrative, that I could not wait to arrive at?

  • Productive investment of accumulated wealth [p. 211].
  • Employment policy an explicit policy objective rather than an indirect outcome of growth [p. 206] re European Union – CAP, Common Agriculture Policy, the only area where EU sought to establish a common social model [p. 176].

Coakley’s tour de force has most interesting views on language, verbal and written, that may carry over – my view is that implications ensuing will certainly carry over – into our online world, and changing practices and opportunities in publishing and dissemination of information, with social and economic drivers, and meaning and relevance of a high level in the social and economic context.

Coakley’s encyclopaedic and far-sighted understanding of language, verbal and written, and trajectory into contemporary processes and practices and innovative possibilities, has much to be commended.

Written by Fionn Murtagh

2013/07/30 at 22:03

Moore’s Law, the Rise of the Data Economy, and the Problem of “Dirty Data”

leave a comment »

In the piece on “Are Greenpeace attacking the younger generation and their ‘dirty data’?”, I found the discussion interesting as to whether data centres were “dirty” or not, in the sense of using energy that is high in CO2 or equivalent greenhouse gases.    In that British Computer Society blog post, the rates of growth are looked at in the data economy, the ability of computational technologies to respond, and then just how green this computational response is, in practice.   Let’s have a closer look at this. 

First I will touch on compute power and space.   Moore’s Law refers to the doubling of processor, storage, and related computing capability every two years.   Hence the compound annual growth rate, or CAGR, for doubling every two years, is 41.42% per annum.   

Hydropower is “the most widely used renewable resource for clean power generation across the globe”, and stable, flexible and inexpensive, as noted by ”Hydropower continues to account for major share of renewable power through 2020”.   A CAGR for hydropower in the period 2011-2020 is estimated as 3.6%. 

Moore’s law and hydropower indicate just well we can do to handle data growth. 

Citing from “IDC Releases First Worldwide Big Data Technology and Services Market Forecast, Shows Big Data as the Next Essential Capability and a Foundation for the Intelligent Economy”, that discusses an IDC (International Data Corp.) report on the Big Data Economy to 2015:

“IDC defines Big Data technologies as a new generation of technologies and architectures designed to extract value economically from very large volumes of a wide variety of data by enabling high-velocity capture, discovery, and/or analysis. Further, the study segments the Big Data market into server, storage, networking, software, and services segments.”

Compound annual growth rate (CAGR) is looked at in this IDC report, and for 2015 it is estimated at 44% CAGR for Big Data deployments, and up to 61.4% CAGR for storage.    Servers and software are less, being estimated respectively at 27.3% and 34.2%.  

So – my conclusion, echoing the Greenpeace and “dirty data” discussion above: the soaring rise of the data economy, and in particular storage, point to high growth of up to around a CAGR of 60%.  Technology as expressed by Moore’s law can only partially keep up with this pace – 41%.   Renewable power generation, as represented by the leading category of hydropower, is incapable of doing the job alone, in the green and global picture.   The growth of data outstrips greatly computing technologies and green technologies. 

So we have something of a conundrum here: just how do we handle our data economy, while avoiding or mitigating the data economy’s “dirty data” side?   Potential answers: nuclear, hydrogen, a battery miracle.   Or the Spirit of Ireland solution. 

Towards the Renaissance of the Irish Construction – Planning; Prices

leave a comment »

The report A Haunted Landscape: Housing and Ghost Estates in Post-Celtic Tiger Ireland, by Rob Kitchin, Justin Gleeson, Karen Keaveney, Cian O’Callaghan, National Institute for Regional and Spatial Analysis (NIRSA) Working Paper 59, July 2010, pp. 66, provides interesting data. From the summary:

“Government has two principle levers through which it can seek to regulate property development. The first is through fiscal policy with respect to regulating access to credit and determining taxation rates. The second is through planning policy and the zoning of land and the granting of planning permissions. Explanations of the Irish property bubble have focused almost exclusively on the former, and the role of the banks, tax incentive schemes, and the failures of financial regulators. To date, the role of the planning system in creating the property bubble has been little considered.”

In regard to transaction price information, an Irish Times editorial had this to say, on Saturday 14 August 2010:

“A property market that is undergoing a huge price adjustment … was never in greater need of accurate price information. … the public awaits right of access to national property sales data.”

Open, linked data and information relating to all aspects of planning and prices are desperately needed.

Written by Fionn Murtagh

2010/08/15 at 16:46

Towards the Renaissance of the Irish Construction Industry

with one comment

The Irish construction and property sector powered the post-Celtic Tiger period in Ireland, and had many bubble characteristics. But while dysfunctional in various ways (building in flood-prone regions, giving rise to long work/home commutes, lack of facilities such as high speed broadband connectivity) nonetheless there is a “real economy” rationale underpinning the property sector. Yes, there was a bubble economy there, but there was also an underlying “real economy”. The latter is what I want to focus on.

By 2008, construction had 13% of Irish employment, or 280,000 people, compared to 10% in 2000. Employment in the sector has taken a huge hit. For the sector, and for some if not most employment in the sector, to recover in Ireland, there is a need for a renaissance of construction, – a new, innovative industry that breaks with the past.

The Irish construction sector collapse started in 2007 and preceded the global, financial crisis of 2008. This is a reason to probe future technology options for Irish construction, to some extent in its own right relative to banking and lending.

The Irish construction sector is (admittedly in bubblish manner) largely privately financed. So much so, in fact, that very considerable volumes of Irish investment took place in the sector across the globe. Irish property investment abroad according to reports amounted to €10 billion in 2007, €11 billion in 2006, over €5 billion in 2004. A cash-rich sector therefore, with lavish spending even if not for the right reasons. Can we spur investment that comes anywhere close to that again in the future?

The first part of the response to this is to see where a major job of work is needed now. The report “Greenprint for a National Energy Efficiency Retrofit Programme” (end 2009) points to how “there are 1.2 million dwellings in Ireland in need of an energy efficiency retrofit, creating at least 30,000 direct construction sector jobs with additional indirect and induced jobs” and that is only the start of it. Buildings of all sorts need mutualized telecoms, energy and waste infrastructure too. In the natural order of things there are big needs to innovate in areas such as those. As families grow up and as mobility becomes less sprightly with age, homes have to adapt in significant ways. A building, any building, is continually changing and, we might even say, a living entity. A real job of work is needed, that extends into the future as far as the mind’s eye can see.

Recommendations arising from the Greenprint report include this: “Create regulatory certainty for businesses and service providers” – and consumers, customers, and you and me. That is part of the core issue. But there is a way to go in establishing anew anything like the confidence that the construction sector enjoyed before its internal (and admittedly quite rotten) collapse.

To create confidence and trust what is needed is linked, open data including all aspects of planning processes and investments and contracts. Pointing the way here is Obama’s Open Government Initiative, for transparency, participation, and public/individual collaboration. The UK’s open data initiative too is hugely active in giving access in a meaningful way to data. Highlighted just from July 2010 alone, there are data and resources for housing and planning, landfill, weather and flood warning, schools, building energy usage, …

Ireland needs now an open, linked data initiative for the construction sector, including data and tools to interpret and exploit the information in new ways, from central and local government, environment, regulatory authorities, finance and banking, transport, schools, hospitals, and all other areas of our built environment. This transparency is necessary in order to start to restore confidence and trust, and to focus where, when and how regulatory, financial and other policy instruments can be brought into play.

Semantic web technology is capable of elucidating open information and data. That is what we need to start to remedy the huge errors of the past.

Why Not Zero Tolerance of Road Fatalities and Injuries?

leave a comment »

An anniversary passed recently, the 140th, of a portentous event. According to [1], the first ever automobile fatality was in Ireland when Mary Ward, a respected microscopist, artist, astronomer and naturalist fell from a steam carriage and went under its heavy iron wheels in Birr, Co. Offaly, on 31 August 1869. In the past year, some 39,000 people have died on Europe’s streets, roads and highways as a result of traffic accidents [2]. While this figure is down on the previous year, nonetheless the downward trend is not pointing to the European target of 27,000 by 2010 (that is, 50% of the number of deaths in 2001). This target will not be realized. Globally, about 1.2 million people die each year from traffic crashes and 25 million suffer permanent disability. The current trajectory of road traffic fatalities is such that by 2020 this is expected to be the third most common cause of death. The tragedy of Mary Ward back in August 1869 goes on and on.

It is interesting to speculate on what modern technologies can offer to end the deaths and injuries in this most man-made of problems. Let me offer just a few such thoughts. Mobile phones are super abundant and location-based services are on the increase, rapidly in fact. It doesn’t have to be a matter of such mobile comms – fixed context-aware comms would be fine too.

Data transfers would lead to the potential of very powerful peer-to-peer mechanisms for the exchange of data, and ambient machine and environment data uploads. Traffic ahead, whether oncoming or receding, could provide valuable information, all the more valuable as data transfers approach real-time transfer rates. Such mechanisms could help not only with safety but with re-routing around bottlenecks and jams. On isolated country highways and byways, fixed beacons by or near the roadside could be pinged for information on ambient conditions.

The ambient machine and environment data uploads – a sort of black box recorder – would have the aim both of allowing everything to be known about an accident if such were to happen, with comprehensive learning from that; or the onboard – or in the driver’s pocket – data recorder could be linked to an insurance company such that (let us say) cultured driving earns an insurance premium rebate.

I have only begun here to envision a world where telecoms, sensors, and interaction algorithms, would meet up with road and highway engineering, transport system planning and design, and human-machine interfaces, to start with (and later financial engineering and regulatory frameworks, among other domains), in order to address this problem that just won’t go away.

This is a Grand Challenge of our time, that is addressable with modern technologies.

[1] I. Fallon and D. O’Neill, “The world’s first automobile fatality”, Accident Analysis and Prevention, 37, 601-603, 2005.

Written by Fionn Murtagh

2009/09/10 at 23:59

Posted in Grand Challenges

Computing, Intellectual Property, and the Engineering of Our Future Health Systems

leave a comment »

The Grand Challenges in Computing Research Conference 2008 Report, edited by John Kavanagh and Wendy Hall, has just been published by the UK Computing Research Committee. In my contribution at this conference, on “The impact of biosciences”, I wanted to look a little beyond the immediate horizon. Apart from the big question of where computing is going, I wanted to draw attention to the very major influence that biotechnology, pharmaceuticals and life sciences have brought to scientific research.

Each discipline has a unique contribution – the diversity of models in the case of computing, for example. But the bio and life sciences have come to tower above all others in the way that citation rates have become so important, as has team-based research, and the role of intellectual property.

So what I look forward to is how our information infrastructure can and should fuse with the health and life sciences.

From my Computing Research Grand Challenges intervention: “As generic drugs gain ground, could they become like software? There are many formal similarities between them. Maybe, too, drug development will need a new Google-Pharma information search and fusion infrastructure at its core, making use of information and data which will be increasingly in the public domain. Beyond that perhaps our health system will be based on a Google-Health information infrastructure, with the door opened at last to a much tighter merger of health and computerisation in terms of personalised health care.”

A most interesting overview of the way that intellectual property is evolving in the pharma sector, furnishing on the way plenty of food for thought about how our health systems could and should be run, is provided by a European Commission Preliminary Report on the Pharma Sector Inquiry (a 426 pp. document).

That our future health system has a considerable amount of relevance to current science and engineering is clear enough. I have sketched out in a recently published article some implications of this related to how research is carried out, and how it is funded, by comparing and contrasting the current situation with the past. This article is “Origins of modern data analysis linked to the beginnings and early development of computer science and information engineering“, published in the Electronic Journal for History of Probability and Statistics (vol. 4, no. 2, Dec. 2008). In this article I cover some of the evolving context of research and applications, including research publishing, technology transfer, and the economic relationship of the university and society.

Written by Fionn Murtagh

2009/02/15 at 13:35