Tough choices around the costs and benefits of nanotechnology

A new report on a public dialogue on nanotechnologies has been published today, 26 May.

Technological innovation depends on science, both to provide the innovation itself and assurance that its benefits outweigh its costs. But when does an innovation become a risk? For most of the long pathway from an innovation emerging to its mainstream adoption in our lives, we tend to focus on the benefits. Only at the eleventh hour can some of the costs become apparent. But does it have to be that way? In my view, greater investment in understanding the basic science of risk and its communication is much needed in advance and to head-off this problem.

Nanotechnology is grounded in an understanding of how materials behave at very small sizes, and has had a long lead time. In 1857, Michael Faraday investigated the action of light on very thin films of gold and noticed that the fluid used to wash these films became ruby red, deducing that this was suspended gold. The particles were about 50 nanometres in diameter – about 1/2000th the width of a human hair. The fact that they were red, rather than gold-coloured, shows how nanomaterials can behave differently to larger pieces of the same material.

Compared with larger particles, nanoparticles can interact differently with light, have different electrical properties, or different chemical reactivities. Their surface area is huge compared to their volume, and most of their mass interacts directly with the outside world. This is what makes them so reactive. The small size of these particles also offers the opportunity for them to get to places where other particles simply could not reach, such as inside individual cells of organisms.

Nanoparticles derive from of a range of metals, alloys and compounds. They have application in everything from medicine to helping integrated circuit designers increase memory storage capacity on computer chips. Nanotechnology is becoming an integral part of our lives and we hardly know it.

The potential of nanotechnology is enormous, but what are the risks? If nanoparticles are capable of entering cells or disappear in to the environment never to be recovered, how can we be sure all the benefits that using them can bring will not rebound on us with some negative impact? It’s also one thing to produce nanoparticles intentionally and to control their release but it’s quite another to produce them unintentionally, as a by-product of some other process.

There is a clear need to understand what people think about these issues and where challenges exist. It is the combined role of government, industry, researchers, and NGOs to not only communicate science to a broad audience, but to engage citizens in a dialogue and capture what we understand to be the potential benefits and the costs of these technologies. People are often content to pay for initial research into technologies like ‘nano’ because they understand where the benefits might lie. It is much harder to persuade people to fund research to understand what the downsides of the technology might be even when the uncertainties can be truly daunting.

A new, qualitative public dialogue commissioned by Defra and carried out in conjunction with (and co-funded by) the organisation Sciencewise, as well as industry, seeks to find out how comfortable people are with specific applications of nanotechnology. By focusing on nano-based products, such as sunscreens and paints, the deliberation process sought to explore the motivation behind people’s views and perceptions.

The report, released today (26 May) highlights the importance of communicating to the consumer what is in a product. People like to know what they’re buying, and don’t like to be forced to consume ‘by stealth’. Nanoparticles have been used in sunscreens for many years but these are one of the applications that consumers are wary of. Citing a lack of clarity over what the product contains, there were concerns that something used on the skin, especially of young children by their parents, could be taken up by the body. It was also thought that nanoparticles from sunscreens could enter watercourses and behave in unknown ways.

This negative opinion of nanoparticles in sunscreens, stemmed largely from the fact negatives were not sufficiently balanced out by positives (prevention of skin cancers). Consumers couldn’t reason why nanoparticles were more efficacious in blocking UV rays. This revealed a deficit of understanding about why nanoparticles are effective in such a product.

Nanoparticles can also be used for remediating contaminated land and this raised the perception of risk. While participants agreed the purposes of removing contamination were worthwhile, there was a concern that they would remove one deleterious substance while replacing it with another, even if there is nothing to validate their concerns in this case. It was felt the future impact was difficult to predict. Lesson learned from the use of CFCs was important in people’s view. CFCs were once ubiquitous in refrigeration and used as aerosol propellants, but subsequently discovered to be the main cause of stratospheric ozone breakdown.

Participants were much more positive and accepting about the use of nanoparticles in paints and coatings, especially if new properties, such as being antimicrobial or more durable could be introduced. Their perceptions over disposal were no greater than they would have for other non-nanoparticle-containing paints, which often require careful disposal. The onus was seen as being on the consumer to read product labels and advice and dispose of waste paints properly. Likewise, nanoparticles used as a fuel additive to reduce emissions were welcomed. In this case pollution from cars was perceived as such a large problem that any risks of reduction using nanotechnology were, in the view of the participants, compensated by the benefits.

The judgement of participants identified the responsibility for dealing safely with nanotechnologies, like any technology, as being shared between government, industry and the individual. Outside this triangle, NGOs provide scrutiny. Crucial to any dialogue, however, are robust and clear channels of communication that serve not only to educate audiences, but also seek their voice when formulating matters of policy and regulation.

One issue that does concern me, however, is the extent to which we have the capacity to control the uptake of new technologies such as nano-based paints and sunscreens. The Montreal Protocol showed for CFCs that it is possible for global concerted action to be taken when presented with overwhelming evidence of negative impact. But in cases where evidence of potential damage is lacking, or where there are significant asymmetries between the winners and losers concerned with a new technology, the power of profit motivations could overwhelm any wish to be precautionary. If only we invested as much in environmental science as we do in developing new technologies we might be in a better position to judge where the costs and benefits of those technologies lie, and to design the use of new technologies in ways that maximise their pay-off.

These kinds of open dialogues provide rich and nuanced insights for scientists, industrialists and regulators around how much more work they need to do to communicate what is known and what is not about the risks and benefits of emerging technologies. Honesty in this communication is vital. Ideally, we need to be able to communicate information to people in ways that can allow them to make informed decisions and choices. When the costs and benefits are too difficult to express in these ways, government needs to adopt precaution and regulate based upon information derived from similar dialogues.


What is the role of a Chief Scientific Adviser?

Some people think the role of a Chief Scientific Adviser (CSA) in government is ‘to kick the door down’. No it isn’t; it’s to keep the door open to science. If a CSA finds themself locked out then they’ve failed. Muscular public shows of independence from big-hitting ‘advisers’ are singularly ineffective.

I want to see science given the consideration it deserves in the formation and delivery of government policy. As a CSA for a major area of government policy and function, I have an important role to play in ensuring that this happens. The key to being successful in securing this outcome is to build trust.

The sort of trust I’m talking about is entirely conditional on the existence of mutual respect. Policy makers have some fiendishly difficult problems to grapple with, and in dealing with these they need the help and respect of scientists. This includes the appreciation that scientific evidence sits alongside other social, economic and political considerations. Politics is the process by which contested decisions are made about policies, and I have to be careful to play the role of the scientist as an honest broker, and the provider of information within the wider social game. My role and the role of other CSAs in government is to be a trustworthy and intelligible proponent of the ‘scientific lens’; to input into the policy making process, but also to avoid the automatic politicisation that comes with advocacy. Similarly, I will not be the mouthpiece for government policy unless it is to explain why a decision has been made, or to increase wider understanding of a particular problem.

Creating a lot of noise and publicity is not the best option in the vast majority of instances where one wants to have impact. This may be a difficult message for some who seek a story that promotes conflict (often disingenuously cloaked as debate) and who want to recruit CSAs to their cause. When one CSA famously said to the House of Lords Science and Technology Committee that ‘part of the job of a CSA is to make sure they kick the door down,’ he had clearly lost the confidence of his ministers and department – such an attitude was in no way conducive to maximising the consideration of science in government. The realisation that this approach does not work may mean that CSAs do not always have the visibility in the press that some may call for. But it’s essential to recognise that our ultimate aim should always be to make sure conflicts are resolved, not created.

The argument has been levelled against me, from time-to-time, that because I don’t regularly engage with the press I am somehow being gagged by the Department I mainly work within, or that somehow the government is seeking to spin what I have to say. Nothing could be further from the truth. As a CSA I am free to say what I want, when I want. I wouldn’t do this job if that were not the case; anybody who hears me talk in public will know that I do not speak from a script. I am passionate about what science can do to support and improve government; and I am happy to talk openly about what goes on in Defra, as in this blog. Furthermore I am accountable through the Defra Science Advisory Council, an independent non-departmental public body, to the wider scientific community, for translating scientific evidence into a policy environment. Through them and other routes, I speak to scientists, I read their papers, listen to them and help them get their messages heard too.

Any scientist who works in government has the same freedoms, with the exception of a very few who work on national security, and if there are any who think otherwise then they are mistaken. We live in a world of free speech and this applies just as much to government scientists as anybody else. However, most scientists in government understand the importance working with the system and, if necessary, changing and influencing from within rather than trying to manipulate it through the media.

If my approach to being a CSA doesn’t raise my personal profile, or suggests to some in the media or elsewhere that I am not a ‘heavy-hitting adviser’, then so be it. My job is to represent science in government as best I can, not to be a public personality.

UK’s Cutting-edge science informs government response to ash dieback

The government often has to deal with difficult problems, and ash dieback disease has been no exception. Ash Dieback is a fungal disease likely to have arrived in the UK from a mixture of infected planting material and spores blown over from infected trees in continental Europe.

The pathogen causing this disease, Hymenscyphus fraxineus, was not formally identified until after it began seriously affecting trees in Eastern Europe in the early 1990s. Even then little was known about the pathogen that might help develop management strategies. Meanwhile, the disease continued to spread across Europe before being identified in East Anglia in 2012. Its arrival and the subsequent public interest demonstrates that trees, woodlands and forests hold a special place in our nation’s hearts.

There are an estimated 126 million ash trees in British woodlands over half a hectare in size, and many more in our parklands, hedgerows and cities. Ash is the 3rd most prevalent broadleaved species in GB woodlands, at 9%, and the fifth most prevalent of all trees at 4%. The economic benefit of forests is estimated to be £1bn to the UK economy, with even greater environmental and social benefits. As one of our native trees, ash is an important part of the forest ecosystem, supporting a huge range of biodiversity from lichens and mosses to invertebrates and birds. Forty-six species are only found on ash trees. So protecting ash trees is about more than just protecting a single species.

After the disease was discovered, Defra worked with the Biotechnology and Biological Sciences Research Council, to establish two research projects to improve our understanding of it. The experience in Europe showed that some trees were more susceptible to the disease, developing symptoms and dying more quickly, while others were less affected. This gave hope that some of the trees in the UK might be tolerant to the disease and their identification became one of Defra’s commitments in response to the disease.

The Nornex project, which published its final report last Friday (22 April) used molecular approaches to improve not only our understanding of the disease, but of the ash tree itself. The research has meant we have been able to develop genetic markers that signal tolerance to the disease, just as quality of plumage can signal biological fitness in birds. The tolerance was assessed using a selection of 182 Danish ash trees, scored for visual signs of disease which was then assessed against the extent to which specific genes were active. Three genetic characteristics appear to be important signals of resistance. Variability in susceptibility may be caused by how two genes interact.

This new knowledge is a great step forward and illustrates the benefits that cutting-edge science can bring to real-life problems, made all the more impressive considering the project ran for only about two years. One of the huge advances that has made this possible is the reduced time needed to sequence a genome, from years to hours and at a fraction of the cost; and the open and collaborative approach taken by the research team.

The project also made use of a Facebook game, Fraxinus, designed to use human pattern recognition skills to identify DNA sequence variations. The game was played more than 63,000 times and resulted in many reliable new sequence variants.

The research, led by Professor Allan Downie from the John Innes Centre in Norwich, was delivered by a consortium including: the University of York; the Genome Analysis Centre at the University of Exeter; Fera Science Ltd; the University of Copenhagen; Forest Research; the Sainsbury Laboratory; East Malling Research; the Forest and Landscape Institute Norway; and the University of Edinburgh.

The Nornex project’s research report can be found here:

Form or function?

It is a question for all ages; one that today continues to pervade much decision-making within wider government in general and Defra in particular. It may hold the key to the policies we make and implement to ensure ecosystem survival.

As the environmental historian, Chris Smout, has pointed out in his book Nature Contested, it is an ancient question that can be traced back to classical poets such as Horace. Form or function – which is more important?

Signalling is a useful barometer of fitness - so long as that signal is honest

Signalling is a useful barometer of fitness – so long as that signal is honest

We all struggle with this question to greater or lesser extents. When we buy a car, what is most important to us? – Racy lines; a striking colour; the promise of adventure via 4-wheel drive (even if such a function is never actually used)? When we reduce a car down to its basics, it is a metal box with a wheel at four corners, and an engine to drive it along. Its function, at least to me, is to get people from one place to another in reasonable comfort, safety and speed. I’ve never bought into the idea of cars being objects of aesthetic desire. To me, when it comes to cars, function is much more important than form.

There are so many other kinds of consumer products that favour form over function. I suspect that it could be argued that one can have both. Product functionality embodied within a form that is also aesthetically attractive must be the ideal combination for the marketing executive. The question left in my mind, and why I doubt the motives of those who focus on form alone, is that such a focus can blind people to the deficiencies of function. A car can look wonderful but under the surface it can be old technology that is environmentally damaging. One can become beguiled by form, and end up compromising on function. I suspect this happens a lot.

My father also struggled with this function-form problem. When I was quite young he took me to Bettyhill in Strathnaver close to north tip of mainland Scotland to seeking out the Scottish primrose, an endemic species to the area. He was a conservationist who was embroiled in the process of embedding the principles of conservation in to the public service during the decades following the Second World War. The pilgrimage to Bettyhill was mainly to remind him why he was making himself unpopular an era when the “white heat of technology” was driving decision-making and, as it happened, was responsible for constructing a fast-breeder nuclear reactor not far along the coast at Dounreay. These were the days before powerful NGOs held decision-makers to account, making my father part of the thin line of defence against the overpowering march of function over form. When constructing the principles under which new developments might be approved and governed, he would privately ask himself the question whether those principles would help to save Primula scotica.

The Scottish primrose is still very much to the fore today. Human expansionism has not yet wiped out this delicate little flower although many other parts of the biodiversity of our planet are at risk, or have disappeared. My father was, of course, using the Scottish primrose as an allegory for biodiversity as a whole. It was his way of staying focused on the functional outcome of his efforts rather than the form of the primrose in particular. It was how he brought his own humanity to bear on a very utilitarian problem, although he lacked the scientific evidence about the environmental damage that could be done by unregulated industrial development. In this case the depiction of function through the medium of form was a way of exploring the trade-offs between very different functional issues – the use of natural resources versus the degradation this caused to the environment that sustains us.

The signalling of function through form is all around us in the natural world. A peacock’s tail is a signal of his fitness. The brightness of plumage colour of a blue tit signals fitness to resist parasites. Throughout the natural world we see adornments like this as forms that have been adopted because they signal a particular fitness for function. Dishonesty in this signalling of fitness for function through form is generally rooted out by natural selection.

This function versus form allegory came to the fore again at the end of February this year because of the work of the International Platform for Biodiversity and Ecosystem function (IPBES), a UN-led effort to put the scientific assessment of global biodiversity on the same footing as has been achieved by the work of the IPCC.

The IPBES chose to focus its first global biodiversity assessment on the state of pollinators, a group of organisms – mainly composed of insects including our much-loved bees – that is essential for the fertilization of plants including many types that are important foods. The messages from the assessment are loud and clear: that there is a problem. In this case, pollinators are the allegory for all biodiversity, but because they are something that we see having direct value, or function, it is more likely that we will do something to correct their decline. That action alone could do much to protect biodiversity as a whole, by protecting species that could be functionally important in ways we are not yet aware of.

IPBES itself embodies this function-form debate even within its own name. They focus on trying to describe the relationship between biodiversity and the functioning of ecological systems in terms of the goods they produce. Like the cars we drive, we need to know how much loss of form can be sustained before the function declines. For cars, I suspect that colour has very little impact on function, whereas the streamlining of the body work might have a greater impact. Like the peacock’s tail there is a signalling component in the cars we choose to drive, but how much is this really functional and how much of the signalling is dishonest?

Research is making rapid progress towards helping us understand what species in ecological systems are more or less critical to healthy function. While our knowledge is improving continuously we still need icons of form to fall back on to help guide us towards the functional outcome. That is why I spent most of my research career study marine mammals. Like the Scottish primrose and other visible manifestations of ecosystem health like birds, elephants, fish stocks or bees they provide indicators of ecosystem health that help us stay on track. Research is allowing us to better understand whether those signals are honest depictions of their critical function in ecosystems.

I remain convinced that function trumps form any day, but as a pragmatist I am happy to see the use of these indicators of biodiversity to guide our understanding on whether ecosystems are functioning as they should be.

A forensic approach to the environment

Nearly one quarter of the UK’s net worth is accounted for by the environment and so understanding how we assess it, understanding its benefits as well as the risks, is vital to preserving it.  This process – which we call ‘environmental forensics’ – was the subject of my recent contribution to the Government Chief Scientific Advisor Sir Mark Walport’s annual report on forensics.

In my chapter I summarised the recent work under the National Ecosystem Assessment and the Natural Capital Committee in improving how we evaluate the environment. We all carry the costs of the environmental decisions of those around us – for example we eat, drink and breathe other people’s pollution on a daily basis. At the same time we rarely think when driving our cars or firing up the wood burning stove that our actions could lead to premature deaths. We take the benefits without thinking of the costs. That is why regulation is important. Without it there would be large asymmetry between the private benefits gained from the environment and the public costs. It has become the responsibility of governments to sustain an appropriate balance between these public and private costs and benefits. But as government are often reluctant to place cost burdens on those who cast votes, we need a mechanism that transfers responsibility for paying the costs to the individuals who benefit.

The rationale for setting environmental standards and measuring compliance is strongly driven by the concept of equity. Around half of air pollutants in the UK come from other parts of Europe – and, of course, the UK contributes to the air quality problems of other European countries. Water contaminated by sewage washed out to sea has the potential to contaminate seafood which could be distributed widely through the food chain. The choices people make about how to dispose of waste can have widespread effects, sometimes with long time lags between the release of pollutants and the ultimate effect, and this has become an issue driving global politics when it comes to different national responses to the need to reduce carbon emissions.

Government regulation to prevent the misallocation of environmental resources is therefore a very blunt instrument. Regulation has spawned an industry in environmental data measurement. The UK is mandated to measure an immense amount of information about everything from the chemistry of rivers to the number of birds on farmland and the noise emitted by human activity in the ocean. Efforts to focus attention on only measuring those features of the environment which matter has been hampered by a lack of underlying knowledge of how these relate to the benefits gained from the environment. The rationale for actions like this hinges on the risk-avoidance approach commonly used today. This approach suggests that changes caused by human presence need to be avoided even if the changes lie within the normal range of natural variability.

Seen in this context, the direction of travel in environmental forensics towards measuring and controlling more and more – at finer and finer levels of detail just in case this might be important in future – is clearly untenable.

The need for the measurement or monitoring of environmental indicators was initially driven by a sincere search for those surrogate indicators within the environment which most effectively represented societal valuation. But this has gradually mutated in to a process of measurement and reporting of data as an end in itself.

In future, the balance needs to shift towards risk and market based methods. New technology has the capacity to drive this change because it puts the power of information in the hands of individuals so they can make informed decisions.

There will always be a need for regulation and statute in this field and a strong role for government, but the nature of environmental forensics needs to change. The current system is arguably unaffordable in the future.  Technological innovation will come to the rescue to some extent by delivering more precise data at the point it affects behavioural choices.

The down side associated with the interpretative nature of decision making needs to be addressed through sophisticated information delivery processes. Micro-innovation at the source of environmental variables needs to be matched by macro-economic innovation to build market-based solutions. Internalising the economic costs of alternative actions for the environment and accounting for these, including the provision of the forensic evidence to support this method, is most likely to be the way forward.

The Nurse Review: A response from Defra’s scientists

As Defra’s Chief Scientific Adviser I convene a group of Chief Scientists and equivalents from Defra-sponsored public bodies[1].  Members are responsible for ensuring the credibility and impact of government investments in science, evidence and specialist capabilities in the areas of policy and delivery related to environment, food and farming, and plant and animal health.  As a group, we wanted to provide our response to Ensuring a successful UK research endeavour.  A Review of the UK Research Councils by Paul Nurse, (the Nurse Review[2]) published on 19 November 2015.  Defra’s scientists wish to convey the following views:

We welcome the Nurse Review and the continued commitment it makes to the Research Councils as a critical part of the UK national research capability.  The ambition to improve the effectiveness of cross-Council working is particularly welcome.  There needs to be more support for multi- and inter-disciplinary, challenge-led research that stimulates innovation within a context of wider strategy-setting.

The review opens up opportunities to improve the delivery and outcomes of the wider ‘research endeavour’ by ensuring that science and research can (as rightly pointed out in the Nurse Review) “benefit society and the public good, e.g. driving a sustainable economy, improving health and the quality of life, and protecting the environment”.

We also agree that the ‘research endeavour’ should be “permeable and fluid, allowing the ready transfer of ideas, skills and people in all directions between sectors, across research disciplines, across institutions, and between providers and potential beneficiaries”.  This is a priority in the July 2014 Defra-wide Evidence Strategy.  Implementation of the recommendations from the review will increase the opportunities there are for alignment between the objectives of the Research Councils and Government departments.  This alignment needs to be balanced alongside the requirement to maintain specialist facilities, capability and career paths to encourage scientists to develop their careers.

We are especially pleased to note that “the best research should be funded wherever it is found” and, in particular, would be happy to support the proposal to pilot arrangements for Public Sector Research Establishments (PSREs) to access Research Council funding in collaboration with universities.  We represent a number of public bodies that qualify as PSREs and among them there are both world-class scientists and excellent facilities.

In addition, we welcome the suggestion that Research Councils “should be focussed on establishing strategic priorities and relationships with key players in their wider research communities, including universities, government, business and the charities sector”.  The agencies and bodies we represent are committed to supporting effective, efficient and equitable strategic partnerships that enable Research Councils to set research strategies reflecting these priorities.  We want our capabilities including infrastructure, data and skills to be a part of this wider endeavour.

In particular, we will contribute to research mapping activities and the analysis of relative strengths and weaknesses of research across the UK.  We commit to production of a single Evidence Plan for Defra so that we can engage with the Research Councils about where research priorities lie that are in the national interest.

We support in principle the proposal to establish Research UK as a “formal organisation with a single Accounting Officer, which can support the whole system to collectively become more than the sum of its parts”.  We agree that this would enable more effective delivery of cross-Council strategy, and enable each Council to focus on its core strengths.  We stress the importance of ensuring that RUK can deal effectively with cross-disciplinary and challenge-led research analysis and horizon-scanning.  We stand ready to assist in enabling the transition to RUK.


[1] Animal and Plant Health Agency, Centre for Environment, Fisheries and Aquaculture Science, Environment Agency, Forestry Commission, Joint Nature Conservation Committee, Marine Management Organisation, Natural England, Veterinary Medicines Directorate, Board of Trustees of the Royal Botanic Gardens Kew, Rural Payments Agency

[2] The Nurse review was announced as part of the government’s science and innovation strategy in December 2014. The Government will respond in detail to Sir Paul’s report in due course.



Earth Observation: on the cusp of a revolution

New technology  tends to trickle in to our lives. It arrives with an explosion of excitement and promise but a steady journey then ensues as the much vaunted tech becomes developed and ubiquitous enough to transform our expectations and truly revolutionise our world. When it comes to satellites and the data we get from them, we have made stunning progress on many such journeys, with pause-able high definition TV and navigation systems on phones now very much the norm. However, after its beginnings in the 70s, the Earth Observation journey – the journey to use data from above clouds to revolutionise our understanding of our planet – is so far less travelled. But this may be about to change…

A few TV sets ago I took the plunge and installed a satellite dish on my roof (mine is discretely hidden behind a vigorous Clematis montana). Satellite TV was new and exciting but in truth, when I plugged the dish into my TV and turned it on, the fundamentals hadn’t changed – it was still more or less the same experience but with more channels and marginally better picture quality. But now, in 2015, the massive increases in the data we can get from satellites, coupled with vastly increased data flow on the internet has meant our TV watching experience has been transformed – it’s now the norm to have hundreds of channels of high definition pictured beamed to our TVs, we can pause and rewind live TV and we can catch up on programmes ‘on demand’ whenever we want. While the fanfare came as satellite dishes were fist installed on our roofs, it is far more recently that satellites have ‘revolutionised’ our TV watching.

It’s just the same with satellite navigation. In the early days it was just a privileged few who could (just about) rely on sat nav systems built in to their high end cars to get them from A to B.  But now, in 2015, the sat navs most of us have built in to our smart phones have capabilities far exceeding the original cumbersome in-car systems, from telling us when the next bus is coming to integrating live traffic information to tell us at each turn the current quickest route to our destination. To me, the revolution really came when sat navs became ubiquitous, reliable and highly featured, not when they first arrived on the scene.

So satellites have steadily transformed how we access information and how we get around, but communications and positioning are just two of the three major functions supplied by satellite space technologies. The third is observation, and this is an area where we haven’t yet seen the same sort of seismic shift in capability, the same revolution.

Generally known by the jargon term Earth Observation, or just EO, this revolution is one about using data from satellites and even from unmanned aerial vehicles (drones) to help us understand more about our world.  The journey began with the launch of the  US ‘Landsat’ system of satellites in 1972. Once positioned, it began collecting pictures of the surface of the planet that gave an eagle-eye view of what covered the surface of the Earth – crops, grasslands, forests, lakes, rivers, mountains  and ice  – like we’d never seen before.  The possibilities, and opportunities opened up by this data seemed limitless, providing invaluable information about natural resources, land, roads and infrastructure to help us build capabilities in the most efficient ways possible and help us to protect tour environment.

But while the journey started in 1972, we’ve been struggling ever since to know how to deal with this avalanche of data and to turn it in to useful information. We have launched more and more EO satellites in the belief that, one day, our ability to assimilate and process all the data that they chuck at us will catch up. Now, finally, I think we have. The reason? A willingness to share.

A willingness to share

In the past, the only way to access the information within the data transmitted from EO satellites was to obtain a digital image, often by paying a lot of money for it, and then give it to another kind of techno-geek to process the information it contained. This was expensive, and the end result did not always answer the need. However, the world of EO has changed. Thanks partly to enlightened attitudes on the part of those now responsible for operating these EO satellites, most of the data from them is now being made free at the point of use. For example, all the data from the new Copernicus satellite system funded by the EU, the updated Landsat system funded by the US is now freely available to anybody who wants it and they are taking a similar approach in China. While previously this data would have just lead to the problem of data overload (or ‘data poisoning’ as I sometimes call it), the simultaneous revolution of cloud computing enables the multiple petabytes of data that emerge from these systems (Copernicus chucks around 8 terabytes of data at us each day) to be stored on-line and beCloudSat_-_Artist_Concept available, anywhere in the world, at the press of a button.

The new culture of ‘sharing’ has of course not emerged solely as a result of ‘caring. ’ The market, including many small companies but also some of the big international aerospace and data companies, are latching on to new business models for delivering the data. In the past, when one accessed an image of the surface of the Earth most of the data you bought was irrelevant and would be thrown away. In the near future the user will only need to pay for the data they actually use. This could reduce the cost of the same piece of information by many thousands of times. The development of new apps will mean that there will be many more users so, rather than charging a very small number of specialist users a lot of money for access to the information, the business model is for those supplying the services to recover their costs by spreading micro-payments across many millions of users – payments so small that each individual user will hardly notice them. Information that probably cost many tens of thousands of pounds to produce in the past and was in the hands of just a few people, will costs fractions of a penny in the future and be in the hands of millions of people.

A simple change of attitude and approach has turned EO on its head. While the Earth Observation revolution may have officially started in the 70’s, I think it is now, thanks to the new spirit of openness, that Earth Observation data can truly start to revolutionise our understanding of our world.