Peering in to the Arctic from a Defra Perspective


IB Svalbard 2

A photo of me in Ny Ålesund.


I spent a few days last month in Svalbard, bringing back memories of past Arctic experience doing research in Iceland and Alaska. Svalbard had been a place where many of my colleagues had worked. The outpost of Ny Ålesund, where I was headed, lies at nearly 80oN, tucked in to a fjord on the north-western coast of the archipelago. It is the most northerly permanently inhabited settlement in the world and was the starting point for many early 20th Century expeditions to reach the North Pole.

Svalbard, NY Alesund IMG_3267_small

An aerial photo of Ny Ålesund.




Svalbard, NY Alesund IMG_3329_small

Many of the buildings in Ny Ålesund are of historical importance.



Ny Ålesund isn’t somewhere just anybody can stay. The settlement is an international research station built within an early 20th Century coal mining settlement. Although cruise ships visit and disembark passengers for an hour or two, it takes an invitation from one of the 11 countries which have a presence at Ny Ålesund if you want to stay there. Surrounded by mountains, glaciers and the Arctic Ocean, Ny Ålesund is a microcosm of both the Arctic environment and Arctic politics.

Norway has sovereignty over the islands but, through the Svalbard Treaty of 1920 other countries, including the UK, have a right of access. The strategic importance of the Arctic is such that lots of countries want to have a foothold there even if they have no land or coastline which impinges on the Arctic itself. 

The politics of the Arctic stand out as one of the major drivers for the Ny Ålesund community but the scientific research being done there is at least as important and it is a common objective which brings together all the national interests. The natural gregariousness and openness of scientists makes for a relaxed and welcoming atmosphere.

Being close to the North Pole, Ny Ålesund acts as a location used to download information from the polar orbiting satellites monitoring the Earth’s surface. These tell us about atmospheric pollution, forest cover, ocean currents and much more. On a mountain above the station is a building perched precariously on a sharp ridge which can only be reached by a small cable car. This is used to sample truly clean air – one of very few places on Earth where this is possible. And further around the bay is a couple of radio dishes pointing to the heavens to measure the regular beat of pulsars in the distant universe. These act like beacons which are reference points used to measure the drift of the continents to an accuracy of millimetres.

Then there are the glaciers nearby which are retreating fast under both the influence of climate change and also because the glaciers in Svalbard go through regular multi-decadal cycles of surge and retreat. But their walls of blue ice are a dominant presence at Ny Ålesund – as are icebergs calving off them which then float past down the fjord past the settlement.

The fjord, which once froze over most winters, is now influenced by warm water from the Atlantic pushing its way north in to the Arctic so it rarely freezes over these days. Harbour, bearded and ringed seals, and beluga whales, eat the polar cod which enter the fjord to feed on the rich food sources resulting from the effects of fertilization of the ocean by glacial dust. These are all subjects of study to unravel how the Arctic is changing as the climate gets warmer.

On-shore, the permafrost, which stretches to depths of many tens of metres is also warming and showing signs of melting. As the soils are churned by freeze-thaw for the first time in millions of years they release more greenhouse gasses thus exacerbating the process of global warming. The immediate practical consequence at Ny Ålesund is that the foundations of buildings are beginning to move. It is perhaps ironic that the reason which first brought people to this remote spot – coal mining – are also the reason they keep coming back in the present day – to observe the effects of burning these fossil fuels on the planet.

By the time I visited towards the end of August, the bright summer flowers of the tundra had largely passed over and had been cropped flat by the unusual, short-legged, variety of wild reindeer which inhabit Svalbard (see photo). The many waterbirds – geese, ducks and waders like purple sandpipers – which breed in these parts had largely departed for their wintering grounds, many around Britain, and were a reminder of the close connection which exists between this cold, treeless landscape and the biodiversity of Britain. The barnacle geese which breed on the offshore islands in the fjord are the very same ones as we value so much at the Carlaverock nature reserve on the Solway Firth.

Svalbard, NY Alesund IMG_3281_small

One of the wild reindeer in Svalbard looking very healthy at the end of the Arctic summer. Svalbard reindeer are very much smaller in stature than those elsewhere.


The strength of the links between Svalbard and Britain go even deeper. How we respond to the challenge of climate change is going to depend on what happens here in the Arctic. British scientists are investigating the effects of warming on this delicate ecosystem. This includes experiments to understand how the tundra will change as the Artic warms up, potentially releasing more greenhouse gas in to the atmosphere. The west coast of Svalbard is especially good as a sentinel because it is warming unusually rapidly. This is because warm Atlantic water coming through the Iceland-Faroes Channel far to the south is making its way further north and starting to bathe the west coasts of Svalbard around Ny Ålesund.  When the manager of the UK’s research station first started coming to Ny Ålesund rain was unknown – all precipitation fell as snow. Nowadays rain is common.

Svalbard, NY Alesund IMG_3296_small

The UK’s Arctic Research Station in Ny Ålesund.




Svalbard, NY Alesund IMG_3294_small

The UK’s Arctic Research Station is operated by the Natural Envrionment Research Council.




There could be nowhere better to understand how climate change will change the Arctic. Sediment and ice cores taken locally place the current climate trends in the long-term historical context and show how unusual they are.

All this is affecting the economy of the Artic as well. A tourist industry is beginning to turn Longyearbyen, the frontier town which is the capital of Svalbard, in to a thriving centre for tourism. Only a few weeks before I visited a Danish shipping company announced that it would be testing the idea of sending cargo vessels through the north-east passage, something which can only happen because of the retreat of the fields of Arctic sea ice. 

The icon of the Arctic, the polar bear, is also likely to be affected and this is something people really care about. At Ny Ålesund, polar bears are respected if not feared. I had the mixed fortune to run in to a mother and cub (see photo). This is the ultimate predator. It is truly at the top of the Arctic food chain where humans are relegated to second place – which is a powerful message me as a representative of a species so dominant elsewhere.

polar b

Polar bears in Ny Ålesund.


By now, I hope it should be clear why Svalbard is important to Defra. The credibility of the UK as a leader in tackling climate change needs to be underpinned both by research examining its effects and presence of the UK in the international fora where decisions are made. Moreover, important components of the biodiversity of Svalbard are shared with the UK. And I have yet to meet anybody in the UK who does not care about the fate of polar bears. At present, they seem to be doing well in Svalbard partly because of reduced hunting but they are being confined to land more than in the past which means they hunt birds and their eggs. This will probably eventually affect those populations.

This is why we should care about Svalbard and the Arctic and why the UK needs to remain interested in its future. The Foreign and Commonwealth Office, together with the Natural Environment Research Council, maintain the UK’s research station at Ny Ålesund but I think Defra should also be fully engaged too as it develops a more expansive leadership role in international environmental stewardship in future.



Seeking Hi-Brazil

In some senses we all seek a promised land, like the medieval fable of Hi-Brazil, a phantom paradise island. For some, this might look something like “Love Island”, a fly-on-the wall reality show currently screening on UK television (I’m pleased to confess I’ve never watched it). For others, like me, it represents something a little grittier.

In my view the operation of government, in terms of how it comes up with ideas about how to fix problems, experiments with solutions, evaluates the outcome and then modifies the solution based on experience, is much the same as the scientific process. Building the scientific process in to government and making it the backbone of how government functions is my Hi-Brazil.

Like all promised lands it will be fictional and like the medieval mariners who hunted for Hi-Brazil, CSAs like me also hunt, mostly in vain, for their nirvana. But even CSAs can sometimes spy a distant land through a thick fog and can start to believe that it might actually exist. What does this land look like? On this land, there is a completely harmonious, seamless relationship between academia and government.

By way of confirmation of the sighting of this land, The Institute for Government (IFG), with part sponsorship from the Arts and Humanities Research Council, recently published a report called “How government can work with academia”. It looked at how government can improve the way it uses academic evidence and expertise in informing policy.

It is always helpful to have an external view such as that given by IFG on how the policy-science interface is working. Those looking in from the outside are often well placed to offer the challenge to government. The report was based on interviews in 10 government departments and it shed light on what works, what doesn’t work and it makes a series of recommendations. Overall, Defra fairs very well from the report and is seen as an exemplar in a number of areas including: its use of structured and responsive expert networks and committees, its systems for managing university relationships, and its approach to bringing in secondments to deliver valuable work (incl. evidence statements) including ensuring they develop insights into policy.

This is all very encouraging and is a tribute to the hard work and a gradually shifting sense of shared responsibility between the people who work in Defra – including those from my office and other scientists, analysts, social scientists and economists who are working in the same teams as policy professionals. This embedded model, which places specialists at the heart of policy-making and empowers them to have an equal share of the responsibility for policy development and delivery, is beginning to shine through in terms of better relationships with academia.

It takes years, perhaps decades, to turn around the massive ship of government, by changing cultures and ensuring that diversity of expertise is valued and built in to decision making. This shift isn’t the same in all parts of government but I think it’s particularly strong in Defra.
For example, Defra’s engagement with the new organisation responsible for overseeing the health of research in Britain, UK Research and Innovation, together with its component Research Councils, has influenced the way scientists are thinking about how they might address questions which are both scientifically interesting but which also address the issues vital to making better policy. Individuals from Defra, who have a strong sense of what great science looks like, participate in Research Council programme expert/advisory groups, knowledge exchange network events and other fora. They also work closely with senior researchers to shape their own ideas and those of the research community.

Gone are the days (I hope) when Defra’s representatives came to these discussions with an agenda. Those sitting on the policy-academia interface now include both academics and the embedded specialists who work in Defra. They are exploring cognitively complex issues which eventually leads to co-design of research and policy; one alongside the other rather than one subservient to the other. We have moved in this direction recently on pollinators, valuing natural assets and there will be more on this in landscape decisions and air quality in the near future.

This is all about having interactive, engaging and influential conversations. The short-term rewards for academics can be pathways to impact which they can exploit within the Research Excellence Framework, but the end point is much more significant. This involves better outcomes supported by a very influential set of thinkers and networkers who feel that they are part of delivering those outcomes. The outcomes are valued.

I still come across some crusty academics who see their role as a battle to keep government in check and policy specialists who just see academics as mendacious meddlers. But they are fewer and further between now and getting rarer all the time. While there is still much to do to make government think and function much more like a scientific process, I see the IFG report as a partial endorsement of progress. My Hi-Brazil is still partially in the fog but not as much as in the past.

Synthesis is the next evolution of the scientific method

This week has seen the publication in Nature  of Four Principles for Synthesizing Evidence – what I see as a key perspective piece.  I and a number of others want to put evidence synthesis centre stage in science alongside the key primary research breakthroughs which push the boundaries of our understanding.

I recently read the book “Theory and Reality” by Peter Godfrey-Smith about the philosophy of science and how it changed through the 20th Century. One of the most powerful conclusions from this exploration is just how much our view of reality is moulded by the methods we use to investigate and understand the world around us. This has shifted us in small steps from uncomfortable fundamentalist positions which questioned the nature of reality itself to much more pragmatic views of the world.

Although the former fundamentalist positions still have not gone away, science has progressively developed increasingly convincing methods for describing what is real. Part of this involves the recognition of science as a creative and social process. We still reward individual scientists for their ‘discoveries’ with prizes but in reality progress in science is a mighty aggregation of the efforts of large numbers of people. I believe that some the most significant progress in science in the early parts of the 21st Century come not from individual breakthrough publications but come instead from the synthesis of evidence across many different lines of enquiry.

The advent of online publication and the presence of powerful web-based portals like the Web of Science and search engines are not only the result of this aggregated process of advancing science, but are also the things that will enable more knowledge to be aggregated in future. This is a form of systemic evolution which, if carried out well, could push the benefits of science in society to much greater heights.

The reward structure for scientists is also recognised by philosophers of science as an important component driving this machine for invention and innovation. The scientific establishment has been slow to reward scientists for looking across their disciplines and coming up with new ideas or insights about the world based on gluing together information already in the public domain. Individual scientists are like component manufacturers who have been told to make pieces of a structure without anybody being tasked with joining all of the pieces together.

Indeed, synthesis has been frowned on as secondary or derived information. Synthesis has been disparaged by being confused with ‘reviewing’ or the restatement of old ideas in a new context. We need to transition from talking about scientific review to talking much more clearly about the prospective and inventive process of scientific synthesis. It has been devalued within the reward structure but its potential is vast. In the article in Nature, we want to purge the old idea that merging and analysing the outcome from multiple strands of scientific output somehow lacks importance, and we want to put synthesis in its rightful place as an exciting, intellectually challenging, high-status and respected activity that provides a global public good.

The article has been built on workshops led by the Royal Society and the Academy of Medical Sciences. It recognises that synthesis in science is as valid a pursuit of original knowledge and ideas as any other. The complexity of scientific information these days is such that it takes special skills and procedures to pull out the essential, reliable knowledge from the midst of a huge mass of disparate information. Evaluating the quality of the underlying evidence base and building up new and otherwise unseen pictures of the world are just two of the reasons why scientific synthesis is especially important for areas like policy-making.

The article defines the characteristics of good evidence synthesis to inform policy: it needs to be inclusive, rigorous, transparent and accessible. It also recognises that synthesis can take many different forms and the utility of these forms depends of the audience. Synthesis to support the progress of science itself might be very different from synthesis to support a decision being made by regulators, for example, about whether a particular drug is effective and safe, or by governments about whether a new policy is likely to be effective. Despite this range, the principles should apply to synthesis for all policy purposes and timescales.

We recognise that producing syntheses can be a substantial task, often involving multiple collaborators working together over months or even years. However within the policy environment, sometimes those who need synthesis can require this information in periods of days and perhaps even hours or minutes when involved in fast-moving emergencies. Evidence synthesis for policy represents everything from flying by the seat of the pants when one is being driven by events out of one’s control through to measured and deeply intellectualised representations of the current state of knowledge. The processes involved in manufacturing the syntheses in each case need to be tailored to the circumstances – but they all have the same key features. In essence, they make sense of the vast amount of published data and information and turn it into accessible, usable, knowledge. This can change our view of the world around us in ways which were not predicted and we emerge feeling better off, more educated.

In Defra, we have been grappling with the idea of synthesis for some time. I have wanted all our main policy areas to be informed by “evidence statements”. These are short, authoritative, readable syntheses of the scientific knowledge – including its strengths and weaknesses – in a particular field. For example, we produced one recently on the effects of air quality on semi-natural terrestrial ecosystems. After a lot of careful consideration, we are starting to build what I hope will be a large portfolio of evidence statements produced by applying the principles described in our article. If implemented across the whole of Defra this would amount to a potentially very large number and, perhaps eventually, we will reach a point where we can produce syntheses of the syntheses.

At present, these mini syntheses are scaled so that they can be produced by a PhD student on secondment to Defra for a period of three months. Not only does this time-limit instil discipline in terms of scope but it also means we are exposing our researchers of the future to very practical experience at the coal face where science meets policy. These individuals also walk away with a very tangible output to their names (hence they receive credit which is so important in the social process of science).

Making the scientific effort of the past count more in the present and future has to be a good thing. The art of scientific synthesis will, I predict, be a recognised and increasingly valued part of the scientific effort in future.


Kicking the Photon Along the Road

We eat the energy which comes from photons, which are the elementary particles making up light. This happens because photons of light with sufficient energy knock electrons into a higher energy state within the green plant pigment known as chlorophyll. These are then used as the energy needed to manufacture complex molecules like starch, proteins and fats. This is an old story but it has a new and exciting twist.

In my job I sometimes feel overwhelmed by the challenges connected with how to build a sustainable future. Most of the time the solutions which many well-meaning people bring to my door just don’t add up or they are insufficiently ambitious. More often than not they create a new way of doing things which just moves the problem of a sustainable future elsewhere. They are not solutions. They kick the can along the road.

The challenges we have will not be solved by doing a little here and a bit more there. Tinkering with our current systems of old production won’t be enough.

For example, one of the biggest challenges we face is how to change the energy balance of agriculture. Currently, we expend about 10 calories of fossil fuel to generate one calorie of food. This is unsustainable not just in a small way but to quite an alarming extent. If we are to tackle this problem farming will need to be completely re-imagined within the next 30 years or so.

Such a huge challenge calls for us to take this back to its basics. How can we transfer photons of light from the sun into edible energy at maximum efficiency? This is a really interesting scientific and engineering question but it is also important and answering it might just be the saviour of humanity.

Until a few days ago I thought that “vertical farming” which was the answer which some people had brought to my door as a solution was just another of those ideas designed to kick the can along the road. This was because, after doing some research, I had concluded that it would always be more efficient to use a photon coming directly from the sun to energize electrons in chlorophyll than to use a photon generated by a light powered by electricity. The loss of efficiency involved in generating the electricity, transmitting it to the point of use and then translating the power of electrons into photons would always make this a pretty unattractive option for future food production. But I’m beginning to think I might be wrong.

“Vertical Farming” is the idea that we can grow plants within enclosed spaces, often including high-rise warehouses, using artificial light instead of the sun. Is it really the future way that our fruit and veg will be produced? I think it could be and here is why.

On a recent visit to a pilot vertical farming unit run by a company called Intelligent Growing Solutions housed at the James Hutton Institute near Dundee I saw what can be achieved by an innovative, never say die, approach. They have worked through the problem of how to translate a photon into edible food. Using the best expertise, much of it in Scottish Universities, they have solved probably the biggest problem in vertical farming. This is concerned with how to improve the efficiency of the Light Emitting Diodes (LEDs) used to generate the photons of light from electricity.

They have re-imagined the problem and on the back of their key innovation they are systematically re-engineering the processes of plant production. This draws on the world’s leading technologies in robotics, storage and transport solutions, and plant biology. By intelligently integrating these growing systems with grid electrical generation, including placing them in the right places like in the centre of cities, they can reduce their power costs to a fraction of normal prices. By containing the farms inside buildings they can optimise the atmosphere for plant growth by enhancing CO2 levels and ensuring it is always at the best temperature. Sealed containment eliminates disease (hence no need for pesticides!) and sensors inspect the plants to detect stress so that they can adjust the growing conditions to be just right for the plants. Nitrate and phosphate fertilizers are provided in exactly the quantities need by the plants and there is no waste and, therefore, no environmental impact.  Robots will busily move pollen around to fertilise plants like beans or soft fruits.

All this will happen in stacked plant trays each of which is tended carefully, and untiringly, by robots around the clock. The plants will grow perhaps at least twice as fast as they do in the field. There is no need for heavy machinery to plough, sow and harvest. Where once many human hands were needed none are now needed. Where once the plants had to put up with all that the weather and climate would throw at them now they are to be cosseted and their every need catered for. Where once the produce was packed into trucks to be carted around Europe it can be produced next to the point of sale. Where once vast tracts of land were needed, the land footprint is very small. Where once only a single crop was possible each year on that vast area of land now perhaps four or five crops of equivalent size will be produced on the much smaller area. Where once we were oxidising valued soils such as the fenland peatlands of East Anglia to produce vegetables (and a lot of CO2 to boot) these can be returned to the function of storing carbon and hosting wildlife. Where once we produced a lot of food which was wasted we can almost completely eliminate this waste.

All this is made possible by one, quite simple, but very clever innovation with how a 3-phase electrical supply can be adapted to feed power to LEDs. This innovation alone is very likely to pop up in other places too. For example, it could completely revolutionise how electricity is routed around factories and even our homes. There is something quite poetic about a bunch of people who are brought together by an entrepreneur with an imaginative idea about farming which could turn out to change all our lives in positive and unimagined ways. This is how innovation works and it’s quite magical.

This doesn’t seem to me to be like kicking the can along the road. But one thing worries me. This is happening now in Scotland, driven mainly by the brilliant minds of people mostly in Scottish universities and institutions, as well as a Scottish entrepreneur. However, will Scotland be the country to scale this up and make it real? Will Scotland be the place that the world turns to as the great innovator in vertical farming, like Finland has been for mobile phone technology? This will only happen if the major capital investments needed to turn this into an industrial reality are made in Scotland. Otherwise perhaps we’ve just kicked the photon along the road, but at least we’ll have the great satisfaction of knowing we’ve just done something really great towards saving the planet.


Ian Boyd (second from right) pictured inside the newly commissioned vertical farming facility at the James Hutton Institute with (left to right) Douglas Elder (Project Manager at Intelligent Growth Solutions), Colin Campbell (CEO, James Hutton Institute) and Henry Aykroyd, founder and director of Intelligent Growth Solutions.


Insect declines in Germany – is seeing really believing?

I once re-homed a dog which had a pathological hatred of brushes. She had clearly been mentally scarred early in her life by being beaten with a brush. We are all scarred to some extent by our past experiences. One of my mental scars concerns the interpretations placed on historical data suggesting trends through time in natural processes, also known as “time series”. This may seem a somewhat odd aversion to have but let me explain why it has been important to me as a scientist by using a recent example concerning the declines of insects in Germany.

When I began my PhD in the early 1980s the research field I was working in was dominated by a simple idea. This was that as populations of long-lived mammals declined they compensated by beginning to reproduce earlier in life and by producing more offspring. Much of the evidence to support this came from the observation of trends in these reproductive features through time.

But there was a problem. Almost all the trends were going down. I cannot ever recall seeing an upward trend in these data. Everybody thought this was indicative of big problems. For me, as a young researcher, I was worried by this and found myself out of step with the received wisdom. I eschewed including these time series in my studies although at the time I wasn’t sure why.

The observation of these trends was important because it was a dominant force in the arguments being used for the population management of some of our most iconic species such as whales, seals, wolves, bears and elephants. Fortunately, I wasn’t the only person who was worried by this and eventually much of the empirical picture was exposed as an artefact.

Ever since then I’ve been very sceptical whenever anybody presents a time series of data. This includes everything from tree ring data purporting to show trends in climate to trends in the abundance of birds, bees or butterflies and moths across the British countryside. Sometimes I feel I could write a whole book about just how misleading the data about trends can be. And yet, the simple messages they carry mean we lap them up. A wiggly ascending or descending line on a graph carries a lot of beguiling messages. But are they true?

So when I read the recent paper published in PLoS One ( showing declines in insect populations in Germany I started from a sceptical viewpoint. I started from a position of knowing what I needed to see in the paper which would convince me that the trend was real. The Guardian had already reported the simple, beguiling message as ‘Warning of “ecological Armageddon” after dramatic plunge in insect numbers’.  Did I see all that I needed in that paper to convince me that the simple message was correct?  Not quite.

The authors had a sample of 96 data points from 63 sites across Germany. They had trapped insects using a standard method between 1990 and 2016 and then had plotted the total biomes of insects trapped through time. As I read the paper I began to like it. My prejudices were being challenged and weakened.

A strength was that not many sites were sampled on multiple occasions. Some were sampled two or three times. This helps to get rid of a nasty feature of time series data known as autocorrelation. So far, so good.

The authors had also gone to great lengths to describe the data using a robust statistical method. This had helped them to look for relationships with weather and changes in land use, all important for building a picture which might convince a sceptic like me.

The results which emerged were quite startling. In 27 years, on average, there had been a 75% decline in insect biomass. Even if I thought the heavy-weight statistics might have built some form of artefact in to the result, the magnitude of the change was so large that it would be difficult to see this as a statistical artefact. I was convinced that the authors claim of declining insect biomass in the sites they had observed was real.

But when I turned to the Guardian article and saw how this result was being interpreted I started to get worried. Technically, the paper showed declines across 64 sites which had been chosen specifically for their conservation value. These were probably mostly relatively pristine habitats. But people were now saying this was telling us about how insects were declining across the whole countryside.  Let me explain why this is misleading.

There are basically two reason. The first concerns what might be called a founder effect and the second concerns how representative the habitats sampled in the paper were of German countryside as a whole.

In Europe the Habitats Directive has encouraged us to scout the country for sites where wildlife seems to be in a relatively natural and abundant state and then to put a protective ring around them. This is a good thing to do but we need to be aware that these sites are not going to stay the same through time. In an environment where there are lots of dynamic processes going on like weather, land use, natural succession and many more, including natural dynamics, change will be the norm. The authors of the paper did a valiant job of trying to recognise this but were very constrained in what they could do to compensate by the lack of control sites chosen at random. This effect, is brought about by the state of these sites when they were founded.

In these circumstances there are only really two likely directions for future change in a set of protected sites which already have high wildlife abundance: they can remain much as they were when they became protected or the abundance of wildlife declines. It is much less likely that abundance will be seen to increase in such a set of sites. It’s hard to make pristine sites more pristine although I acknowledge that management and restoration is an important part of the current philosophy of conservation and could be expected to lead to some increases in measurements like insect biomass.

The second reason concerns representativeness. The sites in Germany were almost certainly a highly biased representation of the German countryside as a whole. A fairer sample would have compensated for these high quality sites by also choosing sites on land which had been cleared of its wildlife. Tracking them in parallel through time using the same methods would almost certainly have produced a very different result.

Can the results in this paper then be used to extrapolate across the whole countryside? I don’t think so. The sites reported in the study are likely to be extremely unrepresentative of German countryside.

Indeed, taking both these problems together, if one was to have thought deeply about this study in advance I suspect that the eventual result would not have been a surprise at all (at least qualitatively if not quantitatively). This is purely because of how the sites were selected.

In my mind, therefore, the idea that there are large changes in insect populations across Germany remains unproven. Of course, it could be correctly reflecting wider trends but this study does not provide that result. Other studies have shown declines but remember what I have said about the multitude of problems with time-series. Were all those other studies fair tests? Almost certainly not and few that I have seen are a fair test. An accumulation of many unfair tests does not amount to a fair test. Indeed, it probably amounts to the creation of an illusion.

We are all victims of our own prejudices. The scars I carry remind me constantly of the dangers of prejudiced interpretations of data. Like everybody else I want to really know what is going on across the countryside but unlike those who uncritically lap up information like that in the German study I also worry terribly about just how blind we are. I cannot bring myself to believe a lot of the data used to track change. The best data we have comes from the BTO, and that shows a mixed picture, but we need to become a lot better at producing synoptic measures of changes which truly capture the total picture. Our work in Defra (including collaborations across the Environment Agency, Natural England, CEFAS and JNCC) on earth observation, when mixed in with the kind of data produced by the BTO, promises much when it comes to putting us on the right track. I want to see us genuinely moving to a new way of systematically measuring and monitoring the environment in ways which can meaningfully track progress.

The lesson from the paper about insect declines in Germany is not about insects at all. Instead, it is about ourselves and whether we want to perceive the real world defined by systematically-gathered, reliable data or whether we prefer to believe our own prejudices and design the data to fit them.

Opportunities too good to miss

As the calendar rotates through to the October, the tachometer shows that I have completed five years as the Chief Scientific Adviser at Defra. Some would say that is enough. Any academic who becomes immersed in government for too long runs the risk of becoming a part of the system, rather than a challenger to the system.

In February, I said that I would leave Defra at the end of August. However clearly this has not been the case and I would like to explain why.

There are many reasons. Some are personal, but most concern what is happening in and around the Defra group and how much that excites me. Much is changing across the scientific landscape at present. UK Research and Innovation (UKRI) will be established and there are the new opportunities from the Global Challenges Research Fund and the Industrial Strategy Challenge Fund. In the whole of my career, the opportunities have never been greater for research to deliver meaningful progress.

The Defra group is not in a position to benefit directly from these initiatives, but it can benefit indirectly. The reason for this is because of much that has been happening behind the scenes here over the past few years. From once being a significant sponsor of research, the Defra group has had to change to become a better user of research. It is becoming a customer, rather than a supplier or sponsor of research. As a customer, the Defra group needs to lead the intellectual agenda with respect what questions should be tackled by research. For the first time in my experience at Defra, the research community is really in a mood to listen to what challenges government departments such as this.

The Defra group is responsible for delivering the basics of life – food, water and air – in sufficient quantities and to a demanding quality standard. As a consequence, we have to deal with some of the most difficult questions facing people and the planet. These include how to mitigate the effects of climate change, sustain food and water supplies, cope with the spiralling demand for natural resources and minimise the poisoning of the environment, and ourselves, by pollution. It’s a massive and critically important agenda. In future, our way of life is going to depend on decisions made within the corridors of Defra group organisations. Balancing the delivery of goods from the environment in the long term with the demands for economic growth in the short term will always be difficult and we need the help of the best intellectual minds Britain can muster.

Like many others, I cannot easily walk away from these challenges and especially when opportunities are opening up which could ratchet us along the track to improvement. I occupy a position in the clockwork which makes this process work. The baton I carry needs to be passed on eventually, but it needs to happen at the right time. When I arrived in government, it took me some time to fully understand where I sat in the clockwork of the government system and how to influence it. With all the changes going on in the research sector and also with the UK exiting the EU, and the challenges and opportunities that throws up, this doesn’t feel like the best time for me to pass that baton on.

Consequently, I have agreed to stay on for at least another year and, with the support of the department, to pursue an ambitious agenda.

The value of scientific opinion?

The European Food Safety Authority (EFSA) has published guidance on how to incorporate uncertainty in to scientific assessment[i].  On the plus side, this is a thorough attempt to bring objectivity to the description of uncertainty and to minimise subjective opinion. On the negative side it could eliminate the opinion of scientists from the policy debate. Where uncertainty exists, this could result in risk-aversion in policy-making.

As a scientist, I believe it is vital that public policy is underpinned by a foundation of evidence. However scientists must also acknowledge that policy makers look through many lenses when making their decisions and science needs to play its part as one of these lenses. It is therefore important that the relationship between uncertainty in the evidence and risk to policy is understood.

While scientists are used to dealing with the uncertainties inherent within their evidence, these uncertainties present a real tension when being used to underpin the more black and white, yes or no, world of policy . Government departments, like Defra, use evidence to guide rather than to determine policy in areas of uncertainty.

Scientific uncertainty comes in two basic forms – aleatory and epistemic. Aleatory uncertainty is the natural variability in a system and is often irreducible even through research. For example, the yield of wheat per hectare from British farms has a tendency to vary among years. In contrast, epistemic uncertainty is what we don’t know, or gaps in our knowledge and is amenable to being reduced through research. For example, wheat yields from British farms have been, on average, static for about the last decade and we don’t know why. It is important to understand the difference between these forms of uncertainty in the context of evidence assessments for policy making.

This is well illustrated by the recent EFSA document which is aimed mainly at documenting epistemic uncertainty. Evidence assessments are now used widely to produce ‘scientific opinion’ in an attempt to advise policy-makers about the scientific consensus view on a subject. EFSA uses them a lot – e.g. for assessing the safety of pesticides or GM organisms. The Intergovernmental Panel on Climate Change (IPCC) is another body that has done this on a massive scale to provide an assessment of the evidential basis for anthropogenic climate change.

These assessments needed to include opinion because we know that the way evidence is generated through the scientific process is itself subject to aleatory uncertainty. For example, the results from many experimental studies carried out in the fields of psychology and biomedicine are known to be unreliable[ii]. Including just the epistemic component of uncertainty using this literature could produce a biased assessment. Among all the studies done in a particular field, it can be impossible to discriminate the reliable from unreliable studies using systemic, rule-based assessment. In the environmental sciences, where studies are often impossible to replicate and where less reliable inferential methods are often used, this problem is probably even more profound.

Within this context, the EFSA attempt to corral and upgrade the assessment process by being clearer about how uncertainty is being dealt with is commendable. However, nobody should imagine that this will solve the problem about how scientific evidence is used to define the risk associated with food in Europe. Beliefs and values are as prevalent within scientists carrying out assessments as they are in non-scientists. The kind of processes being suggested by EFSA, while necessary, still should not ignore scientific opinion. The EFSA guidance carries the risk of systematising the expression of uncertainty by focussing purely on the state of knowledge, the epistemic component of uncertainty. Recognising the existence also of the aleatory components of uncertainty in scientific assessments is essential. It brings humanity to the discourse between science and society, and science and policy.

[i] Guidance on Uncertainty in EFSA Scientific Assessment, EFSA Scientific Committee, doi:10.2903/j.efsa.20YY.NNNN,

[ii] Nosek, B.A. et al. Estimating the reproducibility of psychological science. Science: 349  DOI: 10.1126/science.aac4716