Understanding ecological systems takes time. While some experimental ecological work, performed under controlled lab conditions, can be conveniently fitted into the short-term periods beloved of funding bodies, much of ecology requires a longer-term perspective. Why is that? First, life-histories frequently operate at generational scales approaching decades. To have any hope to make sense of patterns of inheritance, selection or demography we need data spanning multiple generations, and that may mean multiple decades. Second, almost all ecological studies reveal heterogeneity among individuals – frequently in terms of vital rates, or detection probability, or other aspects of life-histories. Such heterogeneity makes it very hard to extrapolate from cross-sectional observations to understand the true sources of variation driving a population.
Third, as anyone who has carried out a field study knows, a sample of just a few years can be hopelessly unrepresentative of the overall pattern of variation. To illustrate with a personally relevant example, the last two years of the long-running Wytham great tit population study (which I now oversee) saw a year with one of the latest ever spring breeding seasons (2013), immediately followed by one of the earliest (2014). This pair of years – in an unbroken sequence of 68 years – provide a fascinating contrast, because they enable us to compare how the same individuals respond to markedly differing conditions, but we can only make that comparison informed by the long history of this study.
Finally, and the flip-side of the argument above, is that ecological processes may sometimes be influenced by rare events, or themselves occur rarely. For example, it will usually be hard to perform a study of inbreeding in a wild animal population with only a few years’ data because close inbreeding is rare. But understanding the many demographic causes and selective effects of inbreeding, or the how the evolution of inbreeding avoidance drives patterns of dispersal in populations, requires sufficient of these rare events to draw robust conclusions. Equally, if we want to understand the influence of extreme climatic events, or of sudden human-caused events, we need a long sequence of data for comparison.
No doubt all of these points are familiar to most readers; and indeed they have been made many times, as has the point that long-term studies are disproportionately influential in terms of their contribution to science. So, why make the point again?
I was sparked into action by a colleague’s solution to a threat to the viability of a long-term population study. Tim Birkhead – my former PhD supervisor – has run a long-term study of a seabird, the guillemot Uria aalge, on Skomer, off the West coast of Wales, ever since he began studying the population as a PhD student in the 1970s. Over the 40 years of the study, Tim and colleagues have documented changes in demography as the population has steadily climbed from a population trough1,2, analysed how the population recovers from sudden events3, as well as using the population as the basis of several PhDs on the behaviour and ecology of monogamy in birds (the results from which have featured in several successful books). In global biodiversity terms, one of the few things the UK is really important for is its coastal seabird populations, and there have been worrying signs that, in some parts of the UK, these populations are in a parlous state. Understanding why some populations – like that of the guillemot on Skomer – are doing well is therefore of considerable conservation relevance. Continued monitoring of the guillemot population on Skomer also provides the raw data to track the way the population responds to future hazards and threats, and interpret these in the light of past responses.
All of this scientific utility from this one study is generated by simple, good-old-fashioned field ecology. A few days’ clambering around on the cliffs at mid-summer each year yields a cohort of colour-ringed young guillemots of known age and origin. Several months of patient observation (and reading leg-rings in colonies of tightly packed guillemots surely takes patience!) each spring and summer yields resightings of these birds, from which recruitment and survival rates can be estimated. That description of fieldwork is – in essence – true for the core data that underpins many long-term studies: it is, on the face of it, simple to collect, but generates tremendous value. We shouldn’t confuse the complexity with which data are collected with their utility. What we should appreciate about studies like this is that continuity is vital. A gap of a single year means, not just potentially missing data on annual variation in recruitment and survival rates, but an entire missing cohort of marked birds, for which nothing will ever be known. For a long-lived species like a guillemot, this cohort might make up a relatively small proportion of the population; for shorter-lived species a missing year would be catastrophic in terms of population coverage.
Tim’s guillemot project costs about £12K a year to fund. Small change, really, when the typical research council grant in the UK – running over three years – comes in at something like £500K. For a long time, the project was funded by the Countryside Council for Wales, but, in its new guise as Natural Resources Wales (NRW; what is it with rebranding? I wonder how much that cost anyhow…), it took the decision to axe the study.
And now to the title of this piece: frustrated by NRW’s refusal to revisit the funding decision, Tim has decided to make a foray into crowd-sourcing to generate the funds to keep the study going. This has gone well with the target reached in only a few weeks, with an average donation of just under £20. In some ways this is very encouraging – if you wish to donate, please visit https://www.justgiving.com/timbirkheadguillemots/ – but it also raises many worries about the long-term vitality of such studies. While Tim is to be congratulated for exploring this alternative source of funds, I doubt it is viable for many such cases. Few of us have quite as much drive and charisma as Tim has with which to engage the public, and guillemots on rugged cliffs may also engage the public more than, say, a long-term study of plant ecology. Inevitably, one suspects that the public will only have a limited amount of interest in long-term studies, before compassion fatigue sets in.
“Wait a minute”, you may be saying, “aren’t many long-term studies funded by research council grants? Isn’t that a sign that they are attracting the resources they need? If they produce such good science shouldn’t they be competitive for that sort of funding?” Well, yes, this is true for some of these studies. Famously, one particular long-term study (red deer on Rum) is said to have had 36 years’ consecutive funding from UK research councils, all in the form of three-year grants. However, I think that reliance on standard grant funding as a way to fund the core parts of longitudinal studies creates three different types of perverse incentive, none of which is good for science in the long run. Arguably, this may actually reduce the funding available for other types of ecology.
First, because grants are short-term (relative to long-term studies), the focus on grant assessment is on testing new hypotheses, and these frequently have to be approached using an experimental approach. Experiments may have all sorts of consequences, and could easily undermine the future utility of the long-term data that led to them. Second, I think that part of the reluctance of those running long-term studies to make their data freely available stems from their need to parcel out their ideas about analyses possible in their data sets to different small and short-term grants. The worry about making data open is that others might carry out and publish such analyses and that this would then compromise the ability to secure funding in the future. Finally, the emphasis on novelty in standard grant applications (as opposed to seeing the value of continuity) leads to large and complex grants that emphasise the testing of new ideas, even if those applying for funding might be prepared to accept funding at a lower level to maintain continuity. Indeed, the ‘core’ data collection of long-term studies sometimes represents a byproduct of these larger grants, even if it is this byproduct that may, in the long run, be more valuable. After all, novel ideas frequently fail to generate more than a lot of hot air, but core data on individual-level life histories has a proven utility.
What is the solution to this situation? For some time I have felt that a relatively small investment by UK research councils, or (dare I say it) by some of the larger professional organisations such as the British Ecological Society, might actually be a cost-effective solution. Most long-term studies are – at their hearts – cheap to run. £12K for annual monitoring of guillemots might be towards the low end, but in many cases, I suspect that the annual costs for the core data are of the order of £25-£50K. Most of this goes on paying salaries of field and research assistants. So, allowing for inflation, for an investment of £2M, I suspect that we could fund a dozen or more of the most extensive long-term studies for at least a decade each to collect their core data. At current rates, that’s about four standard grants’ worth, which would on average generate only about a tenth of the same type of data which, if not continuous, might have much less than one tenth the value. Remove the extra costs associated with writing (and refereeing and assessing) multiple grant applications along the way, and the equation looks even better. Such an arrangement could be linked to making the data sets openly accessible, and generate even more return by enabling others to analyse these data. My personal view is that this would be desirable, although this issue is controversial among those who run such studies. Roche and colleagues4 have some interesting suggestions about how to increase buy-in to this activity, and it is an issue to which we’ll return soon in this blog.
I should declare an interest here: I’ve inherited running the long-term studies of tits based at Wytham Woods near Oxford, from which we continuously extract data collected over decades, some of it before I was born. I can’t take any credit for the founding, or the basic design, of that study, but I am keenly aware of both the value of continuity, as well as the pressure to keep the funding flowing.
Of course, funders may ask what the expected outcome of such long-term funding might be, but one of the many lessons learnt from long-term studies is that the uses that we can think of for data now only capture a small part of their subsequent utility. In the 1980s, few would have guessed how important such studies would have been for understanding biotic responses to climate change; in the 1990s, few would have guessed how much genetics would play a role in modern animal ecology, and writing now, it is clear that only a few years ago we underestimated how much we’d be able to make use of very high density phenotypic data from continuous tracking and observations of animals. And this simply illustrates the often-made point that predicting the future utility of science is difficult, if not impossible. In other fields in environmental biology, long-term monitoring is well-established as a recipient of Research Council funding, and indeed played a vital role in the detection of the ozone hole over Antarctica5; it just seems that not all fields have historically had access to such funding.
A substantial part of decisions on funding are made on the basis of the track-record of investigators, and it makes sense to also allow the track-record of study systems to govern funding decisions. After all, if a particular study continues to generate ground-breaking work over decades, even though the PIs running it may have turned over several times, it’s not an especially brave prediction to suggest that it will continue to do so.
Ben Sheldon
Senior Editor, Journal of Animal Ecology
(twitter: @Ben_Sheldon_EGI)
References
- Meade, J.C. et al. (2013) The population increase of common guillemots Uria aalge on Skomer Island is explained by intrinsic demographic properties. J Avian Biol. 44, 55-61.
- Votier, S.C. et al. (2008) Recruitment and survival of immature seabirds in relation to oil spills and climate variability. Anim. Ecol. 77, 974-983.
- Votier, S.C. et al. (2005) Oil pollution and climate have wide-scale impacts on seabird demographics. Ecology Letters 8, 1157-1164.
- Roche, D.G. et al. (2014) Trouble-shooting public data archiving: suggestions to increase participation. PLOS Biology 12, e1001779. http://www.plosbiology.org/article/info:doi/10.1371/journal.pbio.1001779
- http://planetearth.nerc.ac.uk/features/story.aspx?id=509&cookieConsent=A
I’m afraid that this won’t be the last example of the Welsh goverment not seeing the value of ecological monitoring or indeed nature conservation in action. Natural Resources Wales is not simply a rebranding of the old CCW, it is a completely new organisation and seems intent upon dismantling a lot of the incredible hard work that so many people like Tim have done over the past 40 years. I really hope the fact that Tims work has had so much support in reaction to their terribly short sighted decision that they will be embarassed enough to re-educate themselves about the crucial importance of all these things.
Thank you, Ben, for a very useful blog, and it’s great news that the guillemot study will continue for another year. I just want to mention three points:
In the US, NSF runs a specific funding scheme LTREB (Long Term Research in Environmental Biology) to support ‘extended time series of data’, for up to 10 years at a time – based on competitive grant applications. This seems like an excellent idea. NERC have been resistant in the past to suggestions of something similar, but maybe it’s time to try again.
Ben mentions the Rum red deer project, now in its 41st year. Speaking from experience of the work involved with maintaining continuous funding for it, a fourth problem with reliance on short-term grants is that – with grant success rates so low – grant application failures are inevitable at some point, threatening the essential continuity of the data collection. We recently had a NERC grant application rejected because, whilst the science proposed was rated highly, seen as valuable and worthwhile, the continued fieldwork was deemed unnecessary: the panel’s view was that ‘three further years of data collection was unlikely to make a significant contribution to the data already collected’. Ironically, the proposal was to look at the ongoing effects of climate change, which sadly would not have stopped when the funding stopped. (In the end we were lucky, and a submission of a similar proposal a year later was successful.)
This constant headache (for want of a better word!) of maintaining funding via short-term research grants is indeed a primary reason for concern about open access to long-term datasets: the fear that whatever is lined up for the next grant proposal, which possibly might be addressed to some extent with data already collected, will be ‘scooped’, undermining the novelty value that grant-awarding committees value so highly. It would be great if formal recognition of the open-access value of long-term datasets (see suggestions in Roche et al., as cited above) could be used as a way to justify long-term funding for such projects.
As a data user, I appeciate these studies too, and I also wondered whether a body such as the BES would be able to work out a way of funding them. Ideally I think one would want an investment fund to generate the money to support these projects: leaving it to government is too risky.
How many long term projects like this are there in the UK anyway?
Thanks Loeske – I should perhaps have made the UK-centric focus a little clearer. Excellent point about LTREB in the US; are there equivalent schemes elsewhere? I also fully endorse the point about short term funding elevating the risk of failure of continuity. I’m also pleased to see that you agree that open access and long-term funding might be linked.
Very nice blog, thanks!
At some point, you mention that most costs of long-term field studies derive from salaries of field and research assistant. Even though I agree that this is true, I also have many examples of field stations (potentially not in the UK, as I am not most familiar with this case) where these “salaries” are exceedingly low compared to (i) the work they require and (ii) the responsability to generate trustful data they come with. Usually, writing grants for funding such long-term projects underestimate the costs associated with having an outstanding field manager while focusing on costs linked to, say, post-docs. A potential consequence of the funding scheme’s short-term view.
Ideally, finding alternatives to the current funding situation should also take this aspect in consideration.
I cannot support this idea enough. The BES small grants scheme is a way to look at this. Not the 5k limit at present where a project has to be for 12 months, but perhaps as a means to fund low cost long term projects for 3-5 years. There is nothing like this in Europe except for larger Infrastructure proposals, which are then too big. Certainly the BEs cant fund everything, but if they look into it, perhaps with other learned societies getting on board we can secure the baseline data for important long term study sites