Compilation by Robyn Darbyshire
Ricotta, C., F. de Bello, et al. (2016). “Measuring the functional redundancy of biological communities: a quantitative guide.” Methods in Ecology and Evolution 7(11): 1386-1395.
* The preservation of ecosystem processes under ongoing biotic erosion requires that some species within affected communities perform similar functions, a property that is usually defined as functional redundancy. Although functional redundancy has recently become a relevant part of ecological research, so far there is no agreement on its measurement. * The scope of this work is thus to propose a consistent framework based on functional dissimilarities among species for summarizing different facets of functional redundancy. The behaviour of the proposed measures is illustrated with one small artificial data set, together with actual examples on the species functional turnover along successional gradients. * We believe this new framework provides an important contribution for the clarification and quantification of key metrics of community redundancy and vulnerability. The method, for which we provide a simple r function called ‘uniqueness’, further allows summarizing the functional contribution of single species to the overall redundancy of any type of biological community.
FULL TEXT LINK: http://dx.doi.org/10.1111/2041-210X.12604
Healey, S. P., C. L. Raymond, et al. (2016). “Root disease can rival fire and harvest in reducing forest carbon storage.” Ecosphere 7(11).
Root diseases are known to suppress forest regeneration and reduce growth rates, and they may become more common as susceptible tree species become maladapted in parts of their historic ranges due to climate change. However, current ecosystem models do not track the effects of root disease on net productivity, and there has been little research on how the dynamics of root disease affect carbon (C) storage and productivity across infected landscapes. We compared the effects of root disease against the effects of other types of forest disturbance across six national forest landscapes, 1990–2011. This was enabled by a monitoring tool called the Forest Carbon Management Framework (ForCaMF), which makes use of ground inventory data, an empirical growth model, and time series of Landsat satellite imagery. Despite several large fires that burned across these landscapes during the study period, retrospective ForCaMF analysis showed that fire and root disease had approximately equal impacts on C storage. Relative to C accumulation that would have occurred in their absence, fires from 1990 to 2011 were estimated to reduce regionwide C storage by 215.3 ± 19.1 g/m2 C, while disease in the same period was estimated to reduce storage by 211.4 ± 59.9 g/m2 C. Harvest (75.5 ± 13.5 g/m2 C) and bark beetle activity (14.8 ± 12.5 g/m2 C) were less important. While long-term disturbance processes such as root disease have generally been ignored by tools informing management of forest C storage, the recent history of several national forests suggests that such disturbances can be just as important to the C cycle as more conspicuous events like wildfires.
FULL TEXT LINK: http://dx.doi.org/10.1002/ecs2.1569
Tsen, E. W. J., T. Sitzia, et al. (2016). “To core, or not to core: the impact of coring on tree health and a best-practice framework for collecting dendrochronological information from living trees.” Biological Reviews 91(4): 899-924.
Trees are natural repositories of valuable environmental information that is preserved in the growth and structure of their stems, branches and roots. Dendrochronological analyses, based on the counting, crossdating and characterisation of incrementally formed wood rings, offer powerful insights for diverse fields including ecology, climatology and archaeology. The application of this toolset is likely to increase in popularity over coming decades due to advances in the field and a reduction in the cost of analyses. In research settings where the continued value of living trees subject to dendrochronological investigation is important, the use of an increment bore corer to extract trunk tissue is considered the best option to minimise negative impacts on tree health (e.g. stress and fitness). A small and fragmented body of literature, however, reports significant after-effects, and in some cases fatal outcomes, from this sampling technique. As it stands, the literature documenting increment bore coring (IBC) impacts lacks experimental consistency and is poorly replicated, making it difficult for prospective users of the method to assess likely tree responses to coring. This paucity of information has the potential to lead to destructive misuse of the method and also limits its safe implementation in circumstances where the risk of impacts may be appropriate. If IBC is to fulfil its potential as a method of choice across research fields, then we must first address our limited understanding of IBC impacts and provide a framework for its appropriate future use. Firstly, we review the historical context of studies examining the impacts of IBC on trees to identify known patterns, focal issues and biases in existing knowledge. IBC wound responses, particularly those that impact on lumber quality, have been the primary focus of prior studies. No universal treatment was identified that conclusively improved wound healing and few studies have linked wound responses to tree health impacts. Secondly, we build on literature insights using a theoretical approach to identify the most important factors to guide future research involving implementation of IBC, including innate tree characteristics and environmental factors. Thirdly, we synthesise and interrogate the quantitative data available through meta-analysis to identify risk factors for wound reactions. Although poor reporting standards, restricted scopes and a bias towards temperate ecosystems limited quantitative insight, we found that complete cambial wound closure could still harbour high rates of internal trunk decay, and that conditions favouring faster growth generally correlated with reduced indices of internal and external damage in broadleaved taxa. Finally, we propose a framework for guiding best-practice application of IBC to address knowledge gaps and maximise the utility of this method, including standardised reporting indices for identifying and minimising negative impacts on tree health. While IBC is an underutilised tool of ecological enquiry with broad applicability, the method will always incur some risk of negative impacts on the cored tree. We caution that the decision to core, or not to core, must be given careful consideration on a case-by-case basis. In time, we are confident that this choice will be better informed by evidence-based insight.
FULL TEXT LINK: http://dx.doi.org/10.1111/brv.12200
Magnússon, R. Í., A. Tietema, et al. (2016). “Sequestration of carbon from coarse woody debris in forest soils.” Forest Ecology and Management 377: 1-15.
Worldwide, forests have absorbed around 30% of global anthropogenic emissions of carbon dioxide (CO2) annually, thereby acting as important carbon (C) sinks. It is proposed that leaving large fragments of dead wood, coarse woody debris (CWD), in forest ecosystems may contribute to the forest C sink strength. CWD may take years to centuries to degrade completely, and non-respired C from CWD may enter the forest soil directly or in the form of dissolved organic C. Although aboveground decomposition of CWD has been studied frequently, little is known about the relative size, composition and fate of different C fluxes from CWD to soils under various substrate-specific and environmental conditions. Thus, the exact contribution of C from CWD to C sequestration within forest soils is poorly understood and quantified, although understanding CWD degradation and stabilization processes is essential for effective forest C sink management. This review aims at providing insight into these processes on the interface of forest ecology and soil science, and identifies knowledge gaps that are critical to our understanding of the effects of CWD on the forest soil C sink. It may be seen as a “call-to-action” crossing disciplinary boundaries, which proposes the use of compound-specific analytical studies and manipulation studies to elucidate C fluxes from CWD. Carbon fluxes from decaying CWD can vary considerably due to interspecific and intraspecific differences in composition and different environmental conditions. These variations in C fluxes need to be studied in detail and related to recent advances in soil C sequestration research. Outcomes of this review show that the presence of CWD may enhance the abundance and diversity of the microbial community and constitute additional fluxes of C into the mineral soil by augmented leaching of dissolved organic carbon (DOC). Leached DOC and residues from organic matter (OM) from later decay stages have been shown to be relatively enriched in complex and microbial-derived compounds, which may also be true for CWD-derived OM. Emerging knowledge on soil C stabilization indicates that such complex compounds may be sorbed preferentially to the mineral soil. Moreover, increased abundance and diversity of decomposer organisms may increase the amount of substrate C being diverted into microbial biomass, which may contribute to stable C pools in the forest soil.
Marion, J. L. (2016). “A Review and Synthesis of Recreation Ecology Research Supporting Carrying Capacity and Visitor Use Management Decisionmaking.” Journal of Forestry 114(3): 339-351.
Resource and experiential impacts associated with visitation to wilderness and other similar backcountry settings have long been addressed by land managers under the context of ‘carrying capacity’ decisionmaking. Determining a maximum level of allowable use, below which high-quality resource and experiential conditions would be sustained, was an early focus in the 1960s and 1970s. However, decades of recreation ecology research have shown that the severity and areal extent of visitor impact problems are influenced by an interrelated array of use-related, environmental, and managerial factors. This complexity, with similar findings from social science research, prompted scientists and managers to develop more comprehensive carrying capacity frameworks, including a new Visitor Use Management framework. These frameworks rely on a diverse array of management strategies and actions, often termed a ‘management toolbox’ for resolving visitor impact problems. This article reviews the most recent and relevant recreation ecology studies that have been applied in wildland settings to avoid or minimize resource impacts. The key findings and their management implications are highlighted to support the professional management of common trail, recreation site, and wildlife impact problems. These studies illustrate the need to select from a more diverse array of impact management strategies and actions based on an evaluation of problems to identify the most influential factors that can be manipulated.
Management and Policy Implications: Wildland managers struggle to balance their resource protection and recreation provision objectives. Over the course of six decades, the recreation carrying capacity concept has been repeatedly applied and revised as a management tool, evolving from a simplistic focus on fixed visitation limits to comprehensive decisionmaking frameworks focused on sustaining high-quality recreational opportunities. Recreation ecology studies investigating relationships between amount of visitor use and the magnitude of resource impacts consistently find that use and impact are strongly related only at initial and low levels of visitation, with weak correlations at higher use levels. However, unacceptable resource impacts often occur on well-established and heavily used trails and recreation sites: reducing use to improve their condition is generally an ineffective practice. An increasing number of recreation ecology studies describe the efficacy of alternative management interventions, including the siting, design, construction, and maintenance of more sustainable trails and recreation sites, the spatial and temporal redistribution of visitor use, and persuasive communication or regulations that encourage visitors to apply low-impact practices.