PubTransformer

A site to transform Pubmed publications into these bibliographic reference formats: ADS, BibTeX, EndNote, ISI used by the Web of Knowledge, RIS, MEDLINE, Microsoft's Word 2007 XML.

Models, Statistical - Top 30 Publications

Can Economic Model Transparency Improve Provider Interpretation of Cost-effectiveness Analysis? Evaluating Tradeoffs Presented by the Second Panel on Cost-effectiveness in Health and Medicine.

The Second Panel on Cost-Effectiveness in Health and Medicine convened on December 7, 2016 at the National Academy of Medicine to disseminate their recommendations for conduct, methodological practices, and reporting of cost-effectiveness analyses (CEAs). Following its summary, panel proceedings included lengthy discussions including the field's struggle to disseminate findings efficiently through peer-reviewed literature to target audiences. With editors of several medical and outcomes research journals in attendance, there was consensus that findings of cost-effectiveness analyses do not effectively reach other researchers or health care providers. The audience members suggested several solutions including providing additional training to clinicians in cost-effectiveness research and requiring that cost-effectiveness models are made publicly available. However, there remains the questions of whether making economic modelers' work open-access through journals is fair under the defense that these models remain one's own intellectual property, or whether journals can properly manage the peer-review process specifically for cost-effectiveness analyses. In this article, we elaborate on these issues and provide some suggested solutions that may increase the dissemination and application of cost-effectiveness literature to reach its intended audiences and ultimately benefit the patient. Ultimately, it is our combined view as economic modelers and clinicians that cost-effectiveness results need to reach the clinician to improve the efficiency of medical practice, but that open-access models do not improve clinician access or interpretation of the economics of medicine.

State-dependent metabolic partitioning and energy conservation: A theoretical framework for understanding the function of sleep.

Metabolic rate reduction has been considered the mechanism by which sleep conserves energy, similar to torpor or hibernation. This mechanism of energy savings is in conflict with the known upregulation (compared to wake) of diverse functions during sleep and neglects a potential role in energy conservation for partitioning of biological operations by behavioral state. Indeed, energy savings as derived from state-dependent resource allocations have yet to be examined. A mathematical model is presented based on relative rates of energy deployment for biological processes upregulated during either wake or sleep. Using this model, energy savings from sleep-wake cycling over constant wakefulness is computed by comparing stable limit cycles for systems of differential equations. A primary objective is to compare potential energy savings derived from state-dependent metabolic partitioning versus metabolic rate reduction. Additionally, energy conservation from sleep quota and the circadian system are also quantified in relation to a continuous wake condition. As a function of metabolic partitioning, our calculations show that coupling of metabolic operations with behavioral state may provide comparatively greater energy savings than the measured decrease in metabolic rate, suggesting that actual energy savings derived from sleep may be more than 4-fold greater than previous estimates. A combination of state-dependent metabolic partitioning and modest metabolic rate reduction during sleep may enhance energy savings beyond what is achievable through metabolic partitioning alone; however, the relative contribution from metabolic partitioning diminishes as metabolic rate is decreased during the rest phase. Sleep quota and the circadian system further augment energy savings in the model. Finally, we propose that state-dependent resource allocation underpins both sleep homeostasis and the optimization of daily energy conservation across species. This new paradigm identifies an evolutionary selective advantage for the upregulation of central and peripheral biological processes during sleep, presenting a unifying construct to understand sleep function.

The EASI model: A first integrative computational approximation to the natural history of COPD.

The natural history of chronic obstructive pulmonary disease (COPD) is still not well understood. Traditionally believed to be a self-inflicted disease by smoking, now we know that not all smokers develop COPD, that other inhaled pollutants different from cigarette smoke can also cause it, and that abnormal lung development can also lead to COPD in adulthood. Likewise, the inflammatory response that characterizes COPD varies significantly between patients, and not all of them perceive symptoms (mostly breathlessness) similarly. To investigate the variability and determinants of different "individual natural histories" of COPD, we developed a theoretical, multi-stage, computational model of COPD (EASI) that integrates dynamically and represents graphically the relationships between exposure (E) to inhaled particles and gases (smoking), the biological activity (inflammatory response) of the disease (A), the severity (S) of airflow limitation (FEV1) and the impact (I) of the disease (breathlessness) in different clinical scenarios. EASI shows that the relationships between E, A, S and I vary markedly within individuals (through life) and between individuals (at the same age). It also helps to delineate some potentially relevant, but often overlooked concepts, such as disease progression, susceptibility to COPD and issues related to symptom perception. In conclusion, EASI is an initial conceptual model to interpret the longitudinal and cross-sectional relationships between E, A, S and I in different clinical scenarios. Currently, it does not have any direct clinical application, thus it requires experimental validation and further mathematical development. However, it has the potential to open novel research and teaching alternatives.

A Proposal for Progression-Free Survival Assessment in Patients with Early Progressive Cancer.

Progression-free survival (PFS), which is evaluated in oncology clinical trials, is determined based on tumor progression evaluated according to an assessment schedule. There is possibly a bias in median PFS and hazard ratio (HR) for PFS depending on the assessment schedule referring to randomized controlled trials (RCTs) in patients with metastatic colorectal cancer.

Identifying parameter regions for multistationarity.

Mathematical modelling has become an established tool for studying the dynamics of biological systems. Current applications range from building models that reproduce quantitative data to identifying systems with predefined qualitative features, such as switching behaviour, bistability or oscillations. Mathematically, the latter question amounts to identifying parameter values associated with a given qualitative feature. We introduce a procedure to partition the parameter space of a parameterized system of ordinary differential equations into regions for which the system has a unique or multiple equilibria. The procedure is based on the computation of the Brouwer degree, and it creates a multivariate polynomial with parameter depending coefficients. The signs of the coefficients determine parameter regions with and without multistationarity. A particular strength of the procedure is the avoidance of numerical analysis and parameter sampling. The procedure consists of a number of steps. Each of these steps might be addressed algorithmically using various computer programs and available software, or manually. We demonstrate our procedure on several models of gene transcription and cell signalling, and show that in many cases we obtain a complete partitioning of the parameter space with respect to multistationarity.

Cost Effectiveness of Pembrolizumab for Advanced Melanoma Treatment in Portugal.

The aim of this study was to assess the cost-effectiveness of pembrolizumab in treating patients with ipilimumab-naïve advanced melanoma in Portugal.

A New Statistical Method to Determine the Degree of Validity of Health Economic Model Outcomes against Empirical Data.

The validation of health economic (HE) model outcomes against empirical data is of key importance. Although statistical testing seems applicable, guidelines for the validation of HE models lack guidance on statistical validation, and actual validation efforts often present subjective judgment of graphs and point estimates.

Cost-Effectiveness Analysis of Natriuretic Peptide Testing and Specialist Management in Patients with Suspected Acute Heart Failure.

To determine the cost-effectiveness of natriuretic peptide (NP) testing and specialist outreach in patients with acute heart failure (AHF) residing off the cardiology ward.

Effect of the Salary Model on Sustainability of a Professional Practice Environment.

This replication study examined differences in RN perception of the professional practice environment (PPE) between salary- and hourly-wage compensation models over time.

How much will it cost to eradicate lymphatic filariasis? An analysis of the financial and economic costs of intensified efforts against lymphatic filariasis.

Lymphatic filariasis (LF), a neglected tropical disease (NTD) preventable through mass drug administration (MDA), is one of six diseases deemed possibly eradicable. Previously we developed one LF elimination scenario, which assumes MDA scale-up to continue in all countries that have previously undertaken MDA. In contrast, our three previously developed eradication scenarios assume all LF endemic countries will undertake MDA at an average (eradication I), fast (eradication II), or instantaneous (eradication III) rate of scale-up. In this analysis we use a micro-costing model to project the financial and economic costs of each of these scenarios in order to provide evidence to decision makers about the investment required to eliminate and eradicate LF.

Research Techniques Made Simple: Choosing Appropriate Statistical Methods for Clinical Research.

The statistical significance of results is an important component to drawing appropriate conclusions in a study. Choosing the correct statistical test to analyze results is essential in interpreting the validity of the study and centers on defining the study variables and purpose of the analysis. The complexity of statistical modeling makes this a daunting task, so we propose a basic algorithmic approach as an initial step in determining what statistical method will be appropriate for a particular clinical study.

Models to predict relapse in psychosis: A systematic review.

There is little evidence on the accuracy of psychosis relapse prediction models. Our objective was to undertake a systematic review of relapse prediction models in psychosis.

Probabilistic models for neural populations that naturally capture global coupling and criticality.

Advances in multi-unit recordings pave the way for statistical modeling of activity patterns in large neural populations. Recent studies have shown that the summed activity of all neurons strongly shapes the population response. A separate recent finding has been that neural populations also exhibit criticality, an anomalously large dynamic range for the probabilities of different population activity patterns. Motivated by these two observations, we introduce a class of probabilistic models which takes into account the prior knowledge that the neural population could be globally coupled and close to critical. These models consist of an energy function which parametrizes interactions between small groups of neurons, and an arbitrary positive, strictly increasing, and twice differentiable function which maps the energy of a population pattern to its probability. We show that: 1) augmenting a pairwise Ising model with a nonlinearity yields an accurate description of the activity of retinal ganglion cells which outperforms previous models based on the summed activity of neurons; 2) prior knowledge that the population is critical translates to prior expectations about the shape of the nonlinearity; 3) the nonlinearity admits an interpretation in terms of a continuous latent variable globally coupling the system whose distribution we can infer from data. Our method is independent of the underlying system's state space; hence, it can be applied to other systems such as natural scenes or amino acid sequences of proteins which are also known to exhibit criticality.

Stochastic principles governing alternative splicing of RNA.

The dominance of the major transcript isoform relative to other isoforms from the same gene generated by alternative splicing (AS) is essential to the maintenance of normal cellular physiology. However, the underlying principles that determine such dominance remain unknown. Here, we analyzed the physical AS process and found that it can be modeled by a stochastic minimization process, which causes the scaled expression levels of all transcript isoforms to follow the same Weibull extreme value distribution. Surprisingly, we also found a simple equation to describe the median frequency of transcript isoforms of different dominance. This two-parameter Weibull model provides the statistical distribution of all isoforms of all transcribed genes, and reveals that previously unexplained observations concerning relative isoform expression derive from these principles.

Projecting social contact matrices in 152 countries using contact surveys and demographic data.

Heterogeneities in contact networks have a major effect in determining whether a pathogen can become epidemic or persist at endemic levels. Epidemic models that determine which interventions can successfully prevent an outbreak need to account for social structure and mixing patterns. Contact patterns vary across age and locations (e.g. home, work, and school), and including them as predictors in transmission dynamic models of pathogens that spread socially will improve the models' realism. Data from population-based contact diaries in eight European countries from the POLYMOD study were projected to 144 other countries using a Bayesian hierarchical model that estimated the proclivity of age-and-location-specific contact patterns for the countries, using Markov chain Monte Carlo simulation. Household level data from the Demographic and Health Surveys for nine lower-income countries and socio-demographic factors from several on-line databases for 152 countries were used to quantify similarity of countries to estimate contact patterns in the home, work, school and other locations for countries for which no contact data are available, accounting for demographic structure, household structure where known, and a variety of metrics including workforce participation and school enrolment. Contacts are highly assortative with age across all countries considered, but pronounced regional differences in the age-specific contacts at home were noticeable, with more inter-generational contacts in Asian countries than in other settings. Moreover, there were variations in contact patterns by location, with work-place contacts being least assortative. These variations led to differences in the effect of social distancing measures in an age structured epidemic model. Contacts have an important role in transmission dynamic models that use contact rates to characterize the spread of contact-transmissible diseases. This study provides estimates of mixing patterns for societies for which contact data such as POLYMOD are not yet available.

Correlation among baseline variables yields non-uniformity of p-values.

A recent paper in Neurology used statistical techniques to investigate the integrity of the randomization in 33 clinical trials conducted by a group of investigators. Without justification, the approach assumed that there would be no impact of correlation among baseline variables. We investigated the impact of correlation on the conclusions of the approach in several large-scale simulation studies that replicated the sample sizes and baseline variables of the clinical trials in question and utilized proper randomization. Additionally, we considered scenarios with larger numbers of baseline variables. We found that, with even moderate correlation, there can be substantial inflation of the type I error of statistical tests of randomization integrity. This is also the case under no correlation, in the presence of some discrete baseline variables, with a large number of variables. Thus, statistical techniques for assessing randomization integrity should be applied with extreme caution given that very low p-values, which are taken as evidence against valid randomization, can arise even in the case of valid randomization, in the presence of correlation. More generally, the use of tests of goodness of fit to uniformity for the purpose of testing a global null hypothesis is not advisable in the presence of correlation.

An injury mortality prediction based on the anatomic injury scale.

To determine whether the injury mortality prediction (IMP) statistically outperforms the trauma mortality prediction model (TMPM) as a predictor of mortality.The TMPM is currently the best trauma score method, which is based on the anatomic injury. Its ability of mortality prediction is superior to the injury severity score (ISS) and to the new injury severity score (NISS). However, despite its statistical significance, the predictive power of TMPM needs to be further improved.Retrospective cohort study is based on the data of 1,148,359 injured patients in the National Trauma Data Bank hospitalized from 2010 to 2011. Sixty percent of the data was used to derive an empiric measure of severity of different Abbreviated Injury Scale predot codes by taking the weighted average death probabilities of trauma patients. Twenty percent of the data was used to create computing method of the IMP model. The remaining 20% of the data was used to evaluate the statistical performance of IMP and then be compared with the TMPM and the single worst injury by examining area under the receiver operating characteristic curve (ROC), the Hosmer-Lemeshow (HL) statistic, and the Akaike information criterion.IMP exhibits significantly both better discrimination (ROC-IMP, 0.903 [0.899-0.907] and ROC-TMPM, 0.890 [0.886-0.895]) and calibration (HL-IMP, 9.9 [4.4-14.7] and HL-TMPM, 197 [143-248]) compared with TMPM. All models show slight changes after the extension of age, gender, and mechanism of injury, but the extended IMP still dominated TMPM in every performance.The IMP has slight improvement in discrimination and calibration compared with the TMPM and can accurately predict mortality. Therefore, we consider it as a new feasible scoring method in trauma research.

Link prediction based on non-negative matrix factorization.

With the rapid expansion of internet, the complex networks has become high-dimensional, sparse and redundant. Besides, the problem of link prediction in such networks has also obatined increasingly attention from different types of domains like information science, anthropology, sociology and computer sciences. It makes requirements for effective link prediction techniques to extract the most essential and relevant information for online users in internet. Therefore, this paper attempts to put forward a link prediction algorithm based on non-negative matrix factorization. In the algorithm, we reconstruct the correlation between different types of matrix through the projection of high-dimensional vector space to a low-dimensional one, and then use the similarity between the column vectors of the weight matrix as the scoring matrix. The experiment results demonstrate that the algorithm not only reduces data storage space but also effectively makes the improvements of the prediction performance during the process of sustaining a low time complexity.

Robust Adaptive Lasso method for parameter's estimation and variable selection in high-dimensional sparse models.

High dimensional data are commonly encountered in various scientific fields and pose great challenges to modern statistical analysis. To address this issue different penalized regression procedures have been introduced in the litrature, but these methods cannot cope with the problem of outliers and leverage points in the heavy tailed high dimensional data. For this purppose, a new Robust Adaptive Lasso (RAL) method is proposed which is based on pearson residuals weighting scheme. The weight function determines the compatibility of each observations and downweight it if they are inconsistent with the assumed model. It is observed that RAL estimator can correctly select the covariates with non-zero coefficients and can estimate parameters, simultaneously, not only in the presence of influential observations, but also in the presence of high multicolliearity. We also discuss the model selection oracle property and the asymptotic normality of the RAL. Simulations findings and real data examples also demonstrate the better performance of the proposed penalized regression approach.

A method of determining where to target surveillance efforts in heterogeneous epidemiological systems.

The spread of pathogens into new environments poses a considerable threat to human, animal, and plant health, and by extension, human and animal wellbeing, ecosystem function, and agricultural productivity, worldwide. Early detection through effective surveillance is a key strategy to reduce the risk of their establishment. Whilst it is well established that statistical and economic considerations are of vital importance when planning surveillance efforts, it is also important to consider epidemiological characteristics of the pathogen in question-including heterogeneities within the epidemiological system itself. One of the most pronounced realisations of this heterogeneity is seen in the case of vector-borne pathogens, which spread between 'hosts' and 'vectors'-with each group possessing distinct epidemiological characteristics. As a result, an important question when planning surveillance for emerging vector-borne pathogens is where to place sampling resources in order to detect the pathogen as early as possible. We answer this question by developing a statistical function which describes the probability distributions of the prevalences of infection at first detection in both hosts and vectors. We also show how this method can be adapted in order to maximise the probability of early detection of an emerging pathogen within imposed sample size and/or cost constraints, and demonstrate its application using two simple models of vector-borne citrus pathogens. Under the assumption of a linear cost function, we find that sampling costs are generally minimised when either hosts or vectors, but not both, are sampled.

Can quartet analyses combining maximum likelihood estimation and Hennigian logic overcome long branch attraction in phylogenomic sequence data?

Systematic biases such as long branch attraction can mislead commonly relied upon model-based (i.e. maximum likelihood and Bayesian) phylogenetic methods when, as is usually the case with empirical data, there is model misspecification. We present PhyQuart, a new method for evaluating the three possible binary trees for any quartet of taxa. PhyQuart was developed through a process of reciprocal illumination between a priori considerations and the results of extensive simulations. It is based on identification of site-patterns that can be considered to support a particular quartet tree taking into account the Hennigian distinction between apomorphic and plesiomorphic similarity, and employing corrections to the raw observed frequencies of site-patterns that exploit expectations from maximum likelihood estimation. We demonstrate through extensive simulation experiments that, whereas maximum likeilihood estimation performs well in many cases, it can be outperformed by PhyQuart in cases where it fails due to extreme branch length asymmetries producing long-branch attraction artefacts where there is only very minor model misspecification.

A Comparison of Agent-Based Models and the Parametric G-Formula for Causal Inference.

Decision-making requires choosing from treatments on the basis of correctly estimated outcome distributions under each treatment. In the absence of randomized trials, 2 possible approaches are the parametric g-formula and agent-based models (ABMs). The g-formula has been used exclusively to estimate effects in the population from which data were collected, whereas ABMs are commonly used to estimate effects in multiple populations, necessitating stronger assumptions. Here, we describe potential biases that arise when ABM assumptions do not hold. To do so, we estimated 12-month mortality risk in simulated populations differing in prevalence of an unknown common cause of mortality and a time-varying confounder. The ABM and g-formula correctly estimated mortality and causal effects when all inputs were from the target population. However, whenever any inputs came from another population, the ABM gave biased estimates of mortality-and often of causal effects even when the true effect was null. In the absence of unmeasured confounding and model misspecification, both methods produce valid causal inferences for a given population when all inputs are from that population. However, ABMs may result in bias when extrapolated to populations that differ on the distribution of unmeasured outcome determinants, even when the causal network linking variables is identical.

Analyzing and Predicting Visual Acuity Outcomes of Anti-VEGF Therapy by a Longitudinal Mixed Effects Model of Imaging and Clinical Data.

We develop a longitudinal statistical model describing best-corrected visual acuity (BCVA) changes in anti-VEGF therapy in relation to imaging data, and predict the future BCVA outcome for individual patients by combining population-wide trends and initial subject-specific time points.

A deterministic mathematical model for bidirectional excluded flow with Langmuir kinetics.

In many important cellular processes, including mRNA translation, gene transcription, phosphotransfer, and intracellular transport, biological "particles" move along some kind of "tracks". The motion of these particles can be modeled as a one-dimensional movement along an ordered sequence of sites. The biological particles (e.g., ribosomes or RNAPs) have volume and cannot surpass one another. In some cases, there is a preferred direction of movement along the track, but in general the movement may be bidirectional, and furthermore the particles may attach or detach from various regions along the tracks. We derive a new deterministic mathematical model for such transport phenomena that may be interpreted as a dynamic mean-field approximation of an important model from mechanical statistics called the asymmetric simple exclusion process (ASEP) with Langmuir kinetics. Using tools from the theory of monotone dynamical systems and contraction theory we show that the model admits a unique steady-state, and that every solution converges to this steady-state. Furthermore, we show that the model entrains (or phase locks) to periodic excitations in any of its forward, backward, attachment, or detachment rates. We demonstrate an application of this phenomenological transport model for analyzing ribosome drop off in mRNA translation.

A New Hybrid Model Using an Autoregressive Integrated Moving Average and a Generalized Regression Neural Network for the Incidence of Tuberculosis in Heng County, China.

It is a daunting task to eradicate tuberculosis completely in Heng County due to a large transient population, human immunodeficiency virus/tuberculosis coinfection, and latent infection. Thus, a high-precision forecasting model can be used for the prevention and control of tuberculosis. In this study, four models including a basic autoregressive integrated moving average (ARIMA) model, a traditional ARIMA-generalized regression neural network (GRNN) model, a basic GRNN model, and a new ARIMA-GRNN hybrid model were used to fit and predict the incidence of tuberculosis. Parameters including mean absolute error (MAE), mean absolute percentage error (MAPE), and mean square error (MSE) were used to evaluate and compare the performance of these models for fitting historical and prospective data. The new ARIMA-GRNN model had superior fit relative to both the traditional ARIMA-GRNN model and basic ARIMA model when applied to historical data and when used as a predictive model for forecasting incidence during the subsequent 6 months. Our results suggest that the new ARIMA-GRNN model may be more suitable for forecasting the tuberculosis incidence in Heng County than traditional models.

Study of the cross-market effects of Brexit based on the improved symbolic transfer entropy GARCH model-An empirical analysis of stock-bond correlations.

In this paper, we study the cross-market effects of Brexit on the stock and bond markets of nine major countries in the world. By incorporating information theory, we introduce the time-varying impact weights based on symbolic transfer entropy to improve the traditional GARCH model. The empirical results show that under the influence of Brexit, flight-to-quality not only commonly occurs between the stocks and bonds of each country but also simultaneously occurs among different countries. We also find that the accuracy of the time-varying symbolic transfer entropy GARCH model proposed in this paper has been improved compared to the traditional GARCH model, which indicates that it has a certain practical application value.

The influence of speed abilities and technical skills in early adolescence on adult success in soccer: A long-term prospective analysis using ANOVA and SEM approaches.

Several talent development programs in youth soccer have implemented motor diagnostics measuring performance factors. However, the predictive value of such tests for adult success is a controversial topic in talent research. This prospective cohort study evaluated the long-term predictive value of 1) motor tests and 2) players' speed abilities (SA) and technical skills (TS) in early adolescence. The sample consisted of 14,178 U12 players from the German talent development program. Five tests (sprint, agility, dribbling, ball control, shooting) were conducted and players' height, weight as well as relative age were assessed at nationwide diagnostics between 2004 and 2006. In the 2014/15 season, the players were then categorized as professional (n = 89), semi-professional (n = 913), or non-professional players (n = 13,176), indicating their adult performance level (APL). The motor tests' prognostic relevance was determined using ANOVAs. Players' future success was predicted by a logistic regression threshold model. This structural equation model comprised a measurement model with the motor tests and two correlated latent factors, SA and TS, with simultaneous consideration for the manifest covariates height, weight and relative age. Each motor predictor and anthropometric characteristic discriminated significantly between the APL (p < .001; η2 ≤ .02). The threshold model significantly predicted the APL (R2 = 24.8%), and in early adolescence the factor TS (p < .001) seems to have a stronger effect on adult performance than SA (p < .05). Both approaches (ANOVA, SEM) verified the diagnostics' predictive validity over a long-term period (≈ 9 years). However, because of the limited effect sizes, the motor tests' prognostic relevance remains ambiguous. A challenge for future research lies in the integration of different (e.g., person-oriented or multilevel) multivariate approaches that expand beyond the "traditional" topic of single tests' predictive validity and toward more theoretically founded issues.

An in silico evaluation of treatment regimens for recurrent Clostridium difficile infection.

Clostridium difficile infection (CDI) is a significant nosocomial infection worldwide, that recurs in as many as 35% of infections. Risk of CDI recurrence varies by ribotype, which also vary in sporulation and germination rates. Whether sporulation/germination mediate risk of recurrence and effectiveness of treatment of recurring CDI remains unclear. We aim to assess the role of sporulation/germination patterns on risk of recurrence, and the relative effectiveness of the recommended tapered/pulsing regimens using an in silico model.

Elderly road collision injury outcomes associated with seat positions and seatbelt use in a rapidly aging society-A case study in South Korea.

Aging has long been regarded as one of the most critical factors affecting crash injury outcomes. In South Korea, where the elderly population is projected to reach 35.9% by 2050, the implications of an increasing number of elderly vehicle users on road safety are evident. In this research, the confounding effect of occupant age in a vehicle in terms of seat position and seatbelt use was investigated. In addition, elderly occupants were divided into a younger-old group aged between 65 and 74 years and an older-old group aged 75 years and older in an effort to assess whether the conventional elderly age standard of 65 years should be reconsidered.

Frequencies of decision making and monitoring in adaptive resource management.

Adaptive management involves learning-oriented decision making in the presence of uncertainty about the responses of a resource system to management. It is implemented through an iterative sequence of decision making, monitoring and assessment of system responses, and incorporating what is learned into future decision making. Decision making at each point is informed by a value or objective function, for example total harvest anticipated over some time frame. The value function expresses the value associated with decisions, and it is influenced by system status as updated through monitoring. Often, decision making follows shortly after a monitoring event. However, it is certainly possible for the cadence of decision making to differ from that of monitoring. In this paper we consider different combinations of annual and biennial decision making, along with annual and biennial monitoring. With biennial decision making decisions are changed only every other year; with biennial monitoring field data are collected only every other year. Different cadences of decision making combine with annual and biennial monitoring to define 4 scenarios. Under each scenario we describe optimal valuations for active and passive adaptive decision making. We highlight patterns in valuation among scenarios, depending on the occurrence of monitoring and decision making events. Differences between years are tied to the fact that every other year a new decision can be made no matter what the scenario, and state information is available to inform that decision. In the subsequent year, however, in 3 of the 4 scenarios either a decision is repeated or monitoring does not occur (or both). There are substantive differences in optimal values among the scenarios, as well as the optimal policies producing those values. Especially noteworthy is the influence of monitoring cadence on valuation in some years. We highlight patterns in policy and valuation among the scenarios, and discuss management implications and extensions.