PubTransformer

A site to transform Pubmed publications into these bibliographic reference formats: ADS, BibTeX, EndNote, ISI used by the Web of Knowledge, RIS, MEDLINE, Microsoft's Word 2007 XML.

Models, Statistical - Top 30 Publications

Sugary drinks taxation, projected consumption and fiscal revenues in Colombia: Evidence from a QUAIDS model.

The global shift towards diets high in sugar-sweetened beverages (SSBs) is linked to higher prevalence of obesity, diabetes and most other non-communicable diseases. In Colombia, one out of every two people was overweight or obese by 2010. This study estimates price-elasticities from a Quadratic Almost Ideal Demand System model, using the 2006-2007 Colombian Income and Expenditure survey. The food groups that were jointly considered were: unsweetened unflavored milks; coffee and tea; sugar sweetened beverages (SSBs); sweets and candies (including sugar); dairy products; meats and animal-based products; grains based staples; fruits and vegetables; and condiments and snacks. We take into account the high proportion of households not purchasing specific food and beverage groups (censored data) and endogeneity on both prices (as unit values) and total expenditure. Unhealthy beverages are price-elastic (-1.61 for SSBs) meaning that the change in consumption is proportionally larger with respect to a change in price. Also, there is a high complementarity among SSBs and major food groups (grains, meats and fruits and vegetables). In Colombia, the design of a meaningful tax to influence healthier diets is a next critical step. This study also shows that a tax of 20% on SSBs should prove to be effective, and can yield revenues of about 1% of the Colombian government's total annual fiscal revenue, which can potentially be directed towards public health promotion and investments.

Individual preferences for physical exercise as secondary prevention for non-specific low back pain: A discrete choice experiment.

Exercise is effective in improving non-specific low back pain (LBP). Certain components of physical exercise, such as the type, intensity and frequency of exercise, are likely to influence participation among working adults with non-specific LBP, but the value and relative importance of these components remain unknown. The study's aim was to examine such specific components and their influence on individual preferences for exercise for secondary prevention of non-specific LBP among working adults.

Direct and indirect climate controls predict heterogeneous early-mid 21st century wildfire burned area across western and boreal North America.

Predicting wildfire under future conditions is complicated by complex interrelated drivers operating across large spatial scales. Annual area burned (AAB) is a useful index of global wildfire activity. Current and antecedent seasonal climatic conditions, and the timing of snowpack melt, have been suggested as important drivers of AAB. As climate warms, seasonal climate and snowpack co-vary in intricate ways, influencing fire at continental and sub-continental scales. We used independent records of seasonal climate and snow cover duration (last date of permanent snowpack, LDPS) and cell-based Structural Equation Models (SEM) to separate direct (climatic) and indirect (snow cover) effects on relative changes in AAB under future climatic scenarios across western and boreal North America. To isolate seasonal climate variables with the greatest effect on AAB, we ran multiple regression models of log-transformed AAB on seasonal climate variables and LDPS. We used the results of multiple regressions to project future AAB using GCM ensemble climate variables and LDPS, and validated model predictions with recent AAB trends. Direct influences of spring and winter temperatures on AAB are larger and more widespread than the indirect effect mediated by changes in LDPS in most areas. Despite significant warming trends and reductions in snow cover duration, projected responses of AAB to early-mid 21st century are heterogeneous across the continent. Changes in AAB range from strongly increasing (one order of magnitude increases in AAB) to moderately decreasing (more than halving of baseline AAB). Annual wildfire area burned in coming decades is likely to be highly geographically heterogeneous, reflecting interacting regional and seasonal climate drivers of fire occurrence and spread.

Comparison of Timed Automata with Discrete Event Simulation for Modeling of Biomarker-Based Treatment Decisions: An Illustration for Metastatic Castration-Resistant Prostate Cancer.

With the advent of personalized medicine, the field of health economic modeling is being challenged and the use of patient-level dynamic modeling techniques might be required.

Cost Effectiveness of the Angiotensin Receptor Neprilysin Inhibitor Sacubitril/Valsartan for Patients with Chronic Heart Failure and Reduced Ejection Fraction in the Netherlands: A Country Adaptation Analysis Under the Former and Current Dutch Pharmacoeconomic Guidelines.

To describe the adaptation of a global health economic model to determine whether treatment with the angiotensin receptor neprilysin inhibitor LCZ696 is cost effective compared with the angiotensin-converting enzyme inhibitor enalapril in adult patients with chronic heart failure with reduced left ventricular ejection fraction in the Netherlands; and to explore the effect of performing the cost-effectiveness analyses according to the new pharmacoeconomic Dutch guidelines (updated during the submission process of LCZ696), which require a value-of-information analysis and the inclusion of indirect medical costs of life-years gained.

Estimated clinical impact of the Xpert MTB/RIF Ultra cartridge for diagnosis of pulmonary tuberculosis: A modeling study.

The Xpert MTB/RIF (Xpert) assay offers rapid and accurate diagnosis of tuberculosis (TB) but still suffers from imperfect sensitivity. The newer Xpert MTB/RIF Ultra cartridge has shown improved sensitivity in recent field trials, but at the expense of reduced specificity. The clinical implications of switching from the existing Xpert cartridge to the Xpert Ultra cartridge in different populations remain uncertain.

Translating statistical species-habitat models to interactive decision support tools.

Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences.

Wrapped: An R package for circular data.

The Wrapped package computes the probability density function, cumulative distribution function, quantile function and also generates random samples for many univariate wrapped distributions. It also computes maximum likelihood estimates, standard errors, confidence intervals and measures of goodness of fit for nearly fifty univariate wrapped distributions. Numerical illustrations of the package are given.

The potential of statistical shape modelling for geometric morphometric analysis of human teeth in archaeological research.

This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique's application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets.

Generative mathematical modelling to demonstrate virtual simulations of neovascular age related macular degeneration.

To develop a generative mathematical model of wet age-related macular degeneration (AMD) and model the impact of injections of anti-vascular endothelial growth factor to virtual patients with the condition.

A model for foreign exchange markets based on glassy Brownian systems.

In this work we extend a well-known model from arrested physical systems, and employ it in order to efficiently depict different currency pairs of foreign exchange market price fluctuation distributions. We consider the exchange rate price in the time range between 2010 and 2016 at yearly time intervals and resolved at one minute frequency. We then fit the experimental datasets with this model, and find significant qualitative symmetry between price fluctuation distributions from the currency market, and the ones belonging to colloidal particles position in arrested states. The main contribution of this paper is a well-known physical model that does not necessarily assume the independent and identically distributed (i.i.d.) restrictive condition.

Unadjusted Bivariate Two-Group Comparisons: When Simpler is Better.

Hypothesis testing involves posing both a null hypothesis and an alternative hypothesis. This basic statistical tutorial discusses the appropriate use, including their so-called assumptions, of the common unadjusted bivariate tests for hypothesis testing and thus comparing study sample data for a difference or association. The appropriate choice of a statistical test is predicated on the type of data being analyzed and compared. The unpaired or independent samples t test is used to test the null hypothesis that the 2 population means are equal, thereby accepting the alternative hypothesis that the 2 population means are not equal. The unpaired t test is intended for comparing dependent continuous (interval or ratio) data from 2 study groups. A common mistake is to apply several unpaired t tests when comparing data from 3 or more study groups. In this situation, an analysis of variance with post hoc (posttest) intragroup comparisons should instead be applied. Another common mistake is to apply a series of unpaired t tests when comparing sequentially collected data from 2 study groups. In this situation, a repeated-measures analysis of variance, with tests for group-by-time interaction, and post hoc comparisons, as appropriate, should instead be applied in analyzing data from sequential collection points. The paired t test is used to assess the difference in the means of 2 study groups when the sample observations have been obtained in pairs, often before and after an intervention in each study subject. The Pearson chi-square test is widely used to test the null hypothesis that 2 unpaired categorical variables, each with 2 or more nominal levels (values), are independent of each other. When the null hypothesis is rejected, 1 concludes that there is a probable association between the 2 unpaired categorical variables. When comparing 2 groups on an ordinal or nonnormally distributed continuous outcome variable, the 2-sample t test is usually not appropriate. The Wilcoxon-Mann-Whitney test is instead preferred. When making paired comparisons on data that are ordinal, or continuous but nonnormally distributed, the Wilcoxon signed-rank test can be used. In analyzing their data, researchers should consider the continued merits of these simple yet equally valid unadjusted bivariate statistical tests. However, the appropriate use of an unadjusted bivariate test still requires a solid understanding of its utility, assumptions (requirements), and limitations. This understanding will mitigate the risk of misleading findings, interpretations, and conclusions.

Economic evaluation of medical tests at the early phases of development: a systematic review of empirical studies.

There is little specific guidance on the implementation of cost-effectiveness modelling at the early stage of test development. The aim of this study was to review the literature in this field to examine the methodologies and tools that have been employed to date. Areas Covered: A systematic review to identify relevant studies in established literature databases. Five studies were identified and included for narrative synthesis. These studies revealed that there is no consistent approach in this growing field. The perspective of patients and the potential for value of information (VOI) to provide information on the value of future research is often overlooked. Test accuracy is an essential consideration, with most studies having described and included all possible test results in their analysis, and conducted extensive sensitivity analyses on important parameters. Headroom analysis was considered in some instances but at the early development stage (not the concept stage). Expert commentary: The techniques available to modellers that can demonstrate the value of conducting further research and product development (i.e. VOI analysis, headroom analysis) should be better utilized. There is the need for concerted efforts to develop rigorous methodology in this growing field to maximize the value and quality of such analysis.

Impact of triage guidelines on prehospital triage: comparison of guidelines with a statistical model.

The American College of Surgeons developed the National Field Triage Decision Scheme (NFTDS) that has been adapted by many trauma centers in the nation, but quantitative evidence of its efficacy is unclear. We compare the NFTDS and state of Ohio guidelines to the "observed" rates and with rates derived using a statistical model.

Evaluating Public Health Interventions: 7. Let the Subject Matter Choose the Effect Measure: Ratio, Difference, or Something Else Entirely.

We define measures of effect used in public health evaluations, which include the risk difference and the risk ratio, the population-attributable risk, years of life lost or gained, disability-adjusted life years, quality-adjusted life years, and the incremental cost-effectiveness ratio. Except for the risk ratio, all of these are absolute effect measures. For constructing externally generalizable absolute measures of effect when there is superior fit of the multiplicative model, we suggest using the multiplicative model to estimate relative risks, which will often be obtained in simple linear form with no interactions, and then converting these to the desired absolute measure. The externally generalizable absolute measure of effect can be obtained by suitably standardizing to the risk factor distribution of the population to which the results are to be generalized. External generalizability will often be compromised when absolute measures are computed from study populations with risk factor distributions different from those of the population to whom the results are to be generalized, even when these risk factors are not confounders of the intervention effect.

A multi-paradigm framework to assess the impacts of climate change on end-use energy demand.

Projecting the long-term trends in energy demand is an increasingly complex endeavor due to the uncertain emerging changes in factors such as climate and policy. The existing energy-economy paradigms used to characterize the long-term trends in the energy sector do not adequately account for climate variability and change. In this paper, we propose a multi-paradigm framework for estimating the climate sensitivity of end-use energy demand that can easily be integrated with the existing energy-economy models. To illustrate the applicability of our proposed framework, we used the energy demand and climate data in the state of Indiana to train a Bayesian predictive model. We then leveraged the end-use demand trends as well as downscaled future climate scenarios to generate probabilistic estimates of the future end-use demand for space cooling, space heating and water heating, at the individual household and building level, in the residential and commercial sectors. Our results indicated that the residential load is much more sensitive to climate variability and change than the commercial load. Moreover, since the largest fraction of the residential energy demand in Indiana is attributed to heating, future warming scenarios could lead to reduced end-use demand due to lower space heating and water heating needs. In the commercial sector, the overall energy demand is expected to increase under the future warming scenarios. This is because the increased cooling load during hotter summer months will likely outpace the reduced heating load during the more temperate winter months.

Pre-pubertal exposure with phthalates and bisphenol A and pubertal development.

Epidemiological studies indicate associations between childhood exposure with phthalates and bisphenol A (BPA) and the pubertal development. We examined associations between the pre-pubertal phthalate and BPA body burden and the longitudinally assessed sexual maturation of eight- to thirteen-year-old children.

Synergistic effects of the components of global change: Increased vegetation dynamics in open, forest-steppe grasslands driven by wildfires and year-to-year precipitation differences.

Climate change and land use change are two major elements of human-induced global environmental change. In temperate grasslands and woodlands, increasing frequency of extreme weather events like droughts and increasing severity of wildfires has altered the structure and dynamics of vegetation. In this paper, we studied the impact of wildfires and the year-to-year differences in precipitation on species composition changes in semi-arid grasslands of a forest-steppe complex ecosystem which has been partially disturbed by wildfires. Particularly, we investigated both how long-term compositional dissimilarity changes and species richness are affected by year-to-year precipitation differences on burnt and unburnt areas. Study sites were located in central Hungary, in protected areas characterized by partially-burnt, juniper-poplar forest-steppe complexes of high biodiversity. Data were used from two long-term monitoring sites in the Kiskunság National Park, both characterized by the same habitat complex. We investigated the variation in species composition as a function of time using distance decay methodology. In each sampling area, compositional dissimilarity increased with the time elapsed between the sampling events, and species richness differences increased with increasing precipitation differences between consecutive years. We found that both the long-term compositional dissimilarity, and the year-to-year changes in species richness were higher in the burnt areas than in the unburnt ones. The long-term compositional dissimilarities were mostly caused by perennial species, while the year-to-year changes of species richness were driven by annual and biennial species. As the effect of the year-to-year variation in precipitation was more pronounced in the burnt areas, we conclude that canopy removal by wildfires and extreme inter-annual variability of precipitation, two components of global environmental change, act in a synergistic way. They enhance the effect of one another, resulting in greater long-term and year-to-year changes in the composition of grasslands.

Examining the relationships between body image, eating attitudes, BMI, and physical activity in rural and urban South African young adult females using structural equation modeling.

The persistence of food insecurity, malnutrition, increasing adiposity, and decreasing physical activity, heightens the need to understand relationships between body image satisfaction, eating attitudes, BMI and physical activity levels in South Africa. Females aged 18-23 years were recruited from rural (n = 509) and urban (n = 510) settings. Body image satisfaction was measured using Stunkard's silhouettes, and the 26-item Eating Attitudes questionnaire (EAT-26) was used to evaluate participants' risk of disordered eating. Minutes per week of moderate to vigorous physical activity (MVPA) was assessed using the Global Physical Activity Questionnaire (GPAQ). Significant linear correlates were included in a series of regressions run separately for urban and rural participants. Structural equation modeling (SEM) was used to test the relationships between variables. Urban females were more likely to be overweight and obese than rural females (p = 0.02), and had a greater desire to be thinner (p = 0.02). In both groups, being overweight or obese was positively associated with a desire to be thinner (p<0.01), and negatively associated with a desire to be fatter (p<0.01). Having a disordered eating attitude was associated with body image dissatisfaction in the urban group (β = 1.27, p<0.01, CI: 0.38; 2.16), but only with a desire to be fatter in the rural group (β = 0.63, p = 0.04, CI: 0.03; 1.23). In the SEM model, body image dissatisfaction was associated with disordered eating (β = 0.63), as well as higher MVPA participation (p<0.01). These factors were directly associated with a decreased risk of disordered eating attitude, and with a decreased desire to be thinner. Findings indicate a shift in both settings towards more Westernised ideals. Physical activity may provide a means to promote a healthy body image, while reducing the risk of disordered eating. Given the high prevalence of overweight and obesity in both rural and urban women, this study provides insights for future interventions aimed at decreasing adiposity in a healthy way.

Predictability of machine learning techniques to forecast the trends of market index prices: Hypothesis testing for the Korean stock markets.

The prediction of the trends of stocks and index prices is one of the important issues to market participants. Investors have set trading or fiscal strategies based on the trends, and considerable research in various academic fields has been studied to forecast financial markets. This study predicts the trends of the Korea Composite Stock Price Index 200 (KOSPI 200) prices using nonparametric machine learning models: artificial neural network, support vector machines with polynomial and radial basis function kernels. In addition, this study states controversial issues and tests hypotheses about the issues. Accordingly, our results are inconsistent with those of the precedent research, which are generally considered to have high prediction performance. Moreover, Google Trends proved that they are not effective factors in predicting the KOSPI 200 index prices in our frameworks. Furthermore, the ensemble methods did not improve the accuracy of the prediction.

Is there a problem with quantum models of psychological measurements?

This article presents the results of an experiment, called the ABA experiment, designed to test a fundamental prediction of quantum probability theory when applied to human judgments and decisions. The prediction concerns the effect of one measurement on another when the measurements are incompatible (i.e., the answers to the measurements depend on the order of these measurements). After an initial measurement of an opinion on an issue, A, the answer to a second measurement on the same issue A immediately afterwards will certainly be the same as the first. However, according to the uncertainty principle, if a measurement of opinion on issue A is followed by an incompatible measurement on another issue, B, then the answer to a second measurement on issue A will become uncertain. This prediction was tested with 325 participants on a wide range of 12 different set of issues that were previously shown to be incompatible. Contrary to previous claims published in this journal, the empirical findings support the prediction of quantum probability theory applied to human judgments.

Comparison of two cash transfer strategies to prevent catastrophic costs for poor tuberculosis-affected households in low- and middle-income countries: An economic modelling study.

Illness-related costs for patients with tuberculosis (TB) ≥20% of pre-illness annual household income predict adverse treatment outcomes and have been termed "catastrophic." Social protection initiatives, including cash transfers, are endorsed to help prevent catastrophic costs. With this aim, cash transfers may either be provided to defray TB-related costs of households with a confirmed TB diagnosis (termed a "TB-specific" approach); or to increase income of households with high TB risk to strengthen their economic resilience (termed a "TB-sensitive" approach). The impact of cash transfers provided with each of these approaches might vary. We undertook an economic modelling study from the patient perspective to compare the potential of these 2 cash transfer approaches to prevent catastrophic costs.

Inference With Difference-in-Differences With a Small Number of Groups: A Review, Simulation Study, and Empirical Application Using SHARE Data.

Difference-in-differences (DID) estimation has become increasingly popular as an approach to evaluate the effect of a group-level policy on individual-level outcomes. Several statistical methodologies have been proposed to correct for the within-group correlation of model errors resulting from the clustering of data. Little is known about how well these corrections perform with the often small number of groups observed in health research using longitudinal data.

Network analysis for count data with excess zeros.

Undirected graphical models or Markov random fields have been a popular class of models for representing conditional dependence relationships between nodes. In particular, Markov networks help us to understand complex interactions between genes in biological processes of a cell. Local Poisson models seem to be promising in modeling positive as well as negative dependencies for count data. Furthermore, when zero counts are more frequent than are expected, excess zeros should be considered in the model.

Maximum entropy methods for extracting the learned features of deep neural networks.

New architectures of multilayer artificial neural networks and new methods for training them are rapidly revolutionizing the application of machine learning in diverse fields, including business, social science, physical sciences, and biology. Interpreting deep neural networks, however, currently remains elusive, and a critical challenge lies in understanding which meaningful features a network is actually learning. We present a general method for interpreting deep neural networks and extracting network-learned features from input data. We describe our algorithm in the context of biological sequence analysis. Our approach, based on ideas from statistical physics, samples from the maximum entropy distribution over possible sequences, anchored at an input sequence and subject to constraints implied by the empirical function learned by a network. Using our framework, we demonstrate that local transcription factor binding motifs can be identified from a network trained on ChIP-seq data and that nucleosome positioning signals are indeed learned by a network trained on chemical cleavage nucleosome maps. Imposing a further constraint on the maximum entropy distribution also allows us to probe whether a network is learning global sequence features, such as the high GC content in nucleosome-rich regions. This work thus provides valuable mathematical tools for interpreting and extracting learned features from feed-forward neural networks.

A mechanistic spatio-temporal framework for modelling individual-to-individual transmission-With an application to the 2014-2015 West Africa Ebola outbreak.

In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging.

Receptor partial agonism and method to express receptor partial activation with respect to novel Full Logistic Model of mixture toxicology.

Living organisms interact with various chemical compounds via receptors, which is described by the receptor theory. The affinity of the biologically active compounds toward receptors and their ability to trigger a biological or toxic signal vary substantially. In this work, we describe a new insight into understanding of the mode of action of receptor partial agonists and the receptor theory using a Full Logistic Model (FLM) of mixture toxicology. We describe the hypothesis that the effect of a partial agonist can be mathematically described via separation of agonistic and antagonistic behavior of the partial agonist where the antagonistic effect is described as an action of the compound producing zero effect. In this way, a competitive antagonist can be considered as an agonist with zero effect. This idea is also placed into a context with classical concepts, e.g., Gaddum's equation. Using the assumption that competitive antagonists are agonists with no effect, equations describing the microscopic and macroscopic equilibrium constants have been derived. Accordingly, we show that the constants could be calculated from the measured partial agonistic dose-response curve. As a consequence, we provide a simple mathematical tool for comparison of dose-response curves of drugs according to their affinities and efficacies.

Point process analysis of noise in early invertebrate vision.

Noise is a prevalent and sometimes even dominant aspect of many biological processes. While many natural systems have adapted to attenuate or even usefully integrate noise, the variability it introduces often still delimits the achievable precision across biological functions. This is particularly so for visual phototransduction, the process responsible for converting photons of light into usable electrical signals (quantum bumps). Here, randomness of both the photon inputs (regarded as extrinsic noise) and the conversion process (intrinsic noise) are seen as two distinct, independent and significant limitations on visual reliability. Past research has attempted to quantify the relative effects of these noise sources by using approximate methods that do not fully account for the discrete, point process and time ordered nature of the problem. As a result the conclusions drawn from these different approaches have led to inconsistent expositions of phototransduction noise performance. This paper provides a fresh and complete analysis of the relative impact of intrinsic and extrinsic noise in invertebrate phototransduction using minimum mean squared error reconstruction techniques based on Bayesian point process (Snyder) filters. An integrate-fire based algorithm is developed to reliably estimate photon times from quantum bumps and Snyder filters are then used to causally estimate random light intensities both at the front and back end of the phototransduction cascade. Comparison of these estimates reveals that the dominant noise source transitions from extrinsic to intrinsic as light intensity increases. By extending the filtering techniques to account for delays, it is further found that among the intrinsic noise components, which include bump latency (mean delay and jitter) and shape (amplitude and width) variance, it is the mean delay that is critical to noise performance. As the timeliness of visual information is important for real-time action, this delay could potentially limit the speed at which invertebrates can respond to stimuli. Consequently, if one wants to increase visual fidelity, reducing the photoconversion lag is much more important than improving the regularity of the electrical signal.

Bayesian Analysis: Using Prior Information to Interpret the Results of Clinical Trials.

Estimation of the Most Influential Factors for Survival Probability Prediction of Prostate Cancer Patients.

The objective of the study was to address some important questions related to prostate cancer treatments and survivorship. One of possibility to improve the survival probability of prostate cancer patients is to improve predictive strategies. Therefore in this article was created short-term multistep ahead predictive model for survival probability prediction of prostate cancer patients. Neuro-fuzzy model was used to select the most important inputs for the predictive model. As the inputs, current and time lagged variables were used. The results could be useful for simplification of predictive models to avoid multiple inputs.