PubTransformer

A site to transform Pubmed publications into these bibliographic reference formats: ADS, BibTeX, EndNote, ISI used by the Web of Knowledge, RIS, MEDLINE, Microsoft's Word 2007 XML.

Models, Statistical - Top 30 Publications

High resolution microscopy reveals significant impacts of ocean acidification and warming on larval shell development in Laternula elliptica.

Environmental stressors impact marine larval growth rates, quality and sizes. Larvae of the Antarctic bivalve, Laternula elliptica, were raised to the D-larvae stage under temperature and pH conditions representing ambient and end of century projections (-1.6°C to +0.4°C and pH 7.98 to 7.65). Previous observations using light microscopy suggested pH had no influence on larval abnormalities in this species. Detailed analysis of the shell using SEM showed that reduced pH is in fact a major stressor during development for this species, producing D-larvae with abnormal shapes, deformed shell edges and irregular hinges, cracked shell surfaces and even uncalcified larvae. Additionally, reduced pH increased pitting and cracking on shell surfaces. Thus, apparently normal larvae may be compromised at the ultrastructural level and these larvae would be in poor condition at settlement, reducing juvenile recruitment and overall survival. Elevated temperatures increased prodissoconch II sizes. However, the overall impacts on larval shell quality and integrity with concurrent ocean acidification would likely overshadow any beneficial results from warmer temperatures, limiting populations of this prevalent Antarctic species.

InMAP: A model for air pollution interventions.

Mechanistic air pollution modeling is essential in air quality management, yet the extensive expertise and computational resources required to run most models prevent their use in many situations where their results would be useful. Here, we present InMAP (Intervention Model for Air Pollution), which offers an alternative to comprehensive air quality models for estimating the air pollution health impacts of emission reductions and other potential interventions. InMAP estimates annual-average changes in primary and secondary fine particle (PM2.5) concentrations-the air pollution outcome generally causing the largest monetized health damages-attributable to annual changes in precursor emissions. InMAP leverages pre-processed physical and chemical information from the output of a state-of-the-science chemical transport model and a variable spatial resolution computational grid to perform simulations that are several orders of magnitude less computationally intensive than comprehensive model simulations. In comparisons run here, InMAP recreates comprehensive model predictions of changes in total PM2.5 concentrations with population-weighted mean fractional bias (MFB) of -17% and population-weighted R2 = 0.90. Although InMAP is not specifically designed to reproduce total observed concentrations, it is able to do so within published air quality model performance criteria for total PM2.5. Potential uses of InMAP include studying exposure, health, and environmental justice impacts of potential shifts in emissions for annual-average PM2.5. InMAP can be trained to run for any spatial and temporal domain given the availability of appropriate simulation output from a comprehensive model. The InMAP model source code and input data are freely available online under an open-source license.

Electricity forecasting on the individual household level enhanced based on activity patterns.

Leveraging smart metering solutions to support energy efficiency on the individual household level poses novel research challenges in monitoring usage and providing accurate load forecasting. Forecasting electricity usage is an especially important component that can provide intelligence to smart meters. In this paper, we propose an enhanced approach for load forecasting at the household level. The impacts of residents' daily activities and appliance usages on the power consumption of the entire household are incorporated to improve the accuracy of the forecasting model. The contributions of this paper are threefold: (1) we addressed short-term electricity load forecasting for 24 hours ahead, not on the aggregate but on the individual household level, which fits into the Residential Power Load Forecasting (RPLF) methods; (2) for the forecasting, we utilized a household specific dataset of behaviors that influence power consumption, which was derived using segmentation and sequence mining algorithms; and (3) an extensive load forecasting study using different forecasting algorithms enhanced by the household activity patterns was undertaken.

Zooming in: From spatially extended traveling waves to localized structures: The case of the Sine-Gordon equation in (1+3) dimensions.

The Sine-Gordon equation in (1+3) dimensions has N-traveling front ("kink", "domain wall")- solutions for all N ≥ 1. A nonlinear functional of the solution, which vanishes on a single-front, maps multi-front solutions onto sets of infinitely long, but laterally bounded, rods, which move in space. Each rod is localized in the vicinity of the intersection of two Sine-Gordon fronts. The rod systems are solutions of the linear wave equation, driven by a term that is constructed out of Sine-Gordon fronts. An additional linear operation maps multi-rod systems onto sets of blobs. Each blob is localized in the vicinity of rod intersection, and moves in space. The blob systems are solutions of the linear wave equation, driven by a term that is also constructed out of Sine-Gordon fronts. The temporal evolution of multi-blob solutions mimics elastic collisions of systems of spatially extended particles.

Rainfall changes affect the algae dominance in tank bromeliad ecosystems.

Climate change and biodiversity loss have been reported as major disturbances in the biosphere which can trigger changes in the structure and functioning of natural ecosystems. Nonetheless, empirical studies demonstrating how both factors interact to affect shifts in aquatic ecosystems are still unexplored. Here, we experimentally test how changes in rainfall distribution and litter diversity affect the occurrence of the algae-dominated condition in tank bromeliad ecosystems. Tank bromeliads are miniature aquatic ecosystems shaped by the rainwater and allochthonous detritus accumulated in the bases of their leaves. Here, we demonstrated that changes in the rainfall distribution were able to reduce the chlorophyll-a concentration in the water of bromeliad tanks affecting significantly the occurrence of algae-dominated conditions. On the other hand, litter diversity did not affect the algae dominance irrespective to the rainfall scenario. We suggest that rainfall changes may compromise important self-reinforcing mechanisms responsible for maintaining high levels of algae on tank bromeliads ecosystems. We summarized these results into a theoretical model which suggests that tank bromeliads may show two different regimes, determined by the bromeliad ability in taking up nutrients from the water and by the total amount of light entering the tank. We concluded that predicted climate changes might promote regime shifts in tropical aquatic ecosystems by shaping their structure and the relative importance of other regulating factors.

Variation in benthic long-term data of transitional waters: Is interpretation more than speculation?

Biological long-term data series in marine habitats are often used to identify anthropogenic impacts on the environment or climate induced regime shifts. However, particularly in transitional waters, environmental properties like water mass dynamics, salinity variability and the occurrence of oxygen minima not necessarily caused by either human activities or climate change can attenuate or mask apparent signals. At first glance it very often seems impossible to interpret the strong fluctuations of e.g. abundances or species richness, since abiotic variables like salinity and oxygen content vary simultaneously as well as in apparently erratic ways. The long-term development of major macrozoobenthic parameters (abundance, biomass, species numbers) and derivative macrozoobenthic indices (Shannon diversity, Margalef, Pilou's evenness and Hurlbert) has been successfully interpreted and related to the long-term fluctuations of salinity and oxygen, incorporation of the North Atlantic Oscillation index (NAO index), relying on the statistical analysis of modelled and measured data during 35 years of observation at three stations in the south-western Baltic Sea. Our results suggest that even at a restricted spatial scale the benthic system does not appear to be tightly controlled by any single environmental driver and highlight the complexity of spatially varying temporal response.

Financial Forecasting and Stochastic Modeling: Predicting the Impact of Business Decisions.

In health care organizations, effective investment of precious resources is critical to assure that the organization delivers high-quality and sustainable patient care within a supportive environment for patients, their families, and the health care providers. This holds true for organizations independent of size, from small practices to large health systems. For radiologists whose role is to oversee the delivery of imaging services and the interpretation, communication, and curation of imaging-informed information, business decisions influence where and how they practice, the tools available for image acquisition and interpretation, and ultimately their professional satisfaction. With so much at stake, physicians must understand and embrace the methods necessary to develop and interpret robust financial analyses so they effectively participate in and better understand decision making. This review discusses the financial drivers upon which health care organizations base investment decisions and the central role that stochastic financial modeling should play in support of strategically aligned capital investments. Given a health care industry that has been slow to embrace advanced financial analytics, a fundamental message of this review is that the skills and analytical tools are readily attainable and well worth the effort to implement in the interest of informed decision making. (©) RSNA, 2017 Online supplemental material is available for this article.

The roles of prostate-specific antigen (PSA) density, prostate volume, and their zone-adjusted derivatives in predicting prostate cancer in patients with PSA less than 20.0 ng/mL.

The aim of this study was to develop nomograms for predicting prostate cancer and its zonal location using prostate-specific antigen density, prostate volume, and their zone-adjusted derivatives. A total of 928 consecutive patients with prostate-specific antigen (PSA) less than 20.0 ng/mL, who underwent transrectal ultrasound-guided transperineal 12-core prostate biopsy at West China Hospital between 2011 and 2014, were retrospectively enrolled. The patients were randomly split into training cohort (70%, n = 650) and validation cohort (30%, n = 278). Predicting models and the associated nomograms were built using the training cohort, while the validations of the models were conducted using the validation cohort. Univariate and multivariate logistic regression was performed. Then, new nomograms were generated based on multivariate regression coefficients. The discrimination power and calibration of these nomograms were validated using the area under the ROC curve (AUC) and the calibration curve. The potential clinical effects of these models were also tested using decision curve analysis. In total, 285 (30.7%) patients were diagnosed with prostate cancer. Among them, 131 (14.1%) and 269 (29.0%) had transition zone prostate cancer and peripheral zone prostate cancer. Each of zone-adjusted derivatives-based nomogram had an AUC more than 0.75. All nomograms had higher calibration and much better net benefit than the scenarios in predicting patients with or without different zones prostate cancer. Prostate-specific antigen density, prostate volume, and their zone-adjusted derivatives have important roles in detecting prostate cancer and its zonal location for patients with PSA 2.5-20.0 ng/mL. To the best of our knowledge, this is the first nomogram using these parameters to predict outcomes of 12-core prostate biopsy. These instruments can help clinicians to increase the accuracy of prostate cancer screening and to avoid unnecessary prostate biopsy.

Systematic Review of Health Economic Impact Evaluations of Risk Prediction Models: Stop Developing, Start Evaluating.

Although health economic evaluations (HEEs) are increasingly common for therapeutic interventions, they appear to be rare for the use of risk prediction models (PMs).

A Practical ANOVA Approach for Uncertainty Analysis in Population-Based Disease Microsimulation Models.

To provide a practical approach for calculating uncertainty intervals and variance components associated with initial-condition and dynamic-equation parameters in computationally expensive population-based disease microsimulation models.

Estimating the standardized incidence ratio (SIR) with incomplete follow-up data.

A standard parameter to compare the disease incidence of a cohort relative to the population is the standardized incidence ratio (SIR). For statistical inference is commonly assumed that the denominator, the expected number of cases, is fixed. If a disease registry is available, incident cases can sometimes be identified by linkage with the registry, however, registries may not contain information on migration or death from other causes. A complete follow-up with a population registry may not be possible. In that case, end-of-follow-up date and therefore, exact person-years of observation are unknown.

Gleason Grading, Biochemical Failure, and Prostate Cancer-Specific Death.

To examine the relationship between the recently defined Gleason grade groups and prostate cancer-specific mortality.

Modeling Variables With a Spike at Zero: Examples and Practical Recommendations.

In most epidemiologic studies and in clinical research generally, there are variables with a spike at zero, namely variables for which a proportion of individuals have zero exposure (e.g., never smokers) and among those exposed the variable has a continuous distribution. Different options exist for modeling such variables, such as categorization where the nonexposed form the reference group, or ignoring the spike by including the variable in the regression model with or without some transformation or modeling procedures. It has been shown that such situations can be analyzed by adding a binary indicator (exposed/nonexposed) to the regression model, and a method based on fractional polynomials with which to estimate a suitable functional form for the positive portion of the spike-at-zero variable distribution has been developed. In this paper, we compare different approaches using data from 3 case-control studies carried out in Germany: the Mammary Carcinoma Risk Factor Investigation (MARIE), a breast cancer study conducted in 2002-2005 (Flesch-Janys et al., Int J Cancer. 2008;123(4):933-941); the Rhein-Neckar Larynx Study, a study of laryngeal cancer conducted in 1998-2000 (Dietz et al., Int J Cancer. 2004;108(6):907-911); and a lung cancer study conducted in 1988-1993 (Jöckel et al., Int J Epidemiol. 1998;27(4):549-560). Strengths and limitations of different procedures are demonstrated, and some recommendations for practical use are given.

Utility values associated with advanced or metastatic non-small cell lung cancer: data needs for economic modeling.

Cost-effectiveness analyses often inform healthcare reimbursement decisions. The preferred measure of effectiveness is the quality adjusted life year (QALY) gained, where the quality of life adjustment is measured in terms of utility. Areas covered: We assessed the availability and variation of utility values for health states associated with advanced or metastatic non-small cell lung cancer (NSCLC) to identify values appropriate for cost-effectiveness models assessing alternative treatments. Our systematic search of six electronic databases (January 2000 to August 2015) found the current literature to be sparse in terms of utility values associated with NSCLC, identifying 27 studies. Utility values were most frequently reported over time and by treatment type, and less frequently by disease response, stage of disease, adverse events or disease comorbidities. Expert commentary: In response to rising healthcare costs, payers increasingly consider the cost-effectiveness of novel treatments in reimbursement decisions, especially in oncology. As the number of therapies available to treat NSCLC increases, cost-effectiveness analyses will play a key role in reimbursement decisions in this area. Quantifying the relationship between health and quality of life for NSCLC patients via utility values is an important component of assessing the cost effectiveness of novel treatments.

Development and validation of a risk-prediction nomogram for in-hospital mortality in adults poisoned with drugs and nonpharmaceutical agents: An observational study.

Acute poisoning with drugs and nonpharmaceutical agents represents an important challenge in the emergency department (ED).The objective is to create and validate a risk-prediction nomogram for use in the ED to predict the risk of in-hospital mortality in adults from acute poisoning with drugs and nonpharmaceutical agents.This was a prospective cohort study involving adults with acute poisoning from drugs and nonpharmaceutical agents admitted to a tertiary referral center for toxicology between January and December 2015 (derivation cohort) and between January and June 2016 (validation cohort). We used a program to generate nomograms based on binary logistic regression predictive models. We included variables that had significant associations with death. Using regression coefficients, we calculated scores for each variable, and estimated the event probability. Model validation was performed using bootstrap to quantify our modeling strategy and using receiver operator characteristic (ROC) analysis. The nomogram was tested on a separate validation cohort using ROC analysis and goodness-of-fit tests.Data from 315 patients aged 18 to 91 years were analyzed (n = 180 in the derivation cohort; n = 135 in the validation cohort). In the final model, the following variables were significantly associated with mortality: age, laboratory test results (lactate, potassium, MB isoenzyme of creatine kinase), electrocardiogram parameters (QTc interval), and echocardiography findings (E wave velocity deceleration time). Sex was also included to use the same model for men and women. The resulting nomogram showed excellent survival/mortality discrimination (area under the curve [AUC] 0.976, 95% confidence interval [CI] 0.954-0.998, P < 0.0001 for the derivation cohort; AUC 0.957, 95% CI 0.892-1, P < 0.0001 for the validation cohort).This nomogram provides more precise, rapid, and simple risk-analysis information for individual patients acutely exposed to drugs and nonpharmaceutical agents, and accurately estimates the probability of in-hospital death, exclusively using the results of objective tests available in the ED.

Prognostic nomogram for patients with hepatocellular carcinoma underwent adjuvant transarterial chemoembolization following curative resection.

Adjuvant transarterial chemoembolization (TACE) is a major option for postoperative hepatocellular carcinoma (HCC) patients with recurrence risk factors. However, individualized predictive models for subgroup of these patients are limited. This study aimed to develop a prognostic nomogram for patients with HCC underwent adjuvant TACE following curative resection.A cohort comprising 144 HCC patients who received adjuvant TACE following curative resection in the Zhongshan Hospital were analyzed. The nomogram was formulated based on independent prognostic indicators for overall survival (OS). The performance of the nomogram was evaluated by the concordance index (C-index), calibration curve, and decision curve analysis (DCA) and compared with the conventional staging systems. The results were validated in an independent cohort of 86 patients with the same inclusion criteria.Serum alpha-fetoprotein (AFP), hyper-sensitive C-reactive protein (hs-CRP), incomplete tumor encapsulation, and double positive staining of Cytokeratin 7 and Cytokeratin 19 on tumor cells were identified as independent predictors for OS. The C-indices of the nomogram for OS prediction in the training cohort and validation cohort were 0.787 (95%CI 0.775-0.799) and 0.714 (95%CI 0.695-0.733), respectively. In both the training and validation cohorts, the calibration plot showed good consistency between the nomogram-predicted and the observed survival. Furthermore, the established nomogram was superior to the conventional staging systems in terms of C-index and clinical net benefit on DCA.The proposed nomogram provided an accurate prediction on risk stratification for HCC patients underwent adjuvant TACE following curative resection.

Logistic Regression Diagnostics: Understanding How Well a Model Predicts Outcomes.

Factor structure of parent and teacher ratings of the ODD symptoms for Malaysian primary school children.

This present study used confirmatory factor analysis (CFA) to examine the applicability of one-, two- three- and second order Oppositional Defiant Disorder (ODD) factor models, proposed in previous studies, in a group of Malaysian primary school children. These models were primarily based on parent reports. In the current study, parent and teacher ratings of the ODD symptoms were obtained for 934 children. For both groups of respondents, the findings showing some support for all models examined, with most support for a second order model with Burke et al. (2010) three factors (oppositional, antagonistic, and negative affect) as the primary factors. The diagnostic implications of the findings are discussed.

Examining the dimensional structure models of secondary traumatic stress based on DSM-5 symptoms.

Latent factor structure of Secondary Traumatic Stress (STS) has been examined using Diagnostic Statistic Manual-IV (DSM-IV)'s Posttraumatic Stress Disorder (PTSD) nomenclature. With the advent of Diagnostic Statistic Manual-5 (DSM-5), there is an impending need to reexamine STS using DSM-5 symptoms in light of the most updated PTSD models in the literature. The study investigated and determined the best fitted PTSD models using DSM-5 PTSD criteria symptoms. Confirmatory factor analysis (CFA) was conducted to examine model fit using the Secondary Traumatic Stress Scale in 241 registered and practicing Filipino nurses (166 females and 75 males) who worked in the Philippines and gave direct nursing services to patients. Based on multiple fit indices, the results showed the 7-factor hybrid model, comprising of intrusion, avoidance, negative affect, anhedonia, externalizing behavior, anxious arousal, and dysphoric arousal factors has excellent fit to STS. This model asserts that: (1) hyperarousal criterion needs to be divided into anxious and dysphoric arousal factors; (2) symptoms characterizing negative and positive affect need to be separated to two separate factors, and; (3) a new factor would categorize externalized, self-initiated impulse and control-deficit behaviors. Comparison of nested and non-nested models showed Hybrid model to have superior fit over other models. The specificity of the symptom structure of STS based on DSM-5 PTSD criteria suggests having more specific interventions addressing the more elaborate symptom-groupings that would alleviate the condition of nurses exposed to STS on a daily basis.

Measurement properties of painDETECT: Rasch analysis of responses from community-dwelling adults with neuropathic pain.

painDETECT (PD-Q) is a self-reported assessment of pain qualities developed as a screening tool for pain of neuropathic origin. Rasch analysis is a strategy for examining the measurement characteristics of a scale using a form of item response theory. We conducted a Rasch analysis to consider if the scoring and measurement properties of PD-Q would support its use as an outcome measure.

Statistical Modeling and Prediction for Tourism Economy Using Dendritic Neural Network.

With the impact of global internationalization, tourism economy has also been a rapid development. The increasing interest aroused by more advanced forecasting methods leads us to innovate forecasting methods. In this paper, the seasonal trend autoregressive integrated moving averages with dendritic neural network model (SA-D model) is proposed to perform the tourism demand forecasting. First, we use the seasonal trend autoregressive integrated moving averages model (SARIMA model) to exclude the long-term linear trend and then train the residual data by the dendritic neural network model and make a short-term prediction. As the result showed in this paper, the SA-D model can achieve considerably better predictive performances. In order to demonstrate the effectiveness of the SA-D model, we also use the data that other authors used in the other models and compare the results. It also proved that the SA-D model achieved good predictive performances in terms of the normalized mean square error, absolute percentage of error, and correlation coefficient.

Chemometric analysis of comprehensive two dimensional gas chromatography-mass spectrometry metabolomics data.

Comprehensive two-dimensional gas chromatography time-of-flight mass spectrometry (GC×GC-TOFMS) is a powerful tool for separation and identification of analytes in complex natural samples. In this paper, different chemometric methods were compared for fast non-targeted GC×GC-TOFMS metabolomic profiling of the crustaceous species Daphnia magna and a general chemometric strategy and workflow is proposed. The strategy proposed in this work combined the compression of GC×GC-TOFMS data matrices in the retention time direction using wavelets and the appropriate column-wise data matrix augmentation arrangement, and its modeling by Multivariate Curve Resolution-Alternating Least Squares (MCR-ALS). Using the proposed strategy, eighty different D. magna metabolites were resolved and identified. After calculation of the peak capacities of different columns and peak area changes of these metabolites, the best instrumental configuration and column combination for the GC×GC-TOFMS metabolomic study of D. magna are proposed and discussed. The procedure described in this work can be applied as a general method for untargeted GC×GC-TOFMS data processing and metabolomic profiling.

Review and modelling of malaria crude incidence rate in a low incidence population, Illinois 1990 to 2013.

The highest risk of imported malaria in Illinois is associated with travel to countries of origin by immigrants to visit family and friends. We used Join point regression to analyze Malaria crude incidence rate (mCIR) trend from 1990 through 2013. We found join point regression a useful way to summarize mCIR trends because it connected the linear line segments over a fixed time interval (annual) and allowed characterization of the trends using the Annual Percent Change.

An Overview of the Models in Reporting School Data on Dental Credentialing Examinations.

The development and dissemination of meaningful and useful performance reports associated with examinations involved in the licensure process are important to the communities of interest, including state boards, candidates, and professional schools. Discussions of performance reporting have been largely neglected however. The authors recognize and reinforce the need for such discussions by providing prototypes of performance reporting in dentistry with examples and recommendations to guide practice. For illustrative purposes, this article reviews and discusses the different reporting models used over the past ten years with Part I and Part II of the National Board Dental Examination (NBDE). These reporting models are distinguished by such features as the following: 1) scores in each discipline covered on the exam (four for Part I and nine for Part II) and an overall average are reported in a standard-score metric; 2) a single overall score in a standard-score metric is reported; and 3) performance on the exam is reported as pass/fail. Standard scores on the NBDE range from 49 to 99, with 75 being a passing score. Sample data, without identifying information, are used to illustrate the reporting models.

Risk prediction models for mortality in patients with cardiovascular disease: The BioBank Japan project.

Cardiovascular disease (CVD) is a leading cause of death in Japan. The present study aimed to develop new risk prediction models for long-term risks of all-cause and cardiovascular death in patients with chronic phase CVD.

Direct diffusion tensor estimation using a model-based method with spatial and parametric constraints.

To develop a new model-based method with spatial and parametric constraints (MB-SPC) aimed at accelerating diffusion tensor imaging (DTI) by directly estimating the diffusion tensor from highly undersampled k-space data.

Meta-CART: A tool to identify interactions between moderators in meta-analysis.

In the framework of meta-analysis, moderator analysis is usually performed only univariately. When several study characteristics are available that may account for treatment effect, standard meta-regression has difficulties in identifying interactions between them. To overcome this problem, meta-CART has been proposed: an approach that applies classification and regression trees (CART) to identify interactions, and then subgroup meta-analysis to test the significance of moderator effects. The previous version of meta-CART has its shortcomings: when applying CART, the sample sizes of studies are not taken into account, and the effect sizes are dichotomized around the median value. Therefore, this article proposes new meta-CART extensions, weighting study effect sizes by their accuracy, and using a regression tree to avoid dichotomization. In addition, new pruning rules are proposed. The performance of all versions of meta-CART was evaluated via a Monte Carlo simulation study. The simulation results revealed that meta-regression trees with random-effects weights and a 0.5-standard-error pruning rule perform best. The required sample size for meta-CART to achieve satisfactory performance depends on the number of study characteristics, the magnitude of the interactions, and the residual heterogeneity.

Gaussian model-based partitioning using iterated local search.

The emergence of Gaussian model-based partitioning as a viable alternative to K-means clustering fosters a need for discrete optimization methods that can be efficiently implemented using model-based criteria. A variety of alternative partitioning criteria have been proposed for more general data conditions that permit elliptical clusters, different spatial orientations for the clusters, and unequal cluster sizes. Unfortunately, many of these partitioning criteria are computationally demanding, which makes the multiple-restart (multistart) approach commonly used for K-means partitioning less effective as a heuristic solution strategy. As an alternative, we propose an approach based on iterated local search (ILS), which has proved effective in previous combinatorial data analysis contexts. We compared multistart, ILS and hybrid multistart-ILS procedures for minimizing a very general model-based criterion that assumes no restrictions on cluster size or within-group covariance structure. This comparison, which used 23 data sets from the classification literature, revealed that the ILS and hybrid heuristics generally provided better criterion function values than the multistart approach when all three methods were constrained to the same 10-min time limit. In many instances, these differences in criterion function values reflected profound differences in the partitions obtained.

Response style analysis with threshold and multi-process IRT models: A review and tutorial.

Two different item response theory model frameworks have been proposed for the assessment and control of response styles in rating data. According to one framework, response styles can be assessed by analysing threshold parameters in Rasch models for ordinal data and in mixture-distribution extensions of such models. A different framework is provided by multi-process item response tree models, which can be used to disentangle response processes that are related to the substantive traits and response tendencies elicited by the response scale. In this tutorial, the two approaches are reviewed, illustrated with an empirical data set of the two-dimensional 'Personal Need for Structure' construct, and compared in terms of multiple criteria. Mplus is used as a software framework for (mixed) polytomous Rasch models and item response tree models as well as for demonstrating how parsimonious model variants can be specified to test assumptions on the structure of response styles and attitude strength. Although both frameworks are shown to account for response styles, they differ on the quantitative criteria of model selection, practical aspects of model estimation, and conceptual issues of representing response styles as continuous and multidimensional sources of individual differences in psychological assessment.