PubTransformer

A site to transform Pubmed publications into these bibliographic reference formats: ADS, BibTeX, EndNote, ISI used by the Web of Knowledge, RIS, MEDLINE, Microsoft's Word 2007 XML.

Algorithms - Top 30 Publications

DOMINO: Using Machine Learning to Predict Genes Associated with Dominant Disorders.

In contrast to recessive conditions with biallelic inheritance, identification of dominant (monoallelic) mutations for Mendelian disorders is more difficult, because of the abundance of benign heterozygous variants that act as massive background noise (typically, in a 400:1 excess ratio). To reduce this overflow of false positives in next-generation sequencing (NGS) screens, we developed DOMINO, a tool assessing the likelihood for a gene to harbor dominant changes. Unlike commonly-used predictors of pathogenicity, DOMINO takes into consideration features that are the properties of genes, rather than of variants. It uses a machine-learning approach to extract discriminant information from a broad array of features (N = 432), including: genomic data, intra-, and interspecies conservation, gene expression, protein-protein interactions, protein structure, etc. DOMINO's iterative architecture includes a training process on 985 genes with well-established inheritance patterns for Mendelian conditions, and repeated cross-validation that optimizes its discriminant power. When validated on 99 newly-discovered genes with pathogenic mutations, the algorithm displays an excellent final performance, with an area under the curve (AUC) of 0.92. Furthermore, unsupervised analysis by DOMINO of real sets of NGS data from individuals with intellectual disability or epilepsy correctly recognizes known genes and predicts 9 new candidates, with very high confidence. In summary, DOMINO is a robust and reliable tool that can infer dominance of candidate genes with high sensitivity and specificity, making it a useful complement to any NGS pipeline dealing with the analysis of the morbid human genome.

Correlation Between Macular Integrity Assessment and Optical Coherence Tomography Imaging of Ellipsoid Zone in Macular Telangiectasia Type 2.

To correlate ellipsoid zone (EZ) defects on spectral-domain optical coherence tomography (SD-OCT) with retinal sensitivity loss on macular integrity assessment (MAIA) microperimetry in macular telangiectasia type 2 (MacTel).

Identifying parameter regions for multistationarity.

Mathematical modelling has become an established tool for studying the dynamics of biological systems. Current applications range from building models that reproduce quantitative data to identifying systems with predefined qualitative features, such as switching behaviour, bistability or oscillations. Mathematically, the latter question amounts to identifying parameter values associated with a given qualitative feature. We introduce a procedure to partition the parameter space of a parameterized system of ordinary differential equations into regions for which the system has a unique or multiple equilibria. The procedure is based on the computation of the Brouwer degree, and it creates a multivariate polynomial with parameter depending coefficients. The signs of the coefficients determine parameter regions with and without multistationarity. A particular strength of the procedure is the avoidance of numerical analysis and parameter sampling. The procedure consists of a number of steps. Each of these steps might be addressed algorithmically using various computer programs and available software, or manually. We demonstrate our procedure on several models of gene transcription and cell signalling, and show that in many cases we obtain a complete partitioning of the parameter space with respect to multistationarity.

Optimization to the Phellinus experimental environment based on classification forecasting method.

Phellinus is a kind of fungus and known as one of the elemental components in drugs to avoid cancer. With the purpose of finding optimized culture conditions for Phellinus production in the lab, plenty of experiments focusing on single factor were operated and large scale of experimental data was generated. In previous work, we used regression analysis and GA Gene-set based Genetic Algorithm (GA) to predict the production, but the data we used depended on experimental experience and only little part of the data was used. In this work we use the values of parameters involved in culture conditions, including inoculum size, PH value, initial liquid volume, temperature, seed age, fermentation time and rotation speed, to establish a high yield and a low yield classification model. Subsequently, a prediction model of BP neural network is established for high yield data set. GA is used to find the best culture conditions. The forecast accuracy rate more than 90% and the yield we got have a slight increase than the real yield.

Accounting for cell lineage and sex effects in the identification of cell-specific DNA methylation using a Bayesian model selection algorithm.

Cell- and sex-specific differences in DNA methylation are major sources of epigenetic variation in whole blood. Heterogeneity attributable to cell type has motivated the identification of cell-specific methylation at the CpG level, however statistical methods for this purpose have been limited to pairwise comparisons between cell types or between the cell type of interest and whole blood. We developed a Bayesian model selection algorithm for the identification of cell-specific methylation profiles that incorporates knowledge of shared cell lineage and allows for the identification of differential methylation profiles in one or more cell types simultaneously. Under the proposed methodology, sex-specific differences in methylation by cell type are also assessed. Using publicly available, cell-sorted methylation data, we show that 51.3% of female CpG markers and 61.4% of male CpG markers identified were associated with differential methylation in more than one cell type. The impact of cell lineage on differential methylation was also highlighted. An evaluation of sex-specific differences revealed differences in CD56+NK methylation, within both single and multi- cell dependent methylation patterns. Our findings demonstrate the need to account for cell lineage in studies of differential methylation and associated sex effects.

Machine Learning Approaches in Cardiovascular Imaging.

Cardiovascular imaging technologies continue to increase in their capacity to capture and store large quantities of data. Modern computational methods, developed in the field of machine learning, offer new approaches to leveraging the growing volume of imaging data available for analyses. Machine learning methods can now address data-related problems ranging from simple analytic queries of existing measurement data to the more complex challenges involved in analyzing raw images. To date, machine learning has been used in 2 broad and highly interconnected areas: automation of tasks that might otherwise be performed by a human and generation of clinically important new knowledge. Most cardiovascular imaging studies have focused on task-oriented problems, but more studies involving algorithms aimed at generating new clinical insights are emerging. Continued expansion in the size and dimensionality of cardiovascular imaging databases is driving strong interest in applying powerful deep learning methods, in particular, to analyze these data. Overall, the most effective approaches will require an investment in the resources needed to appropriately prepare such large data sets for analyses. Notwithstanding current technical and logistical challenges, machine learning and especially deep learning methods have much to offer and will substantially impact the future practice and science of cardiovascular imaging.

Lost in Thought - The Limits of the Human Mind and the Future of Medicine.

Relating dynamic brain states to dynamic machine states: Human and machine solutions to the speech recognition problem.

There is widespread interest in the relationship between the neurobiological systems supporting human cognition and emerging computational systems capable of emulating these capacities. Human speech comprehension, poorly understood as a neurobiological process, is an important case in point. Automatic Speech Recognition (ASR) systems with near-human levels of performance are now available, which provide a computationally explicit solution for the recognition of words in continuous speech. This research aims to bridge the gap between speech recognition processes in humans and machines, using novel multivariate techniques to compare incremental 'machine states', generated as the ASR analysis progresses over time, to the incremental 'brain states', measured using combined electro- and magneto-encephalography (EMEG), generated as the same inputs are heard by human listeners. This direct comparison of dynamic human and machine internal states, as they respond to the same incrementally delivered sensory input, revealed a significant correspondence between neural response patterns in human superior temporal cortex and the structural properties of ASR-derived phonetic models. Spatially coherent patches in human temporal cortex responded selectively to individual phonetic features defined on the basis of machine-extracted regularities in the speech to lexicon mapping process. These results demonstrate the feasibility of relating human and ASR solutions to the problem of speech recognition, and suggest the potential for further studies relating complex neural computations in human speech comprehension to the rapidly evolving ASR systems that address the same problem domain.

Development and validation of QMortality risk prediction algorithm to estimate short term risk of death and assess frailty: cohort study.

Objectives To derive and validate a risk prediction equation to estimate the short term risk of death, and to develop a classification method for frailty based on risk of death and risk of unplanned hospital admission.Design Prospective open cohort study.Participants Routinely collected data from 1436 general practices contributing data to QResearch in England between 2012 and 2016. 1079 practices were used to develop the scores and a separate set of 357 practices to validate the scores. 1.47 million patients aged 65-100 years were in the derivation cohort and 0.50 million patients in the validation cohort.Methods Cox proportional hazards models in the derivation cohort were used to derive separate risk equations in men and women for evaluation of the risk of death at one year. Risk factors considered were age, sex, ethnicity, deprivation, smoking status, alcohol intake, body mass index, medical conditions, specific drugs, social factors, and results of recent investigations. Measures of calibration and discrimination were determined in the validation cohort for men and women separately and for each age and ethnic group. The new mortality equation was used in conjunction with the existing QAdmissions equation (which predicts risk of unplanned hospital admission) to classify patients into frailty groups.Main outcome measure The primary outcome was all cause mortality.Results During follow-up 180 132 deaths were identified in the derivation cohort arising from 4.39 million person years of observation. The final model included terms for age, body mass index, Townsend score, ethnic group, smoking status, alcohol intake, unplanned hospital admissions in the past 12 months, atrial fibrillation, antipsychotics, cancer, asthma or chronic obstructive pulmonary disease, living in a care home, congestive heart failure, corticosteroids, cardiovascular disease, dementia, epilepsy, learning disability, leg ulcer, chronic liver disease or pancreatitis, Parkinson's disease, poor mobility, rheumatoid arthritis, chronic kidney disease, type 1 diabetes, type 2 diabetes, venous thromboembolism, anaemia, abnormal liver function test result, high platelet count, visited doctor in the past year with either appetite loss, unexpected weight loss, or breathlessness. The model had good calibration and high levels of explained variation and discrimination. In women, the equation explained 55.6% of the variation in time to death (R(2)), and had very good discrimination-the D statistic was 2.29, and Harrell's C statistic value was 0.85. The corresponding values for men were 53.1%, 2.18, and 0.84. By combining predicted risks of mortality and unplanned hospital admissions, 2.7% of patients (n=13 665) were classified as severely frail, 9.4% (n=46 770) as moderately frail, 43.1% (n=215 253) as mildly frail, and 44.8% (n=223 790) as fit.Conclusions We have developed new equations to predict the short term risk of death in men and women aged 65 or more, taking account of demographic, social, and clinical variables. The equations had good performance on a separate validation cohort. The QMortality equations can be used in conjunction with the QAdmissions equations, to classify patients into four frailty groups (known as QFrailty categories) to enable patients to be identified for further assessment or interventions.

Regional Evaluation of the Severity-Based Stroke Triage Algorithm for Emergency Medical Services Using Discrete Event Simulation.

The Severity-Based Stroke Triage Algorithm for Emergency Medical Services endorses routing patients with suspected large vessel occlusion acute ischemic strokes directly to endovascular stroke centers (ESCs). We sought to evaluate different specifications of this algorithm within a region.

Optical coherence tomography angiography of the peripapillary capillaries in primary open-angle and normal-tension glaucoma.

To evaluate the vascular architecture of the radial peripapillary capillaries (RPCs) and its relation with visual function in patients with open-angle glaucoma (OAG) and normal-tension glaucoma using spectral-domain optical coherence tomography (SD-OCT) angiography.

Randomized Trial Comparing Two Algorithms for Levothyroxine Dose Adjustment in Pregnant Women With Primary Hypothyroidism.

Regulation of maternal thyroid hormones during pregnancy is crucial for optimal maternal and fetal outcomes. There are no specific guidelines addressing maternal levothyroxine (LT4) dose adjustments throughout pregnancy.

A Fully Automated Approach to Spike Sorting.

Understanding the detailed dynamics of neuronal networks will require the simultaneous measurement of spike trains from hundreds of neurons (or more). Currently, approaches to extracting spike times and labels from raw data are time consuming, lack standardization, and involve manual intervention, making it difficult to maintain data provenance and assess the quality of scientific results. Here, we describe an automated clustering approach and associated software package that addresses these problems and provides novel cluster quality metrics. We show that our approach has accuracy comparable to or exceeding that achieved using manual or semi-manual techniques with desktop central processing unit (CPU) runtimes faster than acquisition time for up to hundreds of electrodes. Moreover, a single choice of parameters in the algorithm is effective for a variety of electrode geometries and across multiple brain regions. This algorithm has the potential to enable reproducible and automated spike sorting of larger scale recordings than is currently possible.

Deep learning approach to bacterial colony classification.

In microbiology it is diagnostically useful to recognize various genera and species of bacteria. It can be achieved using computer-aided methods, which make the recognition processes more automatic and thus significantly reduce the time necessary for the classification. Moreover, in case of diagnostic uncertainty (the misleading similarity in shape or structure of bacterial cells), such methods can minimize the risk of incorrect recognition. In this article, we apply the state of the art method for texture analysis to classify genera and species of bacteria. This method uses deep Convolutional Neural Networks to obtain image descriptors, which are then encoded and classified with Support Vector Machine or Random Forest. To evaluate this approach and to make it comparable with other approaches, we provide a new dataset of images. DIBaS dataset (Digital Image of Bacterial Species) contains 660 images with 33 different genera and species of bacteria.

A neural network based computational model to predict the output power of different types of photovoltaic cells.

In this article, we introduced an artificial neural network (ANN) based computational model to predict the output power of three types of photovoltaic cells, mono-crystalline (mono-), multi-crystalline (multi-), and amorphous (amor-) crystalline. The prediction results are very close to the experimental data, and were also influenced by numbers of hidden neurons. The order of the solar generation power output influenced by the external conditions from smallest to biggest is: multi-, mono-, and amor- crystalline silicon cells. In addition, the dependences of power prediction on the number of hidden neurons were studied. For multi- and amorphous crystalline cell, three or four hidden layer units resulted in the high correlation coefficient and low MSEs. For mono-crystalline cell, the best results were achieved at the hidden layer unit of 8.

Initial Clinical Experience With a New Automated Antitachycardia Pacing Algorithm: Feasibility and Safety in an Ambulatory Patient Cohort.

Antitachycardia pacing (ATP) in implantable cardioverter-defibrillators (ICD) decreases patient shock burden but has recognized limitations. A new automated ATP (AATP) based on electrophysiological first principles was designed. The study objective was to assess the feasibility and safety of AATP in ambulatory ICD patients.

Temporal causal inference with stochastic audiovisual sequences.

Integration of sensory information across multiple senses is most likely to occur when signals are spatiotemporally coupled. Yet, recent research on audiovisual rate discrimination indicates that random sequences of light flashes and auditory clicks are integrated optimally regardless of temporal correlation. This may be due to 1) temporal averaging rendering temporal cues less effective; 2) difficulty extracting causal-inference cues from rapidly presented stimuli; or 3) task demands prompting integration without concern for the spatiotemporal relationship between the signals. We conducted a rate-discrimination task (Exp 1), using slower, more random sequences than previous studies, and a separate causal-judgement task (Exp 2). Unisensory and multisensory rate-discrimination thresholds were measured in Exp 1 to assess the effects of temporal correlation and spatial congruence on integration. The performance of most subjects was indistinguishable from optimal for spatiotemporally coupled stimuli, and generally sub-optimal in other conditions, suggesting observers used a multisensory mechanism that is sensitive to both temporal and spatial causal-inference cues. In Exp 2, subjects reported whether temporally uncorrelated (but spatially co-located) sequences were perceived as sharing a common source. A unified percept was affected by click-flash pattern similarity and the maximum temporal offset between individual clicks and flashes, but not on the proportion of synchronous click-flash pairs. A simulation analysis revealed that the stimulus-generation algorithms of previous studies is likely responsible for the observed integration of temporally independent sequences. By combining results from Exps 1 and 2, we found better rate-discrimination performance for sequences that are more likely to be integrated than those that are not. Our results support the principle that multisensory stimuli are optimally integrated when spatiotemporally coupled, and provide insight into the temporal features used for coupling in causal inference.

Restrictive intraoperative fluid optimisation algorithm improves outcomes in patients undergoing pancreaticoduodenectomy: A prospective multicentre randomized controlled trial.

We aimed to evaluate perioperative outcomes in patients undergoing pancreaticoduodenectomy with or without a cardiac output goal directed therapy (GDT) algorithm. We conducted a multicentre randomised controlled trial in four high volume hepatobiliary-pancreatic surgery centres. We evaluated whether the additional impact of a intraoperative fluid optimisation algorithm would influence the amount of fluid delivered, reduce fluid related complications, and improve length of hospital stay. Fifty-two consecutive adult patients were recruited. The median (IQR) duration of surgery was 8.6 hours (7.1:9.6) in the GDT group vs. 7.8 hours (6.8:9.0) in the usual care group (p = 0.2). Intraoperative fluid balance was 1005mL (475:1873) in the GDT group vs. 3300mL (2474:3874) in the usual care group (p<0.0001). Total volume of fluid administered intraoperatively was also lower in the GDT group: 2050mL (1313:2700) vs. 4088mL (3400:4525), p<0.0001 and vasoactive medications were used more frequently. There were no significant differences in proportions of patients experiencing overall complications (p = 0.179); however, fewer complications occurred in the GDT group: 44 vs. 92 (Incidence Rate Ratio: 0.41; 95%CI 0.24 to 0.69, p = 0.001). Median (IQR) length of hospital stay was 9.5 days (IQR: 7.0, 14.3) in the GDT vs. 12.5 days in the usual care group (IQR: 9.0, 22.3) for an Incidence Rate Ratio 0.64 (95% CI 0.48 to 0.85, p = 0.002). In conclusion, using a surgery-specific, patient-specific goal directed restrictive fluid therapy algorithm in this cohort of patients, can justify using enough fluid without causing oedema, yet as little fluid as possible without causing hypovolaemia i.e. "precision" fluid therapy. Our findings support the use of a perioperative haemodynamic optimization plan that prioritizes preservation of cardiac output and organ perfusion pressure by judicious use of fluid therapy, rational use of vasoactive drugs and timely application of inotropic drugs. They also suggest the need for further larger studies to confirm its findings.

Automatic machine-learning based identification of jogging periods from accelerometer measurements of adolescents under field conditions.

Assessment of health benefits associated with physical activity depend on the activity duration, intensity and frequency, therefore their correct identification is very valuable and important in epidemiological and clinical studies. The aims of this study are: to develop an algorithm for automatic identification of intended jogging periods; and to assess whether the identification performance is improved when using two accelerometers at the hip and ankle, compared to when using only one at either position.

A segmentation method for lung nodule image sequences based on superpixels and density-based spatial clustering of applications with noise.

The fast and accurate segmentation of lung nodule image sequences is the basis of subsequent processing and diagnostic analyses. However, previous research investigating nodule segmentation algorithms cannot entirely segment cavitary nodules, and the segmentation of juxta-vascular nodules is inaccurate and inefficient. To solve these problems, we propose a new method for the segmentation of lung nodule image sequences based on superpixels and density-based spatial clustering of applications with noise (DBSCAN). First, our method uses three-dimensional computed tomography image features of the average intensity projection combined with multi-scale dot enhancement for preprocessing. Hexagonal clustering and morphological optimized sequential linear iterative clustering (HMSLIC) for sequence image oversegmentation is then proposed to obtain superpixel blocks. The adaptive weight coefficient is then constructed to calculate the distance required between superpixels to achieve precise lung nodules positioning and to obtain the subsequent clustering starting block. Moreover, by fitting the distance and detecting the change in slope, an accurate clustering threshold is obtained. Thereafter, a fast DBSCAN superpixel sequence clustering algorithm, which is optimized by the strategy of only clustering the lung nodules and adaptive threshold, is then used to obtain lung nodule mask sequences. Finally, the lung nodule image sequences are obtained. The experimental results show that our method rapidly, completely and accurately segments various types of lung nodule image sequences.

Indirect adaptive soft computing based wavelet-embedded control paradigms for WT/PV/SOFC in a grid/charging station connected hybrid power system.

This paper focuses on the indirect adaptive tracking control of renewable energy sources in a grid-connected hybrid power system. The renewable energy systems have low efficiency and intermittent nature due to unpredictable meteorological conditions. The domestic load and the conventional charging stations behave in an uncertain manner. To operate the renewable energy sources efficiently for harvesting maximum power, instantaneous nonlinear dynamics should be captured online. A Chebyshev-wavelet embedded NeuroFuzzy indirect adaptive MPPT (maximum power point tracking) control paradigm is proposed for variable speed wind turbine-permanent synchronous generator (VSWT-PMSG). A Hermite-wavelet incorporated NeuroFuzzy indirect adaptive MPPT control strategy for photovoltaic (PV) system to extract maximum power and indirect adaptive tracking control scheme for Solid Oxide Fuel Cell (SOFC) is developed. A comprehensive simulation test-bed for a grid-connected hybrid power system is developed in Matlab/Simulink. The robustness of the suggested indirect adaptive control paradigms are evaluated through simulation results in a grid-connected hybrid power system test-bed by comparison with conventional and intelligent control techniques. The simulation results validate the effectiveness of the proposed control paradigms.

Prediction and analysis of essential genes using the enrichments of gene ontology and KEGG pathways.

Identifying essential genes in a given organism is important for research on their fundamental roles in organism survival. Furthermore, if possible, uncovering the links between core functions or pathways with these essential genes will further help us obtain deep insight into the key roles of these genes. In this study, we investigated the essential and non-essential genes reported in a previous study and extracted gene ontology (GO) terms and biological pathways that are important for the determination of essential genes. Through the enrichment theory of GO and KEGG pathways, we encoded each essential/non-essential gene into a vector in which each component represented the relationship between the gene and one GO term or KEGG pathway. To analyze these relationships, the maximum relevance minimum redundancy (mRMR) was adopted. Then, the incremental feature selection (IFS) and support vector machine (SVM) were employed to extract important GO terms and KEGG pathways. A prediction model was built simultaneously using the extracted GO terms and KEGG pathways, which yielded nearly perfect performance, with a Matthews correlation coefficient of 0.951, for distinguishing essential and non-essential genes. To fully investigate the key factors influencing the fundamental roles of essential genes, the 21 most important GO terms and three KEGG pathways were analyzed in detail. In addition, several genes was provided in this study, which were predicted to be essential genes by our prediction model. We suggest that this study provides more functional and pathway information on the essential genes and provides a new way to investigate related problems.

Robot-assisted Extracranial Stereotactic Radiotherapy of Adrenal Metastases in Oligometastatic Non-small Cell Lung Cancer.

The aim of this study was to evaluate the efficacy and toxicity of stereotactic body radiation therapy (SBRT) in the treatment of patients with adrenal metastases in oligometastatic non-small-cell lung cancer (NSCLC).

Feature selection before EEG classification supports the diagnosis of Alzheimer's disease.

In many decision support systems, some input features can be marginal or irrelevant to the diagnosis, while others can be redundant among each other. Thus, feature selection (FS) algorithms are often considered to find relevant/non-redundant features. This study aimed to evaluate the relevance of FS approaches applied to Alzheimer's Disease (AD) EEG-based diagnosis and compare the selected features with previous clinical findings.

A spatio-temporal prediction model based on support vector machine regression: Ambient Black Carbon in three New England States.

Fine ambient particulate matter has been widely associated with multiple health effects. Mitigation hinges on understanding which sources are contributing to its toxicity. Black Carbon (BC), an indicator of particles generated from traffic sources, has been associated with a number of health effects however due to its high spatial variability, its concentration is difficult to estimate. We previously fit a model estimating BC concentrations in the greater Boston area; however this model was built using limited monitoring data and could not capture the complex spatio-temporal patterns of ambient BC. In order to improve our predictive ability, we obtained more data for a total of 24,301 measurements from 368 monitors over a 12 year period in Massachusetts, Rhode Island and New Hampshire. We also used Nu-Support Vector Regression (nu-SVR) - a machine learning technique which incorporates nonlinear terms and higher order interactions, with appropriate regularization of parameter estimates. We then used a generalized additive model to refit the residuals from the nu-SVR and added the residual predictions to our earlier estimates. Both spatial and temporal predictors were included in the model which allowed us to capture the change in spatial patterns of BC over time. The 10 fold cross validated (CV) R(2) of the model was good in both cold (10-fold CV R(2) = 0.87) and warm seasons (CV R(2) = 0.79). We have successfully built a model that can be used to estimate short and long-term exposures to BC and will be useful for studies looking at various health outcomes in MA, RI and Southern NH.

Robotic Automation of In Vivo Two-Photon Targeted Whole-Cell Patch-Clamp Electrophysiology.

Whole-cell patch-clamp electrophysiological recording is a powerful technique for studying cellular function. While in vivo patch-clamp recording has recently benefited from automation, it is normally performed "blind," meaning that throughput for sampling some genetically or morphologically defined cell types is unacceptably low. One solution to this problem is to use two-photon microscopy to target fluorescently labeled neurons. Combining this with robotic automation is difficult, however, as micropipette penetration induces tissue deformation, moving target cells from their initial location. Here we describe a platform for automated two-photon targeted patch-clamp recording, which solves this problem by making use of a closed loop visual servo algorithm. Our system keeps the target cell in focus while iteratively adjusting the pipette approach trajectory to compensate for tissue motion. We demonstrate platform validation with patch-clamp recordings from a variety of cells in the mouse neocortex and cerebellum.

Robot-Assisted Training of Arm and Hand Movement Shows Functional Improvements for Incomplete Cervical Spinal Cord Injury.

The aim of the study was to demonstrate the feasibility, tolerability, and effectiveness of robotic-assisted arm training in incomplete chronic tetraplegia.

Elastic network model of learned maintained contacts to predict protein motion.

We present a novel elastic network model, lmcENM, to determine protein motion even for localized functional motions that involve substantial changes in the protein's contact topology. Existing elastic network models assume that the contact topology remains unchanged throughout the motion and are thus most appropriate to simulate highly collective function-related movements. lmcENM uses machine learning to differentiate breaking from maintained contacts. We show that lmcENM accurately captures functional transitions unexplained by the classical ENM and three reference ENM variants, while preserving the simplicity of classical ENM. We demonstrate the effectiveness of our approach on a large set of proteins covering different motion types. Our results suggest that accurately predicting a "deformation-invariant" contact topology offers a promising route to increase the general applicability of ENMs. We also find that to correctly predict this contact topology a combination of several features seems to be relevant which may vary slightly depending on the protein. Additionally, we present case studies of two biologically interesting systems, Ferric Citrate membrane transporter FecA and Arachidonate 15-Lipoxygenase.

Automated diagnosis of myositis from muscle ultrasound: Exploring the use of machine learning and deep learning methods.

To evaluate the use of ultrasound coupled with machine learning (ML) and deep learning (DL) techniques for automated or semi-automated classification of myositis.

Validation of an Algorithm for Semi-automated Estimation of Voice Relative Fundamental Frequency.

Relative fundamental frequency (RFF) has shown promise as an acoustic measure of voice, but the subjective and time-consuming nature of its manual estimation has made clinical translation infeasible. Here, a faster, more objective algorithm for RFF estimation is evaluated in a large and diverse sample of individuals with and without voice disorders.