Posts Tagged ‘science’
Comparison of Electronic Data Capture (EDC) with the Standard Data Capture Method for Clinical Trial Data
Written by Scott Christley et al. on September 23, 2011 – 9:00 pm -by Brigitte Walther, Safayet Hossin, John Townend, Neil Abernethy, David Parker, David Jeffries
BackgroundTraditionally, clinical research studies rely on collecting data with case report forms, which are subsequently entered into a database to create electronic records. Although well established, this method is time-consuming and error-prone. This study compares four electronic data capture (EDC) methods with the conventional approach with respect to duration of data capture and accuracy. It was performed in a West African setting, where clinical trials involve data collection from urban, rural and often remote locations.
Methodology/Principal FindingsThree types of commonly available EDC tools were assessed in face-to-face interviews; netbook, PDA, and tablet PC. EDC performance during telephone interviews via mobile phone was evaluated as a fourth method. The Graeco Latin square study design allowed comparison of all four methods to standard paper-based recording followed by data double entry while controlling simultaneously for possible confounding factors such as interview order, interviewer and interviewee. Over a study period of three weeks the error rates decreased considerably for all EDC methods. In the last week of the study the data accuracy for the netbook (5.1%, CI95%: 3.5–7.2%) and the tablet PC (5.2%, CI95%: 3.7–7.4%) was not significantly different from the accuracy of the conventional paper-based method (3.6%, CI95%: 2.2–5.5%), but error rates for the PDA (7.9%, CI95%: 6.0–10.5%) and telephone (6.3%, CI95% 4.6–8.6%) remained significantly higher. While EDC-interviews take slightly longer, data become readily available after download, making EDC more time effective. Free text and date fields were associated with higher error rates than numerical, single select and skip fields.
ConclusionsEDC solutions have the potential to produce similar data accuracy compared to paper-based methods. Given the considerable reduction in the time from data collection to database lock, EDC holds the promise to reduce research-associated costs. However, the successful implementation of EDC requires adjustment of work processes and reallocation of resources.
Tags: computer, news, science
Posted in Computer Science | Comments Off
Characterizing and Modeling Citation Dynamics
Written by Scott Christley et al. on September 22, 2011 – 9:00 pm -by Young-Ho Eom, Santo Fortunato
Citation distributions are crucial for the analysis and modeling of the activity of scientists. We investigated bibliometric data of papers published in journals of the American Physical Society, searching for the type of function which best describes the observed citation distributions. We used the goodness of fit with Kolmogorov-Smirnov statistics for three classes of functions: log-normal, simple power law and shifted power law. The shifted power law turns out to be the most reliable hypothesis for all citation networks we derived, which correspond to different time spans. We find that citation dynamics is characterized by bursts, usually occurring within a few years since publication of a paper, and the burst size spans several orders of magnitude. We also investigated the microscopic mechanisms for the evolution of citation networks, by proposing a linear preferential attachment with time dependent initial attractiveness. The model successfully reproduces the empirical citation distributions and accounts for the presence of citation bursts as well.Tags: computer, news, science
Posted in Computer Science | Comments Off
Judgment of the Humanness of an Interlocutor Is in the Eye of the Beholder
Written by Scott Christley et al. on September 22, 2011 – 9:00 pm -by Catherine L. Lortie, Matthieu J. Guitton
Despite tremendous advances in artificial language synthesis, no machine has so far succeeded in deceiving a human. Most research focused on analyzing the behavior of “good” machine. We here choose an opposite strategy, by analyzing the behavior of “bad” humans, i.e., humans perceived as machine. The Loebner Prize in Artificial Intelligence features humans and artificial agents trying to convince judges on their humanness via computer-mediated communication. Using this setting as a model, we investigated here whether the linguistic behavior of human subjects perceived as non-human would enable us to identify some of the core parameters involved in the judgment of an agents' humanness. We analyzed descriptive and semantic aspects of dialogues in which subjects succeeded or failed to convince judges of their humanness. Using cognitive and emotional dimensions in a global behavioral characterization, we demonstrate important differences in the patterns of behavioral expressiveness of the judges whether they perceived their interlocutor as being human or machine. Furthermore, the indicators of interest displayed by the judges were predictive of the final judgment of humanness. Thus, we show that the judgment of an interlocutor's humanness during a social interaction depends not only on his behavior, but also on the judge himself. Our results thus demonstrate that the judgment of humanness is in the eye of the beholder.Tags: computer, news, science
Posted in Computer Science | Comments Off
Comparing Effectiveness of Top-Down and Bottom-Up Strategies in Containing Influenza
Written by Scott Christley et al. on September 22, 2011 – 9:00 pm -by Achla Marathe, Bryan Lewis, Christopher Barrett, Jiangzhuo Chen, Madhav Marathe, Stephen Eubank, Yifei Ma
This research compares the performance of bottom-up, self-motivated behavioral interventions with top-down interventions targeted at controlling an “Influenza-like-illness”. Both types of interventions use a variant of the ring strategy. In the first case, when the fraction of a person's direct contacts who are diagnosed exceeds a threshold, that person decides to seek prophylaxis, e.g. vaccine or antivirals; in the second case, we consider two intervention protocols, denoted Block and School: when a fraction of people who are diagnosed in a Census Block (resp., School) exceeds the threshold, prophylax the entire Block (resp., School). Results show that the bottom-up strategy outperforms the top-down strategies under our parameter settings. Even in situations where the Block strategy reduces the overall attack rate well, it incurs a much higher cost. These findings lend credence to the notion that if people used antivirals effectively, making them available quickly on demand to private citizens could be a very effective way to control an outbreak.Tags: computer, news, science
Posted in Computer Science | Comments Off
Modeling Fractal Structure of City-Size Distributions Using Correlation Functions
Written by Scott Christley et al. on September 20, 2011 – 9:00 pm -by Yanguang Chen
Zipf's law is one the most conspicuous empirical facts for cities, however, there is no convincing explanation for the scaling relation between rank and size and its scaling exponent. Using the idea from general fractals and scaling, I propose a dual competition hypothesis of city development to explain the value intervals and the special value, 1, of the power exponent. Zipf's law and Pareto's law can be mathematically transformed into one another, but represent different processes of urban evolution, respectively. Based on the Pareto distribution, a frequency correlation function can be constructed. By scaling analysis and multifractals spectrum, the parameter interval of Pareto exponent is derived as (0.5, 1]; Based on the Zipf distribution, a size correlation function can be built, and it is opposite to the first one. By the second correlation function and multifractals notion, the Pareto exponent interval is derived as [1, 2). Thus the process of urban evolution falls into two effects: one is the Pareto effect indicating city number increase (external complexity), and the other the Zipf effect indicating city size growth (internal complexity). Because of struggle of the two effects, the scaling exponent varies from 0.5 to 2; but if the two effects reach equilibrium with each other, the scaling exponent approaches 1. A series of mathematical experiments on hierarchical correlation are employed to verify the models and a conclusion can be drawn that if cities in a given region follow Zipf's law, the frequency and size correlations will follow the scaling law. This theory can be generalized to interpret the inverse power-law distributions in various fields of physical and social sciences.Tags: computer, news, science
Posted in Computer Science | Comments Off
REST: A Toolkit for Resting-State Functional Magnetic Resonance Imaging Data Processing
Written by Scott Christley et al. on September 20, 2011 – 9:00 pm -by Xiao-Wei Song, Zhang-Ye Dong, Xiang-Yu Long, Su-Fang Li, Xi-Nian Zuo, Chao-Zhe Zhu, Yong He, Chao-Gan Yan, Yu-Feng Zang
Resting-state fMRI (RS-fMRI) has been drawing more and more attention in recent years. However, a publicly available, systematically integrated and easy-to-use tool for RS-fMRI data processing is still lacking. We developed a toolkit for the analysis of RS-fMRI data, namely the RESting-state fMRI data analysis Toolkit (REST). REST was developed in MATLAB with graphical user interface (GUI). After data preprocessing with SPM or AFNI, a few analytic methods can be performed in REST, including functional connectivity analysis based on linear correlation, regional homogeneity, amplitude of low frequency fluctuation (ALFF), and fractional ALFF. A few additional functions were implemented in REST, including a DICOM sorter, linear trend removal, bandpass filtering, time course extraction, regression of covariates, image calculator, statistical analysis, and slice viewer (for result visualization, multiple comparison correction, etc.). REST is an open-source package and is freely available at http://www.restfmri.net.Tags: computer, news, science
Posted in Computer Science | Comments Off
Human NK Cells Differ More in Their KIR2DL1-Dependent Thresholds for HLA-Cw6-Mediated Inhibition than in Their Maximal Killing Capacity
Written by Scott Christley et al. on September 19, 2011 – 9:00 pm -by Catarina R. Almeida, Amit Ashkenazi, Gitit Shahaf, Deborah Kaplan, Daniel M. Davis, Ramit Mehr
In this study we have addressed the question of how activation and inhibition of human NK cells is regulated by the expression level of MHC class I protein on target cells. Using target cell transfectants sorted to stably express different levels of the MHC class I protein HLA-Cw6, we show that induction of degranulation and that of IFN-γ secretion are not correlated. In contrast, the inhibition of these two processes by MHC class-I occurs at the same level of class I MHC protein. Primary human NK cell clones were found to differ in the amount of target MHC class I protein required for their inhibition, rather than in their maximum killing capacity. Importantly, we show that KIR2DL1 expression determines the thresholds (in terms of MHC I protein levels) required for NK cell inhibition, while the expression of other receptors such as LIR1 is less important. Furthermore, using mathematical models to explore the dynamics of target cell killing, we found that the observed delay in target cell killing is exhibited by a model in which NK cells require some activation or priming, such that each cell can lyse a target cell only after being activated by a first encounter with the same or a different target cell, but not by models which lack this feature.Tags: computer, news, science
Posted in Computer Science | Comments Off
Maintaining Vaccine Delivery Following the Introduction of the Rotavirus and Pneumococcal Vaccines in Thailand
Written by Scott Christley et al. on September 13, 2011 – 9:00 pm -by Bruce Y. Lee, Tina-Marie Assi, Korngamon Rookkapan, Angela R. Wateska, Jayant Rajgopal, Vorasith Sornsrivichai, Sheng-I Chen, Shawn T. Brown, Joel Welling, Bryan A. Norman, Diana L. Connor, Rachel R. Bailey, Anirban Jana, Willem G. Van Panhuis, Donald S. Burke
Although the substantial burdens of rotavirus and pneumococcal disease have motivated many countries to consider introducing the rotavirus vaccine (RV) and heptavalent pneumococcal conjugate vaccine (PCV-7) to their National Immunization Programs (EPIs), these new vaccines could affect the countries' vaccine supply chains (i.e., the series of steps required to get a vaccine from their manufacturers to patients). We developed detailed computational models of the Trang Province, Thailand, vaccine supply chain to simulate introducing various RV and PCV-7 vaccine presentations and their combinations. Our results showed that the volumes of these new vaccines in addition to current routine vaccines could meet and even exceed (1) the refrigerator space at the provincial district and sub-district levels and (2) the transport cold space at district and sub-district levels preventing other vaccines from being available to patients who arrive to be immunized. Besides the smallest RV presentation (17.1 cm3/dose), all other vaccine introduction scenarios required added storage capacity at the provincial level (range: 20 L–1151 L per month) for the three largest formulations, and district level (range: 1 L–124 L per month) across all introduction scenarios. Similarly, with the exception of the two smallest RV presentation (17.1 cm3/dose), added transport capacity was required at both district and sub-district levels. Added transport capacity required across introduction scenarios from the provincial to district levels ranged from 1 L–187 L, and district to sub-district levels ranged from 1 L–13 L per shipment. Finally, only the smallest RV vaccine presentation (17.1 cm3/dose) had no appreciable effect on vaccine availability at sub-districts. All other RV and PCV-7 vaccines were too large for the current supply chain to handle without modifications such as increasing storage or transport capacity. Introducing these new vaccines to Thailand could have dynamic effects on the availability of all vaccines that may not be initially apparent to decision-makers.Tags: computer, news, science
Posted in Computer Science | Comments Off
Higuchi Dimension of Digital Images
Written by Scott Christley et al. on September 13, 2011 – 9:00 pm -by Helmut Ahammer
There exist several methods for calculating the fractal dimension of objects represented as 2D digital images. For example, Box counting, Minkowski dilation or Fourier analysis can be employed. However, there appear to be some limitations. It is not possible to calculate only the fractal dimension of an irregular region of interest in an image or to perform the calculations in a particular direction along a line on an arbitrary angle through the image. The calculations must be made for the whole image. In this paper, a new method to overcome these limitations is proposed. 2D images are appropriately prepared in order to apply 1D signal analyses, originally developed to investigate nonlinear time series. The Higuchi dimension of these 1D signals is calculated using Higuchi's algorithm, and it is shown that both regions of interests and directional dependencies can be evaluated independently of the whole picture. A thorough validation of the proposed technique and a comparison of the new method to the Fourier dimension, a common two dimensional method for digital images, are given. The main result is that Higuchi's algorithm allows a direction dependent as well as direction independent analysis. Actual values for the fractal dimensions are reliable and an effective treatment of regions of interests is possible. Moreover, the proposed method is not restricted to Higuchi's algorithm, as any 1D method of analysis, can be applied.Tags: computer, news, science
Posted in Computer Science | Comments Off
Higuchi Dimension of Digital Images
Written by Scott Christley et al. on September 13, 2011 – 9:00 pm -by Helmut Ahammer
There exist several methods for calculating the fractal dimension of objects represented as 2D digital images. For example, Box counting, Minkowski dilation or Fourier analysis can be employed. However, there appear to be some limitations. It is not possible to calculate only the fractal dimension of an irregular region of interest in an image or to perform the calculations in a particular direction along a line on an arbitrary angle through the image. The calculations must be made for the whole image. In this paper, a new method to overcome these limitations is proposed. 2D images are appropriately prepared in order to apply 1D signal analyses, originally developed to investigate nonlinear time series. The Higuchi dimension of these 1D signals is calculated using Higuchi's algorithm, and it is shown that both regions of interests and directional dependencies can be evaluated independently of the whole picture. A thorough validation of the proposed technique and a comparison of the new method to the Fourier dimension, a common two dimensional method for digital images, are given. The main result is that Higuchi's algorithm allows a direction dependent as well as direction independent analysis. Actual values for the fractal dimensions are reliable and an effective treatment of regions of interests is possible. Moreover, the proposed method is not restricted to Higuchi's algorithm, as any 1D method of analysis, can be applied.Tags: computer, news, science
Posted in Computer Science | Comments Off
