Robot to find and connect medical scientists working on the same research via Open Data

Sharing research data or Open Science, aims to accelerate scientific discovery, which is of particular importance in the case of new medicines and treatments. A grant proposal by an international research team, led by Dr Chase C. Smith, MCPHS University, and submitted to the Open Science Prize, suggests development of what the authors call The SCience INtroDuction Robot, (SCINDR). The project’s proposal is available in the open access journal Research Ideas and Outcomes (RIO).

Building on an open source electronic lab notebook (ELN) developed by the same team, the robot would discover and alert scientists from around the world who are working on similar molecules in real time. Finding each other and engaging in open and collaborative research could accelerate and enhance medical discoveries.

Already running and being constantly updated, the electronic lab notebook serves to store researchers’ open data in a machine-readable and openly accessible format. The next step before the scientists is to adapt the open source notebook to run SCINDR, as exemplified in their prototype.

“The above mentioned ELN is the perfect platform for the addition of SCINDR since it is already acting as a repository of open drug discovery information that can be mined by the robot,” explain the authors.

Once a researcher has their data stored on the ELN, or on any similar open database, for that matter, SCINDR would be able to detect if similar molecules, chemical reactions, biological assays or other features of importance in health research have been entered by someone else. If the robot identifies another scientist looking into similar features, it will suggest introducing the two to each other, so that they could start working together and combine their efforts and knowledge for the good of both science and the public.

Because of its ability to parse information and interests from around the globe, the authors liken SCINDR to online advertisements and music streaming services, which have long targeted certain content, based on a person’s writing, reading, listening habits or other search history.

“The potential for automatically connecting relevant people and/or matching people with commercial content currently dominates much of software development, yet the analogous idea of automatically connecting people who are working on similar science in real time does not exist,” stress the authors.

“This extraordinary fact arises in part because so few people work openly, meaning almost all the research taking place in laboratories around the world remains behind closed doors until publication (or in a minority of cases deposition to a preprint server), by which time the project may have ended and researchers have moved on or shelved a project.”

“As open science gathers pace, and as thousands of researchers start to use open records of their research, we will need a way to discover the most relevant collaborators, and encourage them to connect. SCINDR will solve this problem,” they conclude.

The system is intended to be tested initially by a community of researchers known as Open Source Malaria (OSM), a consortium funded to carry out drug discovery and development for new medicines for the treatment of malaria.

###

Original source:

Smith C, Todd M, Patiny L, Swain C, Southan C, Williamson A, Clark A (2016) SCINDR – The SCience INtroDuction Robot that will Connect Open Scientists. Research Ideas and Outcomes 2: e9995. doi: 10.3897/rio.2.e9995

New proposal published in RIO tackles problematic trial detection in ClinicalTrials.gov

Clinical trials are crucial in determining the effectiveness of treatments and directly influence practical and policy decisions. However, their results could be even detrimental to real-life patients if data is fabricated or subject to errors. While it is about 2% of all researchers that admit to having manipulated their data, a new Dutch Fulbright project proposal, published via the innovative Research Ideas & Outcomes (RIO) Journal, suggests new methods to tackle these issues and to apply them to results reported in the ClinicalTrials.gov database.

Decisions based on bad data, both clinical and policy ones made by medical doctors and governmental institutions, respectively, can pose direct risks on treated patients and the population in general. Such was the case of beta-blockers, for instance, used to be prescribed to cardiac patients in order to decrease perioperative mortality. However, a subsequent meta-analysis detected erroneous data in the related clinical trials. Moreover, it turned out that beta-blockers actually increase the risk of mortality.

The new Dutch Fulbright proposal led by Chris HJ Hartgerink, Tilburg University, Netherlands, and Dr. Stephen L George, Duke University, United States, proposes new additional statistical methods for erroneous data detection to provide an additional quality control filter for clinical trial results reported in the ClinicalTrials.gov database.

Unfortunately, misleading data is not simply the product of bad practice, but could also result from human error or inadequate data handling. It is not even clear enough how often such mistakes or manipulations occur and have occurred in reality, let alone their prevalence in any science in particular. What is beyond doubt, however, is that additional methods and procedures of detecting bad data are needed in order to minimize the risk of bad decisions being taken when health and wellbeing are at stake.

“Detecting problematic data is a niche field with few experts around the world, despite its importance,” further explains Chris HJ Hartgerink. “Systematic application remains absent and this project hopes to push this field into this area. New estimates of how prevalent problematic data are welcome, because we currently rely on self-report measures, which suffer from human bias.”

Recently submitted to Fulbright, the 6-month project proposal has now been made open-access by the authors with RIO Journal, an innovative platform publishing all outputs of the research cycle, including: project proposals, data, methods, workflows, software, project reports and research articles.

“A grant proposal is a research output like any other but is only rewarded when it results in funding,” says Chris HJ Hartgerink. “We know that many good proposals are rejected and consequently not rewarded. Publishing the grant proposal shows the output, makes it rewardable and can help improve it by post-publication peer review.”

###

 

Original Source:

Hartgerink CHJ, George SL (2015) Problematic trial detection in ClinicalTrials.gov. Research Ideas and Outcomes 1: e7462. doi: 10.3897/rio.1.e7462

 

Additional Information:

The Research Ideas and Outcomes (RIO) Journal publishes all outputs of the research cycle, including: project proposals, data, methods, workflows, software, project reports and research articles together on a single collaborative platform offering one of the most transparent, open and public peer-review processes. Its scope encompasses all areas of academic research, including science, technology, the humanities and the social sciences.