The amazing people at Sage Bionetworks have kindly asked me to contribute to their series of posts about open science. The first thing that occurred to me was that it’s nice to be recognized as part of the open-science movement. The second thing that occurred was that more people should be part of it.
When I was a grad student and postdoc many years ago, open science wasn’t really something we talked about. We collected our data, wrote our papers, rewrote our papers, celebrated when they were published, and that was pretty much it. In 2010, I started working at INCF (International Neuroinformatics Coordinating Facility), and suddenly, data sharing, interoperability, and standards were all we were talking about. I remember that first discussion with then executive director Sten Grillner, where he outlined all the reasons why people should share their data and all the hurdles we needed to jump in order to optimize the process. All I could think was why on earth are there not central facilities doing this for all the other scientific domains, as well?
In the last decade, several major brain initiatives around the world have been launched and are starting to produce a vast amount of data. In order to integrate these diverse data and address issues of transparency and reproducibility, widely adopted standards and best practices around data sharing will be key to achieve an infrastructure which supports open and reproducible neuroscience. At INCF, we’re working at a fundamental level of open science: how to make data shareable, tools interoperable, and researchers at all career levels trained in data management. One of our main activities is to vet and endorse FAIR (Findable Accessible Interoperable Reusable) standards and best practices for neuroscience data (“neuroinformatics” is in our name, after all) . We also support the development of new standards, as well as the extension of existing standards to support additional data types.
Standards should serve as aspirations and be accessible to this and future generations of scientists as tools for thriving in an open-science environment. Support for an open-science environment in turn facilitates collaborations and idea exchange, which enable mutual growth in striving toward scientific goals. Different expertise can be brought to bear on difficult problems, leading to new solutions, and also training a new and more robust scientific enterprise.
I think we can all agree that there’s zero success in announcing a “standard” and thinking that it will be widely adopted (cue the herding cats analogy). Open science is about choice and providing the mechanisms to facilitate open collaborations – like Irene Pasquetto pointed out in another post in this series: “collaborations are the holy grail of reuse.” One of Pasquetto’s main observations is that reputation, trust, and pre-existing networks have as much impact on reuse as how well the data is curated. In our experience, this perspective extends to standards and best practices, as well. With this in mind, we spent some time in 2017 working out a process for vetting and endorsing standards and best practices where the community itself did the vetting and endorsing . The process was opened for submissions in early 2018, and we currently have three endorsed standards with eight more in the pipeline.
To promote uptake of standards and best practices, and implementation of other neuroinformatics methods, INCF has built TrainingSpace, an online hub which provides informatics educational resources for the global neuroscience community. TrainingSpace offers multimedia content from courses, conference lectures, and laboratory exercises from some of the world’s leading neuroscience institutes and societies. As complements to TrainingSpace, INCF also manages NeuroStars, an online Q&A forum, and KnowledgeSpace, a neuroscience encyclopedia that provides users with access to over a million publicly available datasets and links to literature references and scientific abstracts.
We at INCF hope that our efforts to promote open neuroscience will spark some interest in those of you are just now hearing about us, and join in the fun of developing, vetting, and endorsing FAIR standards and best practices. After all, herding cats is easier if there’s more food to choose from!
This post was written with helpful comments from Randy McIntosh, neuroinformatics expert and brainstormer extraordinaire.
1 Abrams, Mathew, et al. “A Standards Organization for Open and FAIR Neuroscience: The International Neuroinformatics Coordinating Facility.” OSF Preprints, 17 Jan. 2019. https://doi.org/10.31219/osf.io/3rt9b
2 Martone M. The importance of community feedback in the INCF standards and best practices endorsement process [version 1; not peer reviewed]. F1000Research 2018, 7:1452 (document) https://doi.org/10.7490/f1000research.1116069.1
About this series: In February 2019, Sage Bionetworks hosted a workshop called Critical Assessment of Open Science (CAOS). This series of blog posts by some of the participants delves into a few of the themes that drove discussions – and debates – during the workshop. Read the series.