Recorded webinar
Open FAIR data: the role of public data archives
Scientific progress relies on open FAIR data. Describing datasets with rich metadata and appropriate standards, and submitting these to public archives, is crucial in linking different data types in an open, accessible, and sustainable manner.
This webinar will introduce two public archives: BioSamples and BioImage Archive. After a brief overview, we will provide information on how you can use these repositories to make your data open and FAIR, with a particular focus on complex multi-modal datasets.
Please note that by clicking on "Access Materials", a .pptx file with the slides of the presentation will be automatically downloaded.
Who is this course for?
This webinar is suitable for wet-lab and data scientists at all career stages who are interested in learning how to make their samples and imaging data open and available to the community. No prior knowledge of FAIR principles, BioSamples or the BioImage Archive is required.
This event is part of a webinar series organised by the STANDFLOW project, an initiative supported by EMBL’s Planetary biology Transversal Theme. STANDFLOW is about a collaborative effort towards creating a standardised data management workflow. The project primarily utilises imaging data derived from samples collected through the TREC (Traversing European Coastlines) and the Roscoff Culture Collection. For details on all topics covered in this series and registration information, please visit the following link: How to organise and share my imaging data?: Multimodal data management for marine biologists, and environmental scientists and imaging specialists.
Outcomes
By the end of this webinar, you will be able to:
- Explore the role of public archives in making data open
- Discuss the importance of rich metadata
- Identify how to submit to the relevant archives and link the data
DOI:
10.6019/TOL.FAIRPublicArchive-w.2025.00001.1
This webinar took place on 25 June 2025. Please click the 'Watch video' button to view the recording.