Skip to content

4M.2.DEMO: FAIRness Assessment: Demonstration




This is a demonstration of an updated version of 3M.2.DEMO: FAIRness Assessment and shows how to use these tools to perform a FAIR (findable, accessible, interoperable, and reusable) assessment of digital objects.

What they achieved

Nitrogen submitted a demo which including a series of walkthrough videos, a website, a bookmarklet and a Chrome extension. The videos demonstrate using FAIRshake to evaluate FAIRness of digital objects (tools, datasets, repos) including: how to install/use the Chrome extension and Bookmarklet, how to create a FAIRshake evaluator and admin account, how to evaluate digital objects in existing projects, and how to create a new project.

Helium supplied 18 criteria (metrics) that synthesize several
FAIR Data Principles related rubrics. These criteria were used in an assessment demonstration of an implementation of the Data Commons Assessment for FAIRness (DCAF) Digital Object Repository (DOR) API, called DCAF-DOR API. This demonstrated how to do an automated, or semi-automated assessment on a DOR using CommonsShare.

Why is this valuable?

Making data Findable, Accessible, Interoperable, and Reusable is one of the core goals of the Data Commons, and these tools give us a way to access how close we are to those ideals. Although manual FAIRness assessments are useful, they are time-consuming and error prone. Development of APIs that can automatically complete these assessments will be necessary as the Commons expands and begins to take on thousands of data sets.