Skip to main content

Table 5 Summary of STARDIT Beta Version data fields

From: Standardised data on initiatives—STARDIT: Beta version


Data category

Data field

Core: Initiative context—This information locates the initiative within a clear context

Identifying information


Initiative name*


Geographic location(s)*


Purpose of the initiative (aims, objectives, goals)*


Organisations or other initiatives involved (list all if multi-centre)*


Relevant publicly accessible URLs/URIs


Other identifiers (e.g. RAiD [166], clinical trial ID [167, 168])


Keywords or metatags—including relevant search headings (e.g. MeSH [169])


Other relevant information (free text)


Status of initiative


What is the current state of the initiative?*

Select from:

1. Prospective—this report is prospective or describes planned activity

2. Ongoing—the initiative is still taking place

3. Completed—the initiative has finished (evaluation and impact assessment may be ongoing)


Date range (start and end dates of initiative)


Methods and paradigms


Methods of the initiative (what is planned to be done, or is being reported as done). Include information about any populations or eco-systems being studied, any ‘interventions’, comparators and outcome measures (qualitative or quantitative)*

If appropriate, include a link to a publicly accessible document (such as a research protocol or project plan)


Include any information about theoretical or conceptual models or relevant ‘values’ of people involved with this initiative, including any rationale for why certain methods were chosen

Report authorship—Information about who completed the report and how

Please note this section can be completed multiple times if there are multiple authors

Identifying information for each author (authors can be anonymised in the public report but at least one verified identity will need to be sent to STARDIT Editors to attempt to prevent falsified reports)




Publicly accessible profiles, institutional pages*


Open Researcher and Contributor ID (*


Tasks in report completion


Other information




Key contact at initiative for confirming report content (include institutional email address)*




Date of report submission (automatically generated)

Input: Ethics assessment

Ethics approval information (if applicable)


Assessing organisation or group*


Approval date and approval ID—include any relevant URL

Input: Human involvement in initiative

Who is involved in this initiative and how?

Editors assessing involvement may need to use the STARDIT ‘Indicators of involvement’ tool

Details about how each group or individual was involved in the initiative


Who was involved or how would you label those involved (select from group labels or submit new group label name in free-text)*

You can name individuals or use ‘labels’ to describe groups of people such as ‘professional researchers’, ‘service users’ or ‘research participants’. Additional ‘labels’ or ‘meta-tags’ to describe people may be added if appropriate


How many people were in each grouping label?


Tasks of this person or group (list as many as possible)*—including any information about why certain people were included or excluded in certain tasks (such as data analysis)


Method of doing task? How did these people complete these tasks? (what methods were used)—for example ‘group discussion’ or ‘reviewing documents’


Communication modes? What modes of communication were used—for example, ‘group video calls’, ‘telephone interviews’ or ‘postal survey’


How were people recruited, contacted or informed about these tasks?


Involvement appraisal


Methods of appraising and analysing involvement (assessing rigour, deciding outcome measures, data collection and analysis)


Enablers of involvement (what do you expect will help these people get involved—or what helped them get involved)

Examples of enablers


Barriers of involvement (what do you expect will inhibit these people from getting involved—or what inhibited them from getting involved). Are there any known equity issues which may contribute?

Examples of barriers, and any attempts to overcome them


How did the initiative change as a result of involving people? For example, did the initiative design or evaluation plan change?

Note: this can be answered separately for different individuals or groupings of people


Involvement outcomes, impacts or outputs


Were there any outcomes, impacts or outputs from people being involved?* When describing these, attempt to label which groupings were affected and how. These can include impacts on people, organisations, processes or other kinds of impacts


Learning points from involving people


What worked well, what could have been improved? Was anything learned from the process of involving these people?




Which stage of the initiative were these people involved? (please provide information about any distinct stages of this initiative, noting some may overlap)


Financial or other interests (including personal or professional interests)


Describe any interests (financial or otherwise), conflicting or competing interests, or how anyone involved may be personally, financially or professionally affected by the outcome of the initiative* Including any relevant information about authors of this report

Input: Material involvement in initiative

Mapping financial or other ‘interests’



What was the estimated financial cost for the initiative


Funding information (link to publicly accessible URL if possible)—this may include the project funder, funding agreements, grants, donations, public ledgers, transaction data or relevant block(s) in a blockchain




How much time was spent on this project

Note: this can be answered separately for different individuals or groupings of people




Describe any costs or resources that cannot be measured financially or quantitatively—this may include expertise, traditional or Indigenous knowledge, volunteer time or donated resources

Outputs: Data

including code, hardware designs or other relevant information

Sensitive data

Secure criteria

Data adheres to relevant industry/discipline data security requirements



How is data entered, changed or removed within a repository?



Who is the data from this initiative shared with?


Who has access to sensitive data and how is this decided?



Is data encrypted? Is it anonymised or de-identified? What methods are used for re-identification? What is the risk of unauthorised re-identification?


Open data

FAIR criteria

Data adheres to FAIR criteria [170]



Describe relevant metadata, how the data is machine readable and other relevant information



How can data be accessed—include any information about authentication and authorisation



How is data interoperable or integrated with other data? Include information about applications or workflows for analysis, storage, and processing, and resulting file formats or other outputs



How can data be replicated and/or combined?


Indigenous data

CARE principles

Data adheres to CARE principles [171, 172]


Collective Benefit

How will Indigenous Peoples derive benefit from the data


Authority to Control

How will Indigenous Peoples and their governing bodies determine how relevant data are represented and identified



How will those using the data provide evidence of these efforts and the benefits accruing to Indigenous Peoples



How have Indigenous Peoples’ rights and wellbeing been centred during the data life cycle


All data


Where is it data stored and hosted -share any location data if appropriate



Who ‘owns’ the data or claims any kind of copyright, patent(s), or other specific types of intellectual property—include relevant open licensing information


Analysis methods

Describe methods used to analyse the data (including a link to any relevant code and information about validity)



How can data be used? Include information about license and attribution



How is information about this data disseminated? For example, how are results from analysis shared?



impact/effect of the output


Data control

Who controls access to the data? How are decisions about data access made? Is data anonymised or de-identified? What methods are used for re-identification? What is the risk of unauthorised re-identification? How is this risk managed?


Management and quality

Which person (or organisation) is responsible for managing (or ‘curating’) the data?


Who is accountable for ensuring the quality and integrity of the data? (this may be an individual or organisation)

Impacts and outputs: Publications, events, changes, learning items etc

What was learned


What new knowledge has been generated? (if appropriate, include effect size, relevant statistics and level or evidence)*


Knowledge translation


Describe how the learning or knowledge generated from this initiative has or will be used




Have there been any outcomes, or has anything changed or happened as a result of this initiative that isn’t captured in previous answers?*


Measurement and evaluation


How has or how will this be measured or evaluated?


Who is involved in measuring or evaluating this?


Who was or is involved in deciding on the outcomes used to evaluate any impacts or outcomes? How were they involved?

Information completed by Editors


STARDIT report version number (assigned)


Report number assigned to distinguish it from any future updated reports

Indicators completed by Editors and/or peer reviewers

Editors and peer reviewers assessing the report will need to look for indicators in the following categories on publicly accessible URLs*

Indicators of involvement


Use the STARDIT ‘Indicators of involvement’ tool


Indicators of data practice compliance


Use the relevant criteria


Indicators of translation and impact


Other indicators