During October-December 2010, the University ran a “MiniREF Exercise” to gather data on outputs, impacts and esteem data for over 1200 Research staff. The review of that work is now ongoing.
Enlighten, our institutional repository service was used as the platform for selecting and reporting on this information. The Library’s Enlighten team worked closely with the Office of the Vice Principal (Research and Enterprise), Research Offices in Colleges and many academic colleagues as we updated/added publications data.
Impact and Enquire
Our JISC funded Enquire project had impact as a core element and gave us scope to explore different options for gathering supporting information about publications as well as impact data. Ultimately, we decided to work with and implement the functionality which Southampton had developed, back in 2007 for RAE2008.
IRRA – Institutional Repositories for Research Assessment
The work of the Institutional Repositories for Research Assessment (IRRA) project, funded by JISC for the RAE2008 seemed an ideal fit for the collection of outputs, impact and esteem data.
The IRRA Project had developed an EPrints add-on for research assessment which created separate mySQL tables for recording measures of esteem, selecting publications and providing reports. This was designed to:
“Facilitate the gathering of evidence for RAE returns by allowing users to:
- Record measures of esteem
- Select items from the repository
- Qualify each selected item for RAE return
And allowing unit managers/administrators to:
- Carry out each of the above on behalf of a user (e.g. to adjudicate between two users selecting the same item)
- Identify and resolve problems with selections
- Produce reports in Word and Excel (RA2) format”
– From the “EPrints RAE Module Silver Release README”
The EPrints add-on has a rich set of features for capturing and reporting supporting information . We configured it [with assistance from Tim Miles-Board at EPrints] to provide an additional focus on impact and esteem data for our mini-REF pilot exercise. We created two separate means of collecting impact information. The first was a separate section to capture the impact of an individual’s research. The second was an impact option for each specific output where multiple authors could add their own impact to the same output. We didn’t use the latter approach however.
We trialled this with our internal REF Working Group who were very impressed with initial demonstrations of this add-on and provided valuable feedback enabling us to refine the language and make changes to the functionality. One example of this was the removal of the “Selected by” feature which would show staff who else had selected publications, at least in the user view. We have kept this as a feature in our REF Reporting section which only a handful of Administrators have access too.
New MiniREF user options
Using these IRRA add-on we have created new set of mini-REF options which are displayed when staff login.
These options included “REF Selections” – for publications and outputs associated with an individual, and “REF Impact” for impacts authored/created by individuals. A third option “REF Reporting” was only available to designated REF Administrators in the College Research offices. This was a new user role (with thanks to Patrick McSweeney at EPrints).
When users chose “REF Selection”, a list of outputs from 2008 onwards was displayed for selection. In the original release this was based on surname and publication year from 2001 onwards. We updated this to 2008 onwards and took advantage of our author GUID work to more precisely show records available – or in some cases highlight those records which didn’t yet have a GUID.
Staff could select their publications, rank them in order of preference, provide additional information and rate them. The text and guidance for this exercise was written by colleagues in the Office of the Vice-Principal (Research and Enterprise), in consultation with our REF Working Group.
Selected Items screen
Clicking on Edit Info provided Users with a Selection Details screen where further details, a self-rating and a preference could be added.
REF Selection Entry Screen
Impact and esteem
In addition to the selection of four outputs we also wanted to capture impact and esteem data. The IRRA add-on provide a range of granular esteem options (which included impact) but for our MiniREF exercise we only used three fields
- Other Information
The Other Information field was added to enable staff to provide any supporting information they wanted to provide.
REF Reporting and Administrator Options
The IRRA add-on also included a reporting section, this includes Word (HTML) and CSV (for Excel) outputs. This could potentially be extended to an appropriate XML format, potentially CERIF for interoperability. For our exercise we focussed on the existing reports. The Excel report was the one which was used the most since the data could be re-used in Access or just readily reviewed in Excel itself.
This was designed with the RA2 in mind and included labels. We removed many of these and replaced the output codes such as D [a journal article] with the text “journal article”.
It was possible to extract various lists either electronically or in print, for example, to help with preparation and checking of REF submissions.
We added a REF Administrator role so that authorised staff including College Research Office staff and Research and Enterprise administrators could complete information on behalf of academic colleagues. This was an invaluable feature.
REF Administrator Options
Concluding comments and key lessons
Over 1200 academic colleagues returned data to the MiniREF, this was through a mix of self and proxy selection. The exercise ran for just over six weeks and during that time more than 4000 additional records were added to Enlighten. The Library’s Enlighten team dealt with over 700+ e-mail and telephone enquiries [and this doesn’t include the ones which went to our College Research Offices].
We learned a number of key process and development lessons, including:
- Your publications database can never be comprehensive enough in advance of an exercise like this
- Ensure you are ready to deal with the volume of queries, updates and additional publications which the exercise will elicit
- Administration features including the opportunity to make changes on behalf of users (to update records on their behalf) and to run various reports are absolutely vital for managing returns and gauging progress
- Learn lessons, take onboard feedback and be flexible/nimble enough to make changes to the system and workflows
The Library’s role and the work of the Enlighten team has been greeted very positively and favourably as a result of this exercise. It has enabled us to work much more closely with academic colleagues and College Research staff. We will now build on this work to ensure Enlighten is as comprehensive as possible.