What is assessment item banking

A JISC CETIS Briefing

By Phil Barker and Rowin Young, Domain Coordinators, JISC CETIS

Final Version. March 2007.

Please leave any comments on the talk page.

wiki version | [[Media:What-is-assessment-item-banking.doc|.doc version]] | [[Media:What-is-assessment-item-banking.pdf|.pdf version]]

In brief
An item bank is a collection of assessment items and associated software to enable the storing of content to support the assessment of student learning.

An assessment item is a complex object, consisting of a question together with its associated data such as score, feedback and either any media files required or links to those files. It may be in a standard format such the IMS Question and Test Interoperability specification or Macromedia Flash. Items are aggregated into assessments, either in advance of the assessment or during the assessment process itself. Items for aggregation may be selected manually, automatically, or by a combination of the two.

Item banking software enables items to be deposited, discovered and retrieved. In order to support the discovery and delivery of items, each item needs to be described with metadata, and may also be related to usage data. Metadata is information about the assessment item, and can be highly detailed or limited to just a few key fields. Although organisations may develop their own metadata schemes, efficient exchange of items and discovery of items in distributed item banks is easier when using an agreed metadata standard or application profile such as IEEE LOM or Dublin Core.

Usage data is context-specific information about cohort performance on an item. It can be used to support the quality assurance of items: for example, a multiple choice question where one distracter is never chosen suggests that the distracter may need to be changed. Usage data can also reveal items that may be culturally biased or require information from outside the scope of the immediate course of study and are therefore invalid. Item statistics can also be used to help construct assessments, particularly adaptive assessments, where values such as difficulty and discrimination can be searched on to identify appropriate items.

An item bank may be part of an assessment management system (AMS) incorporating tools to enable the authoring and delivery of assessments, results reporting and the storage of usage data resulting from delivery. Alternatively, it may simply be acting as a standalone file store into which items are deposited and from which they are withdrawn without being tied to any particular authoring or delivery system.

What are item banks for?
The main things that item banks allow you to do are:
 * manage the development of new assessment items, for example by providing version control and supporting quality assurance
 * share content with colleagues and others
 * control access to items
 * make resources available to learners for self-assessment
 * provide independence from a VLE or delivery system
 * manage valuable assessment resources at organisational level
 * facilitate reuse of assessment items, for example release of items from summative assessment for subsequent reuse in formative assessment
 * enables the personalisation of assessment, for example supporting the use of adaptive assessment
 * reduce potential for cheating in summative assessment by delivering different assessments to each candidate
 * provide a single point of access to all the information and resources needed to construct assessments (this could include results data or analysis from individual items or assessments).

What needs doing?
Simply put, the core requirements for an item bank are that someone should be able to put a complex object (specifically an assessment item) in to it, and someone (often someone else) should be able to discover it and get it out. It is probably sufficient that the assessment item be smallest level of granularity that the item bank can work with, i.e. that the complex objects are treated as atomic with no access to components of an assessment item being provided by the repository. However, there are cases where disaggregation to these components may be beneficial. For example when an item bank is used in the context of developing new assessment items, components of an item may be updated without the nature of the item as a whole changing substantively. Disaggregation may also be desirable when a large media file, such as a video clip, is included in several items, since it may be more efficient only to store one copy of that file.

Other possible additional requirements or desirable features are: There is also a need for a workflow model that supports these requirements, for example by ensuring quality control of the items, of their metadata and of their access requirements.
 * access management, to allow different users to have different privileges for different items. This is necessary when items are to be used for high stakes assessment.
 * embargoes, that is access permissions that vary with time. Again, this is necessary for items which are to be used for high stakes assessment in order to ensure that they are not exposed to general users before the exam.
 * version control, desirable when items are under development.
 * a preview facility, which is desirable when items are stored in a platform independent format such as IMS QTI which separates the content of an item from its presentation.
 * remote or federated search, so that items can be discovered without the user necessarily visiting what might be one of many item banks available to them. For open item banks, it may be desirable to facilitate discovery of items through a generic web search engine such as Google.

An item bank as a specialised repository
The core requirements of an item bank match the core capabilities of a repository. Furthermore, many of the possible additional requirements are also supported by repository systems as they are deployed in service uses. In many cases there are existing repository standards and agreed protocols that support these capabilities. For example the JISC Information Environment (a web-based architecture for distributed information provision in UK Further and Higher Education) specifies the following technical standards to support the publishing, discovery, access and use of resources for further and higher education:
 * In order to support discovery, resources must be described with appropriate metadata; for educational resources the IEEE LOM, used by the QTI specification, is specified.
 * In order to support federated search metadata should be published using OAI-PMH, or a search interface (such as SRU/W) should be provided to allow machine-to-machine remote search.
 * In order to support access and use, objects that can be retrieved from repositories should be assigned a persistent URI.
 * Authentication for access management is often provided with Athens, though SAML technologies are preferred.

Some alternative approaches are suggested by the Web2.0 paradigm. These include resource discovery through Google (with the requirement that Google is allowed to index full text representations of the items). Lightweight specifications such as ATOM or RSS feeds and the A9 search interface can be used to enable distributed resource discovery.

There are some requirements for which there is no such agreed and commonly implemented approach to fulfilling. For example there are a number of competing specifications for how to deposit a resource in a repository (a plethora of bespoke APIs, WebDav, ATOM publish) and there is no common approach which allows repositories to share information about and provide access to the components of the objects they hold. Both of these issues are the subject of current work. Similarly workflow, version control, and annotation facilities are implemented in various ways by different repository systems.

Differences between item banks and repositories
One perceived difference between item banks and repositories arises from the use of items for high stakes summative assessments which leads to an emphasis on the requirement for secure access management and embargoes. This contrasts with repositories in general since the motivation behind building a repository is often the sharing of resources, frequently with the idea that resources should be made as widely available as possible. However it should be remembered that a great deal of use can be made of assessment items for formative or self-assessment, where access can be available to all; and access management and embargoes are occasional requirements for other types of repository.

Another difference is related to the relationship between item banks and wider assessment systems, where it is normally the case that the item is delivered to the user via a server which is responsible rendering the item for display in a web format. This contrasts with many repositories that hold resources that are already in a format that can be displayed directly by browsers, plug-ins or common desktop applications. Thus there is a need for item banks to a capability render items in a way which allows users to preview an assessment item in a web browser before selecting it.

Item Banks in a Service Oriented Architecture
In a Service Oriented Architecture such as envisaged by the JISC DEST e-Framework for Education and Research, functionality for an item bank would be built from independent but interoperable services each relating to a business requirement and possibly integrated with other services required for an assessment system. For example, three separate services might be brought together to allow users to add a resource to a collection, manage its metadata, and search the collection for a resource. If such an approach were taken, the services used to create, manage and use collections of assessment items would be largely the same as those used for collections of other similar resources (i.e. generic repository-related services). The service oriented approach has the advantage of flexibility in that services to provide functionality such as preview of QTI items or sophisticated access management for high stakes summative assessment may be developed to supplement the services required for other resource collections and deployed only as required.

Related Specifications
define the data model and XML binding for metadata to describe resources that may be used for learning education or training. IMS QTI v2.x uses the IEEE LOM to describe items and tests. These standards may be bought from the IEEE, see: http://shop.ieee.org/ieeestore/
 * IMS Question and Test Interoperability Specification (QTI) describes a platform independent data model and XML binding for the representation of assessment items and test data and their corresponding results reports. See: http://www.imsglobal.org/question/
 * IMS Content Packaging Specification (CP) provides the functionality to describe and package learning materials for storage in a repository and transport between systems. IMS QTI v2.x specifies that items and tests be packaged using this specification. See: http://www.imsglobal.org/content/packaging/
 * IEEE LOM, that is:
 * IEEE Std 1484.12.1 — 2002 Standard for Learning Object Metadata; and
 * IEEE Std 1484.12.3 — 2005 Extensible Markup Language (XML) Schema Definition Language Binding for Learning Object Metadata (LOM);
 * JISC Information Environment Technical Standards provides a list of the key standards required and suggested for interoperability in the JISC IE technical architecture. See: http://www.ukoln.ac.uk/distributed-systems/jisc-ie/arch/standards/
 * JISC DEST e-Framework for Education and Research an initiative to produce an evolving and sustainable, open standards based, service oriented technical framework to support the education and research communities. See http://www.e-framework.org/

JISC Projects

 * CATS Toolkit to support the automated construction of assessments from distributed item banks.
 * Minibix QTI v2 item banking management system.
 * SPAID Web services for item packaging, item bank searching and item retrieval.
 * UKCDR Collaborative infrastructure for developing and sharing high stakes assessment materials.

Other Examples

 * COLA 4,500 questions in the form of over 250 assessments developed by Scotland's Colleges Open Learning Exchange Group for FE.
 * e3an 1,400 peer-reviewed questions developed by the Electrical and Electronic Engineering Assessment Network.
 * HEA Economics Network A large archive of online and offline economics questions submitted by staff within the discipline.

Resources

 * Assessment item banks and repositories A JISC CETIS paper by Sarah Currier, Product Manager, Intrallect Ltd.
 * Assessment item banks: an academic perspective A JISC CETIS paper by Dick Bacon, Senior Lecturer (HEA Consultant), University of Surrey.
 * Other JISC CETIS briefing papers
 * JISC CETIS Assessment SIG
 * JISC CETIS Metadata and Digital Repository SIG