If you want to run an online multiple choice questions exam under controlled conditions in an exam room, here are step by step instructions.
In good time before the exam
- Ensure FET.IT@uwe.ac.uk are well prepared for your exam day as you will need their support – e.g. to prepare the computer room(s), to log in with generic student accounts and to be available for any IT problems.
- Ensure the Learning.Systems@uwe.ac.uk team (formerly Blackboard Support) is aware of the exam and is ready to help in case of problems.
- Consider having the assessment timetabled as an exam.
- Investigate how to transfer grades to ISIS – how to use the Grade Column Mapping tool in Blackboard.
- Run a mock exam with students under exam conditions in the same room(s) to be used for the real exam.
Build a new Test in Blackboard
- Control Panel > Course Tools > Tests, Surveys & Pools > Tests > Build Test.
- Fill in test Name, Description and Instructions then click Submit.
- Check the Question Settings page to change the default points and other settings.
- Now start adding questions. Click Create Question and select Multiple Choice (many other question-types are available).
- Enter the first question and possible answers (there is space for 4 possible answers by default) plus any feedback.
- Repeat steps 4 and 5 above until you have created your questions.
Add the Test to a content area
- Navigate to the content area you want the exam to appear in (e.g. Assignments) choose Assessments > Test.
- Select the Test you have made from the list and click Submit.
- Click Modify Options (in the contextual menu) to open the Test Options page.
- In Test Availability section do the following:
- Make the link available
- Set number of attempts – leave blank for just one attempt
- Set Timer for 120 minutes and turn on Auto-Submit
- Set Display After/Until times, leaving extra time before and after the 120-minute window
- Set a Password students need to enter to start the exam (for invigilators to give to students at start of exam).
- In Test Availability Exceptions section add exceptions to the test availability settings for an individual student or a group of students, to allow extra time to take the test, for example.
- In Show Test Results and Feedback to Students section choose when to show results to students (e.g. after availability of test ends, or on a specific future date).
- In Test Presentation section select Randomise Questions – so each student in exam room sees questions in a different order.
(NB Test Options can be changed again afterwards by clicking Modify Options in the test’s contextual menu.)
On day of the exam
- In the exam room(s), ask IT people to login to each of the PCs with generic student accounts.
- Students log in to Blackboard with their usual UWE login details.
- Invigilators give students the password (e.g. write password on whiteboard) to enter to start the exam.
- Students take and submit the test.
Grading the exam
Marking of the Test is done automatically by Blackboard (first marking).
The tutor should check the marks are appropriate (second marking) and has the opportunity to change marks if, for example, a question is deemed to be unsuitable.
For summative tests grades can be mapped to ISIS using the Grade Column Mapping tool (Control Panel > Course Tools Grade Column Mapping). This may need to be pre-arranged via Blackboard Support.
Provide scores and feedback to students
Test scores and any other feedback can be released at different times, if desired. Students could, for example, be allowed to see their test score (provisional, before second marking) shortly after the test availability ends (e.g. 15 mins after completion).
Further feedback (reasons why an answer is right or wrong, praise etc.) can be provided after second marking is completed, for example.
Using question Pools – Pools are sets of questions which can be added to any Test. Pools are useful for storing questions and reusing them in more than one Test.
Categorising questions – all questions can be categorised in different ways (category, topics, difficulty level etc.) by creating/selecting categories when editing the questions. Doing so makes it far simpler to construct future tests.
Improving the Test for future runs – “The Item Analysis tool (opened via the contextual menu for each test) provides statistics on overall test performance and individual test questions to help you recognise questions that might be poor discriminators of student performance. You can use this information to improve questions for future test administrations or to adjust credit on current attempts.”