School of Education Home Contents Previous Page Next Page
This chapter includes a restatement of the research questions in the form of hypotheses, a description of the subjects selected to participate in the study, the text used in the test of reading comprehension, the test measures, and the administration and design of the tests. The test procedures and statistical analysis are also detailed.
In order to answer the research questions posed in Chapter
1, these must be stated in a form that can be tested. Restating
the questions in the null form yields the following hypotheses:
Hypothesis 1. There is no relationship between scores on paper based cloze tests and online cloze tests.
Hypothesis 2. There is no relationship between scores on paper based maze testsand online maze tests.
Hypothesis 3. There is no relationship between computer anxiety and online test scores.
Hypothesis 4. There is no relationship between computer experience and online test scores.
To test these hypotheses, a series of paper based and online reading comprehension tests were developed and a scale to measure attitudes towards computers and experience with computers was devised.
The subjects were students from two Year 5 classes at a private single sex school in the Adelaide metropolitan area. Fourteen girls participated from one class (designated Class A) and 17 girls participated from the other (designated Class B). The age range of the students at the time of the test was 10.1 to 11.4 years with a mean of 10.7 years. All students at this year level were assessed by the school in February 1997 using the GAP test (McLeod, 1980), a standardised measure of reading comprehension. These GAP test scores indicate reading ages for this group range from 7.8 to 12.5 years with a mean of 10.8 years.
A single passage was used for all the versions of reading comprehension tests developed in this study. The text chosen is a modified form of a cloze exercise contained in Cloze Procedure Passages (n.d.) produced by Reading University. The passage, "The Selfish Giant," by Oscar Wilde has a suggested readability level of 10 to 11 years, as assessed by using the passage with students in two primary schools. MacTexan (Anderson, 1991) was applied to independently gauge the readability of the passage. The resulting Rix score suggests a reading level of Year 5 which corresponds with the Reading University assessment..
Four reading comprehension tests and a scale to measure student attitudes towards computers and experience with computers were developed for this study. Anecdotal reports concerning the tests also formed part of the data collection. Besides these tests, which are described in detail below, a standardised reading comprehension score (The GAP Test) was available from the school. A description of the test measures follows, while the stages in their development is detailed in a later section in this chapter.
The original cloze test as used by Reading University had a random deletion rate of 1:8. In order to improve the reliability of the test, the deletions were changed to regular deletion of 1:7, thus increasing the number of items. The first paragraph was left intact, then every seventh word was deleted and replaced with a blank. The last paragraph was also left intact. The resulting passage contains 30 blanks and 286 words, which is consistent with the length and deletion rate of other maze and cloze studies (Jenkins and Jewel, 1993, p.424; Anderson, 1976, p. 24).
The cloze test includes two A4 pages, the first containing a practice exercise and instructions. The practice exercise, containing three deletions, was drawn from the intact opening first sentence of the actual test. The instructions are a condensed form of the verbal instructions given. The second page consists of a section for name and age of subject, a picture relating to the passage, and the actual close test. Figure 4.1 depicts the second paragraph demonstrating the text format and deletions. See Appendix A for the complete paper based cloze test (referred to as a cloze game in order to motivate students).
saw the noticeboard it was sorry ___________ the children and it slipped back
___________ the ground again, then went off ___________ sleep. The only people who
were ___________ were the Snow and the Frost.
Figure 4.1 Sample from paper based cloze test
The maze construction followed precise rules. A multiple choice item consisting of the correct word and two distractors was placed below each blank determined in the cloze exercise. One of the distractors is semantically incorrect, the other syntactically incorrect. These distractors are drawn from elsewhere within the passage.
Figure 4.2 shows how the multiple choice items are displayed. A subject reads the first sentence which is intact, then reads the second sentence which contains a choice item. The first choice, head, is the target, and most and fruit are distractors. In this sentence, most is syntactically incorrect, and fruit is semantically incorrect. The format used for the maze is a modified form of Guthrieís original format (1973, in Parker, Hasbrouk and Tindal, 1992, p. 203).
Figure 4.2 Sample from paper based maze test
The paper based maze test consists of three A4 pages, the first page containing the same practice exercise as the previous test, except for the inclusion of multiple choice items replacing the blanks, and instructions. The second page consists of a section for name and age of subject, a picture relating to the passage, and the first part of the passage containing 10 maze items. The third page has the remainder of the passage containing 20 maze items. See Appendix A for the complete paper based maze test (similarly referred to as a maze game to increase student motivation).
The structure of the online cloze test is visually similar to the paper based test. It was written in HyperText Markup Language (HTML) and presented to subjects as a World Wide Web page, using Netscape Navigator (version 2) as the browser. In order for the screen view to match the paper page, several factors had to be changed. The font sizes were changed through selecting Options on the menu bar, then General Preferences. The proportional font was changed to Times New Roman size 12, and the fixed font was changed to Courier New size 12. The proportional font is used for the cloze passage, but the text boxes into which subjects type their responses use the fixed font. The window was also maximised so that the browser containing the Web page filled the screen. A comparison of Figure 4.1 and Figure 4.3 shows that the online cloze screen has the same number of words on each line as the paper based cloze test.
The characteristics of the online medium enable coloured, patterned backgrounds, choice of text colour and the addition of pictures. However, a plain white background was adopted and the font colour black chosen in order that the online version of the cloze test resemble as closely as possible the paper base cloze test.
Because this version of cloze test was presented online via computer, subjects only view that part of the test that is contained on the screen at the time. Subjects need to move from one section to another by scrolling the screen up and down, whereas the whole paper-based close test can be seen in one glance. Instead of a line indicating a blank space, the Web page utilises a text box. This appears as a black outlined rectangle. The subjects type the missing words into these text boxes. When subjects complete the test, they click on a button labelled "The End" and all responses are uploaded to a Server within the School of Education at Flinders University. A screen with a "Thank You" message indicates successful transfer of data.
The online cloze test includes two Web pages, the first containing a practice exercise with instructions relevant to the procedure. A hypertext link is clicked to go to the second page which consists of a section for name and age of subject, the picture relating to the passage, and the actual online close test. The online test may be seen by pointing a browser to http://wwwed.sturt.flinders.edu.au/instruc1.htm for the practice test and then clicking the "Go to Cloze Game" link to see the cloze test. Online Maze Test The screen set-up and format of the online maze test is virtually identical to that of the cloze, except for one factor. Drop down list boxes replace the text boxes used in the online cloze test. When the subject clicks the arrow to the right of the box, a list of three multiple choice words drops down. Figure 4.4 shows how this appears on the screen. The subject selects the appropriate word by clicking on it. The word selected then replaces the former blank. The same multiple choice items as used in constructing the paper based maze test are used for the online maze test.
a beautiful flower put its out from the grass, but when
Figure 4.4 Sample from online maze test
Unlike the paper based maze test, only one maze item at a time can be seen, and this occurs only when the item box arrow is clicked. The major benefit of this feature is the compactness of this test compared to the paper based maze test.
The online maze test also includes two Web pages, the first containing a practice exercise and instructions relevant to the procedure. A hypertext link is clicked to go to the second page which consists of a section for name and age of subject, the picture relating to the passage, and the actual online maze test. The online test may be seen by pointing a browser to http://wwwed.sturt.flinders.edu.au/instruc2.htm for the practice test and then clicking the "Go to Maze Game" link to see the maze test.
A scale designed to measure computer anxiety and experience was developed for this study and administered to each of the participants of the cloze and maze tests. The scale consists of 10 questions concerning attitudes towards computers, and 4 questions concerning previous computer experience. The attitudes questions are answered on a 5-point scale determined by degree of agreement, the lower the score, the greater the degree of computer anxiety. The experience questions are rated on a 7-point scale, the higher the score, the greater the degree of experience. The answers are recorded by colouring the appropriate dots. See Appendix A for the attitude and experience scale.
Informal discussions after the completion of the reading comprehension tests, with small groups of 3 to 5 students, were used to gather anecdotal reports about the tests. The following questions were asked:
The test measures used in this study are described above. This section details the stages of development. The development of the reading comprehension tests began with prototype paper based cloze and maze tests. Next the online tests were created. The paper based tests were then redesigned to resemble the online tests as closely as possible. The HTML files for the online tests were placed on the Server at the School of Education and tested. The tests were tried with a few primary school subjects and instructions and practice tests were written.
Once the passage was selected, prototype paper based cloze and maze tests were designed. The texts were formatted with a size 14 Arial font. Figure 4.5 demonstrates the format employed in the maze prototype.
Figure 4.5 Construction of prototype maze test
Marilyn Hyde, the specialist computer teacher at the school where the study was conducted, advised the use of the serif font Times New Roman. A serif font is believed to improve legibility as the "serifs help guide the eye from one word to the next" (Anderson and Poole, 1994). Times New Roman is also the default font used in Netscape Navigator, which determines the base font for the Web pages. Hyde, presently researching the readability of texts, suggested that the readability of the text is also influenced by the movement of the eye across the line. With Guthrieís maze format, the eye does not necessarily move straight across a line but up and down as well as the position of the selected words changes. This reasoning contributed to a modification of the maze to its final form, where the chosen word is written on a blank line, and the maze items are place below the line. Rereading the completed sentences then follows a straight line. The final form also had the advantage of being visually more similar to the online form.
Basic online cloze and maze games were developed by Anderson (1997a,b) at Flinders University. Following his model, factors such as input fields and submit button were copied using the Netscape Navigator editor. Text colour and background and were chosen (black and white), the text for the cloze and maze passages were typed in, field values were changed, and a picture was inserted. The picture originated from the Cloze Procedure Passages (n.d.) from Reading University. This was scanned and saved in jpg format to enable insertion into a Web page. The completed pages were saved in HTML format.
The font and font size of the web pages is not set within the HTML file, but rather by the browser used to view the pages. Determining the size font to use was difficult. In consulting the specialist computer teacher, it was established that the larger size 14 font was more readable online for Year 5 students than a size 10 or 12 font. However, a compromise of size 12 font was eventually made in order to simplify the redevelopment of the paper based tests.
Utilising a function of Microsoft Windows, it was possible to split the computer screen to see both the Web page and the Word 6 page on which the cloze and maze documents were created. This enabled the formatting of the Web pages to be duplicated in the Word documents.
Formatting the text in size 14 for the paper based tests, produced a two page document for the cloze test and a three page document for the maze test. A size 12 font produced one less page for each. Thus, while the larger font was deemed more readable, the size 12 font produced a more compact, neater result. Margin sizes were also varied until the paper based tests and the online tests had exactly the same words in each line of the tests.
The end result was the development of four reading comprehension tests as visually similar as possible, allowing for differences of the cloze and maze structures.
Before the study could proceed, access to the online tests had to be ascertained. The files giant_cl.htm, giant_mz.htm, and castle4.jpg were placed on the Server at the School of Education. A visit to the study site determined that of 16 computers situated in the school library, only 8 or 9 could maintain access to the Internet at any one time. The online cloze and maze tests were accessed, and a trial of the tests produced data that was successfully transmitted back to the Server at Flinders University.
A Year 5 student, who was not one of the subjects in the study, completed the tests. Duration of the trial tests was noted in order to allow subjects sufficient time to complete the tests.
Uncertainty about subjectsí familiarity with cloze and maze measures, and about their expertise with the Internet, meant that detailed practice tests were considered necessary. The first intact sentence from the passage used in the tests was used to construct the practice exercises. Three words were omitted, replaced by blanks in the cloze samples and maze items in the maze samples. The formatting was identical to the tests they represented.
Detailed instructions for each test were written (see Appendix B). A condensed form of these instructions was included below the practice tests for subjects to follow as the instructions were read.
The paper based practice tests and instructions became the first page of their corresponding tests. The online practice tests were on separate Web pages. However, to make access from the practice tests to the actual tests easy, a hypertext link was placed at the bottom of the practice test Web pages.
Subjects were divided into two groups according to their classes. Testing was conducted on two days one week apart. The survey was administered and anecdotal reports were collected the day following the second round of tests.
Class A, consisting of 14 subjects, completed the online tests during the first week, then completed the paper based tests the following week. Class B, consisting of 17 subjects, completed the paper based tests during the first week, followed by the online tests in the second week.
The paper based tests were administered in the subjectsí classrooms. The subjects were given the paper based cloze test and asked not to turn the page. Instructions were read, during which time subjects completed the practice test on the first page and were given a chance to ask any questions about the procedure. They then turned the page and completed the cloze test. Subjects were able to change their responses at any point during the test.
Before the tests were collected, they were checked to ensure names were included, and subjects were encouraged to fill in any blanks remaining. When the entire group had finished, this procedure was repeated for the maze test, with instructions specific to the maze format. See Appendix B for detailed instructions. Starting time for the tests was noted, and individual completion time was written on the test pages during the second week. Unfortunately, time estimates are not available for the paper based tests administered in the first week.
Prior to the online testing, the library computers were set up as described previously. The two practice test pages were bookmarked and the online cloze practice page was downloaded. Because only nine computers had Internet access, each class was divided into two groups. Class A in the first week had a group of 6 followed by a group of 8. Class B in the second week had a group 8 followed by a group of 9 subjects. The first group was asked to sit at the computers, while the second group remained in their classroom.
The online cloze test instructions were first read to the subjects. Then they completed the practice test, and familiarised themselves with the use of the mouse or tab key to move from one blank to the next. Subjects were asked if they had any questions about the procedure, then they accessed the actual cloze test by clicking on the hypertext link at the bottom of the practice test page. Subjects were able to change their responses at any point during the test. As subjects completed typing their chosen words in the text boxes, their tests were checked to ensure names were included. Subjects were also encouraged to fill in any remaining blanks. After checking, subjects were asked to click the button at the end of the page to send their responses back to the Flinders University Server electronically.
When all the group was finished, subjects clicked the bookmark for the maze practice test. The previous procedure was repeated for the online maze test, with instructions specific to the maze format. See Appendix B for detailed instructions. When the first group had completed the two online tests, the second group was sent for and the procedure was immediately repeated. Times at which the tests began were noted. Completion time was automatically registered with the recorded data on computer.
The day following the completion of the four reading comprehension tests, the survey of computer attitudes and experience was administered in the classrooms. Groups of four to five subjects at a time, in a quiet area, were then asked questions about computer and paper based tests. Answers were noted, but names were not recorded. Appendix A contains a copy of the survey form.
One of the benefits of computer based tests is that the format of response data readily enables computer analysis. Sandals (1992, p. 71) and Bugbee (1996, p. 282) refer to advantages of administering tests by computer that include providing instant scoring or reducing scoring time, eliminating error in scoring and speeding the evaluation of testing. While scoring was not instant in this study (partly because of the open ended nature of responses in the cloze test), the response data were easily inserted into a Microsoft Excel 5 spreadsheet. The paper based test responses and the survey scores were input manually into a spreadsheet. The use of spreadsheet functions and formulae enabled rapid scoring and analysis.
The online response data was downloaded from the Server onto disk. Four files contained the cloze data and maze data for both weeks. The files were in a text format that was opened directly into an Excel 5 spreadsheet. Figure 4.6 is a sample of the initial text format.
9/11/1997 9:43 ppp257.adelaide.on.net. Mozilla/2.02 (Macintosh; I; 68K) XXXXX 10 head it for into to pleased Spring We round by Frost they with wrapped about chimney garden for Every on he later fast dressed like is selfish window white a
9/11/1997 9:43 ppp257.adelaide.on.net. Mozilla/2.02 (Macintosh; I; 68K) XXXXX 10 head it for into to pleased Spring They round by Frost they with wrapped under chimney garden for Every on he then fast dressed like is selfish window white a
Figure 4.6 Initial text format of response data
Unwanted fields at the beginning of each record were deleted from the resulting table, and the data were sorted by subject name. Responses from the paper based tests were input, time being saved by taking advantage of Excel features. Subjectsí names were copied and pasted to the paper based test tables. The correct responses were typed across a row. The fill down function then was used to copy the correct responses along side the student names. As individual test papers were checked, only incorrect paper based test results were manually input (replacing the correct responses in the table).
Since the cloze and maze tests are measures of reading comprehension, incorrect spellings were not counted as errors. Accordingly, the spell check function was used to correct the response item spellings. Only one incorrect word was not changed, as it could not be determined what the entered word was meant to be. A later analysis revealed a problem. If a subject had included a space before or after a word, Excel determined that the correct response word plus spaces was an incorrect response. This problem was solved by copying the suspect data table into a Word 6 document, using the Find function to delete the spaces, and copying the corrected data back into the Excel table.
Values representing computer survey responses were input into a table. The attitude responses were scaled from 1 to 5 (strongly agree to strongly disagree), and the experience responses scaled from 7 to 1 (a lot of experience to none). Four of the attitude responses were reverse scaled. Rather than determine these manually for each subject, a formula of the form (6-x), where x is the value requiring reverse scaling, was used. Several individual tests were manually scored and results compared to the computer generated scores to check all procedures.
Excel logic functions were used in the cloze and maze data tables to generate a score of 1 for a correct response and 0 for an incorrect response. Subjectsí total scores out of 29 were then calculated for the four reading comprehension tests, scores out of 50 for the computer attitude responses, and scores out of 28 for the computer experience responses.
Although there were 30 responses for each of the cloze and maze tests, the target word was inadvertently omitted from one of the maze items. The corresponding responses in the four tests were deleted to maintain integrity in comparing the results, and hence a score out of 29 rather than 30.
To perform statistical procedures more readily, subject names and the total scores for all test measures were copied into a new table. Chronological age and GAP test results obtained from the class teachers were added. At this point all names were replaced by a class and number designation (e.g. A1) to maintain student confidentiality.
Excel functions were used to calculate the descriptive statistics. Means and standard deviations for each of the test measures and the comparison criteria were determined. To compare the paper based test results to the online test results, and to determine whether computer experience or attitudes had any effect on the test scores, Pearsonís product-moment correlation was used to calculate correlations between the relevant pairs of variables. Another Excel function was used for these calculations. A more detailed analysis of the paper based and online cloze scores was determined by calculating correlation values for Class A and Class B separately.