International Education Journal

welcome

contents

Back to Contents

download

Download
Article

Acrobat Reader

Download
Acrobat Reader

 

Distance no longer a barrier: Using the internet as a survey tool in educational research

Katherine L. Dix and Jonathan Anderson
School of Education, Flinders University
katherine.dix@flinders.edu.au

Abstract

The existence of the World Wide Web clearly provides new horizons for educational research. In particular, one aspect still in its infancy, is the use of the Web to access individuals as research subjects, which is emerging as a major, new research tool. This paper discusses the processes involved in developing online surveys and how these may be administered to participants in a research study however dispersed they may be, provided there is access to the Internet. Although there are distinct advantages in using the Internet as the interface between researcher and researched, there are difficulties too, and these are considered as a practical guide to undertaking data collection via the Web.

 

Abstract

Introduction

Potential versus current reality

Online alternatives

Conducting a standard online survey

Some Sample Surveys

Conclusion

References

 
Introduction

top

At one time or another, we have all been the survey-taker, filling out long and often complex paper surveys to provide information to the surveyor. As educational researchers, we also understand the time and resources that are necessary to process survey data. Developments in Web technology now offer an opportunity to reduce many complications normally associated with administering surveys. Such opportunities, however, are often counter balanced by concerns arising, in part, from the researcher's lack of understanding of this new research tool and its impact on the task of data collection in educational research.

Literature in the field of online research predominantly addresses the commercial market in terms of customer surveys, and the Internet market in terms of user questionnaires. Each targeted audience requires the development of its own unique online environment to maximise survey effectiveness. Developing online surveys for use in educational research is no exception. Although the art of designing an effective instrument is critical, it is not considered here. Instead, this paper addresses the promises, together with the pitfalls, of developing and conducting online surveys for educational research, based upon the authors' experience in conducting surveys where participants are often situated in remote locations. In particular, this paper details the form survey that uses hypertext markup language (HTML) and common-gateway-interface (CGI) programming to construct, format, and administer surveys to participants, wherever they might be located online. Within this domain, general issues of methodology are offered as a practical guide to undertaking data collection via the Web.

 

 
Potential versus current reality

top

Current literature suggests that web-based surveys are not the solution for every research project, even though the technological capabilities exist (Farmer 1998). There will always be a need for interviews, for telephone surveys, and for traditional pencil and paper surveys in situations where Web technology is not appropriate or available. If a self-administered survey is appropriate, then its adoption in cyber form in preference to the paper variety will depend largely upon the respondents. Clearly, respondents need to have easy access to the Internet, both in terms of hardware and knowledge. The implementation of Government initiatives like the Learning Technologies Project (DETE 1999) in South Australia and similar projects in other States (Education Queensland 1999; Education Victoria 1998; Education Western Australia 1998; NSW DET 1997) provide equipment and training support, reducing the access void and increasing the viability of online research in the education sector.

Beyond the basic issue of access, the decision to adopt an online format requires additional criteria to be considered. These are most effectively discussed by addressing four questions.

  1. What can be done that couldn't be done before?
  2. What can be done better than before?
  3. What no longer needs to be done?
  4. What are the hazards?

 

What can be done that couldn't be done before?

One of the key benefits of an online environment is time. Instantaneous electronic distribution of survey materials and subsequent electronic return of completed surveys give the fastest possible opportunity for responses to be collected and automatically compiled into a database. Time losses incurred from survey distribution by hand or mail and the manual entry of responses to form a database are no longer necessary.

An added benefit of an online survey allows responses to questionnaires to be pre-coded as words, symbols or numbers. For example, if a survey employs a three-point Likert scale of disagree, uncertain, and agree, the transmitted data may be coded as 1, 2, or 3 respectively. Further, if certain items are negatively worded to avoid response bias, then responses to similar three-point Likert scales can be reverse scored (3, 2, or 1). Any inaccuracies resulting from manual data entry are thus removed.

Yet another benefit is the increased degree of flexibility afforded in design and presentation. Respondent-friendly online surveys can be developed for targeted audiences by using varying backgrounds, colourful graphics, and effective layouts, thereby heightening interest and increasing subjects' motivation to complete a survey.

 

What can be done better than before?

A factor significant in any form of research is cost. For a given sample size, an online survey is the least expensive research methodology and constitutes one of the most attractive aspects to going online. Costs typically incurred in traditional paper surveys include outgoing and return postage, stationary materials like paper, envelopes, and printing, and the expense of manual data entry, all of which increase with larger sample sizes. By contrast, the minimal one-off cost to employ a programmer (if you choose not to do it yourself) is independent of sample size and can be an insignificant cost in large studies.

An additional improvement, particularly evident in complex surveys, is the facility of the Internet to be adaptive. Web-languages like Java, VBScript and ActiveX make adaptive surveys possible where subjects can be directed to particular items according to how they have responded to previous items. In a paper-based environment, instructions directing participants, on the basis of certain prior responses, to skip to particular items complicate the questionnaire process and can cause confusion and frustration. Very complex surveys often require interviewers to guide the participant. However, in cyberspace the transition from one item to another non-adjacent item is seamless, with the result that participants are not even aware that particular items have been skipped.

 

What no longer needs to be done?

Removing the apparent complexity of some surveys by using interactive and adaptive techniques may remove the need and cost of an interviewer. The flexibility offered by advances in the language of the Web means that complex surveys, requiring one-to-one interviewing, can be simplified to a point suitable for self-administration.

One of the more onerous and time-consuming tasks associated with survey-based research is the manual entering of questionnaire responses into data files for statistical analysis by computer. An online environment virtually removes the need for such data entry and, more importantly, removes any possibility of typographical errors occurring due to lapses in concentration or fatigue.

 

What are the hazards?

In addition to the positive aspects associated with online surveys, there are a number of weaknesses to consider. First, who is answering the survey? As with all unsupervised mail-out studies, it is difficult to ensure that the desired person actually answers the survey. The procedure of notifying potential participants by email is as reliable a process as mail. Providing the survey site is not promoted beyond the targeted audience, the chance of random Internet surfers finding an online survey site is minimal. Besides, site access can be restricted by employing identification numbers and password protection.

In contrast to unwanted access, a second disadvantage is one of access limitation. Quite simply, not everyone has access to Internet technology. However, of further concern is the possibility of inadvertent biasing of a study by selecting participants (even if they are a random sample) on the basis of Internet access.

A third and frustrating concern regards the fickle nature of technology. The Internet as a technology is far from perfect and continues to be under intensive development. If systems are overloaded or resource conflicts arise, computers and servers can sometimes 'crash', resulting in a loss of data, perhaps without detection. Consultation of server log-files is one measure of site traffic, independent of the responses received. The difference between the number of visits and the number of responses gained provides some indication of data loss. Small discrepancies can be expected due to participants viewing the survey but completing it at a later date. However, large differences should not be ignored and may require an additional email to respondents fielding any access difficulties.

Finally, an issue common to all new research methodologies, is their divergence from the mainstream. Since Internet surveys have developed over quite a short period and are still in a phase of intensive development, the existence of comprehensive research in this field is relatively scarce (Pilypas 1997). With more experience and comparative analysis of online surveys versus paper-based versions, online surveys should reach the same level of acceptance as other methodologies (DSS Research 1998).

Reflection upon these four questions would suggest that as the Internet improves both in terms of access and reliability, and as the methodology of online research becomes better known, the choice of conducting surveys online may be increasingly viable. Related to the adoption of Internet technology is the further decision regarding the type of online format.

Online alternatives

top

 Although this paper is primarily concerned with standard HTML form surveys, a brief introduction is given to a range of alternative methods. Discussed in order of increasing complexity, these include email surveys, standard HTML form surveys, adaptive HTML surveys, and downloadable interactive survey applications.

 

Email surveys

Of the various online alternatives, email surveys are often the fastest and simplest, offering wide audience-access (anyone with an email account) and requiring relatively little set-up time (Bowers 1998). The availability of off-the-shelf email survey software packages supports even novice users in constructing and conducting email surveys. As with any email message, however, the survey form is limited to simple text format with minimal control over page layout and interactivity (Sheehan and Hoy 1999).

 

Standard HTML form surveys

HTML form surveys offer more flexibility than do email surveys, but require substantially more time to design and construct. In essence, the former are just web pages with a form area containing questions, response fields, and a submit button. As with many educational surveys, the standard HTML survey requires participants to complete the same survey answering all questions in a linear style. Since no complex programming is necessary, adapting a paper-based survey is only marginally more difficult than creating a document in a word processor, thus making the technique accessible to most educational researchers.

 

Adaptive HTML surveys

The adaptive questionnaire extends from the standard HTML form by utilising skip routines and branching. Adaptive or inferential questions are based upon responses to previously asked questions (Pitkow and Recker 1994). Accordingly, participants are not asked to respond to all possible questions but proceed through the survey by skipping sections according to responses given to previous items. In a paper-based environment, this is highly visible, but in cyberspace the transition is seamless. Understandably, the complexity of design and programming places adaptive HTML surveys in the realms of professional programmers with greater associated expenses.

 

Interactive survey applications

Offering a similar level of flexibility and complexity as adaptive HTML surveys, the downloadable survey 'application' incorporates the questionnaire into an executable file that respondents download to their own computer (Bowers 1998). The construction, flow, and content of the program are limited only by the creativity of the survey researcher. The resulting survey should be highly interesting and enjoyable. The main disadvantages, however, include the cost and time required for sophisticated programming, distribution and access to the downloadable file and respondents' concerns both in their ability to download files and the fear of such files containing viruses.

For educational research purposes, of these four alternatives, the standard Web page survey supports current educational survey design and offers the greatest flexibility for minimal programming and time requirements.

Conducting a standard online survey

top

 Many function specific survey-building packages are available for purchase from the Internet or from commercial software suppliers. Alternatively, there are numerous online survey providers, some of which offer limited but free survey hosting and data analysis, such as workwork.com, interscore.com, or formsite.com (Ullman 1999). In educational research, however, these options are usually quite expensive or too narrow and restrictive to meet particular purposes. Creating a standard online survey using a general web-page-maker minimises costs, yet allows the researcher complete design and administrative control. Although some level of technical competence is necessary, the process of conducting a standard online survey is not difficult.

There are five basic stages to conducting a Web survey, each of which is addressed in detail below.

  1. Planning and preparing
  2. Going online
  3. Testing
  4. Promoting
  5. Collecting the data.

 

Planning and preparing

Plan the survey by first creating a paper-based prototype, complete with questions, possible responses, scoring of the responses, and general layout. A word processor such as Microsoft Word serves this purpose well with the additional capability of being able to save the file in HTML format (available in Word97 or later). The subsequent file (*.html) can then be edited in Word or with a general web-page-maker such as Claris HomePage by placing the survey questions within a form. Simply, a form area is the cyber equivalent of an envelope, detailing where and how the information is to be sent. The basic range of form components is presented in Figure 1.

Although many design and methodological issues concerning paper-based surveys apply to the online equivalent, there are additional elements that ensure an effective and respondent-friendly online survey. As stated above, respondent-friendly design refers to the construction of online questionnaires that:

increases the likelihood that sampled individuals will respond to the survey request, and that they will do so accurately, i.e., by answering each question in the manner intended by the surveyor.
(Dillman, Tortora and Bowker 1998b:3)

Accordingly, the survey should have an equal chance of being accessed by each participant (regardless of their level of computer skill or sophistication of hardware and software) with all items presented in a way that can be easily understood and answered.

 

Icon
Name
Description
Example

Radio button

Radio buttons allow your audience to select exactly one choice from a list of options. Only one radio button in a list can be selected at a time.

Male         Female

Check box

You can add a single check box to a form or multiple check boxes to a list of choices in a form. Check boxes allow the viewer to select as many choices as they like from the list.

TV     Radio     Video   

Drop-Down menu

Drop-down or Pop-up menus and scrolling lists allow your audience to choose from a list of items. A drop-down menu shows only one item in the list until someone clicks on it. A scrolling list shows more than one item, and your audience can scroll to see the remaining items in the list.

When were you born: 

Text Field

Text fields can contain only one line of text, can be 1 to 500 characters wide, and can only be resized horizontally. These fields are generally used for shorter information such as a name or email address.

Student ID:

Text Area

Text areas allow your audience to enter multiple lines of text, which they can scroll through using the scrollbars.

Describe your observations:

Submit button

You need to include a submit button for your audience to send their information. Each time the submit-button is pushed, the information filled in the form area is sent to the CGI script located on the server. Until the information is sent participants can alter any of their responses.

Figure 1. Form components used in an Online survey together with examples

 

To ensure the construction of a respondent-friendly survey, some useful guidelines should be followed.

An introduction page. Introduce the online survey with a welcome screen that is motivational, emphasizes the ease of responding, and instructs the participant on the general actions necessary to complete the survey. The page may also request an identification field such as a name or ID number.

Ensuring equivalent access. Dramatic advances in software and hardware allow increasingly complex web surveys to be designed. Although it may be tempting to add fancy qualities that the viewer will enjoy, research comparing plain and fancy survey design comes down in favour of plain design (Dillman, Tortora, Conradt and Bowker 1998a). Most importantly, graphic-rich and audio-rich designs require greater transmission time, greater browser power, as well as increasing the chances of overloading and crashing participants' computers. Such features should be kept to a minimum and only used to assist understanding. Secondly, surveys using an advanced language such as Java will not function on all browsers thus limiting accessibility, possibly by as much as 50 per cent of the intended audience (Spain 1998). Applying these findings within the limits of what the computer, browser, and transmission line to respondents can manage, results in a simple accessible survey that is quick to download and utilises less memory.

Page Format. Currently, the two common formats for Online Surveys are 'single page' and 'one question per page'. Ultimately, the choice of layout depends on the structure of the survey, whether it is linear or branching, and the maturity of the target audience. For example, the one question per page format may be highly suitable for younger children as it offers fewer distractions but, by the same token, may be inappropriate for older respondents since it requires more page-turning.

Traditional principles of layout. The recommended approach is to present each item in a conventional format similar to that normally used in traditional paper-based surveys. Clearly, surveys that list questions without numbers, centre questions on the screen, have insufficient spacing around questions, or place yes and no response buttons on opposing sides of the question, are unconventional and challenging for participants to complete. By contrast, a survey that provides questions that are numbered and justified in a well-spaced table, with the placement of the response buttons to the right hand side of the screen closest to the scroll bar (thus minimising mouse movement and control), will appear logical and be easier to complete.

The first question. Begin the main survey with a question that is fully visible on the first screen, and is likely to be easily comprehended and answered by all or most participants. The first question and the process of responding tend to define for the participant whether the questionnaire is easy or difficult to complete.

Question specific user instructions. Provide item specific instruction as part of each item where the action is to be taken, rather than in a separate section at the beginning of the survey. From a cognitive perspective, there is considerable likelihood that detailed instructions placed at the beginning of a survey will be forgotten by the time each action is taken.

Avoid required responses. Unlike paper-based surveys where questions can generally be answered in any order or not at all, online surveys can require participants to answer each question before being able to proceed to the next. Although this has been promoted as an advantage, its use should generally be avoided for survey questions. Circumstances may arise where a respondent has legitimate reasons for objecting to answering a question, or may be unable to provide an answer. Far better that only one question is missed than that a whole survey is not completed. An exception to this general principle regards certain identification fields. Items such as name, age and gender may be crucial to subsequent statistical analysis of the data, and accordingly should require a response (paper-based surveys are disadvantaged in this respect). Such requirements are achieved by reminding participants that particular items (e.g. name or gender) have been omitted.

More choices than size of screen permits. When the number of response choices exceeds the size limits of the screen, consider using double or triple columns to display the choices. Avoid forcing the participant to scroll down to see the alternative choices, at the same time removing the question from easy view.

Bigger is not better. There is nothing worse than having to scroll across a screen to read a question. Placing questions in a table that is formatted to span 90 per cent (for example) of the viewer's screen provides a simple yet effective solution to the wide screen viewing problem.

Progress monitor. Use simple graphic symbols or words that indicate the progress of the participant through a survey. Estimating progress through a paper-based survey is far easier than judging completion progress in an equivalent online survey. Therefore, it is necessary to include on each new screen a progress monitor that may be as simple as a 'percentage completed' message or star rating system.

 

Going online

Once the design of the survey appears to be complete, it can be uploaded to the server where it will have its own unique Web or URL address allowing the survey to be accessed by the research participants. How this occurs will vary depending on server security and software used to deliver the survey form.

However, the server may contain server-specific software and server-sided (CGI format) form handling software along with a folder containing the entire survey (see Figure 2). The location of the folder on the server will determine the Web address of the survey. Accessing the address from another computer using a browser will call up the specified survey on the server. When the questions are completed and the submit button is pressed, the survey data are uploaded to the server. Form handling software collects the data and saves these to a file, usually including details about the submitting computer and the date and time at which it was sent &endash; if all goes according to plan.

 

Figure 2. Representation of the online survey process

 

Testing the survey

Prior to conducting a survey, issues regarding usability, functionality, access and stability need to be addressed (Gould, Gurevich and Pagerey 1998). The survey should look appropriate for its anticipated audience and should function properly on the computer on which it was created. Unfortunately though, when accessed on a different platform, using a different browser, the layout frequently alters due to different screen widths, among other things. The resulting survey may therefore appear quite different making it difficult to read and complete. Overcoming variable layout usually requires a tightening of specifications, such that a table width set on automatic may need to be set to a percentage or pixel width.

Since familiarity breeds acceptance, the survey requires scrutiny and testing by others. Trials should be conducted on a test group composed of participants who are representative of the sample group. Little is gained, for example, if only colleagues trial a survey designed for young school students.

 

Promoting the survey

Promoting the survey is generally achieved by informing the participants in a study by email. The content of the email should include a brief introductory statement about the survey and its purpose. Next, the location of the survey is required, by providing the site URL address. In many email browsers the address becomes a hypertext link from which the participant can directly access the site.

In addition to the survey address, the email sent to participants should include the researcher's name and contact details, along with an invitation for comment or support. Although surveys are not new, conducting them online is new. As a consequence, it is vital that the participant feels confident and able to complete the survey. For most, the novelty factor alone is sufficient (Zukerberg, Nichols and Tedesco 1999). However, for some technophobes, merely providing an email address does not ensure their participation, as in the case of an English speaking person unable to complete a survey written in Chinese &endash; they will need additional support.

 

Collecting and analysing the data

The survey is online, participants have been notified, they complete the form, and submit their responses, but where do the data go? The data are sent to a file located with the survey on the server (the computer that makes the survey available to participants over the web). In addition to the responses, the data file contains records of the time, date and each participant's computer location. Depending on the platform (Mac or PC) and the software used, the data may be saved in a variety of formats, the most common and portable of which is a tab-delimited text file. The subsequent text file can be viewed using a word processor or imported into a statistics package, like SPSS or Excel, ready for immediate analysis.

Some Sample Surveys

top

 To clarify some of the features of online survey forms described above, this section contains samples from current online surveys with which the authors are involved. Figure 3 illustrates the use of radio buttons in a language and an information technology survey respectively, where respondents are asked to select one choice from a list of options.

 

Figure 3. The use of radio buttons in items from an online survey of language and computer knowledge respectively

 

Check boxes are commonly used in surveys where respondents are invited to make one or more responses to particular items. Figure 4 illustrates the use of check boxes in an online mathematics survey.

Drop-down menus and text fields are illustrated in Figure 5. The first item is a passage of text from an online cloze reading comprehension test where students are asked to replace each blank by entering at the keyboard the one word they think best fits. The second text passage shows an online maze reading comprehension test where students, by contrast to what was required in the cloze test, click on each blank and choose the word they think best fits from a number of alternative responses.

 

Figure 4. The use of check boxes in an item from an online mathematics survey

 

Figure 5. The use of text fields and drop-down menus in cloze and maze procedure items from an online survey of reading comprehension

 

Conclusion

top

 This paper provides an overview of the possibilities offered by online surveys in educational research, with a particular focus on the standard Web page survey. Although online research methodology of the kind described is still very much in a developmental phase, clearly it has much to offer to the education research community. In an environment of increased funding cuts, educational researchers need to explore avenues that achieve equivalent outcomes at reduced costs. Use of the Internet to conduct online surveys provides an avenue that can only become more effective with time. As with surveys of the pencil and paper variety, certain problems can arise in the administration of online surveys. The researcher needs to be aware of these and to take appropriate cautions. However, the overwhelming advantage of online surveys over their pencil and paper counterparts is that when subjects complete a questionnaire online, their responses are automatically entered into a database for subsequent data analysis. Not only are online surveys more efficient, they are less prone to error.

Online surveys are not going to be appropriate for every situation where information is sought by eliciting subjects' views or knowledge about some issue or topic. However, where a sample of subjects is dispersed throughout a country, or even across countries, and if they have access to the Internet, online surveys provide a practical solution to efficient data collection.

   
  References

top

 

Batagelj Z. and Vehovar, V. (1998) Technical and Methodological Issues in WWW Surveys. Paper presented at AAPOR ’98. Software and Methods for Conducting Internet Surveys, St. Louis. [Online] http://www.ris.org/ris98/stlouis/index.html [24 May 2000]

Bowers D.K. (1998) FAQ's on online research. Marketing Research, 10 (4), 45-48.

DETE (1999) Learning futures:DECStech 1997-2001 making connections. Department of Education, Training and Employment. [Online] http://www.decstech.nexus.edu.au/ learn.htm [24 May 2000]

Dillman, D.A., Tortora, R.D. and Bowker, D (1998a) Influence of Plain versus Fancy design on response rates for web surveys. Proceedings of Survey Method Section. Annual Meeting of the American Statistical Association, Texas. [Online] http://survey.sesrc. wsu.edu/dillman/papers.htm [24 May 2000]

Dillman, D.A., Tortora, R.D. and Bowker, D. (1998b) Principles for constructing Web Surveys. Social and Economic Sciences Research Centre Technical Report 98-50, Pullman; Washington. [Online] http://surveys.over.net/method/litercro.html [24 May 2000]

DSS Research (1998) Online Research: Good, bad and the ugly. [Online] http://www.dssresearch.com/library/general/online.htm [24 May 2000]

Education Queensland (1999) Schooling 2001: Summary of schooling 2001 project. [Online] http://education.qld.gov.au/tal/2001/abo_sum.htm [24 May 2000]

Education Victoria (1998) Learning Technologies in Victorian Schools, 1998-2001. [Online] http://www.sofweb.vic.edu.au/It/pdfs/Itis.pdf [24 May 2000]

Education Western Australia (1998) Technology 2000: A framework for implementation of learning technologies in WA Government schools. [Online] http://www.eddept.wa. edu.au/t2000/l_t.htm [24 May 2000]

Farmer, T. (1998) Using the Internet for primary research data collection. Market Research Library. [Online] http://www.researchinfo.com/library/infotek/index.shtml [24 May 2000]

Gould, E.W., Gurevich, M. and Pagerey, P.D. (1998) Conducting surveys over the World Wide Web. STC's 45th Annual Conference Proceedings. 294-297. [Online] http://surveys.over.net/method/litercro.html [24 May 2000]

NSW DET (1997) Making the Net returns worthwhile. [Online] http://www.dse.nsw.edu.au/ staff/F2.0/tilt/about/projects.htm [24 May 2000]

Pilypas, H. (1997). The Use of the Computer as a Tool for Testing Reading Comprehension. Unpublished BEd Honours Thesis, School of Education, Flinders University, Adelaide. [Online] http://wwwed.sturt.flinders.edu.au/edweb/onpub/theses/index.htm [21 May 2000]

Pitkow, J.E. and Recker, M.M. (1994) Using the Web as a survey tool: Results from the second WWW user survey. [Online] http://www.cc.gatech.edu/gvu/user_surveys/ survey-09-1994/html-paper/ [24 May 2000]

Sheehan, K.B. and Hoy, M.G. (1999) Using Email to survey internet users in the United States: Methodology and Assessment. JCMC, 4 (3), [Online] http://209.130.1.169/ jcmc/vol4/issue3/sheehan.html [24 May 2000]

Spain, S.W. (1998) Top-10 Web Survey issues and how to address them. [Online] http://www.researchinfo.com/library/top_10_web.shtml [24 May 2000]

Ullman, E. (1999) Survey Says! Small Business Computing and Communications. [Online] http://www.smalloffice.com/sbc/071999/fea04.htm [16 Nov 1999]

Zukerberg, A., Nichols, E. and Tedesco, H. (1999) Designing surveys for the next millennium: Internet questionnaire design issues. Paper presented at the 1999 AAPOR Conference, Florida. [Online] http://surveys.over.net/method/ litercro.html [24 May 2000]

 

International Education Journal, 1 (2) 2000
http://iej.cjb.net


contents

Back to Contents

download

Download
Article

Acrobat Reader

Download
Acrobat Reader


All text and graphics © 1999-2000 Shannon Research Press
online editor