GVU's WWW User Survey

GVU's 9th WWW User Survey


[ Survey Home ] [ 9th Survey Home ] [ Graphs ] [ Reports ] [ Datasets
This is the main document for the Graphic, Visualization, & Usability Center's (GVU) 9th WWW User Survey. GVU runs the Surveys as public service and as such, all results are available online (subject to certain terms and conditions). The 9th Survey was run from April 10, 1998 through May 15, 1998 and was endorsed by the World Wide Web Consortium (W3C) (which exists to develop common standards for the evolution of the Web) and INRIA (the acting European host for the W3C in collaboration with CERN, where the Web originated). The GVU Survey is now also sponsored by a Corporate Council that provides financial support to the survey effort as well as providing new directions for the surveys to explore. Special pointers to the survey were provided by Yahoo, Netscape, TRUSTe, CNN and DoubleClick.

The winners of the $100 cash prizes for the Ninth Survey are: Scott Y. (New Jersey), Keith S. (North Carolina), Meri R. (Florida), Traci A. (Michigan), Laurie M. (Maryland), Ken A. (Colorado), Lari L. (Finland), and Richard G. (Maryland). Congratulations and thanks to all who participated!

Over 10,000 web users participated in the survey. Questions were asked on the following topics:
Basic Sections: Electronic Commerce:
  • General Demographics 
  • Technology Demographics 
  • Data Privacy (TRUSTe)
  • Web and Internet Usage 
  • Software Filters and Content Rating (Vanderbilt) 
  • Beliefs About Society (Vanderbilt) 
  • Internet Shopping (Part 1) (Riggins & Rhee) 
  • Internet Shopping (Part 2)  
  • Internet Banking (Riggins & Rhee) 
Special Sections:
  • Cultural Issues 
  • Webmasters 

 
Get an overview of the findings by reading:
Executive Summary
Selected Results & Trend Analysis, and
Ninth Survey Reports
Read the results on Paper! GVU's Ninth WWW User Survey Report (in color).  Contains analysis and graphs of key findings from the 9th Survey and longitudinal analysis of emerging trends. Authored by: Colleen Kehoe, Dr. James Pitkow, and Dr. Juan Rogers, July 1998, 39 Pages; Price: $80.
To order please contact: 
Office of Technology Licensing 
Georgia Tech Research Corporation 
400 Tenth Street
Atlanta, GA 30332-0415 
404 894-9727 (phone) 
404 894 9728 (facsimile) 
Read about previous surveys in: 
General Survey Information (Past & Future Surveys) 
Special Presentation of Selected Results for the WWW History Day (April 1997) 
Published Papers & Presentations on the Surveys 
Media Responses, Press Releases, & Appearances
Understand how the results are collected by reading: 
Survey Methodology and Limitations of the results, and Technical Information 
Dig into the details by looking at the: 
Tables and Graphs (GIF) for each question
Conduct your own analysis by using our: 
Collected Datasets, and 
Original Questionnaires
Look at other web and internet surveys at: 
Cyber Dialogue 
CyberAtlas - a good starting point 
Nua Internet Surveys - monthly coverage of major surveys 
More sources... 
Read the fine print: 
Special Thanks 
Copyright Information 
The WWW-Surveying Mailing List 
WWW Corporate Council

Executive Summary

The ninth edition of GVU's WWW User Survey is complete and its results offer a fresh picture of the continuously evolving world of the Net. If the attention given to previous editions of the survey is any indication, it should once again be of interest to those following the development of this new medium for academic or business reasons, policy implementation or news reporting. There is significant data in a variety of areas and a few highlights follow.

The Ninth Survey continued to signal the mainstreaming of the WWW, especially in the USA. The general demographics of the user population moved closer to the characteristics of the general population with a continued increase in the proportion of femal e users (38.7%), a decrease in the average income ($52,000 in the US), slightly lower level of educational attainment (50.1% college or more), and a diversification of occupations away from the domination of computer and education-related fields. This new diversity among WWW users is brought about by a group of new users (less than a year on the Net) that is mostly female (51.7%) and more likely to be under 20 or over 50 years old than in their middle years. This process is not yet as noticeable in Europe where the shape of the age profile curve for this year is almost identical to the general profile in 1994.

The results of the Ninth Survey continue to show what is happening in the browser wars. Many more users (9 percentage points) switched browsers and browser vendors during the previous six months, which clearly shows the impact of the release of the 4.0 browser versions. The overall process shows Microsoft gaining market share while Netscape lost some. However, Microsoft's gain seems not to have come from users switching away from Netscape but rather from newer users (under 3 years on the Net) switching from other browsers. The impact of bundling at the level of the Internet Service Provider (ISP) is clear once again given that 31% of new users report it was the source of their browser.

The GVU WWW User Survey began a process of institutionalization about a year ago with the creation of the WWW Survey Council composed of a select group of companies and the WWW Survey Academic Advisory Board composed of Georgia Tech faculty members. Mo re recently, I joined the WWW User Survey Team as a GVU Associate Director for the Survey and will assist Jarek Rossignac, director of GVU, in managing the surveys.

We would like to express special thanks to Jim Pitkow and Colleen Kehoe who continue to be the fundamental forces behind its success, to Kimberley Morton for her part in set up of the survey and analysis of results, to Molly Croft, GVU's Director of Ex ternal Affairs, for lending her expertise in our relations with the corporate world, to Bill Read for his continued guidance and support, and to the members of the WWW Survey Council who support its implementation and provide links to the surveys. We woul d also like to welcome Dr. Naresh Malholtra and Dr. Terry Harpold as members of the WWW Survey Academic Advisory Board.

Work has already started on the implementation of the Tenth Survey that will run in the Fall of 1998. With the support of members of the WWW Survey Council, we will try to get more exposure for the Survey and increase the number of responses. We hope to recruit a few more members to the Council in order to get perspectives from industries not yet represented. We will also work with our WWW Survey Academic Advisory Board on the methodology to improve the basis for making inferences based on survey data . In this way, I hope we can continue to build on the success of the WWW User Surveys and provide a unique source of knowledge about the Internet and its users.

In the name of GVU's WWW User Survey Team, many thanks to all the participants in the surveys and we hope you will get a chance to log your response again in the future.

Juan D. Rogers

GVU Associate Director for the WWW User Survey

Back to the top  


Selected Results and Trend Analysis

General Demographics

See also: Gender
Females represent 38.7% of the respondents to the 9th Survey which is virtually unchanged since the last survey (38.5% Eighth, 33.4% Seventh).  Europe is considerably less gender-balanced with females accounting for only 16.3% of respondents.  F or the rest of the world (mostly Canada & Australia for this survey), females account for 30.5% of respondents. Younger respondents are more likely to be female: 43.8% of those age 11-20 compared to 33.9% of those ages 50 and over.  For the first time, we see a category of users which has more females than males -- users who have been online for less than a year (51.7% female, 48.3% male).

Educational Attainment
Although the average education level of web users has been declining to be more representative of the general population, respondents are still quite highly educated with 80.9% having at least some college experience and 50.1% having obtained at l east one degree.  Respondents who have been on the internet for 4 years or more are much more likely to have advanced degrees (Masters & Ph.D.) than newer users.  

Age
Using a different format for this question (a set of radio buttons instead of a scrolling list) seems to have eliminated a suspiciously high number of "Under 5"-year olds answering the survey.  The average age for all respondents was 35.1 yea rs. 36.4% of respondents are over 40 years old compared to 34.0% just six months ago.  European respondents continue to have a dramatically different age profile with many more in the 21-30 age range.  This profile is very similar to the US prof ile before the original major access providers (AOL, CompuServe, Prodigy) became available.  The respondents with the most online experience tend to be in the 21-30 age range.

Technology Demographics

See also: Connection Speed
Given that speed remains the most cited problem users experience with the Web, it is not surprising that people's connection speed to the Internet has steadily increased. In April 1995, close to 50% of the users were operating 14.4 Kb/sec modems or slower. Three years later, only 4% of the users connect with 14.4 kb/sec modems or slower - a reduction of 92%. Changes in the 1 Mb/sec or faster connections initially dropped from 23% to 15% between April 1995 and April 1997 due to the decrease of a cademic and computer-professional users. In the past year, we note a slight resurgence (2%) towards this class of connections, possibly fueled by increased bandwidth demands from corporate users.

Browser Selection
As initially reported in the last survey, the impact of browser bundling especially at the Internet Service Provider (ISP) level has significantly changed browser market share. Spring marked the official releases of 4.0 browsers by both Microsoft and Netscape, both with significant improvements and with new features. The impact of the 4.0 releases is clear: more users (9 percentage points) switched browsers and browser vendors as a result, with Microsoft gaining market share and Netscape losing market share. An interesting point is that most of the share was gained from users switching from browsers other than Netscape and occurred primarily with users who have been on the Internet under three years. Old-time Internet users tend to be quite lo yal to the Netscape browser.

Web & Internet Use

See also: Hours of Web Use
Because of our methodology of attracting survey respondents by advertising on other web sites, we tend to get respondents who spend a substantial of time using the web. In other words, the more time you spend online, the more likely you are to fi nd out about our survey. For this reason, our results show a bias toward more active web users with the largest category spending 10-20 hours using the web each week (32.7%). 26.4% of respondents use it for more hours while 40.9% use it for fewer hours. Respondents who have been online for more years tend to spend more hours using the web, but statistically the relationship is still fairly weak.

Indispensable Technologies
The question of which technologies people find "Indispensable" was first raised in the Eighth survey. At the time, we were surprised to learn that such a high percentage of respondents found the web to be indispensable and nearly as many found em ail to be also. For the Ninth survey, those percentages have risen even higher and email rose 9 percentage points to become the most indispensable technology among those we listed. Having nearly as big an increase was Java/JavaScript with an increase of 8.1 percentage points. (Although Java and JavaScript are quite different from a technical perspective, we believe most respondents would not distinguish between the two.) Java/JavaScript also showed an increase in the number of people who have used this technology (See the Technology Summary). Audio is another technology that showed a higher than average increase in indispensability (+3.8%).

Falsification of Information
As in previous surveys, the largest category of respondents indicated that they never provide false information when registering with a web site (48.6%).  This means, however, that more than half of respondents do report false information, at least occasionally (25.4% do about a quarter of the time).  Only 5.4% routinely provide false information (i.e. 75% of the time or more).  54% of females reported that they never falsify information, compared with 45% of males.  Other surv ey questions indicate that females guard their privacy more closely than males, so females may register with fewer sites instead of supplying false information.  The likelihood of supplying false information to web sites decreases with age -- again t his might be explained by older respondents being less likely to register with a site at all, but we cannot confirm this with our survey.

Problems Using the Web
Across all groups of users, taking too long to download pages (i.e. "speed") is the most commonly experienced problem with the web (64.8% of respondents). The percentage reporting this problem has been consistent over the last two surveys (63% in the Eighth and 66% in the Seventh). Even though modem speeds continue to rise and respondents upgrade their modems fairly frequently (48% have upgraded in the past year), web pages become even more heavily laden with images, animations, scripts, program s, etc. all of which take extra time to download. Online retailers especially should take note: 53% of respondents reported that they had left a web site while searching for product information simply because the site was too slow. The next most frequent ly cited problem on the web is a growing one--broken links. Although solutions for dealing with broken links are well-known to web designers (i.e. automated auditing of external links for validity, redirecting bad URLs to a search page, etc.) most sites d on't seem to employ these techniques. As anyone who has spent any time on the web can tell you, the problem certainly seems to be getting worse.

Internet Shopping

See also: Total Spending on WWW Purchases
The largest percentage of respondents who made purchases over the Web in the last six months spent between $100-$500 (32.5%) and a group almost as large spent more than $500 (29.5%). When taking into account the location of respondents, the patterns a re similar with the preponderance of the same amounts of spending. However, Europeans and those outside Europe and the USA are more likely to make smaller purchases (less than $50) over the web (26.3% and 28.4% versus 20.9%, respectively). When considerin g the years on the Internet and interesting pattern of spending emerges. The lowest category of spending (less than $50) decreases steadily with increasing experience (from 55% with under 6 months to 15% with more than 7 years). The highest category of sp ending (more than $500) is a mirror image of the lowest: it increases steadily with experience (from 2% under 6 months to 46% with more than 7 years). The two spending categories in between ($50-$100 and $100-$500) remain almost constant with time on the Internet (hovering around 10% and 30% respectively). There are slight differences in the spending patterns of women and men. They are almost equally likely to have spent at the $100-$500 level (30.9% and 33.5%, respectively) but women are more likely to p urchase more in the under $50 level (28.6% versus 17.5%) and less likely to purchase at the above $500 level (21.9% versus 34.5%).

Personal versus Professional Purchasing over the Web
Respondents report that they use the Web for personal purchases mostly with a frequency 1-2 times a month (28.2%) or less than once a month (27.5%). In contrast, most of them never use the Web for professional purchases (35.5%) or they do with low fre quency (24.8% less than once a month and 15.8% 1-2 times a month). Women and men have similar patterns of frequency of personal purchasing except that a larger percentage of women responded that they never made such purchases (10.3% versus 4.7%). The perc entage of women that never made a professional purchase is also significantly larger (44.1% versus 29.7%) but women are only slightly less likely than men to make professional purchases in the other frequency categories.

Type of Information - Personal
Respondents could select more than one type of information and three categories were most often selected. The first was detailed information about products and services (86.4%) followed by price comparison (79.9%) and availability of products and serv ices (77.6%). The same three categories of information are chosen most often by Europeans with a significantly smaller percentage than USA respondents seeking price comparison information (62% versus 80.6%).

Type of Information - Professional
The same question on information seeking behavior related to professional purchases resulted in the same three categories being selected in the same order but at higher rates. Respondents sought detailed information about products and services more th an any other category (92%), followed by price comparisons (82.7%) and availability (78.8%). European respondents selected the same categories most often at slightly higher rates than their American counterparts.

Time Spent Searching - Personal
The largest category of users spends between 5 and 15 minutes searching before they find the first piece of useful information (35.2%). The next largest group spends less than 5 minutes (29.3%). This represents a shift of about 5% toward shorter searc h times from the previous survey. Experts tend to find things faster (less than 5 minutes: 33.7% experts, 15.5% novices), which represents a much larger difference than the previous survey (18.2% versus 8%).

Time Spent Searching - Professional
When searching for professional reasons, an almost equal number of respondents finds it 5-15 minutes (31%) as in less than 5 minutes (30.5%). This represents significant increases toward shorter search times from the previous survey. The contrast betw een experts and novices is significant (less than 5 minutes: 38.6% experts, 16.7% novices).

Back to the top  


Survey Methodology

The Internet presents a unique problem for surveying. At the heart of the issue is the methodology used to collect responses from individual users. Since there is no central registry of all Internet users, completing a census, where an attempt is made to contact every user of the Internet, is neither practical nor feasible financially. As such, Internet surveys attempt to answer questions about all users by selecting a subset of users to participate in the survey. This process of determining a set of users is called sampling, since only a sample of all possible users is selected.

Sampling

There are two types of sampling, random and non-probabilistic. Random sampling creates a sample using a random process for selection of elements from the entire population. Thus, each element has an equal chance of being chosen to become part of the sample. To illustrate, suppose that the universe of entities consists of a hat that contains five slips of paper. A method to select elements from the hat using a random process would be to 1) shake the contents of the hat, 2) reach into the hat, and 3) pick an slip of paper with one's eyes closed. This process would ensure that each slip of paper had an equal chance of being selected. As a result, one could not claim that some slips of paper were favored over the others, causing a bias in the sample.

Given that the sample was selected using a random process, and each element had an equal chance of being selected for the sample, results obtained from measuring the sample can generalize to the entire population. This statistical affordance is why random sampling is widely used in surveys. After all, the whole purpose of a survey is to collect data on a group and have confidence that the results are representative of the entire population. Random digit dialing, also called RDD, is a form of random sampling where phone numbers are selected randomly and interviews of people are conducted over the phone.

Non-probabilistic sampling does not ensure the elements are selected in random manner. It is difficult then to guarantee that certain portions of the population were not excluded from the sample since elements do not have an equal chance of being selected. To continue with the above example, suppose that the slips of paper are colored. A non-probabilistic methodology might select only certain colors for the sample. It becomes possible that the slips of paper that were not selected differ in some way from those that were selected. This would indicate a systematic bias in the sampling methodology. Note that it is entirely possible that the colored slips that were not selected did not differ from the selected slips, but this could only be determined by examining both sets of slips.

Self-selection

Since there is no centralized registry of all users of the Internet and users are spread out all over the world, it becomes quiet difficult to select users of the entire population at random. To simplify the problem most surveys of the Internet focus on a particular region of users, which is typically the United States, though surveys of European, Asian, and Oceanic users have also been conducted. Still, the question becomes how to contact users and get them to participate. The traditional methodology is to use RDD. While this ensures that the phone numbers and thus users are selected at random, it potentially suffers from other problems as well, namely self-selection.

Self-selection occurs when the entities in the sample are given a choice to participate. If a set of members in the sample decides not to participate, it reduces the ability of the results to generalize to the entire population. This decrease in the confidence of the survey occurs since the group of that decided not to participate may differ in some manner from the group that participated. It is important to note that self-selection occurs in nearly all surveys of people. In the case of RDD, if a call is placed to a number in the sample and the user hangs up the phone, self-selection has occurred. Likewise, if in a mail-based survey, certain users do not respond, self-selection has occurred. While there are techniques like double sampling to deal with those members who chose not to participate or respond, most surveys do not employ these techniques due to their high cost.

 

GVU's WWW User Survey Methodology

Unlike most other surveys, GVU's WWW User Surveys are conducted over the Web, i.e., participants respond to questionnaires posted on the Web. In fact, GVU pioneered the entire field of Web-based surveying in January of 1994, being the first publicly accessible Web-based survey. The GVU Center conducts the surveys every sixth months as a public service to the WWW community.

The GVU Surveys employ non-probabilistic sampling. Participants are solicited in the following manner:

There are several points to be made here. First, the above methodology has evolved due the fact there is no broadcast mechanism on the Web that would enable participants to be selected or notified at random. As such, the methodology attempts to propagate the presence of the surveys though diverse mediums. Second, high exposure sites are sites that capture significant portion of all WWW user activity as measured by PC-Meter. These sites are specifically targeted to increase the likelihood that the majority of WWW users will have been given an equal opportunity to participate in the surveys. Additionally, content neutral sites are chosen from the list of most popular sites to reduce the chance of imposing a systematic bias in the results. Finally, the Seventh Survey was the first survey to experiment with the random rotation of banners through advertising networks. The ability of the advertising networks to randomly rotate banners is a relatively new, one that did not really exist during the first three years of GVU's Surveys. This ability goes a long way towards ensuring that members of the WWW community have been selected at random. Since this technique is still quite experimental, it's effect on the reliability of the results in unable to be determined, though we will be examining this effect in future research.

New to the Sixth Survey was the introduction of an incentive cash prizes. Respondents that completed at least four questionnaires became eligible to for the several $250 US awards. Our initial investigation into the effect of including incentives into the design of the surveys reveals that while the overall number of respondents did not increase tremendously, the total number of completed questionnaires did increase significantly. Compared to the Third Survey, which had over 23,000 respondents to the General Questionnaire and 60,000 completed questionnaires (average 2.6 complete questionnaires/user), the Seventh Survey received over 19,000 responses to the General Questionnaire and close to 88,000 completed questionnaires (average 4.6 complete questionnaires/user). The effect of offering incentives on self-selection is an open research issue, though it is a technique that has been employed widely though out traditional survey methodologies, e.g., Nielsen's set-top box sample, etc. For the Ninth survey, ten respondents were chosen to receive a $100 cash prize.

Since random sampling techniques are not employed consistently though out the methodology, the ability of the collected data to generalize to the entire population is reduced, because certain members of the Web user community may not have had an equal chance to participate. The characteristics of these users may differ significantly from those users who did participate in the surveys. As it turns out, comparison of the GVU's WWW User Surveys results to other WWW User data published that utilize random techniques reveal that the main area where GVU's Surveys show a bias exists in the experience, intensity of usage, and skill sets of the users, but not the core demographics of users. Intuitively this makes sense, as only those users that are able to use the WWW are able to participate in the Surveys, whereas a set of RDD users may claim to be able to use the Internet or have used the Web at some time in the past. These users are not likely to be included in the GVU results. However, for many marketing needs, this bias is exactly what is desired of the data: real data from real users online today.

Given the limitations that exist in the data as a result of the methodology, we make the following recommendation to those using the data presented within this report:

Despite the evidence to support the Survey results, we remain unconvinced that the Survey's sampling methodology is optimal and welcome suggestions and further comments on this subject.

Back to the top  


Technical Information

Descriptive Statistics

Most analyses were conducted using SPSS 8.0 for WindowsNT. Additional analyses were conducted with Excel 98 on WindowsNT.

Execution

The Surveys were executed on a dedicated quad processor Sun Sparc 20's. All HTML pages were generated on the fly via our Survey Engine (written in PERL). For more information about how the Surveys Engine actually works, see the write-up in the paper on the Second Survey Results. For those interested in more information about the Adaptive Java Surveying Applet, please see the write up in Surveying the Territory: GVU's Five WWW User Surveys, Colleen M. Kehoe & James E. Pitkow, The World Wide Web Journal, Vol. 1, no. 3. Please direct inquiries about the availability of the survey code to: www-survey@cc.gatech.edu.

Back to the top  


Special Thanks

Special thanks go to Georgia Tech's College of Computing's Computer Network Services for their excellent expert support, especially: Dan Forsyth, Peter Wan, Karen Barrett, YingQing Wang and David Leonard.

Questionnaires and advice were contributed by:

Additional thanks are extended to: The fabulous artwork used as the logo for these pages was created and generously loaned to the surveys by the following artist/graphic designer: Allyana Ziolko

 Back to the top


[ Survey Home ] [ 9th Survey Home ] [ Graphs ] [ Reports ] [ Datasets
Copyright 1994-1997
Georgia Tech Research Corporation
Atlanta, Georgia 30332-0415
ALL RIGHTS RESERVED
Usage Restrictions 
For more information or to submit comments: 
send e-mail to www-survey@cc.gatech.edu.
 GVU's WWW Surveying Team
GVU Center, College of Computing
Georgia Institute of Technology
Atlanta, GA 30332-0280
 
Sun Microsystems Andersen Consulting NCR
CyberDialogue Yahoo Scientific Atlanta
With special thanks to:
DoubleClick     CNN Interactive

for their support in advertising the Ninth Survey.