Applying Quantitative Marketing Techniques to the Internet
Essay Preview: Applying Quantitative Marketing Techniques to the Internet
Report this essay
The Quantitative Challenges from Click stream Data
A common thread through all techniques discussed is the need for data. Fortunately, a natural byproduct of users accessing WWW pages is a dataset that contains the sequence of URLs they visited, how long they viewed them, and at what time. This dataset is called the click stream. To maximize its potential, managers can merge the click stream with demographic and purchase information.

Three potential sources exist for collecting click stream data: (1) The host server (the computer of the site being visited) keeps a record of visits, usually called a server log. As a user requests a page, the server records identifying information (IP address, previous URL visited, and browser type) in the server log. (2) A third party can capture information about web requests.

For example, if a user contacts an Internet Service Provider (ISP) or Commercial On-line Service (COS), such as AOL, it can record any requests the user makes as it relays them to the requested server. Because many ISPs and COSs cache their users requested pages, they do not pass all requests on to the server; instead, they serve many pages from local cache archives to speed up responses to user requests. Unfortunately, this means that server logs contain only a subset of the viewings that occur. Dreze and Zufryden [1998] discuss some of the challenges of using server log data to measure advertising effectiveness. (3) A final-and perhaps the most reliable-source of click stream data is a program installed on the computer where the browser program is running that can “watch” the user and re-cord the URLs of each page viewed in the browser window as well as other application programs that the user is running. It records the actual pages viewed, and thus avoids the problem of cached requests. Such a program can also record how long windows are active. The drawback is that the analyst must choose the individuals and obtain their consent to participate in such a panel. Generally web users are randomly sampled to construct a representative sample. The information from this sample can be projected to the national population using statistical inference.

The largest provider of such information is Media Metrix [Coffey 1999].
The click stream of an actual user session as collected by Media Metrix shows that the user frequently views the same page repeatedly and sometimes pauses to do other tasks between page views (for example run other applications or watch television). Only five of the 12 viewings the user requested could generate a “hit” to the server. This illustrates the advantage of collecting data at a users machine and not from a host site since it includes all requests, eliminating a potential source of bias.

Information about where and how frequently users access web sites is used for various tasks. Marketers use such information to target banner ads. For example, users who often visit business sites may receive targeted banner ads for financial services even while browsing at no business sites. Web managers may use this information to understand consumer behavior at their site. Additionally, it can be used to compare competing web sites. Members of the financial community use such information to value dot com companies. Analysts use click stream information to track trends in a particular site or within a general community. Financial analysts find this type of intelligence useful for assessing the values of companies because many traditional accounting and finance measures can be poor predictors of firms values.

Another use of click stream data is to profile visitors to a web site. Identifying characteristics about visitors to a site is an important precept of personalization. One way to find out characteristics of visitors is to ask them to fill out a survey. However, not everyone is willing to fill them out, creating what is known in marketing research as a self-selection bias. The information may be inaccurate as well, for example visitors may give invalid mailing addresses to protect their privacy or inaccurately report incomes to inflate their egos. Also, completing a survey takes time, and the effort required may severely skew the type of individuals that complete it and the results.

An alternative way to profile users is with click stream data. The demographic profiles of sites reported by companies like Media Metrix can be used to determine what type of individuals visit a site. For example, Media Metrix reports that 66 percent of visitors to ivillage.com are female. Even without knowing anything about a user except that they visit ivillage.com, the odds are two to one that a visitor is female. This is quite reasonable because ivillage.com offers content geared toward issues of primary concern to women. Some gaming sites appeal primarily to teenage boys, and sports sites may draw predominately adult men. On the other hand, such portals as Yahoo! and Excite draw audiences that are fairly representative of the web as a whole. Media Metrix can identify demographic characteristics of visitors using information provided to them by panelists. However, simply a knowledge of the web sites visited by a user and profiles of these web sites (that is, the demographic characteristics of a sample of users) is enough to make a good prediction about a visitors demographics. For example, suppose we wish to predict whether a user is a woman. In general, about 45 percent of web users are female. Therefore without knowing what sites a person visited one would guess that there is a 45 percent probability of being female and a 55 percent probability of being male. If forced to choose, one would guess

Get Your Essay

Cite this page

Natural Byproduct Of Users And Computer Of The Site. (June 7, 2021). Retrieved from https://www.freeessays.education/natural-byproduct-of-users-and-computer-of-the-site-essay/