Login

Publications  •  Project Statistics

Glossary  •  Schools  •  Disciplines
People Search: 
   
Title/Abstract Search: 

Dissertation Information for Michael D. Cooper

NAME:
- Michael D. Cooper
- (Alias) Michael David Cooper

DEGREE:
- Ph.D.

DISCIPLINE:
- Library and Information Science

SCHOOL:
- University of California, Berkeley (USA) (1971)

ADVISORS:
- M. E. Maron

COMMITTEE MEMBERS:
- Patrick Wilson
- Robert M. Hayes
- Raynard C. Swank
- Charles West Churchman

MPACT Status: Fully Complete

Title: Evaluation of information retrieval systems: a simulation and cost approach.

Abstract: This dissertation examines problems of how to evaluate an information retrieval system. Two specific approaches are explored. The first is a mathematical model for use in studying how to minimize the cost of operating a mechanized retrieval system. Through the use of cost analysis, the model provides a method for comparative evaluation between systems. The cost model divides the costs of a retrieval system into two components: system costs and user costs. In addition, it suggests that a trade off exists between the performance level of the system and the combination of user and system time that is expended in working with the system. With this approach it is possible to determine the allocation of user and system time that minimizes the total cost of operating the system. This allocation is done for a given performance level and for a given cost per unit of user and system time.

The second approach to the evaluation of literature searching systems is the development of a simulation model as a preliminary step toward the creation of a tool for system design and evaluation. The simulation program creates in query file characteristics on system performance. First a thesaurus of term relations is generated. Then, employing the thesaurus, routines generate pseudo-documents and pseudo-queries. These pseudo-documents and pseudo-queries are then compared to see the effect of various query file parameter changes on the quantity of material retrieved.

Evaluation of the simulation output indicates that there are small differences between the results of the experimental runs. It is concluded that one method for generating pseudo-queries is not clearly better than another. it is believed, however, that the simulation model as an approach to the evaluation of retrieval systems provides a limited but useful framework for the evaluation of information retrieval systems.

MPACT Scores for Michael D. Cooper

A = 13
C = 4
A+C = 17
T = 19
G = 2
W = 13
TD = 16
TA = 3
calculated 2008-05-30 12:34:57

Advisors and Advisees Graph