Teaching Archaeological Science in the 21st Century
Robert H. Tykot, University of South Florida
This paper takes a retrospective look at how archaeological science has been taught at a variety of American universities in the late 20th century, and evaluates the success of this training as we enter a new millennium. In this discussion, I make no distinction between archaeological science and archaeometry, and broadly define them as the application of natural science techniques to archaeology. For the practitioner of archaeological science, however, we must distinguish between the routine application of existing methods and the development of new ones. In teaching archaeological science, then, we must consider the training necessary for each of four groups: (1) traditional' archaeologists, who often need to obtain scientific data and integrate the results in their interpretation of human behavior; (2) archaeologists who also perform scientific analyses themselves, using already established methods; (3) archaeological scientists who adapt and refine existing methods and experimentally develop new techniques and applications; and (4) natural scientists with interests in applying their techniques to archaeological materials.
What coursework and other instructional experiences are or should be required at the undergraduate and graduate levels for each of these groups? What faculty are interested, capable and available for teaching archaeological science courses at various levels? A comparison of undergraduate and graduate curricula at several major American universities reveals considerable variation in their approach to archaeological education. This paper will discuss which, if any, of the above four groups are well served by each of these approaches. The prophecy that more new and important discoveries will be made in the laboratory than in the field will become a reality only if there are sufficient numbers of scientifically literate archaeologists!