SlideShare a Scribd company logo
As the largest university in its state, Virginia Polytechnic Institute and
State University (Virginia Tech) has developed a reputation as a
computer-intensive, technology-rich environment. This chapter
describes the growth of this environment, the evolving relationship
between technology and teaching and learning, and how assessment
has recently assumed an integral role in that relationship’s
development.
Assessing the Changing Impact of
Technology on Teaching and Learning
at Virginia Tech: A Case Study
C. David Taylor, Joanne D. Eustis
In recent years the U.S. system of higher education has suffered from an ero-
sion of confidence among the public, who seem to believe that the values and
preferences of research universities in particular have become disconnected
from the ethics and needs of society at large (Sykes, 1990). This disconnection
has manifested itself in questions about the quality and relevance of under-
graduate education and in demands from governing boards and funding agen-
cies for more accountability and productivity. Universities have been charged
with failing to prepare students to live and work in a global world, engaging
in studies that are out of harmony with contemporary society, and failing to
involve students in a dynamic, relevant learning process.
In response defenders of higher education argue that new information
technologies offer means to remedy the situation and accommodate the learn-
ing needs of an expanding and increasingly diverse student population. To
this end higher education has indeed been aggressive in adopting state-of-the-
art technologies but slow in adjusting its organizational structure and
processes to leverage these technologies’ potential. In the words of Vartan Gre-
gorian, former president of Brown University, “the new technology per se is
not a revolution—the revolution is the difference that technology makes in
how we organize, structure, and empower our lives” (Gregorian, Hawkins,
and Taylor, 1992, p. 7). In recognition of the power and potential of tech-
nology, the State Council of Higher Education for Virginia (SCHEV) has man-
dated that the use of technology become an integral part of higher education’s
restructuring efforts. The message from SCHEV is clearly stated in a list of
NEW DIRECTIONS FOR INSTITUTIONAL RESEARCH, no. 102, Summer 1999 © Jossey-Bass Publishers 55
56 INFORMATION TECHNOLOGY IN HIGHER EDUCATION
suggested restructuring objectives and actions (State Council of Higher Edu-
cation for Virginia, 1995), among which are the following:
• Improve quality and reduce costs of instruction
• Incorporate technology
• Increase faculty productivity
• Serve students at new sites
• Offer credit for competency and self-paced learning
However, the importance of technology in the teaching-learning process
is not a new concept in the state. During the 1988 session of Virginia’s General
Assembly, then-governor Gerald Baliles charged the Virginia Commission on
the University of the Twenty-First Century (1990) “to develop a vision of
higher education to meet the demands of the next century.” A report written
at Virginia Tech by the University Task Force on the Impact of Digital Tech-
nology on the Classroom Environment (1989) played a prominent role in the
commission’s discussion of technology. The commission report stated that in
the university of the twenty-first century, “the constraints of space and time
will be reduced by thoughtful introduction of telecommunications and com-
puters into the instructional mission of colleges and universities.” The com-
mission further stated:
New digital technology offers the promise of three significant changes in faculty-
student contact:
1. The nature of formally structured contact will shift.
2. A larger part of faculty/student contact will be ad hoc and relatively unstruc-
tured.
3. The provision of an electronic message system will allow extensive contact
without requiring student and teacher to be in the same place” [p. 7].
Viewed with hindsight this list of predictions appears remarkably pre-
scient. Virginia’s legislature has included the use of technology as “an area of
emphasis” in the restructuring of the commonwealth’s institutions of higher
education.
Uses of Technology in Higher Education
For almost a century institutions of higher education made few changes in the
way they delivered instructional services. For the most part teaching has been
synchronous, meaning that students and faculty members meet physically in the
same place at the same time. The predominant instructional model has been
lecture and test. Calls for reform insist that the focus must shift from this pas-
sive receptor model to an independent learning model, with an emphasis on
students’ more active engagement in the learning process. To this end tech-
57
ASSESSING THE CHANGING IMPACT OF TECHNOLOGY
nologies have been combined to create gateways for scholars and students to
vast sources of information in various forms: voice, video and images, text,
interactive databases. These gateways—the most prevalent being the World
Wide Web, or Internet—provide unparalleled opportunities for asynchronous,
on-demand learning through access to a remote resource at the student’s con-
venience. Advocates claim tremendous benefits for asynchronous learning,
including greater opportunities for collaborative learning among students,
greater interaction, and increases in individual productivity (Taylor, Roy, and
Moore, 1997).
This new learning paradigm compellingly advocates a departure from
what Carol Twigg (1994) calls “faculty-centered curricula,” in which faculty
habits and interests take precedence over student-centered learning. Twigg
asserts that in college and university courses, “design all too frequently begins
with the question, ‘What do I want to teach?’ rather than ‘What do the students
need to learn?’” She writes: “The concept of course design itself is indicative of
a faculty-centered approach: faculty design, faculty select, faculty present. In
the process, the student is often little more than a passive recipient of the out-
comes of the faculty member’s decision-making process” (p. 18).
Herein lies a challenge. When institutions of higher education use tech-
nology, the decision regarding what and how students learn should be the
result of collaborative decision making by faculty members, instructional
design specialists, learning assessment experts, students, and college and
university administrators working within the context of the demands for insti-
tutional change. Yet such a radical cultural shift cannot come easily or with-
out questions about the nature and effects of the changes. How does an
academic institution make decisions about new directions in course content
and modes of learning given its shared governance system and a community
with diverse values and agendas? Virginia Tech serves well as a case in point,
because it is both typical and atypical in its history of using technology in
advancing learning.
Most institutions in the corporate and government sector as well as in
higher education relied heavily on centralized mainframes and minicomput-
ers throughout the 1970s and 1980s. Personal computers were not taken seri-
ously until their computing power began to rival that of older mainframes, and
they did not begin to supercede the centralized mainframe systems until net-
working and communication technology became an affordable reality in the
early 1990s. Although computers have been used for instructional purposes
since the late 1960s and on an increasing scale at Virginia Tech since the late
1980s and early 1990s, these early instructional applications were mostly iso-
lated ventures, restricted in their impact to a single course or part of a course.
However, two initiatives at Virginia Tech are worth mentioning because of their
departure from this pattern.
First, in 1984, the College of Engineering, under the leadership of Dean
Paul Torgersen (now president of Virginia Tech), instituted a requirement that
each student have his or her own personal computer and a set of associated
58 INFORMATION TECHNOLOGY IN HIGHER EDUCATION
engineering software tools. This was the first college of engineering in the
nation to institute such a requirement. Each year, as the technology has
become more advanced, specifications for the student-purchased PCs are
upgraded in order to ensure that engineering students experience and exper-
iment with the leading edge of technology. This is one of the few examples of
a technological initiative so widely adopted that it affected every member of a
large population.
Second, Virginia Tech was a key participant in the SUCCEED project,
again through the College of Engineering. Initially funded by the National Sci-
ence Foundation in 1992, the Southeastern University and College Coalition
for Engineering Education (SUCCEED) is a group of southeastern universities’
engineering colleges that make use of new technologies to improve teaching,
especially the teaching of highly theoretical courses that traditionally have been
difficult for students. The member institutions each contribute a variety of
developments to the project. For example, Virginia Tech created and hosted a
database of engineering-related images that any faculty member could down-
load for educational use across the then-fledgling Internet. This project was
important because it not only had an impact within Virginia Tech but also
enabled a cooperative sharing of curricula and ideas among several institutions.
One of SUCCEED’s major objectives is to improve the engineering curriculum
using outcomes assessment results, and one of the “products” produced as a
result of Virginia Tech’s first five-year SUCCEED grant was a learning objec-
tives and outcomes assessment planning guide for engineering academic
administrators [see www.succeed.vt.edu].
Investment in Teaching and Learning
The decade of the 1990s has seen a continuation of Virginia Tech’s innovation
and steady progress toward the incorporation of technology into teaching and
learning methodologies and communications. A number of programs—the
Instructional Development Initiative, Cyberschool and the ACCESS project,
and the Center for Innovation in Learning—have been initiated, accompanied
by an increasing interest in a role for assessment.
Instructional Development Initiative. Virginia Tech Instructional Ser-
vices (a unit formerly called the Learning Resources Center and then Media
Services) has existed since the early 1970s. The Center for Excellence in
Undergraduate Teaching (CEUT) was established in 1993 to foster instruc-
tional excellence and innovation. Despite these resources and innumerable
directives from SCHEV, there was little evidence until recently of interest
among faculty members in teaching methodologies that used technologies. For
example, SCHEV provided funds in 1992 for a Virginia Tech faculty grants
program that allowed Educational Technologies (a department within Instruc-
tional Services) to gain valuable experience in managing three technology-
based course transformation projects. However, these projects affected only
individual courses and had minimal impact outside their home departments.
59
ASSESSING THE CHANGING IMPACT OF TECHNOLOGY
There was of course a pragmatic reason contributing to the indifference
to technological innovation. At Virginia Tech operating budgets had been flat
or declining for a decade, and such constrictions naturally limit the discre-
tionary spending of academic administrators. In departments without signifi-
cant sources of outside income, there had been a particularly acute lack
of funds for traditional support, let alone for new equipment. As a result,
managerial initiative was thwarted, and only limited discretionary choices
remained, except at the highest level of the university administration. Conse-
quently, local experimentation with pedagogical innovation and the develop-
ment of alternative teaching strategies, especially those using technology, did
not enjoy wide appeal or encouragement.
During the early 1990s, Virginia Tech was typical of many universities in
undergoing severe budget cuts amid an atmosphere of public hostility toward
higher education in general and faculty in particular (see, for example, Walzer,
1993). Not surprisingly, faculty morale had sunk to an all-time low. At this
point the vice president for information systems and the university provost
made a courageous and far-reaching decision: rather than continue to retrench
and cut back, they decided to redirect resources and invest in the faculty, stu-
dents, and the teaching infrastructure, using technology as a vehicle. There-
fore, in 1993, the Instructional Development Initiative (IDI), developed jointly
by Information Systems and the Office of the Provost, was proposed and sub-
sequently funded centrally. The goals of the program were outlined in the
document Phase II: Instructional Development Initiative, for the 1994–98 fiscal
period, and were organized into the following three components:
Faculty Development Initiative (FDI)
• Provide the opportunity for all faculty in the University to participate in this
faculty development program. The overarching goal is to motivate them to
investigate, create, and utilize alternative instructional strategies.
• Provide participants who complete the program with access to state-of-the-
art instructional technology, the knowledge to use it, and the motivation to
collaborate with their colleagues in leveraging instructional technology in their
courses.
Student Access
• Provide advice to all students on their investment in computer technology in
order to maximize its usefulness during their college careers.
• Provide better access to computing resources for all students who do not have
their own personal computers and provide computer labs for accessing spe-
cialized software which is unique to disciplinary areas (such as Perseus, Math-
ematica, and Daedalus).
• Provide network-based training materials for students in order to ensure that
they have a basic foundation in the use of computing and instructional tech-
nology resources.
60 INFORMATION TECHNOLOGY IN HIGHER EDUCATION
Course Development
• Support faculty in the development of network accessible courseware and
instruction.
• Facilitate the development of electronic libraries of scholarly materials sup-
porting designated courses.
• Provide improved classroom and presentation facilities to support faculty
efforts in introducing new technologies into core curriculum courses [Virginia
Polytechnic Institute and State University, 1995, p. 15].
An additional, unstated goal was to enable the transformation of the uni-
versity’s computing infrastructure from a mainframe to a client-server archi-
tecture.
In fiscal year 1993–94, as the FDI began its first series of workshops but
even before it began to have a campuswide influence, a number of concur-
rent technology-related developments were occurring as a result of the robust
technology climate at Virginia Tech. One unique project was the Blacksburg
Electronic Village (BEV) (see www.bev.net). BEV was developed as the result
of a desire to extend Virginia Tech’s network access beyond campus bound-
aries. For any computer network to be used to its potential for instructional
purposes, students living off campus required access to network-based infor-
mation services. Therefore in 1991, a decision was made to offer, in collabo-
ration with the town of Blacksburg and Bell Atlantic, Internet access to the
local community. After two years of infrastructure development the first dis-
tribution of BEV software was tested in 1993. The subsequent rapid growth
of BEV was a harbinger of the future national development of the Internet
and the incredible proliferation of Internet service providers (ISPs) and Web
sites around the world, and led to a spate of newspaper and magazine arti-
cles, such as one in USA Today that called Blacksburg “the most-wired town
in America.”
Cyberschool and the ACCESS Project. In November of 1994, a group
of arts and sciences faculty—all of whom had participated in the first FDI—
proposed that more valuable than getting “computers into the classroom”
would be “getting classrooms out of computers.” This loosely organized,
multidisciplinary group of faculty members has proceeded to test the efficacy
and push the limits of computer-mediated communication technologies.
What these faculty have discovered is described in a number of position
papers that may be found on the Cyberschool Web site (www.cyber.
vt.edu/docs/papers.html; see also, more generally, www.cyber.vt.edu/
default.html). They have been a constant source of innovation, inspiration,
and discovery regarding the strengths and limitations of learning networks.
Cyberschool today functions as a forum for information interchange, as a
support group for faculty pioneers, and as a creative voice for advocating pol-
icy changes to the administration. It remains a valuable channel of commu-
61
ASSESSING THE CHANGING IMPACT OF TECHNOLOGY
nication between early adopters of technology and the majority of faculty
and administrators.
As an ad hoc group of faculty, Cyberschool was largely an unfunded entity
and remains so today. In order to develop many of the ideas originating in
Cyberschool in a substantive and programmatic way, Associate Dean Lucinda
Roy, one of Cyberschool’s cofounders, initiated a request to the Alfred P. Sloan
Foundation for funding to explore the concept of asynchronous learning net-
works (ALN). The College of Arts and Sciences, Educational Technologies (as
part of Instructional Services), and three faculty from the department of biol-
ogy were subsequently funded to transform four lower-division, introductory
biology courses in a project called ACCESS (Asynchronous Communications
Courses to Ensure Student Success).
Biology courses were chosen for several reasons. First, the three faculty
were not only willing to participate but also had the drive and experience to
have introduced creative and far-reaching technological and instructional inno-
vation into their own courses. Second, there was a deep need for change in the
department, because it was faced with an unprecedented influx of new stu-
dents. Finally, biology had, more than most departments, settled on the large
lecture model for its lower-division courses, and faculty and administration
alike had recently realized that simply continuing to increase the size of classes
was no longer a viable option. With growing class sizes, few resources for
teaching support, and a student population that was ever more technologically
sophisticated and demanding, the large lower-division biology lecture courses
were prime candidates for transformation.
ACCESS was funded in December of 1995. Development, implementa-
tion, and assessment proceeded for eighteen months and concluded in the
summer of 1997. Some ACCESS developments included a widely emulated
course Web site that provided round-the-clock access to class materials (lec-
ture slides, notes, the syllabus), class news and announcements, links to out-
side resources, and a communication forum. The more important innovations
included experimentation with a variety of new communication channels
among students and between students and faculty, which extended class dis-
cussions far beyond the confines of the classroom. The complete results are
available in the final report, which was published on-line in the summer of
1997 (see www.edtech.vt.edu/access).
From the point of view of this chapter, one particularly important project
innovation was the emphasis placed on evaluation and assessment. The Sloan
Foundation requested that assessment be a major part of the project and
increased the size of the grant in order to fund that part of the initiative. The
assessment effort attempted to answer questions related to asynchronous com-
munication and new teaching and learning models; the use of technology to
relieve faculty of repetitive tasks; self-paced learning in large, mixed ability
classes; the efficient development of course materials; the impact of technol-
ogy on learning, motivation, and student success; and the identification of
62 INFORMATION TECHNOLOGY IN HIGHER EDUCATION
essential faculty and student skills for the new age of learning. These were
ambitious goals for any assessment effort, but the lessons learned during
ACCESS informed an approach to assessment that was used in the much larger
technology projects that were to come.
Center for Innovation in Learning. Recently (1996–98), the grassroots
technology activity of faculty has increased across campus, spurred on by a
heightened level of awareness of the possibilities of the new technologies. Par-
ticipants in the Faculty Development Initiative workshops have begun to
request more help in using technologies in their courses. In addition, the
administration has encouraged development of distance learning courses, and
deans and department heads have begun to identify those courses that could
benefit from the use of technology. In order to aid development, the Center
for Innovation in Learning (CIL) was founded in 1996 to integrate research
on teaching and learning into the curriculum. CIL’s work is based on the
assumption that instructional innovation is accomplished through the inter-
action of design, development, delivery, and marketing. Through faculty proj-
ects, the center investigates, develops, researches, tests, and evaluates new
technology-based teaching and learning approaches, particularly those related
to distance learning, faculty development, instructional technologies, and
telecommunications. A third round of annual grant awards was completed in
the spring of 1998, and to date sixty-six projects have been funded for over
$1.5 million.
The technological and pedagogical changes required to develop distance
learning courses and campus Web-based courses overlap to a great extent, and
spurred on by CIL funding, the pace of course transformation in both these
areas has rapidly increased. Simultaneously, a massive upgrade in the state net-
work infrastructure has created an opportunity for a quantum increase in
distance learning course distribution. Information Systems has laid the ground-
work for the development of NET.WORK.VIRGINIA, a state-of-the-art, wide
area educational network. This advanced broadband network will deliver ATM
(asynchronous transfer mode) service statewide and already connects 120 par-
ticipating sites, including all four-year colleges and universities, the Virginia
Community College System, several K–12 school systems, and many state
agencies.
Marketing is essential to the promulgation and acceptance of any new ini-
tiative, and Virginia Tech Online (VTOnline) was established by the CIL in
1997 as a mechanism for communication among and coordination of net-
centered activities. As a central Web site (www.vto.vt.edu), VTOnline serves as
a single point of contact for all computer-mediated instructional and informa-
tion services across the university. It provides easy-to-use information about
all instructional, administrative, and public service activity that is available at
Virginia Tech over the Internet. It acts as an organizing point within the uni-
versity that coordinates information concerning on-line degree programs, short
courses, extension activities, and support for public service initiatives for the
continual development of all existing and new network-centered teaching ini-
63
ASSESSING THE CHANGING IMPACT OF TECHNOLOGY
tiatives, ranging from course and program innovation to assessment and eval-
uation practices.
Evolution of a Strategy for Assessing Instructional
Technologies
As a result of the overwhelming increase in activity requiring technological
support, there has been a groundswell of interest in and demand for assess-
ment of technology projects and their impact on the university’s most funda-
mental mission—teaching and learning. Furthermore, as resources have been
diverted from other activities into instructional technology projects, new com-
puting and network infrastructure, and course redesign, questions of impact,
of costs versus benefits, and of efficiency have naturally arisen. In addition, an
awareness of new roles for faculty has begun to permeate the university cul-
ture. For example, the idea that faculty need both released time and consider-
able support (expert assistance, training, consulting, additional hardware and
software) in order to revise and transform their courses has evolved from a
novel luxury to an accepted expectation.
Yet as time goes by it seems that more questions are raised than answered.
Faculty developers ask: How can my courses be improved? Which technology
is most appropriate, or more fundamentally, is technology necessary for this
improvement? What kind of and how much assistance will I need—how many
people or machines or resources will be required? When I start implementing
changes to the course, how will the students react, what will their expectations
be? These are not questions that faculty are typically experienced in answer-
ing or that they expected to confront when they began their academic careers
by lecturing, doing research, and serving on committees. Administrators and
department heads have a different set of questions. Once the decision to sup-
port course transformation is made, they ask: How will we allocate resources,
and whom will we support? How will we manage the changes being proposed?
How will development activities affect the traditional areas of faculty produc-
tivity? How will we monitor the student reaction to these changes? Eventually
everyone involved must at some time present a coherent picture of the project
and its outcomes to other faculty, upper administration, the board of regents,
professional conferences, governmental agencies, legislators, and parents. A
good assessment plan can help address most of these questions, and it can cer-
tainly aid in communicating results to interested outsiders.
CNI Project: A Unified Approach to Assessment. In a call for state-
ments of interest and experience, the Coalition for Networked Information
(CNI) invited institutions of higher education, in December 1996, “to use ongo-
ing assessment techniques to study the uses, impacts, costs, and effectiveness
of networks and networked resources and services.” One of nine institutions
chosen to participate in CNI’s Assessing the Academic Networked Environment
project, Virginia Tech formed an interdisciplinary team that included represen-
tation from the library, computing center, instructional services, and network
64 INFORMATION TECHNOLOGY IN HIGHER EDUCATION
services. Work proceeded on assessment of various aspects of network activi-
ties and services throughout 1997.
A great deal of uncoordinated and overlapping assessment activity was
discovered within Virginia Tech Information Systems. Although committed—
and indeed, required—to measure the effectiveness of network use and
instructional technology, the people responsible for network infrastructure
development, maintenance of communications systems, and the delivery of
network-based information services, support, and instruction were not neces-
sarily communicating among themselves. A great benefit of participation in the
project was the opportunity to network, share, and coordinate the results of
individual assessment efforts. Equally important, it increased understanding
of the interdependence in a networked environment.
Purpose of Assessment. As developers and administrators at Virginia
Tech, our experience with assessment grew largely out of the ACCESS project,
was modified by the group experience with the CNI affiliation, and has con-
tinued to evolve with the ongoing assessment of the much larger CIL initiative.
The goals for assessment are therefore practical and needs driven. First and
foremost, assessment must be useful. It should not involve collecting data for
their own sake. Although it uses many of the methods of basic research, assess-
ment is research with an end in mind; in traditional terms, it is applied
research.
The two purposes of assessment can best be summarized as feedback and
communication. The feedback function (also known as formative evaluation)
serves the purpose of informing persons directly involved with the project or
intervention about its progress. The central idea is that given timely and use-
ful information, a midcourse correction might be made that could increase the
effectiveness and chances of success of the project. To be useful in a rapidly
evolving project, feedback data must be quickly collected and analyzed but by
necessity this analysis will be somewhat less rigorous than the final or sum-
mative evaluation upon which the final judgment about project outcomes
might be made.
Communication is the ultimate rationale for the assessment of projects or
interventions. Practically speaking, if results are not communicated to the out-
side world, then except for the benefits to the immediate participants, the proj-
ect might as well not have been undertaken in the first place. Above all, did
the project meet its goals? For example, increased learning may or may not be
a goal, but if it is, how did learning occur, and how was it measured? In any
case, what were the outcomes? What are the different parts or aspects of the
project, and which were successful? What was the impact? Who was affected
by the project and to what extent? Can the project’s innovations be transferred
or scaled up to more students, to other programs? Can these innovations
reduce costs? Are these findings useful in determining whether the project
should be continued or expanded? If so, how could it be improved? Moreover,
different audiences need different kinds of information. The participants and
immediate stakeholders—at Virginia Tech the faculty developers, staff, stu-
65
ASSESSING THE CHANGING IMPACT OF TECHNOLOGY
dents, and administrators who form the development and implementation
team—will first want the kind of practical, formative evaluation and feedback
described in the previous paragraph. Outsiders, however, will be interested in
the long-range, summative outcomes of the project. They either want to apply
the findings to their own situations, or they are decision makers who want to
know where and how to allocate future resources.
Methodologies. Assessment methods at Virginia Tech are in a constant
state of evolution, but in general the emphasis is on a holistic description of
the intervention—a snapshot of the status of the project. To achieve this holis-
tic picture, quantitative and qualitative data are combined in a hybrid design
and findings are reported in a narrative, descriptive format within which data
are embedded as evidence. Most important is the framework for the assess-
ment activities, that is, the questions that need to be answered or the criteria
for success of the project or intervention. It is crucial that this framework be
reviewed, negotiated, and agreed upon well in advance by all stakeholders.
Assessment data include traditional numerical or quantitative measurements
(census data, survey results, grades), but they also include a large portion of
qualitative data such as material from interviews and observations and also
summaries and quotes from various textual sources. The two types of data
complement each other but are considerably different and are therefore dis-
cussed separately in the following sections.
Qualitative Methods and Data. In Assessing the Academic Networked
Environment: Strategies and Options, Charles McClure and Cynthia Lopata
(1996) give a succinct definition of qualitative data: “Qualitative data are data
that describe, explain, and characterize the subject of investigation using words
rather than numbers.” They go on to note that qualitative methods are ap-
propriate “where the research problem and the research setting are not well
understood” (p. 11). Most course transformations that involve instructional
technologies fit this situation. Although faculty and other developers will have
certain objectives for their projects, they cannot pretend to know everything
that will happen when changes are implemented. Furthermore, they fully
expect to make changes along the way. Given this state of change and uncer-
tainty, the usual data-gathering tool, the survey, with its specific questions and
limited response criteria, could easily fail to capture major effects. The solu-
tion is to use qualitative, ethnographic research methods; that is, talk to the
participants and observe them in a natural setting, and above all be good lis-
teners and good observers and be open to the unexpected and the obvious. In
this way the participants themselves can raise issues that the planners and eval-
uators did not anticipate.
Qualitative methods are ideal for finding out, from a holistic point of view,
the impact of new technologies or instructional interventions. In addition to
observing classes, labs, and other class activities such as field trips, our pri-
mary instrument for gathering data was the guided interview (Patton, 1990),
in which the interviewer follows a list of questions but also feels free to depart
from that script and explore topics that the subject initiates or in which the
66 INFORMATION TECHNOLOGY IN HIGHER EDUCATION
subject seems to have interest or knowledge. Other methods we have employed
include focus groups, benchmarking, and examination of user activities,
through transcripts of on-line discussions, for example. A complete descrip-
tion of qualitative research methods along with a rationale for their use is obvi-
ously beyond the scope of this chapter, and the reader is referred to
comprehensive works on the subject (Patton, 1990; Denzin and Lincoln, 1994;
Miles and Huberman, 1994). Finally, the findings of open-ended interviews
help inform the writing of subsequent surveys and point the direction in which
to look with more precise and widely accepted quantitative methods.
The primary practical problem in qualitative methods is accumulating
more data than can comfortably be organized, analyzed, and interpreted.
Open-ended interviews can be lengthy and must be transcribed, checked for
accuracy, then coded and studied. The results must be organized, analyzed,
summarized, and displayed, or presented. In addition, field observation is time
consuming and requires a disciplined researcher to observe, take useful notes
or record observations, and then compile the notes into a useful format.
Finally, almost any textual material is grist for the qualitative mill, and tech-
nology interventions leave an extensive textual trail suitable for analysis: on-
line discussions, e-mail interchanges, listservs, threaded discussion groups—all
are raw material that can be archived and incorporated into the data set.
For all their difficulties, we have found that qualitative data have a great
deal of value and impact in communicating the nature of a project, which was
the bottom line of our assessment effort. When embedded in a readable nar-
rative, qualitative data such as quotes from student or faculty interviews make
immediate sense and are accessible to everyone, which is often not the case for
a complex statistical analysis. We have also found that technology allows the
impact and credibility of qualitative data to be extended beyond the normal
paper-based report. In the ACCESS project, we videotaped as many of the
interviews with students and faculty as possible, which was quite acceptable
to almost all of the interview subjects. Because the videotaping and sound
recording was done at a high level of quality, clips from these interviews could
then be used in a wide variety of applications: as part of traditional videotapes,
on CD-ROMs, or as part of multimedia presentations. For example, we
included approximately forty of these clips in the on-line version of the
ACCESS report (see www.edtech.vt.edu/access). Although the right quote has
tremendous impact in a text narrative, it is even more powerful to see and hear
the student or professor actually saying those same words.
Quantitative Methods and Data. Our standard method of data collec-
tion was the survey, administered to an intact class. Typically, we tried to find
out as much as possible about the students’ background and their use of com-
puters and networks, including demographics, computer ownership, knowl-
edge of and experience with computer networks, and attitudes toward different
ways of learning (cooperative groups, independent study, relative importance
of lecture versus lab, and so forth). We also administered a computer attitude
scale (Ray and Minch, 1990), usually at the beginning and end of the semes-
67
ASSESSING THE CHANGING IMPACT OF TECHNOLOGY
ter, which gave us an indication of students’ anxieties about and alienation
from computer technology and how these feelings changed during the course
of the semester. Finally, we asked many specific questions about technology in
the class: their usage of the class Web site, the impact of the class and tech-
nology on their attitudes toward the course subject matter, their opinion on
the usefulness of the technology, and so forth.
Most of our analyses used descriptive data as a way of summarizing
responses. We were also able to use the data, however, for some interesting
correlation studies and tests of significance. For example, many faculty were
concerned that students who owned computers or were more favorably dis-
posed toward or adept with computer and network technology might have
an unfair advantage in a technology-enhanced class. We were able to report
that, to date, there has been no significant correlation between computer
ownership, experience, or attitude (that is, computer anxiety and alienation)
and course grades. We also found that computer attitude scores in many
technology-enhanced courses improved significantly between the beginning
and end of the semester.
Flashlight Project: A National Link. As useful as these survey data
were, we continued to have doubts that we were asking the right questions
regarding the important questions that were repeatedly asked in conversations,
in meetings, and at conferences: Did students learn more as a result of using
the technology? Are these technologies cost efficient and better than courses
without technology? Fortunately, the Flashlight Project (see www.tltgroup.org)
has provided a useful approach and a tool for answering these questions. The
developers of this project accept the fact that traditional measures of student
outcomes (for example, grades) are not useful or valid indicators for compar-
ing the efficacy of courses with and without technology (Ehrmann and Zuniga,
1997). The reasons are well known: faculty are typically not trained assessors
or test writers, they may grade on the curve and change tests and standards from
semester to semester, and when they reengineer or transform their courses,
they tend to change their tests and assessment methods, thus making com-
parisons with earlier classes impossible. The Flashlight Project notes that the
transformation of courses through technology changes the nature of these
courses, the pedagogical approaches used, and eventually the educational goals
of the course. A class with technology and a class without it cannot be equiv-
alent and any comparison between them that tries to assess the impact of tech-
nology will be invalid.
Instead, Flashlight Project developers use the findings of educational
research as evidence and indirect indicators of increased learning. If certain
educational conditions are met or increased as a result of technology—for
example, active learning, faculty-student interaction, and rich and prompt feed-
back—then it is probable that learning outcomes have also improved and tech-
nology has indeed enhanced the learning environment. The Flashlight Current
Student Inventory uses as its basis a set of acknowledged principles of learning,
which are closely allied and overlap with the “seven principles of good practice
68 INFORMATION TECHNOLOGY IN HIGHER EDUCATION
in undergraduate education,” developed by Chickering and Gamson (1987,
1991; for the technology applications of these principles, see Chickering and
Ehrmann, 1996). The most important point about this approach is that it is not
technology per se but the underlying pedagogy—the instructional design of the
course—that makes the crucial difference in the learning environment. If this
message alone were to come through in any of our reports and other commu-
nications, then we would consider our work to be a success.
Virginia Tech has embraced the Flashlight Project for another reason: as
important as the work at Virginia Tech is, it is still an isolated venture. It is
always more useful to enlarge one’s efforts and be able to compare results to
other courses, programs, and institutions. Doing so requires either a controlled
experiment (which is both time consuming and logistically and politically dif-
ficult) or a standardized method or instrument that is used by many institu-
tions, thus enabling comparisons to be made.
Doing Assessment: Getting Help
If Virginia Tech’s case is in any way typical, many other institutions are also fac-
ing a crisis of assessment. New technologies, interventions, and programs are
proliferating much faster than they can be assessed, given the paucity of avail-
able experts. Developers, already pressed for time and unskilled in evaluation,
are nevertheless being asked to provide answers to difficult and problematic
questions: for example, Do students learn more or more efficiently as a result
of the developers’ program or do any added benefits justify the costs? In order
for institutions to make sense of these many and varied projects, they must be
assessed—but how? One answer is to make use of available instruments, such
as the Flashlight Project instrument described previously, and follow the rec-
ommendations described in other chapters in this volume. A concurrent
approach is to use the expertise that already exists on one’s own campus. In
most cases this expertise resides in an office of institutional research (IR). IR
personnel typically have a great deal of experience in working with the very
entities with which technology developers are concerned: students, faculty, and
departments. Similarly, they are experienced in the practical differences
between assessment or evaluation and the kind of rigorous basic research in
which social scientists engage. As described previously, assessment uses many
of the methods of basic research but differs on practical and philosophical
issues. However, for the relationship between IR staff and developers to prove
beneficial, those involved must recognize several points.
First, the developer must realize that most IR offices are probably already
working at or near full capacity. Typically, they must provide a constant stream
of data for upper administration, the higher education supervisory board, and
legislative committees. However, they are highly skilled in the theory and prac-
tice of designing and implementing quantitative studies within the settings of
higher education. At a minimum they should be able to provide some con-
sulting services to faculty developers who are embarking on a project.
69
ASSESSING THE CHANGING IMPACT OF TECHNOLOGY
Second, IR personnel must be approached early in the development
process. A typical view of an assessment or evaluation is that it is a one-shot
study performed when a project is nearly complete because it is thought that
the effects or outcomes are the project’s most important aspect. Unfortunately,
by that time a great deal of valuable data has probably been lost, forever. Fre-
quently, it is changes in conditions over time—trend analysis—that provide the
most insight about the impact of an intervention. Therefore, the best time to
consult with IR is during the planning stages, so that mechanisms can be
installed and the right data can be captured when they are available and in a
form that will be useful. In addition, the data must be captured consistently and
at the proper intervals and then maintained in a healthy state. Furthermore, if
conditions within the project change, knowledgeable decisions must be made
about which new data to capture, and IR can provide help with making these
decisions. IR can also provide help with planning for the delivery stage, that is,
for analysis and reporting of the data that are gathered. Institutional researchers
can help developers focus and clarify the ultimate purpose, use, and display of
the data and the results of the study. Finally, at the strategic level, institutional
researchers can give advice on the ultimate purpose and impact of the study as
a whole—who might be interested in the results, the possible impact of the
findings, and so forth. IR personnel typically have a great deal of experience in
creating useful reports and making them available to the appropriate audience.
However, developers should not expect IR offices to help with all the types
of studies described here. Typically, most IR personnel have strong quantitative
backgrounds. Although they can offer their perspective on its usefulness, quali-
tative research may be out of their area of expertise. For example, Virginia’s State
Council of Higher Education decided only in 1998 to accept some limited qual-
itative data (student satisfaction responses) from state institutions as part of their
periodic reports. In addition, faculty developers should not expect IR personnel
to be experts in new technologies or new teaching approaches. A process of edu-
cation must take place, but with good communication they can usually learn
enough, and quickly enough, to help design a useful assessment. Therefore, the
important points to be communicated from faculty developers to institutional
researchers are not merely the technology and its application but the nature of
the pedagogical transformation itself. Most people in or out of higher education
implicitly assume that the old models still apply and that knowledge is delivered
to the student by lecture and test. It is more likely, however, that new teaching
and learning models, new modes of discourse, and asynchronous interaction will
form the basis of course or program transformations, and these new models must
be the starting point for an assessment design.
References
Chickering, A., and Ehrmann, S. C. “Implementing the Seven Principles: Technology as
Lever.” AAHE Bulletin, Oct. 1996, pp. 3–6. [www.aahe.org/technology/ehrmann.htm].
Chickering, A., and Gamson, Z. “Seven Principles of Good Practice in Undergraduate Edu-
cation.” AAHE Bulletin, 1987, 39 (7), 3–7.
70 INFORMATION TECHNOLOGY IN HIGHER EDUCATION
Chickering, A., and Gamson, Z. (eds.). Applying the Seven Principles for Good Practice in
Undergraduate Education. New Directions for Teaching and Learning, no. 47. San Fran-
cisco: Jossey-Bass, 1991.
Denzin, N. K., and Lincoln, Y. S. (eds.). Handbook of Qualitative Research. Thousand Oaks,
Calif.: Sage, 1994.
Ehrmann, S. C., and Zuniga, R. E. The Flashlight Evaluation Handbook. Washington, D.C.:
American Association for Higher Education, 1997.
Gregorian, V., Hawkins, B. L., and Taylor, M. “Integrating Information Technologies: A
Research University Perspective.” CAUSE/EFFECT, Winter 1992, pp. 2–12.
McClure, C. R., and Lopata, C. L. Assessing the Academic Networked Environment: Strategies
and Options. Washington, D.C.: Coalition for Networked Information, 1996. [www.
cni.org/projects/assessing].
Miles, M. B., and Huberman, A. M. Qualitative Data Analysis: An Expanded Sourcebook. (2nd
ed.) Thousand Oaks, Calif.: Sage, 1994.
Patton, M. Q. Qualitative Evaluation and Research Methods. (2nd ed.) Thousand Oaks, Calif.:
Sage, 1990.
Ray, N. M., and Minch, R. P. “Computer Anxiety and Alienation: Toward a Definitive and
Parsimonious Measure.” Human Factors, 1990, 32 (4), 477–491.
State Council of Higher Education for Virginia. Presentation on Preliminary Restructuring
Progress. [www.schev.edu/wumedia/cn95.html]. June 1995.
Sykes, C. J. The Hollow Men: Politics and Corruption in Higher Education. Washington, D.C.:
Regnery, 1990.
Taylor, C. D., Roy, L., and Moore, J. F. ACCESS: Asynchronous Communication Courses to
Enable Student Success. A report submitted to the Alfred P. Sloan Foundation by Educa-
tional Technologies/Instructional Services, Virginia Polytechnic Institute and State Uni-
versity, July 1997. [www.edtech.vt.edu/access].
Twigg, C. “The Need for a National Learning Infrastructure.” Educom Review, Sept.–Oct.
1994, pp. 16–20. [www.educause.edu/pub/er/review/reviewarticles/29516.html].
University Task Force on the Impact of Digital Technology on the Classroom Environment.
Report of the University Task Force on the Impact of Digital Technology on the Classroom Envi-
ronment. Roanoke: Virginia Polytechnic Institute and State University, 1989.
Virginia Commission on the University of the Twenty-First Century. The Case for Change.
Richmond: Commonwealth of Virginia, 1990.
Virginia Polytechnic Institute and State University. Phase II: Instructional Development Ini-
tiative. Blacksburg: Virginia Polytechnic Institute and State University, May 1995.
Walzer, P. “Professors Not Often in Class.” Roanoke Times and World News, Sept. 12–14,
1993, p. 1.
C. DAVID TAYLOR is director of educational assessment and technology at the Uni-
versity of Texas Health Science Center at Houston, Dental Branch, and adjunct asso-
ciate professor in the Department of Dental Public Health and Dental Hygiene.
Previously he was assessment project manager for instructional innovations at Vir-
ginia Polytechnic Institute and State University.
JOANNE D. EUSTIS is director of the university library at Case Western Reserve Uni-
versity. She formerly held a variety of positions spanning sixteen years at Virginia
Polytechnic Institute and State University.

More Related Content

PDF
Convergence In Education-Education ERP Solution
PDF
Models and instruments for assessing Technology Enhanced Learning Environment...
PPTX
Workshop european 25_26_january
DOCX
201013150 karabeyeser f. prof. 3 a education and training innovation
PDF
0262033712forw1
PDF
Thoughts on Learning Technologies in 2020
PDF
Understanding Virtual Universities Roy Rada
PPTX
Icthe policy01 cemca_kandy_palitha_edirisingha_6june2014
Convergence In Education-Education ERP Solution
Models and instruments for assessing Technology Enhanced Learning Environment...
Workshop european 25_26_january
201013150 karabeyeser f. prof. 3 a education and training innovation
0262033712forw1
Thoughts on Learning Technologies in 2020
Understanding Virtual Universities Roy Rada
Icthe policy01 cemca_kandy_palitha_edirisingha_6june2014

Similar to Assessing the Changing Impact of Technology on Teaching and Learning at Virginia Tech A Case Study.pdf (20)

PDF
Goldsmith upcraft
PDF
Edu comm asia july 2014 low res
PDF
July, 2014 Vol. 18 No.3
PDF
July, 2014 Vol. 18 No.3
PDF
July, 2014 Vol. 18 No.3
PDF
July, 2014 Vol. 18 No.3
DOC
Assignment 1 Technology And Education
PPT
The Future Looks Digital - TCCTA
PDF
Thesis Final Version
PDF
Siemens handbook of emerging technologies for learning
DOCX
Technology.docx
DOC
Black Clute 2008
PPT
Learning Centers 2.0: Enhancing Student Learning With Technology
PDF
Expanding Educational Opportunity - Online Learning
PPT
21st century trends_&_statistics_that_will_shape
PPT
Technology in the service of our educational ambitions
PDF
Establishing a PLACE For Teaching Technologies
PPSX
ICT in Higher Education: Policy Perspectives
PPTX
Session 2 -- technology driven educational disruption
PPTX
Emerging ict applications in higher education in 21 st century
Goldsmith upcraft
Edu comm asia july 2014 low res
July, 2014 Vol. 18 No.3
July, 2014 Vol. 18 No.3
July, 2014 Vol. 18 No.3
July, 2014 Vol. 18 No.3
Assignment 1 Technology And Education
The Future Looks Digital - TCCTA
Thesis Final Version
Siemens handbook of emerging technologies for learning
Technology.docx
Black Clute 2008
Learning Centers 2.0: Enhancing Student Learning With Technology
Expanding Educational Opportunity - Online Learning
21st century trends_&_statistics_that_will_shape
Technology in the service of our educational ambitions
Establishing a PLACE For Teaching Technologies
ICT in Higher Education: Policy Perspectives
Session 2 -- technology driven educational disruption
Emerging ict applications in higher education in 21 st century
Ad

More from Ann Wera (20)

PDF
Writing Paper Set Cheerfully Given. Online assignment writing service.
PDF
How To Make Yourself Write A Paper - Amos Writing
PDF
How To Overcome Bad Online Reviews. Online assignment writing service.
PDF
How To Write A Literature Review In Research Paper
PDF
Sample Informative Speech Outline On Caffeine A Go
PDF
How To Essay Types AvidBards Essay. Online assignment writing service.
PDF
Writing Numbers In Words Worksheets Grade 3 Ask
PDF
How To Write An Opinion Essay 10Th Grade
PDF
Analytical Response Paper Example. How To Write An
PDF
Third Grade Narrative Writing Prompts. Online assignment writing service.
PDF
Thesis Statement In A. Online assignment writing service.
PDF
A Hand Book of Visual Basic 6.0.pdf.pdf
PDF
Alternative Energy Sources Boron and Hydrogen Energy.pdf
PDF
Advanced Research Methods for Applied Psychology.pdf
PDF
Academics Alone Together Liberal Arts Graduate Students Writing Networks.pdf
PDF
A Decision Support System based on the DDMCC paradigm for strategic managemen...
PDF
6 The role of resource-based theory in strategic management studies manageri...
PDF
15. Mills, A.J., Durepos, G., and Wiebe, E. Eds. (2010) Encyclopedia of Cas...
PDF
Arising Under Jurisdiction and the Copyright Laws.pdf
PDF
A Tale of Violence Animal Farm as an Allegory of Social Order.pdf
Writing Paper Set Cheerfully Given. Online assignment writing service.
How To Make Yourself Write A Paper - Amos Writing
How To Overcome Bad Online Reviews. Online assignment writing service.
How To Write A Literature Review In Research Paper
Sample Informative Speech Outline On Caffeine A Go
How To Essay Types AvidBards Essay. Online assignment writing service.
Writing Numbers In Words Worksheets Grade 3 Ask
How To Write An Opinion Essay 10Th Grade
Analytical Response Paper Example. How To Write An
Third Grade Narrative Writing Prompts. Online assignment writing service.
Thesis Statement In A. Online assignment writing service.
A Hand Book of Visual Basic 6.0.pdf.pdf
Alternative Energy Sources Boron and Hydrogen Energy.pdf
Advanced Research Methods for Applied Psychology.pdf
Academics Alone Together Liberal Arts Graduate Students Writing Networks.pdf
A Decision Support System based on the DDMCC paradigm for strategic managemen...
6 The role of resource-based theory in strategic management studies manageri...
15. Mills, A.J., Durepos, G., and Wiebe, E. Eds. (2010) Encyclopedia of Cas...
Arising Under Jurisdiction and the Copyright Laws.pdf
A Tale of Violence Animal Farm as an Allegory of Social Order.pdf
Ad

Recently uploaded (20)

PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PDF
Chinmaya Tiranga quiz Grand Finale.pdf
PPTX
Computer Architecture Input Output Memory.pptx
PDF
Paper A Mock Exam 9_ Attempt review.pdf.
PDF
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
PPTX
CHAPTER IV. MAN AND BIOSPHERE AND ITS TOTALITY.pptx
PDF
Hazard Identification & Risk Assessment .pdf
PDF
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
PPTX
20th Century Theater, Methods, History.pptx
PDF
LDMMIA Reiki Yoga Finals Review Spring Summer
PDF
Empowerment Technology for Senior High School Guide
PDF
Computing-Curriculum for Schools in Ghana
PPTX
Unit 4 Computer Architecture Multicore Processor.pptx
PDF
IGGE1 Understanding the Self1234567891011
PDF
احياء السادس العلمي - الفصل الثالث (التكاثر) منهج متميزين/كلية بغداد/موهوبين
PDF
1_English_Language_Set_2.pdf probationary
PDF
AI-driven educational solutions for real-life interventions in the Philippine...
PDF
Indian roads congress 037 - 2012 Flexible pavement
PDF
CISA (Certified Information Systems Auditor) Domain-Wise Summary.pdf
PPTX
Onco Emergencies - Spinal cord compression Superior vena cava syndrome Febr...
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
Chinmaya Tiranga quiz Grand Finale.pdf
Computer Architecture Input Output Memory.pptx
Paper A Mock Exam 9_ Attempt review.pdf.
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
CHAPTER IV. MAN AND BIOSPHERE AND ITS TOTALITY.pptx
Hazard Identification & Risk Assessment .pdf
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
20th Century Theater, Methods, History.pptx
LDMMIA Reiki Yoga Finals Review Spring Summer
Empowerment Technology for Senior High School Guide
Computing-Curriculum for Schools in Ghana
Unit 4 Computer Architecture Multicore Processor.pptx
IGGE1 Understanding the Self1234567891011
احياء السادس العلمي - الفصل الثالث (التكاثر) منهج متميزين/كلية بغداد/موهوبين
1_English_Language_Set_2.pdf probationary
AI-driven educational solutions for real-life interventions in the Philippine...
Indian roads congress 037 - 2012 Flexible pavement
CISA (Certified Information Systems Auditor) Domain-Wise Summary.pdf
Onco Emergencies - Spinal cord compression Superior vena cava syndrome Febr...

Assessing the Changing Impact of Technology on Teaching and Learning at Virginia Tech A Case Study.pdf

  • 1. As the largest university in its state, Virginia Polytechnic Institute and State University (Virginia Tech) has developed a reputation as a computer-intensive, technology-rich environment. This chapter describes the growth of this environment, the evolving relationship between technology and teaching and learning, and how assessment has recently assumed an integral role in that relationship’s development. Assessing the Changing Impact of Technology on Teaching and Learning at Virginia Tech: A Case Study C. David Taylor, Joanne D. Eustis In recent years the U.S. system of higher education has suffered from an ero- sion of confidence among the public, who seem to believe that the values and preferences of research universities in particular have become disconnected from the ethics and needs of society at large (Sykes, 1990). This disconnection has manifested itself in questions about the quality and relevance of under- graduate education and in demands from governing boards and funding agen- cies for more accountability and productivity. Universities have been charged with failing to prepare students to live and work in a global world, engaging in studies that are out of harmony with contemporary society, and failing to involve students in a dynamic, relevant learning process. In response defenders of higher education argue that new information technologies offer means to remedy the situation and accommodate the learn- ing needs of an expanding and increasingly diverse student population. To this end higher education has indeed been aggressive in adopting state-of-the- art technologies but slow in adjusting its organizational structure and processes to leverage these technologies’ potential. In the words of Vartan Gre- gorian, former president of Brown University, “the new technology per se is not a revolution—the revolution is the difference that technology makes in how we organize, structure, and empower our lives” (Gregorian, Hawkins, and Taylor, 1992, p. 7). In recognition of the power and potential of tech- nology, the State Council of Higher Education for Virginia (SCHEV) has man- dated that the use of technology become an integral part of higher education’s restructuring efforts. The message from SCHEV is clearly stated in a list of NEW DIRECTIONS FOR INSTITUTIONAL RESEARCH, no. 102, Summer 1999 © Jossey-Bass Publishers 55
  • 2. 56 INFORMATION TECHNOLOGY IN HIGHER EDUCATION suggested restructuring objectives and actions (State Council of Higher Edu- cation for Virginia, 1995), among which are the following: • Improve quality and reduce costs of instruction • Incorporate technology • Increase faculty productivity • Serve students at new sites • Offer credit for competency and self-paced learning However, the importance of technology in the teaching-learning process is not a new concept in the state. During the 1988 session of Virginia’s General Assembly, then-governor Gerald Baliles charged the Virginia Commission on the University of the Twenty-First Century (1990) “to develop a vision of higher education to meet the demands of the next century.” A report written at Virginia Tech by the University Task Force on the Impact of Digital Tech- nology on the Classroom Environment (1989) played a prominent role in the commission’s discussion of technology. The commission report stated that in the university of the twenty-first century, “the constraints of space and time will be reduced by thoughtful introduction of telecommunications and com- puters into the instructional mission of colleges and universities.” The com- mission further stated: New digital technology offers the promise of three significant changes in faculty- student contact: 1. The nature of formally structured contact will shift. 2. A larger part of faculty/student contact will be ad hoc and relatively unstruc- tured. 3. The provision of an electronic message system will allow extensive contact without requiring student and teacher to be in the same place” [p. 7]. Viewed with hindsight this list of predictions appears remarkably pre- scient. Virginia’s legislature has included the use of technology as “an area of emphasis” in the restructuring of the commonwealth’s institutions of higher education. Uses of Technology in Higher Education For almost a century institutions of higher education made few changes in the way they delivered instructional services. For the most part teaching has been synchronous, meaning that students and faculty members meet physically in the same place at the same time. The predominant instructional model has been lecture and test. Calls for reform insist that the focus must shift from this pas- sive receptor model to an independent learning model, with an emphasis on students’ more active engagement in the learning process. To this end tech-
  • 3. 57 ASSESSING THE CHANGING IMPACT OF TECHNOLOGY nologies have been combined to create gateways for scholars and students to vast sources of information in various forms: voice, video and images, text, interactive databases. These gateways—the most prevalent being the World Wide Web, or Internet—provide unparalleled opportunities for asynchronous, on-demand learning through access to a remote resource at the student’s con- venience. Advocates claim tremendous benefits for asynchronous learning, including greater opportunities for collaborative learning among students, greater interaction, and increases in individual productivity (Taylor, Roy, and Moore, 1997). This new learning paradigm compellingly advocates a departure from what Carol Twigg (1994) calls “faculty-centered curricula,” in which faculty habits and interests take precedence over student-centered learning. Twigg asserts that in college and university courses, “design all too frequently begins with the question, ‘What do I want to teach?’ rather than ‘What do the students need to learn?’” She writes: “The concept of course design itself is indicative of a faculty-centered approach: faculty design, faculty select, faculty present. In the process, the student is often little more than a passive recipient of the out- comes of the faculty member’s decision-making process” (p. 18). Herein lies a challenge. When institutions of higher education use tech- nology, the decision regarding what and how students learn should be the result of collaborative decision making by faculty members, instructional design specialists, learning assessment experts, students, and college and university administrators working within the context of the demands for insti- tutional change. Yet such a radical cultural shift cannot come easily or with- out questions about the nature and effects of the changes. How does an academic institution make decisions about new directions in course content and modes of learning given its shared governance system and a community with diverse values and agendas? Virginia Tech serves well as a case in point, because it is both typical and atypical in its history of using technology in advancing learning. Most institutions in the corporate and government sector as well as in higher education relied heavily on centralized mainframes and minicomput- ers throughout the 1970s and 1980s. Personal computers were not taken seri- ously until their computing power began to rival that of older mainframes, and they did not begin to supercede the centralized mainframe systems until net- working and communication technology became an affordable reality in the early 1990s. Although computers have been used for instructional purposes since the late 1960s and on an increasing scale at Virginia Tech since the late 1980s and early 1990s, these early instructional applications were mostly iso- lated ventures, restricted in their impact to a single course or part of a course. However, two initiatives at Virginia Tech are worth mentioning because of their departure from this pattern. First, in 1984, the College of Engineering, under the leadership of Dean Paul Torgersen (now president of Virginia Tech), instituted a requirement that each student have his or her own personal computer and a set of associated
  • 4. 58 INFORMATION TECHNOLOGY IN HIGHER EDUCATION engineering software tools. This was the first college of engineering in the nation to institute such a requirement. Each year, as the technology has become more advanced, specifications for the student-purchased PCs are upgraded in order to ensure that engineering students experience and exper- iment with the leading edge of technology. This is one of the few examples of a technological initiative so widely adopted that it affected every member of a large population. Second, Virginia Tech was a key participant in the SUCCEED project, again through the College of Engineering. Initially funded by the National Sci- ence Foundation in 1992, the Southeastern University and College Coalition for Engineering Education (SUCCEED) is a group of southeastern universities’ engineering colleges that make use of new technologies to improve teaching, especially the teaching of highly theoretical courses that traditionally have been difficult for students. The member institutions each contribute a variety of developments to the project. For example, Virginia Tech created and hosted a database of engineering-related images that any faculty member could down- load for educational use across the then-fledgling Internet. This project was important because it not only had an impact within Virginia Tech but also enabled a cooperative sharing of curricula and ideas among several institutions. One of SUCCEED’s major objectives is to improve the engineering curriculum using outcomes assessment results, and one of the “products” produced as a result of Virginia Tech’s first five-year SUCCEED grant was a learning objec- tives and outcomes assessment planning guide for engineering academic administrators [see www.succeed.vt.edu]. Investment in Teaching and Learning The decade of the 1990s has seen a continuation of Virginia Tech’s innovation and steady progress toward the incorporation of technology into teaching and learning methodologies and communications. A number of programs—the Instructional Development Initiative, Cyberschool and the ACCESS project, and the Center for Innovation in Learning—have been initiated, accompanied by an increasing interest in a role for assessment. Instructional Development Initiative. Virginia Tech Instructional Ser- vices (a unit formerly called the Learning Resources Center and then Media Services) has existed since the early 1970s. The Center for Excellence in Undergraduate Teaching (CEUT) was established in 1993 to foster instruc- tional excellence and innovation. Despite these resources and innumerable directives from SCHEV, there was little evidence until recently of interest among faculty members in teaching methodologies that used technologies. For example, SCHEV provided funds in 1992 for a Virginia Tech faculty grants program that allowed Educational Technologies (a department within Instruc- tional Services) to gain valuable experience in managing three technology- based course transformation projects. However, these projects affected only individual courses and had minimal impact outside their home departments.
  • 5. 59 ASSESSING THE CHANGING IMPACT OF TECHNOLOGY There was of course a pragmatic reason contributing to the indifference to technological innovation. At Virginia Tech operating budgets had been flat or declining for a decade, and such constrictions naturally limit the discre- tionary spending of academic administrators. In departments without signifi- cant sources of outside income, there had been a particularly acute lack of funds for traditional support, let alone for new equipment. As a result, managerial initiative was thwarted, and only limited discretionary choices remained, except at the highest level of the university administration. Conse- quently, local experimentation with pedagogical innovation and the develop- ment of alternative teaching strategies, especially those using technology, did not enjoy wide appeal or encouragement. During the early 1990s, Virginia Tech was typical of many universities in undergoing severe budget cuts amid an atmosphere of public hostility toward higher education in general and faculty in particular (see, for example, Walzer, 1993). Not surprisingly, faculty morale had sunk to an all-time low. At this point the vice president for information systems and the university provost made a courageous and far-reaching decision: rather than continue to retrench and cut back, they decided to redirect resources and invest in the faculty, stu- dents, and the teaching infrastructure, using technology as a vehicle. There- fore, in 1993, the Instructional Development Initiative (IDI), developed jointly by Information Systems and the Office of the Provost, was proposed and sub- sequently funded centrally. The goals of the program were outlined in the document Phase II: Instructional Development Initiative, for the 1994–98 fiscal period, and were organized into the following three components: Faculty Development Initiative (FDI) • Provide the opportunity for all faculty in the University to participate in this faculty development program. The overarching goal is to motivate them to investigate, create, and utilize alternative instructional strategies. • Provide participants who complete the program with access to state-of-the- art instructional technology, the knowledge to use it, and the motivation to collaborate with their colleagues in leveraging instructional technology in their courses. Student Access • Provide advice to all students on their investment in computer technology in order to maximize its usefulness during their college careers. • Provide better access to computing resources for all students who do not have their own personal computers and provide computer labs for accessing spe- cialized software which is unique to disciplinary areas (such as Perseus, Math- ematica, and Daedalus). • Provide network-based training materials for students in order to ensure that they have a basic foundation in the use of computing and instructional tech- nology resources.
  • 6. 60 INFORMATION TECHNOLOGY IN HIGHER EDUCATION Course Development • Support faculty in the development of network accessible courseware and instruction. • Facilitate the development of electronic libraries of scholarly materials sup- porting designated courses. • Provide improved classroom and presentation facilities to support faculty efforts in introducing new technologies into core curriculum courses [Virginia Polytechnic Institute and State University, 1995, p. 15]. An additional, unstated goal was to enable the transformation of the uni- versity’s computing infrastructure from a mainframe to a client-server archi- tecture. In fiscal year 1993–94, as the FDI began its first series of workshops but even before it began to have a campuswide influence, a number of concur- rent technology-related developments were occurring as a result of the robust technology climate at Virginia Tech. One unique project was the Blacksburg Electronic Village (BEV) (see www.bev.net). BEV was developed as the result of a desire to extend Virginia Tech’s network access beyond campus bound- aries. For any computer network to be used to its potential for instructional purposes, students living off campus required access to network-based infor- mation services. Therefore in 1991, a decision was made to offer, in collabo- ration with the town of Blacksburg and Bell Atlantic, Internet access to the local community. After two years of infrastructure development the first dis- tribution of BEV software was tested in 1993. The subsequent rapid growth of BEV was a harbinger of the future national development of the Internet and the incredible proliferation of Internet service providers (ISPs) and Web sites around the world, and led to a spate of newspaper and magazine arti- cles, such as one in USA Today that called Blacksburg “the most-wired town in America.” Cyberschool and the ACCESS Project. In November of 1994, a group of arts and sciences faculty—all of whom had participated in the first FDI— proposed that more valuable than getting “computers into the classroom” would be “getting classrooms out of computers.” This loosely organized, multidisciplinary group of faculty members has proceeded to test the efficacy and push the limits of computer-mediated communication technologies. What these faculty have discovered is described in a number of position papers that may be found on the Cyberschool Web site (www.cyber. vt.edu/docs/papers.html; see also, more generally, www.cyber.vt.edu/ default.html). They have been a constant source of innovation, inspiration, and discovery regarding the strengths and limitations of learning networks. Cyberschool today functions as a forum for information interchange, as a support group for faculty pioneers, and as a creative voice for advocating pol- icy changes to the administration. It remains a valuable channel of commu-
  • 7. 61 ASSESSING THE CHANGING IMPACT OF TECHNOLOGY nication between early adopters of technology and the majority of faculty and administrators. As an ad hoc group of faculty, Cyberschool was largely an unfunded entity and remains so today. In order to develop many of the ideas originating in Cyberschool in a substantive and programmatic way, Associate Dean Lucinda Roy, one of Cyberschool’s cofounders, initiated a request to the Alfred P. Sloan Foundation for funding to explore the concept of asynchronous learning net- works (ALN). The College of Arts and Sciences, Educational Technologies (as part of Instructional Services), and three faculty from the department of biol- ogy were subsequently funded to transform four lower-division, introductory biology courses in a project called ACCESS (Asynchronous Communications Courses to Ensure Student Success). Biology courses were chosen for several reasons. First, the three faculty were not only willing to participate but also had the drive and experience to have introduced creative and far-reaching technological and instructional inno- vation into their own courses. Second, there was a deep need for change in the department, because it was faced with an unprecedented influx of new stu- dents. Finally, biology had, more than most departments, settled on the large lecture model for its lower-division courses, and faculty and administration alike had recently realized that simply continuing to increase the size of classes was no longer a viable option. With growing class sizes, few resources for teaching support, and a student population that was ever more technologically sophisticated and demanding, the large lower-division biology lecture courses were prime candidates for transformation. ACCESS was funded in December of 1995. Development, implementa- tion, and assessment proceeded for eighteen months and concluded in the summer of 1997. Some ACCESS developments included a widely emulated course Web site that provided round-the-clock access to class materials (lec- ture slides, notes, the syllabus), class news and announcements, links to out- side resources, and a communication forum. The more important innovations included experimentation with a variety of new communication channels among students and between students and faculty, which extended class dis- cussions far beyond the confines of the classroom. The complete results are available in the final report, which was published on-line in the summer of 1997 (see www.edtech.vt.edu/access). From the point of view of this chapter, one particularly important project innovation was the emphasis placed on evaluation and assessment. The Sloan Foundation requested that assessment be a major part of the project and increased the size of the grant in order to fund that part of the initiative. The assessment effort attempted to answer questions related to asynchronous com- munication and new teaching and learning models; the use of technology to relieve faculty of repetitive tasks; self-paced learning in large, mixed ability classes; the efficient development of course materials; the impact of technol- ogy on learning, motivation, and student success; and the identification of
  • 8. 62 INFORMATION TECHNOLOGY IN HIGHER EDUCATION essential faculty and student skills for the new age of learning. These were ambitious goals for any assessment effort, but the lessons learned during ACCESS informed an approach to assessment that was used in the much larger technology projects that were to come. Center for Innovation in Learning. Recently (1996–98), the grassroots technology activity of faculty has increased across campus, spurred on by a heightened level of awareness of the possibilities of the new technologies. Par- ticipants in the Faculty Development Initiative workshops have begun to request more help in using technologies in their courses. In addition, the administration has encouraged development of distance learning courses, and deans and department heads have begun to identify those courses that could benefit from the use of technology. In order to aid development, the Center for Innovation in Learning (CIL) was founded in 1996 to integrate research on teaching and learning into the curriculum. CIL’s work is based on the assumption that instructional innovation is accomplished through the inter- action of design, development, delivery, and marketing. Through faculty proj- ects, the center investigates, develops, researches, tests, and evaluates new technology-based teaching and learning approaches, particularly those related to distance learning, faculty development, instructional technologies, and telecommunications. A third round of annual grant awards was completed in the spring of 1998, and to date sixty-six projects have been funded for over $1.5 million. The technological and pedagogical changes required to develop distance learning courses and campus Web-based courses overlap to a great extent, and spurred on by CIL funding, the pace of course transformation in both these areas has rapidly increased. Simultaneously, a massive upgrade in the state net- work infrastructure has created an opportunity for a quantum increase in distance learning course distribution. Information Systems has laid the ground- work for the development of NET.WORK.VIRGINIA, a state-of-the-art, wide area educational network. This advanced broadband network will deliver ATM (asynchronous transfer mode) service statewide and already connects 120 par- ticipating sites, including all four-year colleges and universities, the Virginia Community College System, several K–12 school systems, and many state agencies. Marketing is essential to the promulgation and acceptance of any new ini- tiative, and Virginia Tech Online (VTOnline) was established by the CIL in 1997 as a mechanism for communication among and coordination of net- centered activities. As a central Web site (www.vto.vt.edu), VTOnline serves as a single point of contact for all computer-mediated instructional and informa- tion services across the university. It provides easy-to-use information about all instructional, administrative, and public service activity that is available at Virginia Tech over the Internet. It acts as an organizing point within the uni- versity that coordinates information concerning on-line degree programs, short courses, extension activities, and support for public service initiatives for the continual development of all existing and new network-centered teaching ini-
  • 9. 63 ASSESSING THE CHANGING IMPACT OF TECHNOLOGY tiatives, ranging from course and program innovation to assessment and eval- uation practices. Evolution of a Strategy for Assessing Instructional Technologies As a result of the overwhelming increase in activity requiring technological support, there has been a groundswell of interest in and demand for assess- ment of technology projects and their impact on the university’s most funda- mental mission—teaching and learning. Furthermore, as resources have been diverted from other activities into instructional technology projects, new com- puting and network infrastructure, and course redesign, questions of impact, of costs versus benefits, and of efficiency have naturally arisen. In addition, an awareness of new roles for faculty has begun to permeate the university cul- ture. For example, the idea that faculty need both released time and consider- able support (expert assistance, training, consulting, additional hardware and software) in order to revise and transform their courses has evolved from a novel luxury to an accepted expectation. Yet as time goes by it seems that more questions are raised than answered. Faculty developers ask: How can my courses be improved? Which technology is most appropriate, or more fundamentally, is technology necessary for this improvement? What kind of and how much assistance will I need—how many people or machines or resources will be required? When I start implementing changes to the course, how will the students react, what will their expectations be? These are not questions that faculty are typically experienced in answer- ing or that they expected to confront when they began their academic careers by lecturing, doing research, and serving on committees. Administrators and department heads have a different set of questions. Once the decision to sup- port course transformation is made, they ask: How will we allocate resources, and whom will we support? How will we manage the changes being proposed? How will development activities affect the traditional areas of faculty produc- tivity? How will we monitor the student reaction to these changes? Eventually everyone involved must at some time present a coherent picture of the project and its outcomes to other faculty, upper administration, the board of regents, professional conferences, governmental agencies, legislators, and parents. A good assessment plan can help address most of these questions, and it can cer- tainly aid in communicating results to interested outsiders. CNI Project: A Unified Approach to Assessment. In a call for state- ments of interest and experience, the Coalition for Networked Information (CNI) invited institutions of higher education, in December 1996, “to use ongo- ing assessment techniques to study the uses, impacts, costs, and effectiveness of networks and networked resources and services.” One of nine institutions chosen to participate in CNI’s Assessing the Academic Networked Environment project, Virginia Tech formed an interdisciplinary team that included represen- tation from the library, computing center, instructional services, and network
  • 10. 64 INFORMATION TECHNOLOGY IN HIGHER EDUCATION services. Work proceeded on assessment of various aspects of network activi- ties and services throughout 1997. A great deal of uncoordinated and overlapping assessment activity was discovered within Virginia Tech Information Systems. Although committed— and indeed, required—to measure the effectiveness of network use and instructional technology, the people responsible for network infrastructure development, maintenance of communications systems, and the delivery of network-based information services, support, and instruction were not neces- sarily communicating among themselves. A great benefit of participation in the project was the opportunity to network, share, and coordinate the results of individual assessment efforts. Equally important, it increased understanding of the interdependence in a networked environment. Purpose of Assessment. As developers and administrators at Virginia Tech, our experience with assessment grew largely out of the ACCESS project, was modified by the group experience with the CNI affiliation, and has con- tinued to evolve with the ongoing assessment of the much larger CIL initiative. The goals for assessment are therefore practical and needs driven. First and foremost, assessment must be useful. It should not involve collecting data for their own sake. Although it uses many of the methods of basic research, assess- ment is research with an end in mind; in traditional terms, it is applied research. The two purposes of assessment can best be summarized as feedback and communication. The feedback function (also known as formative evaluation) serves the purpose of informing persons directly involved with the project or intervention about its progress. The central idea is that given timely and use- ful information, a midcourse correction might be made that could increase the effectiveness and chances of success of the project. To be useful in a rapidly evolving project, feedback data must be quickly collected and analyzed but by necessity this analysis will be somewhat less rigorous than the final or sum- mative evaluation upon which the final judgment about project outcomes might be made. Communication is the ultimate rationale for the assessment of projects or interventions. Practically speaking, if results are not communicated to the out- side world, then except for the benefits to the immediate participants, the proj- ect might as well not have been undertaken in the first place. Above all, did the project meet its goals? For example, increased learning may or may not be a goal, but if it is, how did learning occur, and how was it measured? In any case, what were the outcomes? What are the different parts or aspects of the project, and which were successful? What was the impact? Who was affected by the project and to what extent? Can the project’s innovations be transferred or scaled up to more students, to other programs? Can these innovations reduce costs? Are these findings useful in determining whether the project should be continued or expanded? If so, how could it be improved? Moreover, different audiences need different kinds of information. The participants and immediate stakeholders—at Virginia Tech the faculty developers, staff, stu-
  • 11. 65 ASSESSING THE CHANGING IMPACT OF TECHNOLOGY dents, and administrators who form the development and implementation team—will first want the kind of practical, formative evaluation and feedback described in the previous paragraph. Outsiders, however, will be interested in the long-range, summative outcomes of the project. They either want to apply the findings to their own situations, or they are decision makers who want to know where and how to allocate future resources. Methodologies. Assessment methods at Virginia Tech are in a constant state of evolution, but in general the emphasis is on a holistic description of the intervention—a snapshot of the status of the project. To achieve this holis- tic picture, quantitative and qualitative data are combined in a hybrid design and findings are reported in a narrative, descriptive format within which data are embedded as evidence. Most important is the framework for the assess- ment activities, that is, the questions that need to be answered or the criteria for success of the project or intervention. It is crucial that this framework be reviewed, negotiated, and agreed upon well in advance by all stakeholders. Assessment data include traditional numerical or quantitative measurements (census data, survey results, grades), but they also include a large portion of qualitative data such as material from interviews and observations and also summaries and quotes from various textual sources. The two types of data complement each other but are considerably different and are therefore dis- cussed separately in the following sections. Qualitative Methods and Data. In Assessing the Academic Networked Environment: Strategies and Options, Charles McClure and Cynthia Lopata (1996) give a succinct definition of qualitative data: “Qualitative data are data that describe, explain, and characterize the subject of investigation using words rather than numbers.” They go on to note that qualitative methods are ap- propriate “where the research problem and the research setting are not well understood” (p. 11). Most course transformations that involve instructional technologies fit this situation. Although faculty and other developers will have certain objectives for their projects, they cannot pretend to know everything that will happen when changes are implemented. Furthermore, they fully expect to make changes along the way. Given this state of change and uncer- tainty, the usual data-gathering tool, the survey, with its specific questions and limited response criteria, could easily fail to capture major effects. The solu- tion is to use qualitative, ethnographic research methods; that is, talk to the participants and observe them in a natural setting, and above all be good lis- teners and good observers and be open to the unexpected and the obvious. In this way the participants themselves can raise issues that the planners and eval- uators did not anticipate. Qualitative methods are ideal for finding out, from a holistic point of view, the impact of new technologies or instructional interventions. In addition to observing classes, labs, and other class activities such as field trips, our pri- mary instrument for gathering data was the guided interview (Patton, 1990), in which the interviewer follows a list of questions but also feels free to depart from that script and explore topics that the subject initiates or in which the
  • 12. 66 INFORMATION TECHNOLOGY IN HIGHER EDUCATION subject seems to have interest or knowledge. Other methods we have employed include focus groups, benchmarking, and examination of user activities, through transcripts of on-line discussions, for example. A complete descrip- tion of qualitative research methods along with a rationale for their use is obvi- ously beyond the scope of this chapter, and the reader is referred to comprehensive works on the subject (Patton, 1990; Denzin and Lincoln, 1994; Miles and Huberman, 1994). Finally, the findings of open-ended interviews help inform the writing of subsequent surveys and point the direction in which to look with more precise and widely accepted quantitative methods. The primary practical problem in qualitative methods is accumulating more data than can comfortably be organized, analyzed, and interpreted. Open-ended interviews can be lengthy and must be transcribed, checked for accuracy, then coded and studied. The results must be organized, analyzed, summarized, and displayed, or presented. In addition, field observation is time consuming and requires a disciplined researcher to observe, take useful notes or record observations, and then compile the notes into a useful format. Finally, almost any textual material is grist for the qualitative mill, and tech- nology interventions leave an extensive textual trail suitable for analysis: on- line discussions, e-mail interchanges, listservs, threaded discussion groups—all are raw material that can be archived and incorporated into the data set. For all their difficulties, we have found that qualitative data have a great deal of value and impact in communicating the nature of a project, which was the bottom line of our assessment effort. When embedded in a readable nar- rative, qualitative data such as quotes from student or faculty interviews make immediate sense and are accessible to everyone, which is often not the case for a complex statistical analysis. We have also found that technology allows the impact and credibility of qualitative data to be extended beyond the normal paper-based report. In the ACCESS project, we videotaped as many of the interviews with students and faculty as possible, which was quite acceptable to almost all of the interview subjects. Because the videotaping and sound recording was done at a high level of quality, clips from these interviews could then be used in a wide variety of applications: as part of traditional videotapes, on CD-ROMs, or as part of multimedia presentations. For example, we included approximately forty of these clips in the on-line version of the ACCESS report (see www.edtech.vt.edu/access). Although the right quote has tremendous impact in a text narrative, it is even more powerful to see and hear the student or professor actually saying those same words. Quantitative Methods and Data. Our standard method of data collec- tion was the survey, administered to an intact class. Typically, we tried to find out as much as possible about the students’ background and their use of com- puters and networks, including demographics, computer ownership, knowl- edge of and experience with computer networks, and attitudes toward different ways of learning (cooperative groups, independent study, relative importance of lecture versus lab, and so forth). We also administered a computer attitude scale (Ray and Minch, 1990), usually at the beginning and end of the semes-
  • 13. 67 ASSESSING THE CHANGING IMPACT OF TECHNOLOGY ter, which gave us an indication of students’ anxieties about and alienation from computer technology and how these feelings changed during the course of the semester. Finally, we asked many specific questions about technology in the class: their usage of the class Web site, the impact of the class and tech- nology on their attitudes toward the course subject matter, their opinion on the usefulness of the technology, and so forth. Most of our analyses used descriptive data as a way of summarizing responses. We were also able to use the data, however, for some interesting correlation studies and tests of significance. For example, many faculty were concerned that students who owned computers or were more favorably dis- posed toward or adept with computer and network technology might have an unfair advantage in a technology-enhanced class. We were able to report that, to date, there has been no significant correlation between computer ownership, experience, or attitude (that is, computer anxiety and alienation) and course grades. We also found that computer attitude scores in many technology-enhanced courses improved significantly between the beginning and end of the semester. Flashlight Project: A National Link. As useful as these survey data were, we continued to have doubts that we were asking the right questions regarding the important questions that were repeatedly asked in conversations, in meetings, and at conferences: Did students learn more as a result of using the technology? Are these technologies cost efficient and better than courses without technology? Fortunately, the Flashlight Project (see www.tltgroup.org) has provided a useful approach and a tool for answering these questions. The developers of this project accept the fact that traditional measures of student outcomes (for example, grades) are not useful or valid indicators for compar- ing the efficacy of courses with and without technology (Ehrmann and Zuniga, 1997). The reasons are well known: faculty are typically not trained assessors or test writers, they may grade on the curve and change tests and standards from semester to semester, and when they reengineer or transform their courses, they tend to change their tests and assessment methods, thus making com- parisons with earlier classes impossible. The Flashlight Project notes that the transformation of courses through technology changes the nature of these courses, the pedagogical approaches used, and eventually the educational goals of the course. A class with technology and a class without it cannot be equiv- alent and any comparison between them that tries to assess the impact of tech- nology will be invalid. Instead, Flashlight Project developers use the findings of educational research as evidence and indirect indicators of increased learning. If certain educational conditions are met or increased as a result of technology—for example, active learning, faculty-student interaction, and rich and prompt feed- back—then it is probable that learning outcomes have also improved and tech- nology has indeed enhanced the learning environment. The Flashlight Current Student Inventory uses as its basis a set of acknowledged principles of learning, which are closely allied and overlap with the “seven principles of good practice
  • 14. 68 INFORMATION TECHNOLOGY IN HIGHER EDUCATION in undergraduate education,” developed by Chickering and Gamson (1987, 1991; for the technology applications of these principles, see Chickering and Ehrmann, 1996). The most important point about this approach is that it is not technology per se but the underlying pedagogy—the instructional design of the course—that makes the crucial difference in the learning environment. If this message alone were to come through in any of our reports and other commu- nications, then we would consider our work to be a success. Virginia Tech has embraced the Flashlight Project for another reason: as important as the work at Virginia Tech is, it is still an isolated venture. It is always more useful to enlarge one’s efforts and be able to compare results to other courses, programs, and institutions. Doing so requires either a controlled experiment (which is both time consuming and logistically and politically dif- ficult) or a standardized method or instrument that is used by many institu- tions, thus enabling comparisons to be made. Doing Assessment: Getting Help If Virginia Tech’s case is in any way typical, many other institutions are also fac- ing a crisis of assessment. New technologies, interventions, and programs are proliferating much faster than they can be assessed, given the paucity of avail- able experts. Developers, already pressed for time and unskilled in evaluation, are nevertheless being asked to provide answers to difficult and problematic questions: for example, Do students learn more or more efficiently as a result of the developers’ program or do any added benefits justify the costs? In order for institutions to make sense of these many and varied projects, they must be assessed—but how? One answer is to make use of available instruments, such as the Flashlight Project instrument described previously, and follow the rec- ommendations described in other chapters in this volume. A concurrent approach is to use the expertise that already exists on one’s own campus. In most cases this expertise resides in an office of institutional research (IR). IR personnel typically have a great deal of experience in working with the very entities with which technology developers are concerned: students, faculty, and departments. Similarly, they are experienced in the practical differences between assessment or evaluation and the kind of rigorous basic research in which social scientists engage. As described previously, assessment uses many of the methods of basic research but differs on practical and philosophical issues. However, for the relationship between IR staff and developers to prove beneficial, those involved must recognize several points. First, the developer must realize that most IR offices are probably already working at or near full capacity. Typically, they must provide a constant stream of data for upper administration, the higher education supervisory board, and legislative committees. However, they are highly skilled in the theory and prac- tice of designing and implementing quantitative studies within the settings of higher education. At a minimum they should be able to provide some con- sulting services to faculty developers who are embarking on a project.
  • 15. 69 ASSESSING THE CHANGING IMPACT OF TECHNOLOGY Second, IR personnel must be approached early in the development process. A typical view of an assessment or evaluation is that it is a one-shot study performed when a project is nearly complete because it is thought that the effects or outcomes are the project’s most important aspect. Unfortunately, by that time a great deal of valuable data has probably been lost, forever. Fre- quently, it is changes in conditions over time—trend analysis—that provide the most insight about the impact of an intervention. Therefore, the best time to consult with IR is during the planning stages, so that mechanisms can be installed and the right data can be captured when they are available and in a form that will be useful. In addition, the data must be captured consistently and at the proper intervals and then maintained in a healthy state. Furthermore, if conditions within the project change, knowledgeable decisions must be made about which new data to capture, and IR can provide help with making these decisions. IR can also provide help with planning for the delivery stage, that is, for analysis and reporting of the data that are gathered. Institutional researchers can help developers focus and clarify the ultimate purpose, use, and display of the data and the results of the study. Finally, at the strategic level, institutional researchers can give advice on the ultimate purpose and impact of the study as a whole—who might be interested in the results, the possible impact of the findings, and so forth. IR personnel typically have a great deal of experience in creating useful reports and making them available to the appropriate audience. However, developers should not expect IR offices to help with all the types of studies described here. Typically, most IR personnel have strong quantitative backgrounds. Although they can offer their perspective on its usefulness, quali- tative research may be out of their area of expertise. For example, Virginia’s State Council of Higher Education decided only in 1998 to accept some limited qual- itative data (student satisfaction responses) from state institutions as part of their periodic reports. In addition, faculty developers should not expect IR personnel to be experts in new technologies or new teaching approaches. A process of edu- cation must take place, but with good communication they can usually learn enough, and quickly enough, to help design a useful assessment. Therefore, the important points to be communicated from faculty developers to institutional researchers are not merely the technology and its application but the nature of the pedagogical transformation itself. Most people in or out of higher education implicitly assume that the old models still apply and that knowledge is delivered to the student by lecture and test. It is more likely, however, that new teaching and learning models, new modes of discourse, and asynchronous interaction will form the basis of course or program transformations, and these new models must be the starting point for an assessment design. References Chickering, A., and Ehrmann, S. C. “Implementing the Seven Principles: Technology as Lever.” AAHE Bulletin, Oct. 1996, pp. 3–6. [www.aahe.org/technology/ehrmann.htm]. Chickering, A., and Gamson, Z. “Seven Principles of Good Practice in Undergraduate Edu- cation.” AAHE Bulletin, 1987, 39 (7), 3–7.
  • 16. 70 INFORMATION TECHNOLOGY IN HIGHER EDUCATION Chickering, A., and Gamson, Z. (eds.). Applying the Seven Principles for Good Practice in Undergraduate Education. New Directions for Teaching and Learning, no. 47. San Fran- cisco: Jossey-Bass, 1991. Denzin, N. K., and Lincoln, Y. S. (eds.). Handbook of Qualitative Research. Thousand Oaks, Calif.: Sage, 1994. Ehrmann, S. C., and Zuniga, R. E. The Flashlight Evaluation Handbook. Washington, D.C.: American Association for Higher Education, 1997. Gregorian, V., Hawkins, B. L., and Taylor, M. “Integrating Information Technologies: A Research University Perspective.” CAUSE/EFFECT, Winter 1992, pp. 2–12. McClure, C. R., and Lopata, C. L. Assessing the Academic Networked Environment: Strategies and Options. Washington, D.C.: Coalition for Networked Information, 1996. [www. cni.org/projects/assessing]. Miles, M. B., and Huberman, A. M. Qualitative Data Analysis: An Expanded Sourcebook. (2nd ed.) Thousand Oaks, Calif.: Sage, 1994. Patton, M. Q. Qualitative Evaluation and Research Methods. (2nd ed.) Thousand Oaks, Calif.: Sage, 1990. Ray, N. M., and Minch, R. P. “Computer Anxiety and Alienation: Toward a Definitive and Parsimonious Measure.” Human Factors, 1990, 32 (4), 477–491. State Council of Higher Education for Virginia. Presentation on Preliminary Restructuring Progress. [www.schev.edu/wumedia/cn95.html]. June 1995. Sykes, C. J. The Hollow Men: Politics and Corruption in Higher Education. Washington, D.C.: Regnery, 1990. Taylor, C. D., Roy, L., and Moore, J. F. ACCESS: Asynchronous Communication Courses to Enable Student Success. A report submitted to the Alfred P. Sloan Foundation by Educa- tional Technologies/Instructional Services, Virginia Polytechnic Institute and State Uni- versity, July 1997. [www.edtech.vt.edu/access]. Twigg, C. “The Need for a National Learning Infrastructure.” Educom Review, Sept.–Oct. 1994, pp. 16–20. [www.educause.edu/pub/er/review/reviewarticles/29516.html]. University Task Force on the Impact of Digital Technology on the Classroom Environment. Report of the University Task Force on the Impact of Digital Technology on the Classroom Envi- ronment. Roanoke: Virginia Polytechnic Institute and State University, 1989. Virginia Commission on the University of the Twenty-First Century. The Case for Change. Richmond: Commonwealth of Virginia, 1990. Virginia Polytechnic Institute and State University. Phase II: Instructional Development Ini- tiative. Blacksburg: Virginia Polytechnic Institute and State University, May 1995. Walzer, P. “Professors Not Often in Class.” Roanoke Times and World News, Sept. 12–14, 1993, p. 1. C. DAVID TAYLOR is director of educational assessment and technology at the Uni- versity of Texas Health Science Center at Houston, Dental Branch, and adjunct asso- ciate professor in the Department of Dental Public Health and Dental Hygiene. Previously he was assessment project manager for instructional innovations at Vir- ginia Polytechnic Institute and State University. JOANNE D. EUSTIS is director of the university library at Case Western Reserve Uni- versity. She formerly held a variety of positions spanning sixteen years at Virginia Polytechnic Institute and State University.