Assessing Learning in Complex Domains in The World

Google+ Pinterest LinkedIn Tumblr +

Assessing Learning in Complex Domains

I believe the apparent lack of progress in reaping the benefits of research on
learning and instruction is a result of a failure to deal effectively with learning
and instruction as educational systems.2 It is the inability to conceptualize
education as involving complex and dynamic systems that inhibits progress.
In this section, I provide an overview of research related to learning in and
about complex systems and focus in particular on an assessment methodology
that is pertinent to evidence-based decision making in learning and instruction.
A system can be roughly defined as a collection of related components with
inputs and outputs. Representations of systems typically restrict the boundaries
of what will be considered as an integral part of the system. Decisions about
2 The work reported in this section began at the University of Bergen and was continued at
Syracuse University as part of a National Science Foundation Project entitled ‘‘The DEEP
Methodology for Assessing Learning in Complex Domains’’ (Spector & Koszalka, 2004).
6 J.M. Spector
what is integral to a system are often arbitrary. With regard to educational
systems, for example, one might decide that certain aspects of home life
(e.g., the presence of two parents, parental reading at home, etc.) and extracurricular
activities (e.g., participation in school clubs and sports) form an
integral part of the system, whereas other aspects of home life (e.g., diet) and
extracurricular activities (e.g., dating) are not represented as integral aspects.
This discussion is not about what properly constitutes an educational system.
Rather, the focus is on the notion of an educational system, and particularly on
instructional and learning systems within an educational system.
Systems-based approaches to learning and instruction have been around for
more than 50 years and have recently been integrated into assessment and
evaluation (Seel, 2003; Spector, 2004). Findings from systems-oriented research
on learning and instruction in complex, ill-structured problem domains suggest
that learners often fail to comprehend the nature of a system – how various
factors are interrelated and how a change in one part of the system can
dramatically affect another part of the system(Do¨ rner, 1996; Spector&Anderson,
2000). A number of instructional approaches already mentioned address this
deficiency directly, including problem-centered learning (Merrill, 2002) and
variations such as model-centered learning (Seel, 2003) and model-facilitated
learning (Milrad et al., 2003).
A common theme in systems-based approaches is the notion that the full
complexity of a problem situation should eventually be presented to the learner,
and that helping the learner manage that complexity by gradually introducing
additional problem factors can contribute to effective learning. Challenging
problems typically involve a complex system, and instruction should be aimed
not only at a specific facet of the problem but also at the larger system so as to
help learners locate problems in their naturally larger contexts; this has been
called a holistic approach (Spector & Anderson, 2000) or a whole-task
approach (van Merrie¨nboer, 1997). Methods to facilitate understanding in
such complex contexts include presenting multiple representations of problem
situations (Spiro, Feltovich, Jacobson, & Coulson, 1991), interactions with
simulations of complex systems (Milrad et al., 2003), and partially worked
examples (van Merrie¨nboer, 1997). Learner-constructed problem representations
of the type to be described below have implications for instruction as well
as for assessment.
The integration of technology in teaching and learning can be closely linked
to systems-based approaches making use of such technologies as powerful and
affordable computers, broadband networks, wireless technologies, more
powerful and accessible software systems, distributed learning environments,
and so on. Educational technologies provide many valuable affordances for
problem-centered instructional approaches. The learning technology paradigm
has appropriately shifted from structured learning from computers to one
better characterized as learning linked with instructional uses of technology,
or learning with computers (Lowyck & Elen, 2004). The emphasis is on
(a) viewing technology as an ongoing part of change and innovation and
Adventures and Advances in Instructional Design Theory and Practice 7
(b) using technology to support higher-order learning in more complex and less
well-defined domains (Jonassen, 2006; Spector & Anderson, 2000). The latter is
a concern for many educational researchers (see, for example, Project Zero at
Harvard University; http://pzweb.harvard.edu/).
Learning environments and instructional systems are properly viewed as
parts of larger systems rather than as isolated places where learning might
occur. Moreover, learning takes place in more dynamic ways than was true in
the teacher-led paradigm of earlier generations. Many more learning activities
are made possible by technology and this further complicates instructional
design – that is to say, determining how, when, which, and why particular
learning activities promote improved understanding. Lessons learned in previous
generations of educational technology should be taken into account. For
example, simply putting sophisticated technologies into a learning environment
is not likely to be either efficient or effective. Previous studies have focused on
the effects of a particular technology on attitudes, motivation, and simple
knowledge tests. Such studies perpetuate a wrongheaded debate about the
educational efficacy of media (Clark, 1994; Kozma, 1994). What should be
studied is the impact on learning in terms of improvements in student inquiry
processes and other higher-order aspects of learning, directly relevant to understanding
challenging and complex subject matter (Lowyck et al., 2003; Spector &
Anderson, 2000).
To demonstrate that specific instructional approaches and educational
technologies are effective in improving complex problem-solving skills, a methodology
to determine higher-order learning outcomes appropriate for such
problems is required. A pilot test of such a methodology was demonstrated
and discussed at the 2000 International System Dynamics Conference in Bergen,
Norway (Christensen, Spector, Sioutine, & McCormack, 2000). A similar methodology
developed in Germany has shown promise (Seel, Al-Diban, &
Blumschein, 2000). General findings of a 1-year National Science Foundation
(NSF) study involving this modeling assessment methodology are discussed next.
The NSF project entitled ‘‘The DEEP Methodology for Assessing Learning
in Complex Domains’’ (see Spector &Koszalka, 2004 for detailed findings; only
high-level summaries are reported here) examined the use of annotated problem
representations to determine relative levels of expertise in biology, engineering,
and medicine. Complementary studies with similar results have been reported in
the literature (Seel et al., 2000; Stoyanova & Kommers, 2002; Taricani &
Clariana, 2006). The DEEP study involved the selection of two representative
problem scenarios for each of three complex problem-solving domains (biology,
engineering, and medicine). Subjects included both expert and non-expert
respondents; they were provided with a problem scenario and asked to indicate
what they thought would be relevant to a solution. Subjects were asked to
document these items, providing a short description of each item along with a
brief explanation of how and why it was relevant. Subjects were asked to indicate
and document assumptions about the problem situation that they were making
(initially and again at the end of the activity). Subjects were asked to develop the
8 J.M. Spector
representation of a solution approach – but not a solution. Required parts of this
representation included (a) key facts and factors influencing the problem situation;
(b) documentation of each factor – for example, how it influences the
problem; (c) a graphical representation of the problem situation that linked key
factors (see http://deep.lsi.fsu.edu/DMVS/jsp/index.htm for online access to the
DEEP tool); (d) annotations on the graphical representation (descriptions of
each link and each factor); (e) a solution approach based on the representation
already provided, including additional information that would be required to
fully specify a solution; and (f) an indication of other possible solution
approaches (very few of the respondents provided this last item).

Share.

About Author

Leave A Reply