Partnering to Advance Learning in a Technology

Partnering
By Susan Rundell Singer
to Advance Learning in a
Technology-Enhanced World
E
merging from the heightened attention to higher
education access and quality is the potential to
substantively improve the learning experience for all
students—an opportunity that can be leveraged by
technology and actualized through new and creative
partnerships. Advances in connectivity and software
engineering are offering up sophisticated learning platforms while
research on learning is providing new insights into how people learn.
Bringing these two often disparate worlds together can inform online
learning, and research into online learning can inform learning in
all environments.1 Achieving this knowledge integration requires
collaboration among learning scientists, instructional technologists,
faculty members, industry leaders, software engineers, data scientists,
data privacy experts, and learners. For coherence, this article draws on
natural science and engineering contexts, acknowledging that some
aspects of learning are discipline-specific.
PHOTO BY JURGEN ZIEWE/GETTY IMAGES, © 2015
w w w. e d u c a u s e . e d u / e r o
M A R C H / A P R I L 2 015 E D U C A U S E r e v i e w 23
Partnering to Advance Learning in a Technology-Enhanced World
A rapidly changing understanding
of when and where learning is occurring creates unprecedented opportunity to realize Lee Shulman’s vision of
“teaching as community property.”2 A
continuum of face-to-face to online
learning environments is emerging,
with increasingly creative approaches
to laboratory learning in massive open
online courses (MOOCs). David Cox, a
Harvard neuroscientist, has included doit-yourself laboratories in his HarvardX
course “Fundamentals of Neuroscience.”
With a cockroach, a mobile phone, and
an open-source bioamplifier called
a SpikerBox, students can sit at their
kitchen table or in their home office and
listen to neuron activity in a cockroach
leg.3 This example flows into another
learning environment continuum: from
formal to informal learning. Reach-
ing from secondary school students to
interested retirees and from academic
civic engagement in formal classrooms
to citizen science, large data sets are
being created and mined by atypical
collaborators. An example is Foldit
(http://fold.it/portal/), an online game
that crowdsources solutions to difficult
three-dimensional protein structures,
adding human problem-solving and
pattern-recognition to computational
solutions. Foldit players have solved or
refined a number of protein structures,
including one that may lead to improved
designs for antiretroviral drugs to treat
diseases like HIV.4 Large genomics data
sets are also leveraged in online and
blended learning environments.5 Learners are thus no longer constrained by
bricks-and-mortar classrooms, and they
may be collaborating with other learners
halfway around the globe. MOOC participants learn from each other as well
as from the instructor, and some form
local meet-up groups. Learners in the
City University of New York’s Institute
for Virtual Enterprise (http://www.ive
.cuny.edu/us/) gain international entrepreneurial skills as they participate in a
virtual economy that includes 40 global
partners and 80 postsecondary institutions in the United States.
Building environments that afford
new, high-impact learning opportunities
for tomorrow’s citizens and workforce
require collaborative expertise. An
understanding of the cognitive foundations of learning is necessary, along with
intra- and interpersonal skills including
motivation and teamwork. This knowledge can then be embedded in emerging
contexts and tools for the learning of
additional concepts and skills.
Learning from the
Science of Learning
Much that is known about how people
learn has been synthesized in the
books How People Learn and How
Learning Works. 6 In brief,
learning depends on prior
knowledge. Motivation
determines what is
learned. How learners
organize knowledge
affects both learning and the application of what is
learned. The quality of learning is
enhanced by goaldirected practice
and specific feedback . The climate
of the learning environment—intellectual,
social, and emotional—has
a significant impact on
students’ perception and outcomes. Successful, lifelong learners are able to monitor their learning
and adjust their approaches to learning.
In terms of research on undergraduate
learning in online environments, much
24 E d u c a u s E r e v i e w M A R C H / A P R I L 2 015
of the focus, especially of meta-analyses,
has been on the comparative effectiveness of traditional, completely online
courses and blended courses that combine face-to-face and online learning.7
A breadth of disciplines has contributed, and continues to contribute, to
the understanding of how people learn.
Education research draws on a range
of fields and has yet to become fully
theory-driven and theory-generating.
There is opportunity for a greater coherence and integration of knowledge
in the science of learning. Cognitive
psychology investigates perception,
attention, memory, knowledge representation, mental imagery, language,
problem-solving, reasoning, and
­decision-making, all in controlled laboratory settings. Cognitive science, an
interdisciplinary field, brings together
artificial intelligence, psychology,
philosophy, linguistics, anthropology, and neuroscience and focuses on
understanding the mind, often through
representation and computation. The
learning sciences emerged in the 1990s
as an interdisciplinary field looking at
“cognition in the wild”—for example,
examining what learning looks like in
a classroom rather than in a controlled
laboratory setting using interventions
aimed at specific outcomes. From within
disciplinary departments, disciplinebased education research (DBER)
developed with aims quite similar to
those of the learning sciences but deeply
situated in the needs and the priorities
of the specific discipline, primarily at
the undergraduate level. Advances in
the neurosciences are also providing
insight into learning. There is growing
recognition that collaboration among
all of these fields, as well as across other
relevant social sciences, is imperative to
reaching a deeper, richer understanding
of human learning.
Even though gaps remain, what
has emerged so far from the collective
efforts to understand thinking and
learning can inform the design of learning environments, including online
environments. Cognitive psychology
w w w. e d u c a u s e . e d u / e r o
research has provided a solid foundation to guide instruction. A key goal
of the undergraduate years is to develop
expertise in a major; the student
shifts from participating in learning science to becoming a scientist. K. Anders
Ericsson and his colleagues have shown
that expertise is not transferable from
one domain to another and that reasoning and problem-solving in a specific
domain is dependent on knowledge and
content. 8 That’s a challenge if a learner
is in the third year of study as an economics major and becomes interested
in switching to physics. Developing
expertise requires deliberate practice
with feedback over extended periods of
time. If the activity does not sufficiently
challenge the learner, or is too difficult,
no learning will occur, an important
consideration in instructional design.
Expert thinking and reasoning is
quite different from novice thinking
and reasoning. Curiously, experts have
a “blind spot” and cannot remember
reasoning as a novice. Numerous studies reveal that knowledge becomes automated and that experts unintentionally
omit up to 70 percent of the relevant
information when describing a task.9
Experts may be puzzled when a novice
is stumped by something they take as
obvious—a situation that, unrecognized,
can impede teaching and learning. Peer
and near-peer mentors can help, as
can cognitive task analysis, a research
methodology that can unpack experts’
knowledge and use it to more effectively
structure learning. For example, the
technology behind the Carnegie Mellon
Genetics Cognitive Tutor (http://www
.cs.cmu.edu/~genetics/) was informed
by cognitive task analysis of genetics
problem-solving. When the tutor was
implemented at twelve colleges and
universities, statistically significant
learning gains in reasoning skills were
documented. Cognitive task analysis is
time-intensive, however, and the potential to accelerate this work and improve
learning environments by analyzing
learner data in online environments is
tantalizing.10
Collaboration
among all of
these fields, as
well as across
other relevant
social sciences,
is imperative
to reaching a
deeper, richer
understanding of
human learning.
A number of instructional design
principles have been generated from
cognitive psychology research findings.11
For example, frequent testing provides
practice in retrieving knowledge from
memory, which enhances long-term
retention. Spaced practice, rather than
cramming before a final exam, supports
enduring understanding. Elaboration
involves extracting key ideas and making one’s own mental model; a learner
might connect a series of concepts by
drawing a concept map to elaborate an
understanding of the interconnections.
However, catering to the “learning style”
of the learner has not been substantiated
by research. For instance, although learners learn more from words and graphics
than from words alone, extraneous visual
detail, such as showing an elaborate
photo of a skier on a slope in an inclined
plane physics problem, can distract and
impede learning. On the other hand,
placing relevant text close to the graphic
in space or time enhances learning.
The list of instructional design principles continues, and deciding what to
use when can be a challenge. Kenneth
M A R C H / A P R I L 2 015 E d u c a u s e r e v i e w 25
Partnering to Advance Learning in a Technology-Enhanced World
Koedinger and his colleagues have
proposed a framework that takes into
account the type of knowledge to be
acquired, the learning that must occur,
and the implications for instructional
design.12 For example, as noted earlier,
frequent testing may support long-term
retention, but it may be less effective in
enabling a learner to later transfer understanding. A second example of aligning
instructional design with the intended
learning goal concerns “flipped” classrooms. Viewing lecture videos online
before encountering and solving physics
problems in class is an intuitively appealing way to improve instruction. Research
by Daniel Schwartz and his colleagues,
however, indicates that students are more
likely to understand the deep structure
of physics and to transfer their learning
to new situations if they first encounter
a problem followed later by an explanation.13 On the other hand, being told how
to solve the problem before encountering
it may have benefits in terms of procedural knowledge. Clearly, instructional
design principles informed by learning science can improve technologyenhanced learning environments, but
they can also be refined and improved
on through research on the learning data
generated in those environments.
From the Science of Learning
to Undergraduate Learning
Extending the findings of cognitive
psychology from controlled labora-
tory settings to technology-enhanced
or bricks-and-mortar learning environments calls out for partnerships.
Embodied cognition—the concept
that cognition is shaped by the body—
emerged from cognitive science and is
providing rich territory for partnerships
among psychologists, neuroscientists,
and DBER scholars. In one example
focused on undergraduate physics
learners performing angular momentum problems, those who had physically
experienced angular momentum by
tilting an extended axel of a spinning
bicycle wheel outperformed those who
had only observed angular momentum.
Comparisons of fMRI (functional magnetic resonance imaging) data for the two
groups indicate that different regions
of the brain are activated while solving
the problems.14 The benefit of physical
interactivity can inform how to design
hybrid learning environments and how
to balance the use of effective simulations with hands-on activities.
With an eye to research to improve
undergraduate learning, DBER is catalyzing partnerships. Focusing on science and
engineering, a National Research Council
report defined the scope of DBER, noting
that its long-term goals are to
n
understand how people learn the concepts, practices, and ways of thinking
of science and engineering;
n understand the nature and development of expertise in a discipline;
Instructional
design principles
informed by
learning science
can improve
technologyenhanced learning
environments.
n
help to identify and measure appropriate learning objectives and instructional approaches that advance students toward those objectives;
n contribute to the knowledge base in a
way that can guide the translation of
DBER findings to classroom practice;
and
n identify approaches to make science
and engineering education broad and
inclusive.15
The report synthesized the knowledge of undergraduate learning across
the sciences and engineering, finding
a solid evidence base for improving
students’ conceptual understanding,
problem-solving abilities, and use of representations and listing effective instructional strategies that can and ought to be
implemented now. Numerous concept
inventories have been developed to test
conceptual understanding, and some
strategies to alter scientifically inaccurate
conceptions have been vetted. Conceptual change is particularly challenging
for concepts involving either very small
or very large spatial or temporal scales.
Unlike experts, novices are distracted by
superficial details when solving problems and often try to work backward in
finding a solution. Open-ended problems and peer-mediated learning help.
Representations are often disciplinary
shorthand for complex ideas, accessible
to the expert but not to a novice. Multiple
representations can be effectively used
Partnering to Advance Learning in a Technology-Enhanced World
to unpack meaning for learners. A range
of instructional strategies that actively
engage learners can be used to complement or replace the traditional lecture for
improved learning outcomes. A recent
meta-analysis of research on these strategies revealed that these engaged-learner
strategies also increased retention and
persistence.16
Gaps in Understanding
Undergraduate Learning
To date, much of the research on undergraduate learning has been classroombased, and only a limited number of
studies have disaggregated data to
understand similarities and differences
among different groups of students.
Knowing more about different learners
could lead to improvements in college
persistence and success, given that fewer
than 40 percent of all students who begin
college intending to major in a science,
technology, engineering, or mathematics field complete a degree in that field.
The completion rate is even lower for
members of traditionally underrepresented groups in these fields. Further,
the four-year degree graduation rate for
students with top-scoring SAT or SATequivalent scores from the top quartile of
socioeconomic status (SES) is 82 percent,
contrasted with a graduation rate of 44
percent for students with top-scoring
SAT or SAT-equivalent scores from the
lowest SES quartile.17 The need for disaggregated data presents both a caution and
an opportunity for bringing learning to
scale through online environments. That
is, a learning intervention that worked
with one group of learners may need to
be adapted, rather than scaled with fidelity, to a broader group of learners. Online
learning environments afford a research
opportunity to study implementation
across different groups of students. They
also offer a chance to identify what works
across different groups and different
institutions of higher education.
Technology-enhanced learning has
the potential to provide insights into how
learning occurs over time, over multiple
courses, and across disciplines. Longitudinal studies are few, and the development of expertise occurs over longer time
periods. As DBER arose within individual
disciplines, research on learning and
on the application of learning across
disciplines has been limited. Understanding and improving the ability of teams
to work effectively across disciplines is
imperative at a time when meeting global
challenges requires the best collaborative
innovation of people from diverse cultures with diverse experiences.18
Teamwork is just one of the
intrapersonal and interpersonal
competencies that are necessary for learning but that have
received less attention at the
undergraduate level and are
difficult to measure.19 Motivation, persistence, metacognition (reflecting on
one’s own learning), work
ethic, communication, and
collaboration are examples of
other competencies that are critical in
determining a learner’s actions and
success and that have received less
research attention at the undergraduate level. As technology-enhanced
learning goes global, cultural aspects
of motivation and other noncognitive
aspects of learning gain increasing
importance, and as a result, psychometricians, social psychologists, sociologists, and
cultural anthropologists become valued
partners in understanding and assessing these competencies.
Opportunities also exist to integrate
knowledge across different research
communities, including higher education research, institutional research,
disciplinary research, information
technology research, and research on
student support services and advising. Predictive analytics approaches
are leveraging student records and
intervention data to provide timely support to students to increase retention
and success. The Predictive Analytics Reporting (PAR) Framework uses
multi-institutional, de-identified data
to find patterns to predict student success. 20 PAR has partnered with Starfish Retention Solutions (http://www
.starfishsolutions.com/), a platform
that optimizes timely access to college
services with analytical components
that aid institutions in making decisions
about investments in various student
support services.
w w w. e d u c a u s e . e d u / e r o
Technology can help advance the
research on learning, but it is also driving changes in research methodologies.
Research and improvement approaches
that are continuous and nonlinear are
possible in digital environments and
may advance our understanding of
learning more rapidly than the current
“gold standard”: the randomized control
trial.21 A/B testing—in which two alternative approaches are simultaneously
used in an online environment—allows
for rapid, iterative improvements based
on the generated evidence. How costeffective and efficient A/B testing will be
as the research questions and interventions increase in complexity is an open
question. But it is clear that the growth
of big data is driving the development
of the field of data science and learning
analytics, again pushing on the need for
collaborations and partnerships. In addition, the dramatically shifting terrain
Online
learning
environments
offer a chance to
identify what works
across different
groups of students
and different
institutions of
higher education.
M A R C H / A P R I L 2 015 E d u c a u s e r e v i e w 29
Partnering to Advance Learning in a Technology-Enhanced World
of education research in a digital world
calls for new frameworks and expertise
in data privacy and protection.22
Partnerships for
Learning in a Digital World
Integrating the science of learning into technology-enhanced learning environments and developing a
research agenda and methodologies
to iteratively discover more about
learning in these environments
depend on the multifaceted partnerships alluded to above. The challenge
of widely incorporating evidence-based
practices, drawn from research on learning, into teaching practices is far more
universal than the challenge of integrating them into technology-enhanced
learning environments.23 Evidence of
effectiveness alone has not been sufficient to change practice, and a number of
efforts are under way to change the culture of teaching and learning in higher
education. These efforts include the
Association of American Universities’
Undergraduate STEM Education Initiative (https://stemedhub.org/groups/
Integrating
the science of
learning into
technologyenhanced learning
environments
and developing a
research agenda
and methodologies
depend on
multifaceted
partnerships.
30 E d u c a u s E r e v i e w M A R C H / A P R I L 2 015
aau) and the National Science Foundation’s Improving Undergraduate STEM
Education (IUSE, http://www.nsf.gov/
funding/pgm_summ.jsp?pims_id=
5050 82) investments. An intriguing example is “An Introduction to
Evidence-Based Undergraduate STEM
Teaching,” a seven-week professional
development MOOC designed to engage
and support graduate students, postdoctoral fellows, and faculty in the use of
evidence-based practices (https://www
.coursera.org/course/stemteaching).
In addition to growing partnerships
that will bring together expertise in the
science of learning and teaching, information technology, software development,
and data privacy, new areas of expertise
are emerging and areas of expertise from
several fields are converging, creating a
new generation of data scientists. Graduate students, envisioning careers as learning engineers, are building robust online
learning environments with a research
agenda aimed at improved learning, joining instructional designers and academic
technologists. Within the sciences, mathematics, and engineering, departments
are hiring DBER scholars. These individuals are physicists, chemists, biologists, geoscientists, mathematicians, and
engineers with research expertise in how
students learn within the discipline. Professional preparation in these emerging
fields spans disciplines and will benefit
from the growing body of work in the
­science of team science.
Partnerships to support technologyenhanced education extend beyond the
walls of the campus as well. Working collaboratively with industry partners can
hone the curriculum to prepare students
for the workplace. The National Science
Foundation’s Advanced Technological Education program (http://www.ate
centers.org/) supports the development
of technicians; all projects begin with a
robust industry-academic partnership.
The National Convergence Technology
Center (http://www.connectedtech.org/)
prepares information technologists with
a curriculum that maintains its currency
through quarterly meetings of the Business and Industry Leadership Team.
The Nanotechnology Applications and
Career Knowledge (NACK) Network
(http://nano4me.org/) at Penn State
University provides community college
students across the country with remote
access and control to sophisticated equipment—including field emission scanning electron microscopes and energy
(X-ray) dispersive spectroscopy—that
they will use in the workplace. The
National Center for Welding Education
and Training (https://www.weld-ed.org/)
Partnering to Advance Learning in a Technology-Enhanced World
partners with major manufacturers of
welding equipment and supplies to be
sure that its students have access to the
latest welding technology, including
competency-based assessment through
simulated welding experiences. The
Automotive Manufacturing Technical
Education Collaborative (http://www
.autoworkforce.org/) supports students
in 12 states through partnerships with
37 educational partners and 23 automotive industry partners through a
range of blended learning models that
include working with assembly robots.
All of these above examples are offered
not only because of the scale of the
partnerships but also because they
serve as examples of the possibilities
of t­echnology-enhanced education.
Networked, remote access to sophisticated equipment offers value in many
learning contexts, not only in technical
education. As with access to large-scale
data sets, like genomics data, creating
platforms and opportunities for diverse
learners in diverse learning environments increases access and opportunity.
As technology advances at warp
speed, as research on learning leaps
forward in different quarters, and as the
demographics of postsecondary learners shift, collaboration will become
increasingly important. Reaching across
boundaries and developing the ability
to fully engage experts in other fields
may be the most significant challenge we
face in advancing technology-enhanced
­education.
n
Notes
  1. Susan R. Singer and William B. Bonvillian, “The
Online Challenge to Higher Education,” Issues
in Science and Technology 29, no. 4 (Summer 2013),
http://issues.org/29-4/the-online-challenge-tohigher-education/.
  2. Lee S. Shulman, “Teaching as Community
Property: Putting an End to Pedagogical
Solitude,” Change 25, no. 6 (November/December
1993).
  3. “Fundamentals of Neuroscience, Part I,”
https://www.edx.org/course/fundamentalsneuroscience-part-i-harvardx-mcb80-1x#.
VJdh3EANAB; Timothy C. Marzullo and
Gregory J. Gage, “The SpikerBox: A Low Cost,
Open-Source BioAmplifier for Increasing Public
Participation in Neuroscience Inquiry,” PLoS
One 7, no. 3 (2012), http://www.ncbi.nlm.nih.gov/
32 E d u c a u s E r e v i e w M A R C H / A P R I L 2 015
pmc/articles/PMC3310049/.
  4. Firas Khatib, Frank DiMaio, et al., “Crystal
Structure of a Monomeric Retroviral Protease
Solved by Protein Folding Game Players,” Nature
Structural & Molecular Biology 18, no. 10 (October
2011).
  5. For an example, see “Teaching Big Science at
Small Colleges: A Genomics Collaboration”:
http://serc.carleton.edu/genomics/index.html.
  6. John D. Bransford, Ann L. Brown, and Rodney
R. Cocking, eds., How People Learn: Brain, Mind,
Experience, and School, expanded ed. (Washington,
D.C.: National Academy Press, 2000); Susan A.
Ambrose, Michael W. Bridges, Michele DiPietro,
Marsha C. Lovett, and Marie K. Norman, How
Learning Works: Seven Research-Based Principles for
Smart Teaching (San Francisco: Jossey-Bass, 2010).
  7. Barbara Means, Marianne Bakia, and Robert
Murphy, Learning Online: What Research Tells
Us about Whether, When and How (New York:
Routledge, 2014).
  8. Paul J. Feltovich, Michael J. Prietula, and K.
Anders Ericsson, “Studies of Expertise from
Psychological Perspectives,” in K. Anders
Erickson, Neil Charness, Paul J. Feltovich, and
Robert R. Hoffman, eds., The Cambridge Handbook
of Expertise and Expert Performance (Cambridge:
Cambridge University Press, 2006).
  9. M. E. Sullivan, K. A. Yates, K. Inaba, L. Lam,
and R. E. Clark, “The Use of Cognitive Task
Analysis to Reveal the Instructional Limitations
of Experts in the Teaching of Procedural Skills,”
Academic Medicine 89, no. 5 (May 2014).
10. Albert Corbett, Linda Kauffman, Ben MacLaren,
Angela Wagner, and Elizabeth Jones, “A
Cognitive Tutor for Genetics Problem Solving:
Learning Gains and Student Modeling,” Journal
of Educational Computing Research 42, no. 2
(2010); Kenneth R. Koedinger, Elizabeth A.
McLaughlin, and John C. Stamper, “Data-driven
Learner Modeling to Understand and Improve
Online Learning: MOOCS and Technology to
Advance Learning and Learning Research,”
Ubiquity, May 2014.
11. See Richard E. Mayer, Applying the Science of
Learning (Boston: Pearson, 2011).
12. Kenneth R. Koedinger, Julie L. Booth, and
David Klahr, “Instructional Complexity and
the Science to Constrain It,” Science 342, no. 6161
(November 2013).
13. Daniel L. Schwartz, Catherine C. Chase, Marily
A. Oppezzo, and Doris B. Chin, “Practicing
versus Inventing with Contrasting Cases: The
Effects of Telling First on Learning and Transfer,”
Journal of Educational Psychology 103, no. 4
(November 2011).
14. Sian L. Beilock and Susan M. Fischer, “From
Cognitive Science to Physics Education and
Back,” Physics Education Research Conference
Proceedings, 2013.
15. Susan R. Singer, Natalie R. Nielsen, and Heidi
A. Schweingruber, Discipline-Based Education
Research: Understanding and Improving Learning in
Undergraduate Science and Education (Washington,
DC: National Academies Press, 2012), 9. 16. Scott Freeman, Sarah L. Eddy, Miles
McDonough, et al., “Active Learning Increases
Student Performance in Science, Engineering,
and Mathematics,” Proceedings of the National
Academy of Sciences (PNAS) 111, no. 23 (June 10,
2014), http://www.pnas.org/content/111/23/8410.
17. Executive Office of the President, President’s
Council of Advisors on Science and Technology,
Engage to Excel: Producing One Million Additional
College Graduates with Degrees in Science, Technology,
Engineering, and Mathematics, February 2012,
http://www.whitehouse.gov/sites/default/
files/microsites/ostp/pcast-engage-to-excelfinal_feb.pdf; Expanding Underrepresented Minority
Participation: America’s Science and Technology Talent
at the Crossroads (Washington, DC: National
Academies Press, 2011), http://www.nap.edu/
openbook.php?record_id=12984; Anthony P.
Carnevale and Jeff Strohl, “How Increasing
College Access Is Increasing Inequality, and
What to Do about It,” in Richard D. Kahlenberg,
ed., Rewarding Strivers: Helping Low-Income Students
Succeed in College (New York: Century Foundation
Press, 2010).
18. Phillip A. Sharp, “Meeting Global Challenges:
Discovery and Innovation through
Convergence,” Science 346, no. 6216 (December
19, 2014), http://www.sciencemag.org/
content/346/6216/1468.full.
19. James W. Pellegrino and Margaret L. Hilton,
eds., Education for Life and Work: Developing
Transferable Knowledge and Skills in the 21st Century
(Washington, DC: National Academies Press,
2012); Brian M. Stecher and Laura S. Hamilton,
Measuring Hard-to-Measure Student Competencies: A
Research and Development Plan (Santa Monica, CA:
RAND, 2014).
20. Ellen Wagner and Beth Davis, “The Predictive
Analytics Reporting (PAR) Framework, WCET,”
EDUCAUSE Review, December 6, 2013, http://
www.educause.edu/ero/article/predictiveanalytics-reporting-par-framework-wcet.
21. Office of Educational Technology, Expanding
Evidence Approaches for Learning in a Digital World
(Washington, DC: U.S. Department of Education,
2013).
22. Julia Lane, Victoria Stodden, Stefan Bender,
and Helen Nissenbaum, eds., Privacy, Big Data,
and the Public Good: Frameworks for Engagement
(Cambridge: Cambridge University Press, 2014).
23. Singer, Nielsen, and Schweingruber, DisciplineBased Education Research.
This work was developed with support from the National
Science Foundation. Any opinion, findings, and conclusions
or recommendations expressed in this material are those of the
author and do not necessarily reflect the views of the National
Science Foundation.
© 2015 Susan Rundell Singer
usan Rundell Singer
S
(ssinger@carleton.edu) is
Laurence McKinley Gould
Professor in the Biology
and Cognitive Science
Departments at Carleton
College.