Course Planning for Emerging Technology

A colleague of mine recently likened the course planning process to what goes on inside a sausage factory:

Over a century ago, the German statesman Otto Von Bismarck supposedly said, “If you like laws and sausages, you should never watch either one being made.” Same point can be made about the way I construct course syllabi…

While some folks may be shocked by realities of how faculty members plan their courses, I think there is real value to opening up the process. In that spirit, I’m planning to use this blog to reflect on my activities in preparing the Emerging Technologies in Education course that I’m planning for the fall. The planning model that I use looks something like this.

For me, course planning involves balancing three sets of interlocking goals: the learning goals of the individual students, the constraints (and affordances) of accomplishing those goals in a credit-bearing college course, and the “institutional press” of conducting the class within a specific institutional culture. When I plan a class, I try to structure our time together in a way that does justice to the complexity of these three sets of expectations. In a perfect world, the goals would be largely aligned, but in the real world of practice they seldom are.

As a course planner, I make decisions about structure, sequence, timing, grading and the myriad of other details based on my individual interpretation of the context of the class. There are at least four lenses that I use to focus on the particulars of a class.

  • Educational Philosophy: Since the earliest scientific studies on curriculum, planners have noted that course design is a reflection of individual educational philosophy, and there is tremendous variation in the fundamental world views that shape teachers’ decisions. While my practice draws on a variety of perspectives–liberal education, progressivism, sometimes even behaviorism–my primary decision-making lenses are humanistic education and individualized instruction.
  • Authentic Learning: As an intellectual and genetic descendent of John Dewey, I’m committed to building classes that advance authentic learning: learning that uses real-world problems and projects and that allow students to explore and discuss these problems in ways that are relevant to them.
  • Authentic Teaching: One of the dangers of a scientific approach to teaching and learning is that it devalues the relationship between teacher and learner. In planning courses, I try to find topics, techniques and problems that connect to my genuine interests and concerns. In Parker Palmer’s terms: “Our deepest calling is to grow into our own authentic self-hood, whether it conforms to some image of who we ought to be. As we do so, we will also find the joy that every human being seeks–we will also find our authentic service in the world. True vocation joins self and service, as Fredrick Buechner asserts when he defines vocation as ‘the place where your deep gladness meets the world’s deep need.’”
  • Communities of Practice:. I’ve come agree with John Seeley Brown that one of the major goals of education is to bring students into contact with divergent communities with distinct understanding of knowledge and distinct ways of judging what is interesting, valid and significant. The focus of a community of practice is “learning to be” rather than merely mastering a body of knowledge. A major question in my courses is what does it mean to be an effective learner, citizen, teacher or administrator in a time of unparalleled technological change.

Translating those broad principles into practice—a set of activities and interactions, bounded by time and constrained by the realities of “institutional press”—make the course planning process an enormously complex one, but one that constitutes the heart of effective teaching.

The Importance of Finishing the Job

Dam2.jpg

Some time ago I was responsible for a camp in the Ardirondacks with a big dam that controlled our lake. One summer we had some problems and had to bring in an engineering firm to do some work, and one of the engineers was explaining why building dams is so expensive. Generally about 90% of the work is invisible: rerouting stream beds, pouring footings and installing other infrastructure elements that, on the surface, have noting to do with stopping the flow of water. While that’s true, the entire project has no value unless you finish the job and do the last 10% that actually stops the water.

In some ways many academic IT projects are a little like that, but with the numbers reversed. Buying and installing hardware and software, configuring systems and running pilot tests all take time and technical expertise. But for many of our projects, the real work that produces gains in teaching, learning or productivity just begins when those initial projects are completed. Producing those gains may take years of communication, evaluation, training and re-engineering. I wonder how many of us really understand that and commit to finishing the job when we launch some new initiative. Would we focus our time differently if we were more realistic about what we were committing to?

Photo by Flickr User Farm3 under a Creative Commons Attribution License

How Much Is Enough? Focused Research

too much orange juice

One of my students came back to visit me after more than a year working with African refugees. During the time that he was away, he said that one of the things that he dreamed when he got back to US was drinking a tall glass of cold orange juice. When he got back his home in western New York, he headed down the the local Wegmans grocery store make his dream come true–only to find that he had to chose from more than 60 kinds–pulp, no pulp; with calcium or without; from concentrate or not from concentrate. After a year living with virtually no choice of what he would eat or drink or wear, he was so overwhelmed by the possibilities that he left without making a decision.

Most Americans assume that choice is a good thing–and that more more choice is better. Psychologist Barry Schwartz challenges central tenet of western societies: freedom of choice. In Schwartz’s estimation, choice has made us not freer but more paralyzed, not happier but more dissatisfied. For a great introduction to Schwartz’s thinking on this topic, check out his Ted Talk.

We see the problems with too much choice all the time as we help users integrate technology into their teaching and research. Few users even scratch the surface in using the software they purchase. Experts find that most Word users utilize fewer than 5% of the features–even those for whom word processing is the central productivity tool for their work. One of the most difficult–but most important–tasks for those of us in the Technology Integration Program is to find the balance between unfettered choice and a unwarranted centralization that chokes off creativity. We need to take the leadership in exploring new technologies, recommending those that have the widest potential to improve learning and then developing support mechanisms that help faculty adopt new tools quickly and efficiently.

I’ll be writing more about these focused research projects as the summer goes on, but I owe Susan three posts in the next three days, so I’m going to bring this one to a close.

Overcoming Bias: Learning From Your Track Record

Worth a Listen on IT Conversations Overcoming Bias.

We live in a world of cognitive biases and polarized opinions. We consider ourselves to be largely rational, yet we are often prone to systematic errors such as overconfidence, wishful thinking, and the attraction of strong opinions. This means decisions are often driven more by personalities and passions rather than technical merits. Economic theorist Robin Hanson explores common errors, and points to innovative tools such as prediction markets which can help overcome bias and promote truth.

George Mason economics professor Robin Hanson gave this short speech at the O’Reilly Open Source Conference last year about how cognitive biases get in the way of our accomplishing goals that would make our organizations function better. Bias is the a systematic tendency to produce errors in our judgment. (One way of defining the purpose of education is to help individuals understand these typical human errors and overcome them for the betterment of society.)

Most of know that intellectually. We know that our inability to accurately estimate project times and costs creates all kinds of wasted effort, and that it’s incredibly hard to get ideas accepted on many of our campuses because of our natural tendency to glamorize our own ideas and discount those not invent here. This tendency toward wishful thinking and over optimism is so pronounced, that managers of software developers really

Most of us are passionate about something–vegetarianism, open source, the Red Sox–but, in this talk, Hanson suggests that our greatest passion should be for doing the hard work to overcome biases to come to the truth. One key to betting better at reducing errors in our judgment is to invest some time in investigating our actual track records. Most organizations would be well served by spending some serious time analyzing both failed and successful projects to establish better estimating procedures.

After one of my academic advisors worked with me for a while, she came with what she called the half-it/double-it rule. She’s ask me when I might get her the next draft, and I’d tell her 20 pages, end of next week. She’d half-it (reduce the goal to ten pages) and double-it, (increase the time to two weeks). Having goals that were tied to the time it actually took me to write–rather than speed I wished I could write–made the whole project more manageable.

css.php