Unfold Learning

exploring the best innovations in learning and teaching


How Hard Is It?

Rigid Post Image

The definition of rigor, according to the dictionary, is “strictness, severity, or harshness, as in dealing with people.” Over the past decade, this word has crept into eduspeak more and more, often to argue that students are earning high grades for work that is too easy and does not lead to learning. Do we really mean to make classrooms inflexible places? Probably not.

Search for the words “rigor” and “education” together and you will mostly find a wide range of new “definitions” for this word. More than definitions, many of these are circumlocutions: rambling explanations about challenging students without really getting at how to challenge students and how to ascertain the aforementioned rigor. There are no tools or scales for measuring rigor and no advice on how to identify it.

The buzz about rigor is not new. A post at the Hechinger Report from 2010 states that students are leaving high school unprepared for college-level work or the workplace. The post mentions designing a curriculum that prepares students for college, but gives no specifics. The post also mentions an expert who tells us there is a difference between rigorous teaching, rigorous questioning, and rigorous assessment, but the differences are not actually expressed. I have some rigorous questions of my own about “rigor” to help me think through this word that seems to be more of a political construct with suspect motives.

What is the point of rigor in schools? If the goal is to ensure students are working hard, is the hard work a means to an end or an end in itself?

Challenge is necessary for learning. Doing what is already known is easy, but does not lead to further acquisitions of skills or knowledge. Learning happens at the edge of what has already been learned and what can be learned next. It happens at the edge of what is easy and what starts to pose a challenge. Vygotsky used this expansion metaphor to describe learning when he wrote about the “zone of proximal development” (ZPD). If rigor ensures students are challenged so they are constantly learning, there must be a way to figure out if students are challenged, and how much so. Too much of a challenge (a leap too far into the ZPD) leads to frustration. Not enough challenge leads to boredom and a lack of learning. However, hard or easy varies with each individual, and teachers and students judge the ease or difficulty of any situation very differently and they do so for different reasons.

The Glossary of Education Reform defines “rigor” as an element of “…lessons that encourage students to question their assumptions and think deeply, rather than […] lessons that merely demand memorization and information recall.” This is not very different from the goal of Deeper Learning, which is defined by the Alliance for Excellent Education and the Hewlett Foundation as academic activities that lead to students acquiring the following six competencies.

  1. Mastery of core content
  2. Critical thinking
  3. Problem solving
  4. Collaboration
  5. Communication
  6. Self-directed learning

If learning activities are designed to challenge students while promoting the development of these competencies, rigor does not need to be quantified and assessed as a separate element. So why is it?

We ask students to learn things that are definitely hard, but have little or no connection to what will be required of them after graduation, and we do it in the name of rigor. As the Glossary of Education Reform notes, “While some educators may equate rigor with difficultly, many educators would argue that academically rigorous learning experiences should be sufficiently and appropriately challenging for individual students or groups of students, not simply difficult” (emphasis added). Until there is a consensus on what rigor means and how it can be observed and measured for individual students, we should ask ourselves questions that help us ensure students are challenged for a clear purpose and that learning outcomes are genuinely productive. Continue reading


1 Comment

‘Cubic’ ELM Assessments 3: A Problem-Based Learning Course…

Scattered cube

This post is the third in a series using “engagement and learning multiplier” (ELM) assessments to examine some common teaching and learning methods. If you’d like to (re)familiarize yourself with how these assessments work, you can refer to this post. If you’d like to compare this current post’s ELM assessment with others I’ve done, you can find the assessment of a laboratory course here, and you can find the rest of the “cubic” assessments in the sidebar (which may be at the bottom of this screen if you’re reading this on a mobile device).

For each of these assessments, I’ll set the learning scenario and then present analysis about why that approach has a given “cubic” shape and why it receives a particular ELM score. These posts are designed to provide useful examples and guidance as you evaluate your own learning approaches and as you make your own teaching and technology choices.

A problem-based learning course

Description

This middle-school social studies course is organized around a single major concept: building a human colony on Mars. Learners work through three separate phases of this concept over the course of the year: 1) the initial planning and colonization — deciding what would be needed to establish a colony and who should be invited as initial settlers; 2) the running of the colony — deciding what system of laws and government should characterize the settled colony; and 3) the expansion of the colony — deciding how to attract new settlers from earth to come and expand the colony’s capabilities. While the teacher has determined this overall structure, the details of what learners plan, what they make to demonstrate their plan (to both peers and parents), and how they present their work are all entirely left up to the learners. The teacher begins the course by introducing learners to Scrum, the collaboration method originally developed to help software developers work more productively together (see this helpful post by Bea Leiderman about Scrum in school). All work in the course is developed by learners using this method, with the teacher serving as the “Product Owner.”

Learners form their own teams of three to five members whom they choose based on a “skills résumé” (accompanied by examples where appropriate) that each learner prepares and presents to the class: descriptions of drawing or artistic ability, experience making movies, writing or math skills, knowledge of particular software or apps, etc. These teams will stay together throughout the year — though learners are also encouraged to “cross-pollinate” by seeking help from other teams if they need something no one on their own team can provide. Cross-pollination works by means of barter: teams have to negotiate, with one team offering services the other team wants in exchange for the services the first one needs. Any team that finishes a project before the other teams is signified a “consulting group”: its members are expected to split up and serve the other groups by helping with whatever they need. If a team experiences any interpersonal difficulties, its members are responsible for working those difficulties out themselves (though the teacher offers guidance and resources if the team members request help). The role of Scrum Master rotates through all of the team’s members during the first unit, with every member serving as Scrum Master at least once. After that, team members are allowed to choose their own roles based on their abilities and their team’s collective sense of how they can serve best.

Every class day begins with a “Stand-Up,” during which learners show the products of their work to one another, deal with delays or impediments, and decide what their work for the day will involve. Following this initial meeting, the teacher might briefly present relevant materials, involve students in a mini-project, or ask one of the teams to present some of their recent discoveries or work. She also provides materials on the class blog with the understanding that learners will use these as a starting point for their own explorations and creations. As learners develop their projects, they conduct research, develop media, and share results, all facilitated by the tablet devices the school provides for each learner. Teams present the results of the first two project phases in December and March during evening assemblies open to the public. Each team posts its assembled materials on the course blog for “public review” one week prior to the assembly, and is expected to use feedback gathered from this review period and from the public forum to revise their work. Each group gives a 10-minute presentation followed by 10 minutes of public Q&A. At the end of each forum, the assembled audience votes on which group presented the most compelling plan, which produced the best presentation, and which demonstrated the best responses to the audience’s questions. Many former class members participate in these public reviews “just for fun,” though the top team from the previous year serves as a formal “review committee” — service they perform both for the honor of the position and for the pizza party they get during final reviews. The “review committee” provides specific observations about what each group has done well and what each group needs to improve. In late May, learners present the results of the final project phase to the entire school, and the assembled school votes on which settlement they’d most like to join — and why. These three public forums (and the materials prepared for them) take the place of course exams.

The top team (and next year’s “review committee”) is chosen by combining the results of the three public forums and an end-of-year, in-class vote determining which overall project was the best researched, best supported, and best presented — a process which the previous year’s “review committee” referees.

Continue reading


2 Comments

‘Cubic’ ELM Assessments 2: A Laboratory Course…

Scattered cube

This post is the second in a series using “engagement and learning multiplier” (ELM) assessments to examine some common teaching and learning methods. If you’d like to (re)familiarize yourself with how these assessments work, you can refer to this previous post. If you’d like to compare this current post’s ELM assessment with others I’ve done, you can find the assessment of lecture here, and you can find the rest of the “cubic” assessments in the sidebar (which may be at the bottom of this screen if you’re reading this on a mobile device).

For each of these assessments, I’ll set the learning scenario and then present analysis about why that approach has a given “cubic” shape and why it receives a particular ELM score. These posts are designed to provide useful examples and guidance as you evaluate your own learning approaches and as you make your own teaching and technology choices.

A laboratory course

Description

This secondary-level chemistry course is designed to introduce learners not only to the main concepts of general chemistry, but also to much of the basic equipment and lab protocols used in this field. Part of the standard science curriculum, all learners are expected to pass through this general education course prior to graduation. Because of its broad and introductory nature, the teacher has tried to make course concepts accessible and follows a carefully organized curriculum in which more complex concepts and skills build on the simpler ones that precede them. Each two-week unit begins with an introduction featuring a presentation by the teacher. He uses a variety of media to illustrate concepts, including videos and images made by learners in previous years, which he attributes to their authors. This begins the first phase, focusing on conceptualization. The teacher’s introduction is followed up with homework assignments and in-class scenarios designed to give learners practice in understanding and internalizing the unit’s central concepts. The teacher chooses five learners for each unit (eventually rotating through the whole class twice) who present their homework as a basis for class discussions. Their fellow learners are asked to critique and correct the work presented with the requirement that they double-check the science and also describe “something great and something that needs improvement” for each peer presenter. All learners are encouraged to find resources on the web or in the library that they find helpful, posting links to them in the course’s learning portal so others can access them for help understanding course concepts. At the next phase, focusing on experimentation, the teacher presents learners with a hypothesis that will anchor their laboratory explorations. He follows this with a brief introduction to lab and safety protocols, introducing learners to the equipment and procedures they’ll use to conduct the unit’s central experiment. Each learner is also assigned two partners for the experiment, with partners changing for every unit. The teacher has designed these rotating partnerships to help learners make more connections with their classmates as well as to distribute the “advantages” offered by high performing learners. The three-person lab teams record experimental data not only by writing results, but also by making photographs and videos. These will be used to illustrate lab reports, which are jointly authored and submitted digitally. Each partner is assigned specific parts of the lab report and is expected to identify the sections she has authored, but each must also “sign off” on the other partners’ work. During laboratory experiments, the teacher circulates to answer questions, correct improper uses of the equipment or errors in the experimental protocols, and to ensure that all teams are on task and distributing work evenly among learners. The teacher grades lab reports for scientific and experimental accuracy as well as for writing and media quality. For evaluation of the non-empirical aspects of the lab report, the teacher also follows the “something great / something that needs improvement” model, offering all comments via audio files which are delivered to each team via the learning portal. Learners receive both an individual and a team score. The teacher asks those who produce exemplary reports for permission to use them to illustrate concepts for learners in future classes.
Continue reading


3 Comments

‘Cubic’ ELM Assessments 1: Traditional Lecture…

Scattered cube

In my last post, I described the ways the “cubic” model could be used to evaluate  learning approaches and described a method for calculating “engagement and learning multiplier” (ELM) scores. If you aren’t familiar with these concepts, you might want to review that post before continuing…

Over the next several posts, I’ll perform cubic ELM assessments of several common learning approaches. For each, I’ll set the learning scenario and then present analysis about why that approach has a given “cubic” shape and why it receives a particular ELM score. Hopefully, these posts will provide useful examples and guidance as you evaluate your own learning approaches and as you make your own teaching and technology choices.

 


A traditional lecture course

Description

In this course, the teacher presents information most days through a combination of lectures and presentations — some conducted using an “interactive” white board. The teacher also incorporates materials from the course textbook in her lectures, highlighting the points learners will have to know for exams. Learners are expected to take careful notes, and exams and other assessments come largely from material the teacher has covered in lecture, though some also comes from exercises and readings in the course text. During class, learners are encouraged to ask questions if they don’t understand a concept, and the teacher organizes weekly discussions where she probes learners’ understanding of course topics. In addition to homework exercises, learners are expected to complete a major research project. This project is designed to introduce learners to important books and journals in the discipline, and they must use materials from the school’s library, including the library’s online, full-text database, to complete it successfully. Learners choose from a list of topics furnished by the teacher, who has ensured that library holdings are adequate to support each topic. Assessment of these projects (as with exams and homework) is completed by the teacher, who writes comments designed to correct errors, to help learners acquire disciplinary literacies and conform to disciplinary norms, and to praise particularly insightful or advanced responses. The teacher periodically presents exceptional or noteworthy homework exercises, exam responses, and final projects to the class, being careful to protect the authors’ anonymity, in order to encourage dedicated, thoughtful work. She makes herself readily available outside of class to discuss course concepts and encourages learners to come by her office or contact her by email if they have questions or difficulties. Continue reading


3 Comments

Evaluating Learning Approaches in ‘Cubic’ Space: Engagement & Learning Multipliers…

Measuring

In my preceding posts, I’ve described the dimensional levels of each facet of our “cubic” learning model: content, community, and context. This post (and the next several) will incorporate and build on content from those posts, so if you haven’t yet seen them, you might want to review them before continuing here.

In “cubic” learning, each element’s “dimensionality” increases as the learner becomes more engaged and plays a more central role. The increasing agency, skills, and responsibility learners must demonstrate at each progressive level also means that more and more, they need support rather than direction, individual resources rather than a one-size-fits-all recipe, and companions and partners rather than controllers. More dimensionality means more learner-centered — and learner-driven — learning.

As we’ve seen previously, the dimensional levels of the “cubic” model look like this, with the higher levels increasing the volume of the cube they generate:

Cubic dimensions & values

While “volume” in this model is something of a metaphor, it’s one backed up by research. For example, as both Anderson and Krathwohl’s revision of Bloom’s taxonomy and Webb’s “depth of knowledge” (DOK) argue, creating content not only requires more ability from learners than recalling, it also increases their learning potential: the deeper level of engagement makes learning more likely to take hold. In other words, moving from recall to creation increases the potential “volume” for learning — and therefore makes a bigger cube in this model.

Continue reading


Thanks, SSAT Leadership Legacy Project!

Screen Shot 2017-05-22 at 3.01.31 PM

I recently had the great privilege to participate in the SSAT‘s Leadership Legacy Project, a program designed to grow future leaders for UK schools. What was most encouraging about this event — echoed in the week that I spent with SSAT member teachers and staff at events around England — was the emphasis on developing thoughtful leaders by first developing thoughtful teachers. Too often, we can see school leadership and teaching as disconnected or even oppositional. But for powerful learning environments to be created, every member of a school — from the cleaners to the highest level of leadership — has to be engaged in helping learners. In effect, everyone has to be on the teaching team.

This was the primary message of the amazing Baroness Sue Campbell, chair of the Youth Sport Trust and key figure on the UK’s 2012 Olympic Committee. Sue reminded us that teamwork and mutual support are a far more important foundation for success than focusing on skills and performance. Her generosity and encouragement were a great fit for SSAT’s mission and message.

The best thing about SSAT is its role as connector, bringing people together in a powerful, country-wide network to think, collaborate, imagine, and work. I’m glad to have become  part of that network, and I’m looking forward to staying connected!


3 Comments

Dimensions of ‘Cubic’ Learning: Context

Cube Sketch green

In my preceding two posts, I’ve described the levels of two facets of a multidimensional learning model comprised of three: content, community, and context.  If you haven’t yet seen those previous posts, you might want to review them before continuing here.

However, even though I’ve split this discussion across three posts, this model does not describe three elements that function independently; all three combine to create a single “cubic” learning experience. They’re parts of the same basic entity, facets of a single prism. Splitting them apart, as some learning models do, ignores the influences each dimension has on the others and elides the important ways they cocreate an environment for learning.

In this final post examining each facet’s structural progression, I’ll explore the levels associated with context. Then, in my next post, I’ll map specific teaching approaches onto these three dimensions, offering examples of how this “cubic” model can be used to assess and rate the efficacy of particular learning constructs. Finally, I’ll conclude this series by connecting our “cubic” model to other existing learning models and taxonomies.

    *          *          *

From an educational standpoint, context is at once both a simple and an incredibly complex concept. It’s simple because we’re very used to seeing our classrooms and their equipment as the “theaters” where learning happens. We even have a standard minimum expectation for such spaces: seating and work surfaces for learners, special demonstration equipment for teachers — including chalkboards or white boards, projection screens, and so forth. We know that if we want to do something special — display 3D models, organize work groups, conduct lab demonstrations or explorations, connect in real-time to far-away experts, or stage a performance — we might either need special equipment or we might need to move into some sort of special facility that makes these activities possible. But why would we want to do any of these special activities? The answer is simple: we know that they’ll amplify some portion of content or will enable some form of collaboration that we think will benefit our learners — or both.

And this is where context becomes complex. We instinctively realize that even relatively minor changes in the learning context — introducing new tools, a new space, or even a new classroom “culture” — can powerfully impact learning within our schools. But if that’s true for the few changes or additions we can make inside of a school facility, how many more contexts from outside the world of school could we leverage for learning? And what could we expect from our learners if we could integrate those contexts and opportunities every day and not just once in a while? Continue reading