While we’ve seen considerable experimentation and exploration scattered across the educational landscape, one of the holdout areas often untouched by the transformations of recent technologies is the standard conference presentation. Think about it: because of their logistics and their average venue — an auditorium with a stage facing row upon row of chairs or a rigid constellation of tables packed together to maximize attendance — most conference sessions focus primarily on a leader delivering information for an audience’s consumption. If that ‘delivery & consumption’ model is something we’re working to transform in classrooms, couldn’t we also work to transform it at conferences?
This is why it was especially exciting to team with the Bett content team this year to explore ways to do just that. You can read more about our rationale for the experiment and some of the outcomes we were hoping to achieve here. Did we succeed in helping people move from being passive consumers to active partners? We’re still collating data and following up with participants… I’ll post the results here once they’re available. But today, I wanted to consider some of the complexities of the challenge…
Understanding What People Are Thinking…
Gathering real-time feedback from up to a thousand people posed a challenge for us, but we were grateful to have Glisser as a back-end infrastructure to support us. We needed not merely to enable back-channel discussions and observations, but also to capture it so we could analyze it later for the purposes of our experiment. Understanding what people are thinking is a bit of a holy grail for all teachers, whether they’re presenting from a conference stage or from a school classroom. But it generates a significant challenge: how much do the receiver’s thoughts distract from the presenter’s message? One of the important issues to understand as we move forward will be to consider the efficacy of monologue, dialogue, and multilogue (as we might frame large-scale, multi-current discussions). What’s the impact of each of these forms on how information moves among us and how we situate that information in our memory and thinking? For us, we’ll be interested to see how responding even to questions associated with the topics presenters were presenting might function as a distraction or impediment to the presenter’s message.
Considering the Implications of Structure
Another important factor to consider will be the impact of keeping an organic structure to presentations rather than integrating the ‘cubic’ dimensions into the presentations themselves. In the case of our Bett 2018 experiment, we built questions designed to focus the audience on content, community, and context, but we did not ask presenters to integrate these dimensions into their presentations nor to structure the elements of their presentations according to these three. Will an organically structured presentation undergirded by shaped reflection prompts be as effective as a more integrated approach? The results will be interesting to see.
Engaging People’s Prior Knowledge
One of the compelling patterns to emerge in several recent research projects involves the impact of people’s prior knowledge on their reception of (and receptivity to) new information. One of our initial goals with this project was to engage people’s prior knowledge by asking them questions first about their feelings or reactions to the main topic of the impending presentation, and then asking them to provide a detailed description of how that main topic actually works. By starting with reaction and then moving to a more analytical or explanatory question, we were hoping to create a kind of shock or conundrum for those in the session: “I have some feelings and reactions to this topic, but when it comes down to it, I’m not exactly sure how it really works…” Though such informational collisions can be uncomfortable, the realizations they engender can help clear the ground so people in a session are more ready to learn.
The Plusses and Minuses of Positive and Negative
The final structural element we wanted to test in our experiment was the impact positive approaches to a topic might have compared to negative approaches. In some cases, we asked people to consider the ways the topic being presented might benefit or slide positively into their schools and communities, while in others, we asked people to consider ways their schools or communities might resist or make arguments against the topic being presented. In both cases, we asked them to think about the reasons that might drive these responses and to consider how they might interact with either allies or opponents. As we follow up with session participants, it will be interesting to see which was more effective in helping people contemplate the new information they encountered in the sessions and took steps to integrate that information into their contexts and communities.