Have you ever wondered (or meditated during long long classes) why the typical classroom or university study setting can be so incredibly dull that you just won’t get stuff into your head? Welcome now Berlin-based Henning Breuer that specializes in interaction design patterns: his current research is about what kind of tools and interfaces we need to engage best in learning environments today. After four months of research at the Waseda University in Tokyo, PingMag grabbed him for a talk about his interaction works (like new software for interactive tactile whiteboards he was developing…)
Written by Verena
Henning Breuer with Professor Matsumoto at Waseda University, Tokyo
Henning, you deal with interaction design patterns for learning environments. You don’t only research at your own company Bovacon, you also part of the Interaction Design Lab at Potsdam University and for the last four months you enhanced your research at Waseda University in Tokyo. Can you quickly sum up what you do?
My work is basically about human-computer interaction, meaning the design, the implementation and the evaluation of interactive systems for human usability and analyzing the contexts of these usages. It is a mix of computer science, psychology and design, concerning everything connected with interactive interfaces, from recording devices to mobile phones and computers.
And the practical approach of your work would be…?
…about the design of technology from the user’s perspective. Interfaces should be useful and usable, helping instead of hindering people in what they want to do. With computerized systems this tends to get difficult, so my task is to design the things in a useful way according to cognitive and cultural psychology. This may include a shift from a purely task-oriented towards an emotional design and user experience.
“Interaction design and information architecture sound like esoteric, highly technical areas, but these disciplines aren’t really about technology at all. They’re about understanding people, the way they work, and the way they think. By building this understanding into the structure of our product, we help ensure a successful experience for those who have to use it…”From Jesse James Garrett: The Elements of User Experience
So this is about eLearning, am I right?
Not quite. eLearing is usually associated with distant learning, e.g. online learning. Though this gets the main attention in technical literature, most of today’s learning situations take place face-to-face, for example in a classroom.
I see, so what seems to be the problem with today’s learning situations – don’t we have lots of technical devices already in use?
Despite of the many technical devices that are used in these environments, they are often badly integrated and the interfaces equally bad designed. Mainly because they were originally developed for other scenarios than learning processes: most of the human-computer interfaces we are surrounded by were contrived from the paradigm of Personal Computing. Or, as I’d put it, the individual and ‘one’ interface: it is always one person with one screen and one keyboard.
A typical classroom environment. More engineered interaction, please.
And that started in the 80s with office workers adjusting to the PC?
In the late 60s, notably the works of Alan Key and others paved the way for computers to be widely deployed in everyday life. They coined the term of Personal Computing that got popular in the 80s.
Closely connected to that is the notion of task-oriented behavior, as there is always a user and a task: like for example sending an email to someone. Regarding the learning context now, I try to shift from a task-oriented design towards a learning-oriented design.
‘Task-oriented’, that sounds rather rigid. One person working on the same thing, again and again… So ‘learning-oriented’ means more flexibility?
The assumption would now be: as a learning person I first have to learn a task for succeeding the task itself. Also, if I have used one special interface once, I might have different preconditions for the next use. Maybe I have different aims the next time, or I simply learnt something since the last time I opened the software.
Display of the interactive whiteboard you can use via its touch screen, connected to the teacher’s PC.
However, the interface of a software, like for example PowerPoint, always remains the same as it doesn’t adjust to my enhanced skills or knowledge. It doesn’t follow my progress and instead takes me as an abstract user. That is what we need to change!
What exactly did you do during your research period at Waseda University then?
Funded by a research grant of NICT (the National Institute of Information and Communication Technology), we tried to develop new methods for learning situations in the classroom in terms of hardware: modified tactile boards (a very rough explanation of that would be: a beamer projecting onto a white board which behaves similar to a touch screen), tablet PCs and PDAs that got connected to the teacher’s PC via Wi-Fi or USB. Our enhanced tactile boards are now already used in several environments internationally and equipped with a touch sensitive surface.
Henning did some surveys in classrooms and university courses – this is how students would rather like to work: drawing more freely, making associative connections.
So there are already tactile boards in use – but people use it the wrong way and you are now trying to find the right modification, am I right? What exactly happens in the classroom? What did you find out?
We observed the actual use of this technology in classrooms and university environments in Potsdam, Germany and in Chile: its use is often totally chaotic with the teacher switching between his or her laptop and the board. Also, teachers use it too often for their linear PowerPoint presentation and simply point to some note with their marker, which is not really suitable at all for this kind of presentation…
Another situation would be for example at a language course where the teacher writes all over the board and – to start something new – has to wipe out everything he or she wrote before. That’s not very helpful if a student has a question concerning something that was written there earlier.
That sounds less effective and more tiring than what we usually have already… How should a contemporary learning situation function instead?
This technology is now used – wrongly -, so that the whole face-to-face interaction is cut off due to the teacher watching his monitor and the students looking at the screen. This contradicts contemporary learning theory. In older times theories were about the teacher giving the knowledge to the pupils, pounding it into them.
Today we share a view of knowledge always being actively constructed with the pupils having to contribute more actively, like finding out stuff and making their own assumptions. That leaves the teacher more with the part of a moderator.
I see, so the teacher becomes a moderator in that sense, that he gathers all the data provided by the students’ devices, such as the PDAs. As part of this interactive process, the collected info then gets projected onto the whiteboard and acts as the base of all discussion. The teacher can then literally lay his hands onto all student’s information and work with all their data. A truly interactive lesson…
Looking at the technical side, how do these whiteboards you constructed now work in detail?
Touch sensitive surfaces track the position of the pen or a finger on the screen. The interactive board has its own driver software and is connected to the PC, which is again connected to a beamer to project onto the surface.
The next step in gesture-based interaction: letting students use their PDAs/pen tablets for creating, importing or editing content…
The students’ devices communicate with the teacher’s PC and thus the interactive whiteboard. These are now screenshots of the whiteboard interface: on top see the icons of each student working on the board.
And how do you use it, like an enhanced computer desktop? What can you actually do with those white boards?
With the students at Waseda, especially Christian Sousa and also Roberto Konow and Professor Baloian from Chile, we are developing several prototypes with a gesture-based interaction. It means that certain gestures like drawing a rectangle are interpreted as a command by the software, in this case to create a new window for content. Like you draw a field, mark it, color it and you can move it freely and generate new ones. You can zoom in and out, add nodes and link them to other pages thus creating a network. You can generate and upload content and use the board as a whole interactive space. You can even save it and send it back to the pupils’ PDAs.
Reminds me of Jeff Han’s magical touch screen… Moreover, that sounds like another field of interaction that finally gets rid of the mouse! Thank you, Henning Breuer! For more information about his studies, check out the Waseda website!