Friday, 6 May 2011

Thing 12 - LDSE

LDSE logo
Thing 12 is one of the most complex Things to introduce, but bear with us and hopefully you'll find it repays investigation. 

After completing Thing 12
  • You'll have experimented with using LSDE to design a module or lecture
  • You'll be aware of its underlying aims as a pedagogic design tool
  • You'll have an updated opinion about the usefulness of such tools
What is LDSE?

The Learning Design Support Environment is free-to-use desktop software for lecturers.  It lets you model a module or session in a way that makes it easy to experiment with different approaches.  As you experiment, it automatically generates an analysis similar to Thing 5's Pedagogy Profile which you can use to help you compare different approaches. You can choose to start a design from scratch, from another design, or from someone else's design.  Sharing designs is straightforward - just save the XML-format file and give it to whoever you want to share it with.

The project is funded by the EPSRC and ESRC under their Technology Enhanced Learning programme and has involved computer scientists, educationalists and learning technologists from six HE institutions, including the Oxford team that developed Phoebe (Thing 7).  LDSE has kept the well-received bits of Phoebe and other previous tools, while putting a lot of work into improving usability and developing the conceptual underpinnings.  In a few minutes you'll get to judge for yourself how well it has succeeded.

Two more things are worth pointing out.  One; LDSE is a project not a product - it's still a bit of a prototype.  The team is collecting feedback now, so your blog posts will help them decide how to continue.  Two; the main reason for LDSE existing is to help lecturers take advantage of new technologies.  All the labour-saving design re-use and drag and drop is just a means to this end; it lets the software know something about what you're doing, so it can highlight pedagogic principles which you can apply to decisions about how technology might enhance your teaching.


Learning Design Support Environment from LDSE on Vimeo.

How is it used?

The tool's focus is 'sessions', which might be lectures or seminars, or more extended work like field trips or unsupervised study.  Future versions may add 'modules', assembled from groups of sessions, and 'teaching and learning activities' from which sessions are assembled.

The application window has three parts; a 'files' view in the left-hand column, showing your collection of modules, sessions and learning activities.  The main central pane is where you put together your designs, and the right-hand column is a 'palette' from which you can drag items and drop them into your designs.

The Session Properties tab

When working on a session the central pane has three tabs: properties, timeline and evaluation.  The properties tab contains the title, timing and description of the session, plus its aims and learning objectives, which need to be dragged in from the palette.  Once dragged in, these very generic statements can be edited to make them specific, e.g. "Students can explain surface features in terms of plate tectonics" could be a contextualisation of "Link between theory and practice".

All LDSE's palette items are 'learning patterns' captured from actual teaching, abstracted from their subject-specific content, and made available to the user as a basis for new content. 

Once the properties are complete, users proceed to the 'timeline' view.  The palette changes to offer a range of generic learning activities and assessment types, which once dragged onto the timeline can again be edited to fit the specific circumstances.
The Session Timeline tab
When the activities have been placed, the 'evaluation' tab gives two visualisations of the designed session.  One is the inventory view, the other shows the proportion of time allocated to individually-tailored vs. 'one size fits all' work, with group-based investigation falling between the two.  Both views are informational only - it's up to you to decide what you want more or less of.  The idea is that auto-generating them in this way makes it easy to compare the effects of group size / teaching methods / use of technology on the learning experience / types of learning.
The Session Evaluation tab
Later versions of the software than currently available for download have extra features.  You''ll be able to start a design using a 'pedagogical pattern' with pre-set generic learning objective types and session activities.  The evaluation tab includes costs, in terms of yours and your students' time, to help you weigh costs and benefits of alternative teaching approaches.  It 'knows' how, for instance, a particular kind of learning objective is supported by particular learning activities, or even complete learning designs, and make recommendations to enhance learners' experience.  Each learning activity and evaluation metric comes with ideas and guidance from an updated version of the Phoebe wiki.  The time line can be scaled to work in weeks, days, hours or minutes, whichever makes sense for your session.  It is possible to edit learning activities to take account of the fact, for instance, that your lectures are very interactive and not purely didactic.

Step by step instructions

The downloadable version of the software unfortunately does not include the interesting context-aware guidance features, or the time-saving 'design patterns', so at the moment there's only a limited amount you can do with it.
  1. Download LDSE from the project's downloads page and start the application
  2.  Click 'create a new session'
  3. Try describing one of your seminars or lectures using the palette's aims, learning objective and learning activities, editing them to make them specific to your session as you go.  
  4. Look at the evaluation tab.  Would you like to alter the mix?  Try going back and using some different activity types.  Are you happier with the evaluation breakdown?
Blog Thing 12
  1. Did you find the session design process in LDSE intuitive?  How?  If not, please comment of how it fails to support how you do things usually.
  2. What pedagogic insights did you gain into the session you described in the exercise?  How could these help you design or deliver it differently?
  3. What problems do you have with LDSE as you have seen it? What do you like about it? Would you be interested in seeing the finished software?
If you're interested..
  • The LDSE team is collecting examples of teaching sessions from lecturers, which they are making generic so they can be offered back to LDSE users as 'design patterns.  You can test out the current collection and submit your own using the online Pedagogical Pattern Collector. There's an introductory video on LDSE's Vimeo page.
  • A recent powerpoint about LDSE is available here
  • Diana Laurillard's 'conversational framework' provides the basis of the evaluation tab's inventory view.  The conversational framework is a general description of the process that teachers and learners go through and has amongst other things been quite widely used to compare educational technologies, in terms of what parts of the process they are good at.  For a good introduction read the relevant section here, after which you may the example figure here helpful.
  • Bloom's taxonomy, an influential hierarchy of levels of knowledge, comprehension, and critical thinking in a particular topic, forms the basis of many guides to writing learning outcomes (e.g. this one from Oxford Brookes).  It also provides the majority of LDSE's generic learning outcomes. A good chart of the revised taxonomy, which is harder to find, can be had here.
  • The LDSE team have invested a lot of time equipping the software with a concept map or ontology of learning and teaching, a knowledge base of connected ideas and properties that is supposed to let it infer that if you want to do X you might be interested in considering Y.  The knowledge base itself isn't published, but you can find human-readable overviews of pedagogic models on HEA Fellow James Atherton's site and the Phoebe wiki.   

2 comments:

  1. Our main funder, the JISC, recommended the Learning Design Support Environment to us for consideration. The tool’s credibility is boosted by the fact that team behind it includes Professor Diana Laurillard, a highly respected name in educational technology. If anything comes close to mrj10’s “high-powered, web-enabled gadget that designs curricula at the press of a button”, this should be it, although it focuses more on teaching sessions than larger curricula. LDSE is also a prime example of the kind of tool we at CARET frequently have recommended to us, and which we need to evaluate and filter before coming to a decision on its potential value to Cambridge.

    As one of only two desktop applications amongst our Things, LDSE required more effort than most from our participants. Further effort, albeit of imagination, was demanded because the version made available for download featured only a few of the remarkable features recently demonstrated by the team at seminars. This is why we made LDSE the last of our 13 Things. However, our respondents were remarkably patient with it.

    The ‘palette’ model of constructing learning objectives excited quite different responses. Socratic Investigations “quickly ... gained the sense that the platform is far *too* supportive. A bit of support is one thing; programmatic digitalizing of a course of learning is another. The former can be of direct service to the learning environment (classroom); the latter is likely to be only of use to extra-learning-environment bureaucrats and institutions ... It is far too restraining for the educator.” For mrj10 however “it is too open ... anything goes [in this version of the tool, at least], so I felt rather lost.” “Are there no better choices/combinations?  No limit on the number of types of activities or suitable durations? .... are there some combinations that make more sense?” “Having a list of possible aims/outcomes/activities provides some food for thought, but ... it’s not clear why I might/should choose a particular selection.”  “While I don’t want to be forced to adopt a particular approach, if there is theory on this it would be good to know.”  

    This non-judgemental approach was a major source of frustration for mrj10. The final version of LDSE is supposed to offer recommendations aplenty, drawing on an internal conceptual map of learning design theory, but the version tested had nothing like this, making it seem an empty exercise: “Without any guidance I can’t see any immediate insight for a session I might be planning.”

    In terms of usability, Socratic Investigations found “The process is straight-forward” and mrj10 reported “It wasn’t hard to work out how to use LDSE.

    In terms of usefulness, our respondents weren’t able to evaluate the full feature set, which includes sharing and repurposing teaching designs. Based on what they could see, Socratic Investigations felt that “much of the jiggling involved in the LDSE program strikes me as expendable; a bit as relying on a very sophisticated computer to perform calculations such as 5+5”, a sentiment shared by mrj10: “Apart from the evaluation, it’s not clear why this needs to be online.  Most of the rest could be done with pen and paper, so what is gained?”.

    mrj10 explored the reasons for this dissatisfaction: “For me the focus of attention seems back to front.  I would prefer to start with what I am going to teach before thinking about what the learning aims or outcomes might be”. Content-first design is how many teaching academics prefer to work, although it seems to be rejected by the teaching and learning development community. Are so many lecturers wrong, is the teaching and learning development community adrift from reality, or is the difference merely in the understanding of terminology?

    [Continues...]

    ReplyDelete
  2. [Continued...]

    For LDSE as for Thing 5: Pedagogy Profile, mrj10 was also unhappy with the opaque scope and derivation of the choices presented by the tool: “The Palette lists seem reasonable, but I am unable to judge whether they are comprehensive or whether I would want other options.” “From the layout of the timeline there could be seen to be an implication that there should be a mix of types of activities (and maybe of activities within a type), but I don’t know why this should be.” “I would ... like to see examples of how these approaches have been effective before attempting to incorporate them in my teaching (but this doesn’t seem to be something that LDSE offers).”

    LDSE’s course analysis is an interesting feature that could be considered a development of the OU’s Pedagogy Profile (Thing 5). mrj10 “was intrigued to see how the random activities I had entered were ‘evaluated’, but had little sense of why they gave the results they did or whether I should consider this good or bad”, noting “A one hour straight lecture (tutor presentation?) can be fantastic or awful depending on who is teaching it.” Socratic Investigations conceded this platform could “help with laying out facets of educators' concern when planning out an academic course of studies” although he was concerned this might have more to do with institutional expectations that educators‘ concerns.

    Socratic Investigations also worried that such effort invested in planning might come at the cost of responsiveness to learners: “I am aware of the technical "efficiency" of the program (of how "high" the formatting/grid is placed above the particular changing/malleable needs of real learning environments); but I am also aware of the fact that technical efficiency ought not to trap our judgement in a web of sticky pre-formatted expectations”.

    In conclusion Socratic Investigations held that “Essentially what LDSE appears to be aiming at is the gradual replacement of the educator's discerning virtue with "the machine." The LDSE team would probably say ‘supplement by means of codified theories of learning design and reusability of designs’, rather than ‘replace’. mrj10 concluded “The basic idea seems reasonable, the choices sensible and it seems fairly straightforward to use.  I’m just not sure about the principles behind it or whether it is approaching curriculum design in the way that matters most for me.”

    Overall our respondents were intrigued but sufficiently unconvinced of the tool’s value that they were unlikely to spend further time investigating. For LDSE, as for Thing 8: Compendium LD, an advocate of the tool might conclude that the problem lies with teaching staff, who should be sufficiently familiar with teaching and learning theory that helping them represent and measure aspects of their courses is enough. Possessed of this information, in this world view staff will be able to exercise their academic judgement more effectively. If this is not a world view shared by teaching staff, I think it would be fair to say the problem lies with the tool. To be fair to LDSE, it is still under development and the team may address this in a future version.

    As a general point about complex tools in this space, which we must remember is one of an academic’s core concerns, it is important to approach tool design in a user-centric way, not from first principles of theory. Winning users in the latter approach requires having them trust your judgement unquestioningly (an improbable circumstance for an academic) or having them share your whole theoretical and empirical framework (equally improbable). Far more acceptable to accommodate and support, and optionally supplement, their own working practices - who could object to doing what they already do, but better?

    ReplyDelete