Learn xAPI MOOC – Week 4 Reflections

Here are my thoughts after completing week 4 (the final week) of the Learn xAPI MOOC. This week is titled “Final Challenge and Post Conference Drinks” and it brings together the technical and strategy tracks.

There are three key questions:

Discussion Point: Learning analytics is too important for L&D to own

I was surprised (and chuffed) to see that the first question was inspired by the blog post I wrote on 28th May. This was a response to a post on learning analytics written by Mark Aberdour.

There were great comments from everyone and I’m really pleased to have contributed to the debate.

Due to semantics (or perhaps just me not being clear in the first place) there were some comments around responsibility and accountability. To clarify – I believe 100% that L&D needs to take ownership for results and to some extent that will include learning analytics.

My point was more that if organisations are going invest in employing data scientists (as Mark suggested in the original post to which I was responding) they need to do their work at a level above L&D (and every other department). If the measurement and the analysis of the data remains within L&D there is a very high risk that all we end up measuring is our own performance in the context of our own measures (was it a good piece of learning material?) rather than the impact we are having (did it make any difference to workplace performance?).

L&D certainly needs to be involved involved in learning analytics. The alignment of learning analytics to performance outcomes is something that should begin at the earliest stages of design.

One of the hardest part of analysing any data is working out what it actually means. I’m less sure that L&D should be the ones who decide what the data means – that’s where a data scientist (or whoever) looking at this at an organisational level comes in.

Discussion point: Most L&D teams lack the skills and mindset to make effective use of meaningful data. Do you think L&D teams have the potential to develop the necessary skills and will their organisations give them the opportunity to develop them?

This is the question that I was asking myself at the end of week 2.

First of all, this isn’t a question of capability – I expect any good L&D team would be capable of developing the skills. To what degree they should develop these skills will probably depend largely on the size of the team (in small teams it may be more about mindset and understanding rather than deep skills).

Whether they will have the opportunity is harder to answer and this will probably depend on how they are perceived by the organisation. My thinking has moved on a bit since week 2. Back then I was wondering if L&D will be given the opportunity. Now I think that opportunity isn’t something they should wait for – they should own this and get on with developing those skills.

Discussion point: How do we make sure that we don’t get carried away with what’s possible and instead focus on what is valuable? And lastly who is it valuable to?

I think these are the big questions and the ones that I’ve been thinking about since week one.

Focussing on the valuable rather than the possible must be one of the most common challenges facing anyone who works with technology and learning. The only way we do this is by understanding what is valuable to the organisation.

That means going beyond being order takers who simply satisfy the needs of the immediate stakeholder who comes to us for a solution. We need to be able to consider everything we do in the context of the organisation’s goals and, when necessary, challenge the stakeholder if they demand the wrong solution.

The rest of the week’s content was focussed on practical activity around, visualisations and telling stories with data.

Summary of my thoughts after four weeks

I haven’t reached a definitive conclusion about xAPI, but that wasn’t my expectation, and my ideas will continue to evolve. Here is a snapshot at week 4:

  • I’m interested in xAPI in that it changes the technical tools we have for measuring activity. If we really need to track activity, xAPI goes beyond many of the limitations of SCORM (such as tracking activity on mobiles and in apps).
  • Whether or not we need to measure activity is another thing entirely. I’m not against measurement, but it needs to be the right measurement and it needs to be actionable.
  • I’d be surprised if we really see many organisations tracking anything other than activity.
  • I fear that clumsy attempts to use xAPI too widely will degrade the experience for learners.
  • After four weeks I still can’t see how the learner benefits from xAPI. The benefits are to L&D and maybe to the wider organisation if they get the data analysis side of it right.
  • xAPI tracks activity (albeit a wider range of activity than SCORM) but it does not track learning.
  • The name is terrible. Domain specific acronyms make communication difficult and put people off by making them outsiders. See here.
  • People seem keen to use it outside L&D, but I think that may be difficult given that it is designed by L&D people, for L&D people to solve the problems that L&D people have.

Learn xAPI MOOC – Week 3 Reflections

Learn xAPI MOOC – Week 3 Reflections

Here are my thoughts after completing week 3 of the Learn xAPI MOOC.

The strategy track for this week is titled “Data and Learning Analytics”. I’d already been giving this some thought (see here) as I have concerns about the real benefits of even the simplest tracking and measurement, let alone ‘learning analytics’…

This week started off by addressing the difficulty that people often have in grasping the difference between correlation and causation. This was done by sharing a page from Explain xkcd (which is just brilliant!). I’ve bookmarked this for future use.

There was only one discussion point that interested me this week:

Discussion point: What sources of data could you be tapping into to build a better picture of the connections between training and performance?

There are, I think, only two possible answers:

  1. If you are considering this at a high level, then the answer is simply “whatever sources of data the organisation uses to measure performance”. So if you want to know if your customer service training programme has made any difference, you have to start by looking at whatever data source the organisation uses to measure customer service. Good luck with finding anything other than a superficial link.
  2. The alternative is to consider this at a much lower level . For every training intervention we should be identifying the intended performance outcome at the very earliest stages of design. At that stage we should also be identifying what data we need to measure that performance and from where to source that data. If that data source doesn’t already exist, there’s a good chance that’s because what you intend to change isn’t important enough to measure.

The rest of the week’s content was focussed on analytics, data, visualisation, privacy and other general data related subjects. All interesting topics, but nothing I hadn’t already given a great deal of thought to.

Summary of my thoughts after three weeks

I’m still trying to make sense of xAPI, and after three weeks here are the main threads I’m considering.

  • Before starting the MOOC I suspected that most of the talk about measuring “learning outcomes” actually referred to measuring “training outcomes”. After three weeks on the MOOC, I’m now sure that’s true.
  • I worry that there is more interest in using xAPI to prove a link between L&D and performance in order to demonstrate the value of L&D rather than to actually improve workplace performance.
  • Just as I did at the start I can see how tracking activity may bring plenty of benefits to L&D and pretty much none to learners.
  • In theory we could use xAPI to track a huge amount of activity, but I haven’t seen any compelling argument as to why we should. There seems to be an underlying assumption that measurement is a good thing (you only have to think about school league tables to know that it isn’t).
  • One of the more pertinent things I’ve read on the subject recently is this post from Henry Mintzberg. I’ve seen a lot of “measuring as a replacement for managing” within L&D.

The biggest worry I have is that in trying to measure too many things we will actually degrade the experience for learners. There are a few ways this could happen.

  1. We identify specific activities or content that we want to track and to make that easier to do we move them somewhere they are easier to track – you know, like an LMS.
  2. We give the learner some kind of tool (like a browser bookmarklet or app) and say “each time you learn/do/experience anything, just click this so we know about it. Oh, and make sure you pick the right verb.”
  3. We find some diabolical way to track everything they do and then analyse it in search of learning activity. Seriously, stop thinking about that right now.

I’m clearly not the only one to recognise that L&D lacks the skills needed for the kind of learning analytics enabled by advances such as xAPI.

Mark Aberdour has written a very thoughtful post about the challenges we face and makes this suggestion:

Clearly some of these items require close interaction with the L&D team, but in summary there is a real need to bring experienced data scientists into corporate learning and development, not just to set up analytics programmes but to continually monitor, review and refine the results.

via Building a learning analytics platform | Open Thoughts.

I agree that organisations need people who can interpret this data and make it actionable, but I don’t believe they should sit within L&D. If ownership for learning data remains within L&D we risk continuing the current situation where all we do is measure the most basic elements of our performance (inputs and outputs) rather than the impact of learning on workplace performance.

For learning data to have strategic value it needs to be considered at a higher level, in combination with data from other parts of the organisation. To be objective, ownership for this needs to sit outside any department with a vested interest in the results.

Learn xAPI MOOC – Week 2 Reflections

Learn xAPI MOOC – Week 2 Reflections

Here are my thoughts after completing week 2 of the Learn xAPI MOOC.

The strategy track for this week is titled “Building the Business Case for Data”, which suggested that it might hold the answers to some of my questions from week 1. It didn’t.

This week I’ve included some of the discussion points from the MOOC as well as my reaction to them.

The introduction to this week suggests that we can use data to evaluate not just learning activity but the learning experience. However, I have a couple of big doubts about this:

  • I love the idea of being able to assess the end user’s learning experience, but I don’t think that is what xAPI is going to do (at least not as it’s being described in this MOOC). What it seems to be talking about here is measuring the learner’s experience of a training intervention. I’m not saying that there is no value in this, but it’s not the same thing.
  • Given that the people who complete a learning activity are rarely the people who commission the development of that learning activity I’d be interested to know how successful people have been in getting the go ahead to make changes and further develop the learning activity based on that data. If the need of the person who commissions the activity is to “get something out there that proves we’ve done health and safety training” how much will they care about the learner’s experience?

Which brings us nicely to this question in the MOOC.

Discussion point: Does L&D deserve it’s place as a key influencer in business strategy? Or are we playing second fiddle to other departments and their needs?

In the video for this section Sean Putman suggests we need to think about who are the customers for our learning interventions and who are the customers for your learning data?

This is logical advice, but with very few exceptions the customer for both of these is usually someone other than the learner. Indeed, it’s quite common that the customer is someone far detached from the learner (and thus even further detached from the organisation’s customer’s).

L&D puts itself into the place of playing second fiddle when it sees those other departments as its customers and does nothing more than take and satisfy their orders. If L&D wants to be treated as an equal it needs to behave like one – have an opinion, develop its own plan for supporting the organisation’s strategy. xAPI alone isn’t going to fix this, but it could give L&D more data to work with – if it knows what to do with it.

Discussion point: If you actually wanted to measure the performance impact of your learning solutions, who else would you need to work with? Do you think this would be easy to achieve, or are you likely to face road blocks?

It is scary (although accurate) that this question starts with “if”. The fact there is any doubt that L&D might want to measure the performance impact of learning tells you a lot about the state of L&D today.

However, my own experience is that even when that kind of analysis is offered to stakeholders, they don’t want it. I think that is a result of the order/supply relationship that in many cases exists between other departments and L&D.

Discussion point: How could you use this approach in your organisation? What data would you collect and why?

This question was asked in response to this blog post.

I like this idea of generating xAPI statements from the the software that someone is using. However I think I’d be more interested in how I could use the data to improve the user’s experience of the software rather than to better train the users.


The additional data which xAPI can generate makes it even more important that L&D understands what it is they expect to change through any learning intervention and what actually needs to be measured to see if that change has happened. Defining, collecting and analysing this data is not an easy task – it requires a skillset that few L&D people have and it will be time consuming and costly to do.

As with last week, I’m left with more questions.

  • Will L&D be given the opportunity to develop these skills?
  • Will their customers be prepared for the additional time and effort required to develop solutions?
  • Will an industry pop up around this, with vendors selling promises of systems that do all of the analysis for you?

Learn xAPI MOOC – Week 1 Reflections

Learn xAPI MOOC – Week 1 Reflections

Last week I made a start with the Learn xAPI MOOC, which is being run in Curatr and organised by the tried and trusted team of Ben Betts, Martin Couzins and Sam Burrough. This is the first MOOC that I’ve started which I can actually picture myself completing.

For me, the format is spot on – short, focussed pieces of content delivered by people that really understand the topic (who appear to have a real passion for it) with lots of opportunities to share and interact. They’ve also recognised that not everyone has the same kind of interest in this as a subject and so have separated out the content into two tracks – strategy and technical.

A bit of background

I’ve been aware of the Experience API (xAPI) for a long time, and I understand the basics of what it is intended to do:

The xAPI enables tracking of learning experiences, including traditional records, such as scores or completion. It also stores records of learners’ actions, like reading an article or watching a training video.

That description is sourced from this page on the ADL website and if you want a purely functional description of what xAPI is, it’s a good starting point.

I also understand as much as I currently need and want to about the technical aspects of how it is implemented. What I don’t understand is why we should use it and in what context. I have therefore chosen to go through the strategy track.

Week 1

I’ve come away from the first week with more questions than answers – and that’s great. The MOOC has got me thinking about xAPI in much more detail than before. Here are the things that I’m currently pondering:

  • I get that xAPI allows us to track more things than SCORM, in more ways and with much richer data. What I currently don’t get is why we would track them at all.
  • Is knowing that someone has attended a conference, watched a video or read an article somehow more valuable than knowing that they have completed a piece of elearning?
  • xAPI has the potential to produce a lot of data. Other than storing it in a Learning Record Store (LRS) what do you do with it all? Organisations are generally pretty poor at using the data they have now – will they be able to do a better job with even more of it?
  • For it to have value, data needs to be meaningful and actionable. As far as I can tell the xAPI standard makes no attempt to address this – which makes sense. While it may be possible to standardise the mechanisms for structuring, recording and storing the data, the meaning of that data will be unique.
  • There is much talk about using xAPI to record performance data (which is terrific) – but in that case why is the data kept in a Learning Record Store? That name needs to change if it is to be taken seriously by anyone other than L&D.
  • While I’m thinking about it – I’m not yet sure that an LRS records learning any more than an LMS manages it.
  • Who benefits from the xAPI?
    • I’m pretty sure that so far all of the potential benefits I’ve seen broadly relate to the organisation, L&D or the individual learning designer.
    • I can see that there are benefits to the learner, simply because if the organisation wanted to track activity in a SCORM world, it had to be SCORM content, whereas xAPI seems to be able to track pretty much anything. Whether the learner benefits from that tracking is another question entirely.

I’ve really enjoyed the first week. I certainly know more about xAPI and even if I don’t have the answers yet, I’m getting a better handle on the questions I need to answer.

Noddlepod news

Noddlepod news

Back in 2011 I wrote about Noddlepod, which at the time I described as the best collaboration tool you’ve never heard of. Since then they’ve gone from strength to strength and I was delighted to see a big announcement from them. Not only have they received a funding package from Finance Wales and four angel investors, they’ve been joined in an advisory capacity by Charles Jennings, Mary McKenna and Nigel Paine.

I’m really delighted for everyone involved. Ollie and Stephen Gardener came up with the idea in 2010 and had already developed it into a company that employs six people – this new investment should help them continue their mission to revolutionise community learning.

You can read the official announcement on the noddlepod blog.

Let’s Talk About Inequality


Today is Blog Action Day and this year the focus is on inequality.

As a father I have hopes and aspirations for my daughter, and one of the most important things is to make sure that she gets a good education. Here in the UK that isn’t really a big problem. Sure, I can agonise over catchment areas and OFSTED reports, but having a choice of schools is a luxury. I don’t have to deal with anyone trying to stop my daughter receiving an education, or limiting the opportunities she has because she is a girl.1

Sadly, the same can’t be said in many other places, in fact according to Plan International 1 in 5 girls is denied an education.2

Making sure that girls get access to good quality education is one of the most significant steps in eradicating inequality. The results are felt far beyond the classroom.

  • It can help give them the knowledge and skills they need to establish a livelihood, and to develop a career.
  • It can give them the opportunity to enter a relationship by choice, when they are ready and as an equal.
  • And it doesn’t just help them as individuals, it can help lift whole communities out of poverty.

There are no limits to what an educated girl can do, they just need to be given the opportunity. Checkout this video from Plan International:


It’s apt that I am writing this in the week that Malala Yousafzai was jointly awarded the Nobel Peace Prize for the work she has done to promote child rights, particularly in relation to education. If you have any doubt about the power of an educated girl, watch this video of Malala as she addresses the UN on her 16th birthday.

Take Action

I chose the title of this post, Let’s Talk About Inequality, not just because it is the tagline of this year’s Blog Action Day, but because if we’re going to address inequality we have to talk about it. Spread the word, tell other people how important this is and ask them to tell even more people.

Most importantly, take action:

  • Give your support to charities that work with communities to address inequality and improve education
  • Contact your MP. Politicians aren’t just there to represent your views on local issues, they can have a significant influence on international efforts to address inequality. Visit theyworkforyou.com to find your MP’s contact details and let them know that this issue is important to you.
  1. Just let anyone even try!
  2. http://plan-international.org/what-we-do/education

Packt Publishing $10 offer

10 days 10 years - Home Banner

I’m a big fan of Packt Publishing’s technical books and videos, and it’s 10 years since they embarked on their mission to deliver effective learning and information services to IT professionals. In that time they’ve published over 2000 titles and helped projects become household names, awarding over $400,000 through its Open Source Project Royalty Scheme.

To celebrate this huge milestone, Packt is offering all of its eBooks and Videos at just $10 each – this promotion covers every title and customers can stock up on as many copies as they like until July 5th.

More information is available at www.packtpub.com/10years

Learning Articulate Storyline – Book Review

Learning Articulate StorylineEarlier this year I was delighted to be asked by Packt Publishing to act as a Technical Reviewer on their new book Learning Articulate Storyline. The book was recently published and not long afterwards a copy arrived by post. Even though I hadn’t done the hard work of actually writing the book, it was still exciting to see the finished product after having played a small part in its development.

I’m a big fan of the practical approach adopted in Packt’s books, and Learning Articulate Storyline is no exception. If you’ve never used Storyline before you can be confident that after working through this book you will be able to use it to develop elearning content.

There’s no unnecessary theory or explanations; after a quick introduction to Storyline you get stuck into building the first part of a course that you will continue to develop as you progress through the book. You’re introduced to new concepts at the point at which you use them, so there’s always a clear link between theory and practice.

At less than 280 pages you might think that the book is a little short, particularly when compared to typical IT books, but don’t let that fool you. It’s a testament to both to the simplicity of the tool and the practical approach of the book that it covers everything you need to get started – and I should be clear that getting started is what it’s all about.

The book is aimed squarely at novices and it meets their needs well, but this isn’t the book for you if you’re looking for advanced techniques. I can happily recommend it to anyone who is just getting started with Articulate Storyline.