"Results Day....Yay I can stop learning now"

14:03:00 Learning Boffins 0 Comments






This time of year always takes me back to memories of the school summer holidays and the one day we all dreaded, GSCE results day, A Level results day and actually every single assignment result I waited for whilst at University. Every time I received these results I felt a big sigh of relief and thought 'no more learning....until the next one obviously'. And when my final year marks where given to me at University I thought 'no more learning EVER'. At the time this felt like the greatest feeling and it was one that was of course celebrated in traditional student fashion.

But I look back now and think 'o how wrong I was' and not in a negative way at all. Without even realising it everyday since completing my final year of University (and in-between) I potentially have had a learning experience everyday that has helped me in the working world without even realising it. It took awhile for me to realise that after my educational years learning can happen outside of the classroom.

In our conversations we can see that many organisations still don't see learning outside of the classroom as 'learning'. The well known 70:20:10  model (which of course isn't a prescriptive model) shows that 70% of learning is from on-the-job experience, 20% from coaching and relationships and then only 10% from formal classroom learning.

It certainly is a mind-set shift to realise that you are learning outside of the classroom and is one that can be embraced by organisations. It can inspire organisations to use non-formal learning as part of other learning programmes or as an on-going learning support. Getting L&D to understand and implement this is the easy bit, but where you need buy-in and the shift to be noticeable is for the employees to go on the culture shift with you.

Even at times when informal support is in place, employees still say that they don't get any training in their organisation. Therefore it is important that employees understand the concept of informal learning. Through continued targeted communication and simple messaging to employees you can start to get employees to recognise 'informal' as learning and most importantly understand and get the value out of it.

We have had first-hand experience of taking employees on this mind-shift journey and as a result have supported clients through their own journey's, which in a formal sense can be seen as culture change and managed in this way and as result is a process and a journey that shouldn't be underestimated.

Take a look at our digital learning info graphic: http://capitakpconsultants.blogspot.co.uk/2016/06/our-digital-learning-infographic.html 
or tweet us @LearningBoffins. We would love to hear your experience of moving to informal.....



0 comments:

What is the difference that makes the difference?

15:01:00 Learning Boffins 0 Comments


We’re all well into the Olympics now, and a question I’ve pondered is how much difference the equipment makes to the performance of our elite athletes?

In diving for example, it’s pretty clear-cut, there is no equipment: Tom Daley and Dan Goodfellow jump through 10m of air into the pool below.  But what difference does Andy Murray’s choice of racket make against his opponent?  Or the quality of Greg Rutherford’s running shoes affect his long jump?  However I’ll let you decide what difference Amber Hill’s spray painted shotgun and pink cartridges make to her performance.

Following a series of crashes on the cycling course, a TV interview showed that whilst competitive cycling allows some latitude in the dimensions and weight of the bikes, they’re all very similar: perhaps the most significant decision a cyclist makes is how hard to pump the tyres (higher pressure means less drag, but also less grip on corners).

In sport, the rules quite rightly seek to create a ‘level playing field’, so athletes compete as much as possible on equal terms.  Any difference is therefore down to the skill, strength and wit of the athlete, not their equipment.

Now here comes the rather predictable connection to L&D…

… because when it comes to formal training, I notice we make a very different assumption – that the outcomes of learning depend very heavily on the quality of the training course.  Generic content is usually frowned upon in favour of something bespoke: specifically designed for our purpose.  We take a lot of time developing the training.  If we’re outsourcing, a great deal of time and thought goes into choosing the right supplier.  And even if we find something suitable on the shelf, we’ll still want to tailor it for our purposes.  “Content is king” we say (and preferably content built to high standards of instructional design).

Content is important, no doubt about that.  But I have noticed that if you look at learning in terms of outcomes, then content does not influence the result anywhere near as much as our practice would suggest.

Here's my list of the 'Big Four' influences on learning outcomes:
  • Motivation: whether the learner has the desire to learn
  • How relevant the learning is to what the learner needs to be able to do better
  • Line manager support and encouragement of the learner (before and after the training)
  • After training, how much the learner uses the knowledge and skill they have learned.

In my experience, unless these things are right, it doesn’t much matter whether the content is good bad or indifferent.  And yet, how often do we give them even the briefest consideration, let alone investment. 

My conclusion is that, rather like our top athletes, the outcome is down to the people.

0 comments:

Millennials, learning and PokémonGo

14:04:00 Learning Boffins 0 Comments




Where have you been if you haven’t heard of PokémonGo? I’m a millennial that collected Pokémon cards and played the red, blue, gold and silver games on my colour gameboy in the late 1990’s and early 2000’s. I got a little excited when I heard the game was going to launched (even though I am now 25).

If you find yourself still asking what is PokémonGo here is a little intro. It’s an augmented reality game available on IOS and android devices, it uses GPS and camera to capture, battle and train virtual creatures. When the GPS finds a Pokémon it shows on the screen as if it were in the same real-world location. You catch it and it's yours. You collect as many as possible and the game is to catch them all, all 250 of them (especially Pikachu). Looks like I will be doing a lot of walking to find them then!

I wanted to understand why I am so addicted and engrossed into the game, what were the factors that were affecting this and could these elements apply to how we approach and think about learning. So here were my top 3 points that have helped to drive my interest and addiction to the game.

1. Ease of use: 

It’s on my iPhone (basically an attachment to most people now), I open it when I can. So, on my walk for lunch, on the train, just as and when I need it. It’s also simple, straightforward and with little instruction I understand the objectives of the game. Removing all barriers of me accessing it and getting involved.


Learning link:

So looking at this point from a learning perspective. If learning is available when I need it and has limited complexity on its objectives, it removes barriers to get involved. Not saying this increases the motivation towards learning, but it certainly doesn’t block it.


2. Part of a community:

Being part of the game allows me to be part of a Pokémon community which helps to drive my involvement in the game. I talk about it amongst peers and would feel excluded if I wasn’t part of the Pokémon world.

Learning link:

This feeling of being part of a community and being connected is certainly a characteristic of the millennial world, but I think as the growth of technology has allowed us to be a more connected world we are seeing all generations stating that a community element is something that is desired in learning. This ‘network effect’ makes learning interactive and participatory, allowing people to learn from others experience through discussing and experiencing content together.


3. Personal:

Although I am part of the wider Pokémon community, it’s my game. I move through the levels as I wish and only interact with the parts that are of interest to me. I move through it at my own pace and don’t feel pressure to get into all parts of the game.

Learning link:

Having something that I choose how and when to work through at my own pace and only interact with what is relevant to me, reduces the feelings of pointless learning. It also allows for the time spent on learning to be more productive as what I am focused on is more relevant to me as a learner.

I think these 3 points are an important take-away from PokémonGo. Looking at ease of use, community elements and personalisation of learning can help to make for more engaging learning environments according to research. However, this article and research that was done for it is all from a millennials perspective. Do we think these elements we have discussed for learning is something that applies across all generation and not just millennials?

0 comments:

Learning in a VUCA environment: don’t confuse passion with outcomes

15:00:00 Learning Boffins 0 Comments


One of the best acronyms I’ve come across in recent years is VUCA, which stands for Volatile, Uncertain, Complex and Ambiguous.  These four words describe very well the environment in which we currently live and work and learn.

The last few weeks has served up a particularly unsettling series of events, which emphasise just how VUCA our world is.  Whether it’s the result of a referendum or a football match, political leadership contests or resignations, attempted coups or the appointment of new leaders, terrorist attack or rail accident, our world is daily shaken by unexpected events.  Even in our personal life and work, we face complicated situations which demand that we solve problems with half the data missing.
Of all the media output that has recently washed over me, a brief excerpt stood out.  I can’t remember verbatim, but from a radio interview I heard something like this:

“it’s fashionable for politicians to say how passionate they are about an issue.  But I don’t want my politicians to be passionate - I want them to do something”.
And here’s the thing.  In all the complexity of life and work, we love to hear people that are passionate: it cuts through the confusion and we rightly recognise it as A Good Thing.  Being passionate may well contribute energy to achieving an outcome.

But being passionate is not the same as getting things done.
The parallel with learning is this.  When evaluating learning I hear much about how learners enjoy learning, how learning is important for their development, and anger when access to learning is restricted.  I also see excitement in the eyes of L&D professionals as they plan new content (particularly when it’s of the digital kind).

However for all these expressions of passion (or as close as learning brings anyone to that emotion), we rarely seem to know what difference learning makes.  Nor, when planning learning, is there a clear view of what difference we expect it to make.
Getting excited about learning is not the same as achieving a learning outcome.

If learning is to achieve anything in this VUCA environment, we absolutely must focus on the outcomes we want learning to achieve.  Being passionate about learning isn’t enough.  Enjoyable learning just helps learners along the journey.  Recognising that learning supports development gives energy but not direction.  Brilliant learning content sows the seed of behavioural change, but real change only occurs if it’s applied in the workplace.  The storms of our VUCA workplaces will too easily blow all this off course.

To stand a chance of learning ever making a difference, we need to focus on the outcomes of learning:

·        Before you do anything, start by describing the tangible outcomes you want by the end.

·        Throughout the learning process, keep on articulating the hoped-for outcomes.  Increasingly my experience indicates a precise learning path is much less important than articulating the endpoint.  If learners can describe a worthwhile endpoint, they will get there regardless of what L&D do (or don’t) do.

·        At the completion of the learning and embedding, ask to what extent you can see the hoped-for outcomes?  I guarantee the answers will give you great insights into the learning process.
Always encourage passion, but never let it be a substitute for actual outcomes.

0 comments:

How ‘DogFest’ reminded me of learning styles

12:00:00 Learning Boffins 0 Comments


Over the weekend I attended a ‘DogFest’. It is what it says, a festival for dogs. There were stands with different nutrition, bedding, toys, photography and massage and much much more. But the best thing….dogs. 1000s of dogs all over the place, extremely excited to socialise with so many! As a big dog fan I absolutely loved it, seeing tiny Chihuahuas to beautiful Bernese Mountain Dogs was great to see, but especially seeing the puppies of all kinds of breeds ...they got my vote!

What I loved to see was the agility classes. It is incredible to see the relationship between dog and owner and how they work together to go through the course of jumps, ropes, bridges and tunnels. But the interesting thing for me was the different styles and techniques each owner would use to suit their dog, which they had obviously built over time. Plus how they got the dog interested in the first place, either using toys or treats, they knew what worked for their particular pooch. For example, I saw different starting tactics, different command distances away from the dog (some doing a lot of running and keeping close and others not) or a squeaky toy or bone for encouragement. It highlighted that even though they were all there to do the same course at the same event they had all be taught slightly different, taught in way that suited the dog as a learner to get the best result.  

When you think of workplace learning  each learner has a different style and it is sometimes overlooked. On many occasions one course is rolled out to all employees, meaning they are faced with learning in the same way.  In reality each of us learns differently and what works for one doesn’t always work for the other. After seeing all the doggies learning styles, I thought it was a great time to re-visit the typical learning styles of us humans. One of the most widely used models of learning styles is ‘The Index of Learning Styles’ developed by Dr Richard Felder and Barbara Soloman in the late 1980s, and based on a learning styles model developed by Dr Felder and Linda Silverman:


Although some people have a preferred and dominant learning style, many of us actually use a mix of all of these styles. And most commonly people find themselves using different styles to suit different situations, plus people can actually develop ability in less dominant styles so you certainly don’t find yourself in one style, so it certainly isn't the case of always sitting in the same bucket. But when asking most people they have an inclination of what their preferred learning style.  
So with the learning that we deliver to employees can we say that the learning can accommodate or support different learning styles?

As learning professionals  these seem obvious to have covered this right? But when it comes to curriculum design this is something that can be overlooked. In some organisation a 'one-size-fits-all' approach is taken when it comes to learning (granted in terms compliance this may HAVE to be the case, but surely not with other training?). Not recognising different learning needs and styles can be extremely demotivating for some people. Training content should stem from a desire to improve knowledge in any given area and should increase employee's values to the company and increase engagement, so we should think about varying the learning to suit learners needs.

Understanding learning styles are a great tool for us to understand how we can create and support environments in which everyone can learn from. The challenge is to deliver or curate a variety content that helps them learn effectively and to a good standard for the organisation.

Can you be confident that your curriculum supports a number of different learning styles or that your trainers are able to adapt to support different learning styles?
We have experience in curriculum transformation and how to build a curriculum that is aligned and impactful for your organisation. Speak to us or tweet us @LearningBoffins to find out more.

    







0 comments:

England in Euro 2016: what can we learn from England's demise?

10:47:00 Learning Boffins 0 Comments



So last night England failed yet again to make it deep into a major competition.  To make matters worse, the unanimous opinion on their performance was that it is the poorest they have ever had!  In the aftermath of their defeat no less than 20 minutes after the final whistle the manager, Roy Hodgson, had resigned and the twitter-sphere was awash with jokes regarding two exits from Europe in almost as many days.  So where did it all go wrong for Roy and his England Team?

Fundamentally I think it boils down to a lack of strategy on Roy's behalf.  The pundits on BBC and ITV both slated Roy for not having any understanding of what the teams best formation was and who its top players were despite having ample opportunity to work this out during the qualification process and the pre-tournament friendlies.  This lack of strategy can be seen in the decisions made in each of our group games in terms of substitutions, starting players, playing positions and even the timing of specific decisions.  A common thread post game was that we lacked purpose and it looked like we had no idea what we were doing on the pitch.  This simply cannot be allowed if you want to succeed.

This is also true for business.  All too often decisions are made, or not made, at a strategic level that directly set the tone for how a business's L&D function will operate and the success they will have.  A good example of this would be IT infrastructure.  In an age where technology is becoming more and more integrated into daily life it is expected that e-learning or learning platforms can be implemented to full effect.  These solutions are often built with cutting edge technology and yet they often struggle to be implemented in a business environment due to the lack of IT infrastructure investment.  Somewhere along the line a short term view has set the strategy for IT in learning and it is now having dramatic consequences on how effectively L&D can deliver learning to the end user.


Another failing of the England performance and in particular Roy Hodgson is the ability to recognise strengths and weaknesses and act accordingly.  There was a lot of media scrutiny regarding Roy Hodgson's decision to take Jack Wilshere to the Euros this year.  A player who had barely any league time due to injury and who could not be considered 'on-form'.  Roy chose to take Wilshere instead of other players who had played all season and shown an ability in the England squad.  This has been put down to his loyalty to his chosen few.  This loyalty would also govern a number of key decisions he would make during the four matches we played at this year competition.  At its base level Roy is probably guilty of not being able to spot a problem because of his unconsciously biased view point.

As before, business cannot allow similar tendencies to creep into their world.  Our activities, initiatives, programmes of learning, processes, etc. all need to be constantly reviewed.  What worked well a year ago may not work now.  A process of unbiased self-reflection is required in order to truly perform at the highest level possible and I believe that this is one of the most difficult things for a business to do.  Internal politics, lack of time and just the day to day pressures of an ever changing world make it hard but if L&D and the wider business is to succeed then they need to be smart about where they focus their time and ensure that they are leveraging their best assets and their successes.

Hindsight is a wonderful thing and it is easy for us all to criticise the England performance after the game, especially given the fact that it is unlikely that anyone reading this is a professional footballer.  However, we do understand our own world of L&D and we should take the hard lessons learned by the England team and apply them to our own day to day activities.  Essentially, have a well thought out strategy that is forward thinking and makes the best use of the assets that you have whilst constantly self-reflecting and improving on your winning formula.

0 comments:

A Brief History of (Formal) Training

16:00:00 Learning Boffins 0 Comments



My industrial history might be a bit patchy, but try this story out for size.

Around the start of the 20th century, pretty much all 'industrial' work was craftwork. It was practical. It was about making things, making them better, more reliable and making more of them faster. This required practical skills, the kind of skills an expert craftsman could show you how to do, often over an extended period of time. 'Management' was discharged by the rich and powerful, and was pretty basic: you worked, you got paid; if you didn't do enough work or good enough work, you got fired. No unions, limited workforce mobility, even less employment law, no need for great management skills either.

Then came mass production and Taylorism, and with it mass employment at relatively low skill levels. With 100s of people doing the same job, you needed them all trained up the same. So training (face to face, of course) also became a large-scale repeatable activity, and it worked, with armies of workers trained to execute practical tasks in the One Best Way.

By the early 21st century, all manner of technological advances have changed the nature of work and given rise to all kinds of work that never previously existed, requiring new skills and different knowledge. Automation, enabled by IT, profoundly changed what we do and how we do it. Telecommunications has diffused the location and timing of work. The skill levels required tend to be higher and more cerebral, knowledge workers are commonplace. Company value, previously measured in physical assets, is now dependent on intangibles like Human Capital: it’s all about what's in the employees' heads, their ability to innovate, etc..

Not only has the work changed, but management now assumes far greater importance. Human Capital depends as much on how humans work together, as it does on the intrinsic level of their skills. Add to this high levels of technological change, economic turmoil, workforce mobility and skills shortages, then management and leadership emerge as vital and complex disciplines.

Given how the world of work has changed so dramatically, I am surprised that our prevailing image of the main task which equips people for work (i.e. the classroom course) persists so strongly. In passing, I’m also curious that at the same time, on-the-job training still holds derogatory connotations, through phrases like “sitting by Nellie”.

We have known for a very long time that formal instruction is just part of the development process. Medieval craftsmen knew this – it was partly why they formed Guilds. For all their faults, Guilds provided a great framework for the development of skills through the medieval forerunner of the apprenticeship.

Today, so much business success is dependent upon employees being skilfully creative, for example:

· interpreting each particular customer situation and responding accordingly;

· regularly making rapid decisions based on incomplete data;

· developing innovative approaches to maintain competitive advantage in an ever-changing marketplace.

In such an environment, how can we expect any kind of standardised formal training programme to really deliver significant improvements?

Don’t get me wrong, large-scale formal training still has a place, but it’s unlikely to deliver high levels of competence, and certainly not on its own. For more advanced skills, some kind of personalised, on-demand (and probably on-the-job) learning is vital. We need to embrace the technologies that facilitate peer-to-peer information sharing on a global basis, and give employees freedom to experiment and innovate.


Kevin Lovell

0 comments:

A Night at the Museum, Our view on Curation

17:06:00 Learning Boffins 0 Comments








I was working for a client and curating content for them from the internet to use as a resource for their learners. I love doing this work but in terms of efficiency I can’t claim to be making a profit. I will explain why this is the fault of the ‘hyperlink’ not my fault!

To do this properly you need to assess the material for suitability, you need to have criteria for selecting people for the materials. Don’t get fooled by thinking this is an easy, cheap or one off activity for your learners.

Think of curating for a museum. You may have a lot of ‘stuff’ but it doesn’t mean a visitor leaves your museum having had a positive experience. In fact quite the reverse, with too much to look at, with no discernible connection or symbiosis between the items the visitor leaves feeling bewildered, exhausted and frustrated. With most modern museums less is more, they leave space for thought, the footnotes are brief and there is a pathway which tells an evolving story as the visitor progresses through the well curated space.


We can learn a lot from how museums have changed over time. They know what works for their visitors and their visitors, like ours, are learners. 

So what are the ‘musts’ for meaningful curation?


  • Know your audience, how do they learn? What do they need to know and what do they want to know?

  • What does your company want the learners to get out of your curated materials? It might be hard fact absorption, it could be to build a collective base line of understanding in the company or it could be developing a sense of curiosity in the learners for learning outside of their job role. Is it their objective to build a learning organisation

  • Do you need to create a pathway of developmental levels? (a good idea) Be open minded to the learner who will access the highest level first to see if they can understand the topic at the advanced level before dipping into to the lower levels. 

  • Add texture to the learning. Look for videos, blogs, articles, infographics, and LinkedIn communities. Research papers and free courses, published dissertations, book reviews and Ted Talks, home-made videos from your own SME’s. Vary the tone from the serious to the lighthearted and vary the length from the short hit from an infographic or a 15 hour Open University free course.

  • You might want to create ‘boxed sets’ of learning, perfect for the binge learner who will work through them in order looking for a plot twist in each article and enable them to create their own taxonomy from the learning you put in front of them. 

  • At all times focus on quality, it is fine to have content which looks at topics from opposing standpoints but make sure it is correct; there is a lot of incorrect information on the internet so stick to reliable sources.                                                                       
  • Nobody likes governance but if you are putting up information from the internet ask people not to click on advertisements, explain this is curation of ‘free materials’ and not to sign up for anything which costs money and set a rule about the purchase of books following a book review.



So what about the hyperlinks and my efficiency? I get so absorbed in the material itself and the thirst to know where things are referenced from, I follow every hyperlink there is. Walking through my own endless museum of facts, history, and futurism and without ever having to stand in line to view something is my idea of a happiness (possibly my MD might want to put a closing time on my museum.)




For help curating materials for any subject for your organisation contact me – it will be a pleasure to help you. Rachel.kuftinoff@knowledgepool.com



0 comments:

Do happy learners make performance improvements?

16:00:00 Learning Boffins 0 Comments


I think it goes without saying that a happy workforce will in fact drive better business results but what about when you focus on the workforce when they have just engaged in piece of learning.  In other words they are no longer just workers but learners.

Level 1 evaluations - AKA 'Happy Sheets'

When ever I ask the question...

"do you evaluate the your training?"

I will more than likely get an answer of "yes" followed shortly by a statement something like...

"we send out evaluation forms with every course we run"

These level 1 evaluations or 'happy sheets' usually provide information on the venue of the training, the facilitator, the course content, course administration, potential indications of the knowledge / skills gained, etc.  What they fail to do is identify any actual real world implementation of the learning, performance improvement or business results.  So why is it that the majority of organisations never look beyond the 'happy sheet'.

Is it the fact that by performing any evaluation, albeit at a very high level, is enough to put a tick in a box and satisfy senior management?  Is there an acknowledgement of the fact that more detailed and worthwhile evaluation data requires a much greater amount of resource and that the time and effort to acquire the data cannot be justified?  Or perhaps, more worryingly, do people think that level 1 evaluation data provides enough of an indicator of potential improvement that any other evaluation is unnecessary?

It is this final question that I seek to answer in this post.  Can level 1 evaluation data indicate the eventual performance improvement?

The raw data

Given our broad client base and the fact that we pass roughly 700,000 learners through our systems each year, we have a wealth of data relating to level 1 evaluations.  As you would imagine there are much fewer instances of level 3 evaluations and this data is required to validate a real performance improvement.  The final data sample I have used consists of entries from 4 recent years and across all vertical markets, thereby reducing the effect of any potential outside influences.

The graph below shows the spread of data when level 1 scores are matched to their level 3 counterparts.  The level 1 scores relate to immediate feedback in relation to the training they participated in.  This will include admin, course content, facilitation, etc.  The level 3 data is a score based upon the extent to which the learning has contributed to their performance improvement.


What is clear is that there is little to no correlation in the data.  What is interesting about this fact is that prior to me sifting through the data I myself expected there to be more correlation than there is.  I anticipated that those learners who came away from the training enthused and energised about their experience were more likely to put into practice the learning covered on the course.  It startled me to find that immediate enjoyment of the learning and even its content has very little bearing on improvement in role.

After observing the lack of correlation I decided to do a deeper dive into the data, the often startling results of which are covered in the sections that follow.


Level 1 evaluations - 70% satisfaction guarantee

Unsurprisingly, level 1 evaluations lean towards awarding full marks more often than not.  What is surprising is the lack of any significant outlier data.   94% of level 1 responses come from individuals scoring training at 70% or more.  In fact, 49% of all responses were scored at 90% or better.  With only 6% of respondents scoring less than 70% in terms of course satisfaction, does this mean that all training is worthwhile, enjoyable or relevant?

Obviously the answer has to be NO.  We have to remember that these responses are taken close to, if not on, the training programme itself and as such there will likely be an abundance of positive emotions.  Things that invoke a positive reaction may include:

  • Learning something new that may impact their abilities
  • Having a fun / engaging exercise prior to completion of the evaluation
  • Spending 2 days out of the office
  • Meeting colleagues they have not seen in some time
  • Actually just attending training rather than the usual day to day work
  • etc.

This positive slant effectively means that trying to find any correlation between level 1 scores and performance impact becomes much harder.  For example, take the following relatively standard 5 point scale:


If the overall question for a piece of training reads something like...

"Did the training deliver against your development needs?" 

94% of individuals are scoring at either "Agree" or "Strongly Agree".  This does not make for insightful data analysis relating to potential performance improvement.  However it is a great gauge of learner satisfaction immediately post event.


What exactly are you measuring?

Throughout this post I have mentioned that level 1 evaluation measures general satisfaction with the training but perhaps more accurately it measures an subjective opinion or feeling  in the moment.  This opinion needs no justification or evidence to support it and as such can be easily influenced by outside factors.

Level 3 evaluations on the other hand are based upon more considered thinking.  Whilst these are often opinions they are based upon information gathered over a period of time and then evaluated by the learner.  The following question is a spin on one we use in a large proportion of our level 3 evaluations:

"To what extent has the course been directly responsible for your performance improvement?"

This question seeks to establish a tangible link between the learning and performance.  Level 1 evaluations do not focus on creating any meaningful link, and in fact cannot, given their completion immediately after the learning.  Instead they focus on how suitable the content of the learning was or how much new information / knowledge you have gained, all of which is pure speculation.

The crux of it is that level 1 evaluations don't ask the right questions to be able to accurately predict eventual performance improvement.


Dissatisfied learners won't improve!

So I have already discussed the fact that 94% of responses come from people scoring more than 70% in level 1 evaluations and that this has little to no bearing on performance improvement.  However, there is a trend when it comes to the remaining 6% of respondents.  In other words those that answer below 70% on their level 1 evaluations.

Those that respond with 60% satisfaction at level 1 tend to see a reduced amount performance improvement that can be related to the training.  In fact 89% of respondents that scored 60% or less at level 1 attributed less than 50% of their performance improvement to the training they received.  Is this surprising?

In many ways this statistic makes complete sense.  If the training doesn't meet the needs of the individual or map well to their role then it would stand to reason that any performance improvements wouldn't map to the training.  It must also be considered that even if the training is relevant, if the individual has a negative learning experience then they are less likely to attribute any improvements to the learning.

The difficulty with making decisions based upon this information is that it is such a small proportion of the data available.  Discontinuing or changing training based upon this amount of information could not be recommended.  Without level 3 data relating to the success of the actual content, making such suggestions would not be wise.  


Conclusion

To answer my original question... There is no guarantee that happy learners make performance improvements.  This is not to say that level 1 evaluations are not relevant.  They provide meaningful data as long as it is understood what this data measures.  What our data shows us is that you cannot use level 1 data to predict performance improvement.  If your goal is to truly understand the effectiveness of training then time and effort has to be invested into proper detailed evaluation methods.

0 comments:

Our Digital Learning Infographic

11:45:00 Learning Boffins 0 Comments

We speak a lot about the need to support learners with digital. But never see a lot about how we actually do this, what steps do we take to move towards a learning approach that is supportive of digital? The Learning Boffins have captured in an infographic the key phases that need to be considered when going on the digital transformation journey. Take a look........

0 comments:

Moving from ILT to digital learning

14:42:00 Learning Boffins 0 Comments







We know that the learning world is full of discussions around using alternative methods of learning. 'The millennial is engaged like this....' and 'this piece of technology can do this....'. Clearly across organisations this is something we are becoming increasingly aware of within L&D. But a question we see ourselves answering is 'so how do we actually move towards and use/support alternative methods of learning in our organisations?'.

So with all the talk around how we should support/implement/explore alternative methods of learning such as social, online, bite-sized, personalised and micro-learning (to name a few) many are not nailing exactly what phases or steps that should be considered to head in the direction of a more 70:20:10 approach.

Us Learning Boffins have been looking at exactly this. We have tried to pinpoint what we see as key phases of the journey from our experience that you go through to move towards a more of a blended learning approach, so here we are:

Step 1. Business Plan - Create a business plan to facilitate your transformation. Consider the key questions here around current capabilities, future needs and potential implementation, integration and maintenance costs for new technology.

Step 2. Market Analysis - So we know our requirements, what technology is out there that can address and support this?

Step 3. SME and stakeholder buy-in - Appoint the SME for the new technology, get buy-in from your stakeholders and make sure IT have been included!

Step 4. New roles/re-shaping L&D - A new shape, new additions, new technology and new techniques means L&D require new skills, therefore address the gaps here and appoint new roles accordingly

Step 5. Content - Where is your learning content coming from? Third parties, curated internally or created internally?

Step 6. Communication and change management - This is a HUGE phase of the journey, communicate, get awareness, get end-users understanding and make sure they get the benefits, the point of it and how they can use it to benefit their own learning. Don't leave this till the end, all of the previous work becomes obsolete and the investment will be lost.

Step 7. Pilot/test/change/feedback - Employees need to be involved here, allow time for testing and piloting to see how it could potentially be changed, improved or removed. This could be a time consuming phase but no buy-in from employees makes it difficult for the new learning tools to be utilised and therefore for it to become a positive investment

Step 8. Roll out and on-going management - Once the technology has been successfully rolled out, it needs to be continually assessed. Ensuring that content remains relevant and that it has the desired business impact.

Every journey will have a slightly different angle and the phase will look different from one to the other organisation. It all depends on the particular requirements, purpose, type of technology and scale.

To discuss the journey further and how Learning Boffins are able to support at each of these phases, then tweet us @LearningBoffins we would love to discuss your Digital Learning Journey with you!

 

0 comments:

As featured in Training Journal - What's shaping your L&D?

14:50:00 Learning Boffins 0 Comments




Rachel Kuftinoff asks some serious questions about the future of L&D and invites you to join the debate

There are very real concerns in the sector that learning and development professionals and departments are being pushed into a different shape, and out of board-level decision-making processes. It can appear that we have no control on this, but is that true? The fear within the industry that L&D will shrink beyond a vanishing point shows that we need to fight to justify the impact we make on businesses.
More features
As Europe’s number one managed learning service provider, KnowledgePool (part of Capita Learning Services) represents some of the world’s biggest employers and has an incredibly diverse client base, from banking to bread-making. This is why we are perfectly placed to ask this question, and why in 2016 Capita is investing in a new, extensive globalisation study. After a series of events and market research, the investigation will give definitive insight into the challenges, changes and other factors that will shape learning and development over the coming years.
In the course of this research, we will be asking TJ readers, senior learning professionals, and heads of HR for their views, as well as practitioners from over 6000 training providers with which Capita partners. The study will also use the opinions of C-suite leaders such as chief operating officers and chief finance officers who do not work in L&D but feel its impact on organisations.
The insight we receive from the research will give answers on what is shaping the HR and learning services for the UK’s largest employers. Here are some of the emerging themes.
Agility
This recently came up in a discussion with the head of L&D for a large supermarket chain. The company had always had a centralised function, and had been criticised for being too slow to act. By the time learning programmes had been sign-off, the situation had changed. Agility is increasingly important as leaner operations mean that there are often fewer people working in departments, meaning that fewer learners can be taken out of operations for learning and development opportunities. How can we tackle that reduced access to learning?
Cost
Companies and employees across the board need to be aware of budgets. These were slashed in 2008 and we hoped at the time that this was the limit, but in fact it is even closer to the bone now. The funding hasn’t bounced back since, and there has been not so much talk of ‘rebuilding’, as ‘reshaping’, which often means losing employees. Is this the industry’s fault for failing to articulate, in business language, the value that L&D brings to organisations? Perhaps we need to try harder to “prove it or lose it”, showing the true return on an investment in good learning.
Technology
To implement all the exciting developments like mobile learning, bring your own device, social learning and social interactivity, you need to have the technology available to run them. Many companies are running on Internet browsers that can’t support this. Organisations that have large IT infrastructure and a big user base don’t have the available funds to justify the cost of updating to cutting-edge learning platforms. Some of the most exciting and flexible learning solutions, therefore, they simply can’t afford to access. How can we shape L&D to include those hundreds of thousands of users? Accessibility to learning is an important subject, and I’m not averse to recommending printed workbooks in some appropriate circumstances – such as for learners in remote countries with a lack of Internet connection.
Diversity
Diversity is not just about ethnicity and gender. Diversity of language is an issue in the learner population in the UK alone, where English may not be the first language. Diversity of working environment is a factor too: e-learning can work fine for those in an office environment, but where does that leave the employees on the shop floor, or who spend much of their time on the road? Diversity of expectation exists between different individuals, for example, the baby boomer generation might be happy not to undertake learning until they have to, but millennials tend to look for much more frequent access to learning, sometimes themselves on a weekly basis: “What have I learned today?”
Political
Especially in the current political landscape, with the Brexit vote looming, companies are holding their breath. Risk averse, they are waiting to know some of the unknown before investing, which is understandable. What they do need to think about is their readiness, as whatever direction the political landscape goes, they need to be prepared to lead, jump ahead of the game and take advantage of the situation. They already need to be training.
There are also growing concerns among many employers about the work-readiness of people coming out of education aged 16-19. The style of learning in work is unfamiliar after years of academia. If they aren’t following the same style they did in school, college and university, are they prepared for the more informal learning that we in L&D are promoting? There is a disconnection between the education system and requirements of the workplace. For example, what would be called collaboration in work would be seen as ‘cheating’ in school. The pen and paper environment isn’t representative of the real world, where people need to get used to learning out loud, as well as asking for and getting feedback. Should L&D professionals have more influence over education policy? After all, you spend more time learning as an adult than you do as a non-adult.
Generational
Millennials are a very large generation, who don’t necessarily come from the countries they are now living in. Fresh and vibrant, they want to be educated, amused, valued, appreciated and listened to. They are one of the most different and transformative generations we have seen in decades.
We used to talk about the ‘sheep dip’, a reference to everybody undergoing the same training. However, we shouldn’t treat every learning generation the same way. Millennials want to learn, but they also don’t need to be taught certain things such as political correctness and environmental responsibility. They already get it.
So what’s next? While the research is under way, we can think about becoming learners again ourselves by watching and listening. It’s important to give other people the space and opportunity to learn, and remember that learning doesn’t always require a teacher in the room.
We’d be interested in hearing what you as practitioners have to say about what’s driving the future of L&D. If you'd like to discuss it with us and a panel of senior L&D directors, then apply to join us on Thursday 26th May from 16:30 to 20.00 at The Hospital Club in Covent Garden – apply for the event here.

0 comments:

Data, data everywhere, but not a drop to drink

16:00:00 Learning Boffins 0 Comments



As large L&D departments progressively digitise their learning, they find themselves with systems capable of producing huge quantities of reporting data: learning management systems, purchasing/finance systems, HR systems, talent management systems, social learning platforms and so on.

Sitting on this great wealth of raw data, you’d expect to gain great insights from it. Yet L&D departments seem to be no better off: their decision-making becomes slow and based on guesswork rather than evidence. In a fast-moving economy, this is just not good enough.

Even basic metrics like how much is spent on learning, are still quoted to the nearest million pounds. Calculations of the amount of training delivered annually are, at best, estimates. We need to get better at MI!


Why is this?


I think there are three main reasons why this is so:

1. Systems are fragmented


There is often more than one LMS, or there’s one system with data about classroom training, and another with e-learning data. Then there’s no way to join together data from different systems to get the whole picture: data from different systems is incompatible, stored in different formats and with no universal indexing fields (for example, individuals in one system are identified by their email address, but elsewhere they are identified by their employee ID).

One central problem is that supplier spend data is held in finance systems, whilst the learning activity data is stored in one or more LMSs, if not, it’s held in a plethora of Excel spreadsheets. This means there is no way to link spend to activity, to see with any clarity where your spend is being allocated.


2. Data quality is low


Data input into these systems is often inconsistent, particularly when that data is not used for subsequent reporting. Once the data starts being used, there’s a lot more interest in keying it in right, in the first place.

Even when data entry is not involved, you can have problems. For example, when is a piece of e-learning content ‘completed’? Much of the point of e-learning is that the learner does not have to use all of the content, so unless it’s a compliance piece (which insists the learner view all the content, or tests them at the end), the concept of ‘completion’ (which is after all a construct related to ILT) is impossible to pin down. Another example related to digital learning concerns learning duration. Just because some content is open on a learner’s screen does not mean they are reading it. As a result, recorded learning durations can be highly overstated.

Unless you can improve accuracy or sidestep such problems by making some general assumptions, MI built on low quality data can feel like a house built on sand.
3. We can have unrealistic expectations of learning data

Systems tend to capture learning activity, not the outcomes of learning activity. Nowadays every Head of L&D wants to understand the impact of learning, but learning data is always going to struggle to demonstrate the business impact of our learning investments. This was well-illustrated by a frustrating conversation I had over Kirkpatrick Level 1 response data, about why you can’t use Level 1 data to demonstrate ROI. We risk having unrealistic expectations of our data, however there’s still a great deal that MI can do for you.


What is decent MI worth?


In short, it can reduce the total cost of learning by 30-40% in a large organisation. Big organisations are complex and fast-changing: you simply can’t keep track of everything just by keeping an eye on the daily workload. Good MI gives you the means to count everything that happens – that means you have visibility and the means to control what goes on.

I find that the introduction of good MI across the L&D function can save them around a third of their total cost of learning, because it gives them the ability to spot inefficiency and waste, and then work to drive it down. Here are the main areas where MI enables changes to happen:

Make sure learning effort is well prioritised (i.e. aligned to business goals). MI that shows you precisely what spend is going into which learning, will show you how good you are at targeting learning on the key needs. You can spot any large investments in low-priority low-impact learning, which will help you decide how to prevent that in future.

Reduce penalty costs. The cost of no-shows can be 5% of your external spend, good data helps you spot the trends: higher no-shows on a Monday perhaps; repeat offenders; learners booked a long way in advance and forgot or left the organisation. Spotting the trends tells you how to minimise the waste.

Maximise your ILT event occupancy. Make best use of your trainer cost by filling the room. Fill rate analysis shows you how well you are doing – many organisations only fill 60% of their training places, so use the MI to get smarter:

  • · don’t schedule more events than you really need;
  • · cancel low-fill events in advance before penalties become due;
  • · work out your trainer utilisation in terms of people trained rather than just events delivered

If ILT courses end up running too infrequently, then think about redesigning them into a more flexible delivery mode. Lots of companies have shifted induction training from a traditional classroom event on the first Monday each month, to e-learning, webinar and social media delivery. The results are much better.

Analyse your happy sheet data. Paper feedback sheets are next to useless if you are trying to manage learning for several thousand employees. Capture all your course feedback online, and you can quickly generate data to tell you how your courses are being received. This way you can quickly spot the good and the bad: trainers, courses, suppliers, rooms. And make decisions accordingly.

Sort out your curriculum. Learning activity data that is reliable and comprehensive lets you see who’s doing what. As a rule, I find 90% of an organisation’s learning sits with the top 10% of learning content, yet the curriculum is usually choked with thousands of items that have not been used for 2+ years. Let the data tell you what’s being used, then add anything else you expect to need in the foreseeable future, then discard the rest.

Most curricula need around 500 items: make those easy for learners to find, don’t spend time and money maintaining any more!


What's stopping you?


Your organisation’s IT infrastructure. One global organisation said it took seven years to establish a joined-up suite of enterprise-wide HR systems. Surely it’s a big investment, but having up-to-date technology is fast becoming an important factor for workforce productivity and skills development is just one area of benefit.

Great systems but poor implementation. There are market leading learning systems out there, but the benefit realisation depends heavily on how well they are implemented, especially in the area of MI and reporting. Out-of-the-box reporting is basic and always needs some customising to address the organisation’s needs. Also, data quality depends on well-designed processes, system customisation to suit and a degree of discipline on the part of those responsible for data entry.

Disparate systems. Even without enterprise-wide systems, you can build data interfaces between your different learning technologies, perhaps having several data sources feeding into a single database, with a Business Information tool to conduct the analysis. You’ll need to be smart, so the different interfaces populate the database with consistent data, but this is one way of getting better (if not perfect) MI.

The need for learning impact data. If learning impact is what you need, then the data from learning systems will not get you far. You’ll need to look outside learning, to HR and beyond, for business data (and here you may start all over again!).

Getting your masses of learning data into shape isn’t straightforward, but once you crack it, there are big savings to be had.

0 comments:

Looking at learning evaluations in a runners world

15:00:00 Learning Boffins 0 Comments



From a previous post you may have seen that in April I decided to run 2 half marathons. Leading up to this point I was using a TNA approach to make sure my training was targeted, effective and most importantly going to achieve what I wanted at both of the half marathon events. In the previous post 'How to train for a half marathon in a learning consultants world' I discussed how this approach to training is just as important in workplace learning.

Having now having had a break I have spent this time to reflect, identify what went well, what was key in the training to enhance my performance, what I done differently because of the targeted training and what I think I can improve on for future events.

As I am the only person involved in my running evaluation it is relatively straight forward. I can pick out exactly what worked and I know I can identify what impacted the speed of my half marathons:
- A combination of both fast 'sprint' work and long lengthy runs.
- A focus on core strength
- Not to consistently feel the need to run the full distance until race day

What didn't work well:
- Both races I didn't properly prepare in the starting pens
- I was constantly checking the watch (throughout training and race day) and panicking each time I knew I was on target or above target.

Over all this is a brief, not thorough and quick evaluation but it is something I can take away and use to inform how I conduct training going forward and it also recognises what were the key areas that led to performance improvement, because I achieved my goal as I got my sub 2 hour half marathon, woo!!

On a larger scale this is something that we Learning Boffins look at within workplace training. A common question for L&D professionals is - what difference has the training made?

 Looking at evaluations in the learning world is more complex and difficult to do. Not only is there normally a much larger audience size, there is also a multitude of factors and measures involved.

We have experience of applying a methodology to an organisations training programme that we know can prove the business impact of learning. The flexible methodology adapts to, and enhances, the current level of evaluation you have. It doesn't require us having benchmarking data, yet we can still enable clear links to be drawn between the learning delivered and the resulting levels of performance improvements and where possible the ROI. The methodology we use is underpinned by the work of Kirkpatrick, Phillips and Brinkerhoff and can draw on a wide range of data sources and anecdotal evidence. We use the diagram below to show the objectives of our methodology. Through questioning techniques and qualitative analysis we aim to recognise the learner journey from the learning layer, to the performance layer and through to the business benefit layer:


The key benefit of this is to protect and justify L&D's value o we are finding more conversations around the methodology, because if L&D are not starting to prove the value of learning its likely to be taken away! Talk to us if you want to discuss about how learning evaluations can support you learning function or if you have tips to run a sub 1hr45min I would also be keen to hear :)


0 comments: