I'm currently putting together a more formal training syllabus/plan for the testers on our team. Whilst on-the-job coaching has served us pretty well so far (helped by the fact that everyone on the team is a really motivated self-learner, and yes, this does make for an amazing work environment, thanks), I'd like to make sure we don't miss out on opportunities for external training that might bring in new ideas, and I'd also like to be able to use it as a resource that might be helpful to other roles who might still need to develop some degree of testing skill.
I'm not looking for a laundry list of courses, or course recommendations (I already have many more great courses than we could ever afford to send people on), but more of an idea of:
a) How you went about putting together the dept training plan. Consulting my colleagues about what they'd like to see is an no brainer - but any other advice?
b) How you broke down the skills list involved - what areas did you cover? What wasn't on your list, and why? What factors did you consider? What kind of mix of hard and soft skills did you go for, and how was that influenced by the makeup of your team? Did you mainly focus on skills to develop, or knowledge to gain?
c) What really really hasn't worked for you in the past?
( 3 months ago )
I designed and developed training for MS for about 10 years. The most successful approach I found was to
Perform a business needs analysis - many businesses have required compentencies or skills for people in different job functions at different stages in their careers. Even smaller companies likely have a set of expectations, and expected level of proficiencies in job roles, or product/technology 'expertise.' Also, interview people already doing the job, and ask what they needed to know six months ago to be more effective/performant in their job. Also consider the future direction of the business (e.g. what skills/knowledge is required to grow the business).
Perform a skills gap analysis - Assess the current level of skills, compentencies, etc. What skills and how proficient are people with those skills today. What do they know about x. (For example, we were primarily targeting CS grads for tester roles, and we from interviews with new hires, college boards, etc that testing techniques and approaches are barely covered in most programs. We also learned that while most new testers knew the general concept of techniques such as equivalent partitioning of data, few were competent enough to decompose complex data sets into equivalent sub-sets for a given task.) (Another example might be internal tool training, or processes, etc.)
Design targeted training sessions - that addresses the skills/knowledge gap.
For teaching new concepts I would structure each lesson as follows
5-10 minute analogy introducing concept (ideally with something they can relate to)
5-10 minute demonstration of practice/skill
20 - 30 minute practical exercise
15 - 20 minute discussion of lessons learned and new ideas (This is the part where student move from 'application' to 'analysis' in Bloom's taxonomy.)
Few additional thoughts
Don't spend a lot of time in the classroom on theory, provide additional material to explain theory
Keep lessons short, no more than 1 hour spent on a specific topic/concept
Separate 'soft-skills' from more 'technical' skills
Make it hands-ons, many people learn a new skill by doing (anyone can read how to rebuild a carborator, but it is quite different to actually do it successfully)
Encourage collaboration/teamwork on finding solutions to problems