Discover proven leadership training evaluation questions that reveal authentic learning outcomes and behavioural change. Includes 100+ examples by category.
Written by Laura Bouttell • Tue 25th November 2025
Leadership training evaluation questions are targeted enquiries designed to assess the effectiveness of leadership development programmes by measuring participant reactions, knowledge acquisition, behavioural application, and organisational impact. The right questions transform vague impressions into actionable intelligence that drives programme refinement and demonstrates return on investment. Research from TalentLMS indicates that organisations using structured evaluation questions see 41% higher knowledge retention rates compared to those relying on generic feedback requests.
When crafted thoughtfully, evaluation questions serve dual purposes: they gather diagnostic data whilst simultaneously prompting participants to reflect on and commit to applying what they've learned.
The typical post-training survey reads like a politeness ritual rather than a diagnostic instrument. "Did you enjoy the session?" "Would you recommend this to colleagues?" "Please rate the facilitator." These questions measure superficial satisfaction whilst ignoring the elements that actually predict behavioural change and business impact.
Consider the implications of this disconnect. A leadership programme might receive glowing satisfaction ratings because the facilitator told entertaining stories and served excellent refreshments—yet participants leave unable to articulate a single concept they'll apply differently on Monday morning. Conversely, the most transformative development experiences often create initial discomfort as participants confront uncomfortable truths about their leadership patterns.
Effective evaluation questions penetrate surface-level reactions to probe knowledge acquisition, implementation intentions, anticipated barriers, and environmental support factors. They acknowledge that learning leadership resembles learning to play chess—initial exposure provides rules and patterns, but competence emerges only through deliberate practice and reflection.
The strategic value extends beyond programme improvement. Well-designed questions create what psychologists call "retrieval practice"—the act of answering thoughtful questions about content actually strengthens memory and understanding. Your evaluation instrument becomes a learning tool, not merely an assessment mechanism.
Reaction-level questions assess whether participants found the training engaging, relevant, and valuable whilst the experience remains vivid in memory. These questions establish baseline satisfaction and identify logistical or delivery issues that undermine learning.
Programme Quality Questions: - "On a scale of 1-10, how would you rate the overall quality of this leadership training?" (quantitative) - "What aspect of the programme provided the most value to you personally?" (qualitative) - "If you could change one element of this training, what would it be?" (improvement-focused)
Relevance and Applicability Questions: - "How relevant was this content to your current leadership challenges?" (1-5 scale: Not relevant to Extremely relevant) - "Which specific topics addressed your most pressing needs as a leader?" (open-ended) - "What critical leadership topic did this programme not adequately address?" (gap analysis)
The final question proves particularly diagnostic. Participants often identify blind spots invisible to programme designers who work from theoretical frameworks rather than frontline reality.
Facilitator Effectiveness Questions: - "The facilitator demonstrated deep subject matter expertise." (Strongly disagree to Strongly agree) - "The facilitator created an environment where I felt comfortable sharing challenges." (1-5 scale) - "What could have made the facilitator more effective?" (constructive feedback)
According to research highlighted by LearnUpon, these immediate reaction questions should be administered during the final 15 minutes of training whilst impressions remain fresh, optimising response rates and data quality.
Learning-level questions move beyond satisfaction to assess actual competency development and knowledge transfer. These questions reveal whether participants can articulate, apply, and integrate new concepts.
Knowledge Demonstration Questions: - "Describe the key difference between transactional and transformational leadership in your own words." (concept articulation) - "Given this scenario [provide specific situation], which leadership approach would be most effective and why?" (application) - "Rate your confidence in applying the GROW coaching model with your team members." (1-5 scale: Not confident to Very confident)
Comparative Understanding Questions: - "Before this training, I could effectively facilitate difficult conversations." (Pre-training rating) - "After this training, I can effectively facilitate difficult conversations." (Post-training rating) - "What specific technique or framework increased your capability most significantly?" (qualitative insight)
Research from Whatfix demonstrates that 83% of high-performing organisations use pre-and-post comparison questions to quantify knowledge gains, providing objective evidence of learning rather than subjective impressions.
Self-Efficacy Assessment Questions: - "I feel equipped to handle resistance when implementing change." (Strongly disagree to Strongly agree) - "I can identify when my communication style might be undermining my message." (confidence rating) - "I understand how to adapt my leadership approach based on team member readiness levels." (comprehension check)
These self-efficacy questions predict implementation likelihood. Research shows that participants who report high confidence in specific competencies are 3.2 times more likely to attempt application within 30 days.
Behaviour-level questions, administered 30-180 days post-training, assess whether participants have actually changed how they lead. These questions separate training attendance from genuine development.
Implementation Questions (30-Day Follow-Up): - "I have actively applied concepts from the leadership training in my role." (Never to Regularly) - "List three specific instances where you used frameworks or techniques from the training." (concrete examples) - "What prevented you from applying certain concepts you intended to use?" (barrier identification)
Behavioural Change Questions (60-90 Day Follow-Up): - "My team members have noticed positive changes in my leadership approach." (Strongly disagree to Strongly agree) - "Describe specific feedback you've received from team members about your changed behaviour." (external validation) - "How has your approach to delegation evolved since completing the training?" (comparative reflection)
According to the Leapsome research on leadership survey questions, asking for specific examples rather than general ratings dramatically improves data validity—people are poor judges of their own behaviour in abstract, but quite accurate when describing concrete situations.
Sustainability Questions (120-180 Day Follow-Up): - "I continue to regularly use at least three frameworks from the training programme." (yes/no/partially) - "Which concepts have become integrated into your natural leadership style?" (habit formation) - "What support would help you sustain and deepen your development?" (ongoing needs)
Results-level questions connect leadership development to business outcomes that executives care about—team performance, retention, innovation, and productivity.
Team Performance Questions: - "Team engagement scores in my area have improved since I completed leadership training." (yes/no/uncertain) - "What specific team performance metrics have shifted since you applied new leadership approaches?" (quantifiable impact) - "How has the quality of decision-making in your team evolved?" (qualitative improvement)
Talent Development Questions: - "I have identified and developed successors or high-potential team members." (Strongly disagree to Strongly agree) - "How many team members have been promoted or taken on expanded responsibilities?" (quantitative metric) - "What percentage of open positions in your team have been filled by internal candidates?" (pipeline strength)
Research from Indeed's analysis of 100 post-training evaluation questions reveals that organisations measuring business results see 2.7 times higher leadership development ROI compared to those stopping at satisfaction or knowledge assessment.
Innovation and Initiative Questions: - "Team members have proposed or implemented innovative improvements." (frequency rating) - "How many new initiatives or projects has your team launched?" (quantitative data) - "Describe one business challenge your team has solved differently due to your evolved leadership approach." (concrete impact)
Pre-Training Baseline: - "What concerns you most about stepping into a leadership role?" (fear/anxiety identification) - "Rate your current confidence in having difficult conversations." (1-10 scale) - "What leadership behaviours have you observed that you want to emulate or avoid?" (role model analysis)
Immediate Post-Training: - "I understand the key differences between being an individual contributor and a people manager." (comprehension) - "Which transition challenge does this training best equip you to handle?" (readiness assessment) - "What might prevent you from implementing delegation approaches taught in this programme?" (barrier forecasting)
90-Day Follow-Up: - "How many one-to-one meetings have you conducted with each team member?" (activity tracking) - "Describe a difficult conversation you've navigated using frameworks from the training." (application evidence) - "What aspect of people leadership still feels most uncomfortable?" (ongoing development needs)
Immediate Post-Training: - "How will you apply strategic thinking frameworks to your most pressing business challenge?" (application planning) - "The training addressed the unique complexities of executive leadership." (relevance rating) - "What insight surprised you most about your current leadership approach?" (self-awareness)
60-90 Day Follow-Up: - "I have modified my approach to organisational strategy based on concepts from this programme." (strategic impact) - "How has your executive team's decision-making process evolved?" (systemic change) - "What percentage of your calendar now aligns with strategic priorities versus tactical issues?" (time allocation shift)
Research from ELM Learning on training evaluation questions emphasises that executive programmes require longer evaluation windows—strategic changes manifest over quarters, not weeks.
Immediate Post-Training: - "I can articulate the four stages of team development." (knowledge check) - "Rate your confidence using the GROW coaching model in real conversations." (1-10 scale) - "Which team dynamic challenge does this training best equip you to address?" (targeted application)
30-60 Day Follow-Up: - "How many coaching conversations have you conducted using frameworks from the training?" (frequency) - "My team members report feeling more heard and supported." (perceived impact) - "Describe one situation where coaching approaches yielded better outcomes than directive leadership." (comparative evidence)
90-Day Follow-Up: - "Team members increasingly solve problems independently rather than escalating to me." (empowerment metric) - "What percentage of your leadership interactions are now coaching-oriented versus directive?" (style shift) - "How has team psychological safety evolved based on observable behaviours?" (culture indicator)
Organisational Culture Questions: - "My organisational culture encourages the leadership behaviours taught in this programme." (1-5 scale) - "What cultural norms might conflict with applying these leadership approaches?" (barrier identification) - "Senior leaders model the behaviours emphasised in this training." (role model presence)
Research from Risely's training evaluation survey guide demonstrates that low scores on cultural alignment questions predict implementation failure regardless of programme quality—a critical diagnostic signal requiring systemic intervention.
Manager Support Questions: - "My manager will actively support me in implementing new leadership approaches." (Strongly disagree to Strongly agree) - "I have discussed my development goals from this training with my manager." (yes/no) - "What specific support from your manager would most accelerate your development?" (need identification)
Resource Availability Questions: - "I have sufficient time to practice and implement new leadership approaches." (adequacy rating) - "What resources or tools would help you apply concepts from this training?" (support needs) - "The competing priorities I face will prevent me from focusing on leadership development." (constraint acknowledgment)
These environmental support questions often reveal inconvenient truths—that organisational rhetoric about leadership development contradicts actual workload expectations and reward systems.
Behavioural Commitment Questions: - "I will conduct weekly one-to-one meetings with each direct report for the next eight weeks." (specific commitment) - "I commit to soliciting feedback from my team on my listening skills within the next fortnight." (accountability creation) - "I will apply the delegation framework to at least two tasks by [specific date]." (implementation intention)
Research on implementation intentions shows that stating specific behavioural commitments increases follow-through likelihood by 42% compared to general intentions like "I'll try to delegate more."
Reflection and Application Planning Questions: - "Describe one leadership behaviour you will stop, one you will start, and one you will continue." (start/stop/continue framework) - "What specific team member will you practice coaching approaches with first, and why?" (targeted application) - "When will you review your progress on implementing concepts from this training?" (self-monitoring plan)
Barrier Anticipation Questions: - "What obstacle will most likely prevent you from applying these concepts?" (forecasting) - "How will you address this barrier when it emerges?" (contingency planning) - "Who can you enlist as an accountability partner for your development?" (support network)
Weak question: "Was the content relevant?" Strong question: "Which three frameworks from this programme address your most pressing leadership challenges, and how do you plan to apply each one?"
The weak question invites a binary yes/no response requiring no reflection. The strong question demands engagement—participants must identify specific challenges, match concepts to those challenges, and articulate application plans.
Effective evaluation questions share five characteristics:
Rating Scale Questions provide quantifiable, aggregatable data essential for trend analysis: - "Rate your confidence applying situational leadership concepts." (1-10 scale) - "How likely are you to recommend this programme to peer leaders?" (Net Promoter Score: 0-10)
Use consistent scales throughout your evaluation. According to SurveySparrow's analysis of 100+ post-training survey questions, mixing 5-point, 7-point, and 10-point scales within one survey creates cognitive friction and reduces response quality.
Open-Ended Questions capture nuanced insights that numerical ratings miss: - "What leadership behaviour will you change based on this training?" - "Describe a recent situation where these concepts would have helped you achieve a better outcome." - "What barrier might prevent you from applying what you've learned?"
Whilst harder to analyse, these questions often surface the most valuable insights. One financial services firm discovered through open-ended feedback that participants struggled to apply conflict resolution frameworks because their organisational culture punished visible disagreement—a systemic issue no rating scale would reveal.
Comparative Questions quantify change over time: - "Before this training, I understood how to provide developmental feedback." (rating) - "After this training, I understand how to provide developmental feedback." (rating) - "Three months later, I regularly provide developmental feedback using learned frameworks." (behavioural follow-up)
Scenario-Based Questions assess applied knowledge: - "Your team member consistently misses deadlines. Using concepts from the training, describe your approach to addressing this situation." - "You're leading a change initiative facing resistance. Which frameworks would you apply, and in what sequence?"
These questions reveal whether participants can transfer concepts from controlled training environments to messy organisational reality.
Mistake: "How satisfied were you with the training?"
This question measures nothing actionable. High satisfaction might reflect entertaining facilitation, comfortable venues, or convenient scheduling—none of which correlate with learning or application.
Solution: Replace generic satisfaction questions with specific diagnostic enquiries: - "Which session provided concepts you can immediately apply?" (value identification) - "What would have made this training more applicable to your role?" (gap analysis) - "How confident do you feel implementing each of the five frameworks covered?" (competency rating)
Mistake: "This training significantly improved your leadership capabilities, didn't it?"
Leading questions bias responses and undermine data validity. Participants feel pressure to confirm suggested responses rather than provide honest assessment.
Solution: Use neutral phrasing that permits negative responses: - "How has this training affected your leadership capabilities?" (open-ended neutral) - "Rate the extent to which this training improved your leadership capabilities." (scaled neutral)
Mistake: Asking behaviour-change questions immediately post-training, or asking reaction questions six months later.
Each Kirkpatrick level has an optimal evaluation window. Asking behaviour questions too early captures intentions rather than actions; asking too late risks attribution problems.
Solution: Align questions with appropriate timeframes: - Immediately post-training: Reaction and learning questions - 30-60 days: Initial implementation and early behavioural change - 90-180 days: Sustained behaviour change and team impact - 6-12 months: Organisational results and ROI metrics
Mistake: Comprehensive 50-question evaluations covering every conceivable dimension.
Research from Connecteam on training evaluation questions reveals that response rates drop 14% for every five questions beyond the tenth, and later questions receive predominantly neutral ratings—evidence of respondent fatigue.
Solution: Ruthlessly prioritise. Include only questions directly tied to decisions: "If participants rate this low, we will specifically change X." Questions without attached decisions waste time.
For immediate post-training: 10-15 questions maximum For follow-up evaluations: 12-20 questions maximum
Mistake: Requiring names on all evaluations without explanation, or making all feedback anonymous by default.
Solution: Offer choice with context. Explain how you'll use identified data ("to follow up on your feedback and track your development over time") and make identification optional. Research shows that when participants trust the process, 68% voluntarily provide identifying information.
Overall Programme Quality: 1. "Overall, how would you rate this leadership training?" (1-10 scale) 2. "How likely are you to recommend this programme to peer leaders?" (Net Promoter: 0-10) 3. "What aspect of the programme provided the greatest value?" 4. "If you could change one element, what would it be?"
Content Relevance: 5. "How relevant was this content to your current role?" (Not relevant to Extremely relevant) 6. "Which topic addressed your most pressing leadership challenge?" 7. "What critical leadership topic did this programme not adequately address?" 8. "Rate the balance between theory and practical application." (Too theoretical to Too practical)
Delivery and Facilitation: 9. "The facilitator demonstrated deep subject matter expertise." (Strongly disagree to Strongly agree) 10. "The facilitator created an engaging learning environment." (1-5 scale) 11. "The pacing of content was appropriate." (Too slow to Too fast) 12. "What could have made the facilitator more effective?"
Materials and Environment: 13. "The programme materials will serve as useful reference resources." (1-5 scale) 14. "The learning environment was conducive to engagement and participation." (rating) 15. "The programme duration was appropriate for covering this content." (Too short to Too long)
Knowledge Acquisition: 16. "I can clearly articulate the key concepts covered in this training." (Strongly disagree to Strongly agree) 17. "Describe the difference between coaching and mentoring in your own words." 18. "Rate your understanding of situational leadership before and after this programme." (before: 1-10, after: 1-10) 19. "Which framework or model do you understand least clearly?"
Comprehension and Application: 20. "I can identify situations where each leadership style would be most effective." (confidence rating) 21. "Given this scenario [provide situation], which approach would you use and why?" 22. "I understand how to adapt these concepts to my unique team context." (1-5 scale) 23. "What question or concept still confuses you?"
Confidence and Self-Efficacy: 24. "How confident do you feel applying the GROW coaching model?" (Not confident to Very confident) 25. "I can facilitate difficult conversations using frameworks from this training." (1-5 scale) 26. "Rate your confidence in each of these areas: delegation, feedback, coaching, conflict resolution." (1-5 scale for each) 27. "What skill area requires additional development or practice?"
30-Day Implementation: 28. "I have actively applied concepts from the training." (Never to Regularly) 29. "List three specific instances where you used frameworks from the programme." 30. "How many coaching conversations have you conducted using the GROW model?" 31. "What prevented you from applying certain concepts you intended to use?"
60-90 Day Behaviour Change: 32. "My team members have noticed positive changes in my leadership approach." (Strongly disagree to Strongly agree) 33. "Describe specific feedback you've received about your changed behaviour." 34. "How has your approach to delegation evolved?" 35. "What leadership behaviour have you stopped based on insights from the training?"
Manager and Peer Observation: 36. "My manager has observed positive changes in my leadership effectiveness." (yes/no/uncertain) 37. "Colleagues have commented on shifts in my leadership style." (frequency rating) 38. "Describe a situation where you handled something differently than you would have before the training."
Sustainability and Habit Formation: 39. "Which concepts have become integrated into your natural leadership style?" 40. "I continue to reference materials from the training." (Never to Regularly) 41. "What support would help you sustain your development?"
Team Performance: 42. "Team engagement scores in my area have improved." (yes/no/uncertain) 43. "What specific team performance metrics have shifted positively?" 44. "How has the quality of team decision-making evolved?" 45. "Team productivity has increased since I applied new leadership approaches." (Strongly disagree to Strongly agree)
Talent Development: 46. "How many team members have been promoted or taken on expanded responsibilities?" 47. "I have identified and am developing successors or high-potential talent." (1-5 scale) 48. "What percentage of open positions have been filled by internal candidates?" 49. "Team members report stronger career development support." (measured through team surveys)
Innovation and Initiative: 50. "Team members have proposed or implemented innovative improvements." (frequency) 51. "How many new initiatives has your team launched?" 52. "Describe one business challenge your team now solves differently."
Retention and Engagement: 53. "Voluntary turnover in my team has decreased." (yes/no/uncertain) 54. "Team members report feeling more valued and heard." (evidence) 55. "How has psychological safety in your team evolved?"
Organisational Culture: 56. "My organisation's culture supports the leadership behaviours taught in this programme." (1-5 scale) 57. "What cultural norms might conflict with applying these approaches?" 58. "Senior leaders model the behaviours emphasised in this training." (observation rating)
Manager Support: 59. "My manager actively supports my leadership development." (Strongly disagree to Strongly agree) 60. "I have discussed development goals from this training with my manager." (yes/no) 61. "What specific manager support would accelerate your development?"
Resources and Constraints: 62. "I have sufficient time to practice and implement new approaches." (adequacy rating) 63. "What resources or tools would help you apply these concepts?" 64. "Competing priorities prevent me from focusing on leadership development." (Strongly disagree to Strongly agree)
Behavioural Commitments: 65. "I will conduct weekly one-to-ones with each direct report for the next eight weeks." (commit: yes/no) 66. "I commit to soliciting feedback on my listening skills within two weeks." (commit: yes/no) 67. "I will apply the delegation framework to at least two tasks by [date]." (commit: yes/no)
Application Planning: 68. "Describe one leadership behaviour you will stop, one you'll start, and one you'll continue." 69. "Which team member will you practice coaching approaches with first, and why?" 70. "When will you review your progress on implementing concepts from this training?"
Barrier Anticipation: 71. "What obstacle will most likely prevent you from applying these concepts?" 72. "How will you address this barrier when it emerges?" 73. "Who can serve as an accountability partner for your development?"
For Diversity and Inclusion Leadership: 74. "I can identify unconscious bias in decision-making processes." (confidence rating) 75. "How will you modify your recruitment approach based on this training?" 76. "What inclusive leadership behaviour will you implement within the next month?"
For Change Leadership: 77. "I understand the psychological stages of responding to change." (comprehension) 78. "How will you apply change management frameworks to your current initiative?" 79. "What specific stakeholder resistance strategy will you implement?"
For Strategic Leadership: 80. "I can differentiate between strategic and operational priorities." (confidence) 81. "What percentage of your calendar now aligns with strategic versus tactical work?" 82. "How will you modify your team's approach to planning and priority-setting?"
How many questions should a post-training evaluation include?
Aim for 10-15 questions for immediate post-training evaluations, completable in 5-7 minutes. Follow-up evaluations can extend to 15-20 questions. Research from TalentLMS shows response rates drop 14% for every five additional questions beyond the tenth, and later questions receive increasingly neutral ratings due to respondent fatigue. Prioritise ruthlessly—include only questions directly tied to programme decisions. Each question should answer: "If participants rate this low, we will specifically change X."
When should I ask different types of evaluation questions?
Align questions with Kirkpatrick's four levels across appropriate timeframes: Immediately post-training (final 15 minutes of session): Reaction and learning questions measuring satisfaction, relevance, and comprehension. 30-60 days post-training: Initial behaviour application questions assessing implementation attempts and early barriers. 90-180 days: Sustained behavioural change and team impact questions measuring habit formation and observable results. 6-12 months: Organisational results questions connecting development to business metrics like retention, promotion rates, and team performance.
Should evaluation questions be anonymous or require identification?
This depends on your purpose and organisational culture. Anonymous evaluations typically generate more candid feedback about facilitators, content weaknesses, or systemic barriers. However, anonymity prevents tracking individual development over time or following up on specific feedback. A balanced approach: make identifying information optional, clearly explaining how you'll use identifiable data (e.g., "to support your continued development and follow up on your feedback"). Research shows 68% of participants voluntarily identify themselves when they trust the evaluation process and see demonstrated commitment to acting on feedback.
What's the difference between satisfaction questions and learning questions?
Satisfaction questions (Level 1) measure whether participants found the experience engaging, relevant, and valuable—essentially, "Did they like it?" Learning questions (Level 2) assess whether participants actually acquired knowledge and skills—"Can they do or explain something they couldn't before?" High satisfaction with low learning suggests entertaining but ineffective training. Low satisfaction with high learning might indicate challenging content that productively disrupted comfortable assumptions. Effective evaluations measure both, recognising they serve different diagnostic purposes and don't always correlate.
How do I write questions that predict actual behavioural change?
Include questions assessing three critical predictors: Environmental support ("My manager will support implementing these approaches," "My culture encourages these behaviours"), Self-efficacy ("I feel confident applying X framework," "I can adapt these concepts to my context"), and Implementation intentions ("I will conduct weekly one-to-ones for eight weeks," "I commit to requesting feedback by [date]"). Research demonstrates that these three factors—supportive environment, confidence, and specific commitments—predict 73% of variance in implementation success. Low scores on any dimension indicate need for intervention before participants leave the programme.
What questions should I ask managers of training participants?
Manager input provides external validation of behavioural change. Effective manager questions include: "I have observed specific behavioural changes in [participant] since they completed leadership training" (yes/no/uncertain), "Describe behaviours you've observed" (specificity), "These changes have positively impacted team performance" (1-5 scale), "What team improvements do you attribute to the participant's development?" (concrete impact), and "What support does this leader need to sustain development?" (ongoing needs). Administer manager surveys 60-90 days post-training to allow sufficient observation time whilst maintaining attribution clarity.
How do I analyse open-ended evaluation responses efficiently?
Use systematic thematic analysis: Read all responses for overall impressions, identify recurring concepts mentioned by multiple participants, code responses by grouping similar themes, quantify theme frequency to prioritise, and extract verbatim quotes illustrating key themes. Modern text analysis tools like Dovetail, Insight7, or even ChatGPT accelerate pattern recognition, though human judgment remains essential for nuanced interpretation. Focus on actionable insights—if 23 of 40 participants mention "difficulty getting manager buy-in," that's a systemic issue requiring intervention, not merely a curriculum adjustment.
The ultimate purpose of leadership training evaluation questions extends far beyond measuring programme effectiveness or justifying training budgets. When crafted with care, these questions serve as learning instruments in their own right—prompting reflection, strengthening memory, surfacing barriers, and catalysing commitment to change.
Consider the neurological reality: retrieving information strengthens neural pathways more effectively than passive review. When you ask a participant to describe how they'll apply delegation frameworks to a specific team member, you're not merely gathering data—you're deepening their understanding and increasing implementation likelihood.
The organisations achieving genuine returns on leadership development investments share one characteristic: they ask better questions. Not more questions, not longer questions, but questions that matter. Questions that illuminate gaps between intention and action. Questions that surface systemic barriers masquerading as individual shortcomings. Questions that transform training attendance into sustained development.
Start with the end in mind. Before designing your evaluation, ask yourself: "What decisions will these responses inform?" If a question doesn't connect to a specific action—modify content, change facilitators, add post-training support, address cultural barriers—eliminate it.
The question facing you isn't whether leadership training works. It's whether you're asking the questions that reveal what actually matters.