Did you know that response rates to evaluations are at 23% and declining? Did you know that only 22% of evaluation data is usable and that only 5% (yes 5%!) is actually analyzed for performance improvement purposes? The modern day learner has limited time to learn, let alone evaluate their learning. As a result, learning managers must adapt to this new environment.To lighten the load, start by shedding evaluation questions that are duplicative, not meaningful or don't result in any improvement changes. In most cases an evaluation can go from over 20 questions to under 10. The evaluator will continue to capture insights to evaluate quality, learning, job impact and business impact on both end-of-experience evaluations as well as on-the-job evaluations. We call these lighter yet more articulate evaluations 'pulse evaluations.'The benefit of pulse evaluations are that they align to methodology but are so much easier to complete on mobile devices and with learners that simply don't have a ton of time to evaluate their learning. Further, instead of using 5% of the data for analysis that regularly occurs with traditional learning evaluations, you'll use 100% of this data.Next, let's focus on lightening the reporting load. If it takes multiple reports to understand learning performance, that is too much resource extended on reporting vs. improvement. Simpler, concise reporting narrows down reports into 3 (not 30) reports. These 3 reports focus in on the audience so they can quickly glean insight.First is the executive (CLO/VP) and they would need an aggregate dashboard or scorecard. They want to see all of the data within a single set of KPIs against relevant points of reference. Second is the program manager. This person needs to do exploratory analysis on the data and compare a subset of data using respondent demographics (ex. region, business unit) and experience attributes (ex. modality, provider). This data would be analyzed in a cross tab /red-yellow-green format making it simple yet useful for the program manager to visualize patterns. Finally there is the tactician such as the instructor or content curator. They want to look at specific experiences (digital, classroom etc.) and understand details on these instances. They are mining into the verbatim and questions for insight.Reporting is also not just about simplicity it is about technology to find intelligence. Modern day tools use AI to mine volumes of verbatim and tell you if that sentiment is positive, negative or neutral. That saves time and let's machines perform the first pass analysis.Finally, lighter learning evaluation is not about simply reporting less information on less reports. It is migrating from a reporting mindset to an improvement mindset. The evaluation model evolves to focus on action plans with workflow analysis to kick off and complete the corrective actions that must be made to increase quality or optimize impact.In the end, modern day learners need modern day evaluations. This is all about lighter evaluations that continue to be credible yet improve the learner experience. Reporting is lighter and focuses on key stakeholders but more on improvement and optimization vs. compliance and reporting.The migration to lighter evaluation is a process and a journey. It is one worth starting because the world will continue to evolve to it anyway.About PerformitivPerformitiv is a performance improvement technology company. Our solution measures and improves programs, processes and partnerships for lines of business. The system serves as a centralized platform for key performance indicators (KPIs) derived from any data. We have transformed tactical tasks into an automated, strategic process that improves quality, impact, compliance and value.