GenAI adoption is all about people, not about tools. Pharma giant Novo Nordisk offers a great case study of working out what supports useful uptake of AI across a large organization. A case study in MIT Sloan Management Review uncovers a range of useful lessons. Here are some of the most interesting. 🚀 Recognize a mid-cycle drop as normal. Novo Nordisk grew Copilot use from a few hundred to 20,000 users in just over a year, with 23% becoming frequent users within one month. However, by month three or four, 15% of early adopters dropped off and average time saved per week declined. Recognizing this dip as natural helped avoid panic and kept the focus on re-engagement strategies rather than getting staff to try tools for the first time. 🛠 Deliver function-specific training through champion networks. Generic AI onboarding failed to meet the needs of specialized roles. Novo Nordisk succeeded by creating domain-specific training, leveraging internal champions to contextualize AI use, and allowing teams to shape guidance based on their actual work. This addressed “AI shaming” and bridged confidence gaps across functions. 🤝 Use internal champions to overcome cultural resistance. Skepticism wasn’t solved by policy, it was shifted by influence. Novo Nordisk identified trusted, high-status employees to openly adopt and advocate for AI tools. Their visible endorsement encouraged hesitant peers to try AI without fear of judgment or failure. 📈 Treat adoption as a change process, not a tech rollout. Rather than pushing a one-time launch, Novo Nordisk framed GenAI as a long-term transformation. This meant investing in ongoing communication, support structures, and iterative learning. The approach acknowledged that adoption would ebb and flow, and prepared the organization to adapt accordingly. 🎯 Emphasize strategic value over time saved. Though average users saved about 2 hours per week, the most meaningful wins came from higher-quality work—more strategic thinking, clearer writing, and better planning. By highlighting these human-centric gains, Novo Nordisk built a stronger case for AI’s workplace relevance beyond mere productivity. 📊 Use employee data to shape the deployment strategy. Over 3,000 employee surveys and interviews helped Novo Nordisk spot where and why adoption lagged. This feedback guided real-time adjustments—like where to invest in new use cases, where to scale back, and how to tailor messaging. It also surfaced which functions became tool-reliant versus those needing more support.
Implementing a Learning Management System
Explore top LinkedIn content from expert professionals.
-
-
Change management when a company replaces its existing product with a new one! This transition is not about the software - it is about the people, processes, and mindset. When a company is accustomed to using a particular system, they develop a workflow, which leads to challenges in changing to a new one. The key to successful adoption lies in a structured approach: Awareness, Training, Support, and Feedback. 1️⃣Awareness. Users must understand why the change is happening. Communicate the benefits—whether it’s efficiency, cost savings, or compliance. 2️⃣Training ensures users are comfortable with the new system before they fully migrate. This involves hands on sessions, quick reference guides, and scenario based learning. 3️⃣Support - No matter how intuitive a system is, users will face challenges. A dedicated support structure—live chat, email assistance, etc. ensures a smooth transition. 4️⃣Feedback loops are essential. Gathering user concerns and addressing them refines the process and increases user confidence. The goal should be to make the users feel empowered, not burdened, by the shift. #changemanagement #shipsandshipping #maritimeindustry #management #training
-
📈 Unlocking the True Impact of L&D: Beyond Engagement Metrics 🚀 I am honored to once again be asked by the LinkedIn Talent Blog to weigh in on this important question. To truly measure the impact of learning and development (L&D), we need to go beyond traditional engagement metrics and look at tangible business outcomes. 🌟 Internal Mobility: Track how many employees advance to new roles or get promoted after participating in L&D programs. This shows that our initiatives are effectively preparing talent for future leadership. 📚 Upskilling in Action: Evaluate performance reviews, project outcomes, and the speed at which employees integrate their new knowledge into their work. Practical application is a strong indicator of training’s effectiveness. 🔄 Retention Rates: Compare retention between employees who engage in L&D and those who don’t. A higher retention rate among L&D participants suggests our programs are enhancing job satisfaction and loyalty. 💼 Business Performance: Link L&D to specific business performance indicators like sales growth, customer satisfaction, and innovation rates. Demonstrating a connection between employee development and these outcomes shows the direct value L&D brings to the organization. By focusing on these metrics, we can provide a comprehensive view of how L&D drives business success beyond just engagement. 🌟 🔗 Link to the blog along with insights from other incredible L&D thought leaders (list of thought leaders below): https://lnkd.in/efne_USa What other innovative ways have you found effective in measuring the impact of L&D in your organization? Share your thoughts below! 👇 Laura Hilgers Naphtali Bryant, M.A. Lori Niles-Hofmann Terri Horton, EdD, MBA, MA, SHRM-CP, PHR Christopher Lind
-
L&D and the business talk about the same problems...in different languages. Both want better performance and growth. They just explain it in different ways. L&D often talks about programs and learning goals. Business leaders listen for results and outcomes. When those don't match, the message is lost. This isn't because leaders dislike learning. It's because the value is unclear. They can't see the impact. Speaking business language does not weaken L&D. It helps others understand the work. It connects learning to real decisions. That's why this vocabulary comparison matters. It helps L&D explain value more clearly. And get taken seriously. Here is the same program explained two ways. First in L&D language. Then in business language. L&D language: "We launched a leadership learning program for first-time managers. The program covers goal setting, feedback, and check-ins. Learners complete short modules and practice activities over six weeks. Content is delivered asynchronously. Success is measured through completion rates and surveys." Business language: "We launched a performance initiative for first-time managers. The initiative sets clear standards for managing goals and feedback. Managers apply these standards in live team and one-on-one meetings. The format avoids time away from work over six weeks. Success is measured by faster time to productivity and fewer escalations."
-
The system worked. The transition failed. Cloud is live. Code is bug-free. Data migrated successfully. Project status: Complete. Six weeks later - teams are back in spreadsheets. Adoption rate: 15%. McKinsey 2024: 70% of digital transformations fail to meet objectives. In 85% of those failures, the technology worked perfectly. Here's what the radar chart reveals: Technical System Readiness: 98% Leadership Role-Modeling: 35% Shared Meaning & Buy-In: 27% Skills & Behavioral Mastery: 22% Incentive & KPI Alignment: 18% The budget imbalance mirrors this perfectly. 90% allocated to systems. 10% to people. Yet 70% of ROI depends on adoption. Four mechanisms guarantee failure: ❌ The Hypocrisy Gap ↳ Only 1 in 3 leaders change their habits ↳ CEO asks for the old spreadsheet once - transition dies ❌ The Training Fallacy ↳ Most users reach basic awareness, stop there ↳ Only 20% achieve mastery ↳ The rest build workarounds ❌ The Structural Sabotage ↳ New system launched ↳ Bonuses tied to old behaviors ↳ People choose the bonus every time ❌ The Engagement Exodus ↳ 70% of staff feel change is "done to them" ↳ Not "for them" or "with them" ↳ Resistance becomes their identity The 48-hour test predicts everything. If leadership modeling sits below 50%, teams revert to shadow processes within 48 hours of launch. Then the pattern completes: System gets labeled "broken." Transition gets ignored. Change lead gets fired. Document this before your next launch: ↳ Leadership modeling score (target: 70%+) ↳ Incentive alignment assessment (currently 18%) ↳ User engagement in design process ↳ Behavioral mastery milestones beyond training Your technology budget was never the problem. Your people budget was. -------- 🔔 Follow Justin R. for more Transformation insights ♻️ Share with someone launching a system next quarter
-
If you only have time to measure ONE thing in L&D, stop tracking learner satisfaction. Stop with completion rates. Start measuring Manager Support. Why? Because research proves that the work environment is the single biggest predictor of whether learning actually sticks. • Satisfaction ≠ Application. A "5-star" workshop rating doesn't mean a single behavior changed back at the desk. • The System Always Wins. As Geary Rummler said, "Pit a good performer against a bad system, and the system will win every time." • Managers are the "Parachute." Without a manager to provide feedback, space, and resources, the learner is going it alone. If we want to move from an "order-taker" to a strategic partner, we need to change our metrics. Ask: ✅ Did the manager set goals before the session? ✅ Did they provide time to practice after? ✅ Is the new behavior actually being rewarded? Stop valuing activity. Start valuing the support that drives behavior. #LearningTransfer #MultiplyTransfer #LAndD #PerformanceConsulting #FutureOfWork #Management
-
Most change initiatives are measured by one number: Adoption. Did people start using the new system? Did they attend the training? Did they log in? But just because something was adopted doesn’t mean the change worked. Adoption tells you if people used it. It doesn’t tell you how well they’re using it or whether it made anything better. To really measure change success, you need to go deeper: – Is behavior actually different? Are people making decisions in a new way? Are old habits starting to fade? – Is performance improving? Has the change helped teams deliver better results, faster service, fewer errors, or stronger collaboration? – Is the change sustainable? Are people still using the new way of working 3, 6, 12 months later or did things quietly go back to how they were? – Do people understand why the change matters? Real change sticks when people connect it to their purpose, not just their process. Success isn’t just about launch day. It’s about what happens after, when the excitement fades and the real work begins.
-
Demonstrating the value of learning is easier than you think! In a recent workshop with The Institute for Transfer Effectiveness, I demonstrated how! One workshop participant was designing safety training to help employees use Microsoft 365 strategically to prevent data breaches. She was struggling to capture the value of the program for organizational leaders to understand. I used an alignment framework that incorporates Rob Brinkerhoff’s 6 L&D value propositions and mapped out how to connect her learning program with metrics that matter to organizational leaders. Here’s what that looked like! Aligning learning activities, initiatives or programs to strategic business outcomes is like looking for the through line between disparate things: learning, human performance, departmental key performance indicators, and organizational metrics. This can feel nearly impossible. The glue that holds these seemingly disparate things together are Brinkerhoff’s 6 L&D value propositions. In the safety training example we started by identifying the most relevant value proposition for the program. In this case, it was Regulatory Requirements: a learning program designed to ensure employees are complying with industry specific rules and regulations. Then we connect the L&D value proposition (Regulatory Requirements) with the most relevant outcome for the organization. In this case, it was Net Profit. If employees are complying with industry-specific rules and regulations, this consistent practice will save the organization money in fines, lawsuits, or dealing with the unpleasant consequences of safety challenges (like a data breach). Then we must do the hard work unpacking what people will be doing to support the targeted departmental KPIs. If you’re struggling to figure out the KPIs, you’ll likely find them by asking department leaders what problem they are experiencing on a regular basis that they would like solved. In this case it was too many data breaches and too many outdated files on the server causing misinformation and inconsistent practices. I discovered that what people could be doing differently to support the desired KPIs was adhering to updated protocols on how to manage data and documents within the 365 suite. If people followed the protocols with 100% fidelity, departments would experience a reduction in data breaches. Now … we have the behaviors to target in our training program and the data to use to show the value of learning: Learning metrics: Training attendance and completion rates. Capability metrics: Percentage of fidelity to data and document protocols before and after training. KPI metrics: # of documents on the server that are outdated (being at 20% of lower), # of data breaches per department being at 1 or less annually. Organizational metric: Net Profit How will you use the 6 L&D value propositions and alignment framework to tell your learning value story? #learninganddevelopment #trainingstrategy #datastrategy
-
𝗢𝘃𝗲𝗿𝗰𝗼𝗺𝗶𝗻𝗴 𝗥𝗲𝘀𝗶𝘀𝘁𝗮𝗻𝗰𝗲 𝘁𝗼 𝗖𝗵𝗮𝗻𝗴𝗲 𝗶𝗻 𝗟𝗲𝗮𝗿𝗻𝗶𝗻𝗴 𝗮𝗻𝗱 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁 🌟 Facing resistance to new learning initiatives or changes in training methods? You're not alone. Resistance from employees and managers can be a significant roadblock, stalling progress and hindering the successful adoption of new skills and technologies. Resistance Ignoring this resistance can be costly. It can prevent your organization from staying competitive and adaptable in a fast-evolving business landscape. Here’s how to effectively tackle this issue: 📌 Engage Stakeholders Early: Involve employees and managers in the planning phase of new learning initiatives. Seek their input and feedback to make them feel part of the change process. This reduces resistance as they begin to see the change as something they helped shape. 📌 Communicate the Benefits Clearly: Clearly articulate the benefits of the new training methods. Explain how these changes will improve their job performance, career growth, and the organization’s overall success. Use real-world examples and success stories to illustrate the positive impact. 📌 Provide Continuous Support: Offer ongoing support throughout the change process. This includes training sessions, Q&A forums, and one-on-one coaching. Ensure that employees know where to seek help and feel supported as they transition to the new methods. 📌 Address Concerns Openly: Create an open dialogue where employees can voice their concerns and questions. Address these concerns transparently and provide solutions or adjustments when possible. Acknowledging and addressing fears can ease the transition. 📌 Leverage Change Champions: Identify and empower change champions within your organization. These individuals can advocate for the new initiatives, share their positive experiences, and encourage their peers to embrace the change. 📌 Monitor and Celebrate Progress: Track the progress of the new initiatives and celebrate milestones and successes. Recognizing and rewarding employees for their adaptability and participation can boost morale and reinforce positive behavior. 📌 Provide Practical Training: Ensure that the new training methods are practical and relevant to the employees' roles. Hands-on, relatable content can make the learning process more engaging and less daunting. 📌 Use a Phased Approach: Implement changes in phases rather than all at once. This gradual approach allows employees to adapt at a manageable pace and reduces the overwhelm that can accompany significant changes. By engaging stakeholders early, communicating benefits clearly, and providing robust support, you can overcome resistance and pave the way for successful learning and development initiatives. Got more strategies for overcoming resistance to change in L&D? Share them below! ⬇️ #ChangeManagement #LearningAndDevelopment #EmployeeEngagement #TrainingInnovation #OrganizationalGrowth #LeadershipDevelopment
-
Most L&D teams struggle with their tech platforms having weak reporting and analytics. But the thing is, what’s in the platform to report out is only part of the story. The other part requires something else. Right now, your LMS should be able to tell you: - Who logged in - What they searched for - What they selected - How far they got before clicking away - Who's completed their compliance training It’s all very rudimentary. It tells a story of engagement, interest and mandatory responsibilities but it offers no picture of actual development, growth or improvement. So not a lot to shout about and very little to change stakeholder's minds and gain any further influence. At a time when our roles feel precarious, we are struggling to weave these baseline metrics into a compelling story of impact. We have the ‘anecdata’ from our conversations with stakeholders and our lived evidence that we’re doing good work, but this seems separate from the story we’re reporting on. So your reporting and analytics seem weak because engagement in content and programs matters very little if we don't understand, and get close to, the people and performance problems that are hurting our organisation, specific teams, workforce productivity and employee ambitions. To tell a story of value, we need a performance-first methodology. We need to move the focus away from ‘who showed up’ towards a quantifiable narrative that matters: 1. What is the Problem? Stop looking at the platform and look at the business friction. What is the specific challenge? Is it a lag in sales productivity? A spike in technical errors? Define the ‘before’ state in numbers. If you don't understand the problem, you have no context for the data. 2. What did L&D do? This is where your reporting gains a pulse. You stop reporting on ‘enrollments’ and start reporting on targeted initiatives. Instead of saying "100 people took the course," your story becomes: - The Cohort: We identified the 40 managers whose teams had the highest friction scores - The Context: We mapped and validated the skills required for their actual role and facilitated peer-coaching and real-world application tasks based on their specific live challenges, supported by bespoke digital resources created with subject matter experts - The Mechanism: We tracked how they moved from theory in the classroom and the LMS to practice in the workflow Your platform data now acts as a digital footprint for part of the journey of a specific group being equipped to move a specific needle. 3. What changed? Link the platform activity to the shift in the business metric. - The Challenge: [Metric] was lagging - The Action: L&D deployed [Solution] to [Target Group] - The Result: [Metric] improved by [X%] Weak reporting and analytics are a symptom of a deeper issue: we are measuring the ‘solution’ before we’ve defined the problem. When we lead with performance analysis, the data actually has a story to tell.