Children spend just 190 days of the year in school. That leaves 175 days where learning is shaped elsewhere. Yet our education policy debates often treat schools as though they operate in isolation. Attainment gaps, literacy development, attendance, behaviour and aspiration are not solely the product of what happens between 9am and 3pm. Positive outcomes are influenced by: • home stability • access to books • youth provision • community safety • parental confidence • cultural capital • enrichment opportunities If we are serious about raising standards and narrowing gaps, policy cannot stop at the school gates. With such little time being spent in school we need to be innovative about how, when and where we educate our young people. Funding decisions around youth hubs, libraries, early years support, family services and community provision are not peripheral to education policy, they are central to it. We cannot demand that schools compensate for structural disadvantage in 190 days a year while reducing the infrastructure that supports children in the other 175 days. Education reform must move beyond classroom reform. Outcomes are shaped by ecosystems, not institutions alone.
Data-Driven Education Insights
Explore top LinkedIn content from expert professionals.
-
-
I am pleased to share our new publication, "Navigating centralized admissions: The role of parental preferences in school segregation in Chile," recently published in the International Journal of Educational Research (co-authored with Macarena Kutscher). https://lnkd.in/gAyJ8iTR The question we investigated: Why doesn't equal access lead to equal outcomes in school choice? In 2015, Chile enacted the Ley de Inclusión, eliminating school screening practices—no more entrance exams, parent interviews, or income verification. Every family gained equal access through a centralized, algorithm-based system. Key objective: reduce school segregation. The result: Recent evidence by Kutscher and Urzua found minimal impact on integration. Our paper confirms and extends these findings. We analyzed 133,000+ prekindergarten applications to understand why equal access hasn't translated into more integrated schools. By examining families' rank-ordered school choices using discrete choice models, we uncovered systematic differences in how low-SES families navigate school selection. Key findings: Low-income families systematically choose different schools—not because of barriers, but due to distinct preferences: 🔹 They prioritize safety, climate, and belonging over test scores 🔹 They're significantly less likely to apply to high-SES schools 🔹 They strongly favor schools with fewer violent incidents and lower discrimination 🔹 They avoid previously selective schools, even when entitled to fee waivers 🔹 Distance matters far more—they're much less willing to travel The deeper story: Disadvantaged families seek schools where their children will feel welcomed and safe. They rely on observable signals—student behavior, familiar environments, community connections. These choices reflect legitimate concerns about belonging, but may also reflect information gaps about school quality. What this means for policy: Simply removing barriers isn't enough. Effective centralized choice systems need: ✓ Comprehensive information on both academic quality AND school climate ✓ Clear data on safety, inclusiveness, and well-being ✓ Better platform design—parents often spend only minutes applying ✓ Personalized guidance, not just generic rankings ✓ Explicit explanation of how matching algorithms work The opportunity: Pioneering work by Jishnu Das and colleagues in Pakistan and Chris Neilson and colleagues in Chile demonstrated that targeted information interventions can dramatically improve parental choices. We've replicated these approaches in Haiti, Ecuador, and Peru with similar findings. We're now testing these insights on choice platforms in Recife, Brazil, with promising early results. The welfare gains from improving school access for disadvantaged students are substantial. This research points toward specific design features that could help centralized choice systems deliver on their promise of integration.
-
Hey #highered leaders - if you're still using static pivot tables to inform strategy, this post is for you ⤵ Take a peak at the below screenshot. This example, which shows two "paired predictors", is just one way you can turn data into action: 📈 ▶ The top right quadrant are “high achievers”. They have a high GPA + high credit earn ratio. These students might simply receive a message of encouragement. ▶ The top left quadrant are “strivers”. They have lower GPAs, but higher credits earned. These students might receive a nudge related to maximizing their use of available academic resources. ▶ The bottom right quadrant are “setbacks”. They have higher overall GPA, likely from good grades in their early coursework, but are earning fewer credits towards graduation requirements in key courses in their major. These students should probably receive messaging about the need for high-touch interaction with their advisors to stay on track and not lose their early momentum. ▶ The students in the bottom left quadrant are in "survival mode”. They are below average in both areas. These students are probably due for some real human-to-human conversation to better understand their needs. They may need in-depth intervention, with accompanied supports for finding the most successful path towards goals that match the students’ strengths and interests. You may consider nudging and re-nudging them throughout a term. ⤵ There's so many more examples of how Civitas Learning partners are disaggregating data to close equity gaps. If you're curious to learn more, let's connect 💌 #studentsuccessanalytics
-
How can we bridge the gap between academia and policymaking to create more effective public policies? This report provides actionable recommendations on improving academic-policy engagement. Key recommendations include: 🔶 Proactive Support: Universities and policy institutes should actively provide information and resources to aid academic engagement with policymaking. Effective signposting to these resources is also essential. 🔶 Recognition in Academic Frameworks: Institutions need to acknowledge policy engagement within workload models and career progression frameworks. This is frequently highlighted during research impact training and is crucial to get right. 🔶 Tailored Guidance: Policymakers should create specific resources for academics to navigate policy engagement opportunities. 🔶 Addressing Geographic Disparities: Mechanisms should be developed to increase engagement with universities outside London and the South East. 🔶 Sustained Engagement: Continuous interactions between policymakers and academics should be facilitated, considering the workload implications. 🔶 Case Studies and Transparency: Publicly accessible case studies of successful academic-policy engagements and transparent use of research evidence are essential. There is wide agreement that engagement between academia and policymakers is a positive step, but it can be challenging to implement. Ensuring that research effectively informs decision-making is key. #AcademicEngagement #PolicyMaking #ResearchImpact #HigherEducation #PublicPolicy #KnowledgeExchange #EvidenceBasedPolicy #AcademicResearch
-
In schools today, we’re surrounded by a plethora of data - from assessments and observations to a variety of dashboards and feedback loops. But data only matters if it informs what we do next. That’s why here at American International School of Guangzhou we’ve developed the 𝐅𝐀𝐂𝐓𝐒 𝐃𝐚𝐭𝐚 𝐏𝐫𝐨𝐭𝐨𝐜𝐨𝐥 – a structured process designed to help teams move from data collection to meaningful action. FACTS guides us to: 🔎 𝐅𝐨𝐜𝐮𝐬 on the data that matters most 📊 𝐀𝐧𝐚𝐥𝐲𝐳𝐞 insights and gaps 🎉 𝐂𝐞𝐥𝐞𝐛𝐫𝐚𝐭𝐞 successes and positive trends 🎯 𝐓𝐚𝐫𝐠𝐞𝐭 strategies and interventions 🚀 Define clear 𝐒𝐭𝐞𝐩𝐬 for action and accountability We’ve recently rolled this out with faculty, middle leaders, senior leadership - as well as with our Operations Team. All with the goal of shifting the way we talk about and act on data across the whole school. A strong data protocol matters because it: * 𝐄𝐧𝐬𝐮𝐫𝐞𝐬 𝐮𝐧𝐢𝐟𝐨𝐫𝐦𝐢𝐭𝐲 – establishing consistent guidelines and vocabulary, keeping coherence across departments and educators. * 𝐂𝐮𝐥𝐭𝐢𝐯𝐚𝐭𝐞𝐬 𝐭𝐞𝐚𝐦𝐰𝐨𝐫𝐤 – giving staff a shared approach that elevates teaching and learning collaboratively. * 𝐅𝐚𝐜𝐢𝐥𝐢𝐭𝐚𝐭𝐞𝐬 𝐢𝐧𝐟𝐨𝐫𝐦𝐞𝐝 𝐜𝐡𝐨𝐢𝐜𝐞𝐬 – empowering decision-makers to rely on dependable data and implement strategies that truly improve student learning. Just as importantly, a protocol helps us 𝐝𝐞𝐟𝐢𝐧𝐞 𝐭𝐡𝐞 𝐯𝐚𝐥𝐮𝐞 𝐨𝐟 𝐝𝐚𝐭𝐚 itself: Does it suit our needs? Are there important data points missing? Can we find a way to access them? Having vast amounts of data is one thing - having useful data is another. A protocol like FACTS ensures we make that distinction quickly and clearly. We’ve also dedicated significant time to our school improvement plan: our 𝐓𝐫𝐚𝐧𝐬𝐟𝐨𝐫𝐦𝐚𝐭𝐢𝐯𝐞 𝐋𝐞𝐚𝐫𝐧𝐢𝐧𝐠 𝐅𝐫𝐚𝐦𝐞𝐰𝐨𝐫𝐤. FACTS helps us implement, monitor, and analyse its impact with greater clarity. By running the full protocol, we ensure every data dive is structured, organised, and results in actionable steps - not just endless exploration. And beyond the walls of our classrooms and offices, data also helps us 𝐞𝐧𝐠𝐚𝐠𝐞 𝐨𝐮𝐫 𝐰𝐢𝐝𝐞𝐫 𝐜𝐨𝐦𝐦𝐮𝐧𝐢𝐭𝐲 - celebrating successes, building trust, and showing the impact of our collective efforts. Ultimately, regardless of the protocol you use, the true value lies in the cycle itself - structured, collaborative, and action-driven. It’s this cycle that turns information into impact, ensuring data is never for its own sake, but always driving improvement, strengthening our community, and helping every student thrive.
-
+5
-
A few years ago, I worked with an online education platform facing challenges with student engagement. While they had a significant number of users enrolling in courses, they struggled with low participation rates in course discussions and activities, leading to a decline in course completion rates. The platform needed to identify the causes behind low engagement and implement strategies to encourage more active participation. Improving Student Engagement Using Data Analytics 1️⃣ Analyzing Engagement Data We began by analyzing user interaction data, focusing on metrics such as time spent on the platform, participation in discussions, video completion rates, and quiz scores. Using SQL, we aggregated the data to identify patterns and pinpoint where students were losing interest. SELECT student_id, course_id, AVG(time_spent) AS avg_time_spent, COUNT(discussion_post_id) AS posts_made, AVG(quiz_score) AS avg_quiz_score FROM student_activity GROUP BY student_id, course_id; 🔹 Insight: We identified that students who interacted with course discussions and quizzes had higher completion rates, while others dropped off quickly. 2️⃣ Building a Predictive Model We then created a predictive model to determine which students were at risk of disengaging based on their activity patterns. The model incorporated features such as time spent on the platform, participation in discussions, and progress through the course material. # Pseudocode for Predictive Model def predict_student_engagement(student_data): model = train_engagement_model(student_data) predictions = model.predict(student_data) return predictions 🔹 Insight: This model helped us flag students who were likely to disengage early, allowing for timely interventions. 3️⃣ Implementing Engagement Strategies Based on insights from the model, we implemented strategies such as sending personalized emails with reminders, offering incentives for completing activities, and increasing interaction opportunities through live Q&A sessions. # Pseudocode for Engagement Follow-Up def send_engagement_reminder(student_data): if model.predict(student_data) == 'at_risk': send_email_reminder(student_data) 🔹 Insight: Personalized engagement and incentives led to an increase in student participation. Challenges Faced Identifying meaningful engagement metrics that were predictive of success. Finding the right balance between engaging students without overwhelming them. Business Impact ✔ Student engagement improved, leading to higher completion rates. ✔ Retention rates increased, as more students continued with courses. ✔ Revenue grew, driven by more active and satisfied students. Key Takeaway: By analyzing user activity and leveraging predictive analytics, businesses can identify disengaged customers early and implement strategies to improve engagement and retention.
-
Yesterday’s Office of the Auditor General of Canada / Bureau du vérificateur général du Canada report on Canada’s International Student Program highlights a critical reality: stronger system supports for international students are not optional-they are essential. This is not simply about volumes or growth. It points to deeper challenges in accountability, coordination, and the integrity of information. When responsibility for ensuring clear, accurate information is diffused, the result is confusion and increased vulnerability for international students. We welcome the Auditor General’s findings and value the opportunity WES had to contribute to the review. Recently, WES released a seven‑part research series to examine systemic challenges in the information network that international students rely on. Our report highlights key issues such as information gaps, inconsistent policy messaging, unregulated agents, and the growing impact of AI-driven misinformation - while outlining practical solutions focused recommendations for policymakers, intermediaries, and post-secondary institutions. International students play a pivotal role in shaping Canada’s global education standing and reinforce our reputation as a welcoming destination - but only if student voice and protection are treated as core policy outcomes, not afterthoughts. #IRCC #GAC #AGreport #WESResearch #InternationalStudents #IntlStudents #CdnPoli #CdnImm Read WES’s Trust Through Information series: https://lnkd.in/erxx_byQ
-
For years, classrooms divided students into “visual,” “auditory,” or “kinesthetic” learners. It sounded intuitive. It also turned out to be wrong. Cognitive science has been clear for a while now. There’s no evidence that teaching to “learning styles” improves learning outcomes. So what does work? Evidence-based learning strategies backed by decades of research. Here’s the shift that modern educators are making 👇 → Retrieval Practice — Students remember more when they recall information, not when they re-read it. Low-stakes quizzes and brain dumps beat endless highlighting. → Spaced Repetition — Revisiting material over time cements memory. Forget cramming. Learn, forget a bit, then relearn. → Interleaving — Mix up topics and problem types. It builds flexible understanding instead of rote familiarity. → Dual Coding — Combine words and visuals to deepen comprehension. Diagrams + explanations = stronger mental models. → Elaboration — Ask “how” and “why.” Connecting new ideas to existing knowledge builds durable understanding. → Concrete Examples — Ground abstract ideas in real-world cases. Students understand faster when they can see the concept in action. This isn’t about labeling learners. It’s about teaching brains the way brains actually learn. Let’s stop chasing myths and start designing instruction that works. Because great teaching isn’t about how students prefer to learn. It’s about how learning actually happens. #EducationReform #CognitiveScience #TeachingStrategies #EvidenceBasedLearning #FutureOfEducation
-
In almost every school I've ever visited, the issue isn't the teachers. It's not the leaders. And it's definitely not the kids. But here's the reality in too many schools: inconsistent instruction, stagnant student achievement, frustrated teachers, and overwhelmed leaders. That story was no different in a network of 7 schools we've been working with this year. But it's not the story now. Let me share what we did, not because I think it's magic, but because I think anyone can do it. Here's what we did: 1. Defined the vision for every block of the day: We mapped out what excellence looked like in every key instructional block: - What should an effective reading lesson look like? - What are non-negotiables in math instruction? - How do we leverage history to build background knowledge? - How does science become high rigor and high engagement? - What does student engagement actually look like, sound like, and feel like when we walk into any space in the school? That level of clarity removed guesswork for teachers and gave leaders a shared framework for observations. 2. Every teacher was coached, every week. - Short, focused observations (15-20 minutes, not full-period evaluations) - Immediate, actionable feedback on one key lever, not a laundry list of suggestions - Weekly one-on-one coaching meetings held sacred 3. Set weekly goals to measure progress: Instead of waiting for benchmark assessments, we built simple, weekly indicators of progress: - Are students engaged in learning in every block of the day? - Are students getting plenty of time to independently practice? - Are math exit tickets showing mastery of the lesson objective? - Are teachers implementing feedback from the last coaching session? Small wins led to big momentum. A narrow focus helped teachers and leaders stop feeling like they were doing the most and not seeing any progress. 4. Action planning based on data: No more “data meetings” that were just numbers on a slide. - We reviewed student work together, identified breakdowns, and built immediate next steps. - Teachers left each meeting with a plan they could apply the next day, not vague goals for next quarter. The results: Student proficiency increased by double digits in both reading and math benchmarks within one year. Teachers felt more supported and reported higher confidence in their instruction. Leaders shifted from putting out fires to proactively coaching and driving instructional improvement. If your school or network is struggling with initiative overload, the answer isn’t more programs. It’s more clarity. And the discipline to do some simple things really, really well.
-
In our column for the LiveMint, Dr. Bibek Debroy, Shri sanjay kumar, and I emphasize that merely providing access to schools isn't enough; the quality of education is paramount. The Right to Education (RTE) Act of 2009 tackled access, but educational quality varies widely across India. The National Achievement Survey (NAS) 2021 highlights significant disparities and a worrying decline in student performance compared to 2017. Leading states like Punjab, Rajasthan, and Haryana excel, while Meghalaya, Mizoram, Nagaland, and Chhattisgarh fall behind. The decline in scores from Class 3 to Class 10 underscores a failure to build on foundational skills. This pattern suggests current educational strategies aren't reinforcing essential concepts, leading to knowledge gaps over time. The NEP 2020 aims to address these issues by ensuring foundational literacy and numeracy by Grade 3, promoting continuous professional development for teachers, and shifting to competency-based education. The National Curriculum Framework (NCF) 2023 provides a structured approach to align educational aims and outcomes, allowing precise assessment and targeted interventions. To bridge educational gaps, states must leverage NAS data, prioritize teacher training, and actively involve parents and communities. Aligning efforts with NEP 2020 is crucial to elevate educational quality across India. The urgency to act has never been greater.