A look at how CS50 has incorporated artificial intelligence (AI), including its new-and-improved rubber duck debugger, and how it has impacted the course already. 🦆 https://lnkd.in/eb-8SAiw In Summer 2023, we developed and integrated a suite of AI-based software tools into CS50 at Harvard University. These tools were initially available to approximately 70 summer students, then to thousands of students online, and finally to several hundred on campus during Fall 2023. Per the course's own policy, we encouraged students to use these course-specific tools and limited the use of commercial AI software such as ChatGPT, GitHub Copilot, and the new Bing. Our goal was to approximate a 1:1 teacher-to-student ratio through software, thereby equipping students with a pedagogically-minded subject-matter expert by their side at all times, designed to guide students toward solutions rather than offer them outright. The tools were received positively by students, who noted that they felt like they had "a personal tutor." Our findings suggest that integrating AI thoughtfully into educational settings enhances the learning experience by providing continuous, customized support and enabling human educators to address more complex pedagogical issues. In this paper, we detail how AI tools have augmented teaching and learning in CS50, specifically in explaining code snippets, improving code style, and accurately responding to curricular and administrative queries on the course's discussion forum. Additionally, we present our methodological approach, implementation details, and guidance for those considering using these tools or AI generally in education. Paper at https://lnkd.in/eZF4JeiG. Slides at https://lnkd.in/eDunMSyx. #education #community #ai #duck
AI in Education Innovation
Explore top LinkedIn content from expert professionals.
-
-
Last week Google announced Learn Your Way - a research experiment to reimagine the most overused, under-loved artifact in education: the textbook. The problem is obvious: textbooks are one-size-fits-all. Written once, updated rarely, inflicted equally. Great for industrial-scale learning, terrible for actual students. Learn Your Way tries to fix that with AI: a student picks their grade level and interests (sports, music, food). The system then “relevels” the text, swaps out generic examples for personalized ones (Newton’s apple becomes a soccer ball), and builds a personalized core. From there, it spins out multiple formats: immersive text with visuals, section-level quizzes, narrated slides, Socratic dialogues, even mind maps. In a controlled trial with 60 high schoolers, it beat the humble PDF reader across the board: comprehension, retention, and preference. AI is going to fundamentally change education. The way I see it, we will move from: ▪️Standardization → Personalization: Education has been built for scale: 1 teacher, 30 students, 1 chalkboard. AI flips that. Materials adapt to pace and interest; assessment becomes continuous, not blunt. ▪️Knowledge Transfer → Cognitive Coaching: When facts are instantly accessible, memorization stops being the scarce skill. The real edge is knowing when AI is wrong, asking sharper questions, and connecting ideas across disciplines. ▪️Classrooms → Learning Ecosystems: Teachers shift from lecturers to facilitators and motivators. AI covers explanations and drills; humans teach judgment, values, and meaning. Peer learning deepens when everyone brings AI-augmented insights. ▪️Exams → Evidence of Thinking: With AI co-pilots, recall-based tests lose power. Evaluation moves to process, projects, and defense - not “what’s the answer?” but “show your reasoning.” ▪️Scarcity → Abundance (with new inequities): AI promises tutoring for anyone with a smartphone. But access to devices, connectivity, and high-quality models could widen divides. A new gap may emerge between students trained to use AI critically and those who consume it passively. Here's the irony: in making information abundant, AI paradoxically revives the oldest form of teaching. Socrates didn’t assign PDFs; he asked questions until you realized you didn’t know what you thought you knew. His role wasn’t to supply answers but to train skepticism. That is the teacher’s role again. Not to out-explain Gemini, but to show when not to trust it. To cultivate judgment, doubt, and the art of better questions. AI hasn’t reinvented education so much as rerouted it back to its roots: the Socratic method - only now Socrates is paired with a chatbot that never sleeps and never hesitates.
-
There are already hundreds of new Generative AI Edtech companies and products on the market. Some great, some terrible and many in-between. So how can you vet GenAI EdTech companies to ensure that their product is safe, reliable, effective, and fit-for-purpose? To help we've put together our Top Six Questions to guide these conversations. You can download the PDF version and get the reasons "why" for each questions at this link: bit.ly/3qnP0ma. 1️⃣ We know that generative AI (GenAI) is a new technology with extensive limitations. How does your product indicate when it's uncertain or requires human review? What controls do you have in place to identify and lower hallucinations? 2️⃣ It’s important that the tools we use do not cause harm to our students or teachers. What steps are you taking to identify and mitigate biases in your AI models? How will you ensure fair and unbiased outputs? 3️⃣ Protecting student data privacy and ensuring ethical use of data is a top priority for our school. What policies and safeguards can you share to address these concerns? 4️⃣ Our educators need to validate and trust AI-generated content before use. What human oversight and quality control measures do you use? How do you ensure feedback from teachers/students is being collected and actioned? 5️⃣ We need evidence that your AI tool will improve learning outcomes for our student population and/or effectively support our teachers. Can you provide examples, metrics and/or case studies of positive impact in similar settings? 6️⃣ : Our school needs to accommodate diverse learners and varying technical skills among staff. How does your tool ensure accessibility and usability for all our students and staff? What PD is available? AI for Education #genai #edtech #aiforeducation #schoolleaders #AI
-
AI is reshaping the future of learning, not by replacing educators, but by amplifying human potential. I just read Google’s new position paper on 'AI and the Future of Learning', and several points resonate strongly with my own experiences in e-learning, agentic AI, and responsible innovation. Key takeaways for educators, learning designers and AI practitioners:- 1. Human-in-the-loop matters:- AI should empower teachers and learners, not supplant them. Educators remain central in designing, customizing, and supervising AI tools. 2. Personalized, adaptive learning:- AI can meet learners where they are, adapt to their pace, strengths, and needs, especially powerful in large scale or resource-constrained settings. 3. Ethics, fairness, transparency:- Tools must be built responsibly, transparent about data usage, bias, and decisions. Learners, teachers, and their families should understand how AI arrives at suggestions and always have recourse. 4. Skills for the future:- Beyond knowledge recall, education needs to foster curiosity, metacognition, collaboration, and lifelong learning. AI becomes a partner in cultivating how we learn, not just what we learn. As someone who leads e-learning and agentic AI initiatives (and working on courses / frameworks for learning system design), here are some reflections:- 1. Design with pedagogy first:- When building courses or tools, we must anchor in learning science and best practices. Agents or AI modules should align with what we know about how people learn, including cognitive load, scaffolding, and feedback loops. 2. Build with practitioners:- Co-design with educators ensures the AI tools remain grounded in context, and helps avoid misalignment or unintended biases. 3. Measure impact holistically:- Beyond completion or test scores, we should evaluate growth in learner agency and self regulation, especially for adult learners or professionals. 4. Scale responsibly:- The potential for scaling personalized learning is huge, but we must not lose sight of the social, cultural, and equity aspects of learning design. 🧭 In my upcoming course on Augmenting Collective Intelligence via Autonomous Agents + Human Experts, I'll integrate several of these insights:- embedding AI tutors in training, designing feedback loops, and ensuring alignment with ethical & pedagogical frameworks. 💡 Question for my network:- How are you balancing AI tool adoption in education or training environments while preserving educator control, equity, and learner agency? Would love to hear your experience or frameworks that are working. #AI #EdTech #LearningDesign #AgenticAI #LifelongLearning #InstructionalDesign #AIgovernance
-
Part of my work in the world of AI is to spend time sifting through, experimenting with tools to see their educational potential. When I find good ones, I share them with teachers here and on my blog. I know many of you are too busy to keep up with the relentless stream of new tools and platforms. That’s why I do the homework. So you don’t have to. The graphic below offers a curated snapshot of AI tools for teachers, organized by how they can support your work, from creating visuals and presentations, to lesson planning and academic research. A few personal favorites I’ve found especially promising in the classroom: 1. Diffit for adapting readings to different levels 2. SlidesAI for turning text into clean, engaging slide decks in minutes 3. Scite and Elicit for research and evidence-gathering (great for student inquiry and teacher PD!) My criteria? Tools that are intuitive, purpose-aligned, and save teachers time without compromising on quality. #AIinEducation #EdTech #TeacherTools #ArtificialIntelligence #DigitalLiteracy #medkharbach #educatorstechnology
-
AI agents are widely misunderstood due to their broad scope. To clarify, let's derive their capabilities step-by-step from LLM first principles... [Level 0] Standard LLM: An LLM takes text as input (prompt) and generates text as output, relying solely on its internal knowledge base (without external information or tools) to solve problems. We may also use reasoning-style LLMs (or CoT prompting) to elicit a reasoning trajectory, allowing more complex reasoning problems to be solved. [Level 1] Tool use: Relying upon an LLM’s internal knowledge base is risky—LLMs have a fixed knowledge cutoff date and a tendency to hallucinate. Instead, we can teach an LLM how to use tools (by generating structured API calls), allowing the model to retrieve useful info and even solve sub-tasks with more specialized / reliable tools. Tool calls are just structured sequences of text that the model learns to insert directly into its token stream! [Level 2] Orchestration: Complex problems are hard for an LLM to solve in a single step. Instead, we can use an agentic framework like ReAct that allows an LLM to plan how a problem should be solved and sequentially solve it. In ReAct, the LLM solves a problem as follows: 1. Observe the current state. 2. Think (with a chain of thought) about what to do next. 3. Take some action (e.g., output an answer, call an API, lookup info, etc.). 4. Repeat. Decomposing and solving problems is intricately related to tool usage and reasoning; e.g., the LLM may rely upon tools or use reasoning models to create a plan for solving a problem. [Level 3] Autonomy: The above framework outlines key functionalities of AI agents. We can make such a system more capable by providing a greater level of autonomy. For example, we can allow the agent to take concrete actions on our behalf (e.g., buying something, sending an email, etc.) or run in the background (i.e., instead of being directly triggered by a user’s prompt). AI agent spectrum: Combining these concepts, we can create an agent system that: - Runs asynchronously without any human input. - Uses reasoning LLMs to formulate plans. - Uses a standard LLM to synthesize info or think. - Takes actions in the external world on our behalf. - Retrieves info via the Google search API (or any other tool). Different tools and styles of LLMs provide agent systems with many capabilities-the crux of agent systems is seamlessly orchestrating these components. But, an agent system may or may not use all of these functionalities; e.g., both a basic tool-use LLM and the above system can be considered “agentic”.
-
Are you leading teams or shaping the future of work? Here's something you need to hear. I just came off a panel at UC Santa Barbara on AI and education—and what I learned from the students wasn’t just insightful. It was a wake-up call. Upperclassmen—juniors and seniors—are using generative AI to prep for exams and polish their writing. But the underclassmen? They’re not just using it to study. They’re using it to operate. To think. To build. Every day. This next wave of talent isn’t learning how to adapt to AI—they’re growing up with it as second nature. By the time they hit the workforce, they won’t just prefer AI-fluent environments. They’ll expect them. For engineering leaders, founders, and decision-makers, this isn’t a minor shift. It’s a signal. Now is the moment to rethink how you structure teams, develop talent, and design innovation pipelines. Because when Gen Z enters your organization, they’ll bring new tools, new rhythms, and new expectations. The question is: Will your culture be ready to meet them? #FutureOfWork #AILeadership #GenZInsights #TalentStrategy #InnovationCulture
-
A recent Gallup survey reveals what many of us working in education have been sensing: ➡️ Nearly half of Gen Z students (47%) believe AI should be allowed in classrooms. ➡️ Even more compelling — over half (52%) believe schools should be required to teach them how to use AI. Yet, there's a gap: only 28% of students say their school currently permits AI use, and about 49% say their schools don’t have clear policies at all. Students are looking for guidance, not just permission. They want to understand how to use these tools responsibly, creatively, and effectively. If schools don't step in to teach these literacies, students will learn with or without us. At the Michigan Virtual AI Lab, we've witnessed firsthand how thoughtful AI integration can transform learning experiences when schools develop clear, intentional guidelines. Students need not just permission to use these tools, but structured guidance on how to leverage them responsibly and effectively. We must ask ourselves: ❓What message do we send when we leave students to navigate these powerful technologies without clear guidance? ❓How can we better align our educational practices with the realities our students are already facing? ⬇️ In the comments, I’m sharing a set of free, ready-to-use AI literacy lessons from Michigan Virtual for educators who are ready to start these conversations. Let’s bridge this divide together and empower Gen Z to become responsible, capable users of AI. #AIinEducation #AIliteracy #StudentVoice #MichiganVirtual
-
📣 New research from the Walton Family Foundation, Gallup, and GSV Ventures reveals that while #GenZ is actively engaging with AI, they remain deeply skeptical about its impact on learning, work, and critical thinking. 🔍 Key insights from the report: -> 79% of Gen Z uses generative AI, yet 41% feel more anxious than excited about it. -> Only 28% of students say their school allows AI use, and nearly half say schools lack clear policies. -> Students in wealthier, urban areas are more likely to be taught how to use AI, widening the digital divide. -> 57% of students in schools with clear AI policies feel prepared to use AI post-graduation, compared to just 36% in schools that ban it. 💡 The takeaway? Gen Z knows AI is their future—but they’re asking for guidance. Here is where #IBMSkillsBuild can make a difference. We are committed to equipping learners and educators—no matter their zip code—with: ✔️ Free access to AI learning paths with skills progression ✔️ Micro-credentials that demonstrate AI literacy and career readiness ✔️ Toolkits for faculty and institutions to confidently teach AI in the classroom ✔️ Guided learning experiences and expert-led sessions to build real-world skills Learn more about the report and Skillsbuild in the comments below #AIeducation #GenZ #DigitalSkills #SkillsBuild #FutureOfWork #AIinClassrooms #EdTech #EquityInEducation #Microcredentials #WorkforceDevelopment #SkillsFirst
-
Two recent studies, one from OpenAI's analysis of 2.5 billion daily ChatGPT messages and the other from Google's controlled trial of AI-augmented textbooks, provide converging evidence of a fundamental shift in how people learn. ChatGPT, with 700 million weekly users, sees 10% of all messages dedicated to tutoring, predominantly from users aged 18-25. Surprisingly, students primarily use AI to deepen understanding rather than complete tasks: 49% of interactions seek explanations and comprehension, not ready-made answers. This organic adoption shows students creating personalized learning experiences that traditional one-size-fits-all textbooks cannot provide. Google's Learn Your Way validates this approach experimentally. By personalizing textbook content to student interests and reading levels, explaining physics through basketball or economics through music, the system improved test scores by 13 percentage points. Both studies show AI transforms passive reading into active engagement through questions, multiple content representations, and immediate feedback. The gender gap in usage has closed, and adoption is accelerating in lower-income countries, though educated professionals still dominate work-related usage. The convergence is becoming more clear: millions of students aren't waiting for institutions to provide AI learning tools, they're already using GenAI as a personalized tutor. The data suggests GenAI works best as a learning companion that enhances understanding rather than replacing formal education. As we move forward, the question isn't whether AI will transform education, that transformation is already underway, driven by millions of students who have discovered that AI can provide something traditional educational materials cannot: personalized, patient, always-available support for learning. The question is how educational institutions, policymakers, and technology developers will respond to and shape this transformation to ensure it enhances rather than undermines human learning and development. https://lnkd.in/gpAxJrfF
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Event Planning
- Training & Development