CVS check out is awful. There’s always an error (lifted bag too soon, waiting for the green to actually scan, items by weight is off) and a store worker always has to come over which stops them from stocking, talking to a customer or other work. Then they have to restart - over and over and over. It was excruciatingly stupid from a process view. It’s not better or even the same - it’s worse.
So I went to Walgreens across the street after I stole everything I wanted at CVS because it was faster, easier and simple. Plus, let’s be real - CVS will just shrug, contribute it to shrink, and still not hire a counter clerk, security or anyone to really help. I didn’t but wow, I could steal half the store.
Oh and you know what? I couldn’t even talk to their person at Walgreens because she only spoke Spanish and I only speak English (except for the usual, I don’t speak English/Spanish that we BOTH know). Still found what I needed without my stupid phone, and got out of there in half the time. Half. Am exaggerating bit? You bet. But you know what? You kinda don’t care after experiencing stupidity.
Your post raises important questions about the role of AI in education, but it also leaves room to dig deeper into the assumptions and implications you're highlighting.
Mulaney’s joke captures a broader cultural anxiety about human connection being replaced by automation—not just in education but across many domains. In K-12, this concern isn’t just about the novelty of chatbots; it’s about the fundamental purpose of education. Is it about efficient information delivery, or is it about fostering curiosity, critical thinking, and human relationships? AI may be a tool, but what happens when tools start defining the terms of engagement?
Your mention of ‘human in the loop’ got me thinking: are we framing teachers as co-pilots or as passive monitors? The difference matters. It’s one thing to use AI to streamline tasks like grading or lesson planning, but if we accept that chatbots will mediate student interactions with knowledge, we risk sidelining the teacher’s role as a mentor, guide, and community builder. How do we safeguard that while still embracing innovation?
I’d also push on the "self-checkout" analogy. Self-checkouts make transactions more efficient, but in education, efficiency isn’t the goal. Learning is messy, relational, and requires a sense of shared purpose. How do we articulate a vision for AI that enhances, rather than undermines, those elements? For instance, can AI amplify teacher capacity in ways that bring more humanity into the classroom—by freeing teachers to spend more time connecting with students, not less?
What’s your take on how we ensure AI tools remain just that—tools—and don’t redefine the heart of teaching and learning?
Thank you for your comment. You asked, "Are we framing teachers as co-pilots or as passive monitors? The difference matters." I hope the answer is neither. Would we say teachers are co-pilots with SmartBoards, calculators, word processing apps, YouTube videos, or other technologies students use? We need to see some actual pedagogical value demonstrated by chatbots and image generators before we worry about them redefining teaching and learning.
A primary concern seems to be that AI might not be reliably left-wing enough to be trusted.
CVS check out is awful. There’s always an error (lifted bag too soon, waiting for the green to actually scan, items by weight is off) and a store worker always has to come over which stops them from stocking, talking to a customer or other work. Then they have to restart - over and over and over. It was excruciatingly stupid from a process view. It’s not better or even the same - it’s worse.
So I went to Walgreens across the street after I stole everything I wanted at CVS because it was faster, easier and simple. Plus, let’s be real - CVS will just shrug, contribute it to shrink, and still not hire a counter clerk, security or anyone to really help. I didn’t but wow, I could steal half the store.
Oh and you know what? I couldn’t even talk to their person at Walgreens because she only spoke Spanish and I only speak English (except for the usual, I don’t speak English/Spanish that we BOTH know). Still found what I needed without my stupid phone, and got out of there in half the time. Half. Am exaggerating bit? You bet. But you know what? You kinda don’t care after experiencing stupidity.
OPEN AI whistleblower found dead
https://youtu.be/sYlPQiKy_Ws?feature=shared
Your post raises important questions about the role of AI in education, but it also leaves room to dig deeper into the assumptions and implications you're highlighting.
Mulaney’s joke captures a broader cultural anxiety about human connection being replaced by automation—not just in education but across many domains. In K-12, this concern isn’t just about the novelty of chatbots; it’s about the fundamental purpose of education. Is it about efficient information delivery, or is it about fostering curiosity, critical thinking, and human relationships? AI may be a tool, but what happens when tools start defining the terms of engagement?
Your mention of ‘human in the loop’ got me thinking: are we framing teachers as co-pilots or as passive monitors? The difference matters. It’s one thing to use AI to streamline tasks like grading or lesson planning, but if we accept that chatbots will mediate student interactions with knowledge, we risk sidelining the teacher’s role as a mentor, guide, and community builder. How do we safeguard that while still embracing innovation?
I’d also push on the "self-checkout" analogy. Self-checkouts make transactions more efficient, but in education, efficiency isn’t the goal. Learning is messy, relational, and requires a sense of shared purpose. How do we articulate a vision for AI that enhances, rather than undermines, those elements? For instance, can AI amplify teacher capacity in ways that bring more humanity into the classroom—by freeing teachers to spend more time connecting with students, not less?
What’s your take on how we ensure AI tools remain just that—tools—and don’t redefine the heart of teaching and learning?
Thank you for your comment. You asked, "Are we framing teachers as co-pilots or as passive monitors? The difference matters." I hope the answer is neither. Would we say teachers are co-pilots with SmartBoards, calculators, word processing apps, YouTube videos, or other technologies students use? We need to see some actual pedagogical value demonstrated by chatbots and image generators before we worry about them redefining teaching and learning.