<aside>
đź’ˇ
wow so i gotta write 1200 words in fucking record speed. 24 hours fromAugust 27, 2025 7:41 PM (EDT). i believe
</aside>
plan:
talk about my experience as an technical educator, and show my observations. coupled with each observation i want to talk about what will happen at the limit (once ai is fully adopted) and what will it mean for us (students, educators).
tagline: I’ve seen the future of education with AI, and I hate it
key points
- Students follow the path of least resistance. While the studious of the bunch will still try to learn, the rest of the people will fall to the new mean (worse than average).
- This learning gap will exacerbate existing issues within our education system. Think public vs. private,
- Cheating obviously. Forcing people to cut corners this early in their career is ridiculous
- Generative AI is frankly not yet designed for the global majority.
- Without a secure and reliable way to verify if content is AI-generated, we can’t allow it to be…
- used in any form of creative work
- used in education
- “Studies have shown significant bias in GPT (generative pre-trained transformers; e.g., ChatGPT) against non-native English speakers. For example, one study shows over half of non-native English writing samples were misclassified as AI generated (while the accuracy for native English speakers was nearly perfect).”
- Western companies built AI for the West. We are using “Pale Male Data.” existing measures of success in AI don't reflect the global majority—we are fooling ourselves.
anecdotes
- apsc 143 students that were allowed to freely use chatgpt came into mren 178 fully fucking useless (verrry alarming)
- we are losing recipes!
- the kids didn’t even know that GenAI could hallucinate, or give them fully incorrect answers.
- Where will AI companies get their data from?
- There’s legitimate privacy concerns with AI companies using our private data to train their LLMs. There’s precedent, as Google Photos has been rumoured to use your saved photos to train their GenAI models.
- What i’ve seen at a 2nd year eng student doing labs
- garbage in garbage out. profs use ai to write labs, and students use ai to digest them. TAs take the report and grade it using AI.
- even worse, profs using ai to write exams during the TA strike.
- GenAI can’t reason. We are using tools we don’t understand to create content that nobody will read.
- apple paper on intelligence
As a student:
- AI Centipede
- TA Strike causing profs to turn their exams into MC exams using ChatGPT.
- like students, even profs will succumb to the path of least resistance. especially if they don’t understand the risks of AI
As an educator:
- MREN 178 experience
- apsc 143 students that were allowed to freely use chatgpt came into mren 178 fully fucking useless (verrry alarming)
- we are losing recipes!
- AI labs experience
- the kids didn’t even know that GenAI could hallucinate, or give them fully incorrect answers.
- Where will AI companies get their data from?
- There’s legitimate privacy concerns with AI companies using our private data to train their LLMs. There’s precedent, as Google Photos has been rumoured to use your saved photos to train their GenAI models.