The findings from this small-scale study suggest that AI-assisted coding, or “vibe coding,” is neither inherently emancipatory nor inherently damaging to learning. Instead, its pedagogical impact is shaped by when, how, and for whomit is introduced. The contrasting experiences of the two participants highlight a key issue for Computational Arts education: the uneven distribution of prior coding competence, and the risks of assuming a level playing field when introducing powerful AI tools.
Participant A’s experience points to a potential alienation gap. Encountering vibe coding at the very start of their studies, without foundational coding knowledge or structured guidance, led to confusion and a loss of confidence. AI-generated code appeared opaque and difficult to claim ownership over, raising ethical and creative concerns about authorship and understanding. In this context, vibe coding risked accelerating output without supporting comprehension, producing what felt like progress without learning.
Participant B’s experience, by contrast, shows how vibe coding can function as an accelerator once a baseline level of competence is established. Having learned to code prior to the availability of AI assistants, Participant B was able to use conversational coding tools to extend their practice, overcome creative plateaus, and shift focus from technical feasibility towards conceptual intent. Here, AI supported learning rather than replacing it.
These findings suggest a clear pedagogical implication: vibe coding should not be treated as neutral infrastructure or left for students to discover independently. Doing so risks reproducing existing inequalities, where students with prior experience benefit disproportionately, while novices experience anxiety, dependency, or disengagement. An inclusive approach requires making students’ starting competence visible and designing teaching accordingly.
One potential intervention is the introduction of early diagnostic activities that assess students’ confidence, prior exposure, and understanding of coding concepts—not to stream or exclude, but to inform differentiated support. Alongside this, vibe coding could be introduced in a structured and reflective way, framed explicitly as a learning aid rather than a shortcut. This might include guided exercises that require students to explain, annotate, or modify AI-generated code, foregrounding understanding and authorship over output.
Such an approach also responds to wider critiques of AI in education, including those articulated in Current Affairs (Purser, 2025). The article warns of “cognitive debt,” where reliance on AI produces a metacognitive illusion of engagement while eroding underlying skills. While this critique is persuasive it risks flattening all uses of AI into a single moral failure. The findings of this study suggest a more nuanced position: the danger lies not in AI itself, but in pedagogical models that outsource struggle, reflection, and judgment to automated systems.
In Computational Arts, where learning is already exploratory, non-linear, and practice-based, the challenge is not to ban vibe coding, but to bring about reflection, and intentional use. By scaffolding AI-assisted coding within a broader learning trajectory, one that values difficulty and partial understanding, it may be possible to avoid the point of the analogy introduced by schoolteacher Carl Haefemeyer in regards to AI who said in an interview that “[learning] is like weightlifting. You wouldn’t bring a forklift to the gym. The goal isn’t to get the bar up in the air, the goal is to build your muscle by lifting the bar. And so trying to reemphasize that and having growth as a goal is something that helps just prevent that desire to use it to cheat along the way.” (MPR News, 2025).
Ultimately, a more inclusive pedagogy of vibe coding requires shifting from a permissive “use it if you want” stance to an explicitly taught, critically framed practice informed by ongoing course-team research into new education-led tools such as Google Gemini’s Guided Learning feature. Doing so aligns with the course’s commitment to accessibility and creative agency, while ensuring that AI supports learning rather than quietly displacing it.
References
Purser, R. (2025). AI is Destroying the University and Learning Itself. [online] Currentaffairs.org. Available at: https://www.currentaffairs.org/news/ai-is-destroying-the-university-and-learning-itself. [Accessed 8 Dec 2025]
MPR News, (2025). AI in schools: St. Paul teacher says it’s ‘like bringing a forklift to the gym’. [online] MPR News. Available at: https://www.mprnews.org/episode/2025/09/09/ai-in-schools-st-paul-teacher-says-its-like-bringing-a-forklift-to-the-gym# [Accessed 3 Feb. 2026].