ARP Post 9: Q and A

This final post answers the questions I received after the presentation.

1. Could you expand slightly upon the social justice element of the project please?

The social justice element of this project is about access and fairness in how coding is taught and learned on the course. Students arrive with very different levels of prior experience, shaped by factors like schooling, socioeconomic background, confidence, language, and access to technical provision and computers. AI-assisted coding was initially assumed to lower barriers and make code-based work feel more accessible to students who might otherwise feel excluded.

The research however, highlighted that if vibe coding is left informal or implicit, it can exacerbate or create new inequalities. Students with more technical confidence may benefit quickly, while others are left unsure whether using AI is allowed, how to use it well, or whether it undermines their learning. The takeaway from this research is to focus on making those expectations visible and teaching AI-assisted coding as a supported practice, so that it becomes a shared learning and coding tool rather than a hidden advantage. Framing vibe coding pedagogically is about creating fairer conditions for learning, not just faster outcomes.

2. What are your thoughts about the ethics of AI teaching? We acknowledge this could be an essay in its own right – so what are the main considerations in the context of your ARP?

The ethics of AI in teaching, in the context of this project, are mainly about transparency, care, and responsibility rather than whether AI should be used at all. AI-assisted coding is already present in students’ workflows, so the ethical question becomes how educators respond to that reality. UAL’s position on AI and guidance for students already sets out terms for our course to align to; embracing ‘AI through a Curious, Critical, and Compassionate lens’: https://www.arts.ac.uk/about-ual/learning-and-teaching/digital-learning/ai-and-education

Key considerations include being clear about when and how AI use is appropriate, avoiding punitive or ambiguous policies, and ensuring students are not disadvantaged for either using or avoiding AI (again, echoed by UAL’s student guide https://www.arts.ac.uk/about-ual/learning-and-teaching/digital-learning/ai-and-education/student-guide-to-generative-ai). In the context of coding, there is also an ethical responsibility to support learning-to-code rather than encourage dependency, by helping students understand what AI-generated code is doing.

Finally, there are broader ethical concerns around data, surveillance, sustainability, and power, particularly when AI tools are embedded into platforms students are required to use. Within this ARP, the focus is on designing teaching approaches that are transparent, supportive, and aligned with learning aims, while remaining attentive to the wider institutional and wider contexts in which AI operates

3. Thinking about data analysis, what worked – and what was challenging – in this? And how was it analysing a written interview alongside a text interview?

What worked well in the data analysis was the depth of reflection the interviews enabled, even with a very small sample. The semi-structured format allowed participants to surface concerns around confidence, legitimacy, and learning that may not have emerged through a more task-based or survey-led approach. Analysing the interviews comparatively, across a current student and a recent graduate, also helped highlight how quickly issues around AI are shifting.

The participant that responded in writing rather than through a live interview removed opportunities for follow-up questions, clarification, and conversational nuance. At the same time, the written response was often more considered and precise, offering insights that might not have surfaced in real-time discussion, and clearly the participant felt more comfortable with this format. This evidences a need for more inclusive and flexible methods of data collection.

Analysing the written interview alongside the transcribed spoken interview required treating them slightly differently, while still looking for shared themes. Rather than seeing this as a limitation, it reinforced the value of flexible methods and informed my thinking around more open and creative approaches of data collection for future research.

Leave a Reply

Your email address will not be published. Required fields are marked *