
Microteaching session – 5th Feb, with Anna, Ko, Hatie and Rachel as participants.
Object Based Learning
Object-based learning (OBL) is a mode of pedagogy which involves the active integration of objects in the learning environment, to inspire, inform, and excite learners (Chatterjee 2011, 2016).
My approach to OBL was to teach ‘object-detection’ – a type of Computer Vision algorithm that can detect objects in images – taking a meta approach to the brief by looking at ‘objects’ through the machinic gaze.
Microteaching Plan
Here is the link to the microteaching material that I presented. (*edit: now accessible)
The session was aimed at BA Fine Art: Computational Arts students who would have had some experience with coding, coding editors, and using APIs.
Timings:
4-5mins: intro to object-detection library called YOLO,
2 mins: Quick recap on Google Colab
3mins: Quick recap on Flickr API (to source images from to apply YOLO to)
10mins: Live coding session with active participation, looking at Google Colab together
5mins: Review of the results – discussing the objects detected by the algorithm (and what it failed to detect).
Observations and feedback
I wanted to teach this in a live-coding style. This presented challenges that needed to be ironed out in advance. I had to pre-prepare most of the code which took a way the demystification and gentle pace of making something from scratch. Some of the feedback reflected this – as it suggested that the session could be initiated with a check-in at the start to assess coding literacy. I admired how the PgCert team uses Teams to do anonymous polls; this could work for future coding workshops to assess technical literacy at the start of the session.
The goal of my teaching was to interweave technical learning with critical thinking, without jumping between two styles of teaching (coding with a code editor and ‘lecturing’ with slides). I decided to bring some of the slide elements into colab- such as the session’s title and header image. This approach worth is exploring further. Observers pointed out that jumping between tabs can be hard to follow, so integrating the presentation could improve on this.
There were a lot of tools to introduce at the start and I had to also briefly mention Gemini, Google’s built-in assistant. This needed a more nuanced intro that problematised the use of AI but I was pressed for time. The session had been designed for students that would have been introduced to these tools in previous seminars.
Running the code in real-time introduces a sense of anticipation and risk. I encouraged participation by letting others decide what to search. The feedback I received helped evaluate this approach. The comments were positive and it was said that the tone was comforting. The liveness was appreciated as it disclosed vulnerability. The risk was mentioned – what if the system had returned images that were distressing? This is a very valuable observation. There are ways in which I can make the search exclude images as a precaution, however critiquing the algorithm’s shortcomings is part of the learning.
We tried the search terms ‘doorknobs’, ‘farm’, and ‘teacher’. Interestingly, YOLO isn’t trained to detect doorknobs. The farm example revealed how messy databases can be, as some examples tagged ‘farm’ had nothing to do with farms. Finally, the teacher example produced some of the oddest results – meme-like inspirational quotes paired with celebrities, one of which was Keanu Reaves. This was a great end to the session as it had everyone laughing. Humour wasn’t discussed so much in the feedback, but it is something I enjoy in my teaching to demonstrate that software is fallible.
References
Chatterjee, H. J. (2011). Object-based learning in higher education: The pedagogical power of museums. http://dx.doi.org/10.18452/86
Chatterjee, H.J., Hannan, L. and Thomson, L., 2016. An introduction to object-based learning and multisensory engagement. In Engaging the senses: Object-based learning in higher education (pp. 1-18). Routledge.