Use of Word Clouds to Understand Target Learners


Currently, the physiotherapy programme includes 120- 190 students in a large-group teaching (lecture). While there is some benefits to delivering teacher-directed content to the students in this way, it is somewhat difficult to engage students and to establish the current understanding of the cohort as a whole.

While preparing for a lecture that was to be presented following the “lunchtime lull”, I considered introducing something different to help break the monotony; for students to interact and contribute to; to provide me with an understanding of their current learning so could refocus the remainder of my lecture; and for students to get instant feedback on what others had contributed to. In order to attend to the above, I utilised Mentimeter Word Clouds into the middle of my lecture.

Reflection on Understanding Target Learners

While I have reflected on the benefits and constraints of the use of Mentimeter in another post (see 1a); here, I will reflect on how the use of the interactive Word Cloud enabled a better understanding of the students’ current learning. 

I have always found it difficult to gauge the overall level of learning of students in large-group environments- and it appears when discussing with others that I am not alone. With close to 200 students in a lecture theatre, it is those in the front row or only a handful of the same individuals that seem to consistently share their opinions during the lecture. 

The use of the interactive Word Cloud enabled more engagement to the online-based questions than if this was asked verbally. For example- when asked, “what are some effects of aging on the neuromuscular system?”- there were over 80 responses. I found it interesting to see the momentum grow as subsequent students engaged with the interactive question. It seemed to me that by seeing peer responses go up on the data show in “real time”, this provided others confidence of what they were thinking. 

It was relatively easy to establish some of the common themes of what the students were considering in response- with those that had multiple responses represented either larger, or more central. This helped direct my understanding of their current knowledge, and what to emphasis on in the remainder of the lecture. Having the responses more central, did not equate to always being correct, however. However, this did provide me a great opportunity to address the understanding there and then. Likewise, what was more “on the rim” or smaller may have been deemphasised, with some discussion as to why a particular response could be warranted (and/ or correct)… 

In the future, I will use the MentiMeter Word Cloud function which enables student-directed learning as they self-pace through a series of questions. This may be provided ahead of the lecture (though may not provide the confidence to answer as described above); or made available during the time of the lecture only. Watch this space!

Deploying and Supporting Use of 360 cameras


Healthcare higher education (HHE) continues to grow as it tries to meet the deficit of healthcare professionals in New Zealand (Ministry of Health, 2018). However, this puts pressure on lecturer to staff ratio, physical classroom sizes, tutorial equipment, and clinical experiences. In response, HHE lecturers need to consider alternative pedagogy to delivery engaging teaching, that shifts from teacher-directed to student-directed learning in extended [virtual] clinical environments due to the limited opportunity for clinical situated learning. The use of 360 virtual environments may help to bridge that gap.

Other blogs have outlined deployment and support of staff using 360 cameras and virtual environments including a proposal for equipmentEducational workshops; development of a “how-to cheat-sheet”; and instructional video for the online software (SeekBeak).

Reflection on Deployment

Being relatively novice to the use of the 360 cameras myself, I was somewhat nervous in being “called upon” to develop the proposal for the School and to lead teaching sessions to peers (something I find more nerve-wracking compared to teaching students). However, I received some good feedback from the School workshops, with some expressing enthusiasm for its use in their own teaching (see support below). As I became more confident in its use, I was able to initiate, then develop on the “cheat-sheet” and instructional videos. On reflection, having these resources available has reduced the common questions asked on “how to” set up; enabling concentrated face-to-face time on the scenario they are using the 360 environment for.

Reflection on Support

I always find it rewarding to see others develop something that was initially new to them. Even more rewarding when it is using resources personally developed. Staff, especially first time users, were supported in their development of scenarios and use of 360 environments with individualised meetings. I learnt that there was significant value in providing an initial 30 minute meeting to hear their intention for teaching was, to provide an overview of the resources, and then leave with the video and “cheat sheet”. It meant that the next meeting they would have developed an independent plan of how they would use the 360 environment, have a “play” with the equipment and software and have some “creative license”. One example included working with the Occupational Therapy team that focuses on management of an individual with an identified falls risk. I have also learnt that I have gained personal insight by working with different healthcare disciplines who have differing perspectives on the priority of care (i.e. the flow of the scenario). This has provided me a broader context to the clinical reasoning of my own approach to priorities- and to teaching students the consideration of other interprofessionals. I have also learnt to broadened my face-to-face contact when supporting staff- starting with “what is the end result you want to get the students to understand?”, and acknowledge when the use of 360 environment is not suited for that learning.

Another example of supporting others includes an impromptu invite to assist a Seekbeak workshop during the ASCILITE 2017 Conference. By doing so, I in turn developed experience working with others outside HHE; as well as confidence in “student”-directed blended learning as the workshop was directed by the participants’ needs.

I have learnt that by utilising a case scenario in a virtual environment, (HHE) students’ can independently investigate the scene to assess, prioritise then link to other resources. They are also prompted in their own time to peer-up and complete practical tasks, or to confirm learning with the lecturer.  I have been encouraged by how the use of virtual environments has naturally led to student-directed, blended learning.

“It Takes a Village…” Curriculum Working Group #cmaltcmooc #cmalt

The following is an outline of our physiotherapy programme’s recent initiation of a curriculum review. While there have been some minor amendments of papers,  it had been 8 years since the role out of our current undergraduate programme. A change in the Programme Management in August 2018, has enabled a focused review of the curriculum as a whole- supported by incidental reviews leading up to this time.
In appreciation of the need for a review based on staff and student feedback, the incoming Head of Department (HOD) and I met to discuss a way forward. While not in the Programme Leader role at the time, I was asked if I would be happy to take on the responsibility of reviewing the curriculum. In preparation for this, there were a few considerations I wanted to be clear about:
  1. Not to “go it alone”. If I was to lead the review, I would want to do it with a group of individuals who could help champion the progress forward. Discussions with the HOD lead to the advocacy of a  “working group” that would have delegated responsibility to review, and make an informed decision and administer changes within the curriculum. At the next available staff meeting, the Head of Department called for expressions of interest in a Curriculum Working Group and a number of key (and well respected) staff volunteered to be involved
  2. “Do a few things well”– then move on. I was keen to establish the idea that we would tackle a couple of areas of the curriculum at a time- though always returning to the agreed framework that had been established to ensure that if fitted with the overall vision for the programme.
Informally, this would follow a design thinking process- a methodology that provides a solution-based approach to solving complex problems that are otherwise ill-defined. This is done by following (not necessarily in sequential order) five steps (adapted from Dam & Siang, 2018):
  1. Empathising– understanding the human needs involved. This includes consulting (and re-consulting) the students, staff, and stakeholders to understand more about their concerns about the programme. Engaging and empathising to understand their experiences and motivations of the current curriculum, which allows setting aside of own assumptions of the needs of the programme.
  2. Defining– reframing and defining the problem in human-centric ways. The important information has been gathered to define the core strengths and weaknesses that the working group has identified up to this point. This is usually done by creating “Problem Statements”
  3. Ideating– creating many ideas in ideation sessions. Here, solutions are generated. Now with an understanding of the needs of the students, staff, and stakeholders, the working group can “think outside the box” to identify new solutions to the problem statement to look for alternative ways of viewing the problem
  4. Prototyping– adopting a hands-on approach in prototyping. The group then presents a number of versions to the staff- or sub-group. The aim is to identify the best solution for each problem statement. The solutions are investigated and either accepted, improved and re-examined or rejected based on the responses.
  5. Testing– developing a prototype/ solution to the problem. While the final stage includes the application solution, there may be further refinement as a deeper understanding of the impact of the implemented solution is developed.
It can be seen from above, that there are some potential similarities to the phases within Educational Design Research (EDR) as outlined by McKenney & Reeves (2018)- (Analysis and Exploration; Design and Construction; Evaluation and Reflection; Maturing Intervention and Theoretical understanding; and Implementation and Spread)
The following is a recent example of how we have utilised a design thinking process to the development of the curriculum:
  1. Empathising– Once the group had been established, I arranged a meeting (utilising and set about gathering some background information so we could start the process well informed. This included a review of the framework models and common themes proposed by teams of staff before amalgamating the models into the “preferred” framework. I also met with the Student Representative Group to ensure that I had “student voice” in any proposed changes. At the meeting (23rd July), there was general consensus in that we agreed (1) we had some good content already; (2) we want the programme to regain the “gold standard” status we had as a physiotherapy programme; and that (3) it was the framework [and delivery] that we needed to turn our attention to, and- with continued feedback from staff and students- ideas would be informed, presented, redesigned, implemented and evaluated.
  2. Defining– Key members of the Curriculum Working Group were delegated to gather further information on identified “key themes”. Information was gathered then presented back at the next staff meeting (21st August). From this, we agreed to focus initially on two key themes for the overall framework- (1) the integration of research, and (2) the role of optional papers within the programme.
  3. Ideating– Based on information gathered at the empathising stage, and then confirmation of key themes at the staff meeting, I was able to immediately put research and elective options to staff in a survey sent via email (21st August). The aim was to provide staff with six options, to then determine a preference with qualitative data on their reasoning. The survey was open for just under a week (21st- 27th August) which resulted in a response rate of (78%), with 21 responses from the 27 current teaching staff.
  4. Prototyping– once I had collated the research and elective survey data, I arranged a second meeting (11th September) with the Curriculum Working Group to discuss the next set for prototyping. Ideas were discussed and documented on an “evolving ideas whiteboard”…GWG Whiteboard 11Sept18.jpg
The results enabled us to identify two main options, which I then presented in a narrated PowerPoint via an unlisted YouTube Link (13th September). The aim was to provide staff a final opportunity to discuss and consider prior to the next staff meeting (24th September)- after which the Curriculum Working Group would proceed with the identified preferred option for research and electives. This will not be able to be tested until 2020, which would be the first year any curriculum changes would be implemented from this date.
With the regular Curriculum Working Group meetings and a sense of progression and being updated, it had been rewarding to see the collective ownership and shared responsibility that is being developed within the group. This includes two staff who have presented the pros and cons of the two options at the next staff meeting.
Impact on Teaching, Learning or Pedagogy
My use of technology as the working group has met and then updated staff has not been incidental. By utilising the various modes of delivery (PowerPoint, SurveyMonkey, staff meetings and YouTube Video) I have inadvertently showcased alternative ways of engagement that has been commended by others, been suggested as good approaches in collating information and have has been requested as a great way to readily receive information in the future. While this appears to be an involved process for two aspects of a curriculum framework (Research and Optional Papers)- so it should be. I will be utilising elements of the design thinking process and Educational Design Research as I lead the Working Group in the curriculum review. Already, it can been seen that by utilising these processes, that the appropriate users are engaged with, empathised with; that ideas are defined, options are developed, prototyped (and later- tested), and then evaluated.
It is envisioned that this Blog Post will be updated at the next CMALT review.
Dam, R., & Siang, T. (2018, September 10). 5 Stages in the Design Thinking Process. Retrieved September 20, 2018, from
McKenney, S. & Reeves, T. C. (2018). Conducting Educational Design Research. (2nd Ed). Routledge: London.

Developing a Guideline for Audio and Video Recording for Assessment Moderation #cmaltcmooc #cmalt

To ensure fair and equitable assessment processes that are consistent with others within the Faculty (and University), each paper/ module reports on assessment moderation. While this can be fairly easy to plan, execute and evaluate for paper-based assessments (i.e. written examination, assignments and reports), the consistency of assessing practical skills can be problematic. Currently, the physiotherapy programme includes (1) pre-assessment moderation meetings; (2) cross-marking within the scheduled examination period; and (3) post-assessment moderation meetings.
I was approached by the physiotherapy management team to investigate the use of audio and video moderation of examinations. This subsequently led to the development of a guideline/ policy for the Physiotherapy Department.
When considering implementing recording of assessment for moderation, on reflection there were a few key questions that I needed to consider:
  • How will the students be informed of the video and audio being captured?
  • How will the video and audio be captured, and will it be able to capture the movement throughout the practical examination?
  • Will there be enough storage space on the device to capture ALL examinations?
There was some contention as I suggested that the newly purchased video cameras (Sony HDR-PJ410) would not be sufficient and further cameras would need to be purchased. While my argument was based on a comparison of the current cameras to specifications required at a similar cost- the School budget did not allow the purchase of further cameras.
I also consulted other key staff members who were involved in the practical examinations as well as colleagues in paramedicine and nursing who were already recording their assessments.
When considering all the above, it was clear that while there was value in establishing recording for the purpose of moderation; there were no clear guidelines or policy to work from. This was confirmed at a meeting with the Faculty Associate Dean (SS).  Overall, my reflection on the current issue (consistency of moderation of practical examination) and consultation with others led to the development of a Guideline for Audio and Video Recording for Assessment Moderation.
As the paper booklets for Semester Two 2018 had already been published, these Guidelines were not able to be implemented in full this semester. It was pleasing that those papers intending to use the recording for moderation had already included some information in their published booklets. They then utilised the remaining guideline information to better inform the students online and on the examination timetable.
One paper I am involved in did include the recording of assessment. We met as a team to discuss how would use the audio recorders (including what would state; to “stop” at each exam to create separate files, and that would review any that were unsure of final mark and those that were receiving a fail mark).
In future updates of this guideline, I would include the suggestion on a pre-moderation meeting to discuss the consistency of recording (as outlined above). It has also since been confirmed (November 2018) that as the recording is part of a moderation of assessment, that the student’s right to refuse recording; and ability to withdraw consent for recording can be removed; though students can still request to listen to the recording if they wish.