AUDREY’S EVALUATION PLAN
http://media.photobucket.com/image/LOTS+OF+DOGS+/jlnichols_photos/Dogs.gif
Unit 7337 care for dogs - husbandry
Evaluating my new course design?
I plan to use the Survey monkey produced survey which I designed to measure student satisfaction, it asks students how easy they found the new content and if it was useful to them.
The course outcomes, did we manage to get better results this year?
Feedback from local veterinary clinics through work placement verification, are the students able to carry out the procedures effectively and safely?
Who’s going to test it before it goes live?
I plan to use a test group of students prior to going live and use their feedback to redesign if necessary
My colleagues will be asked for feedback.
My moderation partner has reviewed the videos, made some suggestions that have already been implemented but it will be interesting to see what student’s really think; I find it easy to get swept up in thinking my stuff is the bees knees (but it’s not always so).
Ongoing monitoring
This will be done using very similar methods to the above section. I plan to use face to face reporting from students. As I will be teaching this module next year it will be possible to get instant feedback from students as we review the videos together (as part of the block course content)
We have a junior lecturer starting next year and she has already spoken to me about joining in with this unit so it will be possible to get feedback from her.
Peer observation – done annually and I can decide what I want observed (usually done by head of school), this will give me formative feedback rather than judging the content, it will look at how I present the information but will be valuable information because we can have the best stuff in the world but the connection for delivery is vital for success.
Judging success
Student’s completing the course seems the most obvious way to judge success but can be a bit misleading, we often give students a couple of re-sits for each unit so just using this tool only gives a general view of success.
Ongoing formative assessment will be used throughout the delivery, the videos will fit well with the formative workbook that students already complete as they will be using peer feedback during the course (working through the videos and using discussion groups)
Gathering the data
After reading the forum posts I have learned a new way to survey, using the one that is already set up on the website, I will be using the one designed and shown in my plan but I will be looking at this resource, don’t want to burden the students with too much surveying, and get the ‘fed up of surveys’ response (tick anything to get through it).
Student’s already have several opportunities to get industry verification, this gives prospective employers chance to remark on performance, while this gives qualitative data, I think this is the most valuable data, and information that will drive the refining of the unit.
Quantitative data from assessments completed (and passed) are we getting good success rates; we can use this to demonstrate to the OP board that we are meeting targets, which will help with funding.
Using the information
I plan to use the information from the surveys and face to face meetings to look at improving the delivery, giving more of what the students think is useful, rather than what I think they need (or what I would like).
I can use the information to look at the delivery method and how this fits in with learner styles; hopefully feedback shows the information is useful.
We can use information gathered to look at how we market the course to new students and the industry, I like the idea of using past students work (with permission) to show what we provide. I also plan to use more students in the videos (showing real students doing the work rather than just staff); I can also address diversity through this by using indigenous students.