The "Canvas Course Reviews" Service: Individualized Professional Development for Faculty on their LMS Usage
This session highlights the University of Toronto’s Canvas Course Reviews service, a multi-stage process during which a pedagogical consultant provides instructors with formative feedback on how they use Canvas tools to meet their teaching and learning goals. Institutions can adapt this framework to their contexts to improve professional development offerings.
Alrighty. So it's four fifteen. So why don't we get it everyone. So today's presentation, I'll be it's called the Canvas course review service, individualized professional development for faculty on their US usage. So I'll be speaking about a service that, I designed and implemented at the University of Toronto. So I'll give some introduction and background.
I'll provide some information about the range of teaching feedback services that we offer at the University of Toronto. I'll provide some theoretical frameworks that inform the design of this service. I'll give an overview of how this service works from start to finish. I'll share some common themes of what I learned, throughout this process, discuss our feedback and evaluation process and share some advice if you're interested in implementing a similar service at your institution. So sorry slight advances struggling.
Yes. Okay. There we go. So, just a little bit of background for me. I recently completed my master of education at the University of Toronto in online teaching and learning.
Thank you. And, I'm a faculty liaison court Nator, at the center for teaching support and innovation at the University of Toronto, mainly work on organizing professional development for instructors, at the university. So the University of Toronto has around a hundred thousand students We have three campuses and we are a very decentralized institution. So that's just a little bit of background into UFT, large, research intensive, publicly funded. And these are photos of our three campuses in Scarborough, Downtown Toronto mississauga.
And I was gonna do a metrometer pull, but I don't know if you've had the same experience. The internet has been very rough. So I just want maybe three people to raise their hands and let me know when you visit a Canvas course for the first time in the role of a learner, what's the first thing you look for. Ease of navigation. Okay.
Ease of navigation. Yes. We got a second for that. Anything else. To do list? Okay.
Who's teaching? Who's teaching? Yep. The syllabus? Great. Anything else. Sorry? A decent homepage. Great.
And one more. So the course organization. Perfect. And so before I get into the details of our Canvas course review service, I'll just mention a little bit more about the University of Toronto is that we have a collaborative support model for educational technologies. You are not supposed to read this poster.
It's there for, for show. We do have a link where you can read up on that, but essentially because we're such a decentralized institution, we do have local educational technology support staff in academic departments and divisions. I represent the central teaching support unit University. So this was a central initiative that came from us. So we'll just speak a little bit about teaching feedback services and and what they are so before the pandemic, we offered a service called in class observations.
We've rebranded them as teaching observations, and this was, basically someone, faculty liaison pedagogical advisor would come to your class, sit in in the back of the room for an hour and provide some feedback to the instructor about what's going on. And during COVID, we did transition a little bit to, attending synchronous online sessions and providing feedback on that. But there was a missing gap where know, how are instructors using Canvas to meet their teaching learning goals? Not only not not just in cases where the course is a sync but in courses where it's an online synchronous course and, you know, the canvas site is hosting the materials and even for in person course before the pandemic where Canvas is really the hub for the course, where students can find materials, submit assignments, and all of that. So there was something missing. And so, I developed a service, collaboratively with colleagues called Canvas course reviews, and it's intended to provide instructors with formative feedback on how they're using Canvas to meet their teaching and learning goals.
And so the way it works is that a faculty T liaison. So that would be me or one of my colleagues. We would actually visit the quercus, the canvas course. Sorry, if I say quercus, it's that's how we branded canvas. I will do my best to refer to canvas throughout, but it might slip out here and there.
Where we visit their course and provide them with constructive and appreciative feedback on how they're using Canvas tools to meet their teaching learning goals, and the objective is to improve their pedagogical And so this became a new part of our teaching feedback services. We provide feedback on how using the LMS for any course, whether it's in person, hybrid, online synchronous or online asynchronous. And because there was a gap, with teaching observations where we weren't looking at asynchronous content such as self paced modules or prerecorded lectures, there was the option through the service to provide feedback on some asynchronous course content. And, just another side bar this is how we, convey to our community what Canvas can do, or what our academic toolbox can do. So there are four main areas.
It can be used to organize your course content, to connect and communicate with students to assess student work and provide feedback and to teach from a distance or or teach remotely. And so this is our sort of four quadrant framework of what the academic toolbox or Canvas can help you do and I'll be revisiting this throughout the presentation. And so now that I've shared a little bit about how the service works, I will get into more granular details, but just so you know, as a reminder, the service is based we're looking at their canvas course, and we're giving them feedback on how they can improve in the future and letting them know what they're doing well. I'll give a little bit of theory about how what what informed the development of the service. How many of have heard of the T pack framework, show of hands.
Okay. About a little more than half, but, so basically, the T pack framework, it's looking at the interplay of technology, pedagogy, and content, and how they're basically extremely interdependent and how you know, in a faculty development perspective where you want to help instructors improve their teaching practices, you can't just look at the technology or you can't just look at the pedagogue They all are mutually reinforcing. And so in designing the service, it's really looking at the interplay of how technology pedagogy, and to some degree their content are interacting. And so faculty development typically consists of workshops and webinars, how two sessions, one on one consultations, and some self serve resources and documentation. But I felt that there was something missing.
And so if you look back at all of these four, they're really at the end of the day. Training, and training is instructional design, and instructional design is intended to meet a teaching and learning goal that hasn't been met. So something's wrong and you're or you want to improve or something's missing, you organize a training session and hopefully magically it's fixed. Is that often the case? No. And this, this clicker is driving the bananas.
And so human performance technology is an extension of instructional design where you're looking at different ways of addressing that possible training issue or that performance issue. And so what you do is you add the entire system, what's going on. You design and implement an intervention to address what might be going on in the system, and then you find a way to measure its effectiveness. So let's step out of faculty development for a brief second and pretend that we're working in a factory assembling laptop. And there's a certain configuration needed for different for different laptops and you decide, you know, I'm gonna host a one hour training session to teach all of the all the workers how to do this.
And then you go back onto the factory floor and you notice that there are issues with all the laptops. So maybe training wasn't the right way to tackle that. Maybe instead you need, they, you know, you needed to reorganize how the assembly line is to delivered or you need to find a different way to, you know, maybe they just need a job aid that lets them know if the laptop comes out this way. You put the thing this way. I'm I have nothing to do with assembly lines.
I'm just making this up, but the idea is training often isn't the right way to do this. And so I analyzed our system in a way, you know, who is our target audience? Maybe there's an instructor that's not gonna reach out and for a consultation or submit a support ticket, but they still would benefit from, you know, from some one on one guidance. Or maybe they're not the ones to reach out for a one on one consultation patient. And so the idea was to create a new entry point for meaningful conversations about teaching, learning, and technology. And so when you're designing an intervention, I'm calling this an intervention.
I'm not gonna go through all of these, but it's basically this is a a really great sort of checklist of things to think about when you're designing something new. Because you're trying to influence change influence behavior, influence performance. So the idea really is to help meet a different characteristic of instructor that might not have wanted to, you know, see out help, but would be open to receiving some feedback. And this, the service of the way that I it does connect to our unit strategic plan at the time when I designed it which was to increase adoption of technology and and demonstrate leadership in support for technological innovations. So now that I've given you some theory, let's get into practice.
So this is I'll go through how the canvas course reviews service works. So, the first step is an instructor submits quest online. We have a web form. You will be receiving all of the links and all of the materials and everything. So, I'll be sharing this the end.
So the instructor will submit a request online and that will be sent to the service coordinator who is me. So once the request been validated. Basically, just make sure that they're they've submitted the right request. Will meet for thirty minutes to discuss us why they've submitted the request and what it is that they hope to get out of the course review. And so that meeting is about thirty minutes.
We have a structured view protocol for that, that will be in the package of documents that I'm sending out, that that will be included after. But basically, it's to sort of unpack what they've included in the request form. So it's like, what do you really want me to look at in your course? What are the, you know, what are the, what's what's some feedback you've received from students in the past? Are there any things in your course that you think you're doing particularly well? Is there a particular module that you would like me to zoom in on when I'm looking at your course? And so after the pre review meeting, we will look at their course for one hour. So I could spend days months, years in some course but you only have one hour. And so you use the information from the pre review meeting to inform how you, spend your one hour of time how you prioritize your time.
Personally, I like to go in for two or three minutes. Just get my first impression, and I think it goes back to the items you raised at the beginning, like what are the first things you look at? So, like, is there a homepage? Is it just the modules? Am I able to find the syllabus? Stuff like that? And then I go back, you know, a couple hours later, and then I finish my review with more detailed notes, but I really like to separate my impression from my getting into the minutia. After the review, we schedule a one hour debrief meeting where we go through the course and share the feedback, that we found. And I really like it to be a conversation and a dialogue, not a laundry list of, you know, here are the ten things that you need to fix tomorrow It's here are the things that I really liked about your course, and here are some things that I think you could do that will help improve student learning. At the end of the day, I really try to bring the student view into how, into the feedback that I provide.
How will making these changes benefit the students? And so after that, I update the written feedback form and I send it to them and we encourage the instructor to reflect on the feed back and implement the changes as they feel appropriate. And so some key design features that I wanted to highlight is that this is instructor initiated. We're never, you know, sent to review anyone's course. No one's so it's the instructor's it's the instructor's prerogative to open up their course to us and I'm always very grateful when they do. And I say thank you for letting me look at your course, you know, here's the feedback that.
I wanna share with you. We only look at one course. So sometimes I've received requests for three at a time, and the answer is no. As you can imagine, it's a very time intensive process, a well it's a meaningful process. It's well worth the time, but it's it's, time intensive.
So I asked them to pick one course and often the feedback from one course will be transferable to their other courses and they often agree with that. So, the course can either be a past course, a current course or a course in the future. This might change the way that I share the feedback back to instructor, but, we do open it up. There's one provision though. If the course is an upcoming course, it must be student ready.
Like, they must be ready to hit publish button. And this came from our pilot where we were doing a future course and there wasn't enough. And I said, you know, can we reschedule this for when student ready. So, past present or future courses are open for review. And Like I said, we will only spend one hour looking at their course, and the reason why is you could spend so much time in someone's course but you know, at the end of the day, one hour gives you a really good snapshot of things that are working well and could be improved.
We do provide a written document at the end of the review, and this is really helpful for instructors because they can include it in their teaching dossier as evidence of teaching developments. Them opening up their course, they're spending time in the pre and post review meetings, their reflections on the feedback that they've received are, you know, that's this big amount time that they're spending for their own teaching development. And, we do strongly encourage them to reflect on, the feedback that they received. So I did mention how UFT is very centralized, and we have educational technology experts in many of our academic divisions. And so one thing we've done is, I work centrally, but we collaborated with the local teams to provide feedback.
So let's say I receive a request from an instructor in the faculty of nursing I would reach out to the divisional Representative from the faculty of nursing, and we would do the review together. Collaborative and so what this means is that, the instructor receives feedback from two perspectives. We don't necessarily need to agree on everything. We sometimes can offer two perspectives on how to tackle an issue that we identified. And so that's been really helpful from a central perspective of being able to work more closely with colleagues across the campus that are campuses that I might not have had the opportunity to work with otherwise.
We do, however, produce one unified review document. And actually many of our, many of our course reviews that we do are done collaboratively. For a training purpose, so I'll talk about the training in a little bit. But I really think it's meaningful for an instructor to get to perspective and also going back to that nursing example, the nursing representative knows better what's happening in the nursing courses than I do and can share that knowledge directly with that instructor. And so the timeline for this service implementation in summer twenty twenty We did a, I call it a pre pilot where I really have some extremely drafty materials and I worked with some people that we knew.
Just to refine the criteria a little bit in fall and winter twenty twenty fall twenty twenty and winter twenty twenty one. So during the pandemic, We ran an official pilot where we reached out to faculty members that we knew, and you know offered the service to them and received really good feedback the materials were working and, you know, the time commitment on their part, and I adjusted some things on my side. And then we made the service operational in summer twenty twenty one, and in fall twenty twenty one, we began collaborating with the divisional teams, so the representatives from the academic departments. And so now we'll go through some common themes that I have observed through running these canvas course reviews And so just a reminder, this is our breakdown of how we speak about our academic toolbox and conveniently our criteria are based around these four areas. And so I'll give everyone about thirty seconds to read.
Is it a difficult angle to read from this side or this will give one a couple of seconds to read some of our guiding questions. I get a thumbs up. All good. Alright. So this is the first area organizing content, and I think it speaks to many of the sort of first impressions that you raised at the beginning of the session, and you'll see that there's a slot for questions from the pre review consultation.
And so if something emerged in that pre review meeting, the thirty minutes, we'll add something in here just so that it's noted as something that we want in the scope of the review. And so some things that I've noticed, throughout doing many course reviews, and I'll give a number. Often there's no homepage or if there's a homepage, it's very empty. It's not welcoming. It's not offering that social presence that really welcomes students into a course.
Often, I see the modules look thing as the homepage and I don't think it's particularly inviting. It doesn't speak to the course, content in any way. Other than really getting into the minutia. And so, one thing we do is we recommend how to build a home page. We redirect instructors to our we have an asynchronous self paced course on how to build a welcoming homepage.
And so that's definitely one of the main things I've noticed. We at U of T don't recommend the syllabus tool. I don't know if that's the case at other institutions. We typically recommend having it as a link on the homepage or at the very top of the modules, just in terms of the way that the syllabus tool is organized, and so typically we'll recommend, we know where the syllabus, could go, or if we found it difficult to find. We'll let them know where they can put it.
This You could ask one now. Yeah. So why don't you recommend to seal this tool? So it can it the way that it auto populate some of the information requires very precise data entry, and we are syllabus, templates are always in PDF, like, or a document or PDF, and so, it just sort of it's kinda challenging to answer that one, but it's just that it doesn't the output that it creates from your assignments might not actually represent what's happening in your course, and so it requires a level of vision that might not be I know it's not the best answer, but it's just been typically the, at least, at our university, instructors put it at the top of their modules or on their homepage in sort of very big, very big text, but, you know, every institution you it a little bit differently. Yes. Questions.
Is are you talking about in person or online? So In in at your university. Like okay. Yeah. Yeah. So, Canvas course reviews can be done for any any course modality.
Typically online synchronous or online asynchronous, hybrid, or in person. Because an online synchronous course, we'd do a teaching observation. But depending on how the course is structured, basically any course is eligible for review. I think I should say, does Terry University or School require online courses to follow these specific tasks? No. No.
Yep. So another common theme is that are unnecessary navigation menu items, often the files listing, which we don't recommend making available, the pages listing, what else? The people we have because of privacy requirements, unless there's a need for people to be visible. We recommend that be hidden. And so sometimes they'll go into a course and notice there are all these unnecessary items that are basically clouding the student experience when navigating through the course. Another thing and this is probably what I spend the most amount of time on is how is the content organized and are they using the right tools to match the type of content that they're offering.
So, have they uploaded ten different files as PDFs as readings could just be on a single page. Could those be used in our library reading list tool, stuff like that? Are they using modules for content that isn't exactly meant to be followed in a sequential manner. Are they creating unnecessary pages? And so those are a lot of the things that we notice, and so it often leads to a converse about how your content is structured and ways to organize it pedagogically so that students can, you know, navigate with ease. At the end of the day, the focus is the student learning experience and managing their cognitive load. So, big one here, accessibility issues on pages.
And as well meaning as instructors are, very few of them know about the accessibility, checker, and There are, you know, many things. They try to make their content as pretty as possible, and that means adding color and images, but unfortunately those colors aren't possible, or they're they might be signifying meaning that doesn't exist, or there's no alt text on the images that they've put up. They're increasing the font size to be twenty four rather than using the built in headings. And so I'm continually surprised by it's and it it's I think it's a training issue as I mentioned earlier, but sort of the lack of awareness on this. And so opening up their core, showing the accessibility checker and showing how easy it is to fix these issues.
Really raises awareness. And so, going back to the idea about human performance technology versus instructional design, this is another way of addressing some of those concerns. And then internal linking, people are not aware about internal linking and, how it can really improve the student experience navigating through a course. Yes. So we'll flag copyright issues if we we'll flag copyright resources, but copyright is sort of under the jurist of our library, so we'll redirect them.
And so, some things I might get to this later, but some things sort of fall out of scope or be a policy thing. So we'll, you know, redirect them if needed, but we really try to stick to the canvas tools, but you know, like one thing I I I I I work in faculty development. I notice issues with learning outcomes all the time and so, you know, it's not necessarily within the scope of the Canvas review but, you know, they've opened up their course to me. And if I notice that there are no learning outcomes or learning outcomes that could be phrased a little bit better, I'll send them our learning outcomes guide as, you know, you might wanna, you know, look at this in a very gentle way because it is a little bit out of scope, but if I did notice some obvious potential copyright issue, I would, you know, refer them to the library and also remind them I'm not an expert in that area. Because I'm not.
And so this is the next area. I'll give you about thirty seconds to read through this. This is connecting and communicating with students. I guess give me a thumbs up when you're ready. Alrighty.
So what I've seen in courses is with the announcements tool. They're either way too frequent, so we're talking six, seven, eight, a week sometimes, and think instructors sometimes forget that students might be taking four or five classes simultaneously. So if that, announcement volume is happening frequently happening in all five courses. That's just email overload. And so we encourage instructors to stick to one announcement a you know barring any unforeseen circumstances, but, you know, maybe there are opportunities to consolidate your messaging or get into a weekly routine of only sending messages on Wednesday's at eleven.
I'm making up a date, but trying to get into that practice so that you're not sending an announcement. Everything time there's a little announcement that probably could could wait. On the flip side, there are some instructors that aren't using the announcements tool at all, They might be using email. I don't have access to their Outlook account. I don't want access to their Outlook account, but it's just a good opportunity to remind them, you know, you can use these Canvas tools to communicate, you can take that out of email.
And it's good to remind students to, you know, check their notification settings and all of that. So we typically recommend one announcement a week, barring anything unforeseen. And, yes. And I have a version of RSI? What's RSI? Sorry. Regular and substantive, interaction.
Anti spam legislation? I'm not aware of anything in Canada. Yeah. No. And, along those lines. Yeah.
So, but one thing I do recommend is sending a nice welcome announcement that fosters a sense of community, social presence human connection at the beginning of the course, whether it's online in person or hybrid. And then things I've noticed is there's no opportunity or information about Q and A or support. And so, we might recommend using the discussions tool to set up a Q and A or Piazza. We have the Piazza integration approved at the University of Toronto. Another thing that I happen to notice is a lot of a lot of times in discussions in the pre review meeting an instructor might say, you know, students aren't using the discussion board And then, you know, there's no, there's either no grades attached, there's no instructions, they just have an open discussion board with nothing with nothing going on.
And so there's either ambiguous or unclear instructions or if the discussion is intended to be a bit more meaningful, sometimes the romps are yes and no. And so here you can see this is where the pedagogy really starts to become intersected with how the technology is being used why it's important to have that pedagogical and technological lens when doing a review like this. So this one's short. I'll but I'll still give you about thirty seconds to read through it. This is the assessing student work and providing feedback area.
Just give me a thumbs up when y'all are ready. Alright. So really a focus on the instructions, the accessibility of the of the assessments, and how they're linked to the course syllabus. So often, there are no assignment instructions. And the instructor will say, oh, but they're in the syllabus.
And so I say back to them, write, you know, write in the assignment description check syllabus and internally link to those instructions. So that you don't have to worry about maintaining two things, but you're providing an easy access point to that document. Instructors just think, oh, it's, you know, it's all in the syllabus but how many times have you had the, oh, it's in the syllabus, you know, issue. So a lot of times, this is really fascinating to me. Instructions will have a rubric, a great rubric, but they're not using the rubrics tool because they didn't know it existed.
And so in the review, I see them, you know, creating a table in the assignment comments. And I'm like, did you know there's a root tool in Canvas, and changes their life. And so it's these little things that, you know, really improve their workflows. Oftentimes, sometimes, instructors aren't using the grade book, properly to calculate their final grades. And so, don't wanna go it's really good if it's a future course because you can fix it, but you let them know in the future if, you know, set up a consultation with me or one of your divisional colleagues so that you can get your grade book set up properly so that your grades will be calculated correctly over the course of the term.
And so that's it's really good to have their syllabus, and you can do a little comparison with, with what they have. This is the one feature that drives bananas is when instructors use the one question at a time. I understand there might be some cases where it's needed. But especially when they add the no back tracking, I think it's cruel to students and so I share with instructors if I they're using that feature. Not only does it increase the bandwidth requirements on students when, you know, every time you have to force reload the quiz, but it just adds a layer of unnecessary anxiety to an assessment experience.
And so, you know, it raises the, like I said, the services about creating a new entry point for conversations about pedagogy and technology. And so it's a really good opportunity to talk about, you know, assessments and conditions and know, how you think of assessment is is assessment for learning like, if you have no backtracking on, what happens if I as a student realize when I'm answering question that, oh, I have the wrong answer to question six. Why should I be penalized for that? And so I like to open up that conversation with instructors when looking at their course. And then, I don't know, I don't know how many of you know this, but surveys on Canvas aren't actually anonymous, they can be de anonymized. And so if they're asking for feedback, we recommend that they use Microsoft forms instead if it's outside of the official course evaluation period.
And so depending on how they're using surveys, we might recommend changing using a different tool. And the last area, this one's a little long, so again, put your put your thumbs up when ready, but this is teaching from a distance, webinars, and lecture recordings. And you'll have access to all this later, so don't worry if you don't finish reading, but we're all good. Alright. So one thing I've noticed, and this there's a lot of overlap in this area with organizing your course content, but I try to think of this.
If you're using the Canvas modules to deliver an asynchronous learning experience, this is about the learning experience itself, not the content. There's overlap, but we'll deal with it. So, one thing that I really recommend instructors do is if they are teaching in person or if they are teaching online synchronously that they adopt a before during and after framework, so things to do to prepare for the session, what we're gonna do during the session, any materials for the session, and things to do after, so reminders about upcoming deadlines and assignments and stuff like that and what's to come. So often letting instructors know that I recommend that framework. Often sometimes there's a lack of interact with an asynchronous content.
So there might be an, you know, a sixty minute lecture recording, a prerecorded lecture. Don't think anyone wants to sit through sixty minutes, so maybe five to eight to ten minute chunks would be better. So recommending that, but also maybe adding some interactivity and betting a quiz or, you know, adding, you know, adding some knowledge checks or reflections in the middle of the content just to make the content more engaging, and that's more focused on student learning. Adding videos, among other asynchronous elements. And then I talked about some other findings and I think this goes back to question about copyright that when I'm doing a review or when my colleagues are doing a review, we've opened up their whole course to me, and that's a moment of vulnerability, but we do notice some other things.
So maybe they're using some third party tools that aren't approved by the university and that's okay. They're allowed to use them. But are they using them with the guidelines that we have, which are providing students with an opt out and, you know, offering alternatives and providing clear language that this is not a University of Toronto approved tool, so letting them know that. And also learning outcomes, I'm seeing a lot of them that begin with understand and recognize and all of that. And so it's a good conversation.
And so copyright issues might emerge, but, you know, it I'd be negligent not bring up some things that we notice in a course. And so by the number, so since that twenty twenty pre pilot. We've done thirty nine course reviews at the university for forty three instructors or course staff We've worked with twelve academic divisions and we have twenty two reviewers across the three campuses that have provided reviews. Two instructors the university. And so you don't have to read this.
It'll be in the documents that I'm sharing with you, but we really encourage instructors to reflect on the feedback that they've received. It can be overwhelming to receive feedback, even if it's fully appreciative. And, you know, so we encourage instructors to think about what are the things that I can change now that will have a lot of impact? What might have to wait a little bit? And what, like, we also let them know, like, what feedback am I not prepared to act on? You know, I I know Canvas quite well. I know pedagogy pretty well, but there are certain things that they're just not gonna change and that's okay. They don't need to change everything and so they have the opportunity to sort of digest the feedback and think about what they can change and what they're not prepared to change.
And we also provide the follow-up resources on you know, whom they can contact for support and implementing some of the chain implementing some of the feedback that they've received. And so, this is a follow-up survey that we send after the reviews and the feedback has been overwhelmingly positive in terms of them finding the feedback useful, that the time investment that they spent was reasonable on their part. And, if they would recommend a course review to their colleagues. And then they found the resources extremely helpful or very helpful. So the review form and the planning next guide.
So overall the feedback has been very positive and I also have some qualitative feedback here. So that, you know, that hopefully if you are implemented in implementing a similar service at your institution, you can see that it has had value at the University of Toronto. So, instructors have reported that, you know, the, it went above their expectations. They are gonna take the time to make their course more accessible, that it improved their learning environment. And, it was a fabulous resource that they'll use again and again.
So some future directions before we open it up for Oh, I have some additional things. We're looking to continue expanding our collaborations with divisional teams. It's a really great opportunity to work with those across our three campuses. That I might not otherwise have the opportunity to work with. Increasing our internal capacity to offer these services.
It's time intensive, but I think it pays off on both, you know, the faculty developer side, as well as the instructor side. More important for the instructor, but I learn a lot too in the process. Looking at ways to evaluate the longer term impact of the service. And so, if you're familiar with Kirk Patrick and Kirkpatrick, model of evaluation. You wanna look at not only, you know, satisfaction, but are they learning and are are they transfer the knowledge that they've gained.
And then looking at ways to, can we make this a peer review process for instructors We do have a peer observation of teaching guide and I would say canvas very much falls into teaching, even, you know, regardless of the modality. And so some advice if you are interested in implementing a similar service is to pilot it and get feedback. So I mentioned earlier we decided that we're for future courses, we'll only review them if they're student ready, and that came from the pilot because we noticed that a course was not student ready, and then feedback that I would have given wouldn't have been meaningful at the time. And so pilot the service and you'll get a lot of feedback from instructors you know, work with the ones that you already know and are your captive audience. They might not be your target audience in the end, but they'll give you feedback about the time and how it works.
The training model works, as follows, so I'm the service lead. So, whenever someone is new to offering the service, they'll be the secondary reviewer, so they'll walk through the entire process with me. Where I'll be the primary reviewer. I'll lead the meetings, and all of that. They'll also review the course independently.
And then when we get when it's time for the second review, we'll flip roles. So they'll be the lead reviewer, they'll lead the meetings, they'll prepare the final document. I'll be the secondary reviewer just to make sure that everything's running on course. And depending on their comfort level, typically after two, they're ready to either go on their own or know, maybe or, do it with someone else, with the divisional team, for example. And then develop an evaluation plan.
So think early What are the questions that you wanna ask? Think about your criteria and, you know, think about the impact that it's having on your campus. And so here are the resources that I promised. It's a SharePoint folder with the intake form questions, the feedback form and the reviewer guide. So all of the resources are there. They're downloadable and you can edit them as you seek you can just provide credit that it was, you know, inspired by the University of Toronto.
But all the resources are there and, you know, it has some additional advice on, you know, things I do when I review a course and and all of that. But really the idea, like the idea here is that you can take these documents and consider implementing some or service at your own, at your own institution. So did everyone get a copy of of get the link? Okay. And so these are some references that I referred to during the session and we have five minutes for Q and A. Yes.
I I mean, there's that you have a really good framework. How do you get it all done in an hour? You're only allowed in an hour for a course? Yeah. There's a lot So the pre review meeting really helps you focus your time. Because you get a sense of what areas might need the most work, so they might express some concern about assessment. It's sort of setting a timer on yourself.
I know some of my colleagues have gone over an hour. I don't. I can't control their time though. So, and maybe when you're doing your first few, you know, it'll take more time, but one thing I've noticed is that I'm seeing I meant you saw that list of common themes I don't have to rewrite the same statement about how to, you know, that you should use a home page. And so I'm and I'm gonna be creating a bank like common statements so that it improves the the process moving forward.
But after doing four or five, the time, like, it doesn't take as long because you end up noticing common patterns. Well, I use those guiding questions. Yeah. Yeah. So, it depends on how often we're, like, profiling the service.
So it is time intensive. But at least five or six a year, it's been slower since returning back to campus because we've been working on other things. But during COVID, was huge and instructors really appreciate having this opportunity that there's like focused attention and that's why in the title I call personalized, I don't remember personalized professional development or something, because it's they, like, they're getting one on one and it's not just a consultation. They've initiated the process and they're getting the direct support that they need. So instructor engagement's pretty high and I know instructors have appreciate the service and they've let their colleagues know.
And then I it's interesting you do one for one instructor and you get two or three requests, you know, from the same department in in the next week or So so, yeah. Yes. Have you had issues with instructors not only having difficulty with feedback or large related to their force, because sometimes it's, yes, they their whole page is lacking something, but it's because they may need some additional comments. Yeah. We find that especially we get, of course, get the the very common answer.
Well, this is how I've always done. Yeah. And it's in person who under moving to Heather or online, they can't get out of that mindset. Yeah. So there's a vulnerability because they've initiated the process.
So I think there's already an openness to receiving feedback because initiated the request. That doesn't speak for everyone. I would say there might have been a couple of cases where I've encountered a little bit of resistance, but it's just a matter of sort of explaining your rationale, but also letting them know this is my perspective. This is still your course, and you can do it you know, like, this is just one perspective and maybe ask your students about it, but I think it's also about the way that you present the feedback. And so One thing.
It's in the guide that I've included, but it's just being very respectful in terms of how you deliver the feedback So not, oh, you need to change this. Have you considered or have you thought about or, you know, I noticed this one thing. What's one way that we can address this? So trying to be very intentional and respectful in terms of how you share the feedback can help soften the message a little bit, but I think part of it is also that the frame is really designed to be here. It's not here ten things you need to fix. It's here's what's working really well on your course and here are a few things I would change so that you can improve your course even more.
And so it's starting from a place of appreciation. Yes. The main one that I did say to me is, were there sort of operational or institutional agencies that drove you to develop this process as opposed to working with an existing framework for course evaluation or course review. Two m or Oscar or something like that. You're gonna know there's are very intensive.
Yeah. Just just curious, like, what kind of directed you to do this internal development Yeah. So, the service design, I think we only have a minute left, but the service design started shortly after our implementation of canvas and so there was sort of an appetite for wanting to improve, you know, how they're using the tools and all that. And so that partially inspired it, but also, going back to the idea of human performance technology is about trying to reach a different persona of instructor who might not reach out for help, but would be open to feedback. And so that was sort of my analysis of the University of Toronto System, especially me being a new employee was that there was an opportunity for something else, and it was also this whole idea of an hour and the pre review meeting was based on our in class observation framework.
So it was adopted based on that. So It was trying to be somewhat parallel, but not parallel to an in class observation. Just being able to offer a service on a different aspect an instructor's teaching. So I hope that answers it. I think we're up for time. So thank you.
I'll provide some information about the range of teaching feedback services that we offer at the University of Toronto. I'll provide some theoretical frameworks that inform the design of this service. I'll give an overview of how this service works from start to finish. I'll share some common themes of what I learned, throughout this process, discuss our feedback and evaluation process and share some advice if you're interested in implementing a similar service at your institution. So sorry slight advances struggling.
Yes. Okay. There we go. So, just a little bit of background for me. I recently completed my master of education at the University of Toronto in online teaching and learning.
Thank you. And, I'm a faculty liaison court Nator, at the center for teaching support and innovation at the University of Toronto, mainly work on organizing professional development for instructors, at the university. So the University of Toronto has around a hundred thousand students We have three campuses and we are a very decentralized institution. So that's just a little bit of background into UFT, large, research intensive, publicly funded. And these are photos of our three campuses in Scarborough, Downtown Toronto mississauga.
And I was gonna do a metrometer pull, but I don't know if you've had the same experience. The internet has been very rough. So I just want maybe three people to raise their hands and let me know when you visit a Canvas course for the first time in the role of a learner, what's the first thing you look for. Ease of navigation. Okay.
Ease of navigation. Yes. We got a second for that. Anything else. To do list? Okay.
Who's teaching? Who's teaching? Yep. The syllabus? Great. Anything else. Sorry? A decent homepage. Great.
And one more. So the course organization. Perfect. And so before I get into the details of our Canvas course review service, I'll just mention a little bit more about the University of Toronto is that we have a collaborative support model for educational technologies. You are not supposed to read this poster.
It's there for, for show. We do have a link where you can read up on that, but essentially because we're such a decentralized institution, we do have local educational technology support staff in academic departments and divisions. I represent the central teaching support unit University. So this was a central initiative that came from us. So we'll just speak a little bit about teaching feedback services and and what they are so before the pandemic, we offered a service called in class observations.
We've rebranded them as teaching observations, and this was, basically someone, faculty liaison pedagogical advisor would come to your class, sit in in the back of the room for an hour and provide some feedback to the instructor about what's going on. And during COVID, we did transition a little bit to, attending synchronous online sessions and providing feedback on that. But there was a missing gap where know, how are instructors using Canvas to meet their teaching learning goals? Not only not not just in cases where the course is a sync but in courses where it's an online synchronous course and, you know, the canvas site is hosting the materials and even for in person course before the pandemic where Canvas is really the hub for the course, where students can find materials, submit assignments, and all of that. So there was something missing. And so, I developed a service, collaboratively with colleagues called Canvas course reviews, and it's intended to provide instructors with formative feedback on how they're using Canvas to meet their teaching and learning goals.
And so the way it works is that a faculty T liaison. So that would be me or one of my colleagues. We would actually visit the quercus, the canvas course. Sorry, if I say quercus, it's that's how we branded canvas. I will do my best to refer to canvas throughout, but it might slip out here and there.
Where we visit their course and provide them with constructive and appreciative feedback on how they're using Canvas tools to meet their teaching learning goals, and the objective is to improve their pedagogical And so this became a new part of our teaching feedback services. We provide feedback on how using the LMS for any course, whether it's in person, hybrid, online synchronous or online asynchronous. And because there was a gap, with teaching observations where we weren't looking at asynchronous content such as self paced modules or prerecorded lectures, there was the option through the service to provide feedback on some asynchronous course content. And, just another side bar this is how we, convey to our community what Canvas can do, or what our academic toolbox can do. So there are four main areas.
It can be used to organize your course content, to connect and communicate with students to assess student work and provide feedback and to teach from a distance or or teach remotely. And so this is our sort of four quadrant framework of what the academic toolbox or Canvas can help you do and I'll be revisiting this throughout the presentation. And so now that I've shared a little bit about how the service works, I will get into more granular details, but just so you know, as a reminder, the service is based we're looking at their canvas course, and we're giving them feedback on how they can improve in the future and letting them know what they're doing well. I'll give a little bit of theory about how what what informed the development of the service. How many of have heard of the T pack framework, show of hands.
Okay. About a little more than half, but, so basically, the T pack framework, it's looking at the interplay of technology, pedagogy, and content, and how they're basically extremely interdependent and how you know, in a faculty development perspective where you want to help instructors improve their teaching practices, you can't just look at the technology or you can't just look at the pedagogue They all are mutually reinforcing. And so in designing the service, it's really looking at the interplay of how technology pedagogy, and to some degree their content are interacting. And so faculty development typically consists of workshops and webinars, how two sessions, one on one consultations, and some self serve resources and documentation. But I felt that there was something missing.
And so if you look back at all of these four, they're really at the end of the day. Training, and training is instructional design, and instructional design is intended to meet a teaching and learning goal that hasn't been met. So something's wrong and you're or you want to improve or something's missing, you organize a training session and hopefully magically it's fixed. Is that often the case? No. And this, this clicker is driving the bananas.
And so human performance technology is an extension of instructional design where you're looking at different ways of addressing that possible training issue or that performance issue. And so what you do is you add the entire system, what's going on. You design and implement an intervention to address what might be going on in the system, and then you find a way to measure its effectiveness. So let's step out of faculty development for a brief second and pretend that we're working in a factory assembling laptop. And there's a certain configuration needed for different for different laptops and you decide, you know, I'm gonna host a one hour training session to teach all of the all the workers how to do this.
And then you go back onto the factory floor and you notice that there are issues with all the laptops. So maybe training wasn't the right way to tackle that. Maybe instead you need, they, you know, you needed to reorganize how the assembly line is to delivered or you need to find a different way to, you know, maybe they just need a job aid that lets them know if the laptop comes out this way. You put the thing this way. I'm I have nothing to do with assembly lines.
I'm just making this up, but the idea is training often isn't the right way to do this. And so I analyzed our system in a way, you know, who is our target audience? Maybe there's an instructor that's not gonna reach out and for a consultation or submit a support ticket, but they still would benefit from, you know, from some one on one guidance. Or maybe they're not the ones to reach out for a one on one consultation patient. And so the idea was to create a new entry point for meaningful conversations about teaching, learning, and technology. And so when you're designing an intervention, I'm calling this an intervention.
I'm not gonna go through all of these, but it's basically this is a a really great sort of checklist of things to think about when you're designing something new. Because you're trying to influence change influence behavior, influence performance. So the idea really is to help meet a different characteristic of instructor that might not have wanted to, you know, see out help, but would be open to receiving some feedback. And this, the service of the way that I it does connect to our unit strategic plan at the time when I designed it which was to increase adoption of technology and and demonstrate leadership in support for technological innovations. So now that I've given you some theory, let's get into practice.
So this is I'll go through how the canvas course reviews service works. So, the first step is an instructor submits quest online. We have a web form. You will be receiving all of the links and all of the materials and everything. So, I'll be sharing this the end.
So the instructor will submit a request online and that will be sent to the service coordinator who is me. So once the request been validated. Basically, just make sure that they're they've submitted the right request. Will meet for thirty minutes to discuss us why they've submitted the request and what it is that they hope to get out of the course review. And so that meeting is about thirty minutes.
We have a structured view protocol for that, that will be in the package of documents that I'm sending out, that that will be included after. But basically, it's to sort of unpack what they've included in the request form. So it's like, what do you really want me to look at in your course? What are the, you know, what are the, what's what's some feedback you've received from students in the past? Are there any things in your course that you think you're doing particularly well? Is there a particular module that you would like me to zoom in on when I'm looking at your course? And so after the pre review meeting, we will look at their course for one hour. So I could spend days months, years in some course but you only have one hour. And so you use the information from the pre review meeting to inform how you, spend your one hour of time how you prioritize your time.
Personally, I like to go in for two or three minutes. Just get my first impression, and I think it goes back to the items you raised at the beginning, like what are the first things you look at? So, like, is there a homepage? Is it just the modules? Am I able to find the syllabus? Stuff like that? And then I go back, you know, a couple hours later, and then I finish my review with more detailed notes, but I really like to separate my impression from my getting into the minutia. After the review, we schedule a one hour debrief meeting where we go through the course and share the feedback, that we found. And I really like it to be a conversation and a dialogue, not a laundry list of, you know, here are the ten things that you need to fix tomorrow It's here are the things that I really liked about your course, and here are some things that I think you could do that will help improve student learning. At the end of the day, I really try to bring the student view into how, into the feedback that I provide.
How will making these changes benefit the students? And so after that, I update the written feedback form and I send it to them and we encourage the instructor to reflect on the feed back and implement the changes as they feel appropriate. And so some key design features that I wanted to highlight is that this is instructor initiated. We're never, you know, sent to review anyone's course. No one's so it's the instructor's it's the instructor's prerogative to open up their course to us and I'm always very grateful when they do. And I say thank you for letting me look at your course, you know, here's the feedback that.
I wanna share with you. We only look at one course. So sometimes I've received requests for three at a time, and the answer is no. As you can imagine, it's a very time intensive process, a well it's a meaningful process. It's well worth the time, but it's it's, time intensive.
So I asked them to pick one course and often the feedback from one course will be transferable to their other courses and they often agree with that. So, the course can either be a past course, a current course or a course in the future. This might change the way that I share the feedback back to instructor, but, we do open it up. There's one provision though. If the course is an upcoming course, it must be student ready.
Like, they must be ready to hit publish button. And this came from our pilot where we were doing a future course and there wasn't enough. And I said, you know, can we reschedule this for when student ready. So, past present or future courses are open for review. And Like I said, we will only spend one hour looking at their course, and the reason why is you could spend so much time in someone's course but you know, at the end of the day, one hour gives you a really good snapshot of things that are working well and could be improved.
We do provide a written document at the end of the review, and this is really helpful for instructors because they can include it in their teaching dossier as evidence of teaching developments. Them opening up their course, they're spending time in the pre and post review meetings, their reflections on the feedback that they've received are, you know, that's this big amount time that they're spending for their own teaching development. And, we do strongly encourage them to reflect on, the feedback that they received. So I did mention how UFT is very centralized, and we have educational technology experts in many of our academic divisions. And so one thing we've done is, I work centrally, but we collaborated with the local teams to provide feedback.
So let's say I receive a request from an instructor in the faculty of nursing I would reach out to the divisional Representative from the faculty of nursing, and we would do the review together. Collaborative and so what this means is that, the instructor receives feedback from two perspectives. We don't necessarily need to agree on everything. We sometimes can offer two perspectives on how to tackle an issue that we identified. And so that's been really helpful from a central perspective of being able to work more closely with colleagues across the campus that are campuses that I might not have had the opportunity to work with otherwise.
We do, however, produce one unified review document. And actually many of our, many of our course reviews that we do are done collaboratively. For a training purpose, so I'll talk about the training in a little bit. But I really think it's meaningful for an instructor to get to perspective and also going back to that nursing example, the nursing representative knows better what's happening in the nursing courses than I do and can share that knowledge directly with that instructor. And so the timeline for this service implementation in summer twenty twenty We did a, I call it a pre pilot where I really have some extremely drafty materials and I worked with some people that we knew.
Just to refine the criteria a little bit in fall and winter twenty twenty fall twenty twenty and winter twenty twenty one. So during the pandemic, We ran an official pilot where we reached out to faculty members that we knew, and you know offered the service to them and received really good feedback the materials were working and, you know, the time commitment on their part, and I adjusted some things on my side. And then we made the service operational in summer twenty twenty one, and in fall twenty twenty one, we began collaborating with the divisional teams, so the representatives from the academic departments. And so now we'll go through some common themes that I have observed through running these canvas course reviews And so just a reminder, this is our breakdown of how we speak about our academic toolbox and conveniently our criteria are based around these four areas. And so I'll give everyone about thirty seconds to read.
Is it a difficult angle to read from this side or this will give one a couple of seconds to read some of our guiding questions. I get a thumbs up. All good. Alright. So this is the first area organizing content, and I think it speaks to many of the sort of first impressions that you raised at the beginning of the session, and you'll see that there's a slot for questions from the pre review consultation.
And so if something emerged in that pre review meeting, the thirty minutes, we'll add something in here just so that it's noted as something that we want in the scope of the review. And so some things that I've noticed, throughout doing many course reviews, and I'll give a number. Often there's no homepage or if there's a homepage, it's very empty. It's not welcoming. It's not offering that social presence that really welcomes students into a course.
Often, I see the modules look thing as the homepage and I don't think it's particularly inviting. It doesn't speak to the course, content in any way. Other than really getting into the minutia. And so, one thing we do is we recommend how to build a home page. We redirect instructors to our we have an asynchronous self paced course on how to build a welcoming homepage.
And so that's definitely one of the main things I've noticed. We at U of T don't recommend the syllabus tool. I don't know if that's the case at other institutions. We typically recommend having it as a link on the homepage or at the very top of the modules, just in terms of the way that the syllabus tool is organized, and so typically we'll recommend, we know where the syllabus, could go, or if we found it difficult to find. We'll let them know where they can put it.
This You could ask one now. Yeah. So why don't you recommend to seal this tool? So it can it the way that it auto populate some of the information requires very precise data entry, and we are syllabus, templates are always in PDF, like, or a document or PDF, and so, it just sort of it's kinda challenging to answer that one, but it's just that it doesn't the output that it creates from your assignments might not actually represent what's happening in your course, and so it requires a level of vision that might not be I know it's not the best answer, but it's just been typically the, at least, at our university, instructors put it at the top of their modules or on their homepage in sort of very big, very big text, but, you know, every institution you it a little bit differently. Yes. Questions.
Is are you talking about in person or online? So In in at your university. Like okay. Yeah. Yeah. So, Canvas course reviews can be done for any any course modality.
Typically online synchronous or online asynchronous, hybrid, or in person. Because an online synchronous course, we'd do a teaching observation. But depending on how the course is structured, basically any course is eligible for review. I think I should say, does Terry University or School require online courses to follow these specific tasks? No. No.
Yep. So another common theme is that are unnecessary navigation menu items, often the files listing, which we don't recommend making available, the pages listing, what else? The people we have because of privacy requirements, unless there's a need for people to be visible. We recommend that be hidden. And so sometimes they'll go into a course and notice there are all these unnecessary items that are basically clouding the student experience when navigating through the course. Another thing and this is probably what I spend the most amount of time on is how is the content organized and are they using the right tools to match the type of content that they're offering.
So, have they uploaded ten different files as PDFs as readings could just be on a single page. Could those be used in our library reading list tool, stuff like that? Are they using modules for content that isn't exactly meant to be followed in a sequential manner. Are they creating unnecessary pages? And so those are a lot of the things that we notice, and so it often leads to a converse about how your content is structured and ways to organize it pedagogically so that students can, you know, navigate with ease. At the end of the day, the focus is the student learning experience and managing their cognitive load. So, big one here, accessibility issues on pages.
And as well meaning as instructors are, very few of them know about the accessibility, checker, and There are, you know, many things. They try to make their content as pretty as possible, and that means adding color and images, but unfortunately those colors aren't possible, or they're they might be signifying meaning that doesn't exist, or there's no alt text on the images that they've put up. They're increasing the font size to be twenty four rather than using the built in headings. And so I'm continually surprised by it's and it it's I think it's a training issue as I mentioned earlier, but sort of the lack of awareness on this. And so opening up their core, showing the accessibility checker and showing how easy it is to fix these issues.
Really raises awareness. And so, going back to the idea about human performance technology versus instructional design, this is another way of addressing some of those concerns. And then internal linking, people are not aware about internal linking and, how it can really improve the student experience navigating through a course. Yes. So we'll flag copyright issues if we we'll flag copyright resources, but copyright is sort of under the jurist of our library, so we'll redirect them.
And so, some things I might get to this later, but some things sort of fall out of scope or be a policy thing. So we'll, you know, redirect them if needed, but we really try to stick to the canvas tools, but you know, like one thing I I I I I work in faculty development. I notice issues with learning outcomes all the time and so, you know, it's not necessarily within the scope of the Canvas review but, you know, they've opened up their course to me. And if I notice that there are no learning outcomes or learning outcomes that could be phrased a little bit better, I'll send them our learning outcomes guide as, you know, you might wanna, you know, look at this in a very gentle way because it is a little bit out of scope, but if I did notice some obvious potential copyright issue, I would, you know, refer them to the library and also remind them I'm not an expert in that area. Because I'm not.
And so this is the next area. I'll give you about thirty seconds to read through this. This is connecting and communicating with students. I guess give me a thumbs up when you're ready. Alrighty.
So what I've seen in courses is with the announcements tool. They're either way too frequent, so we're talking six, seven, eight, a week sometimes, and think instructors sometimes forget that students might be taking four or five classes simultaneously. So if that, announcement volume is happening frequently happening in all five courses. That's just email overload. And so we encourage instructors to stick to one announcement a you know barring any unforeseen circumstances, but, you know, maybe there are opportunities to consolidate your messaging or get into a weekly routine of only sending messages on Wednesday's at eleven.
I'm making up a date, but trying to get into that practice so that you're not sending an announcement. Everything time there's a little announcement that probably could could wait. On the flip side, there are some instructors that aren't using the announcements tool at all, They might be using email. I don't have access to their Outlook account. I don't want access to their Outlook account, but it's just a good opportunity to remind them, you know, you can use these Canvas tools to communicate, you can take that out of email.
And it's good to remind students to, you know, check their notification settings and all of that. So we typically recommend one announcement a week, barring anything unforeseen. And, yes. And I have a version of RSI? What's RSI? Sorry. Regular and substantive, interaction.
Anti spam legislation? I'm not aware of anything in Canada. Yeah. No. And, along those lines. Yeah.
So, but one thing I do recommend is sending a nice welcome announcement that fosters a sense of community, social presence human connection at the beginning of the course, whether it's online in person or hybrid. And then things I've noticed is there's no opportunity or information about Q and A or support. And so, we might recommend using the discussions tool to set up a Q and A or Piazza. We have the Piazza integration approved at the University of Toronto. Another thing that I happen to notice is a lot of a lot of times in discussions in the pre review meeting an instructor might say, you know, students aren't using the discussion board And then, you know, there's no, there's either no grades attached, there's no instructions, they just have an open discussion board with nothing with nothing going on.
And so there's either ambiguous or unclear instructions or if the discussion is intended to be a bit more meaningful, sometimes the romps are yes and no. And so here you can see this is where the pedagogy really starts to become intersected with how the technology is being used why it's important to have that pedagogical and technological lens when doing a review like this. So this one's short. I'll but I'll still give you about thirty seconds to read through it. This is the assessing student work and providing feedback area.
Just give me a thumbs up when y'all are ready. Alright. So really a focus on the instructions, the accessibility of the of the assessments, and how they're linked to the course syllabus. So often, there are no assignment instructions. And the instructor will say, oh, but they're in the syllabus.
And so I say back to them, write, you know, write in the assignment description check syllabus and internally link to those instructions. So that you don't have to worry about maintaining two things, but you're providing an easy access point to that document. Instructors just think, oh, it's, you know, it's all in the syllabus but how many times have you had the, oh, it's in the syllabus, you know, issue. So a lot of times, this is really fascinating to me. Instructions will have a rubric, a great rubric, but they're not using the rubrics tool because they didn't know it existed.
And so in the review, I see them, you know, creating a table in the assignment comments. And I'm like, did you know there's a root tool in Canvas, and changes their life. And so it's these little things that, you know, really improve their workflows. Oftentimes, sometimes, instructors aren't using the grade book, properly to calculate their final grades. And so, don't wanna go it's really good if it's a future course because you can fix it, but you let them know in the future if, you know, set up a consultation with me or one of your divisional colleagues so that you can get your grade book set up properly so that your grades will be calculated correctly over the course of the term.
And so that's it's really good to have their syllabus, and you can do a little comparison with, with what they have. This is the one feature that drives bananas is when instructors use the one question at a time. I understand there might be some cases where it's needed. But especially when they add the no back tracking, I think it's cruel to students and so I share with instructors if I they're using that feature. Not only does it increase the bandwidth requirements on students when, you know, every time you have to force reload the quiz, but it just adds a layer of unnecessary anxiety to an assessment experience.
And so, you know, it raises the, like I said, the services about creating a new entry point for conversations about pedagogy and technology. And so it's a really good opportunity to talk about, you know, assessments and conditions and know, how you think of assessment is is assessment for learning like, if you have no backtracking on, what happens if I as a student realize when I'm answering question that, oh, I have the wrong answer to question six. Why should I be penalized for that? And so I like to open up that conversation with instructors when looking at their course. And then, I don't know, I don't know how many of you know this, but surveys on Canvas aren't actually anonymous, they can be de anonymized. And so if they're asking for feedback, we recommend that they use Microsoft forms instead if it's outside of the official course evaluation period.
And so depending on how they're using surveys, we might recommend changing using a different tool. And the last area, this one's a little long, so again, put your put your thumbs up when ready, but this is teaching from a distance, webinars, and lecture recordings. And you'll have access to all this later, so don't worry if you don't finish reading, but we're all good. Alright. So one thing I've noticed, and this there's a lot of overlap in this area with organizing your course content, but I try to think of this.
If you're using the Canvas modules to deliver an asynchronous learning experience, this is about the learning experience itself, not the content. There's overlap, but we'll deal with it. So, one thing that I really recommend instructors do is if they are teaching in person or if they are teaching online synchronously that they adopt a before during and after framework, so things to do to prepare for the session, what we're gonna do during the session, any materials for the session, and things to do after, so reminders about upcoming deadlines and assignments and stuff like that and what's to come. So often letting instructors know that I recommend that framework. Often sometimes there's a lack of interact with an asynchronous content.
So there might be an, you know, a sixty minute lecture recording, a prerecorded lecture. Don't think anyone wants to sit through sixty minutes, so maybe five to eight to ten minute chunks would be better. So recommending that, but also maybe adding some interactivity and betting a quiz or, you know, adding, you know, adding some knowledge checks or reflections in the middle of the content just to make the content more engaging, and that's more focused on student learning. Adding videos, among other asynchronous elements. And then I talked about some other findings and I think this goes back to question about copyright that when I'm doing a review or when my colleagues are doing a review, we've opened up their whole course to me, and that's a moment of vulnerability, but we do notice some other things.
So maybe they're using some third party tools that aren't approved by the university and that's okay. They're allowed to use them. But are they using them with the guidelines that we have, which are providing students with an opt out and, you know, offering alternatives and providing clear language that this is not a University of Toronto approved tool, so letting them know that. And also learning outcomes, I'm seeing a lot of them that begin with understand and recognize and all of that. And so it's a good conversation.
And so copyright issues might emerge, but, you know, it I'd be negligent not bring up some things that we notice in a course. And so by the number, so since that twenty twenty pre pilot. We've done thirty nine course reviews at the university for forty three instructors or course staff We've worked with twelve academic divisions and we have twenty two reviewers across the three campuses that have provided reviews. Two instructors the university. And so you don't have to read this.
It'll be in the documents that I'm sharing with you, but we really encourage instructors to reflect on the feedback that they've received. It can be overwhelming to receive feedback, even if it's fully appreciative. And, you know, so we encourage instructors to think about what are the things that I can change now that will have a lot of impact? What might have to wait a little bit? And what, like, we also let them know, like, what feedback am I not prepared to act on? You know, I I know Canvas quite well. I know pedagogy pretty well, but there are certain things that they're just not gonna change and that's okay. They don't need to change everything and so they have the opportunity to sort of digest the feedback and think about what they can change and what they're not prepared to change.
And we also provide the follow-up resources on you know, whom they can contact for support and implementing some of the chain implementing some of the feedback that they've received. And so, this is a follow-up survey that we send after the reviews and the feedback has been overwhelmingly positive in terms of them finding the feedback useful, that the time investment that they spent was reasonable on their part. And, if they would recommend a course review to their colleagues. And then they found the resources extremely helpful or very helpful. So the review form and the planning next guide.
So overall the feedback has been very positive and I also have some qualitative feedback here. So that, you know, that hopefully if you are implemented in implementing a similar service at your institution, you can see that it has had value at the University of Toronto. So, instructors have reported that, you know, the, it went above their expectations. They are gonna take the time to make their course more accessible, that it improved their learning environment. And, it was a fabulous resource that they'll use again and again.
So some future directions before we open it up for Oh, I have some additional things. We're looking to continue expanding our collaborations with divisional teams. It's a really great opportunity to work with those across our three campuses. That I might not otherwise have the opportunity to work with. Increasing our internal capacity to offer these services.
It's time intensive, but I think it pays off on both, you know, the faculty developer side, as well as the instructor side. More important for the instructor, but I learn a lot too in the process. Looking at ways to evaluate the longer term impact of the service. And so, if you're familiar with Kirk Patrick and Kirkpatrick, model of evaluation. You wanna look at not only, you know, satisfaction, but are they learning and are are they transfer the knowledge that they've gained.
And then looking at ways to, can we make this a peer review process for instructors We do have a peer observation of teaching guide and I would say canvas very much falls into teaching, even, you know, regardless of the modality. And so some advice if you are interested in implementing a similar service is to pilot it and get feedback. So I mentioned earlier we decided that we're for future courses, we'll only review them if they're student ready, and that came from the pilot because we noticed that a course was not student ready, and then feedback that I would have given wouldn't have been meaningful at the time. And so pilot the service and you'll get a lot of feedback from instructors you know, work with the ones that you already know and are your captive audience. They might not be your target audience in the end, but they'll give you feedback about the time and how it works.
The training model works, as follows, so I'm the service lead. So, whenever someone is new to offering the service, they'll be the secondary reviewer, so they'll walk through the entire process with me. Where I'll be the primary reviewer. I'll lead the meetings, and all of that. They'll also review the course independently.
And then when we get when it's time for the second review, we'll flip roles. So they'll be the lead reviewer, they'll lead the meetings, they'll prepare the final document. I'll be the secondary reviewer just to make sure that everything's running on course. And depending on their comfort level, typically after two, they're ready to either go on their own or know, maybe or, do it with someone else, with the divisional team, for example. And then develop an evaluation plan.
So think early What are the questions that you wanna ask? Think about your criteria and, you know, think about the impact that it's having on your campus. And so here are the resources that I promised. It's a SharePoint folder with the intake form questions, the feedback form and the reviewer guide. So all of the resources are there. They're downloadable and you can edit them as you seek you can just provide credit that it was, you know, inspired by the University of Toronto.
But all the resources are there and, you know, it has some additional advice on, you know, things I do when I review a course and and all of that. But really the idea, like the idea here is that you can take these documents and consider implementing some or service at your own, at your own institution. So did everyone get a copy of of get the link? Okay. And so these are some references that I referred to during the session and we have five minutes for Q and A. Yes.
I I mean, there's that you have a really good framework. How do you get it all done in an hour? You're only allowed in an hour for a course? Yeah. There's a lot So the pre review meeting really helps you focus your time. Because you get a sense of what areas might need the most work, so they might express some concern about assessment. It's sort of setting a timer on yourself.
I know some of my colleagues have gone over an hour. I don't. I can't control their time though. So, and maybe when you're doing your first few, you know, it'll take more time, but one thing I've noticed is that I'm seeing I meant you saw that list of common themes I don't have to rewrite the same statement about how to, you know, that you should use a home page. And so I'm and I'm gonna be creating a bank like common statements so that it improves the the process moving forward.
But after doing four or five, the time, like, it doesn't take as long because you end up noticing common patterns. Well, I use those guiding questions. Yeah. Yeah. So, it depends on how often we're, like, profiling the service.
So it is time intensive. But at least five or six a year, it's been slower since returning back to campus because we've been working on other things. But during COVID, was huge and instructors really appreciate having this opportunity that there's like focused attention and that's why in the title I call personalized, I don't remember personalized professional development or something, because it's they, like, they're getting one on one and it's not just a consultation. They've initiated the process and they're getting the direct support that they need. So instructor engagement's pretty high and I know instructors have appreciate the service and they've let their colleagues know.
And then I it's interesting you do one for one instructor and you get two or three requests, you know, from the same department in in the next week or So so, yeah. Yes. Have you had issues with instructors not only having difficulty with feedback or large related to their force, because sometimes it's, yes, they their whole page is lacking something, but it's because they may need some additional comments. Yeah. We find that especially we get, of course, get the the very common answer.
Well, this is how I've always done. Yeah. And it's in person who under moving to Heather or online, they can't get out of that mindset. Yeah. So there's a vulnerability because they've initiated the process.
So I think there's already an openness to receiving feedback because initiated the request. That doesn't speak for everyone. I would say there might have been a couple of cases where I've encountered a little bit of resistance, but it's just a matter of sort of explaining your rationale, but also letting them know this is my perspective. This is still your course, and you can do it you know, like, this is just one perspective and maybe ask your students about it, but I think it's also about the way that you present the feedback. And so One thing.
It's in the guide that I've included, but it's just being very respectful in terms of how you deliver the feedback So not, oh, you need to change this. Have you considered or have you thought about or, you know, I noticed this one thing. What's one way that we can address this? So trying to be very intentional and respectful in terms of how you share the feedback can help soften the message a little bit, but I think part of it is also that the frame is really designed to be here. It's not here ten things you need to fix. It's here's what's working really well on your course and here are a few things I would change so that you can improve your course even more.
And so it's starting from a place of appreciation. Yes. The main one that I did say to me is, were there sort of operational or institutional agencies that drove you to develop this process as opposed to working with an existing framework for course evaluation or course review. Two m or Oscar or something like that. You're gonna know there's are very intensive.
Yeah. Just just curious, like, what kind of directed you to do this internal development Yeah. So, the service design, I think we only have a minute left, but the service design started shortly after our implementation of canvas and so there was sort of an appetite for wanting to improve, you know, how they're using the tools and all that. And so that partially inspired it, but also, going back to the idea of human performance technology is about trying to reach a different persona of instructor who might not reach out for help, but would be open to feedback. And so that was sort of my analysis of the University of Toronto System, especially me being a new employee was that there was an opportunity for something else, and it was also this whole idea of an hour and the pre review meeting was based on our in class observation framework.
So it was adopted based on that. So It was trying to be somewhat parallel, but not parallel to an in class observation. Just being able to offer a service on a different aspect an instructor's teaching. So I hope that answers it. I think we're up for time. So thank you.