Meeting Federal Requirements for Distance Instruction Tracking: Our Experience as a Case Study

Share

One of the challenges in developing higher ed distance curriculum has nothing to do with pedagogy, it's meeting federal regulations. We will present a case study of the process we went through to develop our new Data Analytics program and how we ensured it met distance requirements in Canvas.

Share
Video Transcript
So my name is Mason Leftler. I'm a middle school teacher, interested in middle school teacher that turned into an instructional designer that turned into a researcher that turned into an administrator at a technical college now. That's my path. And I'm Josh Beam. I'm working in our curriculum design and development department over at Bridgeline Technical College. I tried to get a deep AI, like image generator.

I fed at the prompt hoping that I could convince it. To to see super engaged. They probably should have said super excited, but that was the that's what it gave me. And I was like, whoa. And I didn't know.

And if I didn't know if that was me looking at you guys or you guys looking up here, but I was like, It's probably right on. Right? Do you wanna go to the next one? Okay. So we wanna spend a little bit of time to figure out where you guys are at because we can tailor this to whether you guys, your your administrators, or financial people, or Anyway, there's a lot of different directions that we can go. We want it to be really powerful for you. So by the raise of your hands, do you work in clock hours at your school? So there's a few.

I was hoping that there'd be a few and credit. Is everybody else? K. Next. How many of you guys are working with competency based like open entry, open exit instruction? Good. Good.

How many of you are administrators? Oh, right? K. How many are, like instructional designers, teachers? Beautiful. K. K k twelve. Alright.

How about two year colleges? How about higher ed? K. I know. I know. I get it too. Well, we're we're a technical college, and I always have to fill it out when I'm doing like my n s I I write NSF grants and they're always like, are you a two year college? I'm like, no.

I'm not. I'm a technical college, which is something different, but that's where they put me. So I always get a little offended. And then it's like higher ed technical college, you know. Anyway, Okay.

And then who's most are you most interested in learning about tracking progress in online learning? Some of you, or are you most interested in the federal laws for clock hours? Okay. Little bit, and then course design to enhance engagement online. Okay. Alright. Beautiful.

We'll have something for everyone. I'm gonna start out by giving some context. In somewhere around twenty nineteen, I was at the Utah State capital and the structures in Utah. It's based out of Utah. Right? And I was sitting around a table and we're talking about the eye he predicament hiring workforce predicament.

And this was said by one of the VPs at instructure. He says offering a hundred and twenty thousand dollars for a data analytics technician or data an data analyst in Utah, we're getting outbid at a hundred and fifty to a hundred and eighty. And so to me, as a grant writer and as somebody that's focused on workforce, they were writing grants at that time to start programs. And I said, well, we should we should write a grant. And then I just so happened to be at an at a PI conference at NSF.

And there was a group of people out of Massachusetts that were helping community colleges and technical colleges across the state start data analytics technicians in the same way that you have like an engineer and they've got like six techs underneath them. That didn't exist back in twenty eighteen. Like, everybody back in twenty eighteen was fighting over who who who owns data analytics. Right? So we started writing. We joined up with them.

They they mentored us We wrote a grant. They said, in state grant, and they said, yay, you guys can have the money. That was in April of twenty twenty. A month later, they said, no, you can't have the money. Because there was a pandemic.

And then we reapplied the next year, got the money, started the program. It's a smaller program. It's four hundred and fifty hours. So think like fifteen credits somewhere around there. And then it was geared mostly to you wanna jump to the next one.

It was geared mostly to the manufacturing sector in in Northern Utah. Like, one in four people are in manufacturing in Northern Utah. So we did a KSA. Have you guys ever heard of that? It's like a day come. Same thing, different name.

So we did that to build the program. And one of the things that they post us is they said, wow. We we had people from Northrop Grumman there in Ottawa. They make probably everybody's What's the call when you get a crash? Nothing pops up airbag. They make all the airbags for the world.

And they said, wow, we need you guys to teach data analytics, and we need you to t we need you to upscale, like, five hundred people here locally. Because they wanted all their technicians to have data analytics. And then they wanted their marketing people to have analytics. And then they want their their management people to know how to do Tableau Microsoft PowerBI. And so it was just a big thing.

And they said we want it to be smaller, but then also can you teach people across state lines? Because if you're teaching everybody here, we want you to teach all at our site in Iowa. And we're like, no. I don't know because that's not really what our role is at a university is to or at a technical college. It's usually like we're we're constrained. So anyway, we did our KSA.

We built the curriculum and we started moving in the direction sort towards doing it online, both for for a few different reasons. You wanna skip to the next one. We built it This is less important, but we built it around the data cycle. All the courses are designed that way. Let's jump again.

But we are basically focused on three different populations. The other issue that we had in Northern Utah at the time is that we had the lowest unemployment rate in the nation for like the past forty years. We had one point two percent unemployment. And the only people that you could basically train were high school students, very, very few unemployed, and then women, because we're in a a more traditional area. There's lots of women that are either caretakers or their parents or triggers that are children, they're reentering the workforce.

And so that was a big thing that we were trying to do was to get more women, which what would we need to do? We'd need to make it distance enabled. Let's jump again. Okay. Yeah. So once we realized that distance was the direction we're gonna want to go, it, initially, it was great.

Well, we've already got Canvas. We're good to go. But then our compliance officer met with us and let us know that we are in fact not ready to go to just toss it into Canvas. There's all these requirements of what we needed to do, specifically because we the technical colleges are all based on a clock hour system. And so there's certain requirements you may not run into in the credit system.

The main some of these will apply, some of these won't apply. Like I said, I'm not, financial aid experts, so I don't know which ones will apply to your institution. But the ones that we had to be aware of are it has to use one or more of the technologies in paragraph two, which it lists out a bunch of them. There's I think I've even got some of them pulled up here. Let's see.

Yeah. So we've got the internet, which I think is what most of us are doing. And then there's one or two way transmission, audio conferencing, video access, DVDs, CD, CD, romps, etcetera. But that inner one's the main one. So if you're using the internet, then it's distance.

The second piece is if you're doing distance, then it has to be interactive. And the way that they qualify that is that students have to get either direct instruction. They are being assessed. They're being provided information or responding to questions about the content of a course or competency facilitating a group discussion regarding the content of a course or competency or other instructional activity approved by the institution of programs crediting Agency. So all that boiled down to, we needed to do a couple of things.

We needed to know who was doing the activity. We needed to know for how long they were doing the activity, and we needed to make sure that they were actively engaged in something. We couldn't turn on a video and then a student gets an hour of clock hour time, even though they got up and got a sandwich the whole time. So we were trying to figure out, alright, how do we do all these different activities in Canvas while still tracking who's doing it and for how long? This is more information about some of the requirements that we have. This states here, the big one here is that For every sixty minutes that we're saying a student was working on our course, we have to prove fifty of them.

So, basically, the whole time. K. So we started going through and figuring out what are the different activities that our students are gonna be doing as Mation showed in that one slide briefly. We did kind of a backwards design approach where we took a look at this is what we want students to know. These are the assessments that we want to be able to prove that they know how to do it.

And then this is how we're gonna prepare them for it. So we already had an idea of what was gonna be the best way of assessing our students but we also needed to track their interaction. So we set up a few things. We wanted there to be regular meetings with students not only because there's a federal requirement, but you see lots of attrition with distance education and having that interaction with the teacher we felt was really important. So we set up both weekly one on one meetings as well as once a week, so because we're not one hundred percent remote.

We have the students all come in for a presentation, where sometimes they'll bring in guest speakers. The students will get to interact with their instructor. But there's some kind of face to face. And this is something that could be done remotely, but we are able to do it in person. Those one zero one progress meetings, we require assignment feedback.

If a student submits something, they're gonna get feedback on it beyond just a good job. They're getting some kind of substantive feedback and then prompt response to student questions. So we've got a couple of things that we're doing to make sure that there's prompt response. We use the Canvas API to pull a lot of data. We've asked that all of our instructors communicate with their students through Canvas' messaging.

That's the email icon on the side that you can use because then we can pull who who are you talking to and how frequently. So I'll show you one of the dashboards that we use. Possibly. Let me go and pull this up. Any questions so far while the internets that slows is populating page about their things? I have a question about the the response.

Yeah. Is this because there's a lot of tools now, especially this week we've heard AI feedback. Yeah. All of this stuff, is that the counted as substantive response? I wouldn't count it. I would need.

Yeah. We don't have any injectors that are currently using any kind of AI tools, but it's definitely a conversation to be had my gut reaction would be no if the teacher's not aware. I'm sure all of you agree I'm preaching to the choir, but half the benefit of feedback is that the teacher knows how the student's doing on the work as opposed to the student knowing how they did. Well, our report's not pulling up because we got kinda slow Wi Fi here. Yes, sir.

Quick question on your fifty minute is that from No. That's the federal for clock hours. Clock hour. That's online. So the the six people that raise their hand that's doing clock hours, it's nasty.

For some reason, we are held to a higher standard than regular credit based but that's clock hours. I don't know what it is for regular online instruction. I wouldn't imagine it, you know, having done a a few higher ed degrees and and been online courses, not this it's not this rigorous. Yeah. Yeah.

The other thing was we can't count homework as part of the time that is in the course, whereas, credit hours can count homework as part of their time. Yeah. In the back. Yeah. That's a good question.

So I I can't go into it in too much detail. The question was, is there a difference between synchronous versus asynchronous? From what we learned from our compliance officer. It's less the distance between difference between synchronous and asynchronous as it is credit versus clock hour. Because for us, the requirement is that there's regular substantive interaction. So whether it's synchronous and the teacher is communicating with the student or it's asynchronous and the student is getting feedback, a delayed feedback.

That still meets that regular substantive interaction. However, there does need to be some amount of synchronous the teacher is meeting with the student. At least the teacher is reaching out to the student. It has to be teacher initiated. The student doesn't have to accept Any other questions? Okay.

So I wasn't able to pull up the report, but one of the other things we look at is their response time in Canvas. How quickly are students or teachers responding to students. After they've submitted, we've got it broken down into three categories. Is it within twenty four hours? Is it within a week or is it more than a week? And we're shooting for always to be within that twenty four hours range. Yeah.

And then the way that we track all these, because, again, we have to make sure we're not only doing it, but tracking it, weekly meetings. We have a clock in system. So students clock in every time they come to school, and so they use that. For those weekly meetings, we have teachers record a summary of their meetings when they meet with them one on one, both for the requirements, but also the teacher can look back or there's other teachers in the program that they can check-in and see what did you talk about your instructor last time. For assignment feedback, obviously, Canvas, And then for the prompt response, we use Canvas messaging in the API.

Yeah. For the data that shows that the instructor's responding within twenty four hours. Yeah. Does your data show that from Jesus before? Yeah. So we we've got a dashboard, and I really wish I could pull it up but it wasn't well, maybe maybe we'll get it to work.

So it doesn't it's my mouse. Well, you were up in the right. You were up in the left hand corner a second ago where am I at now? There you're you're going in the middle. Hang on. Alright.

Alright. So I'm just going to select a random program that we've got. Oops. Well, it's not loading. Okay.

So, yes, we have a dashboard where we're pulling in All the every morning, we run a report that pulls in all of the submissions across all of our school, and then all of the or the times that it was submitted. And then I created a script in Python that just breaks it up, into the different chunks of this teacher responded to this student within x number of days. And then we have a little graph. Yeah. Is there any pushback from faculty on having that level of oversight and look into their classes? Yeah.

So all of these tools we developed in conjunction with faculty. We're small enough that it's nice that I know every single faculty member, and I can go and sit in their office and say, this is what we're looking to do. And And this is the benefit that we see having for you, especially if we can get department head buy in because they're trying to figure out how to help get feedback to their different instructors then it's less big brother on our end and it's more. Here's the tools that we have to help you navigate and figure out which course are you needing to work in more so not so much feedback, but we did a lot of leg work beforehand to to prep people for what we're hoping to do. Yeah.

You mentioned, shooting for a twenty four hour turnaround? How large are these glasses? That varies widely from program to program. And our data analytics one, they're roughly what? Five to ten students in the cohort. It's it's like, well, there there were around twelve. Twelve. Twelve.

It's usually twelve percent per class. Yeah. Yeah. So not massive, fastest. Not.

Sorry. I kept trying to get it to load, and it wasn't It wasn't loading for me. I apologize. We had it loaded before, but everything turned off the computer on us. So I'm sorry.

Yeah. Sorry. I I'll try one more time. I'll try a different department, see if that does it. The script, the him, the guru who has Yeah.

What what tools did you use? How long did it take to be put in place? So, yeah, Well, he's working on that. I can I can speak a little bit to that? So we we sat there at the very beginning and said, okay. If we're in Canvas, gonna have to probably steer away from pages, right, because pages don't have interaction on them unless you've got some some some type of add on. Right? So we said we're not gonna disseminate content through pages. Unless we're using something where we've got a tool that's gathering data from them.

Okay. Then we said, okay. Well, we've got assignments, and we can gather data from there. And we can we've got quizzes and we can gather engagement and time lapsed there. And we just went through all the different structures inside of Canvas and said, okay, where's our gaps? And then we came up with solutions for them.

Like for instance, with with dissemination or lecture based content, we, we went out and and got a note. I don't know if you have heard of Anoto. Anoto is basically it's a it's basically you can superimpose quizzes and chats on top of video. Right? So so every periodically, we can prove that they're that they've been engaged in the content. So and that's the hardest.

That's the hardest one to figure out if you're not in a lecture and they're not clocking in, than proving that you're when you're disseminating that they're engaging with that dissemination, that's the that's the most difficult piece. So in in a large part, we said we're gonna steer away from using pages inside of Canvas, we're gonna build everything through quizzes. So we'll we'll actually disseminate content in the quiz and then have them taking the quiz because then we'd know how long they're taking the quiz. Does that make sense? Okay. Yeah.

So here's the list of the different Canvas tools that we had, and then how we ended up going through and tracking student engagement in them. So as Mason mentioned, where we had pages where there is some kind of just pure didactic instructional content, then we use the tool called a Noto. You could get around this if you don't have access to a tool like this by embedding the video into a quiz and then just using those quiz questions as the interaction. Four quizzes, we use Canvas' quiz log. Is everybody familiar with the quiz log feature.

For those that aren't, Canvas lets you turn on a setting where it'll keep track of interactions with that quiz where when a student clicks when they hit the next button, when they hit submit, and because we now have this, and that's what that little image is on the right, we now have this log of all their interactions. We can use this as our evidence that students were doing something for a given period of time. We then ran into an issue though where because it's a data analytics program, we've got a lot of students that are coding and they're working in their their visual studio code, their right in their Python code, that's not in Canvas. It's not in any kind of LMS, LTI environment where we can track what they're doing. So we found a tool called Top Tracker.

It's a free to use tool I asked them multiple times. Are there any limits on users or anything? And they said, no, just keep using it. It's free. What TopTacker does is for these types of assignments where we have to know how long a student is doing it and who did it. We have them download this.

It's basically similar to what, a remote contractor would use to clock in. They hit the start button. It starts tracking their time, and then it can take little screenshots every five minutes and they can blur it. So it's not showing any of their personal information, but it's showing that at least the content is up on the computer. For all of our students, we let them know This is an optional ability that'll let you do stuff from home.

But you're always welcome to come into the lab and do it in person because then we don't have the same distance requirements. Because we did get some pushback from students saying, I don't want this app on my computer monitoring everything that I do, which I totally get, And so we gave them both those options. You're always welcome to come in. If you want to do it from home, if you want that remote, ability, then we've got this tool here for you. And then the last thing is with LTI's, we just needed to make sure that the LTI was doing the tracking and then before we could use it.

So what we put together is among a bunch of other questions about LTIs every time an instructor is interested in getting a new LTI for their program, they fill out this survey, and one of the pieces is we haven't sent an email to the LTI with a list of all the different things that we wanna know. We wanna know is it tracking student activity how frequently is it checking in on the student? Does it time out can a student just leave and not come back? And they're still getting credit for doing this? And then If the LTI meets all these requirements, then we're good to go. If they don't meet the requirements, it's not necessarily that that tool is not gonna work. It's just we can't get that course approved for this distance requirements. Okay.

Yes, sir. Yeah. That's a fantastic question. The nice thing is we only have to report on how many hours did the students say they took, or how much do they actually take, not how much should it have taken them? So if it's a fifty hour course, and a student finishes in a hundred hours, and they have a hundred hours of tracked content, then we're good to go. Or if they took twenty hours and we've got it that we can track it is just the student's time for how much we've got recorded.

K. Kind of pivoting then. We wanted to make sure what we're doing was actually working. And so we started looking at ways of one helping students be pasted through it as well as doing some analytics. So one tool, I don't know if any of you have used this.

Canvas has a tool called course pacing, it was originally paced plans. It's really nice for asynchronous type learning. For those that are in a lockstep situation, I don't know how useful it is, but the way that the tool works is that from whatever day the student enrolls in, their due dates are set accordingly based on that date. So if you're, if we've got because we've got students that'll start on Monday, and then another student that starts Wednesday, and then another student that starts a week after, And so if we put due dates in Canvas, students are getting locked out all the time. So we've been using course pacing, in our our remote asynchronous courses where they don't always have that teacher there to help guide them through it, and we're trying to roll it out to our other programs as well.

And then the second piece is we've just got tons and tons of analytics. So here's the course pacing tool. So you can get an idea. You go in and you set You know, this is this assignment should take them three days. This assignment should take them five days.

And then it just gives you some information about roughly when would the student be done with based on a given short date. And then the second piece is some of the analytics that we started putting together. So this is a dashboard that we created with some of the information that we really wanted to know about students. So our teachers wanted to know for given student what course are they in? So that's the blue. Right.

What course did they finish? That's the blue. What course are they active in? That's the green. What courses haven't they started? That's the gray. And so they can at a glance see this is where all of our students are at in the program. When I meet with a student, I don't have to ask them every time.

So what are you working on right now? They can see it right away. We wanna know that ESAP, that is a kind of niche number that applies to our situation. It's a measure of progress. Based on how much time are they taking to go through the program versus what they should be taking to go through the program. So how we can know is a student falling behind, are they really speeding through it? Their last login, so we know if they're active in Canvas, and then that little graph is a submission history that shows what where they've been submitting across courses because we've got students that are in a variety of courses and they're not gonna be working on them all at the same time.

We needed some overarching view of these are all the submissions that a student is doing across Canvas. Any questions about that? Nope. Okay. And we wanted to get student feedback. So we did it on two levels in our data analytics program.

One is our course level surveys, which we asked a couple of likert questions. We wanted to know how difficult was the course for them, not necessarily that hard or easy is bad. But if we're seeing a correlation between students who are having a hard time and they're taking a long time to finish, then there may be something that we wanna look at there. We also wanted to take a look at course objectives and whether or not the students were aware of what they're even supposed to be learning. Big thing for us as relevancy to industry.

And so we want students to know. Are they are they learning things that are going to be relevant to them? Then we had two other questions. We had one on whether or not there was a lot of teacher student interaction. And then we had another one on, I'm blanking on the fourth one. I can't remember either.

Yeah. It was a good one. I apologize. I can't remember what it was. It's the best one.

You have to come to the next presentation for that one. The other thing we did though is we then looked at the module of survey. And this turned out to be really useful, and we did not get the pushback that I thought we might because after every single module, there is a survey where the student can fill out what went well, what didn't go well in there. And because we made it optional, if student wanna fill it out, they didn't. If they did wanna fill it out, great.

And then we've got Hopefully, the internet works for me this time. We've got a little button that we've put at the top of the modules page where the teacher can click it, and it'll collect all the data and outline it for them. So they can see every single response for a given module than having to go through SpeedGrader and click through all the student's responses there. It's going to take a second. It'll pop up right here once it decides to load.

Well, I won't make you wait for it. But, yeah, it's it's nuts and too fancy. It's just a little button that runs an API script to go through all the quizzes and it checks if it's got a certain name, and then it grabs all the the responses to that quiz. Okay. Open discussion.

Yeah. That evidence earlier. And so do you keep all this information that rough up for us for either or something along those lines Yeah. So some of it, anything that's not really easy to pull, but we do keep stored in a centralized location. All the Canvas information because I've got a script that'll run and grab it.

We don't need to pull it. If somebody came to audit us, we just run that and it'd go grab it. But the other stuff we do collect. Yeah. Go ahead.

Sorry. I did I did that for the presentation. I turned on a little filter that anonymized the data, but the student or the teacher can, because that's the dashboard designed for them to see it. They can see the student's name. So they know who they're working with Go ahead.

Yeah. It's a great question. We did not find anybody that we felt was an adequate solution to the federal requirements. In an ideal world, we would just keep pages because students can get a little overwhelmed with the the constant questions. We tried to chunk things out as best as possible.

The rough rule of thumb is we had to ask a question for about every five minutes of something, which that gets to feel a lot when you're in hour long thing, and you've been asked twelve questions. But we try and chunk it. So you can do kind of a thirty minute stint and then answer a handful of questions and then another little bit yeah, if anyone's aware of any tools that that do that better, send it our way. We'd love to we're we're still investigating. Try and find a better way.

Yeah. Right here. There's there's a couple of answers to that. So for our remote ones, yeah. Okay.

Perfect. For our remote ones, they are required to use the Canvas inbox. And then for the ones that aren't, they don't have to necessarily. Okay. And I think we are out of time.

I apologize. Or we'd have more. Were you guys there one more question? Oh, sorry. We saw more time. Okay.

Alright. Go ahead. When you talk about, like, the chunking up the videos, like, taking things changing from ages into quizzes, did your faculty Yeah. So the question was, was it our faculty or was it our, instructional designers that did all the work of moving everything over. So it was our it was a team effort, but through the grant that and wrote, we did get an instructional designer that was dedicated to our data analytics program.

So they were able to do the bulk of that work. But we wanted the instructors to be involved because if the instructors don't know why it was set up the way it was or how it set up, then one of two things, something would break and they'd be really confused or they would start undoing things that didn't make sense to them because there's a certain amount of some of this doesn't make sense unless you look at it in the context of that distance. Any other questions? Yeah. In the back. So is the challenge of stages, and being able to tell Yeah.

We have to be able to say how long they were doing something. If it's short content, if this is just like a list of links of third party resources to take a look at, then that could go on a page. But if it's any kind of interactive piece, if we're claiming the student is logging in and doing something, then, yeah, we weren't able to use just the they logged in. I mean, there there there is data right in there that they clicked on the page and then they clicked on the next page but they wouldn't. We, I can't remember if it was Wendy or Jean asked the federal people.

They said, well, can we use time stamps? Yeah. To measure how long somebody is. And they're like, no. They're they were like, well, what what are people using there? Like, We're not at liberty to tell you. You know how this works.

Apparently video cassettes you know, cassette cassette tapes. I love it when something that's so antiquated as running online is, you know, they're creating the rules for online education. But, no, there are, there are several tools out there. I mean, at one point, there was a dream that the canvas would allow you to insert quizzes periodically on the page. I don't think that ever happened, though.

Yeah. And that's what a noto is. It's kind of a layer on top of Kaltura that it just makes it look a little cleaner. But, yeah, if you've got Kaltura, it's gonna do a lot of that. We've got gentleman in the back there.

In Visual Studio Code. Excellent. So he was talking about a, a little widget they've got that pushes the time spent in Visual Studio Code to Canvas. And, yeah, we looked at for we looked at a Visual Studio Code, little plug in, and it may be the one that you guys about that did track time spent coding. And we really liked that for this specific use case.

We were trying to find something that was gonna be a little more general, so we didn't have to reinvent processes for other departments. But definitely, for where we're smaller, we kinda have to make sure that everything that we dive into is gonna be useful for every department. But I do really like that idea for if you've got kind of a large coding program that you're trying to do, that'd be more useful than something more general like top tracker, which we're using. Yeah. I wish.

We do have to show. Yeah. So the only piece that we can have really be the students self reporting that's the surveys that we're collecting, and that's more for our internal analysis for the federal requirement. As far as I understand, they don't trust the students on any of that stuff, unfortunately. Currently, they're just putting it in an Excel sheet that's one of the pieces that we do have to kind of collect in a central repository.

Ideally, we would love further to be a digital space where we could then pull and do analytics on those notes, but we're not currently at this time doing anything like that. There are a few companies that do that out there. Like, dropout detective has a, has a, has a space where you can across, you know, environments or, or different departments that the college you can put in notes. You know, if your student services has access or your financial aid has access, and they can you can tally there. We piloted that for a little while.

Well, it's in multiple places right now. It's it's it's in Canvas and then in our student information system. Yeah. The question was where are we compiling it at? So I we've got the Canvas. We've got the student information system.

And then for some of the aggregate data that we could use for reporting, we then have an additional server where we pull the aggregate data and store that. That's been anonymized there's no student information tied to it, but we could use it for reporting. So do you have programs that have both synchronous and asynchronous? And so basically, like a glitch learning model, it comes to the lab. They start to do the work where you're trying to track the asynchronous. Right? Yeah.

How does that work? Well, first of all, how does that what's that challenge for you? And then secondly, do you have any internships and how do you do your internship hours mapped in? We haven't tried to do it. So this is the the first program that we've really done at this level. We've got our business -- Business does it as well. -- we've got only two programs that are doing online. This point.

And obviously, we teach machining and apprenticeship, and we've got internships and stuff like that. But in these two in these two programs, those aren't prevalent. So we haven't haven't gotten there. Sorry. Yeah.

And the answer to the question about how we'd handle internships, I don't actually understand why internships work because we don't do all this tracking with the internship, but we've gotten the go ahead that it's okay to do it. So I don't Yeah. I don't know. Sorry. I would recommend just as kind of a general plug.

Anybody trying to dive in to financial aid stuff and wrap their head around it, and they wanna make sure that they're in compliance. NASA has a really great system where you can And I'm sure whoever your financial aid compliance person will have access to this, where you can ask a question. It gets posted out to all the experts and then they'll respond to you. It takes a little bit of time. It's usually three to four days turnaround.

But every time I've got a question of are we really in the clear here? We'll send it out to them and they'll say yes or no, and they'll cite the the requirements, the federal requirements of why we are or are not in the clear. Yeah. How do we track attendance? Yeah. The question is how do we track attendance? So we have a tool, what's it called? Click, time click. Time click? Yeah.

So where they have to clock in, online to say that they are now working on their homework, and that's where we get that base number. They've said they've done this many hours, and then all their other stuff is to try and prove it. Okay. Any other questions? Yeah. Sorry.

Can you repeat that? Oh, sorry. So that button is to show the feedback. So they just click on it and they show all the student feedback that they've been given through one of the quizzes that we have set up as a feedback quiz. However, we do like to look at whether or not teachers have responded to quizzes or to they've graded their assignments or whatnot. For all that, we'd use the API, we'd pull it in and then we'd create a little dashboard.

Other questions? Well Thank you so much. Yeah. Thanks everyone for coming. Oh, yeah. And then go to there's your code somewhere for the picture up.

There you go. And I think you're gonna get a plug here. Yeah. Sorry. I just wanna give our presenters a, a big thanks for for coming today.

For everyone in the room, please take a moment, get into the app and rate this session and fill out the survey. It really helps us with the feedback, and it helps us develop future infrastructure cons. So thanks again for coming out. I appreciate it.
Collapse

Discover More Topics: