The Role of GPT-3 in Education: Developing an AI Teaching Assistant
GPT-3 has the opportunity to revolutionize education. By now you have probably heard of ChatGPT. This presentation will demonstrate using the OpenAI GPT-3 API and the Canvas API to create an AI teaching assistant that posts weekly announcements to Canvas on items that are due for that week.
Alright. I got the thumbs up from the back, so we will ahead and get started because I realize I am between you and nighttime activities. So that is not, I understand. Absolutely. So if you are here for the role of G GPT3 in education, developing an AI teaching assistant, you're in the right spot. So my name is Justin Carroll.
I'm the director of academic technology. If you wanted to scan QR code, that's my LinkedIn. If you want to get connected, be more than happy to to be on that platform with you. In addition, I'm on Twitter, which I I don't know if that's still the name of the platform, but twitter dot com slash j wait c, if you wanted to connect with me there as well. Trying to pick that back up.
That's been dormant for a while, so we'll try to get some more feedback there. A little bit about me, in our team, At tarleton, is, last three years we've been moved over to our IT team, used to be part of our teaching and learning team until a reorg that happened. And when generative AI came out, I had stalled out on my dissertation and told my, dissertation chair, Hey, Let's think about this as a topic to do. So full steam ahead, so we are using this particular application for my dissertation in practice. And with it being a dissertation in practice, one of the things that's required for that is being more application based is to have a tangible outcome and something you've created And so the ideas that we're creating this assistant to meet that objective.
So some of this is for research for my dissertation, and then others just personal aspirations to find out how we can connect things together that normally aren't connected. So we'll look through some of the open AI platform. That's what the session outcomes are. So we'll try to preview what that is. And, before we do that, we'll have a poll that will, do to see how how everyone is in the room.
If you've explored that or not, we'll talk about some definitions that you've probably all heard from the main stage of in structure, that have to deal with, generative AI, and then just the technology stack that we're using at tarleton. So it really was. We didn't add any we were using what we had. So, you'll see what that looks like. And then, to ground the study, since this is something that we're doing dissertation is to have some theoretical framework as this science behind nudging.
So we'll be talking a little bit about that. And then we're just EdTech go from here. So that's kind of what we'll look out for the session outcomes. In my undergraduate, I had the privilege of taking classes with doctor James Gentry. And one of his big, bless you.
One of his big, quotes was to know thy student. In my opinion, I'm still a student of generative AI. I feel like everyone out here is a student of generative AI. With all that said, what I'd like for you to do is if you would scan this QR code, there's gonna be about a six question survey. It's just gonna help me know how do we wanna frame the conversation for today.
Because I really wanna make it where it's something that you get out of it, and you can go back to your institution, to your district and say, Hey, There are ways that we can incorporate generative AI in some of our processes. Let's explore what that looks like. So take a few minutes to explore that. No one else needs the QR code. I'm going to move from this into our form.
As you are taking them, you'll see this is what the questions look like. Have you already tried out ChatGPT? Have you heard of the Open AI playground? Are you familiar with the Canvas APIs? That'd be really important for today. Do you have an understanding of JSON and what that looks like for API calls? And, you have Qualtrick And do you have an IPass solution, like power automate, boomy, mulesoft, tons of different ones that are there on the marketplace, That's some of how we were able to put this together. Yes. Like, each one is a radio that you need.
Oh, no. Well, hey, let's change that on the fly, why don't we? Multiple answers there, if you wanted to refresh that, or you can completely bypass that question altogether, it is fixed. Wonderful. Thank you. Okay.
Very good. And let's take a look and see what our responses look like when we present that. Alright. So a good number of you have already tried out ChatGPT. That is wonderful.
You're gonna see kind of a a tech end of this in the Open AI playground. If we look at that, uh-huh. Cool. This is gonna be great. Okay.
So having heard of it gonna be really important for those that are looking at a technical background. How can I do some generative AI, but I don't really wanna copy and paste? I want it to be part of a workflow. Get to see what that looks like at the as the session continues. And then, some of you said, I've heard of it, but I've never used it. So we got some good set of folks there.
And then Canvas API. So good. It looks like we're on the technical side of that, so wonderful. You've had exposure there. And, Jason, so we have some exposure with what that looks like to construct those items for different rest API calls, and qualtrics.
So we'll be able to see some new things that you maybe have heard about the survey builder that you can use and incorporate. And then, platform as a service. So Really, if there's one thing to take away from this, it's the importance of finding a middleware solution, because that really can hold the glue to a lot of processes that you wanna do, you can use qualtrics as like this interface and then have all this other background stuff happen. So we'll show you what we've done with Microsoft Power Automate to do process automation. So great.
Alright. Last question is, okay. We've heard of GPT is most things large language models, what we've heard from a lot from the main stage this week, prompt engineering is really gonna be important, and we'll be going through systematically the prompt that was used as the baseline for this teaching assistant. And, okay. I think we have a good understanding of how to frame the discussion today.
Perfect. Okay. Which brings us right to our definitions. So, large language models, that is What is all the parameters that you heard Ryan talk about, where it went from GPT three that had maybe a hundred million to GPT four that had a billion. That's what it trained on.
Next is GPT of what is that training model that you wanna do. For OpenAI, you can actually take a base model, train it with your own dataset, and then have it where you use that model as your base. So anything you wanna do is gonna be a model from OpenAI, which they call them DaVinci. They've got ADA, they've got a couple of four or five different ones that they have that you can use as the base, and if you wanna train on top of that, you can't. And this particular session, that's beyond the scope of today.
We're gonna do what's called few shot learning, where you're going to show just a couple of examples within the prompt itself, to tell what we want the generative output to be. So natural language processing, which is really excited what I was hoping we'd see some announcements And that's where you have that sidebar panel that comes out, and you naturally tell the UX what you want. So I think we're gonna be in an age where instead of this takes too many mouse clicks, it doesn't know what I want to do. So we're gonna be moving in that direction. And then co pilots, which goes right into that natural language processing.
So really excited to see that, they're already thinking about that on the Instructure road map. Which will make some of this obsolete, but that's good. I was also putting together that, oh, I caught one little phrase. I don't know if you all did, whatever developers use, they're going to open up to restful APIs. I'm like, wouldn't it be cool if there was a generative and structure API that we could call instead of needing to put this infrastructure together ourselves.
Again, not part of the scope of today. I'm just thinking for anybody that's part of Instructure. Let's let's let's talk. Okay. Alright.
So what does this particular application do? I don't want to underwhelm you. It's really simple for what we have for this particular aspect. I was really just focusing on course announcements, a weekly course announcement. What could we do to save instructors time to provide a weekly course announcement that would scrape items that was due in the course for that week, produce a unique output that would describe what was due for that week with the title of the item, whether it's an assignment, a discussion, a quiz, directly link to that item, so the student could go to it. Couple of things with that is, and being able to measure that.
So adding to the body of research of generative AI, did this move the needle in any way for students to be more successful? That's a question we can ask. Be able to measure that when the data comes in. Faculty can measure whether they thought the generative AI prompt was good or not. So that's the small elevator speech about what this particular application does that we are generating for, Open AI. Okay.
So, again, I already kind of alluded to this for personal, interest was I really am in to process automation. We started this in in our group in the late fall of twenty twenty. We had lost a member, during the height of COVID, and we needed to find out how we could augment and being able to continue to do what we need to do. And that made us go deep into low code, no code solutions like power automate, to be able to do a lot of the heavy lifting for us, so that we could, keep the shop open, if you will. And then I really like trying to find ways to integrate multiple systems that you never would have thought could be lamented together.
One of those was that we put a zoom meeting for faculty in every single session. So all they had to do was click on the meeting it automatically had the record to the cloud, and then Panopto picked it back up, and it routed it back to the core. So those are the kind of things that that I get excited about. And then, of course, the academic research interest for dissertation, dissertation study, and then really I wanted to become familiar with these large language models knowing that there's probably gonna be a future in the next set of cycle of RFPs where we have EdTech products where everybody's probably, if you're a product manager, how do we want to incorporate this technology into what we are currently offering? Important to me to know and kinda see under the hood for that, so I could start asking a lot of questions, because we've heard AI ethics, we've heard biases in AI. So trying to get under the hood to see what it was like, so I could be more informed as we're renegotiating with vendors.
We're talking to new vendors. And they're starting to offer these type of services. So that's kind of what the span of why this became about. So the road map, we started in spring of this, year and finishing up last couple of weeks to build the application, build the workflow. The hardest part was to get approval from academic leadership because it's new, something new, and that just happened three weeks ago.
So we're gonna hopefully be able to meet the call for faculty for the fall to get our first set and then we'll start analyzing the data as it comes in for spring twenty four. So my plan is to post in the Canvas community kind of the outcomes of this particular initiative and where we landed. On the research from generative AI. Again, we're polling faculty. We're also gonna have underneath the announcement to students letting them know that this was generative AI, what did you think of it? So I'm really excited to see the depth that we'll be able to bring to those questions that will ask, both sets of groups.
And then I also forgot to mention on the bottom right of this, anytime you have a question, because I kinda think in the moment, and I will forget at the end to ask my question. You can scan that QR code, and I'm gonna be reviewing at the end any of the questions that come in there because they're telling me on the recording that I need to go ahead and say the question anyway. So it's just gonna be easier that if I collect them, I can say it out loud to you all, and then again, I know I'm between you and evening activities. Okay. So let's take a look at the Open AI playground for a minute, and See what it looks like.
So you can tell this is very different than what you have seen with Chat GPT. This is where folks that are programming come to determine how they wanna add generative AI into their own products. This is where I wanted to come to learn how I could do this and adding the Canvas APIs to generate generative AI. So this particular model on here on the right hand side, and we won't go into a lot of details with this. Because there's great documentation from OpenAI as their chat model.
However, for me, I use their completion model because I wasn't gonna be chatting back and forth to get several different prompts in generative AI. I was just going to make one master prompt and get one response back. And that response back was going to be the announcement that ended up in the Canvas course. So we will see what the prompt is that we're using as our base for the model. And, got a really good idea from the panel earlier.
I'm going to incorporate it as since we're still in the, phase of launching this is personas. I love that idea. So maybe having three or four different student teaching personas, where you just change a little bit of the text in the prompt, and then that even adds to the study rich of, okay, now I'm asking faculty, did they like it? Did they not? And what persona did they use? So I can see, well, this persona was adopted better or was perceived to be better, and just gonna be going back to what we can do to determine how generative AI will work in the classroom. All of these items here on the right hand side are based on how creative, you want the prompt to be based on your initial prompt. So how far away are you willing to have the large language model move from what you're saying.
If you're in creative writing, you're gonna want this, temperature to be on the far right hand side. For me, in my particular set of, items that I needed, I needed to be more on the far left, because what I was noticing is when I moved that too far, it wasn't adding the HTML items that I needed so that it would send students to where they needed to go. So all I have to say is that you can look at this, and, you'll have these where you didn't have that available from chat GPT, because these were not options for you. So take a look at what that looks like. And when you get your prompt the way that you want, There is this view code option, and this is why you're seeing an influx of generative AI from OpenAI in products.
Because that particular set of Python, r, and our case JSON, that's all that's needed to start using this in products. And for us, we put this set of JSON into power automate as a custom connector. And when you do it as a custom connector, it's like Lego blocks at that point. Where do you want this particular action to go in your process flow? So That's what the Open AI platform looks like for the playground. So let's go ahead and go back to our presentation.
And here's what the beginning of the prompt looks like. So it's gonna take a minute there to Take a look at that, act as a teaching assistant for university, generate an announcement listing items due this week in Canvas, items with zero. So The way that we are taking from the Canvas APIs, the JSON that you get from the API you can do an expression in power automate called length. And that tells me how many times is something repeating in that JSON and that's where you're getting that zero. And if you just do one call, you can get up to a hundred items, but for a week, you're probably not gonna see more than zero to five, zero to ten.
So how many items of zero indicating nothing is due? An example phrase, so I'm telling it, hey, if you see zero, I want you to say something like this. And that phrase is there are no assignments due this week. Items greater than zero indicate that something is due An example phrase for this is there are two assignments due. R, if only one item is due, there is one quiz due. There is one assignment due.
So we're kinda getting it with few shot, one shot training of what we want the output to actually be. Whenever we get that from the, completion API from Open AI. Next part of the prompt begin the announcement with a creative hook. We'll have to figure out if if if the language model understands what is a creative hook. So again, it is not lost on me that I am a white middle aged man, and this is where your bias can be infused.
Although this is a simple basic prompt, it's Really, if you've ever had to be doing mission, vision, and value statements for your institution, are for your district, and you want to rip your hair out because it's just these are this is just as important as doing those mission and vision and values because Whatever you put here is going to influence what the output is going to be. So all that to say is, it's a basic prompt, My hope was that I didn't input any kind of bias, but again, you want to have lots of eyes on this to ensure that you're looking at that. So just wanted to throw that out there. Then we want to tell it a little bit about what data model to expect. This is what you would consider mail merge.
So through getting the Canvas API, we're gonna be able to tell it the course ID. We're gonna be able to tell it the course name, and then we're gonna be able to tell it the titles of it. And then I saw that I had a mistake on, something with, assignments and discussion, so wasn't able to change that in time. But all that to say is that now you're training it on what data you're gonna get from the process flow. So this is specific to discussions and quizzes and assignments, And now that we have the top of the generative AI, we want you to create an announcement, getting student excited for the day, This is what you're gonna be getting in return.
For assignments, I want you to do this HTML code. So you've got a line break, you've got a bolding of assignments, there are, and then replace with the number of assignments due. And then what's the title what's the ID number of that assignment, because again, we want to send students back to that particular assignment. So that's what that looks like for what is due. And discussion, same thing is repeat this style for each discussion to do.
We have the bold for discussions, we have the title, and we have, the URL that goes to that particular discussion. Following the same thing. We've got quizzes, bold, what's due. And again, if nothing is due, it won't produce this HTML and the output. That was a long prompt, but it has every piece that we need to be successful to get this from the generative AI, which we're using OpenAI into Canvas.
So how do we do that? What does that look like? Let's take a look. I wanna show you now that we've added that to the playground, this is what the entire prompt looks like on the open AI side. And I think I gotta swing this over here. And you can see that it's starting to generate the text. Like you've scenes with chat GPT.
So it's just giving you an idea of what that text is going to look like having that prompt ahead of it. So it made the announcement, and, now we're ready to use that and to put it into canvas. So let's move this back one more time. And go to our presentation. No.
Okay. We're getting things back to where they were. Okay. So this was a sample of what we got from the generative AI. Charles and Texas.
Let's get this week off to a great start. This announcement contains information on items due this week in Canvas. Please make sure to refer to the Canvas calendar when submitting items on time. And then it's got all the HTML of discussions, how to get there, assignments, and quizzes. And so, with this output, what this looks like when we get into Canvas, is that it came from me.
And this is the completion endpoint for Open AI. Let me see. Okay. That's gonna show up later. Okay.
So we have the view code We added that to our power automate, and I'll show you what those flows look like here in a minute. But this is the completion endpoint that we showed previously That's all that's required from OpenAI to send your prompt. The second item there that says prompt That's where your prompt's gonna go. So everything that we said before, that's where it went. We told what the temperature was gonna be, and then the max tokens, that's gonna be really really important because we haven't talked about cost yet.
I'm funding all this myself, but because I didn't wanna do that layer with the institution. They do it based on tokens And the short version of this is about one token is about three fourths of a word. And a thousand tokens is about three cents. So it looks at what you are sending, and what the output is adding those together, that's your token count. So about half of a penny to two cents is what the cost is gonna be each time this is run.
So, kind of keep that in perspective of if you were wanting to do something like this, in your shop, what the cost would look like. This is the output. So we've got the idea of what what it was that was there, and then under choices for the text, That's where the response is coming back from OpenAI. So we're sending something, we're getting something back from the, response there, and then that's how you would be able to use that connector in power automate. So bringing all that together, is this is the view from power automate.
At the very top, you always have something that starts with a trigger. And for this particular flow, it's when an HTTP is sent. So very similar to This API, Power Automate is also making a post URL that you would send to. And when you send to that post URL, now you've started the whole process flow of everything you want to do. And the way this works is any action that happens at the top, the pre the next actions can use as dynamic content.
Again, I'm a low code, no code citizen developer. Some of you may be like, oh, just give me my Python, give me my whatever you code in. I'm sorry. I'm a low code guy. So you've got some variables that you can call.
This is for me to help clean up some of the model. That one, that One in green is what's the completion going to open AI. And again, when you get something back, that one action is doing both things. It's sending but it's also getting the output because you need that output for the very last item on this list, which is that create a Canvas announcement. So starting with a we're cleaning up, we're getting our data model, going to their process flow, sending it to OpenAI, and then sending it to Canvas.
Want to see what that looks like when you click on that after something has run, so you can see that you've got the model and what the output was from Open AI. And then on the Canvas announcement, when you expand that, you can see that you've got the course ID then what the announcement looks like. All that to say is that this is what it looks like when everything's said and done. You've got it coming from me as the instructor, you've got the title, you've got the announcements, and then away they go, to being able to select those items that are due for that particular week. Now we got some decisions to make.
Who should post this announcement? Should it be from the course instructor, I didn't feel comfortable that this should be from the course instructor. I thought a service account, would be best, especially for this particular study. However, we are going to be talking about whether instructors can see this beforehand. That's another thing. It should faculty be able to review this and approve it first, or should it just show up in their course.
So those are the things that we have to kinda get past an answer to know what's important. For me, we are gonna use a service account, we are gonna have faculty review. And then because of costs, we're not going to allow regenerating the output it's just gonna be, did you use it? Did you not? And then why did you not use it? And then that'd be part of the study that we'll gather as well. So let's kind of recap everything again. Step one is that we have a flow that's going to run every Monday morning.
That is going to be a scheduled job, and it's going to be time based. Step two is it's gonna generate output for every single course. I didn't wanna get in the weeds with what that particular flow looked like, and that's not gonna be a benefit to you if you don't even have power automate So anyway, step two is it's going to take from everything, of course, all the due dates, find out how many there are, and get it in that data model that we saw before so that finally step three faculty will get an email. They'll be able to see what the AI has generated and be able to tell whether they want to accept or reject that output. And then finally, if they did accept it, it goes through that whole process flow that was on that previous slide.
And finally, it will post into Canvas, and then students can optionally provide their feedback as well. So for the research study, looking forward to the weekly touch points from both faculty and students and the data that we're gonna be able to collect and This is my assumption. We're going to see if it's true or not, is that the output made it a more personalized experience of the due dates rather than just the due dates and the dashboard, and it also saves faculty time for if they were to try to do this themselves. So that's what we're gonna try to measure and see. With all that, again, we won't get into this, but we're using grounding this on this idea of nudging and getting behavior.
I really love the, main stage, items that were showing how Canvas plans to do that in their product in the future for the roadmap. And then this is what faculty will see. So this is where I ask the question of qualtrics. They're gonna get a unique link to their email where they launch the survey It's gonna have the generative AI output. On the bottom there, they're gonna be able to approve or reject.
And if they approve, again, It's gonna go to that process flow. If they reject, it asks them some questions of what what did we not get right? What do we need to change to make things more easy and better for you? If you weren't familiar in the URL address of Qualtrics, you can have predetermined variables listed. We're doing what's called a person ID, a PID to get a unique identifier that's going to give that Qualtrics link unique to that individual and it's going to act as a prefill is what's going to show the faculty their particular output. So we just have to use one survey But by doing a pre fill with the web service call, which is what this looks like. So if you do a web service call and call tricks, you can do a single sign on, So with single sign on, we're going to be able to capture first name, last name, email, username, and then we'll be able to associate that with that primary ID to see if they are who they say they are, and if they are, they can continue.
And they'll be seeing their particular output on the survey to be able to accept or reject. This is what it looks like whenever we have the pre fill, we're prefilling their username, their email, and then we're matching it up. And then the embedded data is we're setting a variable called weekly generative, output. And that's what's going to show on the cultric survey to folks. So, they just see a visual survey.
Don't realize what's happening all under the hood. If you wanna implement this in call checks, it's called a web service, hook, and then you can, again, in the URL, as long as you have something you wanna send it to, you can then have things be brought back into qualtrics, which is exactly what we're doing here with, the AI generative output. And then finally, we're ready to send all this. And if there was an error because we know things are not always gonna be a hundred percent, we can also put in error logging and send them onto a different call tricks panel to say, Hey, something went wrong. Please contact this guy, name Justin.
And then they can be on their way to say what happened, and we can look at time stamps to find out, well, what went wrong? Where do where do we need to fix things? With all that, let me look and see what questions we have. We're gonna open it up for the last, couple of minutes. How are we doing? I know there was a lot. Let me see what we've got for questions that came into the QR, and then if we didn't get any there, we will see what came in for our what what we're gonna do here face to face with everybody. So let me pull up three questions so far, responses.
Okay. Let's pull that. Okay. The first question, has this process, has this process been impacted by bandwidth or been taxed? How have you dealt with this? So this is, finishing up. We're gonna be launching this in the fall.
So, we will see what that looks like, but in terms of any kind of bandwidth issues, since it's a study, That'd be something that, I will be working through, but, I don't know if that helps answer the question of who had that, but, yeah, to be determined on what that looks like. And does Canvas have any sort of LTI for student facing generative AI assistance, as in a student could ask the assistant when x assignment is due or what they missed class policy or what they pulled from the syllabus. I think that's a great thing that we should ask and structure. I know on the road map, it looked like they were having where you could ask based on the data from what you're getting into course, but I don't think that they've quite looked at that yet. And then is there any sort of adjustment or framing necessary for the syllabus to be easier to read by AI? So, that's a good question.
I don't think that, we have, explored that just yet. So any other questions across the audience? Yes. Do you, is there a way if you have two different sections in the same time? Say there's a Tuesday section Thursday sections today is different due dates. Does the announcement know that only sent to Tuesday's people That's a good question. We'll have to see if we can segment out the announcement from that.
So that's a good point. I'll have to see if we can bake that one in. So, I would hope so, but it's gonna be based on if the announcement itself can be segmented, which, I don't know if it can or not. So something to consider. Yes.
This is just my own experience in terms of watching a similar process. Mhmm. Background is that a number of bullets led to students picking their shoes? Oh, true. Contact? That's true. I think one of the complications might be -- Uh-huh.
-- that takes up an area particular design. Right. Sure. That's something to consider. Absolutely.
You're able to capture due dates and then bring those in So the the thought, so repeating for the recording, was, have we considered instead of it being unbolted list to being, numbered lists so that students didn't pick and choose or thought that there was flexibility. It's a good point. Absolutely. I think that'd be great for the design. Any other questions before we call it a day.
Yes, sir. Yeah. This maybe I mentioned that you mom, would you wanna know about kind of security side of things? Absolutely. Yes. So on the power automate side, once you have found that the process works the way that you want, There is a secure input and secure output slider.
And when you select that after each run, when you try to find out. So you saw on that previous slide, I I could see that, yes, you probably had a lot of questions of why were you able to see this output here? Once you have the model in place, when you try to view this, it is saying, you you can't view it because the security won't let you. So it ensures that encrypts that so that you can't see any run history for any of the items for, that particular run. Now, you might have to open it up if there's things that are not working, so you can, determine what's wrong. But, yes, you wanna, you want to add those sliders to secure inputs and outputs so that they're not viewable for each run history.
Yes. That we're using that as the middleware as the glue to, to make this work. Any other questions? Yes, sir. Can I get this straight? So power automate grants open AI access to all of the information that's in our No. What this is doing is it is, we're we're doing something where we're, those that want to partner with us.
We're looking at their courses on a weekly basis, and we're adding that to a data model. And then only those courses that partner with us are gonna be part of that generative AI. For that one piece. It's not bringing any student data over. It's just bringing the course ID, the discussion ID, the title of the course.
So very small set of information. It's actually going out to generate that response back to may I, like, how do you get that information? Absolutely. Through power automate? It is through power. So we are sending the output through power automate, so it will go to that completion API set. You can just set the criteria.
Correct. Mhmm. Absolutely. Absolutely. Alright.
So if you have any questions, be happy to answer those on LinkedIn, on Twitter, but I really appreciate your time today. I know last session of the day. So thank you very much, and I hope you have a rest, wonderful time of the conference. So Thank you.
I'm the director of academic technology. If you wanted to scan QR code, that's my LinkedIn. If you want to get connected, be more than happy to to be on that platform with you. In addition, I'm on Twitter, which I I don't know if that's still the name of the platform, but twitter dot com slash j wait c, if you wanted to connect with me there as well. Trying to pick that back up.
That's been dormant for a while, so we'll try to get some more feedback there. A little bit about me, in our team, At tarleton, is, last three years we've been moved over to our IT team, used to be part of our teaching and learning team until a reorg that happened. And when generative AI came out, I had stalled out on my dissertation and told my, dissertation chair, Hey, Let's think about this as a topic to do. So full steam ahead, so we are using this particular application for my dissertation in practice. And with it being a dissertation in practice, one of the things that's required for that is being more application based is to have a tangible outcome and something you've created And so the ideas that we're creating this assistant to meet that objective.
So some of this is for research for my dissertation, and then others just personal aspirations to find out how we can connect things together that normally aren't connected. So we'll look through some of the open AI platform. That's what the session outcomes are. So we'll try to preview what that is. And, before we do that, we'll have a poll that will, do to see how how everyone is in the room.
If you've explored that or not, we'll talk about some definitions that you've probably all heard from the main stage of in structure, that have to deal with, generative AI, and then just the technology stack that we're using at tarleton. So it really was. We didn't add any we were using what we had. So, you'll see what that looks like. And then, to ground the study, since this is something that we're doing dissertation is to have some theoretical framework as this science behind nudging.
So we'll be talking a little bit about that. And then we're just EdTech go from here. So that's kind of what we'll look out for the session outcomes. In my undergraduate, I had the privilege of taking classes with doctor James Gentry. And one of his big, bless you.
One of his big, quotes was to know thy student. In my opinion, I'm still a student of generative AI. I feel like everyone out here is a student of generative AI. With all that said, what I'd like for you to do is if you would scan this QR code, there's gonna be about a six question survey. It's just gonna help me know how do we wanna frame the conversation for today.
Because I really wanna make it where it's something that you get out of it, and you can go back to your institution, to your district and say, Hey, There are ways that we can incorporate generative AI in some of our processes. Let's explore what that looks like. So take a few minutes to explore that. No one else needs the QR code. I'm going to move from this into our form.
As you are taking them, you'll see this is what the questions look like. Have you already tried out ChatGPT? Have you heard of the Open AI playground? Are you familiar with the Canvas APIs? That'd be really important for today. Do you have an understanding of JSON and what that looks like for API calls? And, you have Qualtrick And do you have an IPass solution, like power automate, boomy, mulesoft, tons of different ones that are there on the marketplace, That's some of how we were able to put this together. Yes. Like, each one is a radio that you need.
Oh, no. Well, hey, let's change that on the fly, why don't we? Multiple answers there, if you wanted to refresh that, or you can completely bypass that question altogether, it is fixed. Wonderful. Thank you. Okay.
Very good. And let's take a look and see what our responses look like when we present that. Alright. So a good number of you have already tried out ChatGPT. That is wonderful.
You're gonna see kind of a a tech end of this in the Open AI playground. If we look at that, uh-huh. Cool. This is gonna be great. Okay.
So having heard of it gonna be really important for those that are looking at a technical background. How can I do some generative AI, but I don't really wanna copy and paste? I want it to be part of a workflow. Get to see what that looks like at the as the session continues. And then, some of you said, I've heard of it, but I've never used it. So we got some good set of folks there.
And then Canvas API. So good. It looks like we're on the technical side of that, so wonderful. You've had exposure there. And, Jason, so we have some exposure with what that looks like to construct those items for different rest API calls, and qualtrics.
So we'll be able to see some new things that you maybe have heard about the survey builder that you can use and incorporate. And then, platform as a service. So Really, if there's one thing to take away from this, it's the importance of finding a middleware solution, because that really can hold the glue to a lot of processes that you wanna do, you can use qualtrics as like this interface and then have all this other background stuff happen. So we'll show you what we've done with Microsoft Power Automate to do process automation. So great.
Alright. Last question is, okay. We've heard of GPT is most things large language models, what we've heard from a lot from the main stage this week, prompt engineering is really gonna be important, and we'll be going through systematically the prompt that was used as the baseline for this teaching assistant. And, okay. I think we have a good understanding of how to frame the discussion today.
Perfect. Okay. Which brings us right to our definitions. So, large language models, that is What is all the parameters that you heard Ryan talk about, where it went from GPT three that had maybe a hundred million to GPT four that had a billion. That's what it trained on.
Next is GPT of what is that training model that you wanna do. For OpenAI, you can actually take a base model, train it with your own dataset, and then have it where you use that model as your base. So anything you wanna do is gonna be a model from OpenAI, which they call them DaVinci. They've got ADA, they've got a couple of four or five different ones that they have that you can use as the base, and if you wanna train on top of that, you can't. And this particular session, that's beyond the scope of today.
We're gonna do what's called few shot learning, where you're going to show just a couple of examples within the prompt itself, to tell what we want the generative output to be. So natural language processing, which is really excited what I was hoping we'd see some announcements And that's where you have that sidebar panel that comes out, and you naturally tell the UX what you want. So I think we're gonna be in an age where instead of this takes too many mouse clicks, it doesn't know what I want to do. So we're gonna be moving in that direction. And then co pilots, which goes right into that natural language processing.
So really excited to see that, they're already thinking about that on the Instructure road map. Which will make some of this obsolete, but that's good. I was also putting together that, oh, I caught one little phrase. I don't know if you all did, whatever developers use, they're going to open up to restful APIs. I'm like, wouldn't it be cool if there was a generative and structure API that we could call instead of needing to put this infrastructure together ourselves.
Again, not part of the scope of today. I'm just thinking for anybody that's part of Instructure. Let's let's let's talk. Okay. Alright.
So what does this particular application do? I don't want to underwhelm you. It's really simple for what we have for this particular aspect. I was really just focusing on course announcements, a weekly course announcement. What could we do to save instructors time to provide a weekly course announcement that would scrape items that was due in the course for that week, produce a unique output that would describe what was due for that week with the title of the item, whether it's an assignment, a discussion, a quiz, directly link to that item, so the student could go to it. Couple of things with that is, and being able to measure that.
So adding to the body of research of generative AI, did this move the needle in any way for students to be more successful? That's a question we can ask. Be able to measure that when the data comes in. Faculty can measure whether they thought the generative AI prompt was good or not. So that's the small elevator speech about what this particular application does that we are generating for, Open AI. Okay.
So, again, I already kind of alluded to this for personal, interest was I really am in to process automation. We started this in in our group in the late fall of twenty twenty. We had lost a member, during the height of COVID, and we needed to find out how we could augment and being able to continue to do what we need to do. And that made us go deep into low code, no code solutions like power automate, to be able to do a lot of the heavy lifting for us, so that we could, keep the shop open, if you will. And then I really like trying to find ways to integrate multiple systems that you never would have thought could be lamented together.
One of those was that we put a zoom meeting for faculty in every single session. So all they had to do was click on the meeting it automatically had the record to the cloud, and then Panopto picked it back up, and it routed it back to the core. So those are the kind of things that that I get excited about. And then, of course, the academic research interest for dissertation, dissertation study, and then really I wanted to become familiar with these large language models knowing that there's probably gonna be a future in the next set of cycle of RFPs where we have EdTech products where everybody's probably, if you're a product manager, how do we want to incorporate this technology into what we are currently offering? Important to me to know and kinda see under the hood for that, so I could start asking a lot of questions, because we've heard AI ethics, we've heard biases in AI. So trying to get under the hood to see what it was like, so I could be more informed as we're renegotiating with vendors.
We're talking to new vendors. And they're starting to offer these type of services. So that's kind of what the span of why this became about. So the road map, we started in spring of this, year and finishing up last couple of weeks to build the application, build the workflow. The hardest part was to get approval from academic leadership because it's new, something new, and that just happened three weeks ago.
So we're gonna hopefully be able to meet the call for faculty for the fall to get our first set and then we'll start analyzing the data as it comes in for spring twenty four. So my plan is to post in the Canvas community kind of the outcomes of this particular initiative and where we landed. On the research from generative AI. Again, we're polling faculty. We're also gonna have underneath the announcement to students letting them know that this was generative AI, what did you think of it? So I'm really excited to see the depth that we'll be able to bring to those questions that will ask, both sets of groups.
And then I also forgot to mention on the bottom right of this, anytime you have a question, because I kinda think in the moment, and I will forget at the end to ask my question. You can scan that QR code, and I'm gonna be reviewing at the end any of the questions that come in there because they're telling me on the recording that I need to go ahead and say the question anyway. So it's just gonna be easier that if I collect them, I can say it out loud to you all, and then again, I know I'm between you and evening activities. Okay. So let's take a look at the Open AI playground for a minute, and See what it looks like.
So you can tell this is very different than what you have seen with Chat GPT. This is where folks that are programming come to determine how they wanna add generative AI into their own products. This is where I wanted to come to learn how I could do this and adding the Canvas APIs to generate generative AI. So this particular model on here on the right hand side, and we won't go into a lot of details with this. Because there's great documentation from OpenAI as their chat model.
However, for me, I use their completion model because I wasn't gonna be chatting back and forth to get several different prompts in generative AI. I was just going to make one master prompt and get one response back. And that response back was going to be the announcement that ended up in the Canvas course. So we will see what the prompt is that we're using as our base for the model. And, got a really good idea from the panel earlier.
I'm going to incorporate it as since we're still in the, phase of launching this is personas. I love that idea. So maybe having three or four different student teaching personas, where you just change a little bit of the text in the prompt, and then that even adds to the study rich of, okay, now I'm asking faculty, did they like it? Did they not? And what persona did they use? So I can see, well, this persona was adopted better or was perceived to be better, and just gonna be going back to what we can do to determine how generative AI will work in the classroom. All of these items here on the right hand side are based on how creative, you want the prompt to be based on your initial prompt. So how far away are you willing to have the large language model move from what you're saying.
If you're in creative writing, you're gonna want this, temperature to be on the far right hand side. For me, in my particular set of, items that I needed, I needed to be more on the far left, because what I was noticing is when I moved that too far, it wasn't adding the HTML items that I needed so that it would send students to where they needed to go. So all I have to say is that you can look at this, and, you'll have these where you didn't have that available from chat GPT, because these were not options for you. So take a look at what that looks like. And when you get your prompt the way that you want, There is this view code option, and this is why you're seeing an influx of generative AI from OpenAI in products.
Because that particular set of Python, r, and our case JSON, that's all that's needed to start using this in products. And for us, we put this set of JSON into power automate as a custom connector. And when you do it as a custom connector, it's like Lego blocks at that point. Where do you want this particular action to go in your process flow? So That's what the Open AI platform looks like for the playground. So let's go ahead and go back to our presentation.
And here's what the beginning of the prompt looks like. So it's gonna take a minute there to Take a look at that, act as a teaching assistant for university, generate an announcement listing items due this week in Canvas, items with zero. So The way that we are taking from the Canvas APIs, the JSON that you get from the API you can do an expression in power automate called length. And that tells me how many times is something repeating in that JSON and that's where you're getting that zero. And if you just do one call, you can get up to a hundred items, but for a week, you're probably not gonna see more than zero to five, zero to ten.
So how many items of zero indicating nothing is due? An example phrase, so I'm telling it, hey, if you see zero, I want you to say something like this. And that phrase is there are no assignments due this week. Items greater than zero indicate that something is due An example phrase for this is there are two assignments due. R, if only one item is due, there is one quiz due. There is one assignment due.
So we're kinda getting it with few shot, one shot training of what we want the output to actually be. Whenever we get that from the, completion API from Open AI. Next part of the prompt begin the announcement with a creative hook. We'll have to figure out if if if the language model understands what is a creative hook. So again, it is not lost on me that I am a white middle aged man, and this is where your bias can be infused.
Although this is a simple basic prompt, it's Really, if you've ever had to be doing mission, vision, and value statements for your institution, are for your district, and you want to rip your hair out because it's just these are this is just as important as doing those mission and vision and values because Whatever you put here is going to influence what the output is going to be. So all that to say is, it's a basic prompt, My hope was that I didn't input any kind of bias, but again, you want to have lots of eyes on this to ensure that you're looking at that. So just wanted to throw that out there. Then we want to tell it a little bit about what data model to expect. This is what you would consider mail merge.
So through getting the Canvas API, we're gonna be able to tell it the course ID. We're gonna be able to tell it the course name, and then we're gonna be able to tell it the titles of it. And then I saw that I had a mistake on, something with, assignments and discussion, so wasn't able to change that in time. But all that to say is that now you're training it on what data you're gonna get from the process flow. So this is specific to discussions and quizzes and assignments, And now that we have the top of the generative AI, we want you to create an announcement, getting student excited for the day, This is what you're gonna be getting in return.
For assignments, I want you to do this HTML code. So you've got a line break, you've got a bolding of assignments, there are, and then replace with the number of assignments due. And then what's the title what's the ID number of that assignment, because again, we want to send students back to that particular assignment. So that's what that looks like for what is due. And discussion, same thing is repeat this style for each discussion to do.
We have the bold for discussions, we have the title, and we have, the URL that goes to that particular discussion. Following the same thing. We've got quizzes, bold, what's due. And again, if nothing is due, it won't produce this HTML and the output. That was a long prompt, but it has every piece that we need to be successful to get this from the generative AI, which we're using OpenAI into Canvas.
So how do we do that? What does that look like? Let's take a look. I wanna show you now that we've added that to the playground, this is what the entire prompt looks like on the open AI side. And I think I gotta swing this over here. And you can see that it's starting to generate the text. Like you've scenes with chat GPT.
So it's just giving you an idea of what that text is going to look like having that prompt ahead of it. So it made the announcement, and, now we're ready to use that and to put it into canvas. So let's move this back one more time. And go to our presentation. No.
Okay. We're getting things back to where they were. Okay. So this was a sample of what we got from the generative AI. Charles and Texas.
Let's get this week off to a great start. This announcement contains information on items due this week in Canvas. Please make sure to refer to the Canvas calendar when submitting items on time. And then it's got all the HTML of discussions, how to get there, assignments, and quizzes. And so, with this output, what this looks like when we get into Canvas, is that it came from me.
And this is the completion endpoint for Open AI. Let me see. Okay. That's gonna show up later. Okay.
So we have the view code We added that to our power automate, and I'll show you what those flows look like here in a minute. But this is the completion endpoint that we showed previously That's all that's required from OpenAI to send your prompt. The second item there that says prompt That's where your prompt's gonna go. So everything that we said before, that's where it went. We told what the temperature was gonna be, and then the max tokens, that's gonna be really really important because we haven't talked about cost yet.
I'm funding all this myself, but because I didn't wanna do that layer with the institution. They do it based on tokens And the short version of this is about one token is about three fourths of a word. And a thousand tokens is about three cents. So it looks at what you are sending, and what the output is adding those together, that's your token count. So about half of a penny to two cents is what the cost is gonna be each time this is run.
So, kind of keep that in perspective of if you were wanting to do something like this, in your shop, what the cost would look like. This is the output. So we've got the idea of what what it was that was there, and then under choices for the text, That's where the response is coming back from OpenAI. So we're sending something, we're getting something back from the, response there, and then that's how you would be able to use that connector in power automate. So bringing all that together, is this is the view from power automate.
At the very top, you always have something that starts with a trigger. And for this particular flow, it's when an HTTP is sent. So very similar to This API, Power Automate is also making a post URL that you would send to. And when you send to that post URL, now you've started the whole process flow of everything you want to do. And the way this works is any action that happens at the top, the pre the next actions can use as dynamic content.
Again, I'm a low code, no code citizen developer. Some of you may be like, oh, just give me my Python, give me my whatever you code in. I'm sorry. I'm a low code guy. So you've got some variables that you can call.
This is for me to help clean up some of the model. That one, that One in green is what's the completion going to open AI. And again, when you get something back, that one action is doing both things. It's sending but it's also getting the output because you need that output for the very last item on this list, which is that create a Canvas announcement. So starting with a we're cleaning up, we're getting our data model, going to their process flow, sending it to OpenAI, and then sending it to Canvas.
Want to see what that looks like when you click on that after something has run, so you can see that you've got the model and what the output was from Open AI. And then on the Canvas announcement, when you expand that, you can see that you've got the course ID then what the announcement looks like. All that to say is that this is what it looks like when everything's said and done. You've got it coming from me as the instructor, you've got the title, you've got the announcements, and then away they go, to being able to select those items that are due for that particular week. Now we got some decisions to make.
Who should post this announcement? Should it be from the course instructor, I didn't feel comfortable that this should be from the course instructor. I thought a service account, would be best, especially for this particular study. However, we are going to be talking about whether instructors can see this beforehand. That's another thing. It should faculty be able to review this and approve it first, or should it just show up in their course.
So those are the things that we have to kinda get past an answer to know what's important. For me, we are gonna use a service account, we are gonna have faculty review. And then because of costs, we're not going to allow regenerating the output it's just gonna be, did you use it? Did you not? And then why did you not use it? And then that'd be part of the study that we'll gather as well. So let's kind of recap everything again. Step one is that we have a flow that's going to run every Monday morning.
That is going to be a scheduled job, and it's going to be time based. Step two is it's gonna generate output for every single course. I didn't wanna get in the weeds with what that particular flow looked like, and that's not gonna be a benefit to you if you don't even have power automate So anyway, step two is it's going to take from everything, of course, all the due dates, find out how many there are, and get it in that data model that we saw before so that finally step three faculty will get an email. They'll be able to see what the AI has generated and be able to tell whether they want to accept or reject that output. And then finally, if they did accept it, it goes through that whole process flow that was on that previous slide.
And finally, it will post into Canvas, and then students can optionally provide their feedback as well. So for the research study, looking forward to the weekly touch points from both faculty and students and the data that we're gonna be able to collect and This is my assumption. We're going to see if it's true or not, is that the output made it a more personalized experience of the due dates rather than just the due dates and the dashboard, and it also saves faculty time for if they were to try to do this themselves. So that's what we're gonna try to measure and see. With all that, again, we won't get into this, but we're using grounding this on this idea of nudging and getting behavior.
I really love the, main stage, items that were showing how Canvas plans to do that in their product in the future for the roadmap. And then this is what faculty will see. So this is where I ask the question of qualtrics. They're gonna get a unique link to their email where they launch the survey It's gonna have the generative AI output. On the bottom there, they're gonna be able to approve or reject.
And if they approve, again, It's gonna go to that process flow. If they reject, it asks them some questions of what what did we not get right? What do we need to change to make things more easy and better for you? If you weren't familiar in the URL address of Qualtrics, you can have predetermined variables listed. We're doing what's called a person ID, a PID to get a unique identifier that's going to give that Qualtrics link unique to that individual and it's going to act as a prefill is what's going to show the faculty their particular output. So we just have to use one survey But by doing a pre fill with the web service call, which is what this looks like. So if you do a web service call and call tricks, you can do a single sign on, So with single sign on, we're going to be able to capture first name, last name, email, username, and then we'll be able to associate that with that primary ID to see if they are who they say they are, and if they are, they can continue.
And they'll be seeing their particular output on the survey to be able to accept or reject. This is what it looks like whenever we have the pre fill, we're prefilling their username, their email, and then we're matching it up. And then the embedded data is we're setting a variable called weekly generative, output. And that's what's going to show on the cultric survey to folks. So, they just see a visual survey.
Don't realize what's happening all under the hood. If you wanna implement this in call checks, it's called a web service, hook, and then you can, again, in the URL, as long as you have something you wanna send it to, you can then have things be brought back into qualtrics, which is exactly what we're doing here with, the AI generative output. And then finally, we're ready to send all this. And if there was an error because we know things are not always gonna be a hundred percent, we can also put in error logging and send them onto a different call tricks panel to say, Hey, something went wrong. Please contact this guy, name Justin.
And then they can be on their way to say what happened, and we can look at time stamps to find out, well, what went wrong? Where do where do we need to fix things? With all that, let me look and see what questions we have. We're gonna open it up for the last, couple of minutes. How are we doing? I know there was a lot. Let me see what we've got for questions that came into the QR, and then if we didn't get any there, we will see what came in for our what what we're gonna do here face to face with everybody. So let me pull up three questions so far, responses.
Okay. Let's pull that. Okay. The first question, has this process, has this process been impacted by bandwidth or been taxed? How have you dealt with this? So this is, finishing up. We're gonna be launching this in the fall.
So, we will see what that looks like, but in terms of any kind of bandwidth issues, since it's a study, That'd be something that, I will be working through, but, I don't know if that helps answer the question of who had that, but, yeah, to be determined on what that looks like. And does Canvas have any sort of LTI for student facing generative AI assistance, as in a student could ask the assistant when x assignment is due or what they missed class policy or what they pulled from the syllabus. I think that's a great thing that we should ask and structure. I know on the road map, it looked like they were having where you could ask based on the data from what you're getting into course, but I don't think that they've quite looked at that yet. And then is there any sort of adjustment or framing necessary for the syllabus to be easier to read by AI? So, that's a good question.
I don't think that, we have, explored that just yet. So any other questions across the audience? Yes. Do you, is there a way if you have two different sections in the same time? Say there's a Tuesday section Thursday sections today is different due dates. Does the announcement know that only sent to Tuesday's people That's a good question. We'll have to see if we can segment out the announcement from that.
So that's a good point. I'll have to see if we can bake that one in. So, I would hope so, but it's gonna be based on if the announcement itself can be segmented, which, I don't know if it can or not. So something to consider. Yes.
This is just my own experience in terms of watching a similar process. Mhmm. Background is that a number of bullets led to students picking their shoes? Oh, true. Contact? That's true. I think one of the complications might be -- Uh-huh.
-- that takes up an area particular design. Right. Sure. That's something to consider. Absolutely.
You're able to capture due dates and then bring those in So the the thought, so repeating for the recording, was, have we considered instead of it being unbolted list to being, numbered lists so that students didn't pick and choose or thought that there was flexibility. It's a good point. Absolutely. I think that'd be great for the design. Any other questions before we call it a day.
Yes, sir. Yeah. This maybe I mentioned that you mom, would you wanna know about kind of security side of things? Absolutely. Yes. So on the power automate side, once you have found that the process works the way that you want, There is a secure input and secure output slider.
And when you select that after each run, when you try to find out. So you saw on that previous slide, I I could see that, yes, you probably had a lot of questions of why were you able to see this output here? Once you have the model in place, when you try to view this, it is saying, you you can't view it because the security won't let you. So it ensures that encrypts that so that you can't see any run history for any of the items for, that particular run. Now, you might have to open it up if there's things that are not working, so you can, determine what's wrong. But, yes, you wanna, you want to add those sliders to secure inputs and outputs so that they're not viewable for each run history.
Yes. That we're using that as the middleware as the glue to, to make this work. Any other questions? Yes, sir. Can I get this straight? So power automate grants open AI access to all of the information that's in our No. What this is doing is it is, we're we're doing something where we're, those that want to partner with us.
We're looking at their courses on a weekly basis, and we're adding that to a data model. And then only those courses that partner with us are gonna be part of that generative AI. For that one piece. It's not bringing any student data over. It's just bringing the course ID, the discussion ID, the title of the course.
So very small set of information. It's actually going out to generate that response back to may I, like, how do you get that information? Absolutely. Through power automate? It is through power. So we are sending the output through power automate, so it will go to that completion API set. You can just set the criteria.
Correct. Mhmm. Absolutely. Absolutely. Alright.
So if you have any questions, be happy to answer those on LinkedIn, on Twitter, but I really appreciate your time today. I know last session of the day. So thank you very much, and I hope you have a rest, wonderful time of the conference. So Thank you.