Discussion boards are another area where GenAI can help you in your teaching practice. The idea of the discussion board has been a feature of online learning for decades as a cornerstone of the online educational experience. This activity is analogous to an in-person discussion which is guided by the instructor but is student-driven. However, there are substantial challenges in managing this experience, such as generating engaging topics, ensuring meaningful student participation, and addressing academic integrity issues.
Discussion board assignments should be aligned with applicable learning objectives, as discussions usually involve application, analysis, or the creation of content, all of which are higher level activities in Bloom’s Taxonomy; such learning activities are effective when students can first define or identify concepts from the course materials.
Therefore, learning objectives related to this activity should probably involve students analyzing, applying, evaluating, or synthesizing ideas from the course materials. Sequentially, discussion boards may need to appear in the course after the student has already practiced with basic concepts and core definitions attained through reading, lecture, or other practice activities.
Topic generation is also a design issue for the course. One way to improve here is to work with GenAI to generate potential discussion topic prompts. For example, you could provide the GenAI with a copy of your lecture transcript or assigned reading materials and provide a prompt like: “Chat, can you generate some discussion board writing prompts from the enclosed materials that relate to the learning objective …”
From the student’s viewpoint, discussion board topics should have some relevance to the stated objectives of the course, otherwise these assignments may be viewed as optional “busywork” that can easily be skipped (or that can be responded to by using GenAI tools – your students aren’t incapable of this). Discussion boards, therefore, can also create concerns about academic integrity, and more broadly, about achieving the stated learning objectives and establishing a record of student achievement of those objectives.
Suppose, like me, you are teaching a law course. You could ask Chat to brainstorm ideas for a discussion board on pleadings, for example. In seconds, it might offer several suggestions—some good, some not-so-great—but at least you’ve got a starting point. Maybe one idea involves exploring the modern implications of service of process, like whether digital methods should replace in-person delivery.
AI can’t do all the work for you, though. You’ll still need to filter these ideas and tweak them to match your course objectives and potentially align them with the results of a diagnostic assessment that you have conducted in advance (for example, did students demonstrate an understanding of the basic definitions and concepts that you wish them to now write more extensively about in the discussion topic?) In your course design, you may also want discussion boards to be dependent on the completion of prior activities that would demonstrate student understanding of the relevant materials, such as by configuring release conditions in BrightSpace.
In response to the prompt above, Chat was able to generate about ten different writing prompt ideas based on the excerpt from my OER materials on pleadings. Not all of its ideas necessarily fit where we would be in the course. For example, it generated a topic related to joinder, but that concept may be a little too complex at this stage, or it may be a concept that isn’t a focus of my agenda for introducing litigation process.
On the other hand, Chat also came up with the question of whether we ought to have different rules for service of process that permit digital service on a defendant. That kind of topic might be more engaging for students as that is a relevant and current issue, and one that is not well addressed by most court rules of procedure for service.
Discussion prompts that invite informed student opinion, ask for research sources to support the response to the prompt, and that are related by design to prior learning in the course are more likely to be effective in achieving the set learning objective.
Using Chat to generate ideas can also be applied to in-person discussions, particularly if your students have access to a device that they can use to engage in research during the discussion.
Now this gets to the other issue which is whether students are just using GenAI to respond to this prompt and not really contributing anything of their own to it. An effective discussion design would include a policy statement on the appropriate use of tools like GenAI (and other tools like can they google it?). In drafting that policy, you should consider whether you can fairly enforce it. [Research on detection tools, false positive rates, and detection-defeat tools]
Second, is the discussion board topic one that may challenge students sufficiently? A discussion board where students just need to define a key term will not likely be viewed as useful or engaging, whereas asking students to write a master’s level thesis as an expert in the field will likely discourage engagement. Part of the instructor’s challenge is to find the bowl of porridge that is not too hot or cold, but just right.
For example, let’s say that I wanted my students to apply the service of process rules for serving a corporation. The prompts may include a question of what Maryland Rules apply to this situation, with a follow-up prompt asking students to explain the steps to calculate effecting service. Who do you serve? What are your options?
How do I research a corporation’s resident agent? Who else could I serve? What happens if there is no resident agent? What are alternatives to personal service and when might I be able to utilize those to effect service?
This is a more interesting discussion topic because it requires the student to investigate the question in multiple steps. Asking students to cite to the applicable Rule or authority (and then verifying that they got it right when assessing the discussion) also improves the discussion topic while reducing concerns about academic integrity.
So one of the thoughts about this is maybe not to get as wrapped around the axle of whether the student’s submission was entirely written by hand. I know that's a bit controversial to say that, but maybe this is the wrong focus for students entering an AI-world.
Instead, maybe we need to train students on whether AI drafted the response correctly, placing the student in the position of managing the AI to get the correct result. What would a manager do to make sure that this was right? Putting students into the position of evaluating work product, with appropriate preparation, should lead to a better result. Of course, students need to have the opportunity to first be able to define the terms and identify the concepts before asking them to evaluate how Chat did with it, right?
Perhaps our inquiry for the student should begin by asking what sources GenAI provided and whether those sources actually exist and support the written statements provided. Put the student in the position to correct AI hallucinations by doing a better job of engineering effective prompts to the GenAI.
Given that GenAI can reduce the time needed to complete a more basic task, I think it is fair to expect more of my students, given that they can with minimal effort draft responses that are free of grammatical errors and are logically coherent with the help of GenAI. I believe in exchange for being able to ethically use these tools, I can expect better work product supported by citation to validated authority.
Returning to the idea suggested earlier by Chat concerning digital service: could we develop a discussion prompt where students offer an opinion – supported by evidence – as to whether the court system should permit digital service of process rather than requiring essentially in-person only service?
An opinion-type of discussion may be more inviting for students as there is a lower barrier to entry for the student: everyone can have an opinion about a topic relevant to the assigned reading. Such a prompt could include follow-up with asking students to provide a source for their opinion, and have students discuss the potential advantages and disadvantages to traditional in-person service of process.
So, even if students are permitted to use GenAI to help get started or generate ideas about the topic – ultimately the work product needs to include valid sources that the students have examined to support their position while engaging with a real-life debate that relates to the course objectives of learning how the litigation process works.
GenAI helps faculty and potentially students with the “blank page” program of not knowing where to start, by offering ideas that can be accepted or rejected from the user. As a designer of the course, this permits the faculty to be in a position to assess whether a particular idea connects sufficiently with the intended learning objective, and whether the writing prompt is appropriate for taxonomy level of the objective.