Your browser is ancient!
Upgrade to a different browser to experience this site.

Skip to main content

Artificial Intelligence

Plagiarism and Attribution

In this video, Jack Bernard, Lecturer at the University of Michigan Law School and Associate General Counsel at the University of Michigan, discusses plagiarism, bias, and discrimination considerations for generative AI tools.

"Staff and student perceptions of plagiarism" by jobadge is licensed under CC BY-NC 2.0.

Excerpt From

Transcript

I'm Jack Bernard and I'm an attorney in the general counsel's office here at the University of Michigan. I also teach in our schools of law, education, public policy and information. I have been thinking about ChatGPT since I first heard about it, which is about eight months ago, and I've responded to lots of questions from people in our community about how it works and what implications it creates. Well, I think you don't have to plagiarize when you're using ChatGPT, or any other AI that's out there. I think the opportunities with these technologies are amazing for us. There are tremendous number of things that people might be able to do with these AI technologies that don't involve any academic misconduct or misrepresentation of any kind. That's to say we use technology all the time when we're doing our work. The challenge with these kinds of AI technologies is that they're able to do a lot of the work for us, and because they do a lot of the work for us, it might be tempting to want to use them in ways without giving them credit or without acknowledging that we didn't do that work. Of course in the academy, the coin of the realm is acknowledgment and credit. We might struggle with this new technology because the lines are right now unclear. For instance, we have students who might go to a friend and say, "Hey, could you just read this paragraph for me and give me a sense about whether it makes sense to you." Well, we don't typically acknowledge that support and now lots of students use technologies like the recommendations in the Word software or Grammarly. I think those things don't get acknowledged. Of course, those things aren't capable of writing an entire paper or an entire response to a question. I think people are apprehensive about the possibility of replacing a students or even a faculty member's own thinking with the product of something like ChatGPT. I do think there are lots of ways to use ChatGPT and other AI technologies without engaging in some plagiarism, but I don't think that it's hard to slip into the world where someone might have plagiarized. It's easy when you've run out of time to lean on another source because we already know our students do that. They go to Wikipedia or they go to someone else's article and just cut bits of it and put it into their own work without acknowledging it. It's worth also saying something about what is plagiarism. Plagiarism is not illegal violation. It's a violation of academic norms. When someone plagiarizes, they don't acknowledge the source of specific language or even of their thinking or they attribute it to somebody else, like adding someone else's name to your paper in order perhaps to attract more readers. I don't think it's necessary when using technologies like ChatGPT to keep those things hidden. In fact, I expect that students will say, I created this paper and I use ChatGPT to help me. That's a possibility. How faculty will respond to that, that's a whole new question. I think just because ChatGPT looks across a large corpus of information to produce an individualized independent response to a query that draws on this experience that it's had reading lots of other materials, that doesn't make it plagiarism in the same way that it doesn't make it plagiarism for me to have formulated all these words that I'm producing right now from all of my experiences over a lifetime. I think ChatGPT is doing essentially what people do, maybe not exactly the same way, but in ways that are actually very similar, which is, it accumulates knowledge and assembles a perspective based on the AI. Now, it probably isn't conscious just yet, but it's a lot like going to somebody else who also does the same thing to get information. I don't think it's necessarily what's not plagiarism on the part of ChatGPT to create new expression from being exposed to lots of things. It is possible, I suppose, for ChatGPT to quote a third party without someone knowing about it and I guess in that case, it could be plagiarism, but I think it's unlikely to be the case. From what I can tell, there's some safe guards there. There are probably safeguards that can be circumvented in some way or another we have yet to see. But for the most part, just assembling an amalgam of this information it's been exposed to and creating original expression that in itself is not plagiarism, that's just communication from a piece of technology. I think it's difficult to know what kinds of citation culture we're going to create around these smart technologies, these AI technologies. Right now we don't know what to do. I think students, faculty, publishers, are going to be a little confused about how to acknowledge whether their materials were created in part with a collaboration with some AI. I don't think we know the answer to that question yet. If it were me, I would err on the side of letting people know that this is what I used because I think it's about integrity. We want in the academy in particular to make sure that we're citing our sources. I think students and faculty should make their best efforts to create normative standards were their expectations about what people can and cannot use. Of course, in some classrooms, faculty members will be using ChatGPT as part of the teaching model. I think maybe that will be baked in and a student wouldn't have to mention it because it's part of the original assignment. But I think if a student or a faculty member say when making a publication, we're using ChatGPT. It's a good idea to acknowledge it, especially in these early stages, because I think it provokes conversations that will enable us to think about what our culture around citing AI will be.