Plagiarism and copyright infringement have become increasingly prevalent online during a time of information uncertainty and overload. Social media becomes a smokescreen, as anonymity becomes a shield against consequences. People increasingly use machine learning databases or copy others to create “original” work that helps them write articles, social media posts, and more.
However, there are ways to combat this through following specific criteria and media literacy laws.
To prevent copyright and plagiarism from contributing to misinformation in the media, it’s essential to incorporate Columbia University’s four factors to avoid copyright infringement of intellectual property through media that reflects fair use, as supported in public and academic settings. This approach encourages ethics in research, article and essay writing, and media creation.
When these factors aren’t met, problems can arise with the law, publishers, original owner, and the audience. Fairey v. Associated Press is a court case in which Shepard Fairey created a “Hope” poster of Obama, popularized in the 2008 presidential campaign, based on a photo taken by the Associated Press (AP). Fairey sued when AP accused him of copyright infringement, denying copyright violation, and AP countersued, since he had not credited or compensated AP for use of the picture. They settled with Fairey agreeing not to use any AP photos without a license and agreeing to share profits with AP for the sale of the poster and articles based on it. It’s important to note that anyone can face the consequences of copyright infringement.
CHS Audio Video Production and Journalism teacher Dr. Parrish has extensive experience with copyright and originality in student-written articles and media footage.
“You have to meet the criteria of the… [factors] of journalism. When we had a monthly newscast, we covered the California fires, and…. we’re not affiliated with any of these mass-market media sources…that being said, we… use[d] the footage in limited quantity… it can’t be treated like an Instagram reel,” Parrish said. “Instagram and TikTok are outlets where you think…it’s almost okay to [use copyrighted footage], but it’s not.”
The issue of determining originality with objectivity is relevant among teachers who want to apply media literacy laws to teach students to identify fake news and biased information, including machine-generated sources, as people are divided on its potential uses and reliability.
While students use ChatGPT for brainstorming, research, idea generation, or access to a 24/7 editor, some use it for academic dishonesty and assignments by blindly accepting blocks of machine-generated text as a substitute for originality, possibly sowing skepticism in teachers and prolonging plagiarism.
Roger Wang has personal experience with the benefits of using ChatGPT for information to save time and energy.
“ChatGPT is a useful tool for providing information, as it answers queries more precisely than search engines can. Personally, I have saved a lot of time by using ChatGPT instead of clicking through convoluted forums and fluffy blogs,” Wang said. “I see it as a meaningful step forward in the convenience of accessing information, the next iteration after search engines.”
In contrast, Wang has observed that people blindly rely on ChatGPT instead of using their own creativity and rational thinking to complete tasks and form conclusions.
“However, [ChatGPT] may also be a hindrance. Many like to use ChatGPT not as a tool for answering queries but as a source of intellectual labor. They rely on ChatGPT for direction, content, and decision-making. While these functions of ChatGPT allow people to be more productive, I think they also act as substitutes for independent thought,” Wang said. “Rather than challenging themselves to create unique solutions to problems, people can just follow ChatGPT’s instructions. Rather than venturing to form individual opinions, people can just adopt the views presented by ChatGPT.”
Ms. Matos, the digital learning coach at Centennial High School, gives a reminder about being careful to judge the information AI gives to its users.
“Keep in mind, AI is biased, and it includes misinformation. It’s only as good as the people who trained it, so it’s whatever information they fed into it, and if they have a bias, that is built into the information that is included…AI is just predicting what a good response would be….So if the information it is pulling from is inaccurate or misleading or missing information, that will be represented in the information that gets back to you.”
As a result, bias might cause AI to not represent minorities in the information it produces or have nuance in describing their religions or cultures. It can harm the accuracy of the information AI gives through AI hallucinations, in which it makes up information, or providing misinformation to its users, creating a decrease in credibility for AI to supplement research.
Some teachers embrace AI as a tool for students. For example, students can find scholarly research sources to summarize and simplify. College Board, a not-for-profit organization that determines college-level readiness, tests students on topics like AI, copyright laws, and user privacy in media through courses like Psychology, English Language and Composition, and Computer Science Principles.
Matos supports the use of AI to supplement learning as long as it doesn’t interfere with understanding the curriculum.
“The best practices [with AI for teaching and learning] would be using it to brainstorm…using it as a support and not as a replacement for learning, and not trying to say it is your work, because it is not,” Matos said. “…It’s also good for if you have a long text, and maybe you’re having some difficulty with understanding it, putting in some of that and asking for a summary of it to help you with your comprehension of it. That’s also a really good use for it.”
Teachers can use different methods to teach media literacy within their limited curriculum.
“How you teach [students]…[is] up to the interpretation of the teacher,” said Parrish. “We still have to teach…ethics. It’s pretty limited…there’s a handful of PowerPoints…that keep us… grounded.”
With copyright infringement and plagiarism posing issues in the media, it’s important to learn how to avoid them. Educators navigate the rapidly changing technological landscape to foster critical thinking and ethical research practices that prepare students to discern fact from fiction in their academic and personal lives while minimizing plagiarism and copyright infringement in their online presence. Additionally, learning to adhere to Columbia University’s fair use factors and recognizing how ChatGPT offers convenient information access but creates concerns about dependency and inaccuracy can lead to a more responsible use of technology in the digital age.