Difference between revisions of "Tips for Using and Interpreting SafeAssign"
(→How do I interpret the Originality Reports?)
m (1 revision imported)
Revision as of 17:23, 15 June 2016
SafeAssign is a plagiarism detection software embedded into Blackboard Learn. SafeAssign can be used to compare student submissions against a series of databases and assess them for originality.
How to Enable SafeAssign
SafeAssign can be used in two different ways:
1) Part of the Assignment Submission Process
Instructors using the BB Assignments tool can easily enable Safe Assign and ensure that all student submissions are compared against the databases. To do so, use the following steps:
- When setting up an Assignment, click on Submission Details
- Click the checkbox to enable SafeAssign.
- Select whether or not to allow students to view their own originality reports and whether to exclude student submissions from the global database
- Proceed with setting up the Assignment
For more details on how to use SafeAssign to scan submissions: please see the Blackboard Help Files
(2) Direct Submit
Instructors may also submit assignments themselves directly to SafeAssign. To do so, use the following steps:
- Go to the Control Panel, select Course Tools and click on Safe Assign
- Click the DirectSubmit link
- Click the Submit Paper button (top right corner of the page)
- Upload the file OR copy and paste the text directly into the provided box.
- Click Submit. Note that it will often take time for the submission to be processed, and so you will not be able to view the originality report right away. Once the blue bar under SA Report turns to a green checkmark, the report is ready.
- Click the green checkmark to view the originality report.
For more information on how to use the Direct Submit tool please see the Blackboard Help Files on SafeAssign's Direct Submit feature.
How do I interpret the Originality Reports?
SafeAssign originality reports should be interpreted critically and with caution. While Blackboard suggests that any score under 15% indicates that the work is most likely original and any score over 40% indicates excessive reliance on outside sources, this is not necessarily the case. Independent analysis of SafeAssign originality reports demonstrates that, much like other plagiarism detection programs, results include both false positives and false negatives (Hunt & Tompkins, 2014; Hill & Page, 2009; Gillis et al, 2009). A majority of text highlighted by SafeAssign (up to 70%) will include the reference list, properly cited quotations, common and topic phrases or terms, as well as jargon. In addition, SafeAssign may also inaccurately attribute original sources, and sometimes fails to detect plagiarism that comes from custom essay services, sources not included in the databases or plagiarized material that has been run through an article rewriting software program.
Further, research on student writing has indicated that while very common, misuse of sources is often rooted in lack of student skill rather than intent to copy or deceive (Jamieson, 2013; Howard, Serviss, & Rodriguez, 2010). Students lacking comprehension of complex sources, and struggling to synthesize difficult ideas, will often “patchwrite”, borrowing phrases, vocabulary, and syntax from their sources and generating text that is often considered “too similar” to the original. Identifying “patchwriting” as plagiarism is, however, problematic in that students lack the sophisticated comprehension to simultaneously ground their ideas in extant research and, at the same time, bring their own voice into their writing. The assumption that patchwriting is the same as plagiarism may result in penalizing students in situations where instruction might be the more appropriate solution.
Since “patchwriting” will often be determined by plagiarism detection software as unoriginal, but copy and pasted articles run through “rewriting” software that has generated a sufficient number of synonyms to trick the plagiarism detection algorithm into determining the work to be original, it is advised that any originality report from plagiarism detection software be viewed with skepticism. Please see attached SafeAssign Examples and Analysis, for examples, and originality reports that demonstrate some of the challenges of plagiarism detection software.
SafeAssign cannot process any documents larger than 10 MB
Gillis, K., Lang, S., Norris, M. and Palmer, L. (2009). "Electronic plagiarism checkers: Barriers to developing an academic voice." The WAC Journal 20: 51.
Hill, J.D. and Page, E.F. (2009). An Empirical Research Study of the Efficacy of Two Plagiarism-Detection Applications, Journal of Web Librarianship, 3:3, 169-181, DOI: 10.1080/19322900903051011
Howard, R. M., Serviss, T., & Rodrigue, T. K. (2010). Writing from sources, writing from sentences. Writing and Pedagogy, 2(2), 177-192.
Hunt, M. A., & Tompkins, P. (2014). A Comparative Analysis of SafeAssign and Turnitin. Inquiry: The Journal of the Virginia Community Colleges, 19(1), 6.
Jamieson, Sandra. (2013). “Reading and Engaging Sources: What Student’s Use of Sources Reveals About Advanced Reading Skills.” In Across the Disciplines 10(4): http://wac.colostate.edu/atd/reading/jamieson.cfm