Student cheating survey

My journalism students conducted a survey on cheating last week and 65.8% of our high school students admitted to cheating before. Only 20.6% reported they hadn’t and 13.6% didn’t respond to the question.

I guess I’m not surprised. You only need to walk down the hallway in the morning to see kids “helping” each other with last night’s homework (which they didn’t start until this morning!). I’m all for collaboration and for kids working together and explaining a difficult concept to each other but what I am afraid is that what is probably happening is one person is copying the other person’s homework.

Students also have different opinions of what cheating is.Almost half of the 322 students surveyed do NOT consider copying off someone’s homework as cheating though 87% reported that copying of someone’s test IS cheating.What’s the difference?? See chart (right) for more results

The comments left on the survey were telling. Student gave reasons why or why they did not cheat.

“I didn’t have time to do the homework … I’m too lazy … I forgot my book at home … I wanted an A … My parents get mad at me if I do poorly in school … It’s just easier to copy your friend’s homework … I don’t want to ruin my GPA …I had to work last night …The assignment was “stupid” … Everyone does it … I didn’t want a 0 on the assignment … I didn’t understand the assignment … My friend lets me copy their work … The teacher assigns extremely hard work …So what?”

“Cheating is wrong … The consequences aren’t worth the risk … I’d feel guilty if I cheated … You are only hurting yourself … I don’t want to get caught …It’s not that hard to just do the work … Cheating is bad”

There isn’t as much shame attached to be a cheater these days. Some research I did explained that is was the high achieving AP students and busy athletes who cheat the most. They justify their cheating so they can get into a good college and they explain, “Everybody does it”.

An article titled Everybody Does It” by Regan McMahon, (San Francisco Chronicle, Sept 9, 2007) suggests Top 5 Ways to Curb Cheating:

  • Create an honor code with student input so they’re invested in it
  • Seriously punish cheaters according the academic integrity policy
  • Create multiple versions of tests to make purloined answer keys useless
  • Ban electronic devices in testing rooms
  • Develop multiple modes of assessment so the grade is not determined primarily on tests

Doug Johnson is his Blue Schunk Blog suggests:

  • Use performance-based assessments that require personal application of or reaction to the topic
  • Be very clear about what will be tested/assessed
  • Make every assignment a group assignment with expectations that the role of each group member be clearly defined
  • Only make assignments that are actually necessary (Alfie Kohn writes that there is little correlation between test scores and homework?)
  • Eliminate “objective tests” or make them all open book.

So what does this all mean for me? Honestly, I don’t think kids cheat that much in my computer classes because I rarely give “homework” and most assignments are completed in class. Yes, they “help” each other out, but I call that learning. I don’t allow kids to take over the controls of another student’s computer but I encourage collaboration.

I think it is very valid to examine some of the telling reasons why kids cheat. Teachers assign homework and tests to check for understanding and if kids are copying each other’s homework – then the teacher isn’t getting valid feedback and cannot adjust their instruction accordingly. But at the same time, if a teacher is just assigning busy work so they have a “grade” to enter or to “get through” the material, then they are giving the kids too much busy work and the kids know it.

We must discuss cheating with our students. The consequences are too high, now, later on in college and in the business world. Let’s examine our current educational system and devise instruction that is so engaging for students that they WANT to do their best work and are motivated to LEARN.

Isn’t that what this is all about … L E A R N I N G. We need to make that clear.

Learning and assessment

One of my favorite parts about being connected in the edtech blogosphere is that bloggers freely share their professional development ideas and course goals. I was intrigued when Dean Shareski discussed his practices and his goal for his online students was to experience:

  • Learning is social and connected
  • Learning is personal and self-directed
  • Learning is shared and transparent
  • Learning is rich in content and diversity

He went on to explain how difficult it is to assess and grade these ideas. They don’t fit nicely into an A-F format. Evidence of his student’s learning was demonstrated in their blog posts, weekly assignments and synchronous sessions.

Even though I don’t teach a completely online class, I want to see if I can embrace these type of goals in my high school computer classes this year. I pulled out my Understanding by Design book to review how I could incorporate this type of goal for my classes. Some questions stood out for me:

  1. What would count as evidence of such achievement?
  2. What does it look like to meet these goals?
  3. What are the implied performances that should make up the assessment, toward which all teaching and learning should point?

Another way to say it is “What should the students walk out the door able to understand, regardless of what activites or text we use?” and “What is evendence of such ability” and, therefore, “What texts, activities and methods will best enable such a result?”

I love struggling with these big picture ideas. It helps me to take some time and mull them around for a bit because my tendancy is to jump right in and say … “oh, we’ll use wikispaces for a collaborative document, and we’ll create blogs with 21 Classes, and then we’ll set up a shared bookmark on Del.icio.us …” all before I really know WHY I want to do all those things. My instincts are on the right track but I want to be deliberate with my goals and objectives and make sure the students know WHY they are doing it too!

A great design tip from UbD is to ask a student: What are you doing? Why are you being asked to do it? What will it help you do? How does it fit with what you have previously done? How will you show that you have learned it?

The whole idea behind UbD is to plan using backwards design. First identify your desired results. Then determine acceptable evidence. Then finally plan learning experiences and instruction.

Dean also shares that they have seven principles to guide assessment practices in his school division:

1. Students are the key assessment users.

2. A balance of assessment for and of learning should be used.

3. Assessment should be constructive; it should focus on achievement and progress.

4. Assessment and instruction are interdependent.

5. Good quality assessments must be followed by effective communication.

6. Assessment expectations and curricular outcomes should be communicated clearly to students from the beginning.

7. Meaningful and appropriate assessments should include evidence about student achievement in the areas of content, process and product.

This is good stuff!

I started poking around his division website and found another nugget. I C E – ICE is a framework for assessing learning growth. The framework helps to clarify the characterisitics and markers that indicate where learners are along the learning continuum and, in so doing, enables teachers to make instructional decisions that maximize learning. It’s a simple assessment tool that I can use to evaluate my student’s blogs and other understandings.  Thanks Dean!

Ideas:

  • basic facts
  • vocabulary
  • details
  • concepts
  • the “foundational” stuff

Connections:

  • Demonstrate connection amoung the basic concepts
  • Demonstrate connection between what was learned and what they already know

Extensions:

  • Students use their learning in new ways
  • Students are able to answer the question: So, what does this mean? How does this shape my view of the world?

Exerpt from Assessment & Learning: the ICE Approach (2000) Sue Fostaty Young and Robert J. Wilson