Judging Randomness

This page was authored by the CATALST Group at the University of Minnesota

This material was originally developed through CAUSE
as part of its collaboration with the SERC Pedagogic Service.

Summary

This model-eliciting activity has students create rules to allow them to judge whether or not the shuffle feature on a particular iPod appears to produce randomly generated playlists. Because people's intuitions about random events and randomly generated data are often incorrect or misleading, this activity initially focuses students' attention on describing characteristics of 25 playlists that were randomly generated. Students then use these characteristics to come up with rules for judging whether a playlist does NOT appear to be randomly generated.

Students test and revise their rules (model) using five additional playlsits. Then, they apply their model to three particular playlists that have been submitted to Apple by an unhappy iPod owner who claims the shuffle feature on his iPod is not generating random playlists. In the final part of the activity, students write a letter to the ipod owner, on behalf of Apple, explaining the use of their model and their final conclusion about whether these three suspicious playlists appear to have been randomly generated.

This lesson provides an introduction to the fundamental ideas of randomness, random sequences and random samples.


Learning Goals

This activity has the following goals for students:

  1. Expose students to a real world problem with data.
  2. Expose students to ideas of random, randomly generated data, and random sequence.
  3. To provide a conceptual foundation for the idea of a chance model.
  4. Engage students in statistical thinking and working as a team.

Context for Use

This activity:

  • Is appropriate for use at any time in an introductory statistics course.
  • May be adapted for junior high, high school, and college-level instruction.
  • Is most effective when students work in groups of 3-4.
  • Lasts 50 - 75 minutes. The reading and individual students responses can take place prior to class and comparison of student reports can take place at a subsequent class or via an online class management system.

Description and Teaching Materials

  1. Media article: Students individually read the media article to become familiar with the context of the problem. This handout is available here. (iPod Media Article (Microsoft Word 56kB Feb2 10))
  2. Readiness questions: Students individually answer the questions related to the media article to become even more familiar with the context and begin thinking about the problem. This handout is available here. (iPod Readiness Questions (Microsoft Word 37kB Feb2 10))
  3. Problem statement: In teams of three or four, students work are given the problem statement and work on the task in a group for 30 - 45 minutes. This time range depends on the amount of self-reflection and revision you want the students to do. This handout is available here. (iPod Problem Statement (Acrobat (PDF) 95kB Feb2 10))
    Students are given 25 randomly generated playlists to examine. Instructors need to make sure students are examining the lists and not relying on their own experiences with iPods. After they have their lists of characteristics, they are given five more randomly generated lists to test/revise their rules on (Part 3). Students are then provided with the three suspicious playlists to evaluate (Part4) and are finally, asked to write a report to the iPod owner on Apple's behalf (Part 5). The playlists are available here. (25 Random Playlists (Acrobat (PDF) 90kB Feb2 10); 5 Random Playlists (Acrobat (PDF) 51kB Feb2 10); 3 Suspicious Playlists (Acrobat (PDF) 47kB Feb2 10))
  4. Process of sharing solutions: Each team writes their solution in a letter or memo to the client. Then, each team presents their solution to the class. Whole class discussion is integrated with these presentations to discuss the different solutions, the statistics involved, and the effectiveness of the different solutions in meeting the needs of the client.

The following supplies and materials are recommended for this MEA.

  • Computers with word-processing programs to write up reports.
  • Optional: Computers with programs such as Fathom and Excel
  • Optional: Calculators
  • Optional: Materials for students to create posters to share their solutions.

Teaching Notes and Tips

  1. The purpose of the media article and the readiness questions is to introduce the students to the context of the problem. Depending on the grade level and/or your instructional purposes, you may want to use a more teacher-directed format or a more student-directed format for going through the article and the questions.
  2. Place the students in teams of three or four. If you already use teams in your classroom, it is best if you continue with these same teams since results for MEAs are better when the students have already developed a working relationship.
  3. Encourage (but don't require or assign) the students to select roles such as timer, collector of supplies, writer of letter, etc.
  4. Remind the students that they should share the work of solving the problem.
  5. As students work in groups, the teacher's role should be one of a facilitator and observer. Avoid questions or comments that steer the students toward a particular solution. Try to answer student questions with questions so that the student teams come to their own solutions.
  6. Watch the time and try to urge groups on if they are falling behind.
  7. If students seem to get off task and are not focusing on the data provided, direct them back to the actual data and task.
  8. If more follow-up is desired, after presentations and discussion, allow students to resume their groups and modify their models.

Assessment

Assessment is an integral part of a model-eliciting activity. Each group is required to write a report to a "client" that describes their model, the reasoning that led to the model, and a justification of all decisions that are made based on the model. Group reports may be assessed for their clarity, completeness and the soundness of the explanations and justifications. In addition, instructors can decide if they wish to evaluate the students' presentations. Example rubric and scoring methods for student reports and presentations can be found at: https://engineering.purdue.edu/ENE/Research/SGMM/Problems/CASESTUDIESKIDSWEB/casestudies/airport/tools.htm.

Follow-up questions to the MEA may be used to assess student learning outcomes. For example,

  • What do you think you learned from this activity?
  • What questions do you have as a result of completing this activity?

Additional assessment items may be used depending on the purpose for using the MEA and the nature of the course.

References and Resources