Saturday February 28, Morning Round Table Discussions
1. Guidelines for Developing visualizations.
Overall Considerations:
- Visualizations are complex; mind does one thing at a time
- Paper and digital media may need separate tools; same for static/animated
Making Scientific Concepts Visual
- When breaking down a concept (e. g., general to specific; process), keep context
- Order navigation, reading a visualization using attention devices
- Annotations
- Pictorial: arrows, circles, lines,
- Text: numbers, words
- Digital effects: spotlight, etc.
- Space
- Layer (embed)
- Highlight/emphasize essential features of each layer
- Context: wire frame, ghosting, masking, grey scaling, etc., background
- Time: linear; break into steps
- Sidebar with key frame of each step
- Time-line/slider
- Process/causality: linear or branching
- Break into steps with key frame for each step
- Link steps with labels/symbols indicating kind of connection
- Sidebar with causal structure/key frames
- Highlight current key frame
- Build separate processes linearly; summarize to key frame and integrate in overall structure
- Scale
- Insets (for context)
- Zoom
- Perspective
- Brackets, boxes, arrows
- Change perspective, especially route and survey, to increase understanding
- Uncertainty: show alternatives, blur, clouds, etc.
Prediction/virtual laboratory
Data exploration
Help systems (hyperlinks)
- How to navigate interface
- Supplementary information
- Context
- Types
- Processes
- Explanations
Borrow techniques from human-computer interaction, from training, from comics, from graphic design; borrow 'objects' from education/learning objects (website: merlot).
2. How do you assess visualizations?
Research Questions for Assessing Visualizations:
- Assessing the impact on learning of particular visualizations, whether they use technology, are dynamic or static.
- What is it about visualizations that really matter? This question needs to be approached from the perspective of understanding student comprehension or their thinking process, not just focusing on the elements of the visual display (use arrows, colors). Can we see what the interaction is that results in learning?
What makes effective and valid research about learning or teaching?
Defining your goal of assessment or evaluation research is key. This drives design. For instance, the need to include control groups is dependent on your goal.
- Does a particular visualization aid learning? This implies the measure is relative to something else?
- How does it affect learning, how are cognitive models developed does not imply a comparison and need for a control.
- Research must include qualitative as well as quantitative approach. Through interviews we unpack the thinking process. Videotape, mouse tracking, protocols, sketching etc. are good tools.
- Assessments must match teaching method, goals and teaching resources.
How do you take the research and transfer it or apply it?
This is by itself an important research goal. Application might include adjustments for different audiences such as those with physical or learning disabilities.
Caveats when conducting educational research:
- Sometimes interventions are less successful than previous methods and sometimes they are more successful.
- Impact of any interventions is small, subtle not life changing. So the idea of denying a learning opportunity to a control group is not going to have critical impact on their life/learning.
3. How do we learn what students know and think when they come to our courses and what do they see in their diagrams?
This is one of the most important (arguably the most important) questions in geoscience education. What students learn is a direct result of what they already know. If we don't know what the students see in geoscience representations, then we cannot use these representations effectively. Moreover, effective assessments and design require that we know where students are beginning. This is exacerbated by the fact that here are important generational gaps in computer use, and particularly in the use of visualizations.
Approach
In order to understand what they know and see, there are three different levels on which we must understand what students know and see:
The cognitive (and psychometric) level.
- What aspects of the representation do students attend to? What do students look at in the visualization?
- Expertise. Expertise guides attention. Geoscientists know what to look for in a visualization; novices do not. Experts are guided by a mental model; novices are guided by the properties of the image itself.
- Do they understand what the image is intended to represent?
- What mental models do students bring to bear when interpreting geoscience images?
- How and why do students make analogies to prior knowledge?
- How, when, and why does psychometrically-assessed spatial ability influence geoscience understanding (Does spatial ability matter? When and why?)
- Are students able to think about and relate different time scales?
The educational level
How do students learn from the image?
- How does the diagram relate to what they already know?
- How does it promote generation of new questions?
- How do we measure what they learn from the image?
- How do they understand temporal information?
The geoscience level
- What geosciences content do they see? What features do they see?
- How do they understand and interpret the processes that are represented? What are the impediments/misunderstanding that inhibits this type of learning?
- How do they understand spatial and temporal scales that are outside the range of human perception? What are the impediments/misunderstandings that inhibit this type of learning?
- How much time is necessary for students to assimilate spatial information?
- Can they relate abstracted or scaled information into physical reality?
Methods
A variety of methods are needed to understand and assess what students know and see when they begin to study geoscience. Here are some important methods that will be useful:
- Faculty using think aloud protocols at the beginning of the classes. (ask students what they think in an organized format). This would lead to assessment of mental models of both correct and incorrect assumptions of processes (e.g, that the mantle is liquid).
- Engage the community of evaluators and developers of curriculum design, including subject matter experts, instructional designers and educational technologists.
- Develop a computer-based assessment of geoscience image understanding. Computer-based geoscience image. Align similarities and differences in various representations, to enhance 'linking' of representations. Students would be asked, for example, to point out similarities and differences in two different images of the same feature (or process)? Another example would be penetrative learning (Kali & Orion, 1996).
- Case study. Look at a particular student or group of students and follow them through the course. Use a combination of the above methods to assess what they know and how they are learning as they encounter new images.
Recommendations.
- Develop an easy-to-use tip sheet or web site to help geoscience instructors for informal evaluation. This would include suggested methods, images, etc.
- Develop a standard(ized) think-aloud protocol for assessing geoscience conceptions and misconceptions.
- Develop a computer-based assessment of image reading, using key images that illustrate important geoscience phenomena. The assessment would measure basic map-reading skills, image understanding, translation and transformation, etc.
4. How do we engage students in generating their own visualizations and what do we know about the impact of these on their learning?
Members: Steve Reynolds, Janice Gobert , Doug Clark, Peter Guth, and John Geismann
Student-generation of visualizations is an important facet their learning.
Student-generated sketches (from the simplest form, e.g. in their class notebook with pencil/pen to a field notebook with a mechanical pencil for example in structural geology or field geology to using a tablet pc) can take many different forms in many settings
Students must learn how to draw; not animations and not artistically, but a synthesis sketch or sketches that summarize an animation and their understanding of the animation.
Many ways exists to assess student understanding through their visualizations. For example, conduct an exercise in the classroom where students explain their sketches to one or more of their colleagues, and ask their colleagues if they understand the explanation and what did they learn.
Abundant research exists to demonstrate that students who use sketches tend to learn concepts better. The procedure of (1) reading, (2) drawing, (3) summarizing, and (4) explaining forms a logical approach to involving students in creating visualizations, and facilitating assessment of their knowledge gained through visualization preparation.
Computer-generated student visualizations can take many different forms, through the use of such tools a microDEM (Peter G.) to a digital camera and scaffolding with Powerpoint to present their observations, summarize their observations, and explain them. As an example, the evaluation of a landscape, viewed as a high-quality image, could include a description of what is seen, a discussion of the processes currently affecting the landscape, and finally a discussion of the processes that led to the current landscape.
5. What do we know about the role of the instructor in effective use of visualizations
6. What do we know about design of effective activities using visualizations?
7. Review of existing visualization development tools.
Development Tools:
High-end Scientific Visualization/Analysis- AVS
- TGS Amira
- Mathematica
- Matlab
- IDL
- OpenDX
- Grass
- Micro DEM
- Java
- ArcGIS
- Fledermaus
- VRCO VGO
- VoxelGEO
- GoCAD
- IESX (Schlumberger)
- GeoFusion
- IDV, Unidata/UCAR
- Viz5D/VizAD, Wisconsin
- Director
- Flash
- Director IDL plugin
- Bryce
- World Construction Kit
- World Builder
- Walkabout (UIC)
- ROMA
Supplementary Tools:
- Map Publisher, imports shape files to Macromedia Freehand
- Walkabout (Santa Barbara, 3D Studio Max)
Issues to keep in mind:
- Course Management Systems will have their own constraints on visualizations.
- Assessment components of learning objects need to be integrated.
Future technologies to anticipate:
- Personal Digital Assistants
- Chat Rooms
Classes of Developers:
- Technically-inclined educator
- Ed Tech developer
- Professional developer
Tips:
- Anticipate the huge gap between high-end visualization tools and courseware development tools.
- Consider usability, for both the student and instructor.
- Determine whether the tool can output self-contained applications, and whether you need this.
- Attempt to maintain a continual honest productive dialog among multidisciplinary group of experts.
- Consider issues of re-purposing and re-use.
- Consider how to incorporate assessment components and cognitive science issues into your interface.