Monday, December 8, 2014

Final Activities


The project on cultural validation of geoscience assessment has come to an end this year having met all of the goals we set out to accomplish: 

(1)  Working toward ensuring equitable assessment of students in higher-education geoscience that complements place-based curriculum and instructional practice for these communities,
(2)  Providing tribal educators with a relevant means for assessing their students’ conceptual understanding,
(3)  Involving educators in designing assessments appropriate for diverse students, and
(4)  Contributing to a conversation about diversity in assessment for the larger science education community.

The final assessment writing workshops were held with project participants in 2013 and 2014.  The workshops focused on collaborative analysis of student responses to open-ended assessment items (designed in previous workshops) and authoring of new assessment items in selected-response format.  Through this collaboration with students, faculty, and cultural experts, these community members identified relevant geoscience topics for their specific communities and were involved in validating the questions through the assessment design process; from identifying meaningful content to using language that resonates with students from these communities.  Ultimately, the participants have come away with a suite of place-based geoscience assessment items in both open-ended and selected response formats.  They have received training in assessment design and analysis of qualitative data which will inform both instruction and evaluation of programs in these communities.   

Monday, December 31, 2012

2012 Year in Review

Much of the 2012 grant activities involved finishing up validation with the participants from the desert southwest.  Fifteen students completed surveys and took part in interviews to review items from the Geoscience Concept Inventory.  The surveys were designed to gather information about the role of culture, place, Native science and geoscience in assessment, while the interviews were designed to have participants comment on the content, language and format of existing assessment items.  Together, this information would inform the design of new, place-based geoscience assessment.

With these activities complete, the validation methods (as well as the data gathered in the validation process) were then compiled in a manuscript submitted for publication.  These baseline data are essential in providing the context for the new assessment items.  Participants from the northern Rockies completed their validation activities last year, and already have authored open-ended assessment items that incorporate the local landscape.  This fall, two leaders from this group helped select items to pilot with students.  The items incorporated the themes of Earth (Geosphere), Water (Hydrosphere-Geosphere), Air (Atmosphere-Geosphere), and Fire (Biosphere-Geosphere).  Piloting is scheduled to begin with these participants early in the spring semester of 2013.  Once responses have been collected from students, the group will convene again to analyze the student responses to refine the existing items and generate new selected response items for possible inclusion to the Geoscience Concept Inventory. 

Other activities that need to be accomplished in this final year of the grant include a meeting with participants to design new assessment that incorporates the landscape of the southwest and other contextual elements identified from the validation activities earlier this year.  Dependent upon the time needed for authoring the new assessment, these items will be piloted with students as well prior to the end of the grant.  Expected products from the grant include a model for cultural validation that employs a mixed-method approach, workshops and training for participants in assessment design, and place-based geoscience assessment questions (both open-ended and selected response items).               

Sunday, December 4, 2011

Workshop for Writing Assessment

Spectacular view of the mountains on the morning of the writing workshop

Following the May workshop, participants had commented that they wished the workshop was longer so that they could spend time writing assessment questions.  While most valued the focus of the workshop (identifying goals, discussing the breadth of the types of assessment and the data provided by each), time ran short at the end to design questions tailored for students.  We planned for this November workshop as a follow-up to the previous, focused primarily on content (geoscience concepts related to the local landscape) and alignment with Earth Science Literacy Principles and Educational standards.

I seized upon an open weekend in November, when most of the cohort could participate in a 6 hour workshop.  The open weekend was also a holiday weekend (Veteran's Day) so even though there weren't any school conflicts, there were plenty of Veteran's Day activities that were occurring at the same time as the workshop.  So of the 10 participants in the May workshop, five were able to attend and we were lucky to have a new language expert join us this time to help us with some of the translation/interpretation of geoscience concepts.  Those who were unable to attend are able to provide feedback on the workshop products via email and with individual meetings with me on my next visit in February. 

Instead of beginning with multiple choice questions, the group started with an activity that got us thinking about place.  Together, participants began by identifying culturally significant locations that have meaning for the local community and would be appropriate for teaching students about the physical landscape, particularly with respect to geology.  Once we had our list of locales, we came up with words to describe each of these places with regard to both their physical and cultural attributes as well as those that related to how people interact with the physical landscape.  We then considered how these descriptions of place might change with time.  From this collective exercise, we were able to provide a context in which to write our geoscience assessment questions and with language that embodied physical, cultural and interactive attributes of those places. 

We spent the rest of the workshop focused on writing conceptual open-ended questions that aligned with specific Earth Science Literacy Principles and Cultural Standards for Education.  The content of the questions focused on Earth (mountains, geologic structures, rock/mineral properties), Atmosphere (climate, weather), Fire (volcanism, hot springs, forest fires), and Water (rivers), and aligned with the Earth System Science framework identified by faculty and students as important geoscience concepts to understand (see previous posts on questionnaire).  These open-ended questions will be piloted with tribal college students to gather the necessary data in student responses to create the multiple choice concept inventory questions.  Piloting is expected to begin in February, and the cohort will use the open-ended responses to craft multiple choice questions in our May workshop.

Tuesday, July 12, 2011

May: Assessment workshop

 


The Assessment Triangle 
from National Research Council (2001). Knowing what students know: The science and design of educational assessment. National Academy Press: Washington DC.

By the end of May, our group was ready to reconvene to begin thinking about assessment design.  Our previous efforts really focused on gathering information about what the experts viewed as the role of assessment and possible sources of testing bias as well as to identify locally-relevant geoscience topics that would be used for the new assessment content.  The goal of this gathering was to discuss the details of assessment development over the course of three evening sessions in an assessment design workshop.

Members from our expert group, tribal college faculty and K-12 science teachers were invited to join over the three days.  On the first day, the participants (n=10) shared their experiences with assessment which provided us with a rich context in which to place our workshop activities.  As facilitators, we provided an overview of the project and preview of the results from the fall activities, as well as an overview of assessment and spent time working in small groups to identify specific learning outcomes for their classroom environments.


This first day of the workshop went well, though I felt like we may have stuck too rigidly to the original slides of the assessment workshop.  I tried to modify and tailor it to the needs of the community of instructors that had planned to attend.  I looked for relevant local, state and national standards to plug into the cognition-observations-interpretation assessment framework, but I'm not sure how useful this was right off the bat.  This could have been due to the fact that we had a mix of participants: faculty, K-12 teachers, and informal science educators.  Switching gears, we decided to have the participants think up student outcomes of their own, rather than base them upon the pre-determined cultural or state standards.  This helped to jump start brainstorming in the working groups and be more useful for the group as a whole.

In response to the first day’s activities, the two other workshop facilitators and I met that evening after the workshop to revamp some of the activities for Day 2.  Instead of jumping into the analysis of student data (both written and pictoral forms), we decided to take one of the outcomes that the participants came up with the previous day and provide example assessments that could be used to measure that outcome.  We provided both a quantitative and qualitative examples of assessment, and had participants complete a drawing activity of their own and analyze their own responses as a group.  Following this example, we were then able to discuss drawings that students might make in response to a question, and then spent time analyzing example student drawings from an introductory geology class.  The participants really seemed to get into this activity and identified many things that weren’t necessarily included in the “answer key” coding scheme that the workshop facilitators had prepared.  I thought it was really neat to see some so engaged in this activity…and highlighting new themes from the drawings that they found intriguing.


Day three focused in on the concept of place and building on the information from the previous days to engage in question development.  My colleague (co-PI and workshop facilitator) is an expert in place-based learning and instruction and was able to provide an overview of how he incorporates place into his geoscience teaching at his institution in the southwest.  The participants were able to compare and contrast the use of place in the southwest with their own pedagogy and curriculum that they employ at institutions in the northern Rockies, and to begin to think about how to incorporate this concept of place in designing assessment.

Again, the workshop facilitators modified the original workshop activities in order to fit the needs of the participants.  Instead of narrowing our focus to solely geoscience content, we asked participants to come up with a question that they would ask to elucidate their students' understanding of a scientific concept.  The informal educators in the group had just finished working with students in 4-6th grades and were able to identify an ecology question that they would ask their students.  They then provided a plethora of answers that they thought students in this age group would provide to answer that question.  **Note:  Ideally, we would gather data from the student population itself, rather than come up with answers that we think they would use.  Gathering information from the students ensures that the data are grounded and ultimately will speak to the validity of the assessment that is developed from those data.**  Once we had a list of potential answers, we worked together as a group to design a multiple choice question that targets that scientific concept.

We worked slowly through this activity, primarily because all of the participants were paying close attention to the language used in the question (communication validity) and making sure that the question aligned with the proper item-writing rules (c.f. Haladyna and Downing (1989b) and Frey (2005)).  The language expert in our group was a key member of this discussion and really was able to refine the meaning of the question asked and identified ambiguity in the question stem and responses that might not have been otherwise noted by those who were not Native speakers.  Furthermore, another participant (who has acted as evaluator on science education projects for the tribal college) was particularly good at employing the question writing rules and was able to critique the questions as we developed them as a group.  The resultant question (designed for the 4th-6th grade student population) was then modified for upper level high school and introductory classes at the tribal college, by altering the language and infusing higher level scientific answer options.

All in all, the workshop allowed participants to explore the tools for assessment design and provided a brief introduction to the collaborative design of conceptual multiple choice questions grounded in student thinking.  Although one of the main goals of the workshop was to design assessment focused on geoscience content, it soon became apparent that participants needed the overarching ideas about assessment and to become familiar with the methodology of creating assessment using a conceptual framework.  A second workshop has been proposed for the fall that will build upon this initial workshop and will be entirely focused on designing a suite of questions that are tailored to the geosciences.  A secondary outcome of this workshop was that participants are able to carefully critique and design assessment, a tool that is handy when acting as evaluators in science education projects (which many of these participant are or would like to be).  I am already excited for our next meeting!      

Wednesday, April 6, 2011

March: Spring meeting with the second expert group

Photo courtesy of S. Semken
Springtime activities began with meeting with the second panel of experts on our project.  Since the visit was scheduled during spring break at both the tribal college and state institutions, it provided us with ample time to travel to our meeting destination and take in the spectacular landscape.  It helped to have a geologist along to give me the tour!


Our experts in the second group (n=2) bring to project a rich knowledge of language and culture.  Like our first group of experts (who met in the advisory board meeting in November), these experts were asked to complete a questionnaire (first of 2 in a Delphi) and provide feedback on existing geoscience assessment questions commonly used with college-age students.  Although we only were able to meet with 2 expert faculty during this meeting, we have also identified other potential participants from the academic community (inviting perhaps K-12 teachers?) that could provide us with feedback as well.  This meeting really laid the groundwork for future interactions and allowed all the research participants (experts and university faculty) to (re)connect.

This group of experts preferred to complete the questionnaire orally; my colleague and I transcribing their answers onto the questionnaire.  The experts allowed their responses to be audio-recorded, allowing for the interviewers (my colleague and myself) to be able to listen carefully to their responses without feeling pressure to write everything down.  We were able to summarize their answers to the survey questions, while the audio transcript will provide us with the detailed responses.  In the first advisory board meeting, the questionnaire content provided the roadmap for the content of the focus group discussion; whereas in this case, the two activities merged into a larger discussion that addressed the survey questions in more detail.  Though this does not follow the initial research design (Delphi questionnaire first, focus group second), we feel that the data collection methods should be appropriate to the context and situation.  We chose to honor the preferences of our experts by conducting the questionnaire orally rather than adhering to a research design that would put our participants ill-at-ease

Friday, January 7, 2011

Interview update

I was able to reschedule one of my student interviews to be conducted over the phone.   Though not our preferred method of interview (perhaps too impersonal?), it seemed to work out well.  The student and I had met earlier at the informational meeting, and I think that help set the stage for the conversation.  In other words, we had some familiarity with one another before heading into the "phone interview" context.  Since the interview, I have been in email contact with her off and on for various reasons; mostly regarding research logistics & paperwork.  In past experience, email has not been a consistent means of communication (some do not have internet access at their houses), but in this case, it facilitated the exchange of the necessary paperwork so that the participant could receive the stipend for work completed for us.  I'm curious to see if/how the Skype chat will work out with the advisory board... 

Wednesday, January 5, 2011

November 17 & 18: The Interviews

November 17: First Day of Interviews
 
Because I did not have a chance to get as many people signed up at the informational meeting as I would have hoped (darn snow), I had to track people down the following day.  This was not difficult as many students and faculty were on campus for their final week of classes before the Thanksgiving holiday break and were aware that I was visiting that week.  By the end of the first day, I had completed three faculty and two student interviews and one parent questionnaire.  Only one student interview had to be rescheduled due to weather.  I hoped that the following day would be just as productive, even with the issue of snowy weather.  

The interviews themselves went quite well.  Participants looked over individual questions from a typical multiple-choice assessment.  Using a loose protocol, participants were asked for their feedback about the question content, communication (language used, etc.), and relevance.  Some participants were quite descriptive in their responses to the interview questions, while others gave more measured feedback.  All in all, each person provided an interesting perspective on the nature of the geoscience assessment questions and I greatly appreciate the time and effort each person put into the interview.

November 18: Second Day of Interviews

I awoke that morning to snow.  A lot of it.  At least 18 inches of fresh snow had accumulated over the past 24 hours and it was still coming down.  I was literally snowed in at my hotel.  I walked through knee-deep powder to the local cafe to find out about the road report (my hotel was located 12 miles away from the tribal college).  On my way, I stopped to help push out a guy with Ohio plates who had slid off the road into a snowbank (A foreshadowing?).  Inside at the cafe counter, it sounded like the snow plows were staying on top of it and if I could get my little compact Nissan rental car onto the main road I'd be alright.  Why hadn't I upgraded?

The problem was that it was going to take some time to shovel my car out of the motel parking lot and I was not going to make my scheduled 9 a.m. interview at the tribal college.  I left messages for my interview participant, but was pretty sure that the snow would have her snowed in too.  With a broom and a shovel, I successfully dug out my car and a path to the road only to become stuck again at an intersection about a block away.  An elderly man walking up the street behind me, saw my dilemma and asked if I needed a push.  A sweet gesture, but I was afraid to accept given that he was pretty reliant on his cane.  Luckily, the local grocery store owner was out with her dog and a big Dodge pickup had pulled up behind me.  Both were able to push me onto the two-lane highway, which was snow-packed and drivable.

I arrived to empty offices at the tribal college soon after.  My colleagues were snowed in.  After about an hour, a few faculty were able to make it into their offices.  Luckily, one of those faculty members was someone we had identified as a potential contributor to the project.  He graciously agreed to an interview on the spot just before heading out to teach his class.  As the snow continued to fall, I walked over to see if I could meet early for my student interview.  I finished up with the student just as an announcement over the PA system said that afternoon and evening classes were canceled and that the college would be closed for the rest of the day.  My colleagues were suggesting that I leave early after hearing that they were about to close the roads to and from town.  So I hit the road, embarking on a 6-hour-turned-8-hour drive back to my parents house.  My fall visit had come to a close and I was heading out with a total of seven interviews...