Reflection on the Virginia Quality Criteria Review Tool for Performance Assessments Experience

During our last class, Dr. Hunt introduced the Virginia Quality Criteria Review Tool for Performance Assessments and provided us with the Farmers Market plan. In reviewing this plan, we noticed that there were significant strengths and significant weaknesses. I found that grading the content of the plan within categories on a scale from 0-3 was somewhat difficult. Were there any decisions that we  made as a class that you were not particularly in agreement with and, instead, had something else in mind?

I also believe that the time frame given for the activity was unrealistic. As you prepare to complete your own Criteria Review Tool, what are key considerations and other areas of focus that you will present when grading the plan?

Look at the sheets titled “Figure M.11 – Ideas for Performance Tasks ,”  “Figure M.12 – Student Roles and Audiences,” and “Figure M.13 – Possible Products and Performances.” What elements, if any, of these resources do you hope to or plan on using in the assignment and in future teaching.

Overall, what was something that you found to be most helpful from these resources (GRAPS, RAFT, etc.)? Looking at the Principles of Scoring Student Work handout, what two or three principles do you find to be the most important to you?

6 thoughts on “Reflection on the Virginia Quality Criteria Review Tool for Performance Assessments Experience

  1. Luis, I too found the assignment to be quite challenging. In class, I found myself getting caught up with the specific language used and found the descriptions of the criterion to be wordy and confusing. I haven’t tackled the assignment yet but am hoping that reading it more slowly and carefully will allow for further understanding.

    As for how to apply this to the classroom, I really like the examples listed in the Possible Products and Performances (Figure M.13) sheet that we were given. Having the students be afforded the opportunity to choose different ways of expressing their findings puts the student in the driver’s seat of their learning and immediately garners the “buy-in” for student directed learning. The article attached below highlights a school using Student Directed Learning approaches and incorporates the idea of solving real-world problems in the classroom. I think a blend of Student Centered Product and Teacher-Led content is a great method for today’s classroom.

    In regards to the Principles of Scoring Student Work, a teacher I worked with in the past suggested the “Name” line for the student be written on the back of the paper instead of the front to avoid the temptation of bias. I’ve always thought that was a good idea moving forward. -Erika

  2. Thanks for this post, Luis. You’re asking so many great questions! I’ll focus on your last question: “Looking at the Principles of Scoring Student Work handout, what two or three principles do you find to be the most important to you?”

    The Principles of Scoring Student Work handout really helped me solidify my ideas about scoring work in a K-6 setting. I took to heart the advice to “know the rubric” because it’s so important that students understand the guidelines and expectations for an assignment and furthermore, that the scoring is consistent with those guidelines. I also thought quite a bit about the “know your biases; leave them at the door” principle. I really think abandoning your biases is crucial when working with large classes. Biases are often just a grading shortcut (“He always gets an A” or “She’s a C student!”), and you don’t want to ever under- or overestimate a student’s ability. You always want to judge the merit of the work in front of you, regardless of past performance. Finally, I chuckled a bit when I read the “resist seduction” principle. I am a sucker for a good topic sentence and/or a strong thesis statement, and I often want to forgive any number of writerly sins if a student hands in a paper without major typos. These are, of course, examples of seduction, and I really do need to be sure to stick to my grading rubric!


  3. Hi Luis – I’m so glad you wrote about the Virginia Quality Criteria Review Tool. I’ll be honest, I’m not a huge fan of the VQCRT because I felt very unsure of what explicitly I was looking for within each criterion. Honestly, I’m all about being provided explicit examples or a list of common mistakes that occur within a specific criterion to use as a reference tool as I review, so I have a better idea of what I need to look out for.

    I do want to mention that I found a “participant workbook” called “Implementing High-Quality Performance Assessments in Science” that was created by REL AP staff members and Virginia science teachers to support educators in administering their first performance assessment. Here is the link:
    Open the link, then scroll down to page 26, titled “Step 3: Applying the VA Quality Criteria Review Tool.” Although it is on the topic of science, the information on page 26 and the ones that follow helped me feel less overwhelmed as it could lay a more solid foundation for my understanding of the tool. Further, this resource highlights strategies for how to meet the criterion with full evidence.

    Regarding your question, “As you prepare to complete your own Criteria Review Tool, what are key considerations and other areas of focus that you will present when grading the plan?”

    There are definitely a few key considerations I will look out for while reviewing our assigned performance assessment review. For criterion one, I will ensure that the standards and learning outcomes are listed for all content areas being assessed. For criterion two, I will consider whether the given scenario is an authentic experience for all students. With that being said, another consideration will be to ensure that the performance task(s) is also authentic and accessible for all students as well.
    One area that I will focus on is whether the student directions, prompt, resources, and materials are all present and explicit.

    In sum, the “participant workbook” will be a reference I will use as I complete the performance assessment review. I hope that you and our other classmates find this helpful as well!

    – Morgan

  4. Luis,
    I’m really glad you brought up the Quality Criteria Review Tool. I think many of us were on the same page in considering the tool to be a little difficult to digest, discern and interpret. The language used seemed to be overly fluffy with high reaching vocabulary and therefore made the point muddy. I liked the rating from 0-3 but also found it difficult to assign a rating if I felt the description of the criteria did not fall completely one way or the other. I think the tool could be really beneficial if it was edited, revised or simplified to be more easily understandable and assignable.
    As far as Figure M.12 “Student Roles and Audiences” and Figure M.11, “Ideas for Performance Tasks”, I think these lists are great tools for promoting hands-on, creative and personalized learning. It really requires the student to apply what they are learning and allows them to create something in a way that places them in someone else’s historical shoes. I think there definitely has to be caution when using these activities, whereas we are not placing emphasis on student’s recreating something from a difficult time in history, like slavery, that may be sensitive or harmful to our students. However, when used appropriately, it can be a really fun and interesting way to gauge student understanding, keep them interested, and leave room for creativity.
    Great points and topic, Luis!

  5. Hi Luis,

    Thank you for bringing up the Virginia Quality Criteria Review Tool since we talked about it in class and were assigned homework practicing with the tool. I have mixed feelings about the review; most lean more towards disliking the device as the 1-3 scale with strict parameters are hard to work with. It is nice that there are standardized expectations teachers and assessment creators can use when making assessments; however, the review tool is not up to snuff. The criteria tool often did not align with the part of the assessment we were judging; just like Morgan said, concrete examples of good and bad work are most beneficial to me. The review tool was both simple in its requirements but difficult to utilize when there were differences between what the tool was asking and the actual assignment.

    Figure M. 13 is most beneficial to me; I believe that students are best engaged when they have control of their learning, or at least some say. Student-directed learning offers opportunities for students to show their learning in a way that best suits their strengths. More often, students are forced to work within tight parameters that could limit work ethic; allowing students to make their own decisions and direct their learning will promote learning, give students independence, and produce better results.

  6. Hi Luis,

    I’m glad you brought that up about the tool because I too found it to be difficult. I feel as though I was conflicted for most of the sheet as we were going through it as a class. I feel like it isn’t a super thorough way to assess an assignment with the lack of scale. For my own tool, I would want it to focus more on the whole assignment and how it relates to what students need to know. I feel as though it is pretty easy without the tool to look at an assignment and the standards and see if they work together or not. If the tool needs to be more in-depth then add a bigger range with wording for a scale.

    Figure M.13 would be the most beneficial to me. It is important to have different products for assignments so that students are not giving you the same thing over and over. This sheet will make it possible for me to differentiate assignments and make them more enjoyable for students.

    As someone who struggled in elementary with prospective writing, I think the RAFT resource will be the most used and helpful for me and my (hopefully) upper elementary students!


Comments are closed.