It is the exam day, and your students walk into the room with a variety of looks. For those who are good at studying, we see glimpses of confidence. For those who struggle to memorise content, we may see looks of dread. This generic scenario is an all too familiar one. Many students have test anxiety and the scores that result. As we prepare for an assessment of our students’ learning, how often do we, as educators, consider their perspective of that assessment? How often do we see our learners as more than a score?

Even after years of schooling, for many students, scores remain an inconsistent and mysterious tool (Guskey, 2014). Traditionally, an emphasis has been placed on summative assessment. This end score becomes the goal for teachers and students, but our students are more likely to feel judged instead of support. A final score is perceived as a teacher’s objective judgment of students’ results from a learning experience. Scores may be well-intentioned as a form of feedback. However, the scores tend to override any authentic feedback that comes with teacher’s comments. Those learners, who do well, ignore feedback that might help them if they reflected on the comments. Those, who do poorly, may fear what is in the comments. As a result, these one-time grading experiences tend to end learning, more than progress towards learning. The real value of assessment emerges when we help learners separate formative assessment from a final score.

To help learners with this separation, opportunities to reflect on a variety of assessments become essential, as does varying the person who supplies the feedback. The greatest value-add of feedback may come from beyond the classroom. Subject matter experts (SMEs) provide our learners with real and meaningful feedback about the authentic qualities of their work, rather than feedback solely related to their scores on academic standards (Dobbertin, 2010). For example, an engineer who partners with a class of Year 10 female students looking to design better athletic facilities for women in Jeddah, Saudi Arabia, has the real-world experience to provide a new perspective that goes beyond the textbook understanding (Kubik, 2017). ACE Leadership in Albuquerque, New Mexico, makes this the norm, bringing in mentors to help develop authentic projects from a client perspective. The school also utilises these SMEs to provide feedback on student work produced for the real-world (Vander Ark, 2018). When feedback is diversified in this way, students begin to view it as something that helps them to grow rather than something that causes anxiety and fear. These examples do not mean educators are not experts, but that their expertise is generally limited to education unless they have held a position in another career.

Pedagogy as a Participatory Process

Traditional assessment such as home works and assignment sheets rarely take into consideration the individual students’ progress through a learning experience. Instead, these assessments focus on what a teacher needs to know about students. Teachers need this information and are often themselves evaluated using the results. However, traditional assessments rarely get beyond basic levels of student understanding (Armstrong, 2018; August, 2014). This typical quality control approach to assessment is more closely related to a factory assembly line. As a result, we too often miss opportunities to assess students’ more in-depth learning.

Research into both learning and assessment increasingly recommends we see assessment as part of a social learning experience, not separate from it (Mitra, 2013; Price 2013; Zhao, 2018). Assessment for or as learning argues the need for student and educator collaboration (Earl, 2014; Stiggins, 2005). Educators are urged to use assessment to “inform instructional decisions in a way [sic] that motivate students to try to learn” (Stiggins, 2005, p. 1). Motivated students are more likely to articulate and advocate for their learning. Assessment, thus, becomes part of their learning, not something done separately by the teacher.

While this more collaborative approach can work with traditional forms of assessment such as quizzes and home works, authentic performance assessments can be more effective because they signal essential shifts in pedagogy (Stecher, 2010). At Project ARC (2018), we facilitate community partnerships to co-create authentic and relevant experiences that promote and assess complex learning. Over the last four years in our work, we have observed three fundamental shifts in pedagogy to be essential:

  1. Students are driven by authentic challenges that differ from traditional assessments as they are more meaningful to them through their potential to effect change in the real world;
  2. Throughout the authentic challenge, students have multiple occasions to demonstrate learning across varying levels and modalities as we accumulate more data related to student performance before any summative assessments;
  3. Students have more diverse opportunities to receive non-graded feedback and learn from outside SMEs who support learning alongside teachers.

Because students are more likely to invest in their work as a result of these shifts, educators are more likely to elicit meaningful data about the student learning that results from their direct instruction within the learning experiences they design. Generally, students show evidence that they are learning more, and educators use that evidence to focus their instruction. The key is to implement authentic assessments that vary in frequency and format according to the authentic nature of the project learning experience.


Given that our learners’ view of traditional assessments is often negative, we might naturally ask, “what formats and tools can be used in an assessment to be less judgmental?” Technology is often celebrated as an answer. Examples could include, a digital graphic organiser that visually explains a student’s approach to a challenge; or Etherpad, a collaborative, real-time document that allows for peer and teacher feedback. Such tools support the formative data we gather in such a way that is less like a traditional, judgmental assessment, and are very popular as a result. In itself, however, changing the technologies of assessment does not address the pedagogical issues at the core of the assessment debate. Tools such as and Etherpad do provide technologies to improve our approach to assessment. The key, however, is to vary the frequency and format of their implementation according to the authentic nature of the project learning experience.

This adjustment in our approach to assessments means we must also make an actual shift in pedagogical focus within the classroom. Many schools and organisations often refer to this as project-based learning. Accepted distinctions exist between projects and project-based learning (Larmer & Mergendoller, 2010). However, even learning that is merely ‘based’ in a school project is rarely sufficient to change student perspectives on assessment. Too often such project-based assignments are designed, assessed, and managed no differently than the more traditional lessons. Motivated learners require experiences that are authentic such that they are real-world challenges and useful outside of the classroom context. They must also be relevant to our learners in a way that is personal and meaningful to them, rather than resulting in a “why do I need to know this” moment.

Authentic project learning experiences must foster a complex challenge instead of a complicated, required multi-step instructional assignment. This complexity allows for a variety of in-depth investigational paths to emerge for each learner as an individual. Through authenticity, relevancy, and complexity, we develop a learning environment that produces student work that looks different from one another rather than the same. These Authentic Project Learning Experiences decrease the anxiety of the assessment process. Furthermore, when we embed the formative feedback from beyond the classroom, even more of these anxieties vanish. In turn, an excitement for learning replaces these anxieties.

The shift to authentic project learning experiences is not merely a move from worksheets to problem-based activities, or from lecture-based, note-taking to project-based maker spaces. If we fail to address the authenticity, relevancy, and complexity of the project, these shifts can still result in project-based assignments. To push our learners into a broader inquiry loop, that results in a better summative assessment; there are several steps we can take as supportive educators.

First Step: Rubrics as Part of Assessment

Many rubrics imagine a final product (a paper, a presentation, or even an authentic artefact), rather than the authentic project learning experience itself. As a result, these rubrics most often take the form of recipes for product completion. Students choose the quality of the product they want to make and follow the appropriate recipe. Teachers then use the recipe at the end of the project to assess how well students followed it. Rubrics such as these are legacies of a more industrial approach to schooling.

By contrast, a variety of multiple possibilities exist when rubrics indicate only the standards a student must demonstrate in the process of resolving a challenge or problem, rather than a recipe for how to do so. The students’ collaborative role in assessment for learning is then to identify one of those possibilities and explore how it might meet the standards. When students take that active role, teachers are in a position to use the rubrics to coach them towards improved performance during a project, rather than only judge student performance in an exhibition at the end. Indeed, when rubrics are used this way exhibitions or presentations of learning often become unnecessary; thus even more likely if students have been working to learn with SMEs all along in a way that mirrors their future after school.

Second Step: Authentic Formats of Assessment

Standards-based, rather than product-based rubrics, immediately open the door to alternative formats of assessment that are authentic to the project. There may still be a place for homework or quizzes. However, homework will likely be done more thoroughly when based on work students choose to do at home to meet the challenge. Likewise, quizzes provoke far less anxiety when students appreciate that they are something authentic to work. Authentic quizzes parallel what practitioners outside of school might be expected to do as well, such as an industry certification.

Alternative assessments, such as graphic organisers, project planners, or peer-evaluation logs offer even more motivation for students. At Project ARC, we have compiled over 50 possibilities that go beyond traditional assessments of learning to engage students in assessment for and as authentic learning. Some of these possibilities include Clarifying Questions, Harkness Discussions, Audio/Video Reflections, or the SWOT Analysis, all of which have parallels in real-world work situations. These formative assessment options allow multiple combinations that offer varied perspectives in the learning process of each student. In turn, these perspectives provide us with the necessary data to inform any instructional shifts we may need to make on an ongoing basis. Thus, we can meet the needs of all of our learners rather than a pure focus on a whole group where, as a result, many students may be left behind.

Third Step: Authentic Rhythms of Assessment

Finally, authentic project learning experiences can help a teacher rethink the frequency of assessment. For many teachers, assessment is homework that happens every night, an exit ticket such as a journal entry at the end of every class, or a test at the end of every week and every year. This rhythm is predictable but rarely authentic to the demands of an authentic project learning experience. Homeworks often merely go into our books as completion scores, exit tickets are often never read after collection, and tests are just data that is garnered too late in the instructional process. Therefore, it is time to shift to authentic rhythms for assessment.

Some topics often require frequent formative assessment. Idioms in language, odd spelling and grammar rules, the order of operations, cause and effect in science and social studies, and citation practices are all examples that come to mind. However, these topics often become complicated activities or practice sheets. On a higher level of complexity, such as strategic thinking or creative problem solving, such rhythms are likely to constrain students as they prematurely shut off their thinking. Our students need design cycles of varying lengths appropriate to the challenge, and they require multiple opportunities to work on prototypes through those cycles. Here, the rhythms of formative assessment become a negotiation between the learners and the teachers, who direct them to achieve their mutual learning goals.

Finally, as we move toward a more participatory, reflective, and ongoing process of assessment, we must consider how this change connects learners to more meaningful international standards of learning. A careful review of the ISTE (2016) standards for students revealed a strong connection to assessment for more authentic, relevant, and complex learning experiences. As global collaborators, our learners have the opportunity to “broaden their perspectives” as they interact with others at home and abroad (ISTE, 2016). This collaboration improves their prototyped solutions as our learners derive a better summative score. Furthermore, as innovative designers, who are also knowledge constructors, our learners provide us with far more relevant opportunities to collect data as they develop “new, useful, and innovative” solutions through the “curation of a variety of resources” (ISTE, 2016). Ultimately, this gives us a classroom full of empowered learners, who “take an active role in choosing, achieving, and demonstrating competency in their learning goals” (ISTE 2016). As we center our classrooms around authentic project learning experiences, we have a room full of learners who are more than an exam score.

About the Author:

Dayna Laur

Dayna Laur

Dayna Laur’s arc of professional learning began in 1998. Dayna’s doctoral level research in instructional systems design and technology has focused on connecting learners to subject matter experts as they leverage their value-added feedback in order to improve authentic challenge outcomes. Specifically, she is interested in the ways in which these connections can occur virtually to overcome the logistical challenges many rural schools experience. Her experience includes Special Ed., Advanced Placement, Department Chair, Career Academy Coordinator, and designing online coursework for adult learners. Her diverse facilitation and coaching expertise have allowed her to interact, globally, with pre-K to post-secondary professionals since 2008. At Project ARC, Dayna strives to empower educators and their learners by implementing authentic learning experiences, a topic she pioneered in her 2013 book, Authentic Learning Experiences: A Real-World Approach to PBL.

Dr. Tim Kubik

Dr. Tim Kubik

Dr. Tim Kubik’s arc of professional learning spans 21 years of teaching in primary through post-graduate learning environments. His experience includes Team Leader of three different interdisciplinary grade level teams, Department Chair, and Associate Head of Upper School. Over the past ten years, he has facilitated workshops for over 5,000 educators from around the world. At Project ARC, Tim brings a spirit of passionate and innovative collaboration to the complex challenges learning organizations face, while striving to offer the same for individual teacher professional development on a personal level. He shares engaging examples of this work in his book: Unprepared for What We Learned: Six Action Research Exercises that Challenge the Ends We Imagine for Education


  1. Armstrong, P. (2018). Bloom’s Taxonomy. Retrieved from
  2. August, G. (2014). Using Webb’s depth of knowledge to increase rigour. Retrieved from
  3. Dobbertin, C.B. (2010). What kids learn from experts: Feedback from experts help students see how to improve their work–and why it matters. Educational Leadership, 68(1), 64-67.
  4. Earl, L. M. (2012). Assessment as learning: Using classroom assessment to maximise student learning. New York, NY: Corwin Press.
  5. International Society for Technology Education. (2017). ISTE standards for educators: A guide for teachers and other professionals. Washington, D.C.: ISTE.
  6. Kubik, T. (2017). Education and the world economy: Timely visions. Retrieved from
  7. Larmer, J. & Mergendoller, J. (2010). The main course, not dessert: How are students reaching 21st-century goals? Retrieved from
  8. Mitra, S. (2015). SOLE toolkit: How to bring self-organised learning environments to your community. Newcastle: Newcastle University.
  9. Price, David. (2013). OPEN: How we will work, live and learn in the future. London: Crux Publishing, Ltd.
  10. Project ARC. (2018). Project ARC: The potential difference in educator professional development. Retrieved from
  11. Stecher, B. (2010). Performance assessment in an era of standards-based educational accountability. Stanford Center for Opportunity Policy in Education.
  12. Stiggins, R. (2005). Assessment for learning defined. In Presentation at ETS Assessment Training Institute’s International Conference: Promoting Sound Assessment in Every Classroom. Retrieved from
  13. Vander Ark, T. (2018). Mentors enhance project-based learning. Retrieved from
  14. Zhao, Y. (2018). Reach for greatness: Personalizable education for all children. New York, NY: Corwin Press.


We're not around right now. But you can send us an email and we'll get back to you, asap.


Privacy Policy | +92-51-2724070 | | ©2021 The Reformer

Head Office: Plot # 179, 3rd floor, Zuhra Icon, Civic Center, Phase 4, Bahria Town, Islamabad, Pakistan, 44000

Log in with your credentials


Forgot your details?

Create Account

By continuing to use the site, you agree to the use of cookies. more information

PAKISTAN ASCD Cookies, Tracking and Advertising Disclosures

When you visit the Service, we may use cookies and similar technologies, like pixels, web beacons, and local storage, to collect information about how you use PAKISTAN ASCD’s Services and to provide features to you.

Some of the content, advertising, and functionality on our Services may be provided by third parties that are not affiliated with us. Such third parties include

We use third-party analytics tools to help us measure traffic and usage trends for the Service. These tools collect information sent by your device or our Service, including the web pages you visit, add-ons, and other information that assists us in improving the Service.

 Details of Third-Party Services and Your Options

Below is a description of our advertising and audience-measurement companies, and the options you have with their services.

Google: PAKISTAN ASCD uses Google Analytics to help analyze our performance and our delivery of services and advertising to you. You can prevent your data from being collected by Google Analytics by using the Google Analytics Opt-Out Browser Ad-On, which you can find at this URL:

cookie) together to report how your ad impressions, other uses of ad services, and interactions with these ad impressions and ad services are related to visits to PAKISTAN ASCD's sites.

Facebook: PAKISTAN ASCD uses Facebook’s Custom Audience Remarketing Tools. Please go to to opt-out of such tools.

LinkedIn: PAKISTAN ASCD uses LinkedIn’s Lead Accelerator services. Please go to to unsubscribe from target ads and to manage your LinkedIn advertising

Twitter: PAKISTAN ASCD uses Twitter’s tailored advertising and remarketing Please go to to turn off tailored ads and to exercise other options.

 How long will cookies stay on my device?

The length of time a cookie will stay on your computer or mobile device depends on whether it is a "persistent" or "session" cookie. Session cookies will only stay on your device until you stop browsing. Persistent cookies stay on your computer or mobile device until they expire or are deleted.

General Choices

To learn more about the choices that advertisers provide generally for individuals to influence how information about their online activities over time and across third-party websites or online services is collected and used, visit the Network Advertising Initiative at and the Digital Advertising Alliance’s www.Aboutads.Info/choices/ service.