Portfolio Project as Summative Language Assessment: Engaging Learners Online

Sarah Korpi

VOL. 34, No. 2 2019

Abstract: High stakes mid-course and final exams have long been a dominant assessment model. But exam performance does not necessarily correlate with learning or students’ ability to apply learning to real-life scenarios outside the classroom.
Reliance on such high-stakes assessments can result in elevated learner stress during exam times and lack of learner engagement during non-exam times. Active learning requires ongoing student engagement, and assessments of student work completed in such activities are authentic archetypes of student learning that demonstrate student ability to apply their learning to authentic situations, problems, and issues. This article argues that assessment practices based on multiple, low-stakes, iterative assessments within an active learning environment provide more targeted feedback and engage students in their own learning as active participants, which leads to increased student success.

The case study presented in this article is specific to foreign-language courses offered through online education. Combining best practices of learner engagement in the online environment and assessment in the communicative language classroom, language faculty in an open enrollment program developed an assessment model for asynchronous, introductory language courses that relies on multiple low-stakes assessments that culminate in a final, summative portfolio project. This article will offer examples of how the portfolio project is situated in the course and an overview of portfolio project topics. Example instructions and assessment rubrics will be provided. Data from the first full year of implementation will be analyzed to begin to assess the impact and effectiveness of this portfolio assessment.

Keywords: summative assessment, online language course, course design, portfolio projects.

Résumé : Placer les examens importants en milieu et fin de cours constitue depuis longtemps un modèle d'évaluation dominant. Mais les résultats aux examens ne sont pas nécessairement liés à l'apprentissage ou à la capacité des étudiants à appliquer leurs apprentissages à des scénarios de la vie réelle à l'extérieur de la classe.

Le recours à de telles évaluations, dont les enjeux sont élevés, peut entraîner un stress élevé chez l'apprenant pendant les périodes d'examen et un manque d'engagement de la part de l'apprenant pendant les périodes où il n'y a pas d'examen. L'apprentissage actif exige un engagement continu de la part des étudiants, et des évaluations portant sur des activités faisant ressortir leur capacité à appliquer leurs apprentissages à des situations, problèmes et enjeux authentiques. Cet article défend l’idée que les pratiques d'évaluation fondées sur des évaluations multiples et itératives à faibles enjeux, dans le cadre d’un apprentissage actif, fournissent une rétroaction plus ciblée et favorisent l’engagement des étudiants dans leur propre apprentissage en tant que participants actifs, ce qui favorise leur réussite.

L'étude de cas présentée dans cet article est centrée sur les cours de langues étrangères offerts en ligne. Dans un programme de formation ouverte, les professeurs de langue ont combiné les meilleures pratiques d'engagement de l'apprenant dans l'environnement en ligne et l'évaluation en classe de communication en langue étrangère afin de développer un modèle d'évaluation pour des cours asynchrones d'introduction à la langue. Celui-ci repose sur de multiples évaluations à faibles enjeux qui aboutissent au projet final de portfolio d’évaluation sommative. Cet article donnera des exemples de la façon dont le projet de portfolio se situe dans le cours et offrira un aperçu des sujets associés au projet de portfolio. Des exemples d'instructions et de rubriques d'évaluation seront proposés. Les données de la première année complète de mise en œuvre seront analysées pour commencer à évaluer l'impact et l'efficacité de ce portfolio d’évaluation.

Mots-clés : évaluation sommative, cours de langue en ligne, conception de cours, projets de portfolio

This work is licensed under a Creative Commons Attribution 3.0 Unported License.

Introduction

The realization that the majority of students who enrolled in one language sequence series in an online education program either did not complete their course at all, or did not successfully complete (defined as C or better) their course, led to the collaborative creation of a new curricular model for online language instruction. The new curricular model makes extensive use of active learning, an instructional method that requires students to engage in meaningful learning activities and to think about what they are doing as they learn (Prince, 2004). This article reviews the pilot of that curricular model, includes descriptions and examples of key activities, and argues that engaging students in an environment filled with multiple, low-stakes assessments and iterative assignments requires students to become active learners, leading to more meaningful engagement with the course content and, ultimately, higher student success rates. Using an active learning course design resulted in an increase of course completion rates in the third semester course from 22% (old course model) to 86% in the first year. Completion rates in the other sequence courses increased as well. In the first semester course, completion rates increased from 37% to 73%, in the second semester course, from 41% to 62%, and in the fourth semester course, completion rates increased from 42% to 86%.

Table 1: Pre-Revision
Completion Rates

Table 2: Post-Revision
Completion Rates

Literature Review

Background: Choosing an Assessment Strategy

High-stakes assessments have been a model for assessing student learning. This traditional assessment model is useful in some learning environments, for example, large, lecture-based or foundational courses in which students must memorize certain content in order to reach learning outcomes. These high-stakes assessments also have limitations (Gordon & Reese, 1997; Scott-Clayton, 2012). Reliance on high-stakes assessments can result in elevated learner stress during exam times and a lack of learner-engagement during non-exam times (Amrein & Berliner, 2003). They can raise the affective filter (Krashen, 1982), preventing students from learning to their fullest potential. High-stakes exams, in addition to taking a snapshot of student performance on a specific day, can also measure, in part, how well students take exams (Martin, 2015; Plous & Poundstone, 2014), which is a skill often independent of the content of the course assessed on the exam itself. In considering the most appropriate assessment format for a specific course, the content itself should dictate the strategy.

It is possible to conduct a high-stakes, engaged, communication exam, but these high-stakes exams are not the best assessment type for early language learners. When beginning language learners learn a new feature, their accuracy levels with previously mastered features decreases (Bachman, 1990). This is caused by learner attention to mastering the newer, more complex feature. Once the newer feature has been mastered, student production of the older feature returns to previous accuracy levels.  

Selecting the correct assessment strategy is especially key in foundational language classes. The assessment method choice, whether in the classroom or for research purposes, should be guided by: what is being assessed, how to elicit phenomena of what is being assessed, and who gets assessed and why (Norris & Ortega, 2012). Frequent, low-stakes testing in introductory college courses has been found to promote learner compliance with assigned readings, course attendance, and leaner engagement (Pennebaker, Gosling, & Ferrell 2013; Siadat, Musial, & Sagher 2008; Schrank, 2016).

Low-stakes assessments have limitations, especially when the assessment takers are not motivated. Learners may not be as motivated to perform well on low-stakes assessments (Mislevy, 1995; Wolf & Smith, 1995). However, learner motivation is positively correlated to performance on assessments (Pintrich & DeGroot, 1990; Penk, 2016), and motivated students tend to outperform unmotivated students (Baumert & Demmrich, 2001; Cole et al. 2008; Eklöf et al. 2013; Thelk et al. 2009; Wise & DeMars, 2005). In a 2015 review of motivation and low-stakes testing scores, Finn (2015) argues that motivation should be used as, “an essential consideration in evaluating and interpreting low-stakes testing scores,” since learner motivation has such a significant impact on low-stakes assessment results (13).

Assessment type is also a question of consideration. Two common forms of assessment are open and closed assessments. Closed assessments limit the number of choices a learner has to choose from in a given activity and require learners to answer in a specific way. Examples of closed assessments include multiple choice, word bank, and fill-in-the-blank assessments. Open assessments involve tasks where a variety of answers are equally possible. Examples of open assessment tasks include: writing a description of a photograph, giving directions to a specific location in a city, and introducing oneself to a new acquaintance. Students with lower language proficiency levels are able to complete closed assessments with higher accuracy than open assessments (Nunan, 1991; Skehan & Foster, 1997).

The Communicative Approach to Language Teaching

According to the Communicative Language Theory (Savignon, 1987; Savignon, 2006; Savignon, 2018), language learning requires active learning. The goals of Communicative Language instruction are to practice language skills in situations that are real-life or mimic real-life situations. Regardless of whether assessments are high stakes or low stakes, selected assessments should be in line with course goals. If the goal is communication, then the assessments used in the course need to test communication.

In order for students to be successful in the communicative classroom, the classroom itself must be a place where active learning takes place. Students engage in the creation of communication by experimenting with new lexical items and grammatical structures. The active use of these new lexical and grammatical forms promotes learning and retention of the content itself. By focusing low-stakes assessments on what students can do, instructors use communication produced by learners in these activities and communication-based tasks to document an accurate, timely assessment of student learning (Cunliffe, 2002; Norris, 2016).

Language Learning and Online Teaching: A Model for Communicative Learning

The successfulness of face-to-face language immersion programs, which create a space for students to use the target language in real-life situations, has been well documented (e.g., Freed, Segalowitz, & Dewey, 2004). In the online classroom, students and instructors are able to engage in the same behaviors that make face-to-face immersion programs successful with the assistance of digital tools. Online, students and the instructor engage intentionally in the use of the target language to perform specific tasks. They engage in communication, both written and oral, as well as a variety of other assessments, such as practice problems, quizzes, group writing activities, and group discussions.

In the online learning environment used for the language courses discussed here, active learning was embedded into the design of the courses through low-stakes assessments. Student language comprehension and production in written and oral forms is constantly under review and assessment but no individual assessment has a large impact on the student’s course grade. Low-stakes assessments are also low-risk assessments, which, in addition to lower learner frustration, results in a lowering of learner stress and, as a result, the affective filter. Iterative assessments provide for engaged learning by requiring students to not only complete and turn in assessments but also to review instructor feedback and incorporate that feedback into their future work. Finally, wrapping select assessments into a portfolio project offers a more complete picture of student learning throughout the course and provides students a way to document and present their learning in a meaningful way to future employers and others.

Methodology: In the Asynchronous, Online Classroom

The program involved in this study is an online program with roots in the correspondence course model. The courses offered are college-level language courses. The program is open enrollment; there is no application process. No data is collected by the administrative offices regarding student age, degree status, or goals that lead them to enroll. Students can enroll when they are ready, any day of the year.  Courses are self-paced; once enrolled, students complete their coursework on their own timeline, without externally imposed deadlines. Students move through known content at a faster pace and challenging content at a slower pace. Each student receives personalized instruction from the course instructor based on their own individual needs and goals. Group collaborative work is specifically designed so that students can collaborate across units, regardless of their pace and place in the course.

The data for this study was obtained from reports generated by the program’s administrative offices. The reports included data on course name, completion status, and (if completed) final grade. Learners were assigned a random identification number that was not associated with their student ID or enrollment ID. No identifiable student data was provided to the faculty team or the researcher. The researcher analyzed percentage course completion rates and final grade data in the old course model as compared to the same data in other language sequence courses (including Italian, French, German, Latin, and Greek) over a five-year period and as compared to completion rates and final grade data in the revised course. The goal of collecting and analyzing data in this way was to explore whether anecdotal observations regarding completion rates and failure rates could be backed up by data and whether a change in assessment model could positively impact both completion rates and student learning. 

The data from the case studies was compiled from various student experiences that were shared voluntarily and without solicitation. Students who had completed one of the old-version language sequence courses and then enrolled in a revised language sequence course received an email explaining to them the changes that had been made to the course sequence and what the faculty hoped to achieve with the changes. After experiencing both course versions, several students were anxious to share their experiences with their faculty member and the researcher and have those experiences reflected in the study. They are included here to highlight the student experience and the impact of the revisions on student satisfaction and success.

Faculty in this program developed an assessment model for asynchronous, introductory college-level language courses in the online environment. The course content was designed by this experienced team to ensure that students would be linguistically prepared for the program’s third-year advanced language courses in literature, writing, and conversation, and that the goals of each sequence course would be realistically obtainable for learners throughout each course in the sequence. The faculty team believed that the assessment model utilizing multiple low-stakes assessments and culminating in a final, summative portfolio project would have a positive impact on both student completion rates and student success rates. The portfolio project serves as a location where low-stakes assessments are collected, undergo iterative revision, and are assessed as a whole. The topics of the portfolio project were developed to be personal to the student, which had a positive impact on issues of academic integrity, engagement, motivation, and learner autonomy.

Format and Structure of the Course

The assessment strategy was selected purposefully to assist students in scaffolding toward a large, meaningful summative assessment. There are several opportunities for review: each sequence begins with an introductory lesson and a review of material covered in past courses. Each course also has a mid-course and final review lesson. Each lesson has auto-graded, online activities that provide students instant feedback on their performance, a vocabulary quiz, a grammar quiz, a written assessment, and an oral assessment (either an audio recording or live conversation, depending on the unit). There are two reading assessments, one after each review lesson. The portfolio project, assigned in the introductory lesson, built on expanded drafts of the written and oral assessments submitted throughout the course, and submitted for instructor feedback in the two review lessons, serves as the summative assessment of the course.

Portfolio project topics range from an introduction of self in written and video form in the first semester course to creating a welcome or tourist packet for one’s hometown in the target language in the fourth semester course. The portfolio projects all include a written and an oral component and are each designed specifically to test student written and oral proficiency on the American Council on the Teaching of Foreign Languages (ACTFL) proficiency scale.

In order to address academic integrity and increase learner motivation, the portfolio projects were designed to be intentionally personal to the experiences of each student. At higher course levels, the portfolio project also includes topic choices, so students can further tailor their project to their own interests. Grading rubrics are available to the students so they can rate their work and determine the amount and type of effort they need to put into their project in order to earn their desired grade. In the third semester college course, students demonstrate their speaking proficiencies at an Intermediate Low level and their writing proficiencies at an advanced mid-level on the ACTFL proficiency scale through two written and two oral components (see Appendix 2).

Results

Example 1: Sally and an Active Learning Assessment

Sally is a student who is not a natural language learner. In fact, she’d already unsuccessfully tried to complete the third semester language course, the last course she needed to graduate with her degree. After a few years, she decided it was time to finish her degree and had permission from her degree-granting university to complete her last graduation requirement online. Like many students who complete a language course as a degree requirement, Sally had a gap in her language learning and had not recently used her target language for communication. Without ongoing practice, her linguistic skills were not as sharp as they once had been and transitioning into the language course sequence was a challenge. Sally believed that she’d forgotten everything she had learned in the past. She was considering withdrawing from the course and giving up on her dreams of completing her degree.

In an initial conversation with her instructor via a web conferencing tool, Sally heard that many students share her experience and the first unit of the course, a review unit, was meant to help her review things she may have forgotten so she could successfully work through the course.  The multiple, low-stakes assessment approach to textbook activities (see Appendix 1) helped to lower Sally’s affective filter and allowed her to demonstrate her understanding of the material. Sally was encouraged by her performance on the practice activities and quizzes but worried that she would not be able to succeed in the composition. She decided to try the composition anyway and was surprised that the composition was a worksheet, not an essay topic.

The pedagogical goal for the structure of the composition is to break down the steps of writing to make the writing process manageable for students in their target language. In addition, the composition aims to encourage students to use targeted vocabulary and grammar features while improving their digital literacy. Because the steps of the writing process itself are broken down into manageable pieces for students, Sally was not overwhelmed by the assignment, or the progressively more challenging composition assignments throughout the course. In the first step of their compositions, students determine a topic. In the second step, they jot down some ideas and details, including characters, setting, start, and conclusion. In this step, students ensure that all their brainstorming words are in the target language and look up any words they do not know. In step three, students compose their initial paragraph in the target language. In step four, they narrate the story in the target language using the events generated in step 2. In step 5, students compose the conclusion to their story in the target language. Finally, in step six, students use a provided guide to edit their narrative. 

If Sally had been expected to write a marketing piece for her hometown featuring the history of the town and relevant cultural events in the target language without the support of scaffolding throughout the writing process, she may well have given up before starting. However, because of the carefully planned progression of the writing tasks and the step-by-step work through the writing process itself on each assignment, Sally was able to use several of her unit-level compositions as the basis of one her components of the summative portfolio project. Through the design of composition tasks in the course, Sally moved from very basic writing tasks (a letter of introduction for herself or a Haiku) to more advanced tasks (compositions about various people, places, and cultural events in her hometown). In addition, this progression of tasks facilitated meaningful interaction among instructors and students: “Great instructor. Her feedback was always helpful.”

Example 2: Billy and Oral Conversation Skills in an Online Course

Billy is a student athlete who is past his athletic eligibility window and has struggled with several academic courses, including his language courses. In the past, some of his instructors worried whether his work is really fully his own. In this course, the volume of assessments allow the instructor to get to know Billy, his voice, and his skill level. His written and oral work must be consistent, because the instructor is aware of his ability level through a variety of different assessment types.

Billy greatly benefits from the ability to edit his oral work in this course. The first few oral assessments are recordings which focus on the pronunciation of individual sounds, intonation, and sentence melody. The instructions themselves require students to plan what to say in advance, record themselves, listen to their recording, assess their own pronunciation, intonation, and sentence melody, and re-record to address any issues they identified in their analysis.

In the first activity for the course, Billy practices specific oral skills by introducing himself in a video-blog. Following the process outlined above, Billy was able to re-record his introduction several times. Billy re-recorded until he was content with his pronunciation, intonation, and sentence melody. While not a requirement of this specific activity, the process also encouraged him to take the time to notice errors in word choice, grammar, and pronunciation that are common in oral speech, reflect on them, and correct them before making his introduction available to others. This specific activity does not practice live speech but the slower pace of the speech act does help students bridge the gap between their often higher accuracy levels in written communication and lower accuracy levels with oral communication, and prepares them for future live conversations with their instructor.

Example 3: Joe’s Final Portfolio — Putting it all Together

Joe didn’t expect to complete the course so quickly. Theyi had been dreading the portfolio project and were relieved that they had a few weeks before the end of their course to work on it. Joe was expecting a time-consuming project but was surprised to read in the assignment instructions that they had already done a lot of the work of the portfolio project.

Built on two written and two oral assessments that students have already submitted for instructor feedback, in the final portfolio Joe was required to implement instructor feedback on their written and oral communication. Joe selected a topic from a predetermined list ii and then selected prior assessments related to that topic to include in their portfolio project. The portfolio project length requirements are longer than the prior, low-stakes assessments length requirements, so Joe was required to use their instructor’s feedback to polish and expand their prior submission for inclusion in the portfolio. By design, the structure of the portfolio creates student ownership by allowing for student choice. Grading rubrics allow for flexibility of topic while ensuring standard assessment across all possible project types. (See Appendix 2 for portfolio project instructions.)

Because Joe was planning to apply for graduate school, they decided to complete the portfolio option that would help them develop their graduate school application materials in the target language. Joe was convinced that this would set their application apart during the review process. Because of Joe’s intent to use their portfolio project as part of their graduate school application, they were motivated to make the most out of their instructor’s feedback on their prior work and expand their prior submissions to really capture evidence of their current linguistic abilities.

Discussion

As you can see through Sally, Billy, and Joe’s experiences, designing the curriculum intentionally with active learning opportunities to develop meaningful acts of communication had an immediate and positive impact on student success. Students were engaged in their own learning, and responded well to their autonomy as learners and the flexibility to promote and impact how they are learning. Following the implementation of the new, active learning curriculum model, students who completed their course evaluations indicated in their course evaluations that their instructors were “available” and “helpful”. Students also were able to ask more meaningful questions on the course discussion board that demonstrated their learning. Whereas, in the old curricular model, students posted questions similar to: “I do not understand the topics in unit 6” and “I don’t get the past tense,” in the new curricular model, students demonstrated their engagement with the course content by asking questions such as: “I do not understand if I should use the subjunctive when I am talking about my future goals or not” and “Where should I place the verb when using the past tense after a conjunction?” 

Questions like these demonstrate that students were actively engaged with learning the content of the course and knew of specific instances in which they were confused.

Instructor feedback on students’ authentically produced language is another positive impact of the active learning curriculum. Whereas in the old course, instructor feedback was limited to correcting errors on closed activities (activities in which there is only one correct answer), in the active learning curriculum, automated feedback is used to give students immediate guidance on closed activities such as practice problems and quizzes. This impacts student learning because it helps students to notice mistakes as they make them and provides mistake-specific feedback that guides students to the correct answers in closed activities. In addition, it frees up instructional time for impactful engagement and interaction with students in more open, unscripted activities.

Evidence of the impact of this curricular model can be seen in the successful course completion rate, which, for third-semester Spanish has increased from 22% (four of 18 students earned a C or better) in the last year of the old course to 86% (57 of 66 students earned a C or better) in the first year of the new course.  In addition, enrollments in the pilot course, third-semester Spanish, had increased by more than 200% in the first year (66 enrollments), as compared to the same enrollment period in the prior year (19 enrollments). This reversed a multi-year trend in decreasing enrollments in the third-semester course, which enrolled 25 students annually on average over the course of five years leading up to the course revision.

The successfulness of this approach can perhaps be best seen in the experience of a student who completed a high-stakes assessment version of third-semester Spanish as well as the newly revised course with the ongoing, active learning, communication focused, low-stakes assessments and portfolio project. This student, despite ongoing dedication, was not able to successfully complete the course with high-stakes assessments. The level of accuracy on all vocabulary and grammar aspects all at once on the high-stakes exams pushed a passing course grade out of reach, however, these exam scores were not indicative of learning completed throughout the course, as evidenced by this student’s performance on individual homework assignments in the old version of the course and overall performance in the revised version of the course. The revised version also served to foster student-engagement and excitement throughout the learning experience. These experiences are summarized below:

Table 3: Student Feedback

Pre-Revision

I was surprised that homework is not an integral part of the final grade given the amount of homework that is required for the course. The only two grades that count are the midterm and final exam grades. I would also have appreciated specific feedback on the errors that were made on the exams as opposed to just how many points were earned on each section.

Post-Revision

I want to thank you again so much for answering all of my questions. I am excited to tell you that I have passed in everything for all seven modules and am just awaiting feedback on my four portfolio pieces! It was a lot of hard work and very meaningful to me.

I know there is a class evaluation, but I just wanted to email you personally about the portfolio. I really appreciated that this course included options and that both options were relatable to real life situations. I feel like I know my own home town even better than I did before. I truly enjoyed creating my portfolio and I can't wait to see what you folks think! I actually think my hometown is even more special than I thought.

Thank you so much again for clarifying my questions as we went along. I hope things are going well from the instructors' point of view. I am very grateful for the opportunity to take this class.

Conclusions

The assessment method discussed in this article greatly improved course completion and student success rates but it does bring new challenges to language teachers. It requires more intentional and personalized feedback from instructors. It requires more flexibility in modalities, and a higher technical ability level on the part of instructors and students. High-stakes assessments are easier to administer and manage, they require less ongoing work on the part of the instructor but are not well positioned to assist students to accomplish the learning goals of communicative language teaching. In the specific case investigated here, it required a culture shift on the part of the language instructors. Investments in time and professional development were required to accomplish this culture shift. These investments have proven to be worthwhile through student feedback and data so far. Engaging students in an environment filled with multiple, low-stakes assessments and iterative assignments required students to become active learners, which led to a more meaningful engagement with the course content and higher student success rates.

Implications and Further Research

The immediate goal is to continue to assess the successfulness of this curricular model in the current language sequence while expanding this approach to the other three major world languages taught through this program. Once implemented, we will continue to track the effectiveness of this approach across languages and course levels to test whether the positive impact of this approach is transferable outside of the course and language considered here. We also plan to increase variety in assignment options to improve student choice throughout the courses.

An additional goal is to build a Community of Practice outside of the course. This Community of Practice would build resources for others who would like to adapt this ongoing, low-stakes assessment model to their own courses. Then, we would look across the curriculum to explore whether this approach would successfully measure the course outcomes of non-language sequence courses. Some of our questions include: Does this approach measure your learning outcomes? Does this approach engage students in active learning in online courses outside of language programs? Are there additional low-stakes assessment types that can be included to provide for more student choice and personalize student learning throughout the course?

References

Amrein, A. L., & Berliner, D. (2003). Student motivation and learning. NJ, USA: HW Wilson Co.

Bachman, L. F. (1990). Fundamental considerations in language testing. Oxford University Press.

Baumert, J., & Demmrich, A. (2001). Test motivation in the assessment of student skills: The effects of incentives on motivation and performance. European Journal of Psychology of Education, 16(3), 441–462.

Cole, J. S., Bergin, D. A., & Whittaker, T. A. (2008). Predicting student achievement for low stakes tests with effort and task value. Contemporary Educational Psychology, 33(4), 609–624.

Cunliffe, B. (2002). A communicative approach to second language testing and evaluation. 北海学園大学人文論集, (22), 61-82.

Eklöf, H., Pavešič, B. J., & Grønmo, L. S. (2013). A cross-national comparison of reported effort and mathematics performance in TIMSS Advanced. Applied Measurement in Education, 131127082739006.

Finn, B. (2015). Measuring motivation in Low‐Stakes Assessments. ETS Research Report Series2015(2), 1-17.

Freed, B. F., Segalowitz, N., & Dewey, D. P. (2004). Context of learning and second language fluency in French: Comparing regular classroom, study abroad, and intensive domestic immersion programs. Studies in Second Language Acquisition, 26(2), 275-301.

Gordon, S. P., & Reese, M. (1997). High-stakes testing: Worth the price? Journal of School Leadership, 7, 345-368.

Krashen, S. D. (1982). Principles and practice in second language acquisition. Learning, 46(2), 327-69.

Martin, E. (2015, June 26). 4 Ways to outsmart any multiple choice test. Retrieved from http://www.businessinsider.com/4-ways-to-outsmart-any-multiple-choice-test-2015-6 

Mislevy, R. J. (1995). Test theory and language‐learning assessment. Language Testing,12( 3), 341– 369.

Norris, J. M. (2016). Current uses for task-based language assessment. Annual Review of Applied Linguistics36, 230-244.

Norris, J. M., & Ortega, L. (2012). Assessing learner knowledge. In S. M. Gass & A. Mackey, (Eds.), The Routledge handbook of second language acquisition. Routledge.

Nunan, D. (1991). Communicative tasks and the language curriculum. TESOL Quarterly, 25(2), 279-295.

Penk, C., & Richter, D. (2017). Change in test-taking motivation and its relationship to test performance in low-stakes assessments. Educational Assessment, Evaluation and Accountability29(1), 55-79.

Pennebaker, J. W., Gosling, S. D., Ferrell, J. D. (2013). Daily online testing in large classes: Boost college performance while reducing achievement gaps. PLoS ONE, 8(11),1–6.

Pintrich, P. R., & DeGroot, E. V. (1990). Motivational and self-regulated learning components of classroom academic performance. Journal of Educational Psychology, 82(1), 33–40.

Plous, S. (n.d.). Tips on Taking Multiple-Choice Tests. Retrieved from https://www.socialpsychology.org/testtips.htm 

Poundstone, W. (2014, September 5). The secret to acing exams. Retrieved from http://www.bbc.com/future/story/20140905-the-secret-to-acing-exams

Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education, 93(3), 223-231.

Savignon, S. J. (1987). Communicative language teaching. Theory into practice, 26(4), 235-242.

Savignon, S. J. (2006). Communicative language teaching. Language and Linguistics, 673-679.

Savignon, S. J. (2018). Communicative competence. The TESOL Encyclopedia of English Language Teaching, 1-7.

Schrank, Z. (2016). An assessment of student perceptions and responses to frequent low-stakes testing in introductory sociology classes. Teaching Sociology44(2), 118-127.

Scott-Clayton, J. (2012). Do high-stakes placement exams predict college success? CCRC Working Paper No. 41. Community College Research Center, Columbia University.

Siadat, M. V., Musial, P. M., Sagher, Y. (2008). Keystone Method: A learning paradigm in mathematics. PRIMUS: Problems, Resources, and Issues in Mathematics Undergraduate Studies, 18(4), 337–48.

Skehan, P., & Foster, P. (1997). Task type and task processing conditions as influences on foreign language performance. Language Teaching Research, 1(3), 185-211.

Thelk, A. D., Sundre, D. L., Horst, S. J., & Finney, S. J. (2009). Motivation matters: Using the Student Opinion Scale to make valid inferences about student performance. The Journal of General Education, 58(3), 129–151.

Wise, S. L., & DeMars, C. E. (2005). Low examinee effort in low-stakes assessment: Problems and potential solutions. EducationalAssessment,10(1), 1–17.

Wolf, L. F., & Smith, J. K. (1995). The consequence of consequence: Motivation, anxiety, and test performance. Applied Measurement in Education8(3), 227–242.

Author

Dr. Sarah Korpi is the Director for Independent Learning at the University of Wisconsin, Madison, where she oversees curriculum and instruction of asynchronous, online courses. She completed a B.A. at the University of Minnesota, Duluth and holds an M.A. and Ph.D. from the University of Wisconsin, Madison. Email: sarah.korpi@wisc.edu

Appendices

Appendix 1: Example Grading Rubric for Unit-Level Low-Stakes Assessments

 

Greatly Exceeds Expectations

Exceeds Expectations

Meets Minimum Requirements

Does Not Meet Minimum Requirements

E-learning Platform 30%

 

First attempt done thoughtfully, following attempts may contain errors, but indicate thoughtful completion and ongoing skill development

First attempt done thoughtfully, some corrections made in following attempts

First attempt done thoughtfully, no corrections made in following attempts

First attempt submitted blank or not completed thoughtfully, following attempt(s) do not show evidence of thoughtful corrections

Written Assignment 35 %

Grammar

Writing indicates that the student is stretching knowledge of grammar to more fully express ideas. Minimal errors may be present.

Writing indicates that the student is stretching knowledge of grammar  to more fully express ideas. Errors may be present, but are largely related to content that has yet to be covered.

Grammar is appropriate for this level in the course sequence, predictable errors are present.

Grammar errors that obstruct the meaning are present.

Word Choice

Writing indicates that the student is stretching knowledge of word choice to more fully express ideas. Minimal errors may be present.

Writing indicates that the student is stretching knowledge of  word choice to more fully express ideas. Errors may be present, but are largely related to content that has yet to be covered

Word choice is appropriate for this level  in the course sequence, predictable errors are present

Word choice errors that obstruct the meaning are present.

Length

Meets maximum length requirement

Meets minimum length requirement but does not meet or exceed maximum length requirement

Meets minimum length requirement

Does not meet length requirement

Culturally Appropriate Content

Writing directly addresses the prompt and makes exemplary use of culturally appropriate norms.

Writing directly addresses the prompt and makes use of culturally appropriate norms.

Writing addresses the prompt

Writing is off-topic

Creativity

Writing is engaging, making good use of specific examples and transitions. The author is able to "paint a picture" of the experience with adjective and adverb use.

Writing is engaging, making good use of specific examples and transitions.

Writing is engaging, although predictable at times

Writing is void of transitions, details and descriptions.

Speech Act 35%

Use if Target Language

The speech act is completed exclusively  in the target language. No English-language words are present

The speech act is completed in the target language. 1-2 English-language words may be present

The speech act is completed in the target language. 3-5 English-language words may be present

English is used throughout the speech act

Pronunciation

Pronunciation and Prosody closely resemble target language features appropriate for this course level

Pronunciation makes use of target language features appropriate for this level of instruction. Prosody (sentence melody and intonation) is target language appropriate for this course level

 

Pronunciation makes use of target language features appropriate for this course level

Pronunciation closely follows English-language norms

Use of targeted vocabulary

Vocabulary appropriate to this activity and lesson is used without error. Additional appropriate vocabulary that is beyond the level of vocabulary targeting in this unit is present.

Vocabulary appropriate to this activity and lesson is used. Errors are not present in target  vocabulary use.

Vocabulary appropriate to this activity and lesson is used with errors

Vocabulary appropriate to this activity and lesson is not present.

Comprehensibility

Native speakers would easily understanding all aspects of what was said

Native speakers would be able to understand the main themes and topics as well as several details discussed

Native speakers would be able to understand the main themes and topics discussed

Native speakers would not be able to understand what was said

Appendix 2: Portfolio Project Instructions: Third-Semester Spanish

Overview of the Portfolio Project

In this course, you will complete a portfolio project that you will be able to use to demonstrate your Spanish proficiency level in the future. The goal for the portfolio project is for you to begin working on the project now, and work on the project throughout the course in manageable pieces. You have two options to choose from to complete this project, both of which are explained in detail below.

The Basics

  1. Both portfolio project options have 4 components: there are 2 writing tasks and 2 speaking tasks. The speaking tasks require that you make a recorded audio, video, or narrated PowerPoint recording.
  2. You can begin working on the first writing and speaking tasks here in lesson 1. You will turn in all tasks at the end of the course once you have completed lesson 7.
  3. All tasks have the advantage that you may work on them over the entire course, solicit feedback from your instructor, and turn them in when you are ready for them to be graded.
  4. Lesson 6 asks that you submit as your written assignment drafts of at least one written task and at least 1 oral task for instructor feedback. You may elect to submit both written and both oral tasks in lesson 6 for instructor feedback. This means that you can get comments and suggestions for improvements before you turn in the project for a grade. 

The Two Options (Las dos opciones)

To start working on the portfolio project, you will begin by selecting the option that you would like to work toward. Take a moment to reflect on how you plan to use Spanish in the future. If your future use of Spanish is professional, you may prefer option 1. This option is also a good choice if you hope to pursue an advanced degree in your field or demonstrate your Spanish language competency level to your institution through the use of a professional portfolio. If you plan to continue your study of Spanish inside or outside the classroom, or if you prefer creative projects, you may prefer option 2. You may select the project that best fits your needs and interests. Your portfolio project grade will not be impacted by your project decision.

Option 1: The Professional Portfolio

In this option, you will first find a job posting in Spanish for a position that interests you and that fits your qualifications. You also have the option of selecting a graduate program if you are interested in, plan to enroll in, or are currently enrolled in an advanced degree program. This program can be either in a Spanish-speaking country or any other part of the world. In Written Assignment 1 you will share your job posting/graduate program with the instructor and begin brainstorming. After reading your posting carefully, you can begin to work on the project, which consists of:

  1. Two written components:
    1. A cover letter.
    2. A resume (or CV).
    3. Each one should be between 300-600 words.
  2. Two oral components:
    1. In the first recorded audio, video, or narrated PowerPoint, you will discuss in Spanish the academic qualifications and experiences that have prepared you for this job. This recorded audio, video, or narrated PowerPoint should be 3-5 minutes in length.
    2. In the second recorded audio, video, or narrated PowerPoint, you will discuss your professional goals for the position. Students who are considering attending a graduate program may discuss academic goals for the program of study instead. This recorded audio, video, or narrated PowerPoint should be 5-10 minutes in length.

Option 2: A History of my Hometown / County / Region

In this option, you will be asked to become “the expert” on your hometown, county or region within your home state. How much do you know about where you’re from? Have you ever wanted to know more about your county or region? You will begin brainstorming by jotting down ideas about your hometown, county or region. After completing this step in Written Assignment 1, you are ready to begin with:

  1. Two written components:
    1. Complete a short biography of a famous person associated with your hometown, county or region.
    2. Draft a brochure of a historical event associated with the place you have chosen.
    3. Each one should be between 300-600 words.
  2. Two oral components:
    1. In the first recorded audio, video, or narrated PowerPoint, you will discuss a town (county or regional) legend. This recorded audio, video, or narrated PowerPoint should be 3-5 minutes in length.
    2. In the second recorded audio, video, or narrated PowerPoint, you will provide a historical tour of your hometown, county or region that features at least five unique locations or events. These locations and events may be associated with one of the written components and/or the other recorded audio, video, or narrated PowerPoint, or they may be new locations. This recorded audio, video, or narrated PowerPoint should be 5-10 minutes in length.

For the oral component, there are two alternatives to Vocaroo:

  1. If you are interested and able to record yourself on video, that is certainly acceptable. If you do, please ensure that the recorded video is stored in a public location that you can provide a link to so that the course facilitator can grade your efforts.
  2. If you are interested and able to use the audio capabilities of PowerPoint, that is also acceptable. Create slides as appropriate and record your audio narration to accompany them. Submit your PowerPoint to the dropbox when complete.

Notes

  1. The pronouns they/them are used for Joe as an inclusive reflection of non-binary preferred pronouns.
  2. See appendix for an example of portfolio topics.