Teaching for learning blog

Subscribe to Teaching for learning blog feed Teaching for learning blog
Discussing what matters in higher education.
Updated: 14 hours 30 min ago

What does “excellent teaching” really mean?

Tue, 06/11/2019 - 21:00

When I joined the McGill community earlier this year, one of the first projects I worked on was a set of new teaching awards. The Faculty of Management created 9 new awards this year to recognize and reward the excellent teaching that instructors are doing. Teaching awards can help motivate instructors to reflect on and improve their practices, and serve as public recognition for teaching, acknowledging its value and importance. As I spoke with professors about the teaching awards, I started thinking more about what we really mean by excellence in teaching and it made me want to open up a conversation to the larger McGill community about how we recognize and reward teaching.  

I am curious to hear from folks across campus about what criteria you would consider constitute excellent teaching. How would your own discipline define excellence in teaching? What criteria could be used to break down such a complex idea as “excellence in teaching”? 

These questions are important both in the context of how you think about your own teaching practice, and in terms of teaching awards. For example, this review  of university teaching awards notes that some programs consider excellent teaching as something that takes a “student centered approach” or where instructors demonstrate “curriculum development efforts” while other programs claim that teaching is an intangible practice, an art form, and cannot be broken down into specific criteria. How can we respond to claims that teaching is something that simply cannot be assessed?  

This brings me to my next question: once we have established a set of criteria that define excellence in teaching, how do we assess that criteria? What evidence can be provided and by whom—students, colleagues, department chairs, the instructors themselves? For example, while a student can speak to the engaging activities in a class, they may not be aware of an instructor’s dedication to improving their practice over time.   Finally, I’d be interested to think together about other ways we can recognize, reward, and ultimately improve teaching beyond the formal teaching award.  What meaningful strategies have you encountered at McGill or elsewhere that motivate instructors to improve their practice or that acknowledge when excellent teaching is taking place? One example is the “Thank a Prof” program, where students can anonymously thank a professor who they feel did a great job. The professor then receives a personalized letter of thanks, and their name is published in a list of thanked professors for that year. Professors have commented about how meaningful it is to receive a personalized letter, saying “This makes the job so rewarding to know someone cares” and “What a beautiful initiative and such an unexpected gift”. The Thank a Prof program provides a different model from teaching awards, but one that nonetheless recognizes when a professor is doing a great job. Do your faculties or departments have other methods for recognizing excellent teaching? How can McGill think beyond the teaching award to recognize, reward, and motive excellent teaching?

The ins and outs of Polling @ McGill

Thu, 06/06/2019 - 09:00

This event recap details the key insights, strategies and questions shared by instructors at the Polling @ McGill event on February 15, 2019. (Originally published on the Office of Science Education blog)

Formerly called “clickers”, Polling @ McGill is a technology platform McGill instructors use to ask students questions in class. On February 15, 2019, the Office of Science Education and Teaching and Learning Services (TLS) invited twenty instructors to meet to discuss strategies for using this technology, which may be particularly helpful in large classes. Lawrence Chen (Faculty of Engineering), Catherine-Anne Miller (Faculty of Medicine), Laura Pavelka (Faculty of Science), Kenneth Ragan (Faculty of Science), and Pallavi Sirjoosingh (Faculty of Science) outlined their strategies for engaging students through polling.

Lawrence Chen, a Faculty of Engineering professor, began by affirming his dedication to polling, saying he would “never not use polling”. He sees it as a prime opportunity to promote active learning and takes every opportunity to use it with his students, both undergraduate and graduate, in small and large classes. Lawrence frequently asks students challenging questions that they may not be able to fully work through during class time, giving them the opportunity to think about the concepts over time.

Pallavi Sirjoosingh, an instructor in the Department of Chemistry, uses polling to conduct reviews with students, both as they enter the classroom and throughout the term. Both Pallavi and Laura Pavelka, fellow instructor in the Department of Chemistry who initially encouraged Pallavi to begin polling, noted the benefits of polling students while they filter into large lecture halls. During the first five minutes of class as the students get settled, they use that time to “get students thinking about chemistry and the content from last class.” To facilitate reviews for major assessments, Laura will often spend an entire class asking polling questions, while Pallavi will frequently use a poll-repoll strategy to check students’ understanding. This involves asking students a question to gauge their understanding, engaging them in a learning activity, then asking the same question to determine if their understanding has improved.

In addition to facilitating review sessions and encouraging peer learning with poll-repoll strategies, Ken Ragan, professor in the Department of Physics, uses polling to assess participation and gather feedback from students. The information he collects about the needs and expertise of his students enables him to tailor his teaching accordingly. He also uses polling to gauge student perspectives on major assessments. For example, once students hand in an assignment, and before they receive their grades, he uses polling to ask questions like “Was the assignment fair?” and “How happy are you with the assessment?”. He then uses the real-time responses to these questions to spark a discussion in class.

Catherine-Anne Miller, faculty lecturer at the Ingram School of Nursing, takes a similar approach to reflexive teaching practice through polling. She keeps students engaged by “ensuring that they’re with her” and getting feedback on the direction of the course. She also uses polling to test out potential exam questions, determining if they make sense to students. Unlike Ken, Catherine-Anne ultimately found that grading participation using polling was too time-consuming. Nevertheless, she has found that her students are still engaged in class and have conveyed that they very much enjoy polling.

The instructors also pointed to the challenges they faced with the technology. The primary issue that arose was dealing with technical barriers, such as challenges posed by the PowerPoint integration and limitations placed on response types.

As Adam Finkelstein, TLS Educational Developer, reminded the group, TLS offers consultations and support to instructors using polling.

› Request a consultation through the TLS website

You may have noticed something new in myCourses…

Tue, 06/04/2019 - 10:34

“What happened to the Content tool in myCourses?” is a question you may have pondered this spring as you busily worked on your course materials. The Content tool received a makeover and a new “Content Experience” was unveiled by the software vendor. The new experience supports the same content types and activities as the previous interface and the changes are mostly cosmetic at the moment. Cleaner navigation and more user-friendly features such as a quick hide/show toggle to make modules visible/invisible are some of the improvements you will notice when working in the new Content Experience.

While this alternate look and feel is still in development, the new Content Experience is being piloted as an opt-in/out option. This means if you would prefer to continue working in the previous interface, then you can simply turn it off.

To revert back to the previous interface:

Simply click the arrow to expand the new Content Experience dialogue box.

Then select “Turn it off”.

You will be prompted to provide a reason for why you wish to turn it off. Select a reason from the drop-down list or simply click “Done”. This will revert the Content tool back to the previous interface.

To turn on the new Content Experience:

If you feel like now is the right time to experiment with the new Content Experience, then turn it on and give it a whirl. To do so, click the arrow as described above and select “Turn it on”.

Note that additional features are in progress. Stay tuned for more information about the new Content Experience in the coming months.

Online tools for assessment and engagement in large classes

Thu, 05/30/2019 - 09:00

This post featuring Prof. Ken Ragan is the latest installment in our ongoing series about assessment tools for large classes. On October 7 from 8:30-10:00 a.m., Prof. Ragan will be the guest speaker at a breakfast workshop on “Evaluation and Feedback for Large Classes”. For details and to register, go here.
_____
“Experiment. I used to feel like I couldn’t experiment.”

One might imagine that experimentation would occur naturally in an undergraduate physics course – and indeed, the biweekly laboratory sections of Physics 101 are abuzz with students engaged in active discovery. But what about in a lecture hall filled to the brim with nearly 700 students: is there a place here for experimentation as well? Professor Ken Ragan thinks so, especially when it comes to trying out new ways of engaging, giving feedback, and assessing his students.

Like many instructors, Ken wants his students to leave his course able to demonstrate well-developed analytical and critical thinking skills, both crucial for success in the sciences: “Students need to be able to make simplifying models and apply fundamental concepts – going back to basics, problem-solving … Students are more concerned with getting to the answers, and I want them to go back to basics and understand the problem.”

The challenge was how to design his course in such a way that his students could develop these important skills, give his students adequate feedback on their development throughout the term, and not completely overwhelm him (or the TAs) with loads of extra marking. In addition to a mid-term and final exam, Ken also uses several online tools that allow students to practice the concepts brought up in class and develop their problem-solving skills.

Using online tools for feedback and assessment

Ken’s students are given weekly online assignments (worth 10% of the final grade) that give them a chance to work through additional problems framed as real-world situations with familiar objects, like an elevator or a scale. The weekly problem sets are based on the material presented during the week’s lectures; they’re released to students the night before the lecture, and are due the day after. The assignments are set up and graded using an online system called LON-CAPA (more information on the system can be found here: http://www.lon-capa.org/), and students access the problems through myCourses.

CAPA personalizes each student’s problems – no two students’ questions are exactly alike, so some independent effort is required – and students get to try their hand at each question up to six times, without penalty, until they get the question right. This iterative process allows students to receive instant feedback on their problem-solving methods, and allows them to troubleshoot and experiment with alternative approaches.

The CAPA interface can take some getting used to, Ken says, but ultimately it pays off in the form of deeper understanding, and better problem-solving and teamwork skills: “Students work together on these assignments and have CAPA parties in the resident halls, so they really enjoy [the assignments] even if at first they don’t like CAPA so much.” As Ken explains to his students, “the heart of physics is problem solving. Used correctly, the assignments allow you to hone your problem-solving skills”.

In addition to the practice problems, students are assigned two short quizzes every week, on the evenings before lectures: the five questions are based on assigned pre-lecture readings. While the quizzes don’t count towards the final grade per se, students can receive bonus points (up to 5%) for achieving at least 60% on the term’s quizzes (and also meeting a minimum standard of in-class attendance).

This strategy gives students an additional opportunity to engage with the course material and receive feedback on their comprehension prior to the lectures. The added incentive of bonus marks (however small) improves the level of participation in the quizzes, and subsequently his students’ preparedness.

The fifth question on each quiz asks students to name something they didn’t understand about the readings, or something they found particularly interesting. Ken spends about an hour reading a third of the students’ answers to question five – about 250 of them. Armed with the information gleaned from the responses, “it gives me an idea of what I need to focus on during the lecture…I don’t really change the lecture [content] but it tells me how to pace [it]”. This last-minute approach is sometimes called “just-in-time teaching”, and it can take a little bit of practice. Says Ken: “It’s taken me some time to get used to just-in-time teaching, and I’m not quite where I want to be yet. The lecture is still largely prepared, but I’m more conscious of how it’s going and where students are.”

He has some advice for others thinking about giving it a try: “It’s easier to do just-in-time teaching in lower-level courses. Don’t do it the first time you teach a course. It takes time to develop an understanding of students and the course … you need to really know the material and be willing to change quickly how you present or order material.”

It’s clear that Ken’s use of these online tools benefits both students and instructor. Students get feedback on their performance early and very often, are given many opportunities to put conceptual and theoretical material into practice, and also are learning good study habits that are transferrable to other courses: having so many little tasks to complete on an ongoing basis means they can’t “cram” at the last minute. By the same token, Ken receives continual input and feedback from his students, and is able to tailor his teaching to better facilitate their learning.

Ken’s willingness to try out these online tools may seem like a bold move: many students have come to expect a certain assessment style in very large classes and getting them to buy in to other formats can be challenging. But, says Ken, “I tell them I’m experimenting and that I think what I did before didn’t work as well.”
______________
Would you like to see your students better prepared for class? Would you like to have a better sense of their understanding of the materials? Join Prof. Ken Ragan at a discussion to learn more about using online assignments to motivate students to read material before coming to class, on October 7

Original publication date: September 30, 2014

(Re)Designing a course through the lens of a teacher and a learner

Thu, 05/30/2019 - 09:00

On May 14 and 16, 2019, McGill’s Teaching and Learning Services (TLS) held its semi-annual Course Design Workshop (CDW), where instructors from across the disciplines worked on (re)designing one of their own courses according to a learning-centered course design framework. Instructors engaged in a variety of activities and peer feedback exchanges as they worked toward (re)designing a course where learning outcomes, assessment and teaching strategies were aligned.

I’m a newly hired Learning Technology Consultant at TLS and a Master of Education Technology student at the University of British Columbia. I’m particularly interested in instructional design and the role technology has in improving the teaching and learning experience. I attended the CDW and had the opportunity to learn more about teaching by observing how instructors engage in their own learning.

The CDW invites guest instructors to present a piece of their own teaching experience that is relevant to course design. These presentations often illustrate how course design principles can be put into practice. Guest instructor Agus Sasmito (Mining Engineering) illustrated how he uses student comments from his course evaluations to make principled changes to the course content, learning outcomes and assessments in his courses. Agus also explained that he highlights to students the changes he makes to his course design as a result of course evaluation comments from previous students. I learned from this presentation that having open conversations with students about positive ways you use their feedback can be an important driver of change. It also made me realize that course (re)design is not a linear process: it can take time and multiple attempts (over semesters) before getting it right. It was helpful to see that you always have room to test things out, get feedback from students and try again.

Over the last academic year, I wrote a blog series called Strategy Bites, with each post describing a different instructional strategy that promotes active learning. A number of these strategies was employed during the CDW. For example, instructors created concept maps so that they’d have a visual representation of their course content. Creating the map gave them the opportunity to show relationships between course concepts or content modules, and to rearrange or eliminate content. After working on their maps, some instructors commented on the value of this activity for helping them take a step back and look at their course structure from a whole picture perspective. I asked one instructor if he would be interested in implementing this type of activity in his course. His response was, “Yes! I would love to use concept mapping as a way to get my students to create effective study aids!” In effect, students could do concept mapping as a way to review and make sense of course content, and then they could use these maps for studying. I learned that having instructors actually experience active learning in the role of the learner can be a way to get them to buy in to the value of such activities.

By engaging instructors in active learning strategies throughout the CDW, they have opportunity to learn about course (re)design through the lens of a teacher and a learner. Attending this workshop has made me, in my new role as a Learning Technology Consultant, feel better equipped to engage in meaningful conversations with instructors when they come to TLS for support.

Digital games to overcome barriers in second language acquisition

Tue, 05/28/2019 - 09:00

Did you know that playing games can help with learning a second language? No matter the age, most people love playing games. What better way to learn something new than through a medium that is fun and rewarding and promotes exploration and curiosity.

Technology is an inseparable part of our life. Blogs, wikis, podcasting, social networking, video and photo sharing and playing computer games are just some examples of the technologies we interact with on a daily basis. However, the education system in many countries has not yet been adapted to accommodate the technological changes in society and students are bored with traditional teaching methods. This has resulted in a disconnection between learners and learning systems. The educational value of games has been long since been recognized (Lee, 1979), however, a growing interest in the pedagogical use of digital games has only been seen in the recent years.

Willingness to communicate in second language

Studies show that different characteristics of serious games can help second language teachers create an environment for their students to improve their willingness to communicate. Willingness to communicate is a fundamental goal in second language learning and increases the possibility that learners transfer these new skills outside of the classroom (Reinders and Wattana, 2011). Studies show that there is a direct relationship between willingness to communicate and the likelihood of students improving their second language skills, particularly productive skills.

According to second language researchers, learners who communicate with a second language more frequently, have a greater potential to develop language proficiency. However, willingness to communicate in the second language can be affected negatively by such things as anxiety, shyness and lack of self-confidence during evaluations and speaking in front of a group. Therefore, it is very important for second language teachers to remove any barriers in order to improve willingness to communicate among students (Reinders and Wattana, 2011).

Gaming is fun and motivating

Given that games are considered by players as fun and engaging, they create a low anxiety environment for players. Players like communicating in a gaming environment because it allows them to communicate without anxiety or embarrassment (Reinders and Wattana, 2011). Engagement and motivation are key elements of playing the game. If players are confused about a game quest, they are increasingly willing to communicate with other players in order to overcome game challenges and to move on to the next level. This Interaction between players emphasizes the value of communication and gives them an immediate sense of achievement (Rienders and Wattana, 2014).

In multi-player serious games, communication is even more critical since it is key to winning the game. Players are more willing to engage in communication by using the vocabulary and the grammar that they have already learned in the class. Moreover, players have the chance to correct the linguistic mistakes of other players during the game.

Chatting during the game

Chat is a motivating tool which increases students’ willingness to communicate. Players use text chat more than voice chat while playing serious games (Reinder & Wattana, 2011). Players have enough time to read each other’s text messages and prepare their own answers. Most of the players produce a greater amount of language output, experience more intrinsic motivation to communicate in the second language and less anxiety about communication using chat while playing a serious game (Reinders and Wattana, 2014). Once players use text chat to communicate with other players, it helps them to feel more prepared and confident to communicate in the second language and as a result willing to participate orally in the classroom. Text chat communication also motivates students, who are shy in face to face communication, to participate in communication and express themselves while playing serious games (Reinders and Wattana, 2011).

Psychologically secure environment

Second language learners who are in good psychological condition are more likely to concentrate on language learning, communicate in a second language, accomplish a task, receive comprehensible input and acquire the second language (Reinders and Wattana, 2015). Therefore, a psychologically secure environment is required to reduce negative barriers (Aoki, 1999). Serious games provide this type of secure environment. Players are less conscious of themselves while playing a game, thus they don’t feel embarrassed or anxious about making mistakes in communication. This increases student enthusiasm, lowers anxiety and improves willingness to communicate (Reinders and Wattana, 2011). Serious games let the students make mistakes and learn while having fun throughout the game play.

Anonymity

Games which allow players to choose an avatar and associated name helps reduce anxiety. Since players don’t have to face other players directly, they worry less about getting embarrassed when making mistakes. Therefore, they are more encouraged to communicate in the second language with other players while playing the game (Reinders and Wattana, 2015).

So, now that we are aware of the educational value of digital games in learning a second language, let’s think about adding them to course curriculums!

This blog post was written by Hoda Izadnia, a graduate student from the Masters of Educational Technology program at Université de Laval who completed an internship in Teaching and Learning Services (TLS).

Sources:

Aoki, N. (1999). Affect and the role of teachers in the development of learner autonomy.

In J. Arnold (Ed.), Affect in language learning (pp. 142-154). Cambridge:

Cambridge University Press.

Reinders, H., & Wattana, S. (2011). Learn English or die: The effects of digital games on interaction and willingness to communicate in a foreign language. 26.

Reinders, H., & Wattana, S. (2014). Can I Say Something? The Effects of Digital Gameplay on Willingness to Communicate. Language Learning, 23.

Reinders, H., & Wattana, S. (2015). Affect and willingness to communicate in digital game-based learning. ReCALL, 27(01), 38‑57.

Where do you stand on learning styles?

Thu, 05/02/2019 - 09:12

The idea that educators should cater to students’ “learning styles” persists despite scant hard evidence that the concept of learning styles holds. A recent article in Inside Higher Ed entitled ‘Neuromyth’ or Helpful Model? revisits the topic.

The authors of Urban Myths about Learning and Education make the point that there may be a difference between what students profess their preferred learning style to be and which teaching/learning strategies actually lead to better learning. The authors draw an analogy with food: what someone chooses to eat might not necessarily be good for them to eat.

Maybe variety is a better choice – expose students to a variety of ways of learning so that they can develop their skills beyond their preferences. Analogy with food? Some kids don’t like to eat vegetables. Exposing them to a variety of types could be good for them … and they might even end up liking some of them!

What’s your opinion on learning styles? Are they fantasy or fact?

Urban Myths about Learning and Education is available online through the McGill Library.

Moving Classroom Participation Beyond “Please Raise Your Hand”

Tue, 04/30/2019 - 09:00

We remember 10% of what we read, 30% of what we see, and 90% of what we do. This suggests that the emphasis in teaching should be less on what students are assigned to read, and more on having them actively participate in their own learning process by doing

This is often easier said than done. Active engagement in classroom activities and discussions can be an intimidating exercise for many students. Even for those to whom public speaking comes naturally, the benefits of oral participation might not be clear. There may also be social or cultural reasons why students are more or less likely to voluntarily speak in class – students who are traditionally underrepresented in the law school classroom, for example, may feel like their experiences or backgrounds are less valued in classroom discussions.

Classroom participation in law school is traditionally facilitated through a question-and-answer format, where questions are posed to individual students or to the class as a whole. While this can be useful, relying on this method alone doesn’t do much to encourage participation from students who are reluctant talkers.

Thankfully, raising one’s hand or being called on to answer a question is not the only way that participation-based learning can occur. Ensuring that students have a “good first experience” with participation is one way to create a more inclusive and engaged classroom. Professor Sarah Ricks outlines one such strategy, where she has students over-prepare and over-rehearse an oral response to a very straightforward problem scenario. This sets students up to be successful with their first experience speaking in class and gives them the confidence to continue contributing in the future.

The first-year Integration Workshop in McGill’s Faculty of Law is the perfect environment for offering students the type of positive experience that Professor Ricks encourages. I saw the snowball effect of the “good first experience” method early on in the semester: students who spoke up during the first few days and received positive affirmation from the instructor continued to participate. While the first few times were clearly the most difficult, a habit of participation began developing after that. Setting students up for a successful first experience with classroom participation can set them up to be active participators for the rest of their degree.

In order to continue this trend in participation throughout the semester, the instructor I’m working with also uses exercises such as “Instant Summaries” at the end of each session. She asks students to sum up what has been covered in the class, allowing them to respond to this open-ended question with a response that she can build upon if necessary. Not only does this provide students with an opportunity to participate, it also helps the instructor get a very clear idea of what students have actually taken away from the session, especially if multiple students are given the chance to respond.

Along with these particular strategies, I’ve sensed that students are more eager to participate when the instructor makes a point of promoting common decency in the classroom. Making eye contact, demonstrating active listening, giving students time to answer (even if this requires a few awkward silences!) and thanking students for their contributions significantly improves the classroom dynamic. When the instructor is enthusiastic and respectful about engaging with students, participation feels less stressful and more natural.

While the question-and-response structure can be useful for facilitating classroom dialogue, modifications are often necessary to encourage balanced classroom participation. Setting students up for a “good first experience,” incorporating other strategies such as “Instant Summaries,” and ensuring that common decency is promoted in the classroom are all ways of building upon traditional models of student participation. Imagine yourself in the shoes of a first-year law student, thrown into an intimidating new environment – what else might have helped you feel more comfortable participating in class?

Sources 

Kate Exley, “Encouraging Student Participation and Interaction” Reflections (Centre for Educational Development, December 2013), online: <www.qub.ac.uk/directorates/AcademicStudentAffairs/CentreforEducationalDevelopment/FilestoreDONOTDELETE/Filetoupload,432480,en.pdf>.

Sarah E Ricks, “Some Strategies to Teach Reluctant Talkers to Talk About Law” 54:4 (2004) Journal of Legal Education, online: <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1107958>.

Jenny A Van Amburgh, “Lesson 8: Encouraging Classroom Participation” (Northeastern University, October 2017), online: <www.youtube.com/watch?v=e50pIUvYMKA>.

Check out the other posts in the Law series:

Peer Assessment as a Sustainable Feedback Practice

Thu, 04/25/2019 - 09:00

A number of instructors at McGill have been implementing peer assessment (PA) in their courses and have generously shared some of their reflections on the experience. 

Dror Etzion teaches Strategies for Sustainability (MGPO-440), an elective course offered by the Faculty of Management. Enrolment is typically between 50 and 60 students from Management as well as other Faculties, such as Arts, Sciences, and Engineering. During a conversation with Dror about this course, he shared how he has implemented peer assessment of a team project and peer assessment of students’ contribution to teamwork. He also offered advice to instructors considering implementing peer assessment in their courses. 

What is your motivation for including PA in your course?

As I indicate to students in the course outline, “sustainability is a team sport.” I think it’s important for students in Management to be able to work as part of a team, and when students work as part of a team, they have to be able to give and receive peer feedback. Also, feedback from the professor, well, I think students probably deem it to be necessarily biased because I’m from a different generation. I’m always open to crowd sourcing grades. Peer assessment allows me to do that to a large extent.

How are the student teams formed?

They’re established independently of me. I ask students to work in teams of four, though sometimes there are teams of three or five. I give guidelines, like, it’s good to have people from different disciplinary backgrounds, but students don’t take that to heart as much as they should. They tend to gravitate toward friends or people they’ve seen in other classes. Some of the teams are effective and team members form meaningful links between each other. They establish a sense of community and interest in each other’s work. Other teams are much more instrumental in doing the project and there isn’t any deeper engagement. One time, because this course is open to other faculties, four engineers got together, which was too bad because they didn’t benefit from cross-disciplinary perspectives. Students don’t heed my guidelines as much as they should, but I think it’s up to them to decide what their comfort level is, and sometimes they learn the hard way.

How do you implement PA of the project?

Students are assigned a semester-long team project that gives them an opportunity to learn how to provide constructive feedback to peers. Students decide early in the term what their topic will be. Then, I choose which team gives feedback to which team. I match groups so that there is very little topical overlap, so that there is no disincentive to provide good feedback.

There are two rounds of feedback so that students can improve their projects throughout the semester. The first is not graded. The second is worth 10%. Feedback from the draft stage to a more polished version allows students to see that the project is really getting better as the course progresses.

To help teams get the most out of the peer feedback, each team provides questions in writing to their peers on the specific issues where they want help. The “evaluating team” answers these questions and can provide additional suggestions. I also give time in class for teams to work together and meet the team they’ll give feedback to, like, here’s 15 minutes, go talk to them. This happens shortly after the teams are formed and once students have selected the topics of their projects. I used to hoard my time so that I could project as much information at the class as possible, but I’ve kind of walked away from that to give students more time to feel comfortable in the classroom and work with each other. That also minimizes the chance of one student not understanding what’s going on. Somebody in the group can explain to them what the project task is and how the peer assessment works.

In addition to PA of the project, students assess peers’ participation in the team. How do you implement that?

The first deliverable for the team project is for the team to devise and agree upon a contract that articulates the norms and expectations they have for each other as they work together over the semester. On myCourses, I’ve put a template for a contract, but I encourage teams to develop a tailored contract that their team is comfortable with. At the end of the course, students assess each teammate’s participation and professionalism. I remind students to reference the contract to make sure their assessment is accurate and evidence-based. This assessment is worth 5%. Each student’s grade is the average of the scores given by their team members. Students did the assessment by filling in an online form.

I clarify up front what students have to do and I think the contract manages to set expectations quite well, so students are prepared by the end of the course to do the assessment. Also, by the time they get to my course, they’re usually already acculturated to the idea that teamwork is a part of many of their courses. So, peer-grading is never really a concern.

What advice do you have for instructors who are considering implementing PA in their courses? There are a lot of things that go on when doing peer assessment. Students aren’t always sure what to do, like to whom they should submit feedback. So, allocate time to explain the task, and give students opportunities to ask questions so that they really comprehend the mechanics of the exercise. Get an affirmation that everybody understands what they need to do.

Readers: Wondering how students can develop their feedback skills to support teamwork? This brief TLS video addresses how peer assessment can support productive and harmonious team experiences by making students accountable to their team members.

An educational revolution: Should students depose the traditional master of classroom?

Tue, 04/23/2019 - 09:00

Legal education has long been associated with an intimidating learning environment: rigid course structures, competitive classmates and highly qualified professors. This blog post explores active learning and student-driven learning as an educational pathway to increasing student participation, engagement and fulfillment within the law school experience.

Active learning should be conceptualized as a spectrum which comprises multiple educational strategies to achieve the common goal of maximizing student participation and engagement. Throughout my role as a Tutorial Leader (in the Integration Workshop for first-year law students at McGill), by observing the professor and his interactions with students, I have witnessed several examples of this process. In my experience, the implementation of an open and inclusive learning environment, as an active learning strategy, greatly facilitated the accomplishment of the above goal.  

In particular, a strategy of creating a more relaxed learning environment had the effect of optimizing student participation. For example, the professor started each class by playing a song, with the objective of making the students feel more at ease within the first minutes of the session. Starting class with such an unconventional practice set the tone for the entire class: students were welcomed into a counter-traditional and less intimidating classroom. The professor frequently reminisced about the hardships that he had surmounted during the course of his legal education and academic career. He did not hesitate to share anecdotes about his life experiences and encouraged students to call him by his first name. Not only did student input increase as the semester unfolded, but students also seemed keen to express their own opinions and not simply attempt to give the professor the “correct” answer.  

Witnessing the positive results of the professor’s decision to adopt an active learning approach to teaching has led me to reflect upon other strategies that could be implemented within the active learning spectrum. My reflection will focus on student-driven learning and the impacts that this approach could have on law school courses. The rationale in support of student-driven learning is the following: since students are the ones going through the curriculum, they should have a say in how it is designed and implemented. Accordingly, to further the educational benefits that can be derived from active learning, the student-driven learning strategy could be applied by law professors. 

Student-driven learning could also be used as a tool to deconstruct some of the pervasive myths that are engrained in the law school culture which impede the fulfillment of the objectives of active learning. Law school is dominated by the idea that the structure and rules of the class should be pre-determined by the professor. Student-driven learning could be applied within the law school context by giving students more decision-making power as to how their curriculum should be applied. For example, at the beginning of a given semester, professors could engage in discussions and make decisions with their students as to various aspects of the course: the grading scheme, the role of participation and class discussions, peer review as a learning tool or tutorial sessions amongst many others. Instead of using the feedback from ex-post class evaluations, professors should engage in this type of discussion before the course is underway or while the course is progressing in order to consider the needs and expectations of students as they are learning. This approach will allow students to communicate to the professor the solutions that can be implemented to increase their active participation and interest in the class materials.  

To conclude, the use of active learning strategies, ranging from the installation of a less stressful learning environment to giving students the possibility to make choices about their course structure, would represent a significant step forward in legal education. Indeed, students’ active engagement and fulfillment through their law courses will ultimately allow them to better their understanding and increase their retention of what they have been taught in their courses. However, this post does raise one fundamental question to consider: Even though law professors are progressively adopting active learning strategies, is the law school institution too sclerotic to welcome the students themselves into the pedagogical decision-making process?  

Check out the other posts in the Law series:

Engaging Information Studies students with law and policy: Part 2

Thu, 04/18/2019 - 09:00

In a previous blog post, I discussed my first experience teaching independently, and the pedagogical techniques I tried to apply. The course in question was GLIS 690 Information Policy, a class that aims to help library and information studies students navigate legal, policy, and ethical issues around information. I taught this course in Fall 2018 to 19 Master of Information Studies students, with varied backgrounds and career goals. In this course, I attempted to scaffold students’ exposure to legal and policy concepts while also implementing active learning strategies I learned in AGSEM and TLS workshops. Course feedback was overall positive, although some student evaluations were negative and/or offered me constructive criticism. Apparently, one challenge I faced was finding the right balance between lecturing and active learning activities during class time. An additional challenge was course alignment, particularly aligning content, outcomes, and assessments. I address lessons I learned below.

First, I note that – despite working hard to incorporate frequent breaks, questions for students, think-pair-shares, and hypotheticals – some student evaluations noted that the bulk of my classes were still lecture-intensive. That was disappointing for me, as I thought I had avoided lecturing too much. However, I guess that, even when you think you’re not lecturing too much, you probably are. (Maybe it’s an academic rule of thumb, like that – even when you think doing citations for an assignment won’t take too long – you’re wrong, and they will.) I was not deterred by negative feedback, but will strive to improve this area in future iterations of the course. However, thinking about how to cut lecturing further made me realize that I also struggled somewhat with course alignment. To expand and optimize in-class, active learning activities that will help students apply course concepts, I think I first need to tweak my course alignment.

Alignment, as I have learned through TLS, is the linkage between course goals, learning objectives, activities, and assessment. I struggled with this, as – while preparing to teach Information Policy – I focused on learning a lot of substantive content so that I could teach all those subjects to my students. While I worked hard on articulating my learning objectives in the syllabus and on developing in-class activities, my assessments were not as well thought-out as they could be. I also did not put enough time into connecting different elements of the course, including content, objectives, in-class activities, and assessments. Students, in course evaluations, sometimes pointed out that there was a mismatch between these elements. Now that I know the content I need to teach, I can invest more time in aligning these elements and in preparing students for assessments in future iterations of the course.

There was also a bit of a mismatch between the assessments, the content I was trying to teach, and the level I was teaching. Some student evaluations noted that my mid-term assignment was somewhat easy – and misaimed – for a class of graduate-level students. I am currently thinking of ways to make this assignment more challenging and innovative, while ensuring it helps students practice and apply course concepts. For example, inspired by another TLS workshop I attended in December 2018, I am considering an assignment in which I ask students to apply Canadian copyright law and policy to a hypothetical social media posting on behalf of an organization. This new assignment would ask students not just to make an argument about copyright, but to cite the law and justify a specific course of action on behalf of their organization. 

Further, while my final Information Policy assignment may have interested students more, some students pointed out that I could have laid better groundwork to complete it. The final assignment asked students to design an information policy of their own for a hypothetical organization. I assumed students would be able to design their proposed policy by having studied relevant Canadian laws and example policies from different organizations. However, some evaluations pointed out that grounding my class in the policy development process – the how, not just the what of specific, substantive topics, such as copyright or privacy – would have been appreciated. This is an oversight which I will address with future classes.

In conclusion, I have been told that teaching requires reflective practice. I see now that it does, and that designing and teaching effective courses is not just a reflective and iterative process. It should also be a dialogue between students, instructors and, ideally, other educators (such as my thesis supervisor and TLS) who can help with pedagogical development. I am grateful for the teaching opportunities I have in my PhD program, and hope to deliver an even stronger version of this course in future semesters. 

Engaging Information Studies students with law and policy: Part 1

Tue, 04/16/2019 - 09:00
Introduction  

The legal and ethical questions around information engage people far beyond the legal field. Online privacy, copyright, net neutrality, and ‘fake news’ (with its challenge to democracy) are in the headlines almost daily. People who may not work in law, government, or technology are, nevertheless, Internet users and citizens with a stake in the rules shaping the information society. For that reason, I believe it is important to educate users and information managers about their online rights. That was the guiding philosophy behind the course Information Policy, which I recently taught at McGill University’s School of Information Studies. Teaching this course – engaging students from outside law and policy with these subjects – allowed me to try out several pedagogical techniques. These included: scaffolding and building on students’ existing knowledge (especially given the newsworthiness of course topics); and active learning tools encouraging students to apply course content. My first experience teaching this course offered valuable lessons, both positive and negative. I address some of the positives, or, at least, the pedagogical tools I tried to implement, below. Part 2 of this blog post will address lessons I learned and scope for improving this course. 

Context about the class 

In Fall 2018, I taught Information Policy to 19 Master of Information Studies students. Students had varied interests and career goals, including technology, academic and public libraries, archives, and knowledge management (the business of helping organizations effectively manage employee knowledge). I suspect that most students self-selected into the course because of an interest in information policy, which I define as the legal, regulatory, and ethical issues around information and its governance. The class had not been offered at the School since 2005, which gave me an opportunity to revamp the course significantly. I was eager to succeed at my first time teaching independently. I had taken as many TLS and AGSEM workshops as I could, and set out to make an effective, interesting course which would engage information studies students with legal and regulatory concepts.

Scaffolding: building on existing knowledge 

Like all instructors, I suppose much of what I did could be called scaffolding – helping students achieve progressively stronger understanding of and greater independence with material covered. While I had excellent and motivated students, the challenge lay in the fact that my background (law) is not necessarily one that students had much prior exposure to. An additional challenge is that, as with legal literacy in general, students’ exposure to legal issues could vary within a class or between topics (for example, a student who had worked in a government job applying a given statute might know that area of law better than their peers). I also note that I am in no way blaming my students or anyone outside the legal field for having limited exposure to these concepts, which are not necessarily addressed except in specialized disciplines!

I kept all these challenges in mind and tried to apply one important rule in my teaching. That rule was, “Remember in whose presence I babbled.” I’m not sure how well I succeeded at this, but I tried to define legal terms in resonant, accessible ways. Then, once students had a simple (and, where possible, humorous and memorable) foundation, we moved on to more realistic examples or applications. I used similar precepts in developing examples of legal concepts, starting with fairly obvious ones, and moving up to more nuanced or challenging cases. However, in reflecting after the end of the course, I realized that I may still be able to improve how I define legal terms for my class. I may even be able to organize them in a more interactive, effective way – for example, by using my own definitions as a starting point, but letting students collaborate in a shared Google doc. 

Another issue I built on was the fact that information law and policy are newsworthy. As a result, most students had at least some prior exposure to issues such as copyright, privacy, ‘fake news’, and net neutrality. However, this, too, is a double-edged sword. First, while newsworthiness guided my syllabus to some extent, there were also important topics to address that are not necessarily as ‘hot’ in the media. Similarly, even extensive media coverage of a subject – such as the copyright and piracy wars – may be legally shallow or misleading. I believe – based on student feedback – that I successfully harnessed students’ preexisting interests in important 21st century debates while also enhancing their understanding beyond the things they gleaned from the headlines. I hope to continue doing so in future iterations of the course.

Active learning: “It is the one who does the work, who does the learning” (Doyle, 2008, p. 25). 

Another pedagogical tool I tried was active learning. As someone who’s been a student for ages, active learning just makes sense: no one’s attention span is cut out for lengthy lectures. As various TLS and AGSEM workshops highlight, “the person who does the work does the learning.” I therefore set out to break up my lecturing as much as possible. However, I found this difficult, given that I was teaching concepts to students who were not trained in them at all. I developed the compromise of lecturing for part of each class, then breaking to give students time to work on hypothetical problems in groups. Hypothetical problems could be theoretical (such as identifying and discussing ethical issues in a fraught situation), or more practical (such as looking at a scenario and a piece of legislation, and identifying which provisions might apply). Further, where possible, I tried to break lectures early on to do think-pair-shares and engage students with the material. Initial understandings would, I hope, be refined as I explained concepts, and students would then practice those concepts in answering hypotheticals. Overall, I believe student feedback and evaluations were positive. However, as I will address in the next post, there are still some areas (including balancing lecture and active learning) in which I hope to improve.

References 

Doyle, T. (2008). Helping students learn in a learner-centered environment: A guide to facilitating learning in higher education. Sterling, VA: Stylus Publishing. 

The elephant in the room: Teaching students who don’t know what’s going on

Thu, 04/11/2019 - 09:00

It’s no secret: not every student will be prepared for every single class. As a law student, you constantly have to make compromises: read a summary instead of the full case, read for crim and not contracts because you have a quiz, read for your 8:30am class on the crowded bus ride there, or ask your friend to fill you in two minutes before the class starts. An important point to stress is that this is not a feature of particular students: we all go through this every now and then, because there is simply not enough time to read everything, all the time.  

Contrary to Ewell & Rodgers, who argue that this phenomenon occurs due to “lack of motivation,” I would venture to say that this is not the case in law school. We all work hard, we all want to learn and most importantly, we are all studious. We are not balancing watching TV with studying, but rather, we are balancing studying a with re-reading b and finishing the assignment for c. At the end of the day, we still study the whole alphabet, just not necessarily before class.

The question remains, can we learn in class without having prepared for it? I posit that learning in the classroom depends on whether the professor understands that sometimes there are students that have no idea what is going on.

The Room

In the context of small group classes as in the Integration Workshop (a course that is part of the first-year law curriculum at McGill), the impact of not preparing for class is greater than in a lecture-based class. Through small group exercises, the professor expects students to teach each other the important components of the course. Indeed, educational research shows that we retain much more information when we teach than when we passively sit in a lecture. However, students who have not prepared for class will not benefit and might even hinder their peers’ learning by not being able to contribute fully to the discussions. A professor who understands this will create a learning environment that is reflective of this inevitable reality.

In my group, it was clear that the professor was quite aware of the elephant. She spent a few minutes at the beginning of each session making sure students were familiar with the topic for the class: she asked leading questions and asked students to summarize key points of the readings; she went through short reminders of the important considerations and asked if anyone had questions. This 10-minute refresher was primordial, as it enabled unprepared students to catch up.

Although not necessarily causal, the results were positive: the students were engaged in their discussions, they taught each other what wasn’t understood by all, and they took time to help students that simply hadn’t read. More than helping each other learn the material, this type of learning also created an environment of cooperation and exchange that contributed to their confidence in making bold or new assertions about the material studied. The issue with this method however is the lack of time. Because the refreshers took time out of the exercise, we often had to rush through the session to make sure we covered all the subject.

The Fine Line

Although professors should be aware of the elephant, I am not suggesting that they should assume that students come unprepared for class. This is a fine line to draw as the best law professors are those who push their students to do better, to be engaged in the classroom and of course, to come to class prepared. Balancing high expectations with reality, without assuming students cannot contribute meaningfully, is what makes a difference in teaching. Through simple refreshers, professors can truly aid students to catch up in the classroom and enable them to contribute in peer-to-peer learning.

By addressing the elephant in the room, we can have an honest discussion about what it means to learn in a law school environment. In my opinion, students who come unprepared to class can be an asset: although they will not be aware of the exact vocabulary stemming from the readings, thanks to the refreshers from the professor, they will be aware of key points. In a small group setting, this will force them to rely on their own experiences to contribute, thereby possibly providing the group with insight not found in the readings. This promotes agency and respect of their intelligence and capability to contribute: it is empowering.

In light of this consideration, should we rethink the law school syllabus to provide for a classroom where preparing for class would be limited to key points or large concepts? What if we strived for a classroom where students could bring their individual life experiences into classroom discussions?

Sources:  

Warren Binford, “How to Be the World’s Best Law Professor

William Henry Ewell & Robert R. Rodgers, “Enhancing Student Preparedness for Class through Course Preparation Assignments: Preliminary Evidence from the Classroom

Michael Hunter Schwartz et al., “What the Best Law Teachers Do

Check out the other posts in the Law series:

Doing research to inform teaching strategies and assessment practices

Tue, 04/09/2019 - 09:00

Alejandra (Sandra) Barriales-Bouche and Sun Young Kim teach Spanish language and German language courses, respectively, in the Department of Languages, Literatures, and Cultures in the Faculty of Arts at McGill. In a conversation about using research to inform teaching strategies, Sandra and Sun Young shared what motivated them to do research with their students and how the research results have informed their teaching. They also offered advice to other instructors considering doing research with their students.

What motivated you to do the research? 

Sandra: We’re always looking for ways to break the routine in class. We wanted to teach vocabulary, in particular, in a more efficient and entertaining way. It happened that Teaching and Learning Services (TLS) invited us to try out a new software tool called Video Assignments (formerly known as YouSeeU) in myCourses. With this software, students can video record their oral presentations and receive instructor and peer comments within the video, all integrated in myCourses. We were keen to try it out and decided this was an opportunity to do some research. 

Sun Young: We wanted to investigate a real problem: students’ public speaking anxiety. We wondered if video presentations would provide students with a less-anxiety causing alternative to in-class presentations. 

Can you say more about the question(s) you were hoping to answer with this research? 

Sun Young: We were curious about a number of things, such as: 

  • How can we use videos in a meaningful and interactive way?
  • How can the use of videos help us engage students? 
  • How will this kind of project help students advance their language skills?
  • How can we motivate students to produce their own texts?
  • How can we address presentation-induced anxieties?

From these questions, we narrowed the focus and arrived at these four research questions:

  1. To what extent does the use of videos help us engage students in meaningful and interactive ways?
  2. Does the video presentation assignment help students advance their language skills? 
  3. To what extent does the videos assignment reduce presentation-induced anxieties?
  4. Does the software support students with succeeding at the video presentation assignment? 

What was your research process? 

Sandra: We began by reading some literature, mainly about the pedagogical benefits of students producing their own videos and about educational constructivism. That helped us to refine and concretize our research questions. After that, we designed video assignments for students to do. We also designed a survey for students to fill in. Once we had the survey results, we analyzed the data and used it to make decisions about the use of videos with our students going forward.

How did you collect data? 

Sandra: We planned a few different oral assessments. In each case, they were worth only a small amount of students’ final course grade, like 2-5%. We also designed a survey which we each gave to more than one of our classes during the last few weeks of the term. The survey was anonymous and made available to students in myCourses. The survey had four Likert-type questions to be answered on a 5-point scale from strongly disagree (1) to strongly agree (5). Students could also write comments.

How did you analyze the data? 

Sun Young: We graphed the numeric results (Figures 1 and 2), which helped us to visually analyze the data. Then we looked at the comments, sorting them into positive and negative. 

Figure 1. Numeric results from students in one German course Figure 2. Numeric results from students in one Spanish course

When we analyzed the data, the comments (Figures 3 and 4) helped us understand the numeric results. They helped explain certain ratings. Comments specifically about the presentation and a number of the general comments gave us insight into our first three research questions. Comments about the technology addressed our fourth research question.

Figure 3. Example student comment results from a German course Figure 4. Example student comment results from a Spanish course

What did you learn from the survey results? 

Sun Young: Comments were always fair and constructive. We learned that some students really liked being video recorded and others didn’t. We also learned that not all students like dealing with technology! But the majority of students recommended keeping the assignment in the course because they appreciate having a variety of assessment methods. So we’re thinking the tool could be used to give students an alternative activity. That’s in line with Universal Design for Learning (UDL), which McGill’s Office for Students with Disabilities promotes and which I try to implement in my classes. 

What were students’ reactions to being involved in your research? 

Sandra: They were willing to help throughout the whole process. When there were technology glitches, students were always sympathetic.  

Sun Young: The students were so patient. They saw that we were doing it to improve our teaching, and to help them and maybe future generations improve their learning. I think they enjoyed paying it forward. 

Sandra: And the students got interested in the project because they knew it was a pilot for a tool that could eventually be used by the whole McGill community. They knew that their feedback would be shared with TLS. They really liked that. 

How have you used the results? 

Sun Young: We’ve used the results to inform our teaching. Any modifications I made were based on that data: I thought, “This I should keep; this I shouldn’t.”  

Sandra: For instance, we plan to make assignments more interactive by having pairs or groups of students give presentations. We also got participants’ consent to share their assignments for pedagogical purposes.  

Sun Young: Some past participants also gave me permission to use their videos as models with current students. I always mention to students that the assignment has been modified based on previous students’ comments.  

Sandra: Yes, I always explain why I do what I do. I think students like that we trust them to understand why we do what we do in class, and they like having a say in the content of the class. 

Sun Young: The results were so useful that I will use this process for implementing other teaching strategies that I want to try out. I would tell students what I’m doing and survey them again. We’ve also shared the results of our research with colleagues at the “Language Learning and Technology Professional Development Series” organized by the Arts Multimedia Language Facility. We’re now thinking of doing a presentation at a graduate student orientation in our department so that the grad students who teach can see a new tool.

Sandra: It can be part of their training. 

What advice do you have for instructors who are considering doing research on teaching strategies and assessment practices with their students? 

  • Start with a small project, one that can be done in one semester so that it’s manageable. 
  • Be patient with yourself when trying something new. Changing the usual routine comes with some challenges, but after getting used to the new software, it’s worth trying it out. 
  • Work with a colleague so that you can brainstorm together and support each other when facing technological hurdles. Working as a team can inspire instructors to use the first project as a stepping stone to exploring new research questions and teaching strategies. 

Success story looking for happy ending: Is community-engaged learning in peril at McGill?

Thu, 04/04/2019 - 09:00

My classroom, empty.

This is at least how I found it to be at 2:36 PM on October 2, 2018—day 1 of my students’ “research partnerships,” for lack of a groovier word, (and I have searched some!) with a community-based organization working for social change.

In its current form, the capstone course for students majoring in Gender, Sexuality, Feminist and Social Justice Studies, GSFS 400, is a relatively new class offered at the Institute for Gender, Sexuality, and Feminist studies (IGSF) at McGill. It was designed as such in 2017 by the unforgettable Mary Bunch, now assistant professor in Cinema and Media Arts at York, and in the Fall 2018 it was adapted—with a littl’ extra music—by the musicologist currently on faculty at IGSF… yours truly. Inspired by Paulo Freire’s Pedagogy of the Oppressed and bell hooks’ description of a liberatory feminist praxis as “action and reflection upon the world in order to transform it,” (Freire in hooks 1994, 112), GSFS 400 builds on the principles of collaboration, partnership, and critical praxis to support interdisciplinary teams of student researchers in the development of a research project that responds to needs defined by a community-based organization working for social change. The research partnerships had been initiated or reactivated the summer before thanks in large part to McGill’s Experiential Community-Engaged Learning & Research (ExCELR) program, hosted at the Social Equity and Diversity Education (SEDE) office (beware, more on SEDE below). And after nearly a month of readings, in-class discussions, and guest lectures on the risks and benefits of university-community partnerships, feminist research ethics and design, and how to build respectful research relations, for the first time that day my students were free to meet here or to be where the organisation needed them to be. No need to come to class, I wrote on my syllabus, (although you can—I will be here). 

Luckily for my soon-to-burst empty nest syndrome, at 2:40 PM or so, my all-star team of indigenous feminism critics walked in with a carboard box full of surveys that had been done some time ago with service-users at Quebec Native Women (QNW). No one was quite sure whether or not the answers to the surveys been catalogued and compiled, let alone analyzed and turned into recommendations, grants proposals, or policy changes. That, they did: “data collected from question 11” eventually lead to a fantastic resource handbook aimed at two-spirit service-users at QNW. All the way upstairs in IGSF’s classic “pink office,” my #TeamWorkMakesTheDreamWork team was plotting on ways to write a Canada-wide report on best practices for emergency night shelters in order to help Montreal-based organisation Chez Doris turn their recent 1,000,000$ donation into an emergency night shelter for women. Soon, the Fab Four that partnered with AIDS Community Care Montreal also came in with—and I recall this distinctly—a willingness to make the research truly transformative that, I was soon learn, would be as strong as a chain that truly has no weaker link.

The team that was partnering with Femmes* en Musique didn’t make it to class that day; they were working on their literature review for a project on gender-, sexuality-, race-, and age-based discrimination in Quebec’s music industry that would turn out to have a wide media appeal (a press conference is to take place at the end of March). The team that worked at the Laboratory for Urban Culture (LUC), an organisation that provides free after school arts programs to kids in the underprivileged neighbourhood of Little Burgundy, didn’t show up either; they found out quickly enough that the Community Youth Arts Network online platform that the LUC had imagined (C.Y.A.N! They had quipped the catchy acronym already!) would have to be scaled “down” to nine hours a week of volunteering with kids without which the programs could not even take place, as well as recommendations for a five-year sustainability plan. A gem amongst gems, the team that partnered with Project 10, an organization that supports and provides services to queer and questioning youth, was likely benefitting that day from the invaluable guidance provided by the superstar coordinator there, regarding how to collect and edit impact stories (#myP10story) as part of their funding & outreach efforts.

Over the following two months, I followed the progress of my students’ research projects through bi-weekly meet-ups, large-group debriefs and workshops, meeting agendas that I could access online, individual self-reflexive journaling, and several installments of research reports that included, among others, ethical considerations, methodology, and an extensive literature review.

Most students were very forward, particularly in their researcher diaries, about how challenging they found the course to be. Many were confronted for the very first time with the emotional labor involved in actually doing social justice work rather than arguing for “theory-from-the-ground-up” in a term paper. Some found teamwork dynamics to be particularly taxing (though not the #TeamWorkMakesTheDreamWork team, to be sure). Others faced ethical dilemmas with their organisation that made them doubt, at least for a time, their actual commitment to feminist goals. Unanimously, my students found the bridge between theory and practice to be one that was a lot more difficult to build than to critically endorse.

And yet their final journal entries were filled with comments like that of student-researcher Maya Smith, who worked with Chez Doris:

“Throughout this project I have learned so, so much about my preconceived assumptions about homelessness support and care practices. I feel like I went through so many different stages of thinking and rethinking over the last few months and I have been repeatedly challenged. I feel that I entered this project with really lofty, ideological and utopic ideas of how to change the world. And while I think hopes and dreams are definitely important to have, I found myself challenged again and again to reconsider the actual situation on the ground, and how to best address it. At the end of the day, even if I want to take down capitalist racist patriarchy, if there aren’t enough beds in Montreal for all the homeless women in the city, and that’s what homeless women think is needed, then meeting that need should be the goal. I learned how important it is to truly listen to the people you are purporting to be helping, not just in theory, but in actuality.”

Another student-researcher, Corinne Bulger, initially found it difficult to reconcile indigenous feminist scholarship and praxis as a Canadian settler doing research in an indigenous-led organization. “There were times when I felt frustrated or worried about my positionality as a white researcher working with an Indigenous community partner,” she wrote:

“I have felt both my academic focuses, of gender and Indigenous studies, have formalized my ongoing knowledges of ethics and ensuring not to take up space in certain contexts, which at times has left me immobile in moving forward with a research question, project, or even comment out of fear of being inequitable or insensitive. I think this class positively pushed me out of my comfort zones and challenged me to be a more forward researcher and student. I believe that being more forward in my research and study practices is something I need to work on, as I have often found myself being passive or feeling a lack of expertise during my undergrad. Through this course, support from my team, and working with our community partner, I have been encouraged to be more aware of this and challenge myself in the future. Overall, this project has been a wonderful learning experience. I am so grateful for the academic skills it has taught me, as well as the positive relationships I gained with my peers and our community partner.”

Surprisingly, most of my GSFS 400 students—most of whom were in their final year of undergrad—also celebrated their newfound appreciation of teamwork! Student-researcher Elsie Chan described her teammates as “great sources of solidarity, community, and strength,” and spoke of how proud she felt about having “met the goals that I have set up for myself,” “got[ten] outside of my comfort zone,” “challenged myself to do things that I usually try to avoid. . . I am no longer as scared of trying new things,” she concluded, “and cannot wait for what comes next in the future!”

Now, what a success story, you might be thinking; students rising up to the challenges offered by their capstone course and flying out of their undergraduate nests with hope for the future! What a happy ending.

Well, not quite—or, not yet. The recent announcement of the restructuring of SEDE, and in particular the relocation to Enrolment Services of Community Engagement Coordinator Anurag Dhir—who has been building and sustaining university-community partnerships for years—is a harsh blow to community-based learning initiatives at McGill. When I stepped into this course assignment late last summer, I could rely on SEDE to “hop onboard” in 4 out of the 6 partnerships that my students participated in. Beyond the particular circumstances of this course, SEDE creates sustainable community links that allow the university to circumvent the contingencies related to individual faculty members’ particular expertise, availability (maternity leaves, sabbaticals), and familiarity with specific university regulations, including those of McGill’s Research and Ethics Board. Simply put, long-term partnerships are beyond the scope of a single prof. This office is a critical life support for community-engaged education at McGill.

Corinne Bulger was categorical in her assessment of the benefits of university-community partnerships for the whole ecosystem: “I believe that the most positive impact we made was through helping bridge a relationship between Quebec Native Women and McGill. A relationship that will offer beautiful opportunities and resource-sharing in the future. I hope this is something that is sustainable and continuous for future semesters between the two parties.”

And so do I.

McGill has identified community-based learning as a priority. But without an institutional structure responsible for sustaining university-community partnerships beyond single-semester courses, community-engaged learning simply cannot thrive at McGill.

At its best, community-based research does not simply produce new statistical data or reveal how structures of discrimination channel certain lives towards the margins of the citizenry. It does what great music always does (says the musicologist): transport us to another world, compel us to imagine a new social order, and in turn shape that alternative lifeworld. In other words, (actually, the words of another music-and-social-justice scholar, George Lipsitz), “in the process of struggle, scholar-activists develop new ways of knowing as well as new ways of being. . . [in order] to become the kinds of people who can create institutions, practices, beliefs, and social relations capable of generating a more just world” (Lipsitz 2008).

A worthy priority indeed.

Bibliography

hooks, bell. Teaching to Transgress: Education as the Practice of Freedom. New York, Routledge, 1994.

Lipsitz, George. “Breaking the Chains and Steering the Ship: How Activism Can Help Change Teaching and Scholarship.” In Engaging Contradictions: Theory, Politics, and Methods of Activist Scholarship, edited by Charles R. Hale, pp. 88-112. Berkeley, University of California Press, 2008.

Student quotations used with permission.

A “pass/fail” grading system can be the “A+” grading system for law school

Tue, 04/02/2019 - 09:00

In a recent sociological study, Kathryne M. Young poignantly describes how law students typically feel like hitting an “intellectual wall” when confronted with their first readings and assignments. Additionally, many of them struggle to get over their first grades: they got into law school thanks to an outstanding academic record. Now, they work harder than ever just to fall somewhere in the average “B” range, thanks to the infamous “curve” (or “enforced average”). 

In most law schools, grades cause a disproportionate amount of anxiety among previously overachieving students. Worse, their typically “curved” distribution contributes to a toxic, competitive climate. Many students become obsessed with getting the most impressive letter grades on their transcripts, whether they admit it or not. As such, they begin to focus on “doing better” than their classmates. Such focus unfortunately keeps them from fully appreciating an otherwise enriching individual learning experience.

Experimenting with alternative pass/fail grading

The latter criticisms, among others, have given rise to calls for a reform of the grading system in law faculties for a long time. In the early 70’s, the University of Michigan, for instance, led an experiment to assess the benefits of an alternative Pass/Fail system, in response to consistent complaints from both high-achieving and low-achieving students about grade-induced pressure.

In a nutshell, the experiment revealed that, although finding a replacement to the traditional grading system was indeed desirable, a total Pass/Fail system might feature two main flaws. First, it might decrease the general performance of students, absent any measurable standard to attain. Second, it might diminish already inadequate feedback levels.

In 2012, the University of Toronto’s (U of T’s) Faculty of Law implemented Pass/Fail grading in a way that likely addresses these flaws. For all courses, students now receive one of four grade options ranging from “Low Pass” to “High Honours.” The scale holds students to certain measurable standards, which tempers the likeliness of a decrease in their overall performance. In addition, as professors still have to justify the different “grades” given, general feedback levels unlikely diminish.

Since 2016, McGill Law is similarly experimenting with Pass/Fail grading in the context of its first-year Integration Workshop. In this course, although students are ultimately graded on a Pass/Fail basis, their performance on various assignments is thoroughly assessed and graded with rubrics by Tutorial Leaders (TL) and/or professors. Thus, while students keep working towards meeting certain measurable standards, they are less likely to suffer from the anxiety that curved letter grades typically induce.

Meeting high standards, period

As a TL, I have witnessed how, even when students only need to reach a grade of “7/10” to meet the “Pass” threshold of their first assignment, they still generally aim to write the best possible paper. A majority of them have come to drop-in hours I offer to review the feedback that I had provided. Regardless of their grade, most asked me about every single aspect of their assignment that they could improve.

This experience reveals that a transformation of the traditional approach to grading may also transform students’ approach to learning. The students of my section discussed their assignments with an attitude I wish I had had when I was enrolled in this course myself. Back in my first-year law courses, I would have received a curved letter grade for a similar assignment. I admittedly endeavored to beat the odds that my work would be graded as “average.” The students that I have met this year have worked as hard as I did, but they have channelled the energy that I would have spent worrying about my performance in comparison to others into an endeavor to go beyond their own initial capacities. As one student said to me, “I am not doing this for a grade. I am doing this to improve.”

In other words, a Pass/Fail grading system may transform counterproductive anxiety from the fear of “being average” into productive motivation to simply reach high standards. As a student comments, the most commonly given grade at U of T Law, “Pass with Merit, is exactly what it sounds like: you have passed the course and met the high standards.”

In the long run, would U of T and McGill’s new grading systems effectively reduce the performance-related anxiety experienced by many students?

References

  1. Robert Lempert, “Law School Grading: An Experiment with Pass-Fail” (1972) 24 Journal of Legal Education 3 at 251. Retrieved from https://www.jstor.org/stable/42892138
  2. Michael Robert, New Grading System Actually Makes Sense [Blog Post], 2012. Retrieved from http://ultravires.ca/2012/09/new-grading-system-actually-makes-sense/.
  3. Kathryne M. Young, How to Be Sort of Happy in Law School (Stanford: Stanford University Press, 2018), Chapter 18 (“Exams and Grades”).
Check out the other posts in the Law series:

Peer Assessment in Higher Education: A Viable Inclusive Practice

Thu, 03/28/2019 - 09:00

I believe strongly in encouraging individuality and agency in the classroom. After my own experiences with elementary and secondary education, and the experiences of my peers, in which we often felt in some way misunderstood and unacknowledged by our teachers, it has been important for me to advocate for supporting all types of learners with all types of interests within the classroom. Even without knowing much about theoretical aspects of inclusivity, I strived to provide an equitable classroom during my time as a high school teacher. 

The first time I formally explored inclusivity was about a year after stepping out of my teacher role. I was tutoring a student who was in an Early Childhood Education program. Many of their textbooks reflected the importance of individualized support for young students and inclusivity of all students, regardless of ability or interest. The next time my interest in inclusive practices was piqued was when I joined McGill University’s Assessment and Feedback Group (AFG). Here, I brought my experiences as student and teacher, and my support of inclusive approaches to learning, to the group’s conversations. The group inspired me to do some research into inclusive education practices in post-secondary settings. During this research, I found a practical 2-page article published by Plymouth University entitled “7 Steps to: Inclusive Assessment.” The overview of the article effectively summarizes the aim of an inclusive approach to education: 

“Higher Education (HE) expansion has resulted in greater student diversity. Rather than focusing on specific target groups or dimensions of diversity such as disabled students or cultural groups, an inclusive approach aims to make HE accessible, relevant and engaging for all” (Thomas & May, 2010) (p. 1).

The Plymouth University article was an important find because it highlights a perceived difficulty for instructors in implementing inclusivity in post-secondary environments. As pointed out in the first chapter of the Benson (2013) casebook on inclusive practices in higher education, implementing inclusive practices in higher education often requires staff development, the provision of teaching and student supports, and the development of more inclusive higher education pedagogies. These requirements suggest that implementing inclusive practices is a daunting task. This perceived difficulty has been echoed in conversations at AFG meetings and other conversations that I have had (or overheard) with instructors and students during lectures and workshops on ableism. It’s a perception that can dissuade instructors from adopting inclusive practices. In this post, I will use peer assessment, an assessment strategy mentioned in the Plymouth article, as an example of an inclusive practice that can be implemented in a university setting. I address two characteristics that illustrate inclusion: choice and social interaction. 

Firstly, peer assessment can be considered an inclusive assessment practice through the characteristic of choice. Choice can be offered through peer assessment in a variety of ways. (1) Students can have a say in choosing the criteria that they use to assess one another’s work or the format of the feedback that they will be giving and receiving. (2) Students can also be allowed to choose the peers to whom they will provide feedback and receive feedback from. (3) In the case of teamwork, students can determine the criteria for a team contract and (4) also choose their roles within the team. (5) It might also be possible to offer students the choice of using different technologies, such as audio/video-recordings or online rubrics/forms, to complete the assessment. 

Secondly, peer assessment can be an inclusive assessment practice through the socialization opportunities it provides for students. Students interact with each other when communicating their feedback. Again, in the case of teamwork, students develop their interpersonal skills by working together. Setting team goals and having team accountability are other formative socialization opportunities that can be provided through peer assessment. Peer assessment is participatory; it invites students to take a role in their own and their peers’ academic achievement. As a participatory method, peer assessment invites socialization and the voices of students to be prioritized within the classroom. Accessible and participatory strategies, such as peer assessment, are important to implementing inclusivity in higher education (Benson, 2013). Providing opportunities for social interaction creates an inclusive classroom, as benefits of social interaction can include an increased sense of community and acceptance of all students within the classroom.  

These brief observations regarding two characteristics of peer assessment show the viability of implementing an inclusive assessment practice in higher education classrooms. In general, allowing students choice supports their agency in the classroom and providing students with socialization opportunities helps them develop their interpersonal skills. All students can therefore benefit from engaging in peer assessment. 

References 

Benson, R., Heagney, M., Hewitt, L., Crosling, G., & Devos, A. (2013). Managing and supporting student diversity in higher education: A casebook. Oxford: Chandos Publishing. 

Plymouth University. (2014). 7 steps to: Inclusive assessment. Retrieved from https://www.plymouth.ac.uk/uploads/production/document/path/2/2401/7_Steps_to_Inclusive_Assessment.pdf 

Balancing the Roles of Supervisor, Mentor, and Friend

Tue, 03/26/2019 - 09:00

Mentor. The word itself was originally a name—the name of an advisor in Homer’s epic poem The Odyssey who was impersonated by the goddess Athena. The term’s mythical connotations are all but gone now, but it still describes an advisor and teacher.

But we have another term for that in graduate education at many universities: supervisor. The difference is that supervisors usually focus on helping students along the path toward graduation. Mentors make time to guide some of their students in other aspects of their lives.

How? One of my mentors was not my supervisor but a professor from whom I took two or three courses. (One of the courses happened to be on James Joyce’s massive game-changing novel Ulysses, which was based in part on The Odyssey.) After our courses were over, he offered me research assistantships and later helped me to get a job in the instructional development office that complemented my sessional teaching.

These were my day jobs, which I wisely did not quit, but for years I spent many evenings writing music and playing in a band. My mentor too was a musician, a much more accomplished guitarist, and when he launched an album he asked me to play with his group for the occasion. He knew that classical and flamenco were not my forte but thought the experience would be good for me. And it was, perhaps mostly because it was a vote of confidence and a gesture of friendship.

Most supervisors cannot, and probably should not, be friends of all their supervisees. There’s not enough time. There’s also the unavoidable risk of bias. But mentoring has a mode that M. Christopher Brown II calls “frientoring,” which accepts subjectivity as a good thing. Frientoring also attempts to equalize the power differential between mentor and mentee so that they’re equals on a personal level. So, I play tennis and ride road bikes with my local mentors. I still don’t win many sets—but that’s because of my serve, not an institutional dynamic.

Frientoring and mentoring have fewer risks when they can be separated from supervision, which is one reason that supervisory committees are becoming more common than sole supervisors. One supervisor, however multifaceted, simply doesn’t have all the facets that a group does. And if one supervisor can help a student to focus on degree requirements, then another might safely be friendlier.

The mentor can think through longer-term or personal questions with the student: What’s out there after graduation? Who can a person become? The Supervision website at McGill has a page on mentorship that offers many more such questions for mentors and mentees, in addition to lots of other scholarly advice on the roles and responsibilities of supervisors. For alumni and current students, there’s also the Mentor Program at the Career Planning Service.

Wherever you go, and in whichever field, you will find people who have benefited from a mentor and who have grown into that role themselves with time. They attest to a significant need for continuous advice from people who have probably known you only as an adult, and who understand the many phases that adults go through in their careers and personal lives.

Although the original Mentor was a teacher hired by the departing Odysseus to continue the education of his young son Telemachus, most mentors today guide adults. As graduate students gain independence as researchers, mentors become no less important. They are often inspiring, enabling, crucial figures—during and after grad school. In fact, for life.

Original publishing date: April 29, 2014

Strategy Bites: Student-generated questions

Tue, 03/26/2019 - 09:00

At Teaching and Learning Services, we regularly receive questions from instructors asking for ideas to enhance their teaching and improve students’ engagement in class. So, we’ve recorded 2-3 minute video bites that describe how to implement some strategies we’ve chosen based on relative ease of implementation, suitability for different class sizes, and their representation of a variety of interaction types. We’ll be sharing these strategies in the Teaching for Learning @McGill University blog over the coming weeks. Stay tuned!

 Strategy: Student-generated questions 

The student-generated questions strategy involves getting students to write questions about peers’ oral presentations. The questions are shared among the class for discussion and can be submitted to the instructor, who then has a bank of questions that can be used as prompts for online discussions or even quizzes.

Why use this strategy?

As a student, one of the most disappointing things to observe while giving a presentation is feeling no engagement from your fellow classmates. Some students may be on their phone, some on their laptop, some quite obviously tuned out and staring in to space. While wrapping up the presentation you worked very hard on, you ask if any one has any questions and all you get is dead air. Of course, while the purpose of presentations might be to have students research a topic and practice their presentation skills, the topic itself is usually an important addition to the course content and it should therefore be meaningful to the whole class. So how can you teach your students to be a better audience?

An effective strategy is using student-generated questions – a strategy where students are asked to produce a number of questions during a presentation. The questions can be to clarify a concept, stimulate discussion, or be a potential exam question. By having to actively listen to the presentation and create questions based on the content, it is almost guaranteed that student engagement will improve. During my time in grad school, one of the most important characteristics of our weekly colloquium for thesis defenses was that fellow grad students were required to come up with two questions based on each defense. There was never time to address every question, but each question was to be submitted at the end of the session. This was great way to get everybody to listen and practice some critical thinking.

Having students generate questions is an effective way for students to practice formulating questions. As it is, there’s no such thing as a stupid question; however, the effectiveness of a quality question as a means to stimulate meaningful discussions is undeniably valuable in the learning experience. After all, asking a good question can be just as powerful a learning tool as giving a good answer.

Would you like to know more?
  • Does having students write questions enhance their learning? Read what one author has to say.
  • Students don’t always know how to ask meaningful questions. Strategies exist for helping students learn to ask meaningful questions.
Check out the other posts in the Strategy Bites series:

What strategies do you use to get students to pay attention to peers’ in-class oral presentations?Share your ideas!

Graduate Supervision as Teaching: Let’s talk

Thu, 03/21/2019 - 09:00

While many professors come to Teaching and Learning Services (TLS) to talk with us about undergraduate teaching, hardly anyone ever comes to talk with us about graduate supervision. Considering how many difficulties can arise when a professor is supervising a student, it is rather surprising to me that the topic does not come up more often during consultations. As a supervisor, you may find yourself wearing many different hats: that of an employer, a guide, a role model, a coach, and occasionally, even a friend. Regardless of your approach to graduate supervision, it is essentially a form of teaching. What you are teaching will depend on your discipline as well as the skills the supervisee already brings to the table. It can involve anything from identifying pertinent research questions to choosing the right audience for a publication. In addition to the disciplinary knowledge, teaching may also include time management and organisational skills, the ability to communicate research results and the resilience to deal with setbacks.

Whether research takes place in a library, a lab, or the field, your role as a supervisor is to help students on their journey to becoming independent researchers and, importantly, to obtaining a graduate degree. I am mentioning the latter because it is surprisingly easy to lose sight of the fact that your supervisee’s journey should not meander aimlessly from one interesting topic to the next, but rather follow a curriculum with predefined milestones. Due to the unpredictable nature of research, developing the curriculum and keeping students on track can be challenging. To help you with this task, Graduate and Postdoctoral Studies (GPS) has developed myProgress, an online tool that allows you and your supervisees to keep an eye on their progress towards obtaining their degree. “I expect my students to monitor their own progress,” some of you may interject at this point. That raises important questions: What are your expectations for your graduate students and what can they expect from you as their supervisor? Responsibilities in graduate supervision are often not clear-cut. For example, everyone agrees that you are supposed to provide support to your supervisee, but what kind of support and how much? Are we talking about comments on a research paper or emotional support after the rejection of a conference abstract? What kind of support will actually help your supervisees succeed? While there are no simple answers to these questions, the supervision team is here to help. Together, TLS and GPS, have developed a variety of resources and workshops to help you navigate the supervisory role right from the start.

If you are curious about approaches to supervision taken by award-winning supervisors at McGill, take a look at their profiles on the Supervision Snapshots website. The site features the winners of the Carrie M. Derick Award and the David Thomson Award for Graduate Supervision and Teaching at McGill since 2016. If you would like an opportunity to talk with some of these supervisors in person and learn about their experiences, join us at TLS on April 12 for the Supervisors’ Lunch. Whether you are looking for advice or have a successful supervision strategy that you would like to share, this event is your chance to talk to colleagues from across the university. For more information or to register, click here.

Pages

 

 


McGill University is on land which has long served as a site of meeting and exchange amongst Indigenous peoples, including the Haudenosaunee and Anishinabeg nations. We acknowledge and thank the diverse Indigenous people whose footsteps have marked this territory on which peoples of the world now gather.