SBG: With A Little Help From My Friends
With some encouragement from colleagues, I’ve decided to try Standards Based Grading in my second semester calculus based physics class this spring (2013). Because this is my first attempt at SBG, I leaned heavily on the work already done by others and their advice (namely Andy Rundquist and Frank Noschese). I also gained some instant supporters through Twitter.
So I don’t bore readers with all of the finer details, I’ve tried to break up my thoughts by headings below. Feel free to skip around, and if you make it to the end, this post will end up in that discrete minority of entries that are read in entirety!
Change for change’s sake was not enough to warrant this shift in assessing my students. And, while trying something new can be refreshing for the instructor, there is more to the story here. I really am looking for ways to make the course more meaningful for everyone. And there were some very specific items I wanted to address explicitly
- Assignments up to this point have always been one-time “values” students earned. If students made improvements later on in the course, traditional grading means could not facilitate “master learning/grading” (revisions of grades due to addressing shortcomings in resubmissions).
- While I could in general understand the reasons for choosing certain assigned problems, specific objectives were not clear to students. And, to be honest, choosing end of chapter problems was not as focussed as I would have liked. When choosing problems, I found myself thinking:
- “Yeah, they need to do one of those; it’s kinda out of the box” and
- “That’s a classic problem, they need to work that one out”
- These rationales are subject to just being assigned on a whim, which may change by the mood the instructor is in.
- I loathed deciding how much each problem was “worth” point wise. About the only thing I standardized was 0.5 points off each time units are incorrect/omitted. Partial credit consisted of my inferences on where students’ thought processes were misguided. Longer, more difficult problems were worth more points; shorter single step problems were worth less. While this makes sense at the surface, I began to question “should credit be based on content tested rather than length/difficulty?”
- These students are pre-engineering students. Eventually, they will need to present their ideas to their colleagues and clients. Presenting ideas and explaining your own thought processes are fundamentally different from simply turning in a problem set and being done with it.
With just about anything new, there is an uneasiness that, well, makes one feel uneasy! The following are my greatest apprehensions. Time will tell how warranted they are.
- One way students will need to submit work for standards is by screencasts. I’m just beginning to create screencasts myself and know there’s a bit of a learning curve. Much of the burden will fall on the students here, but there are some wonderful pain free ways to create them. Screencastomatic is quickly becoming one of my favorites. Jing is what I learned on. If the means to create them requires admin privileges, it could be a show-stopper for some of my students.
- A departure from traditional instruction might not be received well by the institutions these pre-engineers transfer to. However, I reasoned that if students are more thoroughly grasping the most important concepts in physics, are future instructors really going to care how “assignments” were graded?
- I couldn’t break completely free from the last bullet. I’ve kept exams a part of the course, though they are take home exams. There is also a separate lab score. Some standards can be addressed with what is done in lab.
Standard grading is on a 0 – 4 point scale: from 0 = no attempt to 4 = exceeding expectations. Essentially, I’ve used the rubric Andy Rundquist has on his course website (where he credits Frank Noschese). I added the element of “able to come up with examples and applications outside the context of the problem” for a 4. Because I wasn’t sure exactly how many points there would be for the entire course, I decided to use weighted percentages for the different course components:
Full Disclosure To Students
I really wanted to be transparent with my students. Too often, I think they feel in the dark about why they are being asked to solve certain problems or required to do this and that. So below is an excerpt from the list of standards to be handed out to students.
This is what students will see:
Chapter 15: Mechanical Waves
I can successfully compare and contrast different types of waves, periocidy of waves, sinusoidal waves and provide a clear & complete definition of simple harmonic motion.
Assignment: (Recommended and required for submitting reassessments)
- Describe two different examples that are periodic but not simple harmonic oscillators. Describe two different examples that exhibit simple harmonic motion. Include sketches/graphs for each. (Use multimedia for source material if you wish, like YouTube). Of all of these examples, explain why some are and some are not SHM.
- The point: periodic behavior does not necessarily mean it is SHM!
- There are many types of waves. Give explanations of the three basic types discussed in Chapter 15. Research a fourth different type of wave.
- The point: some waves can be described as combinations of other types of waves. Other wave types or “wave-like” phenomena are completely independent of others.
- Text problems 15.3 and 15.5
- The point: “periocidy” means we can use basic physics to get at the basic physical properties of the wave. But there is a lot more to the story . . .
I heeded the advice I read in Andy’s blog and those who left comments: Standards become “active” when discussed in class and students then have 2 weeks to submit their work. If they want opportunities to resubmit work, then they need to submit the recommended assignment in the same time frame. From then on, they can continue to resubmit so long as the standard has been active within 2 weeks. I’m a little fearful of the accounting this will require, but it’s a small class (only 3 students!).
Other advice taken was to have the policy that score may go up or down based on repeat assessments. I think higher ed courses can implement this more easily than K-12. I just hope students see the merits of this policy.
Students can submit work for standards in various formats: screencasts, URL’s of appropriate media supplemented with their own written narrative, office hour visits, class quizzes, class discussions and chance/water cooler meetings.
Well, that’s it for now–this ended up longer than I’d anticipated. Thanks for reading!
While at a conference recently (AAPT), I was presenting material over an astronomy workshop I’d held with a colleague during the summer of 2012 (at the SLL Observatory). The poster had the usual stuff on it: pictures of the facility, demographics of participants, bulleted lists of what we did, etc. And, even though it was located off the beaten path from the bulk of the posters, it got a lot of traffic — no complaints here.
Those who stopped by to visit had great comments and questions. And then it happened, on more than one occasion… I felt like I was out of the loop; like I’d missed some memo on current buzzwords. I was asked by several individuals
“So what mileage did you get out of doing this?”
To be honest, I didn’t know how to respond at the time. The truth is, working with an NWOSU adjunct who is also a member of a local astronomy organization (Starcreek Astronomical Society), we simply came up with the idea while stargazing as something that would be fun to do: apply for a small grant to host an astronomy workshop in rural Oklahoma. (It was a great success, by the way!)
Anyway–since then, I’ve had time to reflect on the question and its broader meaning. Turns out, it’s an incredibly valid question! and we should all consider answering it before taking on a new project or saying ‘yes’ to the next extra curricular activity around the bend.
When it comes to deciding what to do and why, here are two basic factors to consider:
- Your professional activities ultimately play a role in things like job security, promotion and tenure.
- Extra curricular activities require additional time and will draw from your resources.
While fun and interesting, item 2 above can mean time away from family and hinder fulfilling what’s wrapped up in item 1. Plus, saying “yes” to too many things means you may not be able to complete any of them to the quality you’d like. Not to mention more involvement in things means less “power down” time for you (we all need a little separation from work!).
The best situation one can find oneself in is one where items 1 and 2 above are aligned. Extra curricular activities don’t have to be an added drain/strain. They can be framed in ways that augment the experiences of more than those they were designed to benefit directly.
For example, I help host a local robotics competition (BEST) and and am often asked to launch rockets or do a science demonstration day for area schools. With a little bit of lead time, I get my students involved as much as I can. They help run demonstrations on science safety, judge at the robotics competition and play a role in getting the rocket fleet up and running. And I’m not just pawning off my work to students! Many of them are pre-professional students or future teachers: they need community-based volunteer experiences and public school field experience hours to round out their undergraduate programs of study. In fact, our science majors must complete Science Fair Judging: a service-learning course designed to help majors see the merits of professionals promoting/supporting STEM initiatives among youth. The science fair “extra-curricular” activity has turned into a benchmark experience for all of our majors (biology, chemistry and science education).
The mileage I gain from activities like these is not just in helping students complete their degrees or resume building. One of their most critical values is maintaining relationships and continuity with area HS teachers/schools. This has been a tremendous asset in feeding grant-funded professional development opportunities (see ToPPS, for example).
Long story short, the odometer may not be easy to read in all that we do, but if we keep our wits about us and heed our colleagues’ advice (like backseat drivers!), we’re likely to stay on the right path. And those extra curricular activities? They aren’t always exits away from your final destination. . .
If you’ve ever been in a physics class, at some point you know it’s going to happen. Sometimes it’s once a week– in other classes it’s every single day.
Teacher: “So we see here, there’s a system that looks like this . . .”
Students: Nodding, acknowledging they see what she/he is referring to.
Teacher: “Ok. Now what will happen if I do this?” (Motions like she/he is going to do something to the system but freezes for drama and to prompt some student reactions).
A multiple choice question pops up on the screen, asking for students to make a prediction of what happens next.
Students: Wrestling with what to say/choose, begin working through a mental flowchart; a newly established norm for physics class that defines how they approach answering “simple” conceptual questions:
- “Nope, it can’t be that one, because that’s the obvious answer–that’s too easy to be right.”
- “It could be that answer because it’s the opposite of the obvious answer.”
- “It’s best to choose one of the other choices because by default, the other two are what the teacher expects me to choose based on what we know (or don’t yet know).”
For the sake of the physics education road sign theme lately and keeping things simple, I’m going to call the above logic/graphic the Physics Round-A-Bout (though I know it happens a lot in other disciplines, too). It’s based on every teacher’s desire to surprise or impress upon students something that is unexpected. Let’s face it, for the most part teachers like sharing what they know, and teachers are intrigued by counter-intuitive explanations. This in large part is what is responsible for the physics round-a-bout. But the round-a-bout is also partly derivative of learning theory: by creating a discrepant event, disequilibrium or cognitive dissonance among students, students begin to assimilate the new information until it is accommodated within their world view or mental content. Wait, what? Yeah, sorry about the jargon . . . Those familiar with Piaget’s work, the work of Karplus or Strike and Posner’s conceptual change research may be giving a fist pump right now. For the rest of us, here’s the short of it:
You see something that doesn’t make sense. You are perplexed. So naturally, you try to figure it out. You work at it until you arrive at some kind of explanation that makes sense to you (whether accurate/complete or not)– fitting it in with other related stuff tucked away in the back of your mind (sometimes done consciously, sometimes not). Then you feel at ease, ready to move on to other things.
This works, and it works well. Humans engage in this practice since birth and it plays a significant role in how “science” is carried out everyday. In fact, science education research and practitioners have formalized the process and put it to use. The 3-phase learning cycle, for example, is designed around it (the 5- and 7-phase LCs include assessment and “engaging” elements).
The problem is, when we teachers repeatedly set up lessons or demonstrations to spark interest and get students perplexed or to make predictions, we often do so without first giving them the opportunity to build a knowledge base to adequately tackle the question. So students frequently are not well-equipped to address what’s presented. And they fall victim to this teachers’ traps time after time.
Short version: We teachers routinely set students up to fail in making predictions, and it gets old.
At an AAPT meeting, I recall a presentation on this by Eugenia Etkina. It was some time ago, but it left an impression on me since I’d already begun questioning the “sage on the stage” teaching mode. For further reading on this, this article by Eugenia Etkina on ISLE provides a good background, though the document covers much more than what this post is about. Fast-forwarding to page 26 of the document gets right to the point: repeated conflict, confront and resolve teaching strategies may actually hinder learning. Instead, there is evidence that students may stand a greater chance at remaining tuned in to the content for long term understanding if they are provided with experiences/data to make successful predictions.
When you think about it, it’s pretty obvious: Do you like being accurate or well-informed about what you’re talking about, or do you prefer to always be corrected on your talking points in front of your peers?
So for me, this doesn’t mean “out with all the demonstrations.” It means letting students ask questions and guiding them toward looking in to key factors that will help them make informed predictions. Still show them cool stuff; everyone likes a surprise/change of pace every once in a while. But try giving students the opportunity to see the connections and applications to the material before an eye-catching demo. Engage them with content they can wrestle with and avoid routine magician shows that always have an unexpected result for unsuspecting physics students. Because in the end, it may not end up getting them anywhere but further removed from the discipline.
What kinds of factors limit student conceptual gains in introductory physics?
Take-home message: Secondary trends observed in a study suggest that the average HS senior/beginning college student will face a limit to conceptual gains possible in standard lecture-based instruction of introductory physics. Very high conceptual gains among students in a lecture-based course are only attainable by students with the most developed scientific reasoning abilities (which is a small subset of the HS/beginning college student population).
As part of my PhD research at the University of Oklahoma, I investigated the interdependencies of
- changes in everyday usage of technical terminology (in particular, the word “force”),
- conceptual gains in first semester algebra-based physics classes and
- scientific reasoning ability.
The working hypothesis was that those who demonstrated greater scientific reasoning abilities and had greater conceptual gains would stop their loose (and technically inaccurate) usage of “force” in colloquial usage. In short, I expected that students who demonstrated an aptitude for physics would not mix “force” with other terms (like energy, momentum, strength, etc.).
While the hypothesis was not fully supported, there were other trends revealed that dovetailed other work in Physics Education Research (PER) (see the work of Coletta and Phillips) and can serve as a source for further study. The scatterplot below is pretty loaded. While the axes suggest one trend, the color coding reveals something more subtle:
MLU stands for Mechanics Language Usage instrument (developed for the study). FCI refers to the instrument used to determine conceptual gains in 1st semester physics. “Developmental Stage” refers to performance on a reasoning ability test and is actually much more detailed than 3-point scale–it was reported in this way for simplicity.
The axes suggest the opposite of what was hypothesized: Namely, those with the greatest conceptual gains actually changed their loose usage of “force” the least. And it wasn’t because they had the usage correct to begin with . . . These students were simply more resistant to changing their language usage. I have to admit, the lack of delineation between a decrease and an increase in the amount of “mixing” (loose usage of “force”) was a bit disappointing. I was hoping for more dots above zero.
But look at the vertical dashed lines added to the graph. Notice that to the right of the green line (a gain of 0.65) there are no green dots. To the right of the blue line (a gain of 0.5), there are no blue dots. These are effectively conceptual limit lines based on performance on the reasoning ability test. The average HS/early college student scores nominally in the range of the blue to green category in scientific reasoning ability (an indirect measure of developmental stage). These are potentially gain limits. For additional information on “gain” as reported here, see this document by Richard Hake.
- It’s important to note that the participants in this study were enrolled in a typical lecture-based physics class (3 hours lecture, 2 hours lab, 1 hour recitation) with only limited interactive engagement components. So one could argue the data suggest that this format of instruction (primarily lecture) imposes a limit on gains for the average student.
- I would be inattentive of me to neglect to mention the successes Modeling Instruction has had in this arena. Modelers report average gains in excess of .85 for students of similar demographics. The highly engaging environment of their classrooms make conceptual gains acheivable despite the varied levels of proficiency in scientific reasoning ability.
- Although this study involved 240 participants across two universities, these findings are exploratory– secondary to the intent of the original research question. Which means however convincing the arguments above may seem, further study designed to test that question really needs to be done.
On the road, when you come across this sign you immediately begin assessing your choices: Which lane is the safest to navigate? Should I straddle the two lanes? Will I be able to go back and forth between lanes safely if I have/choose to?
I’m responsible for sections of calculus-based and algebra-based physics in the same semester this year. While these are two different tracks of physics to begin with, a theme I’ve noticed is that there are distinctly different yet analogous levels of preparation in each section. I suppose this is generally the case, but a quick non-official poll of math/physics background at the start of the semester and some early after-class discussions with students really brought it to light.
These are the uneven lanes of the physics classes physics of my current sections:
- Students taking algebra-based physics who are also in calculus or have AP high school physics experience . . . as compared to others who took college algebra years ago with little to no high school level physics.
- Students enrolled in calculus based-physics who completed algebra-based physics and/or calculus I & II the previous year for preparation . . . as compared to those who are just now learning the meaning of a derivative and have not yet taken any physics.
I don’t think this is anything too extraordinary. But it begs the question “How do you make course content accessible, meaningful and challenging across differences in academic demographics?” Maybe more signifiant: “How do learners respond to the varied academic demographics of their peers?”
Should we choose one lane only?
Should we attempt to cater to both?
Can we switch gears every so often without creating problems?
Something I’ve started doing after asking a whiteboard question and following a class discussion is quickly scanning them as I collect them in a pile. I hand pick a few to fuel the discussion a little further. Of the boards I pull aside, I make sure to select one or two that are not as well developed as the others. The purpose: emphasize what worked even among students’ responses that may have been more limited. This alone won’t bridge the gaps. But if students can see the value in their efforts, even if they are only emergent, then they are more likely to want to contribute more–despite the uneven lanes across their colleagues’ levels of preparedness for the class. It can also be eye-opening for students to see
Another thing I do is discuss derivatives in the algebra based physics course. Not at length, but enough to offer intrigue to those who have taken or at least heard of calculus. Care needs to be taken though–some students get awfully nervous when you spend time in class on math that is beyond the scope of the class. I like to think this demonstrates how equations in physics come from someplace; that they didn’t just appear one day as formulas for mortals to memorize. All that said, I don’t get into the calculus very often. Like switching lanes on an uneven road: if you switch lanes too many times, it can get uncomfortable and/or unsafe.
So there you have it. A somewhat scattered post about a couple ideas regarding straddling the lanes of levels of student preparation. To be honest though, I don’t think these are very brave or innovative — I’m more curious how others address uneven lanes in their classrooms . . .
You might think the question above applies only for public school teachers: battling the bustle of students’ schedules can be evidenced by creative assignment schemes to accommodate athletics schedules, quiz bowls, club events and the like. Recording classroom sessions and posting them to the web as videos or screencasts is another way students can review/catch up on what they may have missed during regular class time.
However, “cross traffic” can be more than just extra-curricular activities that act as a physical time constraint. Other coursework generates additional cognitive load. So even if a student has ample “time” for getting work done, they may be struggling cognitively with the overall influx of material.
If learning 400 years of physics content in a single year is like drinking from a fire hose, then what does this times “n” other coursework equate to?
While we would hope scheduled courses are well-balanced and everything dovetails the overall curriculum seamlessly, and that we can relate our course material to everyday life–it doesn’t take long to realize that is an idealistic pipe dream.
For example, when teaching calculus-based physics, I often have students new to physics and concurrently enrolled in calculus. While I can take advantage of this by having students investigate real world applications of elementary calculus (the meaning of a derivative, for example), some care and reservation is required before getting too deep into applications of mathematical integration. So OK, that’s not too bad . . . it’s managable.
But when I try to capitalize on students’ interests so they can better “relate” to the physics, I’ve had mixed results. Discussing what bull riders or line backers experience (rodeo and football are big at NWOSU) and how it applies to “jerk,” I’ve found, does not necessarily bring the physics closer to the students. In fact, sometimes entering a discussion of examples like these makes me come across as Shrek over identifying with Artie in Shrek the Third (go 1:30 minutes into this preview). And then my students post things like #physicsfail.
So what began as a question in a caption of a road sign and led you to believe I had an answer, actually ends with me genuinely soliciting your advice: How do you accommodate for students’ extra-curricular activities, course loads and other demands in your classroom? Do you:
- Ignore the “noise”?
- Solicit for student interests and target as many of those as you can?
- Use the cross traffic as teaching moments only from time to time (once a week, a couple times a semester, as assignments …)?
- Structure the entire course curriculum around it (see Physics for Future Presidents or Service-Learning: Engineering in Your Community)?
For just about any drive, this sign is loathsome
It means delays.
It means less time is spent covering new ground.
And it means we might not get where we wanted by the time others expected.
We are unwitting victims of a “productivity paradigm” that has convinced us more can (and must) be done in less time. Our schedules and professions demand it.
To what degree, do you think, has this paradigm carried over into US education? One of the things I’ve noticed in education policy, program assessments and education research is that a significant motivation is to somehow make more “stuff” fit into prescribed time slots in ways that are “more effective:” we want improved test scores over greater breadths of material in less time.
So here’s a thought: Maybe instructors and learners should be prepared to stop every once in a while. Slow down, engage in some formative assessment and allow for ideas to incubate a bit before pushing on.
Some of the most interesting work I’ve seen in science education research breaks from the paradigm bolded above. These studies investigate how learners and instructors take time to slow down and reflect on what they’ve done before moving forward. This can be accomplished during formative assessment exercises by frequently jotting ideas down on whiteboards individually or in groups for whole class discussion. Or it can be accomplished by taking more extensive time to regularly write and revisit journal entries on one’s own learning. It’s more about having learners stop to think critically about what they know and formalize it in a dialogue with others so they can then move forward with a more complete picture before tacking new content.
But these kinds of practices in a classroom take time.
It means delays.
It means less time is spent covering new ground.
And it means we might not get where we wanted by the time others expected.