Monday, July 17, 2017

3. Thoughts ahead of ADE 2017 Academy

My pedagogical innovations with mobile technology have helped me become a new Apple Distinguished Educator (ADE) in 2017. I'm waiting for my flight to the USA ADE Academy. Before I arrive, I offer a few thoughts about my experiences and philosophy so far.

Excited
To meet and learn from my peers. Here are a few statistics I culled about our 2017 ADE cohort: there are 130 new ADEs, but only 36 from higher education (like me), and only 25 of those 36 are faculty (like me). I'm excited because this means I have only a couple of peers that are not only in higher ed but also science (much less biology) professors. Cross-pollination - the exchange of new ideas and approaches between disciplinary (and grade-level) "silos" - will certainly be a highlight of this three-day Academy! I'm particularly interested in learning from K-12 educators how public schools are preparing students (in both study skills and technology skills) for their college experiences.

n.b. Kudos to the California State University (CSU) system - my employer, and the largest system of higher education in the world (at least by student body size): all five California higher ed faculty who are new ADEs this year are from the CSU (not the University of California - UC) system!

Cautious
I am an Apple advocate. I have been most of my life:

Me (left) in high school, published 1/18/1996 in my hometown newspaper, the Albany (OR) Democrat-Herald

I cut my teeth on an Apple II+ at home during elementary schools, and later used a MacPlus at home, various SE30s and LCs at school through the 12th grade, and then bought my first Mac (a PowerMac 6400), then G3 and G4 towers, etc. Although I'm never an early adopter of technology (usually at least two years behind the latest device), I've managed to accumulate quite a bit of Apple tech since becoming an Assistant Professor:

My office, shot while preparing a class lecture, ca. 2016

At CSU Fresno, my home institution, we launched DISCOVERe, a 1:1 tablet program for our students, a few years ago. However, our administration's approach was not to be an exclusively Apple program, but to develop a brand-agnostic approach to augmenting our classes with mobile technology. Thus, I expect to feel a conflict of interest over the next few days, while I learn how others use Apple products in their classes, and while Apple employees help me learn about, and develop, best practices for employing their hardware and software.

Through the Academy, I am going to be doing my best to maintain focus on developing platform-agnostic approaches. So far, we know that a big focus this year will be on using Clips, a social-media-friendly video-capture app, in the classroom. I'm optimistic, but skeptical, that I'll be able to translate much of the effort I put into this process into my classes because I know that many of my students have Windows and Android devices (and thus not the Clips app). I'll be updating this blog regularly over the coming week with musings and functional applications of mobile technology in education. Grade- and platform-agnostic best practices, here we come (I hope!)

Monday, July 3, 2017

2. Framework for Backward Design

Self-rating of attributes of the process described here (1=good; 2=neutral or not relevant; 3=poor):
Accessibility: 2
Affordability: 1
Engagement: 1
Efficiency: 1
Implementation: 1
Inquiry: 2

I'm proud to be a scientist in this day and age. Preparing students to be evidence-based decision-makers is a hot topic in education, and we are experts at this process. However, we're not necessarily experts in how to help others, like students, achieve this goal! Thus, most professional conferences I attend as a professor or as a scientist have recently had workshops on essential topics like:

  • assessment (how and what data do we gather that help us measure whether our students are learning the information and skills we intend)
  • curricular design (learning outcomes, course goals, aligning these with professional standards, backward design…)

Introduction: the Syllabus
Essential parts of a course syllabus include "Course Goals" and "Student Learning Outcomes" (SLOs). Depending on who you ask, the Goals are meant to be a qualitative, colloquial description of how students should change, or what they should become, by the end of the course. SLOs are specific descriptions of what students should be able to do by the end of the course.

One of my genetics Course Goals:
"You should become comfortable with using facts and terminology to analyze data and to perform quantitative reasoning, decision-making, and critical thinking."

A related SLO from my genetics course:
"Make models to predict experimental outcomes; generate null hypotheses (expected results)"

Thesis
Goals and SLOs must be considered when planning a course - this is the concept of backward design. Here I will provide a generic but detailed framework for achieving backward design, with accompanying examples of how I undertook this process for my genetics course. 

Backward Design
I dislike this term, as "backward" here does not imply wrong - it just means that this approach was formalized after a majority of teachers were, apparently, using a less-useful (forward) curriculum design process. In forward design, a teacher would first provide information and then later design an assessment (e.g. test) to measure student understanding. The problem with this process is that the teacher might not have taken into account the Goals and SLOs when preparing for each class. Without regularly consulting the Goals and SLOs, by the end of the class the assessment items (exam questions) might not be measuring students' achievement of SLOs.

Thus, backward design (e.g. Wiggins G, McTighe J. "Understanding by Design." Alexandria, VA: Association for Supervision and Curriculum Development; 2005) emphasizes:

  1. keeping the endpoint (Goals and SLOs) in mind, then
  2. creating assessments that measure the SLOs, and finally
  3. designing exercises and lessons that help students practice and achieve the SLOs

In other (sometimes unsavory, depending on the company you keep) terms: backward design is teaching to the test.

My backward design alignment involves nine steps. Before I describe the first, I must introduce one more concept:

Introduction: Bloom's Taxonomy
It is widely accepted that students can demonstrate several types of skills and abilities, each of which builds on the former. This idea was first stated by Bloom; there are six levels to Bloom's hierarchy (from most simple to most advanced):

  1. Remember
  2. Understand
  3. Apply
  4. Analyze
  5. Evaluate
  6. Create

(Bloom, Benjamin S. "Taxonomy of Educational Objectives" Allyn and Bacon, Boston, MA; 1956).

Faculty often hear about Bloom's taxonomy, as I did, in the context of writing assessments. Students should have a relatively easy time scoring well on exams that comprise mainly lower-level Bloom's questions requiring memorization and understanding (such items might include matching, true/false, fill-in-the-blank, and multiple-choice questions). We are often encouraged to incorporate higher-level Bloom's questions. My perspective is that this is beneficial for testing because it allows faculty to distinguish students with levels of understanding beyond rote memorization.

Framework

(Optional) Step 1. Align Letter Grades with Bloom's
It was for this reason that, a couple of years ago, I thought it would be clever to align my letter-grading to Bloom's taxonomy. Of course, with six Bloom's levels and five letter grades, I had to be creative, so I combined the highest two levels (evaluating and creating) so that each letter grade corresponded to Bloom's:

A: Creating/Evaluating
B: Analyzing
C: Applying
D: Understanding
F: Remembering

Alignment of my custom five Bloom's levels with letter grades


The first thing that happened after I decided to incorporate this alignment was that I realized I would have to revise my summative assessments (exams) so that I had questions for each of these Bloom's levels. At that point, I decided it would be equitable to have an equal number of points available on each exam for each of my five letter grades. In other words, my alignment of percent of points earned to letter grades became:

A: 100-80%
B: 60-80%
C: 40-60%
D: 20-40%
F: 0-20%

My framework for backward design incorporates this philosophy on grading: that letter grade should directly reflect student understanding and ability as defined by Bloom.

Step 2. Identify Program-Level Goals and Learning Outcomes
As backward design requires consideration of Goals and Outcomes at the onset of course preparation, it makes sense to start with Goals and Outcomes that might already exist at a level higher than the individual course. My department has an established set of Outcomes for our undergraduate major program, as part of our assessment plan for program review. These Outcomes are based on the American Association for the Advancement of Science publication "Vision and Change," in which a set of Core Concepts and Core Competencies for undergraduate STEM education were recommended. Our entire department mapped our major curriculum to the five Concepts and six Competencies to identify any gaps in our preparation of students by noting how the instructors of these classes reported whether each was "Introduced," "Emphasized," or "Mastered" in that course.

Vision and Change


It is also possible that other goals and outcomes will be dictated by professional societies, such as the American Chemical Society. In that case, those recommendations should also be gathered at this point.

Step 3. Create Course Learning Outcomes
With disciplinary and programmatic goals and outcomes in hand, you are ready to generate the Learning Outcomes for your course. After brainstorming a list of Learning Outcomes you think are important, then a useful process is to explicitly align each Learning Outcome with those from Step 2. Remember: you should be assessing (testing) your students on each LO, so you want to develop a relatively short list (perhaps up to ten total). A great way to decide which Outcomes are most important is to consult your mapping of your own Outcomes against those programmatic and society-driven Outcomes; get rid of those that align least well.

In relation to the Student Learning Outcome I shared (above) from my genetics course:
"Make models to predict experimental outcomes; generate null hypotheses (expected results)"
I decided that this doesn't address a Vision and Change Core Concept, but it does correlate with two Core Competencies: i) apply the process of science, and ii) modeling. Thus, I kept this Outcome in my syllabus.

Step 4. Draft specific and measurable Student Tasks
Think of this step as writing a generic final exam, with the goal being to list a number of questions (or types of questions) that you would ask students in order to probe the depth of their understanding/ability for each of your Outcomes. This is where the backward design truly begins: aligning your Outcomes with your Assessments. This is incredibly valuable to do before the term begins, because then you have all of the resources at hand to decide what topics/assignments/readings/exercises you will need to provide students to help them understand and practice what they will be tested on.

Assessment Design


For example, one topic I teach is pedigree analysis; this process involves a need to generate null hypotheses - one of my Course SLOs. The related Student Task I wrote was:
"Define a null hypothesis for each inheritance pattern"

In sum, I wrote over 55 Tasks covering the entire semester of the course. This took me a few hours, mainly poring over old exams and reminding myself all of the different favorite questions I ask about various outcomes. Now I have a list of over 55 activities that I expect students to be able to demonstrate competence in by the end of the term.

(Optional) Step 5. Categorize Tasks by Bloom's
As I explained above, I like the notion of having an equal number of questions (and/or points) on each exam that tests each of my (five) Bloom's levels. Thus, after I brainstormed the Tasks, I assigned each one to a Bloom's level, based entirely on the first word of the Task, which is always a verb. Considering the verb is an expedient way to align Tasks to Bloom's.

Step 6. Outline your course schedule
As every semester, I created a spreadsheet with one row for each day of class. Then I block out classes when I know much or all of the class will be spent on non-instructional activities (Day 1: syllabus, exam review session days, exam days, student ratings of instruction (course evaluation) day, holidays, and so on).

Once I know how many "instructional days" I will be working with, I decide how I will distribute the Learning Outcomes (course content) and, importantly, when I will be giving summative assessments (exams).

This coming semester, I've decided that the term will comprise four "Themes" of material:
  • Molecular Genetics
  • The Central Dogma
  • Gene Expression Regulation
  • Transmission Genetics
At the end of each Theme, I will give an exam. Continuing on with backward design, I decided which Tasks (generic exam questions) would be covered during each Theme of the course. This gives me a snapshot of the types of questions I will be asking on each of those exams.

Step 7. Edit Tasks for Consistency across the Course
Now came the fiddly parts of the process (although this only took me an hour or two). At this point, I decided it would be optimal to ensure that I only kept about the same number of Tasks per Theme (and hence per exam) that I intended to cover. Recall that I started with over 55, but I soon realized that this might be overwhelming for students.

On reflection, I considered that most of the mid-term exams I write have eight to twelve questions, and that I need more questions on the same content for my comprehensive final exam. Thus, I estimated I probably need between ten and fifteen Tasks for each course Theme. I went back through my 55+ Tasks and removed some, especially from areas of content where I had written more than others (this is where having mapped the Tasks onto the Course Schedule is very helpful - so that I can keep roughly equivalent and reasonable numbers of new Tasks to introduce each class meeting).

Step 8. Edit Tasks for Consistency across Bloom's
To create an exam fair for students of all levels of proficiency, I then looked at the mapping of Tasks to Bloom's taxonomy and now aimed also to have about the same number of Tasks per Theme per Bloom's level.

For example, in my Molecular Genetics Theme, I had 17 Tasks, but I discovered that most of those questions were Bloom's levels 1, 2 and 5 - I was missing mid-level Tasks. So, at this point I rewrote some level 2 and level 5 Tasks to be level 3 and 4-type Tasks. In the end, I had:

Level:    Number of Tasks:
1            4
2            3
3            4
4            4
5            2

I have fewer level 5 Tasks than others, but these tend to be Tasks that require more time for students to complete during exams, so I decided that I didn't need quite as many of those as the levels 1 through 4.

Moving forward, each mid-term (Theme) exam will assess students on ~2/3 of the Tasks from Bloom's 1 through 4, and one level 5 question. I will create exam questions for the rest of the Tasks for the final exam.

Ultimately, now, my four Themes have the following number of associated Tasks:
  • Molecular Genetics: 17
  • The Central Dogma: 10
  • Gene Expression Regulation: 13
  • Transmission Genetics: 15
and I have Tasks aligned to my (custom) five levels of Bloom's:
  • 1: 12
  • 2: 12
  • 3: 12
  • 4: 11
  • 5: 8
Step 9. Prepare for class!
Now the process of backward design is almost complete. With an explicit idea of what Tasks will be measured in each summative assessment (exam), I am just starting to work to compile only those resources (movies, case studies, exercises, chapter readings…) that directly support my Tasks. All of these will be listed in my Syllabus when the semester begins.

Conclusion: Benefits

Having defined 55 Tasks, my initial inclination was to provide these to students in the Syllabus, so that they will know from Day 1 what they should expect to learn in this class, what is expected of them, and how they will be assessed. Providing the list of Tasks is like providing students a study guide and, essentially, a generic final exam for the class. Another benefit to students is that my list of Tasks contains most of the critical vocabulary they will need to understand to succeed.

At the same time, I recognize that showing my students the full 55 Tasks on Day 1 of class might be shocking (and perhaps demoralizing) to some. So, a softer approach is in order. Here's how:
Day 1: survey class on what they expect to learn in the course
Day 2: report survey results back to the class, and then have the students map their expectations onto the Goals and Outcomes (as stated in our Syllabus). The goal here is to help students understand how their interests and/or expectations align with what I'll be asking them to do in the class
Day 3: reveal the 55 Tasks that they will need to master to succeed in the class

Finally, backward design also helps my department by ensuring that I am including assessment items (exam questions) that align with our Student Outcomes Assessment Plan, so that those data can be collected and evaluated for our regular program review.

So, why did I do all of this work during my summer break? Because I realize that I only need to do this process once, but once I am done with Step 9 (organizing my class resources), my genetics class will require much less regular maintenance/attention than it usually draws during a semester. I'm in it for the students and for my own efficiency!

1. Welcome to EduProffer!

About
I am an Assistant Professor at California State University, Fresno. I love to teach and to make a positive impact. This is probably why I love to teach teachers about teaching - it amplifies the effect. I was trained as a molecular and evolutionary biologist and geneticist, and not at all to be a teacher. Everything I have learned about how to teach, aside from my observations of my own professors, I have learned since joining the Fresno State faculty. It has truly been eye-opening, and I'm hooked! As a scientist, I am trained in applying quantitative reasoning and evidence-based decisions to my genetics research in the laboratory, and I'm just as enthusiastic about using the same approaches to the efforts you and I make in the classroom.

History
In my first year at Fresno State, I was nominated to take part in an innovative ed tech initiative on campus: teaching classes where the instructor and every student used a mobile device (tablet or laptop or smartphone) to improve engagement inside and outside of class. I immediately began blogging the best practices that I developed at tabletpedagogy.blogspot.com. As time has passed, I've realized that this blog, by name itself, has a rather limited scope. I often think more broadly about effective pedagogy, and the EduProffer blog is where this content has a new home.

Concept
This blog is named EduProffer not just to indicate that it focuses on higher education (and posts will usually be more grade-agnostic than that), but that I will use this site to Proffer best practices in Education.

Format
When I conceive or test a new (to me, at least) educational practice, I'll describe it here. Each post title will start with a serial number, to ease the ability to read posts in order. I'll write posts from the perspective of whichever class I'm teaching at the time, but making my practices discipline-, grade-, and technology-agnostic is a point of pride for me. I'll do my best to ensure that posts aren't exclusively applicable to teaching biology, or teaching with technology, or teaching in higher ed institutions.

Values
In addition to this agnostic approach, I hold a number of other values dear when I consider implementing educational practices in my classes:

  • accessibility (for students)
  • affordability (for students and/or faculty)
  • engagement (of students)
  • efficiency (for students and/or faculty)
  • implementation (detailed protocols provided)
  • inquiry (enhances student question-asking)

I will self-rate (1-3, high-low), each post's relevance/impact to each of these values. I hope that this will provide a quick sense of whether a post is relevant to your own values.

Enjoy!
- E.P.