Home » 2016 Spring

# Category Archives: 2016 Spring

## CUNY Central Recommendations; Spacing Study; Online Homework

Discussing:

- Carpenter, Shana K., et al. “Using spacing to enhance diverse forms of learning: Review of recent research and implications for instruction.” Educational Psychology Review 24.3 (2012): 369-378.
- CUNY Office of Academic Affairs, “Best Teaching Practices” (2011)

Announcements: Funding available for STEM learning conferences (contact Kristen).

## (1) What Faculty Expect to Gain From Attending Conferences

Networking, collaborations, new research, present research & get feedback.

## (2) CUNY Central Academic Affairs

Initiative to move away from remediation as a majority of what we offer; integrate remediation into credit-bearing courses (e.g., LaGuardia & Guttman).

Is this a national trend? (We don’t know, but sense that CUNY is at the vanguard.)

If math/QR (quantitative reasoning) requirements are diluted or done away with, at what point do you cross over from being a university to being a technical college like Monroe?

The effect on graduation rates is at the heart of this.

New requirements: (1) Allow calculators on tests; (2) Give non-algebra course to non-STEM students; (3) Pass algebra students even if they fail the final.

Effects on transferability outside CUNY? Will this influence CUNY Central?

## (3) Discussion of CUNY “Best Teaching Practices” Document

Suggestion about studying in different environments; the study referenced is probably not vaild as evidence for this practice.

When this was brought up to CUNy Central, their response was that they would not remove this suggestion until a study is done supporting the opposite (essentially, “prove the null hypothesis”). A local survey of students showed that those who *did* study in different environments showed no effect on overall grade or final exam scores (unpublished).

## (4) Discussion of Carpenter Article on Spacing

Many of us thought this might support (3) above by showing that different environments may not be best. But instead it was about the *timing* of learning — gaps between presentation, review, and test.

## (5) Discussion of Options for Online Homework

(E.g.: Software packages including textbook & online homework)

ALEKS — Math, chemistry, physics (K-12 as well as college; www.aleks.com).

Challenges with using online homework/effectiveness for student success.

## First Class Visit; Nehm Paper; Dunlosky Paper

Discussing:

Attending: Patrick, Daniel, Kristen, Emral, Shoshana.

## (1) Daniel’s visit to Shoshana’s class

They had discussed interspersing lecture and practice, every 30 min (20 min/10 min respectively). Shoshana had been doing both, but with larger intervals. She thought the shorter intervals have been more effective since then.

Discussion of how the programming software shows you what each line of code is doing as it does it (jGRASP debugger).

## (2) Ross Nehm article

Issues with problem-solving skills; identifying relevant information to use to solve the problem.

Textbooks — do not much help students to *organize* new info like an expert. Could OER help?

Describes well how students need to “deactivate” irrelevant info — how can instructors help? Use in-class activities to individually diagnose & resolve with each student. *Showing *the class an incorrect response often lodges it in everyone’s mind — so that’s less effective.

## (3) “What Works, What Doesn’t” article

Useful tips to share with students.

McGraw-Hill online texts — do electronic highlighting?!

Highlighting emphasizes key terms, but ignores/makes invisible the *connections*, which is what an expert hones in on.

Self-testing (which is basically doing the homework) is what works. Teachers know this works, but do students?

Making a chart with *connections* is harder homework than the questions at the end of the chapter, but more effective.

Assessing students’ mastery not just of concepts, but of *organization* & connections (expertise, with the concepts). Such assessment can be difficult to explain to students what they did “wrong”. “That’s not what I meant” (said by students); they need to articulate what they understand, in a more precise, expert way.

Giving students examples of graded work prior to formally assessing them can help.

Brief discussion of physical set-up of the room — digital projection *AND* two whiteboards necessary for STEM classes. Desired: splitter for two screens (presenter view, presentation view).

## Freeman on Linear Regression, Active Learning; Pairs for Observations

Discussing:

Attending: Kristin, Tara, Jennifer, Daniel, Emral, Azure, Patrick, Shoshana.

Explained proposal to informally observe one another’s classes & teaching strategies this semester. Will assign pairs to work together.

## (1) Linear Regression Paper

In our own experience, nonrandom student enrollment and other factors such as instructor *do* affect any conclusions about effects of interventions.

In assessment, factors such as GPA might need to be collected to validate conclusions (here, at KCC).

Will adoption of these methods reduce the number of “publishable” studies (those finding an effect of interventions)? Would this be a disincentive to use these methods? (Depends on the editors of the journals & whether they’re aware of this issue.)

## (2) Active Learning Paper

A bit melodramatic, considering the broad definition of active learning — who doesn’t do at least some of this each lecture?

Some concepts require worked examples and relatively longer explanation (as discussed in a previous semester).

Active learning can often seem to improve learning, but mask a deeper misconception until later.

Experience with students analyzing probability of life on Mars, but not actually understanding what a molecule is.

“Black box” or abstract concepts are not interesting to students (by their own report).

Tangible concepts, or at least tangible models, seem to help most students. But, abstract thinking can be critical to new insights. So how do we encourage/teach/help them practice abstract thinking?

Most, maybe all beginning students just learn the “mechanics” — definitions, procedures — without necessarily deeply understanding the concepts at first.

Students in the elementary algebra class, required to “double-check” answers by plugging in your solution & seeing if it comes out as it should. This is shown/practiced in class, but on the exam, students are perplexed by how to do/interpret (particularly how to *do* the check). This is interesting: What is the issue? Can we identify it? Do they really understand what “equals” means? This is a major problem in math, and a predictor of success. Focusing on “equals” in class/or focusing on checks, has not made a difference yet.

Focusing on some concept in class, including an active-learning exercise, sends a message to students that it’s important.

## (3) Pairings for Observation

- Daniel – Shoshana
- Emral – Azure
- Kristin – Pat
- Tara & Jen to visit Daniel’s M2 class

Pairs to email each other to arrange; discuss next time.

## Readings for Spring 2016

**1st Session**

- Freeman, Scott, et al. “Active learning increases student performance in science, engineering, and mathematics.”
*Proceedings of the National Academy of Sciences*111.23 (2014): 8410-8415. (Link) - Theobald, Roddy, and Scott Freeman. “Is it the intervention or the students? Using linear regression to control for student characteristics in undergraduate STEM education research.”
*CBE-Life Sciences Education*13.1 (2014): 41-48. (Link)

**2nd Session**

- Dunlosky, John, et al. “What works, what doesn’t.”
*Scientific American Mind*24.4 (2013): 46-53. (Link) - Nehm, Ross H. “Understanding undergraduates’ problem-solving processes.”
*Journal of microbiology & biology education*11.2 (2010). (Link)

**3rd Session**

- Carpenter, Shana K., et al. “Using spacing to enhance diverse forms of learning: Review of recent research and implications for instruction.” Educational Psychology Review 24.3 (2012): 369-378. (Link)
- CUNY Office of Academic Affairs, “Best Teaching Practices” (2011) (Link)