
32:07
Don’t get me started on curving….

32:34
Ha ha @Brenda +1

32:52
arggghhh. “I’m in a section that did better than the others, so the impact of the curve on us will be…..” With touching faith in a process involving many sections and hundreds of students.

33:55
Isn’t the point for everyone to do well on the exam? (Cough)

34:33
Brenda - great example of how the format of a test depends on your pedagogy.

36:39
And please don’t ever say “open internet”.

38:07
Were these questions ever asked prior to spring semester for comparison on whether this was different before the move to remote

40:06
Thank you!

40:53
To expand on Jennifer’s comment - we will also be adding questions in our upcoming Fall survey to get comparison data to the student conduct survey items that have been used in the past.

41:00
Very interesting slides and discussion. Will these sides be available after this session so we can share with colleagues who couldn't attend?

41:08
Yes @Roy

41:14
Yes, we will share slides and the recording.

42:28
Do you have this broken out by discipline? My senior capstone stats students would probably gripe about more writing :)

43:51
No, we have not disaggregated the survey results by discipline because we asked students about their experiences holistically and they may not have been answer questions with a single course in mind

44:01
Thanks, Alessandra

44:02
*answering

44:47
We are working with the Fall survey to try and get some more discipline context without adding too many questions

49:02
Great idea to have more than one student provide feedback for other students in peer assessment

49:14
I strongly suggest faculty interested in hong rubrics have a look at he AACU VALUE rubrics. Wonderful place to start

49:29
Hong? *using

50:00
Is there a Sakai solution to making student work available to other students in blinded review, when I want to pick the reviewers (instead of the Sakai random peer feedback)? I’ve been downloading everything and uploading to Box.

50:15
Then I download again, and upload back to Sakai….

50:18
@Amy, good question. Can anyone from DLI comment?

51:39
I’m new to Duke, but I’ve used Sakai at other institutions and it has peer feedback capability in Assignments.

52:44
Thanks — I’m able to let Sakai pick reviewers, but generally I want to control that (e.g., person X does something well that person Y should see, or common themes in their projects) - were you able to assign peer reviewers yourself? That would be awesome!

52:47
Also, any advice on Gradescope for peer feedback?

52:53
Sakai assessments can do peer feedback, yes, BUT the instructors cannot assign specific reviewers.

53:09
*I mean, assignments

53:19
Hmm @joan that’s a good question - let me think about that

54:51
Is the the Cathy Davidson students write the syllabus technique?

54:59
*that

55:24
For #5 - some version of learning contracts work well

55:40
@Joan, as far as I know there’s not a way to do peer review actually within Gradescope. I can check with Gradescope help to see if they have a way that people do it.

56:29
Does Duke use Hypothes.is?

56:47
@Brenda; Yes, we have a pilot of Hypothes.is integrated into Sakai now

57:31
DLI documentation on Hypothes.is at Duke, if you’re interested: https://learninginnovation.duke.edu/hypothesis/?mc_cid=8377d1124f&mc_eid=7109ae423e

58:07
Thanks!

58:14
@Brenda - Note that hypothesis is a pilot and has a few issues we’re working through.

58:21
Hypothes.is has a nice hands-on training workshop if you are interested.

58:57
@Randy, no worries. I had some trouble with it’s LTI integration at other places as well.

01:00:05
Regarding having students develop part of their assessment criteria I thought this was interesting: https://www.coursehero.com/faculty-club/classroom-tips/benjamin-wiggins/

01:00:31
One “hack” to use Gradescope for peer review is for each reviewer to create their own Gradescope course and then invite their peers to submit. The advantage is that Gradescope rubrics, etc can be used for grading. The disadvantage is it would be unreasonable for large numbers of students.

01:00:40
do you recommend software like lock down?

01:00:56
yes

01:01:05
Thank you for this wonderful information and your insights! How will the recording be distributed?

01:01:19
sure

01:01:57
@joan that’s an interesting work-around for Gradescope but I agree that it probably would be non-scalable. There are rubric options in Sakai as well, although my colleagues know a lot more about how they work than I do.

01:02:55
Thanks so very much for this session!

01:02:55
We have a Qualtrics survey template that can be used to collect peer review feedback on projects or papers.

01:03:05
Email me if you’d like that!

01:03:22
@Deb, we will post it to the Office of Assessment website. assessment.trinity.duke.edu. We’re jotting down attendance here and I’ll try to do a direct send to participants.

01:03:32
Thank you so much, Jennifer!

01:03:39
We will Laos post on the Flex Teaching site

01:03:43
*also

01:04:15
I have a team of folks eager for this, and some of us somehow received a different room address. We’ll catch up as soon as we can. Again, thank you!

01:06:32
Thanks, everyone!

01:07:52
Thanks for this!

01:07:56
thanks very much for the session

01:08:00
Thanks for this session!

01:08:11
Thank you and everyone at DLI!

01:08:28
Thank you

01:08:35
Thank you

01:08:36
Thank you for this workshop! It’s helped me reconceptualize assessment for my spring courses. I will likely follow up with Learning Innovation on some of the ideas presented here.