Wednesday, March 19, 2008

Database Schema V.6



Changes:
- Removed course.department_id since it's redundant w/ subject.dept_id
- Changed section_statistics to survey_statistics and made section an org so the table hold statistics for all levels.
- Removed target_type_id from org_question_set.
- Removed instructor_id from question_category and question_sub_category.

ASUH

I had a meeting with a representative of ASUH today. The gist of the conversation covered privacy concerns and participation rates, with the focus on the latter.

Ideas and new features:
1.) Allow instructors to know who did surveys so they can offer extra credit or other incentives. As long as we make it clear that the responses can never be tied back to the individual, privacy shouldn't be a concern.
2.) Allow instructors to add their own comments in response to what students write on a survey when publishing. This may encourage instructors to publish knowing that they can state why they decline to show a certain response. It may also have the effect of helping students understand where their instructor is coming from and why they do some things the way they do.
3.) During the time period that eCAFE is open to students, add a link to eCAFE on the main UH site and in the portal.

Action items:
1.) Talk to Accounting dept. head regarding their current practice of giving multiple surveys in a class: one for the instructor, one for the department, and one for the college. Discuss if it's better to put all in one survey, or break the questions for each organization out into a survey answered only once by the students under them. ASUH will establish initial contact.
2.) Talk to Marketing or Statistics departments regarding setting up a focus group with students to gather feedback on ideas to increase participation. ASUH will talk to an instructor in the Marketing department to assess interest.

Email #2 to instructors

The following emails were sent on Wednesday March 12th. The first one was sent to instructors who are required to give the survey. The second was sent to instructors who had the option to give or not give the survey. The departments determined who is optional and who is mandatory.

These emails were the second of those sent to instructors, with the first one being forwarded to them through their department representative.

-----------------------

Email sent to mandatory instructors:

The new online Course And Faculty Evaluation System (eCAFE) is open for use. Please log in to eCAFE at http://www.hawaii.edu/ecafe using your UH username and password to set up your surveys.

Setting up your surveys involves adding your own questions to those already selected by your campus and department. If you have set up a survey in the past, you can choose to copy those questions into a current survey and edit as desired.

Your students will start using the system from April 21st. Prior to this date, you will receive an email providing instructions on the eCAFE system which you should forward to your students.

You can also indicate if you want to publish your results at the end of the semester. Publishing your results means that the aggregate results will be displayed through a link on http://www.hawaii.edu/ecafe for all to see. You are not required to publish, and if you decide to publish, you will have the ability to decline showing individual open-ended responses. You can also revoke your decision at any time.

Why would you want to announce that you intend to publish? Because your students will be told of your intent, which gives them an incentive to fill out the survey. This will increase the number of surveys completed, and provide an alternative to the now popular http://www.ratemyprofessors.com, sans hotness ratings.

You have until Friday, April 4th to make your changes. After the 4th, the system will close and you will not be able to make any further changes.

Usage of eCAFE should not be combined with using the paper-based surveys. If you receive the application for a paper survey, please let us know asap by email or phone ().

As this is a pilot test, any and all feedback is greatly appreciated. Please feel free to email us at cafe-support-l@hawaii.edu.



--------------------

Email sent to non-mandatory (optional) instructors:

The new online Course And Faculty Evaluation System (eCAFE) is open for use. Please log in to eCAFE at http://www.hawaii.edu/ecafe using your UH username and password to set up your surveys.

Your department has specified that your participation in the eCAFE pilot test is voluntary. If you don't want to participate, you must log in to http://www.hawaii.edu/ecafe and click the Disable button next to the survey(s). If you choose to Disable, no further notices will be sent to you or your students, and you can ignore the rest of this email. Otherwise...

Setting up your surveys involves adding your own questions to those already selected by your campus and department. If you have set up a survey in the past, you can choose to copy those questions into a current survey and edit as desired.

Your students will start using the system from April 21st. Prior to this date, you will receive an email providing instructions on the eCAFE system which you should forward to your students.

You can also indicate if you want to publish your results at the end of the semester. Publishing your results means that the aggregate results will be displayed through a link on http://www.hawaii.edu/ecafe for all to see. You are not required to publish, and if you decide to publish, you will have the ability to decline showing individual open-ended responses. You can also revoke your decision at any time.

Why would you want to announce that you intend to publish? Because your students will be told of your intent, which gives them an incentive to fill out the survey. This will increase the number of surveys completed, and provide an alternative to the now popular http://www.ratemyprofessors.com, sans hotness ratings.

You have until Friday, April 4th to make your changes. After the 4th, the system will close and you will not be able to make any further changes.

Usage of eCAFE should not be combined with using the paper-based surveys. If you receive the application for a paper survey, please let us know asap by email or phone ().

As this is a pilot test, any and all feedback is greatly appreciated. Please feel free to email us at cafe-support-l@hawaii.edu.

Friday, March 14, 2008

Want to request a feature?

If you want to request a feature, please answer the following questions:

1.) What type of user is the feature for (Staff, Instructor, Student, or Administrator)?

2.) What should the feature do?

3.) What page would the feature be added to?

4.) What sort of forms/buttons are needed to make it work?

5.) What happens when you submit the form or click the buttons?

6.) How many people would be using this feature? (estimate)

7.) How frequently would this feature be used? (ex: every login, once a semester, once ever, etc)

8.) How important is this feature? (ex: Not very, somewhat, a deal breaker, etc)

9.) How tech-savvy is the typical user of this feature?

10.) What other details would a developer need to know to make the feature work the way you envision?

Please include your contact information so we can reach you to discuss the possibilities. Feel free to email your contact information if you don't want to post it.

Thursday, March 13, 2008

Spring 2008 Usage Dates

Staff: January 21 - Feb 15
Faculty: March 3 - April 4
Students: April 21 - May 9
Results: May 21
Publish: June 9

Friday, March 7, 2008

Database Schema V.5



Changes:
- Made subject into an org to handle ICS/CIS/LIS issue.
- Removed error_report table.
- Removed target_type table.
- Removed section.start_date/end_date.
- Moved section.crosslist to own table.
- Added instructor_id to section_response_set
- Added missing course.department_id
- Reevaluated Keys/PrimaryKeys/Indexes

Thursday, March 6, 2008

eCAFE Specifications Version 2

E-Café

I hope to post a link to a doc file so this can be downloaded, but until I have a place to put the file, you can email me and I'll send it to as an attachment.

All images can be viewed full size by right clicking on them and selecting "View in new tab/window."

Red: An assumption that needs to be looked into.
Blue: Future feature. Nice to have, but not a version one necessity.
Green: Recent changes

The site can be accessed by instructors, students, and designated staff members (ex: department secretary) by logging in to www.hawaii.edu/ecafe.

Index.html:
When the user goes to the eCAFE site, they will see the index.html page which will give a brief explanation of the system and a form through which the user logs in. Once the user logs in, their permissions will be checked, and they will be sent to the appropriate page for their role or roles (instructor, staff, student).

The time of year changes what users will see. For example, staff can edit surveys in the period consisting of roughly the first quarter of the semester. Outside of that time period, the editing functions will not appear. This applies to all user roles in various ways.

Staff: (Staff.html)

Upon logging in, the staff member sees any past surveys and a single current-semester campus, college, division, or department survey. Note: I will use the word organization to mean any of campus, college, division, or department hereafter.

From here, the user decides if they want to edit the survey for this semester, view current or past surveys, input the instructor-specific settings, or override the instructors for concurrent courses.
Survey questions/options are editable by the staff for the first ¼ of the semester. During this time, the edit, update, override, and clone buttons are available. Outside of this time period, staff are limited only to viewing the questions that appear(ed) on all past and current surveys.



Upon selecting the “Edit Instructor Survey” button, the user is taken to a page that shows them a list of questions they can select from. This page also shows which questions all higher organizations (campus, college, etc) have already selected, they are placed in a separate section and highlighted in green.

To select the questions for their own organization, the staff member checks the boxes next to the questions they want and clicks “Save.” With this action, the user is setting which questions are going to appear on the surveys of all instructors in their organization and the organizations under them.



For example, if the designated staff member for the Manoa campus selects the question “What is your overall rating of this Course?”, while the designated staff member for the College of Arts & Sciences selects “Which aspects of this course were most valuable?”, and the staff member of the ICS department selects “Would you recommend this course to others?” All three questions will appear on the survey of an ICS instructor, while an instructor in the Art department would only get the campus and college questions.
Note that cross-listed courses are a special case. Since a cross-listed course is under multiple departments, the students will see a survey containing questions from all involved departments and their associated colleges and divisions.

Eventually, there will be a means to create a survey specific for TA’s, but for now, each organization is limited to creating only one set of survey questions, which all instructors under them will see on their surveys.

Staff, Clone.html
In setting up the current semester’s survey, the staff member can choose to clone the questions from a past survey. Next to each past survey is a button labelled “clone”. This button takes the user to a page showing all questions from that survey. There is a checkbox next to each question. The staff member can select all or some of these questions and click “Clone” which will cause the selected questions to be copied to the current semester’s survey.

Note that although this action copies the questions from a previous survey, the user can choose to edit the current set of questions after cloning. This means the user can copy questions from a past survey and then go to the edit page to add new ones to the copied set. The newly cloned questions will show up as being already selected. Also, the user can choose to clone questions from multiple past surveys. The system will automatically remove duplicate selections so a question isn’t repeated.



Staff, View.html:
To see questions that were on past surveys. The staff member clicks on the “View Survey” button on the main page. This shows the user the survey appearing exactly as the students see it, sans “submit” button. The only other action allowed on the view page is through the clone button which will take the user to the same clone page described above.

Staff, Settings.html:
The Staff member must also choose whether or not to set restrictions on what the instructors can do with their surveys. Some departments want the surveys taken unaltered, some allow staff to add questions. If they allow the instructors to add their own questions, they can choose to limit the number the instructors can include.

The user can affect these options by clicking on the “Update Settings” button on the main page. They are taken to the settings page shown here. At the top of this page is a checkbox allowing the user to set if their instructors are allowed to add their own questions to the survey in addition to the ones set by the organizations. In addition, if instructors can add questions, the staff member can place a limit on the number of questions the instructor may include.



The lower half of this page is where the instructors are set to mandatory or optional. An unchecked box means the instructor is optional. This means that the instructor will have a choice of whether or not to present their surveys to their students. What this means for the instructor will be discussed in detail in the instructor section of this document. A checked box means that the instructor is mandatory and has no choice in regards to the students receiving the survey. The system will present it to students automatically. Note that x99 courses are always exempt from mandatory surveys. Even if a instructor is selected as having a mandatory survey, any x99 course they teach will be exempt.

Any person teaching a course under that organization will appear in this list of instructors, even if the instructor is technically not employed by that organization. This means that a single instructor can appear in the lists of multiple organizations. This also means that if a course is crosslisted, the instructor teaching the course will show up in all departments for that course. For example, if a course is crosslisted as W.S. and HIST, both the W.S. and HIST department staff members will see that instructor on their settings list.


Since both departments settings will apply to the same instructor, and those setting may conflict, we are adding a page where the departments can indicate which department is “primary” for a given course. On the page, the staff member will see a list of all courses crosslisted with their department. Next to each course is a checkbox “This department is the primary funder of this course.” Whichever department checks that box is considered primary, and their settings will apply to that instructor for that crosslisted course.
Once set, all other departments for that crosslisted course will see, instead of a checkbox, a label stating “Department ‘X’ is the primary for this course, if this is in error, please contact …”

For example, Psychology 385 (Consumer Behavior) is crosslisted with Marketing 311. If the Staff member for Psychology logs in before Marketing and sets PSY to be primary for course 385, then the Staff member for Marketing will see "Psychology is the primary department for this course." If this is incorrect, the Marketing Staff member will need to contact Psychology to work things out.


Since Psychology is the primary, when the instructor logs in, they will get whatever settings were placed by that department. So if Psychology set all instructors to optional and limited them to 5 questions, while Marketing set all instructors to mandatory with a limit of 10 questions, then the instructor will be optional with a 5 question limit.

If no one sets their department to be the primary, then the instructor will get the most restrictive settings of the crosslisted department. So in the above scenario, the instructor will be mandatory and be limited to 5 questions for that course.



TODO: What happens at the different organizational levels? It should never happen that the college and the department will both be messing around with settings, but it might. If the College of Arts and Sciences sets all instructors as mandatory, but the art department sets them as optional, whose settings should be the one? I’m assuming the department, but is that correct?

TODO: Each organization’s staff member needs to see the contact information of the staff members above and below them (if any) in case of disputes.



Staff, Concurrent.html:

The final action available to a Staff member is through the “Override concurrent courses” button. This takes the user to a page where they can change the instructor of a concurrent course.

Concurrent courses are those which are multiple sections of the same course listed under the same instructor where the individual sections are actually taught by a TA. On this page, the staff member can select from a list of TAs so that the survey will be for the actual instructor (the TA) rather than the listed one.



Instructor: (Instructor.html)

When instructors log in, they are taken to their main page. From this page, they can choose to edit, view, enable/disable upcoming surveys, view open surveys, and see the results of past surveys.



Instructor, upcoming surveys:
Upcoming surveys are those which have not yet opened for students to take. They are in the editing stage. The instructor can edit their surveys for the 2/4ths of the semester following the staff’s editing period. After that, no further changes are permitted.

When an instructor with upcoming surveys logs in, they will see each of their courses listed. The buttons they see next to those courses varies depending on the setting provided by staff members.

If the instructor is allowed to add questions to their survey, they will see an “Edit Survey” button which takes them to the edit page (discussed below). If the instructor is not permitted to add questions, they will see a “View Survey” button instead. The view button show them the survey exactly as the students will see it, minus the “Submit” button. They will not be able to make any changes to it.

When an instructor has an optional setting, meaning they have the choice of whether or not to give the survey to their students, then they see an additional button, either “Enable” or “Disable.” Instructors who are mandatory will not see these buttons. All surveys are enabled by default, so optional instructors will initially see an “Disable” button next to each course’s survey. If the instructor does not wish to give the survey, they have to select the disable button. Once disabled, the Disable button will be replaced by an “Enable” button should the instructor change their mind. In short, all optional instructors are opted-in by default and must opt-out if they do not wish to participate. If they are willing to participate, no action is required.

TODO: Should we give organizations the ability to set whether optional instructors are opted-in or out as their default setting? Or should everyone be opted in as I wrote?

Instructor, open surveys:

Open surveys are those which are available for students to fill out. Once open to responses, the instructors will only see a “View Survey” button which takes them to an uneditable view of the survey. Next to each open survey on the main page is a count of how many students are in the class and how many of those students have taken the survey. This shows only the survey, not any of the results being collected.

Instructor, completed surveys:

These are surveys that happened in the past. They are no longer open for students to fill out. Instructors can choose to view the results, see the survey (uneditable view), email their results to someone, save as pdf, publish/unpublish the results, and clone the survey to another survey. The image of the instructor’s main page is missing two buttons, there should be one labelled “Clone” next to each of the completed surveys, as well as a “Save to PDF” button.

The “View Survey” button for completed surveys has the same effect as it did for upcoming or open surveys, it shows the user an uneditable view of the survey looking exactly like the students saw it, sans Submit button.



The button “Email Results” takes the user to a form through which they can send an email containing a link to their summarized results (discussed in results section later). The link will allow the recipient(s) to see the aggregate results only, not the individual surveys. The recipient will not have to login to see the results, so no account is needed.



“Save to PDF” automatically downloads your aggregate results to your computer in PDF format. The file is named - .pdf, so ICS 414 in the Fall 2007 semester will result in a file titled “ICS 414-Fall 2007.pdf.”


Instructor, Clone Survey:

Instructors can choose to copy their questions from any past survey into the current semester’s survey.

On the main page, there is a “Clone” button next to each past survey. Clicking that button takes the user to a page showing all their personally selected questions from that survey. This list will not include questions set by organizations.

The instructor can select any or all questions, indicate which course’s survey(s) they want to copy the questions into, and click “Clone.” All checked questions will be copied into the other surveys. Should the user want to make further changes to the upcoming surveys, they can edit them via the Edit button on the main page, as described earlier.



Instructor, Edit survey:

Clicking the Edit button (if provided) next to an upcoming survey on the main page, takes instructors to a page where they can add and remove questions to their survey.

The data shown at the top of the page include the following: campus, department, CRN, course name(s), section number(s), instructor name, semester, year, and survey open/close dates. If there are multiple sections of a course assigned to a given instructor, all of them are given the same survey, so the instructor will see multiple courses and sections here in that instance.

This will be followed by a set of questions they can select from in order to set up their survey. Any questions set by their organization or others above them will be displayed as already selected. To add questions to their survey, the instructor checks the boxes next to the questions they want and clicks the “Save” button. If they decide to remove a previously selected question, they uncheck its box and save. While instructors can remove questions that they themselves have selected, they cannot remove questions selected by any organizations.


The previous version of eCAFE had two instructor features that will not be included in the new one: the ability to create their own questions, and the ability to set the order of those questions. See the links on those topics for the reasons why.





Instructor, Publish/Unpublish/Edit Published:

Publishing results means the instructor agrees to have the aggregate results of the survey posted for all to see. Two weeks after the instructors get their results, anyone will be able to go to the published results page and see the names of all instructors who published their results. Clicking on a name will show all published survey results for that instructor. The multiple-choice questions are aggegated, no one but the instructor ever sees the individual survey responses. Open ended question responses are included, too, at the instructor’s discretion.





When an instructor clicks the “Publish” button, they see a page with some explanatory text, and a list of all open-ended responses to their survey questions.

Aggredized responses to multiple choice questions are included in published results by default, but since students sometimes include questionable material in their answers, we allow instructors to select which open-ended responses they will/won’t show.

Each open-ended question is displayed along with a list of the students’ responses, and there is a checkbox next to each response. The instructor unchecks the boxes of the responses they do not want to have displayed, and clicks “Save and Publish.” At that point, the instructor’s name is added to the page of all published instructors.



While instructors can decide not to show some of the open-ended responses, a note will appear next to the results stating something to the effect of “showing x out of y responses,” so there is a disincentive to show only positive responses.

Student: (Student.html)

Students can fill out the surveys starting three weeks prior to finals and ending on the Friday before finals.

At the start of this period, all students who are registered for courses that are giving surveys are sent an email. These emails will repeat each week of the open period until the student either completes all their surveys, or opts out. Note that there is no method for the student to opt-out of receiving the eCAFE notices during this period, although they may just end up putting us in their spam filter.

If the student logs into eCAFE outside of this period, they are shown a message stating that eCAFE is not open for surveys and shows the dates that it will be or was open.

When a student logs in during this period, we show them the student.html page regardless of whether or not they have any surveys. If there are no surveys, we show them a message to that effect.

If they do have unfinished surveys, they can do any of the following: complete the surveys, opt-out of specific surveys, or opt-out of all their surveys. Should they choose one of the opt-out options, they will have to respond to a prompt before the action is put into effect.



Opt-out means that they choose not to do the survey. If selected, they will no longer receive nag-notices. For the survey’s records, we mark the survey as being opted-out so it doesn’t factor in to completion rate statistics.

Classes with three or fewer students will result in the survey not being given as there are not enough students in the class to provide a reasonable expectation of anonymity.

If a student does have surveys to complete, and selects the “Do Survey” button, they are taken to the survey.html page.

Student, Survey.html:
This is where students fill out their surveys and submit them. The page shows the course and instructor data, followed by the questions.

Cancel will prompt the user, telling them any questions they have filled out will be lost. Assuming the user says ok, the survey is abandoned, its status unchanged, and the student is returned to the student.html page where the survey still shows up, complete with “Do Survey” and “Opt-out” buttons.

The submit button causes the survey to be stored in the database. Clicking submit will also cause the survey to be marked as completed on the student’s record. They will then be returned to the student.html page where the just-finished survey will be marked as completed, and will no longer have any buttons next to it. Surveys are final, once submitted, they cannot be retaken.



Instructor, Survey Results:

This is where instructors view their students’ responses to a survey. This data will not be available until the day after grades are due, at which point, instructors will receive an email reminder telling them the results are ready for viewing.

When they first click the “View Results” button of the main page, the instructor will see an aggregate view of all their results.

For each multiple-choice question, it will show the number of students who answered, the number of students who selected each possible answer, the percentage of respondants who selected each possible answer, the mean, the standard deviation, and a comparison between their results and those of everyone in their department or campus who asked the same question. If no one else asked the same question, then the three rows of campus, department, and class will all have the same data. Open ended responses are shown in a list format.





From the aggregate view, there is a means to see the sets of responses given by each student, similar to having a stack of paper survey responses to flip through. By clicking on the “Individual” link, the instructor will see all responses of one of their students, and can see the others by using a drop-down menu or clicking on the “previous” and “next” links.



Manager (manager.html):

A Manager is someone who is specified by their organization to have the ability to see the results of all users who have their surveys set to mandatory. There can be multiple people with this permission and one of them is likely to be the organization secretary.

Means of viewing results by non-instructors:
There are a number of ways that others will be able to see the aggregate results of an instructor’s surveys.
1.) If the instructor was set as mandatory for their organization, then their survey(s) will be available to specific people pre-determined by the department. In this case, the instructor will see a list of who will have access to their results on both the setup and results pages, so they can protest if this is in error. Viewing of results through this manner is limited to people with the Manager role.
2.) If the instructor emails them a link to the survey. Any recipient of the email can see these results, no account or login is necessary.
3.) If the instructor has granted them permission to view their results. This feature needs more definition, but the idea is that the instructor will be able to enter the UH usernames of individuals who have ongoing access to their aggregate results. While this permission can be revoked by the instructor at any time, the intent is that they can set who is allowed to see their results, and have that permission in effect for ongoing semesters. The viewer must login to eCAFE in order to see the results.

Misc Future Features:
- Add an open-ended question at the end of all surveys to get general comments.
- Add a “rate instructor on a scale of 1-5” and use that to generate “stars” for the publish page.
- Include means to NOT show dept and campus statistics on an individual’s summary page.
- Add links on site to research articles re: response rates and online surveys
- Add link to blog from main site.
- Make sure an individual can use the site even if their organization is not using it.

Big changes from past version:
- Changed handling of cross-listed courses, again.
- Removed instructor ability to create new questions and to order those questions.
- Removed staff ability to reorder department questions

Ideas to Increase Response Rates

Reading through the experiences of other Universities, I've come to the conclusion that we need to take a gamble this semester. Since we've restricted the pilot test to the original four departments on the Manoa campus, I think this is an ideal time to try.

The idea is that we need to provide incentives. Last semester, all we offered was a suggestion to instructors to offer a reward (of their choice) if their class reached a certain participation percentage. For example, a KLS class was told they would have a "play" day instead of a serious work day if they reached 80% participation.

I think we should try offering individual-based incentives. Some examples:


  • Students must complete at least one evaluation before they are permitted to use the campus wireless connection for the day.

  • Blank out or delay grades for classes that they didn't do an evaluation for.

  • Allow students who completed all their evaluations to register one day earlier for the next semester than they would have otherwise.

  • Enter contest to win something (an ipod?)

  • Allow instructors to see who completed a survey so they can offer bonus points. Maybe we can add a setting where the students can say they're willing to let the instructor see that they did a survey (although it wouldn't be associated with the results)?



The problem that all these ideas share is that the students may feel threatened, that they may believe that if we know they did a survey, then we know *which* survey they did, and the instructor can find out. Is there a department on campus that might be interested in doing focus groups/research as a class project?

While this is a valid concern, the experiences of other Universities strongly suggests that higher response rates will not happen without some or all of these measures in place. The main objection to the system is the low response rates (~33%), we have to try every option to increase this number.

If none of these are considered viable, then other ideas are:

  • Instructors promise to publish if the response rate reaches a certain percentage

  • Announce the system on the campus portal/main page



I hope to add to these lists if anyone comes up with other ideas.

Wednesday, March 5, 2008

Online Evaluations at Other Institutions

The following is a summary of various experiences reported by other Universities that have or are in the process of transitioning from paper-based course and faculty evaluations to an online method.

I was originally given a list of universities that are considered our peers or benchmarks, but as I went through the list, it appeared that either many of them have no online course evaluation method, or their site just wasn't getting me to what I was looking for. Either way, it became apparent that going to each school individually was inefficient, so I went another route. I went to google and typed in "experiences with online course evaluations." I figure we can learn from others, even if they aren't on the list.

Yale Law:
They started online evaluations in 2001. Since then, they've revamped the system twice, and added incentives. As a result of these changes, they've managed to get a 90% response rate as of 2005, Spring semester.

It appears they've gone through some of the same learning curve we are in the middle of. Like we did, the first semester they sent general email reminders to their students and offered no incentives, resulting in a response rate of 20% (we had ~30%).

For their second semester, they did a redesign, this time gathering input from student representatives and faculty (we're doing this now). They shortened the survey from 18 questions to 8, added a comment section, and included (dis)incentives. For example, they blocked students from seeing their grades for classes for which they did not complete an evaluation. They also experimented with using class time for the surveys. Using class time resulted in a roughly 90% response rate while blocking grades came in about the same.

The lessons learned:

- Keep the surveys short.
- Provide (dis)incentives
- Use automated reminders.

While eCAFE already does the third item, we can explore the other two. The hardest to accomplish would be survey length, given the resistance we've already encountered on standardizing a campus-wide question set. I've seen surveys as long as 50 questions. Perhaps we can force a limit?

Originally, we considered blocking access for students to their grades, and we decided it wasn't feasible given the many entry points to the grades. Perhaps we need to look directly at how the grades are served up instead. If all entry points go to the same location, can we change to code to hide certain course entries if eCAFE says an evaluation wasn't completed? There is a problem with this in that it won't show results until the next semester. We can tell students that they won't get access to their grades, but they'll ignore us, and then complain bitterly when access is denied. At that point, it's too late, they can't do the survey anymore, but hopefully they'll remember for next semester. So, if we implement this, we shouldn't expect too much of a change until the following semester.

University of Denver: Sturm COL Experience

Anecdotal experiences are similar to ours: students are conflicted, they say it's too much of a hassle (time consuming) to go online, but then they report that they like how the online option frees up class time. They also went through a learning curve of initially making the survey period too short, eventually settling on the last two weeks of the semester ending the day prior to the start of finals. They "nag" their students significantly more than we do, every other day as opposed to our once a week.

This school listed a number of "potholes" to watch out for, many of which we've already hit. Once suggestion is useful, they had the same problem we did of students filling out evaluations thinking it was a different instructor than the one it really was for. They resolved this issue by placing the instructor's name throughout the evaluation in as many places as possible. Perhaps we could put a banner every five questions or so saying "This evaluation is for instructor 'X', course 'Y'." They've also had the same issue of students asking to retract a survey. Given their matching policy of complete anonymity, they handle it the same way we do and issue a flat "Sorry, we can't help you."

I also particularly liked this problem/solution: "If you've taken on too much - give it back, if it was their job before it was online, it should still be their job." Noted.

They mention that they had incentives, but I didn't see what specifically they used. They note that all evaluations are public, although the Academic Dean can choose to hide particularly negative/slanderous comments. Perhaps that's part of it. I would like to know what they did as their response rate was 83% out of the gate. Interestingly, their rates dropped to the low 70s in subsequent semesters. The speculation is that the novelty wore off for students. This doesn't appear to be an isolated phenomena as other schools are reporting this as well.

Duke University School of Law:
Duke Law started their online program for the same reason we did, the scantron machine.

I liked their idea for an incentive where a 70% response rate is required to share the results with the public. While we can't force instructors to publish their results, we can suggest this as an idea that may help increase their response rates. They also suggested delaying registration for the next semester, or withholding free printing if the student didn't do the surveys. Are there any services that we can deny access to based on this? Maybe not allow them wireless access for the day unless they've completed at least one survey?

University of Minnesota

While I'm providing a summary of this PPT presentation here, I would strongly encourage anyone interested in this topic to look at the original. It compares online to paper based evaluations and I found it extremely informative and interesting. The data was gathered through studies and focus groups. What was most fascinating to me were the comments that came out of the focus groups. I've heard just about every single comment, complaint, and point of view that the author reports. It appears that perceptions and concerns are the same everywhere, and the perceptions are not always accurate.

They also did a study where they took two classes of the same course taught be the same instructor. They had one class do the survey online, and the other do it on paper. Then the results were compared to see if there were differences in the response rates, the ratings, or the distribution of responses across demographic boundaries.

They concluded that the students who felt the most strongly about a course/instructor were the most likely to complete the survey online, as opposed to paper which got the broad spectrum of students. The response rates reflected this, with paper getting a mean completion rate of 76.7% and online getting 55.9%. The final conclusion was that although the response rates differed between the two methods, there were no significant differences on the actual ratings. The mean scores of the responses were the statistically equal.

University of Mississippi

This is a rather large PPT presentation (68 slides). They report the same concerns and behaviors that we (and everyone else it seems) have seen. Their response rates dropped to about 30%. The "middle" group of students was lost, again showing it's only the students with stronger feelings in either direction that participate when it's made voluntary.

The next semester, they added in the ability for students with 100% completion rates to register one day early for the next semester. They also set it so students with at least 50% completion rate could see their grades right away, while those with less participation had to wait a week after grades were posted to view them. With these changes, the participation rate went up to roughly 60%. Interestingly, even with the doubling of participation, the average scores were comparable. They have also noticed that the student comments appear to be "more extensive and thoughtful."

Tuesday, March 4, 2008

Spec changes: no more user-defined questions

In the current eCAFE, we allow users to create their own questions to add to their surveys. A number of problems arose:

1.) Users didn't bother to check if the question they wanted was already in the set of previously defined questions, so they ended up creating ones that already existed, except now they can't be used for comparison to other department/campus instructors.

2.) Users created the questions, but forgot to add them to their surveys, resulting in confusion when they got their results and the questions were absent.

3.) Users created vaguely worded, multi-part, and/or confusing questions.

I want to get rid of user-defined questions. Only a small percentage (7.5%) of all participating instructors have used this feature, and many of them fall into category 1. The problems and confusion don't seem to be worth keeping it around. Instead, I propose that people can submit questions to be vetted by an organization (current CAFE staff?) and added to the general set as appropriate.

Spec changes: question order

I want to change the way questions are ordered on a survey. In the past, departments and instructors were allowed to set the order of the questions they added. The questions would appear in blocks, first campus questions, then department, followed by instructor. Each block of questions would appear in the order in which the user set them.

There is a fundamental flaw in this manner of ordering the questions. The issue is that standard survey practice is to group questions together into logical chunks, for example all student development questions should be grouped together, and all instructor related questions should appear together. Our original method of handling this resulted in the survey bouncing around between logical groups. For example the student would see the department questions on student development, followed by the department's course evaluation questions, and these would be followed by more student development questions as set by the instructor. This problem is further compounded when secretaries or instructors not familiar with survey theory order questions randomly without regard to category.

Research indicated that when surveys bounce around like this, survey takers are more likely to ignore the subsequent sets of questions from a subject for which they already answered questions. The change I'm proposing is that organizations (departments, etc) and instructors are no longer able to order their questions. Instead, they just select the questions they want asked, and the questions will appear in the order they appeared on the selection page. The questions appear on the selection sheet grouped into their logical categories. See http://www.cafe.hawaii.edu/cafe_catalog.asp to see the categories and order.

For example, lets say the history department selects questions 1, 5, and 10, and one of their instructors selects questions 3, 7, and 12. When the student views the survey, the questions will be ordered as followed: 1, 3, 5, 7, 10, and 12.

Spec changes: crosslisted course handling

During a discussion with the chair of a department that has many crosslisted courses, we arrived at the conclusion that some changes need to be made for the handling of crosslisted courses.

We allow all departments to set which instructors are mandatory vs. optional, if the instructors are allowed to add questions, and if so, how many they can add. Given this, what happens when the instructor is teaching a crosslisted course? The instructor will appear in both departments' list of instructors, and the settings may conflict. What happens when department 'A' says everyone is mandatory and 'B' says everyone is optional?

We decided that there should be a page where a staff member can indicate which department is primary for all their crosslisted courses. This means that the staff member would see a listing of all crosslisted courses in their department, with a checkbox next to each one. By checking the box next to a course, the staff member indicates that their department is the primary for that course.

Once any department has indicated that they are primary for a given course, the other departments will not see the checkbox. Instead, they will see a message to the effect of "Department 'X' is the primary department for this course."

For example, Psychology 385 (Consumer Behavior) is crosslisted with Marketing 311. If the Staff member for Psychology logs in before Marketing and sets PSY to be primary for course 385, then the Staff member for Marketing will see "Psychology is the primary department for this course." If this is incorrect, the Marketing Staff member will need to contact Psychology to work things out.

If no one sets their department to be the primary, then the instructor will get the most restrictive settings of the crosslisted department. So if Marketing sets all instructors to mandatory and allows them to add 10 questions, while Psychology sets all instructors to optional and allows only 5 questions, then the instructor for the above course will be mandatory and be limited to 5 questions for that course.

Instructor Results Confidentiality

Our official position regarding confidentiality of the CAFE results is as follows. "All information about individual instructors and courses will be held in strict confidence. Results will be released only to the instructor participating in the service, or to his or her designee. However, each individual faculty member is viewed as a professional, under the faculty contract, with the professional responsibility of sharing the results with his/her department chair."

If a department, however, has a policy to intercept the CAFE results for review prior to its delivery back to faculty we hope their faculty are fully aware and agreeable to this policy.