Dennis Cunningham, art faculty, was unanimously selected as the recipient of the 2015 Ray Trayle Print Prize, given annually to a "remarkable Northwest printmaker."
- Who are we surveying?
- Is the survey anonymous?
- When does the survey open?
- Why are we surveying students?
- What does the survey ask?
- How will the results be used?
- What survey tool will be used?
- Is the survey tool considered to be reliable and valid?
- How long will it take to complete the survey?
- Can students start, stop, and return to the survey?
- Will students receive anything for completing the survey?
- How is this different from course evaluation?
- How are we encouraging students to respond?
- What do we expect our response rate will be?
- What can faculty and staff do to help?
- When and how will results be shared?
- What kind of results will we receive and share?
- Who do I contact if I have trouble with the survey, or for more information?
1. Who are we surveying?
Most degree-seeking students who are enrolled in Winter term 2013 will be invited to participate. Students who are staff and students taking only elder or Montessori credits will not be included in the survey.
2. Is the survey anonymous?
Yes. Students will receive individual logins and passcodes, but these serve simply to track who has responded and who has not. All identifying information (names, emails, etc) will be disconnected from the results we receive by Noel Levitz, the survey company.
3. When does the survey open?
The survey will open on February 25. Students will receive emails with personalized links and passcodes to the survey (note: responses will be kept anonymous by the survey company) on the 25th, and non-respondents will receive (up to 3) reminders until the survey closes on March 17.
4. Why are we surveying students?
We want to learn more about our students' experiences and satisfaction, and better support student success as a result. Survey results will provide important data to allow the university to improve.
This effort is also part of a larger undertaking to assess how well we serve students and enact our mission. Data from the survey will contribute to that assessment, along with data about enrollment and financial aid, instructional quality (including student course evaluations and faculty evaluations), and staff and alumni satisfaction, among other things.
5. What does the survey ask?
The survey asks students questions about their overall satisfaction and the reasons they enrolled at Marylhurst, as well as for feedback regarding specific services and supports available at Marylhurst such as advising, career services, financial aid, etc.
The survey includes 70 questions (including 20 crafted by Marylhurst) that ask students to rank the importance of and their satisfaction with aspects of their experience at Marylhurst. Following those questions, students will be asked about their decision to enroll at Marylhurst, a few overall satisfaction questions, and a series of demographic questions.
6. How will the results be used?
We are conducting the survey to learn more about our students' experiences so that we can better support student success. We'll examine the results to look for opportunities for improvement, as well as successes that might highlight what is working well for our students. University leaders will consider the survey results in making decisions about student support efforts, and in ongoing assessment of how well we serve students and enact our mission.
7. What survey tool will be used?
An advisory group of representatives from across the Marylhurst community selected the survey tool -- the Adult Student Priorities Survey (ASPS). The group determined that the ASPS is the best tool to help us learn about our students' perspectives and satisfaction given our institutional goals at this time. The ASPS is also intended for use with undergraduate and graduate students, which was an important consideration.
For more information about the tool, see the Noel Levitz website (the company that designed and administers the survey).
Cronbach's coefficient alpha for the importance scores was .93 and .90 for the satisfaction items. Test-retest reliability estimate of mean importance scores was .82 and .81 for the mean satisfaction scores. The validity of the Adult Student Priorities survey has also been tested via both quantitative and qualitative measures. Correlations with both other tools and interview responses indicate that the tool measures what it was designed to measure. More information about reliability and validity of the tool is available upon request (email Kim at email@example.com).
10. Can students start, stop, and return to the survey?
Students must complete the survey in one sitting. If you leave the survey before completing it, you can return, but will have to start over at the beginning.
11. Will students receive anything for completing the survey?
We'll be conducting a drawing for thank you gifts as incentives for response in late March. Students who complete surveys (according to Noel Levitz, the survey company, who will be tracking response for us) will be entered to win one of several $50 gift cards or an ipad mini.
12. How is this different from course evaluation?
The student survey asks about students experiences at Marylhurst generally. It does not ask about specific courses, or instructors, or even degrees or programs of study. While course evaluations help instructors and departments to improve teaching and courses, this survey will help us to improve other services and supports, and the student experience, as well as to better understand what is important to our students.
13. How are we encouraging students to respond?
In addition to the drawing for thank you gifts as incentives for response (students who complete surveys will be entered to win one of several $50 gift cards or an ipad mini), we'll also be utilizing student newsletters and social media to promote the survey and encourage quality response. Our communications with students will include information about the importance of the survey and demonstrate our genuine interest in and need for their feedback.
14. What do we expect our response rate will be?
Noel Levitz, the survey company, estimates response rates of about 20%, which is typical for online surveys in general. When Marylhurst last conducted a student survey similar to this (in 2005), we had about a 35% response rate. We are hoping for a response rate of about 30%, knowing that survey fatigue has grown since 2005, and that students are very busy. If we send the survey to 1600 students (estimate), this means we expect about 480 to reply.
15. What can faculty and staff do to help?
As you talk with students, please encourage them to participate in the survey. Answer questions as you are able or direct students to the information on this website as needed. Let students know that this is an important effort for us, and an opportunity for them to share feedback that we intend to use to make improvements. If you or students have questions that you can't otherwise answer, contact Kim at firstname.lastname@example.org.
16. When and how will results be shared?
We expect to have initial results that we can share internally by late April. If you'd like to see what those initial results might look like, an example (not real data) is here. We'll also hold meetings this spring where results will be presented and discussed, and develop a summary report that we'll make available on our website.
17. What kind of results will we receive and share?
We'll receive summary results for all student respondents for all questions (i.e. the average ranking for importance of this or that thing is 3.5) and composite scales (i.e. the average ranking for institutional effectiveness, which is comprised of a number of specific questions, is 4.5). Those results will include a set of national results from all other schools that have conducted the survey, which allows us to compare our results to that group. We'll also receive summaries for subgroups of students: undergraduate & graduate students, for each School/College (e.g. College of Arts & Sciences) and by mode of coursework (online, on-ground, etc).
We'll share most of the results with university leadership, faculty, and staff via web-based reports and meetings during the spring. From those meetings we'll develop a summary report that we'll make available publicly on our website; that report will contain information about the most important and valuable findings, and what we're doing to respond to them (e.g. improvements we might make or additional information we need to gather).
Update! A full report of results, as well as a video summarizing what we've learned, are now available.
18. Who do I contact if I have trouble with the survey, or for more information?
Jan Dabrowski, Dean, College of Arts & Sciences: email@example.com or 503.699.6275.