You Asked, We’ve Answered
Ithaka S+R US Faculty Survey 2018 FAQs
Earlier this month, we were thrilled to release the Ithaka S+R US Faculty Survey 2018 at the ACRL 2019 conference in Cleveland in a standing-room-only session with 200+ conferences attendees. We subsequently had the opportunity share results from the survey via webinar on April 17th with 250+ attendees (the webinar recording is available here).
We received so many thoughtful questions and comments during these presentations and wanted to take the time to share both these questions and our answers with a broader audience.
If you have a question about the survey findings that has not been answered below, please feel free to reach out to us directly by email or leave a comment below.
What can libraries take away from the survey findings?
Libraries and their staff can use these survey findings for their own strategic planning, and exactly what any individual library might wish to focus on within our findings will depend on its institutional home, user community, and other factors. Here are some of the key areas we suggest considering.
Faculty members generally prefer to be fairly independent in their data management and preservation strategies and are increasingly using cloud storage services to do so, though many would value support from their institution’s library towards these activities. What data management and preservation services is your library currently providing, and how well are these services utilized?
Further, faculty display substantial interest in reducing the costs that students pay for textbooks and other course materials and in using open educational resources (OER). Younger faculty are especially interested in using OER, creating OER, and modifying their pedagogy to be more open. What role can the library play in supporting these activities?
Beyond efforts to use the national findings as a guide, as we mentioned in our presentation at ACRL, 13 libraries ran a local version of this survey at their institutions. These local results allow a library to benchmark their faculty against the national sample and focus in on the particular needs on their campus.
What results did you find to be the most surprising?
This survey helps us see trends over time, but we also included several new areas of coverage in this cycle. It was particularly impressive, for example, to see that so many more scholars as compared to the last survey cycle, especially those in the sciences and social sciences, are using Google Scholar as their starting point for research.
Our newer questions looked at the attitudes and behaviors of faculty members on open educational resources, learning analytics, research fraud and data fabrication, and evolving scholarly communication models. One especially illuminating finding was the extent to which faculty are skeptical about the use of learning analytics tools. Neither the faculty who have used such tools, nor those who have not, believe that the tools will improve their teaching practices or help students who may be struggling. And, humanities faculty are particularly more concerned than those in other disciplines about how the use of these tools might limit their autonomy in choosing how to teach as well as how their institution might rely on the algorithms within these tools.
How do faculty attitudes align with their behavior?
We do see some divide between faculty behavior and attitudes, particularly around open educational resources and open access publishing.
Faculty are very interested in using open educational resources (OER), yet a smaller share are actually using these resources. While 57% of respondents indicated they are interested in using OER, a relatively smaller share (~50%) have actually used any type of OER; 32% have used open textbooks, 24% have used open course modules, and 32% have used open video lectures. Why aren’t more faculty taking advantage of these tools? They may not be receiving enough support. Only a very small share of faculty think their institution provides excellent training and support for using OER or that their institution recognizes or rewards faculty for taking the time to integrate OER into their instructional practices.
Additionally, the majority of faculty members — and a larger share since our last survey cycle — want to see an open access publication model replace the traditional subscription-based one. Yet just a small share of faculty consider whether a journal makes its articles freely available when they decide where to publish their research. We have consistently over time observed that other characteristics of journals — often those concerned with prestige and aligned with traditional scholarly incentives — are viewed as more important to a faculty member’s decision-making behavior. Although many faculty would like to see an open access publication model take greater hold, incentives associated with more traditional publication practices, generally tied to tenure and promotion processes, often influence behavior in a different direction.
What do faculty think about e-books?
Faculty continue to prefer print books for their research and teaching for many activities, especially those that involve long-form reading. Younger faculty are relatively more likely to use e-books but are also more likely to indicate that the library will need to continue maintaining hard copy versions into the future. We have speculated that, given their relatively higher use, younger faculty may be more likely to recognize the limitations of the electronic format.
How did you define “learning analytics” and “open educational resources” in the survey?
We gave these definitions to respondents:
Learning Analytics Tools: Those that summarize and/or analyze student activities, learning, or performance, and produce for you a dashboard, early alert emails, etc.
Open Educational Resources: Teaching, learning, and research materials used for educational purposes that reside in the public domain or have been released under an open license, such as Creative Commons, that permits no-cost access, use, adaptation, and redistribution by others with no or limited restrictions.
However, while we provided these definitions to encourage a consistent understanding of the underlying terms, we also recognize that respondents may hold their own conceptualizations.
Which faculty did you group in your medical cohort?
We include in the medical cohort faculty members who are teaching courses on medical science topics (e.g. family medicine, anesthesiology, ophthalmology, pathology, pediatric medicine, radiology, surgery, etc.). Faculty members teaching non-medical courses have been categorized elsewhere; for example, those teaching public health courses are rolled into the “sciences” macrodisciplinary category. We do not include faculty in the field of nursing in our sample.
For a full list of fields that were collapsed into our five macrodisciplinary categories, please see footnotes 9-12 in the full report.
Why aren’t area studies faculty included in the disciplinary analysis?
While we have included area studies faculty in our aggregate analysis, we have excluded them from disciplinary stratifications due to the low number of respondents associated with this subgroup.
What kind of free-text comments can faculty leave in the survey? Do you analyze these?
We have one field at the end of the survey where faculty can leave free text comments. While we do not report on the analysis of these comments, we do use this feedback to help shape thematic areas of inquiry for future survey cycles and other relevant projects.
How confident can you be in the results from a survey that has a seven percent response rate?
Taking into account the number of responses we’ve attained in this study (nearly 11,000) as well as the size of the broader population, we can be highly confident that the responses represent the population with a very small margin of error. To correct for relatively higher/lower response rates from faculty within specific fields, we also re-weight the aggregate responses by field to match the sample parameters to the population.
When will the underlying dataset be available?
We have deposited the dataset and associated documentation with ICPSR for long-term preservation and storage as we have with previous survey cycles. It often takes ICPSR a number of months to process the deposit, and we will post on the Ithaka S+R blog when the dataset is available.
If you participated in the local faculty survey in the 2018-19 academic year, we will be providing reports within your institution’s Carnegie Classification against which you can benchmark in the coming weeks.