Frequently Asked Questions
Why did Kennesaw State University conduct a climate assessment?
Kennesaw State University (KSU) is a fifty year-old institution. At its inception, KSU reflected a very homogenous racial and ethnic demographic. It was not until the 1980s, with the support of the federally funded Higher Education Advancement Program, that KSU made a concerted effort to recruit and retain faculty, staff, and students of color. Since then, the racial, ethnic, gendered, sexual, and ability status of KSU community members has expanded to reflect an increasingly diverse society. While diversity at KSU is a welcomed advance in creating a more equitable and inclusive university, we recognize the need to assess diversity more systematically. In 2006, President Daniel Papp authorized the first campus-wide initiative to assess the status of diversity and equity on campus. Entitled the Diversity and Equity Assessment initiative (DEAI) and comprised of faculty, staff, and students, the aim was to examine organizational structure, recruitment and retention, overall campus climate, and diversity-focused teaching and research. One of the goals and outgrowths of the DEAI was to develop diversity assessments that could inform leadership decision-making. In 2012, the first comprehensive and campus-wide climate assessment was initiated.
What process was used to develop the questions?
The consultant has administered climate assessments to more than 100 institutions across the nation and developed a repository of tested questions. To assist in contextualizing the survey for KSU, and to capitalize on the many assessment efforts already undertaken, the CCCATF was formed and consisted of faculty, staff and student representatives from various constituent groups at KSU. The committee was responsible for developing the survey questions. The team reviewed selected survey questions from the consultant’s tested collection and added KSU-specific questions which were informed by the focus group results.
Why was an outside researcher selected for the project?
In reviewing efforts by other universities to conduct comprehensive climate studies, several best practices were identified. One was the need for external expertise in survey administration. The administration of a survey relating to a very sensitive subject like campus climate is likely to yield higher response rates and provide more credible findings if led by an independent, outside agency. Members of a university community may feel particularly inhibited to respond honestly to a survey administered by their own institution for fear of retaliation.
What consulting group conducted the survey?
Dr. Susan Rankin (Rankin & Associates Consulting), was the consultant working directly with us on this project. Dr. Rankin is an emeritus faculty member of Education Policy Studies and College Student Affairs at The Pennsylvania State University and a senior research associate in the Center for the Study of Higher Education. She has extensive experience in institutional climate assessment and institutional climate transformation based on data-driven action and strategic planning. Dr. Rankin has conducted multi-location institutional climate studies at more than 100 institutions across the country. She developed and utilizes the Transformational Tapestry model as a research design for campus climate studies. The model is a “comprehensive, five-phase strategic model of assessment, planning and intervention. The model is designed to assist campus communities in conducting inclusive assessments of their institutional climate to better understand the challenges facing their respective communities.” (Rankin & Reason, 2008).
What was the survey response rate? How does this compare to response rates on similar surveys?
The overall response rate was 17%, or 5,128 surveys. The response rate varied by constituent group (e.g., staff, faculty, and undergraduate and graduate students). According to the consultant, Rankin & Associates, the overall response rate for KSU is slightly lower in comparison to surveys she has administered at similar institutions (four-year comprehensive, public).
Why is this a population (census) survey and not a sample survey?
The survey was taken by 5,128 individuals who are a part of the Kennesaw State community. The Campus Culture and Climate Assessment (CCCA) task force reviewed, with Rankin & Associates, the benefits and drawbacks of a population verses a sample survey and decided to conduct a fully population survey. One key reason for conducting a full population survey was to examine if, and to what extent, different sub-populations experience and perceive the campus environment differently. The consultant advised the CCCA that a sample survey might function to overlook some small sub-populations whose insights would be beneficial to improving the campus climate. Equally important, randomized stratified sampling would not work for situations where population data on particular identities/affinity groups does not exist. This would be of particular concern when it comes to gender identity and sexual orientation, where the University does not have comprehensive population data.
Why do some demographic questions contain a very large number of response options?
It is important in campus climate research for survey participants to“see” themselves in response choices to prevent“othering”an individual or an individual’s characteristics. Some researchers maintain that assigning someone to the status of “other” is a form of marginalization and should be minimized, particularly in campus climate research which has an intended purpose of inclusiveness. Along these lines, survey respondents will see a long list of possible choices formany demographic questions. However, it is reasonably impossible to include every possible choice to every question, but the goal is to reduce the number of respondents who must choose “other.”
How is the confidentiality of participants in the study protected?
Confidentiality is vital to the success of campus climate research, particularly as sensitive and personal topics are discussed. While the survey could not guarantee complete confidentiality because of the nature of multiple demographic questions, the consultant took multiple precautionary measures to enhance individual confidentiality and the de-identification of data. No data already protected through regulation or policy (e.g., Social Security number, campus identification number, medical information) was obtained through the survey. In the event of any publication or presentation resulting from the assessment, no personally identifiable information will be shared.
Confidentiality in participating was maintained to the highest degree permitted by the technology used (e.g., IP addresses will be stripped when the survey is submitted). No guarantees could be made regarding the interception of data sent via the Internet by any third parties; however, to avoid interception of data, the survey was run on a firewalled web server with forced 256-bit SSL security. In addition, the consultant and university did not report any group data for groups of fewer than five individuals, because those “small cell sizes” may be small enough to compromise confidentiality. Instead, the consultant and university combined the groups or took other measures to eliminate any potential for demographic information to be identifiable. Additionally, any comments submitted in response to the survey were separated at the time of submission to the consultant so they were not attributed to any individual demographic characteristics. Identifiable information submitted in qualitative comments was redacted and the university only received these redacted comments.
Participation in the survey was completely voluntary, and participants did not have to answer any question— except the first positioning question (staff, faculty) —and could skip any other questions they consider to be uncomfortable. Paper and pencil surveys were also available and were sent directly to the consultant.
Information in the introductory section of the survey described the manner in which confidentiality was guaranteed, and additional communication to participants provided expanded information on the nature of confidentiality, possible threats to confidentiality and procedures developed to ensure de-identification of data.
What protections are in place for storage of sensitive data, including for future secondary use?
KSU worked with the consultant to develop a research data security description and protocol, which includes specific information on data encryption, the handling of personally identifiable information, physical security and a protocol for handling unlikely breaches of data security. The data from online participants was submitted to a secure server hosted by the consultant. The survey was run on a firewalled web server with forced 256-bit SSL security and was stored on a SQL database that could only be accessed locally. The server itself could only be accessed using encrypted SSH connections originating from the local network. Rankin & Associates Consulting project coordinator Dr. Susan Rankin had access to the raw data along with several Rankin & Associates data analysts. All Rankin & Associates analysts have CITI (Human Subjects) training and approval and have worked on similar projects for other institutions. The web server ran with the SE-Linux security extensions (that were developed by the NSA). The server was also in RAID to highly reduce the chance of any data loss due to hardware failure. The server performed a nightly security audit from data acquired via the system logs and notifies the administrators. The number of system administrators was be limited, and each administrator had required background checks.The consultant has conducted more than 120 institutional surveys and maintains an aggregate merged database. The data from the SPSU project was merged with all other existing climate data stored indefinitely on the consultant’s secure server. No institutional identifiers were included in the full merged data set held by the consultant. The raw unit-level data with institutional identifiers was kept on the server for six months and then destroyed. The paper and pencil surveys returned to the consultant directly were kept in a locked file drawer in a locked office. The consultant destroyed the paper and pencil responses after they were merged with the online data. The consultant will notify the committee chairs of any breach or suspected breach of data security of the consultant’s server.
The consultant provided KSU with a data file at the completion of the project.
Which responses were included in the campus report?
If a respondent answered at least 50% of the survey questions, then his/her responses are included in the quantitative data findings. The survey also included a few open-ended questions that solicited narrative comments from respondents in their own words. In order to protect the confidentiality of individuals, any identifying references included in comments have been redacted in the full report.
I am a student from a small minority population. The report doesn't break down student responses into various sub-groups (E.G., Native American women with disabilities). Can I see responses from all sub-groups?
This report only includes high-level aggregations of data, but over the coming months and years the data will be analyzed for various sub-populations. As the University moves forward with the disaggregation of data and subsequent analysis, there might be a need to keep some data in aggregate form and limit the number of variables so as to protect the confidentiality of respondents.
What are the strengths and limitations of using survey research data and interpreting results?
Surveys tend to focus on collecting a broad range of data—in this case, perceptions and experiences—from different demographics. With a large sample, surveys have the potential to allow for generalization and the ability to gain a representative picture of the beliefs, attitudes, and practices of a population. Surveys are helpful for ensuring confidentiality, particularly when compared to data collection efforts that include individual interviews and focus groups. Standardized questions provide fixed responses that allow for tabulation and comparisons among groups, while open-ended questions provide an opportunity for respondents to describe experiences in their own words.
It is crucial to understand that the ability to generalize about a population from a survey can be limited. Researchers use response rate, sample size, availability, and other information to try to determine if the results represent the populations under study. For this particular survey, self-selection bias is an issue as is the response rate. That is, self-selection bias is possible in that the decision to participate might correlate with their perceptions and experiences and, hence, impact the findings. Those with strong negative and/or positive perceptions and experiences might be more have been more apt to participate in the study. A more ideal response rate to ensure generalizable results would be 30%; the response rate for this study was 17%.
How will the university use the results to address climate issues?
From the overall findings presented in the report, in consultation with the campus community, 2-3 action items will be developed that can be completed within 12-18 months. The Office of Diversity and Inclusion will further analyze the data and work with academic and administrative units to develop action plans specific to their needs. Each of these specific plans will link to the master diversity plan and strategic plan. Future assessments will measure how well academic and administrative units have addressed issues found in the survey.
How do Kennesaw State University researchers request access to the data? What is the IRB protocol?
Approximately twelve months after the report and findings are made available to the campus community, after all administrative and academic unit data is disaggregated and corresponding reports written, researchers can begin making requests to have their questions answers of the data. When questions go beyond those offered in the full report, researchers can submit an application to the IRB office. The IRB office will review the application and, if approved, work with the Office of Diversity and Inclusion to prepare/analyze the data and provide the selected results. In no case will data sets be provided to researchers where participants might be identified by way of demographics, affiliated departments, and/or any other information that might compromise confidentiality.
Will Kennesaw State University conduct a study like this again? If so, when?
Yes. KSU will conduct another Campus Culture and Climate Assessment within the next five years. KSU intends to evaluate current survey results to determine how useful the questions were in identifying issues, establishing baselines, and focusing on action items that will improve the campus climate.