REPORTS & SURVEYS | MAY 24TH, 2018

A Major Step: What Adults Without Degrees Say About Going (Back) to College

Methodology and Topline

August 17 through November 12, 2017, Survey of Adult Prospective Students on Going (Back) to College
Data Collected by Social Science Research Solutions, Inc.
Released Month 2018 www.publicagenda.org/

The survey results below appear in the Public Agenda report, “A Major Step: What Adults Without Degrees Say About Going (Back) to College,” and in the research brief. The data are based on a nationally representative survey of adults who are considering enrolling in college to earn an undergraduate degree or certificate, a group we refer to as “adult prospective students.” A total of 1,328 interviews were completed from August 17 through November 12, 2017; 1,328 interviews were included in the analysis. The survey was conducted by telephone, including cell phones, and online. Respondents completed the surveys in English.

This research was funded through a grant to Public Agenda from The Kresge Foundation. The survey was fielded by Social Science Research Solutions Inc. (SSRS). SSRS was responsible for data collection only. Public Agenda designed the survey instrument and analyzed the data. When using the data, please cite Public Agenda.

This research follows up on a nationally representative survey of adult prospective students by Public Agenda—fielded and published in 2013—that was also funded by The Kresge Foundation. The methodology of this survey is similar to that of the previous survey to ensure comparability of results over time and to minimize the possibility that any stability or change in findings could be attributed to methodological differences. The methodology differs in that, in this survey, 36 percent of interviews were completed through probability-based phone sampling and the remainder through both a probability-based web panel and a nonprobability-based, opt-in web panel. In the 2013 survey, 70 percent of the interviews were completed through probability-based phone sampling and the remainder through a nonprobability-based web panel provided to SSRS by Research Now. This survey asks some of the same questions that were asked in the previous one, as well as several new questions. For the 2013 survey, the complete methodology, full question wordings, topline findings and sample characteristics can be found at https://www.publicagenda.org/files/IsCollegeWorthItForMe_Methodology_PublicAgenda_2013.pdf.

Defining “adult prospective students”

For the purpose of this study, adult prospective students are Americans who meet the following criteria:

  • They are 18- to 55-year-old Americans who do not hold an associate or bachelor’s degree (although they may have earned a postsecondary diploma or certificate).
  • They have finished high school but are not entering college straight out of high school.
  • They are not currently enrolled in any kind of higher education institution.
  • They are considering enrolling in a certificate or degree program and say it is likely that they will do so within two years.

The survey

This study used a multimodal design. Data were collected via telephone interviews, including cell phone interviews, and online. A total of 1,336 interviews were completed with adult prospective students, of which 486 were conducted by phone and 850 were completed online.

To enhance data quality, Public Agenda removed online respondents who were preselected based on prior age requirements but when surveyed no longer qualified for the study. The resulting “trimmed” sample size was 1,328 adult prospective students.

Phone sample

Adult prospective students were identified and directly interviewed in the SSRS Omnibus survey for a period of 17 weeks. The SSRS Omnibus is designed to represent the adult U.S. population and gathers 40 percent of its completes via landlines and 60 percent via cell phones. The omnibus covers the 50 states and the District of Columbia.

The SSRS Omnibus uses a single-stage RDD (random digit dialing) sample of landline telephone households and randomly generated cell phone numbers. The landline sample is structured through Marketing Systems Group’s GENESYS database using 18 independent strata, made up of the nine census divisions, split by metro and nonmetro definitions. Sample telephone numbers are computer generated and loaded into online sample files accessed directly by the computer-assisted telephone interviewing (CATI) system.

Web sample

To collect data online, this survey was administered through the SSRS Probability Panel and an opt-in web panel. Some adult prospective student respondents were members of a web panel who completed the entire survey once it was determined that they were qualified respondents. Other web respondents were asked to complete the entire survey immediately after completing the eligibility screening questions.

The SSRS Probability Panel is recruited randomly from a dual-frame RDD sample, through the SSRS Omnibus. At the end of the omnibus survey, respondents who have been identified as having Internet access—about 85 percent of the omnibus respondents—are invited to participate in the probability panel. Approximately 1,400 people are invited to join the panel monthly.

SSRS partnered with Critical Mix to recruit online respondents from a nonprobability panel. Before anyone was invited to participate, Critical Mix used a variety of proprietary and third-party methods to validate the respondents’ identities and confirm humanity. Invitations to complete the web survey were sent directly to potential respondents by the web panel company. Invitations included the length of the survey and a link to the survey. Respondents were also provided with an opt-out link in the email.

Fielding

The survey was designed to be compatible with web and telephone interviews. Respondents to either could refuse to answer any questions. Questions that allowed the telephone respondent to volunteer “Don’t know” as a response included “Don’t know” as an explicit response category in the web version.

Before the fielding period, the survey was programmed using CfMC computer-assisted telephone interviewing (CATI) software. This software was used to produce both a web and CATI version of the survey. SSRS and members of Public Agenda’s research team checked the programs extensively to ensure skip patterns followed the design of the questionnaire.

The fielding period for this survey was August 17 through November 12, 2017. Telephone interviewers received both written materials on the survey and formal training. These included information about the goals of the study, detailed explanations of why questions were being asked, the meaning and pronunciations of key terms and pointers on potential obstacles to be overcome in getting good answers to questions and respondent problems that could be anticipated, as well as strategies for addressing potential problems.

At the outset of fielding, a Public Agenda staff member reviewed a set of recorded interviews. Following the review, the wordings of a few questions were modified slightly. Interviewers were monitored throughout the fielding period and were given feedback, when appropriate, to improve their interview technique and to clarify survey questions.

Within each landline household, a single respondent was selected through the following selection process: First, interviewers asked to speak with the youngest adult male/female at home. The term “male” appeared first for a randomly selected half of the cases and “female” for the other random half. If no males/females were at home during that time, interviewers asked to speak with the youngest female/male at home. Since cell phones were treated as individual devices and the interview might take place outside the respondent’s home, each cell phone interview was conducted with the person answering the phone.

To maximize survey response, the following procedures were enacted:

  • An average of six follow-up attempts were made to contact nonresponsive numbers.
  • Each nonresponsive number was contacted multiple times, with a programmed differential call rule used to vary the times of day and the days of the week of the callback.
  • Respondents were allowed to set the schedule for callbacks.
  • Specially trained interviewers contacted households where the initial calls resulted in a refusal, to attempt to convert the refusals into completed interviews.
  • A $5 incentive was included for cell phone respondents who requested compensation for their time.

The telephone response rate for the phone portion of the survey was calculated to be 7.4 percent using the American Association for Public Opinion Research Response Rate Three (RR3) formula. The web portion of the study was calculated to be 14 percent using the same formula.

Weighting

The final data were weighted to correct for variance in the likelihood of selection for a given case and to balance the sample to known population parameters in order to correct for systematic under- or overrepresentation of different demographic groups.

The weighting procedure involved the following steps:

First, a base weight was calculated for the telephone sample to correct (a) for the fact that a phone number’s probability of selection depends on the number of phone numbers selected out of the total sample frame; (b) for the fact that the probability that the sampling unit will be reached as a product of the number of phones answered by a respondent of a household; and (c) the fact that in households reached by landline, since only a single respondent is selected, the probability of being selected is inversely related to the number of adults in the household.

The probability panel weighting processes incorporate the omnibus base weight, since the recruitment of the panelists was through the SSRS Omnibus.

Second, the SSRS Omnibus sample was then weighted to census population targets utilizing “raking”—that is, the iterative proportional fitting (IPF). Parameter estimates were based on the most recent March 2017 supplement of the U.S. Census Bureau’s Current Population Survey. Eight population parameters were used for post-stratification: age by gender, census region by gender, education, race/ethnicity, Hispanic and born outside of the United States, marital status, population density and phone usage.

Benchmarks for the adult prospective student sample were extracted from the original SSRS Omnibus. All the data underwent IPF using the following parameters: age by gender, census region by gender, education, race/ethnicity, marital status and population density.

In order to reduce possible bias from the web panel sample, “calibration weight” was added as the last stage. This involved weighting all the data to questionnaire-level “calibration benchmarks” extracted from the SSRS Omnibus weighted data. Three calibration variables were used: whether respondents know what they want to study; whether respondents are looking to graduate with a certificate, an associate degree, or a bachelor’s degree or are more interested in taking classes but not completing a program; and the main reason respondents want to get such a degree or certificate.

Third, the weights underwent truncation (or “trimming”) to ensure the consistency of the population estimates produced week to week by the SSRS Omnibus. Weights were trimmed so they did not exceed 4 or fall below 0.25. The design effect for the survey was 1.5, and the survey has an overall margin of error of +/- 3.3 at the 95 percent confidence level.

As in all surveys, question order effects and other nonsampling sources of error can affect the results. Steps were taken to minimize these issues, including pretesting the survey instrument and randomizing the order in which some questions were asked.

Presurvey focus groups

Before developing the survey instrument, we conducted three demographically diverse focus groups with adult prospective students. Focus groups were held in July 2016 in New York City, New York; in July 2016 in Fort Lauderdale, Florida; and in December 2016 in Los Angeles, California. In total, 28 adult prospective students participated in these focus groups. More information about this study can be obtained by emailing research@publicagenda.org.

Topline

Full full survey results, download the PDF of Topline results.

Related Reports

HELP US BUILD A DEMOCRACY THAT WORKS FOR EVERYONE

Public Agenda knows what it takes to fuel progress on critical issues.
We need your support to keep things moving!


Join the Community

Donate

Take Action