Profiting Higher Education - Methodology
The findings in “Profiting Higher Education?” are based on nationally representative surveys with 197 current for-profit college undergraduate students, 249 recent graduates who completed certificates or degrees at a for-profit college and 803 adults who are considering enrolling in college to earn an undergraduate certificate or degree (adult prospective students). This research also included regionally representative surveys with a total of 656 human resources professionals (employers) from four major U.S. metropolitan areas.
Interviews with current undergraduates, graduates and prospective students were conducted from February 7 through June 7, 2013, by phone, including cell phones, and online. Data from employers were collected through telephone interviews from April 4 through May 9, 2013. Public Agenda designed the survey instruments and analyzed the data. Data were collected by Social Science Research Solutions, Inc. (SSRS).
Public Agenda also conducted a total of eight pre-survey focus groups across four major metropolitan areas in the United States. Four groups were conducted with human resources professionals and four with adult prospective students. In addition, we conducted four Learning Curve Research (LCR) focus groups with adult prospective students.
Surveys with students
Current for-profit undergraduate students
To be eligible to participate in the current for-profit undergraduate survey, respondents needed to be enrolled in a for-profit college with the intention either of earning an undergraduate degree, certificate or other credential or of taking classes but not earning a credential.
Respondents qualified as current for-profit undergraduates if the institution they were enrolled at was listed, in spring 2013, by the National Center for Education Statistics as a for-profit institution.
Alumni of for-profit colleges
To be eligible to participate in the for-profit alumni survey, respondents needed to indicate that they had graduated with an undergraduate degree, certificate or other credential, between 2006 and 2013, from a for-profit institution.
Respondents qualified as for-profit alumni if the institution they had graduated from was listed, in spring 2013, by the National Center for Education Statistics as a for-profit institution.
Adult prospective students
For the purpose of this study, adult prospective students are defined as Americans who meet the following criteria:
- They are 18 to 55 years old.
- They do not hold an associate’s or bachelor’s degree (but they may have earned a postsecondary diploma or certificate).
- They are not entering college straight out of high school.
- They are not currently enrolled in any kind of higher education institution.
- They are considering enrolling in a certificate or degree program and say that it is likely that they will do so within two years.
Prior to the beginning of the field period, SSRS screened for qualified respondents for 22 weeks in its weekly dual-frame Excel omnibus survey, which targets 60 percent landline numbers and 40 percent cell phone numbers. At the end of the screening period, SSRS attempted to recontact the qualified respondents so they could complete the survey by phone. In addition, SSRS directly interviewed current undergraduates, alumni and prospective students in the Excel omnibus survey for a period of 12 weeks after the prescreening phase.
Surveys with current undergraduates, alumni and prospective students were also administered through a web panel. The panel was provided to SSRS by ResearchNow. Some undergraduates and alumni from the web panel completed a series of screening questions, then were recontacted to complete the entire survey once it was determined that they attended or had graduated from a qualifying institution. Other web respondents were asked to complete the entire survey immediately after completing the screening questions.
As in all surveys, question order effects and other nonsampling sources of error can affect the results. Steps were taken to minimize these issues, including pretesting the survey instrument and randomizing the order in which some questions were asked.
The final data were weighted to correct for variance in the likelihood of selection for a given case and to balance the sample to known population parameters in order to correct for systematic under- or overrepresentation of different types of students.
The initial weighting procedure utilized iterative proportional fitting process, or “raking,” and parameter estimates were drawn from data collected in the Excel omnibus survey. To create population targets, data from all of the weeks in which SSRS screened or collected data in the Excel omnibus survey were raked to general population targets based on the 2012 Current Population Survey (CPS). SSRS then selected all respondents who screened into each of the different groups and used these weighted data as population targets for weighting data from both the phone and online surveys.
For each group, the data were balanced to the following parameters:
- Gender × age
- Gender × region
- Education: high school graduate; some college but no degree; certificate; associate’s degree, bachelor’s degree
- Ethnicity: white; African-American; native-born Hispanic; foreign-born Hispanic; other
- Phone use (for phone respondents): cell phone only; not cell phone only
- Metro status: urban/suburban; rural
To improve accuracy, the weighted sample of current for-profit undergraduates and alumni were then weighted to available known parameters of their respective populations as drawn from the NCES Beginning Postsecondary Students Longitudinal Study (BPS) and the NCES IPEDS Fall 2011 data. The only targets available were for gender by age and race. So as to not discard the more extensive targets noted earlier, these new targets were raked to using the final outcome weight of the above procedures as the base weight.
Survey with human resource professional (employers)
The sample of respondents for the employer surveys was drawn from the Philadelphia, Los Angeles, Detroit and El Paso/Las Cruces metropolitan areas. Interviewees were randomly selected from organizations listed in the Dun & Bradstreet database. SSRS sampled private companies as well as public and not-for-profit organiza¬tions with a status code that indicated they were headquarters or a single location (branches were not included), and both subsidiaries and nonsubsidiaries. If available, the name and title of a human resources professional was appended to the sample records.
The Philadelphia sample was drawn from the pool of businesses and organizations in the Philadel-phia-Camden-Wilmington, PA-NJ-DE-MD, Core Based Statistical Area (CBSA) that are located in Pennsylvania or New Jersey and have 50 or more employees. The Detroit sample was drawn from the pool of businesses and organizations in the Detroit-Warren-Livonia, MI, CBSA with 50 or more employees. The Los Angeles sample was drawn from the pool of businesses and organizations in the Los Angeles-Long Beach-Santa Ana, CA, CBSA that are located in Los Angeles County and have 50 or more employees. Owing to the smaller number of employers in the El Paso/Las Cruces area, the sample for the El Paso/Las Cruces region was drawn from businesses and organizations with 10 or more employees in El Paso and Hudspeth Counties in Texas, and Dona Ana and Otero Counties in New Mexico.
All interviews were conducted on the telephone. To be eligible to take part in the employer survey, respondents had to indicate that participating in the hiring and recruit¬ment process in their companies or organizations was a major part of their job. Moreover, the current analysis was restricted to employers who said that positions in their business or organizations “sometimes,” “often” or “always” demanded a postsecondary credential.
The response rate for this study was calculated to be 20.4 percent using the American Association for Public Opinion Research RR3 formula.
The final data were weighted to correct for variance in the likelihood of selection for a given case and to balance the sample to known population parameters in order to correct for systematic under- or overrepresentation of meaningful types of businesses and organizations.
The weighting procedure utilized iterative proportional fitting process, or “raking,” and parameter estimates were drawn from the Dun & Bradstreet database. The data were raked as four separate groups to resemble the distribution of the population of organizations in each of the four metro areas.
For each metro area, the data were balanced to the following parameters:
- Number of employees
- Economic sector based on standard industrial classification (SIC) code
- Status Indicator – single location or headquarters
- Subsidiary Indicator – subsidiary or nonsubsidiary
- Whether or not the name and title of a human resources professional was appended to the record
Final weights ranged between 0.14 and 5.01. The design effect is 1.46. The weight-adjusted margin of error for this survey is +/- 4.18.
Survey questions about for-profit colleges
Based on our background and qualitative research for this project, we expected that many survey participants, both students and employers, might not be familiar with the term for-profit college.
We therefore preceded all survey questions that employed the term for-profit college with an open-ended question asking respondents, “What comes to mind, if anything, when you hear the term for-profit college?”
Moreover, we provided the following definition before presenting a set of questions that asked respondents to compare for-profit colleges in general with community colleges and public universities, respectively: For-profit colleges are a growing number of schools that operate as profit-making businesses. Many for-profit colleges are small, private, vocational schools. Some are large, national enterprises such as…
Blanks were filled randomly for each respondent with three school names from a list of the 12 top for-profit schools by total student enrollment in the fall of 2011, according to NCES IPEDS data, retrieved January 2013. The list included the University of Phoenix, ITT Technical Institute, Ashford University, DeVry University, Kaplan University, the Art Institutes, Strayer University, American Public University, Walden University, Everest University, Grand Canyon University and Capella University.
In addition, we included a set of questions in the employer survey that, without relying on the term for-profit college, sought to assess whether employers perceived for-profit schools differently in quality from community colleges, public universities or private not-for-profit schools, respectively.
To this end, we presented each employer with names of five higher education institutions in their area—one small, independent for-profit, one national chain or online for-profit, a community college, a public university and a private not-for-profit four-year school—and asked respondents whether they considered the quality of education and training each school provided excellent, good, only fair or poor, or whether they had not heard anything about the school. School names were selected randomly for each respondent from pre-defined lists of up to five schools in each of the five categories.
This methodology allowed us to assess and compare employers’ perceptions of different types of higher education institutions based on their rating of specific schools instead of explicitly asking employers to generalize across categories. To see the lists of specific schools included in this study, across the four metropolitan areas where the survey was conducted, e-mail Public Agends' Director of Research Carolin Hagelskamp at firstname.lastname@example.org.
Pre-survey focus groups with employers and adult prospective students
Prior to the surveys, Public Agenda conducted four focus groups with human resources professionals in Detroit, El Paso, Los Angeles and Philadelphia. All participants reported that they were involved in making hiring decisions in their organizations.
Through these conversations, we explored employers’ hiring priorities and practices and their views on different kinds of colleges and universities in their areas, including for-profit colleges, public colleges and online schools. Quotes from these focus groups appear in this report to illustrate the views quantified in the survey. A total of 40 human resources professionals participated in this part of the research.
In addition, we conducted four pre-survey focus groups with adult prospective students in Detroit, El Paso, Los Angeles and Philadelphia. Through these conversations, we explored processes by which adult prospective students research and decide upon their postsecondary educational plans. This data informed the design of the survey instruments.
Learning Curve Research focus groups with adult prospective students
Public Agenda also conducted four Learning Curve Research focus groups with adult prospective students in Detroit, El Paso, Los Angeles and Philadelphia.
Learning Curve Research (LCR) focus groups are distinct from standard focus groups. LCR focus groups are designed to create a deliberative environment in which participants have the chance to express their thoughts and opinions, then confront new information and grapple with its implications.
LCR focus groups typically consist of an extended three-hour group conversation, pre- and postgroup surveys and one-on-one follow-up interviews with each participant a few days after the group met.
This project’s LCR focus groups sought to examine how adult prospective students react to new information and data about higher education institutions and how their thinking and considerations change after they have had a chance to discuss and deliberate about the information and issues presented.
In particular, we employed neutral and descriptive language to introduce participants to the notion of for-profit versus not-for-profit higher education, private and public schools and comparative school performance metrics such as graduation rates, loan default rates and graduates’ labor-market outcomes.
Insights from these focus groups are considered in selective sections of this report and in more detail in our companion publication, "Is College Worth It for Me?"