Planetizen Response to Concerns About the Planetizen Guide, 6th Edition
March 4, 2019
We have seen the critique of the Planetizen Guide to Graduate Urban Planning Programs (Guide), written by the faculty at two universities and published on the Planners 2040 Facebook group and the PLANETNEW email listserv. We share these concerns, and we are continuously working to improve the process every time we publish the Guide. We hope the current conversation offers another opportunity for improvement.
We do this work because we believe the Guide is an essential resource for attracting prospective students to the field of planning. We make every effort to present the idea of an academic and professional career in planning as a desirable and worthwhile pursuit. In fact, we place the value of the Guide to the field of planning above all other considerations, and we think the Guide is one of the most valuable resources available for attracting students and professionals to the field.
That endeavor is greatly improved by the participation of as many schools as possible, so we hope every school sees the value in participating. With the Guide full of information from participating schools, distinctions in data and presentation become more apparent. The student looking for the right fit in an academic community, curriculum design, and setting in the real world has so much to consider. Every school that participates in the Guide process helps inform students to make the right choice about which school works best for them.
For each edition of the Guide, we have worked with a committee of academics (appointed by ACSP, from ACSP member schools) to ensure transparency in the process and make sure we’re addressing concerns of the academic community we mean to serve. We believe we have established a collegial and productive relationship with the ACSP over the process of publishing five Guides.
One of the ways we have addressed requests from the committee is by presenting a diversity of schools in the multiple rankings categories included in the Guide, such as "Top Small Programs," "Top Programs Allowing Part-Time Study," "Top Programs in Cities with Less than 50,000 people," "Most Diverse Student Body," and "Top Programs without PhD." We also provide a detailed listing by "Specialty," where each program that asserts a specialty in one of 20 specialty areas is listed (not ranked), so students can identify schools most likely to be aligned with their interests.
We also represent a diversity of schools and professional interests in the student and professional profiles. These profiles exemplify the many routes students can take to the study of planning, and the many exciting and socially-minded professional paths they’ll be able to choose from during their career. We opened up a student essay contest to every school in 2018 to be included in the Guide, and we have worked with a diverse collection of students—in race, gender, geographic location, and professional ambition—to publish in the forthcoming Guide.
We understand completely the desire to make the Guide data collection process as easy in time and effort as possible. We have made every effort to streamline the data reporting process, including developing our own technology platform this year to improve the user experience and ease the amount of time and work necessary in the process. We also have deliberately mirrored Planning Accreditation Board data as closely as possible, to help ease the process of reporting.
Of the 374 pages that constitute the 5th Edition of the Guide, eight pages were devoted to the rankings. The other 366 pages are detailed profiles of students, professional, and the programs themselves. We’ve included at the end of this commentary, as a reference, a PDF of the profile for the University of Colorado Denver that appeared in 5th edition, as well as an example of one of many "Student Profiles."
We believe the rankings component of the Guide is necessary to attract the attention of prospective students. We know the value of the rankings from the search acquisition analytics on the Planetizen website and interviews with students. Contemporary students expect rankings. For students who face the prospect of what is likely to be the most expensive personal investment thus far in their life, and the opportunity cost of two years devoted to study, authoritative rankings offer reassurance about academic study in the field. If rankings do not exist for the field, many highly-qualified students will instead look to other fields that are represented in other ranking products to validate their investment. Perhaps more importantly, many students who do not intend to limit their search for graduate schools to ranked schools discover the resources of the Guide because of the rankings.
We do not share the exact formulas we use to produce these rankings to minimize the probability of schools gaming the system. However, we have reviewed the overall methodology with the ACSP advisory committee for each Guide. The exact text explaining the methodology of the Planetizen rankings, as published in the 5th Edition of the Guide, follows this message.
We continue to welcome constructive criticism that will help to improve the quality and usefulness of the Guide. We hope we can continue to build trust with the academic community and work together to recruit high-quality students to the field of planning. We have contacted the two schools directly, as well as the ACSP advisory committee, to continue this dialogue and ensure positive steps in addressing these concerns.
James Brasuell, Managing Editor
Chris Steins, Founding Editor
How Planetizen Ranks Schools
Source: Planetizen Guide to Graduate Urban Planning Programs, 5th Edition
Planetizen’s ranking system is based on a combination of statistical data collected from the programs themselves and opinion data gathered from planning educators.
Planetizen sent surveys to 96 schools with master’s programs in planning, requesting data on such measures as incoming student GPA, faculty publications, and financial aid. In the end, 81 programs responded to the survey for this edition. Schools that did not respond to the survey were not included in the Planetizen rankings, but do appear in the directory in the interest of providing a complete reference. Rankings were generated from survey data collected during the fall and winter of 2016. Some program profiles have since been updated with additional information that did not impact these rankings.
For a sample of educators, Planetizen attempted to survey the entire population of planning faculty, collecting the name and email of each faculty member listed in the ACSP Guide to Undergraduate and Graduate Education in Urban and Regional Planning and/or on the website of each planning program. In addition, Planetizen sent a notice to all educators on a popular group for planning educators on Facebook, called Planners 2040.
The sample size was 1,115 educators. Of that total, 408 educators responded to the web-based opinion survey, for a response rate of 36.6 percent. Educators were asked to provide their relationship to the program (if they are currently employed by the university,
served on a program review committee, etc.). Between the 4th and 5th editions, the program survey changed to better capture data that conforms to the Planning Accreditation Board format generated by the self-study manual and template. The opinions of educators about a college or university where the educator was currently employed were not included in the ranking. Additionally, other relationships between educators and programs were weighted to account for bias.
In response to consultation with a special committee of the Association of Collegiate Schools of Planning, Planetizen has continued to refine our methodology. We kept changes to a minimum between the 4th and 5th editions of the Guide for consistency. Examples of changes between editions, however, include 1) moving the criteria indicator of the total number of students to the Program Characteristic category and 2) expanding the Faculty Characteristics category to incorporate academic citations analysis provided by Dr. Tom Sanchez of Virginia Tech. Dr. Sanchez sourced and analyzed citation data for academic planning faculty using Google Scholar Citations. More details can be found on Dr. Sanchez’s blog: www.tomwsanchez.com. To formulate our program rankings, Planetizen considered 27 indicators across four main criteria areas (categories listed in the table below). For each indicator other than the opinion data, an octile system was used (the median value is determined, then values are divided into 8 equal sections and each program is assigned a rating from 1-8, with 8 being the best score).
Planetizen accounted for completeness of responses in each major category. Schools that did not answer all questions were assigned a weighted average based on the completeness of their responses across all relevant measures comprising that category.
The final weighted scores for each category were then aggregated to arrive at the final ranking for each program based on the proportions shown in the table below.
Opinion of Planning Educators (30% of Ranking)
Program Characteristics (30% of Ranking)
Indicators include number of students, median course enrollment, merit- and need-based financial aid, accreditation status, student/faculty ratio, student retention, and number of graduates over the past ten years.
Faculty Characteristics (20% of Ranking)
Indicators include number of faculty, faculty publications, faculty citations, percentage of faculty with AICP/FAICP, and gender/ethnic diversity of faculty members.
Student Characteristics (20% of Ranking)
Indicators include the average GPA and GRE scores of incoming students, acceptance rate, the percentage of accepted students who matriculate, and the gender/ethnic diversity of the student body, student employment rates after graduation, and the percent of graduates who seek and achieve AICP certification.