Benchmarking Research Performance
in Department of Computer Science,
School of Computing, National
University of Singapore
Philip M. Long
Tan Kian Lee Joxan Jaffar
April 12, 1999
In April 1999, the Department of Computer Science at the National University
of Singapore conducted a study to benchmark its research performance. The
study shows, using publication counts along, that NUS ranks between 21 and 28
in comparison to a list of the top 70 departments
of CS in the US. In this article, we present the methodology
adopted and report our findings.
1. Background
As part of its self-assessment effort, the Department of Computer Science at the National University of Singapore conducted a study to benchmark its research performance. The study used publication statistics to estimate where it would have been placed in an authoritative ranking of CS departments.
We chose to use statistics of conference publications instead of journal publications because in computer science, conferences are the primary means for communicating research results, and they are refereed and some are very selective. We used papers published from 1995-1997; we stopped at 1997 so that the proceedings from the most recent year would be likely to be available in the library. Prior to this exercise, our department had divided conferences into three categories based on their prestige level: rank 1(the most prestigious conferences), rank 2, and rank 3. Since we felt publications in rank 1 and rank 2 conferences are far more relevant to the standing of a department, and to save on data collection costs, we omitted consideration of conferences of rank 3. In fact, a few of the proceedings were not available in the library: in our study we used the 109 conferences of rank 2 and above whose proceedings were available. We divided the rank 2 conferences into two groups, picking out a small collection of the better rank 2 conferences, which we will refer to as rank 2A conferences and the remainder as rank 2B conferences. This was done by consulting faculty in different areas and asking their opinions: they could support their case for a conference using the usual arguments like small acceptance ratio, publication of prominent results, participation by famous researchers in the conference or on the program committee, and so forth.
As our "authoritative ranking" of CS departments, we used the ranking
published by the National
Research Council [1]. To save on data collection costs, we used only
the top 70 universities in that ranking (see Table 1). We note that our
estimate is obtained only from publication statistics, whereas the original
ranking done by the NRC took into account other factors.
1 Stanford University | 26 Purdue University | 51 University of Illinois at Chicago |
2 Massachusetts Inst of Technology | 27 Rutgers State Univ-New Brunswick | 52 Washington University |
3 University of California-Berkeley | 28 Duke University | 53 Michigan State University |
4 Carnegie Mellon University | 29 U of North Carolina-Chapel Hill | 54 CUNY - Grad Sch & Univ Center |
5 Cornell University | 30 University of Rochester | 55 Pennsylvania State University |
6 Princeton University | 31 State U of New York-Stony Brook | 56 Dartmouth College |
7 University of Texas at Austin | 32 Georgia Institute of Technology | 57 State Univ of New York-Buffalo |
8 U of Illinois at Urbana-Champaign | 33 University of Arizona | 58 University of California-Davis |
9 University of Washington | 34 University of California-Irvine | 59 Boston University |
10 University of Wisconsin-Madison | 35 University of Virginia | 60 North Carolina State University |
11 Harvard University | 36 Indiana University | 61 Arizona State University |
12 California Institute Technology | 37 Johns Hopkins University | 62 University of Iowa |
13 Brown University | 38 Northwestern University | 63 Texas A&M University |
14 Yale University | 39 Ohio State University | 64 University of Oregon |
15 Univ of California-Los Angeles | 40 University of Utah | 65 University of Kentucky |
16 University of Maryland College Park | 41 University of Colorado | 66 Virginia Polytech Inst & State U |
17 New York University | 42 Oregon Graduate Inst Sci & Tech | 67 George Washington University |
18 U of Massachusetts at Amherst | 43 University of Pittsburgh | 68 Case Western Reserve Univ |
19 Rice University | 44 Syracuse University | 69 University of South Florida |
20 University of Southern California | 45 University of Pennsylvania a | 70 Oregon State University B. |
21 University of Michigan | 46 University of Florida | |
22 Univ of California-San Diego | 47 University of Minnesota | |
23 Columbia University | 48 Univ of California-Santa Barbara | |
24 University of Pennsylvania b | 49 Rensselaer Polytechnic Inst | |
25 University of Chicago | 50 Univ of California-Santa Cruz |
2. Basic Method and Result
Once we had divided the conferences into three categories (rank 1 only,
ranks 1+2A, and ranks 1+2A+2B), we counted the number of papers published
in the selected conferences by NUS and the 70 US computer science departments,
and checked how well counting the number of papers published in conferences
of some rank or higher agreed with the ranking published by the NRC. To
measure the degree of disagreement, we counted the number of pairs of universities
that had the property that University A was ranked above University B,
but University B had a higher paper count (considering conferences at some
rank or higher). We took the prestige threshold with the fewest disagreements
(this turned out to be rank 1 conferences only), and looked at its ranking.
Using this method, NUS's estimated ranking among US universities was 26th.
Counting rank 1 conference papers agreed with 80% of the relative rankings
of the NRC study. Using the other prestige thresholds yielded similar results,
with slightly higher rankings for NUS, and slightly more disagreements
with the NRC ranking.
3. Other Methods and Results
Despite the best intentions of the members of the computer science department,
it is natural to suspect that some bias might creep into our departmental
rankings. To address this potential problem, we tried a variety of different
methods, which balanced our prior knowledge about the prestige of conferences
with information obtained by looking at where members of well-respected
universities published.
|
|
|
|
|
|
|
|
|
|