MU Faculty Members Want Transparency, More Time Before Graduate Programs are Cut

Mar 6, 2018

Credit Darren Hellwege / KBIA

  The decision on whether or not to close more than a dozen graduate programs at MU now has a more substantial timeline.

At a special general faculty meeting on Monday, hundreds of MU faculty members packed Jesse Wrench Auditorium at Memorial Union to engage in a discussion centered around the Academic Programs Task Force report that recommended the closure, review or consolidation of several graduate programs.

“The question is: how are the cuts being performed, how will the process be determined and who’s going to have roles and input?” Stephen Karian, associate professor of English, said during a news conference after the meeting. “I think, clearly, there was a strong consensus today that faculty members want to have a voice in that process.”


The faculty meeting was called after Karian submitted a petition to Chancellor Alexander Cartwright. The petition, signed by 83 faculty members, asked for a time for faculty members to voice their displeasure with some aspects of the Academic Programs Task Force’s final report and the possible closure of programs.

Academic Analytics

After introductions from Cartwright and Task Force Co-Chair Cooper Drury, Arts & Science associate dean, who reiterated how the Task Force made its recommendations, Karian took the stage and denounced the task force’s use of Academic Analytics, a controversial company that provides data on faculty and school productivity.

Karian identified eight problems with the company’s data that he hopes MU leaders will keep in mind before making final decisions:

“Confusing Quantity and Quality.” Karian said Academic Analytics does not collect data in terms of quality and deals only in a “more is always better” approach.

“Lack of transparency.” He said most MU faculty members have profiles with Academic Analytics, but very few have been able to view their profile despite deans and administrative leaders having access to this information.

“Exclusions and Distortions.” He said Academic Analytics’ “one size fits all” measurement strategies are problematic for “departments with diverse research and creative profiles.”

“Pervasive Omissions.” Karian said Academic Analytics tends to give too much weight to some of the data collected.

“False equivalencies.” He said the company also gives inadequate comparisons between data that should not be compared, such as comparing a small travel grant to the MacArthur Fellowship.

“Book Inflation.” Academic Analytics counts the number of International Standard Book Numbers, or ISBNs, instead of counting the actual number of books produced by a school which leads to the counting of new editions as new books.

“Implausible and volatile Rankings.” Karian cited the dramatic shifts witnessed in the English department in terms of rankings. He said around one-third of the English programs shifted more than 20 positions up or down and one department’s ranking fell 93 positions.

“Cost.” The campus needs to know how much it costs to subscribe to Academic Analytics, he said.

Cartwright and Drury clarified that although Academic Analytics data was used by the task force, it was just one source of information collected to make recommendations.

“No program is in that report because of Academic Analytics,” Drury said. “We found that each program was unique. There was no piece of information that was more important than others.”

Transparency timeline

When Cartwright opened the floor up for faculty members to voice their opinions on the task force’s report, English professor Andrew Hoberek stepped forward with a prepared motion for MU administrators to:


  • Allow ample opportunity to correct inaccurate and misleading data.
  • Provide a budgetary justification for each closure or merger being considered.
  • Provide a set of procedures and a timeline for moving forward that are transparent and include meaningful faculty input.

The statement, which was approved at the meeting by a majority hand vote, was prepared by Hoberek and a small group of faculty members to focus the discussion and to receive more transparency about the process of closing programs.

“The goal was to boil down what faculty wanted in terms of asking for more information and more input as the process went forward,” Hoberek said.

Before faculty members voted on the motion, Cartwright gave a brief warning. He said MU could potentially face a $60 million budget issue this year, with about $21 million due to state budget cuts and $30 million due to low enrollment.

“We have a lot of tough decisions ahead,” Cartwright said before the vote was cast. “This vote that you take today will have an impact on how we make these decisions. Across the board cuts will not move this institution forward. I could easily see us continue to struggle. I’ve said I wanted to make decisions (on the closure of programs) by the end of the semester.”

Hoberek said he thinks Cartwright was trying to warn faculty that some people in the state might perceive the vote as faculty trying to stall and trying to prevent substantive change.

“I’m not sure that’s necessarily the intention,” Hoberek said. “I think for the most part, people understand that there are things that must be done moving forward, but that this process did not give adequate room for commenting on what, in some cases, was a problematic process.”

After the meeting, Cartwright said he understood the motion as the faculty looking for a public commitment that decisions would not be made until the end of the semester.

“We want to make some of our budget decisions within the next month or two and (decisions on closing graduate programs) towards the end of the semester, around May,” he said. “We’re going to have to work through them and we have to have some scenarios on how we’re going to do it.”


When the report was first released, Chancellor Cartwright said it was part of a larger process and final decisions will be made throughout the spring semester.

Last April, UM System President Mun Choi called for an 8 to 12 percent budget cut and a review of all MU programs. Since the task force’s creation in August, Cooper Drury and Matthew Martens, professor and provost faculty fellow, held multi-hour review sessions twice a week before the recommendations were made.


The task force reviewed data provided from multiple sources including the student census, Missouri Department of Higher Education and the National Study of Instructional Costs and Productivity. The process also involved 39 meetings with campus faculty, staff, administration and students.

After the report was released, the MU chapter of the American Association of University Professors, or AAUP, said the task force used inaccurate conclusions and careless wording that threaten the reputation of MU.

The specific concerns in the statement from AAUP, which was released Jan. 28, addressed the Academic Programs Task Force’s partial reliance on Academic Analytics. The statement was signed by six MU faculty members including MU AAUP Chapter President and Associate Professor of Sociology Victoria Johnson.

“Growing evidence suggests that much of the information produced by (Academic Analytics) is incomplete and inaccurate,” the statement said.

The MU chapter of AAUP said the task force’s report failed to include important differences in programs such as faculty to graduate and undergraduate student ratios. The chapter said faculty members should be provided full access to the task force’s data and enough time to review the information before the administration makes final decisions.

“Research and scholarship that contributes to the state, society and the world are not assembly line products,” the statement said.

After AAUP released their statement, MU spokesman Christian Basi clarified that the task force’s process included multiple sources of data and information, not just information from Academic Analytics.

“We’ve made it very clear that there are many other data points that were part of the process,” he said. “We won’t be making decisions based on one or two data points. We’re using other data points as well as the feedback that we are receiving from deans and faculty.”