For Pittsburg State University in Kansas, success in streamlining HLC reporting began with implementing a faculty activity reporting solution campus-wide. Jan Smith, Assistant Vice President for Institutional Effectiveness at Pitt State, and Stacy Becker, Digital Measures Senior Engagement Consultant and Client Success Manager, presented Pitt State’s success story at the recent Higher Learning Commission (HLC) conference in Chicago. In a previous post, they discussed implementing FAR. Here, Smith and Becker discuss collaborating to create HLC reporting for Pitt State which evolved into HLC reports available to all universities using Activity Insight in the HLC region.
Pitt State did its first HLC Assurance Argument under the new criteria in 2013; now in year four, Smith is working to submit the next one. “We did fine our first time around, but there were some areas where we really struggled to find information. We knew we were doing it, but we couldn’t figure out how can we document this?” Smith said. “The game has changed a bit since 2013.”
HLC revised its guidelines on faculty qualifications in October 2015 and again in March 2016, restating HLC’s “longstanding expectations regarding the qualifications of faculty and the importance of faculty members having appropriate expertise in the subjects they teach.”
“I’ve heard stories from colleagues at other institutions, where review teams wanted to have faculty vita with courses they teach, credentials in terms of whether it’s tested experience, what percentage tested experience—all kinds of information pulled together, which we all have on our campuses, but how do you get it quickly to satisfy reviewers?” Smith shared.
Collaborating for HLC Success
While Pitt State was considering its HLC data collection and reporting challenges, Digital Measures was considering the challenge of HLC reporting. “AACSB has very specific guidelines and templates for accreditation. HLC is a lot different,” Becker acknowledged. “They’re not going to give us a table. They look to individual institutions to determine how to provide evidence, support and reasoning in a way that makes sense for each institution.” Smith and Becker connected after last year’s HLC conference.
A collaboration on HLC reporting that grew to include three additional institutions sprang out of their discussions. “Because HLC doesn’t lay out their reporting requirements as clearly as some accrediting bodies, expanding the scope to include representatives from other institutions allowed Digital Measures to gather feedback and input to ensure the reports would work for most universities,” Becker said. “We also wanted to ensure they would be flexible enough to meet varying needs.”
Smith’s experience as a peer reviewer made her an especially knowledgeable collaborator, Becker noted. “It helped to know what reviewers are looking for.”
Three HLC reports are now available in Activity Insight:
- Faculty Qualification Report, using evidence from the revised guidelines to support faculty qualification
- Scholarship and Research Report, which reports on individuals and rolls up totals for departments and colleges.
- Professional Development Summary Report, which counts different types of activities based on an institution’s customized categories.
“These reports make it easy to generate a narrative. For example, ‘Four out of five faculty in Department X participated in professional development in this time period,’” Becker shared.
“From a reviewer’s perspective, they can scan this,” Smith added. “The reviewer can look through CVs, coursework and various other things this way.”
“I’m guessing many of you here [at the HLC conference] have worked on your own Assurance Argument. It’s not usually a time of joy,” Smith said. “It’s manageable, but it’s not often that you have moments of pure joy. The first time I tried out these reports, with literally three clicks, I got what I needed. It was like a wonderful gift—the time savings, but also the report presentation. It really transforms how you can get information for the Assurance Argument. As an institution, we have to show our processes are working and show our outcomes. Rather than write a description of what we do, we can now elevate it to an argument based on analysis of data.”