Radford University’s road to demonstrating faculty influence on student success had a humble beginning—moving faculty annual reviews off of paper and into digital form. Charley Cosmato, Director of the Center for Innovative Teaching and Learning at Radford, and Andrew Wiech, Digital Measures Senior Engagement Consultant, shared the university’s evolution from simply digitizing faculty reviews to using faculty activity data to measure faculty’s impacts on a range of strategic priorities, including student success and retention at the recent American Association of State Colleges and Universities (AASCU) conference. In our last post, we shared Radford’s experience using faculty activity data to demonstrate the ways faculty contribute to student success, but that didn’t happen overnight. Here, we’ll share Radford’s journey from paper-based faculty annual reporting to fully realizing the capabilities of a faculty activity database.
New Technology, New Approach
Prior to implementing Activity Insight, faculty activities were tracked on paper, Cosmato shared. “It wasn’t a usable form of data at all,” he said. Then, when the Virginia university system began an initiative to go paperless, “We collected the paper annual reports, pulled the staples out and ran them through a PDF scanner, and those scans went into the big digital archive of electronic files. But they were equally useless as a form of actionable data,” he continued. “With a dataset this huge, there is no ‘first glance’ at the data. Somebody would have to go through and read line by line. By the time you’re an inch into the stack, everybody starts to look the same, which is a horrible place to be.”
When Radford initially decided to implement Activity Insight, they thought it would just be nice to have a more efficient means of collecting faculty activity data, Cosmato shared. “We didn’t think, hey, we’ve invested in a system for collecting activity data, so what types of questions could we answer that perhaps we couldn’t answer as well previously? We didn’t think about it as a huge opportunity to answer other important questions,” he said. “Time and again, if you’ve been at this for a while, you get this fantastic new technology, this powerful tool, and immediately look to use it to perfectly replace what you had before. So it becomes a replacement technology as opposed to an innovation.”
Radford quickly implemented annual reviews and Association to Advance Collegiate Schools of Business (AACSB) accreditation reporting then pivoted to considering how faculty activity data could answer larger institutional questions, including faculty’s impact on student success.
Defining the information and answers they needed ensured that Radford could make the most of the activity data faculty input into the system. “I’m sure some of you have had to write this unpopular email: Please by noon tomorrow (and I know you just turned in your faculty annual report) send me a list of all the publications you gave that met this criteria,” Cosmato said. “When you implement activity reporting software, faculty won’t immediately jump up and down and say, ‘Wow! We’re collecting data!’ A big part of faculty buy-in is actually using the data you already collect. Don’t ask for something you already have.”
Perspective From Radford
With several years of working with faculty activity reporting software under its belt, Radford shared some of its lessons learned with the AASCU audience:
Rolling out campus-wide: “Collecting data campus-wide from the start has made it a lot easier for us to get faculty buy-in and demonstrate usefulness in going to an electronic process,” Cosmato said. “Let’s face it, if since 1910 you’ve been doing annual reporting a particular way, there will be some folks who need convincing that there’s benefits for them.” Leveraging annual reporting data for accreditation and other metrics helps prove the value to faculty.
Campus resources: “If you’re going to invest in a system like this and take full advantage of it, determining what is the right place in terms of operational capacity and administrative authority within your institution to be successful. Early on, our system sat with someone who had considerable authority, but no operational capacity. And at some point he threw up his hands, spoke to the Provost, and said to be successful with this, certain things have to happen on a regular basis. And that means we needed people dedicated to the project in order to make it successful,” Cosmato said. Since those resources were put in place, the project has flourished.
Superior support: “We have a regular conversation between our faculty annual reporting specialists and Andrew, and he really is instrumental in prompting us to go back and query, what are the most important initiatives on campus right now? Are you utilizing this dataset in a manner that helps you describe the impact your activities and interventions are having?” Cosmato shared. “Sometimes that means we go back to our campus community and we talk about collecting different pieces of information. While we started with this merely as a replacement tool for our faculty annual report, we’ve discovered that we can inform a lot of other processes.”
About Radford University
Founded by the Virginia General Assembly in 1910 as the State Normal and Industrial School for Women, Radford University has grown from its original mission to educate women as teachers. Radford now offers undergraduate and master’s level education as well as three doctoral programs to a 10,000 strong, diverse student body. With approximately 500 full-time faculty, Radford has been a client of Digital Measures since 2013.