Gaining the buy-in of faculty and administration is key to the success of an activity reporting database. But it’s not always easy. Linda Brewer, Senior Faculty Research Assistant and Project Manager, and Lucas Turpin, Information Technology manager, recently shared Oregon State University’s (OSU) success in implementing an activity reporting database at our User Group. In the first post in this two-part series, OSU shares how they set goals for the system, got to “yes” with administration to fund data entry, and how their data entry team worked. In part two of the series, learn how OSU overcame resistance, earned faculty buy-in with responsiveness and their lessons learned from the project.
Founded in 1868 as a land-grant institution, OSU is one of only two universities in the U.S. to also have sea grant, space grant and sun grant designations. OSU serves more than 31,000 students at its Corvallis, Bend, Newport, Portland and online campuses.
Outgrowing an Internal System
At OSU, as at many institutions, the Extension had developed a campus-built reporting system to help meet its substantial federal reporting requirements. Unfortunately, after the upfront investment, OSU’s internal reporting system didn’t have continued support or funding to ensure that it kept up with evolving needs.
“In the past, the federal government used to give the land grants their slice of the pie based on your population and basically said, ‘Go out and do good,’” Brewer said. “Today, we have to show what it was that we did with the money. And oversight is even more stringent at the state level.”
Meanwhile, several other units at OSU had success partnering with Digital Measures for their reporting needs, so Extension and College of Agricultural Sciences signed on.
Extension and the College of Agricultural Sciences wanted wide-ranging reporting capabilities from a new activity reporting system. They identified several goals for Activity Insight, including:
- Faculty annual reporting
- Promotion and tenure
- Federal annual reporting
- 10-year program reviews
- Repository for datasets not tracked elsewhere
- Meet unique departmental, college and university reporting needs
- Volume of faculty data: Research scientists and Extension faculty have extensive CVs, and OSU needed historical data to be able to achieve its goals for the system. “In my view, the only way we were going to have success was if administration agreed to pay for data entry of the faculty activity,” Turpin said. “We had to make the case to administration by establishing the true value.”
- Building trust: Brewer had strong relationships with faculty in Extension, where she works, which made it easier to ask people to try things and offer honest feedback. They would have to build trust with faculty and administration in the College of Agricultural Sciences.
Get to “Yes”
To overcome the first risk, Turpin developed a spreadsheet laying out the costs and benefits of paid data entry of historical data for two years, seven years and full CVs. “We made it clear to administration in both schools that if they wanted to do historical reporting, which they did, that this was going to be the requirement,” Turpin said. “This is what it was going to take to be successful. Our job was to make this an easy decision for them.”
Administration agreed to pay a data entry team to input seven years of historical data, which was the option Brewer and Turpin advocated based on the stated goal of using activity data for 10-year program reviews.
OSU’s Extension and College of Agricultural Sciences are separate units and have separate administrations. In determining what approach to take, Brewer and Turpin considered the culture in each unit:
- History of online reporting
- Culture of compliance with reporting requirements
- Viewed Activity Insight as the straightforward adoption of a new system
- College of Agricultural Sciences
- Resistance to interruption
- Culture of independence
- Extensive reporting needs
- Many unique departmental reports
Leverage Systems of Record
“We have a lot of different systems we track data in, and there’s crossover between what Activity Insight and other systems do.” Turpin said. “We wanted to make sure we honored systems of record and wherever possible we’re importing data into Activity Insight.” By leveraging data from other systems of record, OSU saves substantial data entry time.
“We’re very quickly getting to the point where we’re exporting data out of Activity Insight to fill the reporting gaps in the institution,” Turpin shared. That helps with faculty buy-in. “When they see this data magically flowing in, and they don’t have to build tables in their CV reports, they actually get very excited, and that’s great,” he said. “It was a big part of getting leadership to say yes to funding data entry.”
Successfully Manage Data Entry
Once they had a “yes” on funding data entry, Brewer and Turpin defined the team needed for the task. They recognized the need for consistent data and hired staff who had a strong background in data entry. In addition, they trained the data entry team on the university, such as the structure of university administration; the 36 counties in Oregon, since Extension is in every county; and the vocabulary necessary to understand the departments’ work.
Brewer also worked as intermediary between faculty and the data entry team, interpreting faculty concerns and requests, delivering those messages to the data entry team and sending responses to faculty. Frequent check-ins ensured information sharing among the data team, which bolstered efforts to standardize the data.
Brewer and Turpin recognized that it didn’t make sense to actively engage faculty in the system until data entry was underway. In the next post, we’ll share how OSU engaged faculty, overcame pockets of resistance and notched early successes with the system.