Achieving Your Implementation Goals With Washington State University

Achieving Your Implementation Goals With Washington State University

Washington State University (WSU) implemented Activity Insight system-wide with a primary goal: supporting faculty annual reviews for all faculty. Thanks to a strong project team and plan, WSU built integrations to populate data from its campus HR system, a campus-built faculty activity reporting system, student information system and Office of Research systems; customized its database to meet the reporting needs of all units; and now uses Activity Insight to support faculty annual reviews. Here, we’ll share the best practices WSU used in their successful implementation.

1. Involve the Right Project Stakeholders

WSU’s campus teams began meeting in early 2017. “Our steering committee was a collaboration between the Provost’s Office; Institutional Research; Enterprise Systems, which is a part of our IT department; and the Office of Research, which supports faculty and student researchers,” said Stephanie Kane, assistant director, Office of Institutional Research. This gave the team two key advantages:

  • People who needed reporting from the system had input on what data to capture and how to make that data as useful as possible.
  • People with technical knowledge of each system were at the table to ensure successful integrations and data-quality best practices.

The implementation team emerged from the steering committee, and included Kane, Craig Parks, Assistant Vice Provost; Greg Neunherz, Carson College of Business Director of Technology; Coleen McCracken, Administrative Planning Specialist, Employee Data; Jasen Skelton, Application Systems Analyst and Developer; Derek Brown, Research Operations Administrator, and Catherine Taylor, Carson College of Business Activity Insight administrator.

For an implementation at the scale of WSU’s, Skelton recommends allocating a full-time IT person and a project manager to the project. Skelton was dedicated full time to the project, as was Kane for a portion of the implementation. Neunherz had a half-time appointment as the project manager. And McCracken and others contributed substantial portions of their daily work time to ensuring the project’s success.

2. Pace Yourself

your challenges, our expertiseIt’s not easy to keep a large project on deadline. Project planning software helped, but in retrospect, the team would have allocated time differently. “We ran out of time on the reporting side,” Kane noted. “We should have probably condensed the time we were doing the road show (see below) to have more time to focus on reporting before our deadline. The screens are great, faculty are really pleased, but on the reporting side, we got more feedback on needed improvements.”

“The reports are a work in progress. More chairs will have ideas for improving them after using them once,” Skelton agreed. Team members wish they had been able to harden up the detail and layout of screens earlier in the project in order to give more time to scripting, validating data from source systems and developing reports.

3. Take Your Show on the Road

The WSU team headed out to campuses to gather information on how different colleges use faculty activity data in annual reviews. “This was a system-wide implementation across five physical campuses plus agricultural and Extension research stations,” Kane said. “We went out to a lot of different colleges with a road show and shared the goal of annual reviews. We also took time with Extension and the College of Medicine to better understand their reporting needs.”

“There’s always a lot of hesitation going from an old system, even a hated system, to a new system,” Skelton shared. In order to overcome some of these concerns, the team was responsive to feedback from college and department leadership, incorporating changes quickly and demonstrating a willingness to respond.

“They made a lot of good suggestions we would never have thought of without them looking at the system,” Kane said. “It was especially important for some of the more specialized disciplines like pharmacy and nursing.”

4. Define Your Approach to Data

Data drawn from an ambitious number of systems support WSU’s faculty activity data—and reduce the amount of data faculty enter into the system themselves. WSU made a plan for data coming from each of them. “We did a lot of mapping of data to make sure it came in correctly,” Kane said.

In addition, the team made some bedrock data quality choices. “We decided early on that if we were importing data from a source system, that source is the system of truth,” Skelton said. “We set up Activity Insight so that faculty can’t change that data. It must be corrected at the source, because inaccurate data there affects other reporting out of that system.”

As a result, faculty can’t change things like enrollment in courses or contact information directly in Activity Insight, but can follow a process established for getting corrections into source systems.

WSU also set a firm date that its legacy activity reporting system would become read-only and not accept new data. All data in the system at that time was imported into Activity Insight, and users had to begin using the new system to make updates.

5. Keep It Simple

With so much variability in disciplines, there’s always some debate on what people value in their annual reviews, the team shared. Some disciplines value certain types of contributions more than others; some want more detail, others want less.

In creating the annual report for WSU, “We balanced the needs of institutional reporting and department reporting,” Kane said. “We had clear goals from strategic planning metrics, and the road shows helped us get an understanding of the details that matter for annual reviews for a department or, say, pharmacy accreditation. The flexibility we built into annual reporting is on the detail end, not so much the bones of it.”

The result the team rolled out was a mostly unified annual report, with an important exception for Extension, which has highly specific reporting requirements.

6. Get LOTS of Feedback

WSU involved faculty in beta testing and presentations after they completed initial research with leadership. “We had a long, wide-ranging pilot test with faculty. We walked them through screens and took notes of their suggestions,” Kane said. “Taking on a system-wide annual report was an ambitious project. There’s so much variability between programs and disciplines and colleges, and we were trying to address the problems that existed in our campus-built system.”

The team felt that while they initially had a good cross section of faculty and leadership for input, they discovered that sometimes key players were still missing. “After ‘go live,’ departments shared additional priorities and nuances,” Skelton said. “It’s been an ongoing effort for the team to ensure that disciplines with unique requirements, such as music, can track and report on the information they need.”

7. Build on Success

“We’re already thinking about how we can make annual reporting even more flexible for them in the future,” Kane said. “One department or individual faculty member may want to show all lab sections while another may not. We want to accommodate that.”

The team also plans to revisit data collection and reporting for Extension to address things they discovered after launching the system. “There’s variability within extension in how they report things for different types of extension staff, so we’ll approach those things a little differently,” Skelton said.

About WSU

Founded in 1890 as Washington Agricultural College and School of Science, the state’s original land grant university, it became known as Washington State University in 1905. Now with more than 30,000 students, WSU continues its land-grant heritage and tradition of service to society.

Its mission is to advance, extend and apply knowledge, through creative research, innovative educational programs, and local and global engagement.