leveraging activity insight for program review

Leveraging Activity Insight for Program Review with the University of Kansas

Universities use program reviews to foster academic excellence, assess student outcomes, identify opportunities to improve instructional quality and provide insight for administrative decisions. When the Kansas Board of Regents mandated a new set of program review guidelines in 2012, Amanda Kulp, Program Manager, Professional Record Online (PRO) Office of Institutional Research and Planning at the University of Kansas (KU), used the opportunity to improve the existing program review process using data from Activity Insight.

Background

KU has been using Activity Insight for five years, and has a university-wide implementation that encompasses nine schools and its college of Liberal Arts and Sciences. As a research university, and a member of the prestigious Association of American Universities since 1909,  KU has 1,200 faculty and 400 research faculty equivalents in Activity Insight, as well as about 100 staff using the system.

Upon implementation, Kulp and her team decided to enter all historical faculty information. With that project now complete, a huge historical and current data set populates KU’s system. In addition, the team trained all faculty to use the system, as KU’s first use of Activity Insight was for annual reviews and promotion and tenure.

New Program Review Guidelines

Like most universities, KU has been doing program reviews for decades. However, when the Kansas Board of Regents released its new guidelines, Kulp and her team felt it was a good time to examine their processes and see how they could better leverage their available data.

They discovered that past processes were unstructured and inconsistent between different programs. For instance, KU needed to provide the Board of Regents with a one-page summary for each program, but some programs were compiling hundreds of pages of information.

Overhauling the KU Program Review Process

In order to create a comprehensive, consistent (but simpler) program review process, KU started with a task force led by the Provost. That group defined 23 questions that each program would need to answer. These questions were discipline-neutral, measurable and data-informed. Departments could pull data from Activity Insight as well as other campus systems, and Kulp’s team made it easy for faculty and administrators to do so.

KU uses an internal portal called Campus Labs, which connects to all of KU’s databases, including Activity Insight. When it is time for a program review, the dean or faculty member logs in to Campus Labs, sees each of the 23 questions, and can easily pull data from the relevant databases.

Activity Insight Data—A Key Component

Activity Insight held data on important review topics including:

  • Overall impact of scholarly work
  • Community-engaged scholarship
  • Faculty service to the discipline
  • Faculty-student mentoring

After defining the specific data points they wanted to pull from Activity Insight, Kulp created custom reports to easily produce that information.

In addition to those areas, they found that Activity Insight was able to complement data from other systems. For example, when Kulp’s team pulled Academic Analytics data from KU’s internal system, that data tended to be very discipline specific and often favored the hard sciences or social sciences. They found that the Academic Analytics data was a snapshot of faculty productivity, but was more accurate and comprehensive when paired with faculty productivity data from Activity Insight.

In the end, Kulp and her team found Activity Insight reporting to be extremely beneficial for supporting their program review process, as well as helping to structure the new process. “Activity Insight was a great match in providing much of the data needed for our program review,” Kulp said.

Could your program review benefit from centralized data and a streamlined process? Contact us today to learn more about how a faculty activity reporting solution can support your university’s reporting and program review requirements.

roll out campus-wide for FAR solution success

Roll Out Campuswide for FAR Solution Success

Looking for the best return on investment for your faculty activity reporting (FAR) solution? Based on system health metrics such as faculty engagement, data quality and use of reporting tools, we’ve discovered that the surest path to success begins with a big leap—rolling out campus-wide.

Why Campus-wide Works

Having uniform data from across the institution allows you to fully leverage the reporting capabilities of your FAR. This means that your provost can standardize and digitize promotion and tenure review and each of your colleges can run reports for its professional accreditation—all from the same set of data.

When committing to a campus-wide FAR solution, you might think it would be easier to roll out a unit or two at a time rather than to the whole university at once, but a staggered rollout has pitfalls. It may seem counterintuitive to go big rather than taking an incremental approach, but addressing institution-wide as well as unit-specific needs at the beginning of your FAR project pays substantial dividends in your ability to fully realize the many benefits of faculty activity reporting.

The Downsides of Incremental Rollout

Important downsides of an incremental rollout include:

Poor user experience. One of the first steps in implementing a FAR system is customizing it to meet the unique needs of your faculty. When you commit to a full implementation, you immediately begin reviewing all of the data collection and reporting needs for each college as well as the university as a whole. This is essential to making good foundational decisions and reaching needed compromises regarding field names, screen configurations and other matters before substantial time has been invested in customizations that suit only one unit.

Imagine trying to get used to a new tool that is constantly changing as your colleagues in other colleges come on board. Ongoing field and screen changes can frustrate users, so minimizing change makes it easier for faculty to become comfortable working in the system.

Wasted resources. When rolling out unit by unit, numerous tasks must be done for each unit instead of once for the entire university. If you have 10 units to roll out, that’s 10 times the work. In addition, work done at the request of one unit will have to be modified when another unit with somewhat different needs comes online. In a campus-wide implementation, those considerations arise sooner, and are resolved before work begins.

Delayed ROI. Many university reporting requirements happen on an annual basis. With a full implementation, your institution experiences campus-wide annual reporting and other benefits within a year of implementation. In a staggered rollout, those reporting capabilities won’t reflect the whole university until the final unit’s implementation is fully populated with data. If your envisioned rollout process will take multiple years, you delay the full return on your FAR investment by that number of years.

A Common Path to Success

Successful FAR implementations follow a path that begins in the provost’s office. The provost is uniquely positioned to champion your project and marshal needed resources. For example, early in your FAR implementation, you’ll need IT to build integrations with campus source systems for course, HR, grant and other data. This ensures high data quality and minimizes the data entry burden for faculty.

Because FAR solutions become increasingly useful as they are populated with data, our most successful clients begin by using Activity Insight for annual reporting, which requires faculty to enter just one year of data to start.

Next, they use FAR reporting for promotion and tenure reviews. Since annual reporting is already in place, faculty are familiar with the system and a portion of the necessary information has already been entered for the previous year’s annual review.

Faculty expertise profiles come next because faculty benefit from being able to share online the accomplishments they already entered for their annual and promotion and tenure reviews. Again, faculty buy-in means more and better quality data, which makes institution-wide reporting possible.

Finally, these universities use their FAR solution to prepare reporting for regional and professional accreditation such as Higher Learning Commission (HLC) and  and Association to Advance Collegiate Schools of Business (AACSB).

Final Thoughts

Every university considers multiple priorities when implementing a FAR solution. If you’re weighing the pros and cons of implementing unit by unit versus taking the plunge on a full implementation, a Digital Measures Implementation Consultant can help you steer a course to best outcomes. Reach out to us to start the conversation.

faculty expertise profiles

Faculty Expertise Profiles with University of Texas, El Paso and University of Washington

Faculty web profiles are the traditional way to share faculty data on a public website, and it’s a use case that works perfectly with Activity Insight. Faculty expertise profiles are the evolution of faculty web profiles. The biggest benefit of faculty expertise profiles is that they are more easily searchable for relevant parties, putting data into users’ hands faster and easier.

This post shares the experience of using Activity Insight to create faculty expertise profiles. Didier Hernandez, Office of the Provost, represented The University of Texas, El Paso, and Jan Boyd, Faculty Data Services Specialist from the University of Washington Information School, joined the conversation. Digital Measures Product Manager Kirby Fitch shared some knowledge from his perspective as well.

Background

The University of Texas, El Paso and the University of Washington Information School shared similar circumstances, as explained by Boyd and Hernandez. Each had an existing system that highlighted faculty expertise, but wanted to replace or enhance it using Activity Insight.

The University of Washington Information School (UW iSchool) launched Activity Insight in late 2010. Around the same time, the school was replacing its existing internal web directory, so the projects coordinated well. In May 2012, the new UW iSchool web directory was launched, with faculty profiles and areas of expertise fed by Activity Insight data.

The University of Texas, El Paso had long invested in sharing faculty expertise. Before signing on with Digital Measures, the university already had an extensive site, Expertise Connector, dedicated to sharing knowledge and expertise, connecting faculty and students, fostering research opportunities and more. One of Activity Insight’s primary roles for Hernandez was to simplify the flow of information into Expertise Connector.

UW iSchool Implementation

When Boyd began the project of implementing faculty expertise profiles within Activity Insight, she initially struggled to determine the correct fields or areas of expertise to include. The faculty wanted so many options that the list ended up being overwhelming and not useful.

The project’s focus changed when the new Associate Dean of Research took interest in the project. Her first question when reviewing the current data and fields was, “Who is the intended audience for these areas of expertise?”

There were multiple answers to that question, so Boyd and her team chose to categorize the areas of expertise into four buckets:

  • A “Research Area” section of the website, pulled from a field added to Activity Insight
  • A notation by faculty members who are willing to take on PhD students, pulled from another new field in Activity Insight
  • All faculty names by teaching area on the Academics portion of the website, pulled from Activity Insight data
  • A list of experts for media and external needs

Boyd shared that her team was still in progress releasing this new functionality in collaboration with a new school website scheduled to launch in late 2016 or early 2017.

University of Texas, El Paso Implementation

The University of Texas, El Paso already had a comprehensive faculty expertise and knowledge sharing system in place with their Expertise Connector website. The site had a searchable element and was accomplishing the goals the university wanted.

Where Activity Insight comes into the picture is in the data part of that system. Before integrating with Activity Insight, Expertise Connector needed to pull from several different sources to populate all the data. This included many different databases, along with manual data entry. The process was complex and cumbersome.

By using Activity Insight, Hernandez and his team were able to pull most Expertise Connector data from one source. Either Activity Insight had the data needed, or it could automatically pull from necessary databases, streamlining the data flow into Expertise Connector.

Now that the project is running smoothly, Hernandez’s team has plans to expand the use of Activity Insight throughout the campus.

Have you been using Activity Insight to support faculty expertise profiles at your university? We’d love to hear your story—please comment below. If you’re interested in learning more about this topic, let’s talk. Contact your Digital Measures Solution Specialist today.

giving back dmers stuff the bus to stop hunger

Giving Back: DMers Stuff the Bus to Feed the Hungry in Eastern Wisconsin

dmers at feeding america eastern wisconsin

It’s the time of year when we celebrate holidays with family and friends, and every holiday has one thing in common: food. But for people who struggle to access food, the holidays can be especially stressful. So Digital Measures volunteers pitched in today at Feeding America Eastern Wisconsin’s 19th annual Stuff the Bus food drive. 

The largest food drive in Wisconsin, Stuff the Bus collects food donations at the Brookfield South Pick ‘n’ Save and loads them into Milwaukee County Transit System buses. The buses transport the food to Feeding America Eastern Wisconsin’s Milwaukee warehouse to be unloaded and sorted for distribution to the organizations 550+ members—food pantries and other service organizations who provide those in need with food and other essentials.

Turkey and So Much More

That’s where DMers came in. Today, we unloaded two buses full of food—nearly 5,000 pounds of everything from breakfast cereal to frozen turkeys—then sorted and packed it for distribution. The boxes, containing Thanksgiving favorites like dressing, canned gravy, vegetables and cranberry sauce, as well as staples like peanut butter, soup and pasta, will be distributed to families and individuals through food pantries and other organizations.

dmers unloading the bus“Today is a special event, but what you see happening here today is what we do here every day, and it wouldn’t get done without volunteers,” said David Berka, Volunteer Engagement Coordinator for Feeding America Eastern Wisconsin. In addition to food and monetary donations, it takes 75,000 volunteer hours per year to provide food to the individuals and families served by Feeding America Eastern Wisconsin’s partner organizations, Berka noted.

Commitment to Community

“Volunteering is a core belief at Digital Measures. DMers are encouraged to volunteer two work days per year,” said Meghan Lemerond, Client Engagement Team Lead and chair of the Volunteering and Community Outreach committee. “Involvement in the community is essential for being a well-rounded individual—it’s our responsibility as individuals and as a company to make a positive impact.” The committee coordinates at least one volunteer opportunity each quarter, but DMers are also encouraged to volunteer with organizations that reflect their own concerns and passions.

Feeding America Southeast Wisconsin

Feeding America is a national network of more than 200 food banks across the country, and is the largest domestic hunger-relief agency in the United States. Feeding America makes access to healthy foods a priority, including fresh fruits and vegetables, milk and meat—foods often unavailable to those struggling with hunger.

Feeding America Southeast Wisconsin https://feedingamericawi.org/ is the leading hunger-relief network in Wisconsin, with food banks in Milwaukee and the Fox Valley that serve more than 550 food pantries, soup kitchens, meal programs, emergency shelters, day care centers and senior centers in 36 counties. Each year, millions of pounds of food are distributed to more than 377,000 individuals, including 124,000 children and 41,000 seniors.

dm_november_okn-01

Collaborating Through Separate Implementations with the University of Oklahoma

Implementing new software always brings its share of positive change, insights and, of course, a few challenges. Two campuses at the University of Oklahoma implemented Activity Insight separately, but collaborated through the process. In this blog, we’ll discuss how both implementations benefited from the collaboration.

Background

Three campuses make up the University of Oklahoma system, with the primary one in Norman, Oklahoma. The Norman campus serves all academic fields except for health services. The Oklahoma City campus includes Center for Health Sciences and the College of Medicine. The third campus is in Tulsa.

Because the Center for Health Sciences and the College of Medicine are separate from the main university, it made the most sense to have two separate Activity Insight implementations. Karen Horne was the Project Manager for the university-wide implementation, and Leah Haines was the OU College of Medicine Project Coordinator.

The university’s Colleges of Engineering and Education already had implementations of Activity Insight when Horne’s team began OU’s university-wide implementation in 2015.

The project team first focused on implementing other colleges, libraries and units, then migrated Engineering and Education to the university-wide system. In the end, between the Norman and Tulsa campuses, the University of Oklahoma had 2,132 Activity Insight users, 1,764 of whom were faculty.

The OU College of Medicine began its project in 2014, and will implement across 31 science and clinical departments by 2017. When complete, the College of Medicine implementation will have 1,494 total Activity Insight users, including 1,331 faculty.

Benefits of Collaboration

Haines and Horne found many benefits of collaborating and sharing ideas while working on their separate Activity Insight implementations. To start with, since both were from the University of Oklahoma system, they were familiar with the constructs and processes of the university—and could discuss sensitive information as needed. The two decided early on to have an open-book policy when it came to sharing their experiences and advice.

Because the two projects overlapped, they could ask each other for advice when facing a specific issue or making a decision. For instance, the OU College of Medicine needed to include several  proxies in their system to help enter data, and Haines was able to ask Horne’s advice for how to set up user access, credentials and other important details. When Horne’s team began importing CV data into their system, Haines also shared a great tip to help the data entry teams enter data correctly, by color-coding sections of the CV.

The two shared that they loved having someone to bounce ideas off of and ask an opinion before making a big configuration decision. They shared screenshots and ideas throughout their implementation processes.

Advantages in the Differences

One of the first areas where the two implementation teams parted ways was branding. The University of Oklahoma chose to call their system the Faculty Activity System, while the OU College of Medicine decided to stick with Activity Insight. At first, Horne and Haines weren’t sure if the dual names would cause confusion, but the opposite turned out to be true. Because many users were in both systems, it was easy to differentiate between them.

Horne and Haines had another win during their co-existing implementations when Karen experienced an IT problem. While the Tulsa campus was doing faculty reviews, users reported that the system timed out or wasn’t accessible at all. As the university-wide coordinator, Horne jumped on the problem, speaking to IT teams from all three campuses and Digital Measures. They discovered a faulty router at the Tulsa campus. Replacing it resolved the issue.

Horne kept Haines in the loop during this hiccup because there were also College of Medicine users at the Tulsa campus. In fact, Haines was actually planning a rollout at that campus in coming weeks and would not have known about this issue had Horne not shared. Due to the prompt communication, Haines delayed her rollout until the issue was resolved.

Have a success story about cross-campus collaboration and implementation? We’ve love for you to share your approach. Digital Measures loves collaboration and putting administrators in touch with each other. There is no better way to learn new setups, share advice, and build networks and relationships than talking with the individuals who do this everyday. Reach out to your Solution Specialist if there is a college or institution that you would like to be connected to.

Medical Schools Inspire Enhancements to Activity Insight

Medical Schools Inspire Enhancements to Activity Insight

Digital Measures recently rolled out enhancements to Activity Insight designed to make faculty activity reporting easier for schools of medicine, medical centers and other related health disciplines. We collaborated with several of our clients to examine the customizations they made in Activity Insight and determined what type of information is needed to accurately capture a unique range of complex activities.

The result? Base screens with the built-in functionality to collect the faculty activity information most critical to schools of medicine including clinical teaching, mentoring, hospital affiliations and more.

Clients who previewed these enhancements at our recent User Group responded enthusiastically. One attendee told us that the new screens cover 90 percent of the customizations done during on-boarding at their College of Medicine, and said that these enhancements will help any medical school go live very quickly.

Unique Needs of Medical Units

The educational model in schools of medicine presents data collection challenges when it comes to tracking faculty activity. This is evident in two areas: reporting faculty’s own education and affiliations as well as reporting on the specific activities medical faculty perform. This was the focus of the new base screen changes.

Broadening the Scope

Medical training goes far beyond the classroom. For medical faculty, their clinical graduate and postgraduate training, such as internships and residencies, are important  parts of their CV, as are formal mentorships. Activity Insight now provides the screens and fields needed to record this information.

In addition, medical faculty’s clinical work is calculated separately when looking at total workload, so faculty can now record clinical workload percentages as well as clinical teaching activities that don’t fit into a term-based dating scheme. Clinical activities are categorized differently by different universities, so customizable fields for those categories are now available in Activity Insight.

Additionally medical units need detailed information about faculty affiliations to fully document their programs, specializations, divisions and units. These can now be easily tracked in Activity Insight, as can specific information on the funding sources of grants that are unique to medical schools.

Just the Beginning

Activity Insight remains fully customizable, but for medical units looking for an out-of-the-box solution, these enhancements ensure they can get up and running quickly.

“The current enhancements will be helpful to many kinds of medical units, but we plan to focus on the specific needs of additional health disciplines going forward,” said Kirby Fitch, Digital Measures Product Manager who oversaw the medical school enhancement initiative.

If you’d like to have a conversation about using Activity Insight in your college of medicine, please reach out through our contact page.

Great Reports Start With Accurate Data

Great Reports Start With Accurate Data

The most important component of any faculty activity reporting solution is accurate data, which determines the quality of every report as well as every faculty annual review and promotion and tenure process. Achieving a high level of data quality means you can trust your reports to inform your decisions about issues related to faculty activity and other strategic priorities.

Four Dimensions of Data Quality

To determine the quality of data in your Activity Insight instrument, Digital Measures considers four dimensions:

  • Completeness is the measure of whether data needed to feed your reports are indeed present in your system.
  • Consistency in data collection ensures that all activities of a certain type are entered the same way, in a single source field and that they can be consistently extracted to feed any report that requires the information.
  • Currency is the measure of how up to date your data are, and therefore how accurately it measures your faculty’s most recent activities. The more you can count on Activity Insight to provide “fresh” information, the more often it will be seen as the go-to source for on-demand information requests about activities and accomplishments.
  • Accuracy is, well, accuracy. More on that in a bit.

We provide quality scores on the completeness, consistency and currency of your instrument’s data on a quarterly basis.

The X Factor—Accuracy

Only you can evaluate data accuracy for your university, so it’s important to regularly review reports for questionable information. The more users there are using reports from the system, the more eyes there are to spot possible inaccuracies.

Check Data From Other Sources

To begin improving accuracy, build a process around regularly updating data that comes from another campus source system such as Banner or Peoplesoft. It’s important to have a well-vetted bulk data upload process or web services integration to automate a consistent stream of accurate records. It’s useful to evaluate data from screens such as:

  • Personal and Contact Information
  • Yearly Data
  • Permanent Data
  • Scheduled Teaching
  • Academic Advising
  • Contracts, Fellowships, Grants and Sponsored Research

Make Alerting Easy

Develop an open channel of communication for faculty to address any inaccuracies in the data imported on their behalves. Adding help text on these screens to direct faculty to the person who can correct the source will ensure that Activity Insight receives corrected data during the next scheduled upload. Frequent updates of data from other campus source systems ensure that Activity Insight data is both current and accurate.

Build Test Reports

It’s also helpful to create custom reports with the specific purpose of testing data quality. These reports can be tailored based on the metrics your campus finds most important. For example, if collaborating with students on research is a key metric, you can surface data quality issues around this metric by building a custom report with a table that displays research records, breaking out collaborators, roles and if a student was involved.

The Dividends of High Data Quality

When all four dimensions of data quality are high, the payoff is enormous: complete, current and correct reporting. Our Resource Center, a knowledge base and user community for Activity Insight, has in-depth articles to assist clients with addressing all dimensions of data quality. In addition, your Solution Specialist can provide your system’s most recent scores and help you identify opportunities to improve them.

What happened to your reports when you improved data quality? We’d like to hear your story.

Capturing the Impact of Community Engagement

Capturing the Impact of Community Engagement with New Mexico State

At our recent User Group, Steven Loring of New Mexico State University led a discussion on capturing the impact of community engagement via Activity Insight. Loring, the Associate Director of the Agricultural Experiment Station System at NMSU, is involved with the tracking and analysis of community engagement for the College of Agricultural, Consumer and Environmental Sciences (ACES) and the university as a whole. Here’s a followup on his presentation, with additional discussion of the value of reporting impacts at the university level.

Background

New Mexico State University sits on a 900-acre campus and enrolls more than 15,000 students from 49 states and 89 countries. Founded in 1888 as Las Cruces College, the school opened the land-grant Agricultural College and Experiment Station in 1890. NMSU now offers more than 180 degree programs at five campuses. In 2015, the Carnegie Foundation for the Advancement of Teaching selected NMSU to carry its Community Engagement Classification in recognition of the university’s commitment to community impact.

Telling a University’s Whole Story

Loring’s experience and commitment to both community engagement and reporting on impacts come from his work in ACES. Since NMSU is a land-grant university, ACES has reporting obligations to the United States Department of Agriculture (USDA) as well as legislative bodies on which ACES relies for funding. Loring is committed to extending community engagement and impact reporting to the university as a whole, because “Provosts, presidents, chancellors, or whoever is the head of your university, needs to be able to shed light on what the university does for the community. University leaders need to go back to governors, legislators, funders and alumni with the stories of how we make a difference,” he explained.

Tracking activity and outcomes is baked into much of the teaching, research and outreach done by ACES faculty and staff, so it’s easier to gain buy-in for use of Activity Insight, particularly since there’s such a direct connection between impact reporting and funding for ACES. That connection isn’t as direct for other parts of the university.

In taking community engagement impact reporting to the university level, Loring has discovered that collecting data can seem burdensome to faculty in other disciplines. Engaging these faculty is part of Loring’s project to gain a 360-degree view of NMSU’s community engagement impacts.

Measuring the Impact of Community Engagement

When it comes to community engagement, there’s no one-size-fits-all metric for measuring impacts. Some impacts can readily be quantified and others can’t, but both matter, according to Loring. For example, a new integrated pest management program cut application of potato pesticides by 15 percent, resulting in an annual average savings of $2 per acre. Here, the outcome is measurable. But when a faculty member gives a talk about weather to fourth graders, there isn’t a readily measurable impact or outcome. Still, being the public face of the university has an intangible value, and may contribute to future outcomes that can’t be easily accounted for or predicted. “You’re comparing apples and oranges, but both apples and oranges are important,” Loring commented.

Loring is researching methods of collecting data on the less tangible impacts of community engagement via “ripple effects mapping.” For example, if someone from the university gives a talk that inspires someone to begin a project, and that project inspires someone to donate a truck to the project, and that truck helps the project accomplish a particular result, how can the university talk meaningfully about its role in this outcome?

Activity Insight as a Data and Reporting Model

Activity Insight has been an effective tool for reporting on community engagement and impact data for ACES faculty and staff, but as Loring takes his efforts to the university level, it has become clear that faculty data is just one part of the picture.

When it comes to impacts, where else might the university’s story be found? “An engineering intern building a bridge in a third-world country may just seem like an internship, but it’s also community engagement,” Loring said.

Loring serves on a subcommittee focused on issues surrounding data: where it exists within the university, how it’s structured, and how it can be brought together to build a more comprehensive picture of the university’s impact via community engagement activities. The road ahead includes determining how those data sets can be created in compatible formats and integrated with data and reporting from Activity Insight to tell the comprehensive story of the university’s impact via community engagement.

At land-grant universities, Activity Insight’s usefulness in capturing data and reporting on community engagement impacts is often first discovered and leveraged by Colleges of Agriculture. But as NMSU demonstrates, the story of community engagement is important across the university as a whole. Is your university measuring and reporting on its community engagement impacts? We’d love for you to share your story in comments.

dm_november_aacsb-01

Getting Down to Business with AACSB, with Johns Hopkins University & California State University, Los Angeles

At our recent User Group, two universities shared their experience using Activity Insight to streamline the Association to Advance Collegiate Schools of Business (AACSB) accreditation process. This session was presented by Frederick San Mateo, Senior Business Analyst at Johns Hopkins Carey Business School, and Erin Doolittle, Special Projects Manager at the College of Business & Economics in California State University, Los Angeles.

Background: Johns Hopkins Carey Business School

Based in Baltimore, MD, there are nine divisions within Johns Hopkins University, three of which use Activity Insight, including the Carey Business School. The business school has 88 full-time and 49 part-time faculty, as well as over 2,000 students.

John Hopkins Carey Business School is in the process of seeking accreditation for the first time from the AACSB, with a site visit from the AACSB scheduled for November of 2016.

Getting Started

Seeking initial AACSB accreditation was a large undertaking, made even more challenging by an accelerated timeframe (15 months) requested by the dean of the Carey Business School. In order to meet that timeline, the project began with a comprehensive data entry initiative, in which graduate students entered faculty CV data into Activity Insight and then aimed to validate and fill in missing information.

Frederick’s team found there was still incomplete data, so every month throughout the project they ran a report to review what was missing and request that faculty update it as needed.

Ensuring Accuracy

In addition to working with the faculty to collect data, Frederick’s team also pulled relevant data from their internal registrar and HR systems to fill in missing data or validate discrepancies. During this process, the team discovered a number of nuances that needed addressing so that reports would be accurate and complete (as one example, part-time faculty were only shown as “active” when they got paid, so there were times when these faculty were unintentionally excluded from reports).

To ensure data quality, they created a process and checklist of procedures to perform before they generated AASCB reports, to make certain they had the data needed (e.g., accounting for all part-time faculty, excluding guest lecturers).

The final step in striving for accuracy was having the faculty validate their own records and enter any outstanding data. Although they didn’t want to burden faculty, and as such had already done most of the data entry for them, Frederick’s team believed that faculty involvement in their records was a crucial step to data quality. The dean helped toward this effort, understanding the accelerated timeline and the need for everyone to participate, by making all future annual reviews based only on data entered into Activity Insight. This proved a great incentive for faculty to double check and update their own records.

Thanks to the team’s methodical data collection and entry processes, along with the flexibility of Activity Insight, Johns Hopkins Carey Business School is confident going into its first AACSB accreditation and well-positioned for future reporting needs. The next section features California State University, Los Angeles, who used similar strategies while completing their AACSB accreditation a year ago.

Background: California State University, Los Angeles College of Business & Economics (CBE)

California State University, Los Angeles is one of 23 California State campuses, and a large urban state school. Unlike some schools in the system, Cal State LA is more teacher-scholar based, with most faculty engaged in professional activity outside of teaching.

The College of Business & Economics (CBE) has 70 full-time and 100 adjunct faculty, along with about 4,600 students. The environment is a competitive one, with many business schools and MBA programs in the LA area. Additionally, the school was in a state of transition at the time of their accreditation project, with a new university president plus a shift from quarters to semesters.

Getting Started

Erin Doolittle shared that CBE wasn’t focused on AACSB preparation since their last review four years ago. They had been using Activity Insight only to capture basic faculty data, so there was a lot of information missing. CBE established a project team, with Erin at the helm as a dedicated resource for this project. Their first step was data collection of their faculty accomplishments for the past five years since their last accreditation review.

Developing a System

As Erin and her team began collecting data, they realized they needed to step back and decide how to categorize it within Activity Insight. They looked at what other business schools who had used Activity Insight had done and married that with their own unique capabilities, and then developed a category and point system to track faculty activities.

The team created four categories (peer-reviewed journal, books, other scholarly, and professional activities), plus a “participating” category, each with a point value and a minimum faculty requirement. There were a total of 53 unique activities created under those umbrellas. Next the team had to define each unique activity and the data needed to complete each record, work with Digital Measures developers to develop the exact screens needed, and begin mapping data. Like Johns Hopkins, CBE chose to involve faculty with data entry, including a coordinated training effort to get faculty comfortable with the new system.

Lessons Learned

Both business schools learned some similar lessons and takeaways to share with other schools planning to use Activity Insight for AACSB reporting. To start with, customizing Activity Insight was the easy part, according to Frederick and Erin, who both worked closely with their Digital Measures teams. The harder part was taking time to figure out how to define their business school and faculty within AACSB standards and then get the required data into the system. On that note, both recommended starting the planning and research as early as possible, as there will be unexpected things throughout the process.

Finally, both Frederick and Erin stressed the importance of including faculty in the project. In addition to ensuring their data was clean, faculty became involved as each school chose to build out CVs from the data and use this information for annual reviews. The benefit to review teams was consistent data to conduct annual reviews, while providing incentive for faculty to ensure their data was fully up-to-date within Activity Insight.

What’s your process for AACSB reporting? Have you overcome similar hurdles as Frederick and Erin? We’d love to hear your thoughts in the comments. And if you want to learn more about how Activity Insight can streamline the process, contact us today.

faculty engagement

Engaging Your Faculty with UC-Merced

When Mubeena Salaam joined the University of California at Merced a year ago to administer the university’s Activity Insight instrument, she identified faculty engagement as a strategic priority. At our recent User Group, Salaam shared her efforts to drive faculty adoption—and enthusiasm—by rolling out a new feature to address one of the biggest pain points for faculty.

Background

UC-Merced is the first new American research university established in the 21st century, with a mission of research, teaching and public service. The newest campus in the University of California system, it began offering undergraduate classes in 2005. In just 11 years of operation, UC-Merced has earned a spot in the US News & World Report ranking of national universities, and is the youngest “doctoral-granting university with higher research activity” in the Carnegie Classification of Institutions of Higher Education.

Turning “Meh” to “Wow!”

“My first order of business as a new administrator was to prioritize the pain points and tackle one that could be quickly resolved and make a significant impact on the mindset of the faculty,” explained Salaam. She learned that data entry was a critical concern for Activity Insight users. She also discovered that the “Import Items” feature hadn’t yet been implemented. This feature allows faculty to import publication citations directly from BibTeX and PubMed rather than manually adding each publication into Activity Insight.

Salaam believed that making data entry easier would significantly impact faculty because they enter publication information so often. A successful rollout of the “Import Items” feature would build enthusiasm for Activity Insight among faculty, renew confidence in the system and build the credibility of her team by demonstrating responsiveness to faculty feedback.

Research, Testing and Planning

The most important and time-consuming phase of the project included a feature study, extensive testing and meticulous planning. After thorough research in the Digital Measures Resource Center, our knowledge base and user community forum, Salaam consulted Solution Specialists Stacy Becker and Brett Bernsteen regarding capabilities of the feature, implementing a Beta environment and establishing an implementation timeline. She also connected with Jeff Adachi and Peter Underwood of University of California Irvine, a peer institution which had already launched the feature, to learn from their implementation experience.

UC-Merced’s team did significant  beta testing, pulling publications from Google Scholar to discover which data came through, which fields weren’t populating, surfacing the system’s current limitations and documenting everything. The findings she shared with the DM team allowed us to make changes to UC-Merced’s custom screens to ensure the cleanest possible data imports, Stacy noted.

With these steps complete, Salaam and her team prepared a proposal for the university’s Vice Provost, who was asked to champion the rollout. The proposal included a description of all aspects of the new feature and all the steps that would be taken to ensure successful rollout, including documentation and training to be offered.

“It’s important to have a champion who is aware of system limitations but also recognizes that what the system can do vastly outweighs its limitations,” Salaam noted.

Collaboration Is Key

Since Salaam’s team administers Activity Insight at the university level, Salaam reached out to the Activity Insight administrative team for each school in the university. Fearing that a poorly managed rollout could create dissatisfaction among faculty, some of UC-Merced’s college-level Activity Insight administrators initially hesitated to introduce a new feature.

To quell concerns and gain buy-in, Salaam’s team gave school-level administrators access to the beta environment. These teams worked with the “Import Items” feature and came back with suggestions that were implemented prior to rollout, including on-screen instructions, since users don’t always refer to training materials while using Activity Insight.

This collaboration removed fear of the unknown and made each school’s administrators an important part of the rollout process, which was crucial, as their support was necessary for successful implementation.

Announce With Authority—and a Roadmap

In order to emphasize university support of Activity Insight, Salaam’s team drafted a message on behalf of the Vice Provost, who then announced the upcoming rollout. The message directed questions about the rollout to Salaam’s team and alerted faculty to watch for emails about walk-in training sessions.

Salaam’s team used the walk-in training sessions to demonstrate the new feature, but used 10 minutes at the end of the session to share the roadmap for future Activity Insight feature implementations. Talking with faculty about current concerns and future fixes built credibility for the university’s Activity Insight administration teams and demonstrated responsiveness to faculty concerns.

Lessons Learned

Salaam offered this advice to Activity Insight administrators planning a new feature rollout:

  • Identify the problem the new feature will solve and gauge impact
  • Test and test again
  • Make a detailed rollout plan
  • Get buy-in and support from a power player such as a Vice Provost
  • Communicate across many channels: email, documentation, information sessions, reminders

“Mubeena’s technical skill is impressive, but even more important to the project’s success was her vision for the system going forward,” Brett said. Salaam’s team provided as much information as possible throughout the rollout to ensure success, including a comprehensive user guide for faculty. “Her implementation process was best practices level, ” he said.

Her team’s discoveries during beta testing are also contributing to Digital Measures’ ongoing efforts to strengthen and extend the functionality of the Import Items feature, Stacy noted.

Are you planning to implement a feature of Activity Insight? Consider UC-Merced’s successful project as you prepare your rollout—and reach out to your Solution Specialist for guidance on planning based on your Activity Insight instrument.