Barriers to Institutional Effectiveness Research and the Need for Self-Service Analytics
Jan 07, 2008
In a recent article published on the Achieving the Dream website by Vanessa Smith Morest and Davis Jenkins titled Institutional Research and the Culture of Evidence at Community Colleges, the authors point out several important barriers to high quality institutional effectiveness research in community colleges. These barriers include:
- Lack of dedicated IR personnel. IR personnel may not have the training for quantitative analysis.
- “Data analysis” is typically “Data collection” for compliance and accreditation. These data are typically not very useful in college management.
- Data contained in the student information system may be “dirty”, and are designed to support administrative functions, not research.
- Skepticism with college presidents about using quantitative methods to manage the institution.
I’ll add that responsibility for data entry, warehousing, and reporting are often scattered among several offices making coordination of a research agenda difficult at best. In addition, faculty are often not engaged in the management aspects of the institution. This may, in part, be due to a lack of timely and accurate information. In recent years, however, the proliferation of business intelligence (BI) and analysis tools (e.g., Business Objects, SAS, SPSS, Tableau, etc.) allow information consumers to quickly access analysis in a “self-service” environment.
Most BI software products have matured to the point where creating reports and even running sophisticated statistical analysis is point and click. This BI software ease of use helps to make the most of the limited staff dedicated to IR. In addition, the self-service nature of BI software helps to extend human resources by making data and analysis more broadly available to administrators, faculty, and staff. There certainly exists the potential to increase the number of research topics as well, since faculty may utilize these tools for investigation of success rates or retention of their own students. Instead of having one or two IR staff stretched to analyze a single topic over several months, there may be dozens of concurrent analyses by curious faculty.
This helps to address items #1 and #2 on the barriers list.
A complication for most institutions right now is the lack of standard sources for data to answer questions. The lack of data access usually encourages the proliferation of “shadow systems”. These shadow systems in turn lead to multiple and inconsistent reports for topics like headcounts, retention, or success rates. It’s not always the case that the student information system (SIS) is at fault. Most of the data needed to answer questions or to conduct analysis is contained in the SIS. Information consumers usually don’t know how to get to data they need. This is where self-service plays an important role.
By implementing BI tools, information consumers can typically make sense of what should be the “authoritative data source”, the SIS. Surfacing data contained in the SIS will often illustrate where data quality problems are, and point to deficiencies in business processes. Most data quality issues in higher education are the result of breakdowns in processes, in other words: it’s usually not a technology deficiency. This helps to address item #3 on the barriers list.
As for item #4, and the coordination across multiple offices, governance structure and commitment to a “culture of evidence” are usually necessary at the highest levels. Providing self-service to analytics across an institution using BI tools is one way to foster this culture of evidence.