Evaluate Primo VE
Establish Test SitesĀ
Establish 2-5 campuses that can monitor release notes and
test new discovery features when available.
Update Wiki Content
Continue to provide ongoing training and support for discovery users via Open Forum session throughout the year. Update wiki documentation to be inline with new standards and best practices.
Apply Findings from UI/UX Evaluation
Discuss findings from UX studies such as those conducted by Gabriel Gardner and Heather Cribbs, and provide a set of recommendations.
Address uResolver Matching Issues
Many consortia campuses are reporting records with uResolver matching issues. These include: 1) over matching of results, 2) under matching of results, 3) different format matching, and 4) title matching issues.
Address Features Not Working as Expected
Features are not working as expected or as we understand them to work. For example:
LC subject browse does not return expected results,
Date Newest facet only works when specific metadata is present in records, which is inconsistent in CDI and Alma records,
Search within journal feature is nice, but only works when ISSN is present and can only be disabled using CSS,
Ranking configuration still eludes many campuses in the consortia, and
Performance issues with indexing, front-end load times, and feature activation.
Investigate Improvements to Analytics Gathering and Analysis
Primo VE analytics is inadequate for assessing performance and usage of the discovery tool. While the metrics do provide insight into what users do within Primo VE, page views, real-time stats, and navigation pathways are not possible due to the analytics data delay and the inability to configure third-parts tracking code such as Google Analytics