April 26, 2024

Two Architects of Library Discovery Tools Launch an Altmetrics Venture

From

Two prominent veterans of the library vendor world recently launched a startup company which aims to capitalize on the rapidly flowering field of altmetrics.

Andrea Michalek and Mike Buschman had been the director of technology and director of product management, respectively, for ProQuest’s Summon discovery service since its inception. But the pair left the company in November 2011 and in January founded Plum Analytics, deciding that altmetrics presented enough promise to justify surrendering such prominent positions.

“I think you have much more security charting your own destiny than going along for the ride,” Michalek said.

After raising money from friends, family, and angel investors, the duo demoed the public alpha product on March 14 at the monthly Philly Tech Meetup (see video).

“Since that time, we have been talking to libraries interested in becoming beta customers to help build out the next level of the product, as well as take the opportunity to define the next generation of impact metrics,” Buschman said. “This has been very well received so far, and we have many more meetings scheduled.”

Buschman said he just returned from Copenhagen where he attended INORMS 2012, a biennial international conference for research managers, and met with a number of librarians there. The company is hoping to sign at least three beta partners.

Altmetrics (short for alternative metrics) provides a new way to measure the impact of scholarly communication. Rather than rely solely on the traditional and slow measure of citations in peer-reviewed articles (the impact factor), altmetrics provides a complementary, instant measurement window that includes all Web-based traces of research communication. It pulls together all the usage data about each individual output a researcher has produced.

“We would capture any piece of engagement data we can find. Anything that you can say ‘yes’ this piece of research was interacted with,” Michalek said.

Plum Analytics and similar ventures in the field aggregate metrics, collected via open APIs, from sources as varied as Twitter to blogs to open access repositories that publish article-level metrics (such as PLoS) as well as from data repositories, scholarly social bookmarking sites (e.g., Mendeley or CiteULike), code source repositories (GitHub), presentation sharing sites (SlideShare), grant funding data, link shortener metrics, and more.

Plum Analytics is wading into an incipient but very active field, such as this user group on Mendeley or the Twitter hashtag #altmetrics shows. In addition to the article-level metrics application that PLoS has been developing, services similar to Plum Analytics, such as CitedIn, ReaderMeter, and Science Card, have also emerged.

One of the more prominent services is Total-Impact, which Jason Priem, a third-year doctoral student at the University of North Carolina at Chapel Hill’s School of Information and Library Science (SILS), and Heather Piwowar, a postdoctoral research associate at the National Evolutionary Synthesis Center (NESCent) in Durham, have developed. In April, they were awarded a $125,000 grant by the Alfred P. Sloan Foundation to further develop their application.

Priem, who wrote an altmetrics manifesto and recently co-authored a paper on altmetrics to be presented at the 17th International Conference on Science and Technology Indicators in Montreal in September, welcomed the emergence of Plum Analytics.

“Looks to me like they’d be pretty direct competitors with Total-Impact and altmetric.com, albeit with some features they’ll hope to differentiate themselves with. I think that’s awesome,” Priem said. “More players in this space is, in some ways, better for all of us…a rising altmetrics tide floats all boats.”

He noted that there was enough money in the citation space to keep Scopus, ISI, and Google Scholar all viable, and he said the same will likely prove true for altmetrics.

In the case of Plum Analytics, the metrics are presently based on 13 main data sources, which Buschman said they are adding to (including sources more important to social sciences and humanities). Users will likely be allowed to weight the data as they choose (e.g., 50 Tweets equal one “like”).

“Our beta customers will be helping us prioritize the next sources from which to harvest,” Buschman said.

This type of prioritization may prove important.

Wendy Pradt Lougee, the university librarian at the University of Minnesota, said the library there has a very close partnership with the university’s office of research in order to explore ways of revealing more data about researchers, including metrics beyond citations, and rolling out SciVal from Elsevier and the Harvard Profiles research networking software. But attitudes toward altmetrics can vary considerably depending on the disciplinary context.

“Faculty are very discerning in how they are represented and the reputational value of different publication venues and metrics,” Lougee said. “We have seen a growing interest in research networking systems and tools that help move beyond just citations and represent a faculty’s research repertoire more fully, but each discipline values specific elements, such as traditional or open access citations, with different weights and perspectives.”

Nevertheless, university librarians such as Lougee and Sarah Michalak at the University of North Carolina at Chapel Hill are keeping a close eye on developments, even if they are not yet ready to plunge headfirst into altmetrics.

“We need to interest ourselves in new ways of measuring the impact of scholarship and new and powerful kinds of information tools,” Michalak said, noting that librarians were among the first to see the possibilities in the school’s Reach NC project.

Buschman of Plum Analytics saw the library as a natural ally. The data becomes a tool that libraries can use to help researchers determine which forms of communication generate the most meaningful interaction with their research and also track forms of impact that are not contained in the citation record.

“We believe these tools will allow librarians to better serve the faculty at their institution as well as provide meaningful incentives for researchers to embrace open access options for their output,” Buschman said, noting that one of his goals was to make open access more powerful.

Also, if researchers are able to boost their resumes and publication lists (and possibly improve their chances of tenure) with these new metrics, then it improves the library’s value proposition and can help align libraries with the revenue-producing part of the institution, Buschman said.

“It’s a way to promote the university’s departments and work with the researchers in a way that gives them something real. ‘Hey the library made this possible for you,’ ” Buschman said.

“One of our goals is to be able to show a whole directory of a researcher’s output for their lab, their department, their university, and what kind of interesting and compelling stories that haven’t been able to be told yet but now can be because you have the data. Technology can help tell a story that hasn’t been told before,” Michalek said.

However, pulling together all the usage data about each individual output a researcher has produced presents a number of technological challenges.

For example, research output is often hosted in different places: the pre-print of an article can be in a discipline-specific repository, the “published” version can be available through multiple sites, including the publisher’s web site and one or more aggregation sites, and the author or authors may post versions to their institutional repositories and/or their personal web sites.

Altmetrics aspires to accumulate metrics from each of these places.

In addition, a given website might have a dozen different URLs that all will get the user to a specific article.  Each of these links might be shared, and usage from each of them needs to be consolidated together to give the complete picture.

“The challenge is to aggregate the metrics across all versions where available,” Buschman said. “This is the inverse of how a link resolver functions, and requires a new approach uniquely suited for this sort of aggregation.”

Befitting their background as architects of Summon, Buschman and Michalek also are attempting to build a commercial-grade product that can  scale up to the challenge of loading the research output from millions of researchers, and the data sources that report off them.

“The flexibility of data analysis at scale is the sweet spot of our solution,” Buschman said.  “We are building toolsets not just for collecting the article level metrics, but also for mapping the hierarchy of the institution and the affiliations of the researchers.”

Michalek said the sheer quantity of data that is available can help guard against people who might try to game the system since patterns can be detected with big data. It’s the same idea that underlies spam filters.

“Most gaming is detectable and as it matures it will be important to be able to combat it,” she said.

The ACM Web Science Conference 2012 will host a workshop on altmetrics in Evanston, IL, on June 21.

“In the long term, this is a chance for libraries to lead the way into a web-native, post-journal world of scholarly communication, in which aggregated conversation and review by experts replaces the slow, closed, inefficient, atavistic system we’ve inherited from the last century,” said Priem of Total-Impact.

 

Share
Michael Kelley About Michael Kelley

Michael Kelley is the former Editor-in-Chief, Library Journal.