Stacy Konkiel is the Director of Marketing & Research at Impactstory, a not-for-profit service that tracks altmetrics.
A former academic librarian, Stacy has written and spoken most often about the potential for altmetrics in academic libraries.
Stacy has been an advocate for Open Scholarship since the beginning of her career, but credits her time at Public Library of Science (PLOS) with sparking her interest in altmetrics and other revolutions in scientific communication.
Prior to PLOS, she earned her dual Masters degrees in Information Science and Library Science at Indiana University (2008). You can connect with Stacy on Twitter at @skonkiel.
This is the second of two posts about altmetrics, which are metrics that measure the social impact of your research.
In the first post, I argued that the impact metrics we’ve been using are good but too limited in scope to be useful for 21st century researchers. In this post, I’ll give a brief guide to the most popular altmetrics aggregators and some tips to get started on collecting and sharing your own impact metrics.

Altmetrics reporting services
There are a handful of altmetrics reporting services currently in existence. Here are some key services and their essential features.
Impactstory
Full disclosure: I’m Director of Marketing & Research for Impactstory.
Impactstory is an open-source, web-based tool that helps researchers explore and share the diverse impacts of all their research products — from traditional ones like journal articles, to emerging products like blog posts, datasets, and software.
By helping researchers tell data-driven stories about their impact, we’re helping to build a new scholarly reward system that values and encourages web-native scholarship. We’re funded by the Alfred P. Sloan Foundation and National Science Foundation, and incorporated as a nonprofit corporation.
Impactstory delivers open metrics, with context, for diverse products:
- Open metrics: Our data (to the extent allowed by providers’ terms of service), code, and governance are all open.
- Context: To help researcher move from raw altmetrics data to impact profiles that tell data-driven stories, we sort metrics by engagement type and audience. We also normalize based on comparison sets: an evaluator may not know if 5 forks on GitHub is a lot of attention, but they can understand immediately if their project ranked in the 95th percentile of all GitHub repositories created that year.
- Diverse products: Datasets, software, slides, and other research products are presented as an integrated section of a comprehensive impact report, alongside articles. Each genre is a recognized and respected as a first-class citizen, equal to but different from journal articles and book chapters, each making its own kind of impact.
Metrics are displayed in user profiles, which users build by connecting Impactstory to third-party services like ORCID, GitHub, and Figshare, and adding individual products using permanent identifiers like DOIs, PubMed IDs, and URLs. The metrics are contextualized by year and discipline using percentiles, so viewers can better understand how your paper’s citations compare with other information science papers’ citation on Impactstory, or how well your slide deck is doing compared to others published in the same year.
You can click on a product’s title or any of its badges to see a drill-down view of the metrics: how many bookmarks a paper got, or stars a software project received on GitHub, and so on. And you can click through any of the numbers to find the qualitative information behind the metrics: who tweeted your slide deck (and what they said about it), what Mendeley tags your paper got, and so on.
We’ve also recently rolled out the ability to share products directly from your profile page via embedding, and are working (as of early September 2014) to make the profile interface customizable.
Interested? You can sign up here for a free 30-day trial. After that, it costs $60 per year.
Altmetric.com
This London-based startup provides a variety of services aimed at publishers and institutions that want to track altmetrics for publications (including mainstream media mentions and citations in policy documents), as well as a feature that would likely interest researchers: a browser bookmarklet that can tell you the attention a paper has received, at a glance.
Altmetric.com is awesome at tracking metrics for publications that have a DOI, PubMed ID, ArXiv ID, or Handle. Their free bookmarklet allows you to quickly and easily look up whether a publication has any altmetrics, as well as set email alerts for new metrics.
Here’s how it works: head over to the Altmetric.com site and drag the bookmarklet into your browser menubar. Then, the next time you’re reading a paper on a journal website (or on PubMed, ArXiv, or an institutional repository), click the bookmarklet.
In the browser, a new window will appear that gives you a summary of the altmetrics associated with that article or preprint. The summary will include a list of the metrics for the document, alongside the Altmetric.com score displayed in the Altmetric.com donut.
The score is calculated by weighting certain types of altmetric indicators more heavily than others. Though the exact formula for weighting is kept a secret (particularly with respect to the weighting of data sources), a tweet doesn’t add as much to the score as a citation in a policy document does, for example. And the donut’s colors will change depending on the mix of metrics the publication has; it will look more blue if heavily tweeted, and more red if heavily bookmarked in Mendeley, for example.
You can also click through to find out more information about the metrics: what policy documents the paper has been cited in, who’s been tweeting about the work and what they’ve been saying, summaries of the blog post coverage the paper has gotten, and so on.
Sign up for email alerts to be notified when the article receives new mentions in mainstream media, on Twitter, and so on. It’s as easy as clicking the “Get email updates when this article is shared” link on the drilldown page, and then providing your email address.
The bookmarklet is a great way to get quick information about the popularity of a paper you’re reading. And best of all, it’s free to use.
Head over to Altmetric.com to download it and start experimenting.
PlumX
PlumX is a platform run by Plum Analytics (an EBSCO subsidiary) that helps institutions and funders track the impact of the research they support.
If your institution subscribes, you’re in for a treat: PlumX is a super-robust service that provides impact reports at the institution, department, and researcher level. PlumX offers the most data sources of all the top altmetrics aggregators, meaning that you not only can discover when your datasets have been tweeted about or your papers have been bookmarked, but you can also discover the ratings your books have received on GoodReads or the number of times your thesis has been downloaded from your university’s institutional repository.
PlumX also offers widgets, which summarize the impacts of any research product in their system. Widgets can be embedded in webpages, institutional profiles, and other places where you maintain an online professional presence.
You can test out their platform at Plu.mx, and get in contact with their team to learn if your institution subscribes.
How to get started with altmetrics
Now that you know the ins and outs of altmetrics and altmetrics aggregators, here’s how you can get started:
Explore altmetrics services and discover which one’s best for you
Dedicate an hour or two on an upcoming Friday morning to signing up for an Impactstory account, installing the Altmetric.com bookmarklet, and just playing around. If your institution subscribes to PlumX, check out your profile there, too.
On each platform, look up some of your best articles, datasets, slide decks, or other research products, and compare how well each service documents its impact. Some services cover some disciplines better than others; others handle certain types of research products best.
You’ll know better than anyone what sorts of impacts your work is likely to have; use that knowledge to ruthlessly evaluate each service and choose one as your primary altmetrics data source. I know the pain of profile fatigue all-too-well, and so I don’t want to suggest that you sign up for and use all three services on an ongoing basis.
Discover where you’re winning and share it with others
Once you’ve chosen a service, or a combination of services, to use as your altmetrics data source, pay attention to your impacts on a regular basis to learn more about how your research is being received. Set aside a few minutes each week (or month, if you’re strapped for time) to dig into the new impact metrics that have accumulated on your profile.
Look at the data behind the numbers–read the tweets about your paper, learn more about who’s blogging about your recent conference presentation, discover the “shares” your thesis has received on Figshare, and so on. Sometimes, you’ll uncover gems (‘A Nobel Laureate said nice things about my research!’); other times, not so much (‘Ugh! This tabloid completely misinterpreted my results and is misleading their readers – I need to set the record straight’).
In either event, you’ll have actionable data you can use to understand you research’s impact, as well as use as a starting point for engagement. And once you have that data, you can share it with others.
Using altmetrics
Consider adding altmetrics to your annual review, grant application, tenure & promotion package, or job applications. Tread carefully here: some departments and institutions will be more amenable to altmetrics than others. Documenting your impacts with evidence to back up your claims is a good thing: it has gotten some tenure, others grants and awards.
At the very least, test the waters by putting altmetrics into your annual review or work planning meetings. You can either use the numbers you’ve gathered on your CV, or use specific examples of mainstream media or social media mentions (like that one you got from that Nobel Laureate) to bolster your case for impact.
Then, if your altmetrics experiment has been warmly received, you can start including altmetrics in grant applications, tenure and promotion packages, and other places where it’s important to show evidence of your impact.
Advocate for altmetrics at your institution
Once you’ve started using altmetrics regularly, you can “level up” by sharing your experiences with others and encouraging them to do the same. Consider talking to your faculty council about incorporating altmetrics and other “alternative” measures of scholarly impact in tenure and promotion dossier preparation guidelines. Get in touch with your subject librarian to let them know about altmetrics, and ask them to consider adding a library workshop on the subject.
Or, better yet, plan your own workshop or brown bag talk on altmetrics for your department, and share your experience with colleagues!
How do you plan to use altmetrics?
Do you practice web native research? Are you interested in using altmetrics to help document your research impacts, or will you stick with more traditional measures? Let’s chat in the comments below and on Twitter.