Metaphor and Metrics

Instead of a basket, or a tide, let’s talk about metrics as mosaic.


Images influence how we think.  That’s why metaphors matter.  They can clarify and improve our understanding of complex concepts.  They can both reflect and shape our attitudes.

The metaphors of journal metrics started with “impact.”

Impact[1]:       a : to have a direct effect or impact on : impinge on

b : to strike forcefully; also : to cause to strike forcefully

The Journal Impact Factor (JIF) represented how a journal interacted with other journals, how it had a direct effect on the literature, how it struck forcefully in the scholarly dialogue.  In physics, the concept of “impact” carries a sense of movement, of momentum and the transfer of energy.  The nomenclature of indicators at ISI – “Impact”, “Immediacy”, “half-life” – were mathematically vague, but linguistically rich.  The drive to create a new way of perceiving the complex interactions of millions of researchers and their articles was a passion, and it deserved poetry.

Years later, when the numbers and types of indicators have expanded, the names chosen are descriptive of their critique (“Retraction Index”), their source (“Scimago Journal Rank”), their method of calculation (“Eigenfactor Score”) or even their originator (“h-index”), but they lack the evocative imagery of the first generation of metrics.

As the use of metrics expanded, the abuse of metrics expanded with it, the images that arose became grim, dire, even diseased: “counting house[2]”, “devouring science[3]”, “Impactitis[4].”  Metaphor signaled the community’s discomfort with how assessment was developing.  The words they chose expressed the vexation and angst about a misrepresentation of the world in which scholarship is done.

But research itself is so often based on measurement; thus the use of measurement in the assessment of research is a compelling prospect. The wish to measure research met the breadth of measurements available, and the images began to express both the number of measures, and the need to choose them:  a “tide”[5], and a “basket” [6] full.  These metaphors represent metrics as a formless quantity, a non-specific group.  They convey the presence of many options, give no sense of the mechanism or purpose of choosing.

I would like to suggest a new metaphor for research assessment, and its careful, considered use of many available measurements:  A mosaic. [7]

A mosaic uses fragments, disparate parts, shards and pieces, to assemble and represent another truth.  Bits of glass and pottery, of tile and shell – they are not, themselves a part of the ship or the person or image that is represented on the finished mosaic, but their deliberate, careful assembly results in a picture of something that exists in three dimensions.

The Britannica definition of mosaic has this wonderful phrase: “Mosaic pieces are anonymous fractions of the design”[8].  Each indicator does not have an isolated meaning, or, if it does, it is much less compelling in isolation than it is when it is incorporated into a larger picture.  The tesserae of metrics and descriptors, of citation and usage, of downloads and clicks and mentions create an image of the research.  They are not the thing itself, the work and thought and experiment and writing, nor are they the full picture of the value and the effect in the world of a work of scholarship, but they can represent the essential work of research.  When carefully done, the mosaic is an art form, and has value in its own right.  The skill of the mosaic artist, of the assessment scientist, is in the assembly of elements to create a meaningful, integrated, cohesive whole, a representation of scholarly work and of its contribution.

Assessment is anchored in the need to convey information about what has taken place in science and scholarship by creating a picture of how that work has landed in the world, how it has created “impact.”

To discard the work, the science and the craft of research assessment is to belittle the work of the artists and artisans who create mosaics showing the value of research.


Follow the JCR 2018 blog series for further updates.


Photo credit: Angela Martello


[2] Adam, D. (2002). “Citation analysis: The counting house.” Nature, 415, 726–729.

[3] Monastersky, R (2005).  “The number that’s devouring science.” Chronicle of Higher Education, October 14, 2005.

[4] Van Diest PJ, Holzel H, Burnett D, et al (2001). “Impactitis: new cures for an old disease.” Journal of Clinical Pathology 54:817-819.

[5] Wilsdon (2015).”The Metric Tide: Report of the independent review of the role of metrics in research assessment.” At:,2014/Content/Pubs/Independentresearch/2015/The,Metric,Tide/2015_metric_tide.pdf

[6] Haustein (2015)  and others.  The origin of the phrase “basket of metrics” is interestingly obcure.