Today, citations are seen as an indicator of the use and dissemination of research findings. You can count how often your article has been cited in order to assess the impact of your research. More interestingly, you can often see who has cited your research. This can be a good way of identifying potential collaborative partners.
Citations were originally conceived as an alternative to subject terms, making it possible to find related articles with the help of reference lists.
The idea emanated from the general academic practice of identifying relevant publications by assessing the reference lists of other publications. By systematising this form of chain search it was possible to use methods to find more recent literature rather than cited articles published earlier.
Furthermore it became possible to identify articles with a high proportion of common references. The idea was that, since these would have a large subject overlap, searches that could identify articles with similar reference lists could therefore function as subject searches (Garfield, 1955).
Citations as indicator of impact
Citations are increasingly used as an indicator of the level of dissemination and impact of researchers and publications. A number of indicators based on citations have been developed (see, for example Journal Impact Factor or h-index).
Reference lists in databases make it possible to count citations and allow reference and citation searches. The first database that included reference lists was ISI’s Science Citation Index, developed by Eugene Garfield. However, a host of databases are emerging that utilise Garfield's ideas and make reference lists searchable (Scopus, Google Scholar).
Citation counts are ultimately counts of documents in a given database that reference other documents. In practice, however, being referenced is not the same as being cited, as a citation can only be counted once it has been made visible via a citation index/database. Thus, how much value you attach to the results has a great deal to do with the content and size of the database and with the database's coverage of the literature in a particular academic field. Citation databases primarily index articles in journals and, moreover, focus on American publications.
Consequently, your work may be frequently referenced without being frequently cited in databases – for example, if the references occur mostly in non-English-language articles or in books.
Motivations behind references and citations are different
The motivations behind references to certain authors and publications are often complex. They range from being complimentary or inspirational to being critical. Almost all references are made on the basis of more than one motive (Brooks, 1986).
Furthermore, when working on publications, authors need to select sources, which means that not all sources referred to are relevant (MacRoberts & MacRoberts, 1988). This complexity is not reflected, for example, in the statistical analyses of citations, as it is not possible to see the background of the individual citation choice (MacRoberts & MacRoberts, 1989).
Owing to this uncertainty, critics of citation analyses feel that citation counts do not serve any function. By contrast, supporters of citation analyses point out that they can be used to show patterns for the publications cited, provided academics do not reference unthinkingly or haphazardly (van Raan, 1998).
When it comes to counts/statistical analyses of citations, the figures will always be interpreted. Thus citations should be thought of as "one important form of use of scientific information within the framework of documented science communication" (Glänzel & Schoepflin, 1999, p. 32).
This is in keeping with Garfield’s original thoughts on ISI’s Science Citation Index, namely that 'good' articles are often cited more than average, but that "impact is not the same as importance or significance" (Garfield, 1963, p. 290).
Terrence A. Brooks (1986). Evidence of complex citer motivations. In: Journal of the American Society for Information Science, Vol. 37, Iss. 1, pp. 34-36. http://dx.doi.org/10.1002/(SICI)1097-4571(198601)37:1<34::AID-ASI5>3.0.CO;2-0
Eugene Garfield (1955). Citation indexes to science: A new dimension in documentation through association of ideas. In: Science, Vol. 122, Iss. 3159, pp. 108-111. http://dx.doi.org/10.1126/science.122.3159.108. Available in reprint edition: http://garfield.library.upenn.edu/essays/v6p468y1983.pdf
Eugene Garfield (1963). Citation indexes in sociological and historical research. In: American Documentation, Vol. 14, Iss. 4, pp. 289-291 http://dx.doi.org/10.1002/asi.5090140405. Available in reprint edition: www.garfield.library.upenn.edu/essays/V1p043y1962-73.pdf
Wolfgang Glänzel & Urs Schoepflin (1999). A bibliometric study of reference literature in the sciences and social sciences. In: Information Processing & Management, Vol. 35, Iss. 1, pp. 31-44. http://dx.doi.org/10.1016/S0306-4573(98)00028-4
Michael H. MacRoberts & Barbara R. MacRoberts (1988). Author motivation for not citing influences: A methodological note. In: Journal of the American Society for Information Science, Vol. 39, Iss. 6, pp. 432-433. http://dx.doi.org/10.1002/(SICI)1097-4571(198811)39:6<432::AID-ASI8>3.0.CO;2-2
Michael H. MacRoberts & Barbara R. MacRoberts (1989). Problems of citation analysis: A critical review. In: Journal of the American Society for Information Science, Vol. 40, Iss. 5, pp. 342-349. http://dx.doi.org/10.1002/(SICI)1097-4571(198909)40:5<342::AID-ASI7>3.0.CO;2-U
Anthony van Raan (1998). In matters of quantitative studies of science the fault of theorists is offering too little and asking too much. In: Scientometrics, Vol. 43, Iss. 1, pp. 129-139. http://dx.doi.org/10.1007/BF02458401