The study of the statistical regularities observed in the field of information production and use has confirmed the existence of important similarities. Thus, the existence of regularities and measurable ratios allow the prevision and the concept of laws. In the 1950s, Shannon and Weaver [13] modelled the information circulation theory. The entropy hypothesis of this theory is that the more ranked a system is, the less information it produces. Theoretical studies have tried to formalise the connection between the bibliometric distribution and the entropy. In this paper, we try to extend previous results linked with ‘the least effort principle’ and the analytical slope of a bibliometric distribution. In the first and second parts, we recall some statements about entropy and bibliometric distributions and, after that, we describe different links between them.