Abstract
There are many ways to define and measure organization, or complexity, in music, most using the notion of informational entropy, as the opposite of organization. Some researchers prompted me to study whether it could be done from the magnitudes of Fourier coefficients of musical objects (pc-sets or rhythms) instead of addressing their atomic elements (pitches, rhythmic onsets). Indeed I found that it could be a promising new approach to measuring organization of musical material. This note only purports to expose this novel idea, leaving for future research the task of comparing it with the numerous other definitions. I also sketch the study of one relevant basis for such comparisons which has been little explored, the asymptotics of entropy of arithmetic sequences modulo n.
2010 Mathematics Subject Classification:
Acknowledgments
I am grateful to the anonymous reviewers for their extremely fruitful comments, and to Co-Editor in Chief Jason Yust for enlightening discussions, judicious corrections, and incentive.
Disclosure statement
No potential conflict of interest was reported by the author(s).
Notes
1 This is essentially Babbitt's theorem, cf. CitationAgon et al. (2011) from a modern viewpoint.
2 These periodic rhythms are characterized by a centre of gravity in the middle of the circle where they can be drawn, generalizing regular divisions of the circles and their reunions.
3 The choice of Shannon's entropy is very well vindicated in CitationTemperley (2007) or CitationTemperley (2019) in terms of expectancy of the listener.
4 More precisely, pursuing the parallel with physics: the distribution of squared magnitudes is what appears on a diffraction picture.
5 This is optional of course, but since all Fourier coefficients are (algebraically at least) independent, being coordinates in a vector basis, pruning out the cardinality allows to concentrate on the organization of the material regardless of size. This would also possible when considering the entropy of Interval Content, see last section.
6 Actually because of their symmetry, the second half of the coefficients may be discarded too. This would essentially divide the entropy by half.
7 The former is more delicate and I only sketch the proof here: if all Fourier coefficients of X but one are nil, then it can be shown that this coefficient must be the middle one, (see more generally CitationAmiot 2016, Theorem 3.11). Then, by inverse Fourier transform X is “half of”
(which of the two halves depends on the signum of
).
8 Any generated scale with a generator coprime with n will have the same entropy, which again seems to me a desirable feature.
9 This feature appears highly desirable for any definition of the entropy of a process known to be periodic, because a repetition of the event does not change how it is perceived; and can be proved computing the Fourier coefficients of the repetition of a musical object, cf. CitationAmiot (2016, Chap. I).
10 The derivation of the Fourier coefficients of a generated scale/rhythm is done in CitationAmiot (2007, Citation2016).
11 It barely passes 2 when .
12 By termwise comparison with and Weierstrass' criterion.
13 This was suggested by Jason Yust in a personal communication. However, so far, more studies of entropy have been devoted to pitch or disambiguation of tonality than to rhythm.
14 Thorough comparison with the hundred-odd existing notions of musical entropies, with perceptual testing, would require a book, not an article.