===== Tutorial Vamp Plugins Part 3: Onsets, Beat and Tempo ===== A determination of beat and tempo is done in several steps with several MIR algorithms. ====Onset Detection==== First the onsets of sound events are determined by a //onset detection// by measuring the changes within the entire audio signal with so-called //novelty// functions. If the changes are large or the //novelty// is high, it is likely that a new sound event will begin at this point. \\ There are several approaches to measure the novelty value: * //Energy-based novelty// refers to changes in the overall energy (intensity/loudness) of the signal. * //Spectral-based novelty//, on the other hand, is directed at changes in the spectral energy distribution. * //Complex-domain novelty// combines the energy-based and the spectral-based approach. * //Phase-based novelty// addresses changes in the phase of the signal. Start the Sonic Visualiser. Please, load the file Audio01.mp3. Select in the menu item 'Transform' - 'Analysis by maker' - 'Queen Mary, University of London' - 'Note Onset Detector: Note Onsets...'. [[https://analyse.hfm-weimar.de/lib/exe/fetch.php?media=aphex_twin_bucephalus_bouncing_ball_selection.mp3|Audio01]]. This is the excerpt from the track "Bucephalus Bouncing Ball" by [[https://en.wikipedia.org/wiki/Aphex_Twin|Aphex Twin]], which we already used in the first one in the basic tutorials. {{:onset_detection.png?300 |}} Then, you can specify the analysis approach in more detail in the window: \\ - Program: Please, select the 'Percussive onsets' option here, as there are mainly percussive sounds within the audio. \\ - Onset Detection Function Type: By selecting 'Percussive onsets' the detection type automatically jumps to 'Broadband Energy Rise', i.e. the energy-based detection approach. Keep this setting. \\ If you press //OK//, a //Time Instants Layer// opens on which the detected onsets are marked by vertical bars. You can listen to the onset pulses along with the audio. At the very bottom of the Layer tab (right above the speed wheel) are three playback mode icons: * dots, left: Volume of the onset pulses * Loudspeaker, center: turning the sounds on and off * Mixer, right: selection of the sound (choose 'Beep', it fits very well to the music!) * with //Show// you can also hide the layer view ====Beat Detection==== Based on regularities of the onsets, a probable basic beat is determined by assuming preferred tempo ranges (between about 40 bpm and 160 bpm). What do you think: Will a beat algorithm be able to find a basic beat with the Aphex Twin excerpt? How does it handle the acceleration of the beats? Now apply the 'Transform' - 'Analysis by maker' - 'Queen Mary, University of London' - 'Bar and Beat Tracker' to the Audio01.mp3 file. At what point does the algorithm find a beat? How does it interpret the sections that follows? Did you notice before that the short sections with accelerations are always exactly eight beats (two bars) long? {{ :audio01_bar_and_beat.png?500 |}} In the plugin window you can set how many beats a bar of the recording has. The algorithm then counts through the beats cyclically. However, it fails to determine the beginning of the beat correctly. Now apply the same plugin (Ctrl T) to Audio02.mp3 ("Come Back, Baby" by Ray Charles). Since it is a 12/8 time signature, you can set '12' in the plugin window. How reliably does the plugin find the beat of the recording? And what’s about the beginning of the bar? ====Tempo==== Of course, the tempo determination can only be as reliable as the beat detection. Since in "Come Back, Baby" only in some passages the basic eighth-note beat is detected - while in others approximately a dotted eighth-note is regarded as the basic beat - those areas must be used for correct tempo determination in which the beat click matches the beat of the recording. In these areas, the tempo (in bpm = beats per minute) can be read directly with the following plugin: //Transform// - //Analysis by maker// - //Queen Mary, University of London// - //Tempo and Beat Tracker//. ====Deepening==== For more in-depth coverage, please consult the [[https://www.audiolabs-erlangen.de/resources/MIR/FMP/C6/C6.html|Chapter 6: Tempo and Beat Tracking]] of the FMP notebooks by Meinard Müller. Another helpful introduction to the capabilities of the //Sonic Visualiser// for musicological analysis, focusing on the exploration of **microtiming** of classical music recordings, is [[https://www.charm.rhul.ac.uk/analysing/p9_1.html|A musicologist's guide to Sonic Visualiser]] by Nicholas Cook and Daniel Leech-Wilkinson.