One application area requiring careful attention to the control stream/audio signal boundary is sampling. Until now our samplers have skirted the issue by looping perpetually. This allows for a rich variety of sound that can be accessed by making continuous changes in parameters such as loop size and envelope shape. However, many uses of sampling require the internal features of a wavetable to emerge at predictable, synchronizable moments in time. For example, recorded percussion sounds are usually played from the beginning, are not often looped, and are usually played in a determined time relationship with the rest of the music.
In this situation, control streams are better adapted than audio signals as triggers. Example C05.sampler.oneshot.pd (Figure 3.14) shows one possible way to accomplish this. The four tilde objects at bottom left form the signal processing network for playback. One vline~ object generates a phase signal (actually just a table lookup index) to the tabread4~ object; this replaces the phasor~ of Example B03.tabread4.pd (Page ) and its derivatives.
The amplitude of the output of tabread4~ is controlled by a second vline~ object, in order to prevent discontinuities in the output in case a new event is started while the previous event is still playing. The "cutoff" vline~ object ramps the output down to zero (whether or not it is playing) so that, once the output is zero, the index of the wavetable may be changed discontinuously.
In order to start a new "note", first, the "cutoff" vline~ object is ramped to zero; then, after a delay of 5 msec (at which point vline~ has reached zero) the phase is reset. This is done with two messages: first, the phase is set to 1 (with no time value so that it jumps to 1 with no ramping). The value "1" specifies the first readable point of the wavetable, since we are using 4-point interpolation. Second, in the same message box, the phase is ramped to 441,000,000 over a time period of 10,000,000 msec. (In Pd, large numbers are shown using exponential notation; these two appear as 4.41e+08 and 1e+07.) The quotient is 44.1 (in units per millisecond) giving a transposition of one. The upper vline~ object (which generates the phase) receives these messages via the "r phase" object above it.
The example assumes that the wavetable is ramped smoothly to zero at either end, and the bottom right portion of the patch shows how to record such a wavetable (in this case four seconds long). Here a regular (and computationally cheaper) line~ object suffices. Although the wavetable should be at least 4 seconds long for this to work, you may record shorter wavetables simply by cutting the line~ object off earlier. The only caveat is that, if you are simultaneously reading and writing from the same wavetable, you should avoid situations where read and write operations attack the same portion of the wavetable at once.
The vline~ objects surrounding the tabread4~ were chosen over line~ because the latter's rounding of breakpoints to the nearest block boundary (typically 1.45 msec) can make for audible aperiodicities in the sound if the wavetable is repeated more than 10 or 20 times per second, and would prevent you from getting a nice, periodic sound at higher rates of repetition.
We will return to vline~-based sampling in the next chapter, to add transposition, envelopes, and polyphony.