Issue 
Acta Acust.
Volume 7, 2023



Article Number  58  
Number of page(s)  6  
Section  Musical Acoustics  
DOI  https://doi.org/10.1051/aacus/2023055  
Published online  14 November 2023 
Technical & Applied Article
How to build a MATLAB demonstrator solving dynamical systems in realtime, with audio output and MIDI control
^{1}
Buffet Crampon, ManteslaVille, France
^{2}
Aix Marseille Univ, CNRS, Centrale Marseille, LMA, Marseille, France
^{*} Corresponding author: tom.colinot@buffetcrampon.com
Received:
21
July
2023
Accepted:
10
October
2023
This paper explains and provides code to synthesize and control, in realtime, the audio signals produced by a dynamical system. The code uses only the Matlab programming language. It can be controlled with an external MIDI (Musical Instrument Data Interface) device, such as a MIDI keyboard or wind controller, or with mouseoperated sliders. In addition to the audio output, the demonstrator computes and displays the amplitude and fundamental frequency of the signal, which is useful to quantify the dynamics of the model. For the sake of this example, it is a type of Van der Pol oscillator, but more complex systems can be handled. The demonstrator holds potential for pedagogical and preliminary research applications, for various topics related to dynamical systems: direct and inverse bifurcations, transient effects such as dynamical bifurcations, artifacts introduced by integration schemes, and above all, the dynamics of selfsustained musical instruments.
Key words: Matlab / Realtime audio / Sound synthesis / Virtual musical instruments / Dynamical systems
© The Author(s), Published by EDP Sciences, 2023
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
1 Introduction
Autonomous dynamical systems are complicated objects to study and teach. Even some of the simplest ones to formulate are extremely unpredictable. The richness of this behavior is not encapsulated in the usual description of the permanent equilibrium points or periodic regimes [1, 2]. Some of their solutions are nonperiodic, or coexist with other stable solutions [3]. This makes it difficult to predict which type of solution is obtained in any given situation. When the system parameters vary, complicated transient effects emerge, such as hysteresis cycles [4] or dynamical bifurcations [5].
Selfoscillating musical instruments such as wind instruments or bowed strings are modeled using autonomous dynamical systems [6]. Their example illustrates how transient effects are essential to a complete description of the system’s reallife behavior, since they are experienced (at least) at the beginning and end of each note. In this spirit, it seems that a reasonable and compelling approach to experience and explore how a dynamical system reacts is implementing a virtual “musical instrument” demonstrator. By manipulating the control parameters, the user can see and hear phenomena typical of nonlinear systems in realtime, in a very controlled and repeatable environment. This is particularly relevant in all fields related to music, such as musical acoustics and instrument making. In these fields, one can go even further by linking the system’s behavior to musical terms, such as intonation (flat, sharp), nuance (piano, forte) or transient dynamics (staccato, legato). The presented demonstrator aims to be as general as possible, meaning that the example model can be replaced by any simple dynamical system with only minor adjustments. While other possible environments for realtime sound synthesis exist, such as C^{++} (notably the JUCE library), Max/MSP, or Faust, this demonstrator is presented in a pure Matlab implementation, using only the Audio Toolbox (audio_system_toolbox, matlab, signal_blocks). This is advantageous for the many researchers who may want to reuse their preexisting codes, solvers, systems, analysis or display tools. Note that the builtin Matlab tools necessary were not available before R2016a (for the Audio toolbox and most of the elements of the user interface) and R2018a (for the MIDI message handling).
After a quick presentation of the dynamical system used in the demonstrator in Section 2, the implementation choices are detailed is Section 3. Ideally, the authors wanted this paper to follow the actual lines of codes implementing the demonstrator. Sadly, the complete code is too lengthy for a journal paper (notably due to display functions, UI building and general housekeeping). With the intent of being as explicit as possible, Section 3 is built around several code snippets addressing the main challenges of a realtime audio demonstrator. The first of these code excerpts is selfcontained and functional, and outputs sound generated by a dynamical system in realtime. The structure of the rest of the section follows that of the code, which is:
Initialization of the audio output object (the current example uses the audiodevicewriter object).
Construction of the user interface: buttons, sliders and MIDI device handling.
Execution of the synthesis loop, typically a while loop with generation of new sound samples, and output of the audio through the audio output object.
Extraction of audio descriptors and display.
A link to the complete Matlab code is given in Data Availability Statement, as well as a compiled version with slightly faster reaction to user controls obtained with the Matlab compiler.
2 A simple system
For this demonstration, we use a modified Van der Pol oscillator with a quadratic and fourth power nonlinearity in the damping term. It can be seen as a simplified model of selfsustained musical instrument. Notably, some fingerings of the saxophone display a similar dynamic [7].
The system is studied in its linear stiffness form in [8]. It is very close to the normal form of the Bautin bifurcation [9]. As such, its behavior is wellknown, but rich enough so that it illustrates a good number of phenomena typical of autonomous nonlinear systems. The governing equation is
Hereafter, we set ν = 0.5 and α = 1 (linear stiffness) or α = 3 (cubic stiffness). The parameters μ and α are controlled by the user. The amplitude of the oscillations can be approximated analytically as shown in [8], setting x = X cos(t), as
In the range of parameters explored here, this approximation is extremely precise. This formula gives the blue surface in Figure 1. The system exhibits a Hopf bifurcation at μ = 0, meaning that a periodic oscillation emerges from the equilibrium [10]. This corresponds to the point at which the linear damping coefficient changes sign. Hence, the linearized oscillator becomes active (energy is created) when μ < 0. The Hopf bifurcation is supercritical for σ > 0 and subcritical for σ < 0. There is a saddlenode bifurcation at σ^{2}/(4ν) when σ < 0, marking the turning point where the unstable and stable periodic solutions collapse together [11]. The saddlenode bifurcation merges with the Hopf bifurcation at σ = 0, forming the codimension 2 Bautin bifurcation [12]. This structure implies a bistability zone, where both the equilibrium and an oscillating (periodic) regime are stable. In this zone, the user can experience the practical implications of bistability around an inverse Hopf bifurcation, such as hysteresis cycles or the impossibility to obtain an oscillation of arbitrarily low amplitude.
Figure 1 Interface of the real time dynamical system demonstrator. 
A modified version of equation (1) is used, in order to produce several notes without changing the dynamics of the continuoustime system. With a new parameter ω_{0}, the statespace representation of the system becomes
The time domain integration of equation (1) is realized using three different discretization schemes. This demonstrate the versatility of the approach, and highlights the influence of the discretization scheme. The user can switch between schemes at any time. The simplest discretization scheme is the explicit Euler method [13], giving the n + 1th values of x and y as a function of the nth:
The second method is a fourthorder Runge Kutta (RK4) integration scheme [14]. Using the F notation from equation (3), one computes the next sample by
The third and last method is Matlab’s builtin ode45 solver [15]. It is also a type of RungeKutta integration scheme but with an autoadaptive time step.
3 Implementation of the demonstrator
This section describes the implementation of the main functionalities of the demonstrator. The complete source code can be downloaded from the repository in Zenodo https://doi.org/10.5281/zenodo.8413627 [16].
3.1 Realtime audio on Matlab
The current implementation uses the Audio toolbox object audiodevicewriter, which communicates with the audio driver of the computer. The Matlab version used during the writing of this article is 2021a. Figure 2 is a selfcontained, minimal working example using the system from Section 2 solved with ode45. When running this code, please lower the volume as the sound can be quite loud.
Figure 2 Minimal working example solving the oscillator of Section 2 in realtime, and outputting the audio stream. This code can be copypasted directly to the Matlab command window (depending on the pdf font used the “” power character needs to be manually rewritten). It is also provided as an Mfile in the code archive that can be downloaded in the Data Availability Statement. 
Note that, especially on Windows, best results are obtained using ASIO drivers instead of the default audio driver. In that case, the sample rate and buffer size of the ASIO driver should be equal to those of the Matlab audiodevicewriter object. However, numerous different ASIO drivers exist depending on each user’s setup. Therefore, it is hard to provide a flexible and compact ASIObased solution. Users of the code are encouraged to adjust the audiodevicewriter parameters to their particular hardware and driver.
3.2 User interface
The user interface displayed in Figure 1 is comprised of a main display graph, three control sliders (μ, σ and f_{0} = ω_{0}/(2π)), two button groups setting the integration scheme and the stiffness exponent σ, and four buttons for other user actions.
3.3 Musical instrument control through MIDI
A natural way to control the demonstrator is through the MIDI protocol. This is especially relevant for any kind of musicrelated interpretation, as external MIDI controllers allow to control the demonstrator like a keyboard synthesizer or a wind instrument. In a more general context, MIDI controllers allow for a more fluid control than a mouse and slider. For instance, MIDI controllers facilitate the simultaneous variation of several parameters.
Matlab’s audio toolbox support MIDI through the mididevice object. First, the mididevice object is created based on a user input given through the MIDIlistbox object.
midicontroller = ...
mididevice(‘Input’,MIDIlistbox.Value);
Then, at every iteration of the synthesis loop (i.e. before each audio buffer is filled with samples), the pending MIDI message are gathered using
msgs = midireceive(midicontroller);
This gives an array of midimsg objects. They signal control parameter changes or note changes, depending on their type. This information can be retrieved by accessing the Type property of the midimsg and comparing it to another string, for instance “NoteOn” or “ControlChange” (CC). This is slower (sometimes by a factor of ten) than directly reading and comparing the bytes of the messages, which hold the same information. The gain in speed is especially interesting in the case of a wind controller, which sends ControlChange messages very often to translate the blowing pressure of the musician. An array of the message bytes is created by
msgsbytes = vertcat(msgs.MsgBytes);
and then parsed using the first byte as the message type identifier (176 for ControlChange, 144 for NoteOn). In the case of ControlChange, the second byte in the array indicates the CC number. Different CC numbers can be linked to different parameters. For example, parameter is linked to CC number muCC (2 by default), which is parsed from the MIDI message bytes by
imsgCCmu = find((msgsbytes(:,1)==176)...
&(msgsbytes(:,2)==muCC),1,’last’);
In order to apply only the most recent userprovided command, only the last message is read. The new value of the control is contained in the third byte of that message (at index imsgCCmu in the array):
newCCmu = double(msgsbytes(imsgCCmu,3));
This value, scaled between the control parameter limits, gives the new control parameter value.
4 Synthesis loop
Each iteration of the synthesis loop produces enough audio samples to fill one audio buffer. Its structure in pseudocode is
while (stopbutton is not pushed)
Read user controls
Solve equation during one audio buffer
Format solution as audio output
Extract signal descriptors from solution
Update display
Record solution
Output sound
Check pending displays or callbacks
end
The following subsections detail each line of this pseudocode block.
4.1 Read user controls
User controls are read using either the MIDI messages or the sliders. There are three methods to assign the value of a slider to a variable. First, it is possible to check the Value property of the slider on each loop iteration. This is done for the f_{0} parameter. This solution is very close to using the ValueChangedFcn callback. In both cases, the changes are applied when the user releases the slider thumb. The third solution is the slider callback function
ValueChangingFcn, which is called while the user moves the slider. This solution is necessary to render progressive variations of the parameters. The control parameters μ and σ are updated with this method. This is useful to follow a quasistatic path along the bifurcation diagram, or execute a slow attack through the Hopf bifurcation.
The button values are read and stored in separate variables, to be used in the loop.
4.2 Solve equation during one audio buffer
Depending on the structure of the solver function, this step can take two forms. If the solver sets its own timestep, and a fortiori if it is autoadaptive (like ode45), the solver function is called once to generate the total number Nbuf of samples in one audio buffer. This is implemented as
[~,Xts] = ode45(@(t,Xt) VanDerPol5_odefun(...
Xt,t,mu,nu0,sigma,2*pi*f0),(0:Nbuf)/Fs,X.’);
X = Xts(end,:);
positions = Xts(2:end,1);
speeds = Xts(2:end,2);
Note that the initial condition X is returned as the first line of the solution Xts. However, it is (by definition) the last line of previous solution. Repeating it twice in the audio stream causes clicks and artifacts. Therefore, ode45 is called for Nbuf+1 time steps, and only the last Nbuf constitute the audio output.
If, on the contrary, the solver simply gives X[n +1] as a function of X[n] (like the RK4 and explicit Euler schemes), it is called Nbuf times. Then, the solving step is
for ibuf = 1:Nbuf X_np1 = VanDerPol5_backwardsEuler(... X,mu,nu0,sigma,2*pi*f0,Fs); X = X_np1; positions(ibuf) = X(1); speeds(ibuf) = X(2);end
We use a test to estimate the maximum number of oscillators that can be integrated simultaneously. The test is done with no user interface or separate display. It uses 44.1 kHz sample rate and a 512 sample buffer. On the laptop this was implemented on, between 3 and 9 ode45solved oscillators can run in parallel while keeping a fluid audio flux (depending on power consumption options). Between 4 and 12 parallel oscillators run with the RK4 solver. Using a simpler explicit Euler scheme allows between 28 and 82 oscillators to run in parallel. This gives an idea of the headroom of this architecture to accommodate bigger systems. As this result heavily depends on the user’s hardware, the code to reproduce this test is given in the code archive linked in the Data Availability Statement, so each user can know their potential for more complex systems.
4.3 Format solution as audio output
The solution of the equation must fit inside a Nbufbytwo matrix, which is passed as argument to the audiodevicewriter object. Here, for this simple system, minimal processing is applied. The solution is scaled by Xmax, an analytical estimate of the maximum amplitude for the considered control parameter range. The left and right channel are passed and respectively. This way, a direct plot of the audio output represents the phase space of the oscillator.
audioout = [positions(:) speeds(:)]/Xmax;
In a more general case, it can be useful to listen to certain physical variables rather than others, or process them in a specific manner (filtering or nonlinear processing) for illustrative or aesthetic purposes.
4.4 Extract signal descriptors from solution
Only basic signal descriptors are extracted in this demonstrator: RMS amplitude (or rather, mean distance to the origin in the phase space) and fundamental frequency. The RMS amplitude is used as the main display indicator. For this system, it is sufficient to differentiate solution branches and locate bifurcations. The fundamental frequency estimate is computed by a simplistic algorithm based on zerocrossings. It helps to quantify the detuning effect of the different integration schemes, and of the cubic stiffness.
4.5 Update display
Any systematic realtime display concurrent with an audio process on Matlab needs to be kept as light as possible to not perturb the audio flux. This demonstrator updates a plot with a single point, and an animatedline object inside the loop. These graphical objects represent the current and recent RMS amplitude of the solution.
4.6 Record solution if necessary
A button on the interface records the descriptors, control parameters and button values in a structure once per loop iteration. In the present demonstrator, no variable at audio rate is recorded. This prevents excessive memory usage and disk access in the event of a long recording. The data is saved in a matfile. It is also used as soon as the recording is stopped to provide a multipurpose plot designed to support a quick analysis of the results. An example of this plot is displayed in Figure 3, where the effect of the integration scheme on fundamental frequency is illustrated. Starting with the RK4 solver, the following parameter variation is applied: beginning at μ = 0.5 and σ = 0.5, the μ parameter is decreased slowly to its minimum μ = −0.5, followed by parameter σ which also decreases until around σ = −0.6. At this point, the RMS value of the oscillation is at its highest point. The parameters σ and then μ are then slowly brought back to their initial values. This scenario is repeated with ode45, starting from approximately 22 s, and with the explicit Euler scheme, starting from approximately 39 s. One can see on the fundamental frequency subplot that the RK4 scheme provides the most consistent fundamental frequency, followed closely by the ode45 solver where variations do not exceed 1 Hz around the 440 Hz expected frequency (i.e. the eigenfrequency of the linearized oscillator). However, the explicit Euler scheme entails considerable pitch flattening when the signal amplitude increases, down to about −6 Hz (about −20 cents) from the expected frequency.
Figure 3 Example of the multipurpose plot created at the end of each recording, illustrating the fundamental frequency variation due to different integration schemes. See Section 4.6 for details. 
4.7 Output sound
The audiodevicewriter object is used to output audio. It is also responsible for the scheduling of the loop.
ADW(audioout);
4.8 Check pending displays or callbacks
The execution of user interface object callbacks and refreshing of the display is ensured by the command drawnow limitrate placed at the end of the loop. A simple drawnow would degrade the audio output by introducing too many pauses, but drawnow limitrate limits the number of pauses to 20 per second. This is essential to keep a robust audio stream.
5 Video demonstration
The video, also linked in the caption of Figure 4, illustrates possible uses of the demonstrator, and showcases the fluidity of the controls.
Figure 4 A snapshot of the illustrative video linked at https://youtu.be/_ExgRsgB7wc. 
6 Conclusion
The presented demonstrator holds a lot of potential for teachers and researchers combining dynamical systems with audio engineering or music. In terms of teaching, it illustrates transient effects in direct and entertaining ways. It can also be useful for proof of concepts, to quickly assess the behavior of a dynamical system, or to compare two slightly different versions (parameter values, physical hypotheses or integration scheme).
Because it relies solely on Matlab, any researcher that has mainly been coding in Matlab can reuse their usual tools. In particular, we show that a continuoustime formulation of a system can be sufficient to produce sound, simply leaving the integration to a builtin method such as ode45. There is also reasonable headroom to adapt the demonstrator to a more complex system, especially if one is willing to simplify the integration scheme.
In musicrelated fields, this kind of demonstrator is all the more interesting as it bridges the gap between the physical model of an instrument and the actual instrument, by allowing to control a model with any musical controller supporting the MIDI protocol. The authors strongly believe in the potential of the realtime physical model control as a way to contribute to an objective definition of the “playability” or “ease of playing” of a musical instrument.
Conflict of interest
Authors declare no conflict of interest.
Acknowledgments
This study has been supported by the French ANR LabCom LIAMFI (ANR16LCV200701). The authors are grateful to Teodor Wolter for proofreading the English.
Data Availability Statement
The source code and .exe installer file for the demonstrator, as well as Mfiles reproducing results for Sections 3.1 and 4.2, can be downloaded from the repository in Zenodo https://doi.org/10.5281/zenodo.8413627 [17].
References
 N.M. Krylov, N.N. Bogoliubov: Introduction to nonlinear mechanics, No. 11, Princeton University Press. 1950. [Google Scholar]
 R. Seydel, Practical bifurcation and stability analysis, Vol. 5, Springer Science & Business Media. 2009. [Google Scholar]
 C. Grebogi, E. Ott, J.A. Yorke: Chaos, strange attractors, and fractal basin boundaries in nonlinear dynamics, Science 238, 4827 (1987) 632–638. [CrossRef] [PubMed] [Google Scholar]
 T. Tachibana, K. Takahashi: Sounding mechanism of a cylindrical pipe fitted with a clarinet mouthpiece, Progress of Theoretical Physics 104, 2 (2000) 265–288. [CrossRef] [Google Scholar]
 B. Bergeot, C. Vergez: Analytical expressions of the dynamic hopf bifurcation points of a simplified stochastic model of a reed musical instrument, Nonlinear Dynamics 107 (2022) 3291–3312. [CrossRef] [Google Scholar]
 A. Chaigne, J. Kergomard: Acoustique des instruments de musique (Acoustics of musical instruments), Belin, 2008. [Google Scholar]
 T. Colinot: Numerical simulation of woodwind dynamics: investigating nonlinear sound production behavior in saxophonelike instruments, PhD thesis, AixMarseille Université, 2020. [Google Scholar]
 D. Dessi, F. Mastroddi, L. Morino: A fifthorder multiplescale solution for HOPF bifurcations: Computers & Structures 82, 31–32 (2004) 2723–2731. [CrossRef] [Google Scholar]
 J. Guckenheimer, Y.A. Kuznetsov: Bautin bifurcation, Scholarpedia 2, 5 (2007) 1853. [CrossRef] [Google Scholar]
 Y.A. Kuznetsov: AndronovHOPF bifurcation, Scholarpedia 1, 10 (2006) 1858. [CrossRef] [Google Scholar]
 Y.A. Kuznetsov: Saddlenode bifurcation, Scholarpedia 1, 10 (2006) 1859. [CrossRef] [Google Scholar]
 W. Beyn, A. Champneys, E. Doedel, W. Govarets, U. Kuznetsov, A. Yu, B. Sandstede: Numerical continuation, and computation of normal forms, in: Handbook of Dynamical Systems, Vol. 2, Elsevier. 2002. [Google Scholar]
 J.C. Butcher: Numerical methods for ordinary differential equations, John Wiley & Sons. 2016. [CrossRef] [Google Scholar]
 J.C. Butcher: A history of RungeKutta methods, Applied Numerical Mathematics 20, 3 (1996) 247–260. [CrossRef] [Google Scholar]
 L.F. Shampine, M.W. Reichelt: The matlab ode suite, SIAM Journal on Scientific Computing 18, 1 (1997) 1–22. [CrossRef] [Google Scholar]
 T. Colinot, C. Vergez: Dynamical system audio demonstrator [Code]. GitHub. 2023, https://github.com/TomColinot/DynamicalSystemAudioDemonstrator/. [Google Scholar]
 TomColinot: TomColinot/dynamicalsystemaudiodemonstrator: resubmission release (v1.1.0beta). Zenodo, 2023. https://doi.org/10.5281/zenodo.8413627. [Google Scholar]
Cite this article as: Colinot T. & Vergez C. 2023. How to build a MATLAB demonstrator solving dynamical systems in realtime, with audio output and MIDI control. Acta Acustica, 7, 58.
All Figures
Figure 1 Interface of the real time dynamical system demonstrator. 

In the text 
Figure 2 Minimal working example solving the oscillator of Section 2 in realtime, and outputting the audio stream. This code can be copypasted directly to the Matlab command window (depending on the pdf font used the “” power character needs to be manually rewritten). It is also provided as an Mfile in the code archive that can be downloaded in the Data Availability Statement. 

In the text 
Figure 3 Example of the multipurpose plot created at the end of each recording, illustrating the fundamental frequency variation due to different integration schemes. See Section 4.6 for details. 

In the text 
Figure 4 A snapshot of the illustrative video linked at https://youtu.be/_ExgRsgB7wc. 

In the text 
Current usage metrics show cumulative count of Article Views (fulltext article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 4896 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.