APE,
I want to give you the benefit of the doubt here, you normally know exactly what you’re talking about and I would defer to you on any topic regarding audio. We must just be failing to communicate at some level
For example your comment:
How exactly does the "After Touch" parameter affect the sound created
Suggests that we might be using different terms for something that I'm sure you must be aware of. Perhaps you know it as "Polyphonic Key Pressure" which can be found in the MIDI spec here:
http://www.midi.org/techspecs/midimessages.php
Also this comment:
a Note On command followed by a Note Off command does not define the duration of a sound.
Though correct in relation to the SOUND, does not relate to the issue of MIDI transmitting performance data. The data transmitting from a MIDI instrument, AKA Controller, IS time based and thus the “Note On” followed by a “Note Off” does indeed denote how long the note was held on the MIDI controller. A synth can do anything it wants with the information just as you described, but it is FACT that the MIDI controller is transmitting MIDI performance data, which has NOTHING to do with the actual sounds generated.
I KNOW digital communication protocols, of which MIDI is a very basic example. MIDI is a time based network protocol, wherein each midi event, which is composed of numeric data, is transmitted serially, one event after the other. Regardless of how nodes in the network respond, or what the data being transmitted is, the only thing going over those wires is data. I can send any data with the limit being numeric values from 0 – 127 (0-255 for status bits), however I can send any number of them that I like, thus any data you can imagine can be sent over MIDI. The MIDI protocol allows for custom data to be sent over MIDI, using SYSTEM EXCLUSIVE operations (also on the doc linked above)
In response to:
… MIDI contains no parameters or data of any kind which defines the actual circle itself!
This is true, but for that matter neither does MIDI contain any parameters for any sound, it only contains numbers that represent the performance data. There is NOTHING about the actual SOUND in MIDI data.
For example: say we had this MIDI data in our synth buffer..
This is just data… the MIDI protocol only requires the synth to accept this value as valid input and forward it out any THRU ports, but beyond that the synth can do what it wants with it.
A piano synth for example might interpret the first byte, 144 as a “channel one note ON”, the second byte 060 as the note “Middle C”, and that the third byte 120 as the velocity at which the “middle c” key was pressed. The synth then takes the performance data and uses it as input parameters for the SOUND it will generate. The mapping between channels\notes and actual sounds is beyond the scope of this example, but sufficient to say that the performer pushed the middle c key on his keyboard and a sound non-unlike a piano middle c was generated by the synth.
No imagine a different synth, our imaginary synth of the analogy. Our new synth would use that same data differently. The imaginary synth would use 144 as the X coordinate on some grid and 060 as the Y coordinate on that same grid, thus defining a “point” and the third byte 120 as the radius value. Our imaginary synth would then instead of making sound, would use those three inputs to plot a circle on a screen or piece of paper at the position defined by points 144, 060 with a radius of 120. How the synth does this is irrelevant to the discussion in the same way the channel\note to sound mapping is irrelevant in the above Piano synth example.
Thanks