Skip to content
rfielding edited this page Dec 18, 2011 · 6 revisions

The overall goal of this project is to produce good components for this class of instrument. This is done by making the components loosely coupled, and not letting them get bound too tightly to this exact instrument.

  • The lowest layer is the synth proxy - CoreMIDIRenderer, which is an OS dependent module. It will forward messages out to CoreMIDI and is tightly bound with iOS. It offers an interface like this:

    • putchar(char)
    • flush()
  • The Fretless layer is the MIDI buffer generator. It has no explicit dependencies on anything. A pair of function pointers (a putchar/flush pair) representing a synth proxy get injected into here on startup. The front of this interface is frequency oriented pitches that are per-finger, where each finger is assigned to a polyphony group. This is done to hide the vulgarities of MIDI channels, as writing MIDI directly involves complex management of channel cycling and manually enabling and disabling notes to handle polyphony and legato.

  • The Fret layer takes candidate pitches and snaps them to frets. It maintains the frets selected, and provides the mechanism to iterate them. It does not know anything about the screen layout.

  • The PitchHandler layer is a proxy above fretless and fret. It includes the rules for laying out the moveable frets, and time-based pitch drifting. In this layer, it is important that the coordinate system is normalized around the rectangle ((0,0), (1,1)). This layer remembers where fingers are located and the states that they are in, so that other parts of the system can query it. PitchHandler wraps around Fret to some degree, as there is some amount of overlap.

  • The TouchMapping layer simply maps pointers to unique finger integers, so that iOS'isms don't creep into Fretless.

  • The Translation layer handles the rotation of the user interface and translating from native touch coordinates. So the touches are oriented around ((0,0), (width,height)) of the device, but also if the UI is rotated from the device's coordinates, that is done here to keep the OpenGL and PitchHandler parts from having to know about it.

  • The GenericTouches layer puts all of the portable parts of touch handling into here. It also does the chorusing effect, as it is implemented as duplicating MIDI touches. (Auto-harmony lines would go into here as well if it were done). It deals with PitchHandler to generate the right pitches for Fretless.

  • VertexObjects is an iterator for generating OpenGL objects out of the line and triangle primitives.

  • The GenericRendering layer puts all of the portable user interface rendering into here. It iterates the VertexObjects to generate the user interface, and also uses the PitchHandler to get information on where the fingers are.

Clone this wiki locally