From: Jay Dugger (til_e@hotmail.com)
Date: Sun Jul 07 2002 - 15:01:33 MDT
Sunday, 07 July 2002
> > But are there other ways of intuitively transferring our creativity into
> > pictures? Blender (and the other high end modellers) have a steep
> > learning curve and a complex interface since they are complex programes
> > where you want control over everything; Bryce has a far simpler
> > interface since it actually is a rather simple program. Do we always
> > have to pay the cost of complex interfaces for advanced programs, or are
> > there ways out? A good musical instrument both allows you to play and to
> > add subtelity - maybe we should look at them for interface ideas?
I don't think musical instruments will offer any good examples for
rendering interfaces. Rendering usually occurs slower than real-time. Most
instruments provide real-time control over their output. E.g., all
human-powered musical instruments, some synthesizers, the stereotypical DJ's
turntable, and audio processors like vocoders, mixing boards, and guitar
pedals. Non-real-time musical instruments like player piano rolls,
sequencers, machine composition programs and most recording machines already
have come under computer control. Their interfaces form a subset of computer
user interfaces, which almost always falls under the control of some GUI.
Csound alone suggests any sort of exception.
That said, musical instruments tend to five types of interface: one,
the breath-powered instruments; two, the percussive instruments; three, the
bowed and plucked instruments; four, keyboard-based instruments; five,
everything else. I'll briefly describe what I mean by each, try to induce
some general interface properties, and finally make some suggestions.
Breath-powered instruments include woodwinds and brass instruments.
Typically these control pitch by a set of keyed valves controlled by
fingers. The trombone player controls pitch by extending and retracting a
tube, but I can't think of another exception. Amplitude and fine pitch
tuning in this type come from breath pressure and embouchure respectively.
Embouchure can also make some difference in timbre, although this varies
with instrument and player skill. Phrasing control comes from tongue and
breath control. Interrupting a steady stream of air with the tongue or by
ceasing to provide it marks the transition between one musical phrase and
another.
The breath-powered type also includes the human voice. Controlling
amplitude, fine tuning, and phrasing resembles methods in other instruments
of this type. Pitch control comes from larynx postiion.
Percussive instruments produce sound when struck. Drums of all kind
belong to this class. They have a very simple interface due to a very
simple output that varies (usually) only in amplitude. Some percussive
instruments such as the kettle drum and xylophone can vary pitch. The former
varies continuously over a small range under control of a pedal. I think
this works by relaxing the drum head's tension. The xylophone varies its
output pitch by presenting a collection of bars tuned to distinct pitches.
Bowed and plucked instruments make sound by making an oscillator
vibrate. The vibration can come from a scraping or a plucking a string.
Pitch control comes from varying the string's length or by selecting from a
set of strings of varying length. Amplitude control comes from plucking
harder, but I don't know off-hand how bowed instruments control amplitude.
Bowed instruments also can modulate their sound by altering how hard, and
how long the bow (scraper) contacts the strings (oscillator). I suspect they
can also make changes in sound by altering how fast the scraper crosses the
oscillator, and they might change sound by changing the direction of
scraping.
Keyboard-based instruments make their sound by presenting a set of
switches (keys) to the player. The switches initally controlled only note-on
or note-off. E.g., pipe organs, harpsichords, and clavichords. Over time,
more parameters of sound fell under control of these same switches.
Amplitude came first, and in modern keyboard controllers these keys can
control parameters of sound based on how fast the keys go down or up
(velocity-sensitivity) and how hard the artist presses the keys
(pressure-sensitivity). Phrasing control comes from playing technique. Since
I hard trouble with phrasing on the piano, I'll say no more about it. :-)
"Everything else" includes non-real-time instruments as described
above, exotica such as the theremin, and non-traditional musical instruments
like turntables, industrial machinery, airplane propellors, steam whistles,
etc. Interfaces of this type obviously vary widely.
So real-time musical instruments tend to control only a few parameters:
pitch, amplitude, timbre and duration. Some instruments vary the first three
over the last under player control. Timbre often varies over the pitch range
of an instrument. Amplitude strongly tends to vary with pitch, so much so
that composers can work against this for emphasis. High and soft flutes
might get the description "like bird song," but high and loud flutes might
get the adjective "shrill." The most expressive instruments, human voices
and bowed strings, have very simple interfaces and corresponding steep
learning curves.
Musical instruments involve a lot of haptic feedback. HI probably would
prove useful for 3d modelling, but does that help rendering? After all this
thinking and writing, perhaps cameras and cinematography would offer better
suggestions. OTOH, perhaps my imagination just fails me.
-----
Jay Dugger : Til Eulenspiegel
til_e@hotmail.com : duggerj1@home.com
-----
Sometimes the delete key is your best friend.
_________________________________________________________________
MSN Photos is the easiest way to share and print your photos:
http://photos.msn.com/support/worldwide.aspx
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:15:13 MST