When one chooses any of the built in Templates in Quartz Composer, all of the necessary input data is given to you to successfully design that type of Template, for the most part.
For instance, when an Image Filter is chosen, it is delivered with an implicit input image, and also renders via the published output, even though this is transparent to the user. The same is true of the Mesh Filter. Other protocol Templates are basically self sufficient, and have all necessary patches within the confines of the actual qtz file.
The exception to this is the Music Visualizer protocol. Typically, one has to place an Audio Input patch on the Editor, and plug it into necessary patches, or perhaps create output splitters off of the Template's protocol inputs, while connecting the Audio Input patch outputs to the newly created splitter inputs, then tidying up at the end of building a composition, by removing the Audio Input patch, and reconnecting to the protocol Template inputs. This kind of workflow is a bit obtuse to say the least.
More importantly, it provides NO ACCESS to the necessary information to deliver to the HUD info patch, or any of the rich information that is available with the protocol Template's Track Info Output, such as album cover information Via an actual Image Structure!
This is the resolution:
Opening iTunes, going to the View Menu, and selecting any of the built in Quartz Composer based Music Visualizers, then choosing Show Visualizer, results in iTUNES.app broadcasting all needed Music Visualizer protocol information to QuartzComposer.app for all compositions which are marked as belonging to the Music Visualizer protocol.
So, in short, turn on Lathe, Jelly, or Stix, and your running Music Visualizer protocol composition, in Quartz Composer.app, will suddenly be able to access all of the unavailable Track Info structure. Track Position, and Track Signal, allowing you to easily create compositions which use album cover previews, or this other info much more effectively.
A typical example of usage would be to use Track Signal to trigger transitions in your Quartz Composer composition when a new song plays, or to use Track Position in conjunction with the Duration (available via using Structure Key "duration" with the Track Info output) to have an event happen halfway into a song, by using a conditional statement that receives both values.
In addition, all of the needed Audio Peak, Audio Spectrum info, though available by adding the Audio Input patch which is present in QC, is delivered from iTunes in this mode, so no "re-noodling" is needed to introduce this patch into your composition while programming and designing it. In fact, introducing an Audio Input is not necessary at all, if this method is used.
In my discussions with those who are fond of creating Music Visualizer protocol compositions, is that these programmers tend to save their work to a Compositions folder, open and close iTunes to get iTunes to load it, and then actually preview their work in the iTunes.app. This method is obviously an extremely more effective working method.
Besides this undocumented information being of serious benefit to those creating Music Visualizers, it is also interesting to contemplate that when iTunes sends the information to qtz files that it plays, that it doesn't do so via a "direct connect", and that it is actually sending them to ALL compositions that are open that correspond to the Music Visualizer protocol. It seems that there must be some kind of broadcaster in iTunes, and that somewhere, it is specified that any Music Visualizer protocol qtz's protocol inputs can act as receivers/listeners, which is interesting from a purely technical perspective.
I have filed a request for documentation of this information in the Quartz Composer New Template interface in future editions of Quartz Composer.