Project Glass @ Google

Google has posted information about Project Glass. The photos and video show a stylish, lightweight eyeworn see-through display operated via a speech interface, which allows you to do many of the things you already do with your iPhone, but without manual interaction. This announcement has been expected for some time and we mentioned it in an earlier post on HMDs titled Retro-Future. It will be increasingly difficult to justify the purchase of outrageously priced and bulky retro-HMDs when consumer products with superior form-factor and functionality come on to the market. Currently price estimates for the AR eyewear run from $200.00 to $600.00, but this is guessing. No doubt Google will make APIs easily available for the eyewear, so that R&D can be conducted by anyone with modest means and sufficient motivation. The video, available via YouTube, is probably mostly a mock-up and I’m wondering where the battery will be located.

Vimeo & YouTube HD Informal Comparison

I uploaded the same HD video file (1280×720 progressive) to YouTube and Vimeo. Watch these on full screen, with resolution set to 720p on YouTube and ‘HD Mode’ selected on Vimeo.

Simple P5 Sketch Circle Animation (720p) from Michael Lyons on Vimeo.

This is an unusual video in that it consists only of moving black lines on a pure white background, so compression artifacts are quite noticeable. The raw data (1800 png image files) is over 200MB but artefacts were barely visible in the ~100MB H.264-compressed mp4 file I uploaded to both Vimeo and YouTube. That noted, it seems clear that YouTube has the advantage in terms of quality. Since I’ve read otherwise in informal reports on the web, I’m not sure whether or not the quality might improve with Vimeo Plus, a paid upgrade currently available for the discounted price of US$60/year. Overall, Vimeo offers a calmer, more pleasant user experience than YouTube. The user interface is nicely designed and the online help files are easy to navigate and genuinely helpful. The content and the community are generally more edifying: no denying it there’s a lot of trash on YouTube. The advertising on YouTube is also more obtrusive and distracting. As for pure spec: time to upload and process a video is much quicker with YouTube. Upload is slower with Vimeo and they also make non-paying users wait for at least 30 minutes before the video goes online. Moreover there are weekly limits on total data and only one HD video may be uploaded per week. Third party advertising is less intrusive but lately Vimeo is pushing the paid-subscription fairly hard.

Addendum: Vimeo offers a slightly better experience when browsing from iOS devices in that it’s easier to directly open the 720p viewer. But compression artefacts are still more noticeable, with this file, than with YouTube. Note also that YouTube allows upload of higher resolution videos such as ‘Full HD’ (1080p).

Simple P5 Animation

This is a simple, lightweight P5 sketch captured to a low-res (320×240), compact video just to illustrate that it is easy to create animations with Processing.

Uploaded with Vimeo also, but Vimeo makes non-paying users wait.

Here is the same P5 sketch rendered at 720p (watch full screen):

The lower-resolution video was made using the built-in MovieMaker class. The 720p version was made by saving each frame as a png image file by calling the saveFrame() method. The frames were then concatenated and saved in mp4 format using FFmpeg.

 

NIME Zemi Basic Tools Part II: Max/MSP

While we are on the topic of basic tools for this seminar, I suspect everyone already knows about, or at least has heard of Max/MSP, a multi-media visual programming environment sold by the company Cycling ’74. If you have any intention to do work in the media arts, and I expect that everyone who joins my seminar has such an interest, then you should try to develop some knowledge of P5, mentioned in the previous post, and Max/MSP.

Our faculty does not presently offer any courses in either of these two tools, however there are plenty of ways you can learn about these through self-study. Just as with P5, there are many resources online for learning about Max/MSP, including the built in tutorials and help files. There are also introductory books, and a popular book written in Japanese is 2061: A Max Odyssey. This book dates from 2006, and there have been some changes to Max/MSP in the mean time, however the basic way of working with Max/MSP has not changed and this book is still very useful.

Max/MSP can be even more fun than P5 because it uses a visual programming paradigm: you create a program by connecting little boxes having a dedicated function. The programs are called patches because the links between the objects resemble the electronic patch cables of the old analog sound synthesizers. Here’s what a simple Max/MSP patch looks like:

Click for larger image.

This is a Max/MSP patch I created after a few hours experimenting with percussion sequences based on the Fibonacci numbers. I’ve titled it ‘Quasi-Periodic Drum Circle’ but it’s not really quasi-periodic because the patterns eventually do repeat and it’s not really a drum circle, because some of the presets use other MIDI instruments such as whistles, and a Cuíca. A more accurate name would be ‘Quasi-Quasi-Periodic Latin Percussion Circle’. This is what a few of the presets sound like:

One of the (many) nice things about the new version of Max/MSP, Max 6, is that it allows you to create standalone applications. If you’d like to try my Percussion Circle as a standalone application on Mac OS X, send me a quick email and I’ll reply with a download link. Because I’ve used several Fibonacci numbers to create the drum patterns, it will take a very long time before the pattern repeats (I’ll leave it as an exercise to calculate just how long it takes.) So this also functions as an ambient generative Latin Percussion app: you can run it as background music if you like that sort of thing.

Like P5, Max/MSP is multi-platform, there are versions for Windows and Mac. Unlike P5, Max/MSP is not free, however it is very reasonably priced and there is a good student discount. Moreover, Cycling ’74 allows you to download Max 6 and try the entire software package for free for one month. Cycling ’74 is an excellent small company to deal with. Members have also showed up at the annual NIME conference from time to time.

You might be wondering where the name Max/MSP comes from. MSP stands for ‘Max Signal Processing’ because MSP handles the audio signals. MSP also stands for the initials of Miller S. Puckette, who first developed Max. Max is named for Max Matthews, who is considered the father of computer music, known for, amongst other things, having programmed the song sung by HAL in the film 2001: A Space Odyssey. Unfortunately Dr. Matthews passed away last year. I have nice memories of meeting him at NIME and other conferences. Here is Max graciously acting as the MC during the first NIME conference concert in 2001.

retro-future


I’m pretty bad at drawing but this image stuck with me from the Graduation Show and I decided to make a sketch because I wasn’t really happy with my photos. This is a ‘back-to-the-future’ scene in more ways than one: head-mounted displays give me a strong sense of nostalgia for the early 1990s when they were all the rage in VR research. Of course, HMDs have been around since the 1960s, but only the military (or those with military research grants) could afford to build or buy them. In the mid-1980s HMDs started to be more widely available commercially, though still at pricetags so high only the best funded labs could afford to p(l)ay. That era has long passed: by the late 1990s the HMD, like the VR ‘Cave’, came to be considered a bit of a dinosaur by the HCI research community, who tend to be much more interested in less brute-force display techniques that don’t obliterate the wonderful natural sense of sight. Besides being impressed by the shiny technology and beautiful kimono, a child might wonder why the model seems to be engaging in a solo session of  blind man’s buff. In the 1990s, HCI researchers were wondering the same thing. Many came to the conclusion that blind man’s buff, while fun, might not be a good interaction metaphor.

Recently, on the other hand, there are rumours Google will soon release hardware ‘Google goggles’ allowing flâneurs to search the Internet while on a stroll and show the results with a lightweight see-through display. That will be pretty cool. Not really new though, either. I first saw someone doing this (or claim so) at a conference in the early-mid-1990s. There’s good reason to believe that it was at least partly a fashion statement and, more likely than not, only partly functional at that time. But if Google’s hardware works well and is affordable, it will be cool indeed. You don’t want to know the price of an HMD of the VR ilk.

p5js doodle

Epicycles & Visual Music

I spent some time this evening looking again at the motion of particles moving along epicyclic trajectories. The motion of a single particle q in the complex plane, z, is given by the following parametric function of t:

z_{q}(t) = Re^{i\omega_{q} t} - re^{i\Omega_{q} t} \;\;\; with \; \Omega_{q} > \omega_{q}, \;\; R > r

The animation above is generated using 50 particles moving according to the same parametric equation, with angular velocities given as integer multiples of a fundamental value:

\omega_{q} = q \omega_{1}, \;\;\; \Omega_{q} = q \Omega_{1}, \;\;\; q = 1, 2, 3 \cdots

With the velocities distributed in this way, the angular positions attain various kinds of harmonic relations, whence the arrangement of the particles in the complex plane falls into simple symmetric patterns. The animation bears some resemblance to the visual music of James and John Whitney. In his later work John Whitney made use of digital computer programs to create visual music animations (Whitney, 1981), using particle systems related to the one described here.

[1] John Whitney, Digital Harmony: On the Complementarity of Music and Visual Art , New York, McGraw-Hill, Inc., (1981).

Singing with your Hands

Currently reported on Gizmodo: friend and collaborator Prof. Sidney Fels, University of British Columbia, and part of his team describe their work on using hand gestures to control speech and singing synthesis. Those interviewed in the video, including Sid, graduate student, Johnty WangProf. Bob Pritchard (School of Music, UBC), professional classical vocalist Marguerite Witvoet are some the people I enjoy hanging out with when I attend the annual NIME conference, which Sid and I co-founded in 2001.

The video contains demos and an excerpt from a vocal performance by Marguerite.