310.684.3047

The Window for Lazy People with ADHD

The Window for Lazy People with ADHDI wanted to post some details on what we built for Hollywood Hackday with Adobe and Flash video — because, I think that it can be difficult to understand what we did just from the screen-capture (there isn't a working version posted online yet).

The focus of the Hackday was "Tech + Music + Video". Our team (which consisted of myself, Evan Squire, Hasan Otuome, and Michael Dela Cruz) didn't have any specific product vision — we just wanted to sit down and see what kind of creative experience we could build in a weekend.

After some discussion, I suggested an update to Mario Klingemann's Flickeur, which assembles a dynamically-created multi-media experience by integrating with the Flickr API to deliver user-generated photos in a creative and expressive medium. Of course, Flickeur was written ages ago (Flash 8, I believe) and today we have a lot more power. So, the key differences between Flickeur and our app include:

  1. Instead of Flickr for pix, we would use Vimeo for flix.
  2. Instead of tweens and color effects, we would use PixelBender to animate shaders on the video.
  3. Instead of sound effects, we would use hit music (hit music makes everything look better) — we still use sound effects, but honestly I need to work on normalizing levels, because they are essentially inaudible in this version.
  4. And we would add in some Synthia-like logic to correlate data of one kind (weather) to data of another kind (music, video and special effects).

I did about two hours of setup work on Friday night (just setting up the project folder, and running one quick proof-of-concept to ensure I could work with PixelBender and the video as I envisioned) so that we could hit the ground running on Saturday morning. We arrived on Saturday morning, and worked through until just before the final presentations began.

You can view a capture of the project here, on YouTube. This project is not posted online yet — primarily, because I need to integrate with a legit music source (like Spotify or Rdio) so I don't go to RIAA prison (this is the same reason that YouTube limits viewing of this video to the United States — sorry global community); and also because it hasn't been tested for online playback (when you're building at a hackday, you want to build quick).

So, what does this app actually do?

Well, first this app loads in the weather from a data feed, and then based on the details of the weather report it builds a playlist of MP3s and Vimeo videos, to reinforce the mood of the climate and environment. (We opted for Vimeo over YouTube support because, well, Vimeo videos tend to look a lot better.)

That's why, as a gag, we called it the 'Window for Lazy People with ADHD' (or 'WLPA')– so you can tell what the weather is without getting off your ass, or opening a window, and for those easily-distracted, you can watch an infinite playlist of rapidly-changing user-generated videos and hit tunes.

We apply PixelBender filters, dynamically, as shaders to the video during playback, to give it that artistic/cinematic feel. And we use GPU acceleration to get great performance, in the browser and in full-screen.

So, in case this isn't clear, WLPA is applying these special effects in real-time, to video that is loaded dynamically from an external source. Indeed, here are some still frames from the source videos, so that you can get a better sense of just how much special-effect work Flash is doing under the hood (click images for larger view):


Despite that this might appear as pre-rendered video (in fact, this led to some fun confusion at the event itself, as some believed this was our demo reel, and not the app we produced over the weekend), it is not — this is just one example of what the special effects and rendering capabilities of Flash offer today. And, I think that it's worth noting, that this project only uses features that have been in Flash since Player 10 was released THREE YEARS AGO — we don't even touch any of the stage acceleration or 3d features in Player 11.

Ours was the only entry in the Hollywood Hackday built in Flash. And while many of the other entries were impressive in various respects (and the winner, Tunehook, really was the best idea for an app with a real potential market, and had a nicely-executed prototype), ours was really the only one that featured any real design sensibility, or created an engaging experience of any dimension, or did anything interesting at all with video content.

And that's because Flash enables massive and rapid creativity with all types of content, including video, in a way that HTML5 simply does not (and saying this should not be seen as trashing HTML — it's just a true characteristic of the tools that we use). And we enjoyed showing the type of experience that you can create in just two days, when you have high-grade media support, and aren't debugging issues from browser-consistency.

I did the bulk of the coding, as well as the PixelBender customization for shader animation. Evan Squire handled graphic design, and the overlay effects and animations. Hasan Otuome wrote the Vimeo API integration code. And Michael Dela Cruz handled audio asset production.

There is a good chance that, time-permitting, I will continue production on this piece so that we can post and distribute it online (again, with legitimate media).

Tools Used by Team Adobe at Hollywood Hackday
For reference, here's a slide from our presentation that shows which Adobe tools we used:

  1. Adobe Flash Professional for animation and asset linkages
  2. Adobe Pixel Bender Toolkit to load and preview shaders, so we could learn which settings to manipulate in the animations
  3. Adobe Photoshop for visual asset production
  4. Adobe After Effects for some of the graphic motion animation in the overlays
  5. Adobe Media Encoder for media preparation
  6. Adobe Flash Player to deliver the final experience

It really is empowering when you remind yourself what you can create with Adobe's tools. And we had a great time doing it.

Share and enjoy!

-r

Category: Code & Samples, Events, General Posts, Synthia

Tagged: , , , , , , ,

6 Responses

  1. DaveW says:

    Very cool! What resolution are the videos? I've only done a little tinkering with Pixel Bender, it always seems to drop frames with 1080p content.

  2. R Blank says:

    This version is at 1024, scaled up. Videos are of different resolutions.
    I'll do more testing on performance when I reopen the app for more Dev.

    But if you're not doing anything fancy with the video, then use StageVideo instead of Video for playback and you can support way higher res HD video playback.

    That wasn't an option in this app, because I needed the BitmapData from the video (which you can't access when using StageVideo).

    How are you using PixelBender? As a shader or a blend?

  3. DaveW says:

    I've been using PixelBender as both shader and blend mode, but I haven't had much luck with the one blend mode I really want, which box-blurs everything behind it. It works in the preview but not in Flash, at least not as a blend…it works as a shader job though. I've been too busy lately to go back to it and see if I can get it working, or if anything in the Flash Player 11 updates lets it work. It didn't work at all until the 10.2 update.

    • rblank says:

      I forgot to add an important question… what hardware acceleration settings are you using? And are you trying in full screen mode, with full screen acceleration?

  4. DaveW says:

    I tried compiling with use-direct-blit=true and use-gpu=true (99% of what we deliver is projectors), it doesn't seem to make any difference. I think the biggest problem is the amount of blur I'm trying to add, it does a 9×9 box blur so that's 81 pixels being sampled and averaged for each pixel being processed.

  5. rblank says:

    Hmm… that's interesting. I don't know if GPU acceleration works at all in projectors — it's been so long since I used them. Do you know?
    You are right — the blurs are the most draining of the filters that I used in this project.
    One way I helped combat that (since it's a blur anyway) is to render the video, off the display list, at a smaller size (in this case, 640×480). I then take that frame, pass it into the bitmap, with the filter as the shader, and then apply a Matrix transformation to scale it up to full screen.
    Does that make sense? Many fewer pixels to blur.

Leave a Reply