Monthly Archives: December 2009

ASP.NET Web Forms Panel and DefaultButton

Quick, short, and useful ASP.NET feature…

You usually expect that if a user presses Enter while inside a non-multiline TextBox would cause the form to be submitted via your Button on the bottom of the form.

However, if you have multiple “form areas” in your ASP.NET web form, one of the weird “isms” is that the user pressing Enter in one of the fields can have fairly random results, because the browser chooses which Submit button to use, usually based on the order it appears in the HTML. Some conditions will cause your Button_Click event to fire, some not.

A quick and simple way to solve this is to wrap your form section in an <asp:Panel> control, but use the DefaultButton attribute. So…

<asp:Panel runat=”server” ID=”myPanel” DefaultButton=”button1”>
<asp:Button runat=”server” ID=”button1” Text=”This is my button” />


… will cause button1 to be the default “enter key” button when the user is within that panel.


XBox Controller in .NET 3.5 with 3 lines of code

XBox controllers are pretty cool little hardware devices. So why not use them in .NET to control something cool?


The only problem is, there’s not a whole lot of code out there for any recent version of .NET, and nobody really likes learning Managed DirectX for fun… In addition, the bits of code that I did actually find, while useful for getting the hardware connected, felt like hacks to actually use it. So…

I’ve pieced together most of the XBox code into what I hope is a clean, ready-to-use and complete .NET DLL to get connected up to a controller and using it in 3 lines of code. Credit goes to some of the code at MDXInfo for giving me a good head start.

First, download the DLL:

Add it as a reference to your .NET project. Then, use it like so:

 X9Tech.XBox.Input.XBoxControllerManager xcm = new XBoxControllerManager();
 var controllers = xcm.GetConnectedControllers(); // get the collection of connected controllers
 var controller1 = controllers[0]; // the controller at index 0 will be controller #1
 if (controller1.ButtonAPressed) {
 // do something if Button A is pressed...


The XBoxControllerManager class simply has a method to GetConnectedControllers() which gives you a List<XBoxController> collection of connected controllers.

This little library addresses the following common issues when dealing with XBox Controllers…

Percentages instead of raw values – In reality, the thumb pads have values between –32,767 to +32,768. The triggers have values of 0 to 65,000. For all practical purposes, these are pretty meaningless, so these types of ranges are converted to percentages (as a double with crazy precisions). When each thumb pad is at its center position, this is 50% X / 50% Y. So from here on out, we’re going to be talking percentages, mmkay?

Trigger On/Off – Left and right triggers (the little, well, triggers that resemble the trigger of a gun) on the front of the controller) are, in reality, never actually “on” or “off”, they’re in between 0-100%. However, you may notice that the XBoxController class has two boolean properties called TriggerLeftPressed and TriggerRightPressed. These are inferred based on the current trigger percentage value, and two more properties (TriggerLeftPressThreshold / TriggerRightPressThreshold), which define the minimum percentage values the trigger needs to meet in order to be considered pressed. Defaults to 10%, as I found this was a pretty comfortable place to consider it “pressed”.

Dead Zone – The thumb pads do have springs to snap them back to “center”. However, with 65,535 positions on each axis, rarely will “center” ever mean the same thing twice. If you’re counting on 50/50 being “no thumb on the thumbpad”, this will drive you insane. So, I needed some kind of a ‘dead zone’ to go ahead and assume 50/50 when it’s in the “close enough to be considered the center” area. Through sheer trial and error I determined that a reliable center is anywhere from a 40% to 60% position. If you want this to be more or less sensitive, just set the SnapDeadZoneTolerance property. The tolerance is the percentage value that we consider to be in the “dead zone” on either side of 50%. So, for a Dead Zone Tolerance of 10, this means “10% on either side of 50%”, so 40% – 60%. A tolerance of 5 would mean the dead zone is between 55% – 65%, and of course a tolerance of 0 would disable the assumed dead zone.

Controller Polling / Refresh – To get new controller state information, we have to poll the XBox Controller. This little library will handle it automatically, so you can just grab properties and go. By default, this will poll the device a maximum of every 30 milliseconds. This felt comfortable to me, but if you need faster or slower polling, just adjust the RefreshIntervalMilliseconds property.

Controller Connect/Disconnect – At some point, your user will disconnect the controller while your application is using it; nothing should crash, and the controller’s IsConnected property should return false once it’s disconnected with no blow ups. Reconnection happens automagically.

Vibration Motors – The XBox Controller has 2 vibration motors, a left and right. The left motor (let’s call it “the big vibrations”), when set to 100%, will send your XBox controller walking across whatever surface it’s on. The right motor resembles what’s in your cell phone. You can set the motors (again, with percentages here) by calling SetLeftMotorVibrationSpeed() and SetRightMotorVibrationSpeed().

That’s about it. If you use this library and there’s anything I have forgotten, I’d love to hear. Happy coding!

Expression Blend 3 – Crash Issue

I’ve been working with the Microsoft Expression toolset for about a year and a half now… in particular, Blend, for me, takes the cake. Blend is a visual designer that allows me to make pretty nifty UIs in Silverlight and WPF. I would consider it one of the most amazing Microsoft development tools to date.

Lately, though, Blend and I have had a rocky relationship. I’ve installed it on two similar machines (one a Dell mini-tower running Windows 7 x64, and my Dell Precision laptop, also running Windows 7 x64). On the mini-tower, it works fine, and on the laptop, it crashes when I pretty much do anything (including creating a brand new WPF or Silverlight project and placing a Button on the new window).

In fact, to see the reproducible steps, check out my Problem Steps Recorder report:

I’ve gotten to know this set of windows pretty well so far:


Actual debug text:

  Stopped working

Problem signature:
  Problem Event Name:    CLR20r3
  Problem Signature 01:    blend.exe
  Problem Signature 02:    3.0.1927.0
  Problem Signature 03:    4a5d4331
  Problem Signature 04:    WindowsBase
  Problem Signature 05:
  Problem Signature 06:    4a17571f
  Problem Signature 07:    e4
  Problem Signature 08:    1b
  Problem Signature 09:    FatalError
  OS Version:    6.1.7600.
  Locale ID:    1033

Read our privacy statement online:

If the online privacy statement is not available, please read our privacy statement offline:

So far, I’ve tried (via various crash fixes and forum suggestions):

  • Windows Updates (I’m desperate)
  • Reinstalling Silverlight 3 Tools
  • Installing Silverlight 3 SDK
  • Reinstalling (well, “turning off” and “turning on” in Windows 7) .NET 3.5 SP1
  • Removing / Reinstalling Expression Studio
  • Repair Installing Expression Studio
  • Removing Visual Studio 2008 SP1 / reinstalling
  • Trying to launch Blend with Exception Logging turned on (but nothing is logged)

… in just about every combination possible.

Finally, broke out WinDbg (Debugging Tools for Windows x64) to watch the crash. Here’s the scene leading up to the crash:

System.Windows.Data Warning: 39 : BindingExpression path error: ‘SceneNodeObjectSet’ property not found on ‘object’ ”GradientBrushEditor’ (HashCode=28052000)’. BindingExpression:Path=SceneNodeObjectSet.IsViewRepresentationValid; DataItem=’GradientBrushEditor’ (HashCode=28052000); target element is ‘PropertyContainer’ (Name=’GradientStopOffsetEditor’); target property is ‘NoTarget’ (type ‘Object’)
ModLoad: 00000000`73b40000 00000000`73cd0000   C:\Windows\WinSxS\\gdiplus.dll
CLR: Managed code called FailFast, saying "Unrecoverable system error."
(b2c.1440): WOW64 breakpoint – code 4000001f (first chance)
First chance exceptions are reported before any exception handling.
This exception may be expected and handled.
*** ERROR: Symbol file could not be found.  Defaulted to export symbols for C:\Windows\syswow64\KERNELBASE.dll –
762922a1 cc              int     3

So, .NET is trying to load gdiplus.dll from some WinSxS directory and failing, right? (WinSxS is Windows 7 “Side By Side”, so if you remember the old “dllcache” folder from XP/Vista, this is the new way that keeps a bunch of copies of DLLs around for compatibility between applications.) But my first thought was “OK, what if that can’t be found or something?”… well, it’s there, so that theory doesn’t work:


Debugging blend.exe in Visual Studio records the following on crash:

System.ExecutionEngineException was unhandled
  Message="Exception of type ‘System.ExecutionEngineException’ was thrown."

Great, awesome. Thanks. That was helpful… not. All a System.ExecutionEngineException means is that something is blowing up in the .NET runtime.

Looking in the event log, this is confirmed with an event:

.NET Runtime version 2.0.50727.4927 – Unrecoverable system error.

Resolution: Well, I was in the right neighborhood, at least… crashing right after loading GdiPlus.dll (a graphics library DLL) was the biggest clue. Originally, I had allowed Windows 7 to automatically detect, update, and install my display drivers. It did pretty well, my Aero experience has been great, never had any other problems with it, but I downloaded the Vista / Win 7 drivers from NVidia, and this fixed it after a restart!

The actual chipset I have is NVidia Quadro FX 1600M (notebook display driver).

How I made a Giant Walking Piano (with C#)

I had 2 projects this year for Christmas for the City – first, the online streaming, and second, to take my idea of a giant walking piano (like in the movie Big), and turn it into reality. The main purpose of this was twofold – first, to give kids a feature that uses interactive technology to bring a smile to their faces, and second, for me to learn what works and doesn’t work well when it comes to interactive displays for projects down the road.


The basic idea is this: Kids walk across the keys of the piano, activating some kind of a pressure sensor, and the keys light up and the key’s sound plays. Sounds fairly simple, right? In all reality, it’s a lot harder than it looks in the movies. 🙂 I decided to go with a 16-key piano (to give me 2 octaves), and to make the design a lot simpler, just to skip the black keys altogether. While I didn’t expect to end up with something as elaborate as this picture, I did intend to create the same experience for a person walking across it.

The Keys

First, the material for the “keys” needed to be lightweight but strong, and readily available. I settled on a Lexan material from Lowe’s. They come in 18” x 24” rectangles, which, when cut in half, give you a 9” x 24” piano key. CFTCPiano012 feet seemed to be wide enough for my purposes, so the next step is to find the best way to let it ride above some kind of a switch, and let it hit the switch when a few pounds of pressure are added to it (i.e. a child’s foot).

The picture to the right is when I was testing it in my living room. I had a curved plastic stand that some time ago held a cable modem or access point or something, and I experimented with a couple of different types of switches.

The Switches

The most readily available supply of switches came from Radio Shack, but like a lot of things from Radio Shack, not all of them were all that great. The general idea here is that the switches would be recessed into the piano wood just enough to allow them to be pushed, but not damage them.


The first contender was a Radio Shack Mini SPST Momentary Pushbutton Switch (275-1556). At less than $3 a piece, this appeared to be what I was looking for, but was actually a massive fail. You have to push these down so hard, and even when pushed down, the “momentary contact” isn’t even that consistent, so I wasn’t comfortable trying to make this actually work.

RSDetectSwitchThe next type of switch I tried was a SPST Detect Switch (275-008).  It’s a little pie shaped switch that gets pushed down into the casing to activate the momentary contact. This had the response I was looking for and took very little travel before the contact was connected. I decided to go ahead and roll with this. The only problem was, most Radio Shacks only stock about 3 of these at a time. It took about a week of visiting Radio Shacks in 3 cities to accumulate enough of these things to wire up 16 keys. At one Radio Shack, someone had stolen all of them and left the plastic packaging behind. Nice.

Now, it was time to make…

The Frame


The idea here was pretty simple. Get a few 14-ft 2×4’s, and space 2×4 supports every 9” where the edges of the keys rest on them. Mount the sensors in the supports and off we go, piece of cake.

I learned a valuable lesson here – measurements are never always exact. I tried to build the frame first, measure it out, attach it, and then put the keys on, expecting everything to line up perfectly. There is apparently a law of woodworking, and a law of Lowe’s lexan products, that measurements don’t always come out like they do on paper. Add in a little bit of warping wood, and I rebuilt this frame twice before I figured out that I should build and fit the frame around the keys, instead of trying to stay in my idealistic happy bubble where everything works out like it does on paper and fairies dance around singing duets with unicorns.


Here is where I got the first few keys fitted, and it starts to look like the top of a keyboard.





To hold the keys in place, I used molding from Lowe’s. Cheap stuff and worked great for keeping the keys from shifting around.



Wiring It Up


The next step, while the framing was fairly accessible, was to go ahead and get the wiring harness in place. I started out using 1-conductor solid wire, running 2 wires to each key, and after 3 keys, realized this would be a complete and unmanageable mess. I switched over to using A/V 2-conductor + ground install cable (XLR audio cable) that I had left over from an old install. This made things a lot more manageable by keeping it to 1 cable per key.


This added some thickness to the cable that I didn’t really anticipate, so a few extra holes and it worked out fine.







For the actual switches, I carved out a slot in each key support with a Dremel router bit. I left just enough clearance on top for the sensor switch to stick up above the top of the wood. I also found that chunks of foam weatherstrip (the kind you would stick around your home windows to seal them) worked great for the keys to ride on. It held the weight of the lexan above the switch, but also compressed enough for it to hit the switch with no problem.


Also, these switches had a hole on the left side of them that I used to tack a short nail through. This held it in place quite nicely.



CFTCPiano09 Once these were in place, I soldered the key cables up to the switches, which completed the wiring for each key. Once stapled and labeled, it ended up looking pretty clean. This part of the design was very solid and worked extremely well.




Once I attached the keys, it ended up looking like this…


The Microcontroller

This piano was to be computer controlled. The main challenge here is how to make the piano talk to the computer. I wanted to do this via USB… so the plan was:

  1. Get enough inputs together on one microcontroller to sense on/off for 16 piano keys.
  2. Write a simple .NET Micro Framework controller OS that will sense these and notify the computer via USB.
  3. Write a host application on the computer that will talk to said microcontroller and take actions (play sound, turn on lights).

I went down this road, looking at Digi Connect, Tahoe, Arduino, etc, and after longer than I care to admit, realized something… what’s a microcontroller that speaks USB, just has a bunch of on/off contacts, is cheap, and readily available? A computer keyboard.

Things just got a LOT simpler, with a $12 Logitech keyboard from Wal-Mart… or so I thought. I didn’t have a whole lot of time to finish this, so this decision was final, and could probably be considered one of the best from a cost/simplicity standpoint, and one of the worst from a design standpoint.

A computer keyboard is actually really, really simple. When you type, your keys simply push two plastic sheets together that are less than a millimeter apart. When certain spots (where the keys are) connect, this connects 2 pins together on the keyboard circuit board. It’s this combination of pins that determines which key value to send to the computer over USB.


The new plan was:

  1. Rip out the controller, and solder cables to the controller contacts.
  2. Wire the piano key switches to a combination of contacts.
  3. In the computer, map certain keys to notes and lights.

There are 2 parts of a keyboard controller that really made this plan suck:

  • These boards aren’t designed to be soldered. The contact surfaces are really, really close together, making it darn near impossible to get soldered cleanly without shorting two contacts together.
  • These boards are cheap and fragile. In all reality, they don’t have to be robust – a plastic sheet lays against them and that’s pretty much it. But once I soldered cables on, the slightest pull and the copper contacts would simply peel off.

It was too late for me to change it by the time I realized these 2 major points (okay, it was the night before the event, because I thought this would be super easy and left it to the last minute), so I rolled with it. After soldering on 20 wires, out of a possible 108 key combinations, I ended up with about 22 working key combinations. Some of these were special keys (tab, shift, etc) that I didn’t want to use, but I ended up with exactly 16 letters. Score. Once connected, I could step on piano keys, and type random letters into Notepad. I was home free on the “input” side of things at this point.

Now on to the software…

Making Sounds

I used MIDI to make the sounds on the piano. I used an open-source library called NAudio to drive MIDI sounds. NAudio is a wrapper for DirectSound and runs as a 32-bit target. There are tons of tutorials on controlling MIDI, so I won’t cover that here.

Controlling Lights

In the hollow space below the keys of the piano, I filled it with Christmas lights. At Wal-Mart, a pack of 100 Christmas Lights was about $1.50, so they became a cost effective lighting medium. I connected these up to DMX-controlled dimmers. For those not familiar with DMX, it stands for Digital MultipleX protocol, and is the standard communication protocol used in production lighting systems today. It is really RS-485 (in essence, RS-232 that can daisy chain to up to 32 devices together and travel 1000 feet without loss) running a constant stream of byte values that represent sequential channels and their current control/intensity value.

Most programmers today don’t often come into contact with lighting control protocols used on large intelligent lighting and dimming systems, so there’s not a whole lot of code out there to work with it. There are really 2 ways of having a normal computer talk to lighting systems:

  1. Through a DMX-over-Ethernet protocol called ARTNet. This requires an Ethernet network, and a converter that will translate Ethernet DMX packets into RS-485 DMX traffic to get to the dimmers. The converters usually run between $300 up to $5,000.
  2. Through a USB DMX device.

My office lights are already controlled via ARTNet, and I already had code for it, but the converter cost was a deterrent for this project. Keeping costs to a minimum, I went the USB DMX device route with a really cheap device – the Enttec Open DMX USB. Enttec has simply taken a chip from FTDI (232BM) that is a USB to RS-485 converter.

The major pain point here is that FTDI only offers unmanaged code DLLs to work with, and a C# wrapper for it that was mediocre and halfway-working at best.

Low-level, byte-by-byte programming using non-managed Interop is pretty much not my cup of tea, but to make a long story short, I decided to stick it out and coded a serial communication wrapper in C# based on the DMX protocol spec, and 6 hours and lots of frustration later, some lights were coming on!


Putting It All Together with Software

The lighting code took a LOT longer than I thought, and truth be told, I still hadn’t written the software to put this all together and actually make the piano come alive an hour before the doors opened for the event.

Using a simple .NET dataset, I constructed a XML file format that looked like this:


I made a very quick Windows Forms application that took the keyboard input (key down / key up events), adjusted the DMX output to light up the key, and played/stopped the specified MIDI note value. This application looked like this:


One of the problems I had to solve was key repeat – when you hold a key down in Windows, Windows will automatically repeat the key every few milliseconds. Each repeat is actually fired as a separate KeyDown event on the form. A simple array of “currently pressed keys” solved this pretty quickly, but there was somewhat of a limit on how many keys could be processed at a time because of this.

By the time I finished tweaking, it was literally minutes until the doors opened. The application actually ran in Visual Studio debug mode the first night, until I got it transferred over to a netbook to run the other 2 nights. One of the unintentional but nice side effects of running it straight out of Visual Studio became the ability to update the code whenever a wave of people left the room. 🙂 So, about halfway through the first night, I added a little timer to change the instrument every 30 seconds. Interesting side note here – MIDI bagpipes are a great way to clear out the room quickly to make room for the next wave to come in. There’s not much more annoying than a high-pitched bagpipe note from a kid sitting on one of the keys and not moving.

Below is the video of the finished product in action:


Lessons Learned

  1. Find a better microcontroller than a computer keyboard with small solder points.
  2. A giant piano made out of 2×4’s, lexan, and dimmers weighs in at over 350lbs and is very awkward. (Thanks goes to the 8 people who helped get this thing up and down the stairs since it wouldn’t fit in the elevator.)
  3. The weather strip started to wear out after a few nights of beating and some had to be supplemented when keys would start to stick. I would use larger strips next time.

The piano was a great success. It did exactly what it was supposed to do, and brought smiles to a lot of kids faces. Can’t wait to use the experiences gained from this project to make something else.

How we did Christmas for the City Streaming (The Technical Details)

For the past 2 years, my church has hosted an event called Christmas for the City. A few hundred volunteers and about 30 other churches partnered with the event from around the area to take over a downtown building for a few days, providing support and resources for people in hard times in Winston-Salem, and a really unique Christmas experience composed of music, art, dance, literary works, and a ton of other features.

One of the goals for 2009 was to bring the event online in a bigger way with a few online streams that would let a virtual visitor surf through some of the main areas with just a couple of clicks. This blog post describes the process of how we ended up with what we did, and some of the challenges to solve for the next event of this type.

The Stream Team

This year’s streaming was very much a team effort, and everyone involved donated a lot of time and resources into making it happen. It wouldn’t be right in finishing this post without props going to:

  • Sean E. – Sean serves as an engineer for Microsoft and provided vision, hardware, some Windows 7 boot-to-VHD ninja, and way too much effort working with Time Warner Cable.
  • Sammy E. / Triad Tech – Sammy, with Triad Technologies, provided wiring through 3 floors of the Millennium Center, giving us the backbone we needed to establish a real network.
  • Jason H. – Silverlight kung-fu, getting the initial application prototype built and into the wild based on some of my very “loose” specs…
  • Brandon M. – Streaming support throughout the event
  • Dillon M. – Mobile Cam Driver
  • WSF Media Team (Aaron R., Chris E.) – These guys rocked the tech world with some awesome animation, and provided cameras and various other accessories and support.
  • X9 Technologies – X9 provided the network infrastructure, access points, servers, bandwidth, and code to get video out of the building and across the ‘net.

Location, and Wireless Networking

MillenniumCenter-NuclearFalloutShelter The Millennium Center in downtown Winston-Salem is the location of the event. It’s an extremely “historical” building with lots of “character”. We needed a solid wireless network to build on, and it didn’t exist in this building. The “character” (marble floors, walls, solid ceilings, 3-ft thick plaster walls) meant 2 things – first, it was designated as a nuclear fallout shelter, and second, RF doesn’t work like you would expect in there.


CiscoAironetSince wiring is pretty difficult and we only really  had one point of access to each floor, the strategy we went with was to cover the “main” area of each floor with one Wireless Access Point. We used three Cisco Aironet AP’s. Power was really not available or not convenient next to the network drops, so we solved this with the magic of Power over Ethernet (PoE). To get the best performance out of these for general internet access (below 5mbit), they were optimized for range instead of speed.

Windows Media vs. Flash

Of the streaming options out there right now, the two most viable options are Windows Media and Flash. When it came time to decide between the two, I was kind of 50/50 on both, with a slight tendency to go toward Flash, just because it had the larger install base and generally “just works” (most people have watched a YouTube video on their computer, so they already have what they need). Sean suggested using Windows Media with Silverlight. While we probably could have accomplished the same thing with Flash, we decided to create the player interface with Silverlight, and be able to pretty carefully control the user’s experience. The decision to go Silverlight over Flash pretty much came down to Silverlight and the underlying .NET stack offering us very rapid development that we could rely on without a whole lot of testing and tweaking. Although we knew some users would need to go through the Silverlight install, it was worth the tradeoff.

The Interface

Here’s a screenshot of what the final looked like:

CFTC Streaming Screenshot

The interface is pretty straightforward. The user can select between 3 different camera streams (originally Party Room, Performing Arts Stage, and Multi-Cultural Room). In the upper right hand corner, a real-time scrolling schedule is kept in sync with the event as it happens, so the user can check out whichever scheduled artist they want. In the lower right hand corner is the Twitter integration, displaying tweets from Twitter’s search API that contain the hashtag #CFTC.


I had a little bit of dumb luck when it came to testing this application before the event. The Friday before, we had a pretty good snowstorm here in North Carolina (which never happens), so I used that opportunity to set up a camera and turn it into a “snow cam”. This gave people a reason to go and visit the application to see the snowfall, and gave me some valuable feedback via comments and logs of how many people had Silverlight installed, and whether or not everything was working as expected. Surprisingly, not one person had a problem viewing it out of a couple dozen. Success.

Failover Streams

The only problem with Windows Media Services is that when a stream ends (i.e. in the event of a network interruption), it gets pretty ugly. The player usually just freezes on the last frame, or displays an ugly error message, which is obviously not ideal. The first night, I ran out of time to implement a static failover stream, but by the second night I was able to have it in place.

This is a fairly undocumented method in Windows Media Services, but it requires 2 publishing points. First, the publishing points for the “live cams” (we called them CFTCCam1, CFTCCam2, and CFTCCam3), and second, the publishing points for the “broadcasts” with failover (we called them CFTCBroadcast1, CFTCBroadcast2, and CFTCBroadcast3). The “broadcast” publishing points are actually a Windows Media Server Side Playlist (.WSX file) that contains a <switch> statement with a loop. So, the Broadcast publishing point will try to play the “live cam” feed first, but if it’s offline, it will switch over to a static .WMV animation file (the Christmas for the City animation).

The only caveat by doing this is that you really have to keep the “failover” file short, as it must play through all the way until the server attempts to switch back over to the “live” stream. We kept ours down to about 25 seconds (the animation followed by a quick “hang tight!” message), and I considered this acceptable.


ExpressionEncoderEncoding was done with Microsoft Expression Encoder 3. This is a fairly new product in Microsoft’s Expression line of design tools, and it’s a fairly slick interface (major improvement over Windows Media Encoder 9).

We had 3 computers (2 desktops and 1 laptop, 2 running Windows 7 and 1 running Vista), with Sony HVR-1 cameras using FireWire for input. This gave us a pretty good video source with the camera audio taking XLR output from the audio console(s) or camera mic.

I have to say, while Expression Encoder is a great interface, it had its share of “isms”. First, after installing it on a clean Windows 7 or Vista machine, using it in Live Encoding mode with our FireWire cameras, it would crash hard every time we pressed “Start Streaming”. This was fixed pretty easily by installing a QFE found here. Also, it generally does not deal with network hiccups very well. Again, we’re 100% wireless in a nuclear fallout shelter, so whenever there was a hiccup, the stream would die, and our only clue would be a little line of red text next to the Publishing Point output with “Unspecified Error”. Also, the “remember password” option when setting up Publishing Points just plain doesn’t appear to be hooked up, so this became an annoyance with our volunteer team that helped set up and maintain the streams throughout the nights. Overall, these were just slightly annoying problems that I’m sure will be fixed in time in an otherwise great piece of software.


We watched stats in realtime during the event through X9’s Streaming Client Center. This hooks into the backend of the streaming servers and cross-references IP addresses with X9’s Geomap database to give us an idea of our traffic on each of the 3 feeds. We could also overlay the data on top of a Google map to impress people.


The Mobile Random Cam

Watching our stats on Monday, while people were frequenting the Party Room and Performing Arts Stage, we noticed that no one was really watching the “Multi-Cultural Room Cam”, which is a camera in a room with an 8-ft ceiling and is pretty narrow. There was no board audio feed on it, and most of the time it was just people standing around in front of the camera.

Tuesday morning, I was helping unload 25 bags of Chex Mix and 20 gallons of Apple Cider from my wife’s car into the Millennium Center, and was attempting to use a caster plate the production company used to cart around truss. By about the 3rd time everything had fallen off, I got frustrated, and went and found a large catering cart. Pushing the cart around, through the elevator, and on to the second floor, I was thinking “man, wouldn’t it be awesome if people online could just virtually ride around on a cart like this……”, and later that evening… ta-dah:


On the back of the cart is a laptop running Expression Encoder, and on the front is the Sony cam. Each laptop would last about 2 hours, so we switched between Sean’s small Lenovo and my personal Dell Precision. We needed something to get the camera lens above the cart handle, so I found a cardboard box. Dillon found a bungee cord in his truck to keep the camera from flying off. The Mobile Random Cam was born.

The stats on the mobile cam were pretty funny. Viewers immediately loved it and started watching it more than the other 2 static feeds. Then we’d go into an elevator, lose our WiFi signal, lose all of our “fans” to the other feeds, and they’d join us back a few minutes later. I think the random-ness combined with the comedy of watching as Dillon and I try to navigate this cart through thousands of people, up and down stairs, and interviewing pretty much everyone we ran into made this stream pretty entertaining. Oh, and we had a blast doing it.

That’s It.

This pretty much sums up the CFTC streaming… by all means, it was a great success, and our team was awesome. We definitely pushed the limits of wireless, Expression Encoder, and put a few miles on a catering cart. Can’t wait to do it again for another event!