Summer 2014
Ultra Low Latency Live Stream
I'd recently implemented a bespoke live bidding service at a local auction house. Built with Node.js and largely communicating via websockets, it allowed their customers to bid remotely on their sales and bypassed the additional fees they were having to pay with other live bidding providers.
It was working perfectly... but they wanted to take it further. They wanted to broadcast an audio visual feed directly in browser.
Going Once... Going Twice...
Before we go into the technological aspects of this project, I want to give you a little background on how traditional auctions work.
Unlike time based auctions that services such as eBay have made popular, traditional auction houses operate under fixed sales. Usually taking place over a day, or multiple days - depending on the lot count, customers will either turn up in person, dial in as a telephone bidder or log into an online bidding service.
As you've likely seen on TV, the auctioneer will set an increment and customers will hold up their paddle, click a button or let the telephone operator know they want to bid. Whoever indicates they want to bid first becomes the highest bidder, once acknowledged by the auctioneer. The next increment is then set and the process repeats itself until there's a significant pause in bidding. Using their knowledge, the auctioneer will then sense the interest has ended and call out fair warning and eventually give their gavel a smack to effectively end the bidding for that item. The highest bidder at the point of the gavel coming down has then won the item that was being offered.
Why am I telling you this? Well, typically an auction house might get through around 80-100 lots an hour (unless you're Christie's or selling high value items). This means each item will usually be offered and passed in just over a minute, unless there's a bit of bidding war.
Every Second Counts
Online bidding platforms have a realtime interface which is constantly changing to reflect the status of the sale and current lot. If you're offering a live stream then you have to assume some users might be using that as their indicator of the sale progress. If there's even a slight delay of 5-10 seconds, they might be hearing an increment that has already expired or attempt to leave a bid when the item has ended.
When the auction house expressed their interest in implementing their own live stream, I knew that latency, not quality, would be the primary focus and would ultimately determine where or not I could achieve a satisfactory solution.
Experimenting
To preface this section, I must say I am not a hardware guy. Sure, I had built a few PCs from scratch rather than buying pre-built... but that's about it. I had next to no experience with AV equipment, networking or streaming in general - I do however, like a challenge.
Off the bat, social channels such as Facebook live, Periscope and Youtube straight up didn't meet the latency requirement, so we could rule all of them out.
We began experimenting with a range of potential solutions. The good news was we had a decent head start - the auction house had a dedicated broadband line into the building which provided huge bandwidth, reliable performance and a fixed IP address.
IP Cameras
On the hardware front, we first tried out a bunch of IP cams. The issue with these was that a lot of them came locked into proprietary software which either didn't support streaming to a wide audience or their latency was too high due to crap encoding.
We got our hands on an Axis encoder, hooked it up to a compatible IP camera and gave it a dedicated port on the router. The next question was how do we broadcast it to an audience. We linked the feed to Wowza and tried out their low latency configuration.
We were getting around 2 seconds of latency initially but it would quickly increase to well over 5-10 seconds after prolonged usage. After messing with the bitrate and other quality settings in Wowza to no avail, we looked at alternatives.
2024: Looking back, I can't actually remember if I even attempted peer-to-peer streaming with the IP + encoder setup. I seem to remember having issues with audio for a while. Either way, I think I was pretty close with this solution, but whether it was inexperience or sheer exhaustion - we moved on.
Web Cameras + FFmpeg
I began testing with web cams. After all, we had access to a bunch of them lying around their office. Software such as OBS didn't give us the level of control to achieve the latency requirements we needed to meet, so I dodged those types pretty early.
Flash live media encoder was a thing at this point in time, however Flash was pretty much phased out so I wasn't keen on this approach either.
After some research, I began looking at FFmpeg. This allowed me to take a USB web cam, encode the stream and broadcast it on a websocket:
I could then take the stream and use JSMpeg to consume and display it in a webpage. Initial tests achieved extremely low latency (< 4 seconds), even after extending the usage. Dropping the bit-rate and resolution helped get this to almost no latency.
Now I'd identified a low latency solution which support audio and visual, it was time to test it at scale and work out the logistics of:
- How to keep it running day to day
- How to install it on the rostrum
Raspberry Pi
An auctioneers rostrum is quite a cramped space as it is. They're already surrounded by auction sheets, cups of tea and various screens to show them remote bidder information. We also needed a solution that required as little setup as possible. The staff were not very tech savvy and had better things to do than fiddle with a tiny computer before each sale. They also had no on-site tech support team - this was a regional auction house after all.
Raspberry Pi had recently released their second generation of micro computer. I actually already had a first gen lying around in a drawer at home from a failed home server attempt, so I dug it out and began replicating the laptop + webcam setup - with some modifications.
I installed required packages and got the FFMpeg and server scripts to auto run on boot. The nice thing about this setup is that the staff could literally turn the device on before the auction and the camera feed should start broadcasting within minutes. At the end of the auction they simply turn it off. Another benefit was that if the feed ever slowed down or encountered an problem, a simple troubleshooting step could be to restart the device, cross your fingers and hope it fixes the issue (it actually did a few times).
Final Touches
Having a tiny computer laying around with a web cam and microphone dangling nearby was a bit of an eye sore and potential trip hazard. Fortunately, Raspberry Pi also sold a handy webcam module and enclosure, so the core setup was about the size of a wallet in the end. I did attempt to use a low profile USB mic but the background noise was awful. In the end I used a basic Logitech desktop mic which sat nicely on the rostrum at the expense of an additional cable.
To finalise the physical setup, I attached the Pi to the rostrum with a lazy phone arm, which allowed staff to move it around as needed and didn't take up any additional desktop space. I configured dedicated ports for the outgoing live stream and a subdomain to the fixed IP to ensure the Pi was always accessible at the same place.
While the feed's quality was not offering an IMAX experience to say the least, the project succeeded in achieving an ultra low latency audio visual stream (consistently less than 5 seconds under load).
Limitations and Considerations
Performance
While delivering a low latency stream, under high load (over 50 concurrent connections) the feed would be prone to stutters. This wasn't a bandwidth issue but a hardware limitation of the Raspberry Pi itself (memory and CPU). A few years later (2018) Raspberry Pi 2 B+ was released with an improved CPU and increased onboard memory. Switching over to this largely solved the stuttering and offered an all around faster stream.
Low Capacity
This setup was for a regional auction house. It was rare that they would have over 100 concurrent users on the realtime bidding application at any given time. No doubt given a higher load, this solution would have suffered from poor performance as it was relying on a peer-to-peer connection.
High Bandwidth
I was fortunate in that the final solution could piggy back off a dedicated broadband line into the building. This resulted in a stable connection with a high bandwidth ceiling. Trying to run this setup on a 30 Mbps standard consumer connection might have resulted in a bit of an underwhelming experience. Unless you have some form of dynamic proxy you will also need a fixed IP address that the dedicated line also provided.
Beware of The Default Ubuntu User
One sale day I got a call to say the feed wasn't appearing. A short car journey over to the auction hall, I plugged the Pi into a spare monitor and began debugging. The default user on Ubuntu required a password change at some pre-defined interval. This meant when the Pi booted up, it couldn't get passed login and never got to the automated stream scripts.
The fix was trivial; set up a new user to avoid the password expiry and make it the default account when booting up.
Process Ordering
During the early stages of the feed being used, there were some instances where it failed to boot. This wasn't down to an expired password but simply that the streaming scripts were triggered before an internet connection had been established.
I can't remember the exact solution, but I'm pretty sure placing the script in the /etc/network/if-up.d/
directory did the trick, which ensured it would run once the Pi was connected to the internet.
This project was definitely out of my comfort zone, having little experience in this space prior to taking it on. I'm grateful to Biddle & Webb, and specifically James who was managing there at the time, who had complete confidence and patience in me finding a solution.
I love learning on the fly and often find you learn the most when you're under pressure.