Hobnobbing with the top knobs
Come the day - Sunday the third - we were greeted on the stand by ace PR man Ray Berardinelli of VTM (the DLNA's US-based public-relations firm) and were surprised to be immediately introduced to two of the association's top knobs, Pat Griffis, the DLNA's vice chairman and vice president, and Dr James W Wendorf, the treasurer.
Our surprise only increased when we realised that Pat Griffis is Microsoft's director of worldwide media standards and James Wendorf is a Philips Electronics vice-president and that company's senior manager for technology and standards.
To start the ball rolling a couple of demonstrations were carried out using a standard Nokia N80 multi-function 3G mobile phone with music-player and camera functions - check out this HEXUS.lifestyle.news headline about the recently announced N80 Internet Edition.
The Nokia is a DLNA-compliant phone and was first demonstrating in conjunction with a Philips SLA5520 Wireless Music Adapter. With a couple of swift button presses on the phone, it had found the Philips gadget and the two kissed one another lightly on the cheeks, continental fashion, to break the ice.
Thereafter, it was possible to use the phone to feed out its store of music via the Philips box and listen to it on an attached audio system. In this set up, the Nokia was acting not just as a wireless source of music but also as a remote handset - even including volume controls.
Next, the same phone was used to create a dialogue with a DLNA-compliant printer, an HP C7180 multi-function jobbie, and then get the printer to make hard copies of images that the phone itself had shot. Menus are shown on the phone for making various selections, including print size.
The only major downside of that dem - and no one had a good explanation for it - was that the time taken between selecting go and seeing a postcard-size print pop out seemed excessively long.
After these initial dems, it was time to find a quiet room to sit down and hobnob with the top knobs before checking out some NAS devices, an Intel ViiV PC and more.
James Wendorf and Pat Griffis ran us through the DLNA's brief history starting from its founding in 2003 and fielded questions that filled in some gaps.
The association, they said, was set up to try to ensure seamless access to digital media in and out of the home (Hallelujah!) and, with over 300 member companies, now includes almost everyone who is anyone in consumer electronics, mobiles, networking, PCs and telecomms - along with the 'middleware' and semi-conductor firms whose glue holds everything together.
Both agree that the list was incomplete without some content providers but Griffis was quick to point out that they, too, are starting to sign up. However, as best as we can tell, the current membership includes just one big-name content provider, Universal Music Group.
The DLNA's first interoperability guidelines were published in June 2004, one year after the organisation was set up by its 17 founder companies. The guidelines - a collaboration between members from various industries - were created after looking at a whole bunch of likely consumer-usage scenarios for digital media and centred on open and established standards within the CE, PC and mobile industries.
The guidelines established three mandatory media formats – JPEG for stills, LCPM for audio and MPEG-2 for video – and they set UPnP AV 1.0 and UPnP Device Architecture 1.0 as the standards to use for device discovery, device control and media management.
HTTP 1.0/1.1 was picked as the media-transport standard and IP v4 Protocol Suite was the agreed network-stack standard.
The chosen network standards were Ethernet (802.3i and 802.3u) for wired connections and 802.11a/b/g for wireless.
In January 2005, optional support was added for 11 further popular media formats – GIF, TIFF and PNG images; MP3, WMA9, AC3, AAC and ATRAC3plus audio files; and MPEG-1, MPEG-4 and WMV9 video.
Then, in September 2005, the association's certification and logo programme was launched, with on-going support coming from quarterly testing get-togethers - plugfests – worldwide.
These are intended to verify that products are designed to the DLNA’s Interoperability Guidelines and meet its certification-testing requirements.
In essence, they're to check that one DLNA-certified product will work with any other relevant DLNA-certified product.
Updates to the Home Networked Device Interoperability Guidelines were published in March 2006. These extended the type of devices covered by the DLNA umbrella, adding in printers and extending the functionality of mobiles, "giving consumers more products and features that they want and expect to use".
For mobiles, the AVC (MPEG-4) video coding standard became the mandatory media format for video interoperability – though MPEG-2 remains the only mandatory video standard for non-mobile home kit.
The adoption of AVC was a bit of a no-brainer because it was designed for optimised content storage and transfer and offers high quality at low bit rates.
Bluetooth was added at the same time as an optional wireless transport protocol because it was already in common use in mobile devices.
However, at that time no mains-borne networking was integrated or made available as an option and there is still none there today.
To our mind, that's one continuing absence that smells of vested interests at work, rather than being a practical decision taken in the best interest of consumers - even though we do acknowledge that choices will be difficult given that there are a number of competing and incompatible mains-borne standards.
There are others, of course, the most obvious and most worrying being the apparent exclusion of Apple OS from this Windows-centric masterplan.
Ten more classes of devices were also added in 2005 - on top of those for Digital Media Servers (DMS) and Digital Media Players (DMP).
The ability to upload and download between mobile devices and AV products was introduced as well and further "mechanisms" were brought in to improve the reliability of streamed media.
According to Griffis, a lot of work had been done on Ethernet to get video to play reliably and the Real-time Transport Protocol (RTP) - already a common transport for streaming audio and video over the internet - was added because it's more efficient than HTTP and reckoned to simplify support for internet streaming content.
Before the March 2006 guidelines, only two device classes were covered - Digital Media Server (DMS) and Digital Media Player (DMP) – and the available functionality was pretty limited.
Back then, it was possible to pull images, video or audio from server to a player – using, for instance, a TV set's IR control to pick a video stored on a Digital Media Server and watch it on the set or use an AV system's handset to choose and play on that system a song stored on a PC.
Afterwards, 12 classes of device were covered and it became possible to pull and push video, images and audio from a server to a player/renderer.
These changes enabled the two dems that used the Nokia N80 phone and also meant that images could be uploaded from a camera or phone to a PC or TV for viewing.
And, it also became possible for photos stored on media servers to be viewed on TV sets and for hard copies to be produced by sending images to networked printers.
But what of the future? Click that next-page button to find out...