Saturday, August 30, 2008

Bandwidth: Limits, Speeds, Standards?

Christopher Blizzard recently wrote about Comcast (a major US ISP) imposing bandwidth limits on their customers -- unfortunately, something those of us in Australia know all too well.

My initial thoughts, included a comment:


I don’t think it’s ‘that’s a huge pile of angry non-americans’ any more that it is ‘the rest of us, would like to welcome you, the americans — to needing to put up with what the rest of the world deals with on a daily basis’ or, consumerism.


After pondering on it for a while longer, especially in relation to local users -- I have to wonder:



  • If the average cost of bandwidth in Australia is $50 AUD for 5/GB of bandwidth, which at 1.5M p/sec equates to 3.75hrs of sustained download time -- what does an average household actually do with their internet the rest of the time?


  • Comparing the three ISPs I deal with on a daily basis:


  • Telstra, ran a TV commercial recently where they demonstrated a 'BigPond Connected Home' with 2 Adults, 1 Adult Child, 2 Children and a Dog using the internet at the same time, one for streaming TV, one for viewing Facebook, one for browsing eBay and one for booking travel.

    Presuming the family only paid the 'average' fee, $59.95 at 1.5M gives you 600M -- yes, that's Megabytes (not a typo), which excess usage charged at $0.15 a megabyte.

    Looking at the list of Telstra unmetered sites, Facebook isn't in the list, YouTube isn't, neither is eBay, which begs the question -- how much is the average family's internet bill per month.



  • SingTel/Optus come out marginally better, providing a 15GB plan for $59.95 AUD a month, with no excess data charges -- but a throttled speed of 128kbps once you've hit your limit, as well as not apparent unmetered content.



  • Internode came out better still, providing a 25GB plan at ADSL2 speeds for $59.95 with no excess usage charge, albeit a slower throttled speed of 64kbps, but provided an extensive online games and free software unmetered mirror, as well as a variety of uncapped site access.




In the last week, i'd been using my Playstation 3 at home a fair amount -- and decided to do a bit of research into how much bandwidth the average game used by running a few demos, plugging my DSL model directly into the PS3, thus removing the rest of the traffic from my LAN from the equation.



  1. Call Of Duty 4, 6-man multiplayer with a 15 minute time-limit used 23M of data the first time, and 24M of data the second time (the only difference was the map we used).



  2. Grand Theft Auto 4, 12-man multiplayer with a 20 minute time limit used 40M of data each time.



  3. The new demo for EA Sports NHL '09 came out on the Australian PS3 Network -- which I promptly grabbed, as a 'normal use case' test for the average household, at 1102M on it's own, that accounts for $169.33 AUD of excess usage charge at Telstra, or nearly a tenth of the available bandwidth at Optus using the above numbers on it's own.



After returning my LAN to the equation, I then stumbled over the following article, that talked about the USA lagging Japan in terms of aggregate bandwidth for the medium-to-long term and thought, sure -- Japan can reach the Japanese at 63M, possibly the Japanese eBay or Amazon too, but what's the speed like from there to Europe, or the USA.

I understand that is actually costs a lot of money for anything to get to Australia & the cost of cable, backchannel links and maintenance is prohibitive.

I also understand that Telstra is still in the position where they can play overlord to the communications network in this country and a mentality of 'we built the links, you paid for them, now we profit from them' is the standard, but that's the case with any 'Shareholder Concerned' business.

Maybe things would have been different (better?) if the government had forced Telstra to hand over control of the physical line infrastructure to AUSTEL 15-20 years ago, rather than dissolving AUSTEL, forming the ACA (and ACMA) instead and creating yet another mid-range government department while allowing the sole telecoms provider to become a complete juggernaut too.

They didn't -- and now we the consumers are paying and will continue to pay until either:

a) government regulates in consumers favour across the country (rather than concentrating on 'the bush')

or

b) the cows come home. (especially with the G9 facing resistance from the ACCC with their Fibre-To-The-Node proposal and Telstra pulling the telecoms version of an All-In-Over-The-Top-Raise, launching their own proposal for the rebuilding and revamping of Australia's telecommunications infrastructure.)

According to Chris's other comments, at least the pricing situation here isn't as bad as the one in South Africa were the equivalent of $140 AUD buys you 10GB, but that's for a userbase of around 10,000 users -- not 16.6 million.

The next few years will be interesting.

Saturday, August 23, 2008

Playstation 3, Streaming, Formats ... Ick.

I've been working away at a media streaming project for the last week or two -- mainly because running between the machine that holds most of my web-content and the TV becomes tiresome, especially when you've got most of the content in .mkv or .mp4 formats that don't fit more than an episode or two on a DVD-R.

I've tried TwonkyVision, which was widely recommended as the best solution for streaming with transcoding from external sources -- but is purchase-ware, as well as both FUPPES and MediaTomb from the world of OSS.

Quite frankly, I came away quite unimpressed.

Not because any of the software is bad, or particularly difficult to setup (albeit that you need a newer version of FFMPEG to get any reasonable quality from the HDTV transcodes, and that creates debian pain.) ...

More because the Playstation doesn't do half the things you'd expect of it.

Granted, it's behind the XBox360 juggernaut in terms of market time -- but if you're going to do format compatibility, please -- SONY, make a decent job of it.

First, I tried connecting the three media servers using the default instructions for each, enabled uPnP and a default multicast route for the boxes that held the data -- of course, while that meant my laptops could see the share -- the PS3 didn't.

Further investigation concluded it's nigh on impossible to get the PS3 (40GB, running firmware 2.42) to talk to a media server via the wireless interface.

15M of CAT5 later and a re-configuration of the PS3 to use a wired interface -- we had liftoff, well -- nearly, the PS3 had dropouts whenever it tried to look up directories on the share, it'd start searching, get through the first 20 or 30 entries and stop.

More poking ensued -- turns out the PS3 needed an explicit route to the box hosting the media, easily fixed -- but in no manual, it only triggered in my brain because the box doing the serving (running Red Hat Enterprise Linux 4.x) was triggering source route notifications in my logfiles.

Now the PS3 saw the media, as a variety of MPEG-2's, and Unknown Data.

Back to the interschnitzel, to find MediaTomb and FUPPES both have 'a workaround' to make the PS3 see DivX files, except they both don't actually work on PS3's with 2.4x firmware 'out of the box' (a phrase becoming my new favourite annoyance)

For MediaTomb this means adding:


<map from="avi" to="video/x-divx"/>
<map from="divx" to="video/x-divx"/>


For FUPPES this means:


<file ext="avi">
<type>VIDEO_ITEM</type>
<mime_type>video/x-divx</mime_type>
</file>


... to your configurations.

A 'Re-boot and Re-Import' later the PS3 saw a bunch of MPEG-2 and DivX files, Good.

Except half of them wouldn't play, Bad.

Infact, the PS3 seems more picky about which DivX/XviD files it'll play than the documentation suggests -- the exact same file with the exact same settings encoded in XviD 1.10 and 1.12 play differently on the PS3, the 1.10 file is called 'Corrupted Data' but the 1.12 copy plays normally, albeit with audio skew caused by lag.

Then there's MPEG-2's that won't play if they are in a TS container, but will if the stream is copied to a PS container first.

Matroska (MKV) based H.264 files won't play at all either, having a platform that says it supports 'new media' and not having a Matroska muxer/parser is ... er, strange.

My personal favourite though, is that transcoding anything high-definition fails using 'chunked' encoding because the 'buffer' size (I set for 512k) is too large.

Luckily, that's an easy fix for FUPPES. For MediaTomb it's not straightforward, you need to change the 'fill' size to workaround it ... but again, that's not documented anywhere either.

The working settings I came up with (which need to be added to each transcoding section where you're converting H.264 (.mkv or .mp4 mostly, unless you watch High-Def Pornography, in which case i'll leave it as an exercise to the reader) were:

MediaTomb:


<buffer size="20971520" chunk-size="524288" fill-size="10485760"/>


FUPPES:


<http_encoding>stream</http_encoding>


In the end, after recompiling both platforms -- I settled on MediaTomb, although I now have both RHEL 4 and Ubuntu 8.04 packages for both platforms from their respective RCSes from 20080818, so I can switch easily if I want to change.

There'll be a forthcoming post on how to configure the transcoder scripts and FFMPEG, sometime when i've got hours up my sleeve to document it with sane reasoning and screenshots, i've got it going now -- but if I wrote it up, it'd look ranty and like I was SONY-bashing.

I agree with this guy. The XMB is nice, the changing colour for the seasons is a nice touch too, but if the PS3 doesn't increase the titles available via the Playstation Network (In Australia, we don't have a Madden '09 demo yet, for example -- nor do we have Castlevania, but we do have a bunch of music videos and some streaming from various trade events .. for all the use they are.) then it'll be behind the X360 for a while to come.

However, if they don't fix the format capabilities -- probably by this holiday season, then they'll be behind the X360 for a long, long, long time to come.

Really, an Matroska muxer -- support for all three main H.264/AAC profiles, MPEG2-TS and -PS support and a differentiation between DivX (DX50/DX60) and XviD (XVID) would be nice, at least so I don't have to transcode the latter to view something you're natively capable of viewing in the first place.

If we got that, plus maybe an Dirac and Vorbis implementation (I mean, does anyone use ATRAC) -- the Playstation 3, to quote a great movie reference ... "would become more powerful than you could possibly imagine."

... but at the moment, it's a games platform, with a swanky front-end.

If SONY think it's anything more than that, they owe me my weekend back.

Tuesday, August 19, 2008

Totem Packages Available (see: Totem, GStreamer & nVidia Graphics Cards)

Playing with a possible fix for these bugs in Totem.

There's packages for Ubuntu Hardy here -- that are the same as the release ones except one rather messy hack that shifts the Hue plus-or-minus 90 from whatever position Totem starts in.

It's basically the code from here cleaned up a tad and dropped in as a patch.

They seem to work for me, using the GStreamer pipelines I presented in the last post.

Not a clean solution, but I can play a full playlist of videos without the colour skewing once -- which is better than upstream can do at present.

Friday, August 15, 2008

Bluetooth's "Operation Not Supported By Backend" Message

This morning, while trying to move some files between my mobile phone and my Ubuntu 8.04.1 machine -- I was greeted with a "Operation Not Supported By Backend" message with the address of my phone and the drag between Nautilus windows was terminated.

Turns out, it is because the GVFS backend for Bluetooth doesn't support the device -- and using the older, GNOME-VFS way isn't supported either.

The 'Send-To Bluetooth' option (right-clicking the files you want to send and selecting your mobile) works as per normal and successfully transferred 25M of files to my phone in under a minute.

Monday, August 11, 2008

Mozilla, SSL & the 'non-optimum' Security Warning

A number of people have been blogging about the state of the SSL Certificate Security Warning since the release of Firefox 3.0.

I must admit, personally I don't mind the dialog that pops up -- it scares the everyday user into thinking twice before sending their data to Nigeria by accident.

It is actually far more awkward to import the various extra root certificates into the various operating environments, than it is to do certificate exemptions on a site by site basis.

I found the report that Federico linked to slightly disturbing -- if 58% of certificates are indeed invalid, expired or otherwise bad, that's a hell of a lot of users that are experiencing an all-too-confusing dialog box far too often.

(On that note: If you have an expired certificate, you should really get it renewed -- especially with a commercial signer, after all -- you've built a reputation with that certificate, you shouldn't have customers turning away because that little yellow bar they've been used to becomes a scary looking error message.)

I like CACert myself, I use it for things regularly and i've configured several e-commerce installations to use certificates for it, after going through the somewhat painful verification process to get a two-year certificate instead of a three-month one.

For commercial stuff though, CACert isn't really practical -- especially considering very few operating environments include their root certificates by default.

For semi-commercial stuff, there's no middle-ground, there's either commerical CA's, Homebrew, or nothing.

For personal use, there's GNU Privacy Guard -- a much better, but less Microsoft-supported way of confirming you really are who you say you are.

I've often thought about the issue in my business, where I see all sorts of certificates on a week-to-week basis -- and often need to handle the case of 'a user complained my certificate was invalid, I bought it and gave it to you, so you must have broken it.'

The thing I haven't been able to come up with yet, is the solution:


  • For big corporates, there's Verisign or Thwate, which is prohibitively expensive for a single-user in the home.

  • For SME's there's second-tier signers, like Comodo, GoDaddy or Network Solutions.
  • For ~$100USD p/year, you can get a certificate that works 'most of the time'.
  • For Free Software Developers and other Personal Use there's basically CACert, or doing it yourself. Neither of which, are supported by anything remotely mainstream without doing a hell of a lot of legwork yourself.



Maybe Mozilla themselves, or Google could do something to help the situation by running a CA that works in parallel with the other services they provide -- but how would that be any less work that rubber-stamping CACert?

Well, even though the principle is the same, if Google did it -- it'd probably be supported everywhere -- but as of now, CACert are still running the gauntlet with Mozilla and will probably have a much more difficult task getting past Microsoft and Apple, accordingly.

Wednesday, August 6, 2008

Autodesk Backburner & VMWare Clones

If you've installed Autodesk's Backburner product within VMWare (part of 3D Studio 7/8/9/2008) and have trouble getting the Backburner server to start because the "UDP interface is not valid in this context", the solution is two-fold:

First, Power off your VM
Then, edit the .vmx file for the failing VM and change:



ethernet0.generatedAddressOffset = "0"



To:



ethernet0.generatedAddressOffset = "1"



Power up the VM and log in to Windows -- using Windows Explorer, navigate to the C:\Program Files\Autodesk\Backburner\Network directory.


Delete the backburner.xml file.

Now when you restart the Backburner server application, a new configuration file will be created with the new GUID and SID of the VMWare instance and it should start up and run normally.

Tuesday, August 5, 2008

Encoding Videos for your PS3 using GStreamer

Following up my previous post, i've been playing more with HDTV content on my machines -- and spending more time re-coding content for my PS3 so I can watch it from the comfort of my lounge.

Using Avidemux is nice, but occassionally -- it's nice to stick it in a terminal, with relatively low overheads and use the GStreamer framework to do the same job.

So, here's two examples of how to use the CLI to generate content playable on the PS3.

For Standard Definition (SDTV) content -- you can use:



gst-launch-0.10 filesrc location="input.avi" ! decodebin name="decode" decode. ! queue ! ffmpegcolorspace ! ffenc_mpeg4 bitrate=999999 gop-size=24 qmin=2 qmax=31 flags=0x00000010 ! avimux name=mux ! filesink location="output.avi" decode. ! queue ! audioconvert ! lame name=enc vbr=0 ! queue ! mux.



As of the 2.42 firmware, the PS3 has issues playing VBR audio -- so you have to explicitly turn it off (setting it to CBR audio) using the vbr=0 option to lame.

For High Definition (HDTV) content in .MKV format -- the x264 encoder works better, with a higher quality re-encode at the expense of a larger output file (obviously, if you've downloaded a HDTV file that's already been encoded in the .avi format, you'd want to use the SDTV code above, because the .avi file would already have significant quality loss over the original source material)

In HDTV, we can also use AAC audio to generate 4.1 channel audio -- as opposed to recoding to MP3.



gst-launch-0.10 filesrc location="input.avi" ! decodebin name="decode" decode. ! queue ! ffmpegcolorspace ! x264enc ! ffmux_mp4 name=mux ! filesink location="output.avi" decode. ! queue ! audioconvert ! faac name=enc ! queue ! mux.



Where input.avi is the path to your existing media source andoutput.avi is the path and file that you'd like to save.

important note: Yes, the full stop (after the mux.) in both cases is intentional -- if you take it out, GStreamer will complain about the pipeline.

Monday, August 4, 2008

Totem, GStreamer & nVidia Graphics Cards

I've been bitten by these bugs fairly often on my HP DV6000 laptop -- and with nVidia claiming it's nothing to do with them, I decided to do a little investigation.

Turns out, Totem seems to reset the video settings after each video has been played.

If the quality sliders in Totem are dead center for all four settings (Saturation, Contrast, Hue and Brightness) ...



... the video displays with a bluish tinge unless you use the following GStreamer Video Output pipeline:


ffmpegcolorspace ! video/x-raw-yuv,format=(fourcc)YV12 ! videobalance contrast=1 brightness=0 hue=-1 saturation=1 ! autovideosink


If the colour settings slider for Hue is at the far left (as has been suggested as a solution by several people), the following pipeline works:


ffmpegcolorspace ! video/x-raw-yuv,format=(fourcc)YV12 ! videobalance contrast=1 brightness=0 hue=0 saturation=1 ! autovideosink




However, regardless of which pipeline one chooses, Totem seems to reset itself each time, seemingly trying to adapt to the optimum setting, which means the first video you play will display correctly, but following videos will be blue.

At this point, i'm not really sure how to fix it -- but nVidia suggest that it isn't their problem and Totem should fix it.

The interesting thing about that, is if I take a screenshot of a playing video -- the screenshot is the correct colour, all the time.