So, the iPhone user-agent string (that Mobile Safari sends to the server to identify itself is):
Mozilla/5.0 (iPhone; U; CPU like Mac OS X; en) AppleWebKit/420+ (KHTML, like Gecko) Version/3.0 Mobile/1C28 Safari/419.3
So it's Mozilla 5, but it's AppleWebKit, but thats KHTML, but like Gecko, oh and it's really Mobile Safari. I mean WTF?
I bet this is Microsoft's fault. I bet some time back in the IE2 or IE3 days they did some stupid, slimy, "try and lock them in" asshattery that started requiring everyone else to bend the user-agent string until we end up with these wretched things which will probably plague us until someone invents the next web. By which time hopefully Microsoft will have made themselves an irrelevance.
If you're wondering why I'm frothing about this it's because I am parsing user-agent strings to make a guess about a type of codec to use (or whether a device can even use one) and the user-agent string is all we got.
This should not be required.
What these browsers should be passing is an HTTP header that specifies what standards they support, e.g.:
HTML3 # no exceptions HTML4 +some_stupid_extension -bug1 -bug2 HTML5 +some_custom_extension -buggy_feature -not_yet -and_so_on
and, to make my life easier some kind of header that outlines the supported video and audio codecs. Although that would be much easier to deal with if user agent strings just unambiguously identified the device/browser instead of playing stupid compatibility games.