-
Joel Martin authored
Also add a wsencoding test client/server program to test send a set of values between client and server and vice-versa to test encodings. Not turned on by default. Add support for encode/decode of UTF-8 in the proxy. This leverages the browser for decoding the WebSocket stream directly instead of doing base64 decode in the browser itself. Unfortunately, in Chrome this has negligible impact (round-trip time is increased slightly likely due to extra python processing). In firefox, due to the use of the flash WebSocket emulator the performance is even worse. This is because it's really annoying to get the flash WebSocket emulator to properly decode a UTF-8 bytestream. The problem is that the readUTFBytes and readMultiByte methods of an ActionScript ByteArray don't treat 0x00 correctly. They return a string that ends at the first 0x00, but the index into the ByteArray has been advanced by however much you requested. This is very silly for two reasons: ActionScript (and Javascript) strings can contain 0x00 (they are not null terminated) and second, UTF-8 can legitimately contain 0x00 values. Since UTF-8 is not constant width there isn't a great way to determine if those methods in fact did encounter a 0x00 or they just read the number of bytes requested. Doing manual decoding using readUTFByte one character at a time slows things down quite a bit. And to top it all off, those methods don't support the alternate UTF-8 encoding for 0x00 ("\xc0\x80"). They also just treat that encoding as the end of string too. So to get around this, for now I'm encoding zero as 256 ("\xc4\x80") and then doing mod 256 in Javascript. Still doesn't result in much benefit in firefox. But, it's an interesting approach that could use some more exploration so I'm leaving in the code in both places.
507b473a
Name |
Last commit
|
Last update |
---|---|---|
.. | ||
flash-src | ||
FABridge.js | ||
README.txt | ||
WebSocketMain.swf | ||
sample.html | ||
swfobject.js | ||
web_socket.js |