You didn’t need to go 64bit on linux to make use of 4G of ram, you could have just installed one of the bigmem kernels which make use of PAE to address up to 64G of ram in x86 compatible computers. Of course PAE incurs some performance problems, but nothing too serious. At least nothing compared to the insane io lag I was seeing in 64bit on my quadcore. when ever something was doing heavy io on one or two cores, it’d totally bog the entire machine down till it was unusable.
Ha ha! Too late. I know I could have compiled a bigmem kernel, but I wanted to stick with stock Arch kernels. And I’m a bleeding edge junkie.
Flash and codecs I know about (I’m going to try out the 64-bit flash version, actually), but I’ve not heard of any (recent) problems with Java other than that the startup time is slower on 64-bit. Both jdk and openjdk are available as 64-bit packages on Arch. And OpenJDK runs the only Java program I care about (kolmafia).
You can’t access whole 4GB in a single application with a 32 bit OS. But it’s true it’s not worth the trouble, unless you want to make full use of all those new registers in the amd64 arch. Which is why 64 bits is good for compute intensive applications, like video encoding and such. Probably compiling is faster too.
Well, it’s almost at the “just working” stage. Arch is 64-bit pure, so there are no 32-bit compat libs. I have flash, which is the main proprietary application I care about.
I’ve had a few interesting issues, but I many of them are issues with packaging in Arch that come up with the less-tested 64-bit packages, or build scripts that work fine with 64-bit but aren’t marked as such. But things like that are the price I pay for using a bare-bones bleeding-edge distro.