Wow! Where do I start? I'm no unix junkie but I do understand that the PC architecture has been pretty damn sucky (AMD has finally fixed things). Suggesting that single accumulator CPUs and segmentation is a hallmark of unix compilers is just being naive.
While one can associate segmentation with the code-data division there are plenty of other far more impacting reasons of speed and protection that come to play. Pipelining, for example, and the "RISC way" in general brought with it the methology of custom hardware and the analogue world where all parts are continuously active and streamlined for data-in to info-out signal processing. This streamlining is improved when the processor is designed to hold many peices of code and data simultaneously and part of the equation shows that keeping code separate from data is a bonus too. And, since it's no great loss in flexbility, everyone is happy.