I thought about that the other day (development in general) but I couldn't think of any actual use cases. I always want to keep my head straight when I'm coding, but maybe you can use head movements as commands? Eg, tilting your head down slightly scrolls down, up, etc.
That sounds more like a use case for eye tracking. What about, say, a torus or sphere of virtual displays accessible via Rift? I don't think I'd actually want that, but it's possible someone might, although the resolution might be poor compared to an ordinary IPS panel -- I'm not familiar enough with the Rift to know for sure.
On the whole, I'm less excited by the Rift, as a developer, than I thought I would be; I'm just not seeing all that much in the way of use cases for it. (As a gamer, though, I'm over the moon, especially since Star Citizen will probably support the Rift.)
As a developer, I'm more interested in new input methods, such as the programming-by-voice scheme Tavis Rudd demoed at Pycon 2013; my wrists aren't as young as they used to be, and given that programming is my hobby as well as my profession, anything that takes some of the load off them will be welcome. (I just wish he'd release his damn code already! I've made a halfway decent start from scratch, in that I've got basic dictation working, but not having to reinvent all the glue from first principles would make life a lot easier…)
If the headgear is comfortable enough, it might help with RSI/fatigue issues -- body position is no longer narrowly constrained by practical display mounting.
You won't be restricted to physical displays anymore - what if your alternate desktops floated off to the side / above / whatever your current "display"?