I'm not super technical but I've always been skeptical of "cloud processing," especially after the Sim City debacle. Are there any scenarios where actually processing information on a server instead of on the user's home Xbox make sense, aside from things like WoW and Diablo III where you need to do the calculations on the server to make the multiplayer aspect fair?
I can't see many developers putting the time to offload calculations when they'd have to make the game work seamlessly with varying levels of latency or the servers becoming unavailable. Just look at how difficult it has been for games to utilize a handful of CPU cores. Microsoft's claims about the cloud remind me of ten years ago, when the .net label was being applied to everything they made and all software was somehow going to be delivered as web services. Plus I doubt MS is offering cloud server time to developers for free. Every service they provide to devs comes at a cost.
It makes sense for multiplayer games to not be hosted on arbitrary client boxes, but on some kind of neutral, low-latency "cloud" device (I know you said other than this, but I thought this was the biggest selling point for it in the first place).
It also makes sense for "complex" calculations. It's how stuff like Google search and Google maps works - you send the query, big machines chomp on the query and spit out a result, and your netbook doesn't have to go into overdrive trying to figure anything out. Basically anything that can take 100ms or so to complete could be moved off of the appliance and onto Microsoft servers. Why? Well, why not? Make some more room for rendering, I guess.