In my opinion, this is the right goal, but I don't think this can be done by running existing apps in the cloud (and remoting their UI). Instead, I think we need a new cloud-native platform so that any app can be written as a cloud app. (Obligatory self-promotion: that's what we're trying to do with https://gridwhale.com)
> The ultimate vision is/was to have everything run in the cloud. Imagine if you could run any app on the most powerful machine in the world.
IMO the core paradigm shift that needs to happen is for the software and infrastructure to become commoditized. The only way for cloud-everything to not be a nightmarish abuse of the end-users is for the mode of operation to change, from the current "data comes to the app", into "app comes to the data". That is, I believe the data, the application and the compute running it need to be independent to the extent possible.
In particular, the choice of an app and where it runs should be entirely up to user. The user should be able to easily switch from e.g. a cloud run by Amazon to e.g. a "cloud" run by their HOA in the basement of their block of flats. And then possibly switch to a cloud run by a company local to their city, or one of their employer, etc.
The primary point behind my view is to prevent application vendors from being able to take their users hostage by keeping users' data under lock on their own infrastructure, accessible only through their own software. The second point is to increase efficiency and boost local markets worldwide.
There needs to be some reason for users to chose to run their code. Maybe if self hosted software was somehow even cheaper than cloud software? Or maybe they hear about their friend/cousin getting their data stolen by hosting it in the cloud? Or maybe its so much faster hosted at home that the cloud can't compete. Or especially if its easier to store it yourself than it is to store it in the cloud somehow. I wonder if self hosted software can compete on any of these metrics? - safety, cost, speed, convenience?
Unless apps could migrate between high and low performance instances, at the end of the day you are back to either over-provisioning ($$$$$$) or over-subscribing (shitty performance). This is exactly the problem space that makes everyone always hate the VDI experience.
Yes, that's exactly it. Local UI (initially in the browser, but could be on a rich-client or mobile app) but all compute is in the cloud.
The difference from existing thin-client models is that it's a single stack: When you write a program, you write the UI code as if on a local computer. E.g., you just call MessageBox("Hi, there") and the platform is responsible for remoting that as appropriate.
"A super massive global mainframe" is the correct analogy, but instead of a text-mode VT100 terminal, you get a full remote GUI (more like X Windows).
See: https://blog.mightyapp.com/mightys-secret-plan-to-invent-the...
In my opinion, this is the right goal, but I don't think this can be done by running existing apps in the cloud (and remoting their UI). Instead, I think we need a new cloud-native platform so that any app can be written as a cloud app. (Obligatory self-promotion: that's what we're trying to do with https://gridwhale.com)