Indeed everything old is new again. Way back in the mid-90s, we had dumb X terminals at home that connected to our Unix network at the office via ISDN. They had no storage and couldn't even boot without the network connection. The "cloud" was just the engineering network at the office.
In more recent years -- for many years -- I have used VNC to remote connect to PCs & Linux computers at work from wherever I am, with whatever machine I happen to have with me. That includes VNC on my iPad. Not having an actual mouse on the iPad makes things a bit clumsy for doing things on a Windows or Linux machine via the iPad, but once you learn VNC's clever ways of handling mouse emulation (even 3 button mouse emulation), it's manageable.
But none of this is new.
So many things do cycle through. I rode the decentralized computing wave and now, as you noted, things are going back the other way.
I'm a little more optimistic about phone and tablets at this point though. My vision is that the "dumb terminal" is really just a wireless keyboard, mouse and display set. The smart phone in your pocket, in a few years, will be powerful enough to handle just about any task needing local data and will be able to wirelessly connect to the dumb terminal on any desk.
The phone will have all of your connections and configurations and whatever data you feel needs to be with you at all times.
I'm amused by how everything old is new again.
When I first got involved in computing in the late 70's, the personal computer was just beginning to become established, and the original IBM PC was starting to appear on corporate desktops. The model was that work was done on a big central computer, and users interacted with it through terminals.
Now we're back to the centralized computing paradigm again, with the cloud serving as the centralized computing resource, and data and the applications that manipulate it actually residing remotely, with the user's machine becoming effectively a terminal accessing the remote resource.
I've been using remote desktop solutions for years, with things like AT&T's VNC software. The work is actually done on a remote host, and only the data needed to display the desktop on my screen is transmitted, with a protocol designed to minimize the bandwidth required.
The key here is interactive graphic applications, and the higher amount of data that must be sent to the user's device to display the current state. The concerns I can see relate to what device the user has. If I'm running a highly graphic application, I may be able to access it via my device, but can I effectively use it?
One of the issues I have with smartphones is that much of what I do graphically really needs a much larger display than any practical phone will have. I see the same only more with this sort of solution. I may be able to get to that big graphic application from my phone, but I probably won't be able to really use it.
The underlying concept looks valid, but I'm less enthusiastic about the cross-device aspects. There are a variety of things I simply wouldn't try to do from a phone or tablet, even if those devices could connect to the resources that did it.
One of the many limitations of tablet and smartphone is workspace. The screen size is not going to be a lot bigger than 10". In addition, the screen resolution is going too be limited because of price concern. I am happy to hear opinion from heavy CAD users.
Security could certainly be a big advantage for corporate and government users. In stead of needing to secure hundreds or thousands of individual PCs, a few larger systems could get dedicated attention to security. Of course, then a lot of vulnerability would be concentrated in one place.
Eventually, I expect that a lot of corporate and government computing will go this route. It may be the only way to really be secure from cyber attacks.
Remote desktop apps like teamviewer, already can connect to any desktop from any mobile device, which is decent enough for most purposes. The advantages in the case of cloud virtualization are not clear.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.