I'm somewhat certain that the rise of secure remote execution was only a few years ago. I've been playing around with some stuff at work that allows remote execution, but it is all locked into the same network, so it does not really answer my question. When did we get the ability to securely execute remotely? I'm sitting at my desk with a bag full of hobby electric motors and a few spare raspberry pi systems, knowing I can make these motors start spinning from anywhere in the world without exposing ports to the internet. I'm probably going to use a combination of shell scripts, nodejs, and python to do it. But this didn't just spring from the cloud.
I guess we will need to start from the beginning, the first computing systems that operated closer to a series of engineered gauges than a piece of hardware running software. Let's try to follow some paths. The very first path I want to look at is the Automatic Computing Engine (ACE). This dives down to almost a century ago. It was based off of classified work that Turing had performed, and may be the first view into how a program runs from memory.
Aggregate 1:
ACE
We will hit fast forward through some government contractor and academic nerd fights. There was the Electronic Discrete Variable Automatic Computer (EDVAC) that used binary instead of decimal. It helped lead to the Harvard Architecture that separated instruction and data in memory. This is the first steps towards modern computers, but they lacked networking and the ability to run multiple applications simultaneously. Probably the first example of using multiple computers for a single application was SAGE, using RADAR data to create a unified image. This eventually led to ARPANET, which was the birth of the internet.
Aggregate 2:
EDVAC
Aggregate 3:
SAGE
This will eventually lead to distributed computing, which has some amazing characteristics. Concurrency of components, lack of a global clock, and independent failure of components. They distribute tasking, do not have high security, and have a low tolerance for failure. Still, they were amazing when they were created. This is what I had in high school and college on the internet. Napster and online video games. About 5 years prior, Object Request Broker (ORB) becomes a mainstream technology. Issuing commands from one computer to another. You can now issue commands from your desktop without having to login to the system that will execute the commands. Created without any form of trust, of course. A long step away from our current cloud computing or domain methods. These technologies will converge.
Aggregate 4:
Distributed Computing
Aggregate 5:
ORB
I need to pause for a second to apologize for the heavy reliance on Wikipedia. The first part of this is history, the last part will be a little more cutting edge education.
Becoming more modern, we have Common Object Request Broker Architecture (CORBA). Which allows ORB to work on Windows, Linux, Mac, Unix, and BSD. You should be able to use any compliant language to issue commands to other systems. There is also The ACE ORB (TAO), but not the ACE mentioned in aggregate 1, a new ACE. We have gotten into the nested acronym portion of our evolution. This ACE is the Adaptive Communication Environment, which is a framework that can tie together advanced features of operating systems. TAO is a low level computing engine that can speak between systems to execute applications in real time. When I say real time, in this instance it means "as if it were a local command".
Aggregate 6:
TAO
Aggregate 7:
ACE part 2
As security grew and became an integrated part of computing, some components of the model shifted. Although the security was half-hearted, (security features were turned off by default in the OS) it was at least present. Kerberos ruled environments with tokens and realms. Massive SNMP configurations were used to trigger responses to events. A serious mashup of security and automation. I don't feel like linking to these technologies, and I have more to add to them. There were LDAP and AD domains that could link into a realm. SNMP was a shitty replacement for the old MQTT and DDS messaging that should have been configured in the first place. But none of these things matter as much to consumers. They wanted a way to use familiar tools to accomplish trivial tasks quickly.
Microsoft almost got us there with Simple Object Access Protocol (SOAP). I still remember checking for SOAP port activity at an old job. Messages that could control a networked application or device. This is still heavily used today, and has been a great stepping stone to what I currently use for cloud based applications. Representational State Transfer (REST). While SOAP was built to be used on the internet, REST was built of the internet. Commands on HTTP, with no need to open local firewall ports to execute fit the description of both SOAP and REST. Simple messages that execute quickly, the ability to send code rather than a simple command, and floating on top of existing protocols rather than attempting to become a protocol are what make REST the winner.
Aggregate 8:
REST
The nice thing about using HTTP to execute is that it is baked into every OS. The uniform interface of REST makes it easier to tie in any other method of code execution that can be used. Think about your smart home hubs, your lights may operate on a Zigbee spectrum using MQTT to send the message to each individual light bulb. But the message that the hub is listening to is from REST. For the time nerds, it's similar to Precision Time Protocol (PTP) compared to Network Time Protocol (NTP). REST would be the NTP that feeds the PTP that is MQTT. While I have built many devices that use standard networks running on WiFi or Ethernet, converting them to an alternative spectrum like Zwave or Zigbee woule just be a matter of adding a hub to capture the REST command and sending it to the device with MQTT.
The future will obviously hold advances on the existing systems. If we look at progress in a tick-tock fashion, I would guess that the remote execution has ticked. The tock is machine learned behavior that executes without a command, being developed and fleshed out on the cutting edge right now. The next tick will be an increase in speed and rapid reconfiguration based on integrated machine learning to further automate. The next tock is basically what people will be gambling on the stock market about. Could be micro-expression recognition to not only automate, but personalize based on observed response to an automation. In ten years, your lights might turn down because your hub knows you are hung over.
I adore comments of ridiculous speculation about the future.
Subscribe to:
Post Comments (Atom)
3d design for printing
I don't want to sound like an idiot. I really don't. I just lack the patience to learn Blender. It's not just because the nam...

-
One of the ideal outcomes of new technology is advancing automation. Setting a schedule for a device to follow and establishing triggers to...
-
The fun stuff you can do with smart home devices is generally reliant on having a smart home hub. You can set up scripts in your devices, o...
-
I don't want to sound like an idiot. I really don't. I just lack the patience to learn Blender. It's not just because the nam...
No comments:
Post a Comment