top of page
  • Phil Venables

Technology - Retrospective

In the late 1980’s I was a developer using virtualized systems and containers, software defined networks, thin-client end points that could graphically render serialized content in a standard mark-up language. The servers were using specialized co-processes for cryptographic functions, storage access and network communications. The data was stored in a high performance transactional NoSQL store with code written in a math oriented functional language with matrix math operators. Orchestration was done using an advanced scripting language, with policies and configuration in the same SDLC as the code. All of this was under the control of a system wide role based access control system accessible by the O/S, data and application layers. The performance was optimized to support 1000’s of end user sessions with millisecond response time with only 8Mb of memory. Yes, the late 1980’s.

Ok, you guessed it, this was the IBM mainframe environment of virtualized MVS regions in VM/370. SNA / VTAM networking, LU2 3279 terminals processing SGML markup, high performance host security modules, disk channel controllers and 3745 network front end processors.

Data & transaction processing in IMS hierarchical data stores with CICS distributed transactions. APL programming language, REXX scripting, and JES2/JES3 job control with RACF security. (It's been a while, so this might not be totally right and I've probably missed a few things.)

Now, this isn’t one of those posts complaining about the reinvention of old concepts - rather, I love the fact that each generation of technology that reinvents some older concepts actually usually does it better, across multiple dimensions of cost, efficiency, flexibility. This reinvention opens up innovation and is more accessible to a wider set of people - I worked on a lot of that as the 1990’s Unix/client/server reinvented the way we do enterprise IT and so on into the late 90’s/ 2000’s and forward to the Internet and cloud(s) we have today. What does amaze me sometimes, though, is the inevitable (I did it myself) snark of the new generations not recognizing these layers of our industrial archeology. Equally annoying is the sneering at them for reinventing things that are - in fact - the better for the reinvention.

Bridging of the generations & advancing the profession is something I think other engineering disciplines seem to do better than we do. Perhaps because those disciplines have more active/formal professional institutions to encode/improve the body of knowledge - or something else?

In the meantime I’m going to flick through my 80's/90’s collection of the classics : Comer’s TCP/IP series, Deitel on Operating Systems, Comer on Distributed Systems, Tony Hoare on structured programming, and finish off with the “Dragon Book” on compiler design.

604 views0 comments

Recent Posts

See All

Where the Wild Things Are: Second Order Risks of AI

Every major technological change is heralded with claims of significant, even apocalyptic, risks. These almost never turn out to be immediately correct. What often turns out to be riskier are the 2nd

Security and Ten Laws of Technology 

There are many well known, so called, laws of technology. Moore’s law being particularly emblematic. Let’s look at some of them and see what the security implications have been for each and what might

DevOps and Security

Each year, DevOps Research and Assessment (DORA) within Google Cloud publishes the excellent State of DevOps report. The 2023 report published in Q4 was as good as ever and in particular documented so


Commenting has been turned off.
bottom of page