top of page
  • Phil Venables

Technology - Retrospective

In the late 1980’s I was a developer using virtualized systems and containers, software defined networks, thin-client end points that could graphically render serialized content in a standard mark-up language. The servers were using specialized co-processes for cryptographic functions, storage access and network communications. The data was stored in a high performance transactional NoSQL store with code written in a math oriented functional language with matrix math operators. Orchestration was done using an advanced scripting language, with policies and configuration in the same SDLC as the code. All of this was under the control of a system wide role based access control system accessible by the O/S, data and application layers. The performance was optimized to support 1000’s of end user sessions with millisecond response time with only 8Mb of memory. Yes, the late 1980’s.

Ok, you guessed it, this was the IBM mainframe environment of virtualized MVS regions in VM/370. SNA / VTAM networking, LU2 3279 terminals processing SGML markup, high performance host security modules, disk channel controllers and 3745 network front end processors.

Data & transaction processing in IMS hierarchical data stores with CICS distributed transactions. APL programming language, REXX scripting, and JES2/JES3 job control with RACF security. (It's been a while, so this might not be totally right and I've probably missed a few things.)

Now, this isn’t one of those posts complaining about the reinvention of old concepts - rather, I love the fact that each generation of technology that reinvents some older concepts actually usually does it better, across multiple dimensions of cost, efficiency, flexibility. This reinvention opens up innovation and is more accessible to a wider set of people - I worked on a lot of that as the 1990’s Unix/client/server reinvented the way we do enterprise IT and so on into the late 90’s/ 2000’s and forward to the Internet and cloud(s) we have today. What does amaze me sometimes, though, is the inevitable (I did it myself) snark of the new generations not recognizing these layers of our industrial archeology. Equally annoying is the sneering at them for reinventing things that are - in fact - the better for the reinvention.

Bridging of the generations & advancing the profession is something I think other engineering disciplines seem to do better than we do. Perhaps because those disciplines have more active/formal professional institutions to encode/improve the body of knowledge - or something else?

In the meantime I’m going to flick through my 80's/90’s collection of the classics : Comer’s TCP/IP series, Deitel on Operating Systems, Comer on Distributed Systems, Tony Hoare on structured programming, and finish off with the “Dragon Book” on compiler design.

569 views0 comments

Recent Posts

See All

Resilience Engineering: Concepts and Precepts is an excellent collection of standalone essays, woven into a consistent whole on the subject of resilience, complexity, safety and systems / control theo

There is a lot of good discussion and emerging methods to manage the risks of AI in various forms from training data protection, model validation, to safety harnesses for use and more. Daniel Miessler

Force 2 : Code wants to be wrong // Central Idea: Shift from a pure focus on only reducing security vulnerabilities towards increasing systems reliability - which should include control validation a

bottom of page