http://www.ivorcatt.co.uk/x1a81.pdf ; “ …. The ingenious nature of the system for sabotaging new invention and industry.”
http://archive.spectator.co.uk/article/2nd-march-1974/25/computers . The fettered giant. “The rejection of high profitability is enshrined in a cliche of business management and accountancy, the well known principle that any business proposal claiming more than a reasonable (say 50 per cent) return on capital Invested must be rejected as unsound.“ - cf “Apple is first public company worth $1 trillion” 1,000,000,000,000 or 1012 . (This would be £20,000 for every man, woman and child in Britain.)
http://www.ivorcatt.co.uk/x5as.htm “The kiss of death for the wafer as an investment option was the debacle of Gene Amdahl ….”
The missed Window of Opportunity.
Up to 1960, technologies then used for processing and memory were different. Then, in an in-house conference in Ferranti Manchester to discuss the implications of the newly arriving integrated circuit (instead of discrete transistors and other components), the brilliant late Ken Johnson pointed out that the new technology would allow “search” within memory. This was a bombshell for me. It meant we would escape from the Von Neumann Bottleneck.
Surely there must be a less primitive way of making big changes in the store than by pushing vast numbers of words back and forth through the von Neumann bottleneck. Not only is this tube a literal bottleneck for the data traffic of a problem, but, more importantly, it is an intellectual bottleneck that has kept us tied to word-at-a-time thinking instead of encouraging us to think in terms of the larger conceptual units of the task at hand. Thus programming is basically planning and detailing the enormous traffic of words through the von Neumann bottleneck, and much of that traffic concerns not significant data itself, but where to find it.
We didn’t. The industry was too conservative. My own view (ignored) was that a computer should have perhaps 50% memory and 50% processing, all interspersed. An example was the “Kernel Machine”. http://www.ivorcatt.com/3ew.htm ; http://www.ivorcatt.co.uk/images/x0106.jpg . In 1990, power dissipation was still not a big issue.
However, then, the size of memory increased, not by a factor of a million, but by a factor of a thousand million. Now, the technology for memory was different from that for processing. The new memory consumed no power, in a regression back to before 1960. This meant that perhaps the Von Neumann architecture became again the ideal.
Ivor Catt 18.8.2018