Catt Spiral.
https://patents.google.com/patent/US3913072A/en
1972
https://patents.google.com/patent/US3913072A/en%201972
http://www.ivorcatt.co.uk/x87tbrunel.htm 1977
http://www.computinghistory.org.uk/det/3043/Anamartic-Wafer-Scale-160MB-Solid-State-Disk/
http://archive.spectator.co.uk/article/2nd-february-1974/20/computers
http://archive.spectator.co.uk/article/9th-february-1974/27/the-cam-invention
http://www.ivorcatt.co.uk/x1a81.pdf
; “ ….
The ingenious nature of the system for sabotaging new invention and
industry.”
http://www.ivorcatt.co.uk/x7cf.htm
http://www.ivorcatt.co.uk/x1a81.pdf
http://archive.spectator.co.uk/article/2nd-march-1974/25/computers . The fettered giant. “The rejection of high profitability is enshrined in a cliche of business management and accountancy, the well known principle that any business proposal claiming more than a reasonable (say 50 per cent) return on capital Invested must be rejected as unsound.“ - cf “Apple is first public company worth $1 trillion” 1,000,000,000,000 or 1012 . (This would be £20,000 for every man, woman and child in Britain.)
http://www.ivorcatt.co.uk/x5as.htm “The kiss of death for
the wafer as an investment option was the debacle of Gene Amdahl ….”
http://www.computinghistory.org.uk/det/8199/Anamartic-Limited/
; http://www.computinghistory.org.uk/det/3043/Anamartic-Wafer-Scale-160MB-Solid-State-Disk/ 1989
Kernel. http://www.ivorcatt.co.uk/3ewk.htm
http://archive.spectator.co.uk/article/23rd-september-1972/24/skinflints-city-diary
The missed Window of Opportunity.
Up
to 1960, technologies then used for processing and memory were different. Then,
in an in-house conference in Ferranti Manchester to discuss the implications of
the newly arriving integrated circuit (instead of discrete transistors and
other components), the brilliant late Ken Johnson pointed out that the new
technology would allow “search” within memory. This was a bombshell for me. It
meant we would escape from the Von Neumann Bottleneck.
https://en.wikipedia.org/wiki/Von_Neumann_architecture#Von_Neumann_bottleneck
The von Neumann bottleneck was
described by John Backus in his 1977 ACM Turing Award lecture.
According to Backus:
Surely there must
be a less primitive way of making big changes in the store than by pushing vast
numbers of words back and forth through the von
Neumann bottleneck. Not only is this tube a literal bottleneck for the data
traffic of a problem, but, more importantly, it is an intellectual bottleneck
that has kept us tied to word-at-a-time thinking instead of encouraging us to
think in terms of the larger conceptual units of the task at hand. Thus
programming is basically planning and detailing the enormous traffic of words
through the von Neumann bottleneck, and much of that traffic concerns not
significant data itself, but where to find it.[26][27]
We didn’t. The
industry was too conservative. My own view (ignored) was that a computer should
have perhaps 50% memory and 50% processing, all interspersed. An example was
the “Kernel Machine”. http://www.ivorcatt.com/3ew.htm ; http://www.ivorcatt.co.uk/images/x0106.jpg
. In 1990, power dissipation was still not a big issue.
However, then,
the size of memory increased, not by a factor of a million, but by a factor of
a thousand million. Now, the technology for memory was different from that for processing.
The new memory consumed no power, in a regression back to before 1960. This
meant that perhaps the Von Neumann architecture became again the ideal.
Ivor
Catt 18.8.2018