Architecture and applications of the Connection Machine

L.W. Tucker, G.G. Robertson
1988 Computer  
A s the use of computers affects increasingly broader segments of the world economy, many of the problems to which people apply computers grow continually larger and more complex. Demands for faster and larger computer systems increase steadily. Fortunately, the technology base for the last twenty years has continued to improve at a steady rate-increasing in capacity and speed while decreasing in cost for performance. However, the demands outpace the technology. This raises the question, can we
more » ... make a quantum leap in performance while the rate of technology improvement remains relatively constant? Computer architects have followed two general approaches in response to this question. The first uses exotic technology in a fairly conventional serial computer architecture. This approach suffers from manufacturing and maintenance problems and high costs. The second approach exploits the parallelism inherent in many problems. The parallel approach seems to offer the best long-term strategy because, as the problems grow, more and more opportunities arise to exploit the parallelism inherent in the data itself. Where do we find the inherent parallelism and how do we exploit it? Most computer programs consist of a control sequence (the instructions) and a collection of data elements. Large programs have tens of thousands of instructions operat-26 Thinking Machines Corp.
doi:10.1109/2.74 fatcat:zog5vvidhba3rn6p2tsco52o54