GPUs, CPUs, MIPS, and Brain-Based Computation
GPUs, CPUs, MIPS, and Brain-Based Computation
Quick links to useful diagrams:
Michael Galloy has produced a good chart showing increase in GPU vs CPU processing over this past decade; nicely continues the line of thought about nonlinear increases in processing power.
Look at: http://michaelgalloy.com/2013/06/11/cpu-vs-gpu-performance.html
See also post by Karl Rupp: http://www.karlrupp.net/2013/06/cpu-gpu-and-mic-hardware-characteristics-over-time/
Also, this post by NVIDIA: http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter29.html
For detailed discussion (including appropriate algorithms/methods), but NOT figures, see:
http://pcl.intel-research.net/publications/isca319-lee.pdf
Debunking the 100X GPU vs. CPU Myth: An Evaluation of Throughput Computing on CPU and GPU, by Victor W Lee et al. from Intel Corporation (Researchers from Throughput Computing Lab and Architecture Group, all Intel).
These continue to bear out the super-exponential long-term IT trends identified by Nagy et al. (Santa Fe, MIT, 2011.) (See graphs in that paper, and my discussion: Modeling Trends in Long-Term IT as a Phase Transition and Going Beyond Moore’s Law.
Quick comment:
Most of the methods that people are using for predictive intelligence are – essentially – linear forecasting methods.
They are not appropriate to dramatic regime changes, or to situations which approach singularity.
For these situations, entirely different mathematical models are needed.