Feynman Lectures On Computation gives a lot of practical examples of how the laws of thermodynamics, engineering developments, and information theory limit information storage density in such systems. Yes, there is a limit, but it is very big and far away.
A really interesting book that would complement (or be) a course in computer architecture is "The Feynman Lectures on Computation" http://www.amazon.com/Feynman-Lectures-Computation-Richard-P/dp/0738202967 This is a really fascinating book that explains computers from basic physics up to a useful machine that does work. It also has the virtue of being written by Feynman, someone with an amazing ability to explain things!
Autogenerated.
Hi! I’m Tony Hey, the chief data scientist at the Science and Technology Facilities Council in the UK and a former vice president at Microsoft. I received a doctorate in particle physics from the University of Oxford before moving into computer science, where I studied parallel computing and Big Data for science. The folks at Physics Today magazine asked me to come chat about Richard Feynman, who would have turned 100 years old today. Feynman earned a share of the 1965 Nobel Prize in Physics for his work in quantum electrodynamics and was famous for his accessible lectures and insatiable curiosity. I first met Feynman in 1970 when I began a postdoctoral research job in theoretical particle physics at Caltech. Years later I edited a book about Feynman’s lectures on computation; check out my TEDx talk on Feynman’s contributions to computing.
I’m excited to talk about Feynman’s many accomplishments in particle physics and computing and to share stories about Feynman and the exciting atmosphere at Caltech in the early 1970s. Also feel free to ask me about my career path and computer science work! I’ll be online today at 1pm EDT to answer your questions.
IamAbot_v01. Alpha version. Under care of /u/oppon. Comment 1 of 1 Updated at 2018-05-11 17:56:32.133134
Next update in approximately 20 mins at 2018-05-11 18:16:32.133173
Yes I know, that is why I mentioned general purpose computation. See Turing wrote a paper about making such a machine, but the British intelligence which funded him during the war needed a machine to crack codes through brute force, so he doesn't need general computation (his invention), but the machine still used fundamental parts of computation invented by Turing.
The Eniac is a marvel, but it is an implementation of his work, he invented it. Even Grace Hopper mentions this.
What the Americans did invent there though, was the higher level language and the compiler. That was a brilliant bit of work, but the credit for computation goes to Turing, and for general purpose computation (this is why the award in my field of comp. sci. is the Turing award, why a machine with all 8 operations to become a general computer is called Turing complete and why Turing along with Babbage are called the fathers of computation). This conversation is a bit like crediting Edison for the lightbulb. He certainly did not invent the lightbulb, what he did was make the lightbulb a practical utility by creating a longer lasting one (the lightbulbs first patent was filed 40 years earlier).
I didn't use a reference to a film as a historical reference, I used it because it is in popular culture, which I imagine you are more familiar with than the history of computation, as is shown by you not mentioning Babbage once and yet the original assertion was the invention of "Computation" and not the first implementation of the general purpose computer.
> The Engine incorporated an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete.
Here is a bit where Von-Neuman (American creator of the Von-Neuman architecture we use to this day) had to say:
> The principle of the modern computer was proposed by Alan Turing, in his seminal 1936 paper, On Computable Numbers. Turing proposed a simple device that he called "Universal Computing machine" that is later known as a Universal Turing machine. He proved that such machine is capable of computing anything that is computable by executing instructions (program) stored on tape, allowing the machine to be programmable.
> The fundamental concept of Turing's design is stored program, where all instruction for computing is stored in the memory.
> Von Neumann acknowledged that the central concept of the modern computer was due to this paper. Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine.
TLDR: History is not on your side, I'm afraid. Babbage invented computation, Turing invented the programmable computer. Americans invented the memory pipelines, transistor, compiler and first compilable programming language. Here is an American book by a famous Nobel prize winning physicist (Richard Feynman) where the roots of computation is discussed and the invention credit awarded to Alan Turing. Its called Feynman's Lectures on Computation, you should read it (or perhaps the silly movie is more your speed).
http://www.amazon.com/Feynman-Lectures-On-Computation-Richard/dp/0738202967
http://www.amazon.com/Practical-Cryptography-Niels-Ferguson/dp/0471223573
www.amazon.com/Applied-Cryptography-Protocols-Algorithms-Source/dp/1119096723/
http://www.amazon.com/Concrete-Mathematics-Foundation-Computer-Science/dp/0201558025
He has this book, a few of the chapters are basically this lecture. Not sure if it's what you're looking for but I recommend it.