> There's also latency vs bandwidth considerations.
So overlooked! Decoding complexity/power is neither inherently good nor bad. Rather, it is an engineering tradeoff -- compact CISC single-byte instructions use less memory/cache bandwidth and can be packed densely into caches. The RISC/CISC war was ended by PentiumPro (aka P6) -- a microcoded RISC micro-architecture implementing a CISC architecture. All of Intel's mainstream desktop/laptop CPUs since then have been P6 derivatives (Atom is a separate story). AMD's Zen micro-architecture (and its derivatives) is a ground-up design based on the same model as P6: RISC micro-architecture implementing a CISC architecture. So pretty much all RISC/CISC debate since ~1996 has been moot.
The real question is what are the tradeoffs? If I make my caches bigger and reduce my core floorplan, what micro-architectural optimizations am I giving up? Obviously 90% cache / 10% core is wrong. But then 90% core and 10% cache is probably wrong, too. So you can't reason in absolutes. You have to run the numbers on real code, using a credible model of the design and see what the various partitions do to real-world performance. There is no shortcut. And there's a whole textbook written on that.
Computer Architecture: A Quantitative Approach by Hennessey and Patterson is the definitive book used in universities all over the world. They are the creators of the MIPS and SPARC.
https://www.amazon.com/Computer-Architecture-Quantitative-Approach-Kaufmann/dp/0128119055/
We used the 2nd edition in school way back in the 90's but it's up to the 6th edition now. We covered branch predictors and Tomasulo's algorithm for out of order execution back then. I'm sure the latest edition has significant updates.
We modeled everything in C but these days our chip architects model everything in even higher level languages like Python. It is easier to simulate the performance effects of different cache sizes, number of pipeline stages, etc. in a high level language and use traces of program execution.
After the architecture is decided they write specs and then the implementation team would write Verilog and go through the whole process of synthesis, place and route, timing analysis, LVS/DRC, etc
https://www.amazon.com/Digital-Design-Computer-Architecture-Harris/dp/0123944244
This book teaches you how to implement a 32-bits MIPS processor with single or multi-cycle design using verilog.
Is it still this way? I think I saw that modern curriculums were switching to RISC-V for educational purposes.
EDIT: using, e.g., this: https://www.amazon.de/Computer-Organization-Design-RISC-V-Architecture/dp/0128122757
I haven't done that LISP project, but I recommend Crafting Interpreters.
I know that you say you want to build a compiler. And this tripped me up when I first started. But an interpreter and a compiler are very, very similar--to the point of being nearly identical. You can build a compiler by building an interpreter, then swapping out the immediate-calculation functions for functions that generate machine code (or LLVM IR or whatever). You will often want at least a basic interpreter for your language anyway, if for no other purpose than simplifying expressions.
I also recommend the "Dragon Book", which is pretty much the textbook on compiler design. Reading this book was absolutely eye-opening, because it presents a bunch of very basic and general information that everybody in language design just assumes you know.
I really struggled when I started before I read this book because so much stuff is very much simpler than I imagined. For instance, for my first language project, I got stuck on codegen and never finished. I was stuck on the basic question: okay, but how do I know what order to generate the code in? It seemed like a very hard question, requiring DAGs and dependency analysis and all sorts of shit like that, and I got discouraged and quit. Especially because so many tutorials are like "now that you've parsed your language, interpreting it is easy and left as an exercise for the reader". Guess what? It is easy. It's just a brain-dead post-order traversal. But literally the only place that ever said it that simply and directly was the Dragon Book.
Arstechnicas "Inside the Machine" is an amazing book if you want to really learn how a CPU ticks.
Amazon link:
I would recommend the hardback variant.
The actual nuts and bolts of PC building and such is really simply enhanced through reading Wikipedia, really. Absorb specs and numbers and learn the history of things. Some of it will also only come with time and exposure and conversations about them.
Fair point.
I should also have posted a reference to the Black Magic book, which covers this in great detail.
Amazon link: High Speed Digital Design: A Handbook of Black Magic
Another great book (this is the RISC-V Edition, but you also have the classic MIPS edition) :
This is also a very very good book:
https://www.amazon.com/Inside-Machine-Introduction-Microprocessors-Architecture/dp/1593276680
I agree, particularly with the first paragraph.
Ben's CPU videos coupled with Harris and Harris https://www.amazon.co.uk/Digital-Design-Computer-Architecture-Harris/dp/0123944244 is a great way to get junior EEs started with FPGAs.
I would start from there (CPU), but you won't be missing or struggling with anything if you start from somewhere else, or jump around a bit.
To be fair, Ben Eater's 8-bit computer uses microcoded ROM for control of the processor, which is just one of many control schemes for a processor. Though, that's outside the scope of this comment.
OP, check out Digital Design and Computer Architecture for a good intro after you watch through some of Ben Eater's stuff. I'd HIGHLY recommend doing the questions at the end of each chapter to make sure you actually understand the stuff.
The book mostly covers question 3 (and I guess 4) above. 1 and 2 are, as others have stated, more VLSI questions.
On that front (again, briefly): the chip designers use hardware description languages (like Verilog and VHDL) to describe the functionality of the chip. They then use programs to convert it to an intermediary language called RTL (register transfer logic). They then "lay out" the chip - describe how they want things positioned with respect to one another) and the chip fabricator then uses libraries that they have to transfer the RTL to a series of photolithography masks… you know what. This process is too damned complex to go through right now. Sam Zeloof has a great series of videos on this. He built a chip fabrication set up in his folks' garage when he was in high school. He's insane.
But How Do It Know? Is a really good book that explains how computers work in a very simple way. There’s also a short YouTube video that sums up the concepts in the book.
But How Do It Know? - The Basic Principles of Computers for Everyone https://www.amazon.com/dp/0615303765/ref=cm_sw_r_cp_api_glt_fabc_WM1AQKMRCGNFPYW5T2F5
> Programming principles and practice
Let me look at the book. C++ books will often have the C++ grammar as an appendix at the end of the book. I suspect this is simply part of the book itself.
EDIT: I looked at the book on Amazon and the grammar appears to be for a simple calculator - and not the C++ grammar. Understanding a grammar will be difficult for you if you have not been formally trained at university in computer science. Explaining grammars is probably too complicated for this sub. I don't know what to really tell you. If you have not studied this in college, then you are probably in a bit over your head.
Most of us who studied CS are familiar with the dragon book. This is kind of the quintessential introduction to compilers and grammars and parsers and tokenizers. You will probably be a bit lost if you have not read this book or something similar.
EDIT2:
Here is a reasonable college level introduction to compilers.
Find it here on Amazon. It's actually a pretty good textbook and worth reading even if you're mostly a software developer.
The "Pillar" book. "Computer Architecture: A Quantitative Approach" 6th Edition by John L. Hennessy (Author), David A. Patterson (Author). It's kinda the bible of high performance computing. (the "downside" is that you always need the newest edition, because they need to update it after every couple of years).
https://smile.amazon.com/Computer-Organization-Design-RISC-V-Architecture/dp/0128122757
Just start writing the HDL. Put it on an FPGA.
Mai e asta, dar abordeaza probleme mult complexe, cealalta se potrivea mai bine cu topic-urile pe care le-ai mentionat, cred. Un fel de "Vietile Sfintilor" vs Biblie :p
Computer Organization and Design by Hennessy and Patterson is a very very good textbook on the subject.
These lectures from CMU are probably fine though I haven't watched them.
I also wouldn't hesitate to recommend Pong Chu's Verilog/VHDL books, which are very practical.
Digital Design and Computer Architecture, Second Edition is another book I recommend because it takes primitives and then successively builds components that are finally integrated to produce a RISC processor (bottom-up).
https://www.amazon.com/Digital-Design-Computer-Architecture-Second/dp/0123944244/
Other links you may want to check out:
https://www.reddit.com/r/ECE/comments/50tlkl/fpga_bookresource_reccomendations/d7c08i8/
If you plan on doing more of this, here is one of the better textbooks on the subject. It takes a very practical approach rather than pure theory. It is geared towards higher speeds than what you are working with, but the design techniques still apply.
http://www.amazon.com/High-Speed-Digital-Design-Handbook/dp/0133957241
You don't need architecture perse unless you want high performance. If you want high performance, you pretty much want to learn the equivalent of a class in computer organization and computer architecture. So if you aren't dealing with raw pointers directly, advanced multithreading, vector/simd math, HPC, 3d graphics, gpu cards, etc, you really don't need computer architecture. If any of those things interest you, you should probably learn it, and you can definitely learn it as you learn C++. If you are mainly dealing with C++ standard library and gui stuff, you will rarely encounter computer architecture topics.
The standard texts (and my favorites) are
http://www.amazon.com/Computer-Organization-Design-Fifth-Edition/dp/0124077269/ (Patterson and Hennesey).
http://www.amazon.com/Computer-Architecture-Fifth-Edition-Quantitative/dp/012383872X ( Computer Architecture, Fifth Edition: A Quantitative Approach, Hennesey)
Those are not referral links, and I recommend they be read in that order.
Some people might say those books are a bit rough for a beginner, so try to get a sample or borrow one to see if you like them. There are some other easier computer org /arch books out there but I'm not aware of any really good legally free ones off the top of my head.
I would like to thank you for the referral. It helped me a lot!
Would highly reccomend this: http://www.amazon.com/Inside-Machine-Introduction-Microprocessors-Architecture/dp/1593276680/ref=sr_1_2?ie=UTF8&qid=1444361271&sr=8-2&keywords=Inside+the+machine
I hope they do an updated version.
Well, depending on how you look at it, it's either Boolean algebra or black magic.
​
TL;DR: You can click through to play with it and see what happens ;)
​
The idea is that you set one of the A/B/C inputs and one of the X/Y/Z inputs.
There are 9 outputs corresponding to 9 different score values for a specific game.
Once you set the inputs, one of those outputs will be set to 1, denoting the score.
​
The circuit uses fairly basic logic gates - AND and OR, plus one NOT.
>Where can I read more about how modern CPUs actually work? I write application software in C++ but there isn’t any need for low level optimizations. However I’m curious though.
This is actually a really good book.
Sounds like cool stuff.
For Zynq specifics and great examples on using Vivado for Zynq SoC development, have you seen the Zynq Book? There is a lot of great information in there.
For bottom up courses on FPGAs I'm not super familiar with any. A great text book which simply covers introductory details on hardware design, FPGAs, and hardware description languages is Digital Design and Computer Architecture by Harris and Harris. Not all chapters of the book are relevant to what you're interested in, but some contain the introductory information I think you're looking for. If you're uninterested in paying for the book, it shouldn't be too difficult to find it online 😉.
I dont know if this is even still relevant to your question, but i always wondered how a pc works and i stubled over this youtube video: How a CPU works.
Ofcause it is only the most basic computer you can build, but with modern computers the basics are the same, you just have some more things that help doing other things faster.
The Video is based on this Book: But how do it know, which i have read and really reccommend if you want to know a little more about the Topic.
It’s called “computer architecture”.
The seminal textbook is this. Usually called “Hennessy and Patterson”.
Nand2Tetris Is also a neat way to learn about it.
Entire books have been written about this topic, and they sell quite briskly.
High Speed Digital Design: A Handbook of Black Magic
And of course the Electronic Design Automation industry sells CAD software which helps you to simulate these phenomena, so you can debug and optimize your design before building boards and performing tests. Prepare to write a check for $100K, or else to use flakeazoid free software from ~~ditzy grad students~~ universities.
Macroeconomists have the purple book on international macro.
Computer scientists have the purple dragon book on compilers.
They serve roughly the same purpose.