I was talking about computers with my electrical engineer husband. I'm in IT and I get computers once we get to the OS level but it just blows my mind that electrons "know" what'd going on in logic gates. He tried to explain it but had some trouble and we looked for a better source.
We both got a good laugh when we stumbled on a book titled "But How Do It Know?" Highly recommend, this book is awesome.
Entirely unrelated to this post, but you made me think of it :-)
Also check out this book! Amazon.com/ButHowDoITKnow
This one teaches you cpu and memory architecture on the lowest level possible in a wonderfully intuitive way.
But how do it know By J. Clark Scott I recommend that book to everyone that asks that question, because that book is what made everything "Click" for me when I was studying computer engineering. It started from the very basic and you get to built a CPU!
But How Do It Know? is a great introduction into how computers work.
Hyperthreading is also a way to utilize each core more effectively. Not all programs can run as many instructions simultaneously like emulators can, so hyperthreading is there to let you use two different threads on one core. It goes without saying that its not as effective as having more cores, but it helps a lot for mid range laptops doing typical user workloads.
State switching is also a feature that helps with typical workloads. The processor change (grossly simplified) to higher performance and lower performance and everything in between quicker and more effectively. So when games suddenly give the CPU more tasks to do, the CPU will be quicker to adjust and give the necessary "power". This improves times for things like wake from sleep, opening programs, much more, and reduces power usage and heat build up.
There are also a large amount of minor things like branch prediction and things we don't hear about as much (industry trade secrets), that does add up to very tangible improvements. In addition to me not knowing so much about them, many of them are by nature secretive since Intel has much competition.
My reason for writing just a few words about these things was that so you can understand that there is a plethora of ways CPU's can improve. And they do. And it might be impossible to get the typical consumer to do more than compare clockspeed (just get them to factor in core count is hard). In short, the topic is much less simple than it looks at first glance.
If you want to learn more about processors, start learning about a simple one can help a huge amount. A guy called J.C. Scott designed a fully working theoretical 8-bit CPU architecture similar to actual 8-bit CPU's just so he could in detail explain to people how CPU's "know" stuff. He starts off by showing you how you can build logic gates using transistors, then how you can build all of the components out of logic gates. And the book is very easy to follow. amazon link
If you're interesting in seeing how computers work from the transistor level, the book "But how do it know?" is excellent:
https://smile.amazon.com/But-How-Know-Principles-Computers-ebook/dp/B00F25LEVC?sa-no-redirect=1
The website has a cool virtual CPU that you can step through one clock tick at a time that really helped me understand what was going on:
Highly suggest either of these books if you like to read https://www.amazon.com/But-How-Know-Principles-Computers-ebook/dp/B00F25LEVC https://www.amazon.com/Code-Language-Computer-Developer-Practices-ebook/dp/B00JDMPOK2/ref=sr_1_2?crid=X11A1TDGZA4H&keywords=code&qid=1660151958&s=digital-text&sprefix=cod%2Cdigital-text%2C64&sr=1-2 they will take you from the simple concept of binary to how a computer works on a low level, and are written in a very beginner-friendly way (no prior CS knowledge). I've read both and my overall understanding of comp. arch. feels much broader and developed.