Quantum Computation and Quantum Information by Nielsen and Chuang is the standard intro text-book on the subject. It's how I got into the field and have taught classes using it as well. Highly recommended.
Don't think jumping directly into a programming language for quantum computing is the best idea. To me it seems like trying to understand addition and multiplication by learning Java. I would tell you to spend time at a university to learn the things required, but it doesn't sound like that's an option.
I think your best bet for a shortcut is to pick up Quantum Computation and Quantum Information by Nielsen and Chuang and start reading. It's the standard intro textbook for the field. Whenever you read something you don't understand, you find some resources that explains the concept to you. You can always come back here and ask if you're stuck on something.
I'm not very surprised because pathfinder has a very passionate fanbase, but it has to be one of the few fanbases that is obsessed with the better game. Almost every other rpg, except for dnd perhaps, is fully aware of the niche they're targeting.
Pathfinder 1e had 28 official source books, 7714 pages in total, how is it not obvious that they're targeting experienced players? And pf 2e's core rulebook already had more pages than dnd 5e's phb and xtge combined. I have a book on quantum computing that's about as long, and as dense, and pf 2e's core rules.
I presume this is the book you are referring to, correct? this
It seems pretty old, considering they have 10th anniversary editions out. Is it still regarded as one of the better textbooks out there for this subject matter today?
/u/the_poope is spot on when separating the 'software' and 'hardware' part of quantum computing. It's an obvious separation when talking about classical computing. You don't need to know how a transistor works in order to learn Java. With your background it's pretty clear you're looking at the software part of quantum computing. I'd strongly recommend getting your hands on Quantum Computation and Quantum Information by Nielsen and Chuang. It will teach you all the basics /u/the_poope mentions and I think you have the prerequisites for it.
Quantum Computation and Quantum Information by Nielsen and Chuang is a brilliant intro book on the subject
Asking to do quantum computing in Python is missing the point entirely. That's like wanting a mathematical proof in a Texan accent. The algorithm is what's important, not whatever language it happens to be written down in.
I wasn't aware that we had a standardized protocol on quatum key distribution.
It is surprising that they would standardize an abort for no reason.
Edit-- Can not find anything on this standard. By ``they'' do you refer to the authors of BB84? Because as I linked, they did later work showing that by privacy amplification you can handle a good amount of error from adversary observations. If you have access to Mike and Ike, they explicitly discuss privacy amplification as part of BB84. And it handles a large fraction of errors.
edit-- I need to relax.
As far as we know quantum computation does not transcend Turing computability. The following is from Nielsen's book on Quantum Computation:
> quantum computers also obey the Church–Turing thesis. That is, quantum computers can compute the same class of functions as is computable by a Turing machine. The difference between quantum computers and Turing machines turns out to lie in the efficiency with which the computation of the function may be performed
YOU HAVE COME TO THE RIGHT PLACE MY BOY! TODAY I GOT LINKS FOR DAYZZZ
IBM-Q is an online cloud based 5 qubit quantum computer open to the public. You can "write" simple algorithms here and it has resources to learn. https://www.ibm.com/quantum-computing/learn/what-is-quantum-computing
This video is a good introduction to Quantum computer by one of the lead QC scientists/engineers at IBM, https://youtu.be/JRIPV0dPAd4
A video overview of the math involved in quantum computing, https://youtu.be/IrbJYsep45E
For a full understanding of the topic at a college introductory level, buy this book: https://www.amazon.com/Quantum-Computation-Information-10th-Anniversary/dp/1107002176
Along the way you should also look into languages like q# by microsoft. I believe google uses a python library for it.
https://www.amazon.com/Quantum-Computation-Information-10th-Anniversary/dp/1107002176
Best book hands down. This will bring you to the frontier of quantum computing. The book is also very approachable and meant for people trying to learn. It covers some linear algebra as well as physics in order to bring you up to speed.
Michael Nielson is an amazing educator and expert in the field. His you tube lecture course https://www.youtube.com/playlist?list=PL1826E60FD05B44E4 Quantum Computing for the Determined, is a short version of that book. He also has a free book online on Neural Networks that is probably the most referenced source on the matter. http://neuralnetworksanddeeplearning.com/index.html
Hey MachLearningEnthu
I was in the same boat after my Bachelors (Electronic Engineering). I started to follow courses on any platform I could find, to get a basic understanding of Quantum computing. Here are the resources I used (at least what I can recall):
Intro book (good to start with): https://www.amazon.com/Quantum-Computation-Information-10th-Anniversary/dp/1107002176 (you can probably find this online)
Youtube playlist by Nielson: https://www.youtube.com/playlist?list=PL1826E60FD05B44E4
Quantum cryptography (online course): https://www.edx.org/course/quantum-cryptography-0
For a introduction to quantum physics: https://ocw.mit.edu/courses/physics/8-04-quantum-physics-i-spring-2013/lecture-videos/ (you can then look at Quantum Physics II and III after)
Since you’ve got a CS background, I think you can start with some quantum cryptography, as it may seem familiar.
I personally use Anki, which is a free program for digital flashcards and spaced repetition. Michael Nielsen (who wrote the book on quantum computing) has written about it here.
I acknowledge my bias, but I think the best way to learn outside an academic institution is the Qiskit textbook here: https://qiskit.org/textbook/preface.html
Note that I do work on this, but we work really hard to produce the best material possible.
Other than that, I here this is the best material, but don't know as much about it: https://www.google.com/url?sa=t&source=web&rct=j&url=https://www.amazon.com/Quantum-Computation-Information-10th-Anniversary/dp/1107002176&ved=2ahUKEwiojtKy_5ntAhWJK80KHb4UCloQFjAOegQIHBAB&usg=AOvVaw27dlDPg9ebKxUuFrKrsKsL
Useful resources for getting into quantum computing:
Congrats on graduation! You're entering the field at a promising time.
Like some of the other comments have suggested IBM Qiskit has some great tutorials that get you started with understanding QC and some of the language used around it. In addition I would recommend checking out Xanadu's Pennylane library and tutorials for quantum machine learning, which is a field of great interest in the software/application side of QC right now.
In addition, you will find a lot of people in the field reference this book on introduction to quantum computing as the gold standard. It's a bit expensive on this listing, but if you do some digging there are ways to get cheap printed copies or free PDF's.
Good luck on your studies!
Information, in very broad terms, is some possible state of some object. For example, a lightbulb can be switched |on> or |off> (|state> is the notation for a state).
Classical mechanics say that a object can only be in a single state. For example, a light switch - |on> and |off> are its only possible states.
Quantum mechanics, on the other hand, can observe a superposition of states. Imagine we have a perfectly isolated box, for which we cannot make any measurements of things inside it if it's closed. Now, put a lightbulb that is |off> inside, and add some automatic switch that will turn it |on> randomly at some point, with a 50% likelihood of changing the lightbulb's state in each minute.
After 1 minute, there is a 50% chance the light is on. After 2 minutes, there's a 50% chance it was already turned on in the first minute, and another 25% chance it will be turned on in the second minute, so the total chance is 75%. After 3 minutes, the chance is 87.5%, and so on.
Now, because the exact information of whether the light in the box is on, there is no way to observe the exact state of the light in the box. However, we still do know it is either |on> or |off> with certain likelihoods. We can therefore observe it as if it is in a superposition of these two states - it is not completely in either of them, it is in a blend of these states. This could e.g. be ✓0.5 * |on> + ✓0.5 * |off> after 1 minute (the squares of the coefficients should add up to 1).
If you try to measure the states (e.g. by opening the box and seeing if the light is on), the superposition will collapse, and we will see it's either on or off. However, if no measurements are made, and we let it interact with some other objects (without measuring the states within interactions), the states may change a bit differently than if we had the measurements.
Giving a real world example of how this really works is not easy. However, there is the famous ("double slit experiment")[https://en.m.wikipedia.org/wiki/Double-slit_experiment.], where we have 2 small slits in a piece of cardboard, and we shoot electrons through these openings. The electrons are actually in a superposition of being a |particle> or a |wave>. Particles would pass through exactly one slit, and will end up traveling in a straight line from the source, while waves would pass through both slits, and create special patterns on the other side.
If we do not measure through which opening the electrons passed, the electrons tend to behave like waves, forming the complex patterns on the other side. If we do put some measuring device on the slits, they will create the simpler pattern expected from particles, because the waves, as superpositions of particles, collapse when they pass through the slits.
This kind of information can be used to represent bits, |0> and |1>. The superpositions of bits are called qubits.
Quantum computing then uses these principles for some really useful algorithms, like the Grover's algorithm, which searches data much faster than any other algorithm in classical computing, or Shor's algorithm, which can factorize numbers much quicker than any classical implementation.
Here's a link with a basic overview: https://www.aps.org/programs/education/highschool/teachers/quantum.cfm
This is a pretty good book on Quantum Mechanics: https://www.wiley.com/en-us/Quantum+Mechanics%3A+Concepts+and+Applications%2C+2nd+Edition-p-9780470026793
This is a great book on Quantum Computing: https://www.amazon.com/Quantum-Computation-Information-10th-Anniversary/dp/1107002176
Study this.
https://www.amazon.com/Quantum-Computation-Information-10th-Anniversary/dp/1107002176
Good luck.
You might like https://www.amazon.it/Quantum-Computation-Information-10th-Anniversary/dp/1107002176 The first chapter goes over what someone wanting to study quantum computing should know about quantum mechanics, from a linear algebra point of view. Very very clear, not much phenomenology, in case you wanted to understand the operator formalism better. Otherwise I will throw this here randomly and suggest asking on physics.stackexchange: https://online.stanford.edu/courses/soe-yeeqmse01-quantum-mechanics-scientists-and-engineers
In terms of books, Nielsen and Chuang is a good basis to start with. If you want to gain some intuition, you could try BLACK OPAL
Everyone else here has the textbook style covered (Nielsen and Chuang is the standard but almost 20 years old now). Quantum Computing Since Democritus is a good contextual approach, and Quantum computing a gentle introduction rounds out my standard textbook-y recommendations.
I teach a lot of quantum computing (did a PhD in the field) and work as a quantum software developer now and wasn't really happy with what existed 2 years ago for helping bootstrap software developers into quantum computing. I also just am not a fan of the dry textbook approach so I approached Chris Granade (helped Aaronson prepare Quantum Computing Since Democritus and a prolific researcher in the field) and we decided to write one! What resulted was Learn Quantum Computing with Python and Q#. You start with learning about quantum computing by using Python to write a quantum simulator to play some games, and then move on to using Q# a domain specific programming language to program real quantum applications like factoring and chemistry simulations. We try to keep it fun and light and give you the tool to explore QC further. Happy to answer any other questions and you might find some more great resources on awesome lists or the quantum open source foundation.
I like Michael Nielsen's discussions of trendy tech fields, such as deep learning, Google's tech stack, causality, Bitcoin, and quantum computing (which is probably what he's most well known for). I think his write-ups strike a good balance between actual substance and accessibility to a general, but intelligent, audience. He definitely doesn't post at the frequency of Matt Levine, but brand new sci/tech areas don't develop every week either. Some of these posts may be kind of dated at this point, but they were a nice introduction at the time they were posted.
Quantum Computation and Quantum Information is generally considered the Bible of the field. Requirements are calculus and linear algebra.
The quantum algorithm zoo is a fairly comprehensive overview of quantum algorithms. Probably a bit overwhelming at first glance.
Read this, then try again:
Quantum Computation and Quantum Information has almost no calculus in it at all, but lots of linear algebra.
Just Googled the textbooks, as I do not really know the field so well, but are you referring to this book?