Skip to content(if available)orjump to list(if available)

Quantum Computation Lecture Notes (2022)

hedgehog0

De Wolf's note is also one of the standards right now, and more up-to-date than the QC&QI book: https://arxiv.org/abs/1907.09415

ofjcihen

Thanks for this.

I’ve recently become very interested in QC and purchased and read Quantum Computation and quantum Information which I think is the standard book on the subject right now.

I’m even more interested in applying what I’ve learned but I’m at a loss as to how to begin working in the industry. Aside from getting a new masters degree I wouldn’t know where to begin and resources on the matter are understandably sparse.

qnleigh

Yes that's still a great book, though it's starting to get a bit outdated. Some recent developments that would belong in an updated edition:

- The section on error correction is still gold, but it doesn't cover "scalable codes" like the Surface Code (and other LDPC codes; lots of exciting progress there) - Superconducting Qubits: https://arxiv.org/abs/1904.06560. - Rydberg Atoms: see Nature Papers from Misha Lukin's group on the subject - Photonic quanum computing

These might be hard to follow now, but if you make it through a good chunk of Nielsen and Chuang, then they might become quite readable. Make sure you solve lots of problems or it won't stick.

Like other commenters have pointed out, quantum computing companies need lots of software engineers, so that's a very viable entry into the field for many people. Here's an arbitrary list of some relevant skills: - Qutip! You can learn sooo much quantum mechanics by playing around in Qutip, and it's quite easy to use. - Rust or C++ (depending on the company?) - FPGA programming - Python (ofc) - Linear algebra - ...

abdullahkhalids

You can definitely work at QC companies even without having a degree in the field. Many QC companies hire people from other fields because they require that expertise, say people with experience in numerical optimization. Of course, many QC companies also hire software engineers because they have complex internal software. If you are a software engineer, you can start there and then start to move laterally within the company.

Source: work at a QC company as a scientist.

ofjcihen

That’s something I didn’t think about. Thank you

dheera

It's actually not that difficult to pick up quantum mechanics and quantum computing if you have a solid foundation in linear algebra. QC really just reduces down to "applied linear algebra on crack".

If you're in AI, you might be pleased to know that the probability distribution of a particle in its various energy states is related to the softmax of the negative values of those energies times temperature, which is where the concept of LLM "temperature" comes from. If you have linear algebra background, those energies are the eigenvalues of a Hamiltonian. Physics is actually quite beautiful.

Getting into industries is another issue though, it seems every company favors credentials over learning ability these days. If you haven't published 1500 papers on the subject you're automatically rejected.

husky8

I made a podcast in NotebookLM once I saw equations. Enjoy https://notebooklm.google.com/notebook/bc7616c4-1c71-4a04-b2...

vismit2000

Getting this message - "Oops! This audio could not be loaded."

mikestorrent

Of course, no discussion of quantum annealing, the only practical form of quantum computation that is likely to exist at scale in our lifetimes.

msgodel

Is it practical? The little I've messed with it it seemed borderline useless. All it can do is QUBO and encoding the problem into the machine topology itself is another QUBO problem that has to be done on a classical computer.

People also keep talking about using it for AI but all you can train with it are Boltzmann machines because those are all that map into QUBO problems.

adastra22

That's a strong statement. Regardless, the content of the course is explicitly targeted at gate-based quantum computing.

fogof

Very funny to me that lecture 21 is one of the only lecture titles that doesn't reference the name of the originator.

polamolo

I feel like I've seen/done this before. Could I be stuck in a groundhog day?

zara_is_reading

I’ve 70.71% seen it before and 70.71% not seen it before.

rvz

Well right now I am very skeptical, but I think we have somewhat given quantum computing plenty of time (we have given it decades) unless someone can convince me that it is not a scam.

Right now it hasn't amounted to anything useful, other than Shor's and 'experiments' and promises and applications that are no better done on a GPU rack right now.

JanisErdmanis

A scam is a strong word, giving the impression that there are malicious interests in selling it without working towards making returns to the investors. But a dead horse, for sure, it now looks like.

The next big challenge will be mounting the controlling hardware, currently connected via coaxial cables, onto the chip while preventing the introduction of new sources of interference so that error correction can run. That will take a miracle.

Of course, an alternative is a million coaxial cables connected to a chip cooled close to mK temperatures.

japanuspus

> Well right now I am very skeptical, but I think we have somewhat given quantum computing plenty of time (we have given it decades) unless someone can convince me that it is not a scam.

Shor's paper on polynomial time factoring is from 1997, first real demonstration of quantum hardware (Monroe et al.) is from 1995: Yes, quantum has had decades -- but only barely, and is has certainly only now started to have generations.

To look at the kind of progress this means, take a look of some of the recent phd spinouts of leading research groups (Oxford Ionics etc.): There are a lot of organisations with nothing but engineering to go before they reach fault tolerance.

When I came back to quantum three years ago, fault tolerance was still to be based on the surface code ideas that floated when I did my phd ('04). Today, after everyone has started looking harder, it turns out that a bit of long-range connectivity can cut the error correction overhead by orders of magnitude (see recent public posts by IBM Quantum): The goalposts for fault tolerance are moving in the right direction.

And this is the key thing about quantum computing: you need error correction, and you need to do it with the same error-prone hardware that you correct for. There is a threshold hardware quality that will let you do this at a reasonable overhead, and before you reach this threshold all you have is a fancy random number generator.

But yes, feel free to be a pessimist -- just remember to own it when quantum happens in a few years.

William_BB

"When quantum happens in a few years" -- do you mean NISQ (i.e. VQA, QAOA) or actual fault tolerant quantum computing that can run Shor?

ttshaw1

We're already at NISQ

andrewla

I'm a skeptic as well, but calling it a "scam" is a bit extreme. I think QC proponents are acting in good faith, and I believe that it is worth chasing a little longer since we don't yet have a convincing model for why QC will or won't work (although I think the Gil Kalai's work in this area is intuitively correct I don't think that we have a physical explanation for why quantum error correction would not work).

The current emphasis on NISQ systems is a bit of a desperate measure because the most we can get out of such systems is evidence that quantum computing can work in theory; they do not advance us towards having a workable quantum computer.

n4r9

Although I agree that "scam" is extreme, the commercial side was sullied in the early 2010s by D-Wave selling what they described as "quantum computers" for $10m and spinning up a bunch of misleading PR. Of course you expect a certain deree of "fake it til you make it" in such companies, but they'd been going for over a decade at that point. This was all kicking off as I was doing my PhD in the field. It was eye-opening to see how little attention was paid to serious academics vs hucksters, and how companies like Google could be duped into spending millions on basically nothing.

wasabi991011

Fwiw:

The last paper I saw posted on hackernews from Gil Kalai included a few explicit predictions about what would be impossible in quantum error correction.

This was a paper from a few years back.

The problem is that now Google has published results which imply that some Kalai's predictions turned out false.

The paper in question is Google's recent "below threshold"/"beyond break-even" QEC paper. I believe Kalai was predicting below threshold QEC to be impossible IIRC, among other things.

Not sure if Kalai has responded or updated his predictions, I haven't been following him closely.

n4r9

For quantum computing, as a rule of thumb I generally look to what Scott Aaronson says. And as you suggest, while some cool stuff is being done both in industry and academia, we are nowhere near general quantum computers. I haven't checked what his outlook is for the next 5-10 years.

honzaik

this may give you an idea about his current outlook https://www.youtube.com/watch?v=DQFyQgA_GE4

bawolff

You should stop viewing it as a tech start up and view it more as a physics experiment.

In some ways it is similar to fusion. People have been working on it for a long time. The benefits are potentially significant (shor is cute and all but really the big deal would be a cheap way to simulate other quantum systems) but the challenges are also significant. Real progress is being. Things that were super challanging 10 years ago are solved now. The field is advancing. But we still have a long way to go.

It is not a scam itself, but a lot of scammers use the language of quantum to sell their scams. You should treat anyone convincing you that they will have a useful quantum computer in the next 5 years the same way as someone offering you a fusion reactor (i.e. full of shit).

Its still a worthwhile pursuit, even just as a physics experiment. It pushes the "weirdness" of quantum physics to the limit - by literally disproving the extended church turing thesis. If we make a real quantum computer - that is proof that quantum physics is really how are world works. Its not just something else that is being misinterpreted.

DebtDeflation

This is exactly it. QC right now is a series of very cool science experiments that are being marketed to [Government Officials, CEOs, Investors, the General Public] as product development, which it is not. We're at the stage of scientists in the 1910s creating the first vacuum tubes and noting the ability to amplify and control small currents with larger ones but these companies are pretending it's instead the early 1980s with the PC and 8088 and Moore's Law getting ready to take off like a rocket.

Viliam1234

I am not an expert, but seems to me it is caused by two things:

1) While quantum computers are potentially exponentially faster, they also seem to be exponentially more expensive given the number of qubits, so you actually can't save money by building a huge quantum computer. This may or may not change in future. Also, there was a problem with error correction, which is made much harder by the nature of quantum computing. Smart people are working on that, I don't know the current state of progress.

2) Despite the hype, only some problems can be calculated exponentially faster using a quantum computer, not all of them. This is analogical to parallel computing: having two CPUs instead of one will allow you to calculate some things twice as fast, but some other things will require exactly the same amount of time because their steps need to be done sequentially. Similarly, a quantum computer is like a network of billions of computers that are spread across the multiverse, but they need to all run the same code, and to compress the results of the gigantic computation into about dozen bytes. So it's great for highly parallelizable tasks where the entire required output is a "yes or no" or a single number... and less useful for everything else. That still includes some important scientific problems, such as simulating atoms and molecules. But those are not the things we typically use computers for.

JBits

There are multiple competing quantum computing hardwares, so you haven't given all of them the same length of time.

vonneumannstan

It's not a scam. There are many applications in materials science for example. Your ignorance of them doesn't make it a scam.

revskill

How can a person become so good at researching ?

gaze

I don't know how else to say it but you just have to do it. It's like asking how to get good at running. You run.

graycat

> How can a person become so good at researching ?

My time in my Ph.D. program and some of the work in my career (getting paid) suggest that I was "good at researching". But I left such research due to wanting to get paid more, and settled on starting a business, owning it, and making it valuable. If some research can help the business, fine, but the real goal is just the money from a successful business.

On (academic) research, one lesson no one ever mentioned to me but eventually I formulated: Pick a field of research. Then in that field a lot about what is expected, respected, intended, valued or not, ..., is not much spoken about and not made clear -- clear that have fertile ground for politics. Then, for such questions, the answers you guess or get in some one field will likely be quite different in another. In some fields can get reminded of the old quip: "Haydn wrote 101 symphonies or one symphony 101 times?" Or at times can believe that with high probability, a paper gets read by just two people, the peer reviewer and the author; as a result of that case, the only accomplishment of papers, good or bad, are that they get counted as in someone with 50 papers is regarded as better than someone with only 4. Ah, tough to prove that the paper will never get 1000 readers!

For research, one approach is to study a (assume an academic) field, crawl down some narrow alley or rabbit whole, see a question with no answer, consider the broad status of the field, then if making progress on the question seems not obviously impossible, give it a try. By a few days or weeks should have an answer, a partial answer, or hints that by continuing you might get something.

Another approach is to pick a problem mostly on your own and not from, trying to extend, published research. You might follow some instance of personal curiosity or something from some other field, e.g., do some math, optimization, statistics, ... research from problems in the environment (why the ups and downs of lobsters in east Canada?), medical testing, the supply chain, some engineering problem, some business problem, etc.

Do note that in the US, after radar, the proximity fuze, submarine acoustics, code breaking, jet engines, the "bomb", the US military had plenty of both money and problems, and that funded a lot of US research. Now there seems to be a general view: We don't know what research directions will yield powerful results, but since we can't afford to miss out on some big results or fall behind, we will continue to fund research. Non-military research seems less eager for results and to have less money.

Ah, be good at the politics, e.g., even follow "Always look for the hidden agenda." If working in an organization, beware of "goal subordination", i.e., others working to have you fail.

11101010001100

What, if any, did you port from the research career to the business?

graycat

I derived some math, new at least to me, based on my pure math background, e.g., Rudin in measure theory some functional analysis, probability based on measure theory from Neveu, tightness in probability once used in some statistics for computer science, presented at NASDAQ, some optimization via the Kuhn-Tucker conditions, some stochastic optimal control, etc.

xeonmc

just get nerd-sniped.

lightbendover

(1) be naturally gifted, (2) avoid falling down wells of distraction, (3) be lucky. Don’t sleep on (3), it’s easy to call it capitalizing on opportunity in hindsight when it was honestly just luck.

jasonhong

I'd also add (4) be incredibly curious about lots of things; (5) surround yourself with other smart, curious, and committed people who have a culture of critiquing ideas; and (6) devote a lot of time to deep thinking.

rwoerz

(8) be good in counting things, (h) be consistent in your thinking, (10) have a good memory, (11) be good in counting things, (12) refrain from making silly comments

TechDebtDevin

luck in terms of natural intelligence and focus. But generally people can approve their ability to avoid falling down the "wells of distraction". This year I set time filters for apps on my phone and its made a stark difference. Even though I can turn off "work mode" to get around it, that little reminder has saved me hundreds of hours as I usually just put down the phone when I see it.

m3kw9

As soon as a math equation comes up I get lost.