Dr. Terry Rudolph, PsiQuantum and Imperial College London, presented this talk recently and it’s worth a watch.
From the abstract:
All components of a photonic quantum computer can be built in a tier-1 foundry, and packaged in the same back-end-of-line processes used to build laptops and cellphones. Thus with photons we can realistically stare down the sorts of numbers (~1 million qubits) which capture the size of machine required to do useful quantum computation. However the specific type of entanglement we need between different photons can only be created probabilistically, and is difficult to create in the presence of loss. In this talk I will overview an architecture Fusion Based Quantum Computing (FBQC) that sits somewhere between the extremes of matter (circuit) based and one-way (cluster state) quantum computing. It requires the production of only fixed size entangled states regardless of the size of computation being performed, and these states can have high probability of loss (or failure to be produced at all). This allows us to attempt creation of the desired entangled states multiple times in parallel, and then to select out successful events. It also for the extensive use of high quality fixed optical delays (e.g fiber) to amplify the effects of small amounts of entanglement by multiple orders of magnitude.