• 0 Posts
  • 5 Comments
Joined 2 years ago
cake
Cake day: July 7th, 2024

help-circle

  • The reason quantum computers are theoretically faster is because of the non-separable nature of quantum systems.

    Imagine you have a classical computer where some logic gates flip bits randomly, and multi-bit logic gates could flip them randomly but in a correlated way. These kinds of computers exist and are called probabilistic computers and you can represent all the bits using a vector and the logic gates with matrices called stochastic matrices.

    The vector necessarily is non-separable, meaning, you cannot get the right predictions if you describe the statistics of the computer with a vector assigned to each p-bit separately, but must assign a single vector to all p-bits taken together. This is because the statistics can become correlated with each other, i.e. the statistics of one p-bit depends upon another, and thus if you describe them using separate vectors you will lose information about the correlations between the p-bits.

    The p-bit vector grows in complexity exponentially as you add more p-bits to the system (complexity = 2^N where N is the number of p-bits), even though the total states of all the p-bits only grows linearly (complexity = 2N). The reason for this is purely an epistemic one. The physical system only grows in complexity linearly, but because we are ignorant of the actual state of the system (2N), we have to consider all possible configurations of the system (2^N) over an infinite number of experiments.

    The exponential complexity arises from considering what physicists call an “ensemble” of individual systems. We are not considering the state of the physical system as it currently exists right now (which only has a complexity of 2N) precisely because we do not know the values of the p-bits, but we are instead considering a statistical distribution which represents repeating the same experiment an infinite number of times and distributing the results, and in such an ensemble the system would take every possible path and thus the ensemble has far more complexity (2^N).

    This is a classical computer with p-bits. What about a quantum computer with q-bits? It turns out that you can represent all of quantum mechanics simply by allowing probability theory to have negative numbers. If you introduce negative numbers, you get what are called quasi-probabilities, and this is enough to reproduce the logic of quantum mechanics.

    You can imagine that quantum computers consist of q-bits that can be either 0 or 1 and logic gates that randomly flip their states, but rather than representing the q-bit in terms of the probability of being 0 or 1, you can represent the qubit with four numbers, the first two associated with its probability of being 0 (summing them together gives you the real probability of 0) and the second two associated with its probability of being 1 (summing them together gives you the real probability of 1).

    Like normal probability theory, the numbers have to all add up to 1, being 100%, but because you have two numbers assigned to each state, you can have some quasi-probabilities be negative while the whole thing still adds up to 100%. (Note: we use two numbers instead of one to describe each state with quasi-probabilities because otherwise the introduction of negative numbers would break L1 normalization, which is a crucial feature to probability theory.)

    Indeed, with that simple modification, the rest of the theory just becomes normal probability theory, and you can do everything you would normally do in normal classical probability theory, such as build probability trees and whatever to predict the behavior of the system.

    However, this is where it gets interesting.

    As we said before, the exponential complexity of classical probability is assumed to merely something epistemic because we are considering an ensemble of systems, even though the physical system in reality only has linear complexity. Yet, it is possible to prove that the exponential complexity of a quasi-probabilistic system cannot be treated as epistemic. There is no classical system with linear complexity where an ensemble of that system will give you quasi-probabilistic behavior.

    As you add more q-bits to a quantum computer, its complexity grows exponentially in a way that is irreducible to linear complexity. In order for a classical computer to keep up, every time an additional q-bit is added, if you want to simulate it on a classical computer, you have to increase the number of bits in a way that grows exponentially. Even after 300 q-bits, that means the complexity would be 2^N = 2^300, which means the number of bits you would need to simulate it would exceed the number of atoms in the observable universe.

    This is what I mean by quantum systems being inherently “non-separable.” You cannot take an exponentially complex quantum system and imagine it as separable into an ensemble of many individual linearly complex systems. Even if it turns out that quantum mechanics is not fundamental and there are deeper deterministic dynamics, the deeper deterministic dynamics must still have exponential complexity for the physical state of the system.

    In practice, this increase in complexity does not mean you can always solve problems faster. The system might be more complex, but it requires clever algorithms to figure out how to actually translate that into problem solving, and currently there are only a handful of known algorithms you can significantly speed up with quantum computers.

    For reference: https://arxiv.org/abs/0711.4770


  • If you have a very noisy quantum communication channel, you can combine a second algorithm called quantum distillation with quantum teleportation to effectively bypass the quantum communication channel and send a qubit over a classical communication channel. That is the main utility I see for it. Basically, very useful for transmitting qubits over a noisy quantum network.


  • The people who named it “quantum teleportation” had in mind Star Trek teleporters which work by “scanning” the object, destroying it, and then beaming the scanned information to another location where it is then reconstructed.

    Quantum teleportation is basically an algorithm that performs a destructive measurement (kind of like “scanning”) of the quantum state of one qubit and then sends the information over a classical communication channel (could even be a beam if you wanted) to another party which can then use that information to reconstruct the quantum state on another qubit.

    The point is that there is still the “beaming” step, i.e. you still have to send the measurement information over a classical channel, which cannot exceed the speed of light.


  • bunchberry@lemmy.worldtoMemes@lemmy.mlVictims of Communism
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    edit-2
    1 month ago

    It is the academic consensus even among western scholars that the Ukrainian famine was indeed a famine, not an intentional genocide. This is not my opinion, but, again, the overwhelming consensus even among the most anti-communist historians like Robert Conquest who described himself as a “cold warrior.” The leading western scholar on this issue, Stephen Wheatcroft, discussed the history of this in western academia in a paper I will link below.

    He discusses how there was strong debate over it being a genocide in western academia up until the Soviet Union collapsed and the Soviet archives were open. When the archives were open, many historians expected to find a “smoking gun” showing that the Soviets deliberately had a policy of starving the Ukrainians, but such a thing was never found and so even the most hardened anti-communist historians were forced to change their tune (and indeed you can find many documents showing the Soviets ordering food to Ukraine such as this one and this one).

    Wheatcroft considers Conquest changing his opinion as marking an end to that “era” in academia, but he also mentions that very recently there has been a revival of the claims of “genocide,” but these are clearly motivated and pushed by the Ukrainian state for political reasons and not academic reasons. It is literally a propaganda move. There are hostilities between the current Ukrainian state and the current Russian state, and so the current Ukrainian state has a vested interest in painting the Russian state poorly, and so reviving this old myth is good for its propaganda. But it is just that, state propaganda.

    Discussions in the popular narrative of famine have changed over the years. During Soviet times there was a contrast between ‘man-made’ famine and ‘denial of famine’.‘Man-made’ at this time largely meant as a result of policy. Then there was a contrast between ‘man-made on purpose’, and ‘man-made by accident’ with charges of criminal neglect and cover up. This stage seemed to have ended in 2004 when Robert Conquest agreed that the famine was not man-made on purpose. But in the following ten years there has been a revival of the ‘man-made on purpose’ side. This reflects both a reduced interest in understanding the economic history, and increased attempts by the Ukrainian government to classify the ‘famine as a genocide’. It is time to return to paying more attention to economic explanations.

    https://www.researchgate.net/publication/326562364