The principles are based upon the von Neumann concepts of multiplexing and pipelining the quantum computer which uses traditional architecture and processors for these activities. This is required for the use of memory and storage. It also requires that classical principles will be needed to move quantum information between quantum computers through a shared memory model. Without parallelism and multiplexing, it will be tough to move qubits between the different parts of the quantum computer. Coherence is absolutely required to ensure the quantum model. The larger the quantum computer, the more optimization and ability to share workload will be discussed.
The other fundamental concept we will be talking about is the abstraction layer that will be required to enable Linux to run on a quantum computer without having to re-educate the entire community. In doing the abstraction layer will be necessary to deal with the use of POSIX on a quantum computer. This is no simple task, and we have been working with universities around the world and especially in Israel to make it a reality. The ultimate goal is to create a giant quantum computer model and a small quantum computer model that can be deployed on a single chip. The temperature issue of the quantum computer is addressed, and specific environments will enable each chip to reach the cooling temperature required to be able to process with minimal electrical and cooling components. The abstraction layer will be based upon the open source project for KVM.
With these concepts outlined in the following articles and discussions with us the universities, I will outline how close we are to the quantum environment that is attainable and programmable without great expense and especially specific environments. I will specifically cite works that are going on at the Technion, Hebrew University and Weizman Institute of Science.