What Is Quantum Computing?

Quantum Computing theory is one of the branches of physics that cares about the world of atoms and the tiny molecules inside these atoms.

You may think that atoms behave the same way as anything else. But this belief is far from true: when we move to the level of particles, the ruling laws change and the classical laws of physics that we usually take as automatically unworkable.

“When things are so small, they behave in a way we’ve never seen or experienced before,” said Richard P. Feynman, one of the most famous physicists of the 20th century.

If you’ve ever had a light study during your education, you might know about quantum theory.

You may know that sometimes the beam of light behaves as if it were made up of molecules (such as raindrop torrents), and sometimes they act as shaky energy waves (such as sea waves).

It is called wave-particle duality. It’s hard to accept that something can be in two cases simultaneously — a molecule and a wave — because it’s a strange idea of what we experience in our daily lives because a car can’t be a bike and a bus at the same time.

But this is just one of the weird and crazy things in quantum theory.

One of the most famous examples of the possibility of something in two cases is the puzzling scientific puzzle called “Schrodinger’s cat.” In short, in the strange world of quantum theory, we can imagine a situation where something like a cat is alive and dead simultaneously!

Part 1 of Quantum Computing:

how can we get a bigger payoff than more minor things? the size of computers is decreasing, while efficiency and processing capacity are increasing, and the processing and storage capabilities of a mobile phone in the 21st century are higher than 50 years ago in a room-sized military computer.

Unfortunately, despite all this remarkable development, we cannot solve many complex dilemmas even with the most advanced computers. There is no guarantee that we will be able to address these issues using this type of computer.

One of the problems behind this is that transistors, the basic units used in constructing actual stills and processors that developers seek to reduce in size, will soon be the size of the corn. Which is technically minimal.

If we want to build smaller and more capable computers than existing ones, we will need to change computing radically.

Access to the world of atoms opens up vast new possibilities using quantum computing, with processors that can work faster than current processors.

It sounds great, but the problem is that quantum computing is more complex than traditional computing stages. Moreover, its work falls within the magical world of quantum physics, where concrete classical physical laws become unworkable.

Part 2:

in the first part of our topic on quantum computers. We discussed the principle of the work of traditional computers and their most prominent problems and stopped when asked what this quantum computing is.

Despite the rapid development of computer capabilities. Science continues to be plagued by complex issues that current computers cannot address due to their lack of storage and processing capabilities.

What Is Traditional Computing?

The expectation is to think that a computer is that stylish little tool that’s put in your lap and enables you to send e-mails and shop online; it may also be a way to chat and play games, but it’s a lot more and at the same time much less than you think.

It’s more because it’s a machine with general goals: virtually, you can make it do anything you want. But, on the other hand, they are less than you think. For the simple reason that they are an insider are only a little more than a standard calculator. Following a series of pre-prepared instructions, i.e. the so-called program.

The pretty thing you see in front of you hides completely ordinary things under its cover.

Traditional computers practice two tricks that enable them to perform in such a good way. The first is that they have the potential to store numbers in memory. And the second is that they can process these stored numbers using simple mathematical processes (such as collection and subtraction).

They can also perform more complex things by combining several simple processes in a series called an algorithm.