Future Of Computing World – Qubits

March 3, 2014 at 12:33 PM (Technical Articles) (, , )

Always it will be a great fun to play around something new , but if the new thing is such a thing which will completely change the basic laws then that new thing will become a joke initially and the one who invent or the one who propose it, will become a joker . Looking back the the history, we always have evidences of such a incidents where many scientists were treated as mad fellow when he proposed his theory , but later the world has accepted it .


(adsbygoogle = window.adsbygoogle || []).push({});

Something similar happened nearly 30+ years ago , at that time people were not even aware of computers , but there were great minds who thought of beyond every one’s imagination in that era of technology. I would like to share a real story of 30 years old which might change the future of computer world .

Every day in market we would be able to see a new model of computer. The process of miniaturisation has reached that extent , even one can keep his computer within his pocket. No doubt the world of technology accepted the Moor’s Law which states ” the number of transistors on a microprocessor continues to double every 18 months, the year 2020 or 2030 will find the circuits on a microprocessor measured on an atomic scale” . Thinking something even beyond that is really exceptionally mind blowing 🙂 .

A physicist at Argonne National Laboratory , Paul Benioff proposed a theory in 1981. As per his theory , the computers will become 100 times faster than today’s fastest super computers, The computers can solve most of the unsolved problems of this era , they can also solve many mathematical problems that have remained impossible for humanity . The theory proposed by Paul was almost impossible to achieve in real time. One can theoretically prove but can not make such a computer practically. The idea was not taken further and all scientists dropped it.

Though many were not interested on the idea, but there were crazy people who use to keep doing research on it . Devid Deutsch said the uniqueness of the theory proposed, brings an incredible computing power to the computers. As per his analysis , today’s Personal computers can perform gigaflops ( billions of floating-point operations per second) but Paul’s theory can make a computer which can perform 10 teraflops ( trillions of floating-point operations per second).

Hold On —- What is that theory after all ??

I am talking about a quantum computers , quantum computers are the future computers . Many questions arise by looking at few of the statements what I have given above . What are quantum computers ? how quantum computers work ? what is the history ? . I may not be able to cover all these in depth, as the concept is beyond my imagination. but I request the reader to get some interest in this and do further investigation.

History
Paul Benioff is the first person who proposed a quantum based computer concept and processing in 1981. His theoretical turing machine was proven but was not possible to make it practical . Some scientists say probably it might take next another 100 years to see such a computer , or even not possible in human life span on this planet .

But the scientists are crazy they don’t want to leave it . A researchers at IBM formed a team and thought to take it as a challenge. The team started working on it several , probably 10 years ago. They started making every single component from scratch to make it real , they incredible effort, research was leading them towards their dream .

What is quantum computer ?
We know that a computer understands only 0 and 1 , binary system is the heart of the computers . At any point of time our processors will process and holds either 0 or 1 . but what if I say it can hold either 0 or 1 or both 0 and 1 together at a time ?

Being in such a state of both 0 and 1 at a time ? is it not strange ? is it really possible ? Quantum computers works based on this concept. A bit is an information represented by either 0 or 1, but a 0 or 1 or both 0 and 1 at a same time , having another state of information is named as Qubit . Qbits are the heart of quantum computers. According to scientists due to the third state in quantum computers, the system will get an incredible parallel processing capability , This is what allows quantum computers to perform millions of calculations at once.

A fully functioning quantum computer could perform millions at the same time. It would instantly be the most powerful computing device ever created by mankind. “In the past, people have said, maybe it’s 50 years away, it’s a dream, maybe it’ll happen sometime,” the Daily Mail quoted Mark Ketchen of IBM’s Watson Research Centre as saying. IBM scientists finally were able to create a Qubit and The ‘qubits’ created by IBM scientists exploit a bizarre property of quantum physics that mean that a quantum computer ‘bit’, or unit of information – a ‘qubit’ – can be both 1 and 0 at once.

The concept itself is very interesting and the capability of computers is even more interesting, A 250-qubit array would contain more ‘bits’ of information than there are atoms in the entire universe. IBM says that the next step is ‘creating systems’ that exploit this power.

“The quantum computing work we are doing shows it is no longer just a brute force physics experiment. It“s time to start creating systems based on this science that will take computing to a new frontier,” said IBM scientist Matthias Steffen. “These properties will have wide-spread implications foremost for the field of data encryption where quantum computers could factor very large numbers like those used to decode and encode sensitive information,” the company added.

When some one asks how can a bit of information can be both 0 and 1 at a time ? then the scientists says one can imagine that it is more like a complex number , which holds an imaginary and real part together.

Following are the controls used to make quantum computer.
QUBIT CONTROL
Computer scientists control the microscopic particles that act as qubits in quantum computers by using control devices.
• Ion traps use optical or magnetic fields (or a combination of both) to trap ions.
• Optical traps use light waves to trap and control particles.
• Quantum dots are made of semiconductor material and are used to contain and manipulate electrons.
• Semiconductor impurities contain electrons by using “unwanted” atoms found in semiconductor material.
• Superconducting circuits allow electrons to flow with almost no resistance at very low temperatures.

Once upon a time a theoretical concept now is real , now google is already trying out with a realtime quantum computer named as D-Wave . IBM is almost done with their experiments and willing to see a real time Q computer. There is no limit for the processing power of this human mind, human mind can not be compared to any man made computer , we can make thousands of quantum computers but human mind is beyond all those computers. No doubt one day we might cross the solar system, we might uncover the galaxies by landing our feet on them.
Sources –
How Quantum computers work

Success of IBM Research team 

Google’s Try with D-Wave

Advertisements

Permalink 3 Comments

Now Everything is Smart

July 4, 2012 at 7:33 PM (Technical Articles)

Technology is making everything simple for us, we do see many innovative devices like iPhone, iPad, Google project glass, Android phones and Many more . When we start thinking about the future , we feel like – we will be having solution for everything. Many times we do even try to think over some innovative ideas, but I feel we will be missing something which will be in front of our eyes . If we face some problems in our daily life, we will be having best innovative solution for it , but we never think beyond the box. Innovators/researchers will do things easy for us.

verything will be very difficult in the initial phase, but their continuous effort, dedication will finally give us one best, an efficient product. One such product will make our life simple and beautiful. Google stick with its search algorithms , now we get everything in seconds. Apple always tried to give innovative gadgets , now they won the heart of Apple fans , gave a magical device iPhone and iPad. Like this every innovator will bring us something memorable.

I can recall an incident , where me and my friend Ravi Prakash were going on a bike . The place was my favorite Agumbe. It was a very odd time, around 7PM , the road was completely covered by thick mist, and driving in such a dangerous Ghats , such a curving road is really challenging . Head lights were on, I was driving slow, but I was feeling that driving without head lights is as same as driving with head lights in this thick mist. I was not able to see anything, I was driving almost in a 20kmph or less than that. Hairpin curves were visible at the sharp edges only. It was horrible for me to drive throughout the ghat section.

For a moment I thought that why can’t automobile industries gave us such sharp head light which help us to see everything clearly even in thick mist and heavy rain ?, I thought and left it there only. I never did think how can it is possible ?. Today I come across an article on research, it is exactly the same which I did think over it.

I believe if hardware and software talk each other nicely then they can make anything possible and it is proved several times.

Please read the article given in this link – Smart Headlights

Permalink Leave a Comment

Everything on Cloud…

August 24, 2011 at 3:34 PM (Technical Articles)

Whenever we sing a song, our voice floats through the air that is fine but what if your digital data floats in the air ? Ah ha What am I talking ?? of course wireless technology is nothing but the same which I said right? . But still , I want to make it big and very big? . Sorry !! Am I confusing you guys? . Ok let me tell you directly , what if our huge data floats through the clouds ?? Yes I am talking about cloud computing . This is not a very new topic but some one did request me to write an article on cloud computing and I thought of doing it. Even I was not knowing it completely but I just did little Investigation and collected information . So let me eat your head 🙂


(adsbygoogle = window.adsbygoogle || []).push({});

Cloud Computing :

It can be defined as , ” A model for, on demand network access to a shared pool of configurable computing resources. A resource here may be a network , server , storage , applications and services” .

A simple plain definition does not give any clear picture , let me explain you how a cloud computing works . Think that you are an executive at a large corporation . Your responsibility is making sure that all of your employees have right hardware and softwares. Whenever you have a new hire then you need to provide necessary software and hardware for the new employee. Your current software license should allow another user . It is so stressful that you find it difficult to handle all once the organization grows in big size . Instead of installing an application in every system , what if , it is installed somewhere in a server and a simple web based service hosts the necessary application upon successful log in ? . No need to install in each and very system .

Remote machines will run the actual applications , may be a e-mail client, word processors or any other complex data analysis programs , everything will run somewhere else, but you can access it in your system with a simple web based application. It is called as cloud computing. A users machine need not worry about the necessary hardware or software , the demands on the users side decrease greatly . The only thing the user’s computer needs to be able to run is cloud computing system’s interface.

This is all about how an application be in cloud , but what if a data ? Say I have a big company , every day huge amount of data I need to store for the future . My requirement demands 100s of GBs storage for every day . Is it not a big problem to handle such a huge data? . Yes ultimate solution for it is , cloud computing . Give all your data to a trusted cloud computing service provider , he will maintain your data and he will store the data. you don’t bother about the storage . This is how a your data store in cloud . What about a data security ? Yes it is a serious matter . Your all confidential data will be given to some one else ! . Guys one good example for cloud computing is dropbox , I hope many people knew about it . If not search in net and read about dropbox . It provides every user of about 2GB space for free. You can store photos, videos anything you want and share with your friends .

I was working on a project , and suddenly my client request for a big server , to store huge data base . What should I do? buy a big server , install it , do necessary set up and make use of it in your project . But how long it takes to install a server? what are the difficulties I need to face to achieve this? My god very hectic !! . I will not bother about server , I will raise a request to my cloud computing service provider , he will install a server for me . Imagine how long he take time to install it ? one day? may be it is too much . They may able to do it within hours . Over my need of server got a solution within hours .

In this way a cloud computing does provide all necessary resources like softwares, hardwares . But Still many companies does not go for cloud computing due to data security. Because our valuable data will be stored somewhere in other’s machine . Even though cloud computing services are trusted , each person need to think before giving that confidential data. Keeping this matter aside , let us think about the benefits and functionality of the cloud computing also let us study its architecture .

Cloud computing Architecture: 

Cloud computing is fully enabled by virtualization technology and virtual appliances. This is a key advantage to cloud computing. The ability to launch new instances of an application with minimal labor and expense allows application providers to :

1> Scale up and down rapidly .

2> Recover from failure .

Its architecture can be divide into two sections: the front end and the back end. They connect each other through a network. The front end is the side the computer user or client sees. The back end is the “Cloud ” section of the system.

Front end includes the clients computer and a web based application required to access the cloud computing service. It is also known as Cloud computing interface. Services like web based e-mail programs also a cloud computing interface. Like gmail, yahoomail , hotmail are all examples for cloud computing. On the back end of the system, there are various computers, servers and data storage systems that create “Cloud” of computing services.

A central server administers the system , monitoring the traffic and client demands to ensure everything runs smoothly. It follows set of rules called protocols and uses a special kind of software called middleware. Middleware allows networked computers communicate with each other. What protocols are being used? this is little bit difficult for me to understand , but I have given a link , if you are interested read it and teach me :).

 Cloud computing design Considerations:

To be successful in cloud computing , it should be designed to take care of the following things.

1> Easily scalable

2> Tolerate Failures

3> Include management tools.

Cloud Computing Applications:  Where you can apply ? and what are the benefits?

1> It brings hardware costs down: A client side system need not be having too much processing power with huge memory. The cloud computing system provides the necessary needs for you. Instead of buying a faster computer with huge memory , just buy a monitor , keyboard , mouse and enough processing power to run one simple middleware application to interact with cloud computing system.

2> Anywhere anytime Access : Clients can access their data, applications anywhere at anytime. They can access cloud computing device anywhere from any computer that is connected to the internet.

3> No need to buy the softwares : No need to buy the applications and their licenses , cloud computing service provides everything for you.

These are few applications of the cloud computing technology, still there are many more .

Benefits of cloud computing:  Let me give you some real time examples for this.

1> Reduced provisioning cycle time:

According to the research , new resources, softwares are provided in short time, this reduces the overall provisioning cycle time for an organization.

examples:

1> New server: A new server provision time can be reduced to 3 minutes from 7.5 weeks.

2> New Collaboration environment: this can be reduced to 5 minutes from 8 weeks.

3> 64-node Linux cluster: A 64 node linux cluster can be provided in 5 minutes instead of 12 weeks.

See the differences, and there are such many real time examples , which give the proof for the benefits of cloud computing.

Realtime Examples: 

1> Wipro – Private first , Public Later. 

Benefits:

  • 1> Internal Server provisioning : 46 days to 35 minutes.
  • 2> Utilization : <10% to 40%
  • 3> Average server cost – from $2000 to $800

2> Japan Post – Business – Critical Cloud

Benefits:

  • Average development time of 3-4 months . This is 3 to 4 times faster than traditional development.
  • High user satisfaction ( functional, performance)
  • No performance and security issues.


3> Japan Ministry of Economy : Trade and Industry – consumer site

Benefits:

  • Built in only 3 weeks
  • 40 million consumers expected to access site at peak times
  • Expected to support more than 20 million transactions per month.

4> Packaged Shipping company : FedEx Express.

Benefits:

  • Able to develop new analytical application that was not economically feasible using earlier infrastructure models.
  • 4 hour batch process can now runs in 20 minutes, developing applications in 60% less time.

So these few such examples shows the benefits of the cloud computing.

So when you will be experiencing it ??? 🙂 🙂

Read the case-studies in cloud computing  .

Permalink 16 Comments

Environment Friendly Facebook

April 19, 2011 at 8:08 AM (Technical Articles)

Guys many of us are working in software companies , may not be as a developer but testers can also understand what I am going to tell today. The one who don’t know programming language may not understand the technical words, but they will get something from this article.

Everyone know how the performance factor is important for an application, let me clear this point for non IT friends. Let us think that you people are listening music in winamp player, and if you press next button in winamp , it should switch to next song immediately , But what if, it takes 5 min to switch to next song? ah that is what we call as very low performance. It takes hell amount of time. When the word ‘time’ comes  my mind will start to think too much. Because whole universe and its birth can be described in terms of time, Forget it now 🙂 . So performance is an important factor and we developer should think a lot before coding ( which I won’t do 🙂 ). If our code is poor then it takes more time, more time means more processing, more processing means more energy consumption , more energy consumption means more CO2 release. What the hell is this? Started with coding and ended with CO2?. Yes everybody should be aware of this fact that , whatever we do with Google, Facebook, twitter, everything is related to release of CO2. I have already posted an article on google and CO2 release. ( link ) But I was searching something and this time I got an important factor about the facebook.

Guys facebook is developed in PHP. Every day millions of people use facebook, upload video, picture, write stupid words on Wall 🙂 etc. An analyst says that If facebook switch to C++ from PHP , facebook could move 30,000 servers down just to 7500 servers, and trim 49000 tons of Carbon dioxide emission from its footprint.

Since an average server consumes about 200 watt and with an average SI EER ( Site Infrastructure Energy Efficiency Ration)  of 2. This translates to around 400 watt including  cooling and other overhead, and this brings us to a total CO2 emission by the facebook server park of about 59000 ton of CO2 per year. This is 1/1000 th of the total CO2 emission by Finland( an European continent ).

Facebook only says “the bulk” is running PHP. Of course we know that C/C++ are most efficient than any other high level language. But definitely it is very difficult to develop facebook using C++. But facebook is a big company, earns like anything. Why can’t they invest and recruit C++ developers in mass and try to achieve it? Why can’t they want to give an environment friendly Facebook?. hmmm profit ! now they are getting nice money, why the hell they want to think about environment?

All the data, what I have given here may not be exactly correct, but it make sense if we think over it. Common guys upload your resume to facebook , mention in resume that We will give environment friendly facebook :). Facebook spends $1million on their electricity bill per month. Guys you can reduce this amount too :).

See when we talk about google, facebook and twitter we get these kind of big information. Now just think about the millions of websites, millions of servers, altogether how much CO2 is being released? just think. After reading this , girls please don’t stop uploading your photos 🙂 :).

Permalink 3 Comments

Silicon Optical Component

February 27, 2011 at 7:08 PM (Technical Articles)

When I was in BE, our lecturer was discussing about the importance of performance and memory in computers. I remember that class because it was a very special class. Our lecturer was explaining his experiences, when he was studying, the computers were very slow, like they will be switching on the system and going for tea. After coming back with 15 min of tea break , some systems were still in booting process . For the next class I did bunk( many such mistakes I did, but I request my juniors not to do the same mistake) and I did visit internet , I was searching on the same topic. That day I got many useful information on memory and processor technology. Of course I didn’t understand whatever I read that day, But I came to know about the development of memory and processor technology . The main component which made revolution in today’s computers is silicon.

After that  many  times I read in internet about the speed of the computers and their development. I came across an incident , when I was doing my final year project with my idiot project mates( it was a great team ) , there was a feature in our project , that is nothing but transferring a multimedia file ( image or video ) from one system to another system by sending an SMS to a server machine. So our project idea was using mobile phones as a console terminal like keyboard or mouse. whatever we do with keyboard we want to do it by sending an SMS to the system. Ok that is not the topic to discuss here. In our project when we were written code to transfer a small video file, my god it was taking hell amount of time . The moment when it was taking so much time, I was discussing with my team, that there must be something ( a technology which is unknown to this world ) which transfers a huge amount of data in fraction of seconds. Something like switching on our TV and getting the display for the next moment? Of course the discussion went very well , but we didn’t get any idea and we didn’t find any such technology in internet too. Finally my idiot team mates did stopped discussing on that. From that day onwards I was searching, searching and searching ( sadly I have to say , I was not thinking 😦 , I was simply searching 🙂 ). Today I got one useful link  which describe about yet another innovation on data transfer. Here in this blog I wanted to share that with you all. Hmmm of course I didn’t understand it completely , but still it is very nice topic to read and to do R&D on it.

Laser – Quick Data Transfer:

Researchers learn how to make lasers directly on microchips – The result could be computers that download huge files much more quickly . For the first time , researchers have grown lasers from high-performance materials directly on silicon’s. Bringing together electrical & optical components on computer chips would speed data transfer within and between computers.

It is a real fact that when we play with light, we can see dramatic speed .  but bringing electrical materials along with light is not an easy task ( said by researchers ) . There are many hurdles in bringing best laser material with silicon’s but University of California , Berkeley have surmounted this hurdle.

When the nanopillar ( http://en.wikipedia.org/wiki/Nanopillar ) is pumped with light from another laser, the light spirals around inside the pillar, as if running up and down a spiral staircase. The difference in materials between the core and the shell encourages this effect trapping the photons. They spiral until they reach a high enough energy threshold and are emitted. Researchers say that , this spiral effect is something that hasn’t been seen in other types of laser before.

Another key to making lasers on silicon chips is not to let the temperature get too high. Chang-Hasnain(  http://www.eecs.berkeley.edu/Faculty/Homepages/chang-hasnain.html )  says that her process could eventually be used to grow high – quality lasers on otherwise finished and optical components giving them the capability of encoding data into pulses of light. A lot of progress has been made on silicon optical components says Intel’s Paniccia .

So this is all about Laser technology , I didn’t give detailed explanation on technical words or concepts over here, because it’s very complex to understand for me . I just kept the idea in front of you guys. If anybody do R&D on this and If anybody understand it, please share with me. This is not an easy task to implement it in today’s memory technology , It may be expensive but in future we may play with this. Can anybody guess what kind of speed we may get in data transfer by using this technology? . You can transfer 10GB of data in one second. Super fast like light . Every time when I build iPhone app of size 70MB , It will take 5min to transfer , I have to wait to run it in the device. Every time when testers will be logging the defect like performance hit , memory reached threshold level etc  etc. We struggle to speed up the process of our application. But how difficult it is to achieve the speed? and here we are dreaming to transfer 10GB in one second!!! Great idea , Is it not?

Permalink 1 Comment

Next page »