![]() |
If Moore's Law hits a sinkhole...
Previously posted was that Moore's Law for computers may be about to hit a brick wall. Fundamental to the problem is that transistors have shrunk to about as small as atomic physics permits. On 15 Aug 2005, EE Times wrote this front page story:
Quote:
The upcoming Intel processors will soon be doing 45 nanometer transistors. As the EE Times article points out, simulations for a 32 nanometer transistor result in massive leakage problems. You have already seen how bad the leaking is. Hold you finger on a Pentium processor without a heatsink. No, the Pentium will not burn up as so many claim. But your finger will. Transistors are already so thin that Pentiums can produce more heat than a 100 watt light bulb. Intel recently announced a fundamental change in design philosophy. No big surprise. Everyone was predicting it. Traditionally, we put all critical computer functions in one chip so that communication between those functions is fastest. Once we send a signal between two ICs, the message gets really slow. However Intel must now separate CPU functions - because transistors are leaking and therefore consuming so much electricity. The trick is to separate CPU functions that need not communicate so fast. That literally means changing the entire CPU architecture. Last time a change this radical was performed, it was called the original Pentium. Intel has also set new corporate limits on how much electricity a chip can consume. IOW CPU speed is no longer the mantra. Once it was all about doing the same computations only faster. Trying to maximize CPU processing verses minimal electric consumption is a new corporate wide directive. All this to avoid the brick wall. I am reminded of the last scenes in James Bond's Thunderball where the hydrofoil is desperately swerving to avoid rock outcroppings. Eventually, a crash into rocks as hard as bricks. In the CPU business, are we about to witness the same end? And who will get the girl? Oh. Back then James never got the girl. He was rescued. And then he was replaced by Roger Moore. Curious coincidence. |
They also want to save power because notebooks are the boom business now, not laptops.
And if Moore's law does finally hit the wall, we will see more multi-CPU solutions, so each chip better take less power. It will be difficult to justify 1000 watt desktops with 4 processors in order to run Office 13. I don't think computing power is driving any innovation right now because all the desktops are faster than we need already. Desktops are about 10 times more powerful than they have to be in order to run Windows, Office, a browser, and full motion video all at the same time. A minority of people do the things that require a really powerful system: Battlefield 2, Unreal Tournament 2K4, video editing and massive photo manipulation. Even when DOS was prominent and programmers had to do weird and interesting things to get access to more memory, there were more applications pushing the edge. Nowadays, the cheapest system will run Windows and Office and everything they need to do, with practically no difference from the most expensive system. |
They are already looking into optics for data storage. They might do the same for CPU's.
|
Gaming is what drives innovation in the computer industry and that won't change. The motherboard I'm buying has 2 PCI Extreme Ports to add 2 high end video cards in SLi mode. This means one video card will render the current frame, the other will render the next frame, allowing an ungodly amount of frames per second and amazing performance.
I wouldn't bet on Moore's law failing. Right now Intel, AMD, and others researchers and universities are working on Quantum computer chips. They will be an order of magnitude faster than our current chips. These chips are right around the corner. I've actually heard the chips are already done, but the government won't allow them to be released until encryption methods are improved. The new chips would eat right through current encryption. It would normally take 30+ years even with supercomputers to break 1024 bit encryption. A Quantum chip could do it in minutes. I'm sure if the chips are available already, the government is using them. I know the U.S. government recently purchased the world's largest solid state storage chip. It holds more than 400 terabytes. This could hold an operating system which could run anything almost instantly. Couple this with the Quantum chip and you've got a machine that would allow the government to get into anyone's computer at anytime. |
Quote:
It helps to first learn of reality before posting. Quantum computers are but fragile advance science toys that, last I had heard were up to 4 bits. It takes something like 8 years to double those bits. And still they are not even stable enough for a controlled computer room. Meanwhile if encryption security was a reason for quashing quantum computers, well, quantum encryption is currently being tested in a network in MA. Quantum encryption is already far ahead of quantum computing. Quantum computer research is not ongoing in 'application' research labs such as AMD and Intel. It is still in 'basic' research labs such as University of CA (Santa Barbara), National Institute for Standards and Technology, CERN in Switzerland, Paul Drude Institute in Berlin Germany, etc. If you know about this 400 Terabyte memory chip, then share details such as who developed it, where, using what technology, manufactured in what quantities, etc? Why not? Radar ... you have again done exactly as Rush Limbaugh types would do. Hype a lie in hopes that the naive will believe propaganda. Warning message to the naive. Radar is lying because he provides no underlying facts, no supporting numbers, and contradicts 'state of the art' science. Most damning - he provides no details hoping that is enough for you to 'know' something. Optics for data storage, as Rich Levy notes, was suppose to first appear on the market this year from IBM. I suspected that was why IBM sold their hard drive division to Hitachi. Optics for CPUs has long been an objective to eliminate those power hungry data busses inside a CPU. But currently no one has made a successful or useful optical switching transistor. As EE Times noted, these would be breakthrough technologies that could replace the FET transistor. Unfortunately, as EE Times said, "there really is no replacement". |
I shouldn't be surprised by the fact that tw has a woefully inadequate education with technology. Over the years, he's proven himself stupid in a number of areas. But what's really funny is that this idiot actually has the gall to accuse me of being dishonest. Tw isn't fit to stand in my shadow when it comes to computer technology, and for that matter, pretty much everyone else on this board is in that boat. It's not an insult, it's just the truth. I'm one of the best all around network engineers in the business and that's not bragging.
I've never lied about anything I've ever said here, or anywhere else on the internet. That is a fact. I remembered reading the article about the Government buying the world's largest RAM disk a long time ago. After doing some research when being challenged by tw (as if he's a challenge for anyone), I found that the government did indeed buy the world's largest RAM disk and that they intend to use it in ways that I thought, but that it's not 400 Terabytes. It's only 2.5 Terabytes per unit. I remembered the 400 Terabyte number for a reason. I'm guessing it's because they ordered enough of the units to equal 400 TBs. Either way, I admit when I'm wrong and being wrong is not the same thing as being dishonest. It's been something like a year since I read the article and even then I was glancing through it while at an airport. http://www.techworld.com/storage/new...fm?NewsID=1176 http://www.atsnn.com/story/37239.html http://www.physorg.com/news6149.html Now let's move onto tw's next retarded contention. He stupidly claims that Quantum computers are a long way off. I guess for a child like him 3 years is a long time. http://www.engadget.com/entry/1234000933047853/ http://physicsweb.org/articles/world/15/4/4/1 http://www.physorg.com/news6149.html Here's something about Quantum Computer Memory... http://www.eurekalert.org/pub_releas...-nst100604.php tw did accidentally get something right. Most of the research for Quantum computers is being done in universities. Who is paying for this research? Large computer companies like IBM, Intel, and AMD. It must suck being such a bitter, sad, pathetic, little jackass like tw. What a horrible life he must live. Still, my pity of his stupidity and dishonesty wouldn't stop me from knocking the shit out of him in person if he ever called me a liar. It's a good thing we have an internet so he can talk shit and act brave. He can't do it anywhere else. |
Here's are most of the places around the world doing research on Quantum Computing.
http://www.qubit.org/phpscripts/places.php |
You talked at first about a 400 TB chip but revised it to a 1.5 TB solid state ram disk when you were questionned.
From a practical and engineering standpoint, we can't really tell the difference between lying, being poorly trained, making bad assumptions, reading too fast, having a personality problem that causes you to insist you have the high ground, or simply being mistaken. All an engineer would know is that you were wrong, and in the real world it probably cost somebody money and/or time. |
No, I talked about a 400 TB chip, which is how I remembered it after having barely glanced at an article a year ago. I admitted to being wrong about that. And did state the correction. Also, it's 2.5 TB of a RAM Disk, which is actually faster than most solid state chips such as USB storage.
In either case, my point about the U.S. government buying the world's largest high speed storage to be used against Americans was correct and so was what I said about Quantum computing. In otherwords, I was correct about it being highly unlikely that Moore's law would fail anytime soon. An engineer (I am one) would not assume dishonesty, even if one did find someone else to be mistaken. They would only bring up the mistake. |
Just for the record, Radar - are you predicting that quantum computers will be available in retail for the desktop in 3 years?
|
No, I said that the leader in Quantum Computer research (D-Wave Systems) says they'll have a working model in 3 years. It might take a bit longer to get them to market, but there's enough room for other innovations to keep Moore's law going before that happens.
Moore's Law hasn't failed in more than 40 years, and I am not going to bet against it failing anytime soon. I've given reasons why. The odds are with me. I'm guessing we'll see the first Quantum Computers hit the market in 2010 |
Quote:
I challenged Radar; making him provide those details. As usual, the 'devil is in those details'. That multi-Terabyte memory chip is actually a large system containing thousand of memory chips. Obviously with basic grasp of reality and the numbers, a Terabyte memory chip did not exist; despite what Radar posted. Meanwhile IBM's earlier 3 Terabyte data storage unit (that does the same thing slower) has already been retired to places like the Encryption Museum in Fort Meade MD. The technology is has been that old. It too was not a memory chip. But IBMs 3 Terabyte storage unit now long since retired demonstrates that 2.5 Terabyte RAM drive is not a major technological breakthrough. Radar only demonstrates someone has deep pockets and a need. Amazing how the poster changes his claims when challenged. It is a shame that he would also insult those who correctly challenged what was a post chock full of errors. Again, it is that devil - those little details - that forced Radar to change his tune. Rush Limbaugh does same by simply forgetting those details. It’s called propaganda. Propaganda must make extravagant claims - and never provide those details. Meanwhile, I see nothing new in Radar's citations other than the quantum computer technology has been performed in silicon and maybe some better use of quantum dots. Notice the temperature that some of these experiments are performed at ... 4 degrees above absolute zero. It's still a laboratory experiment in 'basic' research - too far from being moved into 'application' research and then into a marketable product. Radar demonstrates what I had noted in a previous discussion. Quantum physics is to today's teenager what the transistor was to a teenager in 1960. Quantum physics is that important to this nation's future and to so many advances in technology. Need we note products currently based in quantum physics: the gigabyte disk drives and PET medical scans. Radar has promoted the quantum computer like superconductivity was hyped maybe 20 years ago. Notice all those superconductive wires everywhere? Even a first trial in Chicago appears to have failed. But then taking something from 'basic' research, through 'application' research, and then to a marketable product requires typically at least 20 years. Now that Radar has provided citations, his posts have returned to the realm of reality. Numerous quantum techniques are being tried and still in 'basic' research. The quantum computer as a viable product appears to be essential to our future and is at least a decade away. 20 years is not soon enough to avoid the brick wall that Moore's law may be approaching; no matter how many personal insults Radar includes in his reply. |
I suppose it's not an insult to publicly call someone a liar.
We agree that Quantum computers are extremely important to the future of technology. I say that current advances will carry us long enough until the first of the Quantum computers is up and running and will not interrupt Moore's law. You disagree. Right now it's simply a matter of opinion. I've shown the research, and given a link to a company that says they'll have a working Quantum computer in 3 years. You mentioned superconductivity as though there haven't been any advances in it in the last 20 years when in reality the temperature at which superconductivity can be reached has gotten much higher using ceramics. In fact it can be as warm as minus 234 degrees Fahrenheit. |
Quote:
My criticisms were based primarily on 'Rush Limbaugh like' claims. Why? Rush Limbaugh is faxed daily from the White House what he should say. Reasons why are totally irrelevant. Therefore Rush Limbaugh lies. It is called propaganda. I have not insulted Rush. I have defined him for what he is by what he does. Meanwhile I just read an article in this month's Scientific American on a problem with quantum computing. The way I read it, decoherence means the qubit has maybe 0.5 microseconds to be initialized, perform a logical operation, and qubit states read. Decoherence is the "loss of the very quantum properties that such computers would rely on." Fundamentals of quantum computing may be demonstrated in a 'basic' research experiment. But things like decoherence are the 'devilish details' that will add 10+ years to getting a functioning machine out of basic research and through application research. A functioning transistor was finally demonstrated in 1948. But transistors took another 15 years to eventually appear in products. And even then, transistors were so exotic that a radio was rated by its number of transistors. A 'best' transistor radio was 9 transistors. Quantum computers have yet to achieve the equivalent of a 1948 transistor. They are many years from becoming useful. However quantum physics is the future. Much like the transistor was in 1948 or blue-green steel in Ayn Rand's "Atlas Shrugged". Superconductivity saw a breakthrough maybe 15 years ago. For a while, it appearred superconductivity would start appearing in products everywhere. However failure of superconductive wires in Chicago's Con Ed and use in a naval warship still have not succeeded. And still, the subatomic nature that creates superconductivity is not comprehended enough to predict and then find warmer supercondutors. Having so little knowledge of what makes some compounds superconductors means we are still a long way from profitable applications. But then it too demonstrates long time periods between basic research and a useful product. There is nothing in from basic research that can rescue Moore's law if the FET transistor does hit that brick wall. As the EE Times article noted, many of the tricks for perserving Moore's law are no longer so promising. |
You're forgetting the fact that I'm absolutely NOTHING like Rush Limbaugh and that everything I've said is honest. Your comparison of me to his propaganda spreading holds no weight.
Your opinion is just that...an opinion. It's no better than mine. Mine is based in fact and you claims yours is too. Only time will tell. If your smart you won't put money on Moore's law failing. That's it. There's nothing else to say. |
Quote:
Second, at no time in that post was Radar declared another Rush Limbaugh. Please reread the post more carefully - and only for what is specifically stated in that post. Demonstrated is why your earlier post receieved so much criticism. It made statements in a format that Rush Limbaugh would be proud of. |
I actually have known Tom a long time...
Radar,
I've known him since 1991-1992. He is completely well-researched, and is a respected engineer. He has my respect because he does his homework and is very well-spoken, as he has shown again in this case. At no time did he insult you. He simply stated facts. And yes, you are acting similarly to Rush Limbaugh. If I wanted to hear Rush-like banter, I'll turn on my radio or follow some links off of the Drudge Report site, not here. When I want to talk technology with people who really know what they are talking about, and who do not sling insults, I'll come here. If you want to flame, there's always Fark :). Mitch |
And you are.....?
If you want to talk technology with people who really know what they're talking about, you'd want to talk to me instead of a woefully misinformed guy like tw. I actually AM an engineer and I actually work in the field. I've used my skills for aerospace, banking, entertainment, and now biotech companies. I'm at the bleeding edge of technology, and I've remained there for the last 20+ years. I could give a shit about you comparing me to Rush Limbaugh. Because you're nobody. Before today, I've never seen a single one of your posts so your opinion means less than nothing. You claim he state facts. You are a liar. He did not state any facts. All he did was say, "Nuh uh" to the facts that I stated....and backed up. He didn't provide a single reference where I provided several. So run off little boy and let the adults like me do the talking. |
Quote:
|
Radar, you keep talking about your degree. Is your degree in chip design? quantum mechanics? If not, then your "I am an engineer" statement means squat. Being a 'network engineer' in no way qualifies you as an expert in this domain.
|
My degree is in computer science engineering, which yes, includes chip design. I haven't actually designed chips or even used those skills since college years ago, because I went into networking. In either case, I am an engineer, and work with the latest and greatest technology whether it's communication satellites, or immunodiagnostic testing equipment. I always keep up on what's current, and what I said was correct. I provided references and he didn't.
Talking shit with nothing to back it up (like tw and dar) to someone who actually can back it up and who has offered references is an untenable position. |
Quote:
I've also had a course in chip design and I suspect I've been in the computer industry longer than you. But I don't think that makes me an expert in this area and I don't think you are either. |
On the subject of providing references, tw's first post referenced the front page of the Aug 15 EE Times. Nothing radar has posted is even on the topic of what tw originally posted from the EE Times.
And in fact everything he has posted (as tw noted) suggests that quantum computing is too far off to combat Moore's law slowdown. In fact the first meaningful link he posted on quantum computing notes, right in the summary, Quote:
tw has provided a very meaningful reference that backs up his assertion. radar has posted semi-meaningful references that disprove his assertion. That's my summary of the account so far but you all could re-read the thread and see for yourself. |
I have a Bachellor's in Geography, but I do not call myself a Geographer.
I have a Master's in Clinical Psychology and do not call myself a Psychologist. Having a degree in engineering does not make you an engineer, unless you have worked in that particular discipline for a significant amount of time. Last we knew, Radar, you were going around to people's offices and making sure they couldn't play Minesweeper. |
Quote:
Wolf, I also design, build, and support global computer networks consisting of different security models, wiring, topologies, network protocols, routing protocols, operating systems, hardware, software, databases, email systems, secure VPN's, etc. I have never supported less than 800 people in any computer job since 1996. I'm also not too proud to do any small job, especially when a vice-president requests it. So if they want me to boobytrap minesweeper, go to the house of the CEO to setup his wireless networking, unpack computers, do hardware fixes on old systems, or take a trip to Cabo, Mexico to setup 40 laptops, I do that too. I provided meaningful and relevant references and facts backing up the assertion that Moore's law is in no danger of being violated because we'll have working quantum computers within 3 years. He gave worthless opinions backed by nothing. |
Quote:
So noted in the Cellar calendar entry for September 20, 2008. |
Awesome, UT! This is just like the "Book of Right" in my house. My family has a blank book where we write down all our argument-predictions, so we can tally long-term who's actually right. I want to see more dubious Dwellar claims placed on the calendar!
|
Most people are too smart/chicken to make predictions like that. I made a prediction earlier today in the hurricane thread that Rita won't be as bad as Katrina. I hope I'm right (especially for the victims' sake,) but that prediction might come back to haunt me.
|
Hi.
I'm not little. I also don't talk about my customers here, as it's frowned upon by them. I also know several of the users here personally, including Undertoad. In my line of work, I work in the same crowd as tw, Undertoad, and a few other users. I have never heard anything but respect for those people, and even their former business partners. I don't talk buzzwords. I don't brag or make comments about my customers. I also do research, and know the difference between a website and a peer-reviewed journal. EETimes is peer-reviewed. Your sites are not. TW, when he presents research, presents it from those sources. Any corporate researcher or academic will tell you which one is the most proven. I've been here since 1991. Deal with it. I don't need to justify my posting frequency to you. And BTW, the last guy I met with your attitude got tossed out of a customer's site. People with the attitude you have shown do not last long in any organization. Unfortunately, I encounter too many of you in my travels. People with that personality give IT consultants a bad name. When you attempt to make blind judgements based on my number of posts, then you have proven yourself to be a real troll. Go away, little boy. We have had enough of you. Find somewhere else to brag about what you do. |
The first posted sentence sounded more like a eulogy at a funeral.
Quote:
|
Quote:
Quote:
|
But, but, but, I just ordered mine online.
Btw, Great thread. |
Quote:
I'm only half joking here. A huge amount of innovation is driven by entertainment. It has been argued that porn provided the early impetus to the VCR industry. More effort has been expended on developing, marketing, and getting health coverage for using Viagra than on many cancer medications. More Hummers were sold to soccer moms that the US military ever bought. Collectively speaking, 350 million people in the US (or a few billion worldwide) looking for fun have more economic clout than military, government, or even business. If there's a solution there it will be found. One possibility is moving from electricity to light as in photonic computing. Another is by building a ternary computer instead of a binary computer, which would be more efficient. To do this might require a combination of materials to allow for three states instead of two. |
I'm headed out to MicroCenter to buy my quantum computer this morning. Soon, I'll be posting WTF photos from a kick ass machine! I sure hope they don't force me to buy it with Vista on it, though.
|
Bahahahah
TW,
How does it feel to be proven right, and that smack-talker Radar proven wrong? Mitch |
When the entry was added, I didn't say anything (there is usually nothing to be gained by arguing with Radar directly), but it was pretty clear that Radar was putting all his eggs in one company's PR basket ("D-Wave") and that the company was releasing just enough information to attract venture capital, a sure sign that they are too optimistic.
That company has now run two closed demos which attracted much more criticism than money, in which they claim that QC was happening and roughly the rest of the QC community says it was not. The demos, which included solving a Sudoku puzzle (!), involved tasks that a non-QC can perform. After the demos, the last of which happened in November 2007 - one pro-D-Wave blogger noted "I think that there’s a lot to be said for D-Wave, and a lot to be said against it, but by far they’ve got the best chances of anyone at making a quantum computer in the next 5 years." In the next expo where they show, they are involved in a talk which claims their disruptive technology, amongst others such as optical communication and flash memory, "could lead to viable systems in the 2015-2020 timeframe." Too often Radar uses a "conclusion first" approach: decide what the truth is, and then seek out any corroborating evidence. *Any* evidence will do, no matter how suspect, because it reflects "truth". We all do this from time to time, particularly in politics. The process of picking a side is practically arbitrary, like picking a team to root for in a sport. Once the side is picked, though, the chosen side represents all that is good, and the opposition represents all that is evil or bad. And so by post #6 in the thread we are already into ad hominem and claims that we are Engineers and Know What We Are Talking About. At that point the whole thread becomes the usual boring insult-trading, and only the Cellar calendar can rescue us. |
I'd like to make a prediction. You can put it on the calendar if you like, too, UT.
We will have a new POTUS by January 20, 2009. I have based this projection on careful research and evaluation of all available information, and I am very confident that my conclusion is correct. ;) |
You underestimate the power of Cheney. :biglaugha
|
UT, I hope you understand
UT/Tony,
I've been calling the Cellar in various forms since 1991, and I really appreciate what you've done here over the years transforming this from Waffle on UNIX to this current setup. His grandstanding was the ONLY time I have been insulted the way I have been on your system in 17 years of calling/posting here. I have his post to me saved. If I EVER am in a position where he is presented to me or an associate at another company as an IT consultant, I will forward that post to the consulting agency who presents me to him, along with a strong "will not hire - EVER" notice. I am no "Little Boy" who knows nothing. I actually know two people working in this field who have Ph.D's from Stanford in the related physics disciplines, and who are establishing a business working on practical applications of quantum computing (Tony, you have access to my LinkedIn profile, one is there directly linked to me). This is years from production, even with DARPA money, HP money, and IBM money. Like I said....he's an idiot. 17 years of intelligent dialogue with many cool users here who I know personally, and he has to come along with his grandstanding attitude, not knowing who he's pissing off (and assuming they're a little boy newbie who knows nothing), and insulting everyone who doesn't agree with his point of view. Radar can go fuck himself with something that has lots of splinters and exposed jagged surfaces. Mitch |
Mitch, don't take it personal. He directs that sort of crap at everyone that disagrees with him.
Your credentials are impeccable. You have gone out of your way to be most helpful, on a number of my(and other's) computer related problems, and never steered me(us) wrong. You know your shit. tw, on the other hand, has ventured to be the last word, in areas not of his expertise... claiming to be the ultimate arbiter while expounding opinion, thus damaging his credibility. :haha: |
http://www.dwavesys.com/index.php?ma...t01returnid=21
What do you know, a working quantum computer being demonstrated. It didn't even take the 3 years I said it would take. http://www.sciencedaily.com/releases...0903134202.htm http://news-service.stanford.edu/pr/...um-051408.html http://www.nanowerk.com/news/newsid=3274.php Also, new developments to make it work better. Say what you want about your opinion of me, but the facts speak for themselves and I was correct. Mitch, get over it. It was 3 damn years ago you whining little bitch. ;) I'm not consulting anymore. I've moved up into management. I'm the I.T. Director and lead network engineer for a movie company in L.A. |
In the 2 latest links:
Quote:
Quote:
|
That don't change anything
Radar,
I didn't like seeing that little post of yours dredged up again. You made assumptions which simply were not true then, and are not true now. The level of management you are at doesn't change a damn thing about who you are ;). When they offer pricing, I'll be on the phone with them. There's at least one doctor in cancer research I know who will pay out the nose to make his equations run faster on his huge datasets (http://www.dwavesys.com/index.php?page=bioinformatics). However, the first applications of this type of physics and computing are going to be in areas such as GPS and encryption (think hardware random number generator first). General-purpose computing is still years off, no matter what a press release from late 2007 says. |
Yeah, let's put it this way: if D-Wave actually produced a working model they would have suitors lining up on one side to give them tons of money (not just a few million in VC change) and suitors lined up on the other side to run applications.
But they don't. On either side. Quote:
http://scottaaronson.com/blog/?p=291 Quote:
|
Quote:
|
Quote:
If I could afford such a computer, I'd also be in line to buy one too. It makes sense that they would use it first to attack encryption. The government is always interested in that, and they have invested millions into this research. They always want to get their hands on the new stuff first, and even force companies to delay the release of certain technology until they get it first. I said they'd have a working computer by 3 years. They did. It might not be very practical or useful at the moment, but they have a working version. |
Quote:
|
Quote:
|
No attack - just my opinion. I believe that engineers are interested primarily in facts. What's wrong with that?
I'll be the first to admit that aside from reading this thread and a couple of the associated links, I do not know anything about quantum computers. |
Moving back to the subject, quantum computing is not a solution to all computing problems. QC is only practical to a particular type of computing problem where numerous possibilities (permutations) exist simultaneously during the computing process. Breaking encryption is a possible example of where QC can be so productive. QC works by storing and manipulating a large amount of data with few particles. But when the computation is done, only one answer can be read. For example, any problem with two or more valid answers cannot be solved efficiently. Problems that don't work well in quantum computing include trying to optimize the packing of odd sized boxes in a trunk or finding a solution to visting every island connected by bridges only once. These problems are called "nondeterministic polynomial time".
Quantum computers are not a magic solution to all computing as so many assume. QC is a solution to limited problems that involve permutations. |
Quote:
|
I guess I don't. Thanks for pointing that out. Buh bye.
|
And why do you hate America?
|
Is that directed at me SG?
|
Dude, she's gotten you twice now with that one. You need to watch more Colbert Report.
|
well she could be serious - you never know with those Brits ;)
They have that odd sense of humor and watch Dr. Who and..... |
Moore's Law will hold up even if the new Quantum computers take a long time. The new Zii chips from Creative Labs are amazing.
http://ziilabs.com |
Encryption
Quote:
The purpose of quantum computing in random number generation is to make cracking encryption by hostile agencies a lot harder by increasing the entropy of the system :). Current hardware-based random number generators just don't scale up to the level you need for 10-40Gbps pipes, and beyond. There are certain customers in corporate America and Europe (think banks, multi-national telecoms, and multi-national pharma) that would pay a lot of money to encrypt their dark fiber and SAN traffic efficiently. Defense is right up there too, but a given. Of course, the opposite is true. With the right code, any "product of primes" encryption such as RSA would be toast with quantum computing. GPS would be the best application, as you'd have the ability to use the current satellite system to be significantly more accurate (think fractions of a millimeter). I still think the Zii is more marketing than product :). |
Quote:
|
Quote:
I'm with you on all of that. I do think Zii seems to have great possibilities though and is more than marketing. Come on man, you've got to admit real time rendering with ray tracing on a computer setup with about the same footprint as a standard tower is pretty impressive. |
All times are GMT -5. The time now is 10:04 AM. |
Powered by: vBulletin Version 3.8.1
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.