PDA

View Full Version : Moores Law coming to a end? (HALT ALL YE NON GEEKS) ^_^



Artie
04-12-03, 04:48
Sorry, all you non-geeks out there just ignore this post, but I've found this VERY intrigueing.


"Intel scientists find wall for Moore's Law"
http://news.com.com/2100-7337-5112061.html?tag=nefd_lede


"Myth's of Moore's Law"
http://news.com.com/2010-1071_3-1014887.html?tag=st_rn

this has got me thinking. Why would people need more CPU power? Is 5 nanometer gate limit for binary patterns enough? Should we press on and have more than 3 billion for fabrication labs spent on? When is enough enough? Personally, i love that we're hitting a limit. That means we'll have to find different ways of making computers, therefore coming up with different means to accomplish them.

Trinary, anyone?

Gestra
04-12-03, 05:03
Moores law. is not a law.

Artie
04-12-03, 05:29
Originally posted by Gestra
Moores law. is not a law.


....yes....is this a relevant comment or just upping your post count? kthxbye.

LuCiD
04-12-03, 05:55
supposedly there are still 15 years left before they hit the limit of transistor size

Like people say that when we run out of oil will will HAVE to turn to alternative methods of getting energy, this will apply to computing as well, when they run out of ways to make the transistor smaller the competition will indoubtably drive them to create different methods.

If you look at how few years its taken for MAJOR advances in computing in the 20th century, they may find new methods before they hit the limit on transistors. A few months ago I was reading an article about a theory (I think some IBM engineers had) of making computers not binary anymore but larger (I can't remember but they were called "quantum computers" and used multiway switches). If it or anything like it, a jump in the science, proves feasible, it will happen eventually. They will find the money eventually because they need to keep making money

Dribble Joy
04-12-03, 06:16
As far as I know, binary is still, by FAR, the best way to compute.
It is beyond simple, you can apply data correction with ease, 1 or 0, peasy.
With respect to quantum computers, these aren't that far off.
Research in the land of OZ, has solved one of the most important hurdles in the use of light to opperate a binary switch, and the production of a quantum NOT gate.
1 and 0 are represented by the spin on an atom or the polarisation of a photon, the flipping from one to the other is quite easy, however, combining two 'qubits' to produce a computation is harder.
The use of mirrors and beam splitters allows us to create an interferrence pattern whereby the two particles/waves share quantum properties and the flip is achieved.

Shujin
04-12-03, 06:42
http://www.cs.caltech.edu/~westside/dilbert.gif


Originally posted by Dribble Joy
As far as I know, binary is still, by FAR, the best way to compute.
It is beyond simple, you can apply data correction with ease, 1 or 0, peasy.
With respect to quantum computers, these aren't that far off.
Research in the land of OZ, has solved one of the most important hurdles in the use of light to opperate a binary switch, and the production of a quantum NOT gate.
1 and 0 are represented by the spin on an atom or the polarisation of a photon, the flipping from one to the other is quite easy, however, combining two 'qubits' to produce a computation is harder.
The use of mirrors and beam splitters allows us to create an interferrence pattern whereby the two particles/waves share quantum properties and the flip is achieved.

"Quantum and conventional computers encode, store and manipulate information as sequences of binary digits, or bits, denoted as 1s and 0s. In a normal computer, each bit is a switch, which can be either 'on' or 'off'.

In a quantum computer, switches can be on, off or in a superposition of states - on and off at the same time. These extra configurations mean that quantum bits, or qubits, can encode more information than classical switches.

That increase in capacity would, in theory, make quantum computers faster and more powerful. In practice it is extremely difficult to maintain a superposition of more than a few quantum states for any length of time. So far, quantum computing has been demonstrated with only four qubits, compared with the billions of bits that conventional silicon microprocessors handle."

bounty
04-12-03, 06:45
The most exciting phrase to hear in science, the one that heralds the most discoveries, is not 'Eureka!' (I found it!) but 'That's funny....'" Isaac Asimov

Shujin
04-12-03, 06:46
Originally posted by bounty
The most exciting phrase to hear in science, the one that heralds the most discoveries, is not 'Eureka!' (I found it!) but 'That's funny....'" Isaac Asimov nah its more like... oh shit im gonna be rich

Dribble Joy
04-12-03, 06:54
Whether or not you can store more data whithin a qubit is not the point. Binary is far more simple, and where you find an error, you can immediately correct it, as if 1 is 'wrong' then 0 must be right. if the variables are 0,1,2,3 then there are 3 other states that data can exist in. A simple flip between these two states is faster and more reliable, and actual data storage will be some what irrelevant (at the present) as quantum computers are comparitively instantaneous next to thier electronic versions.

Quantum computers will still work at the speed of light however, so speed will still be restricted to how small you can make the gates.
There are ways of passing information beyond the speed of light, but the technology is it's infancy.

I obviously know nothing about this subject compared with you, so what I have said is most likely bollocks.

Shujin
04-12-03, 06:58
Originally posted by Dribble Joy
Whether or not you can store more data whithin a qubit is not the point. Binary is far more simple, and where you find an error, you can immediately correct it, as if 1 is 'wrong' then 0 must be right. if the variables are 0,1,2,3 then there are 3 other states that data can exist in. A simple flip between these two states is faster and more reliable, and actual data storage will be some what irrelevant (at the present) as quantum computers are comparitively instantaneous next to thier electronic versions.

Quantum computers will still work at the speed of light however, so speed will still be restricted to how small you can make the gates.
There are ways of passing information beyond the speed of light, but the technology is it's infancy.

I obviously know nothing about this subject compared with you, so what I have said is most likely bollocks. notice my " 's. i copied that out of a book i got ;P

Dribble Joy
04-12-03, 07:02
furry muff.
I just reread my post, and it is indeed mostly bollocks, being drunk doesn't help much.
Electronic and light signals both travel at the speed of light, but the quantum gates will be much smaller, thus more gates and processing power in a given area.

Artie
04-12-03, 07:05
Originally posted by Dribble Joy
furry muff.
I just reread my post, and it is indeed mostly bollocks, being drunk doesn't help much.
Electronic and light signals both travel at the speed of light, but the quantum gates will be much smaller, thus more gates and processing power in a given area.


See dribble? You're actually a really smart person ^_^ So now you have no excuse for thinking you're mediocre.


Hmm....i feel very inferior compared to you guys in my knowledge of binary and quantum computing. My only thing i'm waiting for is what AMD will call their line of quantum processors.


Maybe the 3200++? :lol:

Shujin
04-12-03, 07:16
Originally posted by Artie
See dribble? You're actually a really smart person ^_^ So now you have no excuse for thinking you're mediocre.


Hmm....i feel very inferior compared to you guys in my knowledge of binary and quantum computing. My only thing i'm waiting for is what AMD will call their line of quantum processors.


Maybe the 3200++? :lol: willing to be one of them will call it "Q" ;] or Cue

Dribble Joy
04-12-03, 07:29
One thing I want them to bring forward more is the instantaneous (and I do mean instantaneous) data transmission quantum effects.

A group of reseachers (quite a while ago now) made a small thin ring of a super conducting material, about 0.5 cm arcoss and about 2 mm diameter, with a 'pinch' at some point in the circumfrence, which has a diameter around a few microns.
When set into a super conducting state, the spin of the electrons within the ring would align with each other. This is due to the VERY wierd and as of yet not entirely known about goings on inside super conductors.
They found that when the spin on the electrons at one side of the ring was changed, the spin (which was aligned with this) on the other side was changed instantly, not just quickly or at the speed of light, but without delay.
Could prove usefull in making gates and links betwwen gates that work without delay, a relitively simple and small processor, could perform calcutations that a conventional super computer would take decades to carry out.

Shujin
04-12-03, 07:37
Originally posted by Dribble Joy
One thing I want them to bring forward more is the instantaneous (and I do mean instantaneous) data transmission quantum effects.

A group of reseachers (quite a while ago now) made a small thin ring of a super conducting material, about 0.5 cm arcoss and about 2 mm diameter, with a 'pinch' at some point in the circumfrence, which has a diameter around a few microns.
When set into a super conducting state, the spin of the electrons within the ring would align with each other. This is due to the VERY wierd and as of yet not entirely known about goings on inside super conductors.
They found that when the spin on the electrons at one side of the ring was changed, the spin (which was aligned with this) on the other side was changed instantly, not just quickly or at the speed of light, but without delay.
Could prove usefull in making gates and links betwwen gates that work without delay, a relitively simple and small processor, could perform calcutations that a conventional super computer would take decades to carry out.
about a year ago i seen a thing on cnn ( its constantly on at my work ) where a lab in australia was able to teleport a beam of light across the room to another area
i know it doesnt seem to.. yea but still kewl

http://news.bbc.co.uk/1/hi/sci/tech/2049048.stm

Dribble Joy
04-12-03, 07:49
Ah this old thing.

Yes, setting up a photon pair(s) that are 'bonded' and when one has it's state altered the other also matches the changes.
Could be usefull for intersteller communications, but you would have to set up a stream of photons in the first place, which will take time to reach the two locations.
The use of entanglement to destroy one photon and 'recreate' it at some other point in space, is interesting from the perspactive of teleportation, but as the article says, doesn't breach the speed of light.

robinitnow
04-12-03, 09:28
in 20 yeras chips will be so powerful that I can't see why we would need something more powerful.

amfest
04-12-03, 09:44
Quantum teleporting is problematic for humans because the original is destroyed in the process of creating the replica.

:lol:

talk about sucking . .. sure we can get you hear fast!! .. only you have to die and it's actually your clone that carries everything out

ElfinLord
04-12-03, 10:17
Originally posted by robinitnow
in 20 yeras chips will be so powerful that I can't see why we would need something more powerful.
We're human.

Humans will always search for something for bigger, better, faster, and most definitely more powerful.

I mean, how else did we get from throwing stones to nuclear weapons? Not by being satisfied with the fact that what we had was satisfactory for completing the task at hand, that's for sure.

Anyway, quantum computers, traditional computers, doesn't really matter to me. Just give the fastest, most powerful computer I can afford and I'm happy. (Just as long as it's more powerful than Fenyx's :D:p)

japata
04-12-03, 10:32
Why spend so much money on something that's going to be useless in a while? What I mean is that the current cpu-type based on unique parts on the circuit will be shortly outmatched by neural net processors (yea yea, Terminator's brain, I know. ;) ) - a type of processor that has only similiar parts which can adopt any function that is required and if one of those parts broke, the cpu would only become slightly slower. Also this couldn't of course be silicon based, thus they'd prolly use the newly found composite developed by Swedish engineers earlier this year or maybe even light/laser.

This might sound pretty confusing, so lemme clear it up a bit.

There's the normal processor which has multiple unique parts in it, even if one welding would fail, the processor would be broken. In a neural net cpu, there are hundreds of gates and passages crossing over, combining, splitting into two, thus if one route would fail, another would assume the part... also if it were made from a shape-adopting electroactive material, those passages would be created when needed - thus it would have unlimited efficiency. No more need for CPU upgrades

So. Instead of spendin billions to improve current technology, they should put tens of billions to develop new tech. At this point these neural net processors are only high-tech science fiction. :)

robinitnow
04-12-03, 10:36
Modern processors or at least the P4's are designed with redundant circuitry. When the p4 is switched on the first thing it does is see which bits are working and then chooses the circuits that are to be used. If one circuit brakes than it just looks for one like it and uses that if it has one.

or so I am told anyway

japata
04-12-03, 11:01
Originally posted by robinitnow
Modern processors or at least the P4's are designed with redundant circuitry. When the p4 is switched on the first thing it does is see which bits are working and then chooses the circuits that are to be used. If one circuit brakes than it just looks for one like it and uses that if it has one.

or so I am told anyway

In a neural net thingie if a whole transistor array is overheated and burnt for example, a new unit, let's call it a primordial circuit, that doesn't have anything to do yet rushes on the site to assume a new form to replace the blown circuit. It would really look like a human brain - synapses trying to find a correctly working nervetract, if none are found one of the deposited primordial cells are rushed onto the site and developed into a new nerve. The finest thing in a neural net processor is that it would be totally silent and wouldn't heat up as it probably would use natrium-kalium valves to move the impulse. :)

Well, like I said: sci-fi. ;)

Shadow Dancer
04-12-03, 11:05
Originally posted by amfest
:lol:

talk about sucking . .. sure we can get you hear fast!! .. only you have to die and it's actually your clone that carries everything out


Oh god, let's not start THAT debate........................... :p

ElfinLord
04-12-03, 11:08
Originally posted by amfest
:lol:

talk about sucking . .. sure we can get you hear fast!! .. only you have to die and it's actually your clone that carries everything out

Originally posted by Shadow Dancer
Oh god, let's not start THAT debate........................... :p
I want my own personal, real life Gen Rep. :p

J. Folsom
04-12-03, 11:16
Originally posted by robinitnow
in 20 yeras chips will be so powerful that I can't see why we would need something more powerful. That's with people said about the first 166 MHz pentiums, and just look at what we have available now.

jernau
04-12-03, 12:05
General point : The death of Moore's Law is called ever few years. Still it stands there eerily accurate. It's not a real rule, in the scientific sense, as it can't be "proven" that's true but it still works.


Specific point : Logic with more than 2 states is not simply a matter of storage. The big advantage comes in calculation where a circuit can return, for example, "yes", "no", "working", "maybe". Manchester Uni have successfully built Trinary computers where "working" is the third state. By doing this you can remove a lot of the control overhead within a CPU that spends it's time monitoring and balancing load.



/edit - @J.Folsom - don't forget Bill Gates' famous "No-one will ever need more than 16Kb"

Menolak
04-12-03, 12:53
more powerfull not so useless. Why? well all of you geek know about star trek. just watch all the cool thing you can have with more powerfull computer. Holodeck for exemple. imagine playing a game Neocron, CS, Larry leisure suit :lol: in holodeck :).

ok seriously. I know for a couple of year now that moores law will become obsolete in about 15-20 year. Here in Québec some researcher are developping a chip that will work with light instead of electricity. One of the change this will accomplish is : as someone mentionned earlier there is swicht inyour processor. on/off. But the major thing with light processor will be the infinite possibilty of "choice" (color, density, intensivity).
Just try to imagine this possible aplication. One of the big probleme right now with A.I. its the limit of the two "choice" chip. with light processor, a computer could eventually draw conclusion, make a choice between different posibility and act accordingly. As it is right now? not so because the computer will be able to create all by himself the differente solution. Something that it cant do right now. The computer is limited by the two choice. he cant create. Not by is own will.

Lexxuk
04-12-03, 15:24
Originally posted by jernau

/edit - @J.Folsom - don't forget Bill Gates' famous "No-one will ever need more than 16Kb"

dont forget that IBM bloke "I cant imagine there being a need for ohh, 16 computers in the world" or something like that anyhow hehehe.

Anyway, if I remember moores law correctly (without readin) its based on the principle of the power of a computer doubling every so often, 6 months or something? Well, exponential leaps are being made in computers, but the first ones that are based on new technology, might only be twice as powerful as the previous ones :p I dunno, i'm not a geek, honest (but yes, moore's law is spookily accurate)

IceStorm
04-12-03, 16:13
Since no one's going to bother looking it up...

Intel's page on Moore's Law (http://www.intel.com/research/silicon/mooreslaw.htm)

Moore's law has nothing to do with speed or "power". It has to do with the exponential increase in the number of transistors that can be squeezed into the same area.

MjukisDjur
04-12-03, 16:54
I dont have the energy either to actually read it but they started saying this 5 years ago and Moore is still going strong :P

rubaduckythug
04-12-03, 16:58
cant remeber where i heard this but somwhere i heard that they would be able to make a resistor the size or atoms or close... :rolleyes: that would be kool ;)

Vampire222
04-12-03, 17:41
well hes been rite for atleast 50 years when it ends, so thats prettyb much for a simple law anyway

Omnituens
04-12-03, 18:03
apparently, the new Cell technology (being co-developed by Sony and IBM) will defy Moore's Law.

i found an example diagram, but it was just squares with the word 'Cell' in them, so it wasnt really that impressive :lol:

JackScratch
04-12-03, 20:06
The limits of progress have been touted as unmuteable since the dawn of man. "We will never travel faster than 80 miles per hour, it is not possible and if we could, all the air would be sucked from our lungs." "We will never travel faster than the speed of sound, we would turn into pure energy." "We will never travel faster than the spead of light, if we did we would travel back in time createing a paradox which would undo the act from happening in the first place." ETC. These unmuteable boundries placed by those both in and out of the know have always existed and will always exist. Sometime they are correct (as far as we have discovered) and sometimes they are very wrong. Persuit of progress is always a good one, however it should be limited by other equaly valueable persuits. Do not research at the cost of all else.