PDA

View Full Version : A reason to get PCI Express



jernau
28-06-04, 15:51
http://www20.graphics.tomshardware.com/graphic/20040628/index.html

Kilowatt PSUs here we come. :D

phunqe
28-06-04, 15:58
Yeah I mentioned another article in my computer thread... Basically I got so excited I had to calm down :p

However, they won't be here until Q3-Q4 :mad:

IceStorm
28-06-04, 16:09
There's only one PCI Express chipset which can pull this off in the near future and it's Intel's Tumwater, a workstation-class chipset. However, Tumwater doesn't support two x16 PCI Express slots - it supports a x16 and a x8. What Iwill did for Alienware is build a board with two x16 slots, but only routed signal traces for half of the second x16 slot. It'll work because the card will find only half the number of lanes it's equipped for and fall back to x8 mode. Even in x8 mode, it has a ~2GB/sec full-duplex transfer rate (8 x ~250MB/sec per lane). To-date, though, I haven't seen another Tumwater board sporting two x16 slots.

At least this means 5k and up for an Alienware ALX w/ VideoArray won't be the only option.

jernau
28-06-04, 16:09
@Phunge - Ah, I missed that.

I can live with a small delay as it means the mobos can catch up.

The Alienware solution smells of hackery to me. As they say in the article it's inevitable NVidia will provide a suitable board/reference design when the time comes.

msdong
28-06-04, 16:13
lol, 2 Nidia cards. you better dont touch you tower case because its HOT:)

but PCI express is a little to far away and 2 GF 6xxxx cards are a little out of reach until next year ...

Ryuben
28-06-04, 16:13
erm.........so u could have 2 x 256 megs of graphics power...........erm


:o 8| 8|

yeah !

Duel 2.8 zeons + 4 megs of DDR 400 ram + 2 Brand spanking X800 ????


*drools*

jernau
28-06-04, 16:16
Duel 2.8 zeons + 4 megs of DDR 400 ram + 2 Brand spanking X800 ????


*drools*
Even if ATI had the SLI patents or come up with something of their own would you want to go within a mile of their drivers for it.

**shudders**

phunqe
28-06-04, 16:17
Semi OT, a 350W PSU is enough for an Athlon64 and a X800XT right?

Ryuben
28-06-04, 16:18
Semi OT, a 350W PSU is enough for an AMD64 and a X800XT right?
no as if u don't have water cooling then u better have one mofo cooler + extracter + about 3 chases fans :p

so try a 550 w

jernau
28-06-04, 16:19
Semi OT, a 350W PSU is enough for an Athlon64 and a X800XT right?
If you like house-fires.

phunqe
28-06-04, 16:19
no as if u don't have water cooling then u better have one mofo cooler + extracter + about 3 chases fans :p

so try a 550 w

Hmm if you don't overclock, the stock fan on both the cpu and gfx should be enough + one general chassis fan?

I've never gone beyond stock fans if not overclocking.

Ryuben
28-06-04, 16:21
Hmm if you don't overclock, the stock fan on both the cpu and gfx should be enough + one general chassi fan?

I've never gone beyond stock fans if not overclocking.


if u only use 1 chases fan u better put an extracter under the GFX then :p

oh unless u want ur CPU running at what 50 cel

they both produce an amazing amount of heat my friend put his net card next to his new X800 and it kind of melted the edges of his net card :\ plastic was warm and bendy

phunqe
28-06-04, 16:24
Well anyway, I see 350W or higher recommended in most places... So I'll probably go for a 400+ something.

Also, it seems odd to me that you shouldn't be able to run a gfx card out of the box without additional cooling o_O

Strych9
28-06-04, 16:43
The alienware rig is described in the latest maximumPC magazine.

It can handle ANY TWO videocards on the planet (that use PCI Express). Meaning you can team together Nvidia and ATI if you needed to.

It has been tested, and there is no seamline between the top and bottom half of the screen. Supposed to work VERY well.

Ryuben
28-06-04, 16:49
Well anyway, I see 350W or higher recommended in most places... So I'll probably go for a 400+ something.

Also, it seems odd to me that you shouldn't be able to run a gfx card out of the box without additional cooling o_O


u _can_ im just not advising putting any thing near it and/or wanting a cool box :p

also u will need to clean the GFX fan about 1 in 4 days with out the extractor fan :p

phunqe
28-06-04, 16:57
Hmm.. I wouldn't mind a watercooling for the x800xt o_O

Dunno what fits though... Wapochill has kits for cpu/chip/gfx but I cannot figure out if it's ok for athlon64 and x800... Cannot see anything whatsoever on their homepage for socket 939 and x800 cards...

EDIT: Cannot find any at Zalman either apparently. Seems like some odd vendors have it... need to buy it locally though and a mainstream brand would be nice.

EDIT 2: Cooling for the old 754 socket apparently fits for the 939 as well, so no problems there...

[TgR]KILLER
28-06-04, 17:21
i run a whole system on a 460 psu.. but not just the watts on a psu count..

a 400 ennermax can deliver just as much power as a 550 qtec or lower brand..

retr0n
28-06-04, 17:28
If you like house-fires.


damn that made me crack up...

Nice stuff comming from Nvidia though, seems like I should hold on a bit more
before upgrading.

Maarten
28-06-04, 17:30
Nvidia will release the nforce 4 chipset later this year which will feature a dual 16x PCI-E solution for SLI systems... :)

jernau
28-06-04, 17:33
Nvidia will release the nforce 4 chipset later this year which will feature a dual 16x PCI-E solution for SLI systems... :)
Got any specs?

I figure 2 GPUs deserve at least 2 CPUs.....

Carinth
28-06-04, 18:04
Though the second slot is only operating at 8x speed, PCI Express 8x is still way faster then AGP 8x. Also, the system does come with liquid cooling because yes two cpus and two gpus do generate quite a lot of heat. Alienware came up with the idea based on a similar setup you used to be able to do with the Voodoo2. It was a PCI card used in addition to your AGP graphics card. Since it was PCI you could buy a second Voodoo2 and put it in for twice the performance. With PCI Express coming up, they'll be able to repeat this dual card setup. The gist of it is that Alienware's proprietary software splits the screen in two and assigns a graphic card for one half. Maximum PC tested a prototype of the system Alienware will be ofering and said it performed wonderfuly. With no fading in the middle or any way to tell there are actualy two graphics cards drawin the screen. The performance boost will be ridiculous, goodbye benchmarks : )

The downside tho is that with two high end graphics cards and two cpus, this system will prolly be one of the most expensive home pc's ever. I imagine it'll top 4k USD easily maybe even hit 5k.

StryfeX
28-06-04, 18:06
Though the second slot is only operating at 8x speed, PCI Express 8x is still way faster then AGP 8x. Also, the system does come with liquid cooling because yes two cpus and two gpus do generate quite a lot of heat. Alienware came up with the idea based on a similar setup you used to be able to do with the Voodoo2. It was a PCI card used in addition to your AGP graphics card. Since it was PCI you could buy a second Voodoo2 and put it in for twice the performance. With PCI Express coming up, they'll be able to repeat this dual card setup. The gist of it is that Alienware's proprietary software splits the screen in two and assigns a graphic card for one half. Maximum PC tested a prototype of the system Alienware will be ofering and said it performed wonderfuly. With no fading in the middle or any way to tell there are actualy two graphics cards drawin the screen. The performance boost will be ridiculous, goodbye benchmarks : )

The downside tho is that with two high end graphics cards and two cpus, this system will prolly be one of the most expensive home pc's ever. I imagine it'll top 4k USD easily maybe even hit 5k.I'd say even more than that if the system is custom-/hand-built using only premium components.

--Stryfe

Samhain
28-06-04, 18:08
Even if ATI had the SLI patents or come up with something of their own would you want to go within a mile of their drivers for it.

**shudders**

And while we're stating out of date facts that companies have pulled themselves out of, OCZ memory is highly unreliable!

jernau
28-06-04, 18:10
The advantage of the NVidia system is that it's designed as SLI from the start so it includes load-balancing which will put it well ahead of the alienware solution.

jernau
28-06-04, 18:18
And while we're stating out of date facts that companies have pulled themselves out of, OCZ memory is highly unreliable!
erm... Their drivers are still flaky. Unless you think palette errors, misaligned textures and z-buffers that don't order are "features".

-REMUS-
28-06-04, 18:26
So when will a decent PCI express capable board be on the market for benchmarking?

jernau
28-06-04, 18:27
September 1st ;).

-REMUS-
28-06-04, 20:02
oh Jesus Christ I've got to wait till then! even nc laughs at my pc from time to time :(

Will that be a socket 939 mobo? I'm guessing so, hopefully AMD 64 bit cpu prices will calm down by then, to sensible levels and ill tuck a nice 3 ghz cpu in with the best gfx card I can afford.

But then again the first mobo will proberbly be totally crud unless its released by a manufactuer like abit?

jernau
28-06-04, 20:22
That was 100% guesswork based on them trying to tie in the launch to that of BDOY.

I would assume that they will bring out an NForce mobo that supports it at the same time or before. I just hope it also supports 2 CPUs.

t0tt3
28-06-04, 20:54
BRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRR x 2 8| :eek: :wtf:

Or did they fix the cooling noise on GF?

jernau
28-06-04, 21:16
Mine are all fine noise-wise.

Samhain
28-06-04, 21:26
erm... Their drivers are still flaky. Unless you think palette errors, misaligned textures and z-buffers that don't order are "features".


I've not seen anything like this recently. I don't doubt ATi still has scattered issues with their drivers, but I've had more trouble with my nVidia graphics card than my ATi one.

Mighty Max
28-06-04, 21:35
tbh, i dont think that this is a good sign.

SLI only makes sense, when NVIDIA is unable to find new ways to include more power in ONE chip, as those production cost would be less then two chips, and a new technologie is always an investment into future.

the SLI on the Voodoo's basically marked the point where they lost. Instead of implementing new and better algorythm onto their existing chips, they only concentrated on parallel solutions.

While this provides some FPS on the short way, it stops evolving the other technologies.

Lets look what abbilites an SLI will win:
T&L? Has to be done twice (so cards stay in sync), => no win
Pixelshader? Has to be done twice (a pixel has to be aware of his neighbours for them, what cant be promised intime if the upper and lower pixel is on the other gfx card) => no win
TextureBuffer? Each texture needs to be stored in every card once so that the card can actually draw them to a polygon => no win
Pixel / Cycle ? Double the pipes => double theoretical fillrate

It wouldnt be worth my money for this effect.

Espacially those game you want it be more effective, youll be stuck with only marginal wins due to the exzessive use of PixelShader

Samhain
28-06-04, 21:36
the SLI on the Voodoo's basically marked the point where they lost. Instead of implementing new and better algorythm onto their existing chips, they only concentrated on parallel solutions.

Wasn't voodoo working on their 250 series chip or whatever that was super good, just before they went out of business?

jernau
28-06-04, 21:44
SLI has many advantages from both a design and a marketing perspective over other options.

The only reason it vanished was because AGP didn't allow for it properly. If it had been available on AGP-AGP 3DFX may have survived a lot longer and it's rather obvious that NVidia are jumping back onto it as soon as they can (using the 3DFX patents). 3DFX made a killing on SLI at the time, it certainly wasn't a losing tech for them. They had 2 further chip releases after they were forced to drop SLI.

The only technical issue to overcome is where pixels need to know about others which are on the other card. Presumably that is most of the reason why the cards don't often get 200% speed (I can live with 180% tbh) but as it's an all inclusive design (unlike the alienware one) this can be solved at the design stage I'm sure and indeed the results they are claiming suggest they have done so. It's not true to say everything will need to be done twice - I suspect very little will be in fact.

Considering the leap NVidia got already in their latest chips, they can hardly be said to have run out of ideas or performance.

Mighty Max
28-06-04, 21:46
I was not refering their last breath, i was refering when they lost the leading to NVIDIA.

Why ? remember the following cards ? Only more chips - more power consuming, the basic principles didnt change anymore. That was when NVIDIA could supply cheaper "onechip" solutions, what lead to the loss of customers, which ended some good time later.

Samhain
28-06-04, 21:48
I doubt that "what's out" is what they have. If nVidia released their latest chipset then someone discovered a breakthrough that would let their cards attain four times the speed, do you think they'd immediately release that, or would they slowly introduce things that let them stay just ahead of their competition? I think there's more sales to be made in releasing a new "5xxx" version of a card every other month than to just put everything you've got on the table in one go.

jernau
28-06-04, 21:50
I was not refering their last breath, i was refering when they lost the leading to NVIDIA.

Why ? remember the following cards ? Only more chips - more power consuming, the basic principles didnt change anymore. That was when NVIDIA could supply cheaper "onechip" solutions, what lead to the loss of customers, which ended some good time later.
That's because they got lazy. They hung onto the same architecture too long. That's nothing to do with SLI. Voodoo 3s were competitive cards and SLIs were still top dog until GF Ultras appeared a generation and a half later IIRC.

The plus-points of SLI designs remain regardless of whether the original holder of related patents made unrelated bad decisions and was unable to keep using the SLI tech when standards turned against them.

Mighty Max
28-06-04, 22:05
Sure SLI has good advantages,

but the question is, why invest millions of $ to this technologie, when the same money could be put up to increase their real performance of an one card solution ?

SLI is limited, or would you put in 4,8,16 ... cards?

It is therefor not innovative. It does not provide any advantage / $, it infact reduces the power / $ ratio. Not only for the enduser (He can decide if he needs this power that bad) but too for the facturer, they need to produce the parts twice for not the double performance.

It might hurt, but i definately think that will give ATI a good boost in the future, when they decide to put those millions in better onechip solutions then SLI.

Its just like the old addvertise:

32Bit ? we dont need a PC with 32Bit OS, we have 2 with 16Bit ones!

jernau
28-06-04, 22:20
Well for a start they don't have to spend all that money - they got the tech when they bought 3DFX. Bringing it up to date (or keeping it so in the background) won't have been a huge overhead.

The advantages of SLI from a marketing/product POV are :
1) It will always be at least a generation ahead of 1-card solutions
2) It opens up a lot more of the workstation market (which NVidia are pushing into with Quadra)
3) It doubles the lifetime of a product which punters like but also doubles the revenue which manufacturers like.
4) Spread cost - buy one now and one in 6-12 months. The second one costs less to buy (woot for consumer) but it also has a higher profit margin (woot to manufacturer) due to tech moving on and so forth so everyone wins. The user gets top end graphics for 2 generations for the price of 1.5 and the manufacturer avoids the problem of people skipping a gen or two without losing profit margin.

That's before you get into the performance and technical side. The costs of keeping SLI in line with other tech on the card is trivial if it's done right at the start which by all accounts it was (remember 3DFX were the first to put it into consumer cards, not the first to do it at all). When all is said and done SLI is just a few extra standards and protocols the GPU designers have to bear in mind while they are designing new silicon.

In effect a real mass-market SLI card effectively binds the customer to that platform for 2 generations therefore cripling competitor investment. 3DFX never managed to sell that many but if NVidia bit it would seriously hurt other manufacturers.

plague
28-06-04, 23:02
eh oh good old days, i remember me favorite 3dfx voodoo5 and 2 voodoo2' on it lol those where the days runing quake 2 on 1027 res with about 60 fps lol that was uber heh. I am glad that they finnaly starting to make teh sli cards such great potentian can u imagine to have 2 radeons 9800xt hehe how kewl would that be 0.0

yavimaya
29-06-04, 00:00
erm.........so u could have 2 x 256 megs of graphics power...........erm


:o 8| 8|

yeah !

Duel 2.8 zeons + 4 megs of DDR 400 ram + 2 Brand spanking X800 ????


*drools*

HAHAH, Nvidia are brining out a 512MB versqion of thier 6800, you'll be able to have 1024MB of graphics power....
And 4 megs of ddr ram wont get you far, even with 2 processors, and 2 GPU's. :P

alig
29-06-04, 00:17
HAHAH, Nvidia are brining out a 512MB versqion of thier 6800, you'll be able to have 1024MB of graphics power....
And 4 megs of ddr ram wont get you far, even with 2 processors, and 2 GPU's. :P


No one will buy that 512mb 6800....it will be WAY to expensive to justify it. It already costs £360 - £400 for the 6800ultra....it would be probably £100 more like the 128mb ---> 256mb 9800PRO's costed. If you've seen the marginal differences between the 128mb gfx cards and 256mb gfx cards then you'll know why a 512mb gfx would be completely useless and a total waste of money...we are'nt even using 256mb much yet.

On topic : Imagine two of these fuckers rigged together :lol: Gainwards 6800ultra overclocked and watercooled retail (http://www.scan.co.uk/Products/ProductInfo.asp?WebProductID=115960) and http://www.gainward.de/new/main.html but at a staggering £640 that would mean spending a wopping £1280 on just graphics card 8|

jernau
29-06-04, 00:23
Memory size has almost nothing to do with performance.

I don't know if you will be able to fit novelty cooling systems between the cards. It's going to get pretty cramped in there I reckon.

VetteroX
29-06-04, 00:42
Who the hell needs that? one GF 6800 GT or one X800 will run any game out today, or Doom3/HL2 perfectly. I love high end comps, but thats just a waste of $.

jernau
29-06-04, 00:45
What does "need" have to do with it?

Besides I think you'll be surprised at how much HL2 and Doom3 are going to need. Both Valve and ID have said or hinted that they don't expect people to run high fps at full detail on launch day. In fact if you could then people would be disappointed.

Scikar
29-06-04, 00:47
Who the hell needs that? one GF 6800 GT or one X800 will run any game out today, or Doom3/HL2 perfectly. I love high end comps, but thats just a waste of $.
Depends how much you expect of HL2. The GFs have always struggled a little with full anisotropy on, it's perfectly plausible that they might begin to stutter when running the next generation of games at full details with full AA/AF. Also the idea seems to be not that you get a second now, but that when your GF 6800 starts to get left behind, you can buy a second one and match the performance of the next generation of cards without having to pay full price for the upgrade.

VetteroX
29-06-04, 00:52
Im willing to bet the new card will run Doom/HL just fine, by that I mean 1280X1024 full details, shadows, etc. probably 1600X1200 full detail. This is an un nessesary config, will cause a lot of heat, and will need new psu's to run. The 6900 or x900 (whatever they will be) will be out in 6 months, id prefer that.

jernau
29-06-04, 00:54
Like Scikar says the idea is you buy todays best card now and turn it into next years best card next year for half price.

Mighty Max
29-06-04, 00:57
I'm a bit stunned here ...

hehe i thought VetteroX would be a member of the two groups buying this thingy.

The LAN-Bragger. (Is not meant bad, but you know the ones who always have a case modded high end system)



@The mem thingy, Basically it would be 256MB+256MB ~= 280MB.
Because every single texture has to exist on both cards, so only the memory not used for textures (like front & backbuffer) is really doubled.

jernau
29-06-04, 01:02
What's the other group?

The memory thing is right but then most people still seem to think more memory means faster graphics so the bragging rights probably still work.

Mighty Max
29-06-04, 01:06
As the other group i was thinking about companies (startups & middle class companies) who cant afford professional OGL cards or even complete SGIs but need some decent power for their work.

Architects, Software-developers, engeneers etc

jernau
29-06-04, 01:16
As the other group i was thinking about companies (startups & middle class companies) who cant afford professional OGL cards or even complete SGIs but need some decent power for their work.

Architects, Software-developers, engeneers etcAh, yeah. They did already announce it will be on Quadras too.

Having worked in that environment recently I've no doubt they would grab these up as soon as they can even if it means buying insane mobos and Xeons to go with them. In fact they would probably want them too anyway.

It's not just small companies that use high-grunt PC workstations now - Pixar, ILM, etc all design on them and only run final-print renders onto the specialist hardware. In fact IIRC the "specialist" stuff is also PC-based now at one of them (can't remember details but The Register had an article on it last year sometime) though they aren't what the average user would recognise as a PC.

I don't think OGL is seen as anything special these days.

IceStorm
29-06-04, 01:55
Though the second slot is only operating at 8x speed, PCI Express 8x is still way faster then AGP 8x. x8 PCI Express is ~2GB/sec full duplex. AGP 8x is 2.1GB/sec shared. The difference may or may not be noticeable.

Using up all 24 lanes of Tumwater on two video slots means the only I/O bandwidth left is for the old 266MB/sec hub link down to the ICH. It's not much of a problem for a gaming machine that only has a couple disks and a sound card, but there's no room left over for PCI Express options down the road.

SuperMicro has revealed their suite of Tumwater boards. All the EATX boards have x16 PCI Express and x4 PCI Express with a PCI slot in the middle and x16 PCI Express physical connectors on both PCI Express slots. That's ~4GB/sec full-duplex on one slot, ~1GB/sec full-duplex on the other slot, and a space in the middle - perfect for the SLI setup nVidia's unveiled. Most Xeon boards will be using the other four PCI Express lanes for PXH, Intel's PCI-X to PCI Express bridge chip. While there's no room left over for PCI Express x1 cards later on, you at least get three PCI-X slots for things like RAID controllers.

This is all first generation, though. It will be interesting to see how this pans out for 925X/915. Hopefully Intel will step up introduction of a x16/x8 PCI Express chipset for the mainstream/performance segment. I won't be holding my breath, though. :-)