PDA

View Full Version : Next-gen card on new unreal engine.



alig
14-04-04, 21:04
Before anyone starts another farcry Vs the world, this was taken of the farcry forums.

Cant get the video now as its been downloaded to many times. Screenshots i can do though.

This is all run on our one month if that next-gen video cards :D

The one without the monsters is FarCry running on a GeForce6800 with pixel shader 3.0 (Next-gen card due out) and the ones with the monsters is the new unreal engine and all that was rendered with totally normal FPS (Also on Next-gen card)

Just thought id post this for anyone it may concern and because im close to being banned for getting pissed off >.<.

Opar
14-04-04, 21:10
Obviously the 6800 will be the cheapeast card on the market. costing only just over 400 quid!

Kenjuten
14-04-04, 22:07
mou...alig, why you gettin' pissed? =\


The screenshots look great.. o.o;a I wonder..

Possessed
14-04-04, 22:09
PCI Express cards are gonna be sweeeet.

/me sits and twiddles thumbs patiently....

Heavyporker
14-04-04, 22:11
*mops up urine from floor after seeing monsters*

DAMN but that's almost cinematic quality... and the stones with the torchlight.. DAMN!

plague
14-04-04, 22:14
Heh graphic looks uber can't wait to c prise tag lol prolly like 600 bux 8|

alig
14-04-04, 22:14
Im not, well at least i know im not but that isnt what the mods think.....ive been threatended with a ban if i mention a para again and threatened with a ban if i mention (well i cant say it lol or ill get banned) KK and errr...the way (i think) they treat us.

New stuff. Links to the video.

http://www.jamesbambury.pwp.blueyonder.co.uk/unreal3_0002.wmv
http://sanbjoe.home.online.no/nv40/unreal3_0002.wmv
ftp://%20:%20@torstenb.homeip.net/unreal3_0002.wmv

Enjoy. :p

FUCKIN HELL ROFL! check this haha...
http://ubbxforums.ubi.com/6/ubb.x?a=tpc&s=400102&f=452106891&m=324102253

Its now officially on the Nvidia. It nearly doubles the 9800XT on 3Dmark 03 lmao. Wow. Saves my cash from a 9800 pro :lol: :eek: :D

Vampire222
14-04-04, 22:30
agp 8x is by far sufficient for graphics cards... no real need for the ultra high truoughput of pcie, compare it to a 5 lane motorway to some ghosttown

alig
14-04-04, 22:46
Not really. The best gfx card on the market can strugle quite easily on the next gen game known as FarCry. Thats the 9800XT btw which is 8xagp. We need to upgrade hardware or we would all be using p90's playing doom still.

Vampire222
14-04-04, 22:57
i just bought a new mobo wit a fat processor, no freakin way im upgrading for a while

alig
14-04-04, 23:01
It says you need a 480watt PSU to power it but says it isnt so much the power needed its if you can get one with enough ATX leads :(

Opar
14-04-04, 23:05
ffs. Maybe If I win the lottery I could buy one or two of them, and still have money left to buy a candy bar!

alig
14-04-04, 23:12
It says the cost of them, $400 for the ultra version, like any brand new graphics card costs, like my gf4 ti4600 did and like the gffx5950ultra does atm :p

Edit/ i do mean dollas because i cant be bothered to goto a currency website to check it but its about £250 id say

Opar
14-04-04, 23:16
223 quid.

Vampire222
14-04-04, 23:17
now i just haveto get a zalman 400 watt :o

Heavyporker
14-04-04, 23:19
I wonder how Neocron would look on that card....

mmmm....


*strokes self, circling nipples then naughty bits alternately*

Ransom
14-04-04, 23:29
I wonder how Neocron would look on that card....

mmmm....


*strokes self, circling nipples then naughty bits alternately*

Leave the nipples alone... for now.

When the graphical overhaul is implemented perhaps the graphics would be greatly improved... but this will probably be at least Direct X10 or greater, and I'm not sure if the Dev's are upgrading to that standard. But yes, graphics do look great, perhaps this is the real reason that those *other* FPS's have had a release delay.

Ransom

alig
14-04-04, 23:32
Vampire....480 watt :eek: :p

I think you maybe top 50fps at op wars in nc :o with fog on :lol:

Anyway, i keep reading the 26th April is when ATI release their next gen card for us to drool over, we just know the 9800XT gets put to shame from the NV40, i know you cant compare the two properly but it gives a good idea of what you'll be getting for your money and if they are under £350 for the ultra im definately getting one though ill have to see how the R420 does yet to decide properly...Bang for the Buck man...this is one hell of a leap forward so we will get our moneys worth for sure.

Heavyporker
14-04-04, 23:34
the cost is still painful.. oweee, that's like a whole another computer, you know what I mean?

alig
14-04-04, 23:40
Yeah it is quite alot of cash lol...but it wont get outdated very fast like most gfx cards do :/ If you havent got/played/seen farcry running at max settings with a 9800XT then you wont really think this is much of an improvement. Ive got farcry and that is beyond mental how much better that looks with pixel shader 3.0...look at the tree shadows on the ground

numb
15-04-04, 00:14
Definitely something worth saving cash for, I have been holding out on upgrades for stuff like this - things that make a huge difference rather than just increased FPS and faster FSAA.

IceStorm
15-04-04, 01:52
Not really.
Yes, really. AGP 8x isn't the limiting factor when there's 256MB of memory on the card. If it had zero memory on the card, then maybe you'd have a point.

I'm happy nVidia fixed their image quality problems with the NV40 (really NV45 pulled in) generation, but the card's not out and I don't expect it'll be out before the X800 XT is. ATI's big R300-ish replacement isn't R420/R423, though, so it'll be interesting to see what the final R300-ish core can do against nVidia's new NV40 series.

ATI's big replacement comes much later this year.

Dribble Joy
15-04-04, 01:57
GeForce 6800 / Ultra Radeon 9800XT


Transistors 222 million / 115 million
Core Clock 400MHz / 412MHz
Pixel Pipelines 16 / 8
Vertex Units 6 / 4
Memory Bus 256-bit / 256-bit
Memory Clock 550MHz / 365MHz
(1100MHz effective)/(730MHz effective)

You wonder why it beats it. O_o

jiga
15-04-04, 02:29
looks like my gfx 5200 is alrdy outdated :(
im never spending over £100 on a graphics card tho

StrongSad
15-04-04, 03:11
The 6800u looks like a beastly card on paper but it does not perform like some people would assume.

http://www.hardocp.com/article.html?art=NjA2

The 9800xt holds up just fine...and that is amazing considering it is a generation BEHIND the 6800u. I cant wait to see what the r400 or 420 can do. As for your beloved PS3.0 alig. Its only 2.0 in farcry. Although in patch 1.1 they said it will support 3.0 the game only uses 2.0 with the latest card (6800u/9800XT).

Concerning a 'required' 480 watt PSU. Not really true either. The HardOCP article explains it better but in short a 430watt PSU can handle it just fine. Even the right 350watt PSU could handle it I bet.

Sorry if I shattered anyone's high expectations. :)

IceStorm
15-04-04, 03:30
looks like my gfx 5200 is alrdy outdated
The FX5200 was outdated the day it shipped.

The 9800xt holds up just fine...and that is amazing considering it is a generation BEHIND the 6800u.
No, it doesn't. [H]'s benchmarks are a bit odd. They either used CPU-limited games and resolutions, or the 9800 XT is running in a far lower image quality mode/resolution than the 6800 U. If you read other benches (Tom's, AnandTech) where they set all the cards to the same image quality levels and resolutions (or as close as possible), you end up seeing what it really looks like.

[H] also doesn't seem to understand that by default Far Cry will detect nVidia cards and use an NV3x-specific rendering path that butchers image quality in favor of increased performance (you can thank the NV30 line for that). There is a way to force Far Cry to use the DX9 path (at least, there was for the demo). I'm surprised no reviewers used it.

Dribble Joy
15-04-04, 03:33
Yeah.. but the 6400u doesn't perform THAT much better, considering it's supposed to be so much more advanced.
I was expecting to see results WAY higher than the 9800XT.

IceStorm
15-04-04, 03:39
but the 6400u doesn't perform THAT much better,
30% to 120% faster isn't "THAT" much better? Huh? Is the only bench you're reading the warped [H] review?

How about when the resolution keeps going up and the 6800 U's performance stays the same while the rest drop off dramatically?

You should seriously read the benches from all the sites, not just [H]. I will wait patiently for the X800 XT reviews and final reviews of 6800 U, but I'm fairly certain that right now, I'm going to upgrade prior to switching over to PCIe later this year.

StrongSad
15-04-04, 04:00
HardOCP puts their settings at the highest they can before the framrates become playable. Yes the 6800u can run farcry at 1600x1200 with AA/AF maxed much better than a 9800XT, but the framrates are still unplayable. When you look at the better IQ gained by increased power from the 6800u you will realize that it isnt much better than what a 9800XT has to offer.

And about the lower IQ for better performance on the nvidia cards. You really dont know what your talking about. First the "butched image" is a bug in the farcry 1.1 patch. Everything looked fine with 1.0. And as for the "forced" DX9 "path"...I would assume you talking about true trilinear filtering vs. nividias brilinear filtering. If that is the case, HardOCP ran their review with the "trilinear optimizations" off. They stated that several times.

Its just like all the 5950u vs 9800XT reviews. They crank up the resolution and AA/AF then benchmark the two cards and point to a lead by the 9800XT and dubb it a 'superior' card, when in actuality the 9800XT really isnt much better than the 5950u in real-world gaming situations. HardOCP gives you those real-world gaming situations. You just have to think a little more and not just look for the longest bar in a graph to claim a champion.

IceStorm
15-04-04, 05:00
Yes the 6800u can run farcry at 1600x1200 with AA/AF maxed much better than a 9800XT, but the framrates are still unplayable.
That would be the crux of the problem with [H]'s review. They chose what they felt was playable.

When you look at the better IQ gained by increased power from the 6800u you will realize that it isnt much better than what a 9800XT has to offer.
Well, that's not the point. The point is that they managed to get BACK to ATI's de facto IQ standard and pump out more frames/sec to-boot. NV40 is not "broken" like NV30 was.

And as for the "forced" DX9 "path"...I would assume you talking about true trilinear filtering vs. nividias brilinear filtering.
No, I'm taking about the nVidia-specific path vs the true DX9 path (by now everyone should know that NV30's got "issues" when trying to do proper DX9 precision since the hardware doesn't support 24bit).

They crank up the resolution and AA/AF then benchmark the two cards and point to a lead by the 9800XT and dubb it a 'superior' card, when in actuality the 9800XT really isnt much better than the 5950u in real-world gaming situations.
It is when you find out the 5950's drivers are blowing IQ to hell to get the framerate up. There's a reason FutureMark hasn't certified anything past 52.16...

HardOCP gives you those real-world gaming situations.
No they don't, they adjusted IQ on each card to get the graphs to look the same.

You just have to think a little more and not just look for the longest bar in a graph to claim a champion.
The UT 2004 (http://www.anandtech.com/video/showdoc.html?i=2023&p=21) and Far Cry (http://www.tomshardware.com/graphic/20040414/feforce_6800-41.htm) benches pretty much do it for me, although I agree on a need to wait for the IQ fix in Far Cry. Futher down in Tom's review they turn on AA/AF for UT 2004, though, at resolutions I play at and I'm pretty happy with what I saw. I don't use my 9800XT in "normal" mode, I adjust it to 16x AF and 4x AA.

Opar
15-04-04, 09:52
well ffs, This has really fucked up my buying of a new GFX card to shit.

I was gonna for the GeForce 5900XT (XFX 5900), because it was fairly priced, at around £150, and im not a millionaire.

Sooooo, think I should stick with buying the 5900XT? Anyone got one? How does it play UT2K4/Far Cry?

Please ^_^

jiga
15-04-04, 10:28
well ffs, This has really fucked up my buying of a new GFX card to shit.

I was gonna for the GeForce 5900XT (XFX 5900), because it was fairly priced, at around £150, and im not a millionaire.

Sooooo, think I should stick with buying the 5900XT? Anyone got one? How does it play UT2K4/Far Cry?

Please ^_^
yeah just buy that one. it's not worth going for the absolute best graphics card @ around £500 coz it will just be outdated in a few months time

Vampire222
15-04-04, 10:28
NEIN WARTEN!

uhhhh... i mean rather wait till this babeh is out, to be sure, the non ultra ver of it will be about as expensive as the card you got in mind

evs
15-04-04, 13:02
far cry looks like that on my system at the moment
0_o

icarium
15-04-04, 13:26
far cry looks like that on my system at the moment
0_o

what leet evs just are sayingz0r!!1

plus pciexpress wont make *that* much of a difference tbh, the bottleneck is the ram on the graphics card not the agp slot. which is why even though agp8 is twice as fast as agp4, you get a couple of percentage points difference max. i still have a 9700 pro and it runs far cry at max settings with no noticeable slowdown. my mates gf4 4600 runs it at max with only a little slowdown. its not like back in the day where you HAD to have the new card to play the new games.

incidently evs hows work? ;) if its any consolation i feel like shit lol

alig
15-04-04, 13:37
FarCry does support PS3.0, and it is a massive improvement, just because it doesnt get 500fps in 1600x1200 max AA/AF ultra high settings in farcry doesnt mean its shit, because a 9800XT cant actually play farcry on them settings...at least that card gets over 30fps which is playable....Did you go through the entire review? Did you download the video? that card can render a monster with more detail on it than an entire map on UT2004, if that isnt a massive improvement then you probably will never buy a new card because from 5950ultra to 6800ultra is one hell of a step forward. In time.

There wasnt one review i read where they didnt mention jawdropping/unbelieable results. One of them even said "Compared to the GeForce 6800 Ultra, both the Radeon 9800XT and the FX 5950 Ultra look like outdated toys from the last century." As it gets over double what the supposed not so far behind 9800XT gets in CoD and then for farcry "In the indoor level, the GeForce 6800 displays its superiority both in the AVG and the MIN fps. The MIN fps of the 6800 Ultra are higher than the AVG fps of the Radeon 9800XT!!!" Now tell me where "Not bad" comes into that?.

End of the day its your decision and if you think upgrading from a 9700PRO to a 9800PRO is more of an improvement from a 9800XT to a 6800ULTRA then your sadly mistaken...you wont get another bargain like this for at least 2 yrs when they release next-gen cards again. Anyway...we'll have to see what the R420 does yet, all ive heard is it wont be supporting PS3.0 its staying on PS2.0.

Evs, post a pic of farcry looking like that plz, that picture is showing PS3.0 which you cant possibly have.

icarium
15-04-04, 13:47
ali, this is all moot anyway. i would wait to see what ATI bring out before rushing to spend £400 on a new nivid. think rationally and dont let fanboiism cloud your vision young one! :p it could be that the nvid is better than the new ati, but until its out we cant be sure. personally i dont care who makes my graphics card, only how well it performs :angel:

jernau
15-04-04, 18:48
that card can render a monster with more detail on it than an entire map on UT2004,
Unreal, not UT2004.

kurai
15-04-04, 19:14
*shrug*

Nvidia & ATI will always be leapfrogging each other with new technology steps - some leaps will take one ahead, some will just draw them even.

The NV40 leap is unusual (in recent years anyway) in that it is a much bigger performance increase from the previous iteration as opposed to the more usual 10 to 20%.

The R420 stuff in the pipeline will need to be something astonishing to do more than close the gap again.

Anyway - all that aside. Given a choice of an ATI & Nvidia card of roughly similar performance I'd go for the Nvidia without much hesitation - I prefer their driver design and strategy. There's always a lot more ATI problems floating around than there are Nvidia for any given release.

Bragvledt
16-04-04, 23:03
New stuff. Links to the video.

http://www.jamesbambury.pwp.blueyonder.co.uk/unreal3_0002.wmv
http://sanbjoe.home.online.no/nv40/unreal3_0002.wmv
ftp://%20:%20@torstenb.homeip.net/unreal3_0002.wmv



:eek: :eek:
Slowly but surely, we're getting there ...
I wonder how long before we'll see several(!) 50k polygon models in-game.
Painkiller's bosses are about 10K, so it shouldn't be too long anymore.

Serpent
16-04-04, 23:20
LOL and i got a geforce 2 :p