Carigamers
Tech Talk => Hardware, Tweaking & Networking => Graphics Cards => Topic started by: W1nTry on July 04, 2006, 10:16:00 AM
-
Read:
Nvidia G80 meets DX 10 spec with 'dis-unified' shader
Horses for courses
By Fuad Abazovic: Monday 03 July 2006, 09:50
WE HEAR Nvidia has been beavering away fo meet the DirectX 10 specification.
And the firm decided it doesn't need a unified Shader for its upcoming G80 chipset. Instead, it decided that you will be fine with twice as many pixel Shader numbers as geometry and vertex Shaders.
AS we understand it, if a Nvidia DX10 chip ends up with 32 pixel-Shaders, the same chip will have 16 Shaders that will be able to process geometry instancing or the vertex information.
ATI's R600 and its unified Shaders work a bit differently. Let's assume that ATI hardware has 64 unified Shaders. This means that ATI can process 64 pixel lines only per clock. That may be in the proportions: 50 pixel, 14 vertex and geometry lines per clock, or 40 vertex, 10 pixel and 14 geometry information per clock. Any ratio that adds up to 64 will do. I hope you get this maths.
The Nvidian chippery is limited to 32 pixel and 16 vertex and geometry lines per clock, which might be a wining ratio but it is still too early to say. We don’t know who will win the next generation hardware game and whose approach is better: ATI's unified or Nvidia's two-to-one ratio.
DirectX 10 actually doesn’t care how you do your Shaders as you speak with an abstraction layer and hardware can do its pixel, vertex and geometry data the way it wants. It will serve up the information to DirectX 10 that will process it inthe end.
Nvidia's G80 is fully DirectX 10 capable as well as Shader Model 4.0 capable but it won't be unified, according to our sources.
In the end, people care about the frames per second that that will ultimately decide who will win the next-generation graphics hardware race. µ
-
Here yah go Crixxx, let the salivation begin:
Nvidia's G80 has 32 pixel pipes
16 vertex & geometry Shaders
By Fuad Abazovic: Thursday 06 July 2006, 10:15
IT TURNS that the fancy Nvidia G80 chip taped out, and in working silicon stage it will have 32 pixel Shaders and, as predicted, have 16 vertex and geometry Shaders.
Nvidia wants to stick with a two to one ratio and assumes that the games of tomorrow will need twice as many pixels than they will need vertices and geometry information.
ATI believes in a different religion. ATI believes that every Shader should become one, united and bellowed. No more segregation to Pixel and Vertex Shaders. If ATI makes a chip with 64 Shader units all of them can do Shader either vertex or pixel or geometry stuff all the time. You can check the stories at the bottom for a better explanation.
We don’t know the clock speed of the upcoming performer but we don’t believe Nvidia can get more than 700MHz out of it - we could be wrong about that. µ
-
7600gt does 700mhz + on good air , G80 is supposedly 80nm , slightly cooler.. it should make that with normal air cooling..
-
meaty info.
I like the approach ATI took, sounds more flexible but at the same time, Nvidia hasn't let me down with their 3D hardware in recent times (not since the FX 5600 debacle)
As the article says, who nails the higher fps at the end of the day will win the war.
-
found from tech report - pics of nvidia g80 (http://translate.google.com/translate?u=http%3A%2F%2Fitbbs.pconline.com.cn%2Ftopic.jsp%3Ftid%3D6054769&langpair=zh-CN%7Cen&hl=en&ie=UTF8)
Those look like water cooling barbs to me.
would be interesting to see how it compares to 2 7900gtx cards which may be using less power.
-
home heating fins you mean
-
Since that page Beo put up takes forever and a day, here are the thumbnails (click on for full size) from another source:
(http://img365.imageshack.us/img365/69/g801oy8.th.jpg) (http://img365.imageshack.us/my.php?image=g801oy8.jpg) (http://img365.imageshack.us/img365/9218/g802sk7.th.jpg) (http://img365.imageshack.us/my.php?image=g802sk7.jpg) (http://img331.imageshack.us/img331/517/g803qg3.th.jpg) (http://img331.imageshack.us/my.php?image=g803qg3.jpg) (http://img331.imageshack.us/img331/4308/g804kv3.th.jpg) (http://img331.imageshack.us/my.php?image=g804kv3.jpg)
(http://img331.imageshack.us/img331/4626/g805hg9.th.jpg) (http://img331.imageshack.us/my.php?image=g805hg9.jpg) (http://img386.imageshack.us/img386/3926/g806fo8.th.jpg) (http://img386.imageshack.us/my.php?image=g806fo8.jpg) (http://img388.imageshack.us/img388/9797/g807jn3.th.jpg) (http://img388.imageshack.us/my.php?image=g807jn3.jpg)
That's one hella LARGE @$$ card dread...
-
I cant wait to get my hands on one of these. Im just waiting for these and the nforce 590 intel edition based on c55 to come out before i do the upgrade. The card itself is indeed a monstrosity as i estimate it to be at least a couple inches longer than the 6800 series cards. Also, the whole cooling solution looks to be a giant dual slot one with those curious looking adjuster thingies poking out of the heatsink setup. One can only imagine the power that b!tch wil suck.
-
Don't forget, you'll be needing a 1KW PSU to run it in SLI! :p
-
bout how de cards go have lil adjustable jah antennaes on dem yes
nvidia know how to do it.
-
lol @ adjustable jah antennas
-
Well if HDCP is on your wishlist at least:
Nvidia's G80 has problems
SLI and HDCP broken for now
By Charlie Demerjian: Monday 23 October 2006, 09:03
NVIDIA IS GIVING out G80 cards to editors' day attendees but no real drivers. Why on earth would Nvidia need to do that?
What is broken? Well, it seems G80 has a lot broken, some that will be fixed before launch, and some may be harder to fix.
Drivers on the 25th October will fix SLI, as we understand it is pretty badly borked in the current release. How badly? Enough to crash almost every time, said one source, and so basically unusable.
We have a lot of confidence this will be fixed before launch, possibly in the driver set we told you about earlier. I would only rate this as a corporate embarrassment that will hopefully never leak due to "strong and respected NDAs". The "editors" are wearing chastity belts designed to preserve the new Dawn's virginity.
The more troubling news is that we hear HDCP is badly borked in the first rev. Since we are not sure if this problem is hardware or software, if you care about jumping on the DRM infection train, you might want to look long and hard before you buy a G80.
Nvidia has something of a history of shipping broken functionality and not so much as correcting it on the box. Wait for independent reviews from people not wearing chastity belts that test this before you buy.
Last up, we hear that in the forthcoming 95 series drivers the classic display panels are gone for good. We plan to be holding a candlelight vigil for them at One Inquirer Tower on All Hallows Eve. Punch and pie will be served, along with mournful singers and a goth/emo kid moping in the corner. µ
-
Lol @ punch and pie being served at the candlelight vigil for the classic display panel. I for one will surely miss the classic panel because of its simplicity and effectiveness as opposed to the new control panel, but i'm sure a way will be found to include them anyway.
Concerning the g80, it remains to be seen how much of a performance increase there will be from the current gen cards. Depending on that, a tradeoff will have to be made...A nice performance increase with dxx support for cames such as Crysis with the HDCP infection or HDCP less hardware, slightly less powerful and dx9 support. I guess its up to the individual to choose. AFAIK, manufacturers have to pay a license to include HDCP on their cards, so we may even see g80's without the infection coming from manufaturers unwilling to pay for the license.
As far as Nvidia working out the bugs in their as yet unreleased product, i have no doubt that they will, however, its a question of when. We might get a final product that needs the driver support that is not yet available, and have to wait weeks or even months to get it. At this point, its pure speculation. Only time will tell.
-
Asus, the web-site claims, will release two versions of the GeForce 8800-series graphics cards: EN8800GTX/HTDP/768M and EN8800GTS/HTDP/640M. The naming scheme implies that there will be two versions of the G80: the GeForce 8800 GTX and the GeForce 8800 GTS which will be different not only in terms of performance, but also in terms of installed amount of memory and interface: the GTX boards will carry 768MB with 384-bit interface, whereas the GTX cards will have 640MB with 320-bit interface. The higher-end model 8800 GTX will have its chip clocked at 575MHz and GDDR3 memory operating at 1.80GHz.
http://xbitlabs.com/news/video/display/20061023142704.html
so unfair.......the 7900GTO could barely fit in my case :(. look at the size of this beast. and to add to it.......2xPCI-e connectors!!!
-
stop whining bout that
holy smokes @ 384 and 320 bit!! niceeeeeeeeeee
-
Old news... *sighs* at crixx overacting as per norm... anyways just to give some more info read here:
G80 is big, hot and sports new SLI
G80, the remaining facts
By Fuad Abazovic: Wednesday 25 October 2006, 10:36
G80, GEFORCE 8800 GTX and GTS are old news now. We learned all about the cards including the fact that it has around 700 million transistors, that, as we said many moons before, it is based on 90 nanometre process and has dual power plugs.
The G80 chip needs a lot of power and is the biggest desktop graphics chip so far. I think the image from toms hardware Italy speaks for itself. here.
Another thing that we learned is that Nvidia uses a new SLI. We saw it here originally spotted at nvnews.net. The new SLI is dual rail stuff. It is ironic that ATI was the first to introduce such a connector that will let you have read and writes performed at the same time.
As for the rest, no one yet knows how many real pipelines this card has but you can rely on the 1.5GHz scalable clock rates. This doesn’t mean much to us as in theory this card is super fast but we will be anxious to see it in action.
G80, Geforce 8800 GTX and GTS are two biggest graphic cards ever and here is the link for the retail Asus card. Samples should be distributed this week and some lucky chaps got the cards back at Nvidia's editor's dates. µ
And here's a PIC of the MONSTER of a GPU... GG @ size
(http://img118.imageshack.us/img118/9454/880020gts7phbeyypicnq3cke8.th.jpg) (http://img118.imageshack.us/my.php?image=880020gts7phbeyypicnq3cke8.jpg)
-
that not soooo big
will fit eeasliy into a dell ^_^
-
LMAO @ fit in DELL.... IN WHO'S DREAMS??? not even mikey dell I sure... LOL they will have to RE-Design a case JUST to fit that mc card... dread think of it this way, the card is prolly around 10" long... you see how big the CORE is in comparison to the CARD???? *sighs* @ crixx... I wanna see the COOLER for that thing... that's what I really interested in..
-
it will fit in my dell... granted i dont have pcie but it willl fit
i doh ever close my case
so it will fit ^_^
-
Yuh know its interesting that most mobos (at least the newer cases) don't fit replacement PSUs or mobos well AT ALL. Just the other day meh padna had to take a Dremel and Hacksaw to the case of a dell to install a new PSU for someone... not to mention the mobos and alignment for mobos are specific to dell... I hadda see this done. Crixx if/when yuh manage to install a PCIe mobo and THIS card or 1 of comparable size hit we some pix!
-
shall do
btw, dell power supplies are rather interesting goats
dell aparently uses a different measuring system for their wattage measurement
so a dell psu (atleast from the 8300 series and up) that says its rated at 300, in reality is actually closer to 450 w.
thats what allows me to run my multiple drives and my 6800 oced on a supposed 300 w dell psu
-
so who's pre-orderin'? :D
https://www.excaliberpc.com/parts.asp?stxt=8800+gts
https://www.excaliberpc.com/parts.asp?stxt=8800+gtx
with all this rain......u think a crop of "expensive grass" gon grow well? tha gtx lookin real tempting!
-
GG Almighty... those prices are.. well GG!!! I mean sure 640 and 768MB or GDDR3/4.... but DUYAMN... 500USD for the lower of the 2... thats too rich for my pockets i'm afraid. That aside, what are the definite power requirements for those monsters? I hear the performance of these cards is said to be about 12000 in 3DMark06... which is good I guess... but enough to warrant that price? or power req? i'll wait till R600 nears and prices fall.... that TOO HOTT for me in terms of cost. On a side note, its funny that they went with an odd size memory buss (384 I believe) hence they hadda use 640 and 768MB as opposed to R600 which will sport a 512bit internal ring-bus/external memory bus, so we'll see 512 and 1GB cards....
-
you said 12000 in 3dmark 2k6??
12k each or in sli?
-
Well remember Crixx SLi != 2x single card performance. So I will say its 12000 on a SINGLE card. We could prolly expect 15-17K with SLI. ;) I am sure you're salivating now... I have read however that many don't think it'll be enough to beat R600... but then we don't know the pipeline count or the actual tech used just yet, however based on what ppl know R600 will best that. But still 12K in 3DMark06 ain't nothing to jest @. That is GOSU power. We'll see soon enough as G80 is due out Nov 8th.
-
yep! 12k for a single card......MADNESS!! I doh even wanna think about power requirements for these monsters nah. But i cant wait for actual benchies on Nov 8th :D
-
Impressive, most impressive.
Reminds me of the old Voodoo 6000 cards that required a wheelbarrow attachment to your case for 'em to fit.
Not to mention external power.
I'll watch it in awe from afar but that ain't going anywhere near my machine.
T&TEC stock is gonna go up when these hit the market.
-
such vulgar behavior!!!
(http://img.villagephotos.com/p/2006-10/1222880/1161849814.jpg)
http://vic.expreview.com/index.php
this card looks soooo tempting right now.
-
my brain still trying to figure out how it possible to get 12k in 2k6
far less on a single card....
but wait
omfg is that 13k !!.....
wts house an land for 1337 rig to push 13k !!
-
hahaha. My CPU score in 3DMark06 is 920. I seein 5000+ in tha screenie.
Good LAWD!!!
-
OMFG
vulgar indeed
games will refuse to run on that card because of such obscene performance.
-
Anyone else but me seeing the ATItool application running in the Back? I am uncertain about this but, isn't ATItool for Radeon cards? or does it also work for Nvidia cards as well? beyond that, how is it working on hardware not yet released? I smell something fishy about this screenshot...
That aside I have some specs on the SIZE OF THIS MONSTER....I hope you guys been saving for a Lian Li case, cause GG this card wouldn't fit in my case unless I took a hacksaw to the front mounted drives.... AND THEN SOME...
Nvidia should bundle free cases with giant cards
G80, 8800 GTX, GTS numbers resurface
By Fuad Abazovic: Friday 27 October 2006, 07:02
Top INQ jobs
Web Developer, London
WebPortal Engineers, Telford
Senior Support Engineers, Bristol
Project Manager, Aberdeen
Junior Developer, Woking
Search for a job: WE MANAGED to get real measurements of both G80 based cards. You can take a ruler and measure your case and see if they are going to fit.
Geforce 8800 GTX card could be the biggest card ever, we are not sure about 7900 GX2 as we never had those in our hands. 8800 GTX is 28 centimetres long. In comparison, we measured the Radeon X1950 XTX card the highest end ATI and it is 23.5 centimetres long.
The cooler is four centimetres thick, takes both slots and Nvidia uses the same coolers on both cards. Both cards are 12.5 centimetres high, as this is the standard for all the ATX cards.
GTS is shorter at 24 centimetres roughly the size of X1950XTX. It has only one power plug. There is 1.9 centimetres space from the back of the cooler to the end of the card. GTX is 4.5 centimetres from the back of the cooler till the end of the card. GTX also has two power plugs based on the top of the card, just next to cooler. It will be easy to plug the power from the top of the card.
I hope it will fit in your case but if you have $650 for the card you should be able to afford a new case, or Nvidia could simply bundle some cheap case with the card and make your upgrade life easier. µ
28cm.... its approx 11" or just shy of 1 foot long... not to mention the cooler is 4 cm??? 0_o I am thinking of waiting for G80 but that 450USD price tag + the new case requirement... kinda prohibitive...
-
hey everyone, free home heating!!!!
-
prohibitive for 13k in 2k6?????
i mean do yu fathom 13k !! omfg!!
-
Anyone realize the ATItool in the background? anyone BUT ME think this is a lil suspicious??? the NDA on G80 hasn't lifted yet, but here we have 2 unreleased parts (ala Core2 Quad AND G80..) something just kinda off imho... I can believe 13k since I have been reading 12K so 1K more should be doable... but this particular screenie I have my doubts about.
-
Xfired x1950xtx's hit 11 and 12's even 13's so biggie, plus the guys on xtremesystems have had their hands on quad and even 8!!! core ES chips for months now. Ya I saw ati tool to! there is a hack for it to work on nvidia cards, don't know why use that to overclock the card tho?
Basically the scores would show a highly overclocked card, on a also highly overclocked quad core cpu. If anything the scores while possibly genuine are over inflated and inaccurate representation of the real world potential of the cards.
-
I still doh think 13k is a big thing. It's kinda expected of a new tech. Each one usually has to double the performance of the last. Since you said it urself X1950XTX Xfire hitting 13s, then the new single Nvidia & AMD parts have to hit the same or close. I am really curious about ur 8 core comment... but that aside, anyone have any specs on pipelines, shaders, shader type, etc? We know it uses a 384 bit wide mem controller and we know its running on a clock derivative of 500MHz. I hearing the shaders @ 1GHz and up. THe mem will be GDDR4 @ freq I not sure on, but I wanna know pipeline count. We speculate/know that R600 has 64 REAL pipes, and at least 64 unified shaders (expect more though) and we know its HUGE just like G80... but I wanna get an idea of how R600 and G80 will compete... and too I wanna see numbers on both AMD and Intel platforms. I guess we'll hadda put Kentfield vs. 4x4 but so far no dual/single core benchies.. and what about at STOCK?
-
remember not too long ago I posted a canterwood sis sandra pics? That was an ES 8 core intel chip due q3 07! it will most likely be in xenon flavour for it's release tho. As past experience has shown ATI will match Nvidia at the very least, thast why they release after all the time. As far as 4x4 vs kents, AMD has to play catch up and if they do match/beat kentsfield with 4x4 intel has 8 core in the waiting, they are making DAMN sure not to do another presshot fiasco again.
As far as the agp x1950, I would say thats so not worth it as pcie is the way to go!
-
Anyone realize the ATItool in the background......
ATi tool can be used on any video card. I used it on my 7800GT to overclock when i first got the card. Didnt like it tho.....but yea, it can be used with other cards beside radeon cards.
-
I say again, let them have their fun with those monster size cards. Those will never see the inside of my case.
The more reasonably spec'ed and sized younger brethren is where my targets will be set.
-
LOL. the GTS model is a bit more reasonable. It should be about the size of a 7900GT or so....
-
Interesting....
G80, Geforce 8800 GTX is very CPU dependent
Scores as 7950 GX2 on a non overclocked CPU
By Fuad Abazovic: Tuesday 31 October 2006, 08:51
Job search
Top INQ jobs
Web Developer, London
WebPortal Engineers, Telford
Senior Support Engineers, Bristol
Project Manager, Aberdeen
Junior Developer, Woking
Search for a job:
Job search
OUR PALS with a nice toy named G80, Geforce 8800 GTX informed us that unless you overclock your CPU, you won't get the right performance delta.
If you test 3Dmark06 with a 1000MHz overclocked Core 2 Quad or Duo you get 11300 marks. But when you test the same card, same test on a same non overclocked CPU at 2.66GHz the situation completely changes.
You score about 8000+ with a single card, almost identical to our score with 7950 GX2, the two GPU card. Shocking isn't it, as you'd expect more? To be fair to Nvidia's latest greatest card, the Geforce 7800 GTX scores some 5200+ in the same test so Geforce 8800 GTX is significantly faster than this single GPU card but not from the dual chip one. We are sure that this is just case for 3Dmark and that games will benefit more.
Don't say that we haven't warned you - you will need a faster CPU to push this card to its limits. A chap called Victor metioned in the overclocking story actually bought G80 card, he is not under NDA, which makes it even worst for Nvidia. µ
So basically if yuh doh have Core 2 Quad or a vapour/Nitro chilled 1GHz OCed Core 2 Duo... you not gonna be getting those Gosu scores... not that 8k+ is anything to be ashamed of... but as someone put it... X1950XTX Xfire getting 10K+ (on existing rigs that NOT so GOSU) for a next gen single GPU beast to not be closer to the dual-card of present would be disappointing... with 'reasonable' CPU power that you, me and all other average ppl can afford or would venture.
-
dey go release a driver an fix dat
its nvidia
dont worry
-
dey go release a driver an fix dat
its nvidia
dont worry
Spell fanboy: C R I X X C R E W etc...
I hope that something of that sort is the fix, however I can't see a Video driver fixing CPU horsepower requirements... but hey who knows...
-
its just an nvidia tactic
they gonna let ati think they could come rong and let them toot they horn
an they will release the driver and completely own
dont worry
and there is another w in fan boy ^_^
-
Alright ppl, the type of article i've been waiting on has finally surfaced. Here is a somewhat indepth analysis of what G80 is, what it's capable of, some of the new features and just raw SPECS, which are none short to say, GOSU. I will highlight sections that will leave many of you with your Jaw on the ground next door, but just remembed you've been warned. Some of these specs will make you salivate UNCONTROLLABLY and you will have to keep in mind, to take advantage of these beasts... well let's just say even Kentfield has a hard time ATM of feeding this beast (the benchies show that). The parts I highlighted in RED are the bones so you could just read those if yuh strapped for time. I however advise you read it all... its an WTMC article :p
Without further ado, let's begin:
Nvidia's G80 innards exposed
GigaThread, Lumenex, True HDR, Quantum Effects - we disclose all!
By Theo Valich: Thursday 02 November 2006, 09:44
Click here to find out more!
THIS ARTICLE REVEALS all of the important information regarding GeForce 8800 series, which is set to be released to the world on November 8th, 2006 in San Jose. We have learned that during traditional Editor's Day in San Francisco nVidia kept its rules, so "no porn surfing" and "no leaks to the Inquirer" banners were shown. But, we have no hard feelings about that. It is up to the companies to either respect millions of our readers, including employees of Nvidia or... not.
As you already know, Adrianne Curry, a Playboy bunny, America's Next Top Model star and an actor from My Fair Brady is the demo chick for G80. After we posted the story, we received a growl from Graphzilla, but we are here to serve you, our dear readers. However, this was just a story about a person who posed for the G80. Now, it's time to reveal the hardware. Everything you want to know, and don't want to wait for November 8th - lies in this article. Get your pop-corn ready; this will be a messy ride.
For starters, the 8800 launch is a hard one, so expect partners to have boards in store for the big day's press conference at 11AM on the 8th. The board delivery will go in several waves, with the first two separated by days. The boards were designed by ASUSTeK, and feature a departure from usual suspects at Micro-Star International. This is also the first ever black graphics card from nVidia. Bear in mind that every 8800GTX and 8800GTS is manufactured by ASUS. AIBs (add-in board vendors) can only change the cooling, while no overclocking is allowed on 1st gen products. Expect a very limited allocation of these boards, with UK alone getting a mediocre 200 boards.
The numbers
G80 is a 681 million transistor chip manufactured by TSMC. Since Graphzilla opted for the traditional approach, it eats up around 140 Watts of power. The rest gets eaten by Nvidia's I/O chip, video memory and the losses in power conversion on the PCB itself.
->GG, so the GPU ALONE consumes 140W... look for a PSU upgrade if you want this forget SLI, we looking at 1KW PSUs for dat.
If you remember the previous marchitecture, the G70 GPU embedded in 7800GTX 256MB, you will probably remember that the Pixel and Vertex Shader units worked at a different clock speed. G80 takes it one step forward, with a massive increase in clocks of Shader units.
GigaThread is the name of the G80 marchitecture which supports thousands of executing threads - similar to ATI's RingBus, keeping all of the Shader units well fed. G80 comes with 128 scalar Shader units, which Nvidia calls Stream Processors.
The reason Nvidia went with SP description is a DirectX 10 function called Stream Output, that those Shader units will now work on Pixel, Vertex, Geometry and Physics instructions, but not all at the same time. The function, in short, enables data from vertex or geometry shaders to be sent to memory and forwarded back to the top of GPU pipeline in order to be processed again. This enables developers to put in more shiny lighting calculations, physical calculations, or just more complex geometry processing in the engine. Read: more stuff for fewer transistors.
In order to enable that, Nvidia pulled a CPU approach and stuffed L1 and L2 cache across the chip. On the other hand, you might like to know that both Geometry and Vertex Shader programs support Vertex Texturing.
And when it comes to texturing itself, G80 features 64 Texture Filtering Units, which can feed the rest of the GPU with 64 pixels in a single clock. For comparison, GF7800GTX could manage only 24. Depending on the method of texture sampling and filtering used, G80 ranges from 18.4 to 36.8 billion texels in a single second. Pixel wise, the G80 churns out 36.8 billion of finished pixels in a single second.
When it comes to RingBus vs. GigaThread, DAAMIT's X1900 can branch granularity of 48 Pixels, X1800 can do 16. GeForce 8800GTX can do 32 pixel threads in some cases, but mostly the chip will be able to do 16, thus you can expect Nvidia to lose out on GPGPU front (for instance, in Folding@Home stuff).
However, Nvidia claims 100% efficiency, and we know for sure that ATI is mostly running in high 60s to high 70s in percentage points.
How many pixels can G80 push?
One of the things we are using to describe the traditional pixel pipeline is the number of pixels a chip can render in a single clock. With programmable units, the traditional pipeline died out, but many hacks out there are still using this inaccurate description.
To cut a long story short, on the pixel-rendering side, G80 can render the same amount of pixels as G70 (7800) and G71 (7900) chips.
The G80 chip in its full configuration comes with six Raster Operation Partitions (ROP) and each can render four pixels. So, 8800GTX can churn out 24, and 8800GTS can push 20 pixels per clock. However, these are complete pixels. If you use only Z-processing, you can expect a massive 192 pixels if one sample per pixel is used. If 4x FSAA is being used, then this number drops to 48 pixels per clock.
For game developers, the important information is that eight MRT (Multiple Render Targets) can be utilised and the ROPs support Frame Buffer blending of FP16 and FP32 render targets and every type of Frame Buffer surface can be used with FSAA and HDR.
If you are not a game developer, this sentence above means that Nvidia now supports FP32 blending, which was not a thing in the past, and FSAA/HDR combination will be supported by default. In fact, 16xAA and 128-bit HDR are supported at the same time.
Lumenex Engine - New FSAA and HDR explained
ROPs are also in charge of AntiAliasing, which has remained very similar to GeForce 7 series, albeit with quality adjustments. The G80 chip supports multi-sampling (MSAA), supersampling (SSAA) and transparency adaptive anti-aliasing (TAA). The four new 1GPU modes are 8x, 8xQ, 16x and 16xQ. Of course, you can't expect that you will be able to have enough horsepower to run the latest games with 16xQ enabled on a single 8800GTX, right?
Wrong. In certain games you can buy today, you can enjoy full 16xQ with the performance of regular 4xAA. The reason is exactly the difference between those 192 and 48 pixels in a single clock. But in games which aren't able to utilise 16x and 16xQ optimisations, you're far better off with lower AntiAliasing settings.
This mode Nvidia now calls "Application Enhanced, joining the two old scoundrels "Application Override" and "Application Controlled". Only "App Enhanced" is the new mode, and the idea is probably that the application talks with Nvidia's driver in order to decide which piece of a scene gets the AA treatment, and what does not. Can you say.... partial AA?
Now, where did we hear that one before.... ah, yes. EAA on Renderition Verite in late 90s of the past century and Matrox Parhelia in the early 21st century?
On the HDR (High Dynamic Range) side, Nvidia has designed the feature around OpenEXR spec, offering 128-bit precision (32-bit FP per component, Red:Green:Blue:Alpha channel) instead of today's 64-bit version. Nvidia is calling its new feature True HDR, although you can bet your arse this isn't the latest feature that vendors will call "true". Can't wait for "True AA", "True AF" and so on...
Anisotropic filtering has been raised in quality to match for ATI's X1K marchitecture, so now Nvidia offers angle-independent Aniso Filtering as well, thus killing the shimmering effect which was so annoying in numerous battles in Alterac Valley (World of WarCraft), Spywarefied (pardon, BattleField), Enemy Territory and many more. When compared to GeForce 7, it looks like GeForce 7 was in the stone age compared to the smoothness of the GeForce 8 series. Expect interesting screenshots of D3D AF-Tester Ver1.1. in many of GF8 reviews on the 8th.
Oh yeah, you can use AA in conjunction with both high-quality AF and 128-bit HDR. The external I/O chip now offers 10-bit DAC and supports over a billion colours, unlike 16.7 million in previous GeForce marchitectures.
Quantum Effects
Since PhysiX failed to take off in a spectacular manner, DAAMIT's Menage-a-Trois and Nvidia's SLI-Physics used Havok to create simpler physics computation on respective GPUs. Quantum Effects should take things on a more professional (usable) level, with hardware calculation of effects such as smoke, fire and explosions added to the mix of rigid body physics, particle effects, fluid, cloth and many more things that should make their way into games of tomorrow.
GeForce 8800GTX
Developed under a codename P355, the 8800GTX is Nvidia's flagship implementation. It features a fully fledged G80 chip clocked at 575MHz. Inside the GPU, there are 128 scalar Shader units clocked at 1.35GHz and raw Shader power is around 520GFLOPS. So, if anyone starts to talk about teraflops on a single GPU, we can tell you that we're around a year before that number becomes true. Before G90 and R700 these claims come from marketing alone.
-> MY GOD
768MB of Samsung memory is clocked at 900MHz DDR, or 1800 MegaTransfers (1.8GHz) wielding out a commanding 86.4 GB/s of memory bandwidth.
-> Double OMG
The PCB is massive 10.3 inches, or 27 centimetres, and on top of the PCB there are couple of new things. First of all, there are two power connectors, and secondly - the GTX features two new SLI MIO connectors. Their usage is "TBA" (To Be Announced), but we can tell you that this is not the only 8800 you will be seeing on the market. Connectors are two dual-link DVIs and one HDTV 7-pin out. HDMI 1.3 support is here from day one, but we don't think you'll be seeing too much of 8800GTX w/HDMI connection.
Cooling is not water/air cooled, but more manufacturer friendly aluminium with copper heat pipe. The fan is expected to be silent as a grave, and several AIBs are planning a more powerful version for 2nd gen 8800GTX, expected to be overclocked to 600 MHz for GPU and 1 GHz DDR for the memory.
The board's recommended price has changed couple of times and stands at 599 or dollars/euros, or 399 pounds. However, due to expected massive shortage, expect these prices to hit stratospheric levels.
GeForce 8800GTS
Codenamed P356, the 8800GTS is a smaller brother of the GTX. The G80 chip is the same as on the GTX, but the amount of wiring has been cut, so you have the 320-bit memory controller instead of 384-bit, 96 Shader units instead of 128 and 20 pixels per clock instead of 24.
The board itself is long and comes with a simpler layout than the GTX one. Dual-Link DVI, 7-pin HDTV out come by default. "Only" one 6-pin PEG connector is used, and power-supply requirements are lighter on the wallet.
The clocks have been set at 500MHz for the GPU, 1.2GHz for Shader Units, while the 640MB of memory has been clocked down to 800MHz DDR, or 1600 MegaTransfers (1.6GHz), yielding out bandwidth of 64GB/s. Both pixel and texel fill-rate fell by a significant margin, to 24 billion pixels and 16 to 32 billion texels.
Recommended price is 399 dollars/euros, but who are we kidding? Expect at least 100 dollars/euros higher price.
Performance is CPU Bound
Yes, you've read it correctly. Both GTS and GTX are maxing out the CPUs of today, and even Kentsfield and upcoming 4x4 will not have enough CPU to max out the graphics card – G80 chip just eats up all the processing power that a CPU can provide to them.
-> I don't think a Driver update will fix this crixx. I think I wil be holding off meh upgrade for 4x4, I can't see pairing this GPU or then R600 with my current Dual core offering. I think these cards will really shine with 4x4 and Higher clocked Kentfields as mentioned below
Having said this, expect fireworks with AMD's 4x4 platform once that true quad-core FX become available.
In the end
Nvidia has a really strong line-up for upcoming Yuletide shopping madness. However, within the ranks of Graphzilla's troopers there is an obvious intent to bury all of the more advanced features that the competition will offer in couple of months' time. 512-bit memory interface, more pixels per clock, second gen RingBus marchitecture... all this is hidden in the dungeons of Markham and DAAMIT's R&D Labs in Santa Clara and Marlboro.
Also, we have to say that the market is now set for repeat of 2005 and the R520/580 vs.G70/71 duel, since Nvidia will probably offer a spring refresh of the high-end model at the same time as DAAMIT launches the long delayed R600 chip. µ
-
oh talk bout my head spinnin from them numbers
lol @ that memory speed, ahaha thats ridiculous come on.
o m f g@128 shader units... i mean...
-
wow
well that is good news indeed. The card is too powerful for todays games and processors.
That means I have all the more reason not to purchase it in the near future, extending the life and sexyness of my 7900 GT, lol
-
lol yeah for real. especially since even C2D's bottlenecking the card.....
-
Some puuurrty pics:
(http://img136.imageshack.us/img136/3945/1stgtsyp3.th.jpg) (http://img136.imageshack.us/my.php?image=1stgtsyp3.jpg)
(http://img136.imageshack.us/img136/3673/2ndgtsmt7.th.jpg) (http://img136.imageshack.us/my.php?image=2ndgtsmt7.jpg)
These are pics of the GTS model however the modified version (the 2nd pic) is more or less what the GTX looks like as well. These are Leadtek cards.
-
I have to ask... exactly how much room yuh gonna need for these beasts seeing as they're AT LEAST as long as the X1950XTX if not longer??? and judging by the pic i'm about to put up... the X1950XTX is already as wide as the mobo.... that leaves all of WHAT space for drives?
(http://img441.imageshack.us/img441/5403/kentsfieldati2qy0.th.jpg) (http://img441.imageshack.us/my.php?image=kentsfieldati2qy0.jpg)
I know my case right now COULD NOT fit that monster... I guess a Lian Li case is in my future upgrades... no biggie, just abother 150+ USD to add to the cost of a system.... that aside here (http://www.vadim.co.uk/Custom+pc+build/14547)$ is a TRUE monster of a system. Just to give overview of specs... its an QX6700 (or Kentfield quad core) with 2GB of DDR21066 and has 2 8800GTXs in SLI.... obviously sports some 1KW PSU, is water cooled, has the toppa top Xifi sound card and even an agiea physics card... doh worry, its cheap a mere £5400 daz all... :p
-
not too powerful yet
http://www.dailytech.com/article.aspx?newsid=4812
looks like a nice boost if you have a big screen. This will even power a 24 inch screen without needing sli!
-
Not too powerful??? :) well I suppose when yuh run at 2000+x1600+ res yeah... anyways, here's some more info on real game performance... all I have to say is GG!!!
Geforce 8800 GTX is faster than Crossfire X1950XTX
Super fast in games
By Fuad Abazovic: Monday 06 November 2006, 09:11
Click here to find out more!
BEFORE the reviews start popping up on Wednesday, we got the chance to get you some games numbers for the G80.
In Open GL based Quake 4, the Geforce 8800 GTX is faster than two ATI X1950 XTX cards in Crossfire.
The Geforce 8800 GTX scores a few frames faster in every single resolution with and without FSAA 4X and Aniso 8X.
In FEAR, Crossfire keeps it nose in front by seven to eight frames, without the effects on, but at 1600x1200 or higher resolutions the 8800 GTX card is faster. In FEAR with 4X FSAA and 16X Aniso on, Crossfire wins in three from four resolutions but it is never more than seven to eight frames faster.
We also know that you can play Battlefield 2142 at 1600x1200 all effects on and have a 100 FPS all the time. Nvidia really did a great job this time, it will be a worthy upgrade.
A single Nvidia G80, Geforce 8800 GTX card can beat a pair of ATI's fastest X1950 XTX cards! I think DAAMIT might have a problem until it releases its much-delayed R600 card, next year. µ
Well done! beating X1950XTX Xfire is no easy task, and here we have a single card doing it... granted Xfire still needs some tweaking compared to SLI but still. So the real question is, how does this card compare to 7950 GX2, 7900GTX SLI and Quad SLI...
-
'twas in response to Baego's comment, in relation to the CPU
If it were "too powerful" then the score wold be lower as it would be held back by the cpu, and the gains in games would be less.
It's never too powerful ;)
-
The too powerful comes from the fact that as they scale the clock the performance seems to scale as well. That being said, with the current STOCK offerings available to us, the card is TOO powerful for current available parts. I hope that clears things up. At least in relation to my view on too powerful.
-
To add a bit of humour, I'm posting this pic i came across whileon google.
It's "confidential" info on teh GPU as of a 3 months ago.
NSFW!! you are warned!
http://img399.imageshack.us/img399/3839/8800gtx0ep.jpg
-
To add a bit of humour, I'm posting this pic i came across whileon google.
It's "confidential" info on teh GPU as of a 3 months ago.
NSFW!! you are warned!
http://img399.imageshack.us/img399/3839/8800gtx0ep.jpg
AHAHAHAHHHAAAAAAAAAHHAHHAAAAHAHA!!!!!!!!!!!!!!
OH LAWD!!!!
CHAKURA IYMC!!! That is the FUNNIEST shit I see in a LOOOONG TIME hoss!!!
doh worry, its cheap a mere £5400 daz all... :p
O_O............... um 5400 POUNDS!!? Did I read that right!!?
Thats over 54 THOUSAND TT!!! (Nissan B-15 price) lol
Good GOD!!!
-
Lol @ cp f'ing u, shiny metal thing and general "diagram" of card. Lol @ suck it ATI and at least 26x faster than quad sli 7950 ggxxg2
-
omg lol lol lol @ 4 gigs of gddr5 hahahah
and lol lol lol at the benchies from that pic ahahah
-
It just keeps getting better and better....
Nvidia Geforce 8800 GTX tested in SLI
Premature emission 75 per cent faster in FEAR
By Fuad Abazovic: Tuesday 07 November 2006, 15:56
FEAR is one of our favourite benchmarks. And we thought it'd give a pair of G80, Geforce 8800 GTX cards in SLI a workout.
We got a 75 per cent performance increase over a single Geforce 8800 GTX card.
Yes, we are talking about G80, Geforce 8800 GTX times two.
Are you ready for some big numbers?
3Dmark03 scores 49,400.
For comparison, a 7950 GX2 dual core graphic card scores 29,800 in the same test. A single X1950XTX scores 19,500.
In 3Dmark06, the 8800 GTX SLI reaches almost 11,000. The 7950GX2 achieves around 10,000. In Fear, the dual G80 cards can score up to 48 frames faster than a single card - a massive performance increase - and some other games will benefit from it too.
Stay tuned for more numbers we just wanted to give you a hint. Oh yes, the single Geforce 8800 GTX scores 9,800 in 3Dmark, on the fastest AMD CPU to date. µ
The 3DMark06 figures aren't too spectacular when you consider that we've seen 11K+ with a OCed Kentfield. But seeing as these are 'base rig' figures, they're still pretty decent. I wonder if its not a typo...
-
why did they use amd processors for that test???
-
It's called what available at the time crixx. That aside, there are alot of ppl out there that want to know the performance of these beasts on the AMD platform.
-
Ah boy. Things lookin up for ye brethren of AMD money. ^_^
Not that I will be able to afford that monster (in a hurry) in any case,
but I like what I just read there.
It seems the 8800 will pwn all, whether by Intel or AMD.
Niceness.
-
LAUNCH TOMORROW FELLAS!!! (is still tomorrow right?) i cant wait for OFFICIAL benchmarks.....
-
Well it is officially tomorrow... I mean today... I mean NOW... either way here is a SPEW of benchies.. so hold on tight this is gonna get CRAZY!!!!
Leadtek, EVGA Geforce 8800 GTX tested, even in SLI
First INQpression The fastest graphic thing on earth
By Fuad Abazovic: Wednesday 08 November 2006, 11:09
WE TALKED a lot about G80. It has been in the news for a while and today it is finally ready to launch. It had its last minute tweak-up, as Nvidia says that one of its resistors was too weedy. But the firm and its partners fixed that and now the cards that you are about to buy should be trouble-free.
The G80 is a totally new chip, 90 nanometre, with 681 million transistors and manufactured at TSMC. It is the largest graphics chip built to date, shielded in a big metal heat spreader. The fastest version is clocked at 575MHz while the GTS version works at 500MHz.
Nvidia did a good job and it is the first DirectX 10 chip on market. It is also the first to support the 'unified' marchitecture. Nvidia claims that its chip is completely unified. The same pipeline calculates vertex, pixel and geometry Shader information. Nvidia claims 128 processing streams for its faster version of G80, the Geforce 8800 GTX, while the second in line, the Geforce 8800 GTS, has 96.
Geforce 8800 GTX has a core working at 575MHz. GDDR 3 memory works in a rather odd 384-bit mode and it has twelve chips, totally of 768MB of memory. While we all expected 1024 would be the next step, Nvidia decided to go for 768MB due its different memory controller. Nvidia clocked the memory at 1800MHz with a totally bandwidth of respectable 86.4 GB/s and a fill rate of 36.8 billion a second. The Geforce 8800 GTX is 28 centimetres long and is the longest card we've had in our hands. Only the Geforce 7900 GX2 was longer but we never managed to get one of those.
Nvidia's unified parallel Shader design has 128 individual stream processors running at 1.35GHz. Each processor is capable of being dynamically allocated to vertex, pixel, geometry or physics operation. We don’t have any idea where this 1.35GHz number comes from but the card renders fast that's all you should care about now. And of course if supports DirectX 10, an exclusivity for Vista only.
(http://img246.imageshack.us/img246/9498/g80diagff2.th.jpg) (http://img246.imageshack.us/my.php?image=g80diagff2.jpg)
Nvidia finally made a chip that works with FSAA and HDR. It supports the new 128-bit HDR. This one handles 32 bits per component and produces better quality. The Geforce 8800 GTX handles 24 ROPs (Raster Operation Units) or, should we say, can render 24 pixels per clock, while the Geforce 8800 GTS can render 20 only. You'll need a 450W or higher PSU to power the card with 30A 12V current so a quality 450W will do. Our 8800 GTX SLI worked with 700W OCZ GameXstream PSU so we can recommend this one.
Two power six-pin connectors means that the card gets 2x75W from the cables plus an additional 75W from the PCIe bus. This brings total power consumption to an earth-heating 225W.
Geforce 8800 GTX has a dual slot cooler - massive and heavy but it does the job. The card always worked around 55Celsius in 2D mode and a bit higher in 3D.
We will tell you more about Geforce 8800 GTS in the separate part that will follow. We had two cards on test.
(http://img294.imageshack.us/img294/6245/g80ecardyj4.th.jpg) (http://img294.imageshack.us/my.php?image=g80ecardyj4.jpg) (http://img294.imageshack.us/img294/8769/g80eboxwo3.th.jpg) (http://img294.imageshack.us/my.php?image=g80eboxwo3.jpg) (http://img134.imageshack.us/img134/5673/g80leadcardee7.th.jpg) (http://img134.imageshack.us/my.php?image=g80leadcardee7.jpg) (http://img134.imageshack.us/img134/4234/g80leadboxfi2.th.jpg) (http://img134.imageshack.us/my.php?image=g80leadboxfi2.jpg)
You guys do realize the ramifications of this right? DIFFERENT BRAND SLI???
The first one to arrive in our hands was an EVGA Geforce 8800 GTX with a brand-new ACS3 Cooler. EVGA dared to change the Nvidia holy cooler and made some modifications to it. It added a huge metal heat spreader on the back of the card and it covered Nvidia heatpipe cooler with a massive EVGA branded piece of squared metal. It is as long as the card itself but you won’t have any troubles to connect the power cords - the cards have two of them or the two SLI connectors.
Nvidia decided to put two SLI connectors on the top of the card and G80 works with the existing cables that you get with your motherboard. The only trouble is that Nvidia doesn’t bundle an additional SLI cable so, if you have a single motherboard, you will end up with a single cable. As we had two in the lab we just borrowed one SLI cable from the other board. The card takes two slots and you need a bigger SLI connecting cable to put them together. We tried it and it works in SLI too.
The ACS3 cooler is actually more efficient that Nvidia's reference cooler as the card works at 55C in 2D mode while the Nvidia reference cooler cools the card to 60 Celsius only. This is what makes the difference between EVGA card and the rest of the cards based on the reference cooler design.
The retail box is super small. We could not believe that EVGA managed to pack the card in such a small package and still you have all the CDs and cables you need.
Second to arrive was a Leadtek Geforce 8800 GTX card and the moment we got it we just had to test it in SLI, of course. As we don’t have any support from Nvidia we had to figure out how it works, but we did it.
Leadtek Geforce 8800 GTX is packed in a nice widee retail box. It includes two games, Spellforce 2 and Trackmania Nation TMN, along with a driver CD and a bonus software pack, including Adobe reader and Power DVD 6 ORB. It has a single DVI to VGA dongle, S-video to HDTV cable including S-video out. It has a two Molex to six-pin power connectors and that is all. This is all you need. The card supports HDCP and the driver CD also includes Vista drivers, we didn’t try it yet and Leadtek puts an extra touch and at least brands Nvidia driver as Leadtek Winfast ones. They work flawlessly and they told us that the temperature in 2D mode is around 55 Celsius for EVGA or 60 Celsius for Leadtek. The Leadtek card heated up to 75 Celsius in the open, out of the case environment. EVGA ACS3 works at around 70 Celsius after a heavy 3D load.
The second smaller chip on a PCB is TMDS display logic as Nvidia either could not put it in G80, had problems with the integrated TMDS. We believe that the chip would be too big to have it all and Nvidia decided for a cheaper two chip approach.
The driver in both cases has a few new features, including support for 16X Anisotropic filtering and 16X FSAA in a few modes. The big news is that Nvidia finally supports FSAA 8X and 16X. Nvidia's Luminex is a marketing name for incredible image quality that includes support for 16X FSAA, 128-bit HDR and support for 2560x1600 resolution with a high frame rate.
The new FSAA modes include 8x, 8xQ, 16x, and 16xQ. The 8xQ and 16xQ are brand new but we don’t know what Q here stands for, we will try to play with it at a later date. We didn’t get the cards early enough to test everything we wanted.
(http://img134.imageshack.us/img134/6316/g80sliuf6.th.jpg) (http://img134.imageshack.us/my.php?image=g80sliuf6.jpg)
Benchmarketing
We used :
Foxconn C51XE M2aa Nforce 590 SLI motherboard
Sapphire AM2RD580 with SB600 board for Crossfire
Athlon FX 62 2800 MHz 90 nanometre Windsor core
2x1024 MB DDR2 Corsair CM2X1024-6400C3 memory
Seagate Barracuda 7200.9 500GB SATA NCQ hard drive
Thermaltake Mini Typhoon Athlon 64/X2/FX cooler and Intel CPU's
OCZ 700W GameXstream power supply
For ATI cards we used 6.10 drivers the most current available, G80 based cards 8800 GTX used a driver supplied on a CD version 9.6.8.9 and a 91.47 drivers for Gainward 7950 GX2 card.
The power supply was enough for SLI as well. We plugged both cards in Foxconn C51XE motherboard and we have to mention that the retention mechanism on this board sucks. It works flawless but it is very hard to unplug the card as long as the 8800 GTX.
We patched the cards with two SLI connectors from two different boards, installed the drivers for both cards, restarted and Voala, it works. SLI is enabled and we did a few benchmarks. The scores can be even higher when you use Core 2 Duo or Quad CPU but at press time we decided to use our reference platform based on FX62 CPU.
Composite
Figures 3DMark 03 3Dmark 05 3Dmark 06
3Dmark 03
ATI X1950XTX 19741 12348 6283
650/2000 MHz
ATI X1950XT 32951 15725 9909
650/2000 MHz
Cross Fire
Gainward BLISS 29758 13633 8083
7950GX2 PCX
500/1200MHz
EVGA eGeforce 30485 15210 9814
8800GTX ACS3
575 / 1800 MHz
EVGA-Leadtek 49390 16374 10974
8800GTX SLI 2x
575 / 1800 MHz
--> Game benchmarks removed (layout problems in the cut and paste) see here (http://www.theinquirer.net/default.aspx?article=35604) for full details
We started with 3Dmark03 and already saw the light. A single EVGA G80 card, EVGA eGeforce 8800GTX ACS3 575/1800MHz scores 30485. it is just slightly faster than the 7950 GX2 card, 2500 slower than Crossfire and more than 10000 faster than a single fastest X1950XTX card.
It gets even better as the EVGA-Leadtek 8800GTX SLI 2x 575/1800MHz combo scores 49390, almost 50K. You'll need a faster or overclocked CPU for 50K, it is simple as beans. SLI is sixty-two per cent faster than a single card. A single G80 is fifty-four per cent faster than an X1950XTX. It scores 194 frames in game 3, just a bit slower than two ATI cards patched together. It is three times faster than Crossfire set up in Pixel Shader 2.0 test. It scores 1125.5 FPS, versus 373.3 the score of Crossfire. It is three times faster or 301 percent. Amazing isn’t it? Vertex Shader test is twice faster on G80 than on ATI's faster card.
A single G80 beats ATI by almost 3000. SLI beats Crossfire by only 600 marks but it is still faster. 16300 is a great score for this test. Complex vertex Shader is almost twice as fast as ATI's card.
Now to the 3Dmark06, you score 10974 with two cards or 9814 with a single card. This test is super CPU bounded. We will do some testing with Core 2 Duo and overclocking as we believe we should be reaching 13500 or more with a better or overclocked CPU.
EVGA eGeforce 8800GTX ACS3 575/1800MHz scores is eighty-four percent faster than the ATI's X1950XTX in Shader model 2.0 test and seventy-one in Shader model 3.0 / HDR testing. It really smashes the competition.
Doom 3 scores around 135 FPS at first three resolutions and drop to 125 at the 20x15, SLI even then scores 135 so this is clearly CPU limited. EVGA eGeforce 8800GTX ACS3 575 / 1800 MHz is almost 80 percent faster at 2048x1536. Doom 3 with effects on scores 130 FPS at first two resolutions and later starts to drop but is still faster than 7950 GX2 cards all the time. SLI doesn’t drop at all at first three resolutions only slightly drops at 20x15.
FEAR scores are up to sky with the weakest score of 95 FPS at 20x15, faster than Crossfire in the last two resolutions and from GX2 and X1950XTX at all times. It is up to 66 percent faster than X1950XTX and 68 percent from the Gainward 7950 GX2 card. SLI is again 68 percent faster than Crossfire a massive difference.
EVGA eGeforce 8800GTX ACS3 575 / 1800 MHz scores 53 FPS even at highest resolutions with all the effects on and 4X FSAA and 16X Aniso and much more at lower ones. Crossfire beats a single card by three frames at 16x12 and eight at 20x15 but a single card loses by some forty percents. SLI is twice as fast as 7950GX2 and 57 percent than Crossfire.
Quake 4 runs up to forty seven frames faster on G80 and SLI gets the score better but not much. G80 is always faster than GX2 and Crossfire. Quake 4 with FSAA and Aniso runs some forty percent faster than ATI's fastest card and 30 per cent in Crossfire versus SLI G80.
Far Cry with effects on performance is matched with both G80 and X1950XTX while the SLI can outperform both in 20x15.
Serious Sam 2 is faster on ATI in first two resolutins by three frames while EVGA eGeforce 8800GTX ACS3 575 / 1800 MHz wins by eight, or fifteen frames at higher resolutions. SLI is 23 per cent faster than a single G80 and 43 percent faster than X1950xTX.
Serious Sam 2 with FSAA and Aniso on always scores faster at EVGA eGeforce 8800GTX ACS3 575 / 1800 MHz card but not that much, some nine to ten percent while the SLI is 54 per cent faster than a single card and sixty eight percent than ATI's card.
We decided to reintroduce Oblivion powered with Fraps and let me conclude it with it. We tested Nvidia cards in 8X FSAA mode in our custom developed test, a super-intensive routine with all the settings on. At 8xFSAA + 8xAniso and HDR SLI only makes a big stand over a single card in 16x12 and especially at 20x15. It is 21 per cent faster or 67 at 20x15.
ATI could not do more than 6X FSAA and even then it runs slightly slower than a single card. At 20x15 it is unplayable with a few FPS but at 4X FSAA it scores 27.26 FPS. This is still the game that can bring the G80 to its knees and SLI can make it barely playable with all effects on.
Before we finished up testing, we decided to do a quick Quake 4 16X and 16X Q FSAA test. Q mode is always about 11 percent slower than the 16X mode and maybe that Q stands for Quality. Not sure here as we didn’t get any documentation. We will figure it out later. As for the 16X scores it is two or more times faster and drops to 29 FPS at 20x15, but if you ask me that should be enough. If you want 60 FPS Quake 4 and G80 scores that much up to 12x10 the resolution of most 19 inch TFT gaming displays out there.
In Short
Basically, at this time I can say G80 rocks. I love it, it's great and I don’t have anything bad to say about it. Pick up your wallet if you want to spend €/$600 on a graphic card and enjoy it. You can whine that it's long, it can get warm but it's all nonsense, as this is the fastest card on planet Earth and you have to live with the heat and size. It is super stable, that is what matters the most and we didn’t have any issues with stability.
You can set the quality to 16X Anistropic filtering, 16 times full-scene anti-aliasing and have HDR 128 bit at the same time. It looks great and even the FSAA and HDR works on this card, it didn’t work on any Geforce 7 series.
Nvidia can run 16X. It affects your performance but it is still playable, at least in lower resolutions. We will look at the picture quality in the next few days as we are far from being done from analysing G80.
Overall, both EVGA Geforce 8800GTX ACS3 575/1800MHz and Leadtek Winfast PX8800 GTX TDH are great cards and get either of them you can. The only advantage is that EVGA temperature can get a bit lower but just a bit. And the good thing is that they both work in SLI together. So maybe you can buy one of EVGA and one Leadtek and be like us. :)
If you want the fastest card on planet then you have to buy G80. If you want the fastest gaming setup on the planet then you have to buy two Geforce 8800 GTX cards and put them together in SLI. Not much more to say. Nvidia did it and this is the fastest card for some while, at least until the beginning of next year. µ
Reviewed and tested by Sanjin Rados and Fuad Abazovic
And NOW the NEW chick for the Geforce 8xxx series... introducing Adriane... you may know her for being Playboy's Feb 2006 Cover, or better know from the VH1 Reality TV show 'My Fair Brady'. Either way like her or not, the rendering is amazing have a look...
(http://img476.imageshack.us/img476/7835/g803agj0.th.jpg) (http://img476.imageshack.us/my.php?image=g803agj0.jpg)
And Lastly... with all good things, there is ALWAYS a pinch of salt, to be fair of course. Here is a slight downside to the more workstation type application that apparently Nvidia has been promising a long time now ALA Multi-display in SLI mode...
Nvidia's dual-display SLI implementation is bonkers
Buy three graphic cards!!
By Theo Valich: Wednesday 08 November 2006, 10:33
WE HAVE BEEN asking the Graphzilla about SLI and multi-monitor support ever since it came out with the SLI technology.
The continuous mantra previously growled at us was that it was going to be solved in future revisions of drivers. Then, the 7800GTX series came out and the mantra was changed to: watch out for future revisions of hardware. And now, with the 8800 series coming out, this is changing to: Buy a third graphics card.
Nvidia is launching its third-generation SLI-capable hardware, and users can still forget about multi-monitor support when SLI is enabled. We know the problem lies not in the driver, but in the way two displays are being set-up, GPU-wise.
The presentation about the Nforce 680i chipset, which is being launched today, contains an explanation as to why Graphzilla is now offering three long PCIe slots for placing three graphics cards on the motherboard.
(http://img139.imageshack.us/img139/5892/8800slisk6.th.jpg) (http://img139.imageshack.us/my.php?image=8800slisk6.jpg)
The possibilities offered by the third PCIe slot are six monitors (which is nice), SLI Physics (marked as "Future Support") and SLI + Dual Display support - also marked "Future Support".
Now, wait a bit: Dual Display support and three graphics cards? We won't go into the power consumption debate, but this seems rather excessive and expensive.
As you can see in picture above, two of the graphic cards are today's babies, the 8800GTXs, while that one in the middle is 7900GT, probably playing a role of future DX10 mainstream graphics card. And in the future, dual display support just may be added. As SLI-Physics will.
Of course, since the third PCIe connector will be a feature of 680i only, you really need to forget about lower-end chipsets such as also introduced 650i. Bear in mind that the third graphics card will be pretty much idle, since this is not for enabling gaming on the second monitor, but rather enabling Windows desktop. You know, heavy fillrate hitting stuff like Windows Explorer, Skype, ICQ, MSN, and oh my, incredibly complex TeamSpeak screen. At least, according to powerpointery.
Nvidia should either stop talking about dual display support or pull a George Broussard and utter the words: when it's done. µ
Please do visit the link this was taken from here (http://www.theinquirer.net/default.aspx?article=35604) as there are game FPS benchies in the most popular and taxing titles presently out. You'll get an appreciation for the actual in-game performance and the RIDICULOUS res that these beasts in SLI can run at EVEN with FSAA+Ansio+128-bit HDR and all the bells and whistles the game has to offer.
-
I said GODDAMN!!!! LOOK AT THEM NUMBERS!!!! WTFOMGPWNAGE.
-
www.techreport.com has a review , DAMN nvidia came good , ati gonna have to pull off a small miracle to beat them. Also shows Nvidia actually cares a bit about us , the die is so huge ,650 million transistors ,that they only yield 80 per wafer , but they don't charge anything stupid for it .
Image quality beats ATI , alot faster , good power, if ATI doesn't come better i'll see about getting one of these next year when they move to 65nm and get small enough to fit in meh case.
Only flaw : that chick not so hot : P .
-
Well Trini alot of the things you said are a no brainer. Don't you suppose the 'next gen' part would beat out the current gen? DUH.. making the point of better image quality and ATI having to come good is like saying humans need to breathe. NO $H!T... geezus. As for how good ATI has to come, many suspect that R600 will beat it based on priliminary specs, personally I will wait and see. I will hang on to my faithful rig until K8L and R600 cause in the least the prices of these ridiculous performance parts will drop by then. But I eh go wait much beyond that... as it is that FAR off.
-
It just keeps getting better and better:
First watercooler for Geforce 8800 arrives
And its a beauty
By Theo Valich: Wednesday 08 November 2006, 17:40
SOME CRAZY GERMANS from AwardFabrik.de have shown the world's the first waterblock for the yet-to-be-launched 8800 series.
The prototype comes from the company named SilenX, and it looks menacing. Preliminary testing has relealed that the temperature of the GPU drops sharply by 30 degrees Celsius, from 81-85C down to a 50C range. Overclocking scores also show significant improvement.
(http://img227.imageshack.us/img227/8171/blockfrontinqjx4.th.jpg) (http://img227.imageshack.us/my.php?image=blockfrontinqjx4.jpg)
Before the waterblock was installed, the GPU was running in high 90s at a 625MHz clock. With the watercooler you can enjoy a 25degC lower temperature and a stable clock of 660MHz for the GPU, which is almost 100 MHz faster than a default clock.
Of course, getting a GPU to run at 660MHz yields in Shader clock of over 1.GHz (a 90nm part with almost 700 million transistors has parts that run at 1.5GHz) and not using AMD or Intel transistor technologies only gives a tribute to engineers at TSMC and Nvidia, who created a small semico miracle.
(http://img292.imageshack.us/img292/697/blockbakcinqga7.th.jpg) (http://img292.imageshack.us/my.php?image=blockbakcinqga7.jpg)
The reason the cooler works so well is its backside. While the front looks pretty much like many of the waterblocks out there, turning the back reveals the German precision in designing this monster. All of the elements on the PCB are cooled: the G80 GPU, Samsung GDDR3 memory, nV I/O chip and the power regulation.
This enables the G80 PCB to become a warm-to-touch instead of scorching-hot board, especially on the backside. The values we mentioned earlier are for the GPU alone, while the Windows thermal monitoring did not show a sharp decline in the temperature of power regulation, which in the end greatly reduce the stress on the PCB and will probably lead to a longer life of the board.
The cooler will be available soon, and can be ordered using our complementary L'INQ. We're welcoming a test unit, so that we can either confirm or negate the scores in independent testing. µ
BUT WAIT THERE IS MORE BEAUTY!!!!
Sparkle gives the G80 a cold reception
First INQpression Sparkle's Calibre P880+ the first Peltier Junction cooled GF 8800GTX card
By Nebojsa Novakovic: Wednesday 08 November 2006, 18:13
BIRTHDAYS ARE FUNNY affairs, sometimes full of surprises. My birthday this year had no surprises, it just happened to be the official birth date of Nvidia's G80, also known as GeForce 8800 (GTX the preferred variety).
By now, all is known about its Lumenex engine with unified shaders and DirectX 10 Shader 4 support, GigaThread array of 128 streaming processors, and (finally) Nvidia's own "Quantum Effects" physics acceleration, not to mention the somewhat odd 384-bit memory bus with 768 MBytes of GDDR3 RAM there, providing some 86 GB/s of bandwidth, 7x that of Athlon AM2 socket or Intel Core 2 overclocked to FSB1600. Overall, undisputably the world's fastest GPU right now, at least until DAAMIT R600 comes out in, hopefully, a few months.
Sparkle has also joined the Nvidia team fray in announcing a new G80 series of GF8800GTX and GTS cards, however, in this case, they are coming up at the very start with a (slightly) modified offering as well. There are two GeForce 8800GTX models coming out from them - one is a standard reference model, the other the high-end Calibre model with a different cooling design (see photo). Sparkle claims that the Calibre P880+ 8800GTX cooler has better cooling performance, yet lower noise than the reference design, and, once Nvidia allows G80 family overclocking, it might be of more help there, too.
(http://img165.imageshack.us/img165/4769/calibre1savz4.th.jpg) (http://img165.imageshack.us/my.php?image=calibre1savz4.jpg)
Looks-wise, I'd say that the reference design is more polished, but again, the huge heat sink on the Calibre P880+, with its dual fans, gives an image of sheer power, doesn't it? But it's what is under those two fans that gives thess cards the extra oomph (once unlocked) and lower noise with cooler operation: an active Peltier Junction cooling engine, with its own 4-pin power supply (on top of the two 6-pin feeds for the card itself!). Co-developed by Sparkle and TEC, the system is second only to the best water cooling systems when it comes to the heat removal efficiency and reduced noise operation - after all, you don't want the card fan noise to overpower even the machine gun fire in your favourite 3-D shootout.
Inside the huge cooling system is a thermal electric cooler, quad heatpipes and (quite slim) dual 8cm fans. The Peltier effect uses a transducer to produce a cold source that lowers the temperature of the GF8800GTX GPU. A transistor sensor placed near the GPU detects its temperature, while software monitors the overall temperature on the video card. When a certain temperature is reached near the GPU, the transducer will turn on the cold source. When the GPU is in idle mode, the transducer will turn off automatically. The dual 8 cm fans fan push the hot air to the exterior of the card via the quad heatpipes, helping to eliminate the heat remaining in the video card system. The fans have an exhaust cover to minimise the noise.
The first sample that I got did not have any heat sinks on the 12 memory chips or the I/O chip on the graphics card. I volunteered and obtained 13 pieces of Zalman's good quality GDDR memory heat sinks and, to avoid removing the whole Peltier assembly, spent over an hour with a pincer, positioning and fixing these heat sinks on the memory chips - under the Peltier cooler! See the photo with the end result.
(http://img165.imageshack.us/img165/9954/calibre2s1uj4.th.jpg) (http://img165.imageshack.us/my.php?image=calibre2s1uj4.jpg)
Now, I ran both cards on two different CPU configurations - one the Intel "Kentsfield" Core 2 QX6700 running at 3.20GHz with FSB1066, whose four cores proved useful in feeding the GPU for 3DMark06 for now at least (till some actual 4-core optimised games come out), and the old Intel "Presler" Pentium XE 965, running at 4.27GHz with its two cores, again on FSB1066. Both of these are the highest frequencies I could run these CPUs on and reliably complete the CPU portion of 3DMark06 repeatedly without a hitch, even when using Zalman 9700 high-end cooler. The CPUs were using the same platform - Intel D975XBX2 with 1GB of Corsair XMS-5400UL memory, enabling 3-2-2-7 low latency at up to around 670 MHz.
The performance was about the same - after all, Nvidia doesn't allow overclocking right now (feel free to share with us your experiences in overcoming this overclocking barrier here) and the GPU and RAM parts are the same. However, there was a definite difference in the noise. Sparkle claims to have reached over 20 deg C heat benefit (60 C working temperature instead of 80C) and 12 decibels less working noise compared to the reference card. I didn't measure the precise sound or heat levels yet, but the card was substantially quieter and, even after several rounds of 3DMark 06 with and without AA, anisotropics and so on, the huge Peltier block stayed coolish to the touch.
Also, I looked at the power consumption during the 3DMark test runs, and there was only a few watts difference between the reference card and the P880+. Few watts more for far less heat & noise, not to mention some 60 grams less card weight? I guess the deal is good, and will be very good once the overclocking is unlocked. Most importantly, neither card took more watts than the X1900XTX running at 700/1600 speed settings, for way higher performance. In all cases, the system consumption peaked at about 230Watts as measured.
In any case, these guys offer both card versions (the reference design under the Sparkle brand) and the Peltier version under the Calibre brand - also, while the Sparkle version gets the Call Of Duty 2 game CD, Calibre has the more 'unique' Painkiller game bundled along, as a full retail set.
Here are the initial 3D Mark06 results on the Kentsfield and Presler:
(http://img165.imageshack.us/img165/3507/calibre3tablegv7.jpg)
As you can see, this GPU really makes the difference between the CPUs, even in plain 3-D graphics. If you want to spend upwards of US$ 700 on a card like this, you better allocate a good sum for the right CPU (quad core QX6700 or QuadFather isn't a bad choice) and, of course, the right cooling and the right mainboard. Since these GPUs exert strong pressure on the CPU, PCI-E subsystem and memory bandwidth consumption, I'd never go with a dual GF8800GTX SLI configuration in a dual PCI-E X8 configuration, like say Intel 975X chipset. It simply has to be the new Nforce 680i (or, if you can make it work, one of those rare ATI RD600 chipset based boards).
Sparkle Calibre P880+ is worth checking out for any high-end "enthusiast" keen on G80 - I like the idea of a Peltier cooler on such a high end card, however I have yet to see how it will perform in long term operation. Also, trying it in SLI mode with heavily overclocked water-cooled QX6700 on the Nforce 680i, with much faster FSB and memory system to feed the twin G80's, would be tempting. But well, that is for next week. µ
-
One from guru3d
http://www.guru3d.com/article/Videocards/391/
-
http://www.nvidia.com/page/home.html
HOLY F**K!!! Let it load, dont take ur eyes off the page. look at the woman......
-
I'm sorry but I must say the following:
1. I am DEFINITELY going to enlarge my e- penis next year
2. The GeForce 8800 if F***ing VIAGRA (and then some ) for your PC
3. I will import TWO and sell one for like TT$7000 (yeah, frickin Wizz style pricing!)
4. The GP-F**king-U will e-RAPE and e-sexually abuse any Conroe left near it!!
WTMC!!
-
One from guru3d
http://www.guru3d.com/article/Videocards/391/
That review is THE most extensive I have EVER SEEN for a graphics card.
For ANY component for that matter.
You will not find a more comprehensive review than this one for a while I'm sure.
The performance numbers are simply STAGGERING.
WELL DONE NVIDIA!!! A THOUSAND 'Hoshi-Toshis'!!! *bows and kisses Emperor Nvidia's ring*
ALL HAIL THE NEW KING!!!! ^_^
-
lol Yep. that review was very well put together. im still in awe as to how much a performance jump this card manages! and its on newegg.......
http://www.newegg.com/Product/Product.asp?Item=N82E16814150205
http://www.newegg.com/Product/Product.asp?Item=N82E16814143075
soon......
-
guru 3d site
anyone else notice that the 7950 is now a sub 300 dollar card!
-
Overall I'm actually not impressed.
Truthfully what has me disapointed is all the damn heat and power usage lol.
Huge overly power hungry card that is over kill. when they have single slot cooler 65nm versions I'll think of switching.
Also ATI has 2.5x the 1950 as it's goal, so the ubberness of the 8800 won't last to long, ATI now has clear targets to meet and beat.
-
well for one, the 8800 uses less power and produces less heat than a x1900xtx or wahtever teh a$$ ati calls that thing
at first i like most people thought the huge length and width was a bad thing
but thinking bout it, that will be the future of cards, take up more realestate and do bigger better things
the cpus are getting smaller, and the cards are getting much larger its not that bad a trade off.
-
Before I go speechless lemme say R600 wil prolly pwn G80 bases when it arrives AND should consume less power as it's a 80nm part that is actually smaller in transistor count yet so far superior in specs. Anyways the G80 is still nothing to laugh at... now for the speechless part... pictures are worth THOUSANDS of words... and THOUSANDS these are....
(http://img80.imageshack.us/img80/1181/23k200620system20clocksgh1.th.jpg) (http://img80.imageshack.us/my.php?image=23k200620system20clocksgh1.jpg) (http://img300.imageshack.us/img300/4852/3044920system20clocks20screen20mark201wg6.th.jpg) (http://img300.imageshack.us/my.php?image=3044920system20clocks20screen20mark201wg6.jpg) (http://img300.imageshack.us/img300/4383/7498320system20clocks20screen20mark201fh4.th.jpg) (http://img300.imageshack.us/my.php?image=7498320system20clocks20screen20mark201fh4.jpg) (http://img348.imageshack.us/img348/2639/92k200120mark201rl7.th.jpg) (http://img348.imageshack.us/my.php?image=92k200120mark201rl7.jpg)
OMFGWDFWTFPWNAGE..... My jaw is still on the page I got those from.. you can see them here (http://xtremesystems.org/forums/showthread.php?t=122394)
I wonder if there are laws against intertechnology marriage... LMAO!!!! PWNED!!!!
-
^^^ Thousands you say.....
I can't find any. I'm truly speechless.
That system is every PC gamer's wet-dream/fantasy.
-
A review of the MUCH MORE pocket friendly GTS8800 can be found here (http://www.theinquirer.net/default.aspx?article=35618) and here (http://www.tweaktown.com/articles/977/5/page_5_benchmarks_test_system_setup_and_3dmark05/index.html). This looks to be many a gamer sweet spot as it's more affordable and still outperforms to a great extent the current gen cards... oh and @ crixx that 'trade off' you refer to ain't all that balanced as the power these cards consume is increasing faster than the power savings of the CPU... so its only a matter of time before these GPUs have to cut back like the CPUs in power but still maintain the increase in performance... after all WTMC yuh gwon do when that light bill hit yuh from using G80 in SLI with a 1KW PSU?
-
^^^ do like me, unscrew all de light bulbs in yuh house, put clothes to try on de door, back of fridge and out side on de line. Do everything that needs light during the day.
Sell yuh tv, cause yuh does download everything yu need to see anyways
So only ting that need power is yuh water heater, yuh pc and yuh fridge
oh an microwave
right so after i fainted after seeing those benchies
now yu understand why i was so vex that they were using amd chips for the benchies before right?
cause nvidia goodness, deserves intel goodness.
Alyuh see that score?? 23000 in 3dmark 2k6?? i sure futuremark didnt even know it was possible for de software to generate numbers that high!
-
Did you see the OC on those CPUS??? 5GHz on Core 2 Duo Extreme??? OMFG... daz how they did it, that aside, LOL @ crixx suggestions to save power and lastly can you spell 3DMark2006SE??? cause yuh know once someone hits 10K they have to up the anti to get those scores back down...
-
they even get the 5ghz on de freakin kentsfield!!! wtf!!
i sure dem conroes have dilithium crystals in dem an ting
have scotty level giving alll shes got captain!!!
have picard an dem level doin the head thing
-
I didn't even think 5 Ghz stable was possible on current gen CPUs. Intel has really outdone themselves with the Core2Duo/Quadro.
Hopefully by next year this same time the price of the E6400/6600 will be down to piper prices, once the Quadro takes centre-stage.
Maybe by then we might also have a 8800 GS-version as well. Super performance at respectable prices.
-
Actually its just Quad arc :) but that aside they HAVE to be using OMFGWTFPWNAGE cooling to hit those frequencies.. we taking cascaded phase cooling or Nitrogen cooled. Nothing short of that will get the temp low enough to perform such wizardry. As for a GS, we can hope so... but chances are we'll see a 8600xx first and the lower DX10 parts. The GS is more for when DAAMIT has a competing part and they need to scale down the toppa top parts to match. So not for a while i'm afraid.
-
quadro lol
-
Ok I got 1 beef with these cards atm... Nvidia has priced them @ 450USD... (GTS) and newegg starting price is 500USD...WTMC... (not counting shipping and tax to places like NYC)... dread this card is a beast but @ 500USD and 650USD... they have a reason they eh sell out yet. What's also funny is that ALL these cards are made by Asus.... so eVGA, BFG, Leadtech, etc. Its the EXACT SAME CARD with a different sticker. So really and truly there is NO reason to buy a more expensive brandname atm... I guess I many will wait til R600 save the crazy rich mcs who have money to waste as children in africa starve... ah well
-
Ok I got 1 beef with these cards atm... Nvidia has priced them @ 450USD... (GTS) and newegg starting price is 500USD...WTMC... (not counting shipping and tax to places like NYC)... dread this card is a beast but @ 500USD and 650USD... they have a reason they eh sell out yet. What's also funny is that ALL these cards are made by Asus....
Its called price gouging , any rich techie who wants it NOW can afford to give newegg a 100 dollar bligh , after 2 weeks or so the price will drop to MSRP, after a few months it will drop below that.
Nvidia cards can have huge differences between them , especially quality wise . While Asus , EVGA and BFG are all good and might as well be the same thing, lets say, a chaintech will be of lesser quality , but cheaper .
-
quadro lol
Quad/quadro. Whateva. Yuh KNOW wha ah mean.
-
I think what is particularly funny is that QUADRO is an Nvidia WORKSTATION card... so its not only a naming mistake but somewhat of a technical one as well. But is no seen arc, we all learning.
-
quadro was always a stupid and misleading name
fellas,
the 8800GTS looking too sweet to pass up.
omg.
The GTX is just ridiculous but I definitely see an 8800GTS in my not too distant future.
-
US$500 and that card is mine at some point next 3 - 6 months...really can't pass that kindof power jes so..you good to go for about 4 years once you get that card... at least you ought to be!
-
hummmmm
http://www.xtremesystems.org/forums/showthread.php?t=122469
They got up to 19000 3dmark06 now!
-
hummmmm
http://www.xtremesystems.org/forums/showthread.php?t=122469
They got up to 19000 3dmark06 now!
Like yuh eh read the previous page? they hit 23000 in 3DMark06 meh boy... 19K is for small frys!
-
It has been released......one of the best price/performance cards around now. read the review here:
http://www.guru3d.com/article/Videocards/416/
nice price too.....
-
oh yeah... saw a similar review on Anandtech..... very good budget card theere, only absolute difference between this and the original 8800GTS is the RAM amount.
Does almost as well as original 8800 except as very res and 4xAA on.. and that's in certain benchies...
Definitely a good buy when available....
Somehow I thinks this will be an uber year for upgrades.....
-
http://www.newegg.com/Product/Product.asp?Item=N82E16814130082 (http://www.newegg.com/Product/Product.asp?Item=N82E16814130082)
-
good buy for now , but i'll wait for them to bring the cards to 65nm process and with 512mb GDDR4 at least... Less power drain , cheaper and more speed.
-
Since I can't seem to find the origina;l G80 thread (ok I too lazy to) I will post this here and change this topic heading. That aside have a read:
Geforce 8900GTX and 8950GX2 details listed
With 8900 GTS and 8900 GS at 80 nanometres
By Fuad Abazovic: Thursday 15 February 2007, 13:10
Click here to find out more!
JUST A DAY after we uncovered the existence of Nvidia's Geforce 8950 GTX and Geforce 8900 GTX, our friends in Taiwan confirmed the news. You can read our original stories here and here.
The Geforce 8950 GX2 is a dual-chip card based on a new 80 nanometre G80 chip, probably codenamed something else. Both GX2 GPUs are clocked at 550MHz and the difference is GDDR4. The card comes with 2x512MB of 256-bit GDDR4 memory clocked at 2000MHz. The card has 96 Shaders, per chip. It will be priced at $600.
The second in Nvidia's spring line-up is the Geforce 8900 GTX clocked at 700MHz GPU and 2200MHz memory. The chip still has a 384-bit memory interface and comes with 768MB of memory. The card uses a new 80 nanometre chip and has 128 Shaders. Compared to the Radeon X2800XTX it will end up shorter on clock and memory interface. You can compare them here.
The Geforce 8900 GTX is priced at $550.
Meanwhile, the Geforce 8900 GTS is a new card clocked at 600MHz GPU with 2000MHz GDDR4 memory. It supports the 320-bit memory controller and comes with 640Mb of memory. This card should cost $500 and it is using the 80 nanometre chip. This card will still have 128 Shader units.
The current king of the crop, the 8800 GTX will drop in price to $450, while the Geforce 8800 GTS loaded with 640MB of memory stays up in the $400 price range.
Nvidia has two more 80 nanometre cards. The Geforce 8900GT with 600MHz core and with a 256-bit memory interface comes with 512MB of 1800MHz GDDR3 memory. It has 96 Shaders and is built on a 80 nanometre process and will cost $400.
The 8900 GS will be the cheapest G80-based card. The 80 nanometre based beast is clocked at 550MHz core and 1600MHZ memory. The card has the 256-bit memory controller and comes with 256 or 512MB of GDDR3 memory. It also has 96 Shaders. The 256MB version will cost $200 while 512MB incarnation will end up at around the $250 price mark.
The apparently-leaked document mentions G84 and G86 chips but we have some details to confirm.
-
Wow those are a lot of cards yet to be released. And we aient even talk about the budget 8600 series yet.
-
...brace!! an i jus buy ah XFX 7900GS EE for $200 ...yuh really can't keep up with these manufacturers... dem always tryin to make yuh spend money on de latest toys!
-
nice battle for the X28xx series cards. $550 not bad for it, considering that the 8800 cards came out at $100 more than that.....
-
http://www.nzone.com/object/nzone_cascades_home.html nvidia's DX10 tech demo for anyone who has a 8xxx card.. Note the new features , on card physics and geometry shaders so that just about everything is done on the GPU . Also with an algorithm you can procedurally generate infinite terrain instead of having to map it out , means smaller but more sophisticated games in the future .
-
Ah boy, its looking like a good year for hardware upgrades again.
Cheap CPU beasts (Core 2 Duo E4300/4400) coupled with cheap video-card beasts.
8900 GS here I come.
-
For real. The $200 price point for video cards and cpu's looks feature a very decent lineup.
-
Yeah yeah yeah necroposting :p bite me, its an update and I didn't feel like making a WHOLE new thread bout OLD hardware
Nvidia cynically fiddled video benchmarks
And suckered the press to boot
By Charlie Demerjian in San Francisco: Tuesday 18 September 2007, 17:10
THERE HAS BEEN quite a bit of talk lately about Nvidia fiddling with certain video tests, but that is only the tip of the iceberg. The problem is merely hinted at an understanding of the way it manipulated benchmarks, the real damage is revealed in how it did so - through sleazy underhanded playing of the media.
The problem is simple, Nvidia can't do video decode well, and the current media darling benchmark, HQV - especially in the HD version of the benchmark - is eating its hardware alive. On quality, Nvidia regularly loses about 25 per cent of its score in a noise reduction test - enough, to more or less take it out of that game.
So, what do you do when you are blown out of the water? Cheat with a plausible workaround, and then spin your ass off hoping not to get caught. Well, Nvidia seems to have done both.
First, a little background. Noise is basically data that is out of place - bits that are errors or other artifacts of the process. On sound recordings, you hear it as exactly that, noise, popping and clicking. In video, you see pixels that are anomalous and short lived, they pop up, flash, and degrade image quality. You see the noise.
One of the ways you deal with it, and this it does quite effectively, is a time-based filter called Temporal Noise Reduction (TNR). In filters that smooth edges and the like, you filter across the image, looking at the pixels around you to decide if you are out of place. Time-based filters do that but also look across frames in video.
If you have a background that is black for five frames, with a couple of white pixels floating about on one frame then five more completely black frames, chances are you have noise. It detects things that pop up briefly without any accompanying pixels. Then again, this is grossly oversimplified, and we just made filter engineers die a little inside with that explanation.
In the end, it can be very effective, you just average across time, and high-frequency pops go away. The problem is that the more you turn it up, the more average things become. You end up with washed out images, blurred edges, and things that should be there or fast-moving things, getting really ugly. High frequency bits are also a lot of what makes an image 'sharp' instead of blurry.
The other problem you see is a kind of ghosting. If you turn the TNR filter up too high, on transitions between light and dark scenes, especially cut scenes, you see ghosting for a few frames. If you have a white image for 10 pixels and then go black, the first few black pixels are classified as anomalous and get whitened or softened. Again, vast oversimplification of the process, but you see washed out and lightened transitions. It looks like this.
(http://img338.imageshack.us/img338/9320/nvfilternoisehy8.jpg)
Let me be the first to say that of the four pics I took, this was the best, and it still is pretty awful. That said, you can still see the problem, but in person, it looks far worse. Let me give you a bit more explanation.
These pics were done on identical Dell 30-inch monitors, the ATI 2600XT based box is on the left, the NV 8600GTS is on the right. Throughout the tests, the NV machine had lighter images, a benefit in some places, a downer in others. This is not a good or bad thing, it is just a difference between the cards. This picture was taken with the test paused to show details, what you are seeing is a static image, not motion effects.
If you look at the NV screen, it is more washed out, but there is a lot of banding and blotching where it should be solid black. Some of this is the lighter picture, most appears to be from the TNR filter. If it was only the lighter picture, you would not get the blotching effect, and if it was a problem with the monitors, both would be blotchy. I apologise for the poor image quality, but all I have is an old camera and a new phone to take pics with, and in this case, they don't quite cut it.
Getting back to the problem, using TNRs is not a bad thing, in fact it can be a very good thing, it does improve image quality. But if you overuse it, you degrade image quality in certain ways, and that is where the cynical gaming of the press took place.
When NV put out its 163.11 drivers, it default turned up the TNR, way up, just in time to give out to reviewers. Now, if you have TNR turned down, NV basically fails the noise reduction test in HQV. With it cranked up, it passes with flying colours. Fair enough, this is what it is designed to do, and it looks like it does what it is there to do, right?
Yup, perfectly. The first problem though is that HQV's noise reduction test has little movement in it, it is a flower swaying too and fro gently in the breeze. There is no motion, so you won't see how TNR munges the edges and blurs things out all that much. On the other tests, TNR does not affect the outcome because it measures specific performance parameters. It is a synthetic benchmark that measures one performance dimension at a time.
The end result is that TNR makes a single test much better and does not affect the others. You would think this qualifies as a job well done, and if the world was limited to watching HQV loop, you would be right. I don't know about you, but I watch other videos on my computer too, and therein lies the rub.
What Nvidia did with TNR washes out video and notably degrades the quality. Blacks are no longer black during transitions, things blur and lose sharpness. In general, it is a mess. It does reduce noise in those videos though, and if you have a noisy high-def video setup, this is the driver release for you. This all comes at a price, the drivers are broken elsewhere, and they are broken to game a benchmark. Reviewers were not told this, and dutifully reported that the video problems were fixed.
When asked about it, NV will probably respond that it is under user control, and you can set it however you want. That is 100 per cent true, but I don't know about you, I want my video devices to work, not need adjustment every scene change. I want my video driver to reduce noise and not make things look like Micheal Jackson's privates, a manual slider change during every scene change is unacceptable. It is under your control, but nevertheless it is still broken and ineffective.
The cynical gaming was that it was done in a beta release meant for the members of the press, and handed out to them. The next version, 163.44, vastly lowered the default value, and presumably it will vanish under the waves with successive releases until a new card comes out, rinse and repeat. If I ever get an NV card, I will keep an eye on this.
If you are confused as to how this benefits NV, think about it this way. They released 'benchmark special' drivers, and the reviews were done on these parts. Headlines screamed, or maybe footnotes briefly mentioned, that NV scored XYZ in HQV, which is much better than the XYZ - 25% that they would have gotten without the TNR.
By the time someone sees the problem, weeks have gone by, as is the case now, and the problem is written off as a setting in a beta that was never meant for public use. That would be fair, but guess which number goes on the marketing presentations, the slides, and the boxes? Guess which ones people read in the magazines (remember them?), at the newsstand? A retraction four months later on page 173 is not the same as a cover article with the newest GPU on it.
So, from my point of view, NV duped the press, engineered fake numbers, and got away with it. It is a real pity because if you buy a card, you are not getting the same perfromance as you read about in the skewed reviews, you are getting a scammed set of numbers. This is surreptitious at best and plain old cheating at worst. The firm deserves to be called out, otherwise it do it the next time the opportunity arises.
So it falls to us to do the calling. µ
-
got the original link W1n? i'd like to see the images they're talkin about
-
Recheck the post, I added the image in.
-
Aye wam to this man calling we cards ole boy WTMC??? lol
-
'OLD' !!?
*watches W1ntry 'cuteye'* :disgust:
-
LMAO, well guys face it, it's old now. There is a hardware refresh based on the G80 due out Nov and the G90 due sometime Q1/2 next year.... blame it on Nvidia and DAAMIT for having such ridiculously short product life cycles :p
-
This is GOING to piss off GTX and even GTS owners to a certain extent... but such is a refresh cycle of video companies these days...
Nvidia kills 8800GTS 320MB, relaunches 640MB
Boxed in
By Theo Valich: Friday, 05 October 2007, 1:00 PM
We knew that Nvidia was planning to kill the 8800GTS 320MB in order to make room for the 65nm die-shrink that the world has come to know as G92.
But it turns out that you cannot order 320MB versions any more either, since it is being pronounced as an EOL (End of Life) product.
Next in line to go through a change is the 8800GTS 640MB, which is being tweaked up in order to live through the 512MB and 256MB versions of G92.
Nvidia decided to raise the specs by another 16 scalar shader units, so the 8800GTS will now feature 112 scalar shaders, 16 less than 8800GTX/Ultra. Clockspeeds remain the same, as do thermal and other specs.
But there are a lot of miffed Nvidia partners, crying foul over the situation. Imagine the surprise of AIBs that have thousands of printed retail boxes with old specs.
If Nvidia ever says that they're thinking about greening the Planet and all other environmental chit-chat manufacturers like to use these days, ask them about how many trees they killed with just this sudden announcement.. µ
And this next article will piss off Nvidia fanboys :awesome:
Nvidia admits defeat in G92 vs. RV670 battle
3DMark wars
By Theo Valich: Friday, 05 October 2007, 9:39 AM
IN THE FAR EAST, 3DMark is everything. You can say whatever you want about canned benchmarks, but nobody can dodge the influence of the 3DMark06 benchmark.
It's been the same story with previous iterations of graphics cards, and the same will happen with the next, DX10-only workout. When that is coming out, only Futuremark knows.
But something very significant happened in this round of the war, at least according to our highly-ranked sources.
This time around, Nvidia did not tout its G92_200 series as the fastest thing since Niki Lauda, but rather admitted defeat in this all-popular synthetic benchmark at the hands of a yet-unnamed Radeon HD part.
A reference board from Nvidia is capable of scoring 10,800 3DMarks, while a reference board from ATI will score around 11,400 3DMarks, or a clear 550-600 points advantage.
This is a massive leap over previous-gen parts. The current generation's high-end performer from Nvidia, the 8800Ultra scores 12,500 points. Seeing a mainstream, $250 part scoring barely a thousand less than a current $599 card only makes us wonder how those owners that coughed up so much will feel.
When it comes to ATI's part, you know what to expect in this synthetic benchmark - outscoring Radeon HD 2900XT is a default mode of operation for RV670XT. At least in lower resolutions.
Partners are less than happy with Nvidia board politics as well, but this is a subject of another story. µ
-
Hummmmm and I was already planning on a hd2950 pro ( RV670 lol )
-
Ah boy...8800 GT benchmarks (http://www.tomshardware.com/2007/10/29/geforce_8800_gt/) start to show up.
I need to see Crysis performance with this biatch. I NEED to know if it makes sense to upgrade to this, or the 9800 GTS.....
-
i hope there's an ice factory under that 1U slot alocation... or is it more suited to the northern hemisphere, lol :mellow:
-
Based on that review, it seems that the 8800 GT is just about 30% faster overall than the 8800 GTS 320 MB, while consuming less power
and costing just about the same, or a little less. Very nice, considering that the GTS 320 MB has been officially discontinued.
I'm already seeing them here (http://www.tigerdirect.com/applications/SearchTools/search.asp?cat=1826&keywords=8800%20GT&mnf=) and here (http://www.amazon.com/s/ref=sr_nr_p_4_1/105-9930097-8028467?ie=UTF8&rs=172282&keywords=8800%20GT&rh=i%3Aaps%2Ck%3A8800%20GT%2Ci%3Aelectronics%2Cn%3A172282%2Cp%5F4%3AEVGA), available for preorder.
Again, Nvidia seems to have come up trumps here, but its not enough of an upgrade for current GTS 320 MB owners, especially in light of the rumours that the 9800 series
cards are twice as fast as the 8800 series.
I think I can 'manage' until the 9800 series shows up, THEN we'll see where de fight takin place.....
-
justy read that review Arc posted...
Looks like i'll be upgrading again early next year...8800GT here I come!!
-
9800 gt -frickin- s FTW dred!!!!!!
-
^^^^^^^^
(http://img141.imageshack.us/img141/1761/cosignok6.jpg)
:happy0203: :happy0203: :happy0203:
-
Interesting comments from a review:
The Good
This is a fixed G80. Nuff' said. It beats 2900XT, 8800GTS 320 and 640MB, and even 8800GTX in some cases. This card has enough horsepower for games, but do not expect performance FullHD in with single card in every game. Brilliant piece for Crysis, Hellgate and UT3.
The Bad
Drivers are green at this time, install those that are shipped on CDs, since rushed Crysis-demo beta driver does not warrant stability. 256-bit bus could become a bottleneck in next batch of games, VP processor could have been newer VP3 and not VP2
And the Ugly
Single-slot cooling may be cool, but our board heated up significantly. Temperature was always in high-60 (Celsius) range, overclocking capabilities were limited due to single-slot cooling.
The full review can be found here: http://www.theinquirer.net/gb/inquirer/news/2007/10/29/inq-grills-8800gt-dx10-hit (http://www.theinquirer.net/gb/inquirer/news/2007/10/29/inq-grills-8800gt-dx10-hit)
I may just have to get this card... but without a quad core... hmmm crysis...
-
No offense W1ntry, but that review is kinda lame (those graphs suck A$$).
That said, I'm glad they did an immediate Crysis comparison, and the results look promising.
It seems that Nvidia has trumped......Nvidia? ???
I can always count on Tom's Hardware and Anandtech (http://www.anandtech.com/video/showdoc.aspx?i=3140&p=1) to bring some comprehensive reviews to the table.
Based on the performance results in that Anandtech review, it seems that the 8600 and 8800 GTS parts are now 'useless', for users
upgrading from a series 78xx or 79xx card (or ATI X1xxx).
Hmmmm.... *considers* anyone want a 7900 GS to buy?
-
anyone want a 7900 GS to buy?
weyyyyyyyy arc now yuh go say that hoss?
how much yuh calling on it?
EDIT:
newegg aint sticking qui!
http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=2000380048+1069633099&name=PCI+Express+2.0
-
:lol: Hey, I have to be sure I can actually GET one of those 8800 GTs.
You KNOW that they'll be a piece of much sought after hardware.
Them Newegg prices kinda steep, seeing as Amazon have em for $269 US.
-
:lol: Hey, I have to be sure I can actually GET one of those 8800 GTs.
You KNOW that they'll be a piece of much sought after hardware.
Them Newegg prices kinda steep, seeing as Amazon have em for $269 US.
fuh true?!
i only gone there via a link and i see pcie 2.0 in the v/c options so i get curious
well if you do plan to sell it him a bro a pm i might be interested
-
As berzerk pointed out to me, these cards seem to have dried up everywhere on the net and preorder dates are for all 31st December. Apparently these cards are in rather short supply. Either that or people reaaaalllll buy them out.
-
Wintry yuh mc ! you said i was aoob when i claimed i would be getting at least an 8800GTX performing card in a years time, for around 200 US.. you told me it would take at least two years..
8800gt is the shizz! Even better apparantly AMD's card is around 10% slower , so they'r going to sell it cheaper to compete!
http://techreport.com/discussions.x/13495
Times are looking good.
A card just a smidge under the 8800GTS for 199-250 , what more can you want but... perhaps 2?
http://techreport.com/articles.x/13479/8
Apparently the only drawback to the 8800gt is that it doesn't support the 10.1 spec as far as i'v read , so the slower
AMD 3800s may be the better buy for longevity.
AMD might have a chance to over clock the cards a bit before release , based on a 55nm process it should be able to get a 10% boost in clock without too much hassle .
The 3800 series is basically a die shrink and modification of the X2900 , one model runs at the same clock speeds as the 2900xt , and another is higher clocked .
-
The gt is not $200, its closer to $300 and only in some things it comes CLOSE to a GTX. Your predictions havent yet come true.
-
mumbles incoherant DiRTy words about recently purchasing a 640MB GTS, and new shizz hittin de market arredy :shakehead:
-
LOL small ting dani once u overclock the card, you're on par and even faster than the gt.
-
MSRP is 250 for the base 512mb models , 199 for 256mb models , for the first few weeks products always appear above recommended pricing so the websites can make a little extra money off of impulse buyers .
Eg. i got my core 2 duo e6300 for 230USD the 2nd week of its release cause i was suffering without a system , MSRP was 184 or so .
if you check newegg there is a sold out 259USD PNY model , i'm betting the retail price of this one will drop to like 230US in a few weeks .
http://www.newegg.com/Product/Product.aspx?Item=N82E16814133205
I'm expecting overclocked models of the 8800GT to beat the GTX in everything but very high resolutions , where memory bandwidth will limit it .
And of course... Lets not forget Nvidia probably has a 65nm high end replacement that will make the 8800gt look like what it is.. a mid range card .
-
And if THIS is midrange, imagine the power of the refreshed GTS's and o_O @ the 9800 xxx, all within the space of 3 months!!!
-
i gots my 8800GT pre-ordered... bar truly better stuff coming out.... 31 December is my ship date... nuff time to save to actually pay for it...
-
just got my order for the 8800 gt cancelled...truly a WTMC worthy post.
ARCMAN tried his darndest best to order the card for me but tiger direct playin up in dey....*$#@!
But wait there's more!!! http://www.dailytech.com/article.aspx?newsid=9474 Nvidia not easy oui, this is quite possibly there most successful card launch...evaaaa and they not done yet...
mebbe this happened for a reason oui, the 8800 gt wasn't even dx10.1 compliant.
The new cards from Ati may be exactly what the doctor ordered.
http://www.tomshardware.com/2007/10/29/amd_hd_3800_to_support_dx_10/
-
All the 8800gts on newegg sold out.. The overclocked models that reach 660-700mhz (depending on manufacturer ) fetch close to 300 at the moment .
http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Description=8800gt&x=0&y=0
According to one site the 8800GT MSRP is : 199-249 USD 512mb
179-199 USD 256mb.
http://www.techarp.com/showarticle.aspx?artno=467
Nvidia would be cannabalizing all their cards above the 8800gt if this is true , it makes no sense buying a 500 dollar 8800 ultra when two 8800gts for the same price will rape it .
ATI is supposed to be pricing their cards even lower in order to compete... bieng on a 55nm process i'd think they can risk clocking them a bit higher as well,
The base HD 3800 model will only be a little faster than the 2900xt , which the 8800gt beats in most areas...
madness yes.
-
All I'm going to say is NEVER believe anything from inq or fudzilla!!!!
next the 8800gt is a killer card, no doubt but we need to see what the hd 3800 looks like, my guess is the 3800 will suck until 10.1 games are out and amd's designs start to shine, untill then the 8800gt will own all,
also 9800 series will not be 2x the 8800!!! it will be faster but not that much faster, so don't hold your breath too long.
Amd better start getting their act together they are getting owned all over
-
according to leaked specs, ATi's card is definately going to be slower than the 8800gt , which is already faster than the 2900xt
http://techreport.com/discussions.x/13531
-
You need to learn to read there. no where does it say it will be slower than the 8800gt! Specs also mean nothing with out benchies!
-
woahhhhh well looks like there's gonna be some brother vs brother combat. Grab your recliners and doubles pple lets enjoy the show.
It's no use trying to predict a card's performance based on its specs ESPECIALLY when we talking about a DAAMIT product. When the 2900xt was first announced, we all thought that it would blow the 8800's out of the water......and well....the rest is history.
-
according to leaked specs, ATi's card is definately going to be slower than the 8800gt , which is already faster than the 2900xt
http://techreport.com/discussions.x/13531
Yes but the price will be lower as well, if you haven't realized DAAMIT has taken the stance of, well performance isn't the key, but PRICE performance we'll kick em in the nuts. The 2900XT is slower than the 8800GTS in most cases yes (for now, let's see how the newer more shader dependent games go as time progresses) but the price on the revised 2900 will be lower than the 8800 so depends on your budget, cause lets face it, if you can get 60+ frames with AA and FSAA enabled it doesn't matter if you getting 70 or 90 cause you won't see it. Once the cards perform above 60FPS it's fine esp at a lower price.
-
woahhhhh well looks like there's gonna be some brother vs brother combat. Grab your recliners and doubles pple lets enjoy the show.
No show, I'll just slap him around the room and he'll go cry in the corner. He does need to learn to read tho.
-
DOH!!! and I was seriously wanting to buy a 8800GT!!! DAAMIT!!!
Why DX10.1 matters to you
Like speed and beauty?
By Charlie Demerjian: Friday, 16 November 2007, 10:24 AM
YOU KNOW THE graphics wars are heating up when the 'presentations' start to fly. The latest one is from NV talking about their upcoming 8800GT/256 which will be out soon, but it is the header that makes me think it is more than innocent advertising.
"Kevin Unangst, Senior Global Director for Microsoft Games for Windows: " DX10.1 is an incremental update that won’t affect any games or gamers in the near future." He is right, mostly because it won't be out until SP1 hits, so no argument there. More importantly, DX10 and onward is tied to the broken malware infestation known as Vista, so it is really irrelevant, but there is more to it than that.
Lets be honest, DX10.1 brings a lot of new features that don't really matter much if at all, and you can read all about them here. That said, there is one there that will matter a lot, contrary to what MS people say. This magic feature is the multi-sample buffer reads and writes(MSBRW). If you are wondering how you missed that big one in the feature list, well shame on you, read better next time.
What MSBRW does is quite simple, it gives shaders access to depth and info for all samples without having to resolve the whole pixel. Get it now? No? OK, we'll go into a bit more detail. DX10 forced you to compute a pixel for AA (or MSAA) to be functional, and this basically destroyed the underlying samples. The data was gone, and to be honest, there was no need for it to be kept around.
Games like Quake3 would do a lighting pass, then a shader pass, and another lighting followed by shaders and so on until everything was rendered right. This was quite precise but also quite slow. Dog slow.
To optimize around this, a technique called deferred shading took was invented. This does all the lighting passes followed by a single shader pass. If you have five passes, you basically can skip four trips through the shaders. The problem? Because the pixel isn't fully computed, just a pile of AA data, there is no way for it to be read. This is horribly simplified, but I don't want to go into the low level stuff here, go look it up if you really care.
What this meant is that you can't turn on AA if you have deferred rendering unless you do Supersampling which is rendering it at higher reolutions and sampling down. This is unusably slow, so it went out the door, meaning if you were designing a game, you picked speed in the form of deferred shading, or beauty in the form of AA. Most DX10 games will go for speed, meaning the AA hardware will sit more or less idle.
DX10.1 brings the ability to read those sub-samples to the party via MSBRW. To the end user, this means that once DX10.1 hits, you can click the AA button on your shiny new game and have it actually do something. This is hugely important.
The first reaction most people have is that if a game is written for DX10, then new 10.1 features won't do anything, AA awareness needs to be coded in the engine. That would be correct, but we are told it is quite patchable, IE you will probably see upgrades like the famous 'Chuck patch' for Oblivion. Nothing is guaranteed, but there is a very good chance that most engines will have an upgrade available.
In the end, DX10.1 is mostly fluff with an 800-pound gorilla hiding among the short cropped grass. MSBRW will enable AA and deferred shading, so you can have speed and beauty at the same time, not a bad trade-off.
Since NV has not done the usual 'we can do it too' song and dance when they are being beaten about the head and neck by a bullet point feature they don't have, you can be pretty sure they can't do it.
Close looks at the drivers, and more tellingly no PR trumpeting that they will have it out before the release of SP1 almost assuredly means that it will never happen. If you have a G8x or a G9x card, the only feature of DX10.1 you will miss is the important one. µ
-
A worthy MOBILE CHIP???
Nvidia's 8800M GTX mobile chip runs Crysis, shocka
You did see the new graphics chip, right?
By Wily Ferret: Tuesday, 20 November 2007, 12:20 PM
YOU MIGHT NOT HAVE NOTICED, but yesterday Nvidia launched its new mobile graphics chip, the 8800M GTX. Whilst the release may have gotten buried under the weight of Intel and AMD reviews, its actually probably more interesting than either of those two big dogs.
AMD Spider was, by all accounts, a disappointment, and the Intel quad-core release was barely a speed bump. The new 8800M GTX, on the other hand, is actually a rather good bit of kit, it seems. The number of stream processors is now at 96, up 3x from the previous top of the range kit - the 8600 M - which had 32. The memory interface is now 256-bit, which also accounts for a massive increase in frame rates in the latest games.
Perhaps the most striking indication of the performance is in this video review (http://www.unwiredshow.tv/2007/11/19/22-nvidia-notebook-graphics/) which shows the 8600M and the new 8800M side by side benchmarking Crysis. The 8600 is barely more than a slideshow, whilst the 8800 rips through the benchmark like hot butter, as Steve Jobs would say. If a picture is worth a thousand words, we suspect a video is worth at least two Excel charts.
As you'd expect, then, all the usual suspects are lining up to add in 8800 GTX mobile parts to their lineup, including Alienware and the UK's very own Rock. If you're after a gaming notebook, it looks like this is the one to have. µ
-
One more thing deserving of its own post
Wintry yuh mc ! you said i was aoob when i claimed i would be getting at least an 8800GTX performing card in a years time, for around 200 US.. you told me it would take at least two years..
The retail on that part is still >260USD AND its performance is in SOME cases CLOSE to the GTX, however not at the higher res. It will still be another year before you see performance matching and beating the GTX at a 250USD price tag.
-
Well, I finally got my GT, and I must say the package was deceptively light when I received it.
I almost thought that the wrong thing was sent. When I opened it then I understood:
(http://img134.imageshack.us/img134/8333/dsc00023un7.jpg)
I knew it was single slot, but still, its really sleek and dare I say it...sexy?
Its a BIG step up from the GTS
(http://img209.imageshack.us/img209/7278/dsc00024ze5.jpg)
I cannot WAIT to see how this baby performs. :happy0203:
-
Arc... you changing hardware like a 'worker' from Villas I would imagine does change latex.... wtmc?? yuh selling organs like in Turista oh wha?
-
hhahahah i now go to ask the same thing
looking to adopt a fullly grown male to be yur heir ahwah?
i all for dat!
and yes that is a damn sexy NON RED card w00t
-
:lol: lets just say that I may not be upgrading anything until next year this same time.
Ah putting all that OT money I work hard for this year to good use. There's nothing like seeing a tangible
result of the fruits of your labour.
-
Arc... you changing hardware like a 'worker' from Villas I would imagine does change latex.... wtmc?? yuh selling organs like in Turista oh wha?
Daiz why the man working so hard, plus he single again, so money flowing apparently like milk and honey.....LOL
-
Hmmmm sweet looking card indeed. Let's see if you'll be beating my score b1atch.
-
I working too, yuh doh seem me changing hardware like chain smoker changes fags. And not only that HOW he get his hands on a 8800GT when LARGE online Etailers like newegg cyah keep stock for more than a few minutes???
-
I working too, yuh doh seem me changing hardware like chain smoker changes fags. And not only that HOW he get his hands on a 8800GT when LARGE online Etailers like newegg cyah keep stock for more than a few minutes???
Man have connections, what can I say...LOL
-
Well W1n, MINE is definitely coming by month end...
Due to finances i not sure if i'll keep it or not when I get it (a man done call dibs if i decide to sell it!) but i am sure if I keep it is the last card i'll be getting for a long time..
-
I working too, yuh doh seem me changing hardware like chain smoker changes fags. And not only that HOW he get his hands on a 8800GT when LARGE online Etailers like newegg cyah keep stock for more than a few minutes???
Well, what can I say? Gaming and tech/gadgets are my only vices.
As for the GT...yuh know how LONG I stalking Amazon for it!!? I just happened to see this one for sale by a 3rd-party merchant that
had just this ONE in stock. Needless to say I jump on it like cheetahs on gazelles.
-
it reach in yuh hands yet me boy? :banana:
we waits for the benchies urgently!
-
it reach in yuh hands yet me boy? :banana:
we waits for the benchies urgently!
He post pics of it Dani, check the previous page..
-
:lol: lets just say that I may not be upgrading anything until next year this same time.
Ah putting all that OT money I work hard for this year to good use. There's nothing like seeing a tangible
result of the fruits of your labour.
WORD!!
solid! :ph34r:
(try explaining that to the XYL! :shakehead:
-
ok where you got that gt man?!!!
-
ok where you got that gt man?!!!
He said AMAZON.COM
-
ok where you got that gt man?!!!
He said AMAZON.COM
Said Abraham....
btw... what clock speed version is your card arc? cause the default is 600 and i've seen eVGA all the way up to 700... so which version is it and what price did you get it at?
-
EVGA GeForce 8800 GTS Super Super Clocked (SSC) Edition Video Card
http://www.youtube.com/watch?v=sSKLnatJBCE
beats the 8800GT in most games, including Crysis! comes with a FREE COPY OF CRYSIS OR ENEMY TERRITORY BTW!!!! Sweet.
-
AT US$400.00?? (http://www.amazon.com/e-GeForce-PCI-Express-Graphics-Card-Lifetime-Warranty/dp/B000YLGWG6/ref=sr_1_1?ie=UTF8&s=electronics&qid=1196369742&sr=8-1)
Tempting, but a US$250 8800GT + US$50 Crysis works out a lot better pricewise, even if this particular GTS beats the average GT hands down.
-
TweakTown Review...
http://www.tweaktown.com/articles/12...ted/index.html
Hmmm nice performing card....@400 though? I think Sax said it best...A 8800gt + some overclocking + Crysis = best value for money.
-
Well lookee here. The 256 MB 8800 GT models finally show themselves (http://www.amazon.com/gp/product/B0010OT3O4/sr=1-9/qid=1198722615/ref=pd_cp_e_3?ie=UTF8&qid=1198722615&sr=1-9&pf_rd_p=250314601&pf_rd_s=center-41&pf_rd_t=201&pf_rd_i=B0010WJAD0&pf_rd_m=ATVPDKIKX0DER&pf_rd_r=0D9E19D5Y1PNHBRXYSSC)...and at a nice price too.
(http://img176.imageshack.us/img176/3344/screen26122007105346pmvq9.jpg) (http://img186.imageshack.us/img186/945/screen26122007105440pmbx0.jpg)
Shouldn't cost more than $250 US in total (including freight/Customs) to get it in hand in TnT.
This card makes a lot of sense if you're not gaming at resolutions at or above 1600 x 1200.
Who say it? :happy0203:
-
Plenty sense is joke. This card is sure to wtfpwn in any game other than Crysis, and even there it should hold its own nicely.
-
on that amazon link the have the 512 oc version for 90 dollars more
so the question here is that 90 dollar savings worth it?
-
Remember that 90 US equates to more than FIVE HUNDRED dollars TT eh (almost six hundred). As I said, its worth it if
you want to game at lower resolutions (e.g. 1280 x 1024) with all settings maxed out, with 2X or 4X AA on.
Also remember that you need a suitably large LCD to display those higher resolutions, so you have to consider
the native resolution of your monitor/LCD when buying this card.
-
I'd say its worth it, if money is really tight and you want to spring for the faster CPU, but as far as I remember from the reviews, the 512 model was considerably faster.
-
the 512 is considerably faster and would be considered a future proofing buy
at the speeds cpus at, is almost a null issue
games have been more gpu bound for a while
so even if yu not playing on higher resolutions NOW
you could be later
or when the nex gen rolls around, you wont be completely inadequate
-
Geforce 8800GS 192bit 384MB/768MB GDDR3
Hmmmm competition for Daamit 3850/3870...
http://en.expreview.com/?p=151
-
8800GT 512mb eVGA $199us (http://www.amazon.com/e-GeForce-512MB-PCI-E-Graphics-512-P3-N801-AR/dp/B0010T8XFY/ref=sr_1_6?ie=UTF8&s=pc&qid=1207842251&sr=1-6/)
Came across this today, talk about sweet deal if only I was wukkin :crybaby2:
-
Damn...great deal indeed.
*ponders*
-
I really think amazon has some of the once in a lifetime deals cause if you bookmark that page and monitor the price it will most likely change every week. But man I really need to get a job and utilize
TTdirectbuy and get some serious goodies.
Just seeing that makes me want to sprang something....