Forum

SnowiE's timedemo test (Updated 25th Jan 2011)

Created 24th June 2010 @ 15:24

Add A Reply Pages: « Previous 1 ... 4 5 6 ... 12 Next »

Skyweb

CPU: intel q6600
RAM: 2GIG DDR2 RAM
GRAPHICS CARD: 8800gt geforce

RESOLUTION: 1680×1050
WINDOWED/FULLSCREEN: windowed/noborder

Model Detail: low
Texture Detail: low
Shader Detail: low
Water Detail: simple
Shadow Detail: low
Colour Correction: Disabled
Antialising: off
Filtering Mode: bilinear
Wait for vertical sync: Disabled
Motion Blur: Disabled
Field of view: 90
Multicore Rendering: enabled
High Dynamic Range: none
Direct X Level: 8.0

Draw viewmodels: On
FPS Config Used (if any): chris’s fps config (maxframes)

Average FPS over 3 demo playthroughs: 111.57

and now the test with other settings:

Model Detail: high
Texture Detail: very high
Shader Detail: high
Water Detail: all
Shadow Detail: high
Colour Correction: enabled
Antialising: none (0.o)
Filtering Mode: anisotropic 16x
Wait for vertical sync: disabled (-.-)
Motion Blur: enabled
Field of view: 90
Multicore Rendering: enabled
High Dynamic Range: full
Direct X Level: 9.8

Draw viewmodels: On
FPS Config Used (if any): chris’s fps config (maxquality)

Average FPS over 3 demo playthroughs: 47.88

…. WHY do i play with a config -.-

kim

idd.

CPU: AMD Phenom X3 8750 Black Edition
RAM: OCZ Titanium 4GB DDR2
GRAPHICS CARD: ASUS ATi Radeon HD 4850 512MB DDR3

WINDOWED/FULLSCREEN: Fullscreen

Video Advanced Settings (Options > Video > Advanced) –
Model Detail: Low
Texture Detail: Low
Shader Detail: Low
Water Detail: Simple reflections
Shadow Detail: Low
Colour Correction: Disabled
Antialising: None
Filtering Mode: Bilinear
Wait for vertical sync: Disabled
Motion Blur: Disabled
Field of view: 90
Multicore Rendering: Enabled
High Dynamic Range: None
Direct X Level: 8.0

Draw viewmodels: Off
FPS Config Used (if any): http://fakkelbrigade.eu/chris/configs/cfg/maxframes.htm
Picmip forced to 10 via ATi Tray Tools

Average FPS over 3 demo playthroughs (1024×768, STEAM overlay enabled, core0 affinity enabled): 118,01
Average FPS over 3 demo playthroughs (1024×768, STEAM overlay enabled, core0 affinity disabled): 122,06
Average FPS over 3 demo playthroughs (1024×768, STEAM overlay disabled, core0 affinity disabled): 121,48
Average FPS over 3 demo playthroughs (1680×1050, STEAM overlay enabled, core0 affinity enabled): 115,97
Average FPS over 3 demo playthroughs (1680×1050, STEAM overlay enabled, core0 affinity disabled): 118,59


Last edited by kim,

aksu

CPU: Phenom II x4 945 3,0ghz
GPU: Vapor-X hd 5770 1gb oc
RAM: Buffalo 4x1gb 800mhz ddr2

Fullscreen 1024×768 on a 17″ crt

Video Advanced Settings (Options > Video > Advanced) –
Model Detail: Low
Texture Detail: Low
Shader Detail: Low
Water Detail: Simple reflections
Shadow Detail: Low
Colour Correction: Disabled
Antialising: 8x
Filtering Mode: Bilinear
Wait for vertical sync: Disabled
Motion Blur: Disabled
Field of view: 90
Multicore Rendering: Enabled
High Dynamic Range: None
Direct X Level: 8.0

More’s highfps cfg with mat_filtertextures 1

Average fps after 3 demo playthroughs: 205

AnimaL

CPU: Intel Q6600 @ 3.4ghz
RAM: 4GIG DDR2 RAM – W7 64bit
GRAPHICS CARD: 8800GT 512MB

RESOLUTION: 1280×960
WINDOWED/FULLSCREEN: FULLSCREEN

Video Advanced Settings (Options > Video > Advanced) –
Antialising: None
Direct X Level: 95

FPS Config Used (if any): Chris moreframes @ dx9

Normal run:
5125 frames 35.618 seconds 143.89 fps ( 6.95 ms/f) 13.887 fps variability
5125 frames 34.455 seconds 148.75 fps ( 6.72 ms/f) 14.685 fps variability

Core 0 – Disabled:
5125 frames 34.100 seconds 150.29 fps ( 6.65 ms/f) 14.374 fps variability
Which is quite fun :D

Game Booster, core 0 disabled:
5125 frames 33.609 seconds 152.49 fps ( 6.56 ms/f) 14.290 fps variability
(thats +10fps from normal run. I was able to get this gain constantly. Even without bloat ware like gamebooster. )


Last edited by AnimaL,

octochris

(0v0)

Quoted from Skyweb

Antialising: none (0.o)

that’s a bug with the menu, it doesn’t support the value “16”.

CPU: AMD Phenom II X2 555 Processor 3.20GHz
RAM: 2GIG DDR2 RAM
GRAPHICS CARD: Nvidia GeForce GTX 260 896MB

RESOLUTION: 1920×1080 (native)
WINDOWED/FULLSCREEN: Fullscreen

Video Advanced Settings (Options > Video > Advanced) –
Model Detail: High
Texture Detail: Very High
Shader Detail: Low
Water Detail: Reflect All
Shadow Detail: Medium
Colour Correction: Disabled
Antialising: None
Filtering Mode: Anisotropic 4x
Wait for vertical sync: Disabled
Motion Blur: Disabled
Field of view: 90 (110 actually)
Multicore Rendering: Enabled
High Dynamic Range: None
Direct X Level: 8.0

Draw viewmodels: On
FPS Config Used (if any): Chris’ moreframes http://fakkelbrigade.eu/chris/configs/cfg/moreframes.htm

Average FPS over 3 demo playthroughs: 124.83fps

freshmeatt

‹Con›

OK, so this: http://valid.canardpc.com/show_oc.php?id=1260511 gave me 7296 points in 3DMark03 and average of 51.51 FPS over 3 demo playthroughs.

Yet these: http://img686.imageshack.us/img686/8783/dsc00093tf.jpg http://valid.canardpc.com/show_oc.php?id=1261189 http://img14.imageshack.us/img14/7936/configgd.png (with Event Log, DHCP/DNS Client, P&P, Windows Audio and Themes enabled) gave me 8536 points and 54.33 FPS.
I have no idea what else I can do now. Can just push gfx memory to 400MHz and oc core a bit more, but that won’t give me any notable boost whatsoever.


Last edited by freshmeatt,

AnimaL

why would underclock ur cpu and hope for better fps…

when i had p4, 50fps was top on middle, usualy dropped to even 30, now with all updates, 50fps seems rather good result + i had 3.2ghz, you only 2.6ghz

also, p4 might have advantage to disable HT (used to be like that before dual core support, dunno now)


Last edited by AnimaL,

octochris

(0v0)

3DMark scores have little to no diagnostic value.

freshmeatt

‹Con›

Quoted from AnimaL

why would underclock ur cpu and hope for better fps…

Can’t get it to 3.2GHz because of memory, I’ve written that already. My RAM is PC2700 (DDR333), which are already faster than my mobo can theoretically support (PC2100, DDR266). 175*16 = 2800, 3200/16 = 200, 200*2 = 400MHz of memory frequency, about 1.5x faster than it could possibly run at.
Quoted from octochris

3DMark scores have little to no diagnostic value.

I know, I posted them just for comparison’s sake. Every time I increased clock for 10MHz, I got ~150 more points in these tests (lolwut).


Last edited by freshmeatt,

AnimaL

keep ocing ur cpu so mobo dies and you have to stop using stone age pc for gaming

or just stop whining, you wont get more fps by ocing ur gpu, ur fps is rather good for that box

btw me wants to see test from guys who have more than 4 cores


Last edited by AnimaL,

octochris

(0v0)

Quoted from AnimaL

keep ocing ur cpu so mobo dies

Quoted from AnimaL

you wont get more fps by ocing ur gpu

that’s some terrific bullshit you’re talking, do carry on.

AnimaL

Quoted from octochris

[…]

[…]

that’s some terrific bullshit you’re talking, do carry on.

i had same pc build as the guy, just with normal ddr ram and mobo, same fps on any resolution, gpu dependent? :D

not to mention that with any fps cfg ur gpu in tf2 uses 15-50% of its power at most (depending on how good it is)

the only thing that affects gpu usage would be resolution, yet it would require extremely shitty gpu to see the difference on anything thats below 1080p. Even my laptops 4570 doesnt have more than 30% usage @ HD ready res…

funny how u make great cfgs yet dont understand how game works

p.s. try OC 7600`s – the actual performance at ~20% oc would give u not more than 2fps… you can look that up in google ma friend


Last edited by AnimaL,

jgmaster

BM

Shift key :<

octochris

(0v0)

Quoted from AnimaL

i had same pc build as the guy, just with normal ddr ram and mobo, same fps on any resolution

and that makes your nonsense correct? if you genuinely think that every cpu and gpu die is the same then i do little else than point at you with mocking laughter. stop talking bullshit.

Quoted from AnimaL

not to mention that with any fps cfg ur gpu in tf2 uses 15-50% of its power at most (depending on how good it is)

and that is supposed to mean what exactly? no matter how much it uses, a higher clock speed means a higher fps. you’re not using 100% of your CPU all of the time but you would notice the difference of a large OC whether you were using 50% or 100%. the same goes for a gpu. stop talking bullshit.

Quoted from AnimaL

the only thing that affects gpu usage would be resolution

the only thing that affects gpu usage would be resolution? stop talking bullshit.

Quoted from AnimaL

it would require extremely shitty gpu to see the difference on anything thats below 1080p

i wonder what the demographic of fps configs are? stop talking bullshit.

Quoted from AnimaL

funny how u make great cfgs yet dont understand how game works

funny how you talk such bullshit and yet talk such bullshit. stop talking bullshit.

Quoted from AnimaL

p.s. try OC 7600`s – the actual performance at ~20% oc would give u not more than 2fps… you can look that up in google ma friend

the performance difference when ocing does not work like that, it works based on a number of factors, not to mention it is not entirely relative, and it is linear based on the starting point, stop talking bullshit.

please try harder next time, this post wasn’t even a challenge.


Last edited by octochris,

Add A Reply Pages: « Previous 1 ... 4 5 6 ... 12 Next »