Shady going ons at GeCube?

Q&A, advice, reviews, and news about the computers, phones, TVs, stereos, and pretty much anything else that can't be easily whittled out of a stick or chipped out of stone.
User avatar
FireAza
Redshirt
Posts: 12806
Joined: Fri Feb 14, 2003 10:59 am
Gender: Male
Location: Hasuda City, Japan
Contact:

Shady going ons at GeCube?

Post by FireAza » Fri Oct 28, 2005 9:14 am

Twice now I have emailed GeCube as to why my ATI 9800XT they manufactured only runs at a default core clockspeed of 398 MHZ as opposed to 412 MHZ which is the normal clock speed for the 9800XT and twice they have ignored my emails. What do you guys think? Are they trying to hide something? It's really pissing me off that the only way to get it to run at the default speed is to overclock, I'd love to see how much faster it would run if it started off at the correct speed.
*EDIT* corrected the core speed, it's 398, not 405 :P
Last edited by FireAza on Mon Oct 31, 2005 12:31 am, edited 1 time in total.
Image
"For AUS$300, you get FireAza drawing your screen image." -MartinBlank "Oh shit. For once, FireAza is right." -Deacon
"FireAza, if you're really that sneaky and quiet then you can sleep in my bed anytime, mister." -kizba

User avatar
Gowerlypuff
Redshirt
Posts: 2900
Joined: Fri Feb 14, 2003 12:53 pm
Gender: Male
Location: Leamington Spa, UK
Contact:

Post by Gowerlypuff » Fri Oct 28, 2005 10:17 am

Looking at the GeCube site, you're either running the 9800 Pro clock: http://www.gecube.com/products-detail-s ... cification

or something that does 500. Take your pick, what does the box look like?
Sloth: Am I a year behind already?
Image
February was some lyrics or quotes month or something. I don't even remember what year all this was.

User avatar
FireAza
Redshirt
Posts: 12806
Joined: Fri Feb 14, 2003 10:59 am
Gender: Male
Location: Hasuda City, Japan
Contact:

Post by FireAza » Fri Oct 28, 2005 10:46 am

That's the 256MB edition you linked, I have the 128MB version, which doesn't seem to appear on their site. Either way, the only difference between ATI cards manufactured by different partners is the size and brand of memory they use, according to the store I bought my card from. Sure there have been examples of pre-overclocked cards being sold, but I don't think it would be too legal to intentionally sell underclocked cards without informing the customer.
Image
"For AUS$300, you get FireAza drawing your screen image." -MartinBlank "Oh shit. For once, FireAza is right." -Deacon
"FireAza, if you're really that sneaky and quiet then you can sleep in my bed anytime, mister." -kizba

User avatar
edge
Redshirt
Posts: 3376
Joined: Mon Jun 02, 2003 9:43 pm
Gender: Male
Location: Pittsburgh, PA
Contact:

Post by edge » Fri Oct 28, 2005 1:30 pm

That's really not a big enough difference to be concerned about. I highly doubt the card is underclocked. I would suspect more that the speed is just not being reported completely accuratly. Ever notice how on a lot of systems if you look at your system properties and see something like x brand CPU 2.0GHz: 1.72 GHz

I suspect that's what you're looking at. Probably nothing to worry over.

User avatar
FireAza
Redshirt
Posts: 12806
Joined: Fri Feb 14, 2003 10:59 am
Gender: Male
Location: Hasuda City, Japan
Contact:

Post by FireAza » Fri Oct 28, 2005 1:45 pm

Well, it's being reported incorrectly in two places then, both in ATI's control centre. I'd hope they wouldn't screw up the reporting of it's own hardware. I just makes me wonder why a friend of mine can play Black & White 2 really smoothly and it lags like all hell on mine. The only part that beats my system is he's got an AMD Athlon 64 3000+ and I have an AMD XP 3000+. He's got an ATI 9600 XT manufactured by Sapphire by the way. Also considering the fan fails on this card every time I have it replaced, I'm getting the impression that GeCube is a shady company.
Image
"For AUS$300, you get FireAza drawing your screen image." -MartinBlank "Oh shit. For once, FireAza is right." -Deacon
"FireAza, if you're really that sneaky and quiet then you can sleep in my bed anytime, mister." -kizba

User avatar
Infin8Cyn
Redshirt
Posts: 6309
Joined: Tue Apr 29, 2003 10:02 pm
Real Name: James
Gender: Male
Location: Albuquerque, New Mexico
Contact:

Post by Infin8Cyn » Fri Oct 28, 2005 1:56 pm

Big difference between the Athlon 64 3000+ and the Athlon XP 3000+.

I'd also say it's just system differences, tweaks, something you've got running in the background. I got my 9800 Pro to OC to something like 45Mhz past it's base (It's been 5 months) and I saw diddily for changes.
Image

User avatar
edge
Redshirt
Posts: 3376
Joined: Mon Jun 02, 2003 9:43 pm
Gender: Male
Location: Pittsburgh, PA
Contact:

Post by edge » Fri Oct 28, 2005 6:18 pm

You should also concider things like memory. System and video. It could just be the case that the video memory on your card is slower/lower quality. It's not so much that ATI's software is flawed there, it's doing it's job. That's just the data that it gets back from the card or system depending on how it determins that value. Are you running AGP 4x or 8x?

Lots of "little" things that can make a big difference :-/

Something like a difference of 7MHz isn't going to make a very big difference ;)

User avatar
FireAza
Redshirt
Posts: 12806
Joined: Fri Feb 14, 2003 10:59 am
Gender: Male
Location: Hasuda City, Japan
Contact:

Post by FireAza » Sat Oct 29, 2005 12:10 am

A bit more detail. I have 1024MB of DDR400 RAM. My card is running AGP 8x. My friend has basically not tweaked his system. I always run nothing in the background when playing games. Yes, the memory on my card is also slower than it should be.

Posted Mon Oct 31, 2005 10:35 am:

I posted a question about the performance of B&W2 on Lionhead's forums, seems I'm not the only one having problems despite having good hardware. Still, that has nothing to do with my card seemingly being sold underclocked and GeCube trying to doge my enquiries!
Image
"For AUS$300, you get FireAza drawing your screen image." -MartinBlank "Oh shit. For once, FireAza is right." -Deacon
"FireAza, if you're really that sneaky and quiet then you can sleep in my bed anytime, mister." -kizba

User avatar
Infin8Cyn
Redshirt
Posts: 6309
Joined: Tue Apr 29, 2003 10:02 pm
Real Name: James
Gender: Male
Location: Albuquerque, New Mexico
Contact:

Post by Infin8Cyn » Mon Oct 31, 2005 2:57 pm

That reminds me:
Black & White 2

Boost your average FPS

If, like me, you started the game and got major low FPS with a decent graphics card, you should check out this post to fix the problem or just boost your average frames per second.

There have been several topics where a graphics card does not run the game at a playable frame rate, despite being a 6600 for example. They have set the graphics to the lowest possible setting and still got low FPS. This is because the game, depending on your graphics card, decides whether to enable bump mapping, real-time shadows and the 2.0 Microsoft Shaders. There are probably more options that I'm not aware of but still. These options are not removed by setting the graphics to low, thus still having low FPS despite changing the setting. And so, which ever graphic settings you set it to, in game setting that is, you will still see the real-time shadows etc.

IMO, the game decides these complex options too high for the graphics card. Thus resulting in low FPS despite altering the in-game video options. So, to turn the complex options off, you have to find and edit the graphics configuration file. This is found in the B&W2 program files. For example:

Program Files>Lionhead Studios>B&W2>Data

In the data folder, you should see a config file called 'Graphics'. Open this up in Notepad. At the top of the text file, you should see 3 main options, with the meanings of 1,2 and 3 next to them. Find your graphics card in the list, then alter the 3 numbers next to it. With the important one to change being the shaders number. Once you have turned the shader number down, the complex options should be removed and give you good FPS. The Res and Detail options could potentially be turned up, depending on your card. This is much simpler to understand when you view the config file for yourself.

I can guarantee that this method will boost your FPS, and disable the shaders that your graphics card can not handle (although the game thinks it can). The game will still use fur and vegation effects, just using the down-graded shader versions. You will not see bump mapping on rocks for example or real-time shadows. Hopefully, there will be a proper config file like ones used in Doom 3 or HL2, then, you should be able to alter individual options instead of a collective amount. Until then though, try this method and tell me what you think. Feel free to ask me any questions. Thanks for reading.
Image

User avatar
Deacon
Shining Adonis
Posts: 44234
Joined: Wed Jul 30, 2003 3:00 pm
Gender: Male
Location: Lakehills, TX

Post by Deacon » Mon Oct 31, 2005 3:30 pm

Infin8Cyn saves the day?
The follies which a man regrets the most in his life are those which he didn't commit when he had the opportunity. - Helen Rowland, A Guide to Men, 1922

User avatar
FireAza
Redshirt
Posts: 12806
Joined: Fri Feb 14, 2003 10:59 am
Gender: Male
Location: Hasuda City, Japan
Contact:

Post by FireAza » Mon Oct 31, 2005 11:40 pm

That might explain it, my shaders and ect was automatically set to high for my 9800 :shock: Dear god, did Lionhead even test this?? Now to test, I wonder if the menu will still lag? It's not like menus are so intense that they bring all but the most powerful cards to their knees, it sound more like a bug to me.
Image
"For AUS$300, you get FireAza drawing your screen image." -MartinBlank "Oh shit. For once, FireAza is right." -Deacon
"FireAza, if you're really that sneaky and quiet then you can sleep in my bed anytime, mister." -kizba

User avatar
Infin8Cyn
Redshirt
Posts: 6309
Joined: Tue Apr 29, 2003 10:02 pm
Real Name: James
Gender: Male
Location: Albuquerque, New Mexico
Contact:

Post by Infin8Cyn » Tue Nov 01, 2005 3:13 pm

Yay/Nay/Disco?
Image

User avatar
FireAza
Redshirt
Posts: 12806
Joined: Fri Feb 14, 2003 10:59 am
Gender: Male
Location: Hasuda City, Japan
Contact:

Post by FireAza » Wed Nov 02, 2005 2:58 am

Nope, still runs like a hairy goat. Checking the file, it seems that the default settings for the ATI 9600 (what said friend has) is everything on medium. I know that he hasn't changed this, and B&W2 runs fine for him. If I can't run the game on those settings, there's something non-hardware related going on here. Just had a thought, is it possible that for some reason, B&W2 thinks my card is higher then a 9800? That might explain why the game runs so slowly and changing the settings doesn't make a difference. Then again, the game did look much less detailed with detail set to low....
Image
"For AUS$300, you get FireAza drawing your screen image." -MartinBlank "Oh shit. For once, FireAza is right." -Deacon
"FireAza, if you're really that sneaky and quiet then you can sleep in my bed anytime, mister." -kizba

User avatar
edge
Redshirt
Posts: 3376
Joined: Mon Jun 02, 2003 9:43 pm
Gender: Male
Location: Pittsburgh, PA
Contact:

Post by edge » Wed Nov 02, 2005 2:36 pm

Very rarely do games actually even know what kind of card you have. The "settiongs for x video card" are just settings reccomended by the game manufacturer and/or card mfgr for those particular cards. The OS itself has more to do with knowing what card you have, and really, the driver gives it all that info. You might check and see if there's a different version of the driver for your card. Check out http://guru3d.com. If they have any, try an alpha driver. I'm sure you're probably already using the latest Catalyst, but see if there's another option. Worth a shot.

User avatar
Infin8Cyn
Redshirt
Posts: 6309
Joined: Tue Apr 29, 2003 10:02 pm
Real Name: James
Gender: Male
Location: Albuquerque, New Mexico
Contact:

Post by Infin8Cyn » Wed Nov 02, 2005 3:05 pm

Also make sure you thoroughly remove the previous ATI version. Driver Cleaner Pro, and the works.
Image

Locked

Who is online

Users browsing this forum: Majestic-12 [Bot] and 1 guest