 |
 |
GoldenEye 007 Nintendo 64 Community, GoldenEye X, Nintendo 64 Games Discussion GoldenEye Cheats, GoldenEye X Codes, Tips, Help, Nintendo 64 Gaming Community
|
 |
 |
 |
 |
|
 |
 |
 |
 |
 |
MRKane 007

Joined: 11 Dec 2008 Posts: 1076
 |
Posted: Wed Jan 20, 2021 2:32 am Post subject: Back on the Overclock warpath |
 |
|
Talking with a friend, Hypatia, online I got it into my head to have a play with overclocking the N64 again. Spurned on by this video (https://www.youtube.com/watch?v=NCbkDgCIKRQ) I figured there had to be something I'd missed.
So first on my target of things to resolve were the "tearing" artifacts experienced when boosting the X2 clock. I had a theory that it might have to do with the VI reading the framebuffer from part of the ram and so I dropped in the "better on paper" Toshiba ram but that changed nothing. Still, it's nice to know that the MS hasn't taken my ability to solder f*cking finicky little pins. Turns out the framebuffer either isn't in the ram or the process has nothing to do with that. The issue subsisted.
Trevor suggested to try clocking up both X1 and X2 in tandem. Results were as expected, with garbled video as a result of swapping out X1 with a 20.0mHz chip, and X2 with a 17.7mHz chip.
This was where I was talking with Hypatia about solutions and she suggested trying the UltraHDMI - something I never had previously. I said I was dubious, but I did have a spare ribbon and could swap it out of my main console. More horribly tiny soldering, yay. To my amazement installing an UltraHDMI resulted in unscrambled video (https://www.youtube.com/watch?v=TUlZI7gzJyw).
My next test was to see if I could use the correct clock frequency into the MAV-NUS chip to get a stabilised video output. This didn't work either (https://www.youtube.com/watch?v=hZ5K5jRiKDQ). You can also see in this video my terrible "swappable clock" arrangement on the second console which will be used with ongoing tests, and is naturally as unstable as all hell. And yes I should have checked with normal clocks, and yes I'm sure there's resonance in the line which wouldn't have helped there either, and yes, I'm totally open to suggestions here.
So where to from here? Frankly the UltraHDMI giving a video signal with an X1 overclock is a fantastic result, but I'd like to push a bit and see if I can get the clocks high enough to reduce the CPU to the 1x speed instead of it's standard 1.5x and thus restore normal playspeed. I know there's a theoretical ceiling based off of system specifications but I'm still going to try (and I wonder if that Toshiba ram will preform better like it claims to?).
Annoyingly my Gameshark has been dead for years, and with this clocking the Everdrive just doesn't boot so I can't run Goldeneye with the MCM or Shadows of the Empire either. Thing is: The performance metrics would be the same as both clocks have had a speed increase so there's nothing to go off there.
Right, all that said, does anyone have any suggestions? I'm all ears here, and it'd be awesome to come up with a solution that could be written out to the community and used by anyone who wished to. I think the core issue is getting something that handles the video once the X1 clock has been changed.
I'd also love to know if going from the native 1.5x overclock on the CPU down to 1x means I need to overclock it by 33% or 50% to get the same playrate - I'm actually not sure about this one.
P.S. Even though the gameplay is 12% faster, the extra 20% boost in FPS is noticeable, and PD is seriously enjoyable at this stage. Frustratingly the console begins to overheat at about the 10 minute mark at this stage. _________________ No Mr. Bond, I expect you to be re-coded! |
|
|
|
|
|
 |
 |
 |
 |
 |
MRKane 007

Joined: 11 Dec 2008 Posts: 1076
 |
Posted: Thu Jan 21, 2021 12:16 am Post subject: |
 |
|
So the assortment of clocks I ordered arrived today.
I could push X1 up to 25.0mHz but no higher as the console wouldn't boot.
X2 wouldn't boot at 20.0mHz and that was the next up from 17.7mHz that I had.
So now within the realm of a 1.5x speed I checked the timing for comparison and found it wildly out, with a 26 second video I took capturing 31.85 seconds of "game time" which suggests that things aren't quite linear. I dropped a switch on the CPU to give some control between 1x and 1.5x and didn't feel that the 1x provided better performance. Interestingly at 1x speed the "game speed" as demonstrated by the clock in PD was ever so slightly slower than actual time with a 26 second video capturing 20.55 seconds of game time.
Frustratingly I still can't get the Everdrive to work, and even then it wouldn't be a good measure of performance improvement as both clocks have been changed so this is left sitting in the realm of non-qualitative measurement, ie. videos and guesswork.
https://www.youtube.com/watch?v=miuaTgaULt0
https://www.youtube.com/watch?v=Y-rxROhngj0&t=1s
And you bet my pet bird "Fluffy" helped out, and I think that's why there are green dots in the HDMI feed - I think he pulled on the ribbon a bit too much.
https://www.youtube.com/watch?v=3NxiZBS_-pE
So...drawing to the end of a long road for me, and I really don't feel that I've found a "win" at this stage. I'm sure I could get a 20% boost without corrupted video if I could use the Everdrive and adjust the gameplay speed of Perfect Dark or Goldeneye directly, but the Everdrive just won't boot.
I think for the next part of this experiment I'll see about looking at the different X2 clocks and at what point the video tears begin to happen - I might be able to squeak out a little performance that way, and a little is better than nothing. _________________ No Mr. Bond, I expect you to be re-coded! |
|
|
|
|
|
 |
 |
 |
 |
 |
|
 |
 |
 |
 |
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
|
|
 |