ShootersForever.com Forum Index

GoldenEye 007 Nintendo 64 Community, GoldenEye X, Nintendo 64 Games Discussion
GoldenEye Cheats, GoldenEye X Codes, Tips, Help, Nintendo 64 Gaming Community


N64 console - DVI/HDMI/VGA adapter in the works
Goto page Previous  1, 2, 3  Next
 
Post new topic   Reply to topic    ShootersForever.com Forum Index -> Console Workshop
View previous topic :: View next topic  
bmw
Hacker
Hacker


Joined: 04 Jan 2006
Posts: 1366
Location: Michigan

 PostPosted: Wed May 14, 2014 2:46 pm    Post subject: Reply with quote Back to top

While it did load in-game (see bottom-most screenshot) that was with all portals turned off (max visibility on) and I can only imagine how much of a nightmare it would be to properly portal this thing such that any semblance of a reasonable framerate in single-player mode could be achieved.
 
View user's profile Send private message Visit poster's website
MultiplayerX
007
007


Joined: 29 Jan 2006
Posts: 1210
Location: USA

 PostPosted: Wed May 14, 2014 3:49 pm    Post subject: Reply with quote Back to top

Ah Understandable

I've been working on something too Wink

https://www.youtube.com/watch?v=imi7wrdbtyI&feature=youtu.be

should get the perfect dark fans blood flowing right now as it stands it's in rough beta but it looks really nifty Wink
_________________
http://codelegends.proboards.com/
 
View user's profile Send private message Visit poster's website
MRKane
007
007


Joined: 11 Dec 2008
Posts: 1073

 PostPosted: Wed May 14, 2014 5:05 pm    Post subject: Reply with quote Back to top

I've actually just purchased a GameCube s-video connection cable and plan on giving it a crack with the N64 - the LCD TV I've got at home is a weird obscure thing that actually has zoom functions built into it which might be put to good use getting the smaller N64 screen expanded to full screen (if it does shrink it that is).

It is official hardware and I've seen videos of people having success with them - mainly in the way of removing blur from pixels that you get through analogue systems - so I'll post back when I get around to linking it up and trying it out!

Another thing we could do is run it through some real-time video processing software (so N64 to computer to TV). My machine use to have a graphics card that could have done that but alas...it exploded. The trick would be to run it through an edge preserving enlargement pattern - most of which are freely available by reading thesis papers. I think that'd give the most "attractive" method of improving a N64 image to high definition. Wonder if you could process it through an ARDWINO core?
_________________
No Mr. Bond, I expect you to be re-coded!
 
View user's profile Send private message MSN Messenger
Trevor
007
007


Joined: 15 Jan 2010
Posts: 926
Location: UK, Friockheim OS:Win11-Dev PerfectGold:Latest

 PostPosted: Tue Nov 11, 2014 12:31 pm    Post subject: Reply with quote Back to top

I kinda wanted to mention something here that goes against MultiplayerX and BMWs recomendations and reasoning.

The N64-HDMI does not deal with ANY analogue signal.
The whole reason you have to solder is because it takes the video from the frame buffer before its passed to the VI (Video Interface).

Any convertter past the VI and DAC (i.e. plugged into the existing AVout slot) will not improve anything. All it will do is change the signal format. The initial format will still be the same.


Now, onto a problem I found with taking the framebuffer directly is that the VI imparts the final layer of AA (of which both parts are required where the first is done in the RCP per face (Enable flag and edge value) and the final is in the VI where it takes the edge value and then blends.)

So without the VI there is effectivly NO AA and games burn cycles in calculating the Edge Value.

The VI also does Dithering to prevent MachBands in 16Bit Framebuffers (Most comersial games)

The greatest benifit to taking a framebuffer signal direclty would be in 640x480x32 framebuffers where you would get a true 32bit colour signal, as against a 21Bit signal in the VI (Why oh why 21bit Nintendo... for all the extra 3 bits would do to make it 24bit tru colour...)

Trev
_________________
 
View user's profile Send private message
zoinkity
007
007


Joined: 24 Nov 2005
Posts: 1684

 PostPosted: Wed Nov 12, 2014 8:15 am    Post subject: Reply with quote Back to top

Having seen pictures of the output from its development, AA is applied. It intercepts at the point of signal encoding.

On CRTs you can't tell the difference between 21 and 32 bit color. Unless you have a gradient chances are you can't tell the difference anyway.
The N64 can output 32bit just fine. Actually, it does so in a slightly different way programmicaly.
Most games don't bother though, since not only is the output quality difficult to judge but in 2-cycle output the framebuffers would take double the space. 1-cycle would take the same space as typical dual-buffered 21bit output but has some interesting quirks. Instead of fussing over 32bit color it would make much more sense to attempt triple-buffering.
_________________
(\_/) Beware
(O.o) ze
(> <) Hoppentruppen!
 
View user's profile Send private message Send e-mail
Trevor
007
007


Joined: 15 Jan 2010
Posts: 926
Location: UK, Friockheim OS:Win11-Dev PerfectGold:Latest

 PostPosted: Wed Nov 12, 2014 10:01 am    Post subject: Reply with quote Back to top

Not that I question your knowlage, but 16bit = 32 shades of R G and B = 32,767 different combinations.
While this is a big nimber, it still boils down to 32 different shades in a given 0 - 255 gradient which IS noticable. (Having done a lot of textures for GF I also know from this experiance how noticable it is and I try to reduce it. Funny though because Im renound for trying to stick to 4bit textures hahaha)

at 32bit (of which 8 bits are wasted on a video display so really we are talking the difference between 21bit and 24bit) each colour has 255 gradients making the total colours 16,581,375

However at 21bit that 255 gradeint is cut to 127 (1/2 the colours)
1/2 the colours just by dropping 1 bit per colour chanell... seems such a waist...


Ah cool, its interesting to know that the AA is still being applied. So, the description of being soldered 'Before' the VI is incorrect... it should be 'On' the VI? Or is the FPGA taking on the role of VI and doing the AA?
That would be kinda cool since these little boards are quite powerfull and could I suppose be made to do better AA than the original VI.
Also if they took the role of dithering they could to that Post up-scaling so that the dither pattern doesnt get 'blowen up' and made noticable.
(dither patterns on a 1:1 pixle relationship are quite effective, but when the pixles are expanded they become an eyesore probably more so than mach-bands)

Trev
_________________
 
View user's profile Send private message
MRKane
007
007


Joined: 11 Dec 2008
Posts: 1073

 PostPosted: Wed Nov 12, 2014 3:42 pm    Post subject: Reply with quote Back to top

Forgot to post a followup on that s-video:
Pixels were noticeable with no smoothing blur.
Gamma was through the roof and I was only *just* able to bring it down through the TV.

In my honest opinion: it's more of a novelty, but some games (like Mario Kart) seemed to look a bit tacky (possibly because characters were sprites) but others with polygon graphics generally got a bit "sharper". If anything XG and Beetle Racing really seemed to benefit from being more pixelated instead of blurred as it made the graphics more "understandable" in these cases, but in some of the other games (oddly space circus) it really felt like one was being bombarded by blocks of color! lol

I personally think that "understandability" is what I want out of any N64 upscaling as a few too many games (PD included) tended to get difficult to make out - and that's a problem Wink
_________________
No Mr. Bond, I expect you to be re-coded!
 
View user's profile Send private message MSN Messenger
zoinkity
007
007


Joined: 24 Nov 2005
Posts: 1684

 PostPosted: Thu Nov 13, 2014 9:29 am    Post subject: Reply with quote Back to top

16bit really was difficult for many people to distinguish at the time. (circa 1995--info they would have looked at during development)

Output has to be 32bit, not 24. "alpha" is technically "no color scanned", which if you look at the difference between frame buffers and how they're visually combined is very important. A "lazy" screenshot of a single FB will not represent what you see on screen.

The point of HDMI output isn't so much upscaling as dealing with a different type of screen. The N64 was expected to play on a CRT TV. It's output is interlaced and visual quality tailored for what those screen could properly represent. They still had to account for the "visible box"--the borders on a CRT limitting what could actually be seen versus what it projected.
Fast-forward to today, and you have larger screens, resolution, aspect, etc. They're not interlaced, which is probably the biggest problem that has to be dealt with. It makes sense to alter the output not so much for quality sake--technically RGB would be the most accurate output--but to better meet the expected display format.
_________________
(\_/) Beware
(O.o) ze
(> <) Hoppentruppen!
 
View user's profile Send private message Send e-mail
Trevor
007
007


Joined: 15 Jan 2010
Posts: 926
Location: UK, Friockheim OS:Win11-Dev PerfectGold:Latest

 PostPosted: Thu Nov 13, 2014 10:54 am    Post subject: Reply with quote Back to top

Would you mind expanding that a bit?
I understand alpha in image compositing but I just dont get it in display... Why would a display require 255 values of 0?

Do you happen to have 2 framebuffer caps?


As to the reasons for HDMI I guess I must be the opposit of the majority in that I feel that if you are going to spend the time to modify the consol why not better it?
So, back to HDMI and up-scaling not being the primary reason, I say why not?
The picture I get from my AV leads is perfectly acceptable.
If I spent the time and money on modding the output Id expect a big difference, things like post-upscale dithering on 16bit games and maybe even 2xSAL (Or whatever the anaconyme is) to help smooth off jaggies, better yet if it could utilise the coverage value.

Trev
_________________
 
View user's profile Send private message
zoinkity
007
007


Joined: 24 Nov 2005
Posts: 1684

 PostPosted: Fri Nov 14, 2014 8:18 am    Post subject: Reply with quote Back to top

The display doesn't require 256 levels of alpha. What data signals require is aligned data. Therefore, if you have a video buffer with 24bit color you still need to composite in alpha. Even if that is a single bit, data must be aligned in some sensible way. Accepting a field of alpha and color data would require hardware modification unless you combine them into a single buffer--32bit color.
At some point I read the single bit of alpha is actually pushed to a 2-3 bit thing internally. Have to ask some of the dev people if that's really the case though.

Completely forgot about this, but a certain degree of color accuracy is dependent on the conversion from internal RGB to YUV. The range of output YUV, the natural color bias of YUV, and the method of conversion all have a factor on what color you get in the end. So, that said, it's a sad truth that even 24/32 bit color will get dithered to all snot in some cases. Controlling or normalizing that output has its advantages.

Of course, you can go with an RGB mod if you aren't interested in HDMI ;*) HDMI isn't completely the same anyway. For instance, phosphors in CRTs fade at a slower rate than a pel will blacken out. Anything made "by eye" could be thrown off.
_________________
(\_/) Beware
(O.o) ze
(> <) Hoppentruppen!
 
View user's profile Send private message Send e-mail
Trevor
007
007


Joined: 15 Jan 2010
Posts: 926
Location: UK, Friockheim OS:Win11-Dev PerfectGold:Latest

 PostPosted: Wed Feb 11, 2015 8:55 am    Post subject: Reply with quote Back to top

Apparnetly there is some information which this topic lacks. To fill in this info and therefore correct my mistakes above I will post a modified and compiled version of this information - with a few inline edits and modifications Razz.
I will let the source remain annonomus but will not take credit myself, that would just be wrong.



The VI is an internal component of the RCP, you cannot tap on it's input data unless you are running an emulator or you have some really advanced (expensive/impossible) equipment to tap internal busses in an IC xD.

Block Diagram - RCP:


The VI generates 21bit RGB video data (plus sync signals and other stuff) that it sends out of the RCP and into the video DAC.
As for the 21bit issue, it's a hardware bus limitation as it's a 7bit wide bus that sends R G and B and sync bytes sequentially, plus a line for word alignment and a clock line. over that 21bit RGB bus, you are getting upsampled 16bit RGB content. nothing to complain about, I'd say.

Edit-
- Only if VI is set to Dither Filter on 16bit mode.

Block Diagram - VDAC:


This is where the HDMI adapter (and the previous, not so well known, analog RGB DAC by viletim (Tim Worthington), who reversed the bus's datastream format) taps the video data.
What you get there is a ~50Mhz bus clocking digital RGB video compatible with analog SDTV timings and levels. The builtin DAC just converts that to composite svideo or RGB in some cases.
The HDMI adapter performs scaling too and outputs to more modern video connections.

In short, you shouldn't lose any antialiasing or anything like that. Because the VI is part of the RCP and not seperate.

Also, while the N64 can use a 32bit framebuffer most games stick to 16bit anyway. What you and zoinkity discussed about that, I think can be explained shortly like this: these 32bit are really 24bit + 8 bit padding because it's easier to set up one 32bit r/w operation per pixel than 3 8bit operations or one 16 and one 8, which would add overhead delay and increases the complexity on the protocol. Basically, the wasted 8 bit are less waste than the alternatives would be. 32bit RGB is, by convention, considered 24bit RGB + alpha, but of course alpha doesn't make all that much sense in the context of video, unless it's destined for studio compositing, so the "alpha channel" is really just padding here.


For the 32bit FB I didn't say it was 16+16 necesarily, it could be a 32bit single operation too, but well, I'm not sure which or even any case of a game that actually uses 32/24 bit RGB FB, surely emu/videoplugin authors would know. As zoinkity said, most games do 16bit RGB FB.

Edit-
- I created a 32Bit demo but of course all emus display 32bit anyway. On HW all the mach banding dissapears. Also from what I read, most FB Fills are 32bit (or 2 pixles of 16bit) so I guess thats why there is no point in 24bit.
32bit (2 pixles per clock):
gsDPSetFillColor(u32 colour);
16Bit (4 Pixles per Clock):
gDPSetFillColor(GPACK_RGBA5551(255,0,0,1)<<16 | GPACK_RGBA5551(255,0,0,1));



Going back to the VI. In case you don't know, the N64's RDRAM is 9bit RAM. the CPU is only concerned with 8 of these bits, and I'm not sure wether the extra bit is used by the RCP's memory controller as parity check or just ignored, but the VI does use it for the 2nd pass/layer of VI.
16bit are 2 8bit bytes, so we have 2 extra bits per 16bit pixel now thanks to the 9bit RAM. That's where the RCP stores the AA cues for the VI. This particular thing had a name but I forgot what it was.


Edit-
- The 9th Bit is the home of the Coverage Value, Used in VI AA for Silhouettes (outer edges of disconnected faces/objects) GE's Line mode shows this Value as Black against a white BG. If CVG = 0 OR 1 then Colour = BG (white) else Colour = 255 * CVG.


If you pay attention to the schematics I linked, the pixel clock line is fed into both the RCP(VI) and DAC from an external source. The RCP doesn't set the pixel clock, but it's mandated from outside, and it has to do whatever it can with it. in normal games this isn't too noticeable, probably because commercial games probably tend to use well tested video modes from the devkit, but in some games (can't quote one now though) and, particularly, homebrew, you'll notice the results of this if the conditions are right.
If you have a flashcart or backup unit, try running NES emulator Neon64, and look closely at the "title screen" and see if you spot something odd.

these colums have an extra repeated pixel in the horizontal plane because the VI had to repeat a pixel in order to mantain timing. ideally it would resample, but I doubt the VI has a resampler. And frankly, nobody noticed, like nobody noticed that the 15/16bit color framebuffer was conveyed through a 21 bit RGB bus instead of 24 bit xD, particularly on CRTs, and specially using PAL and NTSC encodings.
The next step would be sniffing or hijacking the RAMBUS in order to access the framebuffers directly, and somehow have a highly sophisticated addon that could interpret the N64's data formats found in ram in order to locate the framebuffers, and interpret them in the same way the VI would, but give out 24/32bit video instead of 21. but really, what's the point?

.... actually I thought of a point xD but it's so damn farfetched... 3D! the framebuffer also contains the Z-buffer, which gets written to at the same time as the framebuffer. tracking both at the same time, an addon sniffin on the RAM bus could theoretically reconstruct a 3D view of every surface being drawn to the framebuffer, although resolution would remain 240p or 480p for the few games that use it. hardly usable for 3D glasses.

Anyway, that's all, hope you found this helpful, interesting, entertaining or an amusing waste of time. I felt like explaining crap and you seemed to be a suitable "victim", so there you go. Bye.





So there we go. I missunderstood the function of the HDMI Adaptor.
I also missunderstood the location of the VI. looking over the docs and diagrams again I now see that what I thought was really silly.

Trev
_________________
 
View user's profile Send private message
mistamontiel
007
007


Joined: 17 Apr 2011
Posts: 843
Location: Miami, FL, CUBA

 PostPosted: Wed Feb 11, 2015 1:31 pm    Post subject: Reply with quote Back to top

Beyond odd .. me Neon64 just keeps scrolling down nonstop ={
_________________


 
View user's profile Send private message Visit poster's website
Kerr Avon
007
007


Joined: 26 Oct 2006
Posts: 913

 PostPosted: Fri Mar 06, 2015 5:03 am    Post subject: Reply with quote Back to top

You know how when you run an N64 emulator, how the visuals of a game are much better than on a real N64, not just because the monitor gives a sharper picture than a TV (which isn't very relevant on an LCD TV anyway), but because the emulator is giving a much higher resolution of output? Well, when this HDMI output thing is released, will it make the game ouptut look as good on a real N64 + LCD TV as it does on a PC + monitor?

I hope so, Perfect Dark looks fantastic on emulators.
 
View user's profile Send private message
Trevor
007
007


Joined: 15 Jan 2010
Posts: 926
Location: UK, Friockheim OS:Win11-Dev PerfectGold:Latest

 PostPosted: Fri Mar 06, 2015 5:46 am    Post subject: Reply with quote Back to top

No.

In Emulation N64 RDP code is turned into DirectX or OpenGL code to work on the PC Graphics cards.
This allows the graphics card to render objects at higher, usually desktop, resolution.

The N64 can only output at most 640x480 and is limited by RAM and Fill Rate to use higher resolutions.

Since the HDMI adapter plugs in just before the DAC nothing will change except for bypassing the PAL/NTSC encodings which both have drawbacks and colourloss etc.


So, the best way to show this is to set the emulator to display GoldenEye at 320x244 and then stretch this to your monitor size.
You wont have sharp edges anymore.


Trev
_________________
 
View user's profile Send private message
Kerr Avon
007
007


Joined: 26 Oct 2006
Posts: 913

 PostPosted: Fri Mar 06, 2015 6:26 am    Post subject: Reply with quote Back to top

Trevor wrote:
No.

In Emulation N64 RDP code is turned into DirectX or OpenGL code to work on the PC Graphics cards.
This allows the graphics card to render objects at higher, usually desktop, resolution.

The N64 can only output at most 640x480 and is limited by RAM and Fill Rate to use higher resolutions.

Since the HDMI adapter plugs in just before the DAC nothing will change except for bypassing the PAL/NTSC encodings which both have drawbacks and colourloss etc.

So, the best way to show this is to set the emulator to display GoldenEye at 320x244 and then stretch this to your monitor size.
You wont have sharp edges anymore.


Trev



Fair enough. Graphics aren't much of a concern for me, it is a pity but nothing too important. What IS important, though, is that modern TVs seem to be pulling away from being able to display the output of the N64 and other consoles from that era and earlier. I'd be perfectly happy if modern TVs showed the old consoles' displays the same as CRT TVs do, but most don't, and the shops mostly don't have a clue what their TVs will display re: older consoles.

The next time I buy a TV, I'm going to have to take my N64 in with me, and if the shop won't allow me to test it, then I'll go elsewhere.
 
View user's profile Send private message
Display posts from previous:   
Post new topic   Reply to topic    ShootersForever.com Forum Index -> Console Workshop All times are GMT - 8 Hours
Goto page Previous  1, 2, 3  Next
Page 2 of 3

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum

Cobalt 2.0 BB theme/template by Jakob Persson.
Copyright © 2002-2004 Jakob Persson


Powered by BB © 01, 02 BB Group

 


Please Visit My Other Sites: GoldenEyeForever.com | GrandTheftAutoForever.com

Got kids? Check out my Dora The Explorer site with games and coloring pages!

Our forums feature Nintendo 64 games, GoldenEye 007 N64 New Maps and Missions, GoldenEye Cheats, N64 Emulator, Gameshark, GoldenEye Multiplayer and more!

[ Privacy Policy ]