I just rebuilt one of my machines to be a media center, but TV out via s-video is not working. When I had both a VGA monitor and a TV connected to it, I was able to get it to somewhat work. I did that by using mirrored mode (both screen show the same thing) and setting the resolution higher than what the TV supported, but one that the monitor did support. Meaning that the TV could not display the entire desktop at once. Now with the monitor disconected, all I get on the TV is, well what looks to be static and scan lines....sort of. Although now the TV is simply displaying a blank blue screen, because it can't figure out the video signal. Which makes it seem like the refresh rate is the problem, but I am not sure. Both the Windows display manager and nVidia's software included with the driver will not let me change the refresh rate away from 60hz.
I don't know if the refresh rate is the problem or not. The computer will display onto the TV during startup, but as Windows is loading and it loads the driver, it messes up. If I enable VGA mode or uninstall the driver it will display no problem, but of course video is slow as hell when you do that and I would be unable to play any games or other things that I need this media center to do.
FYI, even when the video is messed up I can still get into it and change setting by remoting into it via VNC, since
remote desktop does not work.
I have tried reinstalling the driver several times, making sure everything is set to NTSC, changing resolutions, going through nVidia TV setup wizard (which really does nothing more than ask if you want mirrored mode or not)
Windows XP SP3 with all updates
GeForce 7100GS
TV: JVC D-Series, unknown model