Xorg for 1080i HDTV (Sony Grand Wega KF50WE610) - Hardware

This is a discussion on Xorg for 1080i HDTV (Sony Grand Wega KF50WE610) - Hardware ; I've been trying to hook the HDMI/DVI port of my 50" Sony LCD rear projection to an Nvidia FX GeForce 5700 graphics card running the 100.14.11 Nvidia driver. When I use the HDMI port I get nothing. I was able ...

+ Reply to Thread
Results 1 to 13 of 13

Thread: Xorg for 1080i HDTV (Sony Grand Wega KF50WE610)

  1. Xorg for 1080i HDTV (Sony Grand Wega KF50WE610)

    I've been trying to hook the HDMI/DVI port of my 50" Sony LCD rear
    projection to an Nvidia FX GeForce 5700 graphics card running the
    100.14.11 Nvidia driver. When I use the HDMI port I get nothing. I was
    able to use the S-Video out connection but the quality with that is
    unspeakably awful.

    Does anyone know what the correct Xorg configuration would be for a 1080i
    HDTV?


  2. Re: Xorg for 1080i HDTV (Sony Grand Wega KF50WE610)

    on Tuesday 17 July 2007 06:01
    in the Usenet newsgroup comp.os.linux.hardware
    General Schvantzkoph wrote:

    > I've been trying to hook the HDMI/DVI port of my 50" Sony LCD rear
    > projection to an Nvidia FX GeForce 5700 graphics card running the
    > 100.14.11 Nvidia driver. When I use the HDMI port I get nothing. I was
    > able to use the S-Video out connection but the quality with that is
    > unspeakably awful.
    >
    > Does anyone know what the correct Xorg configuration would be for a 1080i
    > HDTV?


    Start by finding the specs of the HDTV. Is it a real 1080 lines?
    How many dots per line? (What is the aspect ratio?) Does it support
    1080p? You will get a very slightly better picture if you convert
    1080i signal to 1080p before you send it to the screen.


    --
    sig goes here...
    Peter D.

  3. Re: Xorg for 1080i HDTV (Sony Grand Wega KF50WE610)

    On Tue, 17 Jul 2007 21:40:57 +1000, Peter D. wrote:

    > on Tuesday 17 July 2007 06:01
    > in the Usenet newsgroup comp.os.linux.hardware General Schvantzkoph
    > wrote:
    >
    >> I've been trying to hook the HDMI/DVI port of my 50" Sony LCD rear
    >> projection to an Nvidia FX GeForce 5700 graphics card running the
    >> 100.14.11 Nvidia driver. When I use the HDMI port I get nothing. I was
    >> able to use the S-Video out connection but the quality with that is
    >> unspeakably awful.
    >>
    >> Does anyone know what the correct Xorg configuration would be for a
    >> 1080i HDTV?

    >
    > Start by finding the specs of the HDTV. Is it a real 1080 lines? How
    > many dots per line? (What is the aspect ratio?) Does it support 1080p?
    > You will get a very slightly better picture if you convert 1080i signal
    > to 1080p before you send it to the screen.


    It's an 1080i set. The number of pixels is 1092168 which is slightly
    higher then (1920*1080/2 = 1036800).

  4. Re: Xorg for 1080i HDTV (Sony Grand Wega KF50WE610)

    On Mon, 16 Jul 2007 20:01:43 +0000, General Schvantzkoph typed this
    message:

    > I've been trying to hook the HDMI/DVI port of my 50" Sony LCD rear
    > projection to an Nvidia FX GeForce 5700 graphics card running the
    > 100.14.11 Nvidia driver. When I use the HDMI port I get nothing. I was
    > able to use the S-Video out connection but the quality with that is
    > unspeakably awful.
    >
    > Does anyone know what the correct Xorg configuration would be for a
    > 1080i HDTV?



    The 5700 is not a top of the line performer. It's max resolution is
    about 1600x1200. Also nVidia released PureVideo HDTV/HDCP drivers for
    XP/2000, maybe there's a Linux version available.

    http://www.nvidia.com/page/purevideo.html

  5. Re: Xorg for 1080i HDTV (Sony Grand Wega KF50WE610)

    On Wed, 18 Jul 2007 02:29:49 +0000, noi ance wrote:

    > On Mon, 16 Jul 2007 20:01:43 +0000, General Schvantzkoph typed this
    > message:
    >
    >> I've been trying to hook the HDMI/DVI port of my 50" Sony LCD rear
    >> projection to an Nvidia FX GeForce 5700 graphics card running the
    >> 100.14.11 Nvidia driver. When I use the HDMI port I get nothing. I was
    >> able to use the S-Video out connection but the quality with that is
    >> unspeakably awful.
    >>
    >> Does anyone know what the correct Xorg configuration would be for a
    >> 1080i HDTV?

    >
    >
    > The 5700 is not a top of the line performer. It's max resolution is
    > about 1600x1200. Also nVidia released PureVideo HDTV/HDCP drivers for
    > XP/2000, maybe there's a Linux version available.
    >
    > http://www.nvidia.com/page/purevideo.html


    I'm using it with a 24" Dell display at 1920x1200, it supports that
    resolution just fine.

  6. Re: Xorg for 1080i HDTV (Sony Grand Wega KF50WE610)

    on Tuesday 17 July 2007 22:14
    in the Usenet newsgroup comp.os.linux.hardware
    General Schvantzkoph wrote:

    > On Tue, 17 Jul 2007 21:40:57 +1000, Peter D. wrote:
    >
    >> on Tuesday 17 July 2007 06:01
    >> in the Usenet newsgroup comp.os.linux.hardware General Schvantzkoph
    >> wrote:
    >>
    >>> I've been trying to hook the HDMI/DVI port of my 50" Sony LCD rear
    >>> projection to an Nvidia FX GeForce 5700 graphics card running the
    >>> 100.14.11 Nvidia driver. When I use the HDMI port I get nothing. I was
    >>> able to use the S-Video out connection but the quality with that is
    >>> unspeakably awful.
    >>>
    >>> Does anyone know what the correct Xorg configuration would be for a
    >>> 1080i HDTV?

    >>
    >> Start by finding the specs of the HDTV. Is it a real 1080 lines? How
    >> many dots per line? (What is the aspect ratio?) Does it support 1080p?
    >> You will get a very slightly better picture if you convert 1080i signal
    >> to 1080p before you send it to the screen.

    >
    > It's an 1080i set. The number of pixels is 1092168 which is slightly
    > higher then (1920*1080/2 = 1036800).


    A genuine 1920*1080 gives 2,073,600. The "i" (interlace) just means that
    half the rows are written to on one pass and the other half are written
    to on the second pass. A "p" (progressive) means that all lines are
    written to with each pass. Neither affects the number of pixels.

    Is it 1379*792=(197*7)*(2*2*2*3*3*11) ?

    The TV can probably receive and decode 1080i, 720p, 720i, and some
    others, but it is going to have to "process" all of those before
    displaying them.

    I am assuming that your rear projection TV behaves like an LCD
    rather than like a CRT - but I could well be wrong.

    If you are lucky it will simply be a matter of plugging it in and
    turning it on. What distribution are you using? Are you familiar
    with its configuration utilities? Do you know about runlevels?

    Try this, READ THE TV's MANUAL, set the computer's default runlevel
    to 3 (edit /etc/inittab), power down, plug the TV in to the video
    card, power up. Can you see the BIOS screen? If not I think that
    you will have problems. It is probably best to turn the power off
    before Linux gets a chance to boot. If you can see the BIOS screen
    let it boot, login as root, run the configuration utility, have a
    look at the /etc/X11/xorg.conf file it created (the name could be
    different), go to runlevel 5. Look at /var/log/Xorg.0.log (the name
    could be different). If everything is O.K. set the default runlevel
    back to 5 in /etc/inittab.

    Good luck. If it works, say so. If it breaks the TV I'm sorry,
    you'll have to buy another TV. :-(


    --
    sig goes here...
    Peter D.

  7. Re: Xorg for 1080i HDTV (Sony Grand Wega KF50WE610)

    On Thu, 19 Jul 2007 19:25:58 +1000, Peter D. wrote:

    > on Tuesday 17 July 2007 22:14
    > in the Usenet newsgroup comp.os.linux.hardware General Schvantzkoph
    > wrote:
    >
    >> On Tue, 17 Jul 2007 21:40:57 +1000, Peter D. wrote:
    >>
    >>> on Tuesday 17 July 2007 06:01
    >>> in the Usenet newsgroup comp.os.linux.hardware General Schvantzkoph
    >>> wrote:
    >>>
    >>>> I've been trying to hook the HDMI/DVI port of my 50" Sony LCD rear
    >>>> projection to an Nvidia FX GeForce 5700 graphics card running the
    >>>> 100.14.11 Nvidia driver. When I use the HDMI port I get nothing. I
    >>>> was able to use the S-Video out connection but the quality with that
    >>>> is unspeakably awful.
    >>>>
    >>>> Does anyone know what the correct Xorg configuration would be for a
    >>>> 1080i HDTV?
    >>>
    >>> Start by finding the specs of the HDTV. Is it a real 1080 lines? How
    >>> many dots per line? (What is the aspect ratio?) Does it support
    >>> 1080p?
    >>> You will get a very slightly better picture if you convert 1080i
    >>> signal
    >>> to 1080p before you send it to the screen.

    >>
    >> It's an 1080i set. The number of pixels is 1092168 which is slightly
    >> higher then (1920*1080/2 = 1036800).

    >
    > A genuine 1920*1080 gives 2,073,600. The "i" (interlace) just means
    > that half the rows are written to on one pass and the other half are
    > written to on the second pass. A "p" (progressive) means that all lines
    > are written to with each pass. Neither affects the number of pixels.
    >
    > Is it 1379*792=(197*7)*(2*2*2*3*3*11) ?
    >
    > The TV can probably receive and decode 1080i, 720p, 720i, and some
    > others, but it is going to have to "process" all of those before
    > displaying them.
    >
    > I am assuming that your rear projection TV behaves like an LCD rather
    > than like a CRT - but I could well be wrong.
    >
    > If you are lucky it will simply be a matter of plugging it in and
    > turning it on. What distribution are you using? Are you familiar with
    > its configuration utilities? Do you know about runlevels?
    >
    > Try this, READ THE TV's MANUAL, set the computer's default runlevel to 3
    > (edit /etc/inittab), power down, plug the TV in to the video card, power
    > up. Can you see the BIOS screen? If not I think that you will have
    > problems. It is probably best to turn the power off before Linux gets a
    > chance to boot. If you can see the BIOS screen let it boot, login as
    > root, run the configuration utility, have a look at the
    > /etc/X11/xorg.conf file it created (the name could be different), go to
    > runlevel 5. Look at /var/log/Xorg.0.log (the name could be different).
    > If everything is O.K. set the default runlevel back to 5 in
    > /etc/inittab.
    >
    > Good luck. If it works, say so. If it breaks the TV I'm sorry, you'll
    > have to buy another TV. :-(


    It doesn't display the BIOS screen. I'm using Fedora 7.

  8. Re: Xorg for 1080i HDTV (Sony Grand Wega KF50WE610)

    on Thursday 19 July 2007 21:18
    in the Usenet newsgroup comp.os.linux.hardware
    General Schvantzkoph wrote:

    > On Thu, 19 Jul 2007 19:25:58 +1000, Peter D. wrote:


    [snip]
    >> Good luck. If it works, say so. If it breaks the TV I'm sorry, you'll
    >> have to buy another TV. :-(

    >
    > It doesn't display the BIOS screen. I'm using Fedora 7.


    I don't know anything about Fedora's configuration utilities.

    Did you connect the TV instead of the monitor, or as well as?
    I meant for you to replace one with the other. Did you use the
    same socket? If not, is there any configuration option that
    you know of to send the signal to a different socket, or card?

    Did the manual say anything interesting?

    --
    sig goes here...
    Peter D.

  9. Re: Xorg for 1080i HDTV (Sony Grand Wega KF50WE610)

    On Thu, 19 Jul 2007 23:28:02 +1000, Peter D. wrote:

    > on Thursday 19 July 2007 21:18
    > in the Usenet newsgroup comp.os.linux.hardware General Schvantzkoph
    > wrote:
    >
    >> On Thu, 19 Jul 2007 19:25:58 +1000, Peter D. wrote:

    >
    > [snip]
    >>> Good luck. If it works, say so. If it breaks the TV I'm sorry,
    >>> you'll have to buy another TV. :-(

    >>
    >> It doesn't display the BIOS screen. I'm using Fedora 7.

    >
    > I don't know anything about Fedora's configuration utilities.
    >
    > Did you connect the TV instead of the monitor, or as well as? I meant
    > for you to replace one with the other. Did you use the same socket? If
    > not, is there any configuration option that you know of to send the
    > signal to a different socket, or card?
    >
    > Did the manual say anything interesting?


    I hooked up the set by itself. I only have one graphics card so it's one
    or the other. I did run the system in a dual monitor mode with the TV
    connected to the S-Video output, but the picture quality was horrible.


  10. Re: Xorg for 1080i HDTV (Sony Grand Wega KF50WE610)

    On 2007-07-16, General Schvantzkoph wrote:
    > I've been trying to hook the HDMI/DVI port of my 50" Sony LCD rear
    > projection to an Nvidia FX GeForce 5700 graphics card running the
    > 100.14.11 Nvidia driver. When I use the HDMI port I get nothing. I was
    > able to use the S-Video out connection but the quality with that is
    > unspeakably awful.
    >
    > Does anyone know what the correct Xorg configuration would be for a 1080i
    > HDTV?


    Sir,

    OK, there are two things here that I can think of;

    1) actually establishing HDMI out works
    2) getting 1080i mode

    For me, I had a DVI output on my video card, and used a DVI to HDMI
    converter to attach to the TV. This is slightly different from yours
    (direct HDMI port built into video card?), but I don't worry too much.

    let's diagnose step 1. Get an ssh shell into your box from some other
    machine, and make sure you machine is connected to the TV, the TV is on,
    and tuned to the right input.

    When you run startx, look for a section of
    /var/log/Xorg.0.log like:

    (II) NVIDIA(0): NVIDIA GPU GeForce4 Ti 4600 at PCI:1:0:0 (GPU-0)
    (--) NVIDIA(0): Memory: 131072 kBytes
    (--) NVIDIA(0): VideoBIOS: 04.25.00.27.32
    (II) NVIDIA(0): Detected AGP rate: 4X
    (--) NVIDIA(0): Interlaced video modes are supported on this GPU

    Right after that it should tell you what devices are attached:

    (--) NVIDIA(0): Connected display device(s) on GeForce4 Ti 4600 at PCI:1:0:0:
    (--) NVIDIA(0): SONY TV (DFP-0)
    (--) NVIDIA(0): SONY TV (DFP-0): 165.0 MHz maximum pixel clock
    (--) NVIDIA(0): SONY TV (DFP-0): External Single Link TMDS
    (II) NVIDIA(0): Assigned Display Device: DFP-0


    If you don't have something like this, where it prints that it's connect
    to a DFP output, and assigns the display device, I can't see how you can
    expect to see anything on your display.

    After that you should see some mode select info:

    (II) NVIDIA(0): Validated modes:
    (II) NVIDIA(0): "1280x720"
    (II) NVIDIA(0): "720x480"
    (II) NVIDIA(0): Virtual screen size determined to be 1280 x 720
    (WW) NVIDIA(0): SONY TV (DFP-0)'s EDID does not contain a maximum image
    size;
    (WW) NVIDIA(0): cannot compute DPI from SONY TV (DFP-0)'s EDID.
    (**) NVIDIA(0): DPI set to (83, 63); computed from "DisplaySize" Monitor
    (**) NVIDIA(0): section option

    You can see, it didn't select 1080i as a valid mode.

    To get that, I had to add something to my Monitor section,
    'UseEdidFreqs'

    Section "Monitor"
    Identifier "Monitor[0]"
    VendorName "SONY"
    ModelName "CPD-G520"
    Option "UseEdidFreqs" "FALSE"
    EndSection

    After I added that, I was able to get a 1080i output line modeline
    listed in my Xorg.0.log, which I ended up using.

    It still looked like crap, but I gave up-- 720p worked 'ok'.

    Dave

  11. Re: Xorg for 1080i HDTV (Sony Grand Wega KF50WE610)

    On Thu, 19 Jul 2007 15:00:44 +0000, David E. Konerding DSD staff wrote:

    > On 2007-07-16, General Schvantzkoph wrote:
    >> I've been trying to hook the HDMI/DVI port of my 50" Sony LCD rear
    >> projection to an Nvidia FX GeForce 5700 graphics card running the
    >> 100.14.11 Nvidia driver. When I use the HDMI port I get nothing. I was
    >> able to use the S-Video out connection but the quality with that is
    >> unspeakably awful.
    >>
    >> Does anyone know what the correct Xorg configuration would be for a
    >> 1080i HDTV?

    >
    > Sir,
    >
    > OK, there are two things here that I can think of;
    >
    > 1) actually establishing HDMI out works 2) getting 1080i mode
    >
    > For me, I had a DVI output on my video card, and used a DVI to HDMI
    > converter to attach to the TV. This is slightly different from yours
    > (direct HDMI port built into video card?), but I don't worry too much.
    >
    > let's diagnose step 1. Get an ssh shell into your box from some other
    > machine, and make sure you machine is connected to the TV, the TV is on,
    > and tuned to the right input.
    >
    > When you run startx, look for a section of /var/log/Xorg.0.log like:
    >
    > (II) NVIDIA(0): NVIDIA GPU GeForce4 Ti 4600 at PCI:1:0:0 (GPU-0) (--)
    > NVIDIA(0): Memory: 131072 kBytes (--) NVIDIA(0): VideoBIOS:
    > 04.25.00.27.32 (II) NVIDIA(0): Detected AGP rate: 4X (--) NVIDIA(0):
    > Interlaced video modes are supported on this GPU
    >
    > Right after that it should tell you what devices are attached:
    >
    > (--) NVIDIA(0): Connected display device(s) on GeForce4 Ti 4600 at
    > PCI:1:0:0: (--) NVIDIA(0): SONY TV (DFP-0)
    > (--) NVIDIA(0): SONY TV (DFP-0): 165.0 MHz maximum pixel clock (--)
    > NVIDIA(0): SONY TV (DFP-0): External Single Link TMDS (II) NVIDIA(0):
    > Assigned Display Device: DFP-0
    >
    >
    > If you don't have something like this, where it prints that it's connect
    > to a DFP output, and assigns the display device, I can't see how you can
    > expect to see anything on your display.
    >
    > After that you should see some mode select info:
    >
    > (II) NVIDIA(0): Validated modes:
    > (II) NVIDIA(0): "1280x720"
    > (II) NVIDIA(0): "720x480"
    > (II) NVIDIA(0): Virtual screen size determined to be 1280 x 720 (WW)
    > NVIDIA(0): SONY TV (DFP-0)'s EDID does not contain a maximum image size;
    > (WW) NVIDIA(0): cannot compute DPI from SONY TV (DFP-0)'s EDID. (**)
    > NVIDIA(0): DPI set to (83, 63); computed from "DisplaySize" Monitor (**)
    > NVIDIA(0): section option
    >
    > You can see, it didn't select 1080i as a valid mode.
    >
    > To get that, I had to add something to my Monitor section,
    > 'UseEdidFreqs'
    >
    > Section "Monitor"
    > Identifier "Monitor[0]"
    > VendorName "SONY"
    > ModelName "CPD-G520"
    > Option "UseEdidFreqs" "FALSE"
    > EndSection
    >
    > After I added that, I was able to get a 1080i output line modeline
    > listed in my Xorg.0.log, which I ended up using.
    >
    > It still looked like crap, but I gave up-- 720p worked 'ok'.
    >
    > Dave


    I'll give the UseEdudFreqs FALSE a try when I have time to do another
    test. The connector on my TV is a DVI connector so I've been using a DVI
    to DVI cable. The cable from my Comcast DVR is an HDMI to DVI.


  12. Re: Xorg for 1080i HDTV (Sony Grand Wega KF50WE610)

    David E. Konerding DSD staff wrote:

    > On 2007-07-16, General Schvantzkoph wrote:
    >> I've been trying to hook the HDMI/DVI port of my 50" Sony LCD rear
    >> projection to an Nvidia FX GeForce 5700 graphics card running the
    >> 100.14.11 Nvidia driver. When I use the HDMI port I get nothing. I
    >> was able to use the S-Video out connection but the quality with
    >> that is unspeakably awful.
    >>
    >> Does anyone know what the correct Xorg configuration would be for a
    >> 1080i HDTV?

    >
    > Sir,
    >
    > OK, there are two things here that I can think of;
    >
    > 1) actually establishing HDMI out works
    > 2) getting 1080i mode
    >
    > For me, I had a DVI output on my video card, and used a DVI to HDMI
    > converter to attach to the TV. This is slightly different from
    > yours (direct HDMI port built into video card?), but I don't worry
    > too much.
    >
    > let's diagnose step 1. Get an ssh shell into your box from some
    > other machine, and make sure you machine is connected to the TV, the
    > TV is on, and tuned to the right input.
    >
    > When you run startx, look for a section of
    > /var/log/Xorg.0.log like:
    >
    > (II) NVIDIA(0): NVIDIA GPU GeForce4 Ti 4600 at PCI:1:0:0 (GPU-0)
    > (--) NVIDIA(0): Memory: 131072 kBytes
    > (--) NVIDIA(0): VideoBIOS: 04.25.00.27.32
    > (II) NVIDIA(0): Detected AGP rate: 4X
    > (--) NVIDIA(0): Interlaced video modes are supported on this GPU
    >
    > Right after that it should tell you what devices are attached:
    >
    > (--) NVIDIA(0): Connected display device(s) on GeForce4 Ti 4600 at
    > PCI:1:0:0:
    > (--) NVIDIA(0): SONY TV (DFP-0)
    > (--) NVIDIA(0): SONY TV (DFP-0): 165.0 MHz maximum pixel clock
    > (--) NVIDIA(0): SONY TV (DFP-0): External Single Link TMDS
    > (II) NVIDIA(0): Assigned Display Device: DFP-0
    >
    >
    > If you don't have something like this, where it prints that it's
    > connect to a DFP output, and assigns the display device, I can't see
    > how you can expect to see anything on your display.
    >
    > After that you should see some mode select info:
    >
    > (II) NVIDIA(0): Validated modes:
    > (II) NVIDIA(0): "1280x720"
    > (II) NVIDIA(0): "720x480"
    > (II) NVIDIA(0): Virtual screen size determined to be 1280 x 720
    > (WW) NVIDIA(0): SONY TV (DFP-0)'s EDID does not contain a maximum
    > image size;
    > (WW) NVIDIA(0): cannot compute DPI from SONY TV (DFP-0)'s EDID.
    > (**) NVIDIA(0): DPI set to (83, 63); computed from "DisplaySize"
    > Monitor
    > (**) NVIDIA(0): section option
    >
    > You can see, it didn't select 1080i as a valid mode.
    >
    > To get that, I had to add something to my Monitor section,
    > 'UseEdidFreqs'
    >
    > Section "Monitor"
    > Identifier "Monitor[0]"
    > VendorName "SONY"
    > ModelName "CPD-G520"
    > Option "UseEdidFreqs" "FALSE"
    > EndSection
    >
    > After I added that, I was able to get a 1080i output line modeline
    > listed in my Xorg.0.log, which I ended up using.
    >
    > It still looked like crap, but I gave up-- 720p worked 'ok'.


    That suggests to me that your TV does not support a genuine 1080
    lines of resolution. It pretends to, but actually does an extra
    format conversion. Two conversions must be expected to give
    worse results than one.


    --
    Peter D.
    Sig goes here...

  13. Re: Xorg for 1080i HDTV (Sony Grand Wega KF50WE610)

    General Schvantzkoph wrote:

    > On Thu, 19 Jul 2007 15:00:44 +0000, David E. Konerding DSD staff wrote:
    >
    >> On 2007-07-16, General Schvantzkoph wrote:
    >>> I've been trying to hook the HDMI/DVI port of my 50" Sony LCD rear
    >>> projection to an Nvidia FX GeForce 5700 graphics card running the
    >>> 100.14.11 Nvidia driver. When I use the HDMI port I get nothing. I was
    >>> able to use the S-Video out connection but the quality with that is
    >>> unspeakably awful.
    >>>
    >>> Does anyone know what the correct Xorg configuration would be for a
    >>> 1080i HDTV?

    >>
    >> Sir,
    >>
    >> OK, there are two things here that I can think of;
    >>
    >> 1) actually establishing HDMI out works 2) getting 1080i mode
    >>
    >> For me, I had a DVI output on my video card, and used a DVI to HDMI
    >> converter to attach to the TV. This is slightly different from yours
    >> (direct HDMI port built into video card?), but I don't worry too much.
    >>
    >> let's diagnose step 1. Get an ssh shell into your box from some other
    >> machine, and make sure you machine is connected to the TV, the TV is on,
    >> and tuned to the right input.
    >>
    >> When you run startx, look for a section of /var/log/Xorg.0.log like:
    >>
    >> (II) NVIDIA(0): NVIDIA GPU GeForce4 Ti 4600 at PCI:1:0:0 (GPU-0) (--)
    >> NVIDIA(0): Memory: 131072 kBytes (--) NVIDIA(0): VideoBIOS:
    >> 04.25.00.27.32 (II) NVIDIA(0): Detected AGP rate: 4X (--) NVIDIA(0):
    >> Interlaced video modes are supported on this GPU
    >>
    >> Right after that it should tell you what devices are attached:
    >>
    >> (--) NVIDIA(0): Connected display device(s) on GeForce4 Ti 4600 at
    >> PCI:1:0:0: (--) NVIDIA(0): SONY TV (DFP-0)
    >> (--) NVIDIA(0): SONY TV (DFP-0): 165.0 MHz maximum pixel clock (--)
    >> NVIDIA(0): SONY TV (DFP-0): External Single Link TMDS (II) NVIDIA(0):
    >> Assigned Display Device: DFP-0
    >>
    >>
    >> If you don't have something like this, where it prints that it's connect
    >> to a DFP output, and assigns the display device, I can't see how you can
    >> expect to see anything on your display.
    >>
    >> After that you should see some mode select info:
    >>
    >> (II) NVIDIA(0): Validated modes:
    >> (II) NVIDIA(0): "1280x720"
    >> (II) NVIDIA(0): "720x480"
    >> (II) NVIDIA(0): Virtual screen size determined to be 1280 x 720 (WW)
    >> NVIDIA(0): SONY TV (DFP-0)'s EDID does not contain a maximum image size;
    >> (WW) NVIDIA(0): cannot compute DPI from SONY TV (DFP-0)'s EDID. (**)
    >> NVIDIA(0): DPI set to (83, 63); computed from "DisplaySize" Monitor (**)
    >> NVIDIA(0): section option
    >>
    >> You can see, it didn't select 1080i as a valid mode.
    >>
    >> To get that, I had to add something to my Monitor section,
    >> 'UseEdidFreqs'
    >>
    >> Section "Monitor"
    >> Identifier "Monitor[0]"
    >> VendorName "SONY"
    >> ModelName "CPD-G520"
    >> Option "UseEdidFreqs" "FALSE"
    >> EndSection
    >>
    >> After I added that, I was able to get a 1080i output line modeline
    >> listed in my Xorg.0.log, which I ended up using.
    >>
    >> It still looked like crap, but I gave up-- 720p worked 'ok'.
    >>
    >> Dave

    >
    > I'll give the UseEdudFreqs FALSE a try when I have time to do another
    > test. The connector on my TV is a DVI connector so I've been using a DVI
    > to DVI cable. The cable from my Comcast DVR is an HDMI to DVI.


    With just one google the native resolution of 1386 x 768. It is an LCD
    type. Thus drive at anything other than native will produce an inferior
    picture.
    --
    JosephKK
    Gegen dummheit kampfen die Gotter Selbst, vergebens.**
    --Schiller

+ Reply to Thread