Quantcast
Channel: FedoraForum.org
Viewing all articles
Browse latest Browse all 36194

F19 Nvidia Libglx.so segmentation fault

$
0
0
Hi,
I have just installed Fedora 19 LXDE spin to my old computer. .For getting more performance I installed nvidia driver but If xorg uses nvidia's libglx.so it gives segmentation fault. If I use Xorg's libglx.so It works fine but no direct rendering and glx apllications gives "Error: glXCreateContext failed".
I tried installing driver before and after system update but still no luck. I even tried nvidia driver from nvidia.com still same result.:(

Specs:
Kernel: 3.9.5-301.fc19.i686 \ 3.10.3-300.fc19.i686 (Tried both of them)
Graphic Card: Fx5500
Driver: akmod-nvidia-173xx-173.14.37-2.fc19.i686

Xorg.0.log If I use nvidia's libglx.so:
Code:

[  141.776] (II) "glx" will be loaded by default.
[  141.776] (II) LoadModule: "dri2"
[  141.776] (II) Module "dri2" already built-in
[  141.776] (II) LoadModule: "glamoregl"
[  141.777] (II) Loading /usr/lib/xorg/modules/libglamoregl.so
[  141.843] (II) Module glamoregl: vendor="X.Org Foundation"
[  141.843]        compiled for 1.14.0, module version = 0.5.0
[  141.843]        ABI class: X.Org ANSI C Emulation, version 0.4
[  141.843] (II) LoadModule: "glx"
[  141.843] (II) Loading /usr/lib/xorg/modules/extensions/nvidia/libglx.so
[  141.849] (II) Module glx: vendor="NVIDIA Corporation"
[  141.849]        compiled for 4.0.2, module version = 1.0.0
[  141.849]        Module class: X.Org Server Extension
[  141.849] (II) NVIDIA GLX Module  173.14.37  Wed Mar  6 17:14:50 PST 2013
[  141.849] Loading extension GLX
[  141.849] (II) LoadModule: "nvidia"
[  141.849] (II) Loading /usr/lib/xorg/modules/drivers/nvidia_drv.so
[  141.851] (II) Module nvidia: vendor="NVIDIA Corporation"
[  141.851]        compiled for 4.0.2, module version = 1.0.0
[  141.851]        Module class: X.Org Video Driver
[  141.851] (II) NVIDIA dlloader X Driver  173.14.37  Wed Mar  6 17:02:37 PST 2013
[  141.851] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
[  141.852] (++) using VT number 1

[  141.858] (II) Loading sub module "fb"
[  141.858] (II) LoadModule: "fb"
[  141.860] (II) Loading /usr/lib/xorg/modules/libfb.so
[  141.862] (II) Module fb: vendor="X.Org Foundation"
[  141.862]        compiled for 1.14.2, module version = 1.0.0
[  141.862]        ABI class: X.Org ANSI C Emulation, version 0.4
[  141.862] (II) Loading sub module "wfb"
[  141.862] (II) LoadModule: "wfb"
[  141.864] (II) Loading /usr/lib/xorg/modules/libwfb.so
[  141.867] (II) Module wfb: vendor="X.Org Foundation"
[  141.867]        compiled for 1.14.2, module version = 1.0.0
[  141.867]        ABI class: X.Org ANSI C Emulation, version 0.4
[  141.867] (II) Loading sub module "ramdac"
[  141.867] (II) LoadModule: "ramdac"
[  141.867] (II) Module "ramdac" already built-in
[  141.867] (**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
[  141.868] (==) NVIDIA(0): RGB weight 888
[  141.868] (==) NVIDIA(0): Default visual is TrueColor
[  141.868] (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
[  141.868] (**) NVIDIA(0): Enabling RENDER acceleration
[  141.868] (II) NVIDIA(0): Support for GLX with the Damage and Composite X extensions is
[  141.868] (II) NVIDIA(0):    enabled.
[  142.932] (II) NVIDIA(0): NVIDIA GPU GeForce FX 5500 (NV34) at PCI:1:0:0 (GPU-0)
[  142.932] (--) NVIDIA(0): Memory: 262144 kBytes
[  142.932] (--) NVIDIA(0): VideoBIOS: 04.34.20.87.00
[  142.932] (II) NVIDIA(0): Detected AGP rate: 8X
[  142.932] (--) NVIDIA(0): Interlaced video modes are supported on this GPU
[  142.932] (--) NVIDIA(0): Connected display device(s) on GeForce FX 5500 at PCI:1:0:0:
[  142.932] (--) NVIDIA(0):    Philips PH107E/V/S/G6 (CRT-0)
[  142.932] (--) NVIDIA(0): Philips PH107E/V/S/G6 (CRT-0): 350.0 MHz maximum pixel clock
[  142.935] (II) NVIDIA(0): Assigned Display Device: CRT-0
[  142.935] (==) NVIDIA(0):
[  142.935] (==) NVIDIA(0): No modes were requested; the default mode "nvidia-auto-select"
[  142.935] (==) NVIDIA(0):    will be used as the requested mode.
[  142.935] (==) NVIDIA(0):
[  142.935] (II) NVIDIA(0): Validated modes:
[  142.935] (II) NVIDIA(0):    "nvidia-auto-select"
[  142.935] (II) NVIDIA(0): Virtual screen size determined to be 1280 x 1024
[  142.939] (--) NVIDIA(0): DPI set to (104, 113); computed from "UseEdidDpi" X config
[  142.939] (--) NVIDIA(0):    option
[  142.940] (==) NVIDIA(0): Enabling 32-bit ARGB GLX visuals.
[  142.940] (--) Depth 24 pixmap format is 32 bpp
[  142.943] (II) NVIDIA(0): Initialized AGP GART.
[  142.950] (II) NVIDIA(0): Unable to connect to the ACPI daemon; the ACPI daemon may not
[  142.950] (II) NVIDIA(0):    be running or the "AcpidSocketPath" X configuration option
[  142.950] (II) NVIDIA(0):    may not be set correctly.  When the ACPI daemon is
[  142.950] (II) NVIDIA(0):    available, the NVIDIA X driver can use it to receive ACPI
[  142.950] (II) NVIDIA(0):    events.  For details, please see the "ConnectToAcpid" and
[  142.950] (II) NVIDIA(0):    "AcpidSocketPath" X configuration options in Appendix B: X
[  142.950] (II) NVIDIA(0):    Config Options in the README.
[  142.950] (II) NVIDIA(0): Setting mode "nvidia-auto-select"
[  143.089] Loading extension NV-GLX
[  143.151] (II) NVIDIA(0): NVIDIA 3D Acceleration Architecture Initialized
[  143.165] (II) NVIDIA(0): Using the NVIDIA 2D acceleration architecture
[  143.165] (==) NVIDIA(0): Backing store disabled
[  143.165] (==) NVIDIA(0): Silken mouse enabled
[  143.166] (**) NVIDIA(0): DPMS enabled
[  143.166] Loading extension NV-CONTROL
[  143.167] (==) RandR enabled
[  143.205] (II) SELinux: Disabled by boolean
[  143.212] (II) Initializing extension GLX
[  143.213] (EE)
[  143.213] (EE) Backtrace:
[  143.214] (EE) 0: /usr/bin/X (OsLookupColor+0x136) [0x80b69b6]
[  143.223] (EE) 1: ? (?+0x136) [0xb77ed541]
[  143.225] (EE) 2: /usr/lib/xorg/modules/extensions/nvidia/libglx.so (_nv000010gl+0xc55) [0xb65d650a]
[  143.227] (EE) 3: /lib/libc.so.6 (__libc_start_main+0xf3) [0x4dff2963]
[  143.232] (EE) 4: /usr/bin/X (_start+0x21) [0x806884a]
[  143.234] (EE)
[  143.234] (EE) Segmentation fault at address 0x0
[  143.234] (EE)
Fatal server error:
[  143.234] (EE) Caught signal 11 (Segmentation fault). Server aborting
[  143.234] (EE)
[  143.234] (EE)

xorg.conf:
Code:

Section "ServerLayout"
    Identifier    "Layout0"
    Screen      0  "Screen0"
    InputDevice    "Keyboard0" "CoreKeyboard"
    InputDevice    "Mouse0" "CorePointer"
EndSection

Section "Files"
    FontPath        "/usr/share/fonts/default/Type1"
#    ModulePath  "/usr/lib/xorg/modules/extensions/nvidia"
    ModulePath  "/usr/lib/xorg/modules/drivers"
    ModulePath  "/usr/lib/xorg/modules"
EndSection

Section "InputDevice"
    # generated from default
    Identifier    "Mouse0"
    Driver        "mouse"
    Option        "Protocol" "auto"
    Option        "Device" "/dev/input/mice"
    Option        "Emulate3Buttons" "no"
    Option        "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"
    # generated from default
    Identifier    "Keyboard0"
    Driver        "kbd"
EndSection

Section "Monitor"
    Identifier    "Monitor0"
    VendorName    "Unknown"
    ModelName      "Unknown"
    HorizSync      30.0 - 110.0
    VertRefresh    50.0 - 150.0
    Option        "DPMS"
EndSection

Section "Device"
    Identifier    "Device0"
    Driver        "nvidia"
    VendorName    "NVIDIA Corporation"
EndSection

Section "Screen"
    Identifier    "Screen0"
    Device        "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    SubSection    "Display"
        Depth      24
    EndSubSection
EndSection

If I uncomment "# ModulePath "/usr/lib/xorg/modules/extensions/nvidia" " It uses nvidia's libglx.so but as I said It causes segmentation fault.

Viewing all articles
Browse latest Browse all 36194

Trending Articles