BillW50 said:
In
Oh sorry. I am running 9.10 and have 10.10 and 11.something iso.
Well, first off, Linux doesn't work for people that hate it
So forget that idea right away. The same would go for Windows.
If a person is completely computer-phobic, and their blood pressure
rises any time they see a glowing glass screen (my youngest
brother is like that
), they're bound not to enjoy it.
Anyone who works with Linux, is eventually going to have to face
the "Xorg monster". A.k.a Xwindows. That, and the various driver
options for running the graphics card. (No, don't assume the
driver needs to be changed just yet... Save that for when you're
really really desperate. Changing drivers can have some
long term maintenance implications.)
*******
By the way, you don't want the latest one here, as you may end
up stuck with the Unity interface. 10.04LTS is supported (has
package server) until 2013-04, so that will do for now. With the
11.10 desktop version, you might not be able to disable Unity (easily).
http://www.ubuntu.com/download/ubuntu/download
This is my VM running 11.10, with classic interface. I did
this, by starting with the *server* version of 11.10, and "adding
crap" to it, until it had XWindows and a window manager. For
any of you Linux people out there, don't gag
This is
still Ubuntu, warts and all. I did this to show Canonical
what I think of Unity.
http://img19.imageshack.us/img19/9074/u1110svr.gif
Anyway, stick with the 10.04LTS for now, or try a
derivative, like some version of Linux Mint. It'll allow
you to concentrate on the problem at hand.
*******
You should be looking in /var/log/Xorg.0.log and friends. In
there, you can watch as X tries a large number of resolution
combinations. For each that doesn't pan out, X will report the
reason why (clock out of range, Horizontal or Vertical scan rate
out of range, and so on). While the error messages might not
point at the root cause, this file is like a "fingerprint"
and helps "point at the criminal".
When you use Ubuntu "out-of-the-box", X will be running in its
own self-discovery type world. It'll try to develop its own
settings, based on testing. But you can override this. The
tradition was, you crafted your own Xorg.conf, before you'd
even try and start X. But over the years, the situation has
improved. For example, the Nvidia Linux package, had an
application, which would generate sensible Xorg.conf values
you could actually use. That's what I used as a template for
mine, rather than reading whatever passes for documentation
and typing it in from scratch.
I run quite a few different distros of Linux in VPC2007,
basically to make it easier to tame any rough edges. And
in each case, I have to "fix" X, because very few distros
configure everything exactly right. (Nobody tests, that
their distro will run well in VPC2007...)
In the VPC2007 environment, the emulated video driver pretends
to be some kind of ancient S3 graphics. The chip is assigned
a "maximum pixel clock rate", which in a virtual environment,
doesn't mean a damn thing. But for some reason, they decided
they would make their software emulation, adhere to how the
hardware works, right down to reporting the pixel clock can't
go any higher than 80MHz. The end result is, most Xorg runs
start the screen in 24 bit mode, at a resolution of 1024x768
or 800x600 (which is too small for my tastes).
To fix that, I generated my own "modeline", and also run
the screen in 16 bit mode instead of 24 bit mode. That allows
the dimensions of the virtual screen to be made much larger,
without really compromising other things. (Video doesn't
play very well in that environment, because the emulated
environment lacks even the simplest support for video, making
the processor have to grunt too hard to make good video possible.)
Anyway, this is a copy of one of my Xorg.conf files, for your enjoyment.
This goes into /etc/X11/xorg.conf. I haven't attempted to make this
pretty or anything. The file was auto-generated, to save typing
the trivial stuff, and then I added a line or two. Notice that
a modeline of "1152x864_50.00" makes no physical sense, because
you really wouldn't want to run a physical LCD screen at 50Hz. But because
this is a virtual environment, I can get away with this (the pixmap
from the virtual machine, is being scanned at 60Hz by the real
video card and driver in Windows). The modelines were computed
to stay below the 80.0MHz pixel clock limit for the S3 emulation.
******* Half-baked xorg.conf, to tame the Xorg beast *******
# nvidia-xconfig: X configuration file generated by nvidia-xconfig
# nvidia-xconfig: version 256.53 (buildmeister@builder101) Fri Aug 27 21:34:01 PDT 2010
Section "ServerLayout"
Identifier "Layout0"
Screen 0 "Screen0"
InputDevice "Keyboard0" "CoreKeyboard"
InputDevice "Mouse0" "CorePointer"
EndSection
Section "Files"
EndSection
Section "InputDevice"
# generated from default
Identifier "Mouse0"
Driver "mouse"
Option "Protocol" "auto"
Option "Device" "/dev/psaux"
Option "Emulate3Buttons" "no"
Option "ZAxisMapping" "4 5"
EndSection
Section "InputDevice"
# generated from default
Identifier "Keyboard0"
Driver "kbd"
EndSection
Section "Monitor"
Identifier "Monitor0"
VendorName "Unknown"
ModelName "Unknown"
# Hacked numbers, to prevent sync rates from limiting operating modes
HorizSync 31.5 - 75.0
VertRefresh 50.0 - 100.0
# Modelines via "cvt".
Modeline "1152x864_50.00" 66.25 1152 1208 1320 1488 864 867 871 892 -hsync +vsync
Modeline "1024x768_60.00" 63.50 1024 1072 1176 1328 768 771 775 790 -hsync +vsync
Option "DPMS"
EndSection
Section "Device"
Identifier "Device0"
Driver "s3"
VendorName "Vanilla Corporation"
EndSection
Section "Screen"
Identifier "Screen0"
Device "Device0"
Monitor "Monitor0"
# Try values like 8, 16, or the default 24 bit
DefaultDepth 16
SubSection "Display"
Depth 16
Modes "1152x864_50.00"
EndSubSection
EndSection
************************************************************
Linux is meant to make you sweat. If you're not
editing config files by hand, "you're not holding it right"
HTH,
Paul