(PeteCresswell) said:
I am running a Windows 7 box headless - using TeamViewer to get
to it as needed.
Problem is that as soon as I disconnect the monitor and reboot,
Windows reverts to VGA mode and I cannot find a means of telling
it to do otherwise.
HP p6-2107c, using "AMD Radeon HD 6530D Graphics"
Maybe replace the Radeon driver with some sort of special driver?
There are a couple possibilities.
There could be a utility out there, to force it. There is
PowerStrip, which is used for custom resolutions, but I don't
know if that includes a forcing function (to override EDID).
You should be able to evaluate that for 30 days, to see
if it can do anything for you. For more than 30 days usage,
you pay for it.
http://www.entechtaiwan.com/util/ps.shtm
In terms of hardware, there are two aspects to monitor
operation. The video card has "impedance sensing", and it
can tell if you've unplugged the connector. I keep a set of
"fake" connectors here, like a VGA connector with 75 ohm
resistors on R to ground, G to ground, B to ground. That
creates a fake electrical load. It allows me to fool
the OS into thinking a monitor is present. When I need to test
"dual monitor configurations", when I only owned the one LCD
monitor, I used my "fake VGA" connector to trick the
video card into running dual monitor mode. So that
basically gets the video card to enable the outputs.
On DVI, faking would need something like 100 ohm differential
termination on the high speed signals. I don't have any solder
tail DVI connectors, to build one like that, but I assume it
would work as well. You can also do fake composite and S-Video,
with resistors. (So some of my "fake monitor" experiments, use
composite video terminators.)
( Two video cards, one real monitor, three "fake" monitor connections.
Resolution of monitors: 1280 - 800 - 1280 - 640. Screen capture.)
http://img341.imageshack.us/img341/6043/extendednvidia128080012.gif
The problem with that, is the video driver will limit resolution choices
to "safe" values. Which means I probably can't get 1600x1200 that way.
In the era before multisync monitors became available, there
were resolutions which worked "most of the time" and were considered
"safe" by video card driver writers. And without information coming
from the monitor, the driver may decide to limit the resolution
and refresh to safe values.
There are hardware boxes, which will copy the EDID from a monitor,
and then present that information (read-back mode) when the monitor
is not present. So it's possible to fool the computer, entirely
in hardware, that a monitor is present. This Gefen box is an
example of the technology (a 2Kbit EEPROM in a fancy metal box).
Off brand copy cats of this, may be available for less money.
I don't consider this approach to be cost effective for
your application - this is for people with projection TV sets,
where the set lacks EDID, and the projector costs many times
what the little EDID box costs.
http://www.gefen.com/kvm/dproduct.jsp?prod_id=4714
The Macintosh had a simple scheme for forcing resolution. The
pins on the connector, included an encoding for resolution. I used
to use a "dip switch box", to force resolution to a value my CRT
monitor could use. The PC also had such a scheme, before the EDID
serial clock and data interface came along, but I don't think it
could represent quite as many choices. Resolution forcing by
sense pins, that died out a long time ago. I don't know if
video cards still listen to that or not. In any case, I doubt
the resolution values would be useful ones.
So, test PowerStrip, and see if it does anything useful for you.
I "burned up" my evaluation years ago, and I'd probably have to
do a clean install to be able to test it again.
I've tried to find resolution settings in the Registry, but
it's a maze in there. The programmer at Entechtaiwan probably
knows what to do, but I don't.
Paul