Alright, normally my 17" monitor (Dell ES-17 CRT) runs at 1152x870, 75 hertz, at millions of colors. But today, after leaving the machine off all night, when I booted this morning, the login screen was at 800x600, which is wierd. When I logged in, however, i was greeted with a surprise: it was running at 1400x1050!! that's what my old 15" CRT did. Then, I went into the monitors control panel, and to my surprise, I had options for 1600x200 up to 85 hertz, 1900x1050, and a very high 1900x1600, which caused my monitor to flash on and off (error, according to the manual). so, I decided to run it at 1600x1200, which was really nice. I looked into my video card, and the documentation says it can do 1600x1200. I know the hardware is capable now.
well, after running it like that for a few hours, I shut it down. then my mother got on, and logged in. she said when she went to launch an app, the monitor started flashing again, this time longer than 10 seconds (at which point, before it would default back to the previous resolution). This time, I was fored to three finger salute it, and this time it defaulted back to 1152x870, with no higher possible resolutions.
is this just a glitch, or is there some haxie to rachet it up? I know under 7.5.1-9.2.2, I just hit option, and opened the monitors control panel, and it would unlock all the resolutions supported by the OS. Is there somthing simmilar to that in OS X? yes, I tried option and the monitor control panel. the Dark Lords of OS X changed it, like they changed almost everything else.