Windows (and specifically, the Video Card), generally must actually detect a connected device, and sometimes require a reboot, before it will show the second monitor in the Display properties.
Also, be aware, that XFire and SLI modes (ATI / NVidia) don't actually support dual monitors. They use all the combined processing power for one only.
If a second monitor isn't detected, the video driver usually doesn't allocate any resources to it. Primarily for power savings and efficiency.
Think about it, a typical, low end, Flat Screen Display of today can display true-color at 1280 x 1024, at 60 hz. That means the video card has to process 5,131,880 bytes every 60th of a second,
or 314,572,800 bytes per second. That's a lot of bandwidth! Why try to drive two such displays if there is only one?
Additional Note: That is also why the very large displays (mostly anything greater than 1920 X 1600 pixels) require both DVI connectors to power a single display. Even DVI runs out of bandwidth over a single channel.