Question

I'm trying to know if is possible to understand if monitor is on or is off.

This is what i've tried:

    GraphicsEnvironment g = GraphicsEnvironment.getLocalGraphicsEnvironment();
    GraphicsDevice[] devices = g.getScreenDevices();
    int monitor_count = 0;
    for(GraphicsDevice device : devices){
        if(device.getType() == GraphicsDevice.TYPE_RASTER_SCREEN)
            monitor_count++;
    }
    if(monitor_count==0){
        System.out.println("Monitor is OFF");
    }else{
        System.out.println("Monitor is ON");
    }

But even if i close monitor ( or disconnect directly from power ) it continue count me one monitor.

How can i know if monitor is OFF?

Was it helpful?

Solution

This is certainly not possible in cross platform Java, and to be honest isn't really possible in a reliable sense even if we resort to native code.

The (non-reliable) way to do this natively for Windows would be to use GetDevicePowerState - find it in kernel32.dll. However, from experiments I did using this function a while back I can say it definitely doesn't work with every monitor, and obviously even if this was reliable it would be a Windows-only solution.

If you do want to go down this route bearing in mind the above limitations, then use MonitorFromPoint to grab the handle to the primary monitor (pass in a 0,0 as the point and use the MONITOR_DEFAULTTOPRIMARY flag.)

OTHER TIPS

Some information may be in operation system hands, especially for laptops - OS may get some notification if lid is open or closed. Most certainly not possible for VGA connector, maybe HDMI or DVI monitors report something back to the OS.

You should search for some OS specific functions, maybe something related to Power Management.

The only way to do it would be to have some sort of power usage monitoring device connected between the monitor and the outlet. Apart from that, I think even the computer can't tell if the monitor is on or off; only whether the signal lead is connected.

GraphicsEnvironment can only tell you the desktop arrangement that the user has configured at the OS level. It doesn't care whether there's a real monitor displaying it, or if it's a remote desktop connection, or if there's nothing.

The whole idea is flawed.

There is no way to tell for sure if a monitor is connected to lets say a VGA port, since it works one-way (output only). Although there is a way how the monitor can tell the computer about its capabilities added to VGA, but its completely optional.

Its different with HDMI (the devices actually need to talk to each other), but that doesn't necessarily mean that a monitor is connected to the port, even if the graphics card end believes that. It could very well be a recording device or anything other than a monitor.

The only cases where one could tell reliably if the display was on is when the display is built-in and controlled by the computer itself (e.g. laptop). Still the information is device specific and may not be available through any OS-calls.

Even if you manage to get an indication from the OS (as suggested through windows API), you can never rely on it to be correct. It will be the OS best guess and that will still be systematically wrong in some configurations.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top