Programmatically Changing Monitor for Full-Screen Games in Unity
Introduction
In Unity, you can control which monitor a full-screen game launches on by using the Screen API. This can be particularly useful in multi-monitor setups, where you might want your game to start on a specific display. Below, we explore how to achieve this programmatically.
Using Display API
Unity’s Display class represents a monitor, and you can access all connected displays using the Display.displays
array. By default, Unity games start on the primary monitor (Display 0), but you can change this by enabling other displays and setting your game to launch on one of them.
Test your luck right now!
using UnityEngine;
public class MonitorManager : MonoBehaviour {
void Start() {
// Activate all available displays
for(int i = 0; i < Display.displays.Length; i++) {
Display.displays[i].Activate();
}
// Set target display, e.g., second display (index 1)
Screen.SetResolution(1920, 1080, FullScreenMode.FullScreenWindow, 1);
}
}
Explanation
- Activate additional displays: By default, only the primary display is enabled. Loop through
Display.displays
and callActivate()
on each to make them available for your game. - Set Resolution: Use
Screen.SetResolution()
to specify the resolution and which display to use. The last parameter is the target display index.
Considerations
- Test Across Systems: Multi-monitor setups can vary between users, so ensure thorough testing on different configurations.
- Graphics API: Some graphics APIs may have different capabilities or limitations, so consider checking Unity's documentation for specifics related to DirectX or Vulkan if you're targeting those APIs.
Conclusion
By utilizing Unity's Display API, developers can effectively manage which monitor their full-screen games launch on, providing a smoother user experience in multi-monitor setups.