I think it is GetScreenPosition() returning unexpected values. Now that I am looking at and using the wxDisplay routines, I am getting a better picture of what is going on.
Using wxDisplay, I am getting these values for the monitor layout shown above:
monitor 1: origin (0,0) size (3840, 2160); client area: (0, 0, -3840, 2122)
monitor 2: origin (4800,0) size (2100,1313); client area: (4800,0, 2100, 1275)
monitor 3: origin (1128,-1350) size (2400,1350); client area: (1128,-1350, 2400, 1313)
monitor 4: origin (-3200,-58) size (3200,1350); client area: (-3200,-58, 3200, 1313)
The monitor sizes returned don't look correct.
Monitor 2 has a resolution set to 1680x1050, (confirmed via VideoModeToText(wxDisplay.GetCurrentMode()) and checking the OS Display Settings), yet the size returned by wxDisplay.GetGeometry() says that display is 2100 x 1313. Ignoring the height, how does the 1680 width become 2100? Checking my Win10 Display settings, that monitor is indeed 1680x1050 and has no display scaling. It is returning values as if that monitor is set for 125% display scaling, but it is not.
Funny thing is, monitor 1 is the only display returning client areas that fit the set display resolution, yet that monitor is set for 125% display scaling - it should be reporting sizes like the other monitors.
I'm still investigating and trying to figure out a solution. Just replying so you know I've not moved on with this issue.