I'm currently using the following code to double click a mouse button.
[DllImport("user32.dll", CharSet = CharSet.Auto, CallingConvention = CallingConvention.StdCall)]
public static extern void mouse_event(uint dwFlags, uint dx, uint dy, uint cButtons, uint dwExtraInfo);
public static void DoubleClick(AutomationElement element)
{
Point p = element.GetClickablePoint();
Cursor.Position = new System.Drawing.Point((int)p.X, (int)p.Y);
mouse_event(0x02 | 0x04, (uint)p.X, (uint)p.Y, 0, 0);
mouse_event(0x02 | 0x04, (uint)p.X, (uint)p.Y, 0, 0);
}
However, I believe this is leaving resources open that causes my automated tests to not exit cleanly, causing crashes. Plus, it requires the mouse to be actually be on the clickable point, which has the potential to cause issue if the tester moves the mouse at the wrong time. Is there a better was to do this?
Related
I am trying to make a makeshift onscreen keyboard for Windows 10 and need the background to be transparent, to make it more convenient for the user (the keys are already transparent). I, however, have no idea how to make the background transparent.
Any help would be greatly appreciated.
I believe that I am essentially looking for an updated version of the code in this thread show below:
using System;
using System.Runtime.InteropServices;
using UnityEngine;
public class TransparentWindow : MonoBehaviour
{
[SerializeField]
private Material m_Material;
private struct MARGINS
{
public int cxLeftWidth;
public int cxRightWidth;
public int cyTopHeight;
public int cyBottomHeight;
}
[DllImport("user32.dll")]
private static extern IntPtr GetActiveWindow();
[DllImport("user32.dll")]
private static extern int SetWindowLong(IntPtr hWnd, int nIndex, uint dwNewLong);
[DllImport("user32.dll")]
static extern bool ShowWindowAsync(IntPtr hWnd, int nCmdShow);
[DllImport("user32.dll", EntryPoint = "SetLayeredWindowAttributes")]
static extern int SetLayeredWindowAttributes(IntPtr hwnd, int crKey, byte bAlpha, int dwFlags);
[DllImport("user32.dll", EntryPoint = "SetWindowPos")]
private static extern int SetWindowPos(IntPtr hwnd, int hwndInsertAfter, int x, int y, int cx, int cy, int uFlags);
[DllImport("Dwmapi.dll")]
private static extern uint DwmExtendFrameIntoClientArea(IntPtr hWnd, ref MARGINS margins);
const int GWL_STYLE = -16;
const uint WS_POPUP = 0x80000000;
const uint WS_VISIBLE = 0x10000000;
const int HWND_TOPMOST = -1;
void Start()
{
// You really don't want to enable this in the editor, but it works there..
int fWidth = Screen.width;
int fHeight = Screen.height;
var margins = new MARGINS() { cxLeftWidth = -1 };
var hwnd = GetActiveWindow();
SetWindowLong(hwnd, GWL_STYLE, WS_POPUP | WS_VISIBLE);
// Transparent windows with click through
SetWindowLong(hwnd, -20, 524288 | 32);//GWL_EXSTYLE=-20; WS_EX_LAYERED=524288=&h80000, WS_EX_TRANSPARENT=32=0x00000020L
SetLayeredWindowAttributes(hwnd, 0, 255, 2);// Transparency=51=20%, LWA_ALPHA=2
SetWindowPos(hwnd, HWND_TOPMOST, 0, 0, fWidth, fHeight, 32 | 64); //SWP_FRAMECHANGED = 0x0020 (32); //SWP_SHOWWINDOW = 0x0040 (64)
DwmExtendFrameIntoClientArea(hwnd, ref margins);
}
void OnRenderImage(RenderTexture from, RenderTexture to)
{
Graphics.Blit(from, to, m_Material);
}
}
The code given did not work, so I assume that it is outdated. I have no idea how to update it myself, since it is a bit out of my skill set. When I upload the code to Unity, it just says that there are errors in the code and that it is not a valid script. When I open the script, however, no errors appear.
I expect to be able to have a relatively good view of whatever is behind my keyboard, like my desktop, but I actually just see a black plane.
Update:
So apparently the error message was caused by my script not having the same name as my class. I spent over 4 hours yesterday trying to fix that error message, and this naming incident was the cause :(. Thanks Ruzihm. Anyway, now that the error message is gone, when I run or build the program, my transparent window material just comes up: a dark pink. I then changed my Unity version back to 2018.2.16f1, with no success. Then I removed the #if !Unity Editor line to get the transparency to work perfectly in the editor, but not when I build it. Note, click through does work when I build it and when I run it in the editor.
As discovered in the comments, the problem was fixed when the camera's clear flags were set to solid color and the pink transparent window material was replaced a with white transparent material.
So I found an example from an answer provided here
There was an answer that gave this example of code to move the Notepad window to the top left corner of the screen. I tried it and it worked fine. I then tried it on a small project I am working on and I couldn't move it.
NOTE: I did change the "Notepad" to the name at the top of the window I wanted to move.
using System;
using System.Runtime.InteropServices; // For the P/Invoke signatures.
public static class PositionWindowDemo
{
// P/Invoke declarations.
[DllImport("user32.dll", SetLastError = true)]
static extern IntPtr FindWindow(string lpClassName, string lpWindowName);
[DllImport("user32.dll", SetLastError = true)]
static extern bool SetWindowPos(IntPtr hWnd, IntPtr hWndInsertAfter, int X, int Y, int cx, int cy, uint uFlags);
const uint SWP_NOSIZE = 0x0001;
const uint SWP_NOZORDER = 0x0004;
public static void Main()
{
// Find (the first-in-Z-order) Notepad window.
IntPtr hWnd = FindWindow("Notepad", null);
// If found, position it.
if (hWnd != IntPtr.Zero)
{
// Move the window to (0,0) without changing its size or position
// in the Z order.
SetWindowPos(hWnd, IntPtr.Zero, 0, 0, 0, 0, SWP_NOSIZE | SWP_NOZORDER);
}
}
}
I will give an example. Consider Visual Studios and how it has the Solution Explorer Window or the Output window, and I can drag them with the mouse and move them or undock them. Would there be a way to have an application that has windows inside of it similar to Visual Studios and get the position of them in a program?
I have seen many answers on here about moving a window or finding the active window etc. However I am not sure if I will be able to access this subWindow that is inside of another application.
Thanks
As described in the title I've been trying to search for a way to set the mouse coordinates by using a Cursor.Position = new Point(58, 128); Then while holding the Left Mouse button (Down) I'm trying to scroll to another direction(random direction). For example, if I was to go onto Google Earth and set the cursor position at 0,0 the cursor would then scroll around the map. If anyone can help out I would surely appreciate it.
Thanks
Solution: floatas, thanks again for responding to this post. I spent yesterday and today trying to figure this out and I finally got it working. I will post my code in hopes this helps others out.
First of all you will need to import some functions.
To change cursor position:
[DllImport("user32.dll", EntryPoint = "SetCursorPos")]
[return: MarshalAs(UnmanagedType.Bool)]
public static extern bool SetCursorPos(
[In] int X,
[In] int Y);
To simulate mouse events:
[DllImport("user32.dll", CharSet = CharSet.Auto, CallingConvention = CallingConvention.StdCall)]
public static extern void mouse_event(
[In] uint dwFlags,
[In] uint dx,
[In] uint dy,
[In] int dwData,
[In] uint dwExtraInfo);
Possible mouse events:
public enum MouseEvents
{
MOUSEEVENTF_LEFTDOWN = 0x02,
MOUSEEVENTF_LEFTUP = 0x04,
MOUSEEVENTF_RIGHTDOWN = 0x08,
MOUSEEVENTF_RIGHTUP = 0x10,
MOUSEEVENTF_WHEEL = 0x0800,
}
You can send mouse down and mouse up, simulating click:
mouse_event((uint)MouseEvents.MOUSEEVENTF_LEFTDOWN | (uint)MouseEvents.MOUSEEVENTF_LEFTUP, X, Y, 0, 0);
Didn't tested this, but should press mouse, drag and release:
mouse_event((uint)MouseEvents.MOUSEEVENTF_LEFTDOWN, X, Y, 0, 0);
SetCursorPos((int)X+10, (int)Y+10);
mouse_event((uint)MouseEvents.MOUSEEVENTF_LEFTUP, X+10, Y+10, 0, 0);
I'm working on an Windows Form Application in C#, Framework 4 (32 bit).
I have a list that holds coords of the mouse, and I can capture them. So far so good.
But at some point, I want to go to those coords and left mouse click on it.
This is how it looks like right now:
for (int i = 0; i < coordsX.Count; i++)
{
Cursor.Position = new Point(coordsX[i], coordsY[i]);
Application.DoEvents();
Clicking.SendClick();
}
And the Clicking class:
class Clicking
{
private const UInt32 MOUSEEVENTF_LEFTDOWN = 0x0002;
private const UInt32 MOUSEEVENTF_LEFTUP = 0x0004;
private static extern void mouse_event(
UInt32 dwFlags, // motion and click options
UInt32 dx, // horizontal position or change
UInt32 dy, // vertical position or change
UInt32 dwData, // wheel movement
IntPtr dwExtraInfo // application-defined information
);
// public static void SendClick(Point location)
public static void SendClick()
{
// Cursor.Position = location;
mouse_event(MOUSEEVENTF_LEFTDOWN, 0, 0, 0, new System.IntPtr());
mouse_event(MOUSEEVENTF_LEFTUP, 0, 0, 0, new System.IntPtr());
}
}
But I'm getting this error:
Could not load type 'program.Clicking' from assembly 'program, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null' because the method 'mouse_event' has no implementation (no RVA).
And i realy don't understand what the problem is... Do you guys know what the problem is? or do you know an better way to do what i'm trying to do?
Have you included the following line?
[DllImport("user32.dll")]
static extern void mouse_event(uint dwFlags, uint dx, uint dy, uint dwData,
UIntPtr dwExtraInfo);
This will import the function mouse_event from the user32 dll, which is what you are trying to use within your program. Currently your program does not know about this method within the DLL untill you specify wher it comes from.
The website PInvoke.net user32 Mouse Event is quite handy for the basics on this sort of thing.
The answer to Directing mouse events [DllImport(“user32.dll”)] click, double click will be of great help to your understanding as well.
The flags are what commands you want to send into the mouse_input function, in that example you can see that he is sending both mouse down and mouse up in the same line, this is fine because the mouse_event function will split those flags up and execute them consecutively.
Also note that this method has been superseded by the SendInput command, a good example of SendInput and SetMousePos can be found At this Blog
I guess you are missing the following line
[DllImport("user32.dll", CharSet = CharSet.Auto, CallingConvention = CallingConvention.StdCall)]
I am writing a small program which can simulate mouse clicks at specified positions.
Using the Win32 API call mouse_event like so:
[DllImport("user32.dll", CharSet = CharSet.Auto, CallingConvention = CallingConvention.StdCall)]
public static extern void mouse_event(uint dwFlags, uint dx, uint dy, uint dwData, UIntPtr dwExtraInfo);
[Flags]
public enum MouseEventFlags : uint
{
LEFTDOWN = 0x00000002,
LEFTUP = 0x00000004,
MIDDLEDOWN = 0x00000020,
MIDDLEUP = 0x00000040,
MOVE = 0x00000001,
ABSOLUTE = 0x00008000,
RIGHTDOWN = 0x00000008,
RIGHTUP = 0x00000010
}
mouse_event((uint)(MouseEventFlags.LEFTDOWN | MouseEventFlags.LEFTUP), x, y, 0, UIntPtr.Zero);
Works perfectly fine, except when the mouse cursor is over a Flash application. Flash seems to ignore the simulated mouse click.
What could be the reason for this? And how do I fix it?
Thank you!
Try
mouse_event((uint)
(MouseEventFlags.LEFTDOWN | MouseEventFlags.LEFTUP | MouseEventFlags.ABSOLUTE),
x, y, 0, UIntPtr.Zero);
Also, for some strange reason I've had problems with the above P/Invoke calling convention, see Simulating a mouse button click in Windows