WPF touchscreen monitor touch events detection issue

81 Views Asked by At

I am working on making my wpf appliction touch enabled. I am connected to secondary monitor which is touchscreen.

I am facing an issue. all touch events are being interpreted as a mouse (click) events. using below code(via GPT-4) i tried fetching all input devices.

Now the weird thing is when i debug the below code line by line, the touch events are working as expected, however when i run this code normally (i.e. without breakpoints and debugging) the touch events do not work.

what could be the reason?

[System.Runtime.InteropServices.DllImport("user32.dll")]
static extern uint GetRawInputDeviceList(IntPtr pRawInputDeviceList, ref uint puiNumDevices, uint cbSize);


uint deviceCount = 0;
uint size = (uint)System.Runtime.InteropServices.Marshal.SizeOf(typeof(RAWINPUTDEVICELIST));
                 
if (GetRawInputDeviceList(IntPtr.Zero, ref deviceCount, size) == 0)
{
    
    // Allocate the array for the devices
    IntPtr pRawInputDeviceList = System.Runtime.InteropServices.Marshal.AllocHGlobal((int)(size * deviceCount));

    // Get the list
    GetRawInputDeviceList(pRawInputDeviceList, ref deviceCount, size);

    // Iterate through the list, processing or displaying device info as needed
    

    for (int i = 0; i < deviceCount; i++)
    {
        
        // Marshal the list of RAWINPUTDEVICELIST structs
        RAWINPUTDEVICELIST rid = Marshal.PtrToStructure<RAWINPUTDEVICELIST>(
            new IntPtr((long)pRawInputDeviceList + (size * i)));

        // Process the rid as needed
        Debug.WriteLine($"Device Handle: {rid.hDevice}, Type: {rid.dwType}");
    }
   // Marshal.FreeHGlobal(pRawInputDeviceList);
}
0

There are 0 best solutions below