It is fairly easy to detect if a particular computer/browser supports touch. However, there are now devices available that support both touch screens and mouse events. Users of such devices may use a combination of mouse, touch (and of course keyboard) to interact with web applications. There is no way of predicting how users will interact with apps on these devices.
To solve this issue, all DHTMLX widgets need to be able to support all types of input. Is this kind of enhancement likely to happen anytime soon?
For my app, I particularly need DHTMLXTree and DHTMLXGrid to support drag and drop using both touch gestures and mouse.
Here’s an article that describes the issue in more detail and points out some pitfalls of half baked solutions!
html5rocks.com/en/mobile/touchandmouse/
The problem is not detection, but differentiation between scroll and drag gestures. On touch device there is no real difference between vertical scroll and vertical drag. In both cases you are touching the screen and moving thumb without releasing.
Then you need some setup in the code for each widget that defines the effect of each gesture.
Apart from those things mentioned above, I need relatively straightforward things like being able to drag a DHTMLXWindow around the screen, resize DHTMLXLayout panes, etc. My applications is already setup so it doesn’t need any scrolling or default zoom function.