Weighted integral rotation and translation for touch interaction
Touch based interaction is popular in graphical user interface (GUI) systems, as it provides natural and intuitive direct manipulation. Rotation and translation are basic tasks for manipulating graphical objects and various touch based interaction techniques has been investigated for doing this [Hancock et al. 2006]. In early GUI systems, users had to perform rotation and translation independently by switching between the two manipulation modes either using a menu system or by manipulating different widgets that in many cases make the interface visually cluttered. Recently, two-finger gestures have become common in multi-touch interfaces to perform rotation, translation, and even scaling, simultaneously, without visual clutter. However, there can be ergonomic problems when the user has to rotate objects in large angle [Hoggan et al. 2013], which causes strain on user’s wrist. As a result users tend to split and perform the manipulation in multiple steps, which might not be suitable for certain applications, such as puppeteering based animation tools.