{"id":14788,"date":"2014-05-08T21:07:49","date_gmt":"2014-05-09T02:07:49","guid":{"rendered":"http:\/\/blog.kenperlin.com\/?p=14788"},"modified":"2014-05-08T21:07:49","modified_gmt":"2014-05-09T02:07:49","slug":"generative-versus-contingent","status":"publish","type":"post","link":"http:\/\/blog.kenperlin.com\/?p=14788","title":{"rendered":"Generative versus contingent"},"content":{"rendered":"<p>During one of the technical papers sessions at the recent SIGCHI (Special Interest Group on Computer \/ Human Interfaces) conference, two papers stood out for me because they represented perfectly opposite philosophies.<\/p>\n<p>One paper looked for &#8212; and found &#8212; all the special cases where real world tools could be effectively mimicked by putting your fingers on a multitouch surface (eg: an iPad) as though you were using that tool.  Some tools, like a tape measure, computer mouse, pen and eraser, can be mimicked very well on a tablet.  Others, like scissors, cannot.<\/p>\n<p>The authors found seven real-world tools that could be mimicked beautifully in this way, and after their live demo they got a big spontaneous round of applause from the audience.<\/p>\n<p>But another paper took the opposite approach.  The authors asked &#8220;what are the hand and finger gestures that are inherently powerful and expressive on a multitouch surface?&#8221;  Essentially they game up with a grammar &#8212; a way of building an extremely large and extensible vocabulary of hand gestures that nobody had ever tried before.<\/p>\n<p>The power of the first approach is that it is contingent:  It works because there are particular real world tools that happen to map into finger positions on a multitouch screen.  The power of the second approach is that it is generative: It builds from the inherent richness of what a hand and fingers can express when interacting with a flat surface.<\/p>\n<p>The paper that relied on contingency was more of a crowd pleaser.  But in the long run, my money is on the generative approach.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>During one of the technical papers sessions at the recent SIGCHI (Special Interest Group on Computer \/ Human Interfaces) conference, two papers stood out for me because they represented perfectly opposite philosophies. One paper looked for &#8212; and found &#8212; all the special cases where real world tools could be effectively mimicked by putting your &hellip; <a href=\"http:\/\/blog.kenperlin.com\/?p=14788\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Generative versus contingent&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[1],"tags":[],"_links":{"self":[{"href":"http:\/\/blog.kenperlin.com\/index.php?rest_route=\/wp\/v2\/posts\/14788"}],"collection":[{"href":"http:\/\/blog.kenperlin.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/blog.kenperlin.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/blog.kenperlin.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/blog.kenperlin.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=14788"}],"version-history":[{"count":1,"href":"http:\/\/blog.kenperlin.com\/index.php?rest_route=\/wp\/v2\/posts\/14788\/revisions"}],"predecessor-version":[{"id":14789,"href":"http:\/\/blog.kenperlin.com\/index.php?rest_route=\/wp\/v2\/posts\/14788\/revisions\/14789"}],"wp:attachment":[{"href":"http:\/\/blog.kenperlin.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=14788"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/blog.kenperlin.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=14788"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/blog.kenperlin.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=14788"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}