SlideShare a Scribd company logo
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
‣
‣


‣




‣
‣


‣




‣
‣




‣
‣




‣
‣
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
‣



‣



‣
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
‣


‣


‣
‣
‣
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?
What you see is what you touch?

More Related Content

KEY
アジャイル的アプローチから見えてきたこと
KEY
Seeing The Light Colour
KEY
間主観的な映像:ヒト|映像|データ
KEY
ゲーミフィケーション入門
PDF
ゲーミフィケーション入門
KEY
Alfresco SDKとカスタムアクション
KEY
Alfresco Java Foundation API
PDF
Среднесрочное бюджетное планирование
アジャイル的アプローチから見えてきたこと
Seeing The Light Colour
間主観的な映像:ヒト|映像|データ
ゲーミフィケーション入門
ゲーミフィケーション入門
Alfresco SDKとカスタムアクション
Alfresco Java Foundation API
Среднесрочное бюджетное планирование

What's hot (9)

PDF
PDF
CSS - ボックスモデルを理解する - 千葉商科大学 Web表現
KEY
HTML入門2 ハイパーリンク、インラインイメージ、リスト - 千葉商科大学 Web表現
PDF
オリエンテーション・Web概論 - 芸大Web演習
PDF
インターネットの歴史とWWWの成立 - CUC Web表現1
PDF
キャリア教育とごっこ遊び(一部)
PDF
APP SEO & MARKETING
PDF
강동호 - OT + 레슨1 + 레슨 2 총정리 (2014Y12M26D)
PDF
Start OSS Contribution With What You Know / できることから始める OSS Contribution
CSS - ボックスモデルを理解する - 千葉商科大学 Web表現
HTML入門2 ハイパーリンク、インラインイメージ、リスト - 千葉商科大学 Web表現
オリエンテーション・Web概論 - 芸大Web演習
インターネットの歴史とWWWの成立 - CUC Web表現1
キャリア教育とごっこ遊び(一部)
APP SEO & MARKETING
강동호 - OT + 레슨1 + 레슨 2 총정리 (2014Y12M26D)
Start OSS Contribution With What You Know / できることから始める OSS Contribution
Ad

Viewers also liked (20)

KEY
To See And To Touch the Light Source
KEY
KEY
情報技術とアニメーション
KEY
Nature|cursor|destruction
KEY
「これ」と「あれ」を結びつける
KEY
イメージ操作シンボル
KEY
Display Acts
KEY
イントロダクション
KEY
新しい社会
KEY
カーソルの先
KEY
属性と分人主義
KEY
KEY
ペンからマウスへ
KEY
情報技術の歴史と発展2
PDF
新しい「体験」を作り出す
PDF
情報技術の歴史と発展3
KEY
顕名と属性
KEY
情報技術とは?
PDF
Game and watch for sale - Package 1
To See And To Touch the Light Source
情報技術とアニメーション
Nature|cursor|destruction
「これ」と「あれ」を結びつける
イメージ操作シンボル
Display Acts
イントロダクション
新しい社会
カーソルの先
属性と分人主義
ペンからマウスへ
情報技術の歴史と発展2
新しい「体験」を作り出す
情報技術の歴史と発展3
顕名と属性
情報技術とは?
Game and watch for sale - Package 1
Ad

More from Masanori MIzuno (20)

KEY
まとめ
KEY
人がつくる社会
KEY
電子社会での様々な実験
KEY
信頼に基づく共有
KEY
リアルタイムウェブ
KEY
フリーとクラウド
KEY
プライバシーの問題
PDF
Q&A|your web
PDF
情報技術と映画
PDF
リアルタイムウェブ
PDF
リアルタイムウェブ
PDF
ネットワーク社会とインターネット2
PDF
ネットワーク社会とインターネット2
PDF
コントローラの問題
PDF
コントローラの問題 Touch
PDF
みんなが使えるコンピュータ
PDF
ネットワーク社会とインターネット1
PDF
新しい行為/感覚をつくる
PDF
新しい行為/感覚をつくる
KEY
情報技術の歴史と発展3
まとめ
人がつくる社会
電子社会での様々な実験
信頼に基づく共有
リアルタイムウェブ
フリーとクラウド
プライバシーの問題
Q&A|your web
情報技術と映画
リアルタイムウェブ
リアルタイムウェブ
ネットワーク社会とインターネット2
ネットワーク社会とインターネット2
コントローラの問題
コントローラの問題 Touch
みんなが使えるコンピュータ
ネットワーク社会とインターネット1
新しい行為/感覚をつくる
新しい行為/感覚をつくる
情報技術の歴史と発展3

What you see is what you touch?

Editor's Notes

  • #3: This is G.Bateson’s words. User Interface has the same problem. Computer is digital. Computer is controlled by formal theory. We, human, is not purely digital. We, are analogic rather than digital. But, we make digital computer and communicate with it. So, user interface is the place where purely digital computer becomes analogic one, alalogic human becomes digital one in oder to communicate each other. This is shows that “in the natural world, communication is rarely either purely digital or purely analogic.”
  • #5: When we use a computer, what do we do? Almost all of us look at some images on an electric display, grab and move a mouse, and type on a keyboard, then our right hand holds the mouse in order to point to an image called an icon on the display. This is very 'natural' for us. If our body makes some actions with a plastic object, then the images on the electric display change. However, this relationship between our body and the image did not exist until the computer, and especially until the Graphical User Interface, appeared.  I call this phenomenon 'Display Acts': action which is formed by connecting our body action with the change of images on the electric display. However, we do not understand this action well because we have only separately studied body actions and images in man-computer communication.  
  • #6: In order to make clear the relationship between our body action and images in the man-computer world, this presentation focuses on the cursor in the electric display and the mouse in the real world.    We see the cursor in the electric display: The cursor is "a small mark on a computer screen that can be moved and that shows the position on the screen, where, for example, text will be added." Moreover, the shape of cursor is almost arrow (→) which is "used to show direction or position." We just see the arrow cursor in order to know where it is and point some images on the display. Then, we also hold the mouse.
  • #7: What does the cursor show us? It shows us the position and direction, however, the cursor often points nothing. When the arrow cursor points nothing, the shape does not work. We don't care this. Why? Because we not only see the images in the display, but also grabbing the mouse. What is a relationship between seeing and grabbing?
  • #8: In order to make clear the relationship between the cursor and mouse, I refer to Nelson Goodman’s idea Exemplification. We don’t care that the cursor points nothing because we can move the arrow cursor with the mouse. It shows us that the cursor exemplify not only the position and direction for pointing, but also the moving and touching feeling via the mouse. Exemplification is Nelson Goodman idea "Exemplification is possession plus reference." Possession means that some samples show us their own properties, and we interpret their properties for our interests.
  • #9: We see the shape of cursor as a sample in order to know functions of cursor. We get a pointing function from the arrow shape, therefore, we label this pointing function to the arrow cursor. However, the arrow cursor points nothing, therefore, we think this arrow possess another functions; Maybe it is moving for pointing because the arrow is shape for pointing. Only when we see the display, that’s all we understand the functions of the cursor.
  • #10: We see the shape of cursor as a sample in order to know functions of cursor. We get a pointing function from the arrow shape, therefore, we label this pointing function to the arrow cursor. However, the arrow cursor points nothing, therefore, we think this arrow possess another functions; Maybe it is moving for pointing because the arrow is shape for pointing. Only when we see the display, that’s all we understand the functions of the cursor.
  • #11: We see the shape of cursor as a sample in order to know functions of cursor. We get a pointing function from the arrow shape, therefore, we label this pointing function to the arrow cursor. However, the arrow cursor points nothing, therefore, we think this arrow possess another functions; Maybe it is moving for pointing because the arrow is shape for pointing. Only when we see the display, that’s all we understand the functions of the cursor.
  • #12: We see the shape of cursor as a sample in order to know functions of cursor. We get a pointing function from the arrow shape, therefore, we label this pointing function to the arrow cursor. However, the arrow cursor points nothing, therefore, we think this arrow possess another functions; Maybe it is moving for pointing because the arrow is shape for pointing. Only when we see the display, that’s all we understand the functions of the cursor.
  • #13: We see the shape of cursor as a sample in order to know functions of cursor. We get a pointing function from the arrow shape, therefore, we label this pointing function to the arrow cursor. However, the arrow cursor points nothing, therefore, we think this arrow possess another functions; Maybe it is moving for pointing because the arrow is shape for pointing. Only when we see the display, that’s all we understand the functions of the cursor.
  • #14: However, user interface has the mouse. We touch and grab the mouse, then we see that the cursor moves and point the image. This touching is another function of the cursor because the cursor is designed that it connects with the mouse in order to work in the display. Only when we see the display, we can not know touching function of the cursor when we only see display. When we touch the mouse, we understand the cursor can touch the image and change images in the display. Only pointing can not change anything in the display. Changing needs touching. In short, We not only see the images on the display, but also grabbing the mouse when we use the computer. Therefore, we have to consider what a relationship between seeing and touching is.
  • #15: However, user interface has the mouse. We touch and grab the mouse, then we see that the cursor moves and point the image. This touching is another function of the cursor because the cursor is designed that it connects with the mouse in order to work in the display. Only when we see the display, we can not know touching function of the cursor when we only see display. When we touch the mouse, we understand the cursor can touch the image and change images in the display. Only pointing can not change anything in the display. Changing needs touching. In short, We not only see the images on the display, but also grabbing the mouse when we use the computer. Therefore, we have to consider what a relationship between seeing and touching is.
  • #16: However, user interface has the mouse. We touch and grab the mouse, then we see that the cursor moves and point the image. This touching is another function of the cursor because the cursor is designed that it connects with the mouse in order to work in the display. Only when we see the display, we can not know touching function of the cursor when we only see display. When we touch the mouse, we understand the cursor can touch the image and change images in the display. Only pointing can not change anything in the display. Changing needs touching. In short, We not only see the images on the display, but also grabbing the mouse when we use the computer. Therefore, we have to consider what a relationship between seeing and touching is.
  • #17: However, user interface has the mouse. We touch and grab the mouse, then we see that the cursor moves and point the image. This touching is another function of the cursor because the cursor is designed that it connects with the mouse in order to work in the display. Only when we see the display, we can not know touching function of the cursor when we only see display. When we touch the mouse, we understand the cursor can touch the image and change images in the display. Only pointing can not change anything in the display. Changing needs touching. In short, We not only see the images on the display, but also grabbing the mouse when we use the computer. Therefore, we have to consider what a relationship between seeing and touching is.
  • #18: However, user interface has the mouse. We touch and grab the mouse, then we see that the cursor moves and point the image. This touching is another function of the cursor because the cursor is designed that it connects with the mouse in order to work in the display. Only when we see the display, we can not know touching function of the cursor when we only see display. When we touch the mouse, we understand the cursor can touch the image and change images in the display. Only pointing can not change anything in the display. Changing needs touching. In short, We not only see the images on the display, but also grabbing the mouse when we use the computer. Therefore, we have to consider what a relationship between seeing and touching is.
  • #19: However, user interface has the mouse. We touch and grab the mouse, then we see that the cursor moves and point the image. This touching is another function of the cursor because the cursor is designed that it connects with the mouse in order to work in the display. Only when we see the display, we can not know touching function of the cursor when we only see display. When we touch the mouse, we understand the cursor can touch the image and change images in the display. Only pointing can not change anything in the display. Changing needs touching. In short, We not only see the images on the display, but also grabbing the mouse when we use the computer. Therefore, we have to consider what a relationship between seeing and touching is.
  • #20: Sometimes, the arrow cursor suddenly changes, into the spinning wait cursor or something else.
  • #21: When we see transformations of the cursor's shape, the mouse we grasp does not change. However, the moment the arrow cursor changes into the spinning wait cursor, our action with the mouse changes. Before transformation, we use the arrow cursor and the mouse in order to point to an image on the display, and therefore the cursor exemplifies the function of pointing and our body action with the mouse is formed for pointing. After transformation, the spinning wait cursor exemplifies the function of, not pointing, but showing the computer status. As a result, our action with the mouse is not formed for pointing. After a moment, we realize that what we can do with the mouse in hand is just moving the cursor in the electric display. The spinning wait cursor exemplifies that the computer is still working, but it does not leave us alone; the cursor and the mouse are connecting, please wait. Throughout the whole process, we grasp the same plastic object, even though the cursor's shape is changing.    
  • #22: In user interface, the image we see is changing while the object we touch is not changing. Moreover, this transformation of image is sometimes out of our control. But we naturally accept this phenomena. In oder to make it clear, I refer G.Bateson’s Double Description. Bateson quotes Shakespeare’s Macbeth and says ....., When we consider about user interface, we need Double Description.
  • #23: However, user interface mainly focuses on seeing. For example, What You See Is What You Get; WYSIWYG. From Wikipedia, the main attraction of WYSIWYG is the ability of the user to be visualize what he or she is producing, Seeing, visualize is emphasized.
  • #24: One more example. Ben Shneiderman proposed a very important idea for user interface: Direct Manipulation. From Shneiderman, Direct Manipulation’s central idea is to be visibility of object of interest. Here, only visibility. I want to focus on Direct Manipulation from anther point of view. According to G. Lagoff and M. Johnson. Direct Manipulation leads us not only seeing but also touching. We see and touch the object in oder to create something. And they say that the Direct Manipulation makes the concept of CAUSATION. It is interesting. I will come back the concept CAUSATION.
  • #25: Solen Pold says that WYSIWYG interface has many buttons, and buttons are essential part of controls in GUI. Pold consider the icon as the buttons. It is natural interpretation to understand what the icon is because we feeling pushing the icons by the mouse and cursor, and often the icons are designed to be something like buttons Pold says that..... According to Pold, a straight cause-and-effect relation appears and this cause-and effect hides manipulating several layers of symbolic representation. Furthermore, button force decision into binary choices and does not have sophisticated ways in which our language expresses activity.
  • #26: Another view of the cursor and mouse. Masaki Fujihata says computer does not have a question of materiality. This means that computer has different principle from us. In man-computer world, if the repeated experience are provided us, through interactive experience, Images are made into object. Now, we go through new phenomena: Image are made into object.
  • #27: However, this ‘object’ in the display is not real object. According to Pold, this ‘object’ in the display has cause-and-effect, therefore, we can it with our Direct Manipulation. But, Fujihata says the ‘object’ does not its own materiality. Of course, we see it, but do we touch it? Maybe, yes. Maybe, no. This ‘object’ is something like “switch” as Bateson defined. The ‘object’ is related to the notion “change” rather than to the notion “object”. The cursor is given a special relation to time by the mouse.
  • #28: We confused if ... then of logic with cause-and-effect Computer makes us switch our notion and experience for touching object: Touching not object but change. Because the computer is purely logic machine which we made. Bateson makes a question: Can logic simulate all sequences of case and effect? His answer is maybe no. However, we want to re-made the computer as cause-and-effect machine via Double Description of seeing and touching. User interface makes a coupling cause-and-effect and logic. We are making pseudo-cause and-effect with logic Therefore, we touch the same plastic object, even though the ‘object’ in the display is changing.
  • #29: Why is this happen? Because the computer has changed the relationship between our seeing and touching. The cursor looks like showing cause and effect, however, it stands for the logic in the computer: there is no time. But, we live in time: cause and effect world. Masaki Fujihata says that if we see cause and effect behind the image, then we can touch it. Moreover, Gregory Bateson tells us Double Description of seeing and touching. When we faces the computer, we have to consider the double description of seeing and touching in order to made clear the relationship man and computer. We see the change of image: it shows us the logic in the computer but we feel something like cause and effect, which is made from computer’s pure logic. We touch the same object while the image is changing: our grasping mouse makes pseudo-cause and effect in the computer.
  • #32: We need man-computer world version of Merleau-Ponty’s “Phenomenology of Perception”.
  • #33: However, this ‘object’ in the display is not real object. According to Pold, this ‘object’ in the display has cause-and-effect, therefore, we can it with our Direct Manipulation. But, Fujihata says the ‘object’ does not its own materiality. Of course, we see it, but do we touch it? Maybe, yes. Maybe, no. This ‘object’ is something like “switch” as Bateson defined. The ‘object’ is related to the notion “change” rather than to the notion “object”.
  • #34: Why is this happen? Because the computer has changed the relationship between our seeing and touching. The cursor looks like showing cause and effect, however, it stands for the logic in the computer: there is no time. But, we live in time: cause and effect world. Masaki Fujihata says that if we see cause and effect behind the image, then we can touch it. Moreover, Gregory Bateson tells us Double Description of seeing and touching. When we faces the computer, we have to consider the double description of seeing and touching in order to made clear the relationship man and computer. We see the change of image: it shows us the logic in the computer but we feel something like cause and effect, which is made from computer’s pure logic. We touch the same object while the image is changing: our grasping mouse makes pseudo-cause and effect in the computer.