#3:This is G.Bateson’s words.
User Interface has the same problem.
Computer is digital. Computer is controlled by formal theory.
We, human, is not purely digital. We, are analogic rather than digital.
But, we make digital computer and communicate with it.
So, user interface is the place where purely digital computer becomes analogic one,
alalogic human becomes digital one in oder to communicate each other.
This is shows that “in the natural world, communication is rarely either purely digital or purely analogic.”
#5:When we use a computer, what do we do?
Almost all of us look at some images on an electric display, grab and move a mouse, and type on a keyboard,
then our right hand holds the mouse in order to point to an image called an icon on the display.
This is very 'natural' for us. If our body makes some actions with a plastic object, then the images on the electric display change.
However, this relationship between our body and the image did not exist until the computer, and especially until the Graphical User Interface, appeared. 
I call this phenomenon 'Display Acts': action which is formed by connecting our body action with the change of images on the electric display.
However, we do not understand this action well because we have only separately studied body actions and images in man-computer communication.  
#6:In order to make clear the relationship between our body action and images in the man-computer world,
this presentation focuses on the cursor in the electric display and the mouse in the real world.   
We see the cursor in the electric display:
The cursor is "a small mark on a computer screen that can be moved and that shows the position on the screen, where, for example, text will be added."
Moreover, the shape of cursor is almost arrow (→) which is "used to show direction or position."
We just see the arrow cursor in order to know where it is and point some images on the display. Then, we also hold the mouse.
#7:What does the cursor show us?
It shows us the position and direction, however, the cursor often points nothing. When the arrow cursor points nothing, the shape does not work.
We don't care this.
Why?
Because we not only see the images in the display, but also grabbing the mouse.
What is a relationship between seeing and grabbing?
#8:In order to make clear the relationship between the cursor and mouse,
I refer to Nelson Goodman’s idea Exemplification.
We don’t care that the cursor points nothing because we can move the arrow cursor with the mouse.
It shows us that the cursor exemplify not only the position and direction for pointing, but also the moving and touching feeling via the mouse.
Exemplification is Nelson Goodman idea
"Exemplification is possession plus reference."
Possession means that some samples show us their own properties,
and we interpret their properties for our interests.
#9:We see the shape of cursor as a sample in order to know functions of cursor.
We get a pointing function from the arrow shape,
therefore, we label this pointing function to the arrow cursor.
However, the arrow cursor points nothing,
therefore, we think this arrow possess another functions;
Maybe it is moving for pointing because the arrow is shape for pointing.
Only when we see the display,
that’s all we understand the functions of the cursor.
#10:We see the shape of cursor as a sample in order to know functions of cursor.
We get a pointing function from the arrow shape,
therefore, we label this pointing function to the arrow cursor.
However, the arrow cursor points nothing,
therefore, we think this arrow possess another functions;
Maybe it is moving for pointing because the arrow is shape for pointing.
Only when we see the display,
that’s all we understand the functions of the cursor.
#11:We see the shape of cursor as a sample in order to know functions of cursor.
We get a pointing function from the arrow shape,
therefore, we label this pointing function to the arrow cursor.
However, the arrow cursor points nothing,
therefore, we think this arrow possess another functions;
Maybe it is moving for pointing because the arrow is shape for pointing.
Only when we see the display,
that’s all we understand the functions of the cursor.
#12:We see the shape of cursor as a sample in order to know functions of cursor.
We get a pointing function from the arrow shape,
therefore, we label this pointing function to the arrow cursor.
However, the arrow cursor points nothing,
therefore, we think this arrow possess another functions;
Maybe it is moving for pointing because the arrow is shape for pointing.
Only when we see the display,
that’s all we understand the functions of the cursor.
#13:We see the shape of cursor as a sample in order to know functions of cursor.
We get a pointing function from the arrow shape,
therefore, we label this pointing function to the arrow cursor.
However, the arrow cursor points nothing,
therefore, we think this arrow possess another functions;
Maybe it is moving for pointing because the arrow is shape for pointing.
Only when we see the display,
that’s all we understand the functions of the cursor.
#14:However, user interface has the mouse.
We touch and grab the mouse, then we see that the cursor moves and point the image.
This touching is another function of the cursor
because the cursor is designed that it connects with the mouse in order to work in the display.
Only when we see the display,
we can not know touching function of the cursor when we only see display.
When we touch the mouse, we understand the cursor can touch the image
and change images in the display.
Only pointing can not change anything in the display.
Changing needs touching.
In short, We not only see the images on the display, but also grabbing the mouse when we use the computer.
Therefore, we have to consider what a relationship between seeing and touching is.
#15:However, user interface has the mouse.
We touch and grab the mouse, then we see that the cursor moves and point the image.
This touching is another function of the cursor
because the cursor is designed that it connects with the mouse in order to work in the display.
Only when we see the display,
we can not know touching function of the cursor when we only see display.
When we touch the mouse, we understand the cursor can touch the image
and change images in the display.
Only pointing can not change anything in the display.
Changing needs touching.
In short, We not only see the images on the display, but also grabbing the mouse when we use the computer.
Therefore, we have to consider what a relationship between seeing and touching is.
#16:However, user interface has the mouse.
We touch and grab the mouse, then we see that the cursor moves and point the image.
This touching is another function of the cursor
because the cursor is designed that it connects with the mouse in order to work in the display.
Only when we see the display,
we can not know touching function of the cursor when we only see display.
When we touch the mouse, we understand the cursor can touch the image
and change images in the display.
Only pointing can not change anything in the display.
Changing needs touching.
In short, We not only see the images on the display, but also grabbing the mouse when we use the computer.
Therefore, we have to consider what a relationship between seeing and touching is.
#17:However, user interface has the mouse.
We touch and grab the mouse, then we see that the cursor moves and point the image.
This touching is another function of the cursor
because the cursor is designed that it connects with the mouse in order to work in the display.
Only when we see the display,
we can not know touching function of the cursor when we only see display.
When we touch the mouse, we understand the cursor can touch the image
and change images in the display.
Only pointing can not change anything in the display.
Changing needs touching.
In short, We not only see the images on the display, but also grabbing the mouse when we use the computer.
Therefore, we have to consider what a relationship between seeing and touching is.
#18:However, user interface has the mouse.
We touch and grab the mouse, then we see that the cursor moves and point the image.
This touching is another function of the cursor
because the cursor is designed that it connects with the mouse in order to work in the display.
Only when we see the display,
we can not know touching function of the cursor when we only see display.
When we touch the mouse, we understand the cursor can touch the image
and change images in the display.
Only pointing can not change anything in the display.
Changing needs touching.
In short, We not only see the images on the display, but also grabbing the mouse when we use the computer.
Therefore, we have to consider what a relationship between seeing and touching is.
#19:However, user interface has the mouse.
We touch and grab the mouse, then we see that the cursor moves and point the image.
This touching is another function of the cursor
because the cursor is designed that it connects with the mouse in order to work in the display.
Only when we see the display,
we can not know touching function of the cursor when we only see display.
When we touch the mouse, we understand the cursor can touch the image
and change images in the display.
Only pointing can not change anything in the display.
Changing needs touching.
In short, We not only see the images on the display, but also grabbing the mouse when we use the computer.
Therefore, we have to consider what a relationship between seeing and touching is.
#20:Sometimes, the arrow cursor suddenly changes, into the spinning wait cursor or something else.
#21:When we see transformations of the cursor's shape, the mouse we grasp does not change.
However, the moment the arrow cursor changes into the spinning wait cursor, our action with the mouse changes.
Before transformation, we use the arrow cursor and the mouse in order to point to an image on the display,
and therefore the cursor exemplifies the function of pointing and our body action with the mouse is formed for pointing.
After transformation, the spinning wait cursor exemplifies the function of, not pointing, but showing the computer status.
As a result, our action with the mouse is not formed for pointing.
After a moment, we realize that what we can do with the mouse in hand is just moving the cursor in the electric display.
The spinning wait cursor exemplifies that the computer is still working,
but it does not leave us alone; the cursor and the mouse are connecting, please wait.
Throughout the whole process, we grasp the same plastic object, even though the cursor's shape is changing.    
#22:In user interface, the image we see is changing
while the object we touch is not changing.
Moreover, this transformation of image is sometimes out of our control.
But we naturally accept this phenomena.
In oder to make it clear, I refer G.Bateson’s Double Description.
Bateson quotes Shakespeare’s Macbeth and says .....,
When we consider about user interface, we need Double Description.
#23:However, user interface mainly focuses on seeing.
For example, What You See Is What You Get; WYSIWYG.
From Wikipedia, the main attraction of WYSIWYG is
the ability of the user to be visualize what he or she is producing,
Seeing, visualize is emphasized.
#24:One more example.
Ben Shneiderman proposed a very important idea for user interface: Direct Manipulation.
From Shneiderman, Direct Manipulation’s central idea is to be visibility of object of interest.
Here, only visibility.
I want to focus on Direct Manipulation from anther point of view.
According to G. Lagoff and M. Johnson.
Direct Manipulation leads us not only seeing but also touching.
We see and touch the object in oder to create something.
And they say that the Direct Manipulation makes the concept of CAUSATION.
It is interesting.
I will come back the concept CAUSATION.
#25:Solen Pold says that WYSIWYG interface has many buttons,
and buttons are essential part of controls in GUI.
Pold consider the icon as the buttons.
It is natural interpretation to understand what the icon is
because we feeling pushing the icons by the mouse and cursor,
and often the icons are designed to be something like buttons
Pold says that.....
According to Pold, a straight cause-and-effect relation appears
and this cause-and effect hides manipulating several layers of symbolic representation.
Furthermore, button force decision into binary choices
and does not have sophisticated ways in which our language expresses activity.
#26:Another view of the cursor and mouse.
Masaki Fujihata says computer does not have a question of materiality.
This means that computer has different principle from us.
In man-computer world, if the repeated experience are provided us,
through interactive experience,
Images are made into object.
Now, we go through new phenomena: Image are made into object.
#27:However, this ‘object’ in the display is not real object.
According to Pold, this ‘object’ in the display has cause-and-effect,
therefore, we can it with our Direct Manipulation.
But, Fujihata says the ‘object’ does not its own materiality.
Of course, we see it, but do we touch it?
Maybe, yes. Maybe, no.
This ‘object’ is something like “switch” as Bateson defined.
The ‘object’ is related to the notion “change” rather than to the notion “object”.
The cursor is given a special relation to time by the mouse.
#28:We confused if ... then of logic with cause-and-effect
Computer makes us switch our notion and experience for touching object:
Touching not object but change.
Because the computer is purely logic machine which we made.
Bateson makes a question: Can logic simulate all sequences of case and effect?
His answer is maybe no.
However, we want to re-made the computer as cause-and-effect machine via Double Description of seeing and touching.
User interface makes a coupling cause-and-effect and logic.
We are making pseudo-cause and-effect with logic
Therefore, we touch the same plastic object, even though the ‘object’ in the display is changing.
#29:Why is this happen?
Because the computer has changed the relationship between our seeing and touching.
The cursor looks like showing cause and effect, however, it stands for the logic in the computer: there is no time.
But, we live in time: cause and effect world.
Masaki Fujihata says that if we see cause and effect behind the image, then we can touch it.
Moreover, Gregory Bateson tells us Double Description of seeing and touching.
When we faces the computer, we have to consider the double description of seeing and touching in order to made clear the relationship man and computer.
We see the change of image: it shows us the logic in the computer but we feel something like cause and effect, which is made from computer’s pure logic.
We touch the same object while the image is changing: our grasping mouse makes pseudo-cause and effect in the computer.
#32:We need man-computer world version of Merleau-Ponty’s “Phenomenology of Perception”.
#33:However, this ‘object’ in the display is not real object.
According to Pold, this ‘object’ in the display has cause-and-effect,
therefore, we can it with our Direct Manipulation.
But, Fujihata says the ‘object’ does not its own materiality.
Of course, we see it, but do we touch it?
Maybe, yes. Maybe, no.
This ‘object’ is something like “switch” as Bateson defined.
The ‘object’ is related to the notion “change” rather than to the notion “object”.
#34:Why is this happen?
Because the computer has changed the relationship between our seeing and touching.
The cursor looks like showing cause and effect, however, it stands for the logic in the computer: there is no time.
But, we live in time: cause and effect world.
Masaki Fujihata says that if we see cause and effect behind the image, then we can touch it.
Moreover, Gregory Bateson tells us Double Description of seeing and touching.
When we faces the computer, we have to consider the double description of seeing and touching in order to made clear the relationship man and computer.
We see the change of image: it shows us the logic in the computer but we feel something like cause and effect, which is made from computer’s pure logic.
We touch the same object while the image is changing: our grasping mouse makes pseudo-cause and effect in the computer.