SlideShare a Scribd company logo
111111 1111111111111111111111111111111111111111111111111111111111111111111111111111 
US 20130080898Al 
(19) United States 
c12) Patent Application Publication 
Lavian et al. 
(10) Pub. No.: US 2013/0080898 A1 
(43) Pub. Date: Mar. 28, 2013 
(54) SYSTEMS AND METHODS FOR 
ELECTRONIC COMMUNICATIONS 
(76) Inventors: Tal Lavian, Sunnyvale, CA (US); Zvi 
Or-Bach, San Jose, CA (US) 
(21) Appl. No.: 13/273,187 
(22) Filed: Oct. 13, 2011 
(63) 
Related U.S. Application Data 
Continuation-in-part of application No. 13/245,804, 
filed on Sep. 26, 2011, Continuation-in-part of appli­cation 
No. 13/272,212, filed on Oct. 12, 2011. 
102 
"" 
Publication Classification 
(51) Int. Cl. 
G06F 3101 (2006.01) 
G06F 3116 (2006.01) 
(52) U.S. Cl. 
USPC ........................................... 715/728; 715/738 
(57) ABSTRACT 
Embodiments of the invention provide a system for enhanc­ing 
user interaction with objects connected to a network. The 
system includes a processor, a display screen, a memory 
coupled to the processor. The memory comprises a database 
including a list of two or more objects and instructions 
executable by the processor to display a menu. The menu is 
associated with at least two independent objects. And the two 
independent objects are produced by two independent ven­dors. 
GraQhical User Interface 
r- 3504a ~ 3504b 
Create Cockpit Customize Cockpit 
~ 3506 r- 3504c r-- 3504n 
View Cockpit • • Invite Users 
Audio Video Text List 
Mode lt1ode Mode • • Mode 
3502a 3502b 3502c 3502n
Patent Application Publication Mar. 28, 2013 Sheet 1 of 64 
(!) 
<.) .> 
(!) 
0 
..<.J...) 
0 
E 
Q) 
0::: 
Q) 
0 ·;;: 
Q) 
0 
..<.J...) 
0 
E 
Q) n:: 
Q) 
-~ 
> 
Q) 
0 
N 
0 
"(""'"" 
Q) <J.) 
0 (.) 
·;;: ·::; 
Qj <J.) 
0 0 
..Q...) • • • ..Q....). 0 0 
E E 
Q) Q) 
0::: 0:::: 
US 2013/0080898 Al
~.'"..=. . 
('D .=.... 
114 
Server f./ 
~ '-.e... (') 
~....... .. 
102 I 0 = '"= 
Device ( 110 106a 
Remote Device 
= 0-..".. (') 
Web Page ~....... .. 0 
(112a 
I User 10 I Network 
= 
(112b 
Remote Device 
106b 
I Password I 
~ 
~ :-: 
N 
~CIO 
I 
VMThings 
 Remote Device r N 
II I 
106c 
0 
108 
.... 
(.H 
• rFJ =- ('D • (.'.D.. . 
• N 
0 
106n 
Remote Device v 
..... 
0 
.j;o. 
/ c 
rFJ 
200 N 
0.. .. 
.(...H_ 
0 
0 
CIO 
0 
CIO 
0 
FIG. 18 CIO 
>....
11s , I l 
Display Device 
I 
104 
..._ / 
116 , I I f 
Access Device 
) 
I 
VMThings 
108 II 
....__., 
""' 
/ 
300 
FIG.1C 
Remote Device /.1 
I Remote Device 
~I 
~ Remote Device 
• 
• 
• 
I Remote Device 
v 106a 
1 .... 106b 
If 106c 
I,... 106n 
'"= ~.... . 
('D .=.... 
~ '-.e... (') 
~....... .. 0 = ='"= 0-..".. (') 
~....... .. 0 = 
~ 
~ :-: 
N 
~CIO 
N 
0.. .. 
(.H 
rFJ =­(' 
D 
(.'.D.. . 
(.H 
0... .. 
0 
.j;o. 
c 
rFJ 
N 
0.. .. 
.(...H_ 
0 
0 
CIO 
0 
CIO 
0 
CIO 
>....
~.'"..=. . 
('D .=.... 
118 "" 1 I /I Remote Device 
106a ~ 
Display Device I '-.e... (') 
~....... .. 0 = '"= 
120 / = 0-..".. (') 
~.... . 
Remote Device l/ 106b 
.... 0 = 
11e , I Access Device I _; Zig Bee ~ ~ 
~ 
I 
VMThings 1r 108 
:-: 
N 
~CIO 
Remote Device 
/106c N 
0.. .. 
(.H 
rFJ • =- ('D 
(.'.D.. . • .j;o. 
• 0... .. 
0 
.j;o. 
Remote Device V 106n 
c 
rFJ 
N 
0.. .. 
.(...H_ 
0 
0 
FIG. 10 CIO 
0 
CIO 
0 
CIO 
>....
118 
Display Device 
122 
J 1. X ~~ K lll A '-A ) 
Access Device 
116 -., I I VMThings 
108 II - 
""' 
FIG. 1E 
Remote Device 
Remote Device 
~I 
~ Remote Device 
• 
• • 
Remote Device 
106a 
I I 106b 
L.r 106c 
I r- 106n 
'"= ~.... . 
('D .=.... 
~ '-.e... (') 
~....... .. 0 = ='"= 0-..".. (') 
~....... .. 0 = 
~ 
~ :-: 
N 
~CIO 
N 
0.. .. 
(.H 
rFJ =­(' 
D 
(.'.D.. . 
Ul 
0... .. 
0 
.j;o. 
c 
rFJ 
N 
0.. .. 
.(...H_ 
0 
0 
CIO 
0 
CIO 
0 
CIO 
>....
Remote Device 
118 
Display Device 
124 
FIG. 1 F 
106a 
~.'"..=. . 
('D .=.... 
~ '-.e... (') 
~....... .. 0 = ='"= 0-..".. (') 
~....... .. 0 = 
~ 
~ :-: 
N 
~CIO 
N 
0.. .. 
(.H 
rFJ =­(' 
D 
(.'.D.. . 
0 
0... .. 
0 
.j;o. 
c 
rFJ 
N 
0.. .. 
.(...H_ 
0 
0 
CIO 
0 
CIO 
0 
CIO 
>....
Patent Application Publication Mar. 28, 2013 Sheet 7 of 64 US 2013/0080898 Al 
m (J) (J) (J) 
·;0;; : (.j (.) u ·:;; ·:;; "> (J) Q.') CD Q) 
0 0 0 0 
Q) (J) (J) • • • ......., ..(..J..). 
0 -0 -0 0 E E E E 
(J) Q.') Q.') Q) n::: 0:: 0:: 0:: 
. 
(9 
LL
Patent Application Publication Mar. 28, 2013 Sheet 8 of 64 
(]) u ·:;; 
(]) 
0 
..(.]..). 
0 
E 
(]) 
0:::: 
OJ 
(.) ·::;; 
(]) 
0 
..(..].). 
0 
E 
(J) n:: 
N 
0 
"""" 
OJ (!) 
(.) {.) 
·::;; ·:;; 
(!) (J) 
0 0 
..(.!..). • • • ..<..D.. 0 0 
E E 
<D OJ 
n:: 0::: 
US 2013/0080898 Al 
. 
(9 
LL
'"= ~ 
106a ..... 
('D .=.... 
130 /1 Remote Device 
I 
~ 'e 
102 ..... __.) - f 126 
.... (') 
~.... . 
106b 
.... 0 = 
Device I VMThings I R I J J j . ) I ='"= 
'":VHOVY 0-..".. '(Network Remote Device (') 
~....... .. 108 
0 
106c = 
Remote Device I 
~ 
~ :-: 
N 
~CIO 
/128 I • • 0.. N .. 
Bridge • f 106n 
(.H 
rFJ 
Device I 
=- ('D 
Remote Device 
(.'.D.. . 
0 
0... .. 
0 
.j;o. 
c 
rFJ 
N 
0.. .. 
.(...H_ 
0 
0 
CIO 
0 
FIG. 11 CIO 
0 
CIO 
>....
Patent Application Publication Mar. 28, 2013 Sheet 10 of 64 
('0 
N 
0 
N 
(1) 
0 ·:,:_;: 
(1) 
(f) 
.0 
N 
0 
C1 
(1) 
(,) 
-~ 
Q) 
(f) 
N 
0 
't'""" 
0 
N 
0 
C1 
Q) 
(,) 
-~ 
Q) 
(/) 
• • • 
c 
C1 
0 
N 
(!) 
Q ·s...;.. 
(!) 
(/) 
US 2013/0080898 Al 
<( 
N . 
(9 
LL
Patent Application Publication 
-.:;t 
.,.- ....... " '- 
<D 
C: 
<D 
(j) 
N 
ro 
N 
0 
N 
ill 
(.) ".>..... 
(!) 
(f) 
,0.. ... 
..- ctl 
N ......-. 
.0 
N 
0 
N 
<D 
(.) 
-~ 
<!) 
(f) 
..a 
.N... . ..... 
Mar. 28, 2013 Sheet 11 of 64 
(.) 
N 
0 
N 
m (..) 
-~ 
Q> 
{/) 
• 
If) 
• • 
c 
N 
0 
N 
ill u "> '- 
<D 
(f) 
0 _ _ "2 Q) ..- 0 c: 0 
<D .._ $ £ gsl 
Q) <D If) !- ....... ro (f) If) 2 Q_ :::> ro > .0 0.. 
<D s 
US 2013/0080898 Al 
co 
N 
<D 
u.. 
'§I
Patent Application Publication Mar. 28, 2013 Sheet 12 of 64 US 2013/0080898 Al 
ro 
N 
0 
N 
m 
() .2: 
(]) 
{f) 
a> 
(.) ·:;: 
m 
0 
r>o- 
0. 
(f) 
0 
..0 
N 
0 
N 
(]) 
() .> 
s... 
Q) 
(j) 
<l> 
(J '> Cl.l 
0 
(/) 
(/) 
(!) 
0 
(.) 
<( 
en 
0> 
(.) 
N 
0 
N 
..5.c cool 
1-'r' 
~ > 
m 
0 
-~ • (]) 
(f) 
(l) 
0 
• • -~ 
(!) 
(f) 
'§I 
() 
N . 
<.9 
u...
Patent Application Publication Mar. 28, 2013 Sheet 13 of 64 
ro 
N 
0 
N 
CJ.) 
0 ·::;;: 
CJ.) 
0 
>. 
ro 
Cl. 
.~ 
0 
.n 
N 
0 
N 
CJ.) 
0 ·::;;: 
I- 
(]) 
(/) 
(!) 
0 ·:;: 
(!) 
0 
(f) en 
CD 
() 
() 
<( 
en 
Q) 
0 
N 
0 
N 
c 
:E 81 !-'(""" 
~ > 
<J) 
0 
·~ 
CD 
(f) 
• • • 
c 
N 
0 
N 
CD 
() '2: 
(]) 
{f) 
US 2013/0080898 Al 
0 
N 
<.9 
LL
Patent Application Publication Mar. 28, 2013 Sheet 14 of 64 
ro 
N 
0 
N 
<D u 
0~ 
Q) 
(f) 
Q) 
(,) ·:; 
<D 
0 
»ro 
a.. 
-~ 
0 
.0 
N 
0 
N 
Q) 
(,) '2: 
Q) 
(j) 
(].) 
() ·s; 
(!) 
0 
CIJ 
rJ) 
(].) 
() 
(,) 
<{ 
rJ) 
0) 
u 
N 
0 
N 
c: 
E~~ 1-""r"" 
~ > 
Q) 
(,) '2: 
Q) 
(j) 
• • • 
c 
N 
0 
N 
Q) 
(,) ·;; 
I-CD 
(f) 
US 2013/0080898 Al 
w 
N . 
(9 
LL
Patent Application Publication Mar. 28, 2013 Sheet 15 of 64 
m 
N 
0 
N 
({) ·:u:; ..... 
<!) 
CJ) 
0) 
{.) ·::;: 
({) 
0 
>. m 
0.. 
-~ 
0 
..a 
N 
0 
N 
0) 
{.) ·::;: ..... 
Q) 
(j) 
<II 
0 ·::;: 
(!) 
0 
1/j 
ifJ 
(!.) 
0 
0 
<( 
1/j 
0) 
(.) 
N 
0 
N 
c 
E ?§I 1--r- 
~ > 
<J) 
(.) 
-~ 
(J) 
(/) 
({) u 
• • • ·s; ..... 
<!) 
CJ) 
US 2013/0080898 Al
Patent Application Publication Mar. 28, 2013 Sheet 16 of 64 
ro 
N 
0 
N 
(J) 
0 ·:; 
!.... 
(J) 
(f) 
.a 
N 
0 
N 
<J.) 
-~ 
> I- 
(J) 
(f) 
N 
.0. - 
0 
N 
0 
N 
(1) 
.5d 
> ~ 
(]) 
(f) 
• 
(J) 
0 ·:; 
• • !.... (J) 
(f) 
US 2013/0080898 Al 
. 
(.9 
LL
Patent Application Publication Mar. 28, 2013 Sheet 17 of 64 
m 
N 
0 
N 
(!) 
(.) "2: 
(!) 
(j) 
..0 
N 
0 
N 
(!) 
.2 
> l..... 
(!) 
fJ) 
N 
0 
~ 
(.J 
N 
0 
N 
(!) 
.2 
> l..... 
(!) 
fJ) 
• • • 
c 
N 
0 
N 
(j) 
0 ·::;: 
!- 
(j) 
fJ) 
US 2013/0080898 Al 
I 
N 
<D 
LL
Patent Application Publication Mar. 28, 2013 Sheet 18 of 64 US 2013/0080898 Al 
c 
N 
.0 () ro 0 N N N 
N 0 0 
0 N N 
N 
 OJ (}.) 
(J) <D 0 
(.) "> <.J 0 ·:;;: 
,_ 
-~ ,_ 
-~ • • • <D (]) 
0) (J) (f) (}) 
(f) Cf) 
N 
. 
co N.- (J) (!) .0) .5:2 
(!) ~ 
·""c0 >tD LL coo 
(/') 
Q) en 
<.J c 
"> :c ~I (}.) I-T"" 
0 ~ 
N > 
,0.. ...
8 102 0 
Device J 308 
.r302 
Remote Devices 
1 
r302 
Ccontrol 
~', Remote Devices 
1 
/'304 
Services 
2 
FIG. 3A 
102 
Device r310 
J"306a 
Vehicle 
3 
r306b 
AC 
4 
r306c 
Camera 
5 
• f306n 
Microwave 
n 
'"= ~.... . 
('D .=.... 
~ '-.e... (') 
~....... .. 0 = ='"= 0-..".. (') 
~....... .. 0 = 
~ 
~ :-: 
N 
~CIO 
N 
0.. .. 
(.H 
rFJ =­(' 
D 
(..'..D... .. 
0 
0... .. 
0 
.j;o. 
c 
rFJ 
N 
0.. .. 
.(...H_ 
0 
0 
CIO 
0 
CIO 
0 
CIO 
>....
8 102 0 
Device 308 
302 
Remote Devices 
1 304 
Services 
2 
304 
Services 
2 
FIG. 38 
102 
Device 312 
314a 
Entertainment 
3 
314b 
314c 
314n 
n 
~.'"..=. . 
('D .=.... 
~ '-.e... (') 
~....... .. 0 = ='"= 0-..".. (') 
~....... .. 0 = 
~ 
~ :-: 
N 
~CIO 
N 
0.. .. 
(.H 
rFJ =­(' 
D 
(.'.D.. . 
N 
0 
0... .. 
0 
.j;o. 
c 
rFJ 
N 
0.. .. 
.(...H_ 
0 
0 
CIO 
0 
CIO 
0 
CIO 
>....
8 102 110a 0 
Device ( 
Web Paae Web Paae 
_/302 
Remote Devices 
1 
J 302 
__f 304 
I 
.... Remote Devices 
I 
7 
Services 1 
2 
FIG. 3C 
110b 
Device 1 
.f306a 
Vehicle 
3 
r306b 
f._-::? AC 
4 
f306c 
Camera 
5 
•• .r306n 
...; Microwave 
n 
'"= ~.... . 
('D .=.... 
~ '-.e... (') 
~....... .. 0 = ='"= 0-..".. (') 
~....... .. 0 = 
~ 
~ :-: 
N 
~CIO 
N 
0.. .. 
(.H 
rFJ =­(' 
D 
(.'.D.. . 
.N.. . 
0... .. 
0 
.j;o. 
c 
rFJ 
N 
0.. .. 
.(...H_ 
0 
0 
CIO 
0 
CIO 
0 
CIO 
>....
8 102 
110a 0 
Device ( 
Web Page Web Paae 
f 302 
Remote Devices 
1 
f 304 
f 304 
I 
1-----7 Services 
Services 
I 
2 
2 
FIG. 3D 
110c 
Device ( 
_r314a 
Entertainment 
3 
.J314b 
~ 
Travel 
4 
f314c 
Banking 
5 
•• f314n 
Hotels 
n 
'"= ~.... . 
('D .=.... 
~ '-.e... (') 
~....... .. 0 = ='"= 0-..".. (') 
~....... .. 0 = 
~ 
~ :-: 
N 
~CIO 
N 
0.. .. 
(.H 
rFJ =­(' 
D 
(.'.D.. . 
N 
N 
0... .. 
0 
.j;o. 
c 
rFJ 
N 
0.. .. 
.(...H_ 
0 
0 
CIO 
0 
CIO 
0 
CIO 
>....
Car 
5 
! 
404f 
404k 
Truck 
6 
102 
Device 
Remote Devices 
1 
404a 
404c~ 
I ~n Reg~late 
404g 404h 404i 
4041 
FIG.4 
m II'-- 402 
ff 
9 
404j 
~.'"..=. . 
('D .=.... 
~ '-.e... (') 
~....... .. 0 = ='"= 0-..".. (') 
~....... .. 0 = 
~ 
~ :-: 
N 
~CIO 
N 
0.. .. 
(.H 
rFJ =­(' 
D 
(.'.D.. . 
N 
(.H 
0... .. 
0 
.j;o. 
c 
rFJ 
N 
0.. .. 
.(...H_ 
0 
0 
CIO 
0 
CIO 
0 
CIO 
>....
504b 
504e 
Banking 
2 
Transfer 
9 
504i 
Details 
10 
102 
Device 
Services 
1 
504a 
Entertainment 
3 
504g 
504j 
FIG. 5 
504c 
504f 
Check Bill 
11 
Travel 
4 
.f 504h 
504k 
502 
~.'"..=. . 
('D .=.... 
~ '-.e... (') 
~....... .. 0 = ='"= 0-..".. (') 
~....... .. 0 = 
~ 
~ :-: 
N 
~CIO 
N 
0.. .. 
(.H 
rFJ =­(' 
D 
(.'.D.. . 
N 
.j;o. 
0... .. 
0 
.j;o. 
c 
rFJ 
N 
0.. .. 
.(...H_ 
0 
0 
CIO 
0 
CIO 
0 
CIO 
>....
Device 
;602 4 
Display ~ ....:::. Radio 
/ Interface 
;604 
Processor ~ 
Network 
--7 Interface 
) 606 622 j 
Memory 
Graphical User Interface 
608 Input/ Output 
--7' Interface 
Database 
610 
I I 
VMThings 
612 
I I 
• 
FIG. 6 
v614 
)102 
10 
616 
./ Networ~ 1'- 
v 618 
...,.:.:: . Memory lJ 
Card 
I' -M Keyboard 
,I Mouse 
"'I 
~H USB I 
a 
Ob 
Oc 
20d 
'"= ~.... . 
('D .=.... 
~ '-.e... (') 
~....... .. 0 = ='"= 0-..".. (') 
~....... .. 0 = 
~ 
~ :-: 
N 
~CIO 
N 
0.. .. 
(.H 
rFJ =­(' 
D 
(.'.D.. . 
N 
Ul 
0... .. 
0 
.j;o. 
c 
rFJ 
N 
0.. .. 
.(...H_ 
0 
0 
CIO 
0 
CIO 
0 
CIO 
>....
Access Device 
~ 
_;702 ' Radio 
--;? Interface 
Processor ~ 
720 
j 
Network 
J 704 Interface 
Memory 
Graphical User lntertace ' Input/ Output 
706 
--;? Interface 
I 
Database 
I 
~ 
708 
"/ Ports 
VMThings 
710 ~ 
I I 
~ 722 
FIG. 7 
lr 712 
_;116 
10 
Lf 714 
I_..- Networl "' / 
v 716 
/ ' Memory lJ 
' Card 
1-M Keyboard 
' Mouse / 
'--H USB 
a 
b 
c 
8d 
'"= ~..... . 
('D .=..... 
~ '-.e... (') 
~........ .. 0 = ='"= 0-..".. (') 
~........ .. 0 = 
~ 
~ :-: 
N 
~CIO 
N 
0.. .. 
(.H 
rFJ =­(' 
D 
('D ...... 
N 
0 
0... .. 
0 
.j;o. 
c 
rFJ 
N 
0.. .. 
.(...H_ 
0 
0 
CIO 
0 
CIO 
0 
CIO 
>....
Patent Application Publication Mar. 28, 2013 Sheet 27 of 64 US 2013/0080898 Al 
Start 
I 
802 .__ Access a database of visual access menus 
through a GU I at a device 
II 
804 .__ Display a visual access menu at the device 
II 
806 ""__ Display an enhanced visual access menu 
based on a selection of an option by a user 
it 
808 ~ Receive a selection of a device option from a 
user 
II 
810 ____ Connect to a remote device based on 
selection of the device option 
I; 
812 ~ 
Control one or more operations of the remote 
device based on selection of the device 
option 
II 
Stop 
FIG. 8
Patent Application Publication Mar. 28, 2013 Sheet 28 of 64 US 2013/0080898 Al 
Start 
I 
902 ___ Access a database of visual access menus 
through a GU! at a device 
1 
904 ___ Display a visual access menu at the device 
II 
906 ._ Display an enhanced visual access menu 
based on a selection of an option by a user 
'V 
908 ~ Receive a selection of a service option from 
a user 
'V 
910 ._ Connect to a service based on a selection of 
the service option 
/ 
912 ~ 
Control and display information about the 
service based on selection of the service 
option 
I 
Stop 
FIG. 9
Patent Application Publication Mar. 28, 2013 Sheet 29 of 64 US 2013/0080898 Al 
Start 
1002 Display a GUI for accessing visual access 
menus at the device 
1004 
Receive an input from a user of the device 
1010 
.----"'--'--------., 
Wait for an 
input at the 
device 
No 
1006 _ 
1012 
Is input is for 
accessing 
remote 
devices? 
Is a visual 
access menu 
for remote 
devices 
available? 
FIG. 10A
Patent Application Publication Mar. 28, 2013 Sheet 30 of 64 
1018 
Is a visual 
access menu 
for services 
available? 
Display the visual access menu including 
service options at the device 
1020 "._ 
Receive a selection of a service option from 
the user 
1022 
Is information 
for the No 
selected 
service option 
available? 
1026 Yes 
Display the information based on the received 
selection 
FIG. 108 
US 2013/0080898 Al 
Retrieve 
visual access 
menu from a 
1024 
; 
Receive 
information 
from the 
server
Patent Application Publication Mar. 28, 2013 Sheet 31 of 64 US 2013/0080898 Al 
B c 
1028 _ V 
Retrieve the visual access menu for the 
remote devices from the server 
1030 _ V 
' 
Display the visual access menu including 
/ device options at the device 
1032 __ 
It 
Receive a selection of a device option from 
the user 
'I 
1034 _ Connect to a remote device based on the 
received selection 
V 
1036 _ Control the remote device based on the one 
or more user inputs 
 
0 
FIG. 10C
Patent Application Publication Mar. 28, 2013 Sheet 32 of 64 US 2013/0080898 Al 
Start 
1 
1102 .._ Open a website through a web browser at the 
device 
~ 1104 Authenticate a user's identity at the website 
1106 l 
Display a visual access menu at the device 
1108 t 
Receive an input from the user of the device 
~ 
1110 "__ 
Display an enhanced visual access menu 
when the input is for accessing remote 
devices 
l 
1112 "__ Receive a selection of a device option from 
the user 
l 
1114 "-- Connect to a remote device based on a 
selection of the device option 
1 
1116 "-- Control one or more operations of the remote 
device based on the selection of the device 
option 
l 
Stop 
FIG.11
Patent Application Publication Mar. 28, 2013 Sheet 33 of 64 US 2013/0080898 Al 
Start 
1 
1202 "'. Open a website through a web browser at the 
device 
1204 Authenticate user's identity at the website 
1206 
j 
Display a visual access menu at the device 
1208 _ II 
Receive an input from the user of the device 
l 
1210 _ Display an enhanced visual access menu 
when the input is for accessing services 
~ 
1212 _ Receive a selection of a service option from 
the user 
 
1214 __ Connect to a service based on a selection of 
the service option 
'II 
1216 __ 
Control and display information about the 
service based on selection of the service 
option 
Stop 
FIG.12
Patent Application Publication Mar. 28, 2013 Sheet 34 of 64 US 2013/0080898 Al 
1314 
1302 
1304 
1306 
Wait for an 
input at the 
device 
Start 
Open a website through a web browser at the 
device 
Authenticate user's identity at the website 
Display a visual access menu at the device 
Receive an input from the user of the device 
1310 
1312_ 
No 
Is input is for 
accessing 
remote 
FIG. 13A
Patent Application Publication Mar. 28, 2013 Sheet 35 of 64 US 2013/0080898 Al 
1316 1318 __ 
1320 
Is a visual 
access menu 
for services 
available? 
Yes 
Display the visual access menu including 
service options at the device 
1322~ 
Receive a selection of a service option from 
the user 
1324 
Is information 
for the 
selected 
service option 
available? 
Retrieve visual 
access menu 
from a server 
1326 __ 
Receive 
information 
from the 
server 
1328 Yes *-----------------------~ 
Display the information at the device based 
on the received selection 
FIG13B
Patent Application Publication Mar. 28, 2013 Sheet 36 of 64 US 2013/0080898 Al 
1334 
1336 
1338 
1330 
No 
1332 
Is a visual 
access menu 
for remote 
devices 
available at the 
device? Retrieve the visual 
access menu from 
the server 
Yes 
Display the visual access menu including 
device options at the device 
Receive a selection of a device option from 
the user 
Connect to a remote device based on the 
received selection 
1340 Control the remote device based on the one 
or more user inputs 
FIG. 13C
Patent Application Publication Mar. 28, 2013 Sheet 37 of 64 US 2013/0080898 Al 
Start 
V 
1402 ._ Open a website through a web browser at the 
device 
I 1404 
Display a visual access menu at the device 
I 
1406 ._ 
Receive an input from the user of the device 
I 
1408 ._ Display an enhanced visual access menu 
when the input is for accessing remote 
devices 
I 
1410 ._ Receive a selection of a device option from 
the user 
1412 " Connect to a remote device based on a 
selection of the device option 
1414 " Control one or more operations of the remote 
device based on the selection of the device 
option 
Stop 
FIG.14
Patent Application Publication Mar. 28, 2013 Sheet 38 of 64 US 2013/0080898 Al 
Start 
w 
1502 ~ Access a database of visual access menus 
through a GUI at an access device 
I 
1504 __ Display a visual access menu at a display 
device 
t 
1506 "' 
Display, at the display device, an enhanced 
visual access menu based on a selection of 
an option by a user 
II 
1508 __ Receive a selection of a device option from a 
user 
'II 
1510 __ Connect to a remote device based on 
selection of the device option 
I 
1512 __ 
Control one or more operations of the remote 
device based on selection of the device 
option 
it 
Stop 
FIG. 15
Patent Application Publication Mar. 28, 2013 Sheet 39 of 64 US 2013/0080898 Al 
Start 
I 
1602 "___ Access a database of visual access menus 
through a GU! at an access device 
1 
1604 "___ Display a visual access menu at a display 
device 
t 
1606 . 
Display, at the display device, an enhanced 
visual access menu based on a selection of 
an option by a user 
V 
1608 _ Receive a selection of a service option from 
a user 
II 
1610 ".._ Connect to a service based on a selection of 
the service option 
I 
1612 ~ 
Display information , at the display device, 
about the service based on the selection of 
the service option 
I 
Stop 
FIG. 16
Patent Application Publication Mar. 28, 2013 Sheet 40 of 64 US 2013/0080898 Al 
1710 
1702 
1704 
Display a GUI for accessing visual access 
menus at a display device connected to an 
access device 
Receive an input from a user of the device 
Wait for an 
input from the 
user 
No 
1712 
ls input is for 
accessing 
remote 
Is a visual 
access menu 
for remote 
devices 
available? 
FIG. 17A
Patent Application Publication Mar. 28, 2013 Sheet 41 of 64 US 2013/0080898 Al 
1718 
Is a visual 
access menu 
for services 
available? 
No 
Display the visual access menu including 
service options at the device 
1720~ 
Receive a selection of a service option from 
the user 
1722 
Is information 
for the No 
selected 
service option 
available? 
1726 Yes 
Display the information based on the received 
selection 
Stop 
FIG. 178 
Retrieve 
visual access 
menu from a 
1724 
! 
Receive 
information 
from the 
server
Patent Application Publication Mar. 28, 2013 Sheet 42 of 64 US 2013/0080898 Al 
8 c 
1728 .__ II 
Retrieve the visual access menu for the 
remote devices from the server 
1730 . 
 
' 
Display the visual access menu including 
"' device options at the display device 
V 
1732 ___ Receive a selection of a device option from 
the user 
It 
1734 __ Connect to a remote device based on the 
received selection 
V 
1736 __ Control the remote device based on the one 
or more user inputs 
1 
D 
FIG. 17C
Patent Application Publication Mar. 28, 2013 Sheet 43 of 64 US 2013/0080898 Al
Patent Application Publication Mar. 28, 2013 Sheet 44 of 64 
N 
0 .c..o... 
N 
0 .c..o.. 
US 2013/0080898 Al 
00 
co 
--- 0 
u..
~.'"..=. . 
('D .=.... 
f102 - 
~ '-.e... (') 
CockQit ~....... .. 0 f 1904a 1904b 1904c ='"= = 0-..".. (') 
IVR Remote Services 
Device Control Control 
~....... .. 0 = 
f 1904d f 1904e f 1904f ~ 
~ :-: 
Outlook Calendar Other E-rnails 
1902 N 
~CIO 
N 
0.. .. 
(.H f 1904g f 1904h 1904n rFJ =- ('D 
(.'.D.. . 
.j;o. 
Messengers Games • • Other Objects 
Ul 
0... .. 
0 
.j;o. 
c 
rFJ 
N 
0.. .. 
.(...H_ 
0 
0 
CIO 
0 
FIG.19 CIO 
0 
CIO 
>....
First Device 2002 ""l II. )( I Network , 1E VMThings 
2004 
/'~'' 
2010 
2004 ""I Second Device I Proxy I VMThings I Server 
2008 ----------------------~----------------- 
FIG. 20A 
Remote Device 
)I Remote Device 
"-.. 
I Remote Device 
'. • 
• 
• 
1 Remote Device 
106a 
I ,.,- 106b 
r 106c 
~ 106n 
'"= ~.... . 
('D .=.... 
~ '-.e... (') 
~....... .. 0 = ='"= 0-..".. (') 
~....... .. 0 = 
~ 
~ :-: 
N 
~CIO 
N 
0.. .. 
(.H 
rFJ =­(' 
D 
(.'.D.. . 
.j;o. 
0 
0... .. 
0 
.j;o. 
c 
rFJ 
N 
0.. .. 
.(...H_ 
0 
0 
CIO 
0 
CIO 
0 
CIO 
>....
Patent Application Publication Mar. 28, 2013 Sheet 47 of 64 
ro 
N 
0 
C'1 
(!.) 
0 ·;;: 
"- 
(j) 
(j) 
.0 
N 
0 
N 
(!.) 
(.) ·s;: 
!... 
t1) 
(/) 
Q) 
0 ·s;: 
Q) 
0 
....... 
tJ) 
s... u:: 
N 
0 
0 
N 
Cf) 
0) 
..sc O~~ 
r-o 
~N 
> 
l) 
N 
0 
N 
(j) 
() '> "- 
(!.) 
(/') 
• • • 
------ 
c: 
N 
0 
C'1 
(!.) 
0 
-~ 
(j) 
(j) 
>.s... 
X <D 
0 2: s... (!.) a..w 
(J) 
() ·s;: 
(!.) 
0 
"0 c: 
0 
(.) 
(!.) 
(/') 
.0 
0 
0 
N 
Cf) 
0) 
.·c-c 00o0 1 
1-N 
~ > 
US 2013/0080898 Al 
en 
0 
N . 
(9 
LL
Patent Application Publication Mar. 28, 2013 Sheet 48 of 64 US 2013/0080898 Al 
Start 
I 
2102 ""._ Access a GUI for configuring a cockpit by a 
first user at a first device 
I 
2104 
~ 
Configure the cockpit based on preferences 
of the first user 
I 
2106 __ Share the cockpit with one or more second 
users of the second devices 
J 
2108 ~ Translate the cockpit based on preference of 
the one or more second users 
I 
2110 . Display the translated cockpit at the one or 
more second devices 
V 
Stop 
FIG. 21
Patent Application Publication Mar. 28, 2013 Sheet 49 of 64 US 2013/0080898 Al 
Start 
I 
2202 ~ Access a GUI for configuring a cockpit at a 
first device by a first user 
I 
2204 ___ Configure the cockpit based on preferences 
of the first user 
I 
2206 ._ Share the cockpit with one or more second 
users 
I 
2208 _ Translate the cockpit based on preference of 
the one or more second users 
 
2210 _ 
Display the translated cockpit at one or more 
second devices of the one or more second 
users 
2212  
Interact with the cockpit at the second device 
'II 
A 
FIG. 22A
Patent Application Publication Mar. 28, 2013 Sheet 50 of 64 US 2013/0080898 Al 
A 
I 
2214 "' Store interactions of the second users with 
the cockpit at a proxy server in a network 
1 
Ask for a permission from the first user in 
2216 
""'... case of a change in the cockpit by the one or 
more second user 
I 
2218 __ Update the cockpit based on the permission 
from the first user 
I 
Stop 
FIG. 228
Patent Application Publication Mar. 28, 2013 Sheet 51 of 64 US 2013/0080898 Al 
Start 
I 
2302 
---._ Access a database of visual access menus 
'-- through a GUI for customizing a cockpit at a 
device 
2304 .__ Search the database for a cockpit based on 
an input from a user 
2306 ._ 
2308 
Customize the cockpit according to the user 
preferences 
Display a customized cockpit at the device 
} 
Stop 
FIG. 23
Patent Application Publication Mar. 28, 2013 Sheet 52 of 64 US 2013/0080898 Al 
Start 
if 
2402 .... 
Access a database of visual access menus 
through a GU I for creating a cockpit at a 
device 
I 
2404 .... Display one or more configuration settings 
options for creating the visual access menu 
I 
2406 _ Receive selection of one or more settings 
options from a user 
I 
2408 _ Create the cockpit based on the selection 
received from the user 
2410 II 
Display the cockpit to the user 
It 
Stop 
FIG. 24
Patent Application Publication Mar. 28, 2013 Sheet 53 of 64 US 2013/0080898 Al 
Start 
I 
2502 __ 
Access a database of visual access menus 
through a GU I for creating a cockpit at a 
device 
V 
2504 __ Display one or more configuration options for 
customizing or creating the cockpit 
'V 
2506 _ Create/configure the cockpit based on the 
selection received from the user 
j 
2508  Receive a rating for the cockpit from other 
users in a network 
V 
2510 _ Customize the cockpit based on the ratings of 
the other users 
2512 j 
Display the customized cockpit at the device 
I 
Stop 
FIG. 25
Patent Application Publication Mar. 28, 2013 Sheet 54 of 64 US 2013/0080898 Al 
Start 
2602 __ Create a first cockpit by accessing a GU! for 
creating a cockpit at a first device 
V 
2604 __ Download the first cockpit at one or more 
second devices 
It 
2606 . 
Customize a second cockpit at the one or 
more second devices based on the 
downloaded first cockpit 
I 
2608 _ Receive a rating on the customized second 
cockpit from other users in a network 
I 
2610 ~ 
Download configuration settings of the 
second cockpit at the first device based on 
the users ratings 
It 
2612 _ Customize the first cockpit based on the 
downloaded configuration settings 
2614 _ Display the customized first cockpit at the first 
device 
Stop 
FIG. 26
Patent Application Publication Mar. 28, 2013 Sheet 55 of 64 US 2013/0080898 Al 
2702 ___ 
2704 ___ 
2706  
2708 _ 
Start 
I 
Select a second cockpit of one or more 
second user from a database, wherein a 
profile of the second users is similar to profile 
of a first user 
lt 
Analyze the second cockpit of the one or 
more second users 
Create a first cockpit specific to the first user 
based on the analysis of the second cockpit 
of the second users 
Display the first cockpit specific to the first 
user at the device 
I 
Stop 
FIG. 27
Patent Application Publication Mar. 28, 2013 Sheet 56 of 64 US 2013/0080898 Al 
Start 
ll 
2802 ~ Access a GUI for creating a cockpit at a first 
device 
I 
2804 
Provide information about a second user 
I 
2806 _ Download configuration settings of a second 
cockpit of the second user at the first device 
ll 
2808 _ Create or customize a first cockpit based on 
the second cockpit of the second user 
2810 V 
Store the first cockpit at the first device 
2812 V 
Display the first cockpit to the user 
II 
Stop 
FIG. 28
Patent Application Publication Mar. 28, 2013 Sheet 57 of 64 US 2013/0080898 Al 
Start 
1 
2902 "-- Access a GUI for creating a cockpit at a 
device 
/ 
2904 .._ Download a cockpit having good ratings at a 
device from the internet 
j 
2906 .__ Translate/customize the downloaded cockpit 
according to a language preference of a user 
2908 I 
Store the customized cockpit at the device 
I 2910 
Display the customized cockpit at the device 
I 
Stop 
FIG. 29
~.'"..=. . 
('D 3002 .=.... 
~ 
Device (3008 
Object 3006a 
'e 
3004 
~....... .. 0 = C ='"= 3010a 
-.... (') 
Web Page  
I Network 
0" 
User ID I 
3006b - 
~....... .. 0 = Database 
I Object 
.... 
I 
(') 
Cockpit 
c__ 3010b 
1 3012 
I } VMThings 
~ 
Password I 
Object 
3006c ~ :-: 
N 
3014 
~CIO 
N • 0.. .. 
• (.H 
• rFJ =- ('D 
3006n ('D 
Object r' 
..... 
Ul 
CIO 
0... .. 
0 
.j;o. 
c 
rFJ 
N 
0.. .. 
.(...H_ 
0 
0 
CIO 
0 
FIG. 30 CIO 
0 
CIO 
>....
Patent Application Publication Mar. 28, 2013 Sheet 59 of 64 US 2013/0080898 Al 
Start 
'V 
3102 .. Open a website through a web browser at a 
device 
V 
3104 Authenticate a user's identity at the website 
'V 
3106 ._ Display one or more configuration options to 
the user 
I; 
3108 ___ Receive selection of the one or more 
configuration options from the user 
 
3110 ~ 
Configure or create a cockpit for the user 
based on the selection of the configuration 
options. 
3112 I 
Display the cockpit to the user 
~~ 
Stop 
FIG.31
Patent Application Publication Mar. 28, 2013 Sheet 60 of 64 US 2013/0080898 Al 
Start 
I 
3202 ._ Open a website through a web browser at a 
device 
I 
3204 Authenticate a user's identity at the website 
1 
3206 .._ Display a cockpit specific to the user at the 
device 
I 
3208 _ User interacts with the cockpit 
1 
3210 _ 
Display an enhanced visual access menu 
based on the interaction of the user with the 
cockpit 
I 
3212 _ Interact and control one or more operations of 
the remote devices 
v 
Stop 
FIG.32
Patent Application Publication Mar. 28, 2013 Sheet 61 of 64 US 2013/0080898 Al 
Start 
 
3302 _ Access a website for creating a cockpit at a 
first device 
 
3304 _ Invite one or more second user for 
configuring the cockpit 
V 
3306 __ Receive one or more inputs from the one or 
more second users 
3308 } 
Receive one or more inputs from the first user 
I 
3310 .. Configure a cockpit based on the inputs of the 
first and second user 
3312 'V 
Store the cockpit at the first device 
I 
Stop 
FIG. 33
Patent Application Publication Mar. 28, 2013 Sheet 62 of 64 US 2013/0080898 Al 
Start 
 
3402 "-- Access a database of visual access menus 
through a GUl at a device 
I 
3404 ___ Display a visual access menu along with one 
or more mode options to a user 
V 
3406 . Receive selection of a mode option from the 
user 
I 
3408 __ Switch the mode based on the selection of 
the mode option 
'II 
3410 '_ Play an audio menu to the user when the 
user selects an audio mode 
/ 
Stop 
FIG. 34
~.'"..=. . 
('D .=.... 
f102 ~ '-.e... 
(') 
CockQit 
~....... .. J J J 0 1904a 1904b 1904c 
= ='"= Remote 
0-..".. (') 
~....... .. 0 f 1904d ,;- 1904e f = 1904f 
!VR Device Services 
Control Control 
1- ~ 
~ :-: 
Outlook Calendar Other E-mails 1902 N 
QO 
~ 
N 
0 f 1904g f 1904h J 1904n 
.... 
(.H 
rFJ =- ('D 
('D 
Messengers Games • • Other Objects 
..... 
0 
(.H 
0... .. 
0 
.j;o. 
Audio Video Text list 
Mode Mode Mode • • Mode c 
3502a 3502b 3502c 3502n rFJ 
N 
0.. .. 
.(...H_ 
0 
0 
QO 
0 
FIG.35A QO 
0 
QO 
>....
'"= ~..... . 
102 ('D .=..... 
> 'e 
'-e GraQhical User Interface .... (') 
~........ .. 0 
~ 3504a ~ 3504b = ='"= 0-..".. 
Create Cockpit Customize Cockpit (') 
~........ .. 0 = 
1- ~ 
~ :-: 
3506 N r 3504c ~ 3504n QO 
~ 
N 
0.. .. 
(.H 
View Cockpit • • Invite Users rFJ =- ('D 
('D ...... 
0 
.j;o. 
0... .. 
0 
.j;o. 
Audio Video Text List 
Mode Mode Mode • • Mode c 
rFJ 
3502a 3502b 3502c 3502n N 
0.. .. 
.(...H_ 
0 
0 
QO 
0 
QO 
FIG.35B 0 
QO 
>....
US 2013/0080898 AI 
SYSTEMS AND METHODS FOR 
ELECTRONIC COMMUNICATIONS 
CROSS REFERENCE TO RELATED 
APPLICATIONS 
[0001] This application is a Continuation-In-Part (CIP) of 
U.S. Non-Provisional application Ser. No. 13/245,804 
entitled 'Systems and Methods for Electronic Communica­tions' 
and filed on Sep. 26, 2011. 
[0002] This application is a Continuation-In-Part (CIP) of 
U.S. Non-Provisional application Ser. No. 13/272,212 
entitled 'Systems and Methods for Electronic Communica­tions' 
and filed on Oct. 12, 2011. 
FIELD OF THE INVENTION 
[0003] The present invention is related to electronic com­munications 
in a network and more specifically to systems 
and method for accessing and controlling one or more objects 
(physical or virtual) such as remote devices and services from 
a remote location by a user. 
BACKGROUND OF THE INVENTION 
[0004] Electronic devices are frequently used in day to day 
life. The electronic devices may include television, refrigera­tor, 
air conditioners, fans, tube lights, cameras or other elec­tronic 
equipments such as transmitters, antennas etc. All the 
electronic devices consume power regularly or at frequent 
intervals of time. For efficient power consumption, the elec­tronic 
devices must be controlled or switched ON/OFF. 
[0005] Appliances such as fans, tube lights or microwave 
may be controlled by regulating the electrical parameters 
associated with the appliances. For example, a user may 
control speed of fan, regulate operating power of the micro­wave 
as per requirement. However, it requires physical pres­ence 
of the user to regulate or switch ON/OFF the appliances. 
A technique for controlling the appliances by a remote con­trol 
device is well known. The remote control device may 
transmit signals for controlling the appliances. For example, 
the remote control device may simultaneously control air 
conditioners, fans or cameras as per the requirement. How­ever, 
the technique is limited by location of the user. More­over, 
the technique is incapable of updating the real-time 
status of the appliances to the user. 
[0006] Another available technique discloses a smart 
device for controlling the appliances. The smart device is 
configured with internet and connected with the appliances. A 
user connected with the smart device via the internet may 
control the appliances from a remote location. Moreover, the 
user may control the appliances by connecting with process­ing 
device via communication channel. The processing 
device may be located nearby to the smart device and may 
further receive signals from the user to control the appliances. 
However, the technique requires installation of a smart device 
and/or processing device for controlling the appliances from 
a remote location. 
[0007] Another available technique discloses real-time 
position monitoring of vehicles. The user may monitor real 
time coordinates of the vehicles based on the information 
received from a transmitter located in the vehicle. The user 
receives the position coordinates from the transmitter via a 
GPS server 114. However, the user is unable to control or 
update the positional coordinates of the vehicle as per choice. 
1 
Mar. 28, 2013 
[0008] In light of the above discussion, systems and meth­ods 
are desired for providing real-time control of the elec­tronic 
devices and services from a remote location. 
SUMMARY 
[0009] Embodiments of the invention provide a system for 
enhancing interaction of a user with objects connected to a 
network. The system includes a processor, a display screen, 
and a memory coupled to the processor. The memory com­prises 
a database including a list of two or more objects and 
instructions executable by the processor to display a menu. 
The menu is associated with at least two independent objects. 
Further, the two independent objects are produced by at least 
two independent vendors. 
[001 0] Embodiments of the invention further provide a sys­tem 
for enhancing interaction of a user with objects con­nected 
to a network. The system includes a processor, a dis­play 
screen and a memory coupled to the processor. The 
memory includes a database comprising a list of one or more 
objects and instructions executable by the processor to dis­play 
it to the user. The menu includes icon which may indicate 
one object made by a vendor. Further, the icon is substantially 
different than the one provided by said vendor. 
[0011] Embodiments of the invention provide a method for 
accessing and controlling remote devices in a network. The 
method includes accessing a database of visual access menus 
through a graphical user interface (GUI) at a device. Further, 
the method includes displaying a visual access menu at the 
device. The visual access menu may include one or more 
options. The device may include an Internet of Things appli­cation 
such as a VMThings for displaying the visual access 
menu at the device. The VMThings also enables a user of the 
device to control the remote devices. The VMThings may be 
configured to create an Internet of Things menu including 
representations of recognizable objects. The objects may be 
physical objects or virtual objects. The Internet of Things 
menu may be a menu of identifiable objects (physical or 
virtual objects) connected in an Internet like structure. The 
user may control the remote devices irrespective of the loca­tion 
of the remote devices through the visual access menu. 
The user may select an option from the visual access menu. 
The method further includes displaying an enhanced visual 
access menu based on a selection of an option received from 
the user. The enhanced visual access menu may include one or 
more device options depending on the selection of the option. 
The device options are representation corresponding to the 
remote devices. The method further includes receiving a 
selection of a device option from the user. The method further 
includes connecting to a remote device based on the selection 
of the device option. Further, the method includes controlling 
the one or more operations of the connected remote device 
based on the selection of the device option. 
[0012] Embodiments of the invention provide a method for 
accessing and controlling services from a remote location. 
The method includes accessing, by a user of a device, a 
database of visual access menus through a graphical user 
interface (GUI) at the device. Further, the method includes 
displaying a visual access menu at the device. The visual 
access menu may include one or more options. The device 
may include an Internet of Things application i.e. a VMTh­ings 
for displaying the visual access menu at the device. 
Further, the VMThings may create an Internet of Things 
menu including one or more identifiable objects connected to 
each other in an Internet like structure. The VMThings may
US 2013/0080898 AI 
display visual access menu at the device to enable the user to 
control the remote services. The method further includes 
displaying an enhanced visual access menu based on a selec­tion 
of an option received from the user. The enhanced visual 
access menu may include one or more service options 
depending on the selection of the option. The service options 
are representation corresponding to the services. The method 
further includes receiving a selection of a service option from 
the user. The method further includes connecting to a service 
based on the selection of the service option. Further, the 
method includes connecting the device to the service. Fur­thermore, 
the method includes controlling and displaying 
information about the service at the device based on the 
selection of the service option. 
[0013] Embodiments of the invention also provide a device 
for accessing and controlling remote devices in a network. 
The device may include an Internet of Things application i.e. 
a VMThings configured to enable a user of the device to 
access a database including visual access menus through a 
GUI. Further, the VMThings is configured to create an Inter­net 
of Things menu including one or more identifiable objects 
connected in an Internet like structure. The VMThings may 
display a visual access menu including one or more options at 
the device. Further, the VMThings may display an enhanced 
visual access menu at the device based on a selection of an 
option received from the user. The enhanced visual access 
menu may include one or more device options depending on 
the selection of the option. The device options are represen­tation 
corresponding to the remote devices. The VMThings 
may further receive a selection of a device option from the 
user. The VMThings may also connect the device to a remote 
device based on the selection of the device option. The 
VMThings may control one or more operations of the con­nected 
remote device based on the selection of the device 
option. 
[0014] Embodiments of the invention also provide a device 
for accessing and controlling services in a network from a 
remote location. The device may include an Internet of 
Things application such as a VMThings configured to enable 
a user of the device to access a database including visual 
access menus through a GUI. The VMThings is also config­ured 
to display a visual access menu including one or more 
options at the device. Further, the VMThings may display an 
enhanced visual access menu at the device based on a selec­tion 
of an option received from the user. The enhanced visual 
access menu may include one or more service options 
depending on the selection of the option. The service options 
are representation corresponding to the services located 
remotely. The VMThings may further receive a selection of a 
service option from the user. The VMThings may also con­nect 
the device to a service based on the selection of the 
service option. The VMThings may control and display infor­mation 
of the service to the device based on the selection of 
the service option. 
[0015] Embodiments of the invention also provide a system 
for accessing and controlling remote devices. The system 
includes a display device configured to display one or more 
visual access menus. Further, the system includes an access 
device connected to the display device. The access device 
may include an Internet of Things application i.e. a VMTh­ings 
configured to display the one or more visual access 
menus including one or more options to control the remote 
devices, at the display device. The user may create or config­ure 
an Internet of Things menu through a Graphical User 
2 
Mar. 28, 2013 
Interface at the device. In an embodiment of the invention, the 
VMThings may be configured to create the Internet ofThings 
menu. The VMThings is further configured to enable a user of 
the access device to access a database including the visual 
access menus through a GUI. The VMThings may display an 
enhanced visual access menu at the device based on a selec­tion 
of an option received from the user. The enhanced visual 
access menu may include one or more device options depend­ing 
on the selection of the option. The device options are 
representation corresponding to the remote devices. The 
VMThings may further receive a selection of a device option 
from the user. The VMThings may also connect the device to 
a remote device based on the selection of the device option. 
The VMThings may control one or more operations of the 
connected remote device based on the selection of the device 
option. 
[0016] Embodiments of the invention also provide a system 
for accessing and controlling services in a network from a 
remote location. The system may include a display device 
configured to display one or more visual access menus. Fur­ther, 
the system may include an access device connected to 
the display device. The access device may include an Internet 
of Things application i.e. a VMThings configured to display 
the one or more visual access menus including one or more 
options to control the remote devices at the display device. 
The VMThings is further configured to enable a user of the 
access device to access a database including the visual access 
menus through a Graphical User Interface (GUI). The GUI 
may be used for creating an Internet of Things Menu includ­ing 
a plurality of identifiable objects in a network like struc­ture. 
The identifiable objects may be physical objects or vir­tual 
objects. Further, the VMThings may display an enhanced 
visual access menu at the device based on a selection of the 
option received from the user. The enhanced visual access 
menu may include one or more service options depending on 
the selection of the option. The service options are represen­tation 
corresponding to the services. The VMThings may 
further receive a selection of a service option from the user. 
The VMThings may also connect the device to a remote 
device based on the selection of the service option. The 
VMThings may control and display information about the 
service based on the selection of the service option. 
[0017] Embodiments of the invention further provide a 
method for accessing and controlling the remote devices in a 
network through a web browser. The method includes open­ing 
a webpage in the web browser at a device including a 
VMThings. The method may further include displaying a 
visual access menu at the device. The VMThings may create 
or display the visual access menu or an Internet of Things 
menu at the device. The Internet ofThings menu may include 
a plurality of representations corresponding to identifiable 
objects. The identifiable objects may be physical objects or 
virtual objects. The visual access menu may include one or 
more options. Further, the method includes displaying an 
enhanced visual access menu at the device based on a selec­tion 
of an option received from the user. The enhanced visual 
access menu may include one or more device options depend­ing 
on the selection of the option. The device options are 
representation corresponding to the remote devices. The 
method further includes receiving a selection of a device 
option from the user. The method further includes connecting 
to a remote device based on the selection of the device option. 
Further, the method includes connecting the device to the 
remote device based on the selection of the device option.
US 2013/0080898 AI 
Further, the method includes controlling the one or more 
operations of the connected remote device based on the selec­tion 
of the device option. 
[0018] Embodiments of the invention further provide a 
method for accessing and controlling the services in a net­work 
through a web browser. The method includes opening a 
webpage in the web browser at a device including an Internet 
of Things application i.e. a VMThings. The VMThings is 
configured to enable a user of the device to access a database 
including the visual access menus through a GUI. The 
method further includes displaying a visual access menu at 
the device. The VMThings may display the visual access 
menu at the device. The visual access menu may include one 
or more options. Further, the method includes displaying an 
enhanced visual access menu at the device based on a selec­tion 
of an option received from the user. The enhanced visual 
access menu may include one or more service options 
depending on the selection of the option. The service options 
are representation corresponding to the service. The method 
further includes receiving a selection of a service option from 
the user. The method further includes connecting to a service 
based on the selection of the service option. Further, the 
method includes connecting the device to the remote device 
based on the selection of the service option. Further, the 
method includes controlling and displaying the information 
of the service based on the selection of the service option. 
[0019] An aspect of the invention is to enable a user to 
control one or more operations of the remote devices or 
services through voice commands or gestures or hand move­ments. 
For example, the user may switch on an air conditioner 
(AC) by showing a thumb up gesture in front of the device. 
The device may include a camera to detect the gesture. The 
VMThings at the device (or access device) may analyze the 
gesture and control a remote device based on the analysis. 
[0020] An aspect of the invention is to transfer display of a 
device to another device. The another device may be con­nected 
to the device through wireless means. 
[0021] Another aspect of the invention is to create a data 
base of visual access menus or enhanced visual access menus. 
The visual access menus or the enhanced visual access menus 
are the visual menus for controlling one or more objects such 
as, but are not limited to, remote devices, services, and so 
forth. 
BRIEF DESCRIPTION OF THE DRAWINGS 
[0022] Having thus described the invention in general 
terms, reference will now be made to the accompanying 
drawings, which are not necessarily drawn to scale, and 
wherein: 
[0023] FIG. 1A illustrates an exemplary environment, in 
accordance with an first embodiment of the invention; 
[0024] FIG. 1B illustrates another exemplary environment, 
in accordance with the first embodiment of the invention; 
[0025] FIG. 1C illustrates yet another exemplary environ­ment, 
in accordance with the first embodiment of the inven­tion; 
[0026] FIG. 1D illustrates an environment based on a Zig­Bee 
network, in accordance with the first embodiment of the 
invention; 
[0027] FIG. 1E illustrates an environment based on a 
WiMAX network, in accordance with the first embodiment of 
the invention; 
3 
Mar. 28, 2013 
[0028] FIG. 1F illustrates an environment based on a Glo­bal 
System for Mobile Communication (GSM) network, in 
accordance with the first embodiment of the invention; 
[0029] FIG. 1G illustrates an environment based on a Zig­Bee 
network, in accordance with the first embodiment of the 
invention; 
[0030] FIG. 1H illustrates an environment based on a 
WiMAX network, in accordance with the first embodiment of 
the invention; 
[0031] FIG. 1I illustrates an environment based on a com­bination 
of a local network and the Internet, in accordance 
with the first embodiment of the invention; 
[0032] FIG. 2A illustrates an exemplary environment, in 
accordance with a second embodiment of the invention; 
[0033] FIG. 2B illustrates another exemplary environment, 
in accordance with the second embodiment of the invention; 
[0034] FIG. 2C illustrates yet another exemplary environ­ment, 
in accordance with the second embodiment of the 
invention; 
[0035] FIG. 2D illustrates an environment based on a Zig­Bee 
network, in accordance with the second embodiment of 
the invention; 
[0036] FIG. 2E illustrates an environment based on a 
WiMAX network, in accordance with the second embodi­ment 
of the invention; 
[0037] FIG. 2F illustrates an environment based on a GSM 
network, in accordance with the second embodiment of the 
invention; 
[0038] FIG. 2G illustrates an environment based on a Zig­Bee 
network, in accordance with the second embodiment of 
the invention; 
[0039] FIG. 2H illustrates an environment based on a 
WiMAX network, in accordance with the second embodi­ment 
of the invention; 
[0040] FIG. 2I illustrates an environment based on a com­bination 
of a local network and the Internet, in accordance 
with the second embodiment of the invention; 
[0041] FIG. 3A illustrates an exemplary visual access menu 
and enhanced visual access menu at a device, in accordance 
with the first embodiment of the invention; 
[0042] FIG. 3B illustrates an exemplary visual access menu 
and enhanced visual access menu at the device, in accordance 
with second embodiment of the invention; 
[0043] FIG. 3C illustrates another exemplary visual access 
menu and enhanced visual access menu at the device, in 
accordance with first embodiment of the invention; 
[0044] FIG. 3D illustrates another exemplary visual access 
menu and enhanced visual access menu at the device, in 
accordance with second embodiment of the invention; 
[0045] FIG. 4 illustrates an exemplary enhanced visual 
access menu including one or more device options, in accor­dance 
with an embodiment of the invention. 
[0046] FIG. 5 illustrates an exemplary enhanced visual 
access menu including one or more service options, in accor­dance 
with an embodiment of the invention. 
[0047] FIG. 6 illustrates exemplary components of a 
device, in accordance with an embodiment of the invention; 
[0048] FIG. 7 illustrates exemplary components of an 
access device, in accordance with an embodiment of the 
invention; 
[0049] FIG. 8 illustrates a flowchart diagram for controlling 
remote devices, in accordance with an embodiment of the 
invention;
US 2013/0080898 AI 
[0050] FIG. 9 illustrates a flowchart diagram for controlling 
remote services, in accordance with an embodiment of the 
invention; 
[0051] FIGS. lOA, lOB, and lOC illustrate a flowchart dia­gram 
for controlling objects by using a device in a network, in 
accordance with an embodiment of the invention; 
[0052] FIG. 11 illustrates a flowchart diagram for control­ling 
remote devices by using a web browser at a device, in 
accordance with an embodiment of the invention; 
[0053] FIG. 12 illustrates a flowchart diagram for control­ling 
remote services by using a web browser at a device, in 
accordance with an embodiment of the invention; 
[0054] FIGS. 13A, 13B, and 13C illustrate a flowchart dia­gram 
for controlling objects in a network through a web 
browser at a device, in accordance with an embodiment of the 
invention; and 
[0055] FIG. 14 illustrates a flowchart diagram for control­ling 
remote devices through a website, in accordance with 
another embodiment of the invention; 
[0056] FIG. 15 illustrates a flowchart diagram for control­ling 
remote devices by using an access device in a network, in 
accordance with an embodiment of the invention; 
[0057] FIG. 16 illustrates a flowchart diagram for control­ling 
remote services by using an access device in a network, 
in accordance with an embodiment of the invention; 
[0058] FIGS. 17 A, 17B, and 17C illustrate a flowchart dia­gram 
for controlling objects in a network devices through an 
access device, in accordance with an embodiment of the 
invention; 
[0059] FIG.l8A illustrates an exemplary display of images 
of remote devices, in an embodiment of the invention; and 
[0060] FIG. 18B illustrates transfer of an exemplary dis­play 
of images from a device to another device, in an embodi­ment 
of the invention. 
[0061] FIG. 19 illustrate an exemplary cockpit, in accor­dance 
with an embodiment of the invention; 
[0062] FIG. 20A-B illustrates exemplary environments for 
providing access of a cockpit of a user to other users, in 
accordance with an embodiment of the invention; 
[0063] FIG. 21 illustrates a flowchart diagram for providing 
access control of a cockpit to one or more second users, in 
accordance with an embodiment of the invention; 
[0064] FIG. 22 illustrates a flowchart diagram for providing 
access control of the cockpit to one or more second users, in 
accordance with another embodiment of the invention; 
[0065] FIG. 23 illustrates a flowchart diagram for config­uring 
a cockpit based on user's preference, in accordance 
with an embodiment of the invention; 
[0066] FIG. 24 illustrates a flowchart diagram for config­uring 
a cockpit, in accordance with an embodiment of the 
invention; 
[0067] FIG. 25 illustrates a flowchart diagram for custom­izing 
a cockpit based on other users' reviews, in accordance 
with an embodiment of the invention; 
[0068] FIG. 26 illustrates a flowchart diagram for down­loading 
and customizing a cockpit at a second device, in 
accordance with an embodiment of the invention; 
[0069] FIG. 27 illustrates a flowchart diagram for config­uring 
a cockpit based on another cockpit of other user, in 
accordance with an embodiment of the invention; 
[0070] FIG. 28 illustrates a flowchart diagram for config­uring 
a cockpit based on another cockpit of other user, in 
accordance with another embodiment of the invention; 
4 
Mar. 28, 2013 
[0071] FIG. 29 illustrates a flowchart for downloading a 
cockpit from a network, in accordance with an embodiment 
of the invention; 
[0072] FIG. 30 illustrates an environment for accessing a 
cockpit through a website, in accordance with an embodiment 
of the invention; 
[0073] FIG. 31 illustrates a flowchart diagram for config­uring 
a cockpit through a website, in accordance with an 
embodiment of the invention; 
[0074] FIG. 32 illustrates a flowchart diagram for accessing 
a cockpit through a website, in accordance with an embodi­ment 
of the invention; 
[0075] FIG. 33 illustrates a flowchart diagram for config­uring 
a cockpit with the help of other users, in accordance 
with an embodiment of the invention; 
[0076] FIG. 34 illustrates a flowchart diagram for switching 
a display mode of a cockpit, in accordance with an embodi­ment 
of the invention; and 
[0077] FIG. 35B illustrates an exemplary display of a GUI 
along with one or more mode options, in accordance with an 
embodiment of the invention. 
DETAILED DESCRIPTION OF THE INVENTION 
[0078] Illustrative embodiments of the invention now will 
be described more fully hereinafter with reference to the 
accompanying drawings, in which some, but not all embodi­ments 
of the invention are shown. Indeed, the invention may 
be embodied in many different forms and should not be 
construed as limited to the embodiments set forth herein; 
rather, these embodiments are provided so that this disclosure 
will satisfY applicable legal requirements. Like numbers refer 
to like elements throughout. 
[0079] FIG. lA illustrates an exemplary environment 100, 
in accordance with a first embodiment of the invention. The 
first embodiment describes functionality of an Internet of 
Things application i.e. a VMThings 108 for controlling a 
plurality of remote devices l06a-n. A user may create or 
configure an Internet of Things menu or cockpit for accessing 
or controlling the plurality of remote devices l06a-n at a 
device 102. In an embodiment of the invention, the VMTh­ings 
108 may configure or create the Internet of Things menu 
or the cockpit. The Internet of Things menu may include 
representations of one or more recognizable or identifiable 
objects such as, but are not limited to, remote devices l06a-n 
or services in an Internet or network like structure. The one or 
more identifiable objects may be physical or virtual objects. 
In an embodiment of the invention, a graphical user interface 
(GUI) may be used by the user for creating the Internet of 
Things Menu. The objects may be the remote devices 1 06a-n 
or services. The user may use the device 102 for connecting to 
a plurality of remote devices 1 06a-n through a network 104 
through the Internet of Things menu. The device 102 may be 
used by the user to control a plurality of objects in the network 
104. The VMThings 108 may control one or more operations 
of the plurality of objects. In an embodiment of the invention, 
the objects may include remote devices l06a-n. In another 
embodiment of the invention, the objects may be services as 
described in FIG. 2A-I. In yet another embodiment of the 
invention, the objects may be combination of the remote 
devices l06a-n and services. In an embodiment of the inven­tion, 
the device 102 can be a portable device capable of 
communicating and connecting to other devices such as the 
remote devices l06a-n. The device 102 may have a display 
screen. In an embodiment of the invention, the device 102
US 2013/0080898 AI 
may have a limited display or may not have a display at all. 
Example of the device 102 may include a mobile phone, a 
smart phone, a computer, a personal digital assistant (PDA), a 
tablet computer, a laptop, and so forth. 
[0080] The network 104 can be a wired network or a wire­less 
network or a combination of these. The wireless network 
may use wireless technologies to provide connectivity among 
various devices. Examples of the wireless technologies 
include, but are not limited to, Wi-Fi, WiMAX, fixed wireless 
data, ZigBee, Radio Frequency 4 for Consumer Electronics 
network (RF 4CE), Home RF, IEEE 802.11, 4G or Long Term 
Evolution (LTE), Bluetooth, Infrared, spread-spectrum, Near 
Field Commnnication (NFC), Global Systems for Mobile 
commnnication (GSM), Digital-Advanced Mobile Phone 
Service (D-AMPS). The device 102 is connected to the plu­rality 
of remote devices 106a-n through the network 104. 
Examples of the wired network include, but are not limited to, 
Local Area Network (LAN), Metropolitan Area Network 
(MAN), Wide Area Network (WAN), and so forth. In an 
embodiment of the invention, the network 104 is the Internet. 
[0081] The plurality of remote devices 106a-n can be elec­tronic 
equipments such as, but are not limited to, household 
devices including electric lights, water pump, generator, fans, 
television (TV), cameras, microwave, doors, windows, com­puter, 
or garage locks, security systems, air-conditioners 
(AC), and so forth. In an embodiment of the invention, the 
plurality of the remote devices 1 06a-n can be vehicles such as 
cars, trucks, vans, and so forth. In an embodiment of the 
invention, the VMThings 108 may present a standard menu 
(or a standard visual access menu) for controlling all remote 
devices 106a-n to the user. The user may be provided with 
different visual access menus based on the location of the 
remote devices 106a-n. For example, the user may be dis­played 
with different visual access menus for remote devices 
present in office, home, factory, and so forth. In another 
embodiment of the invention, the VMThings 108 may display 
a customized menu at the device 102 based on user prefer­ences 
and/or access pattern. In an embodiment of the inven­tion, 
the user may configure the VMThings 108 to control 
remote devices 1 06a-n present in more than one building. The 
buildings may be present at different locations. Similarly, the 
user may control the one or more remote devices 106a-n 
located in his/her office from the home. For example, the user 
may control door of his/her office cabin, may switch on or 
switch offhis/her office computer/laptop, AC, and so forth. In 
an embodiment of the invention, the user may control opera­tions 
of one or more remote devices 106a-n present in a 
factory from the home. Further, the user may access the 
plurality of remote devices 1 06a-n from a remote location by 
using the device 102. Further, the user may use the same 
device 102 for controlling the remote devices located at dif­ferent 
locations such as office, factory, home, etc. The user 
doesn't have to carry different or multiple devices for con­trolling 
different remote devices 1 06a-n. The device 102 may 
include a database including a list of one or more objects. In 
an embodiment of the invention, the device 102 may include 
audio or visual menus of the one or more objects i.e. of the 
remote devices 106a-n. The device 102 may include visual 
access menus and/or enhanced visual access menus corre­sponding 
to various objects. The visual access menu may 
provide an interface to the user to control the one or more 
objects such as remote devices 106a-n. The visual access 
menu may include one or more options such as, but are not 
limited to a remote devices option, services option, and so 
5 
Mar. 28, 2013 
forth. In an embodiment of the invention, the visual access 
menus at the device 102 may be updated regularly at pre­defined 
time interval such as after every two days, or once a 
week. The enhanced visual access menus may include one or 
more device options. In an embodiment of the invention, the 
device 102 may include a touch sensitive display. In such a 
scenario, the user may access the one or more options or the 
device options by touching the options directly. In an embodi­ment 
of the invention, the user may connect to the one or more 
objects such as the remote devices 106a-n through applica­tions 
such as, but are not limited to, Skype, Google Talk, 
Yahoo Messenger, Magic Jack, and so forth. 
[0082] Further, the device 102 may include the VMThings 
108 which is configured to enable the user to access the visual 
access menus through a Graphical User Interface (GUI) at the 
device 102. The VMThings 108 may enable the user to con­trol 
the remote devices 106a-n irrespective of their location 
through the network 104. The VMThings 108 may display the 
one or more visual access menus at the device 102. Further, 
the device 102 may include visual access menus associated 
with at least two independent objects. In an embodiment of 
the invention, the two at least two independent objects may be 
produced by two independent vendors, In an embodiment of 
the invention, the device may include vendor specific visual 
access menus or enhanced visual access menus for the remote 
devices 106a-n. Further, the device 102 may also include 
standard menu(s) for accessing the objects. The VMThings 
108 may display the visual access menu depending on the 
independent vendor(s) of the one or more objects. In another 
embodiment of the invention, the VMThings 108 may display 
a visual access menu which is not provided by either of the at 
least two independent vendors of the at least two independent 
objects. In an embodiment of the invention, the user may 
access and control one or more of the remote devices 1 06a-n 
from the remote location by using the device 102. For 
example, the user may use his smart phone to access and 
operate a microwave at his/her home from his/her office. 
Further, the user can use the device 102 at one location to 
monitor and regulate one or more operations of the remote 
devices 106a-n present at another location. The one or more 
operations may be, such as, but are not limited to, switch on, 
switch off, regulate, and so forth. 
[0083] Further, the visual access menus may include at 
least one icon indicating one or more objects such as the 
remote devices 106a-n. Further, the icon is substantially dif­ferent 
than the icons provided in the visual access menu 
provided by the vendor. Further, the remote devices 106a-n 
may be grouped into various categories such as, but are not 
limited to, electronics appliances, home devices, buildings, 
doors, room appliances, switches, floor wise, and so forth. 
Further, the remote devices 106a-n may be grouped accord­ing 
to location of the remote devices, such as home devices, 
office devices, garages devices, factory devices, home2 
devices, farm house devices, and so forth. The VMThings 108 
of the device 102 may store visual access menus and 
enhanced visual access menus corresponding to the remote 
devices 1 06a-n based on the various categories of the remote 
devices 106a-n. Each of the remote devices 106a-n may have 
a nnique remote device identity (ID). In an embodiment of the 
invention, the user may require to register the remote devices 
1 06a-n with the device 102 so that the remote devices 1 06a-n 
may be controlled by using the VMThings 108. In an embodi­ment 
of the invention, the user may be required to authenti-
US 2013/0080898 AI 
cate or prove his/her identity at device 102 or for the remote 
devices 106a-n before controlling one or more operations of 
the remote devices 106a-n. 
[0084] Further, the VMThings 108 may display an 
enhanced visual access menu corresponding to the remote 
devices 106a-n. The enhanced visual access menu may 
include one or more device options. The device options may 
be displayed as graphics or icons and/or text representations 
of the remote devices 106a-n. For example, a car may be 
displayed for representing the car option. The user may con­trol 
the remote devices 106a-n by selecting a device option 
from the device options at the device 102. Further, the 
enhanced visual access menu may display the grouping or 
categories of the remote devices 106a-n. The VMThings 108 
may also translate the visual access menu or the enhanced 
visual access menu from a first language to a second lan­guage. 
Examples of the first language and the second lan­guage 
may include, but are not limited to, Spanish, French, 
English, Sanskrit, Hindi, Urdu, Arabic, and so forth. For 
example, the VMThings may translate an English visual 
access menu into a French visual access menu and thereafter, 
it may be displayed at the device 102. The VMThings 108 
may display the visual access menu or the enhanced visual 
access menu at the device 102 based on the user's preferred 
language. 
[0085] The user may select an option from the visual access 
menu or an enhanced visual access menu. Further, the user 
may select an option (or device options) by using a combina­tion 
of keys on a keypad of the device 102. In an embodiment 
of the invention, the user may select an option by clicking the 
option or the device option by using a mouse device. In an 
embodiment of the invention, the user may select an option by 
touching the screen of the device 102. For example, if the user 
wants to switch on an air conditioner (AC) on way towards 
home, the user can select or enter an appropriate key combi­nation 
on the device 102 or may touch (in case of touch 
sensitive display at the device 102) an option of the visual 
access menu corresponding to the AC. 
[0086] In one embodiment, the user can give a voice com­mand 
to the device 102. Based on the input received by the 
device 102, the air conditioner may be switched on automati­cally. 
Further, the user can also regulate the cooling of the 
room by changing temperature settings of the air conditioner. 
After connecting the device 102 to one or more of the remote 
devices 106a-n, the user can control the one or more opera­tions 
such as, but are not limited to, switch on, switch off, 
reduce temperature, and so forth from a distant location with­out 
being physically present at the location. In one embodi­ment, 
the remote devices 106a-n can be security cameras or 
alarm station installed at the home location of the user. 
[0087] In an embodiment of the invention, the user may 
select an option by making gestures or hand movements at the 
device. For example, the user may do a thumb up gesture to 
switch on an appliance at home or may do a thumb down 
gesture to switch off the same. Similarly, the user may do 
other gestures such as, but are not limited to, waving a hand, 
nodding head, smiling, blinking an eye, and so forth. In an 
embodiment of the invention, the device may include a cam­era 
for detecting the gestures or hand movements. In an 
embodiment of the invention, the VMThings 108 may be 
configured to analyze and interpret the gestures and hand 
movements. Further, the VMThings 108 may include stored 
gestures defined by the user at device 102 and may compare 
or match the real time gestures with the stored gestures. The 
6 
Mar. 28, 2013 
device may include a software or hardware such as micro­phone 
for detecting the voice commands or audio inputs. 
[0088] In another embodiment of the invention, the VMTh­ings 
108 may be configured to analyze the voice commands 
and audio inputs received from the user through voice recog­nition. 
Further, the user may select the option from an Internet 
of Things menu through voice command(s) for controlling 
the remote devices 1 06a-n. The device 102 may include a list 
of voice commands and action to be taken corresponding to 
each command. The VMThings 108 may compare and match 
the received voice command with the stored list and thereafter 
may take an action based on the comparison. In an exemplary 
scenario, the user at office may switch on the AC present at 
home by accessing the visual access menu and saying "switch 
off the AC' on the device 102 (or a smart phone). In an 
embodiment of the invention, speech/voice recognition may 
be used to analyze the voice instructions or commands 
received from the user to control the remote devices 106a-n. 
In an embodiment of the invention, the device 102 may 
receive a call from the one or more objects such as a remote 
device. In such a case, the VMThings 108 may display a 
visual access menu of the calling object. 
[0089] In an embodiment of the invention, the VMThings 
108 may determine location of the device or the plurality of 
objects such as the remote devices 106a-n. In an embodiment 
of the invention, the selection of the option may be automatic 
based on one or more predefined instructions of the user of the 
device 102. For example, the predefined instruction may be 
like switch on the AC at 6 PM, switch off the TV at 2 PM, and 
close the door of the garage. The remote devices 1 06a-n may 
be controlled according to these predefined instructions irre­spective 
of the location of the user or the device 102. 
[0090] In an embodiment of the invention, one or more 
signals may be generated and transmitted by the device 102 
based on the selection of the option or an input received from 
the user. The signals may be transmitted to the remote devices 
106a-n through the network 104. The remote devices 106a-n 
may be controlled based on the signals received from the 
device 102. In an embodiment of the invention, the device 102 
may receive an alert message(s) regarding the operational 
condition of the remote devices 106a-n. For example, an alert 
message like 'Car door left opened' may be received by the 
user at his/her mobile phone for a car standing in a parking 
area. In an embodiment of the invention, the alert message 
may be received through at least one of an SMS, an MMS, an 
instant message, an e-mail, a phone call, turn on of display of 
device when it's off, and so forth. In another embodiment of 
the invention, the user may further receive alert message as 
pop messages at the device 102, at a GPA system, at a multi 
function display of a car of the user, at a TV, at a picture frame, 
and so forth. Thereafter, the user may control or operate the 
car door through his/her smart phone and from the office 
itself. There is no need for him to rush to the parking area for 
closing the door. In an embodiment of the invention, the user 
may receive alert messages at a predefined time period. For 
example, the user may receive the alert messages regarding 
the connected remote devices 106a-n after every 1 hour, 2 
hour, 30 minutes, and so forth. 
[0091] Further, the displayed Internet of Things menu or 
the visual access menu may extend or change based on the 
user selection of the option from the visual access menu. In 
another embodiment of the invention, the device 102 may 
receive images, videos, audios, related to the remote devices 
1 06a-n at the predefined time period. Further, the device 102
US 2013/0080898 AI 
may receive real-time information, such as, but is not limited 
to, images, video etc. of the plurality of the remote devices 
106a-n. In an exemplary scenario, the user can monitor and 
control real-time operation of the remote devices 106a-n such 
as one or more vehicles based on the information received 
through the network 104. For example, the user can receive 
images or videos of the one or more vehicles on the device 
102. Further, the VMThings 108 may display these images of 
remote devices 1 06a-n to the user. The user can send instruc­tions 
or voice response to the one or more vehicles through the 
network 104. For example, the user can track position of the 
one or more vehicles in real-time from the device 102 at 
another location. 
[0092] In an embodiment of the invention, the enhanced 
visual access menus corresponding to the remote devices 
106a-n may be stored at a server 114 in the network 104. As 
discussed with reference to FIG. 1B, the user of the device 
102 may access the visual access menus corresponding to the 
remote devices 106a-n through a web browser in an exem­plary 
environment 200. The environment 200 may include the 
device 102 such as a smart phone capable of connecting to the 
network 104 (or the Internet) via the web browser. In an 
embodiment of the invention, the remote devices 1 06a-n may 
be controlled via a local wireless communication or local 
network. In an embodiment of the invention, the remote 
devices 1 06a-n may be connected to a bridge device that may 
further be connected to the Internet. The web browser may be 
used to connect to the Internet and in turn to the local network. 
Examples of the web browser include, but are not limited to, 
Internet Explorer, Google Chrome, Mozilla Firefox, 
Netscape Navigator, and so forth. The user can enter a Uni­form 
Resource Locator (URL) such as, 'www.ABC.com' in 
the web browser to access a website including a database. The 
database at the website may store a plurality of visual access 
menus or Internet of Things menu or cockpit or enhanced 
visual access menus associated with the remote devices 1 06a­n. 
The enhanced visual access menus are visual access menus 
corresponding to the remote devices 106a-n. Each of the 
enhanced visual access menus may include one or more 
device options. In an embodiment of the invention, the data­base 
may be present in the network 104. 
[0093] A webpage 110 may be displayed at the device 102 
corresponding to the URL entered by the user. The user may 
be required or asked to authenticate his/her identity before 
accessing the visual access menus. The displayed webpage 
110 may include one or more data request fields 112a-b where 
the user may enter his/her details. In an embodiment of the 
invention, the user may access various visual access menus by 
authenticating at the website by entering his/her login details 
such as, but are not limited to, password, used ID, e-mail ID, 
date of birth, and so forth, in the one or more data request 
fields 112a-b. Though not shown, but a person skilled in the 
art will appreciate, that the webpage 110 may include more 
than two data request fields 112a-b. The one or more of 
options of the visual access menus or the enhanced visual 
access menus may be displayed to the user at his/her device 
102. 
[0094] In an embodiment of the invention, the user may 
create personalized visual access menus for controlling his/ 
her personal devices of the remote devices 106a-n. In an 
embodiment of the invention, the user may configure or create 
an Internet of things menu for controlling remote devices. The 
Internet of Things menu may include a plurality of represen­tations 
corresponding to identifiable objects such as the 
7 
Mar. 28, 2013 
remote devices 106a-n. Further, the user may customize the 
Internet of Things menu based on his/her preferences such as, 
but not limited to, language preference, theme preference, 
color preference, font size preference, device preference, ser­vice 
preference, and so forth. The VMThings 108 may display 
customized or personalized visual access menu at the device 
102. In an embodiment of the invention, the VMThings 108 
may display visual access menu at a second display con­nected 
to the device 102. The user may select an option from 
the multiple options of the visual access menu. The enhanced 
visual access menu (or the Internet of Things menu) may be 
displayed at the device based on the selection of an option by 
the user at the device 102. In an embodiment of the invention, 
a connection may be established between the user device 102 
and the remote devices 106a-n based on the selection of the 
option by the user. Thereafter, the user can access and control 
the remote devices 106a-n irrespective of a location of the 
user. The user may not have to be in front of or close to the 
remote device 106a-n for controlling the operations of the 
remote devices 106a-n. 
[0095] FIG. 1C illustrates another exemplary environment 
3 00, in accordance with the first embodiment of the invention. 
An access device 116 may be connected to a display device 
118. The access device 116 may access and control the plu­rality 
of remote devices 106a-n connected through the net­work 
104. The access device 116 may be any device capable 
of data and/or voice communications through the network 
104 or the remote devices 106a-n. Examples of the access 
device 116 include, but are not limited to, a router, a tele­phone, 
a set top box, a hub, a gateway, a printer, a music 
system, a mobile phone, a PDA, a smart phone, a picture 
frame, and so forth. In an embodiment of the invention, the 
access device 116 may not have a display or may have limited 
display capability. The access device 116 may include a plu­rality 
of ports for connecting to the network 104, and/or the 
display device 118. The plurality of ports can be such as, but 
are not limited to, parallel ports, serial ports, DB-2 connector, 
IEEE 1284, IEEE 1394 ports, 8P8C ports, PS/2 ports, RS-232 
ports, Registered Jack (RJ) 45 ports, RJ 48 ports, VGA port, 
Small Computer System Interface (SCSI) ports, USB ports, 
DB-25 ports, and so forth. 
[0096] Examples of the display device 118 may include, 
but are not limited to, a television, a Liquid Crystal Diode 
(LCD) display, a Light Emitting Diode (LED) display, a 
projector screen, a computer, a laptop, a tablet computer, a 
picture frame, a tablet computer, and so forth. The access 
device 116 may provide a network interface to the display 
device 118. The user may use the access device 116 for 
connecting to the network 104. Moreover, the user can access 
the remote devices 106a-n connected to the network 104 by 
using the access device 116. In this embodiment of the inven­tion, 
once connected with the remote devices 106a-n the 
visual access menus or the Internet of Things menus may be 
displayed to the user at the display device 118. In an embodi­ment 
of the invention, the user may have to authenticate 
and/or one or more login details before viewing the visual 
access menus. The user may authenticate or enter his/her 
personal details at the access device 116. In an embodiment of 
the invention, the user may authenticate or enter the personal 
details at the display screen. 
[0097] In an embodiment of the invention, the access 
device 116 may be a home controller device. The user may 
access the VMThings 108 by logging into this home control­ler 
and may view the visual access menus at his device 102 or
US 2013/0080898 AI 
a display device 116. After logging into the home controller 
the user may control the objects i.e. remote devices or ser­vices 
associated with the home controller. Therefore, the user 
may control the one or more objects by using a combination 
of devices such as the home controller, smart phone, another 
display device, and so forth. 
[0098] The access device 116 may include an Internet of 
Things application i.e. VMThings 108 application for access­ing 
the visual access menus and the enhanced visual access 
menus. The VMThings 108 may display the visual access 
menus at the display device 120. The user may connect to the 
remote devices 106a-n by selecting one or more options of the 
visual access menus. Further, the remote devices 1 06a-n may 
be grouped into various categories such as, but are not limited 
to, electronics appliances, home devices, buildings, doors, 
room appliances, electric switches, cars, windows, and so 
forth. Further, the remote devices 106a-n may be grouped 
according to location, such as home devices, office devices, 
garages devices, and so forth. The of the access device 116 
may store visual access menus and enhanced visual access 
menus according to the various categories of the remote 
devices 1 06a-n at the access device 116. Further, the user may 
control any remote device from the remote devices 1 06a-n by 
selecting one or more options from the visual access menu or 
the Internet of Things menu. In an exemplary scenario, the 
user can connect to the network 104 by using a telephone and 
may view the visual access menu on a screen of the television. 
Thereafter, the user may access and control the remote 
devices 106a-n from the telephone by pressing appropriate 
keys/buttons of the telephone. 
[0099] In an embodiment of the invention, the user may 
register the remote devices 106a-n or do some settings at the 
access device 116 or the remote devices 106a-n, so that the 
user may control the remote devices 106a-n from the VMTh­ings 
108. In an embodiment of the invention, the user may be 
required to authenticate or prove his/her identity at the access 
device 116 or for the remote devices 106a-n before control­ling 
one or more operations of the remote devices 106a-n. 
[0100] FIG. 1D illustrates an environment based on a Zig­Bee 
network 120, in accordance with the first embodiment of 
the invention. As shown, the access device 116 may include 
the VMThings 108 for displaying a visual access menu or an 
enhanced visual access menu or an Internet ofThings menu at 
the display device 118. The access device 116 may connect to 
the remote device 106a-n through the ZigBeenetwork 120. In 
an embodiment of the invention, the remote devices 106a-n 
may be connected to the ZigBee network 120 through a local 
network such as a LAN, a NFC network, a Bluetooth network, 
and so forth. The local network may be connected to the 
ZigBee network 120 through some gateway device such as 
bridge, router, hub, gateway device, switch, and so forth. 
[0101] FIG. 1E illustrates an environment based on a 
WiMAX network 122, in accordance with the first embodi­ment 
of the invention. As shown, the access device 116 may 
include the VMThings 108 for displaying the Internet of 
Things menu or the visual access menu or the enhanced visual 
access menus at the display device 118. The access device 
116 may connect to the remote devices 106a-n through the 
WiMAX network 122. In an embodiment of the invention, the 
remote devices 106a-n may be connected to the WiMAX 
network 122 through a local network such as a LAN, NFC 
network and so forth. In an embodiment of the invention, the 
user may require to register the remote devices 1 06a-n or do 
some settings at the access device 116 or the remote devices 
8 
Mar. 28, 2013 
106a-n, so that the user may control the remote devices 
106a-n from the VMThings 108. In an embodiment of the 
invention, the user may be required to authenticate or prove 
his/her identity at the access device 116 or for the remote 
devices 106a-n before controlling one or more operations of 
the remote devices 106a-n. The user may access the visual 
access menus and enhanced visual access menus at the access 
device 116 through a GUI. The VMThings 108 may enable 
the user to control the remote devices 106a-n irrespective of 
the location of the remote devices 106a-n. For example, the 
user may control operations of the air conditioner located in 
his/her factory by being at home itself. The user may not have 
to be physically present at the factory or near the air condi­tioner 
for controlling the operations of the air conditioner. 
The user may do the same through the VMThings 108 of the 
access device 116 (or the device 102). 
[0102] FIG. 1F illustrates an environment based on a Glo­bal 
System for Mobile Communication (GSM) network 124, 
in accordance with the first embodiment of the invention. As 
shown, the access device 116 may be connected to the remote 
devices 106a-n through the GSM network 124. Though not 
shown, but a person skilled in the art will appreciate that the 
access device 116 may be connected to the remote devices 
1 06a-n through other networks, such as, but are not limited to, 
an RF4CE network, an NFC network, an HSPA network, a 
LAN, a WAN, a 3rd generation network, a 4'h generation 
network, a CD MA network, an EV-DO network, and so forth. 
[0103] FIG. 1G illustrates an environment based on the 
ZigBee network 120, in accordance with the first embodiment 
of the invention. As shown, the device 102 may include the 
VMThings 108. A user may configure an Internet of Things 
menu by using the VMThings at the device 102. The user of 
the device 102 may connect to the remote devices 106a-n by 
using the VMThings 108 through the GUI at the device 102. 
Further, the device 102 may be connected to the remote 
devices 106a-n through the ZigBee network 120. In an 
embodiment of the invention, the device 102 may be con­nected 
to other wireless network such as the WiMAX network 
122, as shown in FIG. 1H. 
[0104] FIG. 1I illustrates an environment based on a com­bination 
of a local network 126 and the Internet 130, in 
accordance with the first embodiment of the invention. The 
remote devices 106 a-n may be connected to a local network 
126. The local network 126 can be a private network, a wire­less 
network, and so forth. The local network 126 in turn may 
be connected to an external or public network such as, but are 
not limited to, the Internet 130 through a bridge device 128. 
The device 102 may connect to the remote devices 106a-n 
through the Internet 130. The local network 126 and the 
Internet 130 may be connected to each other through other 
devices such as, but are not limited to, a router, a hub, a switch, 
a gateway, and so forth. 
[0105] In an embodiment of the invention, the VMThings 
108 may display an advertisement or multiple advertisements 
along with the visual access menu at the device 102. In an 
embodiment of the invention, the VMThings may display the 
advertisement or multiple advertisements along with an Inter­net 
of Things menu at the device 102. In an embodiment of the 
invention, the advertisement(s) are selected and displayed 
based on the content of the displayed visual access menu or 
the Internet of Things menu. For example, if the visual access 
menu is for controlling the home appliances, then the adver­tisements 
may be about home appliances such as AC, fans, 
etc. In an embodiment of the invention, the visual access
US 2013/0080898 AI 
menu and/or advertisements may be displayed at a second 
display or a display device such as a picture frame, LCD, 
television, and so forth connected to the device 102. Further, 
the visual access menus and the advertisements may be dis­played 
at the display device or the second display through 
wireless means such as Wi-Fi, Bluetooth, ZigBee, and so 
forth. 
[0106] FIG. 2A illustrates an exemplary environment 400, 
in accordance with a second embodiment of the invention. 
The user 102 may use the device 102 to connect to a plurality 
of services 202a-n through the network 104. The user can 
access the information about the services 202a-n at the device 
102. As discussed with reference to FIG. 1A, the device 102 
can be a portable or hand-held device capable of communi­cating 
and connecting to the network 104 or other devices 
such as the remote devices 1 06a-n. Example of the device 102 
may include a mobile phone, a smart phone, a computer, a 
personal digital assistant (PDA), a tablet computer, a laptop 
etc. The network 104 can be a wired network such as a Local 
Area Network (LAN) or a Wide Area Network (WAN) or a 
wireless network such as a WiMAX network or a combina­tionofthese. 
Examples of the services 202a-n include, but are 
not limited to, banking services, travel services, entertain­ment 
services, railways services, movies services, restau­rants, 
and so forth. Further, the banking services may be 
categorized as insurance services, retail banking services, 
internet banking services, loans service, NRI banking, and so 
forth. The entertainment services may be accessed by the user 
to get information about music, movies, theatre, news, car­toons, 
or sports. For examples, the user may access movies 
services to know the new releases in movies. The information 
about services may be displayed in form of an enhanced 
visual access menu. The user may interact with the enhanced 
visual access menu accordingly. 
[0107] In an embodiment of the invention, the VMThings 
108 may display an Internet of Things menu at the device 102. 
The Internet of things menu may include representations of 
one or more recognizable or identifiable objects such as, but 
are not limited to, remote devices 106a-n or services in an 
Internet or network like structure. The one or more identifi­able 
objects may be physical or virtual objects. A graphical 
user interface (GUI) may be used by the user for creating the 
Internet of Things Menu. In an embodiment of the invention, 
the objects may be the services 202a-n. 
[0108] Further, the VMThings 108 may highlight a fre­quently 
accessed service option or preferred service option in 
the enhanced visual access menu for the services 202a-n or 
the Internet of Things menu based on the user's previous 
access patterns. In an embodiment of the invention, the 
VMThings 108 may highlight one or more frequently 
accessed device options or preferred device options in the 
enhanced visual access menu for the remote devices 106a-n. 
Further, the VMThings 108 may store the user access pattern 
at the device 102. In an embodiment of the invention, the 
VMThings 108 may present a standard menu (or a standard 
visual access menu) for controlling all services 202a-n to the 
user. In another embodiment of the invention, the VMThings 
108 may display a customized menu of services 202a-n at the 
device 102 based on user preferences and/or access pattern. 
[0109] The device 102 may include a Graphical User Inter­face 
(GUI) to enable the user to access the services 202a-n. In 
an embodiment of the invention, the device 102 may include 
audio or visual menus of the services 202a-n. The device 102 
may include visual access menus and/or enhanced visual 
9 
Mar. 28, 2013 
access menus corresponding to the services 202a-n. The 
enhanced visual access menu may include one or more ser­vice 
options. The service options may be displayed as graph­ics 
or icons or text representing the services 202a-n. The user 
may control and get more information about the services 
202a-n by selecting a service option from the service options 
at the device 102. In an embodiment of the invention, the user 
may select a service option by touching the screen of the 
device 102. For example, if the user wants more information 
about the travelling service, the user may select the travel 
service option. In one embodiment, the user can give a voice 
command to the device 102 for selecting a service option 
from the enhanced visual access menu. Further, the user may 
select an option by using a combination of keys on a keypad 
of the device 102. Further, the user may select a service option 
by using a mouse device. In an embodiment of the invention, 
the selection of the service option may be automatic based on 
the one or more predefined instructions of the user of the 
device 102. In an embodiment of the invention, the user may 
have to register him/her or the device 102 to access the ser­vices 
202a-n. In an embodiment the user may have to authen­ticate 
his identity prior to accessing the services 202a-n. In an 
embodiment of the invention, the user may receive alert mes­sages 
related to the services 202a-n. For example, the user 
may receive reminders about making a payment for his/her 
credit card bill. In another embodiment of the invention, the 
user may receive the alert messages regarding the connected 
services 202a-n at a predefined time period such as, but are 
not limited to, after every 1 hour, 2 hour, 30 minutes, and so 
forth. In an embodiment of the invention, the VMThings 108 
may alert the user through at least one of by turning on the 
display of the device 102 from anoffstate and present a menu 
(visual access menu or Internet of Things menu or cockpit), 
presenting a menu in a pop up window, sending Short Mes­saging 
Service (SMS) message, sending a Multimedia Mes­saging 
Service (MMS) message, initiating a telephone call, 
and so forth. Further, the user may receive alert message as a 
pop up message at his/her Global Positioning System (GPS) 
device or a multi function display ofhislher car or at screen of 
a television or at a mobile phone of the user, and so forth. 
[0110] In another embodiment of the invention, the device 
102 may receive images, videos, audios, related to the ser­vices 
202a-n at the predefined time period. In an embodiment 
of the invention, the user may access or control the services 
202a-n by giving voice commands or voice inputs. In an 
embodiment of the invention, the user may connect to the 
services 202a-n through applications such as, but are not 
limited to, Skype, Google Talk, Yahoo Messenger, Magic 
Jack, and so forth. 
[0111] Further, the device 102 may include visual access 
menus associated with at least two independent objects or 
services. In an embodiment of the invention, at least two 
independent objects/services may be produced by at least two 
independent vendors. In an embodiment of the invention, the 
device 102 may include vendor specific Internet of Things 
menus or visual access menus or enhanced visual access 
menus for the services 202a-n. Further, the device 102 may 
also include standard menu(s) for accessing the objects. The 
VMThings 108 may display the visual access menu depend­ing 
on the independent vendor(s) of the one or more objects. 
In another embodiment of the invention, the VMThings 108 
may display a visual access menu which is not provided by 
either of the at least two independent vendors of the at least 
two independent objects. Further, the visual access menus
US 2013/0080898 AI 
may include at least one icon indicating the one or more 
services 202a-n. Further, the icon is substantially different 
than the icons provided in the visual access menu or the 
Internet of Things menu provided by the vendor. The VMTh­ings 
108 may display customized or personalized visual 
access menu or the Internet of Things menu at the device 102. 
In an embodiment of the invention, the VMThings 108 may 
display visual access menu or the Internet of Things menu at 
a second display connected to the device 102. 
[0112] In an embodiment of the invention, speech/voice 
recognition may be used to analyze the voice instructions or 
commands received from the user to access the services 202a­n. 
In an embodiment of the invention, the device 102 may 
receive a call from the services 202a-n. In such a case, the 
VMThings 108 may display a visual access menu and/or an 
Internet of Things menu of the calling service. Further, the 
Internet of Things menu may include one or more options for 
interacting with the service from which call is received. 
[0113] FIG. 2B illustrates another exemplary environment 
500, in accordance with the second embodiment of the inven­tion. 
In an embodiment of the invention, the visual access 
menus or the Internet of Things menu corresponding to the 
services 202a-n may be stored at the server 114 in the network 
104. The user at the device 102 may access an enhanced 
visual access menu corresponding to the services 202a-n by 
using a web browser. The device 102 may be configured to 
connect to the network 104 (or the Internet) by entering a 
URL or a website address in the web browser. Examples of 
the web browser include, but are not limited to, Apple Safari, 
Internet Explorer, Google Chrome, Mozilla Firefox, 
N etscape Navigator, and so forth. The user can enter a URL or 
a website address in the web browser to access a database 
including a plurality of enhanced visual access menus corre­sponding 
to the services 202a-n. In an embodiment of the 
invention, the database may be present in the network 104. 
[0114] A webpage 204 including the one or more data 
request fields 112a-b may be displayed at the device 102 
based on the entered URL. The user may enter his/her details 
in the data request fields 112a-b for getting access to the 
database. Thereafter, at least one enhanced visual access 
menus to access the services 202a-n may be displayed to the 
user at the device 102. The user may access information about 
the one or more services 202a-n by interacting with the dis­played 
enhanced visual access menus. In an embodiment of 
the invention, the webpage 204 may include at least one of 
images, audio/video files, text, hyperlinks, and so forth 
[0115] In an embodiment of the invention, a new visual 
access menu or a new Internet of things menu may be dis­played 
when the user is directed to a new web site based on the 
user's input or selection. The new visual access menu may be 
an IVR menu or an Internet of Things menu associated with 
the new web site. Further, the new visual access menu may 
include options associated with the new web site. 
[0116] FIG. 2C illustrates yet another exemplary environ­ment 
600, in accordance with the second embodiment of the 
invention. As discussed with reference to FIG. 1C, the user 
may use the access device 116 to access or control services 
202a-n. The access device 116 may be any device capable of 
data and/or voice communications through the network 104. 
In an embodiment of the invention, the access device 116 may 
not have a display or may have limited display capabilities. 
The access device 116 can be such as, but are not limited to, 
a router, a telephone, a set top box, a hub, a gateway, a printer, 
a mobile phone, a smart phone, a PDA, a tablet computer, a 
10 
Mar. 28, 2013 
walkie-talkie, and so forth. Further, the access device 116 
may include a plurality of ports for connecting to the network 
104 or the display device 118 such as a television or an LCD 
display. Examples of the plurality of ports include, but are not 
limited to, parallel ports, serial ports, DB-2 connector, IEEE 
1284, IEEE 1394 ports, 8P8C ports, PS/2 ports, RS-232 ports, 
Registered Jack (RJ) 45 ports, RJ 48 ports, VGA port, Small 
Computer System Interface (SCSI) ports, USB ports, DB-25 
ports, and so forth. 
[0117] The access device 116 may provide a network inter­face 
to the display device 118. The user may use the access 
device 116 for accessing the one or more of the services 
202a-n through the network 104. An enhanced visual access 
menu or an Internet of Things menu corresponding to the 
services 202a-n may be displayed to the user. Thereafter, the 
user may access the information about the services 202a-n 
accordingly. In an embodiment of the invention, the user may 
have to enter one or more login details for authenticating 
himself/herself to gain access to the one or more visual access 
menus. In an exemplary scenario, the user can connect to the 
network 104 by using a telephone and may view the visual 
access menu on a television screen. Thereafter, the user may 
access and control the services 202a-n from the telephone by 
selecting or dialing or pressing one or more combination of 
keys at the telephone. 
[0118] In an embodiment of the invention, the VMThings 
108 may display an advertisement or multiple advertisements 
along with the visual access menu at the display device 118. 
In an embodiment of the invention, the advertisement(s) are 
selected and displayed based on the content of the displayed 
visual access menu. For example, if the visual access menu is 
for controlling the banking services, then the advertisements 
may be about insurance and opening accounts. In an embodi­ment 
of the invention, the visual access menu and/or adver­tisements 
may be displayed at a second display or the display 
device 118 such as a picture frame, LCD, television, and so 
forth connected to the access device 116. Further, the visual 
access menus and the advertisements may be displayed at the 
display device 118 or the second display through wireless 
means such as Wi-Fi, Bluetooth, ZigBee, and so forth. 
[0119] FIG. 2D illustrates an environment based on the 
ZigBee network 120, in accordance with the second embodi­ment 
of the invention. As shown, the access device 116 may 
include the VMThings 108 for displaying a visual access 
menu or an enhanced visual access menu including one or 
more service options at the display device 118. The access 
device 116 may access and/or connect to the services 202a-n 
through the ZigBee network 120. Examples of the services 
202a-n include, but are not limited to, banking services, travel 
services, entertainment services, railways services, movies 
services, restaurants, hotels, and so forth. In an embodiment 
of the invention, the services 202a-n may be accessed through 
the ZigBee network 120 and the local network 126 such as a 
LAN, an NFC network, a Bluetooth network, virtual private 
network (VPN), and so forth. The local network may be 
privately monitored network with no or limited access to 
outside users. The local network 126 may be connected to the 
ZigBee network 120 through some gateway device such as 
the bridge device 128, a router, a hub, a gateway, a switch, and 
so forth. 
[0120] FIG. 2E illustrates an environment based on the 
WiMAX network 122, in accordance with the second 
embodiment of the invention. As shown, the access device 
116 may include the VMThings 108 for displaying a visual
US 2013/0080898 AI 
access menu or an enhanced visual access menu including 
one or more service options at the display device 118. The 
access device 116 may connect to the services 202a-n through 
the WiMAX network 122 Examples of the services 202a-n 
include, but are not limited to, banking services, travel ser­vices, 
entertainment services, railways services, movies ser­vices, 
restaurants, and so forth. In an embodiment of the 
invention, the services 202a-n may be connected to the 
WiMAX network 122 through a local network such as a LAN, 
an NFC network, and so forth. The local network 126 may be 
connected to the WiMAX network 122. In an embodiment of 
the invention, the user may require to register to the services 
202a-n or do some settings at the access device 116 or the 
remote devices 106a-n, so that the user may control the ser­vices 
202a-n (or remote devices 106a-n) from the access 
device 116. In an embodiment of the invention, the user may 
be required to authenticate or prove his/her identity at the 
access device 116 or the services 202a-n before accessing the 
services 202a-n. The user may access visual access menus 
and enhanced visual access menus at the access device 116 
through a GUI. The VMThings 108 may enable the user to 
access and control the services 202a-n irrespective of the 
location of the user. 
[0121] FIG. 2F illustrates an environment based on the 
Global System for Mobile Communication (GSM) network 
124, in accordance with the second embodiment of the inven­tion. 
As shown the access device 116 may be connected to the 
services 202a-n through the GSM network 124. Though not 
shown, but a person skilled in the art will appreciate that the 
access device 116 may be connected to the services 202a-n 
through other networks, such as, but are not limited to, an 
RF 4CE network, an NFC network, an HSPA network, a LAN, 
a WAN, a 3rd generation network, a 4'h generation network, a 
Code Division Multiple Access (CDMA) network, an EV-DO 
network, and so forth. 
[0122] FIG. 2G illustrates an environment based on the 
ZigBee network 120, in accordance with the first embodiment 
of the invention. As shown, the device 102 may include the 
VMThings 108 for configuring or customizing or displaying 
an Internet of Things menu at the device 102 by a user. The 
Internet of Things menu may include representations of one 
or more recognizable or identifiable objects such as, but are 
not limited to, remote devices 106a-n or services in an Inter­net 
or network like structure. The one or more identifiable 
objects may be physical or virtual objects. A graphical user 
interface (GUI) may be used by the user for creating the 
Internet of Things Menu. The device 102 can be a portable 
device capable of communicating and connecting to the net­work 
104 or other devices such as the remote devices 106a-n. 
Example of the device 102 may include, but are not limited to, 
a mobile phone, a telephone, a smart phone, a computer, a 
personal digital assistant (PDA), a tablet computer, a laptop, 
and so forth. A user of the device 102 may access to the 
services 1 06a-n by using the VMThings 108 through the GUI 
at the device 102. Further, the device 102 may be connected to 
the services 202a-n through the ZigBee network 120. In an 
embodiment of the invention, the device 102 may be con­nected 
to other wireless network such as the WiMAX network 
122, as shown in FIG. 2H. 
[0123] FIG. 2I illustrates an environment based on a com­bination 
of a local network and the Internet, in accordance 
with the first embodiment of the invention. The services 
202a-n may be interconnected through the local network 126. 
The local network 126 can be a private network, a wireless 
11 
Mar. 28, 2013 
network, a VPN and so forth. The local network 126 in tum 
may be connected to an external or public network such as, 
but are not limited to, the Internet 130 through a bridge device 
128 or a router, or a switch or a gateway device, and so forth. 
Theuserofthe device 102 may connect or access the services 
202a-n through the Internet 130. Further, the VMThings 108 
may display information about services in a preferred lan­guage 
set by the user. For example, if the user wants the 
information in English, the VMThings may display the infor­mation 
about the services 202a-n in English language, and if 
the user is interested in getting information in Spanish lan­guage, 
the VMThings may display the information about the 
services 202a-n in Spanish language. VMThings is config­ured 
to display the visual access menu or the enhanced visual 
access menu in different languages such as, but are not limited 
to, English, Spanish, French, German, Sanskrit, Hindi, and so 
forth. Further, the user may have to register himself or the 
device 102 (or the access device 116) at the website before 
accessing the services 202a-n. In an embodiment of the 
invention, the services 202a-n may be accessed through the 
web browser or the web page 110 as shown in FIG. 2B 
[0124] FIG. 3A illustrates an exemplary visual access menu 
308 and an enhanced visual access menu 310 at a device 102, 
in accordance with the first embodiment of the invention. As 
discussed with reference to FIG. 1A, the device 102 may 
include a graphical user interface (GUI) for accessing the 
visual access menus. Further, the VMThings 108 may display 
the visual access menu 308 (or the Internet of Things menu) 
at the device 102 so as to enable the user to control the remote 
devices 1 06a-n. A visual access menu 308 may include one or 
more options. The options may be a remote devices 302 
option and services 304 option. Though not shown, but a 
person skilled in the art will appreciate that the visual access 
menu 3 08 (or the Internet of Things menu) may include more 
than two options. A user of the device 102 may select an 
option of these options from the displayed visual access menu 
308 (or the Internet of Things menu). Further, the user may 
select an option by any of the following ways, but are not 
limited to, touching an option, through a voice command, 
through a gesture or hand movement, through an audio input, 
by pressing one or more keys at the device 102, and so forth. 
Further, the VMThings 108 may use voice recognition to 
enable the user to make selection of an option or icon from the 
visual access menu 308 (or the Internet of Things menu) 
through a voice command. The device 102 may include a 
voice recognition module to process and analyze the voice 
command(s). 
[0125] Thereafter, an enhanced visual access menu 310 (or 
an enhanced Internet of Things menu) may be displayed 
based on the selection of the option from the visual access 
menu 308. For example, if the user has selected the remote 
devices 302 option, then the enhanced visual access menu 
310 including one or more device options 306a-n may be 
displayed to the user at the device 102. The one or more 
device options may include options corresponding to the 
remote devices 106a-n such as, but are not limited to, a 
vehicle 306a, an air conditioner (AC) 306b, camera 306c, 
microwave 306n, and so forth. The user may select a device 
option of the device options 306a-n. For example, the user 
may select and control a microwave by selecting the micro­wave 
option 306n. For example, if the user may control the 
operations such as switch off, switch on, regulate, and so forth 
through the enhanced visual access menu. Further, the remote 
devices 106a-n may include some predefined settings so that
US 2013/0080898 AI 
the user may access and control the remote devices 106a-n 
from a remote location. In an embodiment of the invention, 
the predefined settings may be done by the user. The VMTh­ings 
108 may store these pre-defined settings at the access 
device 116 (or the device 102). In an embodiment of the 
invention, the device 102 may be connected to the services 
based on the local communication protocol based on nearby 
communication and proximity such as NFC, the Bluetooth, 
and so forth. Further, the user may have to authenticate his/her 
identity before accessing the remote devices 106a-n. The 
device 102 may connect to the remote devices based on the 
predefined settings. Further, in an embodiment of the inven­tion, 
each remote device of the remote devices 106a-n may 
have a unique remote device identity (ID) to distinguish from 
other remote devices. Further, the user may be allowed to 
access the remote devices 1 06a-n based on registration and/or 
authentication. 
[0126] In an embodiment of the invention, the user may 
personalize or customize the visual access menus or the Inter­net 
of Things menu displayed to him/her according to his/her 
preferences. For example, the user may select remote devices 
such as car, garage, home doors, fans, and lights of his/her 
house. Now the user may be displayed with a visual access 
menu corresponding to his/her preferred remote devices of 
the remote devices 106a-n. Through this visual access menu 
or the Internet of Things menu the user may access and 
control one or more operations of the personal remote 
devices. Similarly, the user may define his/her preferences for 
accessing the remote devices present at his/her office or fac­tory, 
and so forth. Therefore, multiple visual access menus 
may be stored at the devices based on the preferences of the 
user. In an embodiment of the invention, more than one user 
may use the device 102 for accessing remote devices 106a-n. 
For example, in a home, 4 users may be using same smart 
phone for controlling the multiple devices of home. The 
VMThings 108 allows different users to access remote 
devices (or services) according to their own preferences at the 
device 102 (or the access device 116). The VMThings 108 
may also store the different preferences corresponding to the 
different users. The VMThings 108 may identifY different 
users based on their unique user ID or details. Further, the 
VMThings 108 may highlight few frequently selected or 
previously selected options of the visual access menu. Fur­ther, 
the VMThings may display a menu for communicating 
with the one or more objects made by a vendor. In an embodi­ment 
of the invention, the menu is not provided by the vendor. 
Further, the one or more objects may comprise at least two 
objects produced by two independent vendors. 
[0127] Further, the user may provide a language preference 
or a display preference. For example, the VMThings 108 may 
display the visual access menu (or the Internet of Things 
menu) in Spanish language based on the user's Spanish lan­guage 
preference. In an embodiment of the invention, the 
visual access menu (or the Internet of Things menu) may be 
displayed by the VMThings 108 on a bigger display screen in 
vicinity of the device 102, such as, but are not limited to a 
projector screen, an LCD display, an LED display, a televi­sion, 
and so forth based on the user's display preference. 
Further, the VMThings 108 may store the usage or access 
pattern for the users based on his/her selections of options 
from the visual access menus or the enhanced visual access 
menus (or the Internet of Things menus) at the device 102. In 
an embodiment of the invention, the device 102 may store 
usage patterns for more than one user at the device 102. 
12 
Mar. 28, 2013 
[0128] In an embodiment of the invention, the user may 
select an option from the one or more options at the device 
102 (or the access device 116) through voice inputs. For 
example, the user may switch on a microwave present at home 
by saying "Switch On the Microwave" or just by saying 
"Switch On". In another embodiment of the invention, the 
user may provide inputs at the device 102 by using different 
gestures or hand movements. For example the user may 
switch on an air conditioner by showing a gesture of a thumb 
up at the device 102. In an embodiment of the invention, the 
device 102 may include a camera. Further, the user may 
provide inputs regarding controlling remote devices (or ser­vices) 
at the device 102 by clicking an image. In an embodi­ment 
of the invention, the VMThings 108 may store a list of 
voice commands or gestures or hand movements for selecting 
options from the visual access menus or the enhanced visual 
access menus (or the Internet of Things menus). The VMTh­ings 
108 may store the actions to be taken corresponding to 
these commands or gestures or hand movements. 
[0129] FIG. 3B illustrates an exemplary visual access menu 
308 and an enhanced visual access menu 312 of services 
202a-n at the device, in accordance with second embodiment 
of the invention. The user may access information about one 
or more services by selecting the services 304 option from the 
visual access menu 308 (or the Internet of Things menu for 
services 202a-n ). An enhanced visual access menu 312 or an 
enhanced Internet of Things menu corresponding to the ser­vices 
202a-n may be displayed to the user by the VMThings 
108. The enhanced visual access menu 312 may include one 
or more service options 314a-n for different types of services 
such as, but are not limited to, entertaiument 314a, travel 
314b, banking 314c, hotels 314n, movies, airlines, and so 
forth. 
[0130] In an embodiment of the invention, the user can 
further expand the visual access menu for any of the services 
by selecting a service option from the service options 314a-n. 
For example, the user may access more information about 
banking services by selecting a banking option 314c. In an 
embodiment of the invention, the user may customize the 
visual access menu displayed to him by providing his/her 
preferences about the services (or remote devices) he/she 
would like to access or control. For example, the user may 
select preferred services such as entertainment, banking, and 
hotels. Therefore, now the user will be displayed an extended 
visual access menu including options for these three preferred 
services only. In an embodiment of the invention, the device 
102 may be connected to the services based on the local 
communication protocol based on nearby communication 
and proximity such as NFC, Bluetooth, and so forth. Further, 
the user may have to authenticate his/her identity before 
accessing the services 202a-n. Further, in an embodiment of 
the invention, each service of the services 202a-n may have a 
unique service identity (ID) to distinguish from other ser­vices. 
Similarly, every user may have a unique user ID. In an 
embodiment of the invention, the user may be authenticated 
based on the user ID. Further, the user may be allowed to 
access the services 202a-n based on registration and/or 
authentication. 
[0131] In an embodiment of the invention, the user may 
access the remote devices 106a-n and services 202a-n 
through a web browser as shown in FIG. 2B. FIG. 3C illus­trates 
another exemplary visual access menu and an enhanced 
visual access menu at the device 102 when a web browser is 
used to access the visual access menus for controlling the
US 2013/0080898 AI 
remote devices 106a-n. The visual access menus may be 
stored at the server 114 in the network 104. In an embodiment 
of the invention, the VMThings may update the database at 
the device 102 (or the access device 116) at a regular interval. 
Further, the database may store a category attribute for each of 
the one or more objects i.e. the remote devices 106a-n and a 
standard menu according to each category attribute. Simi­larly, 
the database may store other attributes or properties 
such as, but not limited to, location, device name, and so forth, 
associated with the plurality of objects. In an embodiment of 
the invention, the user can access the visual access menu 
including the various device options 306a-n through the web 
browser. The user may enter a URL in the web browser. A web 
page 110a including a visual access menu may be displayed 
at the device based on the entered URL. The visual access 
menu at the web page 110 may include options such as, but 
are not limited to, remote devices option 302, and services 
option 304 In an embodiment of the invention, the user may 
be asked to enter his/her personal details for authentication 
prior to getting access to the visual access menu(s ). The user 
may select an option from the remote devices option 302 and 
the services option 304. 
[0132] The display of the device 102 may switch from the 
webpage 110a to webpage 110b when the user selects the 
remote devices option 302. The webpage 110b may include 
an enhanced visual access menu including the device options 
306a-n. The device options 306a-n may be graphics or icon 
and/or text options representing the remote devices 106a-n 
such as, but are not limited to, a vehicle, an air conditioner 
(AC), a camera, a door, a microwave, a window, and so forth. 
Examples of the device options 306a-n include, but are not 
limited to, a vehicle 306a, an AC 306b, a camera 306c, a 
microwave 306n, and so forth. In an embodiment of the 
invention, when the user selects the services option 304 from 
the webpage 110a, the display of the device 102 may change 
from the webpage 110a to a webpage 110c as shown in FIG. 
3D. The webpage 110c may include an enhanced visual 
access menu including the service options 314a-n. The ser­vices 
options 314a-n may include options for accessing the 
services such as, but are not limited to, entertainment 314a, 
travel 314b, banking 314c, hotels 314n, food, and so forth. 
The information may be displayed to the user based on his/her 
selection accordingly. Further, the information may be dis­played 
to the user in a language based on the user's language 
preference. 
[0133] FIG. 4 illustrates an exemplary enhanced visual 
access menu 402 (or the Internet of Things menu for remote 
devices 106a-n) including one or more device options 404a-l, 
in accordance with an embodiment of the invention. A visual 
access menu 402 may include the one or more device options 
404a-l. The device options 404a-l may be such as, but are not 
limited to, a vehicle 404b, an AC 404d, a camera 404e, a 
microwave 404{, a car 404g, a truck 404h, and so forth. In an 
embodiment of the invention, the user of the device 102 may 
select a device option such as a vehicle option 404b from the 
device options 404a-l by touching the vehicle option 404b. In 
another embodiment of the invention, the user may enter a 
voice command or play an audio at the device 102 or at some 
other device nearby to select a device option of the device 
options 404a-l from the enhanced visual access menu 402 (or 
an enhanced Internet of Things menu for the remote devices 
106a-n). In another embodiment of the invention, the user 
may select device options 404a-l through gestures or hand 
movements such as a thumb up, a thumb down, a waving 
13 
Mar. 28, 2013 
hand, a head nod, and so forth. The enhanced visual access 
menu 402 includes device options 404a-l. The user may close 
the door of the car by selecting the Close option 4041. Simi­larly, 
the user may regulate the temperature of the microwave 
by selecting the regulate option 404i. Though not shown, a 
person ordinarily skilled in the art will appreciate that the 
enhanced visual access menu 402 may include different 
device options and more than device options 404a-l. Further, 
the device options 404a-l may differ based on the user's 
preferences such as language, remote devices, and so forth. 
[0134] FIG. 5 illustrates an exemplary visual access menu 
502 (or the Internet of Things menu) including one or more 
service options 504a-k, in accordance with an embodiment of 
the invention. The enhanced visual access menu 502 may 
include a plurality of service options 504a-k. Though not 
shown but a person skilled in art will appreciate that the 
enhanced visual access menu 502 may include more service 
options than shown. The service options 504a-kmay include 
services such as, but are not limited to, banking 504b, enter­tainment 
504c, travel 504d, and so forth. Further, the service 
options 504a-k may differ based on the user's preferences 
such as language, services of interest, and so forth. 
[0135] The user may select a service option of the service 
options 504a-k. In an embodiment of the invention, the user of 
the device 102 may select the banking 504b option from the 
service options 504a-k by touching the banking 504b option. 
In an embodiment of the invention, the user may select the 
banking 504b option by using a combination of keys such as 
'12'. The user can enter the key combination by using an input 
device such as a keyboard connected to the device 102 or 
through keypad of the device 102. In another embodiment of 
the invention, the user may enter a voice command or music 
through a microphone of the device 102 to select a service 
option from the service options 504a-k of the visual access 
menu 502. In yet another embodiment of the invention, the 
user may select or control a service through gestures or hand 
movements. The user may get information about credit cards 
by selecting the credit cards 504h option. Similarly, the user 
may retrieve more information about his/her credit card bill 
by selecting the check bill 504k option from the visual access 
menu 502. 
[0136] In an embodiment of the invention, the user may 
access the local services available in nearby area or are in 
vicinity with respect to the device 102 through the VMThings 
108. For example, if the user is nearby some services, and 
have the device 102 or the access device 116, then the VMTh­ings 
108 may enable the user to communicate and connect to 
the local service. Further, the VMThings 108 may provide 
some suggestion( s) regarding the local services and offerings. 
For example, the device 102 or the user may communicate 
with the nearby Bank, Coffee shop, or train station. 
[0137] Further, the user may have to authenticate his/her 
identity before accessing or using the services. For example, 
the user may be asked to enter his personal details for authen­tication 
prior to connecting or accessing the services. The 
authentication process prevents unauthorized users from 
accessing the services. Further, each service may be identified 
through its unique service ID. 
[0138] FIG. 6 illustrates exemplary components of the 
device 102, in accordance with an embodiment of the inven­tion. 
The device 102 may include a system bus 622 to connect 
the various components. Examples of the system bus 622 
include several types of bus structures including a memory 
bus, a peripheral bus, or a local bus using any of a variety of
US 2013/0080898 AI 
bus architectures. As discussed with reference to FIG. lA, the 
device 102 can be a communication device capable of con­necting 
to other devices such as the remote devices l06a-n 
through the network 104. Example of the device 102 may 
include a mobile phone, a smart phone, a computer, a personal 
digital assistant (PDA), a tablet computer, a laptop etc. The 
remote devices l06a-n can be devices such as, but are not 
limited to, home appliances, vehicles, doors, lights, security 
systems, garage locks, and so forth. Further, the user may 
access the remote devices l06a-n from a remote location by 
using the device 102. In an embodiment of the invention, the 
remote devices l06a-n may be devices present at home loca­tion. 
In another embodiment of the invention, the remote 
devices l06a-n may be devices present at an office location. 
In yet another embodiment of the invention, the remote 
devices l06a-n may be present at a factory location. 
[0139] The device 102 can connect to the network 104 
through a network interface 616. An Input/Output (IO) inter­face 
618 of the device 102 may be configured to connect to 
external or peripheral devices such as a memory card 620a, a 
keyboard 620b, a mouse 620c, and a Universal Serial Bus 
(USB) device 620d. Although not shown, various other 
devices can be connected through the IO interface 618 to the 
device 102. In an embodiment of the invention, the device 102 
may be connected to a hub that provides various services such 
as voice commnnication, network access, television services 
and so forth. For example, the hub may be a Home Gateway 
device that acts as a hub between the device 102 and the 
network 104. 
[0140] The device 102 may include a display 602 to output 
graphical information or the visual access menus or the Inter­net 
of Things menus to the user of the device 102. In an 
embodiment of the invention, the display 202 may include a 
touch sensitive screen. Therefore, the user can provide inputs 
to the device 102 by touching the display 602 or by point and 
click using the mouse 620c. The user can interact with the 
visual access menu (or the Internet of Things menu) by press­ing 
a desired button from the keyboard 620b. For example, the 
user can press a '3' key from the keyboard 620b to select a 
node 3 in the visual access menu. Further, the user can directly 
select the node 3 of the visual access menu from the display 
602, in case of a touch sensitive screen. 
[0141] A memory 606 of the device 102 may store various 
programs, data and/or instructions that can be executed by a 
processor 604 of the device 102. Examples of the memory 
606 include, but are not limited to, a Random Access Memory 
(RAM), a Read Only Memory (ROM), a hard disk, and so 
forth. A person skilled in the art will appreciate that other 
types of computer-readable media which can store data that is 
accessible by a computer, such as magnetic cassettes, flash 
memory cards, digital video disks, and the like, may also be 
used by the device 102. The memory 606 may include a 
graphical user interface (GUI) 604 for accessing the 
enhanced visual access menus (or the enhanced Internet of 
Things menu) for the remote devices l06a-n and/or services 
202a-n. The memory 606 may include a database 610 for 
storing the enhanced visual access menus corresponding to 
the remote devices l06a-n and/or the plurality of services 
202a-n. Further, the database 610 may store user preferences 
related to the enhanced visual access menus of the remote 
devices 1 06a-n and the plurality of services 202a-n. Further, 
the database 610 may include a category attribute for each of 
the objects i.e. the services 202a-n or the remote devices 
l06a-n and a standard menu according to each category 
14 
Mar. 28, 2013 
attribute. Further, the database 610 may store the alert and 
reminder messages. In an embodiment of the invention, the 
database 610 may store information about various services 
202a-n and remote devices l06a-n. Further, the database 610 
may be updated at a predefined time interval. For example, the 
database 610 may be updated after every 2 days, once in a 
week, monthly, and so forth. In an embodiment of the inven­tion, 
the updates may be received from the server 114 as 
shown in FIG. lB. In another embodiment of the invention, 
the updates about the visual access menus may be received 
from the network 104. 
[0142] In an embodiment of the invention, the VMThings 
612 may update the database 610 based on crowd sourcing. It 
means the database 610 may be updated based on feedback or 
reviews or thoughts of other users. For example, if 10 users 
out of 15 users visiting a website and accessing the visual 
access menus says that there is some error in the system of 
controlling a particular object, then based on the ratings pro­vided 
by these users, the record or the menu for the particular 
object in the database 610 may be updated. The VMThings 
612 may also learn the problem associated with the visual 
access menus or the device or the objects from many other 
sources and may find a solution based on many other users. 
Examples of the other sources include, but are not limited to, 
other network devices, remote devices l06a-n, services 202a­n, 
users, server, and so forth. 
[0143] In an embodiment of the invention, the database 610 
may be created based on the information of a yellow pages 
directory. The plurality of objects may be categorized based 
on the category mentioned in the yellow pages. Further, the 
visual access menus in the database may be created based on 
the categories of the objects according to the yellow pages. In 
an embodiment of the invention, the database 610 may be 
created by a human operator or an automatic application. 
[0144] Further, the memory 606 may store an Internet of 
Things application such as a VMThings 612 for displaying 
visual access menus corresponding to the objects such as 
remote devices l06a-n or the services 202a-n at the device 
102. Further, the VMThings 612 may be configured to con­nect 
the device 102 to the one or more of the remote devices 
l06a-n. In an embodiment of the invention, the VMThings 
612 may be used to connect to the services 202a-n remotely. 
The VMThings 612 may be configured to display a visual 
representation in form of enhanced visual access menus of the 
remote devices l06a-n or the services 202a-n at the display 
602. The device 102 may further include a radio interface 614 
configured for wireless communications with other devices in 
the network 104. The visual access menus may include mul­tiple 
device options or service options. The user can select one 
or more options from the visual access menu. Further, the 
VMThings 612 may connect the user to the remote devices 
l06a-n or services based on the selection of the options. 
Further, the VMThings 612 may be configured to enable the 
device 102 to receive images, videos, and so forth of the 
connected remote devices l06a-n and service 202a-n irre­spective 
of their location. In an embodiment of the invention, 
the images are real-time images. In an embodiment of the 
invention, the VMThings 612 may be implemented as soft­ware 
or firmware or hardware or a combination of these at the 
device 102. 
[0145] In an embodiment of the invention, the user VMTh­ings 
612 may store one or more selection of options made by 
the user (s) in the database 610. Further, the VMThings 612 
may bookmark the options based on the past history of the
US 2013/0080898 AI 
user activity with the visual access menu. The database 610 
may store personalized visual access menus or enhanced 
visual access menu for different users. The database 610 may 
be updated based on user instructions. The user instructions 
may be provided by the user through commands such as, but 
are not limited to, voice commands, gestures, selection of 
keys, and so forth. In an embodiment of the invention, the 
VMThings 612 is also configured to analyze and process the 
voice commands based on the context of the voice command. 
[0146] Further, the database 610 may store visual access 
menu of the one or more objects based on category of the 
objects. In another embodiment of the invention, the database 
may store the visual access menus based on the vendors of the 
one or more objects. In an embodiment of the invention, the 
visual access menus may be stored based on one or more 
properties of the objects such as, but not limited to, location, 
type, distance and so forth. The database 610 may also store 
advertisements related to the one or more objects. In an 
embodiment of the invention, the VMThings 612 may display 
at least one advertisement along with the visual access menu 
at the device or display device. The advertisements may be 
related to the content of the visual access menu. In an embodi­ment 
of the invention, the advertisements may be related to 
the one or more objects, remote devices 106a-n, services 
202a-n, and so forth. In another embodiment of the invention, 
the advertisements may be related to a location of the device 
102 or of the one or more objects. In an embodiment of the 
invention, the advertisements may be displayed to the user 
based on one or more preference of the user. For example, the 
user may prefer to view advertisements of electronic devices 
like computers, etc. Further, the VMThings 108 may high­light 
the one or more options in the visual access menu. In an 
embodiment of the invention, the one or more options may be 
highlighted based on the users' previous selection of options. 
Further, the VMThings 612 may keep a record of user activity 
on the device 102. The VMThings 612 may store the user 
profile and access patterns of the user for accessing the visual 
access menu or interacting with the device 102. 
[0147] In an embodiment of the invention, the database 610 
may be updated based on addition or deletion of the one or 
more objects. For example, if a new remote device is added to 
the list of devices to be controlled then the visual access menu 
will be updated accordingly. Further, the VMThings 612 may 
detect errors which may occur during the user interaction 
with the visual access menu. The VMThings 612 may also 
report to the user about these errors. In an embodiment of the 
invention, the errors may occur due to some other reasons 
such as technical reasons, network failure, and so forth. 
[0148] In an embodiment of the invention, the user may 
receive a call from the controlled one or more objects. Also, 
the user may be presented with a visual access menu associ­ated 
with the object from which the call is received. The 
VMThings 612 may display the visual access menu associ­ated 
with the object from which call is received at the device 
102. 
[0149] Depending on the complexity or number of device 
options and/or service options in the visual access menu the 
size of the visual access menu may differ. Moreover, size of 
the display 602 may be limited or small. As a result, all the 
options of the visual access menu may not be displayed 
together on the display 602. In such a case, the VMThings 612 
may allow the user to navigate by scrolling horizontally and/ 
or vertically to view options on the visual access menu. Fur­ther, 
the VMThings 612 may detect the capability of the 
15 
Mar. 28, 2013 
device 102 before displaying the visual access menu. For 
example, in case the device 102 is a basic mobile phone with 
limited functionality of the display screen. Therefore, the 
application may display the visual access menu in form of a 
simple list. Similarly, a list may be displayed in case of fixed 
line or wired telephones. Moreover, in case the device 102 
includes a high capability screen, such as, but are not limited 
to as of an iPad, a television then the visual access menu may 
be displayed in form of graphics. 
[0150] Further, the memory 606 may include other appli­cations 
that enable the user to communicate/interact with the 
remote devices 1 06a-n through the network 104. Examples of 
other applications include, but are not limited to, Skype, 
Google Talk, Magic Jack, and so forth. Other applications 
may be stored as software or firmware on the device 102. 
Further, the memory 606 may include an Operating System 
(OS) (not shown) for the device 102 to function properly. 
[0151] Though not shown, the device 102 may include a 
camera, a microphone, a speaker, and so forth. The user may 
provide voice commands by using the microphone. Further, 
the user may provide the input or select the option by clicking 
an image by using the camera. The user may control one or 
more operations of the remote devices 106a-n by making 
gestures or hand movements in front of the camera of the 
device 102. The speaker may be used to output music and 
voice responses to the user. Further, the VMThings 612 may 
record voice commands received from the user. These 
recorded commands then may be stored at the device 102. The 
user may input one or more key or key combinations using the 
keyboard 620b. The keyboard 620b may be a physical key­board 
or a virtual keyboard displayed on a touch screen dis­play 
602 of the device 102. In an embodiment, the keyboard 
620b is a keypad on the device 102. Subsequently, after some 
processing by the application, the enhanced visual access 
menu corresponding to the remote devices 1 06a-n and/or the 
services 202a-n based on the user inputs or selection is 
searched and displayed on the display 602. 
[0152] In an embodiment of the invention, the visual access 
menu or the enhanced visual access menu may be provided in 
real-time to the user. In another embodiment of the invention, 
the visual access menus (or the Internet of Things menus) 
may be downloaded and stored at the device 102 and may be 
accessed by the user later. In an embodiment of the invention, 
the visual access menu may be provided by a messaging 
service such as a Short Messaging Service (SMS). In an 
embodiment of the invention, customized visual access 
menus may be displayed to the user based on one or more 
preferences of the user. In an embodiment of the invention, 
the visual access menu may be customized based on the 
profile of the user. In an embodiment of the invention, the 
profile may be generated based on access pattern of user or the 
data capture by a hub connected to the device 102. Further, in 
an embodiment of the invention, the VMThings 108 may 
convert the format of the message including the visual access 
menu into another format based on the user preference related 
to the format. For example, the VMThings 108 may convert 
the format of the visual access menu received in an SMS 
format to an e-mail format based on user preference. 
[0153] In an embodiment, the memory 606 may include a 
web browser to access and display web pages from the net­work 
104 and/or other computer networks. The user may use 
the web browser to open a website for accessing the visual 
access menu (or the Internet of Things menu). In an embodi­ment, 
the user may store the login details forthewebsite(s) at
US 2013/0080898 AI 
the device 102. Therefore, the user can connect to the remote 
devices 106a-n or services 202a-n from the web browser 
automatically and may not have to enter his/her login details 
every time to login to the website. The user may navigate 
through the web site and may select a hyperlink embedded in 
the webpage of the website. Based on the selection of the 
hyperlink by the user, he/she may be directed to another 
webpage. In such a scenario, the VMThings 612 may display 
a new Internet of Things menu associated with the new web 
site. In an embodiment of the invention, the VMThings 612 
may display a new visual access menu associated with the 
new web page. 
[0154] FIG. 7 illustrates exemplary components of the 
access device 116, in accordance with an embodiment of the 
invention. The access device 116 may include a system bus 
720 to connect the various components. Examples of system 
bus 720 include several types of bus structures including a 
memory bus, a peripheral bus, or a local bus using any of a 
variety of bus architectures. As discussed with reference to 
FIGS. 1C and 2C, the access device 116 may be any device 
capable of data and/or voice communications through the 
network 104 or the remote devices 106a-n. Examples of the 
access device 116 include, but are not limited to, a router, a 
printer, a music system, a telephone, a set top box, a hub, a 
gateway, a mobile phone, and so forth. In an embodiment of 
the invention, the access device 116 may not have or may have 
limited display capability. The access device 116 may include 
a plurality of ports 722 for connecting to the network 104, 
and/or the display device 118. Examples of the ports 722 may 
include, but are not limited to, parallel ports, serial ports, 
DB-2 connector, IEEE 1284, IEEE 1394 ports, 8P8C ports, 
PS/2 ports, RS-232 ports, Registered Jack (RJ) 45 ports, RJ 
48 ports, VGA port, Small Computer System Interface 
(SCSI) ports, USB ports, DB-25 ports, and so forth. The 
access device 116 may be connected to a display device 118. 
Further, the access device 116 may connect to the remote 
devices 106a-n through the network 104. The access device 
116 may access and control the remote devices 106a-n and 
service 202a-n. In an embodiment of the invention, the access 
device 116 may have a unique access device identity (ID). 
The access device 116 may be authorized based on this 
unique access device ID. 
[0155] The access device 116 can connect to the network 
104 through a network interface 714. An Input/Output (IO) 
interface 716 of the device 102 may be configured to connect 
external or peripheral devices such as a memory card 718a, a 
keyboard 718b, a mouse 718c, and a Universal Serial Bus 
(USB) device 718d. Although not shown, various other 
devices can be connected through the IO interface 716 to the 
access device 116. In an embodiment of the invention, the 
access device 116 may be connected to a hub or gateway 
device that provides various services such as voice commu­nication, 
network access, television services and so forth. For 
example, the hub may be a Home Gateway device that acts as 
a hub between the access device and the network 104. 
[0156] The access device 116 may use the screen of the 
display device 118 to output graphical information to the user 
of the access device 116. Further, the access device 116 may 
include a memory 704 to store various programs, data and/or 
instructions that can be executed by a processor 702. 
Examples of the memory 704 include, but are not limited to, 
a Random Access Memory (RAM), a Read Only Memory 
(ROM), a hard disk, and so forth. A person skilled in the art 
will appreciate that other types of computer-readable media 
16 
Mar. 28, 2013 
which can store data that is accessible by a computer, such as 
magnetic cassettes, flash memory cards, digital video disks, 
and the like, may also be used by the access device 116. The 
memory 704 may store a graphical user interface (GUI) 706 
for accessing the visual access menus of the remote devices 
106a-n and/or services 202a-n. The GUI may provide an 
interface to the user(s) to access the visual access menus or 
enhanced visual access menus. In an embodiment of the 
invention, the GUI may be used to configure or create the 
Internet of Things menus. The Internet of Things menu may 
include representations of one or more recognizable or iden­tifiable 
objects such as, but are not limited to, remote devices 
106a-n or services in an Internet or network like structure. 
The one or more identifiable objects may be physical or 
virtual objects. 
[0157] The memory 704 may include a database 708 to 
store the visual access menus or the Internet of Things menus 
corresponding to the remote devices 106a-n and/or the ser­vices 
202a-n. Further, the database 708 may store user pref­erences 
related to the remote devices 1 06a-n and the services 
202a-n. Further, the database 708 may store the alert and 
reminder messages. In an embodiment of the invention, the 
database 708 may store information about the services 202a­n. 
Further, the database 708 may be updated at a predefined 
time interval. For example, the database 708 may be updated 
after every 4 days, once in a week, monthly, and so forth. In an 
embodiment of the invention, the updates related to the visual 
access menus and remote devices 1 06a-n or services 202a-n 
may be received from the server 114 as shown in FIG. 2B. In 
an embodiments of the invention, the updates may be received 
from the network 104 
[0158] Further, the memory 704 may store an application 
such as a VMThings 710 to connect to the remote devices 
106a-n and the services 202a-n remotely. Further, the VMTh­ings 
710 may connect the access device 116 to the display 
device 118. The VMThings 710 may display a visual repre­sentation 
in form of visual access menus or the Internet of 
Things menus of the remote devices 106a-n or services 
202a-n at the display device 118. The display device 118 may 
further include a radio interface 712 configured for wireless 
communications with other devices. The user can select one 
or more option from the visual access menu or the Internet of 
Things menu to connect to a particular service. Further, the 
VMThings 710 may connect the user to the remote devices 
106a-n or the services 202a-n based on the selection of the 
options. Further, the VMThings 710 may be configured to 
enable the device 102 to receive images, videos, and so forth 
related to the remote devices 106a-n or services 202a-n irre­spective 
of their location. In an embodiment of the invention, 
the VMThings 710 may be implemented as software or firm­ware 
or hardware or a combination of these at the access 
device 116. 
[0159] In an embodiment of the invention, the display 
device 118 may include a touch sensitive screen. Therefore, 
the user can provide inputs or may select an option from the 
visual access menu or the Internet of Things menu by touch­ing 
the screen of the display device 118 or by point and click 
using the mouse 718c. The user can interact with the visual 
access menu or the Internet of Things menu by pressing a 
desired key or combination or keys from the keyboard 718b. 
For example, the user can press a '3' key from the keyboard 
620b to select a node 3 in the visual access menu or the 
Internet of Things menu. Further, the user can directly select
US 2013/0080898 AI 
the node 3 of the visual access menu or the Internet of Things 
menu, in case of a touch sensitive screen. 
[0160] Further, the size of the visual access menu or the 
Internet of Things menu may differ depending on the number 
of service options. As a result, all the service options of the 
visual access menu or the Internet of Things menu may not be 
displayed together on the screen of the display device 118. In 
such a case, the VMThings 710 may allow the user to navigate 
by scrolling horizontally and/or vertically to view various 
service options in the visual access menu or the Internet of 
Things menu. Further, the VMThings 710 may detect the 
capability of the screen of the display device 118 before 
displaying the visual access menu or the Internet of Things 
menu. For example, in case the display device 118 is a basic 
mobile phone with limited functionality of the display screen, 
various device options or the service options of the enhanced 
visual access menu or the Internet of Things menu may be 
displayed as a list including one or more options. 
[0161] In an embodiment of the invention, the database 708 
may be updated based on the feedback of the one or more 
users or based on error report received from the other sources. 
In an embodiment of the invention, the VMThings 710 may 
update the database 708 based on crowd sourcing. It means 
the database 708 may be updated based on feedback or 
reviews or thoughts of other users. For example, if 80 users 
out of 100 users visiting a website and accessing the visual 
access menus says that there is some error in the system of 
controlling a particular object, then based on the ratings pro­vided 
by these users, the record or the menu for the particular 
object in the database 708 may be updated. The VMThings 
710 may also learn the problem associated with the visual 
access menus or the device or the objects from many other 
sources and may find a solution based on many other users. 
Examples of the other sources include, but are not limited to, 
other network devices, remote devices 106a-n, services 202a­n, 
users, server, and so forth. 
[0162] Further, the memory 704 may include other appli­cations 
that enable the user to communicate/interact with the 
services 202a-n through the network 104. Examples of other 
applications include, but are not limited to, Skype, Google 
Talk, Magic Jack, and so forth. Other applications may be 
stored as software or firmware on the display device 118. 
Further, the memory 704 may include an Operating System 
(OS) (not shown) for the access device 116 to function. 
[0163] Though not shown, the access device 116 may 
include a camera, a microphone, a speaker, and so forth. In an 
embodiment of the invention, the display device 118 may 
include the camera or the speaker or the microphone, and so 
forth. The user may provide voice commands by using the 
microphone. Further, the user may provide the input or select 
the option by clicking an image through a camera. The user 
may control one or more operations of the remote devices 
1 06a-n by making gestures or hand movements in front of the 
camera of the device 102. The speaker may also be used to 
output music and voice responses to the user. The user may 
input one or more key or key combinations using the key­board 
718b. The keyboard 718b may be a physical keyboard 
or a virtual keyboard displayed on a touch screen display of 
the display device 118. In an embodiment, the keyboard 718b 
may be a keypad on the access device 116 or the display 
device 118. Subsequently, after some processing by the 
VMThings 710, an enhanced visual access menu correspond- 
17 
Mar. 28, 2013 
ing to the services 202a-n based on the user inputs or selection 
is searched and displayed on the screen of the display device 
118. 
[0164] In an embodiment of the invention, the VMThings 
710 may be configured to recognize the context of the voice 
inputs received from the users or other sources. The VMTh­ings 
710 may take an action based on the context of the voice 
inputs. 
[0165] Further, the user may forward or move the display of 
the device to another device by providing a selection or input. 
In an embodiment of the invention, the VMThings 710 may 
forward or transfer the display from a device to another device 
based on the user inputs. For example, the user may transfer 
the visual menu displayed on his/her smart phone to another 
smart phone by tapping at the display of the smart phone. The 
input for doing so may be a voice command, a selection of one 
or more keys, touching the display, gesture, and so forth. In an 
embodiment of the invention, the user may transfer the dis­play 
from a device to a wall. 
[0166] In an embodiment, the memory 704 may include a 
web browser to display web pages from the network 104 
and/or other computer networks. The user may use the web 
browser to open a website for accessing the visual access 
menu(s). In an embodiment, the user may store the login 
details for the website( s) at the device. Therefore, the user can 
connect to the services 202a-n from the web browser auto­matically 
and may not be required to enter his/her login 
details every time to login to the website. 
[0167] In an embodiment of the invention, the database 708 
may be updated based on addition or deletion of the one or 
more objects. For example, if a new remote device or service 
is added to the list of devices or services to be controlled then 
the visual access menu in the database may be updated 
accordingly. Further, the VMThings 710 may detect errors 
which may occur during the user interaction with the visual 
access menu. The VMThings 710 may also report to the user 
about these errors. In an embodiment of the invention, the 
errors may occur due to some other reasons such as technical 
reasons, network failure, and so forth. In an embodiment of 
the invention, the errors may be reported in form of such as, 
but not limited to, text report, images, an MMS, a SMS, an 
E-mail, voice messages, and so forth. In another embodiment 
of the invention, the VMThings 710 may maintain and store a 
log of errors reported and actions taken to correct them in the 
database 708. 
[0168] In an embodiment of the invention, the database 708 
may be created by a human operator or an automatic appli­cation. 
The human operator may listen to various options of 
the audio menus of the one or more objects and may create a 
visual access menu or visual Internet of Things menus 
accordingly. In an embodiment of the invention, the database 
708 may be created based on one or more instructions of the 
users by the human operator. 
[0169] In an embodiment of the invention, the database 708 
may be created based on the information of a yellow pages 
directory. The plurality of objects may be categorized based 
on the category mentioned in the yellow pages. Further, the 
visual access menus or the Internet of Things menus in the 
database may be created based on the categories of the objects 
according to the yellow pages. 
[0170] FIG. 8 illustrates a flowchart for controlling remote 
devices when the visual access menus or the Internet of 
Things menus are accessed through an access device, in 
accordance with an embodiment of the invention. As dis-
US 2013/0080898 AI 
cussed with reference to FIGS. 1A and 2A, the user of the 
device such as a smart phone may connect to a plurality of 
objects in the network such as remote devices and services. In 
an embodiment of the invention, the objects may be a com­bination 
of the remote devices and services. Further, the 
device may control one or more operations of the remote 
devices. The device may include an Internet of Things appli­cation 
such as a VMThings configured to display graphical 
information to the user. The VMThings may display visual 
access menus (or enhanced visual access menus) or the Inter­net 
of Things menus at the device for controlling remote 
devices or services irrespective of the location of the remote 
devices or services. In an embodiment of the invention, the 
Internet of Things menu may include representations of one 
or more recognizable or identifiable objects such as, but are 
not limited to, remote devices or services in an Internet or 
network like structure. The one or more identifiable objects 
may be physical or virtual objects. In an embodiment of the 
invention, a graphical user interface (GUI) may be used by the 
user for creating the Internet of Things menu. The objects 
may be the remote devices or services. In an embodiment of 
the invention, the device may be connected to a display device 
such as an LCD screen, a TV, an LED screen, a projector 
screen and so forth. In an embodiment of the invention, the 
device or remote devices may be connected to each other 
through a local network such as a wireless network like Blue­tooth, 
RF4CE network, and so forth or through a wired net­work 
like Local Area Network (LAN). 
[0171] At step 802, a database including visual access 
menus may be accessed through a graphical user interface 
(GUI) at the device. In an embodiment of the invention, the 
GUI may be accessed at the device by the user. At step 804, a 
visual access menu or the Internet of Things menu may be 
displayed at the device. In an embodiment of the invention, 
the VMThings may display the visual access menus and the 
Internet of Things menu at the device. The visual access menu 
may include one or more options such as, but are not limited 
to, a remote devices option, a services option, and so forth. 
The user may select an option from these options. The VMTh­ings 
may receive an input from the user. The input may be a 
selection of option by the user. In an embodiment of the 
invention, the device may include a touch sensitive screen. In 
an embodiment of the invention, the user may select an option 
by touching the screen of the device. In another embodiment 
of the invention, the user may select an option by making a 
gesture or hand movement or through a voice command. The 
gestures, hand movements or the voice commands may be 
detected by the display device. In an embodiment of the 
invention, the VMThings may detect the gestures or hand 
movements or the voice commands. Further, the VMThings 
of the device may understand and accept voice inputs from the 
user in different languages irrespective of the device lan­guage. 
Therefore, the user may control the remote devices by 
giving voice commands in different languages such as, but are 
not limited to, English, Spanish, French, Hindi, Chinese lan­guage, 
Japanese language, Hawaiian, German language, and 
so forth. 
[0172] At step 806, an enhanced visual access menu or an 
enhanced Internet of Things menu for remote devices based 
on a selection of an option by a user may be displayed at the 
display device when the user selects the remote devices 
option from the visual access menu. The enhanced visual 
access menu for devices may include one or more device 
options. In an embodiment of the invention, the VMThings of 
18 
Mar. 28, 2013 
the device may display a visual access menu or an enhanced 
visual access menu or an Internet of Things menu in different 
languages. Further, the device or the remote devices may have 
one language and the user may want to control and commu­nicate 
in a different language, the user may do this via the 
VMThings application. The user may select a service option 
from these service options. At step 808, a selection of a device 
option may be received from the user. The user may provide 
the selection by touching the screen of the display device or 
by making some gestures or through hand movements in front 
of the display device or the access device. In an embodiment 
of the invention, the user may select a service option through 
a voice command or instruction. 
[0173] At step 810, the user may be connected to a remote 
device based on the selection of a device option. In an 
embodiment of the invention, the VMThings may also check 
whether the remote device corresponding to the device 
selected by the user is registered to be monitored by the user 
or not. In another embodiment of the invention, the user may 
be required to authenticate his/her identity before accessing 
or connecting to the remote devices 106a-n. Thereafter, at 
step 812, the user may control one or more operations of the 
remote device based on the selection of the device option. For 
example, the user may view real time pictures of the remote 
device, the user may switch on the remote device, and so 
forth. 
[017 4] FIG. 9 illustrates a flowchart for controlling services 
when the visual access menus, in accordance with an embodi­ment 
of the invention. As discussed with reference to FIGS. 
1 C and 2C, the services may be accessed and/or controlled by 
using an access device. At step 902, a graphical user interface 
(GUI) for accessing or creating an Internet ofThings menu or 
a visual access menu may be displayed at the device. In an 
embodiment of the invention, the VMThings may display the 
GUI at the device. In an embodiment of the invention, the 
GUI may be accessed or opened by the user of the device. The 
visual access menu or the Internet of Things menu may 
include one or more options such as, but are not limited to, a 
remote devices option and a services option. The user may 
select any of these options. 
[0175] At step 904, an input including an option selected by 
the user is received at the device. In an embodiment of the 
invention, the device may include a touch sensitive screen. In 
another embodiment of the invention, the user may select an 
option by making a gesture or hand movement or through a 
voice command. The gestures may be such as, but are not 
limited to, a thumb up, ahead nod, a smile, a laughter, a thumb 
down, showing two fingers, and so forth. In an embodiment of 
the invention, the VMThings of the device may detect the 
gestures or hand movements or the voice commands and may 
receive a selectionofthe option. Further, the VMThings of the 
device may understand and accept voice inputs from the user 
in different languages irrespective of the device language. 
[0176] At step 906, an enhanced visual access menu or an 
enhanced Internet of Things menu for services based on a 
selection of an option by a user may be displayed at the device 
when the user selects the services option from the visual 
access menu. The enhanced visual access menu for services 
may include one or more service options. In an embodiment 
of the invention, the VMThings of the device may display the 
enhanced visual access menu in different languages as per the 
user's instruction or convenience. Further, the device or the 
remote devices may have one language and the user may 
control and communicate in a different language via the
US 2013/0080898 AI 
VMThings. In such a scenario, the VMThings may display 
the visual access menu at the device in a language( s) preferred 
by the user. The VMThings will do the required translation of 
language. In an embodiment of the invention, the VMThings 
may display more than one visual access menus at the screen 
of the device. The multiple visual access menus may be 
displayed in different languages. The user may select a ser­vice 
option from these service options. At step 908, a selec­tion 
of a service option may be received from the user. In an 
embodiment of the invention, the user may select a service 
option through a voice command or instruction. 
[0177] At step 910, the user may be connected to a service 
based on the selection of the service option. The VMThings 
may also check whether the information for the selected 
service option is available at the device. If the information is 
not available, then the information may be requested and/or 
received from a server. Thereafter, at step 912, information 
about the service may be displayed at the display device based 
on the selection of the service option. The user may interact 
with the information accordingly. In an embodiment of the 
invention, the information may include text, graphics, audio, 
video, or hyperlinks. 
[0178] FIGS. lOA, lOB, and lOC illustrate a flowchart dia­gram 
for controlling objects by using a device in a network, in 
accordance with an embodiment of the invention. As dis­cussed 
with reference to FIGS. lA and 2A, the user of the 
device such as a smart phone may connect and control various 
objects in the network. In an embodiment of the invention, the 
objects may include remote devices such as a car, a washing 
machine, door, truck, and so forth. In another embodiment of 
the invention, the objects may be services such as entertain­ment, 
banking, hotels, and so forth as described in FIG. 2A-I. 
In yet another embodiment of the invention, the objects may 
be combination of the remote devices and services. Further, 
the device may control one or more operations of the remote 
devices. The user at the device may also view information 
about various services. The device may include an Internet of 
Things application i.e. VMThings configured to display 
graphical information at the device. In an embodiment of the 
invention, the VMThings may display the visual access 
menus at the device for controlling remote devices or services 
irrespective oflocation of the remote devices or services. 
[0179] At step 1002, a graphical user interface (GUI) for 
accessing or configuring an Internet of Things menu or a 
visual access menu may be displayed at the device. In an 
embodiment of the invention, the VMThings may display the 
GUI at the device. In an embodiment of the invention, the 
GUI may be opened by the user of the device. The visual 
access menu may include one or more options such as, but are 
not limited to, a remote devices option and a services option. 
The user may select any of these options. 
[0180] At step 1004, an input including an option selected 
by the user is received at the device. At step 1006, it is checked 
whether the input is for accessing services. The input is for 
accessing services when the user selects the services option. 
If the input is for accessing services then the process control 
goes to step 1014, else the process control goes to step 1008. 
[0181] At step 1008, it is checked whether the input is for 
accessing the remote devices. In an embodiment of the inven­tion, 
the input is for accessing remote devices such as car, 
microwave, garage, doors, and so forth, when the user selects 
the remote devices option from the visual access menu. If the 
19 
Mar. 28, 2013 
input is for accessing the remote devices then the control goes 
to step 1012, else the process waits for an input from the user 
at the device at step 1010. 
[0182] At step 1014, it is checked whether a visual access 
menu or an Internet of Things menu for services is available 
at the device. If not available then at step 1016, the visual 
access menu of the services may be retrieved from a server in 
the network else the process continues to step 1018. At step 
1018, the visual access of the services menu including one or 
more service options may be displayed at the device. The 
service options may be graphics icons and/or text represent­ing 
services. The user may select an option( s) from the service 
options. At step 1020, a selection of a service option may be 
received from the user at the device. Thereafter, at step 1024, 
it is checked whether, information corresponding to the 
selected service option is available at the device. If not avail­able 
the information may be requested and received from the 
server at step 1024. Then, at step 1026, the information may 
be displayed at the device based on the received selection of 
the service option. For example, the user may check his/her 
credit card bill through banking service option and may also 
know different ways of making the payment and information 
about nearby payment office. 
[0183] When at step 1008 the input is for accessing the 
remote devices then at step 1012, it is checked whether a 
visual access menu for remote devices is available at the 
device. If not available then the visual access menu of the 
remote devices is retrieved from the server at step 1028. Then 
at 1030, the visual access menu including one or more device 
options may be displayed at the device. The device options 
may be graphics icons and/or text representing remote 
devices. The user may select a device option(s) from the 
visual access menu of the remote devices. At step 1032, a 
connection between the device and a remote device is estab­lished 
based on the received selection. Thereafter, the user 
may control the remote device(s) irrespective of location of 
the remote devices. 
[0184] FIG. 11 illustrates a flowchart for controlling remote 
devices while accessing the visual access menu or the Internet 
ofThings menu through a web browser, in accordance with an 
embodiment of the invention. As discussed with reference to 
FIGS. lB and 2B, the user of the device 102 may access the 
remote devices and/or services by using a web browser such 
as Google Chrome, Internet Explorer at the device. In an 
embodiment of the invention, the user may access the web 
browser at the access device connected to the display device. 
[0185] At step 1102, the user may open a website through a 
web browser at the device. The user may open the website by 
entering a Uniform Resource Locator (URL) of a website at 
the web browser. The website may allow the user to access 
visual access menus. In an embodiment of the invention, the 
website is displayed at the display device. At step 1104, the 
user may authenticate his/her identity by entering one or more 
details in one or more fields on the web page. The VMThings 
may check whether the user is an authorized user or not based 
on a unique user ID of the user. The VMThings may store the 
user IDs at the device. In an embodiment of the invention, the 
website may maintain the database of user IDs authorized to 
access the remote devices or the services. At step 1106, a 
visual access menu including one or more options is dis­played 
at the device. In an embodiment of the invention, an 
Internet of Things menu may be displayed. The Internet of 
Things menu may include representations or icons of one or 
more recognizable or identifiable objects such as, but are not
US 2013/0080898 AI 
limited to, remote devices 1 06a-n or services in an Internet or 
network like structure. In an embodiment of the invention the 
VMThings may display the visual access menu or the Internet 
of Things menu at the device. In another embodiment of the 
invention the VMThings may display the visual access menu 
at the display device connected to the access device. The one 
or more options can be such as a remote devices option, a 
services option, and so forth. The user may select an option 
from these options. At step 1108, an input regarding the 
selection of the option may be received from the user at the 
device. 
[0186] At step 1110, an enhanced visual access menu for 
the remote devices may be displayed at a screen of the device 
or the web browser when the user selects the remote devices 
option from the visual access menu. In an embodiment of the 
invention, an enhanced Internet of Things menu for the 
remote devices may be displayed at a screen of the device or 
the web browser when the user selects the remote devices 
option from the visual access menu. As shown in FIG. 3C, the 
display of the device may switch based on the selection of the 
option. In an embodiment of the invention the enhanced 
visual access menu or the Internet of Things menu for the 
remote devices may be retrieved from the server. The 
enhanced visual access menu for the remote devices may 
include one or more device options. In an embodiment of the 
invention, the enhanced Internet of Things menu for the 
remote devices may include one or more representations cor­responding 
to the remote devices. The user may select a 
device option from the displayed enhanced visual access 
menu of the remote devices. Each device option may repre­sent 
a remote device which the user can control. Further, the 
options, service options, and device options may be repre­sented 
as graphics or/and text on the visual access menus. At 
step 1112, a selection of a device option may be received from 
the user at the device. In an embodiment of the invention, the 
VMThings may detect the selection received from the user. In 
an embodiment of the invention, the user may select the 
device option by touching the device option at display of the 
device. In an embodiment of the invention, the user may 
provide the selection of the device option through voice 
inputs or commands and/ or gestures or hand movements such 
as, but are not limited to, a thumb up, a head nod, and so forth. 
Further, the voice inputs or commands may be in different 
languages such as English, Spanish, and so forth. The VMTh­ings 
may detect, understand and translate the voice com­mands 
into a language which can be understood by the device. 
[0187] At step 1114, a connection between the device and 
the remote device(s) is established by the VMThings. There­after, 
at step 1116, the user may control one or more opera­tions 
of the connected remote devices irrespective of their 
location. For example, the user may switch on anAC located 
at his/her home while driving back to home. In an embodi­ment 
of the invention, the VMThings at the device may 
change the voice commands into text and may respond or 
control the remote devices accordingly. 
[0188] FIG. 12 illustrates a flowchart for controlling ser­vices 
while accessing the visual access menu through a web 
browser, in accordance with an embodiment of the invention. 
As discussed with reference to FIGS. 1B and 2B, the user of 
the device 102 may access the services by using a web 
browser such as Google Chrome, Internet Explorer at the 
device. In an embodiment of the invention, the user may 
access the web browser at the access device connected to the 
display device. 
20 
Mar. 28, 2013 
[0189] At step 1202, the user may open a website through a 
web browser at the device. The user may open the website by 
entering a Uniform Resource Locator (URL) of a website at 
the web browser such as Google Chrome. The web site may 
allow the user to access visual access menus. In an embodi­ment 
of the invention, the website is displayed at the display 
device. At step 1204, the user may authenticate his/her iden­tity 
by entering one or more details in one or more fields on the 
web page. At step 1206, a visual access menu including one or 
more options is displayed at the device. In an embodiment of 
the invention, an Internet of Things menu may be displayed at 
the device. In an embodiment of the invention the VMThings 
may display the visual access menu at the device. In another 
embodiment of the invention the VMThings may display the 
visual access menu at the display device connected to the 
access device. The user may select an option from the options 
such as a remote devices option or the services option of the 
visual access menu. At step 1208, an input from the user may 
be received at the device. 
[0190] At step 1210, an enhanced visual access menu for 
the services may be displayed at a screen of the device or the 
web browser when the user selects the services option from 
the visual access menu. In an embodiment of the invention, an 
enhanced Internet of Things menu for the services may be 
displayed at a screen of the device or the web browser when 
the user selects the services option from the Internet ofThings 
menu. As shown in FIG. 3D, the display of the device may 
switch based on the selection of the option. In an embodiment 
of the invention, the enhanced visual access menu or the 
enhanced Internet of Things menu for the services including 
the one or more service options may be retrieved from the 
server. The user may select a device option from the displayed 
enhanced visual access menu of the services. Each service 
option may represent a service. At step 1212, a selection of a 
service option may be received from the user at the device. In 
an embodiment of the invention, the VMThings may detect 
the selection received from the user. In an embodiment of the 
invention, the user may select the service option by touching 
the service option at display of the device. In an embodiment 
of the invention, the user may provide the selection of the 
service option through voice inputs or commands and/or ges­tures 
or hand movements such as, but are not limited to, a 
thumb up, a head nod, and so forth. Further, the voice inputs 
or commands may be in different languages such as English, 
Spanish, and so forth. The VMThings may detect, understand 
and translate the voice commands into a language which can 
be understood by the device or the services 
[0191] At step 1214, a connection between the device and 
the remote device(s) may be established by the VMThings. 
Thereafter, at step 1216, the user may control one or more 
operations of the connected remote devices irrespective of 
their location. For example, the user may switch on an AC 
located at his/her home while driving back to home. In an 
embodiment of the invention, the VMThings at the device 
may change the voice commands into text and may respond or 
access the services accordingly. Further, the VMThings may 
store the voice commands in different languages at the device 
(or the access device). The VMThings also stores the list of 
actions corresponding to the various voice commands, ges­tures, 
hand movements, and so forth. 
[0192] FIGS. 13A, 13B, and 13C illustrate a flowchart for 
controlling objects in a network while accessing the visual 
access menu through a web browser, in accordance with an 
embodiment of the invention. As discussed with reference to
US 2013/0080898 AI 
FIGS. 1B and 2B, the user of the device 102 may access 
various objects such as, but are not limited to, remote devices 
and/or services by using a web browser such as Google 
Chrome, Internet Explorer at the device. In an embodiment of 
the invention, the user may access the web browser at the 
access device connected to the display device. 
[0193] At step 1302, the user may open a website through a 
web browser at the device. The user may open the website by 
entering a Uniform Resource Locator (URL) of a website at 
the web browser. The web site may allow the user to access 
visual access menus. In an embodiment of the invention, the 
website is displayed at the display device. At step 1304, the 
user may authenticate his/her identity by entering one or more 
details in one or more fields on the web page. At step 1306, a 
visual access menu comprising one or more options is dis­played 
at the device. In an embodiment of the invention the 
VMThings may display the visual access menu at the device. 
In another embodiment of the invention the VMThings may 
display the visual access menu at the display device con­nected 
to the access device. The one or more options can be 
such as a remote devices option, a services option, and so 
forth. The user may select an option from these options. At 
step 1308, an input from the user may be received at the 
device. Then at step 1310, it is checked whether the input is 
for accessing services. If outcome of the step 1310 is true then 
the control goes to step 1316, else step 1312 is followed. 
[0194] At step 1312, it is checked whether the input 
received at step 1308 is for accessing remote devices. If true 
then the control goes to step 1330 else the process waits for an 
input at the user at step 1314. At step 1316, it is checked 
whether, an enhanced visual access menu for services is avail­able 
at the device. If the enhanced visual access menu is not 
available then at step 1318, the enhanced visual access menu 
may be retrieved from the server else step 1320 is executed. 
Then at step 1320, the enhanced visual access menu including 
one or more service options such as for banking, entertain­ment 
etc. is displayed at the device. The user may select a 
service option from the service options. At step 1322, a selec­tion 
of a service option from the user may be received. Then 
at step 1324, it is checked whether information for selected 
service option is available at the device. If not available then 
the information may be requested and received from the 
server. Then at step 1328, the information may be displayed at 
the device based on the received selection. 
[0195] If at step 1312, the input is for accessing the remote 
device, then at step 1330, it is checked whether an enhanced 
visual access menu for the remote services is available at the 
device. If not available, then at step 1332, the enhanced visual 
access menu for the remote devices including the one or more 
device options may be retrieved from the server else step 1334 
may be executed. At step 1334, the enhanced visual access 
menu including the device options may be displayed at the 
device or the web browser. In an embodiment of the invention, 
the enhanced visual access menu may be displayed at the 
display device connected to the display device or the access 
device. 
[0196] The user may select a device option from the dis­played 
enhanced visual access menu of the remote devices. 
Each device option may represent a remote device. Further, 
the options, service options, and device options may be rep­resented 
as graphics or/and text on the visual access menus. 
At step 1336, a selection of a device option may be received 
from the user. In an embodiment of the invention, the user 
may select the device option by touching the device option at 
21 
Mar. 28, 2013 
display of the device. In an embodiment of the invention, the 
user may provide the selection of the device option through 
voice inputs or commands and/or gestures or hand move­ments 
such as, but are not limited to, a thumb up, a head nod, 
and so forth. The VMThings may detect, understand and 
translate the voice commands into a language which can be 
understood by the device. In an embodiment of the invention, 
the VMThings at the device may change the voice commands 
into text and may respond or control the remote devices 
accordingly. 
[0197] At step 1338, a connection between the device and 
the remote device(s) is established by the VMThings. There­after, 
at step 1340, the user may control one or more opera­tions 
of the connected remote devices irrespective of their 
location. For example, the user may switch on anAC located 
at his/her home while driving back to home. 
[0198] FIG. 14 illustrates a flowchart diagram for control­ling 
the remote devices through a website, in accordance with 
another embodiment of the invention. At step 1402, the user 
may open a website through a web browser at the device. The 
website is for accessing the remote devices or visual access 
menus corresponding to the remote devices. The user may 
open the website by entering a Uniform Resource Locator 
(URL) of the website in the web browser. The web site may 
allow the user to access visual access menus of the remote 
devices (or services as explained in FIG. 12). In an embodi­ment 
of the invention, the website is displayed at the display 
device. Each of the remote devices may have an associated 
unique ID. Similarly, the device may also have a unique 
device ID. The remote devices are registered with the device. 
Further, the user may have to register him/her so as to be able 
to access the remote devices. 
[0199] At step 1404, a visual access menu including one or 
more options may be displayed at the device. In an embodi­ment 
of the invention the VMThings may display the visual 
access menu at the device. In another embodiment of the 
invention the VMThings may display the visual access menu 
display device connected to the access device. The one or 
more options can be such as a remote devices option, a ser­vices 
option, and so forth. The user may select an option from 
these options. At step 1406, an input including a selection of 
the option may be received at the device from the user. 
[0200] At step 1408, an enhanced visual access menu for 
the remote devices may be displayed at a screen of the device 
or as the web page when the user selects the remote devices 
option from the visual access menu. As shown in FIG. 3C, the 
display of the device may switch based on the selection of the 
option. In an embodiment of the invention the enhanced 
visual access menu for the remote devices including the one 
or more device options may be retrieved from the server. The 
user may select a device option from the displayed enhanced 
visual access menu of the remote devices. Each device option 
may represent a remote device which can be controlled. Fur­ther, 
the options, service options, and device options may be 
represented as graphics or/and text on the visual access 
menus. 
[0201] At step 1410, a selection of a device option may be 
received from the user at the device. In an embodiment of the 
invention, the VMThings may detect the selection received 
from the user. In an embodiment of the invention, the user 
may select the device option by touching the device option at 
display screen of the device. In an embodiment of the inven­tion, 
the user may provide the selection of the device option 
through voice inputs or commands and/or gestures or hand
US 2013/0080898 AI 
movements such as, but are not limited to, a thumb up, a head 
nod, and so forth. Further, the voice inputs or commands may 
be in different languages such as English, Spanish, and so 
forth. The VMThings may detect, understand and translate 
the voice commands into a language which can be understood 
by the device. At step 1412, a connection between the device 
and the remote device(s) is established by the VMThings. 
Thereafter, at step 1414, the user may control one or more 
operations of the connected remote devices irrespective of 
their location. For example, the user may switch on an AC 
located at his/her home while driving back to home. In an 
embodiment of the invention, the VMThings at the device 
may change the voice commands into text and may respond or 
control the remote devices accordingly. 
[0202] FIG.15 illustrates a flowchart for controlling remote 
devices when the visual access menus are accessed through 
an access device, in accordance with an embodiment of the 
invention. As discussed with reference to FIGS. 1C and 2C, 
the remote devices may be controlled by using an access 
device. The access device may be any communication device 
capable of connecting to a network or a local network. In an 
embodiment of the invention, the access device may have 
limited display capabilities or no display capabilities. 
Examples of the access device include, but are not limited to, 
a set top box, a home gateway, a hub, a router, a bridge, a 
mobile phone, a smart phone, a printer, a scanner, a computer, 
a PDA, a pager, a watch, a tablet computer, a music player, an 
IPod, a telephone, and so forth. The access device may 
include an Internet ofThings application such as a VMThings 
application for displaying visual access menus for controlling 
the remote devices or services at the display device. The 
access device may be connected to a display device such as an 
LCD screen, a projector screen, a television, and so forth. The 
display device may be a device including a display (or a large 
display screen). The access device may further include an 
application VMThings configured to display visual access 
menus and information to the user. In an embodiment of the 
invention the access device may act as the device itself. In 
another embodiment of the invention, the device may also be 
connected to the display device. 
[0203] At step 1502, a database including visual access 
menus may be accessed through a graphical user interface 
(GUI) at the access device. In an embodiment of the inven­tion, 
the GUI may be accessed via the access device by the 
user. At step 1504, a visual access menu may be displayed at 
the display device. In an embodiment of the invention, the 
VMThings may display the visual access menus at the display 
device. The visual access menu may include one or more 
options such as, but are not limited to, a remote devices 
option, a services option, and so forth. The user may select an 
option from these options. The VMThings may receive an 
input from the user. The input may be a selection of option by 
the user. In an embodiment of the invention, the display 
device may include a touch sensitive screen. In an embodi­ment 
of the invention, the user may select an option by touch­ing 
the screen of the display device. In another embodiment of 
the invention, the user may select an option by making a 
gesture or hand movement or through a voice command. The 
gestures, hand movements or the voice commands may be 
detected by the display device. In an embodiment of the 
invention, the VMThings of the access device may detect the 
gestures or hand movements or the voice commands. Further, 
the VMThings of the access device may understand and 
accept voice inputs from the user in different languages irre- 
22 
Mar. 28, 2013 
spective of the device language. Therefore, the user may 
control the remote devices by giving voice commands in 
different languages such as, but are not limited to, English, 
Spanish, French, Hindi, Chinese language, Japanese lan­guage, 
Hawaiian, German language, and so forth. 
[0204] At step 1506, an enhanced visual access menu for 
remote devices based on a selection of an option by a user may 
be displayed at the display device when the user selects the 
remote devices option from the visual access menu. The 
enhanced visual access menu for devices may include one or 
more device options. In an embodiment of the invention, the 
VMThings of the access device may display visual access 
menu or enhanced visual access menu in different languages. 
Further, the access device or the remote devices may have one 
language and the user may want to control and communicate 
in a different language, the user may do this via VMThings 
application. The user may select a service option from these 
service options. At step 1508, a selection of a device option 
may be received from the user. The user may provide the 
selection by touching the screen of the display device or by 
making some gestures or through hand movements in front of 
the display device or the access device. The gestures may be 
such as, but are not limited to, a thumbs up, a head nod, a 
smile, a laughter, a thumbs down, showing two fingers, and so 
forth. In an embodiment of the invention, the user may select 
a service option through a voice command or instruction. 
[0205] At step 1510, the user maybe connected to a remote 
device based on the selection of a device option. In an 
embodiment of the invention, the VMThings may also check 
whether the remote device corresponding to the device 
selected by the user is registered to be monitored by the user 
or not. Thereafter, at step 1512, the user may control one or 
more operations of the remote device based on the selection 
of the device option. For example, the user may view real time 
pictures of the remote device, the user may switch on the 
remote device, and so forth. 
[0206] FIG. 16 illustrates a flowchart for controlling ser­vices 
when the visual access menus are accessed through an 
access device, in accordance with an embodiment of the 
invention. As discussed with reference to FIGS. 1C and 2C, 
the services may be accessed and/or controlled by using an 
access device. At step 1602, a database including visual 
access menus may be accessed through a graphical user inter­face 
(GUI) at the access device. In an embodiment of the 
invention, the GUI may be accessed via the access device by 
the user. 
[0207] At step 1604, a visual access menu may be displayed 
at the display device. In an embodiment of the invention, the 
VMThings of the access device may display the visual access 
menus at the display device. The visual access menu may 
include one or more options such as, but are not limited to, a 
remote devices option, a services option, and so forth. The 
user may select an option from these options. The VMThings 
may receive an input from the user. The input may be a 
selection of option by the user. In an embodiment of the 
invention, the display device may include a touch sensitive 
screen. In an embodiment of the invention, the user may select 
an option by touching the screen of the display device. In 
another embodiment of the invention, the user may select an 
option by making a gesture or hand movement or through a 
voice command. The gestures, hand movements or the voice 
commands may be detected by the display device. In an 
embodiment of the invention, the VMThings of the access 
device may detect the gestures or hand movements or the
US 2013/0080898 AI 
voice commands. Further, the VMThings of the access device 
may understand and accept voice inputs from the user in 
different languages irrespective of the device language. 
Therefore, the user may control the remote devices by giving 
voice commands in different languages such as, but are not 
limited to, English, Spanish, French, Hindi, Chinese lan­guage, 
Japanese language, Hawaiian, German language, and 
so forth. 
[0208] At step 1606, an enhanced visual access menu for 
services based on a selection of an option by a user may be 
displayed at the display device when the user selects the 
services option from the visual access menu. The enhanced 
visual access menu for services may include one or more 
service options. In an embodiment of the invention, the 
VMThings of the access device may display visual access 
menu or enhanced visual access menu in different languages. 
Further, the access device or the remote devices may have one 
language and the user may want to control and communicate 
in a different language. The user may select a service option 
from these service options. At step 1608, a selection of a 
service option may be received from the user. In an embodi­ment 
of the invention, the user may select a service option 
through a voice command or instruction. 
[0209] At step 1610, the user maybe connected to a service 
based on the selection of a service option. The VMThings 
may also check whether the information for the selected 
service option is available at the device. If the information is 
not available, then the information may be requested and/or 
received from a server. Thereafter, at step 1612, information 
about the service may be displayed at the display device based 
on the selection of the service option. The user may interact 
with the information accordingly. In an embodiment of the 
invention, the information may include text, graphics, audio, 
video, or hyperlinks. 
[0210] FIGS. 17A, 17B and 17C illustrate a flow diagram 
for controlling various objects in a network through an access 
device, in accordance with an embodiment of the invention. 
At step 1702, a GUI for accessing the visual access menus 
may be displayed at the display device. The VMThings may 
display the visual access menus at the display device. The 
visual access menu may include one or more options such as, 
but are not limited to, a remote devices option, a services 
option, and so forth. The user may select from these options. 
At step 1704, an input from the user may be received. The 
input may be a selection of option by the user. In an embodi­ment 
of the invention, the display device may include a touch 
sensitive screen. In an embodiment of the invention, the user 
may select an option by touching the screen of the display 
device. In another embodiment of the invention, the user may 
select an option by making a gesture or hand movement or 
through a voice command. At step 1706, it is checked 
whether, the input is for accessing the services. If the input is 
for accessing services then process control goes to step 1714 
else step 1708 is executed. At step 1708, it is checked whether, 
the input received at step 1704 is for accessing remote device 
(s). If the input is for accessing remote devices then step 1712 
is executed, else the process waits for input from user at the 
access device. 
[0211] At step 1714, it is checked whether, a visual access 
menu of the services is available at the access device. If the 
visual access menu for accessing services is available then 
process control goes to step 1718, else step 1716 is executed. 
At step 1716, the visual access menu for accessing the ser­vices 
is received from a server in the network. Examples of 
23 
Mar. 28, 2013 
the services may include, but are not limited to, banking 
services, entertainment service, tours and travel services, and 
so forth. 
[0212] At step 1718, the visual access menu including one 
or more service options for accessing the services may be 
displayed at the screen of the display device. The user may 
select a service option from these service options. At step 
1720, a selection of a service option may be received from the 
user. The user may provide the selection by touching the 
screen of the display device or by making some gestures in 
front of the display device or the access device. In an embodi­ment 
of the invention, the user may select a service option 
through a voice command or instruction. 
[0213] At step 1722, it is checked whether the information 
for the selected service option is available at the device. If the 
information is not available, then the information may be 
requested and/or received from the server at step 1724, else 
step 1726 is executed. At step 1726, the information of the 
selected services may be displayed at the display device. 
Thereafter, the user may interact with the visual access menu 
for accessing services accordingly. 
[0214] If at step 1708, the input is for accessing the remote 
devices, then step 1712 is executed. At step 1712, it is checked 
whether, a visual access menu of the remote devices is avail­able 
at the access device. If the visual access menu for the 
remote device is available then step 1730 is executed, else the 
visual access menu of the remote devices is retrieved from the 
server at step 1728. At step 1730, the visual access menu 
including one or more device options is displayed at the 
display device. The device options may be graphics icons 
and/or text representing remote devices. The user may select 
a device option(s) from the visual access menu of the remote 
devices. At step 1032, a connection between the device and a 
remote device is established based on the received selection. 
Thereafter, the user may control the remote device(s) irre­spective 
of a location of the remote devices. For example, the 
user sitting in his/her office may regulate the temperature of 
the microwave located at home without being physically 
present at home. 
[0215] FIG. 18A illustrates an exemplary display of 
images, in accordance with an embodiment of the invention. 
As discussed before, the device 102 may receive images of 
the remote devices 1 06a-n (or services 202a-n) in real-time. 
In an embodiment of the invention, the access device 116 may 
receive the images of the remote devices 1 06a-n in real-time. 
In an embodiment of the invention, the images may be 
received at pre-defined time interval. In another embodiment 
of the invention, the VMThings 108 may retrieve the images 
in real-time or based on user's instructions. The images of 
more than one remote device may be displayed at the device 
as shown in FIG. 18A. The image display 1802 includes 
images of multiple remote devices 106a-n. Therefore, the 
user may not have to connect to different remote devices 
individually to see their images. In an embodiment of the 
invention, the device 102 may receive video or audio of the 
remote devices 1 06a-n. Therefore, the remote devices 1 06a-n 
are registered with the device 102 (or the access device 116). 
The images may be received and stored at the device 102 
which can be accessed by the user as per his/her convenience. 
Further, the remote devices 106a-n may be grouped into 
various categories such as, but are not limited to, electronics 
appliances, home devices, buildings, doors, room appliances, 
switches, and so forth. Further, the VMThings 108 may dis­play 
the images of multiple objects such as remote devices
US 2013/0080898 AI 
106a-n, services 202a-n at a single interface or display. Fur­ther, 
the remote devices 1 06a-n may be grouped based on the 
information about the remote devices 106a-n in a yellow 
pages directory. 
[0216] Further, the remote devices 106a-n may be grouped 
according to location, such as home devices, office devices, 
garages devices, and so forth. In an embodiment of the inven­tion, 
the remote devices may be grouped based on other 
criteria such as, but are not limited to, functions of the remote 
device, utility of the remote device, type of the remote device, 
and so forth. The VMThings 108 of the device 102 may store 
visual access menus and enhanced visual access menus cor­responding 
to the remote devices based on the various cat­egories 
of the remote devices 1 06a-n. In an embodiment of 
the invention, the user may require to register at the remote 
devices 106a-n so as to be able to control the remote devices 
106a-n from the VMThings 108. In an embodiment of the 
invention, the user may be required to authenticate or prove 
his/her identity at device 102 or for the remote devices 1 06a-n 
before controlling one or more operations of the remote 
devices 106a-n. The VMThings 108 may also display the 
images of the multiple devices based on these groupings of 
the remote devices 106a-n. In an embodiment of the inven­tion, 
the image display 1802 may include images of the 
remote devices located in kitchen of the home. In an embodi­ment 
of the invention, the VMThings 108 may display one or 
more advertisements related to the content of the display 
1802. Further, the advertisements may be displayed based on 
user preferences such as user interest, etc. 
[0217] FIG. 18B illustrates transfer of an exemplary dis­play 
of images from a device to another device, in an embodi­ment 
of the invention. In an embodiment of the invention, the 
VMThings 108 may connect a device 1 02a to one or more 
devices such as a device 102b and transfer the displayed 
content such as display 1802 from the device 102a to the 
device 102b. As shown in FIG. 18B, the device 102b can be a 
smart phone, a mobile phone, a picture frame, an LCD dis­play, 
an LED display, a GPS screen, a PDA, a TV, a tablet 
computer, a projector screen, a computer, a laptop, and so 
forth. The VMThings 108 of the device 102a may transfer 
display 1802 to the display of the device 102b. Therefore, the 
display 1802 including one or more images of the remote 
devices 106a-n or objects may be displayed at the device 
102b. Further, the VMThings 108 may transfer any display 
such as a visual access menu displayed at the device 102a or 
device 102 to the device 102b. In an embodiment of the 
invention, the device 102b may also include an Internet of 
Things application such as VMThings. In an embodiment of 
the invention, the display 1802 is transferred to the device 
102b based on at least one input from the user. Examples of 
the at least one input may include, but are not limited to, a 
touch, a voice command, a gesture, a hand movement, a 
selection of one or more keys at the device 102, and so forth. 
For example, in case of a touch sensitive screen at the device 
1 02a, a user may transfer the displayed content at the display 
of the device 102b by touching the screen of the device 102a. 
In an embodiment of the invention, the user may provide the 
selection through dual tone multi frequency (DTMF) tones. 
In an embodiment of the invention, the display 1802 may be 
transferred based on the user input to a projection screen or a 
wall. 
[0218] FIG.19 illustrates an exemplary display of a cockpit 
1902 at the device 102, in accordance with an embodiment of 
the invention. The cockpit 1902 is an interface which enables 
24 
Mar. 28, 2013 
a user to access various services, devices or objects. The 
cockpit 1902 may include a plurality of icons 1904a-n repre­senting 
various objects which a user or users can access or 
control. The tabs 1904a-n may be icons or text or combination 
of these. The cockpit 1902 may include a tab 1904a which is 
an icon representing Interactive Voice Response System 
(IVR). The user may select the IVR tab 1904a to access 
various application and interfaces for interacting with IVR 
systems of various destinations. The destinations may be 
organizations or companies or individual services imple­menting 
IVR systems. In an embodiment of the invention, the 
user of the device 102 may connect to any of these destina­tions 
by dialing a telephone number of a destination. A tab 
1904b is an icon corresponding to interface for controlling 
remote devices 106a-n. The user may select the Remote 
devices tab 1904b for viewing an enhanced visual access 
menu for controlling remote devices 106a-n. The remote 
device may be home equipments, cars, doors, electronic 
appliances, windows, and so forth. A tab 1904c is an icon 
corresponding to interface for controlling services 202a-n. 
The user may select the Services tab 1904ca for viewing 
visual access menu for accessing or controlling services 
202a-n. 
[0219] Further, the cockpit 1902 include tabs 1904d-n rep­resenting 
other objects such as, but are not limited to, an 
Outlook 1904d, a Calendar 1904e, Personal E-mails 1904{, 
Messengers 1904g, Games 1904h, and so forth. The user may 
use the Outlook tab 1904d to check his/her professional or 
outlook mails. The user may select calendar tab 1904e to view 
calendar, and to plan his/her day. The user may use the cal­endar 
tab to do many other routine tasks such as, setting 
timings for meetings and appointment etc. In an embodiment 
of the invention, the user may be connected to an online 
calendar when he/she selects the calendar tab 1904e. In 
another embodiment of the invention, the user may be dis­played 
with an offline calendar. The user may also set remind­ers 
about meetings, occasions such as anniversary, birthdays 
etc. using the calendar tab 1904e. 
[0220] FIG. 20A-B illustrates exemplary environments for 
providing access of the cockpit 1902 of a user to other users, 
in accordance with an embodiment of the invention. As 
shown in FIG. 19, a user may be displayed with the cockpit 
1902 for accessing various objects. Further, in an embodi­ment 
of the invention, the user may create or configure the 
cockpit 1902 by using various predefined controls or settings. 
The cockpit 1902 may include the plurality of tabs 1904a-n 
for enabling the user to access the various objects such as 
remote devices 106a-n, services 202a-n, and so forth. In an 
embodiment of the invention, the user may set up the cockpit 
1902 according to his/her preferences such as language pref­erences, 
theme preferences, and so forth. The user may cus­tomize 
the cockpit 1902 according to his/her convenience or 
preferences. 
[0221] In an embodiment of the invention, a first user of a 
first device 2002 may set up a cockpit such as the cockpit 1902 
for accessing various objects at the first device 2002. The first 
device 2002 may include an IVR application VMThings 
2004. The user may create the cockpit 1902 by using the 
VMThings 2004. Further, the first user may provide the 
access of the cockpit 1902 to one or more second users. The 
one or more second users are associated with one or more 
second devices such as a second device 2006. The second 
device 2006 may include an IVR application VMThings 
2008. The VMThings 2008 may display the cockpit 1902 of
US 2013/0080898 AI 
the first user at the second device 2006. In an embodiment of 
the invention, the first device 2002 and the second device 
2006 can be a portable device capable of communicating and 
connecting to other devices such as the remote devices 1 06a­n. 
Examples of the first device 2002 and the second device 
2006 may include, but are not limited to, a mobile phone, a 
smart phone, a computer, a personal digital assistant (PDA), a 
tablet computer, a laptop, and so forth. 
[0222] Further, the first device 2002 and the second device 
2006 are connected to each other through a network 104. The 
network 104 can be a wired network or a wireless network or 
a combination of these. The wireless network may use wire­less 
technologies to provide connectivity among various 
devices. Examples of the wireless technologies include, but 
are not limited to, Wi-Fi, WiMAX, fixed wireless data, Zig­Bee, 
Radio Frequency 4 for Consumer Electronics network 
(RF4CE), Home RF, IEEE 802.11, 4G or Long Term Evolu­tion 
(LTE), Bluetooth, Infrared, spread-spectrum, Near Field 
Communication (NFC), Global Systems for Mobile commu­nication 
(GSM), Digital-Advanced Mobile Phone Service 
(D-AMPS). The device 102 may connect to the plurality of 
remote devices 1 06a-n through the network 104. Examples of 
the wired network include, but are not limited to, Local Area 
Network (LAN), Metropolitan Area Network (MAN), Wide 
Area Network (WAN), and so forth. In an embodiment of the 
invention, the network 104 is the Internet. 
[0223] Further, the cockpit 1902 may include visual access 
menu for controlling the plurality of remote devices 1 06a-n or 
services 202a-n. As shown in FIG. 20A, the first user may 
connect and control the plurality of remote devices 106a-n 
through the network 104. Examples of the remote devices 
include, but are not limited to, household devices including 
electric lights, water pump, generator, fans, television (TV), 
cameras, microwave, doors, windows, computer, or garage 
locks, security systems, air-conditioners (AC), lights, and so 
forth. In an embodiment of the invention, the plurality of the 
remote devices 106a-n can be vehicles such as cars, trucks, 
vans, and so forth. Once set up, the first user may access the 
cockpit 1902 at the first device 2002. In an embodiment of the 
invention, the user may access the cockpit 1902 through a 
website or web browser. The user( s) may have to authenticate 
before accessing the cockpit. In an embodiment of the inven­tion, 
the cockpit 1902 may be stored at a proxy server 2010. 
Further, the proxy server 2010 may also store cockpits of 
other users. In an embodiment of the invention, the proxy 
server 2010 may maintain a record of the interaction of the 
users with the cockpits. Further, the proxy server 2010 may 
include a list of users and information about access control 
over various cockpits. In an embodiment of the invention, the 
access control permissions of the cockpit 1902 may be pro­vided 
to the one or more second users by the proxy server 
2010. In an embodiment of the invention, the proxy server 
2010 may send a message to the first user to ask for a permis­sion 
regarding some changes in his/her cockpit 1902 by the 
one or more second users. Thereafter, the cockpit 1902 may 
be changed or updated based on the permission from the first 
user. Further, the proxy server 2010 may monitor the cockpit 
1902 of the first user and see if there are unauthorized requests 
to control the cockpit 1902 or the remote devices 1 06a-n. In 
case there are unauthorized request, the proxy server 2010 
may report to the owner of the cockpit 1902 such as the first 
user. In an embodiment of the invention, the proxy server 
2010 may report about unauthorized access to a security 
designated entity. Thereafter, either the security designated 
25 
Mar. 28, 2013 
entity or the first user may take an action to handle the unau­thorized 
access. For example, the first user may block the 
users from which unauthorized access requests are received. 
[0224] In an embodiment of the invention, the user may 
create or configure an Internet of Things menu including 
representations of one or more identifiable objects. The iden­tifiable 
objects may be virtual or physical objects. The user 
may share the Internet of Things menu with other user such as 
friends or relatives. 
[0225] In an embodiment of the invention, different users 
may request access to cockpit 1902 of other users. In an 
embodiment of the invention, the one or more second users 
may request to get control over first user's cockpit 1902. For 
example, a wife may request her husband to get access on his 
cockpit. The one or more second users may get access of the 
cockpit 1902 of the first user based on the permission granted 
by the first user. In an exemplary scenario, the reverse control 
may allow the service provider to get more information and 
control of the cockpit of the users. The service provider can be 
a telecom service provider, a grocery provider, a movie rental 
service provider, an internet provider, and so forth. 
[0226] FIG. 21 illustrates a flowchart diagram for providing 
access control of the cockpit to one or more second users, in 
accordance with an embodiment of the invention. As illus­trated 
in FIG. 20A-B, the first user may configure or custom­ize 
the cockpit 1902 at the first device 2002. The first user may 
communicate with the one or more second users over the 
network 104 such as the Internet. The first device 2002 may 
connect to the second device 2006 through the network 104. 
[0227] At step 2102, the first user may access a graphical 
user interface (GUI) for configuring the cockpit 1902 at the 
first device 2002. At step 2104, the user may configure the 
cockpit 1902 based on his/her one or more preferences. 
Examples of the preferences may include, but are not limited 
to, language selection, font size, and selection of remote 
devices, favorite services, pictures, icons, themes, and so 
forth. For example, the user may select a color and theme for 
his/her cockpit 1902. 
[0228] At step 2106, the first user may share the cockpit 
1902 with the one or more second users. For example, the first 
user such as John may share the cockpit 1902 of managing 
and controlling his home devices with his wife Marie or son 
Paul so that they may also control the home devices. Further, 
the user may provide limited or full control of the cockpit 
1902 to the second users. Further, the control to the cockpit 
1902 including different tabs representing objects such as 
remote devices may be provided to different second users. In 
an embodiment of the invention, the access to the cockpit 
1902 may be provided on an event basis. For example, the first 
user may provide access to the second user for two days, or till 
Christmas. In an embodiment of the invention, the first user 
may provide an access to the cockpit 1902 based on time for 
example, such as for 4 hours, 3 hours, and so forth. 
[0229] In an embodiment of the invention, the first user may 
receive one or more alert messages about the remote devices, 
services or other objects of the cockpit 1902. In an embodi­ment 
of the invention, the VMThings 2004 may send these 
alert messages or control of the cockpit 1902 to the first user 
when he/she is available. In another embodiment of the inven­tion, 
the VMThings 2004 may send the alert messages or 
control of the cockpit 1902 to the other second users when the 
first user is not available. Further, the user may set up a list of 
second users to whom the control of the cockpit 1902 may be 
passed in absence of the first user.
US 2013/0080898 AI 
[0230] Further, the VMThings 2008 at the second device 
2006 may translate language of the cockpit 1902 based on 
language preference of the second user. In an embodiment of 
the invention, the VMThings 2008 may translate the cockpit 
1902 of the first user based on the configuration of the second 
device 2006. For example, the VMThings 2008 may translate 
the cockpit 1902 into Russian language if the second user 
understands Russian. Then at step 2110, the cockpit 1902 or 
a menu of the cockpit 1902 may be displayed at the second 
device 2006. In an embodiment of the invention, the cockpit 
1902 may be downloaded at the second device 2006. There­after, 
the second user may interact with the cockpit 1902. 
Further, the VMThings 2008 may change the display of the 
second device 2006 to a menu of the shared cockpit 1902. 
Further, the displayed visual access menu or the cockpit 1902 
will be according to the second user's preference(s). 
[ 0231] FIG. 22 illustrates a flowchart diagram for providing 
access control of the cockpit to one or more second users, in 
accordance with another embodiment of the invention. As 
illustrated in FIG. 20A-B, the first user may configure or 
customize the cockpit 1902 at the first device 2002. The first 
user may communicate with the one or more second users 
over the network 104 such as the Internet. The first device 
2002 may connect to the second device 2006 through the 
network 104. 
[0232] At step 2202, the first user may access a graphical 
user interface (GUI) for configuring the cockpit 1902 at the 
first device 2002. The first device 2002 may be a mobile 
phone, a smart phone, a computer, a personal digital assistant 
(PDA), a tablet computer, a laptop, and so forth. At step 2204, 
the user may configure the cockpit 1902 based on his/her one 
or more preferences. Examples of the one or more prefer­ences 
may include, but are not limited to, language prefer­ence, 
font size, and preferred remote devices, favorite ser­vices, 
pictures, icons, themes, and so forth. For example, the 
user may select a font size for his/her cockpit 1902. 
[0233] At step 2206, the first user may share the cockpit 
1902 with the one or more second users. For example, the first 
user such as John may share the cockpit 1902 for managing 
and controlling his home devices with his wife Marie or son 
Paul so that they may also control the home devices. In an 
embodiment of the invention, the second users may also 
provide control of the cockpit 1902 to one or more third users 
after getting control of the cockpit 1902. The one or more 
second users are the users associated with one or more second 
devices such as the second device 2006. Further, the user may 
provide partial or full control of the cockpit 1902 to the 
second users. Further, the control to the cockpit 1902 includ­ing 
different objects or remote devices may be provided to the 
second users. Further, the access control of the objects may 
differ for different users. For example, first user may provide 
complete control i.e. viewing, controlling and modifying per­mission 
to his/her cockpit 1902 to a User A, and may give 
partial/limited control such as just viewing and controlling 
permission to a User B. 
[0234] In an embodiment of the invention, the access to the 
cockpit 1902 may be provided on an event basis. For example, 
the first user may provide access to the second user for two 
days, or till Christmas. In an embodiment of the invention, the 
first user may provide an access to the cockpit 1902 based on 
time. For example, such as for 4 hours, 3 hours, till5:30 PM, 
and so forth. 
[ 0235] In an embodiment of the invention, the first user may 
receive one or more alert messages about the remote devices, 
26 
Mar. 28, 2013 
services or other objects of the cockpit 1902. In an embodi­ment 
of the invention, the VMThings 2004 may send these 
alert messages or control of the cockpit 1902 to the first user 
when he/she is available. In another embodiment of the inven­tion, 
the VMThings 2004 may send the alert messages or 
control of the cockpit 1902 to the other second users when the 
first user is not available. Further, the user may set up a list of 
second users to whom the control of the cockpit 1902 may be 
passed in absence of the first user. 
[0236] Further, the VMThings 2008 at the second device 
2006 may translate the cockpit 1902 based on language pref­erence 
of the second user. For example, the VMThings 2008 
may translate the cockpit 1902 into Russian language if the 
second user understands Russian or wants to view the cockpit 
1902 in Russian. In an embodiment of the invention, the 
VMThings 2008 may translate language of the cockpit 1902 
of the first user based on the configuration of the second 
device 2006. For example, the VMThings 2008 may translate 
the cockpit 1902 which is in English language into a Russian 
language cockpit if the second user understands or wants to 
view the cockpit in Russian language. Then at step 2210, the 
cockpit 1902 or a menu of the cockpit 1902 may be displayed 
at the second device 2006. Further, the VMThings 2008 may 
change the display of the second device 2006 to a visual menu 
of the shared cockpit 1902. Further, the displayed menu will 
be according to the second user's preference. 
[0237] Thereafter, at step 2212 the one or more second 
users may interact with the cockpit 1902 at their respective 
one or more second devices. The second user(s) may view and 
control the one or more objects in the cockpit 1902 from the 
second device 2006 itself. For example, the second user may 
use his/her smart phone to switch off the microwave associ­ated 
with a home of the first user. Further, the first user may 
receive notifications regarding events at the first device 2002. 
The events may be such as, but not limited to, switch on, 
switch off, theft, and so forth. In an embodiment of the inven­tion, 
the first user may receive notifications about changes 
done by the one or more second user to his/her cockpit 1902. 
Further, messages asking to approve these changes by the 
second users may be received by the first user at the first 
device 2002. 
[0238] Further, the proxy server 2010 may maintain a 
record of interactions with the cockpit 1902 by different 
users. Further, the proxy server 2010 may have some level of 
control related to the sharing of the cockpit 1902 with other 
users. In an embodiment of the invention, the first user may 
provide some instructions to the proxy server 2010 regarding 
sharing of the cockpit. The proxy server 2010 may know to 
whom to send the request and when to send the request if it 
does not work for any reason. Further, the proxy server 2010 
may maintain records related to managing ownership of the 
control of the cockpit 1902. The proxy server 2010 may also 
decide to whom to give control and how much control of the 
cockpit 1902 of the first user. In an embodiment of the inven­tion, 
the proxy server 2010 may decide about giving control to 
other users based on predefined settings received from the 
first user (or the users). Further, the proxy server 2010 may 
save the access pattern of the first user or the one or more 
second users. Further, the proxy server 2010 may also store 
profile information such as name, age, and profession etc. of 
the users. Furthermore, the proxy server 2010 may provide 
control to the second users based on one or more parameters 
such as, but are not limited to, time, event, availability of a 
user at the device and so forth. Further, the proxy server 2010
US 2013/0080898 AI 
may maintain a record of all the changes done to the cockpit 
1902 by the one or more second users. In an embodiment of 
the invention, the first user may roll back all the changes done 
by the other second users based on the record of the changes 
maintained at the proxy server 2010. 
[0239] In an embodiment of the invention, different users 
may request access to cockpit of other users. In an exemplary 
scenario, the one or more second users may request to get 
control over first user's cockpit 1902. For example, a daugh­ter 
may request her mom to get access on her cockpit 1902. 
Therefore, the one or more second users may get access of the 
cockpit 1902 of the first user based on the permission granted 
by the first user. The request for sharing the cockpit may be 
received by the users in form ofSMS, MMS, instant message, 
e-mails, and so forth at their respective devices. The first user 
may provide complete access or limited access to the one or 
more users. In an exemplary scenario, the reverse control may 
allow the service provider to get more information and con­trol 
of the cockpit 1902 of users. Further, the proxy server 
2010 may monitor the cockpit 1902 of the first user and see if 
there are unauthorized requests to control the cockpit 1902. In 
case there are unauthorized request, the proxy server 2010 
may report to the owner of the cockpit 1902 such as the first 
user. In an embodiment of the invention, the proxy server 
2010 may report about unauthorized access to a security 
designated entity. In an embodiment of the invention, the 
proxy server 2010 may itself handle the unauthorized access 
requests. 
[0240] At step 2214, the interactions with the cockpit 1902 
of the first user may be stored at the proxy server 2010. The 
proxy server 2010 may store the interactions in form of list, 
records, text, audio, video and so forth. At 2216, the proxy 
server 2010 may send a message to the first user to ask for a 
permission regarding some changes in his/her cockpit 1902 
by the one or more second users. Thereafter, the cockpit 1902 
may be changed or modified or updated based on the permis­sion 
received from the first user. 
[0241] FIG. 23 illustrates a flowchart diagram for custom­izing 
a cockpit based on user's preference, in accordance with 
an embodiment of the invention. A user may create or con­figure 
a cockpit such as the cockpit 1902 as showninFIG.19. 
The cockpit 1902 may include a plurality of tabs or icons 
1904a-n representing different types of objects. The cockpit 
1902 may be device specific or user specific. The VMThings 
108 may present a GUI for configuring the cockpit 1902 to a 
user at the device 102. 
[0242] At step 2302, the user may access a database of 
visual access menus through a GUI for customizing a cockpit 
including multiple visual access menus corresponding to 
multiple objects at the device 102. The visual access menus 
may be visual menus for accessing one or more objects such 
as, but are not limited to, services 202a-n, remote devices 
1 06a-n, and so forth. The user may provide one or more inputs 
at the device 102. At step 2304, the VMThings 108 may 
search the database for a cockpit or one or more visual access 
menus based on the one or more inputs received from the user. 
The user may provide inputs at the device by at least one of 
pressing one or more keys at the device 102, giving a voice 
command, through gestures, hand movement, touching the 
screen of the device 102, and so forth. In an embodiment of 
the invention, the VMThings 108 may retrieve a cockpit or 
visual access menu matching the inputs from a server. In 
another embodiment of the invention, the VMThings 108 
27 
Mar. 28, 2013 
may display a message telling that cockpit or the visual access 
menu is not available at the device 102. 
[0243] At step 2306, the VMThings 108 may customize the 
cockpit visual access menu according to user's preference. In 
an embodiment of the invention, the VMThings 108 may 
customize one or more visual access menus or objects of the 
cockpit according to user's preference. For example, the user 
maybe interested in controlling remote devices such as car, 
garage, home doors, fans, and lights ofhislher house only. So, 
the user may be displayed with a visual access menu corre­sponding 
to his/her preferred remote devices of the remote 
devices 106a-n. Through this visual access menu the user 
may access and control one or more operations of the personal 
remote devices. Similarly, the user may define his/her pref­erences 
for accessing the remote devices present at his/her 
office or factory, and so forth. Therefore, multiple visual 
access menus may be stored at the devices based on the 
preferences of the user. Examples of the preferences may 
include, but are not limited to, language preference, font size, 
and selection of remote devices, favorite services, pictures, 
icons, themes, and so forth. For example, the user may select 
a color and theme for his/her cockpit to be displayed at the 
device 102. In an embodiment of the invention, the user may 
be displayed with a different visual access menu when the 
user accesses the visual access menu from different devices. 
For example, when the user is accessing a visual access menu 
to control services from his/her laptop, he may see a first 
visual access menu and when the same user accesses the 
visual access menu from his/her smart phone he may be 
presented with a second visual access menu. The purpose or 
functionality of the first visual access menu may be same as of 
the second visual access menu. For example, the first and the 
second visual access menu may be the visual menus for 
controlling one or more cars of the user. 
[0244] Thereafter, at step 2308, a customized cockpit or the 
one or more visual access menus may be displayed at the 
device 102. In an embodiment of the invention, the visual 
access menu may be customized based on the user prefer­ences 
received in real time. In another embodiment of the 
invention, the visual access menu may be customized based 
on predefined user preferences. In an embodiment of the 
invention, the customized visual access menu may be stored 
at the device 102 or at a server in a cloud network. 
[0245] In an embodiment of the invention, a standard cock­pit 
or visual access menu may be displayed to the user. The 
standard cockpit may be an interface which is not customized 
according to the user preferences. The standard visual access 
menu may be a standard menu which may be displayed with­out 
any customization specific to the user. 
[0246] FIG. 24 illustrates a flowchart diagram for config­uring 
a cockpit, in accordance with an embodiment of the 
invention. As discussed with reference to FIG. 1A, a user may 
access or control the remote devices 106a-n or services 
202a-n by using the device 102. The device 102 may include 
the VMThings 108 for displaying graphical information at the 
device 102. The user may create a cockpit by using a GUI at 
the device 102. At step 2402, the user may access a database 
of visual access menus through a GUI for creating a cockpit 
such as the cockpit 1902 as shown in FIG. 19. For example, 
the user may access a database of visual access menu at 
his/her smart phone. In an embodiment of the invention, the 
database may be present at the device 102. In another embodi­ment 
of the invention, the database may be present on a server 
in a cloud network.
US 2013/0080898 AI 
[0247] At step 2404, the VMThings 108 may display one or 
more configuration settings options for creating the cockpit to 
the user at the device 102. The user may choose or select one 
or more configuration setting options. In an embodiment of 
the invention, the user may provide inputs regarding the con­figuration 
settings. At step 2406, a selection of the one or 
more configuration setting options may be received at the 
device 102. In an embodiment of the invention, the VMTh­ings 
108 may detect and receive the selection of the configu­ration 
options from the user at the device 102. At step 2408, a 
cockpit may be created based on the selection received from 
the user. In an embodiment of the invention, the VMThings 
108 may create the cockpit based on the selection of the 
configuration options. The cockpit created may be a custom­ized 
cockpit specific to the user. The customized cockpit may 
be stored at the device 102. Thereafter, at step 2410, the 
cockpit may be displayed at the device 102. In an embodiment 
of the invention, the cockpit may be displayed at a display 
device such as the display device 118 connected to the device 
102. 
[0248] FIG. 25 illustrates a flowchart diagram for custom­izing 
a cockpit based on other users' reviews, in accordance 
with an embodiment of the invention. As discussed with 
reference to FIG. 19, the user may access different objects 
through the cockpit 1902. Further, the user may create or 
configure or set up or customize a cockpit specific to the user. 
[0249] At step 2502, a user may access a database including 
a plurality of visual access menus through a GUI for creating 
a cockpit at a device such as the device 102. The visual access 
menus are the visual menus for accessing or controlling mul­tiple 
objects such as remote devices 106a-n or services 202a­n. 
In an embodiment of the invention, the database may be 
present at a server in the network 104. In another embodiment 
of the invention, the database of visual access menus may 
present at the device 102. 
[0250] At step 2504, one or more configuration options for 
configuring/creating or customizing the cockpit may be dis­played 
to the user. In an embodiment of the invention, the 
VMThings 108 may display the one or more configurations 
options to the user. The user may select or choose these one or 
more configuration options to change or modify a standard 
cockpit. At step 2506, the user may create or configure the 
cockpit based on a selection of the one or more configuration 
options received from the user. 
[0251] The user may allow other users to view or check or 
access the cockpit and rate it and provide reviews or feedback 
about the cockpit. At step 2508, the user may receive ratings/ 
reviews/feedback for the cockpit from the other users in the 
network 104. The other users may also suggest some changes 
like addition or deletion in the cockpit to the user. At step 
2510, the cockpit may be customized at the device 102 based 
on the ratings or reviews or feedback received from the other 
users. In an embodiment of the invention, the VMThings 108 
may modifY the cockpit based on the reviews or ratings or 
feedback automatically at the device 102. In another embodi­ment 
of the invention, the user may accept or reject reviews or 
feedback and then he/she may modify the cockpit manually 
or with the help of the VMThings 108 application at the 
device 102. 
[0252] Further, the modified cockpit may be stored in the 
database. Thereafter, at step 2512, the customized or modified 
cockpit may be displayed at the device 102. In an embodiment 
of the invention, the modified cockpit may be displayed at the 
display device 118 such as a projector screen, a TV, a large 
28 
Mar. 28, 2013 
screen and so forth. In an embodiment of the invention, the 
user may not customize the cockpit based on the other users' 
reviews or feedback. 
[0253] FIG. 26 illustrates a flowchart diagram for down­loading 
and customizing a cockpit at a second device, in 
accordance with an embodiment of the invention. The user 
may share the cockpit with other users. The cockpit may be 
modified by the other users based on the access control per­missions 
from the user. Further, the user may configure or 
customize his/her cockpit with the help of other users in 
his/her social network. The social network may be created by 
the user by using a social networking website. Examples of 
the social networking web sites include, but are not limited to, 
Face book, Google+, Orkut, Twitter, Academia.edu, Athlinks, 
Bebo, Badoo, BIGADDA, BlackPlanet, Buzznet, Cloob, 
Faceparty, Flixter, Fubar, Google Buzz, Hi5, ibibo, MySpace, 
Linked In, My Life, Ning, WAYN, and so forth. For example, 
the user may share or invite other users to help him in creating 
his/her cockpit in real time. 
[0254] At step 2602, a first cockpit may be configured or 
created by accessing a GUI for creating the cockpit at a first 
device. A first user may create the first cockpit at the first 
device. Then at step 2604, the first cockpit may be shared with 
one or more second users and downloaded at their respecting 
one or more second devices. Examples of the first device and 
the second devices may include, but are not limited to, a 
mobile phone, a smart phone, a computer, a laptop, anI-pod, 
anI-pad, a tablet computer, a home controller, a set top box, 
an android device, an android set top box, and so forth. The 
cockpit may be downloaded at the system through at least one 
of an SMS, an MMS, File transfer protocol (FTP), an E-mail, 
through wireless technologies like Bluetooth, ZigBee, 
RF4CE, Wi-Fi, WiMAX, and so forth. 
[0255] At step 2606, the one or more second users may 
modifY or customize a second cockpit at the one or more 
second devices based on the downloaded first cockpit. The 
second cockpit is associated with at least one of the one or 
more second users. At step 2608, ratings or reviews or feed­back 
may be received on the customized second cockpit of the 
second user from the other users (or one or more third users) 
in his/her social network. For example, a second user may 
receive ratings on the second cockpit from his/her friends or 
relatives in the social network such as on Facebook, Twitter, 
Orkut, Ning, MySpace, ibibo, and so forth. 
[0256] At step 2610, one or more configuration settings of 
the second cockpit are downloaded at the first device based on 
the reviews or ratings of the other user i.e. the one or more 
third users. At step 2612, the first cockpit may be customized 
based on the downloaded configuration settings and reviews. 
Thereafter, at step 2612, the customized first cockpit may be 
displayed at the first device. In an embodiment of the inven­tion, 
the customized first cockpit may be stored in the data­base. 
[0257] FIG. 27 illustrates a flowchart diagram for config­uring 
a cockpit based on another cockpit of other user, in 
accordance with an embodiment of the invention. As dis­cussed 
with reference to FIG. 1A, every user in the network 
104 may access visual access menus at their respective 
devices. Subsequently through these visual access menus, the 
user may control the one or more functions or operations of 
the one or more objects such as the remote devices 1 06a-n. As 
discussed with reference to FIGS. 19 and 20, the user may 
configure a cockpit such as the cockpit 19 according to his/her 
preferences. As discussed with reference to FIG. 26, the user
US 2013/0080898 AI 
may configure or customize his/her cockpit with the help of 
other users in his/her social network. The social network may 
be created by the user by using a social networking website. 
Examples of the social networking websites include, but are 
not limited to, Facebook, Google+, Orkut, Twitter, Academia. 
edu, Athlinks, Bebo, Badoo, BIGADDA, BlackPlanet, Buzz­net, 
Cloob, Faceparty, Flixter, Fubar, Google Buzz, Hi5, 
ibibo, MySpace, Linkedin, MyLife, Ning, WAYN, and so 
forth. For example, the user may share or invite other users to 
help him in creating his/her cockpit in real time. 
[0258] At step 2702, at least one second cockpit associated 
with one or more second users is selected from a database. 
The database may be at a first device or at a second device or 
at a server in the network 104. Each user in the network 104 
may have an associated profile stored at the database. The 
profile of a user may include information such as but not 
limited to, name, age, Identity (ID), interests, favorite books, 
and so forth about the user. Further, the at least one second 
cockpit is associated with a second user whose profile is 
similar to a profile of a first user. In an embodiment of the 
invention, the VMThings 108 may search and select the at 
least one cockpit from the database. In an embodiment of the 
invention, the user may select the second cockpit of the one or 
more second users. 
[0259] At step 2704, the second cockpit may be analyzed 
by the VMThings 108. In an embodiment of the invention, the 
analysis may happen at the first device. In another embodi­ment 
of the invention, the analysis may happen at the server in 
the network 104 or a network device in a cloud network. At 
step 2706, a first cockpit specific to the first user may be 
created or configured based on the analysis of the second 
cockpit of the one or more second users. In an embodiment of 
the invention, the VMThings 108 may create the first cockpit 
based on the second cockpit. In another embodiment of the 
invention, the user may provide inputs for configuring the 
cockpit based on the analysis of the second cockpit. Further, 
the user may invite other users may be his friends, relatives, 
colleagues, and so forth to configure the cockpit for the user. 
The first cockpit may be stored at the first device. In an 
embodiment of the invention, the first cockpit may be stored 
at the server or the network device. Thereafter, at step 2708, 
the first cockpit may be displayed at the first device to the user. 
In an embodiment of the invention, the first cockpit may be 
displayed at a display device connected to the first device. The 
display device may be connected to the first device through 
wireless or wired means. 
[0260] FIG. 28 illustrates a flowchart diagram for config­uring 
a cockpit based on another cockpit of other user, in 
accordance with another embodiment of the invention. At 
step 2802, the user may access a graphical user interface 
(GUI) for configuring or creating a cockpit at a first device. At 
step 2804, the first user may provide information or profile of 
at least one second user. The profile may include information 
such as a name, age, devices, services, and so forth. Then at 
step 2806, the VMThings 108 may search for a second cock­pit 
of the second user and download at the first device. At 
2808, the VMThings 108 may customize or configure a first 
cockpit for the first user based on the second cockpit of the at 
least one second user. In an embodiment of the invention, the 
Further at step 2810, the VMThings 108 may store the first 
cockpit at the first device. In an embodiment of the invention, 
the first cockpit may be stored at a server in the network 104. 
Further, the user may translate the first cockpit from one 
language to another. The user may change or select a new font 
29 
Mar. 28, 2013 
size, theme, color etc. for the first cockpit. Thereafter, at step 
2812, the first cockpit may be displayed to the user at the first 
device. In an embodiment of the invention, the first cockpit 
may be displayed at a display device attached or connected to 
the first device. Thereafter, the user may interact and access 
the one or more objects of the first cockpit accordingly. 
[0261] FIG. 29 illustrates a flowchart for downloading a 
cockpit from a network, in accordance with an embodiment 
of the invention. In an embodiment of the invention, the user 
may download the cockpit or one or more configuration set­tings 
for setting his/her cockpit at a device. At step 2902, a 
graphical user interface (GUI) for creating or configuring or 
copying a cockpit at a device may be accessed by a user. In an 
embodiment of the invention, the user may configure his/her 
cockpit based on the cockpit of other users in the network 104. 
At step 2904, the user may select and download a cockpit 
having good reviews and ratings from the other users from the 
network 104 such as the Internet. The cockpit may be present 
in a cloud network. In an embodiment of the invention, the 
user may customize the downloaded cockpit according to 
his/her preference and device compatibility. At step 2906, the 
cockpit may be customized or translated according to a lan­guage 
preference of the user. In an embodiment of the inven­tion, 
the cockpit may be translated or customized by the 
VMThings 108 based on predefined preferences of the user. 
For example, the cockpit language may be changed from 
English to Spanish. In an embodiment of the invention, the 
user may not customize the downloaded cockpit. At step 
2908, the customized cockpit may be stored at the device. In 
an embodiment of the invention, customized cockpit may be 
stored at a server or in cloud network. At step 2910, the 
customized cockpit may be displayed at the device or at a 
display device attached to the device. 
[0262] FIG. 30 illustrates an environment for accessing a 
cockpit through a website, in accordance with an embodiment 
of the invention. As discussed with reference to FIG. 19, the 
cockpit 1902 may include multiple tabs or icons 1902a-n for 
connecting to and controlling multiple objects 3006a-n. The 
objects may be such as but not limited to, remote devices, 
services, applications, and so forth. A user may use a device 
3002 to access a cockpit or visual access menus through a 
website in a network 3004. Examples of the device 3002 may 
include, but are not limited to, smart phone, PDA, a mobile 
phone, a computer, a laptop, a tablet computer, an I-POD, and 
so forth. 
[0263] The network 3004 can be a wired network or a 
wireless network or a combination of these. The wireless 
network may use wireless technologies to provide connectiv­ity 
among various devices. Examples of the wireless tech­nologies 
include, but are not limited to, Wi-Fi, WiMAX, fixed 
wireless data, ZigBee, Radio Frequency 4 for Consumer 
Electronics network (RF 4CE), Home RF, IEEE 802.11, 4G or 
Long Term Evolution (LTE), Bluetooth, Infrared, spread­spectrum, 
Near Field Communication (NFC), Global Sys­tems 
for Mobile communication (GSM), Digital-Advanced 
Mobile Phone Service (D-AMPS). The device 102 is con­nected 
to the plurality of remote devices 106a-n through the 
network 104. Examples of the wired network include, but are 
not limited to, Local Area Network (LAN), Metropolitan 
Area Network (MAN), Wide Area Network (WAN), and so 
forth. In an embodiment of the invention, the network 104 is 
the Internet. In an embodiment of the invention, the one or 
more objects may connect to the network 3004 through a 
network device such as, but not limited to, a router, a bridge,
US 2013/0080898 AI 
a switch, a gateway, a home communication device, and so 
forth. In an embodiment of the invention, the objects 3006a-n 
may connect to the network 3004 indirectly through a local 
network. 
[0264] The device 3002 may include a web browser for 
opening a web site. Examples of the web browser include, but 
are not limited to, Internet Explorer, Google Chrome, Mozilla 
Firefox, Netscape Navigator, and so forth. The user can enter 
a Uniform Resource Locator (URL) such as, 'www.XYZ. 
com' in the web browser to access the website. Further, when 
the user enters a URL in the web browser, a web page 3008 
may be displayed at the device 3002 based on the URL. The 
web page 3008 may include one or more data request fields 
3010a-n. In an embodiment of the invention, the user may 
have to authenticate his identity to the website before access­ing 
the cockpits. The user may enter his/her details in the one 
or more data request fields 3010a-n for authentication. In an 
exemplary scenario, the web page 3008 may include a user­name 
data request field 3010a, and a password data request 
field 3010b. 
[0265] The network 3004 may include a cockpit database 
3012 or server for storing a plurality of cockpits associated 
with a plurality of users or devices. Further, the cockpit data­base 
3012 may include a plurality of visual access menus for 
controlling one or more objects. The cockpit database 3012 
may also maintain a list of users, devices, remote devices, 
services and so forth. In an embodiment of the invention, the 
network 3004 may include an IVR application such as 
VMThings 3014. The VMThings 3014 may display graphical 
information to the user at the device 3002. In an embodiment 
of the invention, the graphical information or visual access 
menu may be displayed at a display device such as, but not 
limited to, a television, an LCD screen, an LED screen, a 
computer, a projector screen, a picture frame, and so forth. In 
an embodiment of the invention, the user may configure a 
cockpit at the device 3002 by accessing a graphical user 
interface (GUI) for configuring the cockpit through the web­site. 
The user may log in to the website by providing one or 
more details. Thereafter, the user may access or configure or 
customize the cockpit. The user may customize the cockpit by 
providing one or more user preferences such as font size, 
theme, color, and so forth. 
[0266] FIG. 31 illustrates a flowchart diagram for config­uring 
a cockpit through a website, in accordance with an 
embodiment of the invention. As discussed with reference to 
FIG. 30, the user may open a website by entering its network 
address or URL in a web browser such as Internet Explorer, 
Google Chrome, etc. At step 3102, the user may open a 
website through a web browser at a device. The user may 
enter a URL associated with the website to open a webpage. 
In an embodiment of the invention, the website may include a 
plurality of webpage. In an embodiment of the invention, a 
third party may maintain the website for configuring the 
cockpit. In an embodiment of the invention, the website may 
be a website for configuring or creating or setting up a cock­pit. 
Based on the URL a web page such as the web page 3008 
may be displayed at the device 3002. The web page 3008 may 
include one or more data request fields 3010a-n. 
[0267] In an embodiment of the invention, the website may 
ask the user to enter his/her personal details for authorization. 
At step 3104, the user may enter one or more personal details 
in the data request fields 3010a-n to authenticate at the web­site. 
The user may be allowed to access web site based on the 
authorization. The user can access a GUI for configuring the 
30 
Mar. 28, 2013 
cockpit after authorization. At step 3106, VMthings 3014 
may display one or more configuration options to the user. 
The user may select or choose the one or more configuration 
options to configure the cockpit. At step 3108, the VMthings 
3014 may receive selection of the one or more configuration 
options from the user. The user may select the options by 
touching the screen of the device. In an embodiment of the 
invention, the user may select the options through at least one 
of entering a combination of keys, giving a voice command, 
gestures, hand movements, and so forth. 
[0268] At step 3110, the VMthings 3014 may configure or 
create the cockpit for the user based on the selection of the 
configuration options. In an embodiment of the invention, the 
cockpit may be customized based on the one or more con­figuration 
options. In an embodiment of the invention, the 
user may create a plurality of cockpits based on his/her pref­erences. 
For example, the user may create a cockpit for han­dling 
home appliances, a second cockpit for handling or 
controlling office objects and so forth. Thereafter, at step 
3112, the cockpit may be displayed to the user. The VMTh­ings 
3014 may display the cockpit at the device 3002. In an 
embodiment of the invention, the VMThings 3014 may dis­play 
the cockpit at a display device attached to the device 
3002. The cockpit is then stored at the cockpit database 3012. 
The user may interact or control one or more objects through 
the cockpit. 
[0269] FIG. 32 illustrates a flowchart diagram for accessing 
a cockpit through a website, in accordance with an embodi­ment 
of the invention. As discussed with reference to FIG. 30, 
the user may access the cockpit through a website. At step 
3202, the user may open a website through a web browser at 
the device 3002. A web page 3008 based on the URL of the 
website may be displayed at the device 3002. The webpage 
3008 may include one or more data request fields 3010 a-n. 
The user may enter his/her details in the data request fields 
3010 a-n. A website server may check whether the user is an 
authorized user or not based on the entered details. Thereafter, 
the VMThings 3014 may search the cockpit database 3012 for 
a cockpit associated with the user. In an embodiment of the 
invention, the cockpit may be present in a cloud network. 
[0270] Then at step 3206, the VMThings 3014 may display 
the cockpit specific to the user at the device 3002. In an 
embodiment of the invention, the cockpit may be displayed at 
a display device. Further, different cockpits may be displayed 
to different users based on their details. In another embodi­ment 
of the invention, a standard cockpit may be displayed to 
the user. The standard cockpit may be a cockpit including one 
or more objects without any specific changes according to 
different users. In an embodiment of the invention, the 
VMThings 3014 may display the cockpit at the device 3002 
based on current location of the user or the device 3002. The 
icons in the cockpit may differ depending on the location of 
the device 3002 or the user. For example, the user may be 
displayed with a first cockpit when the user is at home and 
may be displayed with a second cockpit when the user is 
travelling. In an embodiment of the invention, the location of 
the user may be determined by using a GPS system at the 
device 3001 or in the network 3004. In an embodiment of the 
invention, the location of the objects being controlled may 
change. For example, car, pet, wife, kids may change their 
location. Therefore, VMThings 3014 may display different 
cockpit or visual menus to the user based on the location of 
the controlled objects.
US 2013/0080898 AI 
[0271] Subsequently, the user can interact with the cockpit 
at step 3208. The user may select a tab from a plurality of tabs 
or icons of the cockpit for interacting with the objects. At step 
3210, the user may be displayed with an enhanced visual 
access menu based on the selection or interaction of the user 
with the cockpit. As discussed with reference to FIG. 1A to 
FIG. 2I, the enhanced visual access menu may include one or 
more device options or the service options. The device 
options may be the icons representing one or more remote 
devices 106a-n. Similarly, the service options may be the 
icons or graphics representing one or more services 202a-n. 
In an embodiment of the invention, the cockpit may be dis­played 
based on one or more preference of the user such as 
color preference, font size, theme, language preference, and 
so forth. In an embodiment of the invention, the user may 
provide the preferences in real time. In an embodiment of the 
invention, the user preferences are pre-defined and may be 
stored at the cockpit database 3012 or the device 3002. At step 
3212, the user may interact and control one or more opera­tions 
of the objects such as remote devices. 
[0272] FIG. 33 illustrates a flowchart diagram for config­uring 
a cockpit with the help of other users, in accordance 
with an embodiment of the invention. As discussed with 
reference to FIG. 30, a user may access a website for creating 
or configuring or customizing a cockpit through a web 
browser such as Internet Explorer, Google Chrome, and so 
forth. The website may include a plurality of web pages. Each 
of the web page may display text, images, data request fields, 
and so forth. In an embodiment of the invention, the web page 
may include audio files or video files. 
[0273] In an embodiment of the invention, the user may 
configure an Internet of Things menu by accessing a website. 
The user may login to the website and then may get access to 
various setting controls for configuring the Internet ofThings 
menu based on the authorization. In an embodiment of the 
invention, the Internet of Things application i.e. the VMTh­ings 
may create the Internet of Things menu for different 
users at the device. Further, the user may share the Internet of 
Things menu with other users. In an embodiment of the 
invention, the Internet of Things menu may include one or 
more options for identifiable objects. Further, the Internet of 
Things menu may be created by inviting other users. 
[0274] At step 3302, a first user may access a website for 
creating or configuring or setting up a cockpit at a first device 
such as a first device 2002 of FIG. 20A-B. The first device 
may be a smart phone. At step 3304, the user may invite one 
or more second users for configuring the cockpit for the first 
user. The first user may invite the one or more second users 
through at least one of an SMS, an MMS, an instant message, 
an e-mail, through face to face conversation, or phone, and so 
forth. 
[0275] At step 3306, one or more inputs may be received 
from the one or more second users. Further, the one or more 
second users may provide the one or more inputs at their 
respective second devices. In an embodiment of the invention 
the VMThings 3014 in the network 3004 may receive the one 
or more inputs from the one or more second users. At step 
3308, one or more inputs may be received from the first user. 
Further, the first user may provide the one or more inputs at 
the first device. In an embodiment of the invention, the 
VMThings 3014 may receive the inputs from the first user. 
Further, the first user and the second user may provide the 
inputs by at least one of, touching screen of their devices, 
31 
Mar. 28, 2013 
pressing one or more keys at the devices, giving voice com­mands, 
gestures, hand movements, and so forth. 
[0276] At step 3310, the VMThings 3014 may configure a 
cockpit for the first user based on the one or more inputs from 
the first user and the one or more second users. In an embodi­ment 
of the invention, the VMThings 3014 may customize an 
already configured cockpit of the first user based on the one or 
more inputs from the first user and the one or more second 
users. Finally, at step 3312, the cockpit may be stored at the 
first device. In an embodiment of the invention, the cockpit 
may be stored at a server of the website or at the cockpit 
database 3012 in the network 3004. In an embodiment of the 
invention, the first user may provide access to the cockpit to 
the one or more second users. 
[0277] FIG. 34 illustrates a flowchart diagram for switching 
a display mode of a cockpit, in accordance with an embodi­ment 
of the invention. In an embodiment of the invention, the 
cockpit or the visual access menus may be displayed to the 
user based on the user's one or more preferences. Further, the 
cockpit (or visual access menus) may be displayed to the user 
based on the display capabilities of the device. For example, 
the cockpit may be displayed as a list when the device is a 
simple mobile phone and has a small display. In an embodi­ment 
of the invention, the cockpit may be played to the user 
depending on the user's preference. 
[0278] At step 3402, a user may access a database of visual 
access menus or cockpit through a graphical user interface 
(GUI) at a device. The GUI may provide an interface for 
creating or configuring or customizing or accessing a cockpit. 
As discussed with reference to FIG. 30, the cockpit database 
3012 may include a plurality of cockpits or visual access 
menus for different users and devices. Examples of the device 
may include, but are not limited to, a mobile phone, a smart 
phone, a laptop, anI-pod, a tablet computer, a PDA, an elec­tronics 
device, and so forth. The user may receive alerts or 
messages from the one or more objects connected through the 
cockpit or the visual access menus. At step 3404, a cockpit 
along with one or more mode options may be displayed to the 
user. Examples of the mode options may include, but are not 
limited to, video, audio, visual, text, list, and so forth. In an 
embodiment of the invention, the one or more mode options 
may be displayed at the GUI for creating/accessing cockpit. 
[0279] The user may select at least one mode option from 
the one or more mode options. A selection of the video mode 
option may play the cockpit as a video. A selection of the 
audio mode option may play the cockpit options as audio or 
music. A selection of the text mode option may display the 
cockpit options as text. Similarly, a selection of the list mode 
option may display the cockpit options as a list. At step 3406, 
a selection of the at least one mode options may be received 
from the user at the device. In an embodiment of the inven­tion, 
the VMThings at the device may receive the selection of 
the mode option. 
[0280] Based on the selection of the mode option, the mode 
of the display of the device may be switched at step 3408. For 
example, the user may select the audio option, so the display 
may switch to audio mode and various options of the cockpit 
or the visual access menus may be played to the user. Subse­quently, 
at step 3410, an audio menu may be played at the 
device when the user selects the audio mode. Thereafter, the 
user may listen to the options and may interact by providing 
one or more inputs. The one or more inputs may be provided 
through at least one of gestures, hand movements, voice com­mands, 
pressing one or more keys at the device, touching the
US 2013/0080898 AI 
display, and so forth. For example, when a user is driving, and 
wants to access the cockpit, he may choose the audio mode 
option. Therefore, the options may be played to the user and 
he/she can interact with the cockpit accordingly. 
[0281] FIG. 35A illustrates an exemplary display of cockpit 
along with one or more mode options, in accordance with an 
embodiment of the invention. As discussed with reference to 
FIG. 19, a user may create or configure a cockpit such as the 
cockpit 1902 at the device 102. The cockpit 1902 is an inter­face 
which enables a user to access various services, devices 
or objects. The cockpit 1902 may include icons 1904a-n 
representing various objects which a user or users can access 
or control. The tabs 1904a-n may be icons or text or combi­nation 
of these. 
[0282] As discussed with reference to FIG. 34, the VMTh­ings 
108 may display the cockpit along with one or more 
mode options at the device 102. Examples of the mode 
options may include, but are not limited to, video, audio, 
visual, text, list, and so forth. In an embodiment of the inven­tion, 
the one or more mode options may be displayed at a GUI 
3506 for creating/accessing cockpit as shown in FIG. 35B. 
The user may select at least one mode option from the one or 
more mode options. A selection of the video mode option may 
play the cockpit as a video. A selection of the audio mode 
option may play the cockpit options as audio or music. A 
selection of the text mode option may display the cockpit 
options as text. Similarly, a selection of the list mode option 
may display the cockpit options as a list. A display of the 
device 102 may change based on the selection of the mode 
options by the user. For example, if the user selects an audio 
mode option, an audio menu may be played at the device 102. 
Thereafter, the user may listen to the options and may interact 
by providing one or more inputs. 
[0283] As shown in FIG. 35, the exemplary GUI 3506 may 
include one or more icons/tabs/options 3504a-n. A GUI 
option 3504a may be a Create Cockpit option. A user may 
select this option for creating or configuring or setting up a 
cockpit. A GUI option 3504b may be a Customize Cockpit 
option. The user may use this option to customize an already 
created or stored cockpit. In an embodiment of the invention, 
the cockpit may be stored at the device 102. In an embodiment 
of the invention, the cockpits are maintained by the cockpit 
database 3012 as shown in FIG. 30. A GUI option 3504c may 
be a View Cockpit option. The user may select this option to 
view the cockpits at the device 102. 
[0284] In another embodiment of the invention, a server 
may provide functionality of the VMThings. Further, the 
server may maintain all the information which is otherwise 
was provided by the VMThings. The server may maintain the 
information regarding the one or more visual access menus, 
users, devices, remote devices, services, display device, 
access device, and so forth. A user at the device such as a 
telephone may request information from the server. Further, 
the server may send the information to the requesting device 
over a network. The network may be a wired or a wireless 
network. The connection between the device and the server 
may be a wired or a wireless connection. Further, the server 
may send the information to the requesting device( s) by using 
technologies such as, but are not limited to, SMS, MMS, 
e-mail, and so forth. Based on the received information, the 
content may be displayed at the device. For example, if the 
user has requested the information regarding controlling 
remote devices, then information of visual access menu 
related to remote devices may be received from the server. 
32 
Mar. 28, 2013 
Further, the server may display the visual access menu at the 
device. In an embodiment of the invention, the server may 
also provide other functions or features of the VMThings 108 
as explained in the FIGS. 1A-2G. The user may respond or 
select an option from the displayed visual access menus 
through DTMF tones. The device may be a telephone or a 
simple mobile phone. 
[0285] In an embodiment of the invention, the user may 
access the functionalities as described above by logging into 
a second device such as a home controller. The user may see 
and control devices associated with the home controller. 
[0286] Further, the VMThings may store the user activity 
such as selection of options from the visual access menus at 
the device. This user activity information may be used by the 
VMThings for displaying the visual access menu to the same 
user next time. 
[0287] An aspect of the invention allows the user to share 
his/her cockpit of controlling one or more objects with other 
users. 
[0288] Another aspect of the invention allows the users to 
request permission to access or control the one or more 
objects of the cockpit from the other users. 
[0289] Another aspect of the invention provides a cockpit 
including multiple interfaces for controlling multiple objects 
by a user. 
[0290] An aspect of the invention enables a user to config­ure 
or set up a cockpit with the help of other users in his/her 
social network. Therefore, the user may invite his/her friends 
or other users to set up his cockpit. 
[0291] Further aspect of the invention allows a user to copy 
other user's cockpit. Thereafter, the user may configure his/ 
her cockpit based on the copied cockpit. 
[0292] Another aspect of the invention allows a user to 
download a cockpit from a cloud network or the Internet. 
[0293] Yet another aspect of the invention is to enable a user 
to control one or more operations of the remote devices or 
services through voice commands or gestures or hand move­ments. 
For example, the user may switch on an air conditioner 
(AC) by showing a thumb up gesture in front of the device. 
The device may include a camera to detect the gesture. The 
VMThings at the device (or access device) may analyze the 
gesture and control a remote device based on the analysis. 
[0294] An advantage of the invention relates to visual 
access menus that may ask for voice commands. This GUI is 
for some user harder to use due to accent or other problems. 
The database could be provided with the option as been 
described before for the system to output voice command 
according to user selection of the options or the device 
options or the service options. The device may include a 
microphone for detecting the voice commands. VMThings 
may analyze the voice commands and may take the actions 
accordingly. Further, the disclosed system and methods allow 
the user to give voice commands in different languages. For 
example, the user may select an option by giving a voice 
command in French language. Furthermore, the user may 
select an option (or device options or service options) from 
the visual access menu through one or more gestures or hand 
movements. In an embodiment of the invention, the user may 
store one or more gestures for one or more actions. For 
example, the user may use a thumb up gesture to switch on the 
AC. Similarly the user may store a thumb down gesture to 
switch off an electronic appliance such as microwave. 
[0295] Another advantage of the invention relates to pro­viding 
visual access menus and enhanced visual access
US 2013/0080898 AI 
menus in different language(s). In an embodiment of the 
invention, the VMThings of device or the access device may 
display visual access menu or enhanced visual access menu in 
different languages. Further, the device may have one lan­guage 
and the user may want to control and communicate in 
a different language. Similarly, the VMThings may under­stand 
and accept voice inputs from the user in different lan­guages 
irrespective of the device language. Therefore, the 
user may control the remote devices by giving voice com­mands 
in different languages such as, but are not limited to, 
English, Spanish, French, Hindi, Chinese language, Japanese 
language, Hawaiian, German language, and so forth. In an 
embodiment of the invention, the device may not support or 
understand a particular language such as Spanish, but still the 
VMThings can display the visual access menus in Spanish 
language. 
[0296] Another aspect of the invention is to provide infor­mation 
about various services to the user using a device such 
as a smart phone anytime anywhere. 
[0297] Further aspect of the invention is to enable a user to 
control operations of the remote devices through a device 
including VMThings application. The user may not have to be 
physically present near the remote devices to control them. 
[0298] Yet another aspect of the invention is to allow users 
to see the images of remote devices in real-time irrespective 
of the location of the remote devices. For example, the user 
may see the remote devices such as home appliances present 
at his/her home by being present at the office. 
[0299] Embodiments of the invention are described above 
with reference to block diagrams and schematic illustrations 
of methods and systems according to embodiments of the 
invention. It will be understood that each block of the dia­grams 
and combinations of blocks in the diagrams can be 
implemented by computer program instructions. These com­puter 
program instructions may be loaded onto one or more 
general purpose computers, special purpose computers, or 
other programmable data processing translator to produce 
machines, such that the instructions which execute on the 
computers or other programmable data processing translator 
create means for implementing the functions specified in the 
block or blocks. Such computer program instructions may 
also be stored in a computer-readable memory that can direct 
a computer or other programmable data processing apparatus 
to function in a particular manner, such that the instructions 
stored in the computer-readable memory produce an article of 
manufacture including instruction means that implement the 
function specified in the block or blocks. 
[0300] While the invention has been described in connec­tion 
with what is presently considered to be the most practical 
and various embodiments, it is to be understood that the 
invention is not to be limited to the disclosed embodiments, 
but on the contrary, is intended to cover various modifications 
and equivalent arrangements included within the spirit and 
scope of the appended claims. The invention has been 
described in the general context of computing devices, phone 
and computer-executable instructions, such as program mod­ules, 
being executed by a computer. Generally, program mod­ules 
include routines, programs, characters, components, data 
structures, etc., that perform particular tasks or implement 
particular abstract data types. A person skilled in the art will 
appreciate that the invention may be practiced with other 
computer system configurations, including hand-held 
devices, multiprocessor systems, microprocessor-based or 
programmable consumer electronics, network PCs, mini- 
33 
Mar. 28, 2013 
computers, mainframe computers, and the like. Further, the 
invention may also be practiced in distributed computing 
worlds where tasks are performed by remote processing 
devices that are linked through a communications network. In 
a distributed computing world, program modules may be 
located in both local and remote memory storage devices. 
[0301] This written description uses examples to disclose 
the invention, including the best mode, and also to enable any 
person skilled in the art to practice the invention, including 
making and using any devices or systems and performing any 
incorporated methods. The patentable scope the invention is 
defined in the claims, and may include other examples that 
occur to those skilled in the art. Such other examples are 
intended to be within the scope of the claims if they have 
structural elements that do not differ from the literal language 
of the claims, or if they include equivalent structural elements 
with insubstantial differences from the literal languages of 
the claims. 
1. A method for enhancing interaction of a user with 
objects connected to a network, the method comprising: 
displaying a visual access menu associated with at least 
two independent objects, wherein the said two indepen­dent 
objects are produced by two independent vendors, 
further wherein a database comprises a list of said 
objects. 
2. The method of claim 1, wherein said visual access menu 
is not provided by either of said independent vendors. 
3. The method of claim 1, wherein said visual access menu 
comprises at least one icon indicating one of said objects, 
wherein said at least one icon is substantially different than 
the one provided by said vendor. 
4. The method of claim 1, wherein said database comprises 
a category attribute for said objects and a standard menu for 
said category. 
5. The method of claim 1 further comprising displaying an 
advertisement, wherein said advertisement is selected based 
on content of said visual access menu. 
6. The method of claim 1, wherein said visual access menu 
is displayed at a display device through wireless means. 
7. The method of claim 1 further comprising selecting an 
option from said visual access menu by said user through a 
voice command, wherein voice recognition enables said user 
to select said option. 
8. A method for enhancing interaction of a user with 
objects connected to a network, the method comprising: 
displaying, to said user, a visual access menu for commu­nicating 
with one or more objects made by a vendor, 
wherein said visual access menu is not provided by said 
vendor, further wherein a database comprises a list of 
said one or more objects. 
9. The method of claim 8, wherein said one or more objects 
comprises at least two objects produced by two independent 
vendors. 
10. The method of claim 8, wherein said menu comprises at 
least one icon indicating one of said one or more objects; 
further wherein said at least one icon is substantially different 
than the one provided by said vendor. 
11. The method of claim 8, wherein said database com­prises 
a category attribute for said one or more objects and a 
standard menu for said category. 
12. The method of claim 8 further comprising displaying 
an advertisement, wherein said advertisement is selected 
based on content of said visual access menu.
US 2013/0080898 AI 
13. The method of claim 8, wherein said visual access 
menu is displayed at a display device through wireless means. 
14. A method for enhancing interaction of a user with 
objects connected to a network, the method comprising: 
displaying, to said user of a device, a visual access menu 
comprising an icon indicating at least one object made 
by a first vendor, wherein said icon is substantially dif­ferent 
than the one provided by a second vendor, further 
wherein a database comprises a list of said objects. 
15. The method of claim 14, wherein said visual access 
menu is not provided by either of said first vendor and said 
second vendor. 
16. The method of claim 14, wherein said objects com­prises 
at least two objects produced by either of said first 
vendor and said second vendor. 
34 
Mar. 28, 2013 
17. The method of claim 14, wherein said database com­prises 
a category attribute for said objects and a standard 
menu for said category. 
18. The method of claim 14 further comprising displaying 
an advertisement, wherein said advertisement is selected 
based on content of said visual access menu. 
19. The method of claim 14, wherein said visual access 
menu is displayed at a display device through a wireless 
means. 
20. The method of claim 14 further comprising selecting an 
option from said visual access menu by said user through a 
voice command, wherein voice recognition enables said user 
to select said option. 
* * * * *

More Related Content

PDF
Systems and methods for visual presentation and selection of ivr menu
PDF
TC74VCX244FT PSpice Model (Free SPICE Model)
TXT
Samrt attendance system using fingerprint
PDF
Аварийный дамп – чёрный ящик упавшей JVM. Андрей Паньгин
TXT
ambil aja
PDF
The Ring programming language version 1.9 book - Part 130 of 210
DOCX
PDF
Ramirez entorno
Systems and methods for visual presentation and selection of ivr menu
TC74VCX244FT PSpice Model (Free SPICE Model)
Samrt attendance system using fingerprint
Аварийный дамп – чёрный ящик упавшей JVM. Андрей Паньгин
ambil aja
The Ring programming language version 1.9 book - Part 130 of 210
Ramirez entorno

What's hot (6)

TXT
PPT
Bowling Game Kata
PDF
ipython notebook poc memory forensics
PDF
Learning iPython Notebook Volatility Memory Forensics
PDF
Instruction Manual Minelab Eureka Gold Metal Detector English Language 4901 ...
PDF
Sentença Fepese
Bowling Game Kata
ipython notebook poc memory forensics
Learning iPython Notebook Volatility Memory Forensics
Instruction Manual Minelab Eureka Gold Metal Detector English Language 4901 ...
Sentença Fepese
Ad

Viewers also liked (20)

PPS
маите 2 част
PPS
самостоятелна работа 1
PPT
Opti cal inc.
PPS
лапландия духът на коледа 5
PDF
People &amp; Performance UK
 
PDF
Method and apparatus for automated negotiation for resources on a switched un...
DOC
самостоятелна работа чо
ODP
Egyptian Museum
PPS
почва
PPT
Enabling Active Flow Manipulation In Silicon-based Network Forwarding Engines
PPTX
панаир на проектите разград горна оряховица
ODP
" new Mercedes-Benz "
PPT
Open Programmability
PPS
коледна украса 4
PPT
Lambda Data Grid: An Agile Optical Platform for Grid Computing and Data-inten...
PPTX
равносметка
PPT
Swine Flu
PPT
звук и буква оо цв.гергова
PPTX
урок 42
PDF
Аз и буквите_2008
маите 2 част
самостоятелна работа 1
Opti cal inc.
лапландия духът на коледа 5
People &amp; Performance UK
 
Method and apparatus for automated negotiation for resources on a switched un...
самостоятелна работа чо
Egyptian Museum
почва
Enabling Active Flow Manipulation In Silicon-based Network Forwarding Engines
панаир на проектите разград горна оряховица
" new Mercedes-Benz "
Open Programmability
коледна украса 4
Lambda Data Grid: An Agile Optical Platform for Grid Computing and Data-inten...
равносметка
Swine Flu
звук и буква оо цв.гергова
урок 42
Аз и буквите_2008
Ad

Similar to Systems and methods for electronic communications (20)

PDF
Systems and methods for visual presentation and selection of ivr menu
PDF
Systems and methods for visual presentation and selection of ivr menu
PDF
Cálculos cilindro
PDF
Prelude to halide_public
PDF
Resolução do Livro Mecânica Vetorial para Engenheiros BEER 5ª Edição.pdf
PDF
Transmission Line
PDF
TensorFlow 2: New Era of Developing Deep Learning Models
PDF
20190225 DIY伝送装置
PDF
BE Industrial Engineering
PDF
Comint part b
PDF
NoWave CP1 REV 11 Markup.pdf
PDF
これからはじめるバックエンド技術 〜OAuthとWeb APIの基礎〜
PDF
SCC2017「両利きスマホアプリ開発のススメ」資料
PDF
Flexibility Method of Structural Analysis Class Note.pdf
PDF
Final term 2012-2013 D1
PDF
セオリー・オブ・チェンジ(ToC)とは(2020.6.9. 神戸大学経営学部内田ゼミ)
PDF
Physique révision
PDF
US20150022563
PDF
Buffer Overflows Presentation
Systems and methods for visual presentation and selection of ivr menu
Systems and methods for visual presentation and selection of ivr menu
Cálculos cilindro
Prelude to halide_public
Resolução do Livro Mecânica Vetorial para Engenheiros BEER 5ª Edição.pdf
Transmission Line
TensorFlow 2: New Era of Developing Deep Learning Models
20190225 DIY伝送装置
BE Industrial Engineering
Comint part b
NoWave CP1 REV 11 Markup.pdf
これからはじめるバックエンド技術 〜OAuthとWeb APIの基礎〜
SCC2017「両利きスマホアプリ開発のススメ」資料
Flexibility Method of Structural Analysis Class Note.pdf
Final term 2012-2013 D1
セオリー・オブ・チェンジ(ToC)とは(2020.6.9. 神戸大学経営学部内田ゼミ)
Physique révision
US20150022563
Buffer Overflows Presentation

More from Tal Lavian Ph.D. (20)

PDF
Ultra low phase noise frequency synthesizer
PDF
Ultra low phase noise frequency synthesizer
PDF
Photonic line sharing for high-speed routers
PDF
Systems and methods to support sharing and exchanging in a network
PDF
Systems and methods for visual presentation and selection of IVR menu
PDF
Grid proxy architecture for network resources
PDF
Ultra low phase noise frequency synthesizer
PDF
Systems and methods for electronic communications
PDF
Ultra low phase noise frequency synthesizer
PDF
Ultra low phase noise frequency synthesizer
PDF
Radar target detection system for autonomous vehicles with ultra-low phase no...
PDF
Grid proxy architecture for network resources
PDF
Method and apparatus for scheduling resources on a switched underlay network
PDF
Dynamic assignment of traffic classes to a priority queue in a packet forward...
PDF
Method and apparatus for using a command design pattern to access and configu...
PDF
Reliable rating system and method thereof
PDF
Time variant rating system and method thereof
PDF
Systems and methods for visual presentation and selection of ivr menu
PDF
Ultra low phase noise frequency synthesizer
PDF
Ultra low phase noise frequency synthesizer
Ultra low phase noise frequency synthesizer
Ultra low phase noise frequency synthesizer
Photonic line sharing for high-speed routers
Systems and methods to support sharing and exchanging in a network
Systems and methods for visual presentation and selection of IVR menu
Grid proxy architecture for network resources
Ultra low phase noise frequency synthesizer
Systems and methods for electronic communications
Ultra low phase noise frequency synthesizer
Ultra low phase noise frequency synthesizer
Radar target detection system for autonomous vehicles with ultra-low phase no...
Grid proxy architecture for network resources
Method and apparatus for scheduling resources on a switched underlay network
Dynamic assignment of traffic classes to a priority queue in a packet forward...
Method and apparatus for using a command design pattern to access and configu...
Reliable rating system and method thereof
Time variant rating system and method thereof
Systems and methods for visual presentation and selection of ivr menu
Ultra low phase noise frequency synthesizer
Ultra low phase noise frequency synthesizer

Recently uploaded (20)

PPTX
Prograce_Present.....ggation_Simple.pptx
PDF
Dynamic Checkweighers and Automatic Weighing Machine Solutions
PPT
chapter_1_a.ppthduushshwhwbshshshsbbsbsbsbsh
PDF
-DIGITAL-INDIA.pdf one of the most prominent
PPTX
sdn_based_controller_for_mobile_network_traffic_management1.pptx
PPTX
Entre CHtzyshshshshshshshzhhzzhhz 4MSt.pptx
PPTX
PROGRAMMING-QUARTER-2-PYTHON.pptxnsnsndn
PPTX
INFERTILITY (FEMALE FACTORS).pptxgvcghhfcg
PPTX
Fundamentals of Computer.pptx Computer BSC
PPTX
STEEL- intro-1.pptxhejwjenwnwnenemwmwmwm
PDF
Smarter Security: How Door Access Control Works with Alarms & CCTV
PDF
Prescription1 which to be used for periodo
PPTX
Sem-8 project ppt fortvfvmat uyyjhuj.pptx
PDF
Cableado de Controladores Logicos Programables
PPTX
code of ethics.pptxdvhwbssssSAssscasascc
PPT
Hypersensitivity Namisha1111111111-WPS.ppt
PPT
FABRICATION OF MOS FET BJT DEVICES IN NANOMETER
PPTX
quadraticequations-111211090004-phpapp02.pptx
PPTX
PLC ANALOGUE DONE BY KISMEC KULIM TD 5 .0
PPTX
1.pptxsadafqefeqfeqfeffeqfqeqfeqefqfeqfqeffqe
Prograce_Present.....ggation_Simple.pptx
Dynamic Checkweighers and Automatic Weighing Machine Solutions
chapter_1_a.ppthduushshwhwbshshshsbbsbsbsbsh
-DIGITAL-INDIA.pdf one of the most prominent
sdn_based_controller_for_mobile_network_traffic_management1.pptx
Entre CHtzyshshshshshshshzhhzzhhz 4MSt.pptx
PROGRAMMING-QUARTER-2-PYTHON.pptxnsnsndn
INFERTILITY (FEMALE FACTORS).pptxgvcghhfcg
Fundamentals of Computer.pptx Computer BSC
STEEL- intro-1.pptxhejwjenwnwnenemwmwmwm
Smarter Security: How Door Access Control Works with Alarms & CCTV
Prescription1 which to be used for periodo
Sem-8 project ppt fortvfvmat uyyjhuj.pptx
Cableado de Controladores Logicos Programables
code of ethics.pptxdvhwbssssSAssscasascc
Hypersensitivity Namisha1111111111-WPS.ppt
FABRICATION OF MOS FET BJT DEVICES IN NANOMETER
quadraticequations-111211090004-phpapp02.pptx
PLC ANALOGUE DONE BY KISMEC KULIM TD 5 .0
1.pptxsadafqefeqfeqfeffeqfqeqfeqefqfeqfqeffqe

Systems and methods for electronic communications

  • 1. 111111 1111111111111111111111111111111111111111111111111111111111111111111111111111 US 20130080898Al (19) United States c12) Patent Application Publication Lavian et al. (10) Pub. No.: US 2013/0080898 A1 (43) Pub. Date: Mar. 28, 2013 (54) SYSTEMS AND METHODS FOR ELECTRONIC COMMUNICATIONS (76) Inventors: Tal Lavian, Sunnyvale, CA (US); Zvi Or-Bach, San Jose, CA (US) (21) Appl. No.: 13/273,187 (22) Filed: Oct. 13, 2011 (63) Related U.S. Application Data Continuation-in-part of application No. 13/245,804, filed on Sep. 26, 2011, Continuation-in-part of appli­cation No. 13/272,212, filed on Oct. 12, 2011. 102 "" Publication Classification (51) Int. Cl. G06F 3101 (2006.01) G06F 3116 (2006.01) (52) U.S. Cl. USPC ........................................... 715/728; 715/738 (57) ABSTRACT Embodiments of the invention provide a system for enhanc­ing user interaction with objects connected to a network. The system includes a processor, a display screen, a memory coupled to the processor. The memory comprises a database including a list of two or more objects and instructions executable by the processor to display a menu. The menu is associated with at least two independent objects. And the two independent objects are produced by two independent ven­dors. GraQhical User Interface r- 3504a ~ 3504b Create Cockpit Customize Cockpit ~ 3506 r- 3504c r-- 3504n View Cockpit • • Invite Users Audio Video Text List Mode lt1ode Mode • • Mode 3502a 3502b 3502c 3502n
  • 2. Patent Application Publication Mar. 28, 2013 Sheet 1 of 64 (!) <.) .> (!) 0 ..<.J...) 0 E Q) 0::: Q) 0 ·;;: Q) 0 ..<.J...) 0 E Q) n:: Q) -~ > Q) 0 N 0 "(""'"" Q) <J.) 0 (.) ·;;: ·::; Qj <J.) 0 0 ..Q...) • • • ..Q....). 0 0 E E Q) Q) 0::: 0:::: US 2013/0080898 Al
  • 3. ~.'"..=. . ('D .=.... 114 Server f./ ~ '-.e... (') ~....... .. 102 I 0 = '"= Device ( 110 106a Remote Device = 0-..".. (') Web Page ~....... .. 0 (112a I User 10 I Network = (112b Remote Device 106b I Password I ~ ~ :-: N ~CIO I VMThings Remote Device r N II I 106c 0 108 .... (.H • rFJ =- ('D • (.'.D.. . • N 0 106n Remote Device v ..... 0 .j;o. / c rFJ 200 N 0.. .. .(...H_ 0 0 CIO 0 CIO 0 FIG. 18 CIO >....
  • 4. 11s , I l Display Device I 104 ..._ / 116 , I I f Access Device ) I VMThings 108 II ....__., ""' / 300 FIG.1C Remote Device /.1 I Remote Device ~I ~ Remote Device • • • I Remote Device v 106a 1 .... 106b If 106c I,... 106n '"= ~.... . ('D .=.... ~ '-.e... (') ~....... .. 0 = ='"= 0-..".. (') ~....... .. 0 = ~ ~ :-: N ~CIO N 0.. .. (.H rFJ =­(' D (.'.D.. . (.H 0... .. 0 .j;o. c rFJ N 0.. .. .(...H_ 0 0 CIO 0 CIO 0 CIO >....
  • 5. ~.'"..=. . ('D .=.... 118 "" 1 I /I Remote Device 106a ~ Display Device I '-.e... (') ~....... .. 0 = '"= 120 / = 0-..".. (') ~.... . Remote Device l/ 106b .... 0 = 11e , I Access Device I _; Zig Bee ~ ~ ~ I VMThings 1r 108 :-: N ~CIO Remote Device /106c N 0.. .. (.H rFJ • =- ('D (.'.D.. . • .j;o. • 0... .. 0 .j;o. Remote Device V 106n c rFJ N 0.. .. .(...H_ 0 0 FIG. 10 CIO 0 CIO 0 CIO >....
  • 6. 118 Display Device 122 J 1. X ~~ K lll A '-A ) Access Device 116 -., I I VMThings 108 II - ""' FIG. 1E Remote Device Remote Device ~I ~ Remote Device • • • Remote Device 106a I I 106b L.r 106c I r- 106n '"= ~.... . ('D .=.... ~ '-.e... (') ~....... .. 0 = ='"= 0-..".. (') ~....... .. 0 = ~ ~ :-: N ~CIO N 0.. .. (.H rFJ =­(' D (.'.D.. . Ul 0... .. 0 .j;o. c rFJ N 0.. .. .(...H_ 0 0 CIO 0 CIO 0 CIO >....
  • 7. Remote Device 118 Display Device 124 FIG. 1 F 106a ~.'"..=. . ('D .=.... ~ '-.e... (') ~....... .. 0 = ='"= 0-..".. (') ~....... .. 0 = ~ ~ :-: N ~CIO N 0.. .. (.H rFJ =­(' D (.'.D.. . 0 0... .. 0 .j;o. c rFJ N 0.. .. .(...H_ 0 0 CIO 0 CIO 0 CIO >....
  • 8. Patent Application Publication Mar. 28, 2013 Sheet 7 of 64 US 2013/0080898 Al m (J) (J) (J) ·;0;; : (.j (.) u ·:;; ·:;; "> (J) Q.') CD Q) 0 0 0 0 Q) (J) (J) • • • ......., ..(..J..). 0 -0 -0 0 E E E E (J) Q.') Q.') Q) n::: 0:: 0:: 0:: . (9 LL
  • 9. Patent Application Publication Mar. 28, 2013 Sheet 8 of 64 (]) u ·:;; (]) 0 ..(.]..). 0 E (]) 0:::: OJ (.) ·::;; (]) 0 ..(..].). 0 E (J) n:: N 0 """" OJ (!) (.) {.) ·::;; ·:;; (!) (J) 0 0 ..(.!..). • • • ..<..D.. 0 0 E E <D OJ n:: 0::: US 2013/0080898 Al . (9 LL
  • 10. '"= ~ 106a ..... ('D .=.... 130 /1 Remote Device I ~ 'e 102 ..... __.) - f 126 .... (') ~.... . 106b .... 0 = Device I VMThings I R I J J j . ) I ='"= '":VHOVY 0-..".. '(Network Remote Device (') ~....... .. 108 0 106c = Remote Device I ~ ~ :-: N ~CIO /128 I • • 0.. N .. Bridge • f 106n (.H rFJ Device I =- ('D Remote Device (.'.D.. . 0 0... .. 0 .j;o. c rFJ N 0.. .. .(...H_ 0 0 CIO 0 FIG. 11 CIO 0 CIO >....
  • 11. Patent Application Publication Mar. 28, 2013 Sheet 10 of 64 ('0 N 0 N (1) 0 ·:,:_;: (1) (f) .0 N 0 C1 (1) (,) -~ Q) (f) N 0 't'""" 0 N 0 C1 Q) (,) -~ Q) (/) • • • c C1 0 N (!) Q ·s...;.. (!) (/) US 2013/0080898 Al <( N . (9 LL
  • 12. Patent Application Publication -.:;t .,.- ....... " '- <D C: <D (j) N ro N 0 N ill (.) ".>..... (!) (f) ,0.. ... ..- ctl N ......-. .0 N 0 N <D (.) -~ <!) (f) ..a .N... . ..... Mar. 28, 2013 Sheet 11 of 64 (.) N 0 N m (..) -~ Q> {/) • If) • • c N 0 N ill u "> '- <D (f) 0 _ _ "2 Q) ..- 0 c: 0 <D .._ $ £ gsl Q) <D If) !- ....... ro (f) If) 2 Q_ :::> ro > .0 0.. <D s US 2013/0080898 Al co N <D u.. '§I
  • 13. Patent Application Publication Mar. 28, 2013 Sheet 12 of 64 US 2013/0080898 Al ro N 0 N m () .2: (]) {f) a> (.) ·:;: m 0 r>o- 0. (f) 0 ..0 N 0 N (]) () .> s... Q) (j) <l> (J '> Cl.l 0 (/) (/) (!) 0 (.) <( en 0> (.) N 0 N ..5.c cool 1-'r' ~ > m 0 -~ • (]) (f) (l) 0 • • -~ (!) (f) '§I () N . <.9 u...
  • 14. Patent Application Publication Mar. 28, 2013 Sheet 13 of 64 ro N 0 N CJ.) 0 ·::;;: CJ.) 0 >. ro Cl. .~ 0 .n N 0 N CJ.) 0 ·::;;: I- (]) (/) (!) 0 ·:;: (!) 0 (f) en CD () () <( en Q) 0 N 0 N c :E 81 !-'(""" ~ > <J) 0 ·~ CD (f) • • • c N 0 N CD () '2: (]) {f) US 2013/0080898 Al 0 N <.9 LL
  • 15. Patent Application Publication Mar. 28, 2013 Sheet 14 of 64 ro N 0 N <D u 0~ Q) (f) Q) (,) ·:; <D 0 »ro a.. -~ 0 .0 N 0 N Q) (,) '2: Q) (j) (].) () ·s; (!) 0 CIJ rJ) (].) () (,) <{ rJ) 0) u N 0 N c: E~~ 1-""r"" ~ > Q) (,) '2: Q) (j) • • • c N 0 N Q) (,) ·;; I-CD (f) US 2013/0080898 Al w N . (9 LL
  • 16. Patent Application Publication Mar. 28, 2013 Sheet 15 of 64 m N 0 N ({) ·:u:; ..... <!) CJ) 0) {.) ·::;: ({) 0 >. m 0.. -~ 0 ..a N 0 N 0) {.) ·::;: ..... Q) (j) <II 0 ·::;: (!) 0 1/j ifJ (!.) 0 0 <( 1/j 0) (.) N 0 N c E ?§I 1--r- ~ > <J) (.) -~ (J) (/) ({) u • • • ·s; ..... <!) CJ) US 2013/0080898 Al
  • 17. Patent Application Publication Mar. 28, 2013 Sheet 16 of 64 ro N 0 N (J) 0 ·:; !.... (J) (f) .a N 0 N <J.) -~ > I- (J) (f) N .0. - 0 N 0 N (1) .5d > ~ (]) (f) • (J) 0 ·:; • • !.... (J) (f) US 2013/0080898 Al . (.9 LL
  • 18. Patent Application Publication Mar. 28, 2013 Sheet 17 of 64 m N 0 N (!) (.) "2: (!) (j) ..0 N 0 N (!) .2 > l..... (!) fJ) N 0 ~ (.J N 0 N (!) .2 > l..... (!) fJ) • • • c N 0 N (j) 0 ·::;: !- (j) fJ) US 2013/0080898 Al I N <D LL
  • 19. Patent Application Publication Mar. 28, 2013 Sheet 18 of 64 US 2013/0080898 Al c N .0 () ro 0 N N N N 0 0 0 N N N OJ (}.) (J) <D 0 (.) "> <.J 0 ·:;;: ,_ -~ ,_ -~ • • • <D (]) 0) (J) (f) (}) (f) Cf) N . co N.- (J) (!) .0) .5:2 (!) ~ ·""c0 >tD LL coo (/') Q) en <.J c "> :c ~I (}.) I-T"" 0 ~ N > ,0.. ...
  • 20. 8 102 0 Device J 308 .r302 Remote Devices 1 r302 Ccontrol ~', Remote Devices 1 /'304 Services 2 FIG. 3A 102 Device r310 J"306a Vehicle 3 r306b AC 4 r306c Camera 5 • f306n Microwave n '"= ~.... . ('D .=.... ~ '-.e... (') ~....... .. 0 = ='"= 0-..".. (') ~....... .. 0 = ~ ~ :-: N ~CIO N 0.. .. (.H rFJ =­(' D (..'..D... .. 0 0... .. 0 .j;o. c rFJ N 0.. .. .(...H_ 0 0 CIO 0 CIO 0 CIO >....
  • 21. 8 102 0 Device 308 302 Remote Devices 1 304 Services 2 304 Services 2 FIG. 38 102 Device 312 314a Entertainment 3 314b 314c 314n n ~.'"..=. . ('D .=.... ~ '-.e... (') ~....... .. 0 = ='"= 0-..".. (') ~....... .. 0 = ~ ~ :-: N ~CIO N 0.. .. (.H rFJ =­(' D (.'.D.. . N 0 0... .. 0 .j;o. c rFJ N 0.. .. .(...H_ 0 0 CIO 0 CIO 0 CIO >....
  • 22. 8 102 110a 0 Device ( Web Paae Web Paae _/302 Remote Devices 1 J 302 __f 304 I .... Remote Devices I 7 Services 1 2 FIG. 3C 110b Device 1 .f306a Vehicle 3 r306b f._-::? AC 4 f306c Camera 5 •• .r306n ...; Microwave n '"= ~.... . ('D .=.... ~ '-.e... (') ~....... .. 0 = ='"= 0-..".. (') ~....... .. 0 = ~ ~ :-: N ~CIO N 0.. .. (.H rFJ =­(' D (.'.D.. . .N.. . 0... .. 0 .j;o. c rFJ N 0.. .. .(...H_ 0 0 CIO 0 CIO 0 CIO >....
  • 23. 8 102 110a 0 Device ( Web Page Web Paae f 302 Remote Devices 1 f 304 f 304 I 1-----7 Services Services I 2 2 FIG. 3D 110c Device ( _r314a Entertainment 3 .J314b ~ Travel 4 f314c Banking 5 •• f314n Hotels n '"= ~.... . ('D .=.... ~ '-.e... (') ~....... .. 0 = ='"= 0-..".. (') ~....... .. 0 = ~ ~ :-: N ~CIO N 0.. .. (.H rFJ =­(' D (.'.D.. . N N 0... .. 0 .j;o. c rFJ N 0.. .. .(...H_ 0 0 CIO 0 CIO 0 CIO >....
  • 24. Car 5 ! 404f 404k Truck 6 102 Device Remote Devices 1 404a 404c~ I ~n Reg~late 404g 404h 404i 4041 FIG.4 m II'-- 402 ff 9 404j ~.'"..=. . ('D .=.... ~ '-.e... (') ~....... .. 0 = ='"= 0-..".. (') ~....... .. 0 = ~ ~ :-: N ~CIO N 0.. .. (.H rFJ =­(' D (.'.D.. . N (.H 0... .. 0 .j;o. c rFJ N 0.. .. .(...H_ 0 0 CIO 0 CIO 0 CIO >....
  • 25. 504b 504e Banking 2 Transfer 9 504i Details 10 102 Device Services 1 504a Entertainment 3 504g 504j FIG. 5 504c 504f Check Bill 11 Travel 4 .f 504h 504k 502 ~.'"..=. . ('D .=.... ~ '-.e... (') ~....... .. 0 = ='"= 0-..".. (') ~....... .. 0 = ~ ~ :-: N ~CIO N 0.. .. (.H rFJ =­(' D (.'.D.. . N .j;o. 0... .. 0 .j;o. c rFJ N 0.. .. .(...H_ 0 0 CIO 0 CIO 0 CIO >....
  • 26. Device ;602 4 Display ~ ....:::. Radio / Interface ;604 Processor ~ Network --7 Interface ) 606 622 j Memory Graphical User Interface 608 Input/ Output --7' Interface Database 610 I I VMThings 612 I I • FIG. 6 v614 )102 10 616 ./ Networ~ 1'- v 618 ...,.:.:: . Memory lJ Card I' -M Keyboard ,I Mouse "'I ~H USB I a Ob Oc 20d '"= ~.... . ('D .=.... ~ '-.e... (') ~....... .. 0 = ='"= 0-..".. (') ~....... .. 0 = ~ ~ :-: N ~CIO N 0.. .. (.H rFJ =­(' D (.'.D.. . N Ul 0... .. 0 .j;o. c rFJ N 0.. .. .(...H_ 0 0 CIO 0 CIO 0 CIO >....
  • 27. Access Device ~ _;702 ' Radio --;? Interface Processor ~ 720 j Network J 704 Interface Memory Graphical User lntertace ' Input/ Output 706 --;? Interface I Database I ~ 708 "/ Ports VMThings 710 ~ I I ~ 722 FIG. 7 lr 712 _;116 10 Lf 714 I_..- Networl "' / v 716 / ' Memory lJ ' Card 1-M Keyboard ' Mouse / '--H USB a b c 8d '"= ~..... . ('D .=..... ~ '-.e... (') ~........ .. 0 = ='"= 0-..".. (') ~........ .. 0 = ~ ~ :-: N ~CIO N 0.. .. (.H rFJ =­(' D ('D ...... N 0 0... .. 0 .j;o. c rFJ N 0.. .. .(...H_ 0 0 CIO 0 CIO 0 CIO >....
  • 28. Patent Application Publication Mar. 28, 2013 Sheet 27 of 64 US 2013/0080898 Al Start I 802 .__ Access a database of visual access menus through a GU I at a device II 804 .__ Display a visual access menu at the device II 806 ""__ Display an enhanced visual access menu based on a selection of an option by a user it 808 ~ Receive a selection of a device option from a user II 810 ____ Connect to a remote device based on selection of the device option I; 812 ~ Control one or more operations of the remote device based on selection of the device option II Stop FIG. 8
  • 29. Patent Application Publication Mar. 28, 2013 Sheet 28 of 64 US 2013/0080898 Al Start I 902 ___ Access a database of visual access menus through a GU! at a device 1 904 ___ Display a visual access menu at the device II 906 ._ Display an enhanced visual access menu based on a selection of an option by a user 'V 908 ~ Receive a selection of a service option from a user 'V 910 ._ Connect to a service based on a selection of the service option / 912 ~ Control and display information about the service based on selection of the service option I Stop FIG. 9
  • 30. Patent Application Publication Mar. 28, 2013 Sheet 29 of 64 US 2013/0080898 Al Start 1002 Display a GUI for accessing visual access menus at the device 1004 Receive an input from a user of the device 1010 .----"'--'--------., Wait for an input at the device No 1006 _ 1012 Is input is for accessing remote devices? Is a visual access menu for remote devices available? FIG. 10A
  • 31. Patent Application Publication Mar. 28, 2013 Sheet 30 of 64 1018 Is a visual access menu for services available? Display the visual access menu including service options at the device 1020 "._ Receive a selection of a service option from the user 1022 Is information for the No selected service option available? 1026 Yes Display the information based on the received selection FIG. 108 US 2013/0080898 Al Retrieve visual access menu from a 1024 ; Receive information from the server
  • 32. Patent Application Publication Mar. 28, 2013 Sheet 31 of 64 US 2013/0080898 Al B c 1028 _ V Retrieve the visual access menu for the remote devices from the server 1030 _ V ' Display the visual access menu including / device options at the device 1032 __ It Receive a selection of a device option from the user 'I 1034 _ Connect to a remote device based on the received selection V 1036 _ Control the remote device based on the one or more user inputs 0 FIG. 10C
  • 33. Patent Application Publication Mar. 28, 2013 Sheet 32 of 64 US 2013/0080898 Al Start 1 1102 .._ Open a website through a web browser at the device ~ 1104 Authenticate a user's identity at the website 1106 l Display a visual access menu at the device 1108 t Receive an input from the user of the device ~ 1110 "__ Display an enhanced visual access menu when the input is for accessing remote devices l 1112 "__ Receive a selection of a device option from the user l 1114 "-- Connect to a remote device based on a selection of the device option 1 1116 "-- Control one or more operations of the remote device based on the selection of the device option l Stop FIG.11
  • 34. Patent Application Publication Mar. 28, 2013 Sheet 33 of 64 US 2013/0080898 Al Start 1 1202 "'. Open a website through a web browser at the device 1204 Authenticate user's identity at the website 1206 j Display a visual access menu at the device 1208 _ II Receive an input from the user of the device l 1210 _ Display an enhanced visual access menu when the input is for accessing services ~ 1212 _ Receive a selection of a service option from the user 1214 __ Connect to a service based on a selection of the service option 'II 1216 __ Control and display information about the service based on selection of the service option Stop FIG.12
  • 35. Patent Application Publication Mar. 28, 2013 Sheet 34 of 64 US 2013/0080898 Al 1314 1302 1304 1306 Wait for an input at the device Start Open a website through a web browser at the device Authenticate user's identity at the website Display a visual access menu at the device Receive an input from the user of the device 1310 1312_ No Is input is for accessing remote FIG. 13A
  • 36. Patent Application Publication Mar. 28, 2013 Sheet 35 of 64 US 2013/0080898 Al 1316 1318 __ 1320 Is a visual access menu for services available? Yes Display the visual access menu including service options at the device 1322~ Receive a selection of a service option from the user 1324 Is information for the selected service option available? Retrieve visual access menu from a server 1326 __ Receive information from the server 1328 Yes *-----------------------~ Display the information at the device based on the received selection FIG13B
  • 37. Patent Application Publication Mar. 28, 2013 Sheet 36 of 64 US 2013/0080898 Al 1334 1336 1338 1330 No 1332 Is a visual access menu for remote devices available at the device? Retrieve the visual access menu from the server Yes Display the visual access menu including device options at the device Receive a selection of a device option from the user Connect to a remote device based on the received selection 1340 Control the remote device based on the one or more user inputs FIG. 13C
  • 38. Patent Application Publication Mar. 28, 2013 Sheet 37 of 64 US 2013/0080898 Al Start V 1402 ._ Open a website through a web browser at the device I 1404 Display a visual access menu at the device I 1406 ._ Receive an input from the user of the device I 1408 ._ Display an enhanced visual access menu when the input is for accessing remote devices I 1410 ._ Receive a selection of a device option from the user 1412 " Connect to a remote device based on a selection of the device option 1414 " Control one or more operations of the remote device based on the selection of the device option Stop FIG.14
  • 39. Patent Application Publication Mar. 28, 2013 Sheet 38 of 64 US 2013/0080898 Al Start w 1502 ~ Access a database of visual access menus through a GUI at an access device I 1504 __ Display a visual access menu at a display device t 1506 "' Display, at the display device, an enhanced visual access menu based on a selection of an option by a user II 1508 __ Receive a selection of a device option from a user 'II 1510 __ Connect to a remote device based on selection of the device option I 1512 __ Control one or more operations of the remote device based on selection of the device option it Stop FIG. 15
  • 40. Patent Application Publication Mar. 28, 2013 Sheet 39 of 64 US 2013/0080898 Al Start I 1602 "___ Access a database of visual access menus through a GU! at an access device 1 1604 "___ Display a visual access menu at a display device t 1606 . Display, at the display device, an enhanced visual access menu based on a selection of an option by a user V 1608 _ Receive a selection of a service option from a user II 1610 ".._ Connect to a service based on a selection of the service option I 1612 ~ Display information , at the display device, about the service based on the selection of the service option I Stop FIG. 16
  • 41. Patent Application Publication Mar. 28, 2013 Sheet 40 of 64 US 2013/0080898 Al 1710 1702 1704 Display a GUI for accessing visual access menus at a display device connected to an access device Receive an input from a user of the device Wait for an input from the user No 1712 ls input is for accessing remote Is a visual access menu for remote devices available? FIG. 17A
  • 42. Patent Application Publication Mar. 28, 2013 Sheet 41 of 64 US 2013/0080898 Al 1718 Is a visual access menu for services available? No Display the visual access menu including service options at the device 1720~ Receive a selection of a service option from the user 1722 Is information for the No selected service option available? 1726 Yes Display the information based on the received selection Stop FIG. 178 Retrieve visual access menu from a 1724 ! Receive information from the server
  • 43. Patent Application Publication Mar. 28, 2013 Sheet 42 of 64 US 2013/0080898 Al 8 c 1728 .__ II Retrieve the visual access menu for the remote devices from the server 1730 . ' Display the visual access menu including "' device options at the display device V 1732 ___ Receive a selection of a device option from the user It 1734 __ Connect to a remote device based on the received selection V 1736 __ Control the remote device based on the one or more user inputs 1 D FIG. 17C
  • 44. Patent Application Publication Mar. 28, 2013 Sheet 43 of 64 US 2013/0080898 Al
  • 45. Patent Application Publication Mar. 28, 2013 Sheet 44 of 64 N 0 .c..o... N 0 .c..o.. US 2013/0080898 Al 00 co --- 0 u..
  • 46. ~.'"..=. . ('D .=.... f102 - ~ '-.e... (') CockQit ~....... .. 0 f 1904a 1904b 1904c ='"= = 0-..".. (') IVR Remote Services Device Control Control ~....... .. 0 = f 1904d f 1904e f 1904f ~ ~ :-: Outlook Calendar Other E-rnails 1902 N ~CIO N 0.. .. (.H f 1904g f 1904h 1904n rFJ =- ('D (.'.D.. . .j;o. Messengers Games • • Other Objects Ul 0... .. 0 .j;o. c rFJ N 0.. .. .(...H_ 0 0 CIO 0 FIG.19 CIO 0 CIO >....
  • 47. First Device 2002 ""l II. )( I Network , 1E VMThings 2004 /'~'' 2010 2004 ""I Second Device I Proxy I VMThings I Server 2008 ----------------------~----------------- FIG. 20A Remote Device )I Remote Device "-.. I Remote Device '. • • • 1 Remote Device 106a I ,.,- 106b r 106c ~ 106n '"= ~.... . ('D .=.... ~ '-.e... (') ~....... .. 0 = ='"= 0-..".. (') ~....... .. 0 = ~ ~ :-: N ~CIO N 0.. .. (.H rFJ =­(' D (.'.D.. . .j;o. 0 0... .. 0 .j;o. c rFJ N 0.. .. .(...H_ 0 0 CIO 0 CIO 0 CIO >....
  • 48. Patent Application Publication Mar. 28, 2013 Sheet 47 of 64 ro N 0 C'1 (!.) 0 ·;;: "- (j) (j) .0 N 0 N (!.) (.) ·s;: !... t1) (/) Q) 0 ·s;: Q) 0 ....... tJ) s... u:: N 0 0 N Cf) 0) ..sc O~~ r-o ~N > l) N 0 N (j) () '> "- (!.) (/') • • • ------ c: N 0 C'1 (!.) 0 -~ (j) (j) >.s... X <D 0 2: s... (!.) a..w (J) () ·s;: (!.) 0 "0 c: 0 (.) (!.) (/') .0 0 0 N Cf) 0) .·c-c 00o0 1 1-N ~ > US 2013/0080898 Al en 0 N . (9 LL
  • 49. Patent Application Publication Mar. 28, 2013 Sheet 48 of 64 US 2013/0080898 Al Start I 2102 ""._ Access a GUI for configuring a cockpit by a first user at a first device I 2104 ~ Configure the cockpit based on preferences of the first user I 2106 __ Share the cockpit with one or more second users of the second devices J 2108 ~ Translate the cockpit based on preference of the one or more second users I 2110 . Display the translated cockpit at the one or more second devices V Stop FIG. 21
  • 50. Patent Application Publication Mar. 28, 2013 Sheet 49 of 64 US 2013/0080898 Al Start I 2202 ~ Access a GUI for configuring a cockpit at a first device by a first user I 2204 ___ Configure the cockpit based on preferences of the first user I 2206 ._ Share the cockpit with one or more second users I 2208 _ Translate the cockpit based on preference of the one or more second users 2210 _ Display the translated cockpit at one or more second devices of the one or more second users 2212 Interact with the cockpit at the second device 'II A FIG. 22A
  • 51. Patent Application Publication Mar. 28, 2013 Sheet 50 of 64 US 2013/0080898 Al A I 2214 "' Store interactions of the second users with the cockpit at a proxy server in a network 1 Ask for a permission from the first user in 2216 ""'... case of a change in the cockpit by the one or more second user I 2218 __ Update the cockpit based on the permission from the first user I Stop FIG. 228
  • 52. Patent Application Publication Mar. 28, 2013 Sheet 51 of 64 US 2013/0080898 Al Start I 2302 ---._ Access a database of visual access menus '-- through a GUI for customizing a cockpit at a device 2304 .__ Search the database for a cockpit based on an input from a user 2306 ._ 2308 Customize the cockpit according to the user preferences Display a customized cockpit at the device } Stop FIG. 23
  • 53. Patent Application Publication Mar. 28, 2013 Sheet 52 of 64 US 2013/0080898 Al Start if 2402 .... Access a database of visual access menus through a GU I for creating a cockpit at a device I 2404 .... Display one or more configuration settings options for creating the visual access menu I 2406 _ Receive selection of one or more settings options from a user I 2408 _ Create the cockpit based on the selection received from the user 2410 II Display the cockpit to the user It Stop FIG. 24
  • 54. Patent Application Publication Mar. 28, 2013 Sheet 53 of 64 US 2013/0080898 Al Start I 2502 __ Access a database of visual access menus through a GU I for creating a cockpit at a device V 2504 __ Display one or more configuration options for customizing or creating the cockpit 'V 2506 _ Create/configure the cockpit based on the selection received from the user j 2508 Receive a rating for the cockpit from other users in a network V 2510 _ Customize the cockpit based on the ratings of the other users 2512 j Display the customized cockpit at the device I Stop FIG. 25
  • 55. Patent Application Publication Mar. 28, 2013 Sheet 54 of 64 US 2013/0080898 Al Start 2602 __ Create a first cockpit by accessing a GU! for creating a cockpit at a first device V 2604 __ Download the first cockpit at one or more second devices It 2606 . Customize a second cockpit at the one or more second devices based on the downloaded first cockpit I 2608 _ Receive a rating on the customized second cockpit from other users in a network I 2610 ~ Download configuration settings of the second cockpit at the first device based on the users ratings It 2612 _ Customize the first cockpit based on the downloaded configuration settings 2614 _ Display the customized first cockpit at the first device Stop FIG. 26
  • 56. Patent Application Publication Mar. 28, 2013 Sheet 55 of 64 US 2013/0080898 Al 2702 ___ 2704 ___ 2706 2708 _ Start I Select a second cockpit of one or more second user from a database, wherein a profile of the second users is similar to profile of a first user lt Analyze the second cockpit of the one or more second users Create a first cockpit specific to the first user based on the analysis of the second cockpit of the second users Display the first cockpit specific to the first user at the device I Stop FIG. 27
  • 57. Patent Application Publication Mar. 28, 2013 Sheet 56 of 64 US 2013/0080898 Al Start ll 2802 ~ Access a GUI for creating a cockpit at a first device I 2804 Provide information about a second user I 2806 _ Download configuration settings of a second cockpit of the second user at the first device ll 2808 _ Create or customize a first cockpit based on the second cockpit of the second user 2810 V Store the first cockpit at the first device 2812 V Display the first cockpit to the user II Stop FIG. 28
  • 58. Patent Application Publication Mar. 28, 2013 Sheet 57 of 64 US 2013/0080898 Al Start 1 2902 "-- Access a GUI for creating a cockpit at a device / 2904 .._ Download a cockpit having good ratings at a device from the internet j 2906 .__ Translate/customize the downloaded cockpit according to a language preference of a user 2908 I Store the customized cockpit at the device I 2910 Display the customized cockpit at the device I Stop FIG. 29
  • 59. ~.'"..=. . ('D 3002 .=.... ~ Device (3008 Object 3006a 'e 3004 ~....... .. 0 = C ='"= 3010a -.... (') Web Page I Network 0" User ID I 3006b - ~....... .. 0 = Database I Object .... I (') Cockpit c__ 3010b 1 3012 I } VMThings ~ Password I Object 3006c ~ :-: N 3014 ~CIO N • 0.. .. • (.H • rFJ =- ('D 3006n ('D Object r' ..... Ul CIO 0... .. 0 .j;o. c rFJ N 0.. .. .(...H_ 0 0 CIO 0 FIG. 30 CIO 0 CIO >....
  • 60. Patent Application Publication Mar. 28, 2013 Sheet 59 of 64 US 2013/0080898 Al Start 'V 3102 .. Open a website through a web browser at a device V 3104 Authenticate a user's identity at the website 'V 3106 ._ Display one or more configuration options to the user I; 3108 ___ Receive selection of the one or more configuration options from the user 3110 ~ Configure or create a cockpit for the user based on the selection of the configuration options. 3112 I Display the cockpit to the user ~~ Stop FIG.31
  • 61. Patent Application Publication Mar. 28, 2013 Sheet 60 of 64 US 2013/0080898 Al Start I 3202 ._ Open a website through a web browser at a device I 3204 Authenticate a user's identity at the website 1 3206 .._ Display a cockpit specific to the user at the device I 3208 _ User interacts with the cockpit 1 3210 _ Display an enhanced visual access menu based on the interaction of the user with the cockpit I 3212 _ Interact and control one or more operations of the remote devices v Stop FIG.32
  • 62. Patent Application Publication Mar. 28, 2013 Sheet 61 of 64 US 2013/0080898 Al Start 3302 _ Access a website for creating a cockpit at a first device 3304 _ Invite one or more second user for configuring the cockpit V 3306 __ Receive one or more inputs from the one or more second users 3308 } Receive one or more inputs from the first user I 3310 .. Configure a cockpit based on the inputs of the first and second user 3312 'V Store the cockpit at the first device I Stop FIG. 33
  • 63. Patent Application Publication Mar. 28, 2013 Sheet 62 of 64 US 2013/0080898 Al Start 3402 "-- Access a database of visual access menus through a GUl at a device I 3404 ___ Display a visual access menu along with one or more mode options to a user V 3406 . Receive selection of a mode option from the user I 3408 __ Switch the mode based on the selection of the mode option 'II 3410 '_ Play an audio menu to the user when the user selects an audio mode / Stop FIG. 34
  • 64. ~.'"..=. . ('D .=.... f102 ~ '-.e... (') CockQit ~....... .. J J J 0 1904a 1904b 1904c = ='"= Remote 0-..".. (') ~....... .. 0 f 1904d ,;- 1904e f = 1904f !VR Device Services Control Control 1- ~ ~ :-: Outlook Calendar Other E-mails 1902 N QO ~ N 0 f 1904g f 1904h J 1904n .... (.H rFJ =- ('D ('D Messengers Games • • Other Objects ..... 0 (.H 0... .. 0 .j;o. Audio Video Text list Mode Mode Mode • • Mode c 3502a 3502b 3502c 3502n rFJ N 0.. .. .(...H_ 0 0 QO 0 FIG.35A QO 0 QO >....
  • 65. '"= ~..... . 102 ('D .=..... > 'e '-e GraQhical User Interface .... (') ~........ .. 0 ~ 3504a ~ 3504b = ='"= 0-..".. Create Cockpit Customize Cockpit (') ~........ .. 0 = 1- ~ ~ :-: 3506 N r 3504c ~ 3504n QO ~ N 0.. .. (.H View Cockpit • • Invite Users rFJ =- ('D ('D ...... 0 .j;o. 0... .. 0 .j;o. Audio Video Text List Mode Mode Mode • • Mode c rFJ 3502a 3502b 3502c 3502n N 0.. .. .(...H_ 0 0 QO 0 QO FIG.35B 0 QO >....
  • 66. US 2013/0080898 AI SYSTEMS AND METHODS FOR ELECTRONIC COMMUNICATIONS CROSS REFERENCE TO RELATED APPLICATIONS [0001] This application is a Continuation-In-Part (CIP) of U.S. Non-Provisional application Ser. No. 13/245,804 entitled 'Systems and Methods for Electronic Communica­tions' and filed on Sep. 26, 2011. [0002] This application is a Continuation-In-Part (CIP) of U.S. Non-Provisional application Ser. No. 13/272,212 entitled 'Systems and Methods for Electronic Communica­tions' and filed on Oct. 12, 2011. FIELD OF THE INVENTION [0003] The present invention is related to electronic com­munications in a network and more specifically to systems and method for accessing and controlling one or more objects (physical or virtual) such as remote devices and services from a remote location by a user. BACKGROUND OF THE INVENTION [0004] Electronic devices are frequently used in day to day life. The electronic devices may include television, refrigera­tor, air conditioners, fans, tube lights, cameras or other elec­tronic equipments such as transmitters, antennas etc. All the electronic devices consume power regularly or at frequent intervals of time. For efficient power consumption, the elec­tronic devices must be controlled or switched ON/OFF. [0005] Appliances such as fans, tube lights or microwave may be controlled by regulating the electrical parameters associated with the appliances. For example, a user may control speed of fan, regulate operating power of the micro­wave as per requirement. However, it requires physical pres­ence of the user to regulate or switch ON/OFF the appliances. A technique for controlling the appliances by a remote con­trol device is well known. The remote control device may transmit signals for controlling the appliances. For example, the remote control device may simultaneously control air conditioners, fans or cameras as per the requirement. How­ever, the technique is limited by location of the user. More­over, the technique is incapable of updating the real-time status of the appliances to the user. [0006] Another available technique discloses a smart device for controlling the appliances. The smart device is configured with internet and connected with the appliances. A user connected with the smart device via the internet may control the appliances from a remote location. Moreover, the user may control the appliances by connecting with process­ing device via communication channel. The processing device may be located nearby to the smart device and may further receive signals from the user to control the appliances. However, the technique requires installation of a smart device and/or processing device for controlling the appliances from a remote location. [0007] Another available technique discloses real-time position monitoring of vehicles. The user may monitor real time coordinates of the vehicles based on the information received from a transmitter located in the vehicle. The user receives the position coordinates from the transmitter via a GPS server 114. However, the user is unable to control or update the positional coordinates of the vehicle as per choice. 1 Mar. 28, 2013 [0008] In light of the above discussion, systems and meth­ods are desired for providing real-time control of the elec­tronic devices and services from a remote location. SUMMARY [0009] Embodiments of the invention provide a system for enhancing interaction of a user with objects connected to a network. The system includes a processor, a display screen, and a memory coupled to the processor. The memory com­prises a database including a list of two or more objects and instructions executable by the processor to display a menu. The menu is associated with at least two independent objects. Further, the two independent objects are produced by at least two independent vendors. [001 0] Embodiments of the invention further provide a sys­tem for enhancing interaction of a user with objects con­nected to a network. The system includes a processor, a dis­play screen and a memory coupled to the processor. The memory includes a database comprising a list of one or more objects and instructions executable by the processor to dis­play it to the user. The menu includes icon which may indicate one object made by a vendor. Further, the icon is substantially different than the one provided by said vendor. [0011] Embodiments of the invention provide a method for accessing and controlling remote devices in a network. The method includes accessing a database of visual access menus through a graphical user interface (GUI) at a device. Further, the method includes displaying a visual access menu at the device. The visual access menu may include one or more options. The device may include an Internet of Things appli­cation such as a VMThings for displaying the visual access menu at the device. The VMThings also enables a user of the device to control the remote devices. The VMThings may be configured to create an Internet of Things menu including representations of recognizable objects. The objects may be physical objects or virtual objects. The Internet of Things menu may be a menu of identifiable objects (physical or virtual objects) connected in an Internet like structure. The user may control the remote devices irrespective of the loca­tion of the remote devices through the visual access menu. The user may select an option from the visual access menu. The method further includes displaying an enhanced visual access menu based on a selection of an option received from the user. The enhanced visual access menu may include one or more device options depending on the selection of the option. The device options are representation corresponding to the remote devices. The method further includes receiving a selection of a device option from the user. The method further includes connecting to a remote device based on the selection of the device option. Further, the method includes controlling the one or more operations of the connected remote device based on the selection of the device option. [0012] Embodiments of the invention provide a method for accessing and controlling services from a remote location. The method includes accessing, by a user of a device, a database of visual access menus through a graphical user interface (GUI) at the device. Further, the method includes displaying a visual access menu at the device. The visual access menu may include one or more options. The device may include an Internet of Things application i.e. a VMTh­ings for displaying the visual access menu at the device. Further, the VMThings may create an Internet of Things menu including one or more identifiable objects connected to each other in an Internet like structure. The VMThings may
  • 67. US 2013/0080898 AI display visual access menu at the device to enable the user to control the remote services. The method further includes displaying an enhanced visual access menu based on a selec­tion of an option received from the user. The enhanced visual access menu may include one or more service options depending on the selection of the option. The service options are representation corresponding to the services. The method further includes receiving a selection of a service option from the user. The method further includes connecting to a service based on the selection of the service option. Further, the method includes connecting the device to the service. Fur­thermore, the method includes controlling and displaying information about the service at the device based on the selection of the service option. [0013] Embodiments of the invention also provide a device for accessing and controlling remote devices in a network. The device may include an Internet of Things application i.e. a VMThings configured to enable a user of the device to access a database including visual access menus through a GUI. Further, the VMThings is configured to create an Inter­net of Things menu including one or more identifiable objects connected in an Internet like structure. The VMThings may display a visual access menu including one or more options at the device. Further, the VMThings may display an enhanced visual access menu at the device based on a selection of an option received from the user. The enhanced visual access menu may include one or more device options depending on the selection of the option. The device options are represen­tation corresponding to the remote devices. The VMThings may further receive a selection of a device option from the user. The VMThings may also connect the device to a remote device based on the selection of the device option. The VMThings may control one or more operations of the con­nected remote device based on the selection of the device option. [0014] Embodiments of the invention also provide a device for accessing and controlling services in a network from a remote location. The device may include an Internet of Things application such as a VMThings configured to enable a user of the device to access a database including visual access menus through a GUI. The VMThings is also config­ured to display a visual access menu including one or more options at the device. Further, the VMThings may display an enhanced visual access menu at the device based on a selec­tion of an option received from the user. The enhanced visual access menu may include one or more service options depending on the selection of the option. The service options are representation corresponding to the services located remotely. The VMThings may further receive a selection of a service option from the user. The VMThings may also con­nect the device to a service based on the selection of the service option. The VMThings may control and display infor­mation of the service to the device based on the selection of the service option. [0015] Embodiments of the invention also provide a system for accessing and controlling remote devices. The system includes a display device configured to display one or more visual access menus. Further, the system includes an access device connected to the display device. The access device may include an Internet of Things application i.e. a VMTh­ings configured to display the one or more visual access menus including one or more options to control the remote devices, at the display device. The user may create or config­ure an Internet of Things menu through a Graphical User 2 Mar. 28, 2013 Interface at the device. In an embodiment of the invention, the VMThings may be configured to create the Internet ofThings menu. The VMThings is further configured to enable a user of the access device to access a database including the visual access menus through a GUI. The VMThings may display an enhanced visual access menu at the device based on a selec­tion of an option received from the user. The enhanced visual access menu may include one or more device options depend­ing on the selection of the option. The device options are representation corresponding to the remote devices. The VMThings may further receive a selection of a device option from the user. The VMThings may also connect the device to a remote device based on the selection of the device option. The VMThings may control one or more operations of the connected remote device based on the selection of the device option. [0016] Embodiments of the invention also provide a system for accessing and controlling services in a network from a remote location. The system may include a display device configured to display one or more visual access menus. Fur­ther, the system may include an access device connected to the display device. The access device may include an Internet of Things application i.e. a VMThings configured to display the one or more visual access menus including one or more options to control the remote devices at the display device. The VMThings is further configured to enable a user of the access device to access a database including the visual access menus through a Graphical User Interface (GUI). The GUI may be used for creating an Internet of Things Menu includ­ing a plurality of identifiable objects in a network like struc­ture. The identifiable objects may be physical objects or vir­tual objects. Further, the VMThings may display an enhanced visual access menu at the device based on a selection of the option received from the user. The enhanced visual access menu may include one or more service options depending on the selection of the option. The service options are represen­tation corresponding to the services. The VMThings may further receive a selection of a service option from the user. The VMThings may also connect the device to a remote device based on the selection of the service option. The VMThings may control and display information about the service based on the selection of the service option. [0017] Embodiments of the invention further provide a method for accessing and controlling the remote devices in a network through a web browser. The method includes open­ing a webpage in the web browser at a device including a VMThings. The method may further include displaying a visual access menu at the device. The VMThings may create or display the visual access menu or an Internet of Things menu at the device. The Internet ofThings menu may include a plurality of representations corresponding to identifiable objects. The identifiable objects may be physical objects or virtual objects. The visual access menu may include one or more options. Further, the method includes displaying an enhanced visual access menu at the device based on a selec­tion of an option received from the user. The enhanced visual access menu may include one or more device options depend­ing on the selection of the option. The device options are representation corresponding to the remote devices. The method further includes receiving a selection of a device option from the user. The method further includes connecting to a remote device based on the selection of the device option. Further, the method includes connecting the device to the remote device based on the selection of the device option.
  • 68. US 2013/0080898 AI Further, the method includes controlling the one or more operations of the connected remote device based on the selec­tion of the device option. [0018] Embodiments of the invention further provide a method for accessing and controlling the services in a net­work through a web browser. The method includes opening a webpage in the web browser at a device including an Internet of Things application i.e. a VMThings. The VMThings is configured to enable a user of the device to access a database including the visual access menus through a GUI. The method further includes displaying a visual access menu at the device. The VMThings may display the visual access menu at the device. The visual access menu may include one or more options. Further, the method includes displaying an enhanced visual access menu at the device based on a selec­tion of an option received from the user. The enhanced visual access menu may include one or more service options depending on the selection of the option. The service options are representation corresponding to the service. The method further includes receiving a selection of a service option from the user. The method further includes connecting to a service based on the selection of the service option. Further, the method includes connecting the device to the remote device based on the selection of the service option. Further, the method includes controlling and displaying the information of the service based on the selection of the service option. [0019] An aspect of the invention is to enable a user to control one or more operations of the remote devices or services through voice commands or gestures or hand move­ments. For example, the user may switch on an air conditioner (AC) by showing a thumb up gesture in front of the device. The device may include a camera to detect the gesture. The VMThings at the device (or access device) may analyze the gesture and control a remote device based on the analysis. [0020] An aspect of the invention is to transfer display of a device to another device. The another device may be con­nected to the device through wireless means. [0021] Another aspect of the invention is to create a data base of visual access menus or enhanced visual access menus. The visual access menus or the enhanced visual access menus are the visual menus for controlling one or more objects such as, but are not limited to, remote devices, services, and so forth. BRIEF DESCRIPTION OF THE DRAWINGS [0022] Having thus described the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein: [0023] FIG. 1A illustrates an exemplary environment, in accordance with an first embodiment of the invention; [0024] FIG. 1B illustrates another exemplary environment, in accordance with the first embodiment of the invention; [0025] FIG. 1C illustrates yet another exemplary environ­ment, in accordance with the first embodiment of the inven­tion; [0026] FIG. 1D illustrates an environment based on a Zig­Bee network, in accordance with the first embodiment of the invention; [0027] FIG. 1E illustrates an environment based on a WiMAX network, in accordance with the first embodiment of the invention; 3 Mar. 28, 2013 [0028] FIG. 1F illustrates an environment based on a Glo­bal System for Mobile Communication (GSM) network, in accordance with the first embodiment of the invention; [0029] FIG. 1G illustrates an environment based on a Zig­Bee network, in accordance with the first embodiment of the invention; [0030] FIG. 1H illustrates an environment based on a WiMAX network, in accordance with the first embodiment of the invention; [0031] FIG. 1I illustrates an environment based on a com­bination of a local network and the Internet, in accordance with the first embodiment of the invention; [0032] FIG. 2A illustrates an exemplary environment, in accordance with a second embodiment of the invention; [0033] FIG. 2B illustrates another exemplary environment, in accordance with the second embodiment of the invention; [0034] FIG. 2C illustrates yet another exemplary environ­ment, in accordance with the second embodiment of the invention; [0035] FIG. 2D illustrates an environment based on a Zig­Bee network, in accordance with the second embodiment of the invention; [0036] FIG. 2E illustrates an environment based on a WiMAX network, in accordance with the second embodi­ment of the invention; [0037] FIG. 2F illustrates an environment based on a GSM network, in accordance with the second embodiment of the invention; [0038] FIG. 2G illustrates an environment based on a Zig­Bee network, in accordance with the second embodiment of the invention; [0039] FIG. 2H illustrates an environment based on a WiMAX network, in accordance with the second embodi­ment of the invention; [0040] FIG. 2I illustrates an environment based on a com­bination of a local network and the Internet, in accordance with the second embodiment of the invention; [0041] FIG. 3A illustrates an exemplary visual access menu and enhanced visual access menu at a device, in accordance with the first embodiment of the invention; [0042] FIG. 3B illustrates an exemplary visual access menu and enhanced visual access menu at the device, in accordance with second embodiment of the invention; [0043] FIG. 3C illustrates another exemplary visual access menu and enhanced visual access menu at the device, in accordance with first embodiment of the invention; [0044] FIG. 3D illustrates another exemplary visual access menu and enhanced visual access menu at the device, in accordance with second embodiment of the invention; [0045] FIG. 4 illustrates an exemplary enhanced visual access menu including one or more device options, in accor­dance with an embodiment of the invention. [0046] FIG. 5 illustrates an exemplary enhanced visual access menu including one or more service options, in accor­dance with an embodiment of the invention. [0047] FIG. 6 illustrates exemplary components of a device, in accordance with an embodiment of the invention; [0048] FIG. 7 illustrates exemplary components of an access device, in accordance with an embodiment of the invention; [0049] FIG. 8 illustrates a flowchart diagram for controlling remote devices, in accordance with an embodiment of the invention;
  • 69. US 2013/0080898 AI [0050] FIG. 9 illustrates a flowchart diagram for controlling remote services, in accordance with an embodiment of the invention; [0051] FIGS. lOA, lOB, and lOC illustrate a flowchart dia­gram for controlling objects by using a device in a network, in accordance with an embodiment of the invention; [0052] FIG. 11 illustrates a flowchart diagram for control­ling remote devices by using a web browser at a device, in accordance with an embodiment of the invention; [0053] FIG. 12 illustrates a flowchart diagram for control­ling remote services by using a web browser at a device, in accordance with an embodiment of the invention; [0054] FIGS. 13A, 13B, and 13C illustrate a flowchart dia­gram for controlling objects in a network through a web browser at a device, in accordance with an embodiment of the invention; and [0055] FIG. 14 illustrates a flowchart diagram for control­ling remote devices through a website, in accordance with another embodiment of the invention; [0056] FIG. 15 illustrates a flowchart diagram for control­ling remote devices by using an access device in a network, in accordance with an embodiment of the invention; [0057] FIG. 16 illustrates a flowchart diagram for control­ling remote services by using an access device in a network, in accordance with an embodiment of the invention; [0058] FIGS. 17 A, 17B, and 17C illustrate a flowchart dia­gram for controlling objects in a network devices through an access device, in accordance with an embodiment of the invention; [0059] FIG.l8A illustrates an exemplary display of images of remote devices, in an embodiment of the invention; and [0060] FIG. 18B illustrates transfer of an exemplary dis­play of images from a device to another device, in an embodi­ment of the invention. [0061] FIG. 19 illustrate an exemplary cockpit, in accor­dance with an embodiment of the invention; [0062] FIG. 20A-B illustrates exemplary environments for providing access of a cockpit of a user to other users, in accordance with an embodiment of the invention; [0063] FIG. 21 illustrates a flowchart diagram for providing access control of a cockpit to one or more second users, in accordance with an embodiment of the invention; [0064] FIG. 22 illustrates a flowchart diagram for providing access control of the cockpit to one or more second users, in accordance with another embodiment of the invention; [0065] FIG. 23 illustrates a flowchart diagram for config­uring a cockpit based on user's preference, in accordance with an embodiment of the invention; [0066] FIG. 24 illustrates a flowchart diagram for config­uring a cockpit, in accordance with an embodiment of the invention; [0067] FIG. 25 illustrates a flowchart diagram for custom­izing a cockpit based on other users' reviews, in accordance with an embodiment of the invention; [0068] FIG. 26 illustrates a flowchart diagram for down­loading and customizing a cockpit at a second device, in accordance with an embodiment of the invention; [0069] FIG. 27 illustrates a flowchart diagram for config­uring a cockpit based on another cockpit of other user, in accordance with an embodiment of the invention; [0070] FIG. 28 illustrates a flowchart diagram for config­uring a cockpit based on another cockpit of other user, in accordance with another embodiment of the invention; 4 Mar. 28, 2013 [0071] FIG. 29 illustrates a flowchart for downloading a cockpit from a network, in accordance with an embodiment of the invention; [0072] FIG. 30 illustrates an environment for accessing a cockpit through a website, in accordance with an embodiment of the invention; [0073] FIG. 31 illustrates a flowchart diagram for config­uring a cockpit through a website, in accordance with an embodiment of the invention; [0074] FIG. 32 illustrates a flowchart diagram for accessing a cockpit through a website, in accordance with an embodi­ment of the invention; [0075] FIG. 33 illustrates a flowchart diagram for config­uring a cockpit with the help of other users, in accordance with an embodiment of the invention; [0076] FIG. 34 illustrates a flowchart diagram for switching a display mode of a cockpit, in accordance with an embodi­ment of the invention; and [0077] FIG. 35B illustrates an exemplary display of a GUI along with one or more mode options, in accordance with an embodiment of the invention. DETAILED DESCRIPTION OF THE INVENTION [0078] Illustrative embodiments of the invention now will be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodi­ments of the invention are shown. Indeed, the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfY applicable legal requirements. Like numbers refer to like elements throughout. [0079] FIG. lA illustrates an exemplary environment 100, in accordance with a first embodiment of the invention. The first embodiment describes functionality of an Internet of Things application i.e. a VMThings 108 for controlling a plurality of remote devices l06a-n. A user may create or configure an Internet of Things menu or cockpit for accessing or controlling the plurality of remote devices l06a-n at a device 102. In an embodiment of the invention, the VMTh­ings 108 may configure or create the Internet of Things menu or the cockpit. The Internet of Things menu may include representations of one or more recognizable or identifiable objects such as, but are not limited to, remote devices l06a-n or services in an Internet or network like structure. The one or more identifiable objects may be physical or virtual objects. In an embodiment of the invention, a graphical user interface (GUI) may be used by the user for creating the Internet of Things Menu. The objects may be the remote devices 1 06a-n or services. The user may use the device 102 for connecting to a plurality of remote devices 1 06a-n through a network 104 through the Internet of Things menu. The device 102 may be used by the user to control a plurality of objects in the network 104. The VMThings 108 may control one or more operations of the plurality of objects. In an embodiment of the invention, the objects may include remote devices l06a-n. In another embodiment of the invention, the objects may be services as described in FIG. 2A-I. In yet another embodiment of the invention, the objects may be combination of the remote devices l06a-n and services. In an embodiment of the inven­tion, the device 102 can be a portable device capable of communicating and connecting to other devices such as the remote devices l06a-n. The device 102 may have a display screen. In an embodiment of the invention, the device 102
  • 70. US 2013/0080898 AI may have a limited display or may not have a display at all. Example of the device 102 may include a mobile phone, a smart phone, a computer, a personal digital assistant (PDA), a tablet computer, a laptop, and so forth. [0080] The network 104 can be a wired network or a wire­less network or a combination of these. The wireless network may use wireless technologies to provide connectivity among various devices. Examples of the wireless technologies include, but are not limited to, Wi-Fi, WiMAX, fixed wireless data, ZigBee, Radio Frequency 4 for Consumer Electronics network (RF 4CE), Home RF, IEEE 802.11, 4G or Long Term Evolution (LTE), Bluetooth, Infrared, spread-spectrum, Near Field Commnnication (NFC), Global Systems for Mobile commnnication (GSM), Digital-Advanced Mobile Phone Service (D-AMPS). The device 102 is connected to the plu­rality of remote devices 106a-n through the network 104. Examples of the wired network include, but are not limited to, Local Area Network (LAN), Metropolitan Area Network (MAN), Wide Area Network (WAN), and so forth. In an embodiment of the invention, the network 104 is the Internet. [0081] The plurality of remote devices 106a-n can be elec­tronic equipments such as, but are not limited to, household devices including electric lights, water pump, generator, fans, television (TV), cameras, microwave, doors, windows, com­puter, or garage locks, security systems, air-conditioners (AC), and so forth. In an embodiment of the invention, the plurality of the remote devices 1 06a-n can be vehicles such as cars, trucks, vans, and so forth. In an embodiment of the invention, the VMThings 108 may present a standard menu (or a standard visual access menu) for controlling all remote devices 106a-n to the user. The user may be provided with different visual access menus based on the location of the remote devices 106a-n. For example, the user may be dis­played with different visual access menus for remote devices present in office, home, factory, and so forth. In another embodiment of the invention, the VMThings 108 may display a customized menu at the device 102 based on user prefer­ences and/or access pattern. In an embodiment of the inven­tion, the user may configure the VMThings 108 to control remote devices 1 06a-n present in more than one building. The buildings may be present at different locations. Similarly, the user may control the one or more remote devices 106a-n located in his/her office from the home. For example, the user may control door of his/her office cabin, may switch on or switch offhis/her office computer/laptop, AC, and so forth. In an embodiment of the invention, the user may control opera­tions of one or more remote devices 106a-n present in a factory from the home. Further, the user may access the plurality of remote devices 1 06a-n from a remote location by using the device 102. Further, the user may use the same device 102 for controlling the remote devices located at dif­ferent locations such as office, factory, home, etc. The user doesn't have to carry different or multiple devices for con­trolling different remote devices 1 06a-n. The device 102 may include a database including a list of one or more objects. In an embodiment of the invention, the device 102 may include audio or visual menus of the one or more objects i.e. of the remote devices 106a-n. The device 102 may include visual access menus and/or enhanced visual access menus corre­sponding to various objects. The visual access menu may provide an interface to the user to control the one or more objects such as remote devices 106a-n. The visual access menu may include one or more options such as, but are not limited to a remote devices option, services option, and so 5 Mar. 28, 2013 forth. In an embodiment of the invention, the visual access menus at the device 102 may be updated regularly at pre­defined time interval such as after every two days, or once a week. The enhanced visual access menus may include one or more device options. In an embodiment of the invention, the device 102 may include a touch sensitive display. In such a scenario, the user may access the one or more options or the device options by touching the options directly. In an embodi­ment of the invention, the user may connect to the one or more objects such as the remote devices 106a-n through applica­tions such as, but are not limited to, Skype, Google Talk, Yahoo Messenger, Magic Jack, and so forth. [0082] Further, the device 102 may include the VMThings 108 which is configured to enable the user to access the visual access menus through a Graphical User Interface (GUI) at the device 102. The VMThings 108 may enable the user to con­trol the remote devices 106a-n irrespective of their location through the network 104. The VMThings 108 may display the one or more visual access menus at the device 102. Further, the device 102 may include visual access menus associated with at least two independent objects. In an embodiment of the invention, the two at least two independent objects may be produced by two independent vendors, In an embodiment of the invention, the device may include vendor specific visual access menus or enhanced visual access menus for the remote devices 106a-n. Further, the device 102 may also include standard menu(s) for accessing the objects. The VMThings 108 may display the visual access menu depending on the independent vendor(s) of the one or more objects. In another embodiment of the invention, the VMThings 108 may display a visual access menu which is not provided by either of the at least two independent vendors of the at least two independent objects. In an embodiment of the invention, the user may access and control one or more of the remote devices 1 06a-n from the remote location by using the device 102. For example, the user may use his smart phone to access and operate a microwave at his/her home from his/her office. Further, the user can use the device 102 at one location to monitor and regulate one or more operations of the remote devices 106a-n present at another location. The one or more operations may be, such as, but are not limited to, switch on, switch off, regulate, and so forth. [0083] Further, the visual access menus may include at least one icon indicating one or more objects such as the remote devices 106a-n. Further, the icon is substantially dif­ferent than the icons provided in the visual access menu provided by the vendor. Further, the remote devices 106a-n may be grouped into various categories such as, but are not limited to, electronics appliances, home devices, buildings, doors, room appliances, switches, floor wise, and so forth. Further, the remote devices 106a-n may be grouped accord­ing to location of the remote devices, such as home devices, office devices, garages devices, factory devices, home2 devices, farm house devices, and so forth. The VMThings 108 of the device 102 may store visual access menus and enhanced visual access menus corresponding to the remote devices 1 06a-n based on the various categories of the remote devices 106a-n. Each of the remote devices 106a-n may have a nnique remote device identity (ID). In an embodiment of the invention, the user may require to register the remote devices 1 06a-n with the device 102 so that the remote devices 1 06a-n may be controlled by using the VMThings 108. In an embodi­ment of the invention, the user may be required to authenti-
  • 71. US 2013/0080898 AI cate or prove his/her identity at device 102 or for the remote devices 106a-n before controlling one or more operations of the remote devices 106a-n. [0084] Further, the VMThings 108 may display an enhanced visual access menu corresponding to the remote devices 106a-n. The enhanced visual access menu may include one or more device options. The device options may be displayed as graphics or icons and/or text representations of the remote devices 106a-n. For example, a car may be displayed for representing the car option. The user may con­trol the remote devices 106a-n by selecting a device option from the device options at the device 102. Further, the enhanced visual access menu may display the grouping or categories of the remote devices 106a-n. The VMThings 108 may also translate the visual access menu or the enhanced visual access menu from a first language to a second lan­guage. Examples of the first language and the second lan­guage may include, but are not limited to, Spanish, French, English, Sanskrit, Hindi, Urdu, Arabic, and so forth. For example, the VMThings may translate an English visual access menu into a French visual access menu and thereafter, it may be displayed at the device 102. The VMThings 108 may display the visual access menu or the enhanced visual access menu at the device 102 based on the user's preferred language. [0085] The user may select an option from the visual access menu or an enhanced visual access menu. Further, the user may select an option (or device options) by using a combina­tion of keys on a keypad of the device 102. In an embodiment of the invention, the user may select an option by clicking the option or the device option by using a mouse device. In an embodiment of the invention, the user may select an option by touching the screen of the device 102. For example, if the user wants to switch on an air conditioner (AC) on way towards home, the user can select or enter an appropriate key combi­nation on the device 102 or may touch (in case of touch sensitive display at the device 102) an option of the visual access menu corresponding to the AC. [0086] In one embodiment, the user can give a voice com­mand to the device 102. Based on the input received by the device 102, the air conditioner may be switched on automati­cally. Further, the user can also regulate the cooling of the room by changing temperature settings of the air conditioner. After connecting the device 102 to one or more of the remote devices 106a-n, the user can control the one or more opera­tions such as, but are not limited to, switch on, switch off, reduce temperature, and so forth from a distant location with­out being physically present at the location. In one embodi­ment, the remote devices 106a-n can be security cameras or alarm station installed at the home location of the user. [0087] In an embodiment of the invention, the user may select an option by making gestures or hand movements at the device. For example, the user may do a thumb up gesture to switch on an appliance at home or may do a thumb down gesture to switch off the same. Similarly, the user may do other gestures such as, but are not limited to, waving a hand, nodding head, smiling, blinking an eye, and so forth. In an embodiment of the invention, the device may include a cam­era for detecting the gestures or hand movements. In an embodiment of the invention, the VMThings 108 may be configured to analyze and interpret the gestures and hand movements. Further, the VMThings 108 may include stored gestures defined by the user at device 102 and may compare or match the real time gestures with the stored gestures. The 6 Mar. 28, 2013 device may include a software or hardware such as micro­phone for detecting the voice commands or audio inputs. [0088] In another embodiment of the invention, the VMTh­ings 108 may be configured to analyze the voice commands and audio inputs received from the user through voice recog­nition. Further, the user may select the option from an Internet of Things menu through voice command(s) for controlling the remote devices 1 06a-n. The device 102 may include a list of voice commands and action to be taken corresponding to each command. The VMThings 108 may compare and match the received voice command with the stored list and thereafter may take an action based on the comparison. In an exemplary scenario, the user at office may switch on the AC present at home by accessing the visual access menu and saying "switch off the AC' on the device 102 (or a smart phone). In an embodiment of the invention, speech/voice recognition may be used to analyze the voice instructions or commands received from the user to control the remote devices 106a-n. In an embodiment of the invention, the device 102 may receive a call from the one or more objects such as a remote device. In such a case, the VMThings 108 may display a visual access menu of the calling object. [0089] In an embodiment of the invention, the VMThings 108 may determine location of the device or the plurality of objects such as the remote devices 106a-n. In an embodiment of the invention, the selection of the option may be automatic based on one or more predefined instructions of the user of the device 102. For example, the predefined instruction may be like switch on the AC at 6 PM, switch off the TV at 2 PM, and close the door of the garage. The remote devices 1 06a-n may be controlled according to these predefined instructions irre­spective of the location of the user or the device 102. [0090] In an embodiment of the invention, one or more signals may be generated and transmitted by the device 102 based on the selection of the option or an input received from the user. The signals may be transmitted to the remote devices 106a-n through the network 104. The remote devices 106a-n may be controlled based on the signals received from the device 102. In an embodiment of the invention, the device 102 may receive an alert message(s) regarding the operational condition of the remote devices 106a-n. For example, an alert message like 'Car door left opened' may be received by the user at his/her mobile phone for a car standing in a parking area. In an embodiment of the invention, the alert message may be received through at least one of an SMS, an MMS, an instant message, an e-mail, a phone call, turn on of display of device when it's off, and so forth. In another embodiment of the invention, the user may further receive alert message as pop messages at the device 102, at a GPA system, at a multi function display of a car of the user, at a TV, at a picture frame, and so forth. Thereafter, the user may control or operate the car door through his/her smart phone and from the office itself. There is no need for him to rush to the parking area for closing the door. In an embodiment of the invention, the user may receive alert messages at a predefined time period. For example, the user may receive the alert messages regarding the connected remote devices 106a-n after every 1 hour, 2 hour, 30 minutes, and so forth. [0091] Further, the displayed Internet of Things menu or the visual access menu may extend or change based on the user selection of the option from the visual access menu. In another embodiment of the invention, the device 102 may receive images, videos, audios, related to the remote devices 1 06a-n at the predefined time period. Further, the device 102
  • 72. US 2013/0080898 AI may receive real-time information, such as, but is not limited to, images, video etc. of the plurality of the remote devices 106a-n. In an exemplary scenario, the user can monitor and control real-time operation of the remote devices 106a-n such as one or more vehicles based on the information received through the network 104. For example, the user can receive images or videos of the one or more vehicles on the device 102. Further, the VMThings 108 may display these images of remote devices 1 06a-n to the user. The user can send instruc­tions or voice response to the one or more vehicles through the network 104. For example, the user can track position of the one or more vehicles in real-time from the device 102 at another location. [0092] In an embodiment of the invention, the enhanced visual access menus corresponding to the remote devices 106a-n may be stored at a server 114 in the network 104. As discussed with reference to FIG. 1B, the user of the device 102 may access the visual access menus corresponding to the remote devices 106a-n through a web browser in an exem­plary environment 200. The environment 200 may include the device 102 such as a smart phone capable of connecting to the network 104 (or the Internet) via the web browser. In an embodiment of the invention, the remote devices 1 06a-n may be controlled via a local wireless communication or local network. In an embodiment of the invention, the remote devices 1 06a-n may be connected to a bridge device that may further be connected to the Internet. The web browser may be used to connect to the Internet and in turn to the local network. Examples of the web browser include, but are not limited to, Internet Explorer, Google Chrome, Mozilla Firefox, Netscape Navigator, and so forth. The user can enter a Uni­form Resource Locator (URL) such as, 'www.ABC.com' in the web browser to access a website including a database. The database at the website may store a plurality of visual access menus or Internet of Things menu or cockpit or enhanced visual access menus associated with the remote devices 1 06a­n. The enhanced visual access menus are visual access menus corresponding to the remote devices 106a-n. Each of the enhanced visual access menus may include one or more device options. In an embodiment of the invention, the data­base may be present in the network 104. [0093] A webpage 110 may be displayed at the device 102 corresponding to the URL entered by the user. The user may be required or asked to authenticate his/her identity before accessing the visual access menus. The displayed webpage 110 may include one or more data request fields 112a-b where the user may enter his/her details. In an embodiment of the invention, the user may access various visual access menus by authenticating at the website by entering his/her login details such as, but are not limited to, password, used ID, e-mail ID, date of birth, and so forth, in the one or more data request fields 112a-b. Though not shown, but a person skilled in the art will appreciate, that the webpage 110 may include more than two data request fields 112a-b. The one or more of options of the visual access menus or the enhanced visual access menus may be displayed to the user at his/her device 102. [0094] In an embodiment of the invention, the user may create personalized visual access menus for controlling his/ her personal devices of the remote devices 106a-n. In an embodiment of the invention, the user may configure or create an Internet of things menu for controlling remote devices. The Internet of Things menu may include a plurality of represen­tations corresponding to identifiable objects such as the 7 Mar. 28, 2013 remote devices 106a-n. Further, the user may customize the Internet of Things menu based on his/her preferences such as, but not limited to, language preference, theme preference, color preference, font size preference, device preference, ser­vice preference, and so forth. The VMThings 108 may display customized or personalized visual access menu at the device 102. In an embodiment of the invention, the VMThings 108 may display visual access menu at a second display con­nected to the device 102. The user may select an option from the multiple options of the visual access menu. The enhanced visual access menu (or the Internet of Things menu) may be displayed at the device based on the selection of an option by the user at the device 102. In an embodiment of the invention, a connection may be established between the user device 102 and the remote devices 106a-n based on the selection of the option by the user. Thereafter, the user can access and control the remote devices 106a-n irrespective of a location of the user. The user may not have to be in front of or close to the remote device 106a-n for controlling the operations of the remote devices 106a-n. [0095] FIG. 1C illustrates another exemplary environment 3 00, in accordance with the first embodiment of the invention. An access device 116 may be connected to a display device 118. The access device 116 may access and control the plu­rality of remote devices 106a-n connected through the net­work 104. The access device 116 may be any device capable of data and/or voice communications through the network 104 or the remote devices 106a-n. Examples of the access device 116 include, but are not limited to, a router, a tele­phone, a set top box, a hub, a gateway, a printer, a music system, a mobile phone, a PDA, a smart phone, a picture frame, and so forth. In an embodiment of the invention, the access device 116 may not have a display or may have limited display capability. The access device 116 may include a plu­rality of ports for connecting to the network 104, and/or the display device 118. The plurality of ports can be such as, but are not limited to, parallel ports, serial ports, DB-2 connector, IEEE 1284, IEEE 1394 ports, 8P8C ports, PS/2 ports, RS-232 ports, Registered Jack (RJ) 45 ports, RJ 48 ports, VGA port, Small Computer System Interface (SCSI) ports, USB ports, DB-25 ports, and so forth. [0096] Examples of the display device 118 may include, but are not limited to, a television, a Liquid Crystal Diode (LCD) display, a Light Emitting Diode (LED) display, a projector screen, a computer, a laptop, a tablet computer, a picture frame, a tablet computer, and so forth. The access device 116 may provide a network interface to the display device 118. The user may use the access device 116 for connecting to the network 104. Moreover, the user can access the remote devices 106a-n connected to the network 104 by using the access device 116. In this embodiment of the inven­tion, once connected with the remote devices 106a-n the visual access menus or the Internet of Things menus may be displayed to the user at the display device 118. In an embodi­ment of the invention, the user may have to authenticate and/or one or more login details before viewing the visual access menus. The user may authenticate or enter his/her personal details at the access device 116. In an embodiment of the invention, the user may authenticate or enter the personal details at the display screen. [0097] In an embodiment of the invention, the access device 116 may be a home controller device. The user may access the VMThings 108 by logging into this home control­ler and may view the visual access menus at his device 102 or
  • 73. US 2013/0080898 AI a display device 116. After logging into the home controller the user may control the objects i.e. remote devices or ser­vices associated with the home controller. Therefore, the user may control the one or more objects by using a combination of devices such as the home controller, smart phone, another display device, and so forth. [0098] The access device 116 may include an Internet of Things application i.e. VMThings 108 application for access­ing the visual access menus and the enhanced visual access menus. The VMThings 108 may display the visual access menus at the display device 120. The user may connect to the remote devices 106a-n by selecting one or more options of the visual access menus. Further, the remote devices 1 06a-n may be grouped into various categories such as, but are not limited to, electronics appliances, home devices, buildings, doors, room appliances, electric switches, cars, windows, and so forth. Further, the remote devices 106a-n may be grouped according to location, such as home devices, office devices, garages devices, and so forth. The of the access device 116 may store visual access menus and enhanced visual access menus according to the various categories of the remote devices 1 06a-n at the access device 116. Further, the user may control any remote device from the remote devices 1 06a-n by selecting one or more options from the visual access menu or the Internet of Things menu. In an exemplary scenario, the user can connect to the network 104 by using a telephone and may view the visual access menu on a screen of the television. Thereafter, the user may access and control the remote devices 106a-n from the telephone by pressing appropriate keys/buttons of the telephone. [0099] In an embodiment of the invention, the user may register the remote devices 106a-n or do some settings at the access device 116 or the remote devices 106a-n, so that the user may control the remote devices 106a-n from the VMTh­ings 108. In an embodiment of the invention, the user may be required to authenticate or prove his/her identity at the access device 116 or for the remote devices 106a-n before control­ling one or more operations of the remote devices 106a-n. [0100] FIG. 1D illustrates an environment based on a Zig­Bee network 120, in accordance with the first embodiment of the invention. As shown, the access device 116 may include the VMThings 108 for displaying a visual access menu or an enhanced visual access menu or an Internet ofThings menu at the display device 118. The access device 116 may connect to the remote device 106a-n through the ZigBeenetwork 120. In an embodiment of the invention, the remote devices 106a-n may be connected to the ZigBee network 120 through a local network such as a LAN, a NFC network, a Bluetooth network, and so forth. The local network may be connected to the ZigBee network 120 through some gateway device such as bridge, router, hub, gateway device, switch, and so forth. [0101] FIG. 1E illustrates an environment based on a WiMAX network 122, in accordance with the first embodi­ment of the invention. As shown, the access device 116 may include the VMThings 108 for displaying the Internet of Things menu or the visual access menu or the enhanced visual access menus at the display device 118. The access device 116 may connect to the remote devices 106a-n through the WiMAX network 122. In an embodiment of the invention, the remote devices 106a-n may be connected to the WiMAX network 122 through a local network such as a LAN, NFC network and so forth. In an embodiment of the invention, the user may require to register the remote devices 1 06a-n or do some settings at the access device 116 or the remote devices 8 Mar. 28, 2013 106a-n, so that the user may control the remote devices 106a-n from the VMThings 108. In an embodiment of the invention, the user may be required to authenticate or prove his/her identity at the access device 116 or for the remote devices 106a-n before controlling one or more operations of the remote devices 106a-n. The user may access the visual access menus and enhanced visual access menus at the access device 116 through a GUI. The VMThings 108 may enable the user to control the remote devices 106a-n irrespective of the location of the remote devices 106a-n. For example, the user may control operations of the air conditioner located in his/her factory by being at home itself. The user may not have to be physically present at the factory or near the air condi­tioner for controlling the operations of the air conditioner. The user may do the same through the VMThings 108 of the access device 116 (or the device 102). [0102] FIG. 1F illustrates an environment based on a Glo­bal System for Mobile Communication (GSM) network 124, in accordance with the first embodiment of the invention. As shown, the access device 116 may be connected to the remote devices 106a-n through the GSM network 124. Though not shown, but a person skilled in the art will appreciate that the access device 116 may be connected to the remote devices 1 06a-n through other networks, such as, but are not limited to, an RF4CE network, an NFC network, an HSPA network, a LAN, a WAN, a 3rd generation network, a 4'h generation network, a CD MA network, an EV-DO network, and so forth. [0103] FIG. 1G illustrates an environment based on the ZigBee network 120, in accordance with the first embodiment of the invention. As shown, the device 102 may include the VMThings 108. A user may configure an Internet of Things menu by using the VMThings at the device 102. The user of the device 102 may connect to the remote devices 106a-n by using the VMThings 108 through the GUI at the device 102. Further, the device 102 may be connected to the remote devices 106a-n through the ZigBee network 120. In an embodiment of the invention, the device 102 may be con­nected to other wireless network such as the WiMAX network 122, as shown in FIG. 1H. [0104] FIG. 1I illustrates an environment based on a com­bination of a local network 126 and the Internet 130, in accordance with the first embodiment of the invention. The remote devices 106 a-n may be connected to a local network 126. The local network 126 can be a private network, a wire­less network, and so forth. The local network 126 in turn may be connected to an external or public network such as, but are not limited to, the Internet 130 through a bridge device 128. The device 102 may connect to the remote devices 106a-n through the Internet 130. The local network 126 and the Internet 130 may be connected to each other through other devices such as, but are not limited to, a router, a hub, a switch, a gateway, and so forth. [0105] In an embodiment of the invention, the VMThings 108 may display an advertisement or multiple advertisements along with the visual access menu at the device 102. In an embodiment of the invention, the VMThings may display the advertisement or multiple advertisements along with an Inter­net of Things menu at the device 102. In an embodiment of the invention, the advertisement(s) are selected and displayed based on the content of the displayed visual access menu or the Internet of Things menu. For example, if the visual access menu is for controlling the home appliances, then the adver­tisements may be about home appliances such as AC, fans, etc. In an embodiment of the invention, the visual access
  • 74. US 2013/0080898 AI menu and/or advertisements may be displayed at a second display or a display device such as a picture frame, LCD, television, and so forth connected to the device 102. Further, the visual access menus and the advertisements may be dis­played at the display device or the second display through wireless means such as Wi-Fi, Bluetooth, ZigBee, and so forth. [0106] FIG. 2A illustrates an exemplary environment 400, in accordance with a second embodiment of the invention. The user 102 may use the device 102 to connect to a plurality of services 202a-n through the network 104. The user can access the information about the services 202a-n at the device 102. As discussed with reference to FIG. 1A, the device 102 can be a portable or hand-held device capable of communi­cating and connecting to the network 104 or other devices such as the remote devices 1 06a-n. Example of the device 102 may include a mobile phone, a smart phone, a computer, a personal digital assistant (PDA), a tablet computer, a laptop etc. The network 104 can be a wired network such as a Local Area Network (LAN) or a Wide Area Network (WAN) or a wireless network such as a WiMAX network or a combina­tionofthese. Examples of the services 202a-n include, but are not limited to, banking services, travel services, entertain­ment services, railways services, movies services, restau­rants, and so forth. Further, the banking services may be categorized as insurance services, retail banking services, internet banking services, loans service, NRI banking, and so forth. The entertainment services may be accessed by the user to get information about music, movies, theatre, news, car­toons, or sports. For examples, the user may access movies services to know the new releases in movies. The information about services may be displayed in form of an enhanced visual access menu. The user may interact with the enhanced visual access menu accordingly. [0107] In an embodiment of the invention, the VMThings 108 may display an Internet of Things menu at the device 102. The Internet of things menu may include representations of one or more recognizable or identifiable objects such as, but are not limited to, remote devices 106a-n or services in an Internet or network like structure. The one or more identifi­able objects may be physical or virtual objects. A graphical user interface (GUI) may be used by the user for creating the Internet of Things Menu. In an embodiment of the invention, the objects may be the services 202a-n. [0108] Further, the VMThings 108 may highlight a fre­quently accessed service option or preferred service option in the enhanced visual access menu for the services 202a-n or the Internet of Things menu based on the user's previous access patterns. In an embodiment of the invention, the VMThings 108 may highlight one or more frequently accessed device options or preferred device options in the enhanced visual access menu for the remote devices 106a-n. Further, the VMThings 108 may store the user access pattern at the device 102. In an embodiment of the invention, the VMThings 108 may present a standard menu (or a standard visual access menu) for controlling all services 202a-n to the user. In another embodiment of the invention, the VMThings 108 may display a customized menu of services 202a-n at the device 102 based on user preferences and/or access pattern. [0109] The device 102 may include a Graphical User Inter­face (GUI) to enable the user to access the services 202a-n. In an embodiment of the invention, the device 102 may include audio or visual menus of the services 202a-n. The device 102 may include visual access menus and/or enhanced visual 9 Mar. 28, 2013 access menus corresponding to the services 202a-n. The enhanced visual access menu may include one or more ser­vice options. The service options may be displayed as graph­ics or icons or text representing the services 202a-n. The user may control and get more information about the services 202a-n by selecting a service option from the service options at the device 102. In an embodiment of the invention, the user may select a service option by touching the screen of the device 102. For example, if the user wants more information about the travelling service, the user may select the travel service option. In one embodiment, the user can give a voice command to the device 102 for selecting a service option from the enhanced visual access menu. Further, the user may select an option by using a combination of keys on a keypad of the device 102. Further, the user may select a service option by using a mouse device. In an embodiment of the invention, the selection of the service option may be automatic based on the one or more predefined instructions of the user of the device 102. In an embodiment of the invention, the user may have to register him/her or the device 102 to access the ser­vices 202a-n. In an embodiment the user may have to authen­ticate his identity prior to accessing the services 202a-n. In an embodiment of the invention, the user may receive alert mes­sages related to the services 202a-n. For example, the user may receive reminders about making a payment for his/her credit card bill. In another embodiment of the invention, the user may receive the alert messages regarding the connected services 202a-n at a predefined time period such as, but are not limited to, after every 1 hour, 2 hour, 30 minutes, and so forth. In an embodiment of the invention, the VMThings 108 may alert the user through at least one of by turning on the display of the device 102 from anoffstate and present a menu (visual access menu or Internet of Things menu or cockpit), presenting a menu in a pop up window, sending Short Mes­saging Service (SMS) message, sending a Multimedia Mes­saging Service (MMS) message, initiating a telephone call, and so forth. Further, the user may receive alert message as a pop up message at his/her Global Positioning System (GPS) device or a multi function display ofhislher car or at screen of a television or at a mobile phone of the user, and so forth. [0110] In another embodiment of the invention, the device 102 may receive images, videos, audios, related to the ser­vices 202a-n at the predefined time period. In an embodiment of the invention, the user may access or control the services 202a-n by giving voice commands or voice inputs. In an embodiment of the invention, the user may connect to the services 202a-n through applications such as, but are not limited to, Skype, Google Talk, Yahoo Messenger, Magic Jack, and so forth. [0111] Further, the device 102 may include visual access menus associated with at least two independent objects or services. In an embodiment of the invention, at least two independent objects/services may be produced by at least two independent vendors. In an embodiment of the invention, the device 102 may include vendor specific Internet of Things menus or visual access menus or enhanced visual access menus for the services 202a-n. Further, the device 102 may also include standard menu(s) for accessing the objects. The VMThings 108 may display the visual access menu depend­ing on the independent vendor(s) of the one or more objects. In another embodiment of the invention, the VMThings 108 may display a visual access menu which is not provided by either of the at least two independent vendors of the at least two independent objects. Further, the visual access menus
  • 75. US 2013/0080898 AI may include at least one icon indicating the one or more services 202a-n. Further, the icon is substantially different than the icons provided in the visual access menu or the Internet of Things menu provided by the vendor. The VMTh­ings 108 may display customized or personalized visual access menu or the Internet of Things menu at the device 102. In an embodiment of the invention, the VMThings 108 may display visual access menu or the Internet of Things menu at a second display connected to the device 102. [0112] In an embodiment of the invention, speech/voice recognition may be used to analyze the voice instructions or commands received from the user to access the services 202a­n. In an embodiment of the invention, the device 102 may receive a call from the services 202a-n. In such a case, the VMThings 108 may display a visual access menu and/or an Internet of Things menu of the calling service. Further, the Internet of Things menu may include one or more options for interacting with the service from which call is received. [0113] FIG. 2B illustrates another exemplary environment 500, in accordance with the second embodiment of the inven­tion. In an embodiment of the invention, the visual access menus or the Internet of Things menu corresponding to the services 202a-n may be stored at the server 114 in the network 104. The user at the device 102 may access an enhanced visual access menu corresponding to the services 202a-n by using a web browser. The device 102 may be configured to connect to the network 104 (or the Internet) by entering a URL or a website address in the web browser. Examples of the web browser include, but are not limited to, Apple Safari, Internet Explorer, Google Chrome, Mozilla Firefox, N etscape Navigator, and so forth. The user can enter a URL or a website address in the web browser to access a database including a plurality of enhanced visual access menus corre­sponding to the services 202a-n. In an embodiment of the invention, the database may be present in the network 104. [0114] A webpage 204 including the one or more data request fields 112a-b may be displayed at the device 102 based on the entered URL. The user may enter his/her details in the data request fields 112a-b for getting access to the database. Thereafter, at least one enhanced visual access menus to access the services 202a-n may be displayed to the user at the device 102. The user may access information about the one or more services 202a-n by interacting with the dis­played enhanced visual access menus. In an embodiment of the invention, the webpage 204 may include at least one of images, audio/video files, text, hyperlinks, and so forth [0115] In an embodiment of the invention, a new visual access menu or a new Internet of things menu may be dis­played when the user is directed to a new web site based on the user's input or selection. The new visual access menu may be an IVR menu or an Internet of Things menu associated with the new web site. Further, the new visual access menu may include options associated with the new web site. [0116] FIG. 2C illustrates yet another exemplary environ­ment 600, in accordance with the second embodiment of the invention. As discussed with reference to FIG. 1C, the user may use the access device 116 to access or control services 202a-n. The access device 116 may be any device capable of data and/or voice communications through the network 104. In an embodiment of the invention, the access device 116 may not have a display or may have limited display capabilities. The access device 116 can be such as, but are not limited to, a router, a telephone, a set top box, a hub, a gateway, a printer, a mobile phone, a smart phone, a PDA, a tablet computer, a 10 Mar. 28, 2013 walkie-talkie, and so forth. Further, the access device 116 may include a plurality of ports for connecting to the network 104 or the display device 118 such as a television or an LCD display. Examples of the plurality of ports include, but are not limited to, parallel ports, serial ports, DB-2 connector, IEEE 1284, IEEE 1394 ports, 8P8C ports, PS/2 ports, RS-232 ports, Registered Jack (RJ) 45 ports, RJ 48 ports, VGA port, Small Computer System Interface (SCSI) ports, USB ports, DB-25 ports, and so forth. [0117] The access device 116 may provide a network inter­face to the display device 118. The user may use the access device 116 for accessing the one or more of the services 202a-n through the network 104. An enhanced visual access menu or an Internet of Things menu corresponding to the services 202a-n may be displayed to the user. Thereafter, the user may access the information about the services 202a-n accordingly. In an embodiment of the invention, the user may have to enter one or more login details for authenticating himself/herself to gain access to the one or more visual access menus. In an exemplary scenario, the user can connect to the network 104 by using a telephone and may view the visual access menu on a television screen. Thereafter, the user may access and control the services 202a-n from the telephone by selecting or dialing or pressing one or more combination of keys at the telephone. [0118] In an embodiment of the invention, the VMThings 108 may display an advertisement or multiple advertisements along with the visual access menu at the display device 118. In an embodiment of the invention, the advertisement(s) are selected and displayed based on the content of the displayed visual access menu. For example, if the visual access menu is for controlling the banking services, then the advertisements may be about insurance and opening accounts. In an embodi­ment of the invention, the visual access menu and/or adver­tisements may be displayed at a second display or the display device 118 such as a picture frame, LCD, television, and so forth connected to the access device 116. Further, the visual access menus and the advertisements may be displayed at the display device 118 or the second display through wireless means such as Wi-Fi, Bluetooth, ZigBee, and so forth. [0119] FIG. 2D illustrates an environment based on the ZigBee network 120, in accordance with the second embodi­ment of the invention. As shown, the access device 116 may include the VMThings 108 for displaying a visual access menu or an enhanced visual access menu including one or more service options at the display device 118. The access device 116 may access and/or connect to the services 202a-n through the ZigBee network 120. Examples of the services 202a-n include, but are not limited to, banking services, travel services, entertainment services, railways services, movies services, restaurants, hotels, and so forth. In an embodiment of the invention, the services 202a-n may be accessed through the ZigBee network 120 and the local network 126 such as a LAN, an NFC network, a Bluetooth network, virtual private network (VPN), and so forth. The local network may be privately monitored network with no or limited access to outside users. The local network 126 may be connected to the ZigBee network 120 through some gateway device such as the bridge device 128, a router, a hub, a gateway, a switch, and so forth. [0120] FIG. 2E illustrates an environment based on the WiMAX network 122, in accordance with the second embodiment of the invention. As shown, the access device 116 may include the VMThings 108 for displaying a visual
  • 76. US 2013/0080898 AI access menu or an enhanced visual access menu including one or more service options at the display device 118. The access device 116 may connect to the services 202a-n through the WiMAX network 122 Examples of the services 202a-n include, but are not limited to, banking services, travel ser­vices, entertainment services, railways services, movies ser­vices, restaurants, and so forth. In an embodiment of the invention, the services 202a-n may be connected to the WiMAX network 122 through a local network such as a LAN, an NFC network, and so forth. The local network 126 may be connected to the WiMAX network 122. In an embodiment of the invention, the user may require to register to the services 202a-n or do some settings at the access device 116 or the remote devices 106a-n, so that the user may control the ser­vices 202a-n (or remote devices 106a-n) from the access device 116. In an embodiment of the invention, the user may be required to authenticate or prove his/her identity at the access device 116 or the services 202a-n before accessing the services 202a-n. The user may access visual access menus and enhanced visual access menus at the access device 116 through a GUI. The VMThings 108 may enable the user to access and control the services 202a-n irrespective of the location of the user. [0121] FIG. 2F illustrates an environment based on the Global System for Mobile Communication (GSM) network 124, in accordance with the second embodiment of the inven­tion. As shown the access device 116 may be connected to the services 202a-n through the GSM network 124. Though not shown, but a person skilled in the art will appreciate that the access device 116 may be connected to the services 202a-n through other networks, such as, but are not limited to, an RF 4CE network, an NFC network, an HSPA network, a LAN, a WAN, a 3rd generation network, a 4'h generation network, a Code Division Multiple Access (CDMA) network, an EV-DO network, and so forth. [0122] FIG. 2G illustrates an environment based on the ZigBee network 120, in accordance with the first embodiment of the invention. As shown, the device 102 may include the VMThings 108 for configuring or customizing or displaying an Internet of Things menu at the device 102 by a user. The Internet of Things menu may include representations of one or more recognizable or identifiable objects such as, but are not limited to, remote devices 106a-n or services in an Inter­net or network like structure. The one or more identifiable objects may be physical or virtual objects. A graphical user interface (GUI) may be used by the user for creating the Internet of Things Menu. The device 102 can be a portable device capable of communicating and connecting to the net­work 104 or other devices such as the remote devices 106a-n. Example of the device 102 may include, but are not limited to, a mobile phone, a telephone, a smart phone, a computer, a personal digital assistant (PDA), a tablet computer, a laptop, and so forth. A user of the device 102 may access to the services 1 06a-n by using the VMThings 108 through the GUI at the device 102. Further, the device 102 may be connected to the services 202a-n through the ZigBee network 120. In an embodiment of the invention, the device 102 may be con­nected to other wireless network such as the WiMAX network 122, as shown in FIG. 2H. [0123] FIG. 2I illustrates an environment based on a com­bination of a local network and the Internet, in accordance with the first embodiment of the invention. The services 202a-n may be interconnected through the local network 126. The local network 126 can be a private network, a wireless 11 Mar. 28, 2013 network, a VPN and so forth. The local network 126 in tum may be connected to an external or public network such as, but are not limited to, the Internet 130 through a bridge device 128 or a router, or a switch or a gateway device, and so forth. Theuserofthe device 102 may connect or access the services 202a-n through the Internet 130. Further, the VMThings 108 may display information about services in a preferred lan­guage set by the user. For example, if the user wants the information in English, the VMThings may display the infor­mation about the services 202a-n in English language, and if the user is interested in getting information in Spanish lan­guage, the VMThings may display the information about the services 202a-n in Spanish language. VMThings is config­ured to display the visual access menu or the enhanced visual access menu in different languages such as, but are not limited to, English, Spanish, French, German, Sanskrit, Hindi, and so forth. Further, the user may have to register himself or the device 102 (or the access device 116) at the website before accessing the services 202a-n. In an embodiment of the invention, the services 202a-n may be accessed through the web browser or the web page 110 as shown in FIG. 2B [0124] FIG. 3A illustrates an exemplary visual access menu 308 and an enhanced visual access menu 310 at a device 102, in accordance with the first embodiment of the invention. As discussed with reference to FIG. 1A, the device 102 may include a graphical user interface (GUI) for accessing the visual access menus. Further, the VMThings 108 may display the visual access menu 308 (or the Internet of Things menu) at the device 102 so as to enable the user to control the remote devices 1 06a-n. A visual access menu 308 may include one or more options. The options may be a remote devices 302 option and services 304 option. Though not shown, but a person skilled in the art will appreciate that the visual access menu 3 08 (or the Internet of Things menu) may include more than two options. A user of the device 102 may select an option of these options from the displayed visual access menu 308 (or the Internet of Things menu). Further, the user may select an option by any of the following ways, but are not limited to, touching an option, through a voice command, through a gesture or hand movement, through an audio input, by pressing one or more keys at the device 102, and so forth. Further, the VMThings 108 may use voice recognition to enable the user to make selection of an option or icon from the visual access menu 308 (or the Internet of Things menu) through a voice command. The device 102 may include a voice recognition module to process and analyze the voice command(s). [0125] Thereafter, an enhanced visual access menu 310 (or an enhanced Internet of Things menu) may be displayed based on the selection of the option from the visual access menu 308. For example, if the user has selected the remote devices 302 option, then the enhanced visual access menu 310 including one or more device options 306a-n may be displayed to the user at the device 102. The one or more device options may include options corresponding to the remote devices 106a-n such as, but are not limited to, a vehicle 306a, an air conditioner (AC) 306b, camera 306c, microwave 306n, and so forth. The user may select a device option of the device options 306a-n. For example, the user may select and control a microwave by selecting the micro­wave option 306n. For example, if the user may control the operations such as switch off, switch on, regulate, and so forth through the enhanced visual access menu. Further, the remote devices 106a-n may include some predefined settings so that
  • 77. US 2013/0080898 AI the user may access and control the remote devices 106a-n from a remote location. In an embodiment of the invention, the predefined settings may be done by the user. The VMTh­ings 108 may store these pre-defined settings at the access device 116 (or the device 102). In an embodiment of the invention, the device 102 may be connected to the services based on the local communication protocol based on nearby communication and proximity such as NFC, the Bluetooth, and so forth. Further, the user may have to authenticate his/her identity before accessing the remote devices 106a-n. The device 102 may connect to the remote devices based on the predefined settings. Further, in an embodiment of the inven­tion, each remote device of the remote devices 106a-n may have a unique remote device identity (ID) to distinguish from other remote devices. Further, the user may be allowed to access the remote devices 1 06a-n based on registration and/or authentication. [0126] In an embodiment of the invention, the user may personalize or customize the visual access menus or the Inter­net of Things menu displayed to him/her according to his/her preferences. For example, the user may select remote devices such as car, garage, home doors, fans, and lights of his/her house. Now the user may be displayed with a visual access menu corresponding to his/her preferred remote devices of the remote devices 106a-n. Through this visual access menu or the Internet of Things menu the user may access and control one or more operations of the personal remote devices. Similarly, the user may define his/her preferences for accessing the remote devices present at his/her office or fac­tory, and so forth. Therefore, multiple visual access menus may be stored at the devices based on the preferences of the user. In an embodiment of the invention, more than one user may use the device 102 for accessing remote devices 106a-n. For example, in a home, 4 users may be using same smart phone for controlling the multiple devices of home. The VMThings 108 allows different users to access remote devices (or services) according to their own preferences at the device 102 (or the access device 116). The VMThings 108 may also store the different preferences corresponding to the different users. The VMThings 108 may identifY different users based on their unique user ID or details. Further, the VMThings 108 may highlight few frequently selected or previously selected options of the visual access menu. Fur­ther, the VMThings may display a menu for communicating with the one or more objects made by a vendor. In an embodi­ment of the invention, the menu is not provided by the vendor. Further, the one or more objects may comprise at least two objects produced by two independent vendors. [0127] Further, the user may provide a language preference or a display preference. For example, the VMThings 108 may display the visual access menu (or the Internet of Things menu) in Spanish language based on the user's Spanish lan­guage preference. In an embodiment of the invention, the visual access menu (or the Internet of Things menu) may be displayed by the VMThings 108 on a bigger display screen in vicinity of the device 102, such as, but are not limited to a projector screen, an LCD display, an LED display, a televi­sion, and so forth based on the user's display preference. Further, the VMThings 108 may store the usage or access pattern for the users based on his/her selections of options from the visual access menus or the enhanced visual access menus (or the Internet of Things menus) at the device 102. In an embodiment of the invention, the device 102 may store usage patterns for more than one user at the device 102. 12 Mar. 28, 2013 [0128] In an embodiment of the invention, the user may select an option from the one or more options at the device 102 (or the access device 116) through voice inputs. For example, the user may switch on a microwave present at home by saying "Switch On the Microwave" or just by saying "Switch On". In another embodiment of the invention, the user may provide inputs at the device 102 by using different gestures or hand movements. For example the user may switch on an air conditioner by showing a gesture of a thumb up at the device 102. In an embodiment of the invention, the device 102 may include a camera. Further, the user may provide inputs regarding controlling remote devices (or ser­vices) at the device 102 by clicking an image. In an embodi­ment of the invention, the VMThings 108 may store a list of voice commands or gestures or hand movements for selecting options from the visual access menus or the enhanced visual access menus (or the Internet of Things menus). The VMTh­ings 108 may store the actions to be taken corresponding to these commands or gestures or hand movements. [0129] FIG. 3B illustrates an exemplary visual access menu 308 and an enhanced visual access menu 312 of services 202a-n at the device, in accordance with second embodiment of the invention. The user may access information about one or more services by selecting the services 304 option from the visual access menu 308 (or the Internet of Things menu for services 202a-n ). An enhanced visual access menu 312 or an enhanced Internet of Things menu corresponding to the ser­vices 202a-n may be displayed to the user by the VMThings 108. The enhanced visual access menu 312 may include one or more service options 314a-n for different types of services such as, but are not limited to, entertaiument 314a, travel 314b, banking 314c, hotels 314n, movies, airlines, and so forth. [0130] In an embodiment of the invention, the user can further expand the visual access menu for any of the services by selecting a service option from the service options 314a-n. For example, the user may access more information about banking services by selecting a banking option 314c. In an embodiment of the invention, the user may customize the visual access menu displayed to him by providing his/her preferences about the services (or remote devices) he/she would like to access or control. For example, the user may select preferred services such as entertainment, banking, and hotels. Therefore, now the user will be displayed an extended visual access menu including options for these three preferred services only. In an embodiment of the invention, the device 102 may be connected to the services based on the local communication protocol based on nearby communication and proximity such as NFC, Bluetooth, and so forth. Further, the user may have to authenticate his/her identity before accessing the services 202a-n. Further, in an embodiment of the invention, each service of the services 202a-n may have a unique service identity (ID) to distinguish from other ser­vices. Similarly, every user may have a unique user ID. In an embodiment of the invention, the user may be authenticated based on the user ID. Further, the user may be allowed to access the services 202a-n based on registration and/or authentication. [0131] In an embodiment of the invention, the user may access the remote devices 106a-n and services 202a-n through a web browser as shown in FIG. 2B. FIG. 3C illus­trates another exemplary visual access menu and an enhanced visual access menu at the device 102 when a web browser is used to access the visual access menus for controlling the
  • 78. US 2013/0080898 AI remote devices 106a-n. The visual access menus may be stored at the server 114 in the network 104. In an embodiment of the invention, the VMThings may update the database at the device 102 (or the access device 116) at a regular interval. Further, the database may store a category attribute for each of the one or more objects i.e. the remote devices 106a-n and a standard menu according to each category attribute. Simi­larly, the database may store other attributes or properties such as, but not limited to, location, device name, and so forth, associated with the plurality of objects. In an embodiment of the invention, the user can access the visual access menu including the various device options 306a-n through the web browser. The user may enter a URL in the web browser. A web page 110a including a visual access menu may be displayed at the device based on the entered URL. The visual access menu at the web page 110 may include options such as, but are not limited to, remote devices option 302, and services option 304 In an embodiment of the invention, the user may be asked to enter his/her personal details for authentication prior to getting access to the visual access menu(s ). The user may select an option from the remote devices option 302 and the services option 304. [0132] The display of the device 102 may switch from the webpage 110a to webpage 110b when the user selects the remote devices option 302. The webpage 110b may include an enhanced visual access menu including the device options 306a-n. The device options 306a-n may be graphics or icon and/or text options representing the remote devices 106a-n such as, but are not limited to, a vehicle, an air conditioner (AC), a camera, a door, a microwave, a window, and so forth. Examples of the device options 306a-n include, but are not limited to, a vehicle 306a, an AC 306b, a camera 306c, a microwave 306n, and so forth. In an embodiment of the invention, when the user selects the services option 304 from the webpage 110a, the display of the device 102 may change from the webpage 110a to a webpage 110c as shown in FIG. 3D. The webpage 110c may include an enhanced visual access menu including the service options 314a-n. The ser­vices options 314a-n may include options for accessing the services such as, but are not limited to, entertainment 314a, travel 314b, banking 314c, hotels 314n, food, and so forth. The information may be displayed to the user based on his/her selection accordingly. Further, the information may be dis­played to the user in a language based on the user's language preference. [0133] FIG. 4 illustrates an exemplary enhanced visual access menu 402 (or the Internet of Things menu for remote devices 106a-n) including one or more device options 404a-l, in accordance with an embodiment of the invention. A visual access menu 402 may include the one or more device options 404a-l. The device options 404a-l may be such as, but are not limited to, a vehicle 404b, an AC 404d, a camera 404e, a microwave 404{, a car 404g, a truck 404h, and so forth. In an embodiment of the invention, the user of the device 102 may select a device option such as a vehicle option 404b from the device options 404a-l by touching the vehicle option 404b. In another embodiment of the invention, the user may enter a voice command or play an audio at the device 102 or at some other device nearby to select a device option of the device options 404a-l from the enhanced visual access menu 402 (or an enhanced Internet of Things menu for the remote devices 106a-n). In another embodiment of the invention, the user may select device options 404a-l through gestures or hand movements such as a thumb up, a thumb down, a waving 13 Mar. 28, 2013 hand, a head nod, and so forth. The enhanced visual access menu 402 includes device options 404a-l. The user may close the door of the car by selecting the Close option 4041. Simi­larly, the user may regulate the temperature of the microwave by selecting the regulate option 404i. Though not shown, a person ordinarily skilled in the art will appreciate that the enhanced visual access menu 402 may include different device options and more than device options 404a-l. Further, the device options 404a-l may differ based on the user's preferences such as language, remote devices, and so forth. [0134] FIG. 5 illustrates an exemplary visual access menu 502 (or the Internet of Things menu) including one or more service options 504a-k, in accordance with an embodiment of the invention. The enhanced visual access menu 502 may include a plurality of service options 504a-k. Though not shown but a person skilled in art will appreciate that the enhanced visual access menu 502 may include more service options than shown. The service options 504a-kmay include services such as, but are not limited to, banking 504b, enter­tainment 504c, travel 504d, and so forth. Further, the service options 504a-k may differ based on the user's preferences such as language, services of interest, and so forth. [0135] The user may select a service option of the service options 504a-k. In an embodiment of the invention, the user of the device 102 may select the banking 504b option from the service options 504a-k by touching the banking 504b option. In an embodiment of the invention, the user may select the banking 504b option by using a combination of keys such as '12'. The user can enter the key combination by using an input device such as a keyboard connected to the device 102 or through keypad of the device 102. In another embodiment of the invention, the user may enter a voice command or music through a microphone of the device 102 to select a service option from the service options 504a-k of the visual access menu 502. In yet another embodiment of the invention, the user may select or control a service through gestures or hand movements. The user may get information about credit cards by selecting the credit cards 504h option. Similarly, the user may retrieve more information about his/her credit card bill by selecting the check bill 504k option from the visual access menu 502. [0136] In an embodiment of the invention, the user may access the local services available in nearby area or are in vicinity with respect to the device 102 through the VMThings 108. For example, if the user is nearby some services, and have the device 102 or the access device 116, then the VMTh­ings 108 may enable the user to communicate and connect to the local service. Further, the VMThings 108 may provide some suggestion( s) regarding the local services and offerings. For example, the device 102 or the user may communicate with the nearby Bank, Coffee shop, or train station. [0137] Further, the user may have to authenticate his/her identity before accessing or using the services. For example, the user may be asked to enter his personal details for authen­tication prior to connecting or accessing the services. The authentication process prevents unauthorized users from accessing the services. Further, each service may be identified through its unique service ID. [0138] FIG. 6 illustrates exemplary components of the device 102, in accordance with an embodiment of the inven­tion. The device 102 may include a system bus 622 to connect the various components. Examples of the system bus 622 include several types of bus structures including a memory bus, a peripheral bus, or a local bus using any of a variety of
  • 79. US 2013/0080898 AI bus architectures. As discussed with reference to FIG. lA, the device 102 can be a communication device capable of con­necting to other devices such as the remote devices l06a-n through the network 104. Example of the device 102 may include a mobile phone, a smart phone, a computer, a personal digital assistant (PDA), a tablet computer, a laptop etc. The remote devices l06a-n can be devices such as, but are not limited to, home appliances, vehicles, doors, lights, security systems, garage locks, and so forth. Further, the user may access the remote devices l06a-n from a remote location by using the device 102. In an embodiment of the invention, the remote devices l06a-n may be devices present at home loca­tion. In another embodiment of the invention, the remote devices l06a-n may be devices present at an office location. In yet another embodiment of the invention, the remote devices l06a-n may be present at a factory location. [0139] The device 102 can connect to the network 104 through a network interface 616. An Input/Output (IO) inter­face 618 of the device 102 may be configured to connect to external or peripheral devices such as a memory card 620a, a keyboard 620b, a mouse 620c, and a Universal Serial Bus (USB) device 620d. Although not shown, various other devices can be connected through the IO interface 618 to the device 102. In an embodiment of the invention, the device 102 may be connected to a hub that provides various services such as voice commnnication, network access, television services and so forth. For example, the hub may be a Home Gateway device that acts as a hub between the device 102 and the network 104. [0140] The device 102 may include a display 602 to output graphical information or the visual access menus or the Inter­net of Things menus to the user of the device 102. In an embodiment of the invention, the display 202 may include a touch sensitive screen. Therefore, the user can provide inputs to the device 102 by touching the display 602 or by point and click using the mouse 620c. The user can interact with the visual access menu (or the Internet of Things menu) by press­ing a desired button from the keyboard 620b. For example, the user can press a '3' key from the keyboard 620b to select a node 3 in the visual access menu. Further, the user can directly select the node 3 of the visual access menu from the display 602, in case of a touch sensitive screen. [0141] A memory 606 of the device 102 may store various programs, data and/or instructions that can be executed by a processor 604 of the device 102. Examples of the memory 606 include, but are not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a hard disk, and so forth. A person skilled in the art will appreciate that other types of computer-readable media which can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, and the like, may also be used by the device 102. The memory 606 may include a graphical user interface (GUI) 604 for accessing the enhanced visual access menus (or the enhanced Internet of Things menu) for the remote devices l06a-n and/or services 202a-n. The memory 606 may include a database 610 for storing the enhanced visual access menus corresponding to the remote devices l06a-n and/or the plurality of services 202a-n. Further, the database 610 may store user preferences related to the enhanced visual access menus of the remote devices 1 06a-n and the plurality of services 202a-n. Further, the database 610 may include a category attribute for each of the objects i.e. the services 202a-n or the remote devices l06a-n and a standard menu according to each category 14 Mar. 28, 2013 attribute. Further, the database 610 may store the alert and reminder messages. In an embodiment of the invention, the database 610 may store information about various services 202a-n and remote devices l06a-n. Further, the database 610 may be updated at a predefined time interval. For example, the database 610 may be updated after every 2 days, once in a week, monthly, and so forth. In an embodiment of the inven­tion, the updates may be received from the server 114 as shown in FIG. lB. In another embodiment of the invention, the updates about the visual access menus may be received from the network 104. [0142] In an embodiment of the invention, the VMThings 612 may update the database 610 based on crowd sourcing. It means the database 610 may be updated based on feedback or reviews or thoughts of other users. For example, if 10 users out of 15 users visiting a website and accessing the visual access menus says that there is some error in the system of controlling a particular object, then based on the ratings pro­vided by these users, the record or the menu for the particular object in the database 610 may be updated. The VMThings 612 may also learn the problem associated with the visual access menus or the device or the objects from many other sources and may find a solution based on many other users. Examples of the other sources include, but are not limited to, other network devices, remote devices l06a-n, services 202a­n, users, server, and so forth. [0143] In an embodiment of the invention, the database 610 may be created based on the information of a yellow pages directory. The plurality of objects may be categorized based on the category mentioned in the yellow pages. Further, the visual access menus in the database may be created based on the categories of the objects according to the yellow pages. In an embodiment of the invention, the database 610 may be created by a human operator or an automatic application. [0144] Further, the memory 606 may store an Internet of Things application such as a VMThings 612 for displaying visual access menus corresponding to the objects such as remote devices l06a-n or the services 202a-n at the device 102. Further, the VMThings 612 may be configured to con­nect the device 102 to the one or more of the remote devices l06a-n. In an embodiment of the invention, the VMThings 612 may be used to connect to the services 202a-n remotely. The VMThings 612 may be configured to display a visual representation in form of enhanced visual access menus of the remote devices l06a-n or the services 202a-n at the display 602. The device 102 may further include a radio interface 614 configured for wireless communications with other devices in the network 104. The visual access menus may include mul­tiple device options or service options. The user can select one or more options from the visual access menu. Further, the VMThings 612 may connect the user to the remote devices l06a-n or services based on the selection of the options. Further, the VMThings 612 may be configured to enable the device 102 to receive images, videos, and so forth of the connected remote devices l06a-n and service 202a-n irre­spective of their location. In an embodiment of the invention, the images are real-time images. In an embodiment of the invention, the VMThings 612 may be implemented as soft­ware or firmware or hardware or a combination of these at the device 102. [0145] In an embodiment of the invention, the user VMTh­ings 612 may store one or more selection of options made by the user (s) in the database 610. Further, the VMThings 612 may bookmark the options based on the past history of the
  • 80. US 2013/0080898 AI user activity with the visual access menu. The database 610 may store personalized visual access menus or enhanced visual access menu for different users. The database 610 may be updated based on user instructions. The user instructions may be provided by the user through commands such as, but are not limited to, voice commands, gestures, selection of keys, and so forth. In an embodiment of the invention, the VMThings 612 is also configured to analyze and process the voice commands based on the context of the voice command. [0146] Further, the database 610 may store visual access menu of the one or more objects based on category of the objects. In another embodiment of the invention, the database may store the visual access menus based on the vendors of the one or more objects. In an embodiment of the invention, the visual access menus may be stored based on one or more properties of the objects such as, but not limited to, location, type, distance and so forth. The database 610 may also store advertisements related to the one or more objects. In an embodiment of the invention, the VMThings 612 may display at least one advertisement along with the visual access menu at the device or display device. The advertisements may be related to the content of the visual access menu. In an embodi­ment of the invention, the advertisements may be related to the one or more objects, remote devices 106a-n, services 202a-n, and so forth. In another embodiment of the invention, the advertisements may be related to a location of the device 102 or of the one or more objects. In an embodiment of the invention, the advertisements may be displayed to the user based on one or more preference of the user. For example, the user may prefer to view advertisements of electronic devices like computers, etc. Further, the VMThings 108 may high­light the one or more options in the visual access menu. In an embodiment of the invention, the one or more options may be highlighted based on the users' previous selection of options. Further, the VMThings 612 may keep a record of user activity on the device 102. The VMThings 612 may store the user profile and access patterns of the user for accessing the visual access menu or interacting with the device 102. [0147] In an embodiment of the invention, the database 610 may be updated based on addition or deletion of the one or more objects. For example, if a new remote device is added to the list of devices to be controlled then the visual access menu will be updated accordingly. Further, the VMThings 612 may detect errors which may occur during the user interaction with the visual access menu. The VMThings 612 may also report to the user about these errors. In an embodiment of the invention, the errors may occur due to some other reasons such as technical reasons, network failure, and so forth. [0148] In an embodiment of the invention, the user may receive a call from the controlled one or more objects. Also, the user may be presented with a visual access menu associ­ated with the object from which the call is received. The VMThings 612 may display the visual access menu associ­ated with the object from which call is received at the device 102. [0149] Depending on the complexity or number of device options and/or service options in the visual access menu the size of the visual access menu may differ. Moreover, size of the display 602 may be limited or small. As a result, all the options of the visual access menu may not be displayed together on the display 602. In such a case, the VMThings 612 may allow the user to navigate by scrolling horizontally and/ or vertically to view options on the visual access menu. Fur­ther, the VMThings 612 may detect the capability of the 15 Mar. 28, 2013 device 102 before displaying the visual access menu. For example, in case the device 102 is a basic mobile phone with limited functionality of the display screen. Therefore, the application may display the visual access menu in form of a simple list. Similarly, a list may be displayed in case of fixed line or wired telephones. Moreover, in case the device 102 includes a high capability screen, such as, but are not limited to as of an iPad, a television then the visual access menu may be displayed in form of graphics. [0150] Further, the memory 606 may include other appli­cations that enable the user to communicate/interact with the remote devices 1 06a-n through the network 104. Examples of other applications include, but are not limited to, Skype, Google Talk, Magic Jack, and so forth. Other applications may be stored as software or firmware on the device 102. Further, the memory 606 may include an Operating System (OS) (not shown) for the device 102 to function properly. [0151] Though not shown, the device 102 may include a camera, a microphone, a speaker, and so forth. The user may provide voice commands by using the microphone. Further, the user may provide the input or select the option by clicking an image by using the camera. The user may control one or more operations of the remote devices 106a-n by making gestures or hand movements in front of the camera of the device 102. The speaker may be used to output music and voice responses to the user. Further, the VMThings 612 may record voice commands received from the user. These recorded commands then may be stored at the device 102. The user may input one or more key or key combinations using the keyboard 620b. The keyboard 620b may be a physical key­board or a virtual keyboard displayed on a touch screen dis­play 602 of the device 102. In an embodiment, the keyboard 620b is a keypad on the device 102. Subsequently, after some processing by the application, the enhanced visual access menu corresponding to the remote devices 1 06a-n and/or the services 202a-n based on the user inputs or selection is searched and displayed on the display 602. [0152] In an embodiment of the invention, the visual access menu or the enhanced visual access menu may be provided in real-time to the user. In another embodiment of the invention, the visual access menus (or the Internet of Things menus) may be downloaded and stored at the device 102 and may be accessed by the user later. In an embodiment of the invention, the visual access menu may be provided by a messaging service such as a Short Messaging Service (SMS). In an embodiment of the invention, customized visual access menus may be displayed to the user based on one or more preferences of the user. In an embodiment of the invention, the visual access menu may be customized based on the profile of the user. In an embodiment of the invention, the profile may be generated based on access pattern of user or the data capture by a hub connected to the device 102. Further, in an embodiment of the invention, the VMThings 108 may convert the format of the message including the visual access menu into another format based on the user preference related to the format. For example, the VMThings 108 may convert the format of the visual access menu received in an SMS format to an e-mail format based on user preference. [0153] In an embodiment, the memory 606 may include a web browser to access and display web pages from the net­work 104 and/or other computer networks. The user may use the web browser to open a website for accessing the visual access menu (or the Internet of Things menu). In an embodi­ment, the user may store the login details forthewebsite(s) at
  • 81. US 2013/0080898 AI the device 102. Therefore, the user can connect to the remote devices 106a-n or services 202a-n from the web browser automatically and may not have to enter his/her login details every time to login to the website. The user may navigate through the web site and may select a hyperlink embedded in the webpage of the website. Based on the selection of the hyperlink by the user, he/she may be directed to another webpage. In such a scenario, the VMThings 612 may display a new Internet of Things menu associated with the new web site. In an embodiment of the invention, the VMThings 612 may display a new visual access menu associated with the new web page. [0154] FIG. 7 illustrates exemplary components of the access device 116, in accordance with an embodiment of the invention. The access device 116 may include a system bus 720 to connect the various components. Examples of system bus 720 include several types of bus structures including a memory bus, a peripheral bus, or a local bus using any of a variety of bus architectures. As discussed with reference to FIGS. 1C and 2C, the access device 116 may be any device capable of data and/or voice communications through the network 104 or the remote devices 106a-n. Examples of the access device 116 include, but are not limited to, a router, a printer, a music system, a telephone, a set top box, a hub, a gateway, a mobile phone, and so forth. In an embodiment of the invention, the access device 116 may not have or may have limited display capability. The access device 116 may include a plurality of ports 722 for connecting to the network 104, and/or the display device 118. Examples of the ports 722 may include, but are not limited to, parallel ports, serial ports, DB-2 connector, IEEE 1284, IEEE 1394 ports, 8P8C ports, PS/2 ports, RS-232 ports, Registered Jack (RJ) 45 ports, RJ 48 ports, VGA port, Small Computer System Interface (SCSI) ports, USB ports, DB-25 ports, and so forth. The access device 116 may be connected to a display device 118. Further, the access device 116 may connect to the remote devices 106a-n through the network 104. The access device 116 may access and control the remote devices 106a-n and service 202a-n. In an embodiment of the invention, the access device 116 may have a unique access device identity (ID). The access device 116 may be authorized based on this unique access device ID. [0155] The access device 116 can connect to the network 104 through a network interface 714. An Input/Output (IO) interface 716 of the device 102 may be configured to connect external or peripheral devices such as a memory card 718a, a keyboard 718b, a mouse 718c, and a Universal Serial Bus (USB) device 718d. Although not shown, various other devices can be connected through the IO interface 716 to the access device 116. In an embodiment of the invention, the access device 116 may be connected to a hub or gateway device that provides various services such as voice commu­nication, network access, television services and so forth. For example, the hub may be a Home Gateway device that acts as a hub between the access device and the network 104. [0156] The access device 116 may use the screen of the display device 118 to output graphical information to the user of the access device 116. Further, the access device 116 may include a memory 704 to store various programs, data and/or instructions that can be executed by a processor 702. Examples of the memory 704 include, but are not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a hard disk, and so forth. A person skilled in the art will appreciate that other types of computer-readable media 16 Mar. 28, 2013 which can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, and the like, may also be used by the access device 116. The memory 704 may store a graphical user interface (GUI) 706 for accessing the visual access menus of the remote devices 106a-n and/or services 202a-n. The GUI may provide an interface to the user(s) to access the visual access menus or enhanced visual access menus. In an embodiment of the invention, the GUI may be used to configure or create the Internet of Things menus. The Internet of Things menu may include representations of one or more recognizable or iden­tifiable objects such as, but are not limited to, remote devices 106a-n or services in an Internet or network like structure. The one or more identifiable objects may be physical or virtual objects. [0157] The memory 704 may include a database 708 to store the visual access menus or the Internet of Things menus corresponding to the remote devices 106a-n and/or the ser­vices 202a-n. Further, the database 708 may store user pref­erences related to the remote devices 1 06a-n and the services 202a-n. Further, the database 708 may store the alert and reminder messages. In an embodiment of the invention, the database 708 may store information about the services 202a­n. Further, the database 708 may be updated at a predefined time interval. For example, the database 708 may be updated after every 4 days, once in a week, monthly, and so forth. In an embodiment of the invention, the updates related to the visual access menus and remote devices 1 06a-n or services 202a-n may be received from the server 114 as shown in FIG. 2B. In an embodiments of the invention, the updates may be received from the network 104 [0158] Further, the memory 704 may store an application such as a VMThings 710 to connect to the remote devices 106a-n and the services 202a-n remotely. Further, the VMTh­ings 710 may connect the access device 116 to the display device 118. The VMThings 710 may display a visual repre­sentation in form of visual access menus or the Internet of Things menus of the remote devices 106a-n or services 202a-n at the display device 118. The display device 118 may further include a radio interface 712 configured for wireless communications with other devices. The user can select one or more option from the visual access menu or the Internet of Things menu to connect to a particular service. Further, the VMThings 710 may connect the user to the remote devices 106a-n or the services 202a-n based on the selection of the options. Further, the VMThings 710 may be configured to enable the device 102 to receive images, videos, and so forth related to the remote devices 106a-n or services 202a-n irre­spective of their location. In an embodiment of the invention, the VMThings 710 may be implemented as software or firm­ware or hardware or a combination of these at the access device 116. [0159] In an embodiment of the invention, the display device 118 may include a touch sensitive screen. Therefore, the user can provide inputs or may select an option from the visual access menu or the Internet of Things menu by touch­ing the screen of the display device 118 or by point and click using the mouse 718c. The user can interact with the visual access menu or the Internet of Things menu by pressing a desired key or combination or keys from the keyboard 718b. For example, the user can press a '3' key from the keyboard 620b to select a node 3 in the visual access menu or the Internet of Things menu. Further, the user can directly select
  • 82. US 2013/0080898 AI the node 3 of the visual access menu or the Internet of Things menu, in case of a touch sensitive screen. [0160] Further, the size of the visual access menu or the Internet of Things menu may differ depending on the number of service options. As a result, all the service options of the visual access menu or the Internet of Things menu may not be displayed together on the screen of the display device 118. In such a case, the VMThings 710 may allow the user to navigate by scrolling horizontally and/or vertically to view various service options in the visual access menu or the Internet of Things menu. Further, the VMThings 710 may detect the capability of the screen of the display device 118 before displaying the visual access menu or the Internet of Things menu. For example, in case the display device 118 is a basic mobile phone with limited functionality of the display screen, various device options or the service options of the enhanced visual access menu or the Internet of Things menu may be displayed as a list including one or more options. [0161] In an embodiment of the invention, the database 708 may be updated based on the feedback of the one or more users or based on error report received from the other sources. In an embodiment of the invention, the VMThings 710 may update the database 708 based on crowd sourcing. It means the database 708 may be updated based on feedback or reviews or thoughts of other users. For example, if 80 users out of 100 users visiting a website and accessing the visual access menus says that there is some error in the system of controlling a particular object, then based on the ratings pro­vided by these users, the record or the menu for the particular object in the database 708 may be updated. The VMThings 710 may also learn the problem associated with the visual access menus or the device or the objects from many other sources and may find a solution based on many other users. Examples of the other sources include, but are not limited to, other network devices, remote devices 106a-n, services 202a­n, users, server, and so forth. [0162] Further, the memory 704 may include other appli­cations that enable the user to communicate/interact with the services 202a-n through the network 104. Examples of other applications include, but are not limited to, Skype, Google Talk, Magic Jack, and so forth. Other applications may be stored as software or firmware on the display device 118. Further, the memory 704 may include an Operating System (OS) (not shown) for the access device 116 to function. [0163] Though not shown, the access device 116 may include a camera, a microphone, a speaker, and so forth. In an embodiment of the invention, the display device 118 may include the camera or the speaker or the microphone, and so forth. The user may provide voice commands by using the microphone. Further, the user may provide the input or select the option by clicking an image through a camera. The user may control one or more operations of the remote devices 1 06a-n by making gestures or hand movements in front of the camera of the device 102. The speaker may also be used to output music and voice responses to the user. The user may input one or more key or key combinations using the key­board 718b. The keyboard 718b may be a physical keyboard or a virtual keyboard displayed on a touch screen display of the display device 118. In an embodiment, the keyboard 718b may be a keypad on the access device 116 or the display device 118. Subsequently, after some processing by the VMThings 710, an enhanced visual access menu correspond- 17 Mar. 28, 2013 ing to the services 202a-n based on the user inputs or selection is searched and displayed on the screen of the display device 118. [0164] In an embodiment of the invention, the VMThings 710 may be configured to recognize the context of the voice inputs received from the users or other sources. The VMTh­ings 710 may take an action based on the context of the voice inputs. [0165] Further, the user may forward or move the display of the device to another device by providing a selection or input. In an embodiment of the invention, the VMThings 710 may forward or transfer the display from a device to another device based on the user inputs. For example, the user may transfer the visual menu displayed on his/her smart phone to another smart phone by tapping at the display of the smart phone. The input for doing so may be a voice command, a selection of one or more keys, touching the display, gesture, and so forth. In an embodiment of the invention, the user may transfer the dis­play from a device to a wall. [0166] In an embodiment, the memory 704 may include a web browser to display web pages from the network 104 and/or other computer networks. The user may use the web browser to open a website for accessing the visual access menu(s). In an embodiment, the user may store the login details for the website( s) at the device. Therefore, the user can connect to the services 202a-n from the web browser auto­matically and may not be required to enter his/her login details every time to login to the website. [0167] In an embodiment of the invention, the database 708 may be updated based on addition or deletion of the one or more objects. For example, if a new remote device or service is added to the list of devices or services to be controlled then the visual access menu in the database may be updated accordingly. Further, the VMThings 710 may detect errors which may occur during the user interaction with the visual access menu. The VMThings 710 may also report to the user about these errors. In an embodiment of the invention, the errors may occur due to some other reasons such as technical reasons, network failure, and so forth. In an embodiment of the invention, the errors may be reported in form of such as, but not limited to, text report, images, an MMS, a SMS, an E-mail, voice messages, and so forth. In another embodiment of the invention, the VMThings 710 may maintain and store a log of errors reported and actions taken to correct them in the database 708. [0168] In an embodiment of the invention, the database 708 may be created by a human operator or an automatic appli­cation. The human operator may listen to various options of the audio menus of the one or more objects and may create a visual access menu or visual Internet of Things menus accordingly. In an embodiment of the invention, the database 708 may be created based on one or more instructions of the users by the human operator. [0169] In an embodiment of the invention, the database 708 may be created based on the information of a yellow pages directory. The plurality of objects may be categorized based on the category mentioned in the yellow pages. Further, the visual access menus or the Internet of Things menus in the database may be created based on the categories of the objects according to the yellow pages. [0170] FIG. 8 illustrates a flowchart for controlling remote devices when the visual access menus or the Internet of Things menus are accessed through an access device, in accordance with an embodiment of the invention. As dis-
  • 83. US 2013/0080898 AI cussed with reference to FIGS. 1A and 2A, the user of the device such as a smart phone may connect to a plurality of objects in the network such as remote devices and services. In an embodiment of the invention, the objects may be a com­bination of the remote devices and services. Further, the device may control one or more operations of the remote devices. The device may include an Internet of Things appli­cation such as a VMThings configured to display graphical information to the user. The VMThings may display visual access menus (or enhanced visual access menus) or the Inter­net of Things menus at the device for controlling remote devices or services irrespective of the location of the remote devices or services. In an embodiment of the invention, the Internet of Things menu may include representations of one or more recognizable or identifiable objects such as, but are not limited to, remote devices or services in an Internet or network like structure. The one or more identifiable objects may be physical or virtual objects. In an embodiment of the invention, a graphical user interface (GUI) may be used by the user for creating the Internet of Things menu. The objects may be the remote devices or services. In an embodiment of the invention, the device may be connected to a display device such as an LCD screen, a TV, an LED screen, a projector screen and so forth. In an embodiment of the invention, the device or remote devices may be connected to each other through a local network such as a wireless network like Blue­tooth, RF4CE network, and so forth or through a wired net­work like Local Area Network (LAN). [0171] At step 802, a database including visual access menus may be accessed through a graphical user interface (GUI) at the device. In an embodiment of the invention, the GUI may be accessed at the device by the user. At step 804, a visual access menu or the Internet of Things menu may be displayed at the device. In an embodiment of the invention, the VMThings may display the visual access menus and the Internet of Things menu at the device. The visual access menu may include one or more options such as, but are not limited to, a remote devices option, a services option, and so forth. The user may select an option from these options. The VMTh­ings may receive an input from the user. The input may be a selection of option by the user. In an embodiment of the invention, the device may include a touch sensitive screen. In an embodiment of the invention, the user may select an option by touching the screen of the device. In another embodiment of the invention, the user may select an option by making a gesture or hand movement or through a voice command. The gestures, hand movements or the voice commands may be detected by the display device. In an embodiment of the invention, the VMThings may detect the gestures or hand movements or the voice commands. Further, the VMThings of the device may understand and accept voice inputs from the user in different languages irrespective of the device lan­guage. Therefore, the user may control the remote devices by giving voice commands in different languages such as, but are not limited to, English, Spanish, French, Hindi, Chinese lan­guage, Japanese language, Hawaiian, German language, and so forth. [0172] At step 806, an enhanced visual access menu or an enhanced Internet of Things menu for remote devices based on a selection of an option by a user may be displayed at the display device when the user selects the remote devices option from the visual access menu. The enhanced visual access menu for devices may include one or more device options. In an embodiment of the invention, the VMThings of 18 Mar. 28, 2013 the device may display a visual access menu or an enhanced visual access menu or an Internet of Things menu in different languages. Further, the device or the remote devices may have one language and the user may want to control and commu­nicate in a different language, the user may do this via the VMThings application. The user may select a service option from these service options. At step 808, a selection of a device option may be received from the user. The user may provide the selection by touching the screen of the display device or by making some gestures or through hand movements in front of the display device or the access device. In an embodiment of the invention, the user may select a service option through a voice command or instruction. [0173] At step 810, the user may be connected to a remote device based on the selection of a device option. In an embodiment of the invention, the VMThings may also check whether the remote device corresponding to the device selected by the user is registered to be monitored by the user or not. In another embodiment of the invention, the user may be required to authenticate his/her identity before accessing or connecting to the remote devices 106a-n. Thereafter, at step 812, the user may control one or more operations of the remote device based on the selection of the device option. For example, the user may view real time pictures of the remote device, the user may switch on the remote device, and so forth. [017 4] FIG. 9 illustrates a flowchart for controlling services when the visual access menus, in accordance with an embodi­ment of the invention. As discussed with reference to FIGS. 1 C and 2C, the services may be accessed and/or controlled by using an access device. At step 902, a graphical user interface (GUI) for accessing or creating an Internet ofThings menu or a visual access menu may be displayed at the device. In an embodiment of the invention, the VMThings may display the GUI at the device. In an embodiment of the invention, the GUI may be accessed or opened by the user of the device. The visual access menu or the Internet of Things menu may include one or more options such as, but are not limited to, a remote devices option and a services option. The user may select any of these options. [0175] At step 904, an input including an option selected by the user is received at the device. In an embodiment of the invention, the device may include a touch sensitive screen. In another embodiment of the invention, the user may select an option by making a gesture or hand movement or through a voice command. The gestures may be such as, but are not limited to, a thumb up, ahead nod, a smile, a laughter, a thumb down, showing two fingers, and so forth. In an embodiment of the invention, the VMThings of the device may detect the gestures or hand movements or the voice commands and may receive a selectionofthe option. Further, the VMThings of the device may understand and accept voice inputs from the user in different languages irrespective of the device language. [0176] At step 906, an enhanced visual access menu or an enhanced Internet of Things menu for services based on a selection of an option by a user may be displayed at the device when the user selects the services option from the visual access menu. The enhanced visual access menu for services may include one or more service options. In an embodiment of the invention, the VMThings of the device may display the enhanced visual access menu in different languages as per the user's instruction or convenience. Further, the device or the remote devices may have one language and the user may control and communicate in a different language via the
  • 84. US 2013/0080898 AI VMThings. In such a scenario, the VMThings may display the visual access menu at the device in a language( s) preferred by the user. The VMThings will do the required translation of language. In an embodiment of the invention, the VMThings may display more than one visual access menus at the screen of the device. The multiple visual access menus may be displayed in different languages. The user may select a ser­vice option from these service options. At step 908, a selec­tion of a service option may be received from the user. In an embodiment of the invention, the user may select a service option through a voice command or instruction. [0177] At step 910, the user may be connected to a service based on the selection of the service option. The VMThings may also check whether the information for the selected service option is available at the device. If the information is not available, then the information may be requested and/or received from a server. Thereafter, at step 912, information about the service may be displayed at the display device based on the selection of the service option. The user may interact with the information accordingly. In an embodiment of the invention, the information may include text, graphics, audio, video, or hyperlinks. [0178] FIGS. lOA, lOB, and lOC illustrate a flowchart dia­gram for controlling objects by using a device in a network, in accordance with an embodiment of the invention. As dis­cussed with reference to FIGS. lA and 2A, the user of the device such as a smart phone may connect and control various objects in the network. In an embodiment of the invention, the objects may include remote devices such as a car, a washing machine, door, truck, and so forth. In another embodiment of the invention, the objects may be services such as entertain­ment, banking, hotels, and so forth as described in FIG. 2A-I. In yet another embodiment of the invention, the objects may be combination of the remote devices and services. Further, the device may control one or more operations of the remote devices. The user at the device may also view information about various services. The device may include an Internet of Things application i.e. VMThings configured to display graphical information at the device. In an embodiment of the invention, the VMThings may display the visual access menus at the device for controlling remote devices or services irrespective oflocation of the remote devices or services. [0179] At step 1002, a graphical user interface (GUI) for accessing or configuring an Internet of Things menu or a visual access menu may be displayed at the device. In an embodiment of the invention, the VMThings may display the GUI at the device. In an embodiment of the invention, the GUI may be opened by the user of the device. The visual access menu may include one or more options such as, but are not limited to, a remote devices option and a services option. The user may select any of these options. [0180] At step 1004, an input including an option selected by the user is received at the device. At step 1006, it is checked whether the input is for accessing services. The input is for accessing services when the user selects the services option. If the input is for accessing services then the process control goes to step 1014, else the process control goes to step 1008. [0181] At step 1008, it is checked whether the input is for accessing the remote devices. In an embodiment of the inven­tion, the input is for accessing remote devices such as car, microwave, garage, doors, and so forth, when the user selects the remote devices option from the visual access menu. If the 19 Mar. 28, 2013 input is for accessing the remote devices then the control goes to step 1012, else the process waits for an input from the user at the device at step 1010. [0182] At step 1014, it is checked whether a visual access menu or an Internet of Things menu for services is available at the device. If not available then at step 1016, the visual access menu of the services may be retrieved from a server in the network else the process continues to step 1018. At step 1018, the visual access of the services menu including one or more service options may be displayed at the device. The service options may be graphics icons and/or text represent­ing services. The user may select an option( s) from the service options. At step 1020, a selection of a service option may be received from the user at the device. Thereafter, at step 1024, it is checked whether, information corresponding to the selected service option is available at the device. If not avail­able the information may be requested and received from the server at step 1024. Then, at step 1026, the information may be displayed at the device based on the received selection of the service option. For example, the user may check his/her credit card bill through banking service option and may also know different ways of making the payment and information about nearby payment office. [0183] When at step 1008 the input is for accessing the remote devices then at step 1012, it is checked whether a visual access menu for remote devices is available at the device. If not available then the visual access menu of the remote devices is retrieved from the server at step 1028. Then at 1030, the visual access menu including one or more device options may be displayed at the device. The device options may be graphics icons and/or text representing remote devices. The user may select a device option(s) from the visual access menu of the remote devices. At step 1032, a connection between the device and a remote device is estab­lished based on the received selection. Thereafter, the user may control the remote device(s) irrespective of location of the remote devices. [0184] FIG. 11 illustrates a flowchart for controlling remote devices while accessing the visual access menu or the Internet ofThings menu through a web browser, in accordance with an embodiment of the invention. As discussed with reference to FIGS. lB and 2B, the user of the device 102 may access the remote devices and/or services by using a web browser such as Google Chrome, Internet Explorer at the device. In an embodiment of the invention, the user may access the web browser at the access device connected to the display device. [0185] At step 1102, the user may open a website through a web browser at the device. The user may open the website by entering a Uniform Resource Locator (URL) of a website at the web browser. The website may allow the user to access visual access menus. In an embodiment of the invention, the website is displayed at the display device. At step 1104, the user may authenticate his/her identity by entering one or more details in one or more fields on the web page. The VMThings may check whether the user is an authorized user or not based on a unique user ID of the user. The VMThings may store the user IDs at the device. In an embodiment of the invention, the website may maintain the database of user IDs authorized to access the remote devices or the services. At step 1106, a visual access menu including one or more options is dis­played at the device. In an embodiment of the invention, an Internet of Things menu may be displayed. The Internet of Things menu may include representations or icons of one or more recognizable or identifiable objects such as, but are not
  • 85. US 2013/0080898 AI limited to, remote devices 1 06a-n or services in an Internet or network like structure. In an embodiment of the invention the VMThings may display the visual access menu or the Internet of Things menu at the device. In another embodiment of the invention the VMThings may display the visual access menu at the display device connected to the access device. The one or more options can be such as a remote devices option, a services option, and so forth. The user may select an option from these options. At step 1108, an input regarding the selection of the option may be received from the user at the device. [0186] At step 1110, an enhanced visual access menu for the remote devices may be displayed at a screen of the device or the web browser when the user selects the remote devices option from the visual access menu. In an embodiment of the invention, an enhanced Internet of Things menu for the remote devices may be displayed at a screen of the device or the web browser when the user selects the remote devices option from the visual access menu. As shown in FIG. 3C, the display of the device may switch based on the selection of the option. In an embodiment of the invention the enhanced visual access menu or the Internet of Things menu for the remote devices may be retrieved from the server. The enhanced visual access menu for the remote devices may include one or more device options. In an embodiment of the invention, the enhanced Internet of Things menu for the remote devices may include one or more representations cor­responding to the remote devices. The user may select a device option from the displayed enhanced visual access menu of the remote devices. Each device option may repre­sent a remote device which the user can control. Further, the options, service options, and device options may be repre­sented as graphics or/and text on the visual access menus. At step 1112, a selection of a device option may be received from the user at the device. In an embodiment of the invention, the VMThings may detect the selection received from the user. In an embodiment of the invention, the user may select the device option by touching the device option at display of the device. In an embodiment of the invention, the user may provide the selection of the device option through voice inputs or commands and/ or gestures or hand movements such as, but are not limited to, a thumb up, a head nod, and so forth. Further, the voice inputs or commands may be in different languages such as English, Spanish, and so forth. The VMTh­ings may detect, understand and translate the voice com­mands into a language which can be understood by the device. [0187] At step 1114, a connection between the device and the remote device(s) is established by the VMThings. There­after, at step 1116, the user may control one or more opera­tions of the connected remote devices irrespective of their location. For example, the user may switch on anAC located at his/her home while driving back to home. In an embodi­ment of the invention, the VMThings at the device may change the voice commands into text and may respond or control the remote devices accordingly. [0188] FIG. 12 illustrates a flowchart for controlling ser­vices while accessing the visual access menu through a web browser, in accordance with an embodiment of the invention. As discussed with reference to FIGS. 1B and 2B, the user of the device 102 may access the services by using a web browser such as Google Chrome, Internet Explorer at the device. In an embodiment of the invention, the user may access the web browser at the access device connected to the display device. 20 Mar. 28, 2013 [0189] At step 1202, the user may open a website through a web browser at the device. The user may open the website by entering a Uniform Resource Locator (URL) of a website at the web browser such as Google Chrome. The web site may allow the user to access visual access menus. In an embodi­ment of the invention, the website is displayed at the display device. At step 1204, the user may authenticate his/her iden­tity by entering one or more details in one or more fields on the web page. At step 1206, a visual access menu including one or more options is displayed at the device. In an embodiment of the invention, an Internet of Things menu may be displayed at the device. In an embodiment of the invention the VMThings may display the visual access menu at the device. In another embodiment of the invention the VMThings may display the visual access menu at the display device connected to the access device. The user may select an option from the options such as a remote devices option or the services option of the visual access menu. At step 1208, an input from the user may be received at the device. [0190] At step 1210, an enhanced visual access menu for the services may be displayed at a screen of the device or the web browser when the user selects the services option from the visual access menu. In an embodiment of the invention, an enhanced Internet of Things menu for the services may be displayed at a screen of the device or the web browser when the user selects the services option from the Internet ofThings menu. As shown in FIG. 3D, the display of the device may switch based on the selection of the option. In an embodiment of the invention, the enhanced visual access menu or the enhanced Internet of Things menu for the services including the one or more service options may be retrieved from the server. The user may select a device option from the displayed enhanced visual access menu of the services. Each service option may represent a service. At step 1212, a selection of a service option may be received from the user at the device. In an embodiment of the invention, the VMThings may detect the selection received from the user. In an embodiment of the invention, the user may select the service option by touching the service option at display of the device. In an embodiment of the invention, the user may provide the selection of the service option through voice inputs or commands and/or ges­tures or hand movements such as, but are not limited to, a thumb up, a head nod, and so forth. Further, the voice inputs or commands may be in different languages such as English, Spanish, and so forth. The VMThings may detect, understand and translate the voice commands into a language which can be understood by the device or the services [0191] At step 1214, a connection between the device and the remote device(s) may be established by the VMThings. Thereafter, at step 1216, the user may control one or more operations of the connected remote devices irrespective of their location. For example, the user may switch on an AC located at his/her home while driving back to home. In an embodiment of the invention, the VMThings at the device may change the voice commands into text and may respond or access the services accordingly. Further, the VMThings may store the voice commands in different languages at the device (or the access device). The VMThings also stores the list of actions corresponding to the various voice commands, ges­tures, hand movements, and so forth. [0192] FIGS. 13A, 13B, and 13C illustrate a flowchart for controlling objects in a network while accessing the visual access menu through a web browser, in accordance with an embodiment of the invention. As discussed with reference to
  • 86. US 2013/0080898 AI FIGS. 1B and 2B, the user of the device 102 may access various objects such as, but are not limited to, remote devices and/or services by using a web browser such as Google Chrome, Internet Explorer at the device. In an embodiment of the invention, the user may access the web browser at the access device connected to the display device. [0193] At step 1302, the user may open a website through a web browser at the device. The user may open the website by entering a Uniform Resource Locator (URL) of a website at the web browser. The web site may allow the user to access visual access menus. In an embodiment of the invention, the website is displayed at the display device. At step 1304, the user may authenticate his/her identity by entering one or more details in one or more fields on the web page. At step 1306, a visual access menu comprising one or more options is dis­played at the device. In an embodiment of the invention the VMThings may display the visual access menu at the device. In another embodiment of the invention the VMThings may display the visual access menu at the display device con­nected to the access device. The one or more options can be such as a remote devices option, a services option, and so forth. The user may select an option from these options. At step 1308, an input from the user may be received at the device. Then at step 1310, it is checked whether the input is for accessing services. If outcome of the step 1310 is true then the control goes to step 1316, else step 1312 is followed. [0194] At step 1312, it is checked whether the input received at step 1308 is for accessing remote devices. If true then the control goes to step 1330 else the process waits for an input at the user at step 1314. At step 1316, it is checked whether, an enhanced visual access menu for services is avail­able at the device. If the enhanced visual access menu is not available then at step 1318, the enhanced visual access menu may be retrieved from the server else step 1320 is executed. Then at step 1320, the enhanced visual access menu including one or more service options such as for banking, entertain­ment etc. is displayed at the device. The user may select a service option from the service options. At step 1322, a selec­tion of a service option from the user may be received. Then at step 1324, it is checked whether information for selected service option is available at the device. If not available then the information may be requested and received from the server. Then at step 1328, the information may be displayed at the device based on the received selection. [0195] If at step 1312, the input is for accessing the remote device, then at step 1330, it is checked whether an enhanced visual access menu for the remote services is available at the device. If not available, then at step 1332, the enhanced visual access menu for the remote devices including the one or more device options may be retrieved from the server else step 1334 may be executed. At step 1334, the enhanced visual access menu including the device options may be displayed at the device or the web browser. In an embodiment of the invention, the enhanced visual access menu may be displayed at the display device connected to the display device or the access device. [0196] The user may select a device option from the dis­played enhanced visual access menu of the remote devices. Each device option may represent a remote device. Further, the options, service options, and device options may be rep­resented as graphics or/and text on the visual access menus. At step 1336, a selection of a device option may be received from the user. In an embodiment of the invention, the user may select the device option by touching the device option at 21 Mar. 28, 2013 display of the device. In an embodiment of the invention, the user may provide the selection of the device option through voice inputs or commands and/or gestures or hand move­ments such as, but are not limited to, a thumb up, a head nod, and so forth. The VMThings may detect, understand and translate the voice commands into a language which can be understood by the device. In an embodiment of the invention, the VMThings at the device may change the voice commands into text and may respond or control the remote devices accordingly. [0197] At step 1338, a connection between the device and the remote device(s) is established by the VMThings. There­after, at step 1340, the user may control one or more opera­tions of the connected remote devices irrespective of their location. For example, the user may switch on anAC located at his/her home while driving back to home. [0198] FIG. 14 illustrates a flowchart diagram for control­ling the remote devices through a website, in accordance with another embodiment of the invention. At step 1402, the user may open a website through a web browser at the device. The website is for accessing the remote devices or visual access menus corresponding to the remote devices. The user may open the website by entering a Uniform Resource Locator (URL) of the website in the web browser. The web site may allow the user to access visual access menus of the remote devices (or services as explained in FIG. 12). In an embodi­ment of the invention, the website is displayed at the display device. Each of the remote devices may have an associated unique ID. Similarly, the device may also have a unique device ID. The remote devices are registered with the device. Further, the user may have to register him/her so as to be able to access the remote devices. [0199] At step 1404, a visual access menu including one or more options may be displayed at the device. In an embodi­ment of the invention the VMThings may display the visual access menu at the device. In another embodiment of the invention the VMThings may display the visual access menu display device connected to the access device. The one or more options can be such as a remote devices option, a ser­vices option, and so forth. The user may select an option from these options. At step 1406, an input including a selection of the option may be received at the device from the user. [0200] At step 1408, an enhanced visual access menu for the remote devices may be displayed at a screen of the device or as the web page when the user selects the remote devices option from the visual access menu. As shown in FIG. 3C, the display of the device may switch based on the selection of the option. In an embodiment of the invention the enhanced visual access menu for the remote devices including the one or more device options may be retrieved from the server. The user may select a device option from the displayed enhanced visual access menu of the remote devices. Each device option may represent a remote device which can be controlled. Fur­ther, the options, service options, and device options may be represented as graphics or/and text on the visual access menus. [0201] At step 1410, a selection of a device option may be received from the user at the device. In an embodiment of the invention, the VMThings may detect the selection received from the user. In an embodiment of the invention, the user may select the device option by touching the device option at display screen of the device. In an embodiment of the inven­tion, the user may provide the selection of the device option through voice inputs or commands and/or gestures or hand
  • 87. US 2013/0080898 AI movements such as, but are not limited to, a thumb up, a head nod, and so forth. Further, the voice inputs or commands may be in different languages such as English, Spanish, and so forth. The VMThings may detect, understand and translate the voice commands into a language which can be understood by the device. At step 1412, a connection between the device and the remote device(s) is established by the VMThings. Thereafter, at step 1414, the user may control one or more operations of the connected remote devices irrespective of their location. For example, the user may switch on an AC located at his/her home while driving back to home. In an embodiment of the invention, the VMThings at the device may change the voice commands into text and may respond or control the remote devices accordingly. [0202] FIG.15 illustrates a flowchart for controlling remote devices when the visual access menus are accessed through an access device, in accordance with an embodiment of the invention. As discussed with reference to FIGS. 1C and 2C, the remote devices may be controlled by using an access device. The access device may be any communication device capable of connecting to a network or a local network. In an embodiment of the invention, the access device may have limited display capabilities or no display capabilities. Examples of the access device include, but are not limited to, a set top box, a home gateway, a hub, a router, a bridge, a mobile phone, a smart phone, a printer, a scanner, a computer, a PDA, a pager, a watch, a tablet computer, a music player, an IPod, a telephone, and so forth. The access device may include an Internet ofThings application such as a VMThings application for displaying visual access menus for controlling the remote devices or services at the display device. The access device may be connected to a display device such as an LCD screen, a projector screen, a television, and so forth. The display device may be a device including a display (or a large display screen). The access device may further include an application VMThings configured to display visual access menus and information to the user. In an embodiment of the invention the access device may act as the device itself. In another embodiment of the invention, the device may also be connected to the display device. [0203] At step 1502, a database including visual access menus may be accessed through a graphical user interface (GUI) at the access device. In an embodiment of the inven­tion, the GUI may be accessed via the access device by the user. At step 1504, a visual access menu may be displayed at the display device. In an embodiment of the invention, the VMThings may display the visual access menus at the display device. The visual access menu may include one or more options such as, but are not limited to, a remote devices option, a services option, and so forth. The user may select an option from these options. The VMThings may receive an input from the user. The input may be a selection of option by the user. In an embodiment of the invention, the display device may include a touch sensitive screen. In an embodi­ment of the invention, the user may select an option by touch­ing the screen of the display device. In another embodiment of the invention, the user may select an option by making a gesture or hand movement or through a voice command. The gestures, hand movements or the voice commands may be detected by the display device. In an embodiment of the invention, the VMThings of the access device may detect the gestures or hand movements or the voice commands. Further, the VMThings of the access device may understand and accept voice inputs from the user in different languages irre- 22 Mar. 28, 2013 spective of the device language. Therefore, the user may control the remote devices by giving voice commands in different languages such as, but are not limited to, English, Spanish, French, Hindi, Chinese language, Japanese lan­guage, Hawaiian, German language, and so forth. [0204] At step 1506, an enhanced visual access menu for remote devices based on a selection of an option by a user may be displayed at the display device when the user selects the remote devices option from the visual access menu. The enhanced visual access menu for devices may include one or more device options. In an embodiment of the invention, the VMThings of the access device may display visual access menu or enhanced visual access menu in different languages. Further, the access device or the remote devices may have one language and the user may want to control and communicate in a different language, the user may do this via VMThings application. The user may select a service option from these service options. At step 1508, a selection of a device option may be received from the user. The user may provide the selection by touching the screen of the display device or by making some gestures or through hand movements in front of the display device or the access device. The gestures may be such as, but are not limited to, a thumbs up, a head nod, a smile, a laughter, a thumbs down, showing two fingers, and so forth. In an embodiment of the invention, the user may select a service option through a voice command or instruction. [0205] At step 1510, the user maybe connected to a remote device based on the selection of a device option. In an embodiment of the invention, the VMThings may also check whether the remote device corresponding to the device selected by the user is registered to be monitored by the user or not. Thereafter, at step 1512, the user may control one or more operations of the remote device based on the selection of the device option. For example, the user may view real time pictures of the remote device, the user may switch on the remote device, and so forth. [0206] FIG. 16 illustrates a flowchart for controlling ser­vices when the visual access menus are accessed through an access device, in accordance with an embodiment of the invention. As discussed with reference to FIGS. 1C and 2C, the services may be accessed and/or controlled by using an access device. At step 1602, a database including visual access menus may be accessed through a graphical user inter­face (GUI) at the access device. In an embodiment of the invention, the GUI may be accessed via the access device by the user. [0207] At step 1604, a visual access menu may be displayed at the display device. In an embodiment of the invention, the VMThings of the access device may display the visual access menus at the display device. The visual access menu may include one or more options such as, but are not limited to, a remote devices option, a services option, and so forth. The user may select an option from these options. The VMThings may receive an input from the user. The input may be a selection of option by the user. In an embodiment of the invention, the display device may include a touch sensitive screen. In an embodiment of the invention, the user may select an option by touching the screen of the display device. In another embodiment of the invention, the user may select an option by making a gesture or hand movement or through a voice command. The gestures, hand movements or the voice commands may be detected by the display device. In an embodiment of the invention, the VMThings of the access device may detect the gestures or hand movements or the
  • 88. US 2013/0080898 AI voice commands. Further, the VMThings of the access device may understand and accept voice inputs from the user in different languages irrespective of the device language. Therefore, the user may control the remote devices by giving voice commands in different languages such as, but are not limited to, English, Spanish, French, Hindi, Chinese lan­guage, Japanese language, Hawaiian, German language, and so forth. [0208] At step 1606, an enhanced visual access menu for services based on a selection of an option by a user may be displayed at the display device when the user selects the services option from the visual access menu. The enhanced visual access menu for services may include one or more service options. In an embodiment of the invention, the VMThings of the access device may display visual access menu or enhanced visual access menu in different languages. Further, the access device or the remote devices may have one language and the user may want to control and communicate in a different language. The user may select a service option from these service options. At step 1608, a selection of a service option may be received from the user. In an embodi­ment of the invention, the user may select a service option through a voice command or instruction. [0209] At step 1610, the user maybe connected to a service based on the selection of a service option. The VMThings may also check whether the information for the selected service option is available at the device. If the information is not available, then the information may be requested and/or received from a server. Thereafter, at step 1612, information about the service may be displayed at the display device based on the selection of the service option. The user may interact with the information accordingly. In an embodiment of the invention, the information may include text, graphics, audio, video, or hyperlinks. [0210] FIGS. 17A, 17B and 17C illustrate a flow diagram for controlling various objects in a network through an access device, in accordance with an embodiment of the invention. At step 1702, a GUI for accessing the visual access menus may be displayed at the display device. The VMThings may display the visual access menus at the display device. The visual access menu may include one or more options such as, but are not limited to, a remote devices option, a services option, and so forth. The user may select from these options. At step 1704, an input from the user may be received. The input may be a selection of option by the user. In an embodi­ment of the invention, the display device may include a touch sensitive screen. In an embodiment of the invention, the user may select an option by touching the screen of the display device. In another embodiment of the invention, the user may select an option by making a gesture or hand movement or through a voice command. At step 1706, it is checked whether, the input is for accessing the services. If the input is for accessing services then process control goes to step 1714 else step 1708 is executed. At step 1708, it is checked whether, the input received at step 1704 is for accessing remote device (s). If the input is for accessing remote devices then step 1712 is executed, else the process waits for input from user at the access device. [0211] At step 1714, it is checked whether, a visual access menu of the services is available at the access device. If the visual access menu for accessing services is available then process control goes to step 1718, else step 1716 is executed. At step 1716, the visual access menu for accessing the ser­vices is received from a server in the network. Examples of 23 Mar. 28, 2013 the services may include, but are not limited to, banking services, entertainment service, tours and travel services, and so forth. [0212] At step 1718, the visual access menu including one or more service options for accessing the services may be displayed at the screen of the display device. The user may select a service option from these service options. At step 1720, a selection of a service option may be received from the user. The user may provide the selection by touching the screen of the display device or by making some gestures in front of the display device or the access device. In an embodi­ment of the invention, the user may select a service option through a voice command or instruction. [0213] At step 1722, it is checked whether the information for the selected service option is available at the device. If the information is not available, then the information may be requested and/or received from the server at step 1724, else step 1726 is executed. At step 1726, the information of the selected services may be displayed at the display device. Thereafter, the user may interact with the visual access menu for accessing services accordingly. [0214] If at step 1708, the input is for accessing the remote devices, then step 1712 is executed. At step 1712, it is checked whether, a visual access menu of the remote devices is avail­able at the access device. If the visual access menu for the remote device is available then step 1730 is executed, else the visual access menu of the remote devices is retrieved from the server at step 1728. At step 1730, the visual access menu including one or more device options is displayed at the display device. The device options may be graphics icons and/or text representing remote devices. The user may select a device option(s) from the visual access menu of the remote devices. At step 1032, a connection between the device and a remote device is established based on the received selection. Thereafter, the user may control the remote device(s) irre­spective of a location of the remote devices. For example, the user sitting in his/her office may regulate the temperature of the microwave located at home without being physically present at home. [0215] FIG. 18A illustrates an exemplary display of images, in accordance with an embodiment of the invention. As discussed before, the device 102 may receive images of the remote devices 1 06a-n (or services 202a-n) in real-time. In an embodiment of the invention, the access device 116 may receive the images of the remote devices 1 06a-n in real-time. In an embodiment of the invention, the images may be received at pre-defined time interval. In another embodiment of the invention, the VMThings 108 may retrieve the images in real-time or based on user's instructions. The images of more than one remote device may be displayed at the device as shown in FIG. 18A. The image display 1802 includes images of multiple remote devices 106a-n. Therefore, the user may not have to connect to different remote devices individually to see their images. In an embodiment of the invention, the device 102 may receive video or audio of the remote devices 1 06a-n. Therefore, the remote devices 1 06a-n are registered with the device 102 (or the access device 116). The images may be received and stored at the device 102 which can be accessed by the user as per his/her convenience. Further, the remote devices 106a-n may be grouped into various categories such as, but are not limited to, electronics appliances, home devices, buildings, doors, room appliances, switches, and so forth. Further, the VMThings 108 may dis­play the images of multiple objects such as remote devices
  • 89. US 2013/0080898 AI 106a-n, services 202a-n at a single interface or display. Fur­ther, the remote devices 1 06a-n may be grouped based on the information about the remote devices 106a-n in a yellow pages directory. [0216] Further, the remote devices 106a-n may be grouped according to location, such as home devices, office devices, garages devices, and so forth. In an embodiment of the inven­tion, the remote devices may be grouped based on other criteria such as, but are not limited to, functions of the remote device, utility of the remote device, type of the remote device, and so forth. The VMThings 108 of the device 102 may store visual access menus and enhanced visual access menus cor­responding to the remote devices based on the various cat­egories of the remote devices 1 06a-n. In an embodiment of the invention, the user may require to register at the remote devices 106a-n so as to be able to control the remote devices 106a-n from the VMThings 108. In an embodiment of the invention, the user may be required to authenticate or prove his/her identity at device 102 or for the remote devices 1 06a-n before controlling one or more operations of the remote devices 106a-n. The VMThings 108 may also display the images of the multiple devices based on these groupings of the remote devices 106a-n. In an embodiment of the inven­tion, the image display 1802 may include images of the remote devices located in kitchen of the home. In an embodi­ment of the invention, the VMThings 108 may display one or more advertisements related to the content of the display 1802. Further, the advertisements may be displayed based on user preferences such as user interest, etc. [0217] FIG. 18B illustrates transfer of an exemplary dis­play of images from a device to another device, in an embodi­ment of the invention. In an embodiment of the invention, the VMThings 108 may connect a device 1 02a to one or more devices such as a device 102b and transfer the displayed content such as display 1802 from the device 102a to the device 102b. As shown in FIG. 18B, the device 102b can be a smart phone, a mobile phone, a picture frame, an LCD dis­play, an LED display, a GPS screen, a PDA, a TV, a tablet computer, a projector screen, a computer, a laptop, and so forth. The VMThings 108 of the device 102a may transfer display 1802 to the display of the device 102b. Therefore, the display 1802 including one or more images of the remote devices 106a-n or objects may be displayed at the device 102b. Further, the VMThings 108 may transfer any display such as a visual access menu displayed at the device 102a or device 102 to the device 102b. In an embodiment of the invention, the device 102b may also include an Internet of Things application such as VMThings. In an embodiment of the invention, the display 1802 is transferred to the device 102b based on at least one input from the user. Examples of the at least one input may include, but are not limited to, a touch, a voice command, a gesture, a hand movement, a selection of one or more keys at the device 102, and so forth. For example, in case of a touch sensitive screen at the device 1 02a, a user may transfer the displayed content at the display of the device 102b by touching the screen of the device 102a. In an embodiment of the invention, the user may provide the selection through dual tone multi frequency (DTMF) tones. In an embodiment of the invention, the display 1802 may be transferred based on the user input to a projection screen or a wall. [0218] FIG.19 illustrates an exemplary display of a cockpit 1902 at the device 102, in accordance with an embodiment of the invention. The cockpit 1902 is an interface which enables 24 Mar. 28, 2013 a user to access various services, devices or objects. The cockpit 1902 may include a plurality of icons 1904a-n repre­senting various objects which a user or users can access or control. The tabs 1904a-n may be icons or text or combination of these. The cockpit 1902 may include a tab 1904a which is an icon representing Interactive Voice Response System (IVR). The user may select the IVR tab 1904a to access various application and interfaces for interacting with IVR systems of various destinations. The destinations may be organizations or companies or individual services imple­menting IVR systems. In an embodiment of the invention, the user of the device 102 may connect to any of these destina­tions by dialing a telephone number of a destination. A tab 1904b is an icon corresponding to interface for controlling remote devices 106a-n. The user may select the Remote devices tab 1904b for viewing an enhanced visual access menu for controlling remote devices 106a-n. The remote device may be home equipments, cars, doors, electronic appliances, windows, and so forth. A tab 1904c is an icon corresponding to interface for controlling services 202a-n. The user may select the Services tab 1904ca for viewing visual access menu for accessing or controlling services 202a-n. [0219] Further, the cockpit 1902 include tabs 1904d-n rep­resenting other objects such as, but are not limited to, an Outlook 1904d, a Calendar 1904e, Personal E-mails 1904{, Messengers 1904g, Games 1904h, and so forth. The user may use the Outlook tab 1904d to check his/her professional or outlook mails. The user may select calendar tab 1904e to view calendar, and to plan his/her day. The user may use the cal­endar tab to do many other routine tasks such as, setting timings for meetings and appointment etc. In an embodiment of the invention, the user may be connected to an online calendar when he/she selects the calendar tab 1904e. In another embodiment of the invention, the user may be dis­played with an offline calendar. The user may also set remind­ers about meetings, occasions such as anniversary, birthdays etc. using the calendar tab 1904e. [0220] FIG. 20A-B illustrates exemplary environments for providing access of the cockpit 1902 of a user to other users, in accordance with an embodiment of the invention. As shown in FIG. 19, a user may be displayed with the cockpit 1902 for accessing various objects. Further, in an embodi­ment of the invention, the user may create or configure the cockpit 1902 by using various predefined controls or settings. The cockpit 1902 may include the plurality of tabs 1904a-n for enabling the user to access the various objects such as remote devices 106a-n, services 202a-n, and so forth. In an embodiment of the invention, the user may set up the cockpit 1902 according to his/her preferences such as language pref­erences, theme preferences, and so forth. The user may cus­tomize the cockpit 1902 according to his/her convenience or preferences. [0221] In an embodiment of the invention, a first user of a first device 2002 may set up a cockpit such as the cockpit 1902 for accessing various objects at the first device 2002. The first device 2002 may include an IVR application VMThings 2004. The user may create the cockpit 1902 by using the VMThings 2004. Further, the first user may provide the access of the cockpit 1902 to one or more second users. The one or more second users are associated with one or more second devices such as a second device 2006. The second device 2006 may include an IVR application VMThings 2008. The VMThings 2008 may display the cockpit 1902 of
  • 90. US 2013/0080898 AI the first user at the second device 2006. In an embodiment of the invention, the first device 2002 and the second device 2006 can be a portable device capable of communicating and connecting to other devices such as the remote devices 1 06a­n. Examples of the first device 2002 and the second device 2006 may include, but are not limited to, a mobile phone, a smart phone, a computer, a personal digital assistant (PDA), a tablet computer, a laptop, and so forth. [0222] Further, the first device 2002 and the second device 2006 are connected to each other through a network 104. The network 104 can be a wired network or a wireless network or a combination of these. The wireless network may use wire­less technologies to provide connectivity among various devices. Examples of the wireless technologies include, but are not limited to, Wi-Fi, WiMAX, fixed wireless data, Zig­Bee, Radio Frequency 4 for Consumer Electronics network (RF4CE), Home RF, IEEE 802.11, 4G or Long Term Evolu­tion (LTE), Bluetooth, Infrared, spread-spectrum, Near Field Communication (NFC), Global Systems for Mobile commu­nication (GSM), Digital-Advanced Mobile Phone Service (D-AMPS). The device 102 may connect to the plurality of remote devices 1 06a-n through the network 104. Examples of the wired network include, but are not limited to, Local Area Network (LAN), Metropolitan Area Network (MAN), Wide Area Network (WAN), and so forth. In an embodiment of the invention, the network 104 is the Internet. [0223] Further, the cockpit 1902 may include visual access menu for controlling the plurality of remote devices 1 06a-n or services 202a-n. As shown in FIG. 20A, the first user may connect and control the plurality of remote devices 106a-n through the network 104. Examples of the remote devices include, but are not limited to, household devices including electric lights, water pump, generator, fans, television (TV), cameras, microwave, doors, windows, computer, or garage locks, security systems, air-conditioners (AC), lights, and so forth. In an embodiment of the invention, the plurality of the remote devices 106a-n can be vehicles such as cars, trucks, vans, and so forth. Once set up, the first user may access the cockpit 1902 at the first device 2002. In an embodiment of the invention, the user may access the cockpit 1902 through a website or web browser. The user( s) may have to authenticate before accessing the cockpit. In an embodiment of the inven­tion, the cockpit 1902 may be stored at a proxy server 2010. Further, the proxy server 2010 may also store cockpits of other users. In an embodiment of the invention, the proxy server 2010 may maintain a record of the interaction of the users with the cockpits. Further, the proxy server 2010 may include a list of users and information about access control over various cockpits. In an embodiment of the invention, the access control permissions of the cockpit 1902 may be pro­vided to the one or more second users by the proxy server 2010. In an embodiment of the invention, the proxy server 2010 may send a message to the first user to ask for a permis­sion regarding some changes in his/her cockpit 1902 by the one or more second users. Thereafter, the cockpit 1902 may be changed or updated based on the permission from the first user. Further, the proxy server 2010 may monitor the cockpit 1902 of the first user and see if there are unauthorized requests to control the cockpit 1902 or the remote devices 1 06a-n. In case there are unauthorized request, the proxy server 2010 may report to the owner of the cockpit 1902 such as the first user. In an embodiment of the invention, the proxy server 2010 may report about unauthorized access to a security designated entity. Thereafter, either the security designated 25 Mar. 28, 2013 entity or the first user may take an action to handle the unau­thorized access. For example, the first user may block the users from which unauthorized access requests are received. [0224] In an embodiment of the invention, the user may create or configure an Internet of Things menu including representations of one or more identifiable objects. The iden­tifiable objects may be virtual or physical objects. The user may share the Internet of Things menu with other user such as friends or relatives. [0225] In an embodiment of the invention, different users may request access to cockpit 1902 of other users. In an embodiment of the invention, the one or more second users may request to get control over first user's cockpit 1902. For example, a wife may request her husband to get access on his cockpit. The one or more second users may get access of the cockpit 1902 of the first user based on the permission granted by the first user. In an exemplary scenario, the reverse control may allow the service provider to get more information and control of the cockpit of the users. The service provider can be a telecom service provider, a grocery provider, a movie rental service provider, an internet provider, and so forth. [0226] FIG. 21 illustrates a flowchart diagram for providing access control of the cockpit to one or more second users, in accordance with an embodiment of the invention. As illus­trated in FIG. 20A-B, the first user may configure or custom­ize the cockpit 1902 at the first device 2002. The first user may communicate with the one or more second users over the network 104 such as the Internet. The first device 2002 may connect to the second device 2006 through the network 104. [0227] At step 2102, the first user may access a graphical user interface (GUI) for configuring the cockpit 1902 at the first device 2002. At step 2104, the user may configure the cockpit 1902 based on his/her one or more preferences. Examples of the preferences may include, but are not limited to, language selection, font size, and selection of remote devices, favorite services, pictures, icons, themes, and so forth. For example, the user may select a color and theme for his/her cockpit 1902. [0228] At step 2106, the first user may share the cockpit 1902 with the one or more second users. For example, the first user such as John may share the cockpit 1902 of managing and controlling his home devices with his wife Marie or son Paul so that they may also control the home devices. Further, the user may provide limited or full control of the cockpit 1902 to the second users. Further, the control to the cockpit 1902 including different tabs representing objects such as remote devices may be provided to different second users. In an embodiment of the invention, the access to the cockpit 1902 may be provided on an event basis. For example, the first user may provide access to the second user for two days, or till Christmas. In an embodiment of the invention, the first user may provide an access to the cockpit 1902 based on time for example, such as for 4 hours, 3 hours, and so forth. [0229] In an embodiment of the invention, the first user may receive one or more alert messages about the remote devices, services or other objects of the cockpit 1902. In an embodi­ment of the invention, the VMThings 2004 may send these alert messages or control of the cockpit 1902 to the first user when he/she is available. In another embodiment of the inven­tion, the VMThings 2004 may send the alert messages or control of the cockpit 1902 to the other second users when the first user is not available. Further, the user may set up a list of second users to whom the control of the cockpit 1902 may be passed in absence of the first user.
  • 91. US 2013/0080898 AI [0230] Further, the VMThings 2008 at the second device 2006 may translate language of the cockpit 1902 based on language preference of the second user. In an embodiment of the invention, the VMThings 2008 may translate the cockpit 1902 of the first user based on the configuration of the second device 2006. For example, the VMThings 2008 may translate the cockpit 1902 into Russian language if the second user understands Russian. Then at step 2110, the cockpit 1902 or a menu of the cockpit 1902 may be displayed at the second device 2006. In an embodiment of the invention, the cockpit 1902 may be downloaded at the second device 2006. There­after, the second user may interact with the cockpit 1902. Further, the VMThings 2008 may change the display of the second device 2006 to a menu of the shared cockpit 1902. Further, the displayed visual access menu or the cockpit 1902 will be according to the second user's preference(s). [ 0231] FIG. 22 illustrates a flowchart diagram for providing access control of the cockpit to one or more second users, in accordance with another embodiment of the invention. As illustrated in FIG. 20A-B, the first user may configure or customize the cockpit 1902 at the first device 2002. The first user may communicate with the one or more second users over the network 104 such as the Internet. The first device 2002 may connect to the second device 2006 through the network 104. [0232] At step 2202, the first user may access a graphical user interface (GUI) for configuring the cockpit 1902 at the first device 2002. The first device 2002 may be a mobile phone, a smart phone, a computer, a personal digital assistant (PDA), a tablet computer, a laptop, and so forth. At step 2204, the user may configure the cockpit 1902 based on his/her one or more preferences. Examples of the one or more prefer­ences may include, but are not limited to, language prefer­ence, font size, and preferred remote devices, favorite ser­vices, pictures, icons, themes, and so forth. For example, the user may select a font size for his/her cockpit 1902. [0233] At step 2206, the first user may share the cockpit 1902 with the one or more second users. For example, the first user such as John may share the cockpit 1902 for managing and controlling his home devices with his wife Marie or son Paul so that they may also control the home devices. In an embodiment of the invention, the second users may also provide control of the cockpit 1902 to one or more third users after getting control of the cockpit 1902. The one or more second users are the users associated with one or more second devices such as the second device 2006. Further, the user may provide partial or full control of the cockpit 1902 to the second users. Further, the control to the cockpit 1902 includ­ing different objects or remote devices may be provided to the second users. Further, the access control of the objects may differ for different users. For example, first user may provide complete control i.e. viewing, controlling and modifying per­mission to his/her cockpit 1902 to a User A, and may give partial/limited control such as just viewing and controlling permission to a User B. [0234] In an embodiment of the invention, the access to the cockpit 1902 may be provided on an event basis. For example, the first user may provide access to the second user for two days, or till Christmas. In an embodiment of the invention, the first user may provide an access to the cockpit 1902 based on time. For example, such as for 4 hours, 3 hours, till5:30 PM, and so forth. [ 0235] In an embodiment of the invention, the first user may receive one or more alert messages about the remote devices, 26 Mar. 28, 2013 services or other objects of the cockpit 1902. In an embodi­ment of the invention, the VMThings 2004 may send these alert messages or control of the cockpit 1902 to the first user when he/she is available. In another embodiment of the inven­tion, the VMThings 2004 may send the alert messages or control of the cockpit 1902 to the other second users when the first user is not available. Further, the user may set up a list of second users to whom the control of the cockpit 1902 may be passed in absence of the first user. [0236] Further, the VMThings 2008 at the second device 2006 may translate the cockpit 1902 based on language pref­erence of the second user. For example, the VMThings 2008 may translate the cockpit 1902 into Russian language if the second user understands Russian or wants to view the cockpit 1902 in Russian. In an embodiment of the invention, the VMThings 2008 may translate language of the cockpit 1902 of the first user based on the configuration of the second device 2006. For example, the VMThings 2008 may translate the cockpit 1902 which is in English language into a Russian language cockpit if the second user understands or wants to view the cockpit in Russian language. Then at step 2210, the cockpit 1902 or a menu of the cockpit 1902 may be displayed at the second device 2006. Further, the VMThings 2008 may change the display of the second device 2006 to a visual menu of the shared cockpit 1902. Further, the displayed menu will be according to the second user's preference. [0237] Thereafter, at step 2212 the one or more second users may interact with the cockpit 1902 at their respective one or more second devices. The second user(s) may view and control the one or more objects in the cockpit 1902 from the second device 2006 itself. For example, the second user may use his/her smart phone to switch off the microwave associ­ated with a home of the first user. Further, the first user may receive notifications regarding events at the first device 2002. The events may be such as, but not limited to, switch on, switch off, theft, and so forth. In an embodiment of the inven­tion, the first user may receive notifications about changes done by the one or more second user to his/her cockpit 1902. Further, messages asking to approve these changes by the second users may be received by the first user at the first device 2002. [0238] Further, the proxy server 2010 may maintain a record of interactions with the cockpit 1902 by different users. Further, the proxy server 2010 may have some level of control related to the sharing of the cockpit 1902 with other users. In an embodiment of the invention, the first user may provide some instructions to the proxy server 2010 regarding sharing of the cockpit. The proxy server 2010 may know to whom to send the request and when to send the request if it does not work for any reason. Further, the proxy server 2010 may maintain records related to managing ownership of the control of the cockpit 1902. The proxy server 2010 may also decide to whom to give control and how much control of the cockpit 1902 of the first user. In an embodiment of the inven­tion, the proxy server 2010 may decide about giving control to other users based on predefined settings received from the first user (or the users). Further, the proxy server 2010 may save the access pattern of the first user or the one or more second users. Further, the proxy server 2010 may also store profile information such as name, age, and profession etc. of the users. Furthermore, the proxy server 2010 may provide control to the second users based on one or more parameters such as, but are not limited to, time, event, availability of a user at the device and so forth. Further, the proxy server 2010
  • 92. US 2013/0080898 AI may maintain a record of all the changes done to the cockpit 1902 by the one or more second users. In an embodiment of the invention, the first user may roll back all the changes done by the other second users based on the record of the changes maintained at the proxy server 2010. [0239] In an embodiment of the invention, different users may request access to cockpit of other users. In an exemplary scenario, the one or more second users may request to get control over first user's cockpit 1902. For example, a daugh­ter may request her mom to get access on her cockpit 1902. Therefore, the one or more second users may get access of the cockpit 1902 of the first user based on the permission granted by the first user. The request for sharing the cockpit may be received by the users in form ofSMS, MMS, instant message, e-mails, and so forth at their respective devices. The first user may provide complete access or limited access to the one or more users. In an exemplary scenario, the reverse control may allow the service provider to get more information and con­trol of the cockpit 1902 of users. Further, the proxy server 2010 may monitor the cockpit 1902 of the first user and see if there are unauthorized requests to control the cockpit 1902. In case there are unauthorized request, the proxy server 2010 may report to the owner of the cockpit 1902 such as the first user. In an embodiment of the invention, the proxy server 2010 may report about unauthorized access to a security designated entity. In an embodiment of the invention, the proxy server 2010 may itself handle the unauthorized access requests. [0240] At step 2214, the interactions with the cockpit 1902 of the first user may be stored at the proxy server 2010. The proxy server 2010 may store the interactions in form of list, records, text, audio, video and so forth. At 2216, the proxy server 2010 may send a message to the first user to ask for a permission regarding some changes in his/her cockpit 1902 by the one or more second users. Thereafter, the cockpit 1902 may be changed or modified or updated based on the permis­sion received from the first user. [0241] FIG. 23 illustrates a flowchart diagram for custom­izing a cockpit based on user's preference, in accordance with an embodiment of the invention. A user may create or con­figure a cockpit such as the cockpit 1902 as showninFIG.19. The cockpit 1902 may include a plurality of tabs or icons 1904a-n representing different types of objects. The cockpit 1902 may be device specific or user specific. The VMThings 108 may present a GUI for configuring the cockpit 1902 to a user at the device 102. [0242] At step 2302, the user may access a database of visual access menus through a GUI for customizing a cockpit including multiple visual access menus corresponding to multiple objects at the device 102. The visual access menus may be visual menus for accessing one or more objects such as, but are not limited to, services 202a-n, remote devices 1 06a-n, and so forth. The user may provide one or more inputs at the device 102. At step 2304, the VMThings 108 may search the database for a cockpit or one or more visual access menus based on the one or more inputs received from the user. The user may provide inputs at the device by at least one of pressing one or more keys at the device 102, giving a voice command, through gestures, hand movement, touching the screen of the device 102, and so forth. In an embodiment of the invention, the VMThings 108 may retrieve a cockpit or visual access menu matching the inputs from a server. In another embodiment of the invention, the VMThings 108 27 Mar. 28, 2013 may display a message telling that cockpit or the visual access menu is not available at the device 102. [0243] At step 2306, the VMThings 108 may customize the cockpit visual access menu according to user's preference. In an embodiment of the invention, the VMThings 108 may customize one or more visual access menus or objects of the cockpit according to user's preference. For example, the user maybe interested in controlling remote devices such as car, garage, home doors, fans, and lights ofhislher house only. So, the user may be displayed with a visual access menu corre­sponding to his/her preferred remote devices of the remote devices 106a-n. Through this visual access menu the user may access and control one or more operations of the personal remote devices. Similarly, the user may define his/her pref­erences for accessing the remote devices present at his/her office or factory, and so forth. Therefore, multiple visual access menus may be stored at the devices based on the preferences of the user. Examples of the preferences may include, but are not limited to, language preference, font size, and selection of remote devices, favorite services, pictures, icons, themes, and so forth. For example, the user may select a color and theme for his/her cockpit to be displayed at the device 102. In an embodiment of the invention, the user may be displayed with a different visual access menu when the user accesses the visual access menu from different devices. For example, when the user is accessing a visual access menu to control services from his/her laptop, he may see a first visual access menu and when the same user accesses the visual access menu from his/her smart phone he may be presented with a second visual access menu. The purpose or functionality of the first visual access menu may be same as of the second visual access menu. For example, the first and the second visual access menu may be the visual menus for controlling one or more cars of the user. [0244] Thereafter, at step 2308, a customized cockpit or the one or more visual access menus may be displayed at the device 102. In an embodiment of the invention, the visual access menu may be customized based on the user prefer­ences received in real time. In another embodiment of the invention, the visual access menu may be customized based on predefined user preferences. In an embodiment of the invention, the customized visual access menu may be stored at the device 102 or at a server in a cloud network. [0245] In an embodiment of the invention, a standard cock­pit or visual access menu may be displayed to the user. The standard cockpit may be an interface which is not customized according to the user preferences. The standard visual access menu may be a standard menu which may be displayed with­out any customization specific to the user. [0246] FIG. 24 illustrates a flowchart diagram for config­uring a cockpit, in accordance with an embodiment of the invention. As discussed with reference to FIG. 1A, a user may access or control the remote devices 106a-n or services 202a-n by using the device 102. The device 102 may include the VMThings 108 for displaying graphical information at the device 102. The user may create a cockpit by using a GUI at the device 102. At step 2402, the user may access a database of visual access menus through a GUI for creating a cockpit such as the cockpit 1902 as shown in FIG. 19. For example, the user may access a database of visual access menu at his/her smart phone. In an embodiment of the invention, the database may be present at the device 102. In another embodi­ment of the invention, the database may be present on a server in a cloud network.
  • 93. US 2013/0080898 AI [0247] At step 2404, the VMThings 108 may display one or more configuration settings options for creating the cockpit to the user at the device 102. The user may choose or select one or more configuration setting options. In an embodiment of the invention, the user may provide inputs regarding the con­figuration settings. At step 2406, a selection of the one or more configuration setting options may be received at the device 102. In an embodiment of the invention, the VMTh­ings 108 may detect and receive the selection of the configu­ration options from the user at the device 102. At step 2408, a cockpit may be created based on the selection received from the user. In an embodiment of the invention, the VMThings 108 may create the cockpit based on the selection of the configuration options. The cockpit created may be a custom­ized cockpit specific to the user. The customized cockpit may be stored at the device 102. Thereafter, at step 2410, the cockpit may be displayed at the device 102. In an embodiment of the invention, the cockpit may be displayed at a display device such as the display device 118 connected to the device 102. [0248] FIG. 25 illustrates a flowchart diagram for custom­izing a cockpit based on other users' reviews, in accordance with an embodiment of the invention. As discussed with reference to FIG. 19, the user may access different objects through the cockpit 1902. Further, the user may create or configure or set up or customize a cockpit specific to the user. [0249] At step 2502, a user may access a database including a plurality of visual access menus through a GUI for creating a cockpit at a device such as the device 102. The visual access menus are the visual menus for accessing or controlling mul­tiple objects such as remote devices 106a-n or services 202a­n. In an embodiment of the invention, the database may be present at a server in the network 104. In another embodiment of the invention, the database of visual access menus may present at the device 102. [0250] At step 2504, one or more configuration options for configuring/creating or customizing the cockpit may be dis­played to the user. In an embodiment of the invention, the VMThings 108 may display the one or more configurations options to the user. The user may select or choose these one or more configuration options to change or modify a standard cockpit. At step 2506, the user may create or configure the cockpit based on a selection of the one or more configuration options received from the user. [0251] The user may allow other users to view or check or access the cockpit and rate it and provide reviews or feedback about the cockpit. At step 2508, the user may receive ratings/ reviews/feedback for the cockpit from the other users in the network 104. The other users may also suggest some changes like addition or deletion in the cockpit to the user. At step 2510, the cockpit may be customized at the device 102 based on the ratings or reviews or feedback received from the other users. In an embodiment of the invention, the VMThings 108 may modifY the cockpit based on the reviews or ratings or feedback automatically at the device 102. In another embodi­ment of the invention, the user may accept or reject reviews or feedback and then he/she may modify the cockpit manually or with the help of the VMThings 108 application at the device 102. [0252] Further, the modified cockpit may be stored in the database. Thereafter, at step 2512, the customized or modified cockpit may be displayed at the device 102. In an embodiment of the invention, the modified cockpit may be displayed at the display device 118 such as a projector screen, a TV, a large 28 Mar. 28, 2013 screen and so forth. In an embodiment of the invention, the user may not customize the cockpit based on the other users' reviews or feedback. [0253] FIG. 26 illustrates a flowchart diagram for down­loading and customizing a cockpit at a second device, in accordance with an embodiment of the invention. The user may share the cockpit with other users. The cockpit may be modified by the other users based on the access control per­missions from the user. Further, the user may configure or customize his/her cockpit with the help of other users in his/her social network. The social network may be created by the user by using a social networking website. Examples of the social networking web sites include, but are not limited to, Face book, Google+, Orkut, Twitter, Academia.edu, Athlinks, Bebo, Badoo, BIGADDA, BlackPlanet, Buzznet, Cloob, Faceparty, Flixter, Fubar, Google Buzz, Hi5, ibibo, MySpace, Linked In, My Life, Ning, WAYN, and so forth. For example, the user may share or invite other users to help him in creating his/her cockpit in real time. [0254] At step 2602, a first cockpit may be configured or created by accessing a GUI for creating the cockpit at a first device. A first user may create the first cockpit at the first device. Then at step 2604, the first cockpit may be shared with one or more second users and downloaded at their respecting one or more second devices. Examples of the first device and the second devices may include, but are not limited to, a mobile phone, a smart phone, a computer, a laptop, anI-pod, anI-pad, a tablet computer, a home controller, a set top box, an android device, an android set top box, and so forth. The cockpit may be downloaded at the system through at least one of an SMS, an MMS, File transfer protocol (FTP), an E-mail, through wireless technologies like Bluetooth, ZigBee, RF4CE, Wi-Fi, WiMAX, and so forth. [0255] At step 2606, the one or more second users may modifY or customize a second cockpit at the one or more second devices based on the downloaded first cockpit. The second cockpit is associated with at least one of the one or more second users. At step 2608, ratings or reviews or feed­back may be received on the customized second cockpit of the second user from the other users (or one or more third users) in his/her social network. For example, a second user may receive ratings on the second cockpit from his/her friends or relatives in the social network such as on Facebook, Twitter, Orkut, Ning, MySpace, ibibo, and so forth. [0256] At step 2610, one or more configuration settings of the second cockpit are downloaded at the first device based on the reviews or ratings of the other user i.e. the one or more third users. At step 2612, the first cockpit may be customized based on the downloaded configuration settings and reviews. Thereafter, at step 2612, the customized first cockpit may be displayed at the first device. In an embodiment of the inven­tion, the customized first cockpit may be stored in the data­base. [0257] FIG. 27 illustrates a flowchart diagram for config­uring a cockpit based on another cockpit of other user, in accordance with an embodiment of the invention. As dis­cussed with reference to FIG. 1A, every user in the network 104 may access visual access menus at their respective devices. Subsequently through these visual access menus, the user may control the one or more functions or operations of the one or more objects such as the remote devices 1 06a-n. As discussed with reference to FIGS. 19 and 20, the user may configure a cockpit such as the cockpit 19 according to his/her preferences. As discussed with reference to FIG. 26, the user
  • 94. US 2013/0080898 AI may configure or customize his/her cockpit with the help of other users in his/her social network. The social network may be created by the user by using a social networking website. Examples of the social networking websites include, but are not limited to, Facebook, Google+, Orkut, Twitter, Academia. edu, Athlinks, Bebo, Badoo, BIGADDA, BlackPlanet, Buzz­net, Cloob, Faceparty, Flixter, Fubar, Google Buzz, Hi5, ibibo, MySpace, Linkedin, MyLife, Ning, WAYN, and so forth. For example, the user may share or invite other users to help him in creating his/her cockpit in real time. [0258] At step 2702, at least one second cockpit associated with one or more second users is selected from a database. The database may be at a first device or at a second device or at a server in the network 104. Each user in the network 104 may have an associated profile stored at the database. The profile of a user may include information such as but not limited to, name, age, Identity (ID), interests, favorite books, and so forth about the user. Further, the at least one second cockpit is associated with a second user whose profile is similar to a profile of a first user. In an embodiment of the invention, the VMThings 108 may search and select the at least one cockpit from the database. In an embodiment of the invention, the user may select the second cockpit of the one or more second users. [0259] At step 2704, the second cockpit may be analyzed by the VMThings 108. In an embodiment of the invention, the analysis may happen at the first device. In another embodi­ment of the invention, the analysis may happen at the server in the network 104 or a network device in a cloud network. At step 2706, a first cockpit specific to the first user may be created or configured based on the analysis of the second cockpit of the one or more second users. In an embodiment of the invention, the VMThings 108 may create the first cockpit based on the second cockpit. In another embodiment of the invention, the user may provide inputs for configuring the cockpit based on the analysis of the second cockpit. Further, the user may invite other users may be his friends, relatives, colleagues, and so forth to configure the cockpit for the user. The first cockpit may be stored at the first device. In an embodiment of the invention, the first cockpit may be stored at the server or the network device. Thereafter, at step 2708, the first cockpit may be displayed at the first device to the user. In an embodiment of the invention, the first cockpit may be displayed at a display device connected to the first device. The display device may be connected to the first device through wireless or wired means. [0260] FIG. 28 illustrates a flowchart diagram for config­uring a cockpit based on another cockpit of other user, in accordance with another embodiment of the invention. At step 2802, the user may access a graphical user interface (GUI) for configuring or creating a cockpit at a first device. At step 2804, the first user may provide information or profile of at least one second user. The profile may include information such as a name, age, devices, services, and so forth. Then at step 2806, the VMThings 108 may search for a second cock­pit of the second user and download at the first device. At 2808, the VMThings 108 may customize or configure a first cockpit for the first user based on the second cockpit of the at least one second user. In an embodiment of the invention, the Further at step 2810, the VMThings 108 may store the first cockpit at the first device. In an embodiment of the invention, the first cockpit may be stored at a server in the network 104. Further, the user may translate the first cockpit from one language to another. The user may change or select a new font 29 Mar. 28, 2013 size, theme, color etc. for the first cockpit. Thereafter, at step 2812, the first cockpit may be displayed to the user at the first device. In an embodiment of the invention, the first cockpit may be displayed at a display device attached or connected to the first device. Thereafter, the user may interact and access the one or more objects of the first cockpit accordingly. [0261] FIG. 29 illustrates a flowchart for downloading a cockpit from a network, in accordance with an embodiment of the invention. In an embodiment of the invention, the user may download the cockpit or one or more configuration set­tings for setting his/her cockpit at a device. At step 2902, a graphical user interface (GUI) for creating or configuring or copying a cockpit at a device may be accessed by a user. In an embodiment of the invention, the user may configure his/her cockpit based on the cockpit of other users in the network 104. At step 2904, the user may select and download a cockpit having good reviews and ratings from the other users from the network 104 such as the Internet. The cockpit may be present in a cloud network. In an embodiment of the invention, the user may customize the downloaded cockpit according to his/her preference and device compatibility. At step 2906, the cockpit may be customized or translated according to a lan­guage preference of the user. In an embodiment of the inven­tion, the cockpit may be translated or customized by the VMThings 108 based on predefined preferences of the user. For example, the cockpit language may be changed from English to Spanish. In an embodiment of the invention, the user may not customize the downloaded cockpit. At step 2908, the customized cockpit may be stored at the device. In an embodiment of the invention, customized cockpit may be stored at a server or in cloud network. At step 2910, the customized cockpit may be displayed at the device or at a display device attached to the device. [0262] FIG. 30 illustrates an environment for accessing a cockpit through a website, in accordance with an embodiment of the invention. As discussed with reference to FIG. 19, the cockpit 1902 may include multiple tabs or icons 1902a-n for connecting to and controlling multiple objects 3006a-n. The objects may be such as but not limited to, remote devices, services, applications, and so forth. A user may use a device 3002 to access a cockpit or visual access menus through a website in a network 3004. Examples of the device 3002 may include, but are not limited to, smart phone, PDA, a mobile phone, a computer, a laptop, a tablet computer, an I-POD, and so forth. [0263] The network 3004 can be a wired network or a wireless network or a combination of these. The wireless network may use wireless technologies to provide connectiv­ity among various devices. Examples of the wireless tech­nologies include, but are not limited to, Wi-Fi, WiMAX, fixed wireless data, ZigBee, Radio Frequency 4 for Consumer Electronics network (RF 4CE), Home RF, IEEE 802.11, 4G or Long Term Evolution (LTE), Bluetooth, Infrared, spread­spectrum, Near Field Communication (NFC), Global Sys­tems for Mobile communication (GSM), Digital-Advanced Mobile Phone Service (D-AMPS). The device 102 is con­nected to the plurality of remote devices 106a-n through the network 104. Examples of the wired network include, but are not limited to, Local Area Network (LAN), Metropolitan Area Network (MAN), Wide Area Network (WAN), and so forth. In an embodiment of the invention, the network 104 is the Internet. In an embodiment of the invention, the one or more objects may connect to the network 3004 through a network device such as, but not limited to, a router, a bridge,
  • 95. US 2013/0080898 AI a switch, a gateway, a home communication device, and so forth. In an embodiment of the invention, the objects 3006a-n may connect to the network 3004 indirectly through a local network. [0264] The device 3002 may include a web browser for opening a web site. Examples of the web browser include, but are not limited to, Internet Explorer, Google Chrome, Mozilla Firefox, Netscape Navigator, and so forth. The user can enter a Uniform Resource Locator (URL) such as, 'www.XYZ. com' in the web browser to access the website. Further, when the user enters a URL in the web browser, a web page 3008 may be displayed at the device 3002 based on the URL. The web page 3008 may include one or more data request fields 3010a-n. In an embodiment of the invention, the user may have to authenticate his identity to the website before access­ing the cockpits. The user may enter his/her details in the one or more data request fields 3010a-n for authentication. In an exemplary scenario, the web page 3008 may include a user­name data request field 3010a, and a password data request field 3010b. [0265] The network 3004 may include a cockpit database 3012 or server for storing a plurality of cockpits associated with a plurality of users or devices. Further, the cockpit data­base 3012 may include a plurality of visual access menus for controlling one or more objects. The cockpit database 3012 may also maintain a list of users, devices, remote devices, services and so forth. In an embodiment of the invention, the network 3004 may include an IVR application such as VMThings 3014. The VMThings 3014 may display graphical information to the user at the device 3002. In an embodiment of the invention, the graphical information or visual access menu may be displayed at a display device such as, but not limited to, a television, an LCD screen, an LED screen, a computer, a projector screen, a picture frame, and so forth. In an embodiment of the invention, the user may configure a cockpit at the device 3002 by accessing a graphical user interface (GUI) for configuring the cockpit through the web­site. The user may log in to the website by providing one or more details. Thereafter, the user may access or configure or customize the cockpit. The user may customize the cockpit by providing one or more user preferences such as font size, theme, color, and so forth. [0266] FIG. 31 illustrates a flowchart diagram for config­uring a cockpit through a website, in accordance with an embodiment of the invention. As discussed with reference to FIG. 30, the user may open a website by entering its network address or URL in a web browser such as Internet Explorer, Google Chrome, etc. At step 3102, the user may open a website through a web browser at a device. The user may enter a URL associated with the website to open a webpage. In an embodiment of the invention, the website may include a plurality of webpage. In an embodiment of the invention, a third party may maintain the website for configuring the cockpit. In an embodiment of the invention, the website may be a website for configuring or creating or setting up a cock­pit. Based on the URL a web page such as the web page 3008 may be displayed at the device 3002. The web page 3008 may include one or more data request fields 3010a-n. [0267] In an embodiment of the invention, the website may ask the user to enter his/her personal details for authorization. At step 3104, the user may enter one or more personal details in the data request fields 3010a-n to authenticate at the web­site. The user may be allowed to access web site based on the authorization. The user can access a GUI for configuring the 30 Mar. 28, 2013 cockpit after authorization. At step 3106, VMthings 3014 may display one or more configuration options to the user. The user may select or choose the one or more configuration options to configure the cockpit. At step 3108, the VMthings 3014 may receive selection of the one or more configuration options from the user. The user may select the options by touching the screen of the device. In an embodiment of the invention, the user may select the options through at least one of entering a combination of keys, giving a voice command, gestures, hand movements, and so forth. [0268] At step 3110, the VMthings 3014 may configure or create the cockpit for the user based on the selection of the configuration options. In an embodiment of the invention, the cockpit may be customized based on the one or more con­figuration options. In an embodiment of the invention, the user may create a plurality of cockpits based on his/her pref­erences. For example, the user may create a cockpit for han­dling home appliances, a second cockpit for handling or controlling office objects and so forth. Thereafter, at step 3112, the cockpit may be displayed to the user. The VMTh­ings 3014 may display the cockpit at the device 3002. In an embodiment of the invention, the VMThings 3014 may dis­play the cockpit at a display device attached to the device 3002. The cockpit is then stored at the cockpit database 3012. The user may interact or control one or more objects through the cockpit. [0269] FIG. 32 illustrates a flowchart diagram for accessing a cockpit through a website, in accordance with an embodi­ment of the invention. As discussed with reference to FIG. 30, the user may access the cockpit through a website. At step 3202, the user may open a website through a web browser at the device 3002. A web page 3008 based on the URL of the website may be displayed at the device 3002. The webpage 3008 may include one or more data request fields 3010 a-n. The user may enter his/her details in the data request fields 3010 a-n. A website server may check whether the user is an authorized user or not based on the entered details. Thereafter, the VMThings 3014 may search the cockpit database 3012 for a cockpit associated with the user. In an embodiment of the invention, the cockpit may be present in a cloud network. [0270] Then at step 3206, the VMThings 3014 may display the cockpit specific to the user at the device 3002. In an embodiment of the invention, the cockpit may be displayed at a display device. Further, different cockpits may be displayed to different users based on their details. In another embodi­ment of the invention, a standard cockpit may be displayed to the user. The standard cockpit may be a cockpit including one or more objects without any specific changes according to different users. In an embodiment of the invention, the VMThings 3014 may display the cockpit at the device 3002 based on current location of the user or the device 3002. The icons in the cockpit may differ depending on the location of the device 3002 or the user. For example, the user may be displayed with a first cockpit when the user is at home and may be displayed with a second cockpit when the user is travelling. In an embodiment of the invention, the location of the user may be determined by using a GPS system at the device 3001 or in the network 3004. In an embodiment of the invention, the location of the objects being controlled may change. For example, car, pet, wife, kids may change their location. Therefore, VMThings 3014 may display different cockpit or visual menus to the user based on the location of the controlled objects.
  • 96. US 2013/0080898 AI [0271] Subsequently, the user can interact with the cockpit at step 3208. The user may select a tab from a plurality of tabs or icons of the cockpit for interacting with the objects. At step 3210, the user may be displayed with an enhanced visual access menu based on the selection or interaction of the user with the cockpit. As discussed with reference to FIG. 1A to FIG. 2I, the enhanced visual access menu may include one or more device options or the service options. The device options may be the icons representing one or more remote devices 106a-n. Similarly, the service options may be the icons or graphics representing one or more services 202a-n. In an embodiment of the invention, the cockpit may be dis­played based on one or more preference of the user such as color preference, font size, theme, language preference, and so forth. In an embodiment of the invention, the user may provide the preferences in real time. In an embodiment of the invention, the user preferences are pre-defined and may be stored at the cockpit database 3012 or the device 3002. At step 3212, the user may interact and control one or more opera­tions of the objects such as remote devices. [0272] FIG. 33 illustrates a flowchart diagram for config­uring a cockpit with the help of other users, in accordance with an embodiment of the invention. As discussed with reference to FIG. 30, a user may access a website for creating or configuring or customizing a cockpit through a web browser such as Internet Explorer, Google Chrome, and so forth. The website may include a plurality of web pages. Each of the web page may display text, images, data request fields, and so forth. In an embodiment of the invention, the web page may include audio files or video files. [0273] In an embodiment of the invention, the user may configure an Internet of Things menu by accessing a website. The user may login to the website and then may get access to various setting controls for configuring the Internet ofThings menu based on the authorization. In an embodiment of the invention, the Internet of Things application i.e. the VMTh­ings may create the Internet of Things menu for different users at the device. Further, the user may share the Internet of Things menu with other users. In an embodiment of the invention, the Internet of Things menu may include one or more options for identifiable objects. Further, the Internet of Things menu may be created by inviting other users. [0274] At step 3302, a first user may access a website for creating or configuring or setting up a cockpit at a first device such as a first device 2002 of FIG. 20A-B. The first device may be a smart phone. At step 3304, the user may invite one or more second users for configuring the cockpit for the first user. The first user may invite the one or more second users through at least one of an SMS, an MMS, an instant message, an e-mail, through face to face conversation, or phone, and so forth. [0275] At step 3306, one or more inputs may be received from the one or more second users. Further, the one or more second users may provide the one or more inputs at their respective second devices. In an embodiment of the invention the VMThings 3014 in the network 3004 may receive the one or more inputs from the one or more second users. At step 3308, one or more inputs may be received from the first user. Further, the first user may provide the one or more inputs at the first device. In an embodiment of the invention, the VMThings 3014 may receive the inputs from the first user. Further, the first user and the second user may provide the inputs by at least one of, touching screen of their devices, 31 Mar. 28, 2013 pressing one or more keys at the devices, giving voice com­mands, gestures, hand movements, and so forth. [0276] At step 3310, the VMThings 3014 may configure a cockpit for the first user based on the one or more inputs from the first user and the one or more second users. In an embodi­ment of the invention, the VMThings 3014 may customize an already configured cockpit of the first user based on the one or more inputs from the first user and the one or more second users. Finally, at step 3312, the cockpit may be stored at the first device. In an embodiment of the invention, the cockpit may be stored at a server of the website or at the cockpit database 3012 in the network 3004. In an embodiment of the invention, the first user may provide access to the cockpit to the one or more second users. [0277] FIG. 34 illustrates a flowchart diagram for switching a display mode of a cockpit, in accordance with an embodi­ment of the invention. In an embodiment of the invention, the cockpit or the visual access menus may be displayed to the user based on the user's one or more preferences. Further, the cockpit (or visual access menus) may be displayed to the user based on the display capabilities of the device. For example, the cockpit may be displayed as a list when the device is a simple mobile phone and has a small display. In an embodi­ment of the invention, the cockpit may be played to the user depending on the user's preference. [0278] At step 3402, a user may access a database of visual access menus or cockpit through a graphical user interface (GUI) at a device. The GUI may provide an interface for creating or configuring or customizing or accessing a cockpit. As discussed with reference to FIG. 30, the cockpit database 3012 may include a plurality of cockpits or visual access menus for different users and devices. Examples of the device may include, but are not limited to, a mobile phone, a smart phone, a laptop, anI-pod, a tablet computer, a PDA, an elec­tronics device, and so forth. The user may receive alerts or messages from the one or more objects connected through the cockpit or the visual access menus. At step 3404, a cockpit along with one or more mode options may be displayed to the user. Examples of the mode options may include, but are not limited to, video, audio, visual, text, list, and so forth. In an embodiment of the invention, the one or more mode options may be displayed at the GUI for creating/accessing cockpit. [0279] The user may select at least one mode option from the one or more mode options. A selection of the video mode option may play the cockpit as a video. A selection of the audio mode option may play the cockpit options as audio or music. A selection of the text mode option may display the cockpit options as text. Similarly, a selection of the list mode option may display the cockpit options as a list. At step 3406, a selection of the at least one mode options may be received from the user at the device. In an embodiment of the inven­tion, the VMThings at the device may receive the selection of the mode option. [0280] Based on the selection of the mode option, the mode of the display of the device may be switched at step 3408. For example, the user may select the audio option, so the display may switch to audio mode and various options of the cockpit or the visual access menus may be played to the user. Subse­quently, at step 3410, an audio menu may be played at the device when the user selects the audio mode. Thereafter, the user may listen to the options and may interact by providing one or more inputs. The one or more inputs may be provided through at least one of gestures, hand movements, voice com­mands, pressing one or more keys at the device, touching the
  • 97. US 2013/0080898 AI display, and so forth. For example, when a user is driving, and wants to access the cockpit, he may choose the audio mode option. Therefore, the options may be played to the user and he/she can interact with the cockpit accordingly. [0281] FIG. 35A illustrates an exemplary display of cockpit along with one or more mode options, in accordance with an embodiment of the invention. As discussed with reference to FIG. 19, a user may create or configure a cockpit such as the cockpit 1902 at the device 102. The cockpit 1902 is an inter­face which enables a user to access various services, devices or objects. The cockpit 1902 may include icons 1904a-n representing various objects which a user or users can access or control. The tabs 1904a-n may be icons or text or combi­nation of these. [0282] As discussed with reference to FIG. 34, the VMTh­ings 108 may display the cockpit along with one or more mode options at the device 102. Examples of the mode options may include, but are not limited to, video, audio, visual, text, list, and so forth. In an embodiment of the inven­tion, the one or more mode options may be displayed at a GUI 3506 for creating/accessing cockpit as shown in FIG. 35B. The user may select at least one mode option from the one or more mode options. A selection of the video mode option may play the cockpit as a video. A selection of the audio mode option may play the cockpit options as audio or music. A selection of the text mode option may display the cockpit options as text. Similarly, a selection of the list mode option may display the cockpit options as a list. A display of the device 102 may change based on the selection of the mode options by the user. For example, if the user selects an audio mode option, an audio menu may be played at the device 102. Thereafter, the user may listen to the options and may interact by providing one or more inputs. [0283] As shown in FIG. 35, the exemplary GUI 3506 may include one or more icons/tabs/options 3504a-n. A GUI option 3504a may be a Create Cockpit option. A user may select this option for creating or configuring or setting up a cockpit. A GUI option 3504b may be a Customize Cockpit option. The user may use this option to customize an already created or stored cockpit. In an embodiment of the invention, the cockpit may be stored at the device 102. In an embodiment of the invention, the cockpits are maintained by the cockpit database 3012 as shown in FIG. 30. A GUI option 3504c may be a View Cockpit option. The user may select this option to view the cockpits at the device 102. [0284] In another embodiment of the invention, a server may provide functionality of the VMThings. Further, the server may maintain all the information which is otherwise was provided by the VMThings. The server may maintain the information regarding the one or more visual access menus, users, devices, remote devices, services, display device, access device, and so forth. A user at the device such as a telephone may request information from the server. Further, the server may send the information to the requesting device over a network. The network may be a wired or a wireless network. The connection between the device and the server may be a wired or a wireless connection. Further, the server may send the information to the requesting device( s) by using technologies such as, but are not limited to, SMS, MMS, e-mail, and so forth. Based on the received information, the content may be displayed at the device. For example, if the user has requested the information regarding controlling remote devices, then information of visual access menu related to remote devices may be received from the server. 32 Mar. 28, 2013 Further, the server may display the visual access menu at the device. In an embodiment of the invention, the server may also provide other functions or features of the VMThings 108 as explained in the FIGS. 1A-2G. The user may respond or select an option from the displayed visual access menus through DTMF tones. The device may be a telephone or a simple mobile phone. [0285] In an embodiment of the invention, the user may access the functionalities as described above by logging into a second device such as a home controller. The user may see and control devices associated with the home controller. [0286] Further, the VMThings may store the user activity such as selection of options from the visual access menus at the device. This user activity information may be used by the VMThings for displaying the visual access menu to the same user next time. [0287] An aspect of the invention allows the user to share his/her cockpit of controlling one or more objects with other users. [0288] Another aspect of the invention allows the users to request permission to access or control the one or more objects of the cockpit from the other users. [0289] Another aspect of the invention provides a cockpit including multiple interfaces for controlling multiple objects by a user. [0290] An aspect of the invention enables a user to config­ure or set up a cockpit with the help of other users in his/her social network. Therefore, the user may invite his/her friends or other users to set up his cockpit. [0291] Further aspect of the invention allows a user to copy other user's cockpit. Thereafter, the user may configure his/ her cockpit based on the copied cockpit. [0292] Another aspect of the invention allows a user to download a cockpit from a cloud network or the Internet. [0293] Yet another aspect of the invention is to enable a user to control one or more operations of the remote devices or services through voice commands or gestures or hand move­ments. For example, the user may switch on an air conditioner (AC) by showing a thumb up gesture in front of the device. The device may include a camera to detect the gesture. The VMThings at the device (or access device) may analyze the gesture and control a remote device based on the analysis. [0294] An advantage of the invention relates to visual access menus that may ask for voice commands. This GUI is for some user harder to use due to accent or other problems. The database could be provided with the option as been described before for the system to output voice command according to user selection of the options or the device options or the service options. The device may include a microphone for detecting the voice commands. VMThings may analyze the voice commands and may take the actions accordingly. Further, the disclosed system and methods allow the user to give voice commands in different languages. For example, the user may select an option by giving a voice command in French language. Furthermore, the user may select an option (or device options or service options) from the visual access menu through one or more gestures or hand movements. In an embodiment of the invention, the user may store one or more gestures for one or more actions. For example, the user may use a thumb up gesture to switch on the AC. Similarly the user may store a thumb down gesture to switch off an electronic appliance such as microwave. [0295] Another advantage of the invention relates to pro­viding visual access menus and enhanced visual access
  • 98. US 2013/0080898 AI menus in different language(s). In an embodiment of the invention, the VMThings of device or the access device may display visual access menu or enhanced visual access menu in different languages. Further, the device may have one lan­guage and the user may want to control and communicate in a different language. Similarly, the VMThings may under­stand and accept voice inputs from the user in different lan­guages irrespective of the device language. Therefore, the user may control the remote devices by giving voice com­mands in different languages such as, but are not limited to, English, Spanish, French, Hindi, Chinese language, Japanese language, Hawaiian, German language, and so forth. In an embodiment of the invention, the device may not support or understand a particular language such as Spanish, but still the VMThings can display the visual access menus in Spanish language. [0296] Another aspect of the invention is to provide infor­mation about various services to the user using a device such as a smart phone anytime anywhere. [0297] Further aspect of the invention is to enable a user to control operations of the remote devices through a device including VMThings application. The user may not have to be physically present near the remote devices to control them. [0298] Yet another aspect of the invention is to allow users to see the images of remote devices in real-time irrespective of the location of the remote devices. For example, the user may see the remote devices such as home appliances present at his/her home by being present at the office. [0299] Embodiments of the invention are described above with reference to block diagrams and schematic illustrations of methods and systems according to embodiments of the invention. It will be understood that each block of the dia­grams and combinations of blocks in the diagrams can be implemented by computer program instructions. These com­puter program instructions may be loaded onto one or more general purpose computers, special purpose computers, or other programmable data processing translator to produce machines, such that the instructions which execute on the computers or other programmable data processing translator create means for implementing the functions specified in the block or blocks. Such computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means that implement the function specified in the block or blocks. [0300] While the invention has been described in connec­tion with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims. The invention has been described in the general context of computing devices, phone and computer-executable instructions, such as program mod­ules, being executed by a computer. Generally, program mod­ules include routines, programs, characters, components, data structures, etc., that perform particular tasks or implement particular abstract data types. A person skilled in the art will appreciate that the invention may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, mini- 33 Mar. 28, 2013 computers, mainframe computers, and the like. Further, the invention may also be practiced in distributed computing worlds where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing world, program modules may be located in both local and remote memory storage devices. [0301] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope the invention is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims. 1. A method for enhancing interaction of a user with objects connected to a network, the method comprising: displaying a visual access menu associated with at least two independent objects, wherein the said two indepen­dent objects are produced by two independent vendors, further wherein a database comprises a list of said objects. 2. The method of claim 1, wherein said visual access menu is not provided by either of said independent vendors. 3. The method of claim 1, wherein said visual access menu comprises at least one icon indicating one of said objects, wherein said at least one icon is substantially different than the one provided by said vendor. 4. The method of claim 1, wherein said database comprises a category attribute for said objects and a standard menu for said category. 5. The method of claim 1 further comprising displaying an advertisement, wherein said advertisement is selected based on content of said visual access menu. 6. The method of claim 1, wherein said visual access menu is displayed at a display device through wireless means. 7. The method of claim 1 further comprising selecting an option from said visual access menu by said user through a voice command, wherein voice recognition enables said user to select said option. 8. A method for enhancing interaction of a user with objects connected to a network, the method comprising: displaying, to said user, a visual access menu for commu­nicating with one or more objects made by a vendor, wherein said visual access menu is not provided by said vendor, further wherein a database comprises a list of said one or more objects. 9. The method of claim 8, wherein said one or more objects comprises at least two objects produced by two independent vendors. 10. The method of claim 8, wherein said menu comprises at least one icon indicating one of said one or more objects; further wherein said at least one icon is substantially different than the one provided by said vendor. 11. The method of claim 8, wherein said database com­prises a category attribute for said one or more objects and a standard menu for said category. 12. The method of claim 8 further comprising displaying an advertisement, wherein said advertisement is selected based on content of said visual access menu.
  • 99. US 2013/0080898 AI 13. The method of claim 8, wherein said visual access menu is displayed at a display device through wireless means. 14. A method for enhancing interaction of a user with objects connected to a network, the method comprising: displaying, to said user of a device, a visual access menu comprising an icon indicating at least one object made by a first vendor, wherein said icon is substantially dif­ferent than the one provided by a second vendor, further wherein a database comprises a list of said objects. 15. The method of claim 14, wherein said visual access menu is not provided by either of said first vendor and said second vendor. 16. The method of claim 14, wherein said objects com­prises at least two objects produced by either of said first vendor and said second vendor. 34 Mar. 28, 2013 17. The method of claim 14, wherein said database com­prises a category attribute for said objects and a standard menu for said category. 18. The method of claim 14 further comprising displaying an advertisement, wherein said advertisement is selected based on content of said visual access menu. 19. The method of claim 14, wherein said visual access menu is displayed at a display device through a wireless means. 20. The method of claim 14 further comprising selecting an option from said visual access menu by said user through a voice command, wherein voice recognition enables said user to select said option. * * * * *