Blind Mobility and flying IFR
                                                                       Bjarne Fjeldsenden

Why can a bat "see with its ears" while a blind person can't even given such information  via
professor Kays trisensor spectacles and cane  http://www.batforblind.co.nz/index.htm working in a similar way?
Because a bat has a brain particularly well suited to  interpret the auditory information it emits and which
get reflected. One may say a bat has "an auditory radar".  Blind people may be able to move around almost
as a sighted person if information can be delivered in a form  which can be easily interpreted by a human.

How can a pilot find his way through clouds down to the airfield?
Because the plane is equipped with instruments telling the pilot the exact position of the plane and the glide
slope to follow to hit the runway. But the pilot has to interpret the information from the instruments and act
accordingly or at least monitor the auto pilot.

What have blind mobility and  flying IFR, "flying instrument", in common?
1: They may both use GPS, GIS and other electronic information to know their position and navigate.
GPS tells the position based on satellite data and GIS (Geographic Information System) is an electronic
map which can be linked to GPS. CFIT
2: Both the blind and the pilot has to rely on "artificial information".  "Natural information" is to see the
environment and behave accordingly.  So what's the problem?  To  deliver information in such a way that
it is not misinterpreted.  Half the accidents with commercial airliners are attributed to CFIT  Controlled Flight
Into Terrain, and blind people can not move around as easily as a sighted person.

What is the state of the art?
The aviation industry, building commercial airliners, are well on the way to build visual displays that makes it easier
for the pilots to fly the plane.  Soon they may have VR displays being similar to what the pilot see out of the cockpit
window but based on electronic information. Two possible weak points may be the reliability of the system and how
easy it is for the human to interact with.
When it comes to blind mobility the situation is more bleak. An American firm, Arkenstone in US, has made a talking
map for the blind based on GPS and GIS. This system gives the blind the name and number of the street he or she is
in and by a simple command the system tells the person direction and distance to a chosen location as e.g. the railway
station. The system doesn't tell the blind about obstacles in the way. Professor Leslie Kays ultrasonic spectacles, see
may do this if the person can interpret the information.  Or perhaps "the
ultrasonic dog" by Johannes Borenstein,  from University of Michigan inUS may be a better solution?  This device
consists of ultrasonic sensors placed on a small two wheeled cart and is pushed by a stick. When an obstacle is in the
way the wheels are activated turning the cartaway from it and towards open space so the blind only have to follow the
cart. Other interesting links related to blind mobility can be found at the home page of Tony Heyes,

What can be done?
One may envisage a development where "the electronic dog" is programmed via way points from A to B
in the same way as one can do with airplanes using  GPS.  The blind person can push  "the electronic dog"
which both will avoid obstacles and find the best way much  the same way as a guide dog will take the blind
person from A to B.   But maybe most blind people prefer a live dog in preference to an electronic dog?
What has been suggested above can be realized in Trondheim, Norway at NTNU, Norwegian University
of Science and Technology, if different groups can cooperate and a project of this nature is economically
feasible.  Trondheim has also industry with competence in this area.
Maybe would it be more interesting to make similar systems, based on GPS and GIS, for either the aviation
industry or ships. The new element would be to create programs which took the plane or the ship off a
track which would lead to disaster.  This would require a closer integration between GPS and GIS, together
with steering systems/autopilots, than what is the case today.

What has this to do with psychology?
More often than not systems have been created, by engineers, which are not very compatible with the way
the operator thinks.  Some accidents and near accidents with Airbus are examples where the pilot and
the autopilot fought about the control of the plane. "They didn't understand each other."
In addition, a system may do things which are hidden to the operator and suddenly a critical situation may
arise without any forewarning as with a China Airlines 747 on autopilot, which compensated only up to a
certain point when one engine gradually lost power. The plane suddenly went out of control when the autopilot
failed and the plane dropped from 35.000 feet down to 8.000 feet before the pilots gained control. Or the
757 from AA which flew into a mountain near Cali in Colombia because the pilots programmed the autopilot
to go to a point just north of the airfield.   What the autopilot didn't take into account was a high mountain in
between and the pilots didn't know  their exact position. They were not "in the loop". Neither the pilots or
autopilots  cognitive model of the situation was adequate, and this lead to a disaster.
One goal is to produce systems which make a flexible interaction between the human and the
system/computer possible. Two factors may be considered important in this context:
1: The system must be transparent so the operator can see and understand what the system does.
2. A cognitive model which is both adequate,easy to understand and act upon,  must be worked out.
This  model must be reflected in the construction of  the control panel.

About the person behind IDEAS:  He has worked in a school for the blind for 4 years, teach cognitive-
and cross-cultural psychology, has been a private pilot since 1982 and have read a lot both about
mobility aids for the blind and questions related to aviation.