> I understand that when making a database
> you identify items in terms of reciprocal proximity :
> an owl is an animal, bird, of prey, nocturnal, etc.

For nouns, it is possible to define a hierarchy (ontology)
based on the "is a" realation; owl isa bird, bird isa animal,
animal isa creature, creature isa thing (that's the top).
There is a project (WordNet) to do that for several
European languages; it has already been done for English.
There is also a famous conlang that uses it ... the URL is

> Maybe experience lingers in your brain as the configuration
> of all links between available pieces of information.
> If such link is only a pulsed distance between pieces of information
> then I can imagine you could measure it to one distal standard
> and retrieve anything at a given adddress.

There are other links in the WordNet database for things like
synonyms, antonymns, and some other easily definable relations.
These were compiled by professional lexicographers.
Your "distance metric" would not be a single scalar quantity,
but perhaps multi-dimensional based on dynamic criteria.
Which means, what do you want to do with the data?

> This is why I can't get how you can make an artificial language
> based on *predicate*.

But all programming languages are based on predicates.
So are all the hardware architectures.

> I mean : does the computer handling an artificial language
> only analyse information according to a set of standards or also speak ?