Print

Print


On 3 September 2015 at 01:32, Patrik Austin <[log in to unmask]> wrote:
[...]
> When you look at the mindmap, it suggests that language is a matter of expressing who did what, where and when, and that you have to organise these in a meaningful way. This is pretty much what I mean by logical necessity governing all languages; even though it's been suggested that languages could be "just anything", actually they can't if the purpose is to express something.

That's exactly what And & I have both been saying. You call it a
mindmap, I call it a semantic graph, And calls it predicate-argument
structure. Predicate logic is (one of) the formal system(s) used to
describe those kinds of objects. What you seem to be saying here is
just different words for the same thing that And meant when he said
that a language must encode predicate-argument structure- i.e., a
language must encode the shape of the semantic graph, or the shape of
the mindmap. Either the grammar of FL1 cannot accomplish that, or it
is actually more complex than you have been claiming so far, and I
have been trying to suss out what it's *actual* rules really are, such
that they can produce the interpretations that you have claimed for
your examples.

> In formal terms, whatever you do, it will always build on aS

Sure. But that's not interesting. It gives you no useful insight. The
*only* thing that tells you is that "human languages put words in
linear order". Well, duh.

> While I said FL1 has one *lexical* category, Gil writes about a monocategorial language with just one *syntactic* category, but I'm not sure what the difference is.

It means that there is only one kind of syntactic node. Lexical
categories *may* correspond to the syntactic categories of *leaf*
nodes, but there may be additional categories for syntactic
constituents that only occur internal to trees, and never embodied in
a single word.

My current best understanding of how FL1 actually works, based on the
evidence of your examples, is that it may have only one lexical class,
but definitely has at least two syntactic classes.

> One nice thing about the hyper-minimalistic FL1 is that the actual tree/mindmap seems pretty much like that of any human language, so it's easy to consider it as potentially relevant for natural languages.

Which is exactly why And and I (and the vast majority of engelangers
and semanticians) consider PL relevant for natural languages. PL
encodes what you are claiming FL1 encodes. And & I are arguing that
FL1, as you have described it, *can't* actually do that in an
unambiguous manner. (I thought of a couple logical possibilities for
how it could that would have been consistent with evidence to that
point, but then you didn't like any of them, so....)

> To answer the inevitable question why people did not choose this "optimal" structure in the first place, you can't properly use it in real time; in any effort to make a language that is good for expressing yourself with, you'd have to solve the problem of how to put the complex structures into linear form. In other words aS is the grammar, but it entails a problem that needs to be solved one way or another; my suggestion is that, looking at the mindmap, there are endless ways to do so.

Resolving exactly this problem is the challenge of the ergonomic
engelang, which no one has yet managed to produce a convincing example
of.

> I also believe that the majority of linguists have not accepted - or considered - the idea that natural languages are anyhow based on a structure similar to PL.

That's because they *aren't* based on PL, or on the notations that are
incidentally used to express PL. Rather, predicate logic was developed
in order to abstract *away* from the complexity and ambiguity of
language, and deal directly with the underlying semantics. You think
languages encode mindmaps? That's exactly what PL was intended to
provide a framework for understanding. Predicate logic tells you how
to manipulate mindmaps independently of any one specific language.

>> Indeed you can, but S -> aS is not the grammar that describes that
> structure or allows for the application of those rules.
> What you need is something like this (playing fast-and-loose with
> Kleene star because the linear branching direction is totally
> irrelevant here):
>
>> S -> S'S'*
> S' -> aa*
>
>> Or, in other words, a sentence is composed of a linear sequence of an
> arbitrary number of sub-sentences (S', or S-primes) greater than or
> equal to one, and each subsentence is composed of a linear sequence of
> an arbitrary number of words ('a's) greater than or equal to one.
>
> If you put it all together this way, I'd still prefer to use the aS structure as a key element. Do you have any suggestions for a grammar with aS (or AS), even if it's more complicated? I can't seem to make any one work properly with JFLAP although Kleene star should be allowed.

This should work:

S -> S'S | 0
S' -> aS' | 0

Note, however, that this is an ambiguous grammar; the surface forms
that it produces are just lists of 'a's. Disambiguating it requires at
the very least adding an additional syntactic category to mark either
the beginning or the ending of an S' (sub-sentence) constituent (which
could be distinguished from the syntactic category 'a' by morphology,
without necessarily introducing an additional lexical category), or
actually distinguishing a second separate lexical category of function
words (like "topic" and "event") for that purpose.

>> I repeat: paaS can unambiguously encode arbitrarily
> complex semantic graphs. FL2 can only unambiguously describe
> tree-structured graphs. That's nothing to feel ashamed of; natlangs
> don't do much better. But it's expressive power is strictly less than
> that of paaS.
>
> In formal language theory regular grammars have less expressive power than context free grammars. Are you referring to database theory or something?

Formal language theory in the way you are referring to it (which, by
the way, is primarily the domain of *computer science*, but which has
nevertheless been embraced by mainstream linguists to an arguably
*inappropriate* extent which, in my opinion, has held back progress in
theoretical syntax by decades) deals only with the classes of symbol
strings that a grammar can describe (or, equivalently, the kinds of
computations that it can represent). In other words, it deals strictly
with symbolic syntactic manipulations, completely divorced from
semantics.

But languages are not restricted to encoding information solely in
their syntax. Yes, FL2 syntax contains more information than paaS
syntax. Yes, it has a more powerful grammar at the strictly syntactic
level. But the interpretation rules of paaS allow it to encode
strictly more possible semantic graphs / mindmaps / predicate-argument
structures (specifically *all* of them-arbitrarily complicated graphs)
than the semantics of FL2 (so far as I understand them) are capable of
(specifically, only the tree-structured ones).

-l.