Print

Print


> [Logan:] Ah, I think I see what you're going for now. You are effectively
dividing your sentence into sub-clauses (which in the last example,
are all terminated by the marker "topic"), each of which describes, by
conjunction of predicates[1], a single entity that is part of the
structure of the over-arching event described by the whole sentence.

> In essence, "topic" is acting as a function word to divide sub-clauses
that describe separate entities. That could just be an artifact of the
particular example you chose; 

That's right. I've had words, like action, so, apropos, or used the action word (e.g. kick) as the head, omitting the obligatory FL2 coordinator. For what are conventionally sub clauses, I've simply used an ordinary subordinator.

> perhaps you do in fact intend that any
word could come at the end vs. beginning, which preserves your claim
of monocategoriality, and if this is in fact supposed to work the way
I think it does, then that's perfectly OK and doesn't necessarily
break anything. Sub-clauses could be distinguished by intonation /
punctuation without requiring a special word or even any morphology.
But there are distinct subclauses, which means there is at least a
one-level deep tree structure involved, which means that the linear S
-> aS grammar is not an accurate reflection of the actual syntax of
the language.

Right, it isn't in the conventional sense. I'm not considering truth values, just the simplest associative map between words. And complex meanings can be linked with a common head. The syntax is essentially a mindmap, like I've said. The idea is to show what the core element of human language is. When you look at the mindmap, it suggests that language is a matter of expressing who did what, where and when, and that you have to organise these in a meaningful way. This is pretty much what I mean by logical necessity governing all languages; even though it's been suggested that languages could be "just anything", actually they can't if the purpose is to express something. 

In formal terms, whatever you do, it will always build on aS; what's more, the semantics of the single category will have to be included. You can have a single category all-noun language with a variety of grammars. An all-verb language is also possible, but that would require two different classes of verbs, one of which would easily be interpreted as a noun (cf. predicate words functioning as either predicates or constants; or n-ary predicates vs. unary predicates, where all actions are transitive). 

A question that arises when working on this level is whether natural languages have lexical categories in the first place, or whether POSs are just lexicalised syntactic tags, and semantically all words refer to a nominal idea. While I said FL1 has one *lexical* category, Gil writes about a monocategorial language with just one *syntactic* category, but I'm not sure what the difference is.

One nice thing about the hyper-minimalistic FL1 is that the actual tree/mindmap seems pretty much like that of any human language, so it's easy to consider it as potentially relevant for natural languages. To answer the inevitable question why people did not choose this "optimal" structure in the first place, you can't properly use it in real time; in any effort to make a language that is good for expressing yourself with, you'd have to solve the problem of how to put the complex structures into linear form. In other words aS is the grammar, but it entails a problem that needs to be solved one way or another; my suggestion is that, looking at the mindmap, there are endless ways to do so.

One simple solution is to remove all repeating structures, whereby a previous example becomes "assault-VERB boy-OBJ house-IN girl-SUB" which is a very basic VOS structure, the word order being potentially free. Another solution is to establish a class of verbs plus use a fixed word order: "boy assault girl house-IN." IN can be a grammatical morpheme, but FL1 also explains how it can be a word of its own, as in "boy assault girl in house". Although directly derived from the unambiguous FL1, the above structures are now ambiguous. This could serve as one part of the answer as to why natural syntaxes are ambiguous although it is theoretically simpler to create unambiguous structures. 

> It's formal semantics. I don't know if you consider that "basic"
linguistics or not, but that's what it is.

Formal semantics is actually not basic linguistics. For example at the University of Edinburgh it's placed in the School of Informatics, but it's also part of Speech and Language Technology studies. At Helsinki it's in the Department of Computer Science, but also part of Language Technology studies - call it interdisciplinary. It's not what linguistics is generally about, and the theoretical framework is essentially different although there exist efforts to put the two together. 

I believe that formal semantics may be regarded marginal from the perspective of a mainstream linguist, although having a useful application it has gained a lot more prestige in the past years. I also believe that the majority of linguists have not accepted - or considered - the idea that natural languages are anyhow based on a structure similar to PL. Such an idea might be associated with Montague Grammar, which for one instance has a Wikipedia article in only eight languages, excluding Finnish, German, French and Russian among most others. The Formal semantics (linguistics) article is only available in the English Wikipedia, as a stub that has been proposed to be merged with the Formal semantics (logic) article since 2012, which in its turn is shorter and available in fewer languages than the linking Semantics (computer science) article.

What And Rosta wrote about linguistics earlier seemed pretty eccentric to me, but if the underlying assumption is that formal semantics is somehow fundamental for linguistics, then who knows, maybe it would seem like something of the kind. In this world though, I don't think so. Of course it's easy for computer scientists think of other kinds of people involved with languages - arts people (these are sometimes called "linguists") - as unskilled. Building on this, try to imagine what a mainstream linguist might (in the worst case) think of a computer scientist's conceptions of linguistic theory.

> And if you extend your syntax trees to the interior of words, then
agglutinative is exactly isomorphic to isolating.

That's true though. The book has the kind of stuff I could definitely use as a source. I guess what my article adds to this is (1) an actual method of stripping a complex language into a monocategorial one, using evidence from natural languages, and (2) propose a universal semantics (applicable within a sandbox) for particles, or show how to replace them with content words without losing any of the meaning or functionality. 

Of course some people will always like to argue that it's "all too theoretical" to directly prove anything; on the other hand my method is at least far less speculative than any attempt to capture the birth of human language in retrospective, which as we see, has however been accepted. I think journals realise it's important to stay open-minded. Of course the difference between me and Gil is that the peer reviewers know him from seminars and recognise his writing immediately; some will be positive to his idea while others don't matter. As for my article, future will tell. I might be sending it to a journal in spring (the previous one can remain an upload; 1391 downloads to date).

> Indeed you can, but S -> aS is not the grammar that describes that
structure or allows for the application of those rules.
What you need is something like this (playing fast-and-loose with
Kleene star because the linear branching direction is totally
irrelevant here):

> S -> S'S'*
S' -> aa*

> Or, in other words, a sentence is composed of a linear sequence of an
arbitrary number of sub-sentences (S', or S-primes) greater than or
equal to one, and each subsentence is composed of a linear sequence of
an arbitrary number of words ('a's) greater than or equal to one.

If you put it all together this way, I'd still prefer to use the aS structure as a key element. Do you have any suggestions for a grammar with aS (or AS), even if it's more complicated? I can't seem to make any one work properly with JFLAP although Kleene star should be allowed.

> I repeat: paaS can unambiguously encode arbitrarily
complex semantic graphs. FL2 can only unambiguously describe
tree-structured graphs. That's nothing to feel ashamed of; natlangs
don't do much better. But it's expressive power is strictly less than
that of paaS.

In formal language theory regular grammars have less expressive power than context free grammars. Are you referring to database theory or something?

> S -> abbS is *the same grammar* as S -> paaS. You've just renamed the variables.

Well, obviously. I may be lost sometimes, but not SO lost! :P