Years ago I was involved in building a natural language processor which was to overlay an expert system dealing with the Pascal implemented on a PDP-11. For those not familiar with Pascal, it suffices to say here that it was an ideal teaching language - encouraging good use of structured programming and data structuring. Niklaus Wirth published it in 1970.
So, by far the biggest challenge was to efficiently convert natural language expressions into machine-interpretable canonical representations. An ability to perform content analysis, fact-finding, question answering, inference, translation, and sense making, was necessary. Soon, the obvious became clear, that is the universe of features found in English (as in any natural language) is large, complex and potentially unbounded. With great effort, a simplified grammar and reduced set of words we produced a recognizer of sort.
The reason I recount this is because of my interest in functional grammar as taught in schools and in particular Kaplan and Bresnan’s work in Lexical Functional Grammar and their treatment of functional structure analysis when looking at syntax. For example how the function structure encodes linguistic information about the various functional reactions between parts of a sentence.
Now, I know much has been said about the benefits of teaching, and using what has been a controversial program - functional grammar. Years ago, teachers in particular who understand functional grammar were keen to see it maintained and used side-by-side with traditional grammar. Words such as "participant", "process" and "lexical chain" would sit as comfortably as noun and verbs thus providing a way of describing how language is used, what language does and how it is applied.
Functional grammar emphasises the ways in which language functions to assist meaning, but also relies upon knowledge, understanding and the use of terms of traditional grammar see Fig 1.