Machine assisted conlaning, imho, has a lot of promise. There isn’t enough time to create a lot of complete conlangs by hand. If they were machine generated, then they could be evaluated on various criteria, such minimal sentence length, reading level for typical sentences, reading level for maximally complex sentences, etc.
I’ve put some thought into this and… I don’t think there is a general solution for grammar generation. At best one can generate a variety of highly similar grammars. There just isn’t an easy way to index all the possible communication systems and then start picking them at random. For example, if one were to generate IE languages, they would have a variety of endings and prefixes around a root, and the affixes would do all the grammar work, and syntax would be de-emphasized. A English or Khmer like language would have a variety of order based syntaxes. And so on. While you can vary syntax parametrically (do the adjectives come first or last? flip a coin!) but the fundamental system differences from language to language isn’t a matter of parametric differences. I can’t conceive a coin to flip that sometimes indicates a logical language, sometimes something like an Algonquian language and sometimes like a Near-East language with tri-consonantal roots.
A grammar generator has to be able to conceive in advance of all possible systems and then start picking from them at random.
Anyhow, so I’m back to the drawing board, this time to create *specialized* grammar generators, e.g. a program that spews out Romance-conlangs or the like.