I have spoken before of Kircher’s Universal Language dictionary, which aimed to reduce all words to numbers which could then be transmitted to a recipient, who would then look up those numbers in his vernacular, and so read the message.
Of course, all this translation system really does, from our modern day viewpoint, is slow translations down by introducing a dictionary reference instead of the actual word. Direct translation of words leads to interesting mixups like this one:
Anyway, in 1663 Kircher was to publish Polygraphia nova et universalis (the New and Universal Polygraphy), a grandiose tract that was to promise far more than it could deliver. It was a typical Kircher work in that it took the ideas of others, spun them around and presented in a new manner. Kircher was a master of this art, and his polygraphia is one of his masterpieces. Notwithstanding that, the polygraphia is a good example of how the intellectuals of Europe considered language and codes at this point in time, and so let’s look at the state of “universal languages” in the pre-Renaissance via this book – because the content of the book is intellectual very much on the dividing line between medieval and Renaissance thinking.
First off, let me distinguish between “universal languages” and codes. The distinction was somewhat blurry at the time, but roughly speaking, there were two conflicting desires. One desire was to find a “universal language” that would allow any two people from different nations to communicate – the fact that one already existed, Latin, seems not to have bothered people searching for this Utopia. The other desire was to find a language that nobody could read, what we would call today cryptology. This second desire came from the vague sense that the usual substitution codes of the day were breakable – and indeed, they were. The first desire belonged to the philosophers and took its authority from the Bible, the latter desire belonged to merchants and princes and took its authority from the military and the banks.
Anyway, Kircher promised the following sections in his book:
1.- The Reduction of all Language to One
2.- The Extension of all Language to All
3.- A Technologia; or a universal Steganographic Secret operating by combinations of things whereby through a technique impenetrable to the human mind, one may transmit one’s secrets to another in nearly a thousand ways
It all seemed exciting enough, but as usual, the promises were empty and Kircher’s dreams were simply too large to accommodate the reality. It also wasn’t anything new. All the systems described in polygraphia were already known, but Kircher was the first to package them up as a system for the mass market. You can’t accuse the chap of plagiarism as he was pretty up front about his sources, most of which he acquired via his international network of Jesuit scholars
Section I, the “Reduction”, was an international code which tried to represent all words as a two part symbol, the first part referring to the meaning of the word (as recorded in a lookup table), the second part representing the grammatical function of the word (based on the morphology of the Latin language – so you had to know Latin to use the system).
The system was based on the Tironian Note system, a manner of shorthand dating back to Roman times and supposedly invented by the secretary of Cicero. It’s been noted that in 1624 a man called Gustavus Selenus had already published a book outlining this system, although Kircher seems in fact to have copied a 1650 system invented by a deaf-mute Jesuit priest from Madrid, upon which Kircher had commented in an article from 1660. Kircher ends this section with his universal dictionary.
Section 2, the “Extension”, is the opposite. It is a table of equivalences in which each letter of the alphabet is substituted by Latin words. By substituting letters for these Latin words, the encoder is gratified to find that his sentence has turned into a flowing Latin prose. The receiver then looks up each Latin word in his dictionary and substitutes for the original letter. It’s a stenographic code that replaces letters with words, and hides your meaning in a somewhat stilted Latin poem.
Kircher here admits that he based his system on Johannes Trithemius, whose original work on encryption had for some mad reason been written as if it were a book on sorcery, but that’s another story. We’ll talk about it later.
Section III is an implementation of the Vigenère system, a set of cryptographic techniques comprising substitution ciphers and letter keys.
The whole book is mainly a farce compared to its stated ambitions which must have been realised at the time. Section I is nothing more than a numbered vocabulary list. It is, I suppose, possible that if both parties had the lookup tables in front of them, a stilted conversation could be held, but how would this be any faster than using two phrase books? It was also extremely limited, as no real grammar was included and the dictionary was limited in vocabulary.
The Extension in Section II could be modified into other languages, should anyone care to translate the tables from Latin into a vernacular, but it falls far short of the promises made by Kircher in the introduction.
But this is the common problem of all such systems in the 17th century, and shows a lack of understanding of what language actually is. The philosophers of the time had a hunger for a true One Language, the mythical language that was spoken before the fall of the Tower of Babel, and in their search they failed to create; they simply tried to reconstruct what they assumed had been destroyed. This, indeed, was why Kircher was so fascinated with hieroglyphics, he felt this must have been the original Language spoken before the fall of language.
European language theory rested solely on the Biblical interpretation. There was the original Divine Language spoken to Adam & Eve; the language that was lost after the Tower of Babel was destroyed. Thoughts of a universal language always revolved around this concept, as Umberto Eco puts it, there was a feeling that a divine language that everyone could understand from birth must exist, but it was hidden to us and so we had to search for that language. Umberto incidentally tends to interpret these early fumblings as the beginnings of linguistic theory, but personally I tend towards the opposite view – this wave of stilted doxastic attitudes needed to be swept away before any real insight could be gained.
Communication was the key. How to communicate fascinated Kircher, who after all belonged to an order whose principle aim was to teach. His machina cryptologica was a series of bottles on magnets that could transmit a message – it was less of a cryptological machine than a semaphore.
Kircher was convinced that communication lay at the heart of things, describing how communication lay everywhere, and this was a common enough supposition. Kircher once wrote that:
“I have seen a complete crucifix in an agate-stone [..] in tufa rock I have seen a whole alphabet whose letters were formed of the veins in the stone [..] I once caught a butterfly on whose wings nature had accurately imprinted the face of our Saviour”.
This was the difference in language that the early moderns had. They considered their day to day languages to be inferior human made constructs that were only temporary; God had a secret universal communication that he had hidden from us until we were wise enough to use anew, and they hoped to be able to discover it. It was the doctrine of signatures written into linguistic communication. The idea of deconstructing their vernacular and rebuilding it into an artificial language – such as Esperanto, for example – would have to wait until the 18th and 19th centuries.
Johannes Trithemius had had the same problem. When he wrote his Steganographia in 1499 he stored up no end of trouble for himself by placing it within a magical religious framework. His work was entirely about cryptographical techniques, but on the surface it seemed to be magical, swinging towards demonic. Scholars have puzzled for centuries about why he had to write this innocuous enough tome in such a way as to infuriate the Inquisition; the theory is that he too was thinking of this religious linguistic duopoly and so couched his language in a religious form, communicating with spirits, simply due to the same Biblical interpretation of the essence of language.
Of course, once the Renaissance came along and natural philosophers started their evolution into specialised scientists, all this woolly thinking was cut out of the equation and we start getting into artificial languages. But the philosophical framework for such works didn’t exist in the 15th and early 16th centuries.
But universal languages in the first half of the 17th century all followed the same template. They all promised to make learning the language easy – the promise was always that with just a few hours of study, a universal system of writing that could be applied to any language in the world could be acquired. The system always promised to reduce all languages back to their “primitive” form.
Some systems promised to sweep away the current damaged forms of language and promised to create new “characters real” (the term comes from Francis Bacon’s praise of Chinese ideograms) which would allow the student of the system to communicate not just with anyone in the world but also understand the truth of the thoughts of the ancients. There is an element of Kabbalism here, the idea that behind the words written on the page that could be read by anyone is hidden the “true meaning”, a secret code that needs only a key to be unlocked. Bacon and Descartes were both involved in separate systems.
It wasn’t really until 1674, when Pierre Besnier wrote a philosophical essay for the Reunion of the Languages that the concept of reuniting distinct languages became established. Pierre had spotted that words in different languages often seem similar – he had discovered the basics of etymology. His “reductive epistemology” tried to break down languages to discover how they had evolved – he called the steps of evolutions an alembic, after the alchemical vessel that is used to distil liquids into different conditions.
But Pierre’s new direction was to come too late for Kircher, who in any case had been developing his theories in the 1650’s after being commanded by Emperor Ferdinand III to “create a universal language that would allow any Prince to communicate with any of his subjects in any nation or language unimpeded”. Kircher couldn’t fulfil the wish of the Emperor, but rather than admit failure went off at a tangent and built the first encryption device, something he called a “steganographic ark”. The principles were based on the music pipe organs he had built in the previous decade, and they were large wooden devices designed to encode messages mechanically. In essence, they were giant slide rules that simply automated the encryption system.
This is what came in Section III of his polygraphia. He designed a paper version which he put in his book (far cheaper than building his arks, which he sent to the Emperor and several rulers, but just as flattering for the second rate powers that were his next audience) called a Glottotactic Ark which promised to be “good for writing letters throughout the whole world”. Kircher loved this “magnetic language” philosophy and promised that it would allow anyone access to the knowledge of the world. But the arks were not what Kircher promised. He said they were a true universal language that allowed the users of the system to bypass vernacular languages and communicate on a true level. In reality, they were just automated encoding machines that transposed vernacular language into a mechanical code that still preserved all of the features and flaws of the original communication. Descartes knew this, when he wrote as far back as 1637 in discours de la méthode on fantastical machines, but concluded in the end that they could not replace the essence of the human spirit. They could, he said (I paraphrase here a bit) only carry out our instructions, they could not make a quantum leap to the spiritual. A language machine was bound to fail.
An alternative to Kircher’s language machines were the volvelles, of which I have touched on elsewhere. Georg Phillips 1651 Mathematical Recreations published an immensely popular volvelle (illustrated below) that promised to automatically create 97,209,600 German words, although of course many of these were invented ones that didn’t exist.
Georg and his contemporaries were bothered about dissecting the true nature of language. Language was instead the linguistic equivalent of the alchemical process. An “Art” for them meant a method for producing objects (as opposed to our modern day meaning) and they wanted language machines to produce as many combinations as possible. They seemed to have a faint idea that the more words they could produce, the more productive they would be.
Kircher was simply the standard bearer of the thoughts of his day, a final swain song before the sun set on his way fo thinking for good. There was a cognitive dissonance between his ambitions and his achievements – he wanted to set human communication free, but all he succeeded in doing was imprisoning it within a very strict confine of rules. His inventions were suited to the needs of large organisations having to carry out very strict communication in different regions and languages, and one can imagine the Jesuits using his schemes to send out orders and reports amongst themselves over the five continents in which they were present. But as a means to his original aim of a true universal language, free from human artifice and cultural shackles, he (and those of his time) failed miserably.
For more on this subject, I heartily recommend Haun Saussy’s article “Kircher’s Magnetic Language”. Other references in the text hyperlinked.