-> To the website.
-> Mar. 2023
-> Bespoke software / website
-> Published on The New River Spring 2023

"I will give you a word. Make a sentence or phrase with a series of words whose first letters sequentially spell out my word. Your sentence doesn't have to have a strong semantic connection with the word I give you.

Here're some good examples:
Cake - Creating amazing kitchen experiences.
Fire - Fierce inferno razed everything.
Smile - Some memories invoke lovely emotions.

Here's a bad example:
Abandon - Alice abandoned her plans to move to the city when she realized the cost of living was too high.

Now make a sentence for /WORD/.

According to our rules, there should be /N/ words in your sentence. Your response should only contain the sentence you make. "

Above is a prompt I wrote for the large language model GPT 3.5 Turbo which currently powers chatGPT. In response to my prompt, the model creates acrostics but in sentences. Thousands of word-sentence pairs compose this project, Lecsicon: Linguists Enthusiastically Catalog Symbols, Interpreting Carefully Occurred Nuances.

The accuracy of GPT's responses, as usual, was not guaranteed. At times, one or two extra words slip into the sentence. In extreme cases, the model rambled on and lost track of the initial instructions. Out of over 27,000 attempts, 7828 results strictly followed my rules. When one visits the Lecsicon web page, the 7828 word-sentence pairs are typed out letter-by-letter by a program. Beginning with a random word, the program scans the Lecsicon database, and strives to find a new word that has the smallest Levenshtein distance from the preceding word. The temperature slider on the web page adjusts the scope of the program, yielding either stronger or weaker connections between consecutive words displayed. Typing speed is also influenced by temperature - in the same way as molecular motion is.

Language games like acrostics captivate people like me because there is a hidden layer behind putting words together following certain mechanisms. Making sense--the resulting sentences provide a greater context for the original words and reveal new perspectives. It's not a coincidence that GPT makes sentences that relate to the original word's meaning. Rather, it's a deliberate calculation and balancing process to capture words based on their relations to each other: a ghost wandering through a latent word vector space. Any word in such a space is nothing but the linguistic associations around it, based on statistical computation of the patterns of the text huamans have produced. Sentences emerge from their context, and perhaps that's why some entries are intriguing, such as "Music: Many unexpected sounds indicate creativity", or "Date: Dinner and theater experience". Lecsicon is only made possible by the fractal nature of the English language: the way we make sense of letters, words, sentences, and paragraphs.

Lecsicon - exhibition view