Composing Conversational Architecture by Integrating Large Language Model: From Reactive to Suggestive Architecture through Exploring the Mathematical Nature of the Transformer Model

Lok Hang Cheung*, Giancarlo Di Marco

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

First proposed in the 1960s, Conversational Architecture enhances human and computer-integrated built environment interaction. Nowadays, most interactive designs are based on reaction and automation, rarely on conversation. Despite Natural Language Processing, including Large Language Model (LLM), being considered a candidate for Human-Computer Interaction (HCI), LLM applications are limited to verbal communication. The syntactic relationship between LLM, and architectural composition is underexplored. The paper proposes a qualitative framework to integrate the theoretical research of LLM and HCI in Conversational Architecture design. Through a mathematical and algorithmic analysis of a transformer model, the key component of LLM, its attributes are mapped onto Conversational Architecture parameters. With the identified design implications, a theatre hall design experiment is conducted. Through observation, the feasibility and challenges of the proposed framework are analysed.

Original languageEnglish
JournalNexus Network Journal
DOIs
Publication statusPublished - Nov 2024

Keywords

  • Computational design
  • Conversational architecture
  • Large language model
  • Performative optimisation
  • Transformer model

Fingerprint

Dive into the research topics of 'Composing Conversational Architecture by Integrating Large Language Model: From Reactive to Suggestive Architecture through Exploring the Mathematical Nature of the Transformer Model'. Together they form a unique fingerprint.

Cite this