Deep Dive: AI Tools for Architectural Research
Architectural research was once defined by dusty archives and endless google searches. Now, we have agents that can read, synthesize, and map millions of documents in seconds.

When we talk about AI in architecture, we usually mean image generation. But the text-processing capabilities of Large Language Models (LLMs) are far more transformative for the research phase of a project.
Imagine feeding a PDF of the local zoning code into an agent and asking it, "What are the setback requirements for a commercial lot on 5th street?" and getting an instant, cited answer. This is not science fiction; it is simply RAG (Retrieval Augmented Generation).
Understanding Semantic Search
Traditional search relies on keywords. If you search for "brutalist concrete," you get pages with those exact words.
Semantic search understands concepts. It converts text into mathematical vectors. "Heavy" and "Oppressive" are mathematically close to "Brutalist."
This unlocks a new way of finding precedents. You can search for "buildings that feel like a labyrinth" or "structures that mimic cellular biology," and the system will return relevant results even if those exact words never appear in the source text.
The Thesis Assistant (RAG Pipeline)
For thesis students, the sheer volume of reading is overwhelming. RAG tools allow you to build a "Second Brain."
You upload your bibliography—50 PDFs on urbanism, philosophy, and history. The AI indexes them. You can then "chat" with your entire library.
- "Summarise what Koolhaas and Tschumi say about programme, citing specific pages."
- "Find contradictions between these two authors regarding density."

Tools You Can Use Today
You do not need to be a coder to use this.
- 01 ::NotebookLM (Google): Upload up to 50 sources. It generates podcasts, FAQs, and briefing docs instantly. It is arguably the best research tool for students right now.
- 02 ::Perplexity: A search engine that cites its sources. Use it instead of Google for initial queries.
- 03 ::Obsidian + Smart Connections plugin: If you take notes in Obsidian, this plugin allows you to semantically search your own notes.
The Hallucination Danger
A warning: LLMs are convincing liars. They can hallucinate court cases, scientific studies, and historical events.