Introducing Graph of Thoughts: Revolution in the LLMs World

  • Piotr Nyczyk


    Piotr Nyczyk

  • Date

    August 22, 2023

  • Read time

    2 min


Check out the Graph of Thoughts, our newest result of the ongoing collaboration with the Scalable Parallel Computing Lab (SPCL) at ETH Zurich!

Graph of Thoughts (GoT) is a framework that advances prompting capabilities in large language models (LLMs) beyond those offered by paradigms such as Chain-of-Thought or Tree of Thoughts (ToT). The key idea and primary advantage of GoT is the ability to model the information generated by an LLM as an arbitrary graph, where units of information (“LLM thoughts”) are vertices, and edges correspond to dependencies between these vertices. This approach enables combining arbitrary LLM thoughts into synergistic outcomes, distilling the essence of whole networks of thoughts, or enhancing thoughts using feedback loops.

We illustrate that GoT offers advantages over the state-of-the-art on different tasks, for example increasing the quality of sorting by 62% over ToT, while simultaneously reducing costs by >31%. We ensure that GoT is extensible with new thought transformations and thus can be used to spearhead new prompting schemes. This work brings LLM reasoning closer to human thinking or brain mechanisms such as recurrence, both of which form complex networks.

Among other use cases, we illustrate how to use GoT for the effective and efficient creation of documents such as NDAs, while reducing prompting costs compared to all other baselines.

Take a look at the paper here:

P.S. – and here’s the code, with extensive documentation on how to build your own Graphs of Thoughts:


Interested in enhancing your AI capabilities?

Leave your details and any questions or ideas you might have, and we’ll get back to you to schedule a call.