Skip to main content
Ctrl+K
🦜🔗 LangChain  documentation - Home 🦜🔗 LangChain  documentation - Home
  • Reference
  • Legacy reference
Ctrl+K
Docs
  • GitHub
  • X / Twitter
Ctrl+K
  • Reference
  • Legacy reference
Docs
  • GitHub
  • X / Twitter

Section Navigation

Base packages

  • Core
  • Langchain
    • agents
    • callbacks
    • chains
    • chat_models
    • embeddings
    • evaluation
    • globals
      • get_debug
      • get_llm_cache
      • get_verbose
      • set_debug
      • set_llm_cache
      • set_verbose
    • hub
    • indexes
    • memory
    • model_laboratory
    • output_parsers
    • retrievers
    • runnables
    • smith
    • storage
  • Text Splitters
  • Community
  • Experimental

Integrations

  • AI21
  • Airbyte
  • Anthropic
  • AstraDB
  • AWS
  • Azure Dynamic Sessions
  • Box
  • Chroma
  • Cohere
  • Couchbase
  • Elasticsearch
  • Exa
  • Fireworks
  • Google Community
  • Google GenAI
  • Google VertexAI
  • Groq
  • Huggingface
  • Milvus
  • MistralAI
  • MongoDB
  • Nomic
  • Nvidia Ai Endpoints
  • Ollama
  • OpenAI
  • Pinecone
  • Postgres
  • Prompty
  • Qdrant
  • Robocorp
  • Together
  • Unstructured
  • VoyageAI
  • Weaviate
  • LangChain Python API Reference
  • globals
  • set_debug

set_debug#

langchain.globals.set_debug(value: bool) → None[source]#

Set a new value for the debug global setting.

Parameters:

value (bool) –

Return type:

None

Examples using set_debug

  • Bittensor

  • How to debug your LLM apps

  • OpaquePrompts

  • TextGen

On this page
  • set_debug()

© Copyright 2023, LangChain Inc.