Skip to main content
Ctrl+K
🦜🔗 LangChain  documentation - Home 🦜🔗 LangChain  documentation - Home
  • Reference
Ctrl+K
Docs
  • GitHub
  • X / Twitter
Ctrl+K
  • Reference
Docs
  • GitHub
  • X / Twitter

Section Navigation

Base packages

  • Core
  • Langchain
  • Text Splitters
  • Community
  • Experimental

Integrations

  • AI21
  • Anthropic
  • AstraDB
  • AWS
    • agents
    • chains
      • create_neptune_opencypher_qa_chain
      • extract_cypher
      • get_prompt
      • trim_query
      • use_simple_prompt
      • create_neptune_sparql_qa_chain
      • extract_sparql
      • get_prompt
    • chat_models
    • document_compressors
    • embeddings
    • function_calling
    • graphs
    • llms
    • retrievers
    • runnables
    • utilities
    • utils
    • vectorstores
  • Azure Ai
  • Azure Dynamic Sessions
  • Cerebras
  • Chroma
  • Cohere
  • Deepseek
  • Elasticsearch
  • Exa
  • Fireworks
  • Google Community
  • Google GenAI
  • Google VertexAI
  • Groq
  • Huggingface
  • IBM
  • Milvus
  • MistralAI
  • MongoDB
  • Neo4J
  • Nomic
  • Nvidia Ai Endpoints
  • Ollama
  • OpenAI
  • Perplexity
  • Pinecone
  • Postgres
  • Prompty
  • Qdrant
  • Redis
  • Sema4
  • Snowflake
  • Sqlserver
  • Standard Tests
  • Tavily
  • Together
  • Unstructured
  • Upstage
  • VoyageAI
  • Weaviate
  • XAI
  • LangChain Python API Reference
  • langchain-aws: 0.2.23
  • chains
  • get_prompt

get_prompt#

langchain_aws.chains.graph_qa.neptune_sparql.get_prompt(
examples: str,
) → BasePromptTemplate[source]#

Selects the final prompt.

Parameters:

examples (str)

Return type:

BasePromptTemplate

On this page
  • get_prompt()

© Copyright 2025, LangChain Inc.