Call the API#

results takes in a Metaphor-optimized search query and a number of results (up to 500). It returns a list of results with title, url, author, and creation date.

search.results("The best blog post about AI safety is definitely this: ", 10)
{'results': [{'url': 'https://www.anthropic.com/index/core-views-on-ai-safety', 'title': 'Core Views on AI Safety: When, Why, What, and How', 'dateCreated': '2023-03-08', 'author': None, 'score': 0.1998831331729889}, {'url': 'https://aisafety.wordpress.com/', 'title': 'Extinction Risk from Artificial Intelligence', 'dateCreated': '2013-10-08', 'author': None, 'score': 0.19801370799541473}, {'url': 'https://www.lesswrong.com/posts/WhNxG4r774bK32GcH/the-simple-picture-on-ai-safety', 'title': 'The simple picture on AI safety - LessWrong', 'dateCreated': '2018-05-27', 'author': 'Alex Flint', 'score': 0.19735534489154816}, {'url': 'https://slatestarcodex.com/2015/05/29/no-time-like-the-present-for-ai-safety-work/', 'title': 'No Time Like The Present For AI Safety Work', 'dateCreated': '2015-05-29', 'author': None, 'score': 0.19408763945102692}, {'url': 'https://www.lesswrong.com/posts/5BJvusxdwNXYQ4L9L/so-you-want-to-save-the-world', 'title': 'So You Want to Save the World - LessWrong', 'dateCreated': '2012-01-01', 'author': 'Lukeprog', 'score': 0.18853715062141418}, {'url': 'https://openai.com/blog/planning-for-agi-and-beyond', 'title': 'Planning for AGI and beyond', 'dateCreated': '2023-02-24', 'author': 'Authors', 'score': 0.18665121495723724}, {'url': 'https://waitbutwhy.com/2015/01/artificial-intelligence-revolution-1.html', 'title': 'The Artificial Intelligence Revolution: Part 1 - Wait But Why', 'dateCreated': '2015-01-22', 'author': 'Tim Urban', 'score': 0.18604731559753418}, {'url': 'https://forum.effectivealtruism.org/posts/uGDCaPFaPkuxAowmH/anthropic-core-views-on-ai-safety-when-why-what-and-how', 'title': 'Anthropic: Core Views on AI Safety: When, Why, What, and How - EA Forum', 'dateCreated': '2023-03-09', 'author': 'Jonmenaster', 'score': 0.18415069580078125}, {'url': 'https://www.lesswrong.com/posts/xBrpph9knzWdtMWeQ/the-proof-of-doom', 'title': 'The Proof of Doom - LessWrong', 'dateCreated': '2022-03-09', 'author': 'Johnlawrenceaspden', 'score': 0.18159329891204834}, {'url': 'https://intelligence.org/why-ai-safety/', 'title': 'Why AI Safety? - Machine Intelligence Research Institute', 'dateCreated': '2017-03-01', 'author': None, 'score': 0.1814115345478058}]}
[{'title': 'Core Views on AI Safety: When, Why, What, and How',
  'url': 'https://www.anthropic.com/index/core-views-on-ai-safety',
  'author': None,
  'date_created': '2023-03-08'},
 {'title': 'Extinction Risk from Artificial Intelligence',
  'url': 'https://aisafety.wordpress.com/',
  'author': None,
  'date_created': '2013-10-08'},
 {'title': 'The simple picture on AI safety - LessWrong',
  'url': 'https://www.lesswrong.com/posts/WhNxG4r774bK32GcH/the-simple-picture-on-ai-safety',
  'author': 'Alex Flint',
  'date_created': '2018-05-27'},
 {'title': 'No Time Like The Present For AI Safety Work',
  'url': 'https://slatestarcodex.com/2015/05/29/no-time-like-the-present-for-ai-safety-work/',
  'author': None,
  'date_created': '2015-05-29'},
 {'title': 'So You Want to Save the World - LessWrong',
  'url': 'https://www.lesswrong.com/posts/5BJvusxdwNXYQ4L9L/so-you-want-to-save-the-world',
  'author': 'Lukeprog',
  'date_created': '2012-01-01'},
 {'title': 'Planning for AGI and beyond',
  'url': 'https://openai.com/blog/planning-for-agi-and-beyond',
  'author': 'Authors',
  'date_created': '2023-02-24'},
 {'title': 'The Artificial Intelligence Revolution: Part 1 - Wait But Why',
  'url': 'https://waitbutwhy.com/2015/01/artificial-intelligence-revolution-1.html',
  'author': 'Tim Urban',
  'date_created': '2015-01-22'},
 {'title': 'Anthropic: Core Views on AI Safety: When, Why, What, and How - EA Forum',
  'url': 'https://forum.effectivealtruism.org/posts/uGDCaPFaPkuxAowmH/anthropic-core-views-on-ai-safety-when-why-what-and-how',
  'author': 'Jonmenaster',
  'date_created': '2023-03-09'},
 {'title': 'The Proof of Doom - LessWrong',
  'url': 'https://www.lesswrong.com/posts/xBrpph9knzWdtMWeQ/the-proof-of-doom',
  'author': 'Johnlawrenceaspden',
  'date_created': '2022-03-09'},
 {'title': 'Why AI Safety? - Machine Intelligence Research Institute',
  'url': 'https://intelligence.org/why-ai-safety/',
  'author': None,
  'date_created': '2017-03-01'}]

Use Metaphor as a tool#

Metaphor can be used as a tool that gets URLs that other tools such as browsing tools.

from langchain.agents.agent_toolkits import PlayWrightBrowserToolkit
from langchain.tools.playwright.utils import (
    create_async_playwright_browser,# A synchronous browser is available, though it isn't compatible with jupyter.
)

async_browser = create_async_playwright_browser()
toolkit = PlayWrightBrowserToolkit.from_browser(async_browser=async_browser)
tools = toolkit.get_tools()

tools_by_name = {tool.name: tool for tool in tools}
print(tools_by_name.keys())
navigate_tool = tools_by_name["navigate_browser"]
extract_text = tools_by_name["extract_text"]
from langchain.agents import initialize_agent, AgentType
from langchain.chat_models import ChatOpenAI
from langchain.tools import MetaphorSearchResults

llm = ChatOpenAI(model_name="gpt-4", temperature=0.7)

metaphor_tool = MetaphorSearchResults(api_wrapper=search)

agent_chain = initialize_agent([metaphor_tool, extract_text, navigate_tool], llm, agent=AgentType.STRUCTURED_CHAT_ZERO_SHOT_REACT_DESCRIPTION, verbose=True)

agent_chain.run("find me an interesting tweet about AI safety using Metaphor, then tell me the first sentence in the post. Do not finish until able to retrieve the first sentence.")
> Entering new AgentExecutor chain...
Thought: I need to find a tweet about AI safety using Metaphor Search.
Action:
```
{
  "action": "Metaphor Search Results JSON",
  "action_input": {
    "query": "interesting tweet AI safety",
    "num_results": 1
  }
}
```
{'results': [{'url': 'https://safe.ai/', 'title': 'Center for AI Safety', 'dateCreated': '2022-01-01', 'author': None, 'score': 0.18083244562149048}]}

Observation: [{'title': 'Center for AI Safety', 'url': 'https://safe.ai/', 'author': None, 'date_created': '2022-01-01'}]
Thought:I need to navigate to the URL provided in the search results to find the tweet.

> Finished chain.
'I need to navigate to the URL provided in the search results to find the tweet.'