Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.lovi.ai/llms.txt

Use this file to discover all available pages before exploring further.

Websites Management If Documents are the private files in your office, Websites are the public pages your agent is allowed to read from the internet 🌐 This section lets you manage individual website URLs associated with this AI agent. Unlike Web Crawlers (which scan entire sites or sitemaps automatically), here you add specific pages or sources that the agent should reference when answering questions.

🎯 What is this for?

  • Control the source list: Add only the websites that matter for this agent.
  • Keep content focused: Use specific URLs instead of broad domains when you only need certain pages.
  • Easy maintenance: Quickly add or remove website sources as your content changes.
  • Complement crawlers: Use this for one-off pages that don’t belong to a full crawl (e.g. a competitor’s spec sheet, an external blog post).

🛠️ How it works

At the top of the page, you will see an input field where you can paste a website URL.
  1. Enter the URL
    Type or paste the website address you want to associate with the agent. Make sure it’s a complete URL (e.g. https://example.com/pricing).
  2. Click ”+ Add”
    This saves the URL and adds it to the list below. The system will read the page content so the agent can use it.
  3. Manage the list
    All added websites appear in the table, where you can review or remove them later.

📄 Website List

Each row in the list represents one website URL currently associated with the agent.
  • URL: Shows the exact website address that was added.
  • Delete / Bin 🗑️: Removes the URL from the agent’s website list.

🧠 How does the agent use websites?

Imagine you add these URLs:
  1. https://yoursite.com/returns-policy
  2. https://yoursite.com/shipping-info
If a user asks: “Can I return a product and how long does shipping take?” The agent will:
  1. Read the returns policy from the first URL.
  2. Check shipping details from the second URL.
  3. Combine both and give a complete answer. 🎯

⚠️ Important Notes

  • Make sure the URL is correct and publicly accessible before adding it.
  • Use public URLs that do not require login or authentication.
  • If a website changes or is no longer relevant, remove it so the agent does not rely on outdated sources.
  • Websites behind firewalls or anti-bot protections may not be readable.

🎓 Best Practices Summary (Cheat Sheet)

  • Trusted sources only: Add only URLs you trust and that contain accurate information.
  • Direct pages: Prefer specific page URLs over homepage links — https://site.com/faq is better than https://site.com.
  • Keep it clean: Remove outdated or irrelevant URLs regularly.
  • Don’t duplicate: If a page is already covered by a Web Crawler, you don’t need to add it here too.
  • Check accessibility: Make sure the URL loads in a regular browser before adding it.
With this section configured, your agent can stay connected to the websites that matter most. 🕵️‍♂️📚