LLMs.txt Generation

Create machine-readable brand files that help AI crawlers discover and understand your content.

LLMs.txt generation

An llms.txt file is a machine-readable document that helps AI crawlers discover, understand, and cite your website's content. Think of it as a sitemap.xml designed specifically for large language models — it tells AI engines what your site contains and where to find it.

Vizzybl generates these files automatically by crawling your website and producing AI-optimized descriptions for each page.

llms.txt vs. llms-full.txt

Vizzybl generates two complementary file types.

FilePurposeBest for
llms.txtA compact routing file listing your site's major sections with short descriptionsDirecting AI engines to your most important pages quickly
llms-full.txtA comprehensive file including all discovered pages with detailed descriptionsMaximum content discovery across your entire site

Most sites benefit from both. The llms.txt file acts as a directory, while llms-full.txt provides full coverage.

Generating your files

  1. Open the Create section and click the LLMs.txt button
  2. Enter your website URL (e.g., example.com)
  3. Choose your generation mode:
ModePages crawledSpeedBest for
QuickUp to 20 pagesImmediate resultsSmall sites or a fast first draft
FullUp to 200 pagesRuns as a background jobComprehensive coverage of larger sites

Click Generate. For Quick mode, results appear immediately in the editor. For Full mode, a background job starts and you can track its progress in real time — you can close the tab and return later without losing progress.

Reviewing and refining

After generation, Vizzybl displays a split-screen workspace:

  • Left panel — An AI chat interface for refining your files
  • Right panel — An editor showing the generated content with tabs for llms.txt and llms-full.txt

Use the chat panel to refine your files with natural language. For example:

  • "Rewrite the descriptions to emphasize our enterprise features"
  • "Group the URLs by product category"
  • "Add schema type mentions for each page"
  • "Remove all blog post URLs and focus on product pages"

Each refinement updates the editor in real time.

Exporting your files

Once you're satisfied with the output:

  • Copy to clipboard — Click the copy button to grab the content for a single file
  • Download as ZIP — Export both llms.txt and llms-full.txt together

Upload the files to your website's root directory (e.g., example.com/llms.txt). AI crawlers look for these files at the root path, similar to robots.txt.

Tip: Regenerate your llms.txt files periodically — especially after adding new pages, products, or content sections — so AI crawlers always have an up-to-date view of your site.

Next steps