LLMS Central - The Robots.txt for AI
Web Crawling

A better way to crawl websites with PHP

Freek.dev2 min read
Share:
A better way to crawl websites with PHP

Original Article Summary

Our spatie/crawler. package is one of the first one I created. It allows you to crawl a website with PHP. It is used extensively in Oh Dear and our laravel-sitemap package. Throughout the years, the API had accumulated some rough edges. With v9, we cleaned …

Read full article at Freek.dev

Our Analysis

Spatie's release of v9 of their crawler package with a cleaned-up API marks a significant improvement in crawling websites with PHP. This means that website owners who utilize PHP for their websites can now leverage a more efficient and refined crawling process, which can lead to better website indexing and improved search engine optimization. The updated API can also enhance the performance of tools like Oh Dear and laravel-sitemap, which rely on the crawler package, ultimately benefiting website owners who use these tools. To take advantage of this update, website owners can start by reviewing their current crawling setup and consider integrating the updated Spatie crawler package. Additionally, they should monitor their website's traffic and adjust their llms.txt files accordingly to ensure that their desired pages are being crawled and indexed correctly. Lastly, website owners can utilize tools like Oh Dear to keep track of their website's performance and identify areas where the improved crawling capabilities can be leveraged.

Related Topics

Web Crawling

Track AI Bots on Your Website

See which AI crawlers like ChatGPT, Claude, and Gemini are visiting your site. Get real-time analytics and actionable insights.

Start Tracking Free →