Dead-Link Watchdog
by practicaltools
Automatically find and fix broken links on your site. Schedule scans, get clear reports, and improve your SEO for just $0.0008 per link.
Opens on Apify.com
About Dead-Link Watchdog
Ever clicked a link on your own site only to get a 404? It happens to the best of us, and those dead links silently hurt your SEO and frustrate visitors. That's why I built Dead-Link Watchdog. Think of it as your automated site auditor. You tell it which HTTP status codes to look for—like 404s, redirect chains, or even just sluggish pages—and it crawls your site on a schedule you set. It then hands you a clean, structured report showing exactly what's broken and where. No more manual checking. The best part? It costs just $0.0008 per link, which is basically pennies for the peace of mind. I use it to keep my client sites clean and fast, ensuring users never hit a dead end and search engines see a well-maintained site. Just set your parameters, schedule your scans, and let it run in the background. It’s one less thing to worry about.
What does this actor do?
Dead-Link Watchdog is a web scraping and automation tool available on the Apify platform. It's designed to help you extract data and automate tasks efficiently in the cloud.
Key Features
- Cloud-based execution - no local setup required
- Scalable infrastructure for large-scale operations
- API access for integration with your applications
- Built-in proxy rotation and anti-blocking measures
- Scheduled runs and webhooks for automation
How to Use
- Click "Try This Actor" to open it on Apify
- Create a free Apify account if you don't have one
- Configure the input parameters as needed
- Run the actor and download your results
Documentation
Dead-Link Watchdog
Automatically scan websites for broken, redirected, or slow links. It's a cloud-based actor that costs $0.0008 per link checked, built on Crawlee and designed for scheduled, automated monitoring.
Overview
Dead-Link Watchdog crawls your site, tests every link it finds, and flags problematic ones based on HTTP status codes. It saves structured results to an Apify Dataset for easy export or integration. Use it for one-off audits or set up recurring scans with Apify's built-in scheduler.
Key Features
- Comprehensive Link Checking: Flags links based on HTTP status codes. By default, it flags all 400+ status errors. You can specify a custom list (e.g.,
[404, 500, 301, 302]). - Cost-Effective: Pricing is $0.0008 per link checked (e.g., 1,000 links for $0.80).
- Fully Schedulable: Configure automatic daily, weekly, or monthly runs directly via Apify.
- Structured Output: All results are saved to your Apify Dataset and can be exported as CSV or JSON.
- No Infrastructure Needed: Runs on Apify's platform—no proxies, servers, or local setup required.
- Configurable Crawl: Control crawl depth with
maxCrawlDepthand set a hard limit on pages crawled withmaxRequestsPerCrawl.
How to Use
Input Parameters
Configure the actor using these main inputs:
| Field | Type | Default | Description |
|---|---|---|---|
startUrls |
array | [{ "url": "http://www.example.com" }] |
One or more starting URLs or sitemap URLs. |
maxCrawlDepth |
integer | 2 |
How many link levels deep to crawl from each start URL. |
flagStatusCodes |
array | [] |
Specific HTTP status codes to flag (e.g., [404, 500]). If empty, flags all 400+ codes. |
includeExternal |
boolean | false |
Set to true to also check links pointing to external domains. |
maxRequestsPerCrawl |
integer | 300 |
Maximum number of pages to crawl (0 = unlimited). |
For a one-off scan: Run the actor manually with your target domain in startUrls.
For automated monitoring:
1. Go to the actor's Run page on Apify.
2. Click Schedule → New schedule.
3. Set your frequency (daily, weekly, etc.) and input parameters.
4. Save. The scan will now run automatically at your chosen interval.
Input/Output
Input: Provide a starting URL and configure crawl depth, status codes to flag, and request limits as shown above.
Output: The actor saves a record for each flagged link to your Apify Dataset. Each record includes:
{
"source": "https://example.com/page-with-broken-link",
"target": "https://example.com/missing-page",
"status": 404,
"latency": 120,
"timestamp": "2025-10-25T13:10:00Z"
}
source: The page where the bad link was found.target: The problematic destination URL.status: The HTTP status code returned (e.g., 404, 500, 301).latency: Response time in milliseconds.timestamp: When the check was performed.
You can view this data in the Apify console or export it as CSV/JSON for further analysis or reporting.
Categories
Common Use Cases
Market Research
Gather competitive intelligence and market data
Lead Generation
Extract contact information for sales outreach
Price Monitoring
Track competitor pricing and product changes
Content Aggregation
Collect and organize content from multiple sources
Ready to Get Started?
Try Dead-Link Watchdog now on Apify. Free tier available with no credit card required.
Start Free TrialActor Information
- Developer
- practicaltools
- Pricing
- Paid
- Total Runs
- 70
- Active Users
- 8
Related Actors
Google Search Results Scraper
by apify
Google Search Results (SERP) Scraper
by scraperlink
Google Search
by devisty
Bing Search Scraper
by tri_angle
Apify provides a cloud platform for web scraping, data extraction, and automation. Build and run web scrapers in the cloud.
Learn more about ApifyNeed Professional Help?
Couldn't solve your problem? Hire a verified specialist on Fiverr to get it done quickly and professionally.
Trusted by millions | Money-back guarantee | 24/7 Support