Dataset Validity Checker

by equidem

Automatically checks, whether default datasets created by runs of an actor differ too much from the previously encountered ones, allowing it to warn y...

6,570 runs
28 users
Try This Actor

Opens on Apify.com

About Dataset Validity Checker

Automatically checks, whether default datasets created by runs of an actor differ too much from the previously encountered ones, allowing it to warn you about web scraping problems caused by, e.g., a website layout changing, or other significant changes in the resulting data.

What does this actor do?

Dataset Validity Checker is a web scraping and automation tool available on the Apify platform. It's designed to help you extract data and automate tasks efficiently in the cloud.

Key Features

  • Cloud-based execution - no local setup required
  • Scalable infrastructure for large-scale operations
  • API access for integration with your applications
  • Built-in proxy rotation and anti-blocking measures
  • Scheduled runs and webhooks for automation

How to Use

  1. Click "Try This Actor" to open it on Apify
  2. Create a free Apify account if you don't have one
  3. Configure the input parameters as needed
  4. Run the actor and download your results

Documentation

Dataset Validity Checker (DVC) ### How it works DVC goes through default datasets of the actor specified in its input and calculates indicators, that allow it to recognize, when something is amiss with datasets from later runs. When it notices such a case, it will send you an email to address specified in the input, as well as write a note to its console output. All datasets that pass the check are then also added to the history DVC uses to determine validity of the later datasets. You can use DVC for an entire actor, or only for a specific task, depending on whether you provide it with a task id or actor id in its input. Moreover, DVC can be used simultaneously (and independently) for any number of actors/tasks. You only need to run it with different task/actor ids. Please note, that DVC doesn't check individual items in a dataset, but the dataset as a whole, so it will not catch a mistake influencing only a small portion of the items. Also, it only checks datasets from successful runs. ### How to use it Before DVC can work properly, it needs to have historical information about the runs of your actor/task. Therefore, the first run of DVC (and perhaps several others, if you don't have enough runs of the checked actor/task yet) will go through the existing runs and obtain the information it needs. Because of this, the first run might take a relatively long time. For ways to decrease it, see the next section. Please, make sure, that all the runs the first DVC run processes are valid, otherwise, accuracy of the check will be lower. For ways to exclude invalid runs you know about, see the next section. After the first run, I recommend running DVC after each run of the checked actor/task completes (best achieved using a webhook). If you specify a warning email in DVCs input, it will send you an email for each dataset it considers invalid. If you don't specify it, you can check the console logs from the runs - they contain the same information. ### Useful tips #### 1. Restricting the scope This tip will be useful (especially for the first run of DVC), if at least one of the following applies to you: a) You already have a large number of runs (e.g. hundreds) in your actor/task and the first run would therefore take too long. b) You know there is a mistake somewhere in the previous runs of your actor/task and don't want DVC to consider it normal. c) You know the website you are scraping changed over time and you don't want DVC to consider the older state normal. If any of those apply to you, you can use the parameter 'Starting At' (or 'startingAt' in the JSON input) to control what will be the earliest run DVC will process. If you need even more control, you can use the parameter 'Until' (or 'until' in the JSON input) to define, what will be the latest run DVC will process. #### 2. Clearing the history Previous tip was mainly concerned with the first run, but what if the website changes significantly without causing an error in your scraping? This could for example happen, when an e-shop decides to widen the selection of items it offers. To prevent false positives (valid datasets flagged as invalid by DVC) in this case, you can use the parameter 'Clear History' (or 'clearHistory' in the JSON input) for a single run to delete all information about what a dataset should look like gathered before that run, allowing DVC to start anew for the particular actor/task. #### 3. Adjusting strictness As with any similar algorithm, there has to be a tradeoff between false positives and false negatives (invalid datasets flagged as valid by DVC). If the default setting doesn't work well enough for you, you can use parameters 'Average Multiplying Coefficient' and 'Maximal Multiplying Coefficient' (or 'averageMultiplyingCoefficient' and 'maximalMultiplyingCoefficient' in the JSON input) to adjust the tradeoff, or even change them both using the 'Leniency Coefficient' parameter (or 'leniencyCoefficient' in the JSON input).

Common Use Cases

Market Research

Gather competitive intelligence and market data

Lead Generation

Extract contact information for sales outreach

Price Monitoring

Track competitor pricing and product changes

Content Aggregation

Collect and organize content from multiple sources

Ready to Get Started?

Try Dataset Validity Checker now on Apify. Free tier available with no credit card required.

Start Free Trial

Actor Information

Developer
equidem
Pricing
Paid
Total Runs
6,570
Active Users
28
Apify Platform

Apify provides a cloud platform for web scraping, data extraction, and automation. Build and run web scrapers in the cloud.

Learn more about Apify

Need Professional Help?

Couldn't solve your problem? Hire a verified specialist on Fiverr to get it done quickly and professionally.

Find a Specialist

Trusted by millions | Money-back guarantee | 24/7 Support