Public Tender Scraper
by stephaniehhnbrg
Public Tender Scraper is an Apify Actor designed to collect public tender information from various government procurement portals.
Opens on Apify.com
About Public Tender Scraper
Public Tender Scraper is an Apify Actor designed to collect public tender information from various government procurement portals.
What does this actor do?
Public Tender Scraper is a web scraping and automation tool available on the Apify platform. It's designed to help you extract data and automate tasks efficiently in the cloud.
Key Features
- Cloud-based execution - no local setup required
- Scalable infrastructure for large-scale operations
- API access for integration with your applications
- Built-in proxy rotation and anti-blocking measures
- Scheduled runs and webhooks for automation
How to Use
- Click "Try This Actor" to open it on Apify
- Create a free Apify account if you don't have one
- Configure the input parameters as needed
- Run the actor and download your results
Documentation
Public Tender Scraper
Public Tender Scraper is an Apify Actor designed to collect public tender information from multiple government procurement portals. While public tenders ensure transparency and fair competition, counteracting favoritism and corruption, they are not publicly advertised in one place. Each government provides their own platform, if any, to let organizations publish tenders, on which businesses and suppliers can apply and bid. Supported platforms: | Country | National procurement portal | |:-------------|:------------------------------------------------------------------------------------------------------------| | 🇩🇪 Germany | e-Vergabe | | 🇮🇪 Ireland | gov.ie eTenders | | 🇪🇺 tba | Link | Public Tender Scraper can help businesses ... - to automate steps in their procurement process - to find suitable tenders faster and more efficiently across various platforms - to go international and win contracts abroad ... by providing the following functionalities:
🔎 Unified tender search across multiple national platforms
🏗️ Automation-ready data for integration into procurement workflows
💬 Automatic translation of queries and results to/from user's language
💾 Flexible output formats: JSON and CSV
🧩 Simple API integration for seamless embedding in systems
### How to integrate? 1. Generate the API keys - Apify API Key - Groq API Key (optional, to enable translation feature) 2. Export env variables \ export APIFY_API_KEY=apify_api_... \ export GROQ_API_KEY=gsk_... 3. Invoke the Actor via curl bash curl -X POST "https://api.apify.com/v2/acts/stephaniehhnbrg~public-tender-scraper-germany/runs?token=$APIFY_API_KEY" \ -d '{"keyword": "Ultraschall", "maxResults": "10", "groqApiKey":"'"$GROQ_API_KEY"'"}' \ -H 'Content-Type: application/json' 4. Retrieve the RUN-ID from the JSON response (data > id) 5. Check the status of the run (data > status) bash curl "https://api.apify.com/v2/acts/stephaniehhnbrg~public-tender-scraper-germany/runs/<RUN-ID>?token=$APIFY_API_TOKEN" 6. Retrieve the DATASET-ID from the JSON response (data > defaultDatasetId) 7. Fetch the dataset items, as soon as the run holds the status succeeded. bash curl "https://api.apify.com/v2/datasets/<DATASET-ID>/items?view=overview" Alternatively, open the Apify Console link from the status response (data > consoleUrl) - https://console.apify.com/view/runs/ ### Dev Notes #### Folder Structure The project was initialized using the Apify template ts-crawlee-playwright-chrome, which provides a standard structure: .actor/ # Actor metadata and I/O schemas src/ # Source code storage/ ├── datasets/ # Actor outputs (JSON + CSV) ├── key_value_stores/ # Input variables and run statistics └── request_queues/ # Crawling data #### Run Actor locally 1. Install Apify - Guide 2. Configure input parameters (keyword, maxResults, groqApiKey) by editing INPUT.json 3. Run the actor locally bash apify run 4. Review the output - as JSON objects: ./storage/datasets/default - as CSV file: ./storage/datasets/result.csv #### Publish Actor After running the following commands: bash apify login apify push check out the Apify console and publish the actor via the UI.
.actor/ # Actor metadata and I/O schemas src/ # Source code storage/ ├── datasets/ # Actor outputs (JSON + CSV) ├── key_value_stores/ # Input variables and run statistics └── request_queues/ # Crawling data #### Run Actor locally 1. Install Apify - Guide 2. Configure input parameters (keyword, maxResults, groqApiKey) by editing INPUT.json 3. Run the actor locally bash apify run 4. Review the output - as JSON objects: ./storage/datasets/default - as CSV file: ./storage/datasets/result.csv #### Publish Actor After running the following commands: bash apify login apify push check out the Apify console and publish the actor via the UI.Categories
Common Use Cases
Market Research
Gather competitive intelligence and market data
Lead Generation
Extract contact information for sales outreach
Price Monitoring
Track competitor pricing and product changes
Content Aggregation
Collect and organize content from multiple sources
Ready to Get Started?
Try Public Tender Scraper now on Apify. Free tier available with no credit card required.
Start Free TrialActor Information
- Developer
- stephaniehhnbrg
- Pricing
- Paid
- Total Runs
- 4
- Active Users
- 3
Related Actors
Video Transcript Scraper: Youtube, X, Facebook, Tiktok, etc.
by invideoiq
Linkedin Profile Details Scraper + EMAIL (No Cookies Required)
by apimaestro
Twitter (X.com) Scraper Unlimited: No Limits
by apidojo
Content Checker
by jakubbalada
Apify provides a cloud platform for web scraping, data extraction, and automation. Build and run web scrapers in the cloud.
Learn more about ApifyNeed Professional Help?
Couldn't solve your problem? Hire a verified specialist on Fiverr to get it done quickly and professionally.
Trusted by millions | Money-back guarantee | 24/7 Support