Unlimited Possibilities in A-Parser

We have gathered all the benefits on one page; detailed information on each feature can be found in the documentation.

Windows
Linux
macOS (Docker)
Parser logos in the background

Task Editor

Multiple Parsers in One Task

Use up to 20 parsers in a single task, distributing threads evenly to reduce proxy bans and increase parsing speed.

Parser Presets

Numerous settings for each parser can be saved into separate presets and reused across various tasks.

Query Builder

Separating input data allows you to change the query format and write additional related data to the results.

Query Formatting

A separate query format for each parser within one task, with control over the order of formatting execution.

Query Deduplication

If you're unsure about the input data, A-Parser ensures no redundant work is done.

Substitution Macros

Automatic query expansion, substitution of subqueries from files, and iteration through alphanumeric combinations and lists.

Results Formatting

The powerful Template Toolkit lets you apply additional logic to results and output data in various formats, including JSON, SQL, and CSV.

Result Deduplication

Advanced deduplication capabilities guarantee the uniqueness of the strings, links, and domains you receive.

Result Filtering

Save only the data that meets your criteria: substring matches, numerical comparisons, or regular expressions.

Simultaneous Saving to Different Files

Use different formats for different files and apply additional conditions and filters, all within a single task to conserve parsing resources.

Task Logging

A detailed log for each thread and each query allows for quick and convenient task debugging.

Task Chaining

Extend A-Parser's logic by automatically running different tasks in sequence, passing the results of one task as queries for the next.

Saving Deduplication Databases

Building databases from multiple tasks? Saving deduplication databases ensures you always get only new results.

Thread Count Control

By running each task with a specified number of threads, you can ensure A-Parser won't exceed your proxy plan or server resources.

Task Debugger

Use the debugger to quickly verify a task's operation during its creation, with fast execution and a clear log display.

Task Queue and Scheduler

Adding Multiple Tasks

The task queue frees you from waiting for one task to finish. Add an unlimited number of independent tasks.

Simultaneous Task Execution

Control the number of concurrently running tasks, significantly reducing the total time to get results.

Task Management

Start, pause, edit, or delete tasks. Resume tasks from where they left off; A-Parser will continue collecting information.

Task Priorities

With a large task queue, it's crucial to control which tasks start sooner than others.

Dynamic Thread Limit

Set a global thread limit for all tasks, and A-Parser will automatically distribute threads among active tasks.

Task History

Access a complete history of completed tasks, view statistics, and re-add tasks for another run.

Task Scheduler

Run recurring tasks using the task scheduler with flexible settings for repetition intervals.

Proxy Checkers and Proxy Management

HTTP(S) and SOCKS4/5 Proxy Support

A-Parser works with all proxy protocols, and the proxy checker can test all types simultaneously.

Unlimited Proxy Checkers

Add separate proxy checkers for different proxy sources, each with its own verification settings.

Multi-threaded Proxy Checking and Downloading

Manage the number of checking and downloading threads separately for each proxy checker.

Support for Proxies with Authorization

Specify proxy access credentials in the proxy checker settings or in proxy lists with separate authorization data.

Various Check Types

A-Parser checks proxies for POST method support, anonymity, response time, and other parameters.

Option to Disable Proxy Checking

If you're confident your proxies are working, you can disable checking to save resources.

Proxy Checker Selection per Task

For each task, you can select specific proxy sources, allowing for flexible resource allocation.

Proxy Checker Selection per Parser

For even more flexibility, use different proxies within a single task, such as separate proxies for Google and Yandex scrapers.

Proxy Banning

If a service bans a proxy, A-Parser will stop using it for a specified time, reducing failed requests.

Thread Limit per Proxy

You can limit the maximum number of threads per proxy to avoid overusing its resources.

Proxy Reuse Between Attempts

By default, A-Parser uses a unique proxy for each data download attempt, but this behavior can be changed.

Reserving Proxies

This feature allows you to exclude certain proxies from general use and assign them only to specific tasks.

Flexible Settings

All Settings Organized in Presets

Save groups of settings into different presets and reuse them in various tasks.

Detailed Settings for Each Parser

For instance, the Google scraper allows specifying page count, results per page, language settings, geolocation, and much more.

Import and Export

Export your settings and parsers to share with others, or import ready-made tasks from our catalog.

Multithreading and Performance

Asynchronous Architecture

A-Parser is built on a fully asynchronous architecture, capable of running up to 10,000 concurrent asynchronous threads.

Numerous Optimizations

A-Parser employs many optimizations for better performance, and we constantly profile and improve our code.

Millions and Billions of Queries and Results

There are no limits on the number of queries, the size of query files, or the number of results.

Low Resource Consumption

Most tasks run smoothly on any standard office or home computer, as well as any entry-level VDS.

Load Distribution Across Cores

Currently, A-Parser can efficiently use up to 4 processor cores. A license with unlimited core support is coming soon.

Captcha Recognition

Integration with XEvil and CapMonster

The most popular CAPTCHA recognition software supports many types of CAPTCHAs, including reCAPTCHA v2.

Integration with Online Recognition Services

We support integration with a vast majority of services, including Anti-Captcha, RuCaptcha, CapMonster.cloud, 2captcha, and others.

In-Scraper Recognition Support

Captcha recognition support is built into all popular scrapers. You can also use it from your own custom JavaScript scrapers.

Developing Presets Based on Regular Expressions

Data Collection from Arbitrary Sites

Apply regular expressions to data obtained from the Net::HTTP scraper or the HTML::LinkExtractor spider.

Working with Variables and Arrays

Collect single data points into variables or repeating blocks (lists, tables) into arrays. Output the data in a convenient format using the templating engine.

Expanding Standard Scraper Capabilities

You can apply additional processing to the source data of all built-in scrapers (e.g., Google search results).

Navigating Pagination

Use regular expressions to find links to the next page of pagination, and A-Parser will automatically navigate through all pages.

Data Validation

Use regular expressions to validate content, check for proxy bans, or detect captchas. A-Parser will automatically retry with another proxy upon failure.

Additional Result Processing

With the result constructor, you can perform search-and-replace operations using regular expressions on any scraping results.

Developing Scrapers in JavaScript

Simple and Concise JavaScript Code

Linear and synchronous code using async/await, which A-Parser will execute in a multi-threaded environment.

Handling Proxies and Retries

A-Parser lets you focus on writing data extraction and transformation code, handling all proxy management and retries automatically.

TypeScript Support

Write in modern JavaScript (ES2020+) or use TypeScript for strong typing and syntax highlighting.

Using NodeJS Modules

The vast NPMJS module catalog allows you to extend A-Parser's data extraction and processing capabilities limitlessly.

Chrome Control via Puppeteer with Proxy Support

A-Parser adds proxy support to the popular Puppeteer library, allowing automatic use of different proxies for different tabs.

Calling Built-in and Other JavaScript Scrapers

You can send requests to any built-in or other JavaScript scrapers, enabling the creation of arbitrarily complex logic.

Automation and API

Full Control via HTTP/JSON API

Send HTTP requests from your own programs and scripts, or use our ready-made libraries for NodeJS, Python, PHP, and Perl.

Task Creation

Add tasks by preset name or by providing a full JSON structure with detailed settings.

Queue Management

Gain full control over the task queue, track task statuses, and download results.

Single and Bulk Requests in Blocking Mode

Send an HTTP request and receive the results immediately upon completion of the data collection.

Redis API

Our solution for high-load projects. Connect an unlimited number of A-Parser instances to process API requests in a Redis queue with minimal latency.

Updating A-Parser via API

For complete automation, a remote update of A-Parser is available through an API call.

Continuous Improvements and Support

140+ Versions of A-Parser Released Since 2011

The constant evolution of A-Parser provides our users with new capabilities year after year.

Regular Updates to Built-in Scrapers

We test all built-in scrapers daily and automatically, allowing us to release updates promptly in response to any layout or result changes.

Technical Support

Free technical support is available to all our users and is considered by them to be the best among similar products.

Educational Materials

We regularly release educational materials, sample presets and scrapers, as well as tutorial videos on our YouTube channel.

We Listen to Your Feedback on Our Forum

Most new features and scrapers are developed based on requests from our users.

Paid Services

We can save you time by offering custom scraper development on our platform, as well as integration with your business logic and databases.

Choose the Right License

Lifetime license, updates are purchased separately

A-Parser Lite

Basic Google and Yandex scrapers

$ 179
  • Includes Google & Yandex scrapers
  • 3 months of updates
  • Bonus proxies: 20 threads for 2 weeks
  • Support
Popular

A-Parser Pro

Access All Scrapers

$ 299
  • Full suite of 110+ scrapers
  • Create your own JavaScript scrapers
  • 6 months of updates
  • Bonus proxies: 50 threads for a month
  • Includes all features from the Lite plan

A-Parser Enterprise

Access All Scrapers and API

$ 479
  • API control
  • Multi-core task processing
  • Redis integration
  • Includes all features from the Pro plan

Updates: $49 for 3 months, $149 for a year, or $399 lifetime

Paid Solutions

Custom Scraper Development

We believe any data can be scraped.

We provide custom solutions for obtaining any data from any website.

We will create a scraper that exports results in the format you need, based on your requirements.