Possibilities in
one A-Parser

We have compiled all the benefits for you on one page, detailed information on each function can be found in the documentation


Task Editor

Multiple Scrapers in One Task

Use up to 20 scrapers in one task, evenly distributing threads  to reduce proxy bans and increase scraping speed

Scrapers Presets

Many settings of each scraper can be saved in separate presets and reused in various tasks

Query Builder

Separation of input data allows you to change the type of query and write additional related data to the results

Query Formatting

Separate query format for each scraper in one task, control of formatting order

Uniqueness of Queries

If you are not sure about the input data, A-Parser takes care not to do unnecessary work

Substitution Macros

Automatic multiplication of queries, substitution of subqueries from files, enumeration of alphanumeric combinations and lists

Results Formatting

The powerful Template Toolkit allows you to apply additional logic to results and output data in various formats, including JSON, SQL and CSV

Uniqueness of Results

Advanced deduplication capabilities guarantee the uniqueness of the resulting strings, links and domains

Results Filtering

Save only data that fit your conditions: substring occurrence, numeric comparison, RegEx

Simultaneous Saving to Different Files

Use a different format for different files, apply additional conditions and filters, all-in-one task to save scraping resources

Job Logging

A detailed log separately for each thread and separately for each query, allows you to quickly and easily debug tasks

Task sequences

Extend the logic of A-Parser by automatically running different tasks as they are completed, passing the results of one task as requests to execute another task

Saving Uniqueization Databases

Collecting databases using several different tasks? Saving uniqueization databases allows you to always get only new results

Control the Number of Threads

By running each task in a specified number of threads, you can be sure that A-Parser will not exceed your proxy rate or your server resources

Task Вebugger

Use the debugger to quickly check the operation of the task during its compilation, quick launch and visual display of the log

Task Queue and Scheduler

Adding Multiple Task

The task queue saves you from having to wait for a task to complete. Add an unlimited number of independent tasks

Simultaneous Performance of Tasks

Control the number of simultaneous task, reducing the total time to get results many times over

Task Management

Start and pause, edit, or delete. Start tasks from where you last stopped, A-Parser will continue collecting information

Task Priorities

With a large queue of tasks it is important to be able to control which task starts faster than the others

Dynamic Flow Limit

If necessary, set a common thread restriction for all task, A-Parser will automatically distribute threads between active jobs

Tasks History

Full history of completed tasks, view tasks statistics, and the ability to add tasks again

Tasks Scheduler

Run recurring tasks using the task scheduler, flexible settings for repetition intervals

Proxy Checkers and Proxy Work

Support for HTTP(S) and SOCKS4/5 Proxies

A-Parser knows how to work with all proxy protocols, the proxy checker can check all types of proxies at the same time

Unlimited Number of Proxies

Add separate proxy checkers for different proxy sources by setting different proxy check settings

Multi-threaded Proxy Check and Download

Manage the number of check and download threads, separately for each proxy checker

Proxy Support with Authorization

Specify proxy access data in the proxy checker settings or proxy lists with separate authorization data

Different Types of Tests

A-Parser checks proxies for POST method operation, anonymity, response time and other parameters

Ability to Disable Proxy Checking

If you are sure that all proxies are working there is an option to disable checking to save resources

Selection of Proxies for each Task

For each task you can choose your proxy sources, flexibly split resources

Selection of Proxies for each Scraper

Ability to use different proxies in one task even more flexibly, for example separate proxies for Google and Yandex scrapers

Banning Proxies

If a proxy is banned by the service, A-Parser will not use it for a specified period of time, reducing the probability of unsuccessful requests

Threads Limit per Proxy

You can limit the maximum number of threads per proxy in order not to overuse proxy resources

Reusing the Proxy Between Attempts

By default, A-Parser uses a unique proxy for each attempt to download data, it is possible to change this behavior

Possibility to Exclude Proxy Checker

This feature allows you to exclude certain proxies and use them only in certain tasks

Flexible Settings

All Settings are Organized in Presets

Save groups of settings in different presets, reuse in different tasks

Detailed Settings for each Scraper

For example, for Google Scraper you can specify the number of pages, the number of results per page, language settings, geolocation and many other options

Import and Export

Export settings and scrapers, share with other users, import ready-made tasks from our catalog

Multithreading and Performance

Asynchronous Architecture

A-Parser is built on a fully asynchronous architecture and is capable of running up to 10,000 simultaneous asynchronous threads

Lots of Optimizations

A-Parser uses many optimizations for better performance, and we are constantly profiling and improving our code

Millions and Billions of Queries and Results

No limits on the number of queries, query file size or number of results

Low Resource Consumption

For most tasks, any office or home computer is suitable, as well as any entry-level VDS

Load Distribution by Cores

Currently A-Parser can efficiently use up to 4 processor cores, soon there will be a license with unlimited number of cores

Captcha Recognition

Integration with XEvil and CapMonster

The most popular CAPTCHA recognition software, supports many types of captchas, including ReCaptcha2

Integration with Online Recognition Services

We support integrations with the vast majority of services, including Anti-Captcha, RuCaptcha, CapMonster.cloud, 2captcha and others

Support for Recognition in Scrapers

Support for captcha recognition is added to all popular scrapers, and you can also use recognition from your own JavaScript scrapers

Developing Presets Based on Regular Expressions

Collecting Data from Any Site

Apply RegEX to data retrieved from the Net::HTTP scraper or the HTML::LinkExtractor spider scraper

Working with Variables and Arrays

Collect single data from a page into variables or repeating blocks (lists, tables) into arrays. Output the data in a convenient format using the templating tool

Expand the Capabilities of Standard Scrapers

You can apply additional processing to the raw data for all built-in scrapers (for example, Google search results)

Navigating Through Pagination Pages

Use RegEX to find links to the next pagination page, A-Parser scraper will automatically navigate to all pages

Data Correctness Check

Use RegEx to check content, check proxy ban or captcha display, A-Parser will automatically retry with another proxy if it fails

Additional Results Processing

With the Result Builder you can search and replace with RegEx in any scraping results

Developing JavaScript Scrapers

Simple and Concise JavaScript Code

Linear and synchronous code using async/await to be executed by A-Parser in multithreaded mode

Working with Proxies and Retries

A-Parser allows you to concentrate on creating code for data extraction and conversion, it takes care of all the proxy work

TypeScript Support

Write in JavaScript with ES2020+ support or use TypeScript for strict typing and syntax highlighting

Use NodeJS Modules

The unlimited number of NPMJS catalog modules allows you to expand A-Parser's data extraction and processing capabilities

Chrome Control via Puppeteer with Proxy Support

A-Parser adds proxy support to the popular puppeteer library, allowing you to automatically use proxies for different tabs

Referring to Built-In and other JavaScript Scrapers

You can send queries to any built-in scraper, as well as to other JavaScript scraper, thus creating logic of any complexity

Automation and API

Full Control Over A-Parser via HTTP/JSON API

Send HTTP requests from your software and scripts or use ready-made libraries for NodeJS, Python, PHP and Perl

Task Creation

Adding tasks by preset name or complete structure with detailed settings

Queue Management

Full control over tasks in the queue, tracking the status of tasks, downloading results

Single and Bulk Requests in Blocking Mode

Send an HTTP request and receive the results as soon as the data collection is finished

Redis API

Our solution for busy projects. Ability to connect an unlimited number of A-Parsers to process API requests in Redis queue with minimal delays

Updating A-Parser via API

For full automation, the ability to remotely update A-Parser by API is available

Regular Improvements and Support

We have released over 140+ versions of A-Parser since 2011

Continuous development of A-Parser gives our users more and more new features every year

Regular updates of built-in scrapers

We test all the built-in scrapers daily and automatically, which allows us to release updates as soon as possible with any changes in the layout or issuance

Technical Support

Free technical support is available to all our users and in their own opinion is the best among similar products

Learning Materials

We regularly release tutorials, sample presets and scrapers, and tutorial videos on our YouTube channel

We Listen to Your Opinions on Our Forum

Most of the new features and scrapers appear based on requests from our users

Paid Services

We know how to save your time and offer the development of scrapers on our platform, as well as integration with your business logic and databases

Choose the Appropriate License

Simple, Transparent Pricing

A-Parser Lite

Basic Google and Yandex Scrapers
Buy Now
Only Google and Yandex Scrapers
3 months of free updates
Bonus proxies: 20 threads for 2 weeks

A-Parser Pro

Access to All Scrapers
Buy Now
A set of 90+ built-in web scrapers
Creating JavaScript scrapers
6 months of free updates
Bonus proxies: 50 threads for a month
All features included in the Lite pack

A-Parser Enterprise

Access to All Scrapers and API
Buy Now
API control
Multi-core task processing
Redis integration
All features included in the Pro pack
Updates: $ 49 for 3 months, $ 149 for a year, or $ 399 for life
Complete Comparison of Versions and Functionality

Custom Web Scraping Solutions

We not only just rely on tools, but also our people and their expertise
Developing your own web scrapers or web automation robots can take a lot of time and effort. With A-Parser, you can delegate this job to experts who will deliver a turn-key solution just for you
We understand each company has a unique business objective, that’s why our data experts will put the focus clearly on fulfilling your needs and wants
Contact Us

Transform the web into usable data 100x faster with A-Parser

Automate your information gathering and data processing
Sign Up and Try Demo
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.