Skip to main content

General information

A-Parser - scraper for professionals


A-Parser is a multi-threaded scraper for search engines, website evaluation services, keywords, content (text, links, arbitrary data), and other various services (YouTube, images, translator, etc.). A-Parser contains over 90 built-in scrapers.


The key features of A-Parser are support for Windows/Linux platforms, a web interface with remote access, the ability to create your own scrapers without writing code, and the ability to create scrapers with complex logic in JavaScript / TypeScript with support for NodeJS modules.

Performance, proxy management, CloudFlare protection bypass, fast HTTP engine, support for managing Chrome through puppeteer, managing the scraper through API, and much more make A-Parser a unique solution. In this documentation, we will try to reveal all the advantages of A-Parser and ways to use it.

Areas of application

A-Parser can solve many tasks. For convenience, we have divided them into categories by areas of application. Follow the links below for details.

SEO specialists and studios

Business and freelancers



Online stores and marketplaces


Features and benefits

In this section, we briefly list the main advantages of A-Parser. More detailed information can be found at the link below.

Overview of all features

⏩ A-Parser webinar: overview and Q&A

Multithreading and performance

  • A-Parser is based on the latest versions of NodeJS and JavaScript engine V8
  • AsyncHTTPX - proprietary implementation of the HTTP engine with support for HTTP/1.1 and HTTP/2, HTTPS/TLS, support for HTTP/SOCKS4/SOCKS5 proxies with optional authorization
  • The scraper can perform HTTP requests in 5000-10000 concurrent threads depending on the computer configuration and the task being solved
  • Each task (set of requests) is parsed in the specified number of threads
  • When using multiple scrapers in one task, each request to different scrapers is executed in different threads simultaneously
  • The scraper can run multiple tasks in parallel
  • Checking and loading proxies from sources also takes place in multithreaded mode

Creating your own scrapers

Creating scrapers in JavaScript

Powerful tools for query and result formation

  • Query builder and result builder - allow modifying data (search and replace, extracting domain from link, transformations using regular expressions, XPath...)
  • Substitutions for queries: from file; iteration over words, characters and numbers, including with a specified step
  • Result filtering - by substring, equality, greater\less than
  • Result deduplication - by string, by domain, by main domain (A-Parser knows all top-level domains, including ones like,
  • Powerful result templating based on Template Toolkit - allows outputting results in any convenient format (text, csv, html, xml, custom format)
  • The scraper uses a system of presets - for each scraper, multiple pre-installed settings can be created for different situations
  • Everything can be configured - no frames or limitations
  • Export and import of settings allow easy exchange of experience with other users


  • Possibility to integrate and manage the scraper from your own programs and scripts
  • Full automation of business processes
  • Clients for PHP, NodeJs, Perl and Python