Skip to content

Commit 5e16882

Browse files
committed
feat: update version
1 parent 00431fd commit 5e16882

File tree

4 files changed

+31
-7
lines changed

4 files changed

+31
-7
lines changed

CHANGELOG.md

+24
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,27 @@
1+
# [v9.0.0](https://github.com/coder-hxl/x-crawl/compare/v8.3.1..v9.0.0) (2024-03-16)
2+
3+
### 🚨 Breaking Changes
4+
5+
- The enableRandomFingerprint attribute of XCrawlConfig configuration is changed to false by default
6+
- Drop support for Node16
7+
8+
### ⛓️ Dependencies
9+
10+
- puppeteer upgraded from 21.6.1 to 22.5.0
11+
- https-proxy-agent upgraded from 7.0.1 to 7.0.4
12+
13+
---
14+
15+
### 🚨 重大改变
16+
17+
- XCrawlConfig 配置的 enableRandomFingerprint 属性默认改为 false
18+
- 放弃对 Node16 的支持
19+
20+
### ⛓️ 依赖关系
21+
22+
- puppeteer 从 21.6.1 升至 22.5.0
23+
- https-proxy-agent 从 7.0.1 升至 7.0.4
24+
125
# [v8.3.1](https://github.com/coder-hxl/x-crawl/compare/v8.3.0..v8.3.1) (2023-12-26)
226

327
### 🚀 Features

package.json

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
{
22
"private": true,
33
"name": "x-crawl",
4-
"version": "8.3.1",
4+
"version": "9.0.0",
55
"author": "coderHXL",
66
"description": "x-crawl is a flexible Node.js multifunctional crawler library.",
77
"license": "MIT",

publish/README.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -258,7 +258,7 @@ const myXCrawl = xCrawl({
258258
})
259259
```
260260

261-
The enableRandomFingerprint option defaults to true.
261+
The enableRandomFingerprint option defaults to false.
262262

263263
- true: Enable random device fingerprinting. The fingerprint configuration of the target can be specified through advanced configuration or detailed target configuration.
264264
- false: Turns off random device fingerprinting, does not affect the fingerprint configuration specified for the target by advanced configuration or detailed target configuration.
@@ -1535,7 +1535,7 @@ export interface XCrawlConfig extends CrawlCommonConfig {
15351535
**Default Value**
15361536
15371537
- mode: 'async'
1538-
- enableRandomFingerprint: true
1538+
- enableRandomFingerprint: false
15391539
- baseUrl: undefined
15401540
- intervalTime: undefined
15411541
- log: { start: true, process: true, result: true }
@@ -2073,7 +2073,7 @@ The crawlPage API has built-in [puppeteer](https://github.com/puppeteer/puppetee
20732073
20742074
### Using crawlPage API causes the program to crash
20752075
2076-
If you need to crawl many pages in one crawlPage, it is recommended that after crawling each page, use [onCrawlItemComplete life cycle function] (#onCrawlItemComplete) to process the results of each target and close the page instance. If no shutdown operation is performed, then The program may crash due to too many pages being opened (related to the performance of the device itself).
2076+
If you need to crawl many pages in one crawlPage, it is recommended that after crawling each page, use [onCrawlItemComplete life cycle function](#onCrawlItemComplete) to process the results of each target and close the page instance. If no shutdown operation is performed, then The program may crash due to too many pages being opened (related to the performance of the device itself).
20772077
20782078
```js
20792079
import xCrawl from 'x-crawl'

publish/package.json

+3-3
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
22
"name": "x-crawl",
3-
"version": "8.3.1",
3+
"version": "9.0.0",
44
"author": "coderHXL",
55
"description": "x-crawl is a flexible Node.js multifunctional crawler library.",
66
"license": "MIT",
@@ -35,11 +35,11 @@
3535
}
3636
},
3737
"engines": {
38-
"node": ">=16.0.0"
38+
"node": ">=18.0.0"
3939
},
4040
"dependencies": {
4141
"chalk": "4.1.2",
42-
"https-proxy-agent": "^7.0.1",
42+
"https-proxy-agent": "^7.0.4",
4343
"puppeteer": "22.5.0"
4444
}
4545
}

0 commit comments

Comments
 (0)