Skip to content

Commit 3fa47b6

Browse files
committed
update readme
1 parent 49a84aa commit 3fa47b6

File tree

3 files changed

+33
-62
lines changed

3 files changed

+33
-62
lines changed

README.md

Lines changed: 27 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -2,36 +2,37 @@
22

33
`npm install scrapfly-sdk`
44

5-
Quick use:
5+
## Quick Intro
6+
7+
Get your API Key on [scrapfly.io/dashboard](https://scrapfly.io/dashboard) and start scraping:
68

79
```javascript
8-
import { ScrapflyClient, ScrapeConfig } from "scrapfly-sdk";
9-
10-
const client = new ScrapflyClient({key: "YOUR SCRAPFLY KEY"});
11-
const result = await client.scrape(new ScrapeConfig({
12-
url: "https://httpbin.dev/html",
13-
// optional:
14-
aps: true, // enable anti-scraping protection bypass
15-
render_js: true, // enable headless browsers for javascript rendering
16-
country: "us", // use a US proxy
17-
method: "GET", // use GET, POST or other type of requests
18-
data: {}, // what data to send if POST is used
19-
...
20-
}))
21-
console.log(result.result.content) // html content
10+
import { ScrapflyClient, ScrapeConfig } from 'scrapfly-sdk';
11+
12+
const key = 'YOUR SCRAPFLY KEY';
13+
const client = new ScrapflyClient({ key });
14+
const apiResponse = await client.scrape(
15+
new ScrapeConfig({
16+
url: 'https://web-scraping.dev/product/1',
17+
// optional parameters:
18+
// enable javascript rendering
19+
render_js: true,
20+
// set proxy country
21+
country: 'us',
22+
// enable anti-scraping protection bypass
23+
asp: true,
24+
// set residential proxies
25+
proxy_pool: 'public_residential_pool',
26+
// etc.
27+
}),
28+
);
29+
console.log(apiResponse.result.content); // html content
2230
```
2331

24-
See [/examples](./examples/) for more.
25-
26-
## Get Your API Key
27-
28-
You can create a free account on [Scrapfly](https://scrapfly.io/register) to get your API Key.
29-
30-
- [Usage](https://scrapfly.io/docs/sdk/python)
31-
- [Python API](https://scrapfly.github.io/python-scrapfly/scrapfly)
32-
- [Open API 3 Spec](https://scrapfly.io/docs/openapi#get-/scrape)
33-
- [Scrapy Integration](https://scrapfly.io/docs/sdk/scrapy)
34-
32+
For more see [/examples](/examples/) directory.
33+
For more on Scrapfly API see full documentation: <https://scrapfly.io/docs>
34+
For Python see [Scrapfly Python SDK](https://github.com/scrapfly/python-scrapfly)
35+
3536
## Development
3637

3738
Install and setup environment:

examples/README.md

Lines changed: 0 additions & 35 deletions
Original file line numberDiff line numberDiff line change
@@ -1,38 +1,3 @@
11
# Scrapfly Typescript SDK Examples
22

33
This directory contains commonly used examples for the Scrapfly Typescript SDK.
4-
5-
## Quick Guide
6-
7-
Install the library using npm:
8-
9-
```shell
10-
$ npm install scrapfly-sdk
11-
```
12-
13-
Get your API Key on [scrapfly.io/dashboard](https://scrapfly.io/dashboard) and start scraping:
14-
15-
```javascript
16-
import { ScrapflyClient, ScrapeConfig } from 'scrapfly-sdk';
17-
18-
const key = 'YOUR SCRAPFLY KEY';
19-
const client = new ScrapflyClient({ key });
20-
const apiResponse = await client.scrape(
21-
new ScrapeConfig({
22-
url: 'https://web-scraping.dev/product/1',
23-
// optional parameters:
24-
// enable javascript rendering
25-
render_js: true,
26-
// set proxy country
27-
country: 'us',
28-
// enable anti-scraping protection bypass
29-
asp: true,
30-
// set residential proxies
31-
proxy_pool: 'public_residential_pool',
32-
// etc.
33-
}),
34-
);
35-
console.log(apiResponse.result.content); // html content
36-
```
37-
38-
for more see [/examples](/examples/) directory.

package.json

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,11 @@
55
"type": "module",
66
"types": "build/src/main.d.ts",
77
"main": "build/src/main.js",
8+
"repository": {"type": "git", "url": "https://github.com/scrapfly/typescript-scrapfly"},
9+
"bugs": "https://github.com/scrapfly/typescript-scrapfly/issues",
10+
"homepage": "https://scrapfly.io/",
11+
"keywords": ["web scraping", "SDK", "scrapfly", "api"],
12+
"files": ["build/src"],
813
"engines": {
914
"node": ">= 18.12 <19"
1015
},
@@ -35,7 +40,7 @@
3540
"test:watch": "jest --watch"
3641
},
3742
"author": "Bernardas Alisauskas <[email protected]>",
38-
"license": "Apache-2.0",
43+
"license": "BSD",
3944
"dependencies": {
4045
"axios": "^1.4.0",
4146
"node-fetch": "^3.3.1",

0 commit comments

Comments
 (0)