Skip to content

Commit e7ccea8

Browse files
committed
fix: beautify readme && function renaming
1 parent 3010dc4 commit e7ccea8

File tree

3 files changed

+102
-11
lines changed

3 files changed

+102
-11
lines changed

README.md

Lines changed: 94 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -21,22 +21,113 @@ npm i
2121
npm build
2222
```
2323

24-
## Example Usage
24+
## Example Usages
2525

26-
#### The following example returns the response of how you requested from httpbin.co. You should see the ‘A123’ header in the header section of the response.
26+
> ### extraHeaders
27+
28+
#### extraHeaders is used when you want to add one or more headers specifically required by the target website, without altering the core headers automatically generated by the service. This is useful for passing additional information while maintaining the integrity of the existing request headers.
29+
30+
#### The following example returns the response of how you requested from httpbin.co. You should see the ‘Key’ header in the header section of the response.
2731

2832
```typescript
2933
const client = new ScrapeDo("example_token");
3034
const response = await client.sendRequest("GET", {
3135
url: "https://httpbin.co/anything",
3236
extraHeaders: {
33-
A123: "Extra Header",
37+
Key: "Value",
38+
},
39+
});
40+
41+
console.log(response.data);
42+
```
43+
44+
> ### forwardHeaders
45+
46+
#### The forwardHeaders option is ideal when you want to forward your custom headers directly to the target website without any additional headers being generated or modified by the service. This approach makes the request appear as if it is being made directly from your end, preserving the original header structure.
47+
48+
```typescript
49+
const client = new ScrapeDo("example_token");
50+
const response = await client.sendRequest("GET", {
51+
url: "https://httpbin.co/anything",
52+
forwardHeaders: {
53+
Key: "Value",
54+
},
55+
});
56+
57+
console.log(response.data);
58+
```
59+
60+
> ### customHeaders
61+
62+
#### The customHeaders option gives you full control over all headers sent to the target website. When you use customHeaders, the headers you provide will completely replace the default ones. This feature is useful when you need to define specific headers like User-Agent, Accept, Cookies, and more, ensuring that only your specified headers are sent with the request.
63+
64+
```typescript
65+
const client = new ScrapeDo("example_token");
66+
const response = await client.sendRequest("GET", {
67+
url: "https://httpbin.co/anything",
68+
customHeaders: {
69+
Key: "Value",
3470
},
3571
});
3672

3773
console.log(response.data);
3874
```
3975

76+
> ### super (residental proxy)
77+
78+
#### The super parameter enables the use of a residential proxy for the request. When this parameter is set to true, the request will be routed through a residential IP address. This means that the IP address will typically appear as if it belongs to a mobile network provider, adding an additional layer of anonymity and making the request look more like regular web traffic.
79+
80+
```typescript
81+
const client = new ScrapeDo("example_token");
82+
const response = await client.sendRequest("GET", {
83+
url: "https://httpbin.co/anything",
84+
super: true,
85+
});
86+
87+
console.log(response.data);
88+
```
89+
90+
> ### render (javascript execution - humanized browser rendering)
91+
92+
#### The render parameter allows for the execution of JavaScript during the request, enabling full browser-like rendering. When this parameter is set to true, the service will render the target webpage as if it were being loaded in a real browser, executing all JavaScript, loading dynamic content, and handling client-side interactions. This approach is particularly useful for scraping websites that rely heavily on JavaScript to display their content, providing a more accurate and “humanized” view of the page.
93+
94+
```typescript
95+
const client = new ScrapeDo("example_token");
96+
const response = await client.sendRequest("GET", {
97+
url: "https://httpbin.co/anything",
98+
render: true,
99+
});
100+
101+
console.log(response.data);
102+
```
103+
104+
> ### final bonus example (render, super, geoCode, playWithBrowser)
105+
106+
#### In this example, multiple parameters are combined to showcase advanced scraping capabilities. By using a combination of render, super, geoCode, and playWithBrowser, you can perform complex scraping tasks that require JavaScript execution, residential proxies, geographical targeting, and interactive browser actions:
107+
108+
- render: true: Enables JavaScript execution to fully render the webpage, allowing for the scraping of dynamic content that relies on client-side scripting.
109+
- super: true: Utilizes a residential proxy, which makes the request appear as if it is coming from a typical user on a mobile network, providing enhanced anonymity and avoiding blocks from anti-scraping measures.
110+
- geoCode: "us": Targets a specific geographic location for the request, in this case, the United States. This is useful for scraping content that varies by region, such as localized prices or region-specific data.
111+
- playWithBrowser: Provides the ability to interact with the browser while rendering the page. For example, you can wait for specific elements to load or perform actions like clicking buttons. In this case, it waits for the <body> element to ensure the page is fully loaded before proceeding.
112+
113+
```typescript
114+
const client = new ScrapeDo("example_token");
115+
const response = await client.sendRequest("GET", {
116+
url: "https://ExampleProtectedDomain.com/prices",
117+
render: true,
118+
super: true,
119+
geoCode: "us",
120+
playWithBrowser: [
121+
{
122+
Action: "WaitSelector",
123+
WaitSelector: "body",
124+
},
125+
],
126+
});
127+
128+
console.log(response.data);
129+
```
130+
40131
## More details
41132

42133
#### [Documentation for more information](https://scrape.do/documentation/?utm_source=github&utm_medium=node-client)

src/lib.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -186,7 +186,7 @@ export class ScrapeDo {
186186
*
187187
* @see https://scrape.do/documentation/
188188
*/
189-
async doRequest(method: string, options: DoRequest, body?: any, validateStatus?: (status: number) => boolean) {
189+
async sendRequest(method: string, options: DoRequest, body?: any, validateStatus?: (status: number) => boolean) {
190190
if (!validateStatus) {
191191
validateStatus = (status) => true;
192192
}

tests/lib.test.ts

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ const TOKEN = process.env.TOKEN || "";
66
describe("Usability tests", () => {
77
test("Should be able to get successful response with extra headers", async () => {
88
const client = new ScrapeDo(TOKEN);
9-
const response = await client.doRequest("GET", {
9+
const response = await client.sendRequest("GET", {
1010
url: "https://httpbin.co/anything",
1111
extraHeaders: {
1212
A123: "Extra Header",
@@ -18,7 +18,7 @@ describe("Usability tests", () => {
1818
});
1919
test("Should be able to get successful response with custom headers", async () => {
2020
const client = new ScrapeDo(TOKEN);
21-
const response = await client.doRequest("GET", {
21+
const response = await client.sendRequest("GET", {
2222
url: "https://httpbin.co/anything",
2323
customHeaders: {
2424
A123: "Custom Header",
@@ -30,7 +30,7 @@ describe("Usability tests", () => {
3030
});
3131
test("Should be able to get successful response with forward headers", async () => {
3232
const client = new ScrapeDo(TOKEN);
33-
const response = await client.doRequest("GET", {
33+
const response = await client.sendRequest("GET", {
3434
url: "https://httpbin.co/anything",
3535
forwardHeaders: {
3636
A123: "Forward Header",
@@ -42,7 +42,7 @@ describe("Usability tests", () => {
4242
});
4343
test("Should be able to get successful response with cookies", async () => {
4444
const client = new ScrapeDo(TOKEN);
45-
const response = await client.doRequest("GET", {
45+
const response = await client.sendRequest("GET", {
4646
url: "https://httpbin.co/anything",
4747
setCookies: {
4848
A123: "Cookie",
@@ -55,7 +55,7 @@ describe("Usability tests", () => {
5555
test("Should throw error if setCookies is used with customHeaders", async () => {
5656
const client = new ScrapeDo(TOKEN);
5757
await expect(
58-
client.doRequest("GET", {
58+
client.sendRequest("GET", {
5959
url: "https://httpbin.co/anything",
6060
setCookies: {
6161
A123: "Cookie",
@@ -68,7 +68,7 @@ describe("Usability tests", () => {
6868
});
6969
test("Should get successful response with render and playWithBrowser", async () => {
7070
const client = new ScrapeDo(TOKEN);
71-
const response = await client.doRequest("GET", {
71+
const response = await client.sendRequest("GET", {
7272
url: "https://httpbin.co/anything",
7373
render: true,
7474
playWithBrowser: [
@@ -83,7 +83,7 @@ describe("Usability tests", () => {
8383
});
8484
test("Should get successful response with render and super proxy", async () => {
8585
const client = new ScrapeDo(TOKEN);
86-
const response = await client.doRequest("GET", {
86+
const response = await client.sendRequest("GET", {
8787
url: "https://httpbin.co/anything",
8888
render: true,
8989
super: true,

0 commit comments

Comments
 (0)