Skip to content

Commit 6fbd17d

Browse files
committed
Remove duplicate section
1 parent 8af973b commit 6fbd17d

File tree

1 file changed

+0
-4
lines changed

1 file changed

+0
-4
lines changed

frontend/docs/docs/user-guide/workflow-setup.md

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -117,10 +117,6 @@ When enabled, the crawler will check for a sitemap at /sitemap.xml and use it to
117117

118118
This can be useful for discovering and capturing pages on a website that aren't linked to from the seed and which might not otherwise be captured.
119119

120-
### Fail Crawl If Not Logged In
121-
122-
When enabled, the crawler will fail the crawl if it behaviors detect that the browser is not logged in for specific supported social media sites (Facebook, Instagram, TikTok, X, YouTube).
123-
124120
### Link Selectors
125121

126122
Instructs the crawler which HTML elements should be used to extract URLs, i.e. considered a “link.” By default, the crawler checks the `href` value of all anchor (`<a>`) elements on a page.

0 commit comments

Comments
 (0)