Add a robots.txt file to public ressources with SSO enabled #2786
Root-Core
started this conversation in
Feature Requests
Replies: 1 comment 1 reply
-
|
We would need this to be customized in some manner, as some people are using different routes to be protected by SSO, EG: allow most routes but |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Summary
Serving a
robots.txtfile can prevent the indexing of publicly available but not openly accessible resources.If SSO is enabled, the Pangolin authentication page is served and indexed by crawlers.
This increases the security threat surface and is likely unwanted behaviour.
Motivation
As @miloschwartz wrote: We are also aware that the search engines are crawling the pangolin auth pages
If a resource is protected with SSO, it's unlikely that it's meant to be indexed by search engines.
Proposed Solution
Serve a
robots.txtthat instructs the crawler to skip indexing this website / resource.Alternatives Considered
Add meta tags and/or HTTP headers.
However, this has the disadvantage that the page is requested by the crawler and that it only applies to a single page.
Additional Context
No response
Beta Was this translation helpful? Give feedback.
All reactions