Join the Webstudio community

Updated 3 months ago

A couple of questions

At a glance
Happy weekend! I've got a couple of question/issues:

1.) What if I need to add say 400 redirects or more complex dynamic ones. I assumed there would be an option to edit the _redirects file that Cloudflare uses, but there is only a very simplistic GUI for redirects. This neither allows for bulk managing redirects nor does it allow for anything other than "redirect slug A to slug B".

a) How can I set www to non-www redirects and the other way around?
b) How can I set trailing slash to non trailing slash redirects or the other way around?

Both A and B above could be two simple dropdown settings on project level to avoid the mess of duplicated URLs being indexed.

2.) The doc mentions robots.txt rules are supported but I can't seem to find any setting or way to edit the robots.txt. Do I need to create a new page for it and use robots.txt as path or how does this work? I assume this would just create an HTML file (or XML) as there is no option for TXT/plain text?

3.) There doesn't seem to be an option to noindex the wstd.io subdomains or remove them from the "Publish" settings because it's not needed when using an external domain. The way things currently work, Google is indexing duplicates of your customers' websites under your subdomain (see https://www.google.com/search?q=site:wstd.io). There should be a setting for "global noindex" for those subdomains and it might make sense to have this enabled by default; at least when an external domain was added to the project.
B
c
3 comments
  1. There are two options for redirects. Static one in project settings and dynamic redirect in page settings, which can construct urls.
A and b maybe on cloudflare level only
  1. We generate robots.txt automatically for now though we will add support for custom one in the future.
  1. It was a bug, indexing home page was always enabled. I think we already fixed it some time ago. You can check there is no matches for the last month.
Hey @TrySound , thanks for your quick response!

1.) Still hard to manage redirects this way - I think it would be much easier to just add an editable _redirects file to each project by default that allows for plain text editing of the _redirects file

  1. A and B) Yes, that's indeed the current workaround but at least for the very basic ones like www vs non-www and trailing slash vs no trailing slash I think it would be great to be able to control this from within project settings. Because this way I wouldn't have to use CF proxy mode to control the redirects from there and could also use a different third party DNS if I wanted to.
2.) That's unfortunate because I need to be able to disallow certain subfolders for certain crawlers etc. An automatically generated robots.txt isn't much better than having none at all πŸ™‚ Just like with point 1 above, I think it would be awesome to automatically add a robots.txt "page" to projects by default and allow plain text editing of it. Should be easy to implement and allows for maximum flexibility.

3.) Oh that's great, I indeed didn't check the dates to see if indexation of those subdomains had recently stopped or not.
Add a reply
Sign up and join the conversation on Discord