But the issue which we are facing with this package is :
It creates robots.txt on the first request only, and after that it gives the same file to every request.
for e.g. for our two domains
www.website.com/robots.txt
www.website.fr/robots.txt
Whichever url is hit first after site restart it creates a sitemap url for that in robots.txt. We want it to create on every unique request
Our code looks like below
[ComposeAfter(typeof(RobotsComposer))]
public class CustomRobotsComposer : IUserComposer
{
public void Compose(Composition composition)
{
composition.Register(factory => GetConfiguration(), Lifetime.Request);
}
public RobotsConfiguration GetConfiguration()
{
var request = HttpContext.Current.Request;
var address = string.Format("{0}://{1}", request.Url.Scheme, request.Url.Host);
var configuration = new RobotsConfiguration
{
UserAgent = "*",
Allow = new[] { "/" },
Sitemaps = new[] { address + "/sitemap.xml" }
};
return configuration;
}
}
We have also tried with "Lifetime.Scope" and "Lifetime.Transient" but the same result. Not able to figure out what we are doing wrong here.
Single Installation for Multiple Domains - Generate different sitemap and robots
Hello All,
I have two domains pointed to single Umbraco installation. Content will be different for each domain based on the location/language.
For e.g. as below. One domain is general for world and other is country specific.
www.website.com www.website.fr
For both domain I have single Umbraco installation to manage the content centrally.
For both website I need separate sitemap and robots.txt from SEO point view and submit to google search console.
What is the best way to do this??
Thanks Amit Patel
Hi Amit
Have a look at the following two packages:
https://our.umbraco.com/packages/website-utilities/friendly-robots/
and
https://our.umbraco.com/packages/website-utilities/friendly-sitemap/
They both allow 'dynamic' versions of robots.txt and an xml sitemap for each configured domain in a multi site setup.
regards
Marc
Hi Mark,
Thanks for your reply. Still not working.
I have tried using the package you mentioned and followed the GitHub documentation given here - https://github.com/callumbwhyte/friendly-robots
But the issue which we are facing with this package is : It creates robots.txt on the first request only, and after that it gives the same file to every request.
for e.g. for our two domains
www.website.com/robots.txt
www.website.fr/robots.txt
Whichever url is hit first after site restart it creates a sitemap url for that in robots.txt. We want it to create on every unique request
Our code looks like below
We have also tried with "Lifetime.Scope" and "Lifetime.Transient" but the same result. Not able to figure out what we are doing wrong here.
Your help will be much appreciated.
Regards,
Amit Patel
Hi Marc,
Can you please provide your input on my last comment?
is working on a reply...
This forum is in read-only mode while we transition to the new forum.
You can continue this topic on the new forum by tapping the "Continue discussion" link below.