Copied to clipboard

Flag this post as spam?

This post will be reported to the moderators as potential spam to be looked at


  • Amit 3 posts 73 karma points
    Aug 19, 2020 @ 14:01
    Amit
    0

    Single Installation for Multiple Domains - Generate different sitemap and robots

    Hello All,

    I have two domains pointed to single Umbraco installation. Content will be different for each domain based on the location/language.

    For e.g. as below. One domain is general for world and other is country specific.

    www.website.com www.website.fr

    For both domain I have single Umbraco installation to manage the content centrally.

    For both website I need separate sitemap and robots.txt from SEO point view and submit to google search console.

    What is the best way to do this??

    Thanks Amit Patel

  • Marc Goodson 1451 posts 9716 karma points MVP 5x c-trib
    Aug 19, 2020 @ 19:24
    Marc Goodson
    0

    Hi Amit

    Have a look at the following two packages:

    https://our.umbraco.com/packages/website-utilities/friendly-robots/

    and

    https://our.umbraco.com/packages/website-utilities/friendly-sitemap/

    They both allow 'dynamic' versions of robots.txt and an xml sitemap for each configured domain in a multi site setup.

    regards

    Marc

  • Amit 3 posts 73 karma points
    Aug 21, 2020 @ 15:44
    Amit
    0

    Hi Mark,

    Thanks for your reply. Still not working.

    I have tried using the package you mentioned and followed the GitHub documentation given here - https://github.com/callumbwhyte/friendly-robots

    But the issue which we are facing with this package is : It creates robots.txt on the first request only, and after that it gives the same file to every request.

    for e.g. for our two domains

    www.website.com/robots.txt

    www.website.fr/robots.txt

    Whichever url is hit first after site restart it creates a sitemap url for that in robots.txt. We want it to create on every unique request

    Our code looks like below

    [ComposeAfter(typeof(RobotsComposer))]
    public class CustomRobotsComposer : IUserComposer
    {
       public void Compose(Composition composition)
       {
           composition.Register(factory => GetConfiguration(), Lifetime.Request);
       }
    
       public RobotsConfiguration GetConfiguration()
       {
             var request = HttpContext.Current.Request;
             var address = string.Format("{0}://{1}", request.Url.Scheme, request.Url.Host);
             var configuration = new RobotsConfiguration
             {
                   UserAgent = "*",
                   Allow = new[] { "/" },
                   Sitemaps = new[] { address + "/sitemap.xml" }
              };
    
             return configuration;
        }
    }
    

    We have also tried with "Lifetime.Scope" and "Lifetime.Transient" but the same result. Not able to figure out what we are doing wrong here.

    Your help will be much appreciated.

    Regards,

    Amit Patel

  • Amit 3 posts 73 karma points
    Aug 24, 2020 @ 13:15
    Amit
    0

    Hi Marc,

    Can you please provide your input on my last comment?

  • This forum is in read-only mode while we transition to the new forum.

    You can continue this topic on the new forum by tapping the "Continue discussion" link below.

Please Sign in or register to post replies