sitemap which contains numerous links/URLs

I am presently working with a purchasing/ public auction internet site. The entailing event that is doing SEO working as a consultant is urging to create URLs for every single solitary mix of items detailed on the internet site, instances listed below. It simply does not really feel appropriate to me being that each item/ group web page currently has an approved web links so I am actually currently certain just how much gain the new sitemap would certainly make. Would certainly a person talk about the choice and also offer pointers/ referrals in the offered context? Many thanks.

Instances:

  • www.mywebsite.com/CPU
  • www.mywebsite.com/CPU/Intel
  • www.mywebsite.com/CPU/Intel/Core2Duo
  • www.mywebsite.com/CPU/AMD
  • www.mywebsite.com/CPU/AMD/Phenom
  • www.mywebsite.com/RAM
  • and so on.
0
2019-05-13 02:37:21
Source Share
Answers: 2

I assume just approved variations need to get on sitemap. Adhere to:

  • www.mywebsite.com/CPU
  • www.mywebsite.com/CPU/Intel
  • www.mywebsite.com/CPU/Intel/Core2Duo
  • www.mywebsite.com/CPU/AMD
  • www.mywebsite.com/CPU/AMD/Phenom

If all those web links simply indicate www.mywebsite.com/CPU, after that just add this url to sitemap. If each web page is various (also if it is simply the title), after that you require to add them all to sitemap, yet beware of replicating web content.

0
2019-05-17 11:48:50
Source

There isn't adequate details in your inquiry to give a "excellent" solution. I often tend to assume that you could be barking up the incorrect tree, yet I do so by thinking a couple of points as adheres to.

I think:

  • you have numerous (solitary) item web pages currently, Google hasn't crept them all (greater than 100,000?)
  • items will certainly reoccur with such regularity that spidering is a never ever - finishing problem, and also of essential relevance
  • without a sitemap, you possibly will not get a crawl of all item web pages so you rely upon the sitemap to reveal all web pages you require crept

My pointers:

  • engineer a browseable power structure for the advantage of the crawlers (just) ; it is clickable by customers, certainly, yet it is objective is simply for the crawler to creep
  • the browse power structure requires every item to be no greater than 4 web pages deep
  • construct your URL revise to adhere to the crawl (i.e. first directory site = item - brochure (or comparable such expression), 2nd directory site = pierce to first degree of item uniqueness (i.e. "storage space"), last directory site = a search phrase, a hyphen and also item ID, last little URL string is Product - Name.html)
  • layout - sensible, show the browseable directory site web links insignificantly to customers ; you actually made this for the crawler and also from an use point ofview, it is much better if your customers click via the existing UI
  • guide the crawler by marking all various other inner web links as NoFollow (some might claim not all various other inner web links ; whatever the follow/nofollow strategy, you require to urge the crawler to take a trip the very easy ordered course, and also not jump around the website making use of all the various other all-natural web links)

In recap, what I'm claiming is that you need to not make a web page for every single mix of items, afterall, that is a boundless number and also a difficult job. Professionally, I differ with your SEO individual and also @Eric. Rather, I would certainly be particular to engineer a browse - able "brochure" for the advantage of the crawlers, and also wed your URL revise reasoning with the drill down clicks AND additionally your support message as the crawler drills down. I would certainly enjoy to show to you a URL to act as an instance beyond this discussion forum (e-mail : chris @adragna.com).

If you are still taking into consideration the combination web pages, the existing URL reasoning you created and also the website maps, collaborate with this mathematics : you can have up to 50,000 URLs per map and also approximately 1000 website maps. That a maximum of 50 million web pages ... if you intend to function in reverse from that you can utilize it to identify the ceilings of the amount of item mixes to draw out.

0
2019-05-17 11:42:44
Source