Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Moz Q&A is closed.

After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • magento nginx

    Our client's Magento website has been running for at least a decade, so has a lot of old legacy categories for Brands they no longer carry. We're looking to trim down the amount of unnecessary URL Redirects in Magento, so my question is: Is there a way that is SEO efficient to setup permanent redirects at a server level (nginx) that Google will crawl to allow us at some point to delete the categories and Magento URL Redirects? If this is a good practice can you at some point then delete the server redirects as google has marked them as permanent?

    | Breemcc
    0

  • reviews pagination crawler disallow

    Hi experts, I present customer feedback, reviews basically, on my website for the products that are sold. And with this comes the ability to read reviews and obviously with pagination to display the available reviews. Now I want users to be able to flick through and read the reviews to help them satisfy whatever curiosity they have. My only thinking is that the page that contains the reviews, with each click of the pagination will present roughly the same content. The only thing that changes is the title tags which will contain the number in the H1 to display the page number. I'm thinking this could be duplication but i have yet to be notified by Google in my Search console... Should i block crawlers from crawling beyond page 3 of reviews? Thanks

    | Train4Academy.co.uk
    0

  • seo

    My website has many pages like this: mywebsite/company1/valuation mywebsite/company2/valuation mywebsite/company3/valuation mywebsite/company4/valuation ... These pages describe the valuation of each company. These pages were never identical but initially, I included a few generic paragraphs like what is valuation, what is a valuation model, etc... in all the pages so some parts of these pages' content were identical. Google marked many of these pages as duplicated (in Google Search Console) so I modified the content of these pages: I removed those generic paragraphs and added other information that is unique to each company. As a result, these pages are extremely different from each other now and have little similarities. Although it has been more than 1 month since I made the modification, Google still marks the majority of these pages as duplicates, even though Google has already crawled their new modified version. I wonder whether there is anything else I can do in this situation? Thanks

    | TuanDo9627
    0

  • shopify seo audit seo expert

    Hi Experts, Single filter page: /collections/dining-chairs/black
    -- currently, canonical the same: /collections/dining-chairs/black
    -- currently, index, follow Double filter page: /collections/dining-chairs/black+fabric
    -- currently, canonical the same: /collections/dining-chairs/black+fabric
    -- currently, noindex, follow My question is about double filter page above:
    if noindexing is the better option OR should I change the canonical to /collections/dining-chairs/black Thank you

    | williamhuynh
    0

  • HEY EXPERTS, My website page speed is not increasing. I used the wp rocket plugin but still, I am facing errors of Reduce unused CSS, Properly size images, and Avoid serving legacy JavaScript to modern browsers. you can see in the image Screenshot (7).png I used many plugins for speed optimization but still facing errors. I optimized the images manually by using photoshop but still, I am facing the issue of images size. After Google Core Web Vital Update my website keyword position is down due to slow speed. Please guide me on how I increase the page speed of my website https://karmanwalayfabrics.pk Thanks

    | frazashfaq11
    0
  • Unsolved

    ecommerce noindex shopify indexed urls

    Hello everyone, I am very new to SEO and I wanted to get some input & second opinions on a workaround I am planning to implement on our Shopify store. Any suggestions, thoughts, or insight you have are welcome & appreciated! For those who aren't aware, Shopify as a platform doesn't allow us to send a 410 Gone Code/Error under any circumstance. When you delete or archive a product/page, it becomes unavailable on the storefront. Unfortunately, the only thing Shopify natively allows me to do is set up a 301 redirect. So when we are forced to discontinue a product, customers currently get a 404 error when trying to go to that old URL. My planned workaround is to automatically detect when a product has been discontinued and add the NoIndex meta tag to the product page. The product page will stay up but be unavailable for purchase. I am also adjusting the LD+JSON to list the products availability as Discontinued instead of InStock/OutOfStock.
    Then I let the page sit for a few months so that crawlers have a chance to recrawl and remove the page from their indexes. I think that is how that works?
    Once 3 or 6 months have passed, I plan on archiving the product followed by setting up a 301 redirect pointing to our internal search results page. The redirect will send the to search with a query aimed towards similar products. That should prevent people with open tabs, bookmarks and direct links to that page from receiving a 404 error. I do have Google Search Console setup and integrated with our site, but manually telling google to remove a page obviously only impacts their index. Will this work the way I think it will?
    Will search engines remove the page from their indexes if I add the NoIndex meta tag after they have already been index?
    Is there a better way I should implement this? P.S. For those wondering why I am not disallowing the page URL to the Robots.txt, Shopify won't allow me to call collection or product data from within the template that assembles the Robots.txt. So I can't automatically add product URLs to the list.

    | BakeryTech
    0

  • technical seo noindex disallow

    Hello everyone, I am reaching out to seek your expert advice on a few technical SEO aspects related to my website. I highly value your expertise in this field and would greatly appreciate your insights.
    Below are the specific areas I would like to discuss: a. Double and Triple filter pages: I have identified certain URLs on my website that have a canonical tag pointing to the main /quick-ship page. These URLs are as follows: https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black
    https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black+fabric Considering the need to optimize my crawl budget, I would like to seek your advice on whether it would be advisable to disallow or noindex these pages. My understanding is that by disallowing or noindexing these URLs, search engines can avoid wasting resources on crawling and indexing duplicate or filtered content. I would greatly appreciate your guidance on this matter. b. Page URLs with parameters: I have noticed that some of my page URLs include parameters such as ?variant and ?limit. Although these URLs already have canonical tags in place, I would like to understand whether it is still recommended to disallow or noindex them to further conserve crawl budget. My understanding is that by doing so, search engines can prevent the unnecessary expenditure of resources on indexing redundant variations of the same content. I would be grateful for your expert opinion on this matter. Additionally, I would be delighted if you could provide any suggestions regarding internal linking strategies tailored to my website's structure and content. Any insights or recommendations you can offer would be highly valuable to me. Thank you in advance for your time and expertise in addressing these concerns. I genuinely appreciate your assistance. If you require any further information or clarification, please let me know. I look forward to hearing from you. Cheers!

    | williamhuynh
    0

  • schema schema markup serp features

    Hello Moz Team, I hope everyone is doing well & good, I need bit help regarding Schema Markup, I am facing issue in my schema markup specifically with my blog posts, In my majority of the posts I find error "Missing field "url" (optional)"
    As this schema is generated by Yoast plugin, I haven't applied any custom steps. Recently I published a post https://dailycontributors.com/kisscartoon-alternatives-and-complete-review/ and I tested it at two platforms of schema test 1, Validator.Schema.org
    2. Search.google.com/test/rich-results So the validator generate results as follows and shows no error
    Schema without error.PNG It shows no error But where as Schema with error.PNG in search central results it gives me a warning "Missing field "url" (optional)". So is this really be going to issue for my ranking ? Please help thanks!

    | JoeySolicitor
    6

  • Hi all...I'm at a loss. I've never had this happen. Google only shows pages of my site when I search the brand name as one word. When I Google the site as one word BrandBrand- it only shows my blog page and about us page plus Twitter and Facebook on page 1.  The homepage does not show up at all. When I Google the site as two words Brand Brand - My Facebook page is on page 1 but nothing else. The homepage isn't showing up at all. When I search both words on Bing and Yahoo both are indexing it as two words and shows on page 1. Any ideas?

    | TexasBlogger
    0

  • I have a question about the impact (if any) of site-wide redirects for DNS/hosting change purposes. I am preparing to redirect the domain for a site I manage from https://siteImanage.com to https://www.siteImanage.com. Traffic to the site currently redirects in reverse, from https://www.siteImanage.com to https://siteImanage.com. Based on my research, I understand that making this change should not affect the site’s excellent SEO as long as my canonical tags are updated and a 301 redirect is in place. But I wanted to make sure there wasn’t a potential consequence of this switch I’m not considering. Because this redirect lives at the root of all the site’s slugs and existing redirects, will it technically produce a redirect chain or a redirect loop? If it does, is that problematic? Thanks for your input!

    | mollykathariner_ms
    0

  • 404s

    Sometimes I get alerts from Google Search Console that it has detected soft 404s on different websites, and since I take great care to never have true soft 404s, they are always false positives. Today I got one on a website that has pages promoting some events. The language on the page for one event that has sold out says that "tickets are no longer available" which seems to have tripped up Google into thinking the page is a soft 404. It's kind of incredible to me that in the current era we're in, with things like chatGPT that Google doesn't seem to understand natural language. But that has me thinking, are there some strategies or best practices we can use in how we write copy on the page so Google doesn't flag it as soft 404? It seems like anything that could tell a user that an item isn't available could trip it up into thinking it is a 404. In the case of my page, it's actually important information we need to tell the public that an event has sold out, but to use their interest in that event to promote other events. so I don't want the page deindexed or not to rank well!

    | IrvCo_Interactive
    0

  • crawler cloudflare rogerbot 520 server error

    I am getting a lot of 520 Server Error in crawl reports. I see this is related to Cloudflare. We know 520 is Cloudflare so maybe the Moz team can change this from "unknown" to "Cloudflare 520". Perhaps the Moz team can update the "how to fix" section in the reporting, if they have some possible suggestions on how to avoid seeing these in the report of if there is a real issue that needs to be addressed. At this point I don't know. There must be a solution that Moz can provide like a setting in Cloudflare that will permit the Rogerbot if Cloudflare is blocking it because it does not like its behavior or something. It could be that Rogerbot is crawling my site on a bad day or at a time when we were deploying a massive site change. If I know when my site will be down can I pause Rogerbot? I found this https://developers.cloudflare.com/support/troubleshooting/general-troubleshooting/troubleshooting-crawl-errors/

    | awilliams_kingston
    0

  • url inspection breadcrumbs technical seo seo tactics

    Hello everyone ! I am building an ecom store using wordpress. I have assigned multiple categories to the same product. What should be the URL structure when users are navigating with different product categories? Categories Assigned: tshirt, blue, striped
    Product Name: blue-striped-tshirt Option 01: Matching site navigation breadcrumb to product url URL - ecomstore.com/tshirt/blue-striped-tshirt
    Breadcrumb - home/tshirt/blue-striped-tshirt URL - ecomstore.com/blue/blue-striped-tshirt (canonical to 1 product page)
    Breadcrumb - home/color/blue/blue-striped-tshirt URL - ecomstore.com/striped/blue-striped-tshirt (canonical to 1 product page)
    Breadcrumb - home/type/striped/blue-striped-tshirt Option 02: Same product urls and different breadcrumbs based on user site navigation URL - ecomstore.com/tshirt/blue-striped-tshirt
    Breadcrumb - home/tshirt/blue-striped-tshirt URL - ecomstore.com/tshirt/blue-striped-tshirt (url same as 1 product page)
    Breadcrumb - home/color/blue/blue-striped-tshirt URL - ecomstore.com/tshirt/blue-striped-tshirt (url same as 1 product page)
    Breadcrumb - home/type/striped/blue-striped-tshirt I have decided to got with Option 01 so that the product in each category can be ranked according to each category keyword. Which option is the best according to your experience or is there any other best practice?

    | Dingos
    0

  • url seo

    Hi All, Some real estate/ news companies have a code appended to the end of a URL https://www.realestate.com.au/property-house-qld-ormiston-141747584 https://www.brisbanetimes.com.au/national/queensland/childcare-centre-could-face-prosecution-for-leaving-child-on-hot-bus-20230320-p5ctqs.html Can I ask if there's any negative SEO implications for doing this? Cheers Dave

    | Redooo
    0

  • I have looked at lots of different plugins and wanted a recommendation for an easy way for patients of ours to upload pictures of them out partying and having fun and looking beautiful so future users can see the final results instead of sometimes gory or difficult to understand before and after images. I'd like to give them the opportunity to write captions (like facebook or insta posts and would offer them incentives to do so. I don't want it to be too complicated for them or have too many steps or barriers but I do want it to look nice and slick and modern. Also do you think this would have a positive impact on SEO? I was also thinking of a Q&A app where dentists could get Q&A emails and respond - i've been doing AMA sessions and they've been really successful and I would like to bring it into out site and make it native. Thanks in advance 🙂

    | Smileworks_Liverpool
    1

  • Hey Mozers! Moz Crawl tells me I am having an issue with my Wordpress category - it is returning a 5XX error and i'm not sure why? Can anyone help me determine the issue? Crawl Issues and Notices for: http://www.refusedcarfinance.com/news/category/news We found 1 crawler issue(s) for this page. High Priority Issues 1 5XX (Server Error) 5XX errors (e.g., a 503 Service Unavailable error) are shown when a valid request was made by the client, but the server failed to complete the request. This can indicate a problem with the server, and should be investigated and fixed.

    | RocketStats
    0

  • After migration of web  application from aws  ec2 instance to Microsoft azure web App service,  we observed that we lost  our 50% traffic. Our site  custom domain is ihealthmantra.com and azure web App has default domain azurewebsites.net  . Azure  WebApp service has drawback that default domain gets in picture after mapping to my custom domain .We have mapped  azure webAPP host name to our custom domain as CNAME record in DNS Table .  Now same site working with two domains i.e ihealthmantra.com as well ass  azurewebsites.net . As we seen this issues we made 301 redirection from azure default domain to our custom domain, Still no change in traffic.Google is now showing external links from azurewebsites.net  to healthmantra.com . We are totally confused now . We don't know what exactly affected to our search traffic . Please Help us.

    | DivyaDubey
    0

  • We are implementing new design and removing some pages and adding new content. Task is to correctly map and redirect old pages that no longer exist.

    | KnutDSvendsen
    0

  • Hello,
    Recently, I was checking how my site content is getting indexed in Google and from today I noticed 2 links indexed on google for the same article: This is the proper link - https://techplusgame.com/hideo-kojima-not-interested-in-new-silent-hills-revival-insider-claims/ But why this URL was indexed, I don't know - https://techplusgame.com/hideo-kojima-not-interested-in-new-silent-hills-revival-insider-claims/?utm_source=rss&utm_medium=rss&utm_campaign=hideo-kojima-not-interested-in-new-silent-hills-revival-insider-claims Could you please tell me how to solve this issue? Thank you

    | Dinsh007
    1

  • Hey guys, just wondering, my client has 3 websites, 2 of 3 will be closed down and the domains will be permanently redirected to the 1 primary domain - however they have some high quality backlinks pointing the domains that will be redirected. How does this effective SEO? Domain One (primary - getting redesign and rebuilt) - not many backlinks
    Domain Two (will redirect to Domain One) - has quality backlinks
    Domain Three (will redirect to Domain One) - has quality backlinks When the new website is launched on Domain One I will contact the backlink providers and request they update their URL - i assume that would be the best.

    | thinkLukeSEO
    0

  • Hi guys -- Still waiting on Moz to index a page of mine. We launched a new site over two months ago. In the meantime, I really just need a list of internal links to a specific page because I want to change its URL. Does anybody know how to find that list (of internal links to 1 of my pages) without the Moz index? I appreciate the help!

    | marchexmarketingmcc
    1

  • I have forum site.I've opened it 2 months ago.But there is a problem.Therefore my content is unique , my site's keyword ranking constantly changing..Sometimes my site's ranking drops from first 500.After came to 70s. I didn't make any off page seo to my site.What is the problem ?

    | tutarmi
    0
  • Unsolved

    technical seo google analytics utm parameters

    Does anyone have a tutorial on how to dynamically capture the referring source to be populated in UTM parameters for Google Analytics? We want to syndicate content and be able to see all of the websites that provided referral traffic for this specific objective. We want to set a specific utm_medium and utm_campaign but have the utm_source be dynamic and capture the referring website. If we set a permanent utm_source, it would appear the same for all incoming traffic. Thanks in advance!

    | peteboyd
    0

  • url encoding

    Hello friends, Will properly encoding a url hurt my ranking after having it improperly coded? I want to change my & symbols to & If I go from:
    http://www.example.com/product.php?attachment=pins&model=cool To:
    http://www.example.com/product.php?attachment=pins&model=cool Will I get hurt if I make the leap?

    | sonic22
    0

  • Hi, Currently we have the following url structure for all pages, regardless of the hierarchy: domain.co.uk/page, such as domain/blog name. Can you, please confirm the following: 1. What is the benefit of organising the pages as a hierarchy, i.e. domain/features/feature-name or domain/industries/industry-name or domain/blog/blog name etc. 2. This will create too many 301s - what is Google's tolerance of redirects? Is it worth for us changing the url structure or would you only recommend to add breadcrumbs? Many thanks Katarina

    | Katarina-Borovska
    1

  • I have four English sites for four different countries, UK, Ireland, Australia and New Zealand and I want to share some content between the sites. On the pages that share the content, which is essentially exactly the same on all 4 sites, do I use the hreflang tags like: or do I add a canonical tag to the other three pointing to the "origin", which would be the UK site? I believe it is best practice to use one or the other, but I'm not sure which make sense in this situation.

    | andrew-mso
    0

  • Hi guys, Had a quick question that I wanted to verify here. After reviewing a Moz report we received some redirect chain error on all of our sites hosted with WPEngine. We noticed that the redirect chain appears to be coming from how the domains are configured in their control panel. Essentially, there is a redirect: from staging/temp -> to live from non-www -> to www SSL redirect from http -> https The issue here is that the non-www is redirecting to www and then redirected again to https://www According to support the only way to get rid of this error is to drop the www version of the domain and to host everything under https://domain.com. To me it seems very odd that you cannot just go from http://non-www to https://www in just 1 301 redirect. Has anyone else experienced this or am I just not looking at the situation correctly?

    | AaronHenry
    0

  • domain authority optimization

    In my latest site crawl, the domain authority dropped 10 points for no apparent reason. There have been no changes to the site. The only change I have made this month is to block referral spam to the site. My competitors' DAs have stayed the same too. website name: https://knowledgefront.co.uk/ Any ideas?

    | LisaBabblebird
    0
  • Solved

    My page rank for www and non-www is the same. In one keyword instance, my www version performs SO much better. Wanting to consolidate to one or the other. My question is as to whether all these issues would ultimately resolve to my chosen consolidated domain (i.e. www or non-www) regardless of which one I choose. OR, would it be smart to choose the one where I am already ranking high for this significant keyword phrase? Thank you in advance for your help.

    | meditationbunny
    0

  • 1 e-commerce site has bad seo

    I’ve been working on educating myself about SEO all day, again. All-Star Telescope up in Canada. We have a competitor that consistently ranks #1 and I don't get it. Their site is full of duplicate content (straight copy and paste from the manufacturer site). They don't have any meaningful blog or video content to add relevance or value to their site. We have higher page authority, higher domain authority, and they keyword analyzer in moz says that our page is higher quality than the the competitors page. Our site is slow, but theirs is slower. I can’t find a single metric on any tool (ubbersuggest, Moz, ahrefs, semrush) that says Telescopes Canada is a better site, or has a better NexStar 8SE product page (a popular telescope). Here’s the link to Telescope Canada’s page for their Celestron 8SE: https://telescopescanada.ca/products/celestron-nexstar-8se-computerized-telescope-11069?_pos=1&_sid=f0aa91cc2&_ss=r Here’s a link to the Celestron 8SE page from the manufacturer website: https://www.celestron.com/products/nexstar-8se-computerized-telescope?_pos=1&_sid=56abdabd4&_ss=r#description Telescopes Canada has just copied and pasted. There is no original content aside from adding the shipping and return policy to the tab, and having some options for selecting accessories on the page. Here is our page: https://all-startelescope.com/products/celestron-nexstar-8se Our titles are good, our metadata is good (but I don’t think that’s been a serious ranking factor for about ten years). The text is original, it’s relevant, we have healthy internal links to the page. We have invensted in some excellent blog content, we’re adding new products to the website so that we rank for more keywords. All of those things are helping, but I fundamentally don’t understand why Telescopes Canada is #1 almost across the board on every key product in our market. There is something that I’m not seeing here, something that isn't being captured by the tools that I have. Is it simple the fact that they get more traffic? Is that why some people go and buy traffic? Can you see any metric, any tool in your toolbox that indicates why they rank at the top, or even higher than we do for in these search terms specific to that product: Celestron NexStar 8SE
    NexStar 8SE
    Celestron NexStar 8SE Canada
    NexStar 8SE Canada We've worked with two highly ranked SEO's to try and figure this out, one in Canada, and one in the USA. I haven't seen a confidence inspiring answer from either of them. Posting on a forum is a bit of an act of desperation, I'll continue to work the problem, but it's discouraging to see the leader in my industry look like he's just phoning it in with his website.

    | nkennett
    1

  • duplicate content

    Hello, I had an issue with some pages being listed as duplicate content in my weekly Moz report. I've since discussed it with my web dev team and we decided to stop the pages from being crawled. The web dev team added this coding to the pages <meta name='robots' content='max-image-preview:large, noindex dofollow' />, but the Moz report is still reporting the pages as duplicate content. Note from the developer "So as far as I can see we've added robots to prevent the issue but maybe there is some subtle change that's needed here. You could check in Google Search Console to see how its seeing this content or you could ask Moz why they are still reporting this and see if we've missed something?" Any help much appreciated!

    | rj_dale
    0

  • 301 redirect page rank

    I have a blog which is ranking well for certain terms, and would like to repurpose it to better explain these terms it is ranking for, including updating the url to the new term the blog will be about. The plan being to 301 redirect the old url to new. In the past, I've done this with other pages, and have actually lost much of the rankings that I had earned on the original URL. What is your take on this? Maybe repurpose blog, but maintain original URL just to be on the safe side? Thanks

    | CitimarineMoz
    0

  • A page on our WordPress powered website has had an error message thrown up in GSC to say it is included in the sitemap but set to 'noindex'. The page has also been removed from Google's search results. Page is https://www.onlinemortgageadvisor.co.uk/bad-credit-mortgages/how-to-get-a-mortgage-with-bad-credit/ Looking at the page code, plus using Screaming Frog and Ahrefs crawlers, the page is very clearly still set to 'index'. The SEO plugin we use has not been changed to 'noindex' the page. I have asked for it to be reindexed via GSC but I'm concerned why Google thinks this page was asked to be noindexed. Can anyone help with this one? Has anyone seen this before, been hit with this recently, got any advice...?

    | d.bird
    0

  • moz crawler

    Hi, i need help regarding Moz Can't Crawl Your Site i also share screenshot that Moz was unable to crawl your site on Mar 26, 2022. Our crawler was not able to access the robots.txt file on your site. This often occurs because of a server error from the robots.txt. Although this may have been caused by a temporary outage, we recommend making sure your robots.txt file is accessible and that your network and server are working correctly. Typically errors like this should be investigated and fixed by the site webmaster.
    my robts.txt also ok i checked it
    Here is my website https://whiskcreative.com.au
    just check it please as soon as possibe

    | JasonTorney
    0
  • This question is deleted!

    0

  • indexation keyword rankings

    Hello everyone, For one of our websites, we have optimized for many keywords. However, it seems that every keyword is indexed on the home page, and thus not ranked properly. This occurs only on one of our many websites. I am wondering if anyone knows the cause of this issue, and how to solve it. Thank you.

    | Ginovdw
    1

  • Hi mozzers, I am running an audit and disabled cookies on our homepage for testing purposes, this pointed to a 404 http response? I tried on other pages and they were loading correctly. I assume this is not normal? Why this is happening?  and could this harm the site's SEO? Thanks!

    | Taysir
    0
  • This question is deleted!

    1

  • seo audit domain rankings

    I have a domain for my UAE based project called https://mydubaiseo.com/ and however, one of my colleagues suggested going with .ae option.
    Whether if we change the domain like as suggested get earlier results than .com domain or what?\Which domain .com or .ae ranks faster in UAE location if the SEO strategies followed in the same way?

    | 0eup.ombitao
    0
  • This question is deleted!

    | Dgoad1
    0

  • disavow spam

    I recieved a huge amount of spamy link (most of them has spam score 100) Currently my disavow link is arround 85.000 lines but at least i have 100.000 more domain which i should add them. All of them are domains and i don't have any backlink in my file. My Problem is that google dosen't accept disavow link which are more than 2MB and showes this message : File too big: Maximum file size is 100,000 lines and 2MB What should i do now?

    | sforoughi
    0
  • This question is deleted!

    0

  • We are not sure that page does matter or not for google ranking as I am working for this keyword "flower delivery in Bangalore" from last few months and I saw some website's google first page who have low page speed but still ranking so I am really worried about my page that has also low page speed. will my Bangalore page rank on google the first page if the speed is low and kindly suggest me more tips for the ranking best factors which really works in 2020 and one more thing that domain authority really matters in this year? as I also saw some websites with low domain authority and ranking on google's first page. Home page: Flowerportal Bangalore page: https://flowerportal.in/flower-delivery/bangalore/ focus Keyword is: Flower delivery in Bangalore, send flowers to Bangalore

    | vidi3423
    1

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.