Make yourself familiar with SEO. Here is a good Guide: https://moz.com/beginners-guide-to-seo
You can do many things without an SEO Agent and if you still need more professionell help, you know how to ask the right questions to "verify" the SEO Marketer.
One thing to stay away of is when someone guarantee you a #1 position or if he claims to "know the secret" or something like that. Let him also write down the Stuff he wants to do for you with results you can expect from the beginning and results you can expect in the future.
Best of luck for you mate!
Hey, this is Russ from Moz. We crawl the web fast (although not as fast as Ahrefs) but there is no way for us to know when we will pick up a particular link. You can check FreshWebExplorer (https://moz.com/researchtools/fwe/) which is a separate crawl of the web and gets to blogs a little quicker than our standard crawl which doesn't prioritize them.
Rest assured we will get to them though. How long has it been?
What you want is called "Multivariate Testing" and if I understood your question correctly, you can't do it by having a version of your site appear in the SERP sometime, and sometime another version.
You will only have one single website in the SERPs on which you need to implement a mechanism of serving different versions of the landing page to your users and analyse how it affects your chosen signal. The process isn't exactly hard but it's not simple either. Moz talks about it here.
Kind of. I think agencies mostly did this so they can maintain connections you've made whilst working for them in the event that you leave them because staff turnover at agencies is pretty high. When guest blogging was a mainstream tactic, this also helped give them a bit of credibility because they could see their author profiles around the web on all the sites "they" had written at.
And like X years ago, someone tested that women's names got a better response rate so some people got excited about using fake names. One of my colleagues at the time used to do outreach as "Scarlett Huxley" so he could apparently do 39.4% less work per link earned.
I've noticed that when we sent emails from "a marketing agency representing the client" to journalists, we got way less co-operation than sending emails from the client itself. I think it's because journalists/bloggers raise a bullshit detector for the former, less so with the latter. I prefer to send out emails using the identity of the client themselves partially because... every client will leave you one day and they deserve the contacts you've made during their billable hours..... but this kind of bites you in the arse if you need the same writer for another client.
None of the above is applicable to you.... so do whatever you're happy with.
afaik there are only 2 companies that improve on data from GKP: Ahrefs & Moz. Each of us has their own model of combining GKP data with clickstream data, and thus each tool gives slightly different numbers. Though in many cases our search volume estimates are very close.
As for the rest of the tools - they rely 100% on GKP (afaik) and you never know how often they pull fresh data (since Search volume changes pretty much every month). That is why, in different tools that get data from GKP you see different numbers for the same keywords, even though they use the same source.
And as for GKP data reliability, /u/karmaceutical did a superb job explaining what's wrong with their search volumes and why they can't be trusted (see his articles at Moz blog).
No I assure you it was not, although I see why you may think so. I wasn't aware Moz provided this specific feature.
But it appears to have been disabled now. In the thread you shared, meghan (who appears to be part of the Moz team) linked this as the answer.
Which shows that this feature was retired a few months ago. I'd appreciate any other leads you may have, and thanks for your help.
P.s. I don't know why answers are getting downvoted in this thread. I found your comment at 0 karma.
It's an honor to be listed on an unprofitable (or barely profitable) company website these days?
Hey, thanks for the response.
> Yes they do.
My responsibilities do not. Perhaps you are right that I am biased towards Moz because I work there, but it isn't my responsibility to do anything on reddit, or anywhere else for that matter outside of the moz blog and moz.com/q
> Doesn't Moz mind that you're detrimentally affecting their reputation?
I think if I did damage their reputation in any material way that it would reflect negatively on me, but I have never heard of a Moz employee reprimanded for speaking their minds. There is no Moz Whitehat SEO Handbook that we all sign requiring that we speak one way or another about about SEO techniques, nor is there a prohibition about responding emotionally online.
> Don't you need to insert a statement that your posts are your opinion and not those of Moz?
"Moz" doesn't have an "opinion". We have core values which are represented by TAGFEE, which actually encourage things like being "authentic". Being upset with your initial response to a post about a lot of my friends losing their jobs is me being me.
> Because I'm taking these posts as official responses from Moz
This would be a mistake. The only people really in any position to make statements on behalf of Moz would be its Officers and Founder, and even then they would only do it in collaboration with one another.
> and I've lost all respect for Moz
I am sorry that is the case.
> I can see it all over some of the SEM journals now.
You seem really hurt by my comment "stay classy". I am sorry if it offended you in some way.
Have you viewed the Google cache? http://webcache.googleusercontent.com/search?q=cache:8KvOsp2liQgJ:www.artforweb.co.uk/&num=1&hl=en&gl=uk&strip=1&vwsrc=0
Those lovely injected links appear there.
So the 'malware' is only inserting the links for the GoogleBot (and similar). You can set your own user agent as GoogleBot if you want to see them yourself, but the Google cache shows what Google is being served.
Your using http://piwigo.org/ it appears.
Google 'piwigo malware' (& similar), there is plenty of discussion on similar issues which may help you.
Have you checked your file permissions? Do you know what this means? There seems to be a re-occurring issue for piwigo installs and file permissions being exploited.
What was the name of the plugin which was hacked?
This is covered in a bunch of places across the web: https://www.seroundtable.com/crawled-currently-not-indexed-google-quality-issue-31677.html
https://moz.com/blog/crawled-currently-not-indexed-coverage-status
Hello. My name is David Black and I'm the Director of Customer Success here at SEMrush. Shinjetsu01, this is a very generalized statement considering the scope of tools, data and overall functionality that we provide. Do you care to elaborate on what exactly you feel is inaccurate?
I guess I can say that our Support team deals with the question of accuracy a lot. In short, no tool is ever going to be 100% accurate due to how the data are typically collected but I can tell you that in terms of what we do (and how we do it) SEMrush is by far the most accurate and keeps getting better and better every day. Even Rand Fishkin of Moz has spoken highly of the accuracy of our data. Placing us above Moz itself for predictive SEO traffic indicators.
https://moz.com/rand/traffic-prediction-accuracy-12-metrics-compete-alexa-similarweb/
When you consider everything you have access to...the crazy amount of Organic and Ads competitor data and marketing tools like Site Audit, Position Tracking, PDF Report building, all for such a competitive price that is so small in comparison to the ROI it delivers, I personally feel that you can at least TRY the software for a month or two, talk to our Customer Success team, access our educational content and formulate your own opinion without killing your budget.
Reach out to me personally, I'll help you get set up!
For creating the structure i just recently developed an google-sheets extension that lists all headings of a list of urls into an overview. So you can quickly get an overview of the already ranking articles and then build a better structure based on this info: https://workspace.google.com/marketplace/app/nicheknife_content_analyzer/136984433266
its obviously free and i build it more or less for fun..
Here is my OPML file from Feedly. It's not really well organized, but the Digital Marketing section is what you're looking for (it's mostly SEO):
https://drive.google.com/file/d/0B1aSTilKEYd4U2lCbk1UMXRKdjA/edit?usp=sharing
Scariest... Probably going through depression in 2013/2014. https://moz.com/rand/long-ugly-year-depression-thats-finally-fading/
Your brain convinces you that nothing matters, that nothing you do is good, that it's all pointless. Not existing seems really tempting in that headspace. Awful, awful thing I wouldn't wish on anyone.
> But please remember that if your question can be answered with a link to https://moz.com/learn/seo then it probably doesn't belong here.
Sub guidelines on submissions. Asking a question that is partially answered on Moz Learn SEO isn't what we're going for on here. The intro...
> BigSEO is a community that's trying to be SEO for Grownups.
> Big has multiple meanings. Big as in big people (grown ups), big shots (people who know their trade) and big ideas (not getting caught up in meaningless SEO debates over whether or not to use a pipe in title tags).
> Never be afraid to ask a question. Post something fun, something interesting or something you read that you learnt from.
If something is asked that can be easily Googled, we tend to either wait for reports OR if the poster's thread was culled by automoderator, they might just not get approved. This time it was approved to see if it would get reported.
We're aiming for "content that would be useful for someone with decent basic knowledge" to try to differentiate the subreddit from /r/seo - nothing malicious.
The 1 post every week or so methodology does create a good signal to noise ratio. Moz has published 12 articles since this one was released, some of which were very good IMHO, like this one on RankBrain and Keyword Research. Of course, we cover a lot of issues that Ahrefs doesnt - 3 were targeted at Local Search, for example.
But we will certainly take the critique in stride. Perhaps we should slow our release cadence and focus on bigger pieces.
Don't give up, keep writing and doing outreach.
Frequently check your GSC's Search Query data (and other resources) to find new queries you can write content about.
Check out this article: https://moz.com/blog/beyond-the-seo-plateau-after-optimizing-your-website-whats-next.
Hi,
IIRC /u/Purpose2 avoids multiple H1 tags like the plague.
I'm less convinced.
Headings and on page optimisation in general seems to matter more for certain types of keyword - especially ecommerce. It seems to matter less for services and more abstract terms. Google "project management software" for example.
HTML5 specifications allow it once per section. Google use multiple H1s in their own products, e.g https://play.google.com/store/apps/details?id=com.ubercab&hl=en_GB (the main heading is just a div).
Perhaps any large heading at the top of the page is going to be the main heading - regardless of what tags.
My approach is not to try to create multiple H1 tags but not to cry about it if they exist. Technical SEO is triage and usually something like this is way down the list. If it's something that can be fixed in 5 minutes, maybe as part of another job.
I'm more concerned about the content of them (keyword, keyword variant, abstract).
Think these studies may help you: https://moz.com/blog/title-tags-seo
(disclosure: I am the author)
Short answer: It can depend on a host of factors, including the strength of your brand, and/or if people are searching for the keywords used in the boilerplate part of your title. Emotional/marketing elements sometimes come into play as well. The only way to know for sure is to test.
QuicCloud is great, but please let's not spread FUD, there are literally dozens of CloudFlare nodes in India and all over Asia.
https://www.cloudflare.com/en-gb/network/
> Asia
Baku, AZ
Bandar Seri Begawan, BN
Bangkok, TH
Bengaluru, IN
Cagayan, PH
Cebu, PH
Chennai, IN
Chittagong, BD
Colombo, LK
Dhaka, BD
Ekaterinburg, RU
Hanoi, VN
Ho Chi Minh City, VN
Hong Kong
Hyderabad, IN
Islamabad, PK
Jakarta, ID
Jashore, BD
Johor Bahru, MY
Karachi, PK
Kathmandu, NP
Kolkata, IN
Krasnoyarsk, RU
Kuala Lumpur, MY
Lahore, PK
Macau
Male, MV
Manila, PH
Mumbai, IN
Nagpur, IN
New Delhi, IN
Osaka, JP
Phnom Penh, KH
Seoul, KR
Singapore, SG
Surat Thani, TH
Taipei
Tbilisi, GE
Thimphu, BT
Tokyo, JP
Ulaanbaatar, MN
Vientiane, LA
Yangon, MM
Yerevan, AM
Interesting question - I would say the most unexpected thing I have noticed is that things you would expect to drive traffic don't always, and often you are completely blindsided by an activity or site placement which refers hundreds of thousands of visits.
For example, a placement on Mashable would probably be universally accepted as likely to drive hundreds or even thousands of clicks - not always the case.
We've had forum posts on places like MoneySavingExpert.com that have driven thousands of visits in a few hours... http://www.thisiswhyimbroke.com/ is one of the biggest referrers for one of the businesses I am an investor in.
For sure, there is a graph that shows the top affected niches in this moz article here, medical was just one of the top effected categories of the bunch, which dubbed the title.
​
hey /u/karmaceutical , how come you created a comparison table including Ahrefs and not running it by me? I thought we're friends :)
site:ask.metafilter.com/tags
At least as late as as March of 2014, ask.metafilter.com allowed their tags pages to be indexed - to the tune of 450,000+ pages https://web.archive.org/web/20140330015339/http://ask.metafilter.com/tags/internet
Now they are finally blocking it. Google did discover this section was full of trash (in this case, tons of tag pages).
well, actually we did partner with Moz recently to study our metrics side-by-side (as part of their annual study).
You can see the results here:
MOZ: https://moz.com/search-ranking-factors/correlations#3
AHREFS: https://moz.com/search-ranking-factors/correlations#10
As you can see, Ahrefs metrics correlated better. But they didn't include our "URL Rank" there for some reason, so I'm not sure if it's really better or worse than PA.
One way to find out if this is real is to do a poll like the folks of Hacker News did: https://news.ycombinator.com/item?id=7672910 Great idea! Big corporation like Google with big stakeholders could easily be doing that.
First of all, congratulations and great job on that massive growth.
You might know this, or you might not, but that this stage I would start looking for an agency with strong e-commerce experience (or even specialization) that can show you how in the past 12months they've helped grow another ecommerce website. Luckily, I run just that kind of agency ...just kidding! Or it can be one guy, as long as he's good. The nice thing about e-commerce is if you're good, the numbers easily show it.
But I'd also warn you to manage your expectations. I don't know the search volume around your keywords (and you guys have a lot of products) but growing 3x in 6 months might be too fast. Then again, 0 to 175K in a year is quite the feat.
I went to your site and the first thing I noticed was that it was really slow to load. It seems like those images are really slowing you down. You're using Big Commerce's CDN, so that might be an issue (even though CDNs normally help, I'm not sure this one is). I just ran it thru page spead insights (here - https://developers.google.com/speed/pagespeed/insights/?url=http%3A%2F%2Fwww.dankstop.com&tab=desktop) and your site failed miserably and they call out those images so it IS an issue. (I'm sure you've heard about amazon's famous study about page load speed and how much it causes drop off in ecomm business, if not google it). Also looks like your styling is on the page instead of a separate CSS file (also might be a Big Commerce setup thing).
Just some thoughts.
You're still getting the link juice from the backlink. Don't worry.
I saw this case study about how backlinks from such sites still impact the ranking very well. it just takes a few months.
I'm at an agency, so we're doing this pretty often.
Like what others have said here, it's largely the same game as traditional SEO, except now you're working on 10 pages instead of 1.
A big play that I've seen do a lot is building out your companies' profiles other sites, such as CrunchBase, Behance, About.me, etc. Make sure to actually optimize these pages with your keyword and related terms (TF-IDF analysis works great).
Also make sure you have the obvious ones up, like Facebook, Twitter, LinkedIn, etc.
From there, building links to these already high DA sites with optimized on-page copy will do a lot for you in a relatively short period of time (for SEO's standards).
A big part of SEO is problem finding, and solving.
Bltonwhite has already noticed one thing that needs optimising, I clicked your site, and it took 6.7 seconds to load.(https://gtmetrix.com/reports/www.actisspartners.com/KjdhxS1l)
keep working man :)
I always start with:
For which highly popular & relevant transactional queries related to main products or services are all of some of the competition highly ranked that you're still not?
Which are their top ranked & traffic driving pages and categories vs. yours?
Now you have those pages to prioritize in your content optimization and development process, as well as to validate first from a technical perspective.
-Which are those top pages ranking very well but with very low search results CTR? -Which are those top pages with high organic search traffic but high bounce rate? -Which are those pages ranking and converting well in desktop but not on mobile?
Usually the cause of these issues is that there's a cannibalization problem:
Product pages ranking instead of category pages or viceversa because they haven't been correctly targeted from a content perspective or organized internally to effectively rank for their targeted queries without overlapping.
The wrong product or the wrong category ranking instead of another or many of them ranking - switching - because there's not a specific page targeting them.
The mobile Web version is not correctly optimized -as the Desktop might be-.
In general, I follow the approach that I explained in my BrightonSEO session here: https://www.slideshare.net/aleydasolis/how-to-drive-growth-through-your-seo-audits-at-brightonseo - and specifically with an ecommerce site I also usually pay specific attention to what I described in this presentation I did in Oslo last year: https://www.slideshare.net/aleydasolis/seo-for-ecommerce-5-must-follow-tips-at-maxnordic
http://www.amazon.com/The-Elements-Style-William-Strunk/dp/1557427283
The Elements of Style. It's dirt cheap and a short read, check it out I think it'd help you hone the conversational voice you're looking for in your post!
https://imgur.com/a/md3kxJR taken from here.
Some people might say Moz isn't a reliable resource or is outdated, but from my experience, it's much easier to fuck up if you put it in multiple places.
See here https://www.cloudflare.com/en-gb/plans/#overview
On the free and ALL plans you get:
"Globally Load Balanced CDN
Our 250 data centers located across the globe provide visitors with location-based access to your website, while removing latency and improving performance."
> Especially given how Majestic seem to be cheating on their backlink counts!
Here's the relevant snippet for anyone who doesn't want to read the whole artcle:
> This happens because Majestic fail to strip URL parameters from URLs, which results in the same backlink being duplicated hundreds, sometimes even thousands of times.
The author's claim that Majestic is "cheating" seems a bit disingenuous. Is the author suggesting that URLs with parameters should all be counted as equivalent to the same URL without the parameters? The issue is not so clear. Many sites deliver unique content based on the key/value pairs of the parameters. In those cases, it's definitely preferrable to have each crawlable instance listed. Even if the backlink sources are duplicated (let's call it spam) content, it's worth having a resources that allows you to see that.
Is it ideal? No, but until they (Majestic and others) make a crawler that can assess and classify duplicate content (possibly even letting you filter the export) I'd rather have the extra rows than not and sort it out myself.
For what it's worth, this post (https://moz.com/blog/big-fast-strong-backlink-index-comparisons) by Russ Jones (/u/karmaceutical) is a far better comparison of the major backlink tools.
I guess I should have specified for SERP snippets, but here is this on Moz, https://moz.com/blog/how-long-should-your-meta-description-be-2018 and this is on Search Engine Land, https://searchengineland.com/google-officially-increases-length-snippets-search-results-287596
Whoa... That's a hard one :-)
How about this - every year I write a "predictions" post (here's my one from 2017 - https://moz.com/blog/8-predictions-for-seo-in-2017). I'll do that again in January of 2018. It's granular and specific, though relatively short-term focused (because I believe none of us are good at predicting the long term future, and it's not particularly useful to try).
My understanding is that you can reference other domains in your hreflang annotation but you have to remember that for it to work it can only be on an individual URL basis and the other domain's URL must reference back to the initial one. I've not implemented across domains myself but this resource from Moz gives a lot of different scenarios which might help you: https://moz.com/blog/hreflang-behaviour-insights
I'm on my phone but you can find the deck under the schedule portion here: https://moz.com/mozcon
On Wednesday night. Just hit download deck. Rand was the last presentation. There's also a ton more decks you can download. :) a lot of valuable info in them.
They call these Infinite PAAs. IIRC, it doesn't always happen but it is really interesting when it does. Britney Muller at Moz wrote about it recently if you want to see more examples.
> I can't believe Google is not aware of this.
They're almost certainly aware of it. I think they've been indecisive over whether they should actively exclude this traffic from analytics. I imagine they see modifying data as some larger technical or accountability challange (IE they dont want to be responsible to changing data that goes into GA). I have yet to see any evidence that the Exclude Known Bots feature does anything.
Best thing to do is set up a known hostname filter on your main reporting view, which removes majority of this and hope that at some point Google takes action on the spam issue.
I also don't think the source issue has anything to do with SEO -- these are straight up spam websites using bots to brute-force guess your UA profile and hope they get some suckers to visit and ultimately get some incremental traffic out of it. There should be no tangible ranking benefit, Google doesn't look in your GA profile to see if they're legit ;)
Very interesting stuff!
Does anybody have a working example using the Schema.org Markup with Google Custom Search? This is our code right now:
<section class="grid_3 omega widget same-height-right" style="height: 224px;"> <div id="c290902" class="csc-default"> <a id="c278724"></a> <h2 class="title csc-header">Haben Sie Fragen? </h2> <div> <div class="holder"> <p class="bodytext"><br>Einfach einen Begriff in das Suchfeld eingeben und sofort themenrelevante Ergebnisse erhalten!</p> </div> <div class="box_content"> <div id="cse-search-form"> <form id="cse-search-box" action="/suche/q.php"> <input type="hidden" value="017002835581306790937:802l7atpqxs" name="cx"> <input type="hidden" value="UTF-8" name="ie"> <input id="googleSearchField" type="text" size="31" name="q" style="background: url("https://www.google.com/cse/images/google_custom_search_watermark.gif") no-repeat scroll 50% 50% transparent;"> <input class="btn" type="submit" value="Suchen" name="sa"> </form> </div> </div> </div> </div> </section>
Thanks!
Edit: formatting
I recommend checking out Swiftype (https://swiftype.com/site-search) - it's easy to set up and comes with real-time analytics that allow you to see what visitors are searching for and clicking on. Swiftype has faceted search, drag and drop result ranking and an optimized mobile search experience (for both mobile web and native apps via SDKs). Once you set up Swiftype, new content is indexed automatically so your search stays up to date.
I think you are correct about only 2 companies doing blend right now.
Of course, there are a few other sources. The clickstream companies themselves often have estimates based just on clickstream alone (so not GKP). Jumpshot and Similarweb for example. There are also companies that use other sources like search data from 3rd party search engines and some of them offer GKP data side-by-side (not sure if any do modeling). These would be sites like the venerable WordTracker.com, KeywordDiscovery.com, WordStream (IIRC). I did a comparison of some a few years back before joining Moz
Why not manually login to or create the business accounts on Yahoo and Bing and manage them yourself?
As for Apple maps according to the moz local blog it does take care of that for you.
First-tip: Don't trust GSC-data as gospel. It often screws up or purposely gives you 'half the story'. Second-tip: Get some sort of rank-checking setup to track term rankings you care about (specifically ones that help you meet business objectives). Third-tip: Monitor your traffic. Without this you'll have a hell of a time figuring out if a SERP change actually harmed you.
One tip for you. On the product landing page (https://moz.com/products/pro/keyword-explorer?), I clicked on 'Features' expecting to see a list of features for Keyword Explorer. This actually pulls up a list of features for Moz Pro in general. I think you might end up confusing people. Now that I look back at the page, I see MozPro in the nav bar, but it doesn't stand out that that sub-nav is for MozPro specifically. Just a general UX suggestion :)
Cheers for the new tool, I will add it to my arsenal.
This question was answered in Moz Q&A recently, not sure if you're the same user or it's just a coincidence so in case it wasn't you... the answers are correct.
Let me just put this link in here for all ya'll Moz haters:
https://moz.com/blog/big-data-big-problems-link-indexes-compared
TLDR; A backlink checker's value is its proportional representation to Google's link graph.
Analysing your organic landing pages can also give you an indication of which keywords are driving traffic.
Other strategies have been outlined here: https://moz.com/blog/easing-the-pain-of-google-keyword-not-provided
There are countless beginner SEO guides out there. Since SEO is such an incredibly broad subject, I'd recommend reading through one (or preferably several) of these and subscribing to a few industry blogs. Here's something to get you started:
Everyone usually recommends the Moz beginner's guide, so I guess you can't go wrong with that. Am not their biggest, fan, though.
https://moz.com/beginners-guide-to-seo
There's also this book you could check out:
http://www.amazon.com/SEO-Like-Im-Beginners-Optimization/dp/1500865206
There are 800 thousand articles and sites for beginners, from black hat to white hat. Being informed ABOUT seo will probably help you not get burned in the future, it's true. Let me know if I can help in any other way.
Good luck!
Don't do subdomains. Google knows about its issues about subdomain ranking and haven't figured out a way to fix it yet.
I would also add that people should do a little bit of research to see what type of search volume exist for their local product or service. You can use the GoogleAdPlanner as a good place to start.
I've seen a couple of instances where the industry term is not what a typical customer uses to describe the product or service.
As any popular CMS, we get report of security flaw, we handle them privately and rapidly, then release security fixes for all major branches. Piwigo exists from several years now and is quite safe. BTW the release notes of http://piwigo.org/releases/2.8.0 doesn't contain anything related to sql injection or security fix
This big ass list of outsourced content creation options (of which I've contributed to) is your bible. Enjoy :)
Yep - Make sure you set up the right redirects, ahead of time and do as much testing as physically possible.
You will want to look in to using Xenu (or similar) to make sure that the links are all working on an internal development copy of the site before going live.
As long as you do your research beforehand, get the right redirects in place, and ensure that your WordPress setup is optimised before going live, you shouldn't have a problem.
You might see fluctuations for the next couple of months, but spend the time doing all your research and extensive testing now to save yourself loads of issues in 4 weeks time.
Good luck!
Have you tried setting up a Google Dashboard in Analytics for them? You can set it to email them a report based on the metrics you set up however frequently you like. Visuals aren't fantastic but you can get some basic pie charts, line graphs, etc.
The best part is you can use the gallery of submitted widgets/reports to pick and choose what works best for you.
If you already have data in an Excel, you could also try using Google Sheets and one of the Analytics add ons (open a google sheet, select add ons and then 'get add ons'). I haven't been able to play around with the few I found but I remember some of them being pretty promising if you fiddle with them. And they can connect to your Analytics account to pull data directly, or you might be able to load in Excel data if you already have it.
*edited for clarity
Don't renew your Wildcard certificate past Jan 2018, letsencrypt is coming out with Wildcard SSL certs https://letsencrypt.org/2017/07/06/wildcard-certificates-coming-jan-2018.html
Working on a new look for https://saijogeorge.com/css-puns/, that one is a bit dated.
Sure, that's the simplest way. WebPageTest is a great tool for assessing page speed.
Chrome and Firefox both have fantastic developer tools with similar features too.
I won't have time to search for sources, but I would recommend you to avoid AMP:
First of all, because it copies certain parts of your site to Google's servers. This reduces the privacy level of your site, and allows the big G to track your visitors in a way that wouldn't be possible without AMP (remember that the use of HTTPS protects users to a point).
Yet mainly because it is unnecessary. Get a VPS with HTTP2 and you get almost the same result without giving Google anything.
SEO still depends on the same two things:
As you may see, AMP is not part of the equation.
~ ~ ~ ~ ~ ~ ~
If I may add a personal comment, I really wish to see Google being ignored now and then. A few years ago, they said "now everybody please add a rel="nofollow" if you are not vouching for the link", and people obeyed! Because of that, now rel="nofollow" is part of the HTML standard, causing all of us (but Google) more harm than good.
Now they ask for AMP to get some of your website's code on their servers. Why? Because they think they can do it better than us? Because they think themselves as the saviours of the web? I disagree. AMP does not benefit websites performance more than a correct optimization; therefore, it is unnecessary.
At IMN we have crazy amounts of internal tools that the public doesn't have access to...But we do have a good deal of amazing free tools at: http://www.internetmarketingninjas.com/tools/
Also as far as other tools, I use ScreamingFrog quite a bit: http://www.screamingfrog.co.uk/seo-spider/
At the agency I used to work in we used Wrike: https://www.wrike.com/
We created client logins, but found that clients would rarely login and would just ask us how things were going. It's good if you change the mindset of your clients. ;)
I'm reading Jessica Bowman's new book right now and I'm really enjoying it. I wish I had it when I first started in SEO! r/https://www.amazon.com/Executive-SEO-Playbook-Company-Wide-Profitability-ebook/dp/B073X3RMVD
I don't know the impact for branding. I can say that Google has been moving in the direction of giving more weight to brand authority and domain authority, so it's almost certainly better to have all four products on the same domain (remember that you can have separate landing pages optimized for each product). A lot of companies have been ditching exact match domains in favor of brand consolidation -- Wayfair Stores is the key example: http://techcrunch.com/2011/09/01/online-retail-giant-csn-stores-rolls-its-200-shopping-sites-into-one-brand-wayfair-com/
Here are a few things you can do:
It seems that backlinks from authoritative sites have the strongest effect on local SEO too.
Here is a link to slides of a case study by Darren Shaw (WhiteSpark founder) from this year's BrightonSEO conference: https://www.slideshare.net/darrenshaw1/darren-shaw-brightonseo-2019-can-you-rank-in-local-without-a-website
Among all other things, backlinks to the site had the strongest effect on local SEO relevance (jump to slide 85).
Hope that helps.
Just beware canonical is not a directive like a redirect. I prefer to keep it simple and efficient with proper 301.
Even if we can play with a canonical kinda like a 301, it's made to help sort out internal duplicate content.
About OP's question, Google pretends subdomains are fine and they act as directories, if the subdomain is strongly related to the root. For example, you can open a free blog on Wordpress.com. The URL will be a subdomain. However, Google needs to figure out that it is not "attached" to the root domain. It should be treated as a separate website.
​
When a subdomain belongs to the root, Google should be able to treat it like a directory. Is Google really able to figure it out everytime? Even if they claim it's all good for them, I would be careful.
About the loss of Pagerank, there will be some in theory. In fact, it's negligeable. It won't hurt your ranking if the transfer is done properly. If you have access to some backlinks, it's obviously a good idea to correct the URL.
It is also a good idea to build up new backlinks. You want to give an extra push to the new URL.
We are at a pivotal crossroad in SEO as we recently entered the mobile-first era. SEO is no longer about MORE CONTENT, since the way we consume content on mobile devices is much different (the online attention span of gen z is something like 2.8 seconds). I really think SEOs are beginning to think about less content, but organized better based on the intent of the user.
I have adopted a style of optimization with this in mind and have seen amazing results (increased traffic from 30,000 to 1 million sessions per month in under one year).
If you search “attention span of gen z” you don’t want to read a 10,000 word bullshit SEO article, so just answer the question in a short manner and move on. (This example is so meta, I know).
Bonus: the one area where UX and SEO still clash badly is with alt text. Unfortunately blind folks get totally screwed at the behest of SEOs trying to stuff their keywords into alts. Sorry blind folks. However I do predict alt text will become irrelevant in the next 5 years as Google AI gets better at understanding content of images on its own.
I’ve done a load of writing and speaking on this topic, you can find one of the presentations here: https://www.slideshare.net/mobile/johnleoweber/the-role-of-ux-in-seo
And this also ties into ADA compliance which is a pretty interesting topic too, more about that here: https://www.searchenginejournal.com/ada-compliant-website/200106/
How often are you changing the URL of a given page?
URLs are a minor ranking factor as I understand it, it sounds like you're creating way more trouble for yourself by changing the URL than it's worth.
> URLs are a minor ranking factor search engines use when determining a particular page or resource's relevance to a search query. While they do give weight to the authority of the overall domain itself, keyword use in a URL can also act as a ranking factor.
> While using a URL that includes keywords can improve your site's search visibility, URLs themselves generally do not have a major impact on a page’s ability to rank. So, while it’s worth thinking about, don’t create otherwise unuseful URLs simply to include a keyword in them.
The total amount of link juice a page can pass on through it's links is divided up among all the links on a page. Not equally, but that is another discussion.
When you add nofollow to a link, the same amount of link juice still passes on through that link, but it does not get credited to the destination page. It just disappears into the ether more or less.
I rarely would point to Moz for anything useful, but the images in this article do a pretty good job explaining how it works after Google adjusted how they handle nofollow in 2009.
https://moz.com/blog/google-maybe-changes-how-the-pagerank-algorithm-handles-nofollow
actually /u/karmaceutical wrote a great post at Moz blog, showing that data from GSC is quite inconsistent as well - Google Search Console Reliability: Webmaster Tools on Trial
In other words, no tool is perfect.. but we don't have anything better as of today :)
When it comes to URLs you can do pretty much as you wish depending on the website. If it makes sense to have a page by area (i.e. /wedding/areas/{city} ) and a listing page that will list all the area ( i.e. /wedding/areas/ ) so be it.
From a SEO perspective there's only very little benefits you can expect from the URL (assuming you don't make them too ugly).
I personally don't like (and avoid) to have too many directories levels just because it can be a mess quickly and you might not want a website full of noindex listing pages (especially a small one).
You probably saw this already but I think this summarize pretty much everything https://moz.com/blog/15-seo-best-practices-for-structuring-urls (this URL is actually pretty neat)
We believe our numbers are accurate within the range ~95% of the time. Russ Jones can give more detail on this, but you can see methodology here: https://moz.com/blog/google-keyword-unplanner-clickstream-data-to-the-rescue
Basic story is that nearly every other tool relies on Google AdWords data, but Moz doesn't. We find that AdWords conflates multiple terms, doesn't effectively account for seasonality, hides volume for many terms, and uses buckets hidden by actual numbers.
If you have examples of KWs where we've reported them to be very low, but you've seen many more searches for those (via your AdWords account, which is the only real way to test for true volume anymore), please let Russ know! He's taken a look at a bunch of cases, but still feels pretty good about the methodology.
A country-specific TLD does have an impact on ranking. Here's a quite good article on the hover website. I live in Australia, and recently searched for a local company to print lanyards - I literally only clicked on the .com.au local TLD.
A keyword in a TLD (eg .diamonds) does not have an impact on ranking. John Mueller from Google is quoted as saying that in the article above. However, a poor quality TLD will also have an impact. Moz advises you to avoid uncommon TLDs, but for human reasons (they're less commonly known). Some are undeniably a bit spammy, particularly .biz.
Search engines do use keywords in domains as a ranking factor. So, if you're trying to use fenetres.org, fenster.org and windows.org for a window manufacturing company, then that's probably a good thing. However, ".org" normally means, to most people, a charitable group rather than a company, and they might be a bit saddened to discover a company there.
After checking if they drive traffic and convert (check assisted conversions too), test adding noindex, follow meta robots instruction into the header.
We did this recently and it was a wash. http://www.seerinteractive.com/blog/eliminated-84-low-quality-posts/
Everett Sizemore did this on an ecommerce site and saw double digit gains (noindexed 11,000 pages): https://moz.com/blog/pruning-your-ecommerce-site
You can always remove the noindex and quickly resubmit pages/sitemaps in GSC if you see it hurting the site somehow.
Are you a professional seo?
> and gives whatever traffic (or SEO juice) to the newsletter to your main blog URL.
this isn't correct canonical is not a redirect, it does not pass link juice or traffic
Use search console to remove your page from the index, or add the no-index tag
Turn this - https://moz.com/blog/build-content-keyword-map-for-seo-whiteboard-friday - into a piece of software.
Would solve so many monthly SEO management issues for me.
(karmaceutical you should create this too :)
It will tell you if they have sufficient data for a particular page. Look for Field Data and Origin Summary just below the score on the results page.
Try simple keyword tool with their new features Bulk Analyze for quickly determine the list of your competitors and get the maximum of information in the shortest possible time, or more detail compare - Confront Domains for better keyword selection.
I think GsusChord may mean https://serposcope.serphacker.com/en/. It looks good, but I haven't tried it myself.
How large it "a large number of keywords"? Are we talking 200, 2.000 or 20.000?
I would recommend Wincher, but they adjusted their pricing recently from unlimited keywords, for just € 2 per domain to € 3 per domain with max. 100 keywords and € 3 per extra € 100 keywords. Kind of a dick move if you ask me, but they probably have their reasons.
>Do you think I need to step up the guest blogging? That's why I mentioned creating a network of SEOs who can leverage their network of clients to accomplish this.
I use SEO PowerSuite's LinkAssistant for automating (as much as it can be automated) the research, reaching out to and guest post requests. I also use Scrapebox, though a highly optimized list and process in that software. Been using it for this for many many years.
What I see currently from forums/blogs are people about to slit their wrists open. So many people are down and out right now from these most recent Google updates. It's pretty bad. Definitely, check them out though so you can get a feel for the type of sites hit the most/least and don't forget to get regular "forecast" data from semrush.com/sensor, moz, etc.
You're looking for an online reputation management tool, here are all the vendors that are a solution for review monitoring and management + user reviews of those tools: https://www.g2.com/categories/online-reputation-management
Karma - can you talk a bit about the discrepancy between the # of links found through Moz vs Moz's index (https://moz.com/products/api/updates)?
It seems comparable to AHref's index found on their homepage, yet AHrefs consistenty finds more links and RDs than OSE.
Awesome news! And thanks for commenting Russ! As a data nerd I may have a mild fanboy crush on you :p
Q: In your post (https://moz.com/blog/google-keyword-planner-dirty-secrets) it sounds like you use a regression model to compare clickstream data to Keyword Planner data and then adjust the clickstream numbers based on your model. Super cool, and I'm guessing that's what KWE uses in part to get its traffic numbers?
But if Keyword Planner muddies the waters by combining search traffic for different keywords (ex. 'apartment rent' vs 'apartment rental'), whats you guys' workaround? It seems like you'd be using a regression model to compare clickstream data to now-flawed Keyword Planner data. Does that make sense?
If you want to download the ranking info for all your keywords (not limited to 1000), you can do this via the API (Search Console + Analytics). It's quite technical but it's not extremely difficult to setup - more info here: https://moz.com/ugc/how-to-get-the-data-you-need-from-googles-search-analytics-api/
Example: https://moz.com/blog/google-keyword-unplanner-clickstream-data-to-the-rescue
We aren't going to replace our image as a pro white hat, content-first provider, but we will append to that characterization a damn good data and tool provider.
But then why are there other regions supported? Take for instance:
https://moz.com/explorer/keyword/suggestions?q=advocaat+bouwrecht (keyword means: 'lawyer construction law', selected region: 'Netherlands'). Here the suggestions are way off. The first couple are good, they look like they're coming from Google Suggest. But then it becomes interesting. There once was a soccer coach called Dick Advocaat. He's not a lawyer ('advocaat'), but that's just his last name. And then there's a whole list of soccer coaches as keyword suggestions :)
> Some of my research tells me that when they launched their new website in May, the amount of pages indexed decreased substantially.
Have you created a multi sitemap approach?
If you don't mind me asking, what kind of a living are you making at this?
Most of the information I've found points to good money to be made in this area.
https://moz.com/industry-survey
I know it's a couple years old, but a median salary of $60,000 seems pretty good to me.
I have two pieces of anecdotal evidence as well. I spoke to the CEO of a San Francisco marketing firm. He suggested I get into SEO due to skills being emphasized over degrees and that a good SEO can make $100,000 a year. I spoke with the marketing director (who also does some SEO) of glassdoor and he said that he was prepared to pay $130,000 a year for an SEO specialist and it took him 6 months to fill the position.
I hear what you're saying, but no matter what area I get into someone is going to tell me why the market is tough. I get that.
If SEO is really that bad, where would you suggest I go?
EDIT: Additionally, I'm using this as a vehicle to teach myself a number of useful skills. HTML, CSS, and Java for starters. The goal is SEO but, as I understand it, I'll need those skills in some way shape or form to be effective at this work. I can learn others as I move on as well.
This is a pretty solid checklist, much as I roll my eyes at Moz much of the time: https://moz.com/blog/seo-tips-https-ssl
The key thing to remember is that an https and an http site are considered entirely different sites. So you have to treat it as you would a site migration. The two protocols use entirely different ports.
If there is complexity to your site (CDNs, widgets, etc.) you need to ensure they all move.
This is an interesting subject actually. I attended BrightonSEO last year and Pi Data Metrics seems to have come to the conclusion that there is the possibility that multiple pages ranking for the same query could be the cause for a page to be not ranking as high as it should be. You can see a post on You Moz where someone has asked a similar question to you: https://moz.com/community/q/similar-pages-on-a-site
You can find more on the case studies here: https://econsultancy.com/blog/64992-are-your-own-sites-harming-your-seo-strategy/
Whether I totally buy into this is another question though. Would be good to get everyone's thoughts on this.
The website I work on ranks three times for the same lucrative query in P1, P2 and P3. That's not something we'd try to change obviously!
Sounds like you may be more interested in CRO, Conversion Rate Optimization, than SEO. There are inherently some overlap between the two but I'd start with searching for CRO info/books. No books immediately come to mind, but I've had this bookmarked.
Im currently in charge of reducing the page load time on our site. www.webpagetest.org helped me a lot together with this MOZ article where they explain how the waterfall diagram actually works. Most of the time the site can be much faster by just doing: server optimization, image compression, combining/minifying JavaScript/CSS ressources and caching.
In addition to the two links /u/gensher posted, [moz covered this a few years ago](https://moz.com/blog/rel-confused-answers-to-your-rel-canonical-questions>
>That’s a tougher question. First off, Google may choose to ignore cross-domain use of rel=canonical if the pages seem too different or it appears manipulative. The ideal use of cross-domain rel=canonical would be a situation where multiple sites owned by the same entity share content, and that content is useful to the users of each individual site. In that case, you probably wouldn’t want to use 301-redirects (it could confuse users and harm the individual brands), but you may want to avoid duplicate content issues and control which property Google displays in search results. I would not typically use rel=canonical cross-domain just to consolidate PageRank.
I have been dealing with these issues a lot. My site has full text syndication deals with Yahoo and other big news sites. Your best bet is to ask for all of these, canonicals, citation links, no-index.
If you are looking to do your own local SEO, here are a couple of good resources: https://moz.com/local and http://www.localseoguide.com Tell Andrew I sent you, no affiliation, just good PR. Good Luck.
(A heads up I’m likely biased on this one as I work at Raven, but we've also got tons of love for Moz. Great tools and they've been brilliant at educating folks as the industry evolves.)
I’d say if you haven’t tried both trials, give them both a spin. I know at least the Raven trial doesn’t require a credit card so you poke around without committing to anything:
We’ve found loads of folks use Raven alongside Moz (like THEsolid85 and jiminy_christmas have pointed out) plus folks often use one or both of us alongside Buffer, MajesticSEO, the list goes on, so based on your needs it may make sense to use one, both, or your own custom suite of tools.
Let me know if I can help with any questions as you review your needs!
Here is a link to the site. I'm currently using prerender.io to serve up static versions of the dynamic content. I've had no problem getting indexed but I'm thinking my root domain won't rank in non-English countries because spiders will see the site as default English. Unless they use separate user agents for each country which would trigger that language to display as default. Then they could display the appropriate title/description in the SERPS... Uncharted territory for me.
The site isn't for ecommerce. Here it is: it's a web app for teaching users how to code. My only concern is that users see challenges in their native language and the root domain ranks for relevant terms in the respective languages.