Portent » Google http://www.eigene-homepage-erstellen.net Internet Marketing: SEO, PPC & Social - Seattle, WA Thu, 03 Sep 2015 18:20:24 +0000 en-US hourly 1 http://wordpress.org/?v=4.3 How to Solve 6 Brutal Problems in Google Analytics http://www.eigene-homepage-erstellen.net/blog/analytics/how-to-solve-6-brutal-problems-in-google-analytics.htm http://www.eigene-homepage-erstellen.net/blog/analytics/how-to-solve-6-brutal-problems-in-google-analytics.htm#comments Thu, 16 Apr 2015 13:00:04 +0000 http://www.eigene-homepage-erstellen.net/?p=28050 Ever have one of those days where you’re digging through Google Analytics (GA) and you say to yourself “What the f*@$ is this s%$#!?!?” Yeah – me too. Don’t get me wrong – I love GA. It’s way easier to navigate than most of the other analytics platforms out there. But just like its users, GA… Read More

The post How to Solve 6 Brutal Problems in Google Analytics appeared first on Portent.

]]>
Ever have one of those days where you’re digging through Google Analytics (GA) and you say to yourself “What the f*@$ is this s%$#!?!?”

Yeah – me too.

Don’t get me wrong – I love GA. It’s way easier to navigate than most of the other analytics platforms out there. But just like its users, GA isn’t perfect.

As an expert on both analytics and all things brutality (I’m a big death metal fan) I’d like to call out some of the most frustrating aspects of GA, as well as solutions to work around these obstacles.

NOTE: These brutal issues will be listed from the simplest to the most complex. If you’re new to GA, start reading here. If you’re a GA wizard, feel free to scroll down to more complex topics.

Problem: Limited visuals

When you present data you want your charts to look sparkly clean. GA’s basic visuals are fine, but when you try to show multiple metrics, segments, or time intervals…things get ugly:

1

Looks more like a Richter scale than analytics data

Even with simple data points, the visuals in GA aren’t great:

2

We want sexy data visualizations, not the bare minimum. The visuals can’t be customized within GA, so we have to look elsewhere for enhancing our charts.

Solution – Microsoft Excel

Export the data into a CSV in Excel to create your own visuals. Want to pick your favorite colors? Done! Want the labels to actually be readable from a distance? Done!

3

If you’re lacking confidence in your Excel skills, check out my deck on visualizing analytics data in Excel. It’ll show you the basics on making kick-ass charts from your GA data.

Problem: Names of metrics and reports keep changing

One day you go into GA and try to look up how many unique visitors came to your site. But then you can’t find that metric, or other metrics, or even the Channels report where you looked yesterday.

Maybe you think you’re going crazy. It was all here yesterday!

You’re (probably) not crazy. GA decided to rename metrics and reports out of the blue. Now your unique visitors are called users, your visits are called sessions, and your Channels report is hidden under the All Traffic button. How dare they make you click one more button to find your report!

Solution: Search box

You can use the search box in the top left corner of the interface. If you need the Channels report, instead of clicking around the dropdown lists you can type it in the box:

4

And select the report from the results:

5

Note that you can also use this for finding recently viewed reports.

As for the renamed metrics, be on the lookout for announcements from GA when these changes occur. Here’s a quick list of common metrics from Google’s last renaming batch in 2014:

  • Visits are now Sessions
  • Visitors are now Users
  • Avg. Time on Site is now Avg. Session Duration

Problem: Data sampling

When you try to look at lots of data (like millions of sessions) with several dimensions and segments, eventually you’ll hit a data wall. GA will sample your report based on less than 100% of your sessions. It usually looks like this:

6

This can be especially frustrating when you already clicked the button telling GA to provide your data with slower response for greater precision. Response time is exactly why the sampling occurs. GA doesn’t want to spend all day building out your report, so it provides a sample.

Solution(s):

  1. Purchase GA Premium to eliminate almost all sampling from your reports. You put down the cash and GA will work harder to bring you all of your data. But Premium costs $100,000+, so let’s assume that isn’t possible.
  2. Slim your report down to the essentials. Strip away extra segments and dimensions to obtain the largest sample possible.
  3. Worst case scenario – split the time interval into smaller parts and move the data into one Excel file. This is crazy annoying, but it technically works.

Problem: Traffic segments aren’t accessible in all reports

Custom segments are a fantastic way to view specific parts of your data. They’re easy to implement; click on the top ribbon, find (or create) the one you need, and you’re good to go!

7

But if you venture to the Multi-Channel Funnels report, you’re screwed:

8

Same goes for the Goal Funnel report:

9

C’mon GA! What’s up with that?

Solution: Write Google an expletive-filled letter of complaint

I’m kidding of course. But you’d think by now this would be consistent across the entire interface. Until then we have to wait for the GA team to mercifully give us access to the segment ribbon on these reports.

Problem: Report filters lack “or” logic option

Let’s say you want to view all of your pages based around your site’s newsletter. The URLs contain either the word “mail” or “newsletter”. You can make a report filter to find URLs for mail, but when you want a filter to also capture newsletter…uh oh:

10

We can only select “and”, meaning that the URLs would have to include both of those words. In this scenario the filters wouldn’t work.

Brutal.

Solution: Regular Expressions (RegEx)

This nifty language can help you access any combination of dimensions you’d want in a report. It even has a character that represents the word “or”, the vertical bar | (that’s not a lowercase L, or an uppercase i, but a vertical bar |).

If you select RegEx in the filter options, now we can create our filter with one condition:

11

New to RegEx? No problem! Here’s a handy cheat sheet, as well as this free RegEx testing tool to verify whether or not your expressions are capturing what you need.

Problem: Multi-Channel Funnels revenue consistency to other reports

My colleague Michael Wiegand brought this one up. When you go to the Multi-Channel Funnels (MCF) report, direct traffic is awarded conversions differently than every other GA report. Google has their reasons but I’ll make it simple:

If a conversion takes place from a direct visit, in the MCF report the direct channel is awarded with the conversion. Every other GA report defers the conversion to the last non-direct visit from the user who just converted.

For example, here’s revenue in the Channels report:

12

Then the same profile in the MCF report:

13

They both add up to the same total revenue, but the distribution is inconsistent.

Some argue that this is more accurate since direct traffic is not ignored. But it creates an inconsistency when using multiple reports.

Solution: Less emphasis on last-click attribution

This brutal situation is an important lesson for all analytics users: last-click is not the only attribution model. Michael has an excellent post on the flaws viewing conversions with only last-click attribution.

Go play around in the Attribution Model Comparison tool to get more thorough insights on how your channels contribute to your conversions.

Next time you start to have a meltdown while in GA, just remember many issues on the platform have solutions. Even if there isn’t a current fix, Google updates the platform at least once per year to help solve these issues.

But don’t hesitate to take a screenshot of an issue and tweet them “WTF?” – they need the feedback.

The post How to Solve 6 Brutal Problems in Google Analytics appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/analytics/how-to-solve-6-brutal-problems-in-google-analytics.htm/feed 0
Guide to Personalized Search Results http://www.eigene-homepage-erstellen.net/blog/seo/personalized-search-results.htm http://www.eigene-homepage-erstellen.net/blog/seo/personalized-search-results.htm#comments Thu, 28 Aug 2014 17:43:36 +0000 http://www.eigene-homepage-erstellen.net/?p=26250 If you grew up watching Sesame Street like me, you might have heard this song: One of these things is not like the others,One of these things just doesn’t belong,Can you tell which thing is not like the othersBy the time I finish my song? The search results that you see within your browser are… Read More

The post Guide to Personalized Search Results appeared first on Portent.

]]>
If you grew up watching Sesame Street like me, you might have heard this song:

One of these things is not like the others,
One of these things just doesn’t belong,
Can you tell which thing is not like the others
By the time I finish my song?

The search results that you see within your browser are not the same as the others, each person is seeing different results. This is because those magic elves that place links on Google’s results pages knows that not everyone is that same and they customize your search results to better fit your needs. These personalized searches are created by multiple factors and from these sources, Google provides you with more relevant searches and gets you to the page you are looking for.

What affects my search results?

There are many factors that go into personalizing your search results, but here are some of the top ones:

Location

Google knows where you sleep. They also know where you work, go to school, and where you go on your weekends.

Don’t believe me? Take a look at this:
Google knows where you have been

This is the location data Google has collected on me for the past 30 days.

Of course, I have an Android phone and take Google everywhere I go, but have a look here and find out if Google already knows what you did last weekend:
https://maps.google.com/locationhistory/b/0

This precise location data allows Google to give you information based on your current location as well as the places you have visited in the past.

If you are not connected to Google via a mobile device, it will get your location based off your IP address of your internet connection. It may not be as precise as GPC, but it gives them the general area you are located.

This location data is used to help you find information on nearby restaurants or other local businesses. These custom results are very helpful to the user, but in my tests they caused the biggest fluctuation of the rankings.

portent-blog-local

You will also see local results from a couple different sources. One source is the content on your site. Google will look for the best content based on the location and the search query. These results will show up in the regular organic results (see the blue highlighted listings above).

There are still a section of local listings grouped within the search results. This data comes from Google My Business listings and finds local businesses near your location and places them on a map to help you find a store near you.

Search History

Google tracks the different terms that you search for to help understand the context of your search. Google first announced personalized search way back in 2005, which used your personal search history to influence your results. This was only available to users that had a Google account.

Then four years later, in 2009, Google announced that it was giving personalized search to everyone whether they were signed into their Google account or not.

 Previously, we only offered Personalized Search for signed-in users, and only when they had Web History enabled on their Google Accounts. What we’re doing today is expanding Personalized Search so that we can provide it to signed-out users as well. This addition enables us to customize search results for you based upon 180 days of search activity linked to an anonymous cookie in your browser. It’s completely separate from your Google Account and Web History (which are only available to signed-in users).
http://googleblog.blogspot.com/2009/12/personalized-search-for-everyone.html

 So you can see that Google remembers your other recent searches. The image below shows the same search query for ‘JavaScript,’ but you can see personalized results on the right based on previous searches.

portent-blog-javascript

In the search on the right I searched for ‘Programming Textbooks’ and ‘Books on HTML’ before I searched ‘JavaScript’. This changed the results by bringing in three book listings that were not on the original set of results at all.

Web History

If you are signed into a Google account and use Chrome or the Google Toolbar, your web history is being collected and stored in a vast Google data center somewhere. Google uses this web history to learn what kind of sites you like and base your search results on this.

When testing this, I saw Twitter rise in the rankings over Facebook since I tend to visit Twitter more often. Otherwise, I didn’t see any major changes.

Google+

When you create a Google+ account, you give Google a lot of demographic data on yourself including your age, sex, where you live, other places you used to live, where you work, who your friends are, what your favorite 80’s TV show is (mine is Misfits of Science).

You would think they would use this demographic data to target you, but during my tests I didn’t see any clear indications of this. The only major changes I saw based on Google+ was the additions of reviews or ratings by people I have in my circles.

portent-blog-reviews

I moved to Seattle about the same time that Johnathon Colman moved to California, but I have been followed by his ghost ever since. I have Johnathon in my Google+ circles and because of that he shows up every time I’m looking for local businesses.

I didn’t notice any of these reviews making changes in the position of the rankings, but they do make the site listing more visible which would likely increase the click through rate of that listing.

What does this mean to me?

There is no consistent search experience because of personalization. This means that you can track the keyword rankings for your site using generic non-personalized search results, but they don’t match up 1:1 to what your customers are seeing. It’s still OK to track your keywords, but you need to realize that it is not giving you the full picture of what is going on in the wild. You need to use these ranking to see how you are trending, not what place a specific keyword is ranking for this week.

When you are trying to increase the rankings of your site, it is best to take a holistic approach and include onsite and offsite optimization, localization, and social visibility. All of these factor into your rankings and will help you increase your search visibility in personalized and non-personalized search results.

The post Guide to Personalized Search Results appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/personalized-search-results.htm/feed 4
How to Go Incognito for International Search Queries – and Why http://www.eigene-homepage-erstellen.net/blog/seo/go-incognito-international-search-queries.htm http://www.eigene-homepage-erstellen.net/blog/seo/go-incognito-international-search-queries.htm#comments Tue, 05 Aug 2014 17:03:36 +0000 http://www.eigene-homepage-erstellen.net/?p=25977 The first thing to do when you’re starting out your quest to be the master of international search is to go incognito. You can’t just search on Google.com for your product/service and expect to see the same thing that everyone else in the world will. Google personalizes search results according to your location, your search… Read More

The post How to Go Incognito for International Search Queries – and Why appeared first on Portent.

]]>
The first thing to do when you’re starting out your quest to be the master of international search is to go incognito. You can’t just search on Google.com for your product/service and expect to see the same thing that everyone else in the world will. Google personalizes search results according to your location, your search history, your Google+ social circles, and more. You have to de-personalize yourself in order to see results that are closer to what your audience might see.  Let’s take this one step at a time.

How to unGoogle Yourself:

I used to manipulate the search query string itself in the URL bar to make it do what I wanted. Now, I’m too lazy for that, and there are ways to automate the process. But first, let’s look at the search query bar to learn what all the moving parts are:

International Search Query URLYou could technically just play around with this search bar until you get the results for the country and languages you’re targeting. But not everyone knows what the country and language codes are, and so here are some work-arounds.

There are a few methods to manually set this up:

  • Sign out of all your Google accounts
  • Clear your browser’s cache and cookies
  • Change your location in Google using the Search Tools box. Note: you have to set your location according to the country-coded top-level domain that Google is currently using. So if you are on google.com, you can only select a location within the USA.
  • Use the Incognito Browsing mode in Google Chrome. When you fire up Chrome, click the icon on the far right of the search box that looks like three horizontal bars, and then select Open New Incognito Window. You’ll know you’ve done it correctly when you see:

Incognito SEO Search

One of the easiest ways to automate this process is to go to http://www.impersonal.me, put in your site, and then select one of the presets that matches the country you want to target, or select Options in order to change the interface language, the TLD, and the location of the search. Then impersonal.me does the rest. An alternative is http://isearchfrom.com.

A more advanced option is to use a proxy, or alternatively, a VPN. Make sure you are using a trusted proxy, not one of the freebies.

Let’s Just Talk About Daniel Day-Lewis for a While

Why the British Daniel Day-Lewis was the unbelievably perfect role for Abraham Lincoln, my American brain will never understand. Anyway, I did some digging around between the US and UK Google search results to find some interesting results on my favorite actor.

The first thing I noticed is that, regardless of whether I was signed in or out of my Google accounts, nothing in the search results changed. This means that no one in my Google+ circle cares about Daniel Day-Lewis and hasn’t shared anything about him that Google thought I might like to see. In other words, my Google+ friends need to focus more on what really matters in life, like fine method acting.

The second thing I noticed while searching on Google.com, is that the 11th result (below the in-depth articles) was an article from the http://www.dailymail.co.uk titled “Daniel Day-Lewis to receive a knighthood,” which shouldn’t have surprised me because if Sir Elton John is a knight, Day-Lewis deserves to be one too.

international seo result

This search result is puzzling because the rest of the first page results all come from US top-level domains, but not too puzzling when we remember that Day-Lewis is British so there’s bound to be some British articles that float to the top, even in the US.

Sure enough, when I switched over to GB (Great Britain) as my location, the dailymail.co.uk result was #2, behind the British newspaper The Guardian, which had also made an appearance at #7 in the In-Depth Articles section on the US results. The Guardian is a bit of an interesting exception because they recently migrated from .co.uk to .com, so I am assuming that Google thinks they swing both ways.

So, the 2 British sites that made it onto the first page in the US also dominate the search results in the UK. But other than those two results (and the resource box on the right), there is no other overlap in organic listings.

The US results:

De-Personalized, Signed-out DDL

The UK results:

Signed-out, UK DDL

The most important thing to notice here is that if you are searching from the US, you get a fairly different set of search results than if you search in the UK for the same thing. Also, it appears that in the US, search results are more oriented towards Daniel-Day Lewis as an actor, while in the UK, the search results are more oriented towards him as a person. Intriguing.

Tools for Tracking Keyword Rankings Globally

There are some great tools for tracking keyword rankings in different locations across the globe. Advanced Web Rankings and SEMRush are two that come to mind.

In Advanced Web Rankings, or AWR, you can use their online web app or their desktop tool to create custom reports. You can create a project in which you can designate websites to track, such as your website alongside your competitors, your targeted keywords, and specific search engines.  You can target search engines by country or by region, such as Google Organic USA (loc: salt lake city, ut), which means that AWR will look at keyword rankings specifically as if they were located in Salt Lake City. You can also select different languages for the search engines. For instance, German search results in Switzerland as opposed to French results in Switzerland.

In SEMRush, when you login to your online account, you can view the organic positions report and then click through the different country tabs to view your rankings in those regions. But you can’t select different languages, so you don’t have as much flexibility as with AWR.  SEMRush has 26 countries to select, while AWR has hundreds, including major search engine indexes in the main regional languages for each country. Besides, in AWR you can select multiple search engines per project, while in SEMRush you have to create a separate project for each search engine you want to track. So AWR wins this battle.

Parting Thoughts

People all over the world might be searching for your product or service and are probably seeing very different results based on their location. It would be a false sense of security to think that your global rankings are as good as the results in your neighborhood. In order to determine what your customers are most likely seeing on Google, you need to go incognito. This is just the first step in evaluating and tracking your SEO success on a global scale, but it’s a crucial one.



The post How to Go Incognito for International Search Queries – and Why appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/go-incognito-international-search-queries.htm/feed 7
Team Portent Weighs In On the Loss of Organic Keywords http://www.eigene-homepage-erstellen.net/blog/seo/team-portent-weighs-in-on-the-loss-of-organic-keywords.htm http://www.eigene-homepage-erstellen.net/blog/seo/team-portent-weighs-in-on-the-loss-of-organic-keywords.htm#comments Thu, 26 Sep 2013 14:00:54 +0000 http://www.eigene-homepage-erstellen.net/?p=21602 Well it finally happened. In what they claim is a move to make search data more secure, Google has begun to encrypt all searches, effectively placing all organic traffic into the (not provided) category. This means business owners will never see the keywords people used to get to their site. We’ve already gone over what… Read More

The post Team Portent Weighs In On the Loss of Organic Keywords appeared first on Portent.

]]>
A length of rope almost broken with the strain

Well it finally happened. In what they claim is a move to make search data more secure, Google has begun to encrypt all searches, effectively placing all organic traffic into the (not provided) category. This means business owners will never see the keywords people used to get to their site.

We’ve already gone over what this means for SEO but, since big changes like this are always accompanied by big opinions, I decided to ask around Portent and see what people here thought about this new era of SEO. With that, let’s meet the players:

Ian Lurie, CEO, Founder, Dictator. Probably already knew this was happening from some important-people news feed.

Elizabeth Marsten, Senior Director of Search. The Commander Riker to Ian’s Darth Vader aboard the Galactica.

Josh Patrice, Director of SEO. “Directs” the SEO team through the troubled waters of the industry. Recently lost compass.

Michael Wiegand, Senior PPC & Google Analytics Strategist. Still has all his keyword data.

Aviva Jorstad, Director of Accounts. Courier of terrible and depressing news from SEOs to clients.

Ken Colborn, SEO & Analytics Strategist. Informed team of Google’s update by loudly sobbing into his keyboard like a little, baby girl.

Travis Brown, Offsite SEO Strategist. Couldn’t be happier that Google is making life miserable for other SEOs.

Nick Bernard, SEO Strategist. Lives in Montana. Keyword research process almost certainly involves fly fishing in some form.

Marianne Sweeny, Senior Search Strategist. Has been warning colleagues, clients, and people walking on the sidewalk about this for years.

Matthew Henry, SEO Developer. Half robot, half cyborg, half wizard.

David Portney, SEO Strategist. (bio not provided)

Kiko Correa, PPC Strategist. Uses the word “clicks” in almost every sentence, except when talking about “cliques.”

George Freitag, SEO Strategist. Author of this article, making his opinions the most important.

Where were you when you found out about Google’s switch?

Aviva: At the airport bar in DC, checking Facebook on my phone. Ian had shared the SEL article.

Elizabeth: At my desk, smashing through emails.

David: Working at my desk.

Michael: Probably eating a sandwich.

Matthew: Sitting in my flooded apartment trying to roll out some code.

Ken: I heard about it first from Twitter and the angry mob of SEOs declaring the end of the world.

Travis: At my desk across from THE Ken Colborn during a beautifully dreary morning.

George: Eavesdropping on Travis and Ken.

Kiko: Looking at a search term report in AdWords.

What were your initial thoughts on the move?

Marianne: “No surprises here.”

Josh: I don’t think that we can print my initial thoughts. This is a family blog.

Aviva: The writing’s been on the wall for nearly two years. We knew this was coming.

Elizabeth: Well that’s a hell of a thing… but I’m PPC oriented, so it really doesn’t affect me. If anything, I just got more useful.

Matthew: I figured they would do this eventually, but I was surprised they did it so soon, and so completely.

Michael: It’s a token gesture on Google’s part to their searcher base. But ultimately, one that’s likely to garner them respect outside of the search community.

Ken: I was surprised at first. While it was a move that I was expecting, I didn’t think it would happen this soon. I thought it was going to be a more gradual move over the next year.

Nick: Like most people I’m sure, I’m surprised they just flipped the switch and turned it secure for everyone. I thought they would drag it out some more in little increments, like “This month, all searches from BLACKBERRIES are secure!” (Were they already?)

Travis: Google has been moving towards this direction for a while, and it was only a matter of time. While there are going to be negative side effects and an adjusting period to having no data, the future will be better because we will not be slowly hemorrhaging data for years to come. Instead, it is all gone now, and we have to adjust now. In the end, it is going to be more “wheat from chaff” for agencies.

Kiko: Thank goodness I work in PPC. Total job security… for me.

How do you think this will impact the industry?

Ian: Keyword rankings and data became a poor metric several years ago, when personalized search hit. If you haven’t already changed your focus to true KPIs that impact the business, and started treating SEO as a single channel in a larger strategy, then this would be a reallllly good time to start.

Matthew: People will scream bloody murder for a while, then everyone will eventually calm down and adapt to make use of what information we do have.

Josh: Well, the levels of panic are going to rise in the near future, but if we’re really doing our best to optimize a site, then we don’t necessarily need this data. Sure, it’s helpful; we can build content around long-tail queries, we can chase changes in the lexicon for a site or a product or a category, and we can make assumptions around our audience. In the end, most of the time what we as SEOs really need to be doing is putting the right content on the right pages. I feel that we do that already.

Aviva: There will be freak-outs. There will be outrage. For content-focused, whole-brained Internet marketing agencies like Portent, not much – in fact, in many ways it sets us apart from the pack. My point is, thinking in terms of individual keywords is really, really limiting. As marketers, this move is exciting because it forces companies to be more strategic and holistic with their online marketing efforts. At Portent, we’ve always pushed clients to start from a higher level, and approach SEO as an integrated effort that is part of everything they do online. Now, we have more leverage to push high-quality, link-attracting, and social-buzz-getting content. We have more leverage to talk about user experience and site speed. We have more leverage to stop the obsession with keywords and rankings and look at overall visibility. Can you tell I’m excited?

Marianne: Without organic keyword data, keyword research will have to change as ad behavior is markedly different from organic data. User experience practices will become instrumental in optimizing websites for organic ranking.

Elizabeth: After the crying dies down and we all remember that Google is a privately held company that can do anything they want to in reality, 3rd party tools are going to become a booming industry, anyone who can do correlation fun (i.e. with paid search keywords) is going to enjoy job security, and I think we’re going to see a lot more innovation over all. New tools, new technologies, new math even.

Travis: Rankings may not return as the KPI to watch, but they will continue to be an indicator of performance. Google could get more people running advertisements or paying for the data. Using an analytics platform to appropriately segment attribution and measure page-level performance will be even more important. From a link building perspective, it is a non-issue. There are more interesting KPIs for off-site to judge performance, and anchor text should already be diversified.

Michael: SEOs will look for more creative ways to siphon data from PPC. Ironically, there’s a new report in that shows click-through balance on a given term when you have paid running, when you have organic listing or when you have both. Additionally, Google Analytics’ eventual move to tracking users instead of cookies will render a lot of what we used to ascertain through search queries – customer intent, namely – useless, as we’ll get a much bigger window into how many visits and influences lead to a purchase. We’ll need to start solving for the entire marketing mix and not just one keyword on one channel.

Keyword SpyGoogle claims that they are doing this for enhanced security. Do you feel there is any legitimacy to this reasoning?

Josh: Ha.

Ian: I question Google’s statement that this is privacy-motivated, given that they still store the data (I’m sure) and they still show all AdWords clicks.

Kiko: Did someone say enhanced? Seriously though, are you implying Google would have an ulterior motive behind hiding keyword info behind a pay wall? $hocking.

David: At SMX Advanced 2013, Matt Cutts passionately argued this as a justifiably important protection for searchers, but that seems hypocritical when the data is available if you pay for it via AdWords advertising.

Matthew: Nope. I think they are withdrawing this information because they have no real motive to give it to us, and because the information makes it easier to manipulate.

Elizabeth: No, it’s crap. That’s the kind of thing that’s thought up by a scriptwriter for a movie or TV show to cover up the real reason. I’ve got cable. There have to be other mitigating factors and one of them (it wouldn’t surprise me) has to be around the fatigue of fighting spam and jerks trying to “game” the system. Take away the stuff they’re using to do it and you’re left with fewer options. Like creating good content for people.

Michael: While I think they’ve taken an appropriate response to NSA activity in general and in crying out for more government transparency, I think the query data they’re storing to benefit their AdWords user base is at odds with any legitimately good motives they might have on the privacy/security.

Aviva: User data is still available for sale. And we have encryption technologies that make it possible to protect users, which are or will soon be enabled anyway. So no, this has nothing to do with security.

Marianne: Google’s justification fig leaf of protecting privacy is very small and extremely thin. User privacy was never compromised as it was not accompanied by the data points of who and where. Also, Google still retains all user data for use at their end. How private is that? IMHO, the motivation for this move on the part of Google is all of that tantalizing Big Data and its richness of actual user behavior data.

Travis: Yes, there is legitimacy to the Big Bad Wolf’s reasoning. What is not legitimate is keeping the data for paying parties. To reinforce their claim, Google is moving towards more transparency by showing the amount of requests they receive from government agencies and probably would do more if they were legally able. Recently, there has been buzz about tracking users without cookies. How Google accomplishes that will be a huge hint at how they truly feel about privacy and whether they are walking the walk.

George: I do think that Google being able to state that they no longer give your search data to marketing agencies can play pretty well for them from a political standpoint. Even if it is a totally empty gesture.

Any final thoughts on the matter?

Ken: While we lose some valuable insights on keyword data, our main goal should stay the same: create great content that is truly useful to our customers.

Kiko: In all honesty this seems like a poorly motivated move by Google that will have an unintended positive impact on marketing. Crap marketers will still be crap, but have one less leg to stand on. After the initial shock of client expectations people doing the real quality work will have no problem getting the job done.

David: We just have to adapt accordingly. Search marketing will undergo radical changes as Google works toward its dream of a “Star Trek” computer and continues to serve itself and its shareholders, being a publicly traded company and all.

Matthew: The SEO industry has always had to adapt to squeeze as much use as possible out of very limited information. When we are given something useful, and then it is later taken away, it’s easy to fall into a sense of entitlement. “Google OWES us that keyword data!” but, of course, they don’t really owe us anything.

Michael: AdWords will still have a ton of data that’ll be useful for SEOs. Hopefully this’ll be the (albeit awful) thing that drives legitimate cooperation between organic and paid search folks for good. We’re in the same game and it’s been stupid of us to create these borders – blog posts about cannibalization, mainly – between our goals, which should be to grow search holistically.

Travis: ¡Viva la Rankings!

What were your thoughts about Google’s switch to secure? Do you have any questions? Any tips? Share your thoughts and stories below and keep the conversation going!

The post Team Portent Weighs In On the Loss of Organic Keywords appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/team-portent-weighs-in-on-the-loss-of-organic-keywords.htm/feed 6
A Day in the Life or: How I Learned to Stop Worrying and Love (not provided) http://www.eigene-homepage-erstellen.net/blog/seo/how-i-learned-to-stop-worrying-and-love-not-provided.htm http://www.eigene-homepage-erstellen.net/blog/seo/how-i-learned-to-stop-worrying-and-love-not-provided.htm#comments Tue, 24 Sep 2013 22:15:55 +0000 http://www.eigene-homepage-erstellen.net/?p=21581 I read the news today, oh boy… Danny Sullivan announced that Google is actively moving towards 100% encrypted search results. This should come as no surprise to anyone in the industry. When Google introduced secure search back in 2011, we saw the writing on the wall. As Google increased their number of users, the number… Read More

The post A Day in the Life or: How I Learned to Stop Worrying and Love (not provided) appeared first on Portent.

]]>
I read the news today, oh boy…

Danny Sullivan announced that Google is actively moving towards 100% encrypted search results. This should come as no surprise to anyone in the industry. When Google introduced secure search back in 2011, we saw the writing on the wall. As Google increased their number of users, the number of (not provided) results would increase as well. The percentage of (not provided) traffic grew from 3% to 5% overnight, and upwards to 10% by the first quarter of 2012.

We’ve been able to get by for the past two years. We’ve explained to our clients what (not provided) means, why it exists, everything.

and though the news was rather sad…

Though it might not be tomorrow, next month, or even this year, soon enough 100% of Google’s organic traffic will be unknown to us. Sure, there will be a number of hacks we can use to infer data, or we can just lean heavily on AdWords (shut up and take my money!) to provide the information, but the truth is keyword-based marketing as we have known it is dead. We will no longer know what keyword drove that visit that drove that sale. It won’t exist. Not in Google Analytics, not in Site Catalyst, not in your log files, nowhere.

This has led to quite a bit of panic.

People are assuming that we’ll have to spend thousands of dollars on AdWords just to see what terms are working, and to find what terms we need to build a strategy around. Some are claiming that all the data Google collects is going to go away eventually, and that we’re heading to a paid-only world.

There’s even talk that we’ll have to start tracking keyword rankings like a hawk. That the only metric to determine keyword effectiveness will be ranking, and that we’ll map each page of a site to just a set of unique terms and we’ll weigh how much traffic those pages generate by how successful we are.

I just had to laugh…

At the end of the day, this really changes nothing.

While it has been very useful to have that data over the years, and I know we’ve all been able to glean new ideas for pages, blog posts, etc., it’s not crucial. That’s because, when you take a step back, everything we do is still just a part of Web marketing. In fact, we view SEO as a result of proper marketing.

If your site has a thorough hierarchy, uses clear architecture, is well designed, provides a good user experience, is fast, and has proper title tags, headings, well-crafted content, etc. then it’ll likely do just fine. It will get links and visits and will rank for whatever term you like. Write content that speaks to what you’re selling whether it’s a product, a service, or just an idea. We’re afforded a great opportunity as Web marketers; our audience already knows that they want something akin to what we’re offering. All we have to do is close the deal.

handshake

Seal the Deal

I know that I’m not going to start relying solely on rank tracking to convey keyword success. There are a lot of tactics we can use to get granular about how profitable traffic may be related to a specific search, and none of them involve heavy scrutiny of keyword ranking. Rankings shuffle all the time for various keywords. You could rank in position 3 for your most profitable term today, and rank number 8 tomorrow. It’s just not a part of a strong long-term strategy.

If anything, this is good news for all of us who have been in a meeting where a client says, “more sales are great, but we’re not number 1 for ‘widget keyword that doesn’t convert’.” Or think of it this way, imagine you sold umbrellas, and one day you found that 5% of your visitors came to your site using the term “parasols for rain.” Would you revamp your entire site? Of course not. A little traffic or a bump in search rankings doesn’t affect your bottom line as much as a sale does. So why not talk about site performance from that standpoint instead?

This is perhaps the best outcome of all. We can focus on how organic traffic and site performance influence unique visitors, conversions, sales, revenue, etc. If you haven’t done so already, now is the time to shift your conversation towards meaningful KPIs and away from silly things like ranking and traffic.

So, in the end, thank you (not provided) for our industry’s evolution away from outdated “metrics” and toward meaningful results.

The post A Day in the Life or: How I Learned to Stop Worrying and Love (not provided) appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/how-i-learned-to-stop-worrying-and-love-not-provided.htm/feed 10
3 Google Algorithms We Know About & 200 We Don’t http://www.eigene-homepage-erstellen.net/blog/seo/3-google-algorithms-we-know-about-200-we-dont.htm http://www.eigene-homepage-erstellen.net/blog/seo/3-google-algorithms-we-know-about-200-we-dont.htm#comments Wed, 15 May 2013 14:00:48 +0000 http://www.eigene-homepage-erstellen.net/?p=17194 When I meet with clients or present at conferences, I am always asked: “How do I rank high on Google for (insert keyword-phrase-du-jour)?” I give the standard answer: “Only the search engineers and Google can tell you and they aren’t talking.” Inevitably, the questioner looks dejected, mutters a slur on my credentials, and walks away. … Read More

The post 3 Google Algorithms We Know About & 200 We Don’t appeared first on Portent.

]]>
When I meet with clients or present at conferences, I am always asked: “How do I rank high on Google for (insert keyword-phrase-du-jour)?” I give the standard answer: “Only the search engineers and Google can tell you and they aren’t talking.”

Inevitably, the questioner looks dejected, mutters a slur on my credentials, and walks away.  I scream silently in my head: “Don’t kill the messenger because we are all hapless Wile E. Coyotes chasing the Larry and Sergey Road Runner with zero chance of catching them, no matter what we order from ACME!”

Thirteen years ago, before the Cone of Silence dropped on Google’s method of operation, we got a glimpse of the method behind their madness. This, combined with the common knowledge of the foundational tenets of all search engines, gives us some idea of what’s going on behind that not-so-simple box on the white page.

In this post, I am going to explore the 3 algorithms that we know for sure Google is using to produce search results, and speculate about the 200+ other algorithms that we suspect they are using based on patent filings, reverse engineering, and the Ouija board.

What is an algorithm (you might ask)?

There are many definitions of algorithm. The National Institute of Standards and Technology defines an algorithm as “a computable set of steps to achieve a desired result.”  Ask a developer and they will tell you that an algorithm is “a set of instructions (procedures or functions) that is used to accomplish a certain task.” My favorite definition, and the one that I’m going with, comes from MIT’s Kevin Slavin’s TED Talk “How Algorithms Shape Our World”: algorithms are “math that computers use to decide stuff.”

3 Google algorithms we know about

PageRank

The most famous Google algorithm is PageRank, a pre-query value that has no relationship to the search query. In its infancy, the PageRank algorithm used links pointing to the page as an indication of its importance. Larry Page, after whom the algorithm is named, used the academic citation model where the papers citing another were endorsements of its authority. Strangely enough, they do not have citation rings or citation buying schemes as with web links. Warning, scary, eye-bleeding computational math ahead.

Initial PageRank algorithm

Initial PageRank algorithm

To combat spam, a Random Surfer algorithm was added was added to PageRank. This algorithm “imagined” a Random Surfer that traveled the Web and would follow the links on each page. However, sometimes, the Random Surfer would arbitrarily, much like us thought-processing bipeds, not return to the original page and keep going or would stop following links and “jump” to another page. The algorithm steps are:

  1. At any time t, surfer is on some page P
  2. At time t+1, the surfer follows an outlink at random
  3. Surfer ends up on some page Q (from page P)
  4. The process repeats indefinitely

That’s the benefit of algorithms, no overtime and they never get tired or bored.

Hilltop Algorithm

Surf’s up Dude algorithm worked for about 10 minutes before the SEO community found the hole in its wet suit to manipulate rankings. In the early 2000s, processors caught up to computational mathematics and Google was able to deploy the Hilltop Algorithm (around 2001). This algorithm was the first introduction of semantic influence on search results inasmuch as a machine can be trained to understand semantics.

Hilltop is like a linguistic Ponzi scheme that attributes a quality to links based on the authority of the document pointing the link to the page.  One of Hilltop’s algorithms segments the web into a corpus of broad topics. If certain documents in a topic area have lots of links from unaffiliated experts within the same topic area, that document must be an authority. Links from authority documents carry more weight. Authority documents tend to link to other authorities on the same subject and to Hubs, pages that have lots of links to documents on the same subject.

Topic-Sensitive PageRank

The Topic-Sensitive PageRank algorithm is a set of algorithms that take the semantic reasoning a few steps further. Ostensibly the algorithm uses the Open Directory ontology (dmoz.org) to sort documents by topic.

Another algorithm calculates a score for context sensitive relevance rank based on a set of “vectors”. These vectors represent the context of term use in a document, the context of the term used in the history of queries, and the context of previous use by the user as contained in the user profile.

So, I know what you’re thinking. How can they do that for the whole web? They don’t. They use predictive modeling algorithms to perform these operations on a representational subset of the web, collect the vectors, and apply the findings to all of the “nearest neighbors.”

D’oh!

[Added May 16, 2013]
There are a lot of algorithms for indexing, processing and clustering documents that I left out because including them would have many of you face-first-in-your cereal-from-boredom. However, it is NOT OK to leave out the mother of all information retrieval algorithms, TF-IDF, known affectionately to search geeks as Term Frequency-Inverse Document Frequency.

Introduced in the 1970s, this primary ranking algorithm uses the presence, number of occurrences, and locations of occurrence to produce a statistical weight on the importance of a particular term in the document. It includes a normalization feature to prevent long boring documents from taking up residence in search results due to the shear nature of their girth. This is my favorite algorithm because it supports Woody Allen’s maxim that 80% of success is showing up.

The 200+ we don’t know about

All of the search engines closely guard their complete algorithm structure for ranking documents. However, we live in a wonderful country that has patent protection for ideas. These patents provide insight into Google’s thinking and you can usually pinpoint which ones are deployed.

Panda, the most famous update is an evolving set of algorithms that are combined to determine the quality of the content and user experience on a particular website. There are algorithms that apply decision trees to large data sets of user behavior.

These decision trees look at if this/then that:

  • If the page has crossed a threshold a certain ratio of images to text, then it is a poor user experience.
  • If a significant portion of searchers do not engage with anything on the page (links, navigation, interaction points), then the page is not interesting for searchers using that particular query.
  • If the searchers do not select the page from the results set, then it is not relevant to the query.
  • If the searcher returns to the search engine results to select another result or refine the search, then the content was not relevant and not a good user experience.

Complementing the decisions trees could be any one of a number of page layout algorithms that determine the number and placement of images on a page in relation to the amount of content in relation to a searcher’s focus of attention.

Following on the heels of Panda are the Penguin algorithms. These algorithms are specifically targeted at detecting and removing web spam. They use Google’s vast data resources to evaluate the quality of links pointing to a site, measure the rate of link acquisition, the link source relationship to the page subject, shared domain ownership of the linking sites, and relationships between the linking sites.

Once a site passes an established threshold, another algorithm likely flags the site for additional review by a human evaluator or automatically re-ranks the page so that it drops in search results.

Let’s stop, guess, and go with what we know

As with the formula for Coca-Cola or the recipe for Colonel Sanders’ Kentucky Fried Chicken, specifics on what Google uses to decide who gets where in the search results set are a closely guarded secret. Instead of speculating on what we might know, let’s focus on what we do know:

  • In order to rank for a term, that term must be present in the document. Sometimes, a contextual or semantic match for a term will get you into the SERP swimsuit competition for placement. Don’t count on that though.
  • Being picky and realistic about what you want to rank for is the best start.
  • Text on the page drives inclusion in the results for a searcher’s query. Be about some thing instead of many things.
  • Quality content is focused, fresh and engaging.
  • Custom titles that describe the page using contextually relevant keywords are THE low hanging fruit. Pick it for your most important pages.
  • Compelling description text in search results will draw clicks. Meeting the searcher’s expectations with rich content will keep them on the page.
  • Pictures, images, and ads are most effective when used in moderation.
  • Links are important, but only the right kind. For Google, the “right” kinds are links from pages about the same subject and place high in contextually-related searches.

Are there any major algorithms we missed?  Let us know in the comments.

The post 3 Google Algorithms We Know About & 200 We Don’t appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/3-google-algorithms-we-know-about-200-we-dont.htm/feed 20
5 SEO Strategies We Swear Aren’t Going Anywhere http://www.eigene-homepage-erstellen.net/blog/seo/5-seo-strategies-we-swear-arent-going-anywhere.htm http://www.eigene-homepage-erstellen.net/blog/seo/5-seo-strategies-we-swear-arent-going-anywhere.htm#comments Wed, 01 May 2013 14:00:52 +0000 http://www.eigene-homepage-erstellen.net/?p=17098 It seems like every other day, some SEO technique that used to be accepted is now being devalued or, even worse, penalized. (Remember when meta keywords and nofollow tags were totally legit?  Ah, the good old days…)  And now Google is threatening to crack down on two staples of the SEO stable: anchor text and… Read More

The post 5 SEO Strategies We Swear Aren’t Going Anywhere appeared first on Portent.

]]>
Hand Graphic of Scout's Honor

It seems like every other day, some SEO technique that used to be accepted is now being devalued or, even worse, penalized. (Remember when meta keywords and nofollow tags were totally legit?  Ah, the good old days…)  And now Google is threatening to crack down on two staples of the SEO stable: anchor text and infographics.

With all of these changes it can be hard for businesses to know which search strategies are long-term and future-proof.

Business: “Sure I could invest a bunch of time and resources into this new strategy the weird search person is suggesting, but how do I know it won’t just change next year?”

SEO: “Err…”

Well, as much as things change, there are a few basic guidelines that are here to stay.  And while I’m not going to go into details on every nuance, as long as your SEO efforts keep with these general strategies, you should be in good shape (at least for a few years).

1. Page speed and efficiency

As a general rule, fast things are better, and in this case, the “better” means rankings.  Google and Bing have been saying this for a while now: page speed counts. Google even made a cool little tool for everyone to measure their website’s page speed. It even gives instructions on how to improve it. Why would they do that if they didn’t consider it important?

dog sticking head out car window
Image optimization, javascript and CSS consolidation, minification, caching, compression – the list goes on. We’ve given a few pointers in past posts, but the message is that improving your site’s page speed is a long term strategy.

Why is it so important? Because it’s one of the main things real people look at when they decide whether or not to use a site. Amazon has stated that every tenth of a second of increased load time resulted in a 1% drop in sales. That means people really, really care about this. And if people really, really care, you know all the little Ooompa Loompas and elves working at Google and Bing are trying to make their magic robots care as well.

2. Fixing duplicate anything

Duplication happens. Title tags get repeated, URLs accidentally get indexed with parameters, content gets scraped, mobile sites get indexed separately – it sucks. Like Oompa Loompas, the ways in which content can get duplicated go on forever.

Oompa Loompas with query text.

Oompa Loompas are a result of a dynamic query parameter on a self referencing link.

The search engines are getting better at identifying duplicate content. But a search engine robot being 99.9% sure that your page is the one that should rank out of the 15 others indexed is still not as good the robot being 100% sure because it only has one page to choose from.

The best defense with duplicate content is to avoid it all together:

  • Don’t use parameters in places where they’ll get indexed.
  • Use consistent URLs for both mobile and desktop versions of your site.
  • If you can’t fix a wrong link, use 301 redirects.
  • If you can’t get rid of duplicate content and you can’t redirect, then use the meta tags like rel=canonical or noindex.
  • If you can’t do that, then just fight it any way you can.

3. Resource and instructional content

Updates like Penguin have got everyone freaking out. “What makes a page good?” “What makes a page spammy?” “What makes a link good?” “Does Google like me or like like me?”

Well, here’s the thing – if you have a legitimately useful page for something, it will always be considered good. So, if you sell Frisbees, write a page about how to throw a Frisbee. If you’re trying to get more Renaissance Fair enthusiasts to visit your site, write something on how to care for jousting armor.

This goes back to Google’s humble beginnings as an indexing engine for academic documents.

In academia, if you reference something, you’re doing it because it’s something that you found useful when writing your dissertation on Bigfoot or Jetpacks (or whatever it is smart people write about).

This is how search engines wish you used your links. Lots of SEOs complain about Wikipedia always being in the number 1 spot, but few can argue that it isn’t the most relevant result for most searches.

I realize not everyone can be Wikipedia, but as long as people are linking to you because you’re useful, you will be in good shape.

How do you do this?

  • Create resource pages about your industry.
  • Create some data oriented blog posts.
  • Make an instructional page about how to use your product or a related product.

This is how links were always intended to be used and that’s why they were ever a ranking factor in the first place. These pages are naturally good content and links to these pages will tend to be good links.

4. Good site structure

Should I link to every page from the homepage or just the main ones? Should I repeat everything in the footer or should I cut the footer all together? Can I hide the homepage text?

Good site structure typically falls more into the UI/UX category, but it’s an SEO concern as well. You see, when you design a site with the intention of getting your user or shopper to the right page or give them information they’re looking for, you’re naturally creating a page that does the same for the search engines.

Confusing instructions in Asian.

Search engines are complicated because the elves are trying very hard to make them emulate human factors when viewing a website.

Lots of links on a page is confusing. It implies that you consider them all equally important. If your homepage has a few links, on the other hand, it looks like you really care about those pages.

Using this sort of user-oriented thought process is a future-proof strategy to predict what search engines will care about within your site. Sure, you still have to help the robots with filters and search boxes, but they are very good at finding links.

So remember to:

  • Show off the links you want to show off.
  • Use page hierarchy to group internal pages into categories and subcategories.
  • Link to related products. These really make sense to your users and naturally lay out the relationship to the search engines.

5. Anything local

The bottom line for businesses and consultants is this: People who are actually looking for products aren’t changing their settings in Google or using a proxy to see the universal search results. They are clicking on those search results with the little letters next to them.

If searching for any of your keywords displays a local search result, you need to spend time on local. And if people can walk into a storefront, you really need to care about local.

Screen cap of costume stores Google search.

Local search isn’t going anywhere; in fact it’s getting more popular. On mobile devices it pretty much dominates the search results. So if you have one store or one thousand stores, you need to spend some time in Google+ Local and Bing Local.

You need to:

  • Claim your listings, check your NAP, and monitor your reviews.
  • Create storefront pages and make sure they are associated with your local listings.
  • If you don’t have a storefront, make some pages that talk about the area you serve.

Is that all?

No! This doesn’t mean don’t worry about any of the other things. Don’t ignore a social network today just because it might not be around in 10 years. If Google comes out with a hot new tag then you should absolutely use it, even if it may be ignored a few months later. Infographics still work!

But if you have limited resources, these guidelines can help you evaluate whether something is worth investing a ton of time and money in or if there is something more effective you could be working on.

Disagree? Totally agree? Not sure what level of agreement you’re feeling? Leave a comment below and share your thoughts!

The post 5 SEO Strategies We Swear Aren’t Going Anywhere appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/5-seo-strategies-we-swear-arent-going-anywhere.htm/feed 23
Are my links poopy? Know a spammy link when you see one. http://www.eigene-homepage-erstellen.net/blog/seo/are-my-links-poopy-know-a-spammy-link-when-you-see-one.htm http://www.eigene-homepage-erstellen.net/blog/seo/are-my-links-poopy-know-a-spammy-link-when-you-see-one.htm#comments Fri, 18 Jan 2013 00:25:52 +0000 http://www.eigene-homepage-erstellen.net/?p=15274 Short version of this article: If you’ve been penalized for unnatural links by Google, either manually or under Penguin, you need to cut deep or you won’t recover. Now, the long version, with examples: Here’s a joke I learned in hebrew school, an unmentionable number of decades ago: Three guys are walking down the sidewalk.… Read More

The post Are my links poopy? Know a spammy link when you see one. appeared first on Portent.

]]>
clean up after your links, k?

Short version of this article: If you’ve been penalized for unnatural links by Google, either manually or under Penguin, you need to cut deep or you won’t recover. Now, the long version, with examples:

Here’s a joke I learned in hebrew school, an unmentionable number of decades ago:

Three guys are walking down the sidewalk. They come upon a pile of dog poo. The first one kneels down and gives it a big sniff. He says “Oy, that smells like poo.” The next guy touches it and says, “Oy, that feels like poo.” The last guy tastes it and says, “Oy, that tastes like poo.” Then, they all say, “Wow, sure am glad we didn’t step in it!!!”

The moral here: If it looks like poo, and it smells like poo, you probably don’t need to touch or taste it.

But, for when Google penalizes us for link spam, we ignore the rule. When it comes to links, apparently, we need to not only touch and taste, we need to roll around in it for a while like a Black Lab on speed, then jump up and down in front of Google yelling “This is OK! This is OK! This is OK!”

I get it: If you’ve been penalized under Google Penguin, it’s hard to know spam links from good ones. Your justification motor kicks into high gear. The problem with justification, though, is that Google doesn’t want it. They want to get rid of the spam.

So you need to do a really good job of sussing out the spammy links. I’ve done several reinclusion requests now, and I’ve put together some examples.

If you know about Google Penguin, skip the next section. You don’t need it.

The story so far: Google Penguin and link penalties

Just to catch you up: Last Spring, Google began rolling out something called “Penguin.” Google Penguin targets any website attempting to move up in the rankings through ‘unnatural’ link acquisition. When it launched, the big G sent out warnings that looked something like this:

A Google link warning

A Google link warning

Then, your Google traffic plunges:

google organic traffic takes a dive

GAAAAHHHH!!!!

And then everyone starts screaming. Primal-type screams. Screams that would chill the very soul of the most cynical, shrivel-hearted meanie on the planet.

Once the screaming stops, most folks look for a way to fix the problem. The fix: Remove all the unnatural links, then go back to Google on bended knee. Hopefully, you get the penalty lifted, and life is good again.

I’m talking about the manual penalty and reinclusion process here. The ‘real’ Penguin penalty happens algorithmically, and can be a lot harder to detect and fix. We can save that nightmare for another post, yes?

Unnatural = spammy

What is an unnatural link? Hmmmm. Good question. Google’s not going to tell you. You can splutter angrily about it (I did), but the truth is, Google doesn’t have too. Unnatural (spammy, aka poopy) links are usually as obvious as a gargantuan turd on the sidewalk in front of you. I collected data on a bit over 200,000 links pointing at penalized sites, and after using some really advanced computer niftiness to automatically evaluate the links, I concluded:

If a link looks like poop, it’s poopy.

Examples of spammy links

If you got a link by stuffing it into a press release where it makes no sense, it’s poopy.

a press release stuffed with links

Why exactly are all these links in here?

If you got a link by dumping it into a forum post while thinking “Well, this will help me rank higher!”, it’s poopy.

If you got a link by posting to a site with thousands of pages of barely-readable drivel on subjects ranging from STDs to outdoor patio furniture, it’s poopy.

yes, this site's for real

Yes, this site really exists.

If you got a link by adding it to a list of links on a page with tons of other completely unrelated links, it’s poopy.

awful link list

Yes folks, from Archery to Babylon 5, we’ve got it all!

If you got a link by clicking away on bookmarking sites like a fiend, it’s poopy.

bookmark spam

No.

If you got a huge number of links from pages that could be totally legitimate, but are all one type of page — a link page, or a forum thread, or a press release page — then even though they’re not individually poopy, they may be poopy in aggregate. Get rid of them.

If you’re under a Google manual link penalty, and a link is spammy, or even seems a teeny bit spammy, or even a teeny-tiny-itty-bitty-bit spammy, then it’s poo. REMOVE IT.

No exceptions

You read all of these, and they’re obvious, right? You didn’t get these links because they were good marketing. You didn’t get them because someone loved your stuff. You got them to improve your rankings. Which makes them unnatural. But you’ll still try to justify those links. I know I do. I can hear this little voice in my head saying things like:

This press release only has 2 keyword-rich links in it. That’s better than 3, so it’ll be OK.

or

This spammy link directory is purely focused on kevlar products. It’ll be fine!

or

I worked damned hard to get this #!# link. I am not taking it down.

Whatever. Google doesn’t want them justified. Google wants them gone.

If you’re under penalty, remove all links you obtained by:

  • Paying someone other than a charity or foundation.
  • Using any tool with ‘Amazing,’ ‘Super,’ ‘Crusher,’ ‘Stomper’ or any other superlative/smashing reference in the name.
  • Begging someone for a link that adds no value to their site.
  • Trading.
  • Writing the same article 100000 times.

Just do it

If you didn’t get a link through real, honest-to-god marketing, take it down. Yes, that’s scary. You’re going to lose authority. You’re going to lose some good links in the process. But it’s also your best bet. Cyrus Shepard has a great case study on just this subject. Read what he had to do. It worked.

Or, justify away, file your reinclusion request and see what happens. It’s only a few more months in purgatory if you get rejected. What’s the worst that can happen?

One last tip. If you’re using disavow, use the domain: command generously. This is advice directly from the Google search quality team. Otherwise, you can miss a lot of spam links on a site, no matter how thorough you are. Or, spam links can sneak back in later.

The post Are my links poopy? Know a spammy link when you see one. appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/are-my-links-poopy-know-a-spammy-link-when-you-see-one.htm/feed 35
The Dos and Don’ts for Google’s New Disavow Links Tool http://www.eigene-homepage-erstellen.net/blog/seo/google-disavow-links-tool-best-practices.htm http://www.eigene-homepage-erstellen.net/blog/seo/google-disavow-links-tool-best-practices.htm#comments Tue, 16 Oct 2012 23:43:50 +0000 http://www.eigene-homepage-erstellen.net/?p=13285 For the first time in what might be ever, Google has followed Bing’s lead and announced a tool to disavow links. We asked (or demanded), and they listened! Cleverly named the Disavow Links tool, Google Webmaster Tools’ latest feature gives power back to webmasters and takes it away from spammers. Here are our tips for… Read More

The post The Dos and Don’ts for Google’s New Disavow Links Tool appeared first on Portent.

]]>
Disavow Link Tool dog
For the first time in what might be ever, Google has followed Bing’s lead and announced a tool to disavow links. We asked (or demanded), and they listened! Cleverly named the Disavow Links tool, Google Webmaster Tools’ latest feature gives power back to webmasters and takes it away from spammers.

Disavow Links tool

Here are our tips for using the new Google Disavowinator™:

Don’t disavow links unless you’ve carefully researched them

What does that mean? Well, it’s not just looking at domain names, domain authority, or whatever else you use to get a rough understanding of the quality of said site. Thoroughly researching your links is much more complicated and will take time. However, you’ll be better off in the end for having done it.

Building a database of your current backlinks is your first step… and then there are about 19 other steps after that. Follow Ian’s master list – he’s a good man, and thorough.

Don’t use this as a short cut

The disavowal tool is not a magic shortcut. Only submit links that you’ve already tried to remove.
As you read the rest of this article, you’ll see Google makes it clear that this isn’t a 100% guaranteed way to get links removed from your profile. Nor is it there to make life ‘easier’ so much as it’s there to tip the balance of power away from spammers and back towards webmasters. Treat it as such.

Don’t disavow an entire domain unless you’re 100% sure every link on that domain is garbage

This applies to the odd industry blog, news site, content site, etc. Yeah, they look like garbage. Yeah, they smell like garbage. And yes, they likely are 90% garbage…but that 10%. When the website got that link from Microsoft, or Yahoo, or CNN – before they let their site go to hell. That link still matters, that piece of content still matters, and your link from them might still matter.

If the domain is filled with spammy content and other forms of web crap, go ahead. To quote Ian, “If not, think. Use your brain.” (Seriously, he says this all the time)

Don’t assume that all rankings issues can be fixed by disavowing links

There are a lot of reasons your site may have dropped in the rankings. In fact, your site’s drop in the rankings on the same day as the last Penguin update might just be a coincidence. Yes, it’s true! Your site might not be all that good. That’s OK. You can make it better, but disavowing a bunch of questionable links is not going to turn it around in a heartbeat.

Be smart. Disavow the links you know to be spam and that you’ve already tried to remove. Then re-evaluate your site, and see where you can improve.

Don’t file a re-inclusion request until you’ve uploaded a disavowal file

Oh man, this would be a classic FAIL.

We saw quite a few examples of sites filing for re-inclusion before they’d even done any link cleanup after the first Penguin release. You know what that does? If the request is answered, it will likely keep you out of Google even longer because it may trigger additional review of your site by Google.

Before you file for re-inclusion, make sure you’ve done your research. Leave no stone unturned, and be sure to clean up the rest of your site as well.

Don’t become a serial uploader

Find a bunch of lousy links. Submit them. Then wait.

Don’t sit there submitting every 3 hours and then wondering why you’re not back in the rankings. This is a new, fairly advanced tool. It’s best to proceed with caution. Google agrees.

Disavow Links tool suggestions

Do party like it’s 1999

We got the power to filter out spammy links. This is pretty huge. Enjoy it for a day. Then, get to work.

Do properly fill out your file of links to be disavowed

All you need is a plain text file with one URL per line. Simple, I know, but someone will screw that up. Google also gives us a few commands

  • Lines that begin with a hash # are considered comments.
  • Lines that start “domain:” allow you to disavow all links from a particular domain

Your file will look something like this:

Disavow Links example file

Do use Webmaster Tools: Links to Your Site

Don’t be ridiculous, go to the source to find the easy links first. Webmaster Tools is your insight into Google’s view of your site. Grab the links they’re reporting first, and filter through those.

This is also a helpful reminder for any site that doesn’t have access to SEOmoz or MajesticSEO, or ahrefs – you can still find links pointing to your site. While the database might not be as grand, it’s still going to help you fix your site’s backlink issues.

Do think of this like rel=”canonical”

Google is equating this tool to rel=”canonical” in that it’s more of a strong suggestion than a directive. Think about it, they’re not going to give us the keys to the kingdom just because we all complained about the power the spammers gained from Penguin.

It’s important to note that, just like rel=”canonical,” this is meant to be used when necessary. We’re still expected to clean up as many links as possible on our own, request that webmasters be taken down, etc. Then we can use the disavow tool.

Do give it time

You won’t see anything change overnight. Google says:

We need to recrawl and reindex the URLs you disavowed before your disavowals go into effect, which can take multiple weeks.

I wouldn’t expect to see positive results for a month. Have you checked out the Disavow Links tool yet? Let us know in the comments.

The post The Dos and Don’ts for Google’s New Disavow Links Tool appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/google-disavow-links-tool-best-practices.htm/feed 18
How Google gave the spammers all the power http://www.eigene-homepage-erstellen.net/blog/seo/how-google-gave-the-spammers-all-the-power.htm http://www.eigene-homepage-erstellen.net/blog/seo/how-google-gave-the-spammers-all-the-power.htm#comments Thu, 05 Jul 2012 21:55:05 +0000 http://www.eigene-homepage-erstellen.net/?p=10660 Google launched the Penguin update to filter out spammy links. Great! That’s fantastic! Lead us, oh tuxedoed little birdie, to the Golden Age of SEO! Content, marketing and all-around smarts will win the day! Penguin is not subtle. It targets link profiles that flunk fairly obvious, common-sense criteria: Don’t buy links; don’t get site-wide footer… Read More

The post How Google gave the spammers all the power appeared first on Portent.

]]>
Surprise! No Penguin here

Surprise! No Penguin here

Google launched the Penguin update to filter out spammy links. Great! That’s fantastic! Lead us, oh tuxedoed little birdie, to the Golden Age of SEO! Content, marketing and all-around smarts will win the day!

Penguin is not subtle. It targets link profiles that flunk fairly obvious, common-sense criteria: Don’t buy links; don’t get site-wide footer links in 5-point type; don’t spam forums; don’t get 2,000 links with identical anchor text. In short, don’t acquire crappy links. You can read a longer explanation of Penguin here.

If Google penalizes you, you can file a re-consideration request.

But the common sense ends where the request process starts: The spammers are in control. Here’s why:

Unsettling trend…

In the last few weeks, I’ve noticed legitimate sites, selling real stuff in a thoroughly businesslike manner, getting their re-consideration requests slapped down.

I’m not Google. But I know a wee bit about what spammy SEO looks like. The sites in question have reformed—they’ve given up their dastardly link-dealing ways.

So I did a little digging, and discovered that in every rejected case, some of the spammy links remain. They remain because we can’t get the site owners to remove the bloody links. We’re not hiding them. We’re listing them. Hell, I’d personally go to Google on bended knee and beg that they devalue the links if I could.

How much cleanup is enough?

That led me to write a little ditty:

If any spammy links remain, your request goes down the drain.

1,200 link removal requests isn’t enough. If you want to get back into the rankings, you have to actually get all of those spammy links removed. Or just pray the re-inclusion team misses the remaining links in a spot check.

Google gave the Batmobile to the Joker

Which leaves me in a pickle. I’m a strong believer in TOS-compliant search engine optimization. Do what Google and Bing want: Help them find, correctly categorize and rank the best stuff. Simple. If you screw up, fix it.

Except that now, site owners can’t do that. If a spammer decides not to remove a link, or if they’ve abandoned their site, Joe the Siteowner is hosed. Google, you’ve just put all the power in the hands of the people you most detest: Spammers. People who sell links, build blog networks and otherwise wreak havoc on your algorithm.

Um. Why, exactly?

Why, Google? Why?

Why, Google? Why?

The answer: Disavowal, or an ‘A’ for effort

Google, we need that link disavow tool. Now. It’ll totally wipe out a nice little consulting division here at Portent. And I’m OK with that. Just give site owners a way to yank links out of their profile.

Or, when a site owner sends you an excruciatingly detailed list of every link they’ve tried to remove, give them an ‘A’ for effort. Then devalue the remaining links, and move on.

Just please, put the power back in the hands of the site owners, OK?

The post How Google gave the spammers all the power appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/how-google-gave-the-spammers-all-the-power.htm/feed 30