Portent » search marketing http://www.eigene-homepage-erstellen.net Internet Marketing: SEO, PPC & Social - Seattle, WA Thu, 03 Sep 2015 18:20:24 +0000 en-US hourly 1 http://wordpress.org/?v=4.3 Everything Non-SEOs Need To Know About SEO (Webinar) http://www.eigene-homepage-erstellen.net/blog/seo/everything-non-seos-need-to-know-webinar.htm http://www.eigene-homepage-erstellen.net/blog/seo/everything-non-seos-need-to-know-webinar.htm#comments Tue, 09 Jul 2013 14:00:33 +0000 http://www.eigene-homepage-erstellen.net/?p=18627 Last month, I did a webinar called “Everything Non-SEOs Need To Know About SEO.” The goal was to give developers, designers, and other web professionals a core understanding of some of the more complex SEO concepts. Here’s the full video along with the link bundle that includes the slidedeck. Enjoy! You can find the link… Read More

The post Everything Non-SEOs Need To Know About SEO (Webinar) appeared first on Portent.

]]>
Everything Non-SEOs Need to Know Title Slide

Last month, I did a webinar called “Everything Non-SEOs Need To Know About SEO.” The goal was to give developers, designers, and other web professionals a core understanding of some of the more complex SEO concepts.

Here’s the full video along with the link bundle that includes the slidedeck. Enjoy!

You can find the link bundle here.

Transcript:

Ariana:  So without further ado, please join me in welcoming George.  Hey, George.

 

George:  Hi, Ariana.  Thank you for that introduction.  All right, so the title of the webinar, like it says on the screen, “Everything Non-SEOs Need to Know About SEOs.”  And what I wanted to do was just discuss some more common conflicts that can happen between other web folk, particularly designers and developers, and the SEO.  The goal here is just to give a high-level understanding of some of the more technical and sometimes invasive SEO concerns and what SEOs are looking for when they start budding in to other people’s business.  But at the very least, I hope that you understand these issues but hopefully, we can avoid them altogether and make a website that really lives up to its total potential.  So with that, we’ll move forward.

Like Ariana said, here’s the hashtag, #portentu so if you have any questions or comments or you just want to yell at us, you can tweet them out to this hashtag.  And then this is where the link bundle is gonna be.  I’m gonna put the slides up later on after we’re done here.  Right now, there’s just a few links, but this is where everything is going to be.

So with that, let’s get started and let’s start at the beginning, and that all start with a website, so either a business owner, marketing person, someone just has an idea to either start a new website, create a new website, revamp their old website, or do something.  So they get a team together to build a website.  So start out with a designer, typically, someone to make the website look nice, someone to make sure that everything’s findable, everything’s set up in a way that people can use.  Get a developer, someone to make the website actually work, make sure it loads fast, everything works, all the gears are moving, and all of that.  Next, a content person of some kind, maybe a writer or in some cases a merchandise person, someone to give the actual meat substance to the website so there’s something there to actually do and use and is useful of resources.  And then last of all, comes the SEO, and usually later in the process, you bring in an SEO for the purpose of making the website findable.

So the designer, developer, and everyone else get together.  They put together a website that everyone likes and they’re ready to launch it.  That’s when the SEO comes along and they’re looking for some advice on how to make sure that it’s found.  So the SEO starts making some changes.  Some of this might be expected.  Basic changes like adding content here and changing an image here, but then they start making more changes, getting into the actual design and structure and now they’re talking about how the website should actually function, talking about getting into the developer’s territory and moving along with that sort of stuff.  Next thing you know, we’ve got navigation changes made to the website.  Then they start asking some vague questions without any real reason, maybe some vague recommendations on how the website should be acting, telling you to get rid of pages altogether, and sooner or later, you sort of feel like this lady right here with the SEO being this dude with the blue elbow pads.

I’m sure you’re all familiar with this guy, the annoying guy hovering over you, behind you, nitpicking where you click, when you do, what you do, and all you want to do is get your work done.  Nobody likes this guy.  He’s got those elbow pads.  They’re just kind of weird.  So that brings me to the secondary title of this webinar and the one that I think is actually a little bit more accurate and that’s “How to Keep SEOs from Getting All Up in Your Business.”  Because this is really what it’s about.  Everyone has stuff that they need to get done and work they need to do.  You don’t need elbow pads guy here telling you how to run your day to day.  You’ve got your own work that you need to get done and you’ve got your own concerns related to how the website works and all of that.

So I want to give advice on how to make a website without having an SEO like this guy getting into your day-to-day business.  So what I’m going to be doing is speaking mostly to designers and developers and talking about how you can avoid some of the big issues related to SEO.  Marketers and business owners, hopefully you can learn about a lot of these potential conflicts, get a high-level understanding of what these conflicts are and in a lot of ways how they can just be avoided altogether with simple planning.  And SEOs, if everything in here is something that you already know, then maybe at least you can understand that these are pain points and they should be treated as such and we should learn to avoid them.

So let’s get to the bottom of this and like a lot of things in life, a lot of these conflicts happen from expectations and so when people hire an SEO, I tend to find that they’re looking for specific types of recommendations or certain things from the SEO.  They’re looking for things related to keywords or anything related to words, maybe an analysis of the existing keywords bringing in traffic, new keyword opportunities, anything related to that.  Of course, title tag, new title tags that are targeting the right phrases.  Content, content strategy, anything related to content or words.  And if they’re getting really technical, maybe a three-to-one redirect or a couple of three-to-one redirects to get people to the right pages.

This is what people tend to expect when they hire an SEO and from this perspective, it really makes sense why the SEO tends to be brought in so late in the game because these are all things that can be changed.  You can always change a title tag and you can always write a new blog post or redirect can always happen at any time.  That makes total sense why the SEO was brought in in the process, because changing the content of the page is something that’s very doable and you do it regularly and if that’s all you expect, then it makes total sense.

What tends to happen is that there’s a series of things SEOs start actually doing and they’re sort of unexpected, like messing around with the navigation, telling you to change things around in there, telling you to move pages around, moving a page to a different category or changing how a page is being displayed, moving around the images.  Beyond that, adding alt text.  Maybe they want you to remove an image altogether because it’s in the way of something else.  Maybe asking you to change how the site behaves, so exactly what’s changing in CMS, how the site’s functioning, stuff like that.  Messing around with the actual URL structure or the URLs, how they’re being delivered, make changes related to that.  Or possibly the worst thing yet, giving UX or UI advice to the designer or any sort of developer that’s working on that.  Sometimes the SEO will actually have advice related to that and make comments on how people should be finding the site navigation items and stuff like that.

So a lot of people are wondering what SEOs are actually doing.  Believe it or not, SEOs aren’t just trying to make people’s lives miserable when they’re asking questions like these.  Whenever an SEO ventures outside of the world of keywords and content and blogs and all those sorts of things, it’s because we’re looking for something very specific and we’re looking for some big issues.  Because there’s one underlying concern that all SEOs have and that’s we are worried that no one will ever find your website.  We want to be sure that your website can always be found, particularly and specifically by the search engines.

So I’m gonna skip over a lot of the basics because I assume that you know a lot of these and focus on the technical issues.  I assume you know already the good things like title tags and anything related actually to onsite content and then also what doesn’t work anymore and go to the large, major technical issues that tend to result in the conflicts between SEOs and other web professionals.  Typically this problems can fall into three major categories.  I’m gonna be going over each one of these very specifically and hopefully get an understanding of the concerns SEO has when they’re attempting to address these.  Mostly I’ll be going over that.

So let’s go into the three big problems.  You have indexation, duplication, and canonicalization.  So let’s start with indexation.  In indexation, problems can be caused by many things, but basically when an SEO is looking for indexation problems, they’re looking at this is that when there is no content on the website that matches the search, then you’ve got a content problem.  But if search engines can’t actually find the words on your website or find the content, that’s when you have an indexation problem.  So we’ll go over how indexation problems can occur.  I’ll go over briefly at a high level how the search engine’s working just to give a concept of that.

So it always begins with a query.  Google goes out with your query that you put in and finds all the pages that contain the same words with what you’ve typed.  Then it takes those words and puts them into whatever order they want resulting in the search results, using things like link, offsite metrics, page range, a little bit of trust, and whatever other magic factors that Google uses to sort through all those pages.  But the point here is that before any of that stuff can happen, it starts with a query looking for a specific word.  So it starts with your content.

So basically it doesn’t matter how good your site is.  If someone can’t find the content or if robots can’t find the content, then it doesn’t matter good the site is.  There’s lots of ways that this can happen.  The most direct way and brutal way is the classic robots.txt block.  This is something that a lot of developers are familiar with and this is a great method that you want to use to prevent crawlers from getting to your website with setting up a staging site.  But what happens is if the staging site gets launched with this still in place, then it gets blocked from the search engines.  And a lot of times argument that allows that to happen is that it’s only on for a few minutes and the SEO is saying that it can’t ever be on at all but sometimes it’s on for a few minutes and it might be okay and the problem is that it’s actually not.  And the reason why is that everything is always connected on the web.  That’s why it’s called the web.  If something changes over here, you have something over here that can be felt.

So you have a lot of links pointing to your website.  If Google happens to crawl into one of the websites that’s linked to yours, it’s going to find the link that goes to your website, crawl, and notice something has changed.  So what this means is the more popular your brand is or the more links you have going to your website, the more likely it is that that site is going to be crawled when something changes.  So when you launch a new website, that will probably trigger a crawler to start crawling your site, especially if you’re popular.  That is why the robots block is such a big deal and that’s why SEOs are always so concerned about it, because if you are a big brand, chances are you will get blocked.  This happens a lot.  Almost half of the sites that I’ve assisted with have launched with a robots.txt in place and it’s just something that should be avoided.

So the solution to this is just to change it before it goes live and I’ll leave it at that.  That’s the most direct way of blocking content and creating indexation problems.  But there’s lots of other ways that SEOs will get pushy about how the content is being indexed and that’s because they’re concerned about this happening, that you’ve got some nice content but it’s out here where no one will find it.

So to understand this, we’ll go into how the search engines are crawling the website and basically they’re doing something called the rational search model but basically that means that they are trying to crawl a page the same way that a person does.  So essentially it’s a robot pretending to be a human when it crawls your site, like Data from Star Trek, Michael Fassbender from Aliens: Episode 1 or Zuckerberg from The Social Network, a robot that’s pretending to be a human.

And so what the search engines do is they get to your website, start clicking around the things, start touching things, looking around.  Just basically doing everything a human would do if they were on your site but the problem is this, is that the search engine robots aren’t actual humans.  They can’t do things and there’s a little series of things that they can’t do when crawling your website and looking for content.

The most common thing that can happen is text and images.  So to go over this, this is something that a lot of people are familiar with, but basically the problem is this, that you’ve got some great content and great words that are describing what your website is about, but it’s in an image.  That means that the search engines can’t see it.  I understand that you don’t always want to be limited to Times New Roman and Arial for every single font and there’s lot of reasons, like branding, personality, and just overall style, why you want to make sure that it’s good text that’s not just a boring text that’s available online.

Before I go any further, I’m not talking about your logo.  Logos are always okay.  You can make that into an image.  You don’t have to worry about that.  If your SEO is asking about that, you can win that argument by telling them just to go away ‘cause your logo is fine.  But for everything else, you really do want to make sure that it’s visible as actual HTML text and there’s ways to do this.  There’s some old ways.  This used to be a way of doing it where you just put in an image anyways and just shove the text off 10,000 pixels.  You don’t want to do this anymore.  It’s something that search engines specifically look for.  So if an SEO does find this, this is something that they’ll ask to stop.

But the good thing is that there’s not really any reason to do this anymore because of all these fonts that you can upload to your website.  So instead of doing anything within images, you just load the fonts to the actual website.  The easiest way to do that is with Google Web Fonts, something that a lot of people are familiar with.  Also, there’s lots of other fonts available online for download.  Whenever there’s text on a page, if you just used actual HTML tags instead of an image, then you’re going to avoid a lot of issues with SEO and that’s something that they won’t have to bug you about.

But there’s also other ways of hiding content.  Those were two of the more direct ways but some other ways that happen aren’t as known.  Start with something like this.  Sometimes SEOs are asking to make changes to the navigation.  So they’ll take a look at your site that you’ve already spent time on and have you put links in the navigation that you don’t necessarily want.  The reason why SEOs will be doing this is because they want to make sure that the website functions in this way.  They want a link from your home page to go to the category page to go to a product page.  They want to be able to click from one place to another.

So when an SEO’s getting involved in UI stuff, it’s because they’re concerned that this isn’t happening but more importantly, that this isn’t happening, that you don’t have an architecture that’s communicating the hierarchy of your website.  You need to make sure that your website is communicating what each page is about both in context and in the actual words in the page and to complicate this a little bit further, there’s other things that search engines can’t use that are very great for users but totally unusable for search engines.  Those are things like search bars, filters.  Those are things that are both great for user experience, you should totally have on your website, and totally make sense, but you have to create a workaround for the search engines because they can’t use them.  They need to have something like this.  So in addition to a filter, there needs to be a way to get to every single page that reflects both relevancy and importance contextually.

And before I get any further, this does not mean just putting a link at the bottom in the footer.  That’s something that search engines can detect, they know for, so that’s actually not a solution.  It needs to be someplace where someone will actually click and that means here.  So you need to have a link in the primary navigation or somewhere on the site that goes to each product and there’s a way to get to every single product.  Even if there’s a great filter in place that already accomplishes this, the SEOs don’t care about the filter because the search engines can’t use it and that’s where their concern comes from.

So the argument often is no one every uses that and in this case, it’s just you have to do something to make the search engine feel like it’s a human in the end, troll the website like a human would, even if a human isn’t actually doing it.

Then there’s some other ways of hiding content that can cause indexation that’s used, and that’s when you are delivering content, new content, but the URL doesn’t change.  So the way this would happen in the past was frames or iframes or Flash.  All three of those are not really used as often anymore, but still there’s different ways that content is being delivered to pages without changing the URL and that’s through JavaScript or AJAX or there’s great new technology creating great new opportunities within web design and have great interactive pages that work with these things.  And basically the issue here is this, that you’ve got your great website with content and then you click on something, new content comes in from the side or wherever, but you’ve got the same URL.

So in this case, the surefire way to keep SEOs away is to make sure that every page on your website can always be reached through clicks, so you can actually click from one page to another, not using a builder and not using a search.  You can get to every single page by doing that.  And also whenever the content changes, the URL should change as well.  This makes sense for indexation and search engines but it also makes sense for sharing.  If you’re gonna share a piece of content and the only way to get to it is by clicking on something within the page, then that’s really limiting to your site in general and it’s going to make that content that is only being brought in dynamically from being seen by people sharing it on blogs or other social networks.

So let’s move on to another group of big problems that can occur within your website and that cause a lot of other conflicts with SEOs and other web people.  Most people understand duplication, but basically it’s this.  It’s when your site says the same thing as a bunch of other websites.  There’s a couple of ways that this can happen.  The most common way – or not the most common way but one of the more frequent ways is from scraping content.  This is when you’re just taking other people’s content using some sort of automated effort like grabbing it.  This is something that should be avoided, generally is avoided, but can cause some direct penalties from the search engines.  So this is something that SEOs are always looking for.

The other thing is copying specific content.  Even if it’s just something like the about page or the legal page or something like a how-to page, it’s something that’s taken from another website and put on your own.  It should be avoided as well, ‘cause again, this is something that search engines specifically look for.

Another common thing that happens is pulling in feeds, pulling in product feeds or news feeds, anything where you’re automatically publishing content from another provider.  This can something that can almost be its own webinar because there’s lots of nuances that can be discovered, so I won’t go into the details for this.  I’ll just say contact an SEO.  This is a good opportunity to reach out to an SEO and figure out the best way to do this.  There’s lots of websites that have a business model built around this type of strategy and there’s ways to do it that are okay but there’s not really a way to encompass it as a whole within this.  So I’m just going to skip over that.

But scraping and copying content are still big deals and the reason is this, is that they’re such big deals that Google has actually launched a series of updates targeting these specific problems related to duplicate content.  The updates were called Panda updates and they’re all targeting duplicated content, any cases of websites copying other websites or copying the content or pulling in content from other websites.  It’s a huge deal and here’s why.  The basic reasoning is this.  Right here, we’ve got four search results, the top four search results for this query.  The people that run Google know that these four sites are going to be about 66 percent of all the clicks, so most people are going to be clicking on these four results.  What that means is this is what Google is giving you.  It’s giving you it’s best guess, it’s next best guess, it’s third and fourth, and the last is sort of longshot ‘cause it’s a tiny percentage of the clicks compared to the first ones.

If Google doesn’t get this right, then the people that run it know that you’re either gonna bale and do another search or worse yet, you’re gonna go to the competitor, in this case Bing.  Bing is obviously where you’re gonna not like what you get there and go to Google.  So Google’s not gonna show any sort of result that contains stuff that’s already found over on other websites.  Search engines are only gonna show you need content for this reason.  They don’t want you to bale on the search results and they want you to stay on their search results and trust those websites for delivering content that’s actually useful.

But everything I’ve covered here, scraping, copying, and pulling of feeds, these are all variations of pulling in other people’s content and putting it on your site.  These are some things that people are usually aware of, duplication or duplicate content.  Another thing that SEO’s will tend to look for is not as oftenly noticed by people running the website, is when you’re pushing your own content to other websites.  So you’re using some sort of syndication and pushing it out that way.  That’s something that might cause alarm with SEOs and that’s something that they might be looking for when they are working on your website.

And you might think that it’s your content so it’s okay.  The issue is this, is that going back to those four results that I was showing earlier from Google.  It’s only gotten four results to work with, so if it’s got one option of showing a site that has all your stuff on it, like your site, or if it’s got an option of showing a site that has all your stuff plus more stuff, it’s gonna go with Option 2 because that’s gonna be the more valuable search result that someone’s searching on their search engine.  So even if you own the content, it can still cause you to get problems and something that you should try to avoid.

So the surefire way in this case to keep SEOs away is to just not let your content be on other websites.  But I understand that that’s not always possible, so in that case, here are the backup plans if you’re talking with SEO and they do have concerns with you pushing out content or in some case, pulling in content, these are the strategies that you can have in place and already have in your tool belt to use when you start having these conversations.  So there’s always the special descriptions, so if you’ve got descriptions that you are pushing out to other websites, having unique descriptions on your own website that are different from those is a great option for keeping your site unique.

Having exclusive articles, anything exclusive related to content, pushing out news articles or push out your own content to other websites having stuff that can only be found on your website is something that can be very helpful.  Unique category descriptions.  So if all individual products on a page can be pushed out to other websites but the category pages themselves on your page are actually unique, that’s something.  It’s basically anything that adds value or makes your website a little bit different than all of the other ones that are showing your stuff.

And one more thing, what can you do about people stealing your stuff?  The thing is that you can’t.  So if you’re programmatically pushing yourself, pushing your content online, or programmatically pulling in stuff, that is a problem.  But if your content is just getting scraped or getting stolen every once in a while, there’s nothing you can really do to prevent it.  You just have to be aware of it.  But fixing it on a case-by-case level is a lot easier than fixing a problem that’s been done, a duplication problem that’s been the result of some sort of programmatic effort.

So now that we’ve talked about unique, we’ll keep on a similar subject by going into canonicalization.  This is typically what causes the biggest headaches for SEOs, but also is the more difficult thing for them to communicate to the website owners and developers and business owners because it’s sort of complicated and it’s not really that clear on why it’s a problem.  So I want to spend a few minutes going over what exactly canonicalization issues are.

So let’s start out with the difference between a canonicalization issue and a duplication issue.  Duplication is when the same content appears on different pages.  Canonicalization means that the same page is appearing in different plates.  So I’ll explain this with a little analogy and let’s say that you’re on a dating site and the weekend shows up and your first date shows up and it’s this guy.  Then that date ends.  Your next date starts the next weekend and it’s a different person but they’ve got the same bag of tricks and they’re saying the exact same thing.  Date No. 3 shows up, it’s the same person yet again.  I mean it’s a different person saying the same thing again.  This is a duplication problem.  You’ve got three different dudes and they’re all saying the exact same thing.

Canonicalization problem would be this:  This guy shows up at your door and the next weekend, the same guy shows up at your door again, and then the weekend after that, it’s the same guy showing up at your door again.  Even if there’s a little bit different of colorization or style, it’s still the same thing showing up over and over again.  This is a canonicalization problem.  It’s the same person who shows up over and over and over again.

So how does happen and why should you worry about it and all of that?  SEOs are concerned about all of this because it can cause a bunch of serious problems with the website.  The biggest concern is related to crawl budget.  So Google crawls every website online.  That takes a lot of time.  Because of that, they set aside a budget to – they’ve budgeted a certain amount of time to crawl your site particularly.  So that means if Google’s spending all of its time crawling the same page over and over again or variations of the same page, it might not have time to get to other pages.

Another issue is canonicalization.  If you have multiple versions of the same page, they’re going to be competing with each other for the same position and Google might not know which one to rank.  So in that case, you’re competing with each other and canonicalization your own search results.

The last thing is split authority.  This has to do with links or shares going to a certain page.  If there’s a bunch of different versions of a page that are being linked to, you’ve got a bunch of different links going to a bunch of different pages, when instead, all those thinks could be going to the same page and giving that page all the authority from all the links and shares that are up there.

So let’s get down to how it happens and how you can avoid it.  This is the most common thing that happens and it’s when there’s a canonicalization issue on the actual home page.  This is the first thing SEOs will typically look for when they’re going to a website is that you have your website and it shows up here and then it also shows up here.  This is a canonicalization issue, your home page under two different URLs.

But then also, these are problems that can happen.  Suppose that your home page exists on a specific file like default.aspx and maybe on top of that, you don’t have any sort of capitalization method on your server so it’s showing up as both lower case and upper case.  Now you’ve got four different versions of your home page all showing up.  Now let’s say that you’ve got a parameter from the mobile site and that’s directing so when you click on a link from the mobile site to your home page, that gets this URL.  Now you’ve got another URL that is delivering the exact same page but as a canonical issue, it’s showing up differently.

Now let’s say that you’ve got a secure version of this site, so you’ve got HTTPS and HTTP, that’s all the previous versions plus these versions under HTTPS.  Those are more versions of the home page showing up.  This can kind of go on and on.  This is the most common thing.  This was frequently called URL normalization.  It’s more often called URL canonicalization.  But still, the concept is that you want to make sure that each page can only show up for one URL.  So your home page specifically will always show as www.YourSite.com and not any of those variations.  If someone types in one of those variations, ideally they redirect to this version.

In Apache, you’re going to do this in HTX file or any kind of server can take a file like that.  Here’s some resources that I’ll put in the link bundle on how to do that.  There’s lots of different resources out there.  This is pretty straightforward. In IIS, it’s a little bit different.  You go to Information Services Manager and click on a few things in there.  But basically what you’re trying to do is to insure that your home page can only be indexed in the same way.

The bonus thing in this, too, is that if you use a directory-level URL structure instead of a page-level URL structure, so if a page exists on page rather than page.html, sometimes you can get a trailing slash in there that can also cause a canonicalization issue.  This will solve those problems as well.  So if you’ve got URL normalization, it solves a lot of those internal canonicalization problems as well.  So this is the most common.  It’s also the one with the most straightforward fix.

But here are the issues that can cause larger headaches for SEOs.  What can cause some of the bigger conflicts between developers and SEOs are issues dealing with grammars and one of the reasons why is because they are difficult to explain and sometimes SEOs aren’t exactly clear on why this is such a big concern and what they’re looking for.  So they happen like this:  Let’s say you’ve got a page with a product on it and then you click on medium and you click on medium and it takes you to a variation of that same page with a little grammar up there that says, “This is a page for the medium-sized version of this product.”  Then you click on green and it does the same thing with another grammar saying, “This is medium and green.”

Now you’ve got three different URLs all for the exact same page.  Those of course aren’t the only variations.  More variations of different combinations of size and colors and now you’ve got multiple versions of the URL that are all describing the same page.

One of the main reasons why this is such a big concern is that let’s say this particular page with the product on it gets popular and it gets shared all over the place.  People start tweeting about it.  It gets on news sites.  It gets on blogs, Google Plus, Facebook.  Every time someone goes to this particular page, they want to share it.

What happens if you’ve got a canonicalization problem is all these links are now pointed to different URLs.  So whether or not they’re all for the same page and they should be and there are really people sharing the exact same thing, everyone’s sharing a different version of this.  That means all of the links and votes that you’re getting from other sites are all going to different URLs, when instead, they could all be going to the same one and just giving that page all the authority to rank.  So that’s one of the more common canonicalization problems.

The best way to keep SEOs away and for discussing parameter issues related to this is that you just avoid using parameters for variations of the same page.  The other option would be to use a hashtag.  So if something needs to have a variation based on color or something like that, you can use a hashtag to distinguish that.  Hashtags right now are something that search engines ignore and they view it as the same page and so they won’t spend time crawling different hashtags over and over again.  They’ll treat it actually as the exact same page, at least for now.  Since it’s something that the search engines decide they want to use, it’s something that can always get taken away.  So I recommend using it now and there might be better solutions in the future, but for now, you should use a hashtag if you need to distinguish the color because those will all be attributed to the same page.

So the only thing worse than a little parameter issue like that is a big parameter issue, and these are some of the worst things that can happen in a website and this is when SEOs will really dig in their heels and they’ve got a good reason and here’s why.  Sometimes when you have a page like this with a product or a category with a couple of other related links to it, maybe through some related products, you can do something like this:

You’ve got a link here to relate a product.  So let’s say you click on that related product link and it adds a reference grammar to the end of the URL saying where you came from.  Now let’s say you click on another related product right here, it adds another reference parameter.  You do that again, adds another one.  You do that again, and now you’ve created what’s called a spider trap.  This can be one of the biggest concerns and this is why SEOs are so worried about parameters and might ask you to move parameters or not have them built within a website.  These can cause some of the bigger headaches.

So what can you do about this?  The best solution is just to avoid using parameters for behaving tracking.  I realize that that is part of a lot of CMS but there are much better ways to do this.  Google Analytics will let you do good behavior tracking without ever changing the URL.  Other analytics platforms will allow you to do the same.  That should be the way you track behavior, not adding parameters.  Even if the parameters aren’t causing a spider trap, they’re still going to cause other canonicalization issues because they’re going to resolve in different variations of the URL.

Before I get any further, ads and e-mails, those are okay.  I understand if you’re doing an e-mail blast, you’ve got AdWords  or display ads.  A lot of times, those will pack on a reference parameter that you use using AdWords.  Google will tell you to use those.  Those are all okay.  The biggest thing you want to be sure of is that those don’t get indexed and so you want to make sure that those aren’t on the website.  This can be in the ads.  This can be in the e-mails.  You just shouldn’t link them on your website.  If they get linked on other websites, you can deal with those on a case-by-case basis, but just don’t use those on your own website because those will also screw up your numbers anyways.

The last solution for dealing with a canonicalization issue is by using the canonical tag, and the reason why I didn’t bring this up as a primary solution is because it’s great.  It’s very useful but it’s not something that you should actually depend on and here’s why.  So in case you’re not familiar with it, it’s a great little tag and it says basically something like this.  It says, “Hey, I know that the URL up there says this, but really the URL is supposed to be right here.”  If it’s honored, then it does transfer authority.  It can address serious canonicalization issues and help address a lot of problems within the website.

But the problem is that it’s far from perfect and here’s why is that while it’s a great little tag that helps a lot, in the end, it’s a little tag on your website and Google is Google.  Google does not have to follow it.  Sometimes it doesn’t.  If Google finds a lot of problems on your website with the canonical tag or if it’s not correctly implemented, then it will ignore it altogether.  So if that’s your only defense against canonicalization issue, then you’ve got nothing if Google does choose to ignore it.

Another thing is that it doesn’t prevent Google from calling the same pages over and over and using a crawl budget, so it doesn’t help you in that regards either.  That’s why the canonical tag is not always the best solution.  You should absolutely use it, absolutely include it on the website, and it’s a good thing for a lot of reasons but it should be the backup plan to existing straightforward canonicalizations to just keeping the same URL structure.

So if there’s one thing that you can learn about all this, about how to keep SEOs from bugging you with grammar issues or having canonicalization concerns, saying the word canonical because it is kind of hard to say, is that you should just follow this rule, one page, one URL, always.  Anytime a page content changes it should have its own URL and it should only have one URL.  So you might ask if that even applies to mobile and the answer is yes.

And this is why SEOs love responsive design.  We did a webinar on responsive design.  It was great.  There was lots of great things about it.  It’s very cool and it opens up a whole world of possibilities, but SEOs really only have one concern about it and that’s this, that responsive design always results in consistent URLs from page to page.  So each version of the website, no matter where you’re showing up, always has the same URL.  So regardless of your opinions about responsive design, love it or you think it’s just a passing trend, when the SEO is talking about responsive design or recommending it, it’s only because they have this thing in mind.  This is the only thing that they’re interested in is that it provides us consistent URL structure.

So if for some reason you don’t want to do responsive design, you still want to be sure that this sort of stuff is implemented in whatever way because this is what’s going to avoid any sort of mobile canonicalization issues which are pretty common and if you can avoid them, you always should.

So those are the three big problems.  So let’s go over what they were.  We talked about indexation, duplication, and canonicalization.  These are the three things that SEOs are looking for and so when they’re butting their heads into the day-to-day life of developers and designers and content people, it’s because these are the three things that they’re looking for, issues related to these.

So in essence, the way to avoid those conflicts with the SEO is just to make sure that when you put content on your site, put a new page, make sure that the content can always be found by a robot.  You can click to any content and it’s indexable.  Your content is unique.  It’s not found anywhere else on any other page.  And the content always displays under a single URL.  So regardless of how you get to it, any sort of variation of the actual page, it’s always displaying under one URL.

So is this everything?  Will this resolve all the conflicts?  The thing is, this will actually take care of a lot of the things because these are, like I’ve said over and over again, these are the big technical things that SEOs are looking for when they’re getting into a website.  They’re great, because preventing this problem is pretty simple.  If you’re doing this during the planning stages, it involves a couple of conversations and then the implementation, which itself can be sort of complicated but still it’s worth it because fixing these problems can take forever.  It involves diagnosis, identifying what exact problems are, where they’re occurring.  Then you’ve got to roll out the actual implementation, rolling out the fixes, figuring out what’s going on and how to make sure that the site is avoiding this big pitfalls.  They always involve fixing internal links or updating links.  With a blog, especially, can be a real big pain.

External links can almost be impossible.  This is asking other websites to update links to make sure your website can stopped being indexed in ways you don’t want it to be indexed any longer.  De-indexing content, asking Google to remove things from its index and then re-indexing other content.  These are big headaches that can take months and months and months.  If either one of these become an issue where you need content removed from Google or you need it replaced with existing content, this can take forever.  And then a lot of begging Google to give you just one last chance on whatever rain dance you need to do in order to make sure that Google crawls your site again and indexes it the way you want it to.

So all these issues can be avoided a lot more easily than they can be fixed.  The best way to do that is instead of doing this when you decide on making a website, getting ready just before launch to call the SEO, do something like this.  Bring in the SEO early on.  Get these big problems out of the way because this is the big secret and this is the main thing of anything else that you can take away from this, I hope you learn one thing.  It’s that if you take care of these issues, you make sure the site is indexable, it’s original, canonical, and shareable, SEOs don’t really have much else to do during the design process.  That means they’ll totally leave you alone.

You’re allowed to do whatever you want.  Design the website, developers can be left alone.  Everyone can just stick to what they do and they don’t have to have the SEO constantly bothering them about grammars if all those issues are taken care of ahead of time.  And that lets you do what you do best and lets the SEOs do what they like to do, which is actually all the stuff you typically hire them for in the first place like keywords, title tags, content, and redirects.  This is what we want to be doing and what we are looking forward to do.

The post Everything Non-SEOs Need To Know About SEO (Webinar) appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/everything-non-seos-need-to-know-webinar.htm/feed 6
Building Successful Low Budget PPC: Keywords http://www.eigene-homepage-erstellen.net/blog/ppc/building-successful-low-budget-ppc-keywords.htm http://www.eigene-homepage-erstellen.net/blog/ppc/building-successful-low-budget-ppc-keywords.htm#comments Fri, 24 May 2013 14:00:27 +0000 http://www.eigene-homepage-erstellen.net/?p=17348 In charge of building your first PPC account? Don’t have a lot of time or money to spend within AdWords? Well you came to the right place. We’re in week three of our six-part blog series in which Portent PPC Strategists Chad Kearns and Tim Johnson  lay down the knowledge on best practices for achieving PPC success. Follow… Read More

The post Building Successful Low Budget PPC: Keywords appeared first on Portent.

]]>
In charge of building your first PPC account? Don’t have a lot of time or money to spend within AdWords? Well you came to the right place. We’re in week three of our six-part blog series in which Portent PPC Strategists Chad Kearns and Tim Johnson  lay down the knowledge on best practices for achieving PPC success. Follow along to pick up tips on how to build your first PPC account like a PPC superstar.

Post #1: Building Successful Low Budget PPC: Account Structure

Post #2: Building Successful Low Budget PPC: Understanding your Campaign Settings

Man looking through telescope

If the account structure is the backbone of your AdWords account, then keywords are the blood running through its veins. Building a descriptive keyword list that closely represents your product will give life to your account, put your ads in front of a relevant audience, and reduce wasteful spending. You may ask yourself: Which keywords should I bid on? How much should I bid? What’s a match type?

In this post, we explain some techniques for building a keyword list that will keep your account alive and performing well.

Finding keywords

When building out your initial keyword list, you should try and put yourself in the shoes of your customers. I like to ask myself “What would my mom search for?” Qualified customers that you want to target are searching on Google because they have a specific problem or want. Your keywords should reflect that want in the same way that your product should be their solution.

There are many different tools out there to help you expand your keyword list from a few core keywords to a fully developed list. In AdWords, you can use the Keyword Tool:

Keyword Tool Dropdown

The Keyword Tool will take a list of existing keywords or your website URL and generate new related keyword ideas that you probably didn’t think of the first time brainstorming.

There are many other tools out there to find new keywords. A couple of my favorite research tools are Google search suggest and the “searches related to” tool. These are tools right in the Google search engine itself.

With Google search suggest you can type a keyword into Google and then simply take note of the more long-tail keywords that Google suggests to you. Chances are there are a few frequently searched terms right there for the taking.

Google Search Suggest

Searches related to that can be found at the bottom of the search results page.

Searches Related To screencap

Match types

Now that you have an initial keyword list, you need to set appropriate match types for all of your new keywords. There are three main match types to select from: broad, phrase, and exact.

Broad match is the default for all keywords. When a keyword is set on broad match, AdWords will show your ads when someone searches for that keyword as well as some slight variations of that keyword.

Broad Match Example

When in phrase match, ads will only show when that exact phrase is searched but will allow for additional words before or after the keyword (i.e. “buy steak online”).

Phrase Match Example

Exact match (i.e. [buy steak online]) limits your ads to only show when the search query is exactly the same as the keyword with no additional words.

There is no right or wrong time for each match type. In general, it is best to be as specific as possible. So utilize phrase and exact match when applicable to prevent gaining excess impressions from unqualified customers.

After you upload your first list

Once you have your keywords uploaded into AdWords, there are a couple of other useful features in AdWords for finding additional keywords, missed keyword opportunities, and negative keywords.

Add Keywords tool & Search Query Reports

In AdWords under the keywords tab, you can find new keyword suggestions by clicking the green Add Keywords button. After your keywords have run for a short period of time, you can click on Keyword details > All to find a list of search queries that have trigged an ad in the past. This is called the Search Query Report. This list of queries is a great resource for finding new keywords you should be bidding on, as well as negative keywords that are irrelevant and should be excluded.

Setting Bids

The final step left is setting your bids. A great tool for deciding how much to bid initially is the AdWords Traffic Estimator.

Traffic Estimator tool

With this tool you can enter in groups of keywords and AdWords will give you an estimate for how many clicks you can expect depending on your bid. You will get a graph that looks something like this:

Traffic Estimator Graph

From this you can determine an appropriate starting bid for that keyword group.

Based off of the keywords I entered to generate this graph, I would start my bidding near $1.50-$2.00 because that is the point where increased bids stops yielding higher click totals.

After you have strong keyword lists built and uploaded into your account, it’s time to start crafting your ads.

Watch out next Friday for our next post on best practices for building your first PPC account!

Feel free to ask any questions you may have in the comments.

The post Building Successful Low Budget PPC: Keywords appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/ppc/building-successful-low-budget-ppc-keywords.htm/feed 5
3 Google Algorithms We Know About & 200 We Don’t http://www.eigene-homepage-erstellen.net/blog/seo/3-google-algorithms-we-know-about-200-we-dont.htm http://www.eigene-homepage-erstellen.net/blog/seo/3-google-algorithms-we-know-about-200-we-dont.htm#comments Wed, 15 May 2013 14:00:48 +0000 http://www.eigene-homepage-erstellen.net/?p=17194 When I meet with clients or present at conferences, I am always asked: “How do I rank high on Google for (insert keyword-phrase-du-jour)?” I give the standard answer: “Only the search engineers and Google can tell you and they aren’t talking.” Inevitably, the questioner looks dejected, mutters a slur on my credentials, and walks away. … Read More

The post 3 Google Algorithms We Know About & 200 We Don’t appeared first on Portent.

]]>
When I meet with clients or present at conferences, I am always asked: “How do I rank high on Google for (insert keyword-phrase-du-jour)?” I give the standard answer: “Only the search engineers and Google can tell you and they aren’t talking.”

Inevitably, the questioner looks dejected, mutters a slur on my credentials, and walks away.  I scream silently in my head: “Don’t kill the messenger because we are all hapless Wile E. Coyotes chasing the Larry and Sergey Road Runner with zero chance of catching them, no matter what we order from ACME!”

Thirteen years ago, before the Cone of Silence dropped on Google’s method of operation, we got a glimpse of the method behind their madness. This, combined with the common knowledge of the foundational tenets of all search engines, gives us some idea of what’s going on behind that not-so-simple box on the white page.

In this post, I am going to explore the 3 algorithms that we know for sure Google is using to produce search results, and speculate about the 200+ other algorithms that we suspect they are using based on patent filings, reverse engineering, and the Ouija board.

What is an algorithm (you might ask)?

There are many definitions of algorithm. The National Institute of Standards and Technology defines an algorithm as “a computable set of steps to achieve a desired result.”  Ask a developer and they will tell you that an algorithm is “a set of instructions (procedures or functions) that is used to accomplish a certain task.” My favorite definition, and the one that I’m going with, comes from MIT’s Kevin Slavin’s TED Talk “How Algorithms Shape Our World”: algorithms are “math that computers use to decide stuff.”

3 Google algorithms we know about

PageRank

The most famous Google algorithm is PageRank, a pre-query value that has no relationship to the search query. In its infancy, the PageRank algorithm used links pointing to the page as an indication of its importance. Larry Page, after whom the algorithm is named, used the academic citation model where the papers citing another were endorsements of its authority. Strangely enough, they do not have citation rings or citation buying schemes as with web links. Warning, scary, eye-bleeding computational math ahead.

Initial PageRank algorithm

Initial PageRank algorithm

To combat spam, a Random Surfer algorithm was added was added to PageRank. This algorithm “imagined” a Random Surfer that traveled the Web and would follow the links on each page. However, sometimes, the Random Surfer would arbitrarily, much like us thought-processing bipeds, not return to the original page and keep going or would stop following links and “jump” to another page. The algorithm steps are:

  1. At any time t, surfer is on some page P
  2. At time t+1, the surfer follows an outlink at random
  3. Surfer ends up on some page Q (from page P)
  4. The process repeats indefinitely

That’s the benefit of algorithms, no overtime and they never get tired or bored.

Hilltop Algorithm

Surf’s up Dude algorithm worked for about 10 minutes before the SEO community found the hole in its wet suit to manipulate rankings. In the early 2000s, processors caught up to computational mathematics and Google was able to deploy the Hilltop Algorithm (around 2001). This algorithm was the first introduction of semantic influence on search results inasmuch as a machine can be trained to understand semantics.

Hilltop is like a linguistic Ponzi scheme that attributes a quality to links based on the authority of the document pointing the link to the page.  One of Hilltop’s algorithms segments the web into a corpus of broad topics. If certain documents in a topic area have lots of links from unaffiliated experts within the same topic area, that document must be an authority. Links from authority documents carry more weight. Authority documents tend to link to other authorities on the same subject and to Hubs, pages that have lots of links to documents on the same subject.

Topic-Sensitive PageRank

The Topic-Sensitive PageRank algorithm is a set of algorithms that take the semantic reasoning a few steps further. Ostensibly the algorithm uses the Open Directory ontology (dmoz.org) to sort documents by topic.

Another algorithm calculates a score for context sensitive relevance rank based on a set of “vectors”. These vectors represent the context of term use in a document, the context of the term used in the history of queries, and the context of previous use by the user as contained in the user profile.

So, I know what you’re thinking. How can they do that for the whole web? They don’t. They use predictive modeling algorithms to perform these operations on a representational subset of the web, collect the vectors, and apply the findings to all of the “nearest neighbors.”

D’oh!

[Added May 16, 2013]
There are a lot of algorithms for indexing, processing and clustering documents that I left out because including them would have many of you face-first-in-your cereal-from-boredom. However, it is NOT OK to leave out the mother of all information retrieval algorithms, TF-IDF, known affectionately to search geeks as Term Frequency-Inverse Document Frequency.

Introduced in the 1970s, this primary ranking algorithm uses the presence, number of occurrences, and locations of occurrence to produce a statistical weight on the importance of a particular term in the document. It includes a normalization feature to prevent long boring documents from taking up residence in search results due to the shear nature of their girth. This is my favorite algorithm because it supports Woody Allen’s maxim that 80% of success is showing up.

The 200+ we don’t know about

All of the search engines closely guard their complete algorithm structure for ranking documents. However, we live in a wonderful country that has patent protection for ideas. These patents provide insight into Google’s thinking and you can usually pinpoint which ones are deployed.

Panda, the most famous update is an evolving set of algorithms that are combined to determine the quality of the content and user experience on a particular website. There are algorithms that apply decision trees to large data sets of user behavior.

These decision trees look at if this/then that:

  • If the page has crossed a threshold a certain ratio of images to text, then it is a poor user experience.
  • If a significant portion of searchers do not engage with anything on the page (links, navigation, interaction points), then the page is not interesting for searchers using that particular query.
  • If the searchers do not select the page from the results set, then it is not relevant to the query.
  • If the searcher returns to the search engine results to select another result or refine the search, then the content was not relevant and not a good user experience.

Complementing the decisions trees could be any one of a number of page layout algorithms that determine the number and placement of images on a page in relation to the amount of content in relation to a searcher’s focus of attention.

Following on the heels of Panda are the Penguin algorithms. These algorithms are specifically targeted at detecting and removing web spam. They use Google’s vast data resources to evaluate the quality of links pointing to a site, measure the rate of link acquisition, the link source relationship to the page subject, shared domain ownership of the linking sites, and relationships between the linking sites.

Once a site passes an established threshold, another algorithm likely flags the site for additional review by a human evaluator or automatically re-ranks the page so that it drops in search results.

Let’s stop, guess, and go with what we know

As with the formula for Coca-Cola or the recipe for Colonel Sanders’ Kentucky Fried Chicken, specifics on what Google uses to decide who gets where in the search results set are a closely guarded secret. Instead of speculating on what we might know, let’s focus on what we do know:

  • In order to rank for a term, that term must be present in the document. Sometimes, a contextual or semantic match for a term will get you into the SERP swimsuit competition for placement. Don’t count on that though.
  • Being picky and realistic about what you want to rank for is the best start.
  • Text on the page drives inclusion in the results for a searcher’s query. Be about some thing instead of many things.
  • Quality content is focused, fresh and engaging.
  • Custom titles that describe the page using contextually relevant keywords are THE low hanging fruit. Pick it for your most important pages.
  • Compelling description text in search results will draw clicks. Meeting the searcher’s expectations with rich content will keep them on the page.
  • Pictures, images, and ads are most effective when used in moderation.
  • Links are important, but only the right kind. For Google, the “right” kinds are links from pages about the same subject and place high in contextually-related searches.

Are there any major algorithms we missed?  Let us know in the comments.

The post 3 Google Algorithms We Know About & 200 We Don’t appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/3-google-algorithms-we-know-about-200-we-dont.htm/feed 20
5 SEO Strategies We Swear Aren’t Going Anywhere http://www.eigene-homepage-erstellen.net/blog/seo/5-seo-strategies-we-swear-arent-going-anywhere.htm http://www.eigene-homepage-erstellen.net/blog/seo/5-seo-strategies-we-swear-arent-going-anywhere.htm#comments Wed, 01 May 2013 14:00:52 +0000 http://www.eigene-homepage-erstellen.net/?p=17098 It seems like every other day, some SEO technique that used to be accepted is now being devalued or, even worse, penalized. (Remember when meta keywords and nofollow tags were totally legit?  Ah, the good old days…)  And now Google is threatening to crack down on two staples of the SEO stable: anchor text and… Read More

The post 5 SEO Strategies We Swear Aren’t Going Anywhere appeared first on Portent.

]]>
Hand Graphic of Scout's Honor

It seems like every other day, some SEO technique that used to be accepted is now being devalued or, even worse, penalized. (Remember when meta keywords and nofollow tags were totally legit?  Ah, the good old days…)  And now Google is threatening to crack down on two staples of the SEO stable: anchor text and infographics.

With all of these changes it can be hard for businesses to know which search strategies are long-term and future-proof.

Business: “Sure I could invest a bunch of time and resources into this new strategy the weird search person is suggesting, but how do I know it won’t just change next year?”

SEO: “Err…”

Well, as much as things change, there are a few basic guidelines that are here to stay.  And while I’m not going to go into details on every nuance, as long as your SEO efforts keep with these general strategies, you should be in good shape (at least for a few years).

1. Page speed and efficiency

As a general rule, fast things are better, and in this case, the “better” means rankings.  Google and Bing have been saying this for a while now: page speed counts. Google even made a cool little tool for everyone to measure their website’s page speed. It even gives instructions on how to improve it. Why would they do that if they didn’t consider it important?

dog sticking head out car window
Image optimization, javascript and CSS consolidation, minification, caching, compression – the list goes on. We’ve given a few pointers in past posts, but the message is that improving your site’s page speed is a long term strategy.

Why is it so important? Because it’s one of the main things real people look at when they decide whether or not to use a site. Amazon has stated that every tenth of a second of increased load time resulted in a 1% drop in sales. That means people really, really care about this. And if people really, really care, you know all the little Ooompa Loompas and elves working at Google and Bing are trying to make their magic robots care as well.

2. Fixing duplicate anything

Duplication happens. Title tags get repeated, URLs accidentally get indexed with parameters, content gets scraped, mobile sites get indexed separately – it sucks. Like Oompa Loompas, the ways in which content can get duplicated go on forever.

Oompa Loompas with query text.

Oompa Loompas are a result of a dynamic query parameter on a self referencing link.

The search engines are getting better at identifying duplicate content. But a search engine robot being 99.9% sure that your page is the one that should rank out of the 15 others indexed is still not as good the robot being 100% sure because it only has one page to choose from.

The best defense with duplicate content is to avoid it all together:

  • Don’t use parameters in places where they’ll get indexed.
  • Use consistent URLs for both mobile and desktop versions of your site.
  • If you can’t fix a wrong link, use 301 redirects.
  • If you can’t get rid of duplicate content and you can’t redirect, then use the meta tags like rel=canonical or noindex.
  • If you can’t do that, then just fight it any way you can.

3. Resource and instructional content

Updates like Penguin have got everyone freaking out. “What makes a page good?” “What makes a page spammy?” “What makes a link good?” “Does Google like me or like like me?”

Well, here’s the thing – if you have a legitimately useful page for something, it will always be considered good. So, if you sell Frisbees, write a page about how to throw a Frisbee. If you’re trying to get more Renaissance Fair enthusiasts to visit your site, write something on how to care for jousting armor.

This goes back to Google’s humble beginnings as an indexing engine for academic documents.

In academia, if you reference something, you’re doing it because it’s something that you found useful when writing your dissertation on Bigfoot or Jetpacks (or whatever it is smart people write about).

This is how search engines wish you used your links. Lots of SEOs complain about Wikipedia always being in the number 1 spot, but few can argue that it isn’t the most relevant result for most searches.

I realize not everyone can be Wikipedia, but as long as people are linking to you because you’re useful, you will be in good shape.

How do you do this?

  • Create resource pages about your industry.
  • Create some data oriented blog posts.
  • Make an instructional page about how to use your product or a related product.

This is how links were always intended to be used and that’s why they were ever a ranking factor in the first place. These pages are naturally good content and links to these pages will tend to be good links.

4. Good site structure

Should I link to every page from the homepage or just the main ones? Should I repeat everything in the footer or should I cut the footer all together? Can I hide the homepage text?

Good site structure typically falls more into the UI/UX category, but it’s an SEO concern as well. You see, when you design a site with the intention of getting your user or shopper to the right page or give them information they’re looking for, you’re naturally creating a page that does the same for the search engines.

Confusing instructions in Asian.

Search engines are complicated because the elves are trying very hard to make them emulate human factors when viewing a website.

Lots of links on a page is confusing. It implies that you consider them all equally important. If your homepage has a few links, on the other hand, it looks like you really care about those pages.

Using this sort of user-oriented thought process is a future-proof strategy to predict what search engines will care about within your site. Sure, you still have to help the robots with filters and search boxes, but they are very good at finding links.

So remember to:

  • Show off the links you want to show off.
  • Use page hierarchy to group internal pages into categories and subcategories.
  • Link to related products. These really make sense to your users and naturally lay out the relationship to the search engines.

5. Anything local

The bottom line for businesses and consultants is this: People who are actually looking for products aren’t changing their settings in Google or using a proxy to see the universal search results. They are clicking on those search results with the little letters next to them.

If searching for any of your keywords displays a local search result, you need to spend time on local. And if people can walk into a storefront, you really need to care about local.

Screen cap of costume stores Google search.

Local search isn’t going anywhere; in fact it’s getting more popular. On mobile devices it pretty much dominates the search results. So if you have one store or one thousand stores, you need to spend some time in Google+ Local and Bing Local.

You need to:

  • Claim your listings, check your NAP, and monitor your reviews.
  • Create storefront pages and make sure they are associated with your local listings.
  • If you don’t have a storefront, make some pages that talk about the area you serve.

Is that all?

No! This doesn’t mean don’t worry about any of the other things. Don’t ignore a social network today just because it might not be around in 10 years. If Google comes out with a hot new tag then you should absolutely use it, even if it may be ignored a few months later. Infographics still work!

But if you have limited resources, these guidelines can help you evaluate whether something is worth investing a ton of time and money in or if there is something more effective you could be working on.

Disagree? Totally agree? Not sure what level of agreement you’re feeling? Leave a comment below and share your thoughts!

The post 5 SEO Strategies We Swear Aren’t Going Anywhere appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/5-seo-strategies-we-swear-arent-going-anywhere.htm/feed 23
Why You Should Market Like You (Want to) Cook http://www.eigene-homepage-erstellen.net/blog/internet-marketing/why-you-should-market-like-you-want-to-cook.htm http://www.eigene-homepage-erstellen.net/blog/internet-marketing/why-you-should-market-like-you-want-to-cook.htm#comments Thu, 11 Apr 2013 14:00:27 +0000 http://www.eigene-homepage-erstellen.net/?p=16925 Great marketing should be mouthwatering, scrumptious, and satisfying. Don’t you want customers pining for your next email, devouring every bit of information, and anxiously awaiting more? We don’t always think about marketing in terms of a delicious meal, but restaurants and marketers ultimately have the same goals. Give customers a taste of something yummy, satisfy… Read More

The post Why You Should Market Like You (Want to) Cook appeared first on Portent.

]]>
Portent Marketing Recipe

Great marketing should be mouthwatering, scrumptious, and satisfying. Don’t you want customers pining for your next email, devouring every bit of information, and anxiously awaiting more?

We don’t always think about marketing in terms of a delicious meal, but restaurants and marketers ultimately have the same goals. Give customers a taste of something yummy, satisfy their appetites, and keep them coming back for more.

Cooks have the advantage because they directly appeal to one of the most basic human needs (nourishment) and there are few things—if any—more satisfying than the perfect combination of dishes (okay, maybe the perfect combination of dishes and beverages). For me, few things beat a perfectly seared medium-rare fatty ribeye steak with a bold juicy red wine.

If we start thinking about the reactions we evoke from customers in the same way that culinary masterminds do, we can appeal more to their basic needs and desires, achieving maximum satisfaction for them while delivering profitability to us.

Here are five ways we can apply the secrets of successful kitchens and gastronomy to our marketing plans.

1. Be creative

Both marketing and cooking encourage and allow endless creativity and innovation. This is the magical allure of each.

But do not abuse the creativity by straying too far from your core competency. Find ways to innovate and expand your audience from within your brand voice and image.

If you’re looking for a new interpretation of the s’mores sandwich—of course, a waste of time because it’s perfect—don’t try to make it more savory with a garlic aioli. When sprucing up a traditional French dish, a Tex-Mex flare might not be the most cohesive approach.

On the other hand, lemongrass matzo ball soup (it even sounds delicious) is a brilliant and delightful play on the traditional dish.

Lemongrass Matzo Ball Soup

Courtesy of www.bonappetit.com

That’s what you want to be.

In the marketing world if you envision creating a viral video, a piece of highly shareable content, or something to engage a new audience, think about what’s on your menu. How can you take what you already do well, and elevate it by adding a new dimension?

Starbucks’ #spreadthecheer hashtag on Twitter this past fall is a great example of a discombobulated marketing flavor profile. Yes, it’s a thing. This unfortunately coincided with widespread public criticism of Starbucks’ labor practices and questionable corporate tax policy. Starbucks did not consider how all of the elements would mesh, resulting in the equivalent of a lemon-curd mocha. Cheer was not spread.

Instead, take a lesson from something brilliantly simple like the Extreme Diet Coke & Mentos Experiment. A perfect balance of two things many people like—Diet Coke and Mentos—combined with a few things everyone loves—a sweet bass line, theatrics, and synchronized fountains—yields over 16 million views. That’s tasty!

2. Time it right

Whether cooking a delicious meal or brainstorming marketing ideas, keen awareness of your surroundings is imperative. Both require multi-tasking. When you know what’s happening, new opportunities arise.

Experienced and passionate cooks know that for optimal flavor, it’s best to use the freshest local produce. Ideally, produce should not have a long commute to your table. When selecting vegetables to cook in the midst of Washington (Portent is located in Seattle) winter, discerning chefs try to stay away from corn—it couldn’t have come from anywhere local because it’s primarily a summer vegetable. Instead, one might opt for the kale or apropos snow peas because they are winter vegetables more likely to be sourced closer to home.

Whether taking advantage of the best seasonal produce or a trending topic for your marketing plan, utilize convenience and existing opportunities whenever possible.

If I planned this blog post better, I could have coordinated it with the “Top Chef: Seattle” season finale last month. (There would have been a lot of searches around “top chef.”)

Check the calendar, look for upcoming holidays and identify potential opportunities to coordinate with events that already attract a large audience like the Super Bowl. How about a campaign for donating a percentage of your April revenues to a charity supporting Earth Day on April 20? Do you have a grand idea to coincide with the increasing ads for products and gifts targeted towards Mother’s Day on May 12?

3. Practice good management

Gordon Ramsay has a very distinct approach to managing his personnel and TV show participants: bitch, yell, intimidate, and then deliver just the right amount of sincere praise (when it’s due)—all of it in an entertainingly muddy British accent.

 

Iron Chef Morimoto is more demure and quiet with a head down approach. (When you make ridiculously innovative and incredible food for that long, people drop everything to listen at a second’s notice.)

The chefs have different styles of communication, but they both balance all of the moving parts of their kitchen with impeccable timing. They know every ingredient and item their sous chefs prepare. This doesn’t come off the cuff; it’s strategically organized so that everyone knows their role.

First choose a solid team able to execute your general marketing idea. Consider the theme of your campaign: what is the flavor profile? When that’s solidified, determine a specific timeline for completion of each piece and decide if that piece is a dish served independently or assembled into a grander meal.

If the campaign involves an aspect of outreach (which it likely should), assign someone to research relevant parties, organizations, and people to contact in advance. But remember: asking people to help or participate never bears fruit immediately. Account for an estimated time frame to hear back from contacts and build it into the deadline for completing outreach.

Are the proper discussion points, graphics, and videos prepared for your social media team to deliver on Facebook and Twitter? Perfect timing requires the timeline and all supporting materials to be ready well in advance.

When cooking, you must ensure the excellence of every ingredient. In business and marketing, if each piece of the campaign isn’t quality by itself, it will only bring down the entire dish. Realistically, things happen on the fly all of the time, but it’s always best to account for error and create as much time for review as possible.

This boils down to another element of timing: if the copy and personnel are in place to contact potential supporters, but the graphics and media aren’t ready, it can’t happen. Everything must be ready at the right time.

4. Adjust as needed

When steak runs out and the kitchen has to 86 (cancel) the dish, great chefs in nimble kitchens see an opportunity to create a new nightly special by using excess available ingredients from other menu items.

For example, a customer writes a scathing review on your Facebook wall. Rather than ignoring it or responding defensively, use the situation as an opportunity to showcase stellar customer service: respond to it directly and apologetically, offering multiple solutions to fix the problem, replace the product, or offer a free trial. This turns what could be a blemish on your profile into a demonstration of high quality, genuine customer service.

Or maybe you didn’t sell all of your discounted flash sale products, leaving you with excess inventory. Instead of leaving it to collect dust in your online store, use each item as part of an advertised giveaway to collect email addresses or increase your social following. Be creative to continue getting value from that inventory.

5. Present it tastefully

Chefs taste as they cook to ensure that all components of the dish and meal (including its marketability) are on point. That means tasty, unique, and cohesive.

The final product is only as good as the sum of its parts, or only as good as the weakest link, or insert other overused business cliché here. (These sayings are overused for a reason.)

Quality control at each step is essential. That doesn’t mean micro-managing and smothering creativity, but there must be adequate time to check every element.

When delivering a dish or final product, make it attractive. People eat with their eyes and their nose before ever putting anything in their mouth. It’s a mélange of the senses building to the most pleasurable and successful experience possible. That is the crux of both brilliant cuisine and marketing.

We don’t want our customers to just “go through the motions” by clicking on our advertisements or entering an email address—both are the equivalent of eating a bland meal.  We want them overwhelmed with excitement to get more information, join our community, or sign up for a trial.

This pork marketing campaign is a great example. A display ad showing an enticing image appears on a cooking site—great initial steps with effective audience targeting and appealing creative.

Caribbean Pork Chops Image

Courtesy of www.porkbeinspired.com

Click on the ad and it takes me directly to the website showing: a great slogan (entirely echoing my sentiments that pork makes everything better), beautiful images of delicious pork goodness, links to a variety of recipes thus broadening appeal (people love options), an easy recipe video, and a call to action inviting users to share recipes.

This isn’t perfect and it doesn’t apply to all forms of marketing, but in essence this contains many of the core concepts we want our campaigns to echo.

  • Targeted audience
  • Enticing creativity
  • Engaging content
  • Succinct and appropriate call to action

These aren’t brand new concepts to us as marketers, but next time you’re planning a marketing campaign, or even a one-off email, try a new approach.

Think about your favorite restaurant (your mother’s kitchen counts) and the feelings you experience there. It can be anything from a dimly lit romantic setting to childhood nostalgia.  Then ask yourself: what do you experience when eating the meal? Can you imagine the presentation, recall the smell and excitement? At the end of the meal do you feel entirely satisfied or crave more?

Now, change your approach to best evoke these feelings and reactions from your customers to get them raving and coming back with their friends.

Please comment, share your experiences and passion for cooking and marketing. Do you know a restaurant or organization that does both well? Can you think of a cooking metaphor I didn’t already abuse?

The post Why You Should Market Like You (Want to) Cook appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/internet-marketing/why-you-should-market-like-you-want-to-cook.htm/feed 9
Why Web Professionals Hate SEOs http://www.eigene-homepage-erstellen.net/blog/seo/why-web-professionals-hate-seos.htm http://www.eigene-homepage-erstellen.net/blog/seo/why-web-professionals-hate-seos.htm#comments Tue, 05 Mar 2013 14:00:25 +0000 http://www.eigene-homepage-erstellen.net/?p=16048 Before I even became an SEO I learned to hate SEOs. After all, SEOs would take the beautiful, functional site the designers and developers spent their nights creating and cram it full of footer links, anchor text, H tags, nofollows, and any other piece of code they could think of until the site couldn’t breathe.… Read More

The post Why Web Professionals Hate SEOs appeared first on Portent.

]]>

Before I even became an SEO I learned to hate SEOs. After all, SEOs would take the beautiful, functional site the designers and developers spent their nights creating and cram it full of footer links, anchor text, H tags, nofollows, and any other piece of code they could think of until the site couldn’t breathe.

Next, SEOs would spend weeks plastering the website’s address anywhere they could; clogging up blogs, forums, and message boards. SEOs would then have their developers try new tricks and tactics to serve secret content and hide text in different places.

The brave new world of SEO

Then, one day all of that changed. Google cracked down on link spam. Bad practices were devalued. We could add fonts and functionality to sites in ways that didn’t hide the content. Designers, developers, and SEOs everywhere rejoiced, new friendships blossomed, and we all marched hand-in-hand to this new era of web design.

Except not really. In fact for many, all of those views out there about SEOs, what we do, and why we do the terrible things we do are just as they always have been. Most SEOs are familiar with developers that don’t take you seriously or designers that won’t take your calls. So what gives?

Common complaints about SEOs

To find out I decided to ask web designers, developers, and managers all over what they thought about SEO. I got 7 responses. Well, 8 if you count one response from a message board: “SEOs are spammers. There’s your quote.”

But let’s get to it. What’s going on here? Well, it turns out there’s lots of things SEOs are still doing wrong that we need to fix if we want others to take us seriously.

SEOs don’t get the big picture

“The priorities should just be dependent on the goals of the project. Sometimes SEO is the prime objective, sometimes it’s something else.” Rick Murphy, Web Designer, Hardly Code

“It’s very rare to find an SEO group who thinks about overall site experience, as opposed to magnifying the attractiveness of single screens. This myopic view is usually at the detriment of context or user interaction. It’s rare to find SEO teams that consider brand or experience as a factor in what they’re trying to accomplish.” Andrew Heaton, Web Designer, Revinity

SEO is an important part of the site. I mean without SEO no one will ever see your site, right? Well, here’s a hard truth for all SEOs:  Sometimes SEO is not the most important thing.

Are you OK? Things like making sure the cart works or incorporating the brand into the site often are, in fact, higher priorities. It’s true. We can huff and puff about how they don’t get it and how they need us, but that’s probably not going to help our likability.

Designers and developers have a million things to take into account when building a site, making our pleas to give all the images alt attributes seem like just another seagull squawking among a whole flock of irritating Internet marketers.

SEO involves a lot of different things and we need to be able to distinguish between a vital problem that will block the entire site and a minor change that will help a single page rank. This way our fellow webhead can prioritize our suggestions among the other countless things they need to make the site work; which brings me to our next problem:

SEOs don’t get what’s involved with the implementation

“As a product manager, I devote my energy to trying to come up with ways to improve user experience. With a complex website and a diverse audience, nailing down the right new feature is a daunting task. However, the experience of seeing that feature come to life and the anticipation of getting real user feedback is thrilling; that is, until I’m forced to roll it back on day 2 because—gasp!—we’ve experienced a rankings change.” Brittany McCullough, Program Manager, Guide to Online Schools

“Custom CMS builds mean sites are all different. Changing something may take 10 seconds or 2 hours; it all depends on how familiar I am with the CMS being used, and how well the templates were originally developed.” RJ LaCount, Web Designer, RJLaCount.com

A lot of SEOs aren’t designers. Some of us aren’t the world’s greatest coders either. Very few of us are programmers. So when we tell our clients to “consolidate images into CSS sprites” or “use more keyword-rich URLs,” how many of us actually know what that involves? What CMS is the site built on? How many of the images are actually able to be merged into a sprite? Tiled images in backgrounds can’t be used in CSS sprites; but all of you already knew that right?

When you casually tell another web professional to do something that requires a complete overhaul of the site, all you are doing is telling them that you don’t actually know what you’re talking about. How receptive do you think the developer will be the next time you give them a recommendation, especially ones that have less to do with their immediate skillset? Some of the more vague recommendations we give, in turn, lead into the next problem:

Unclear Justifications

For our clients who may or may not be the most tech-savvy people on the planet, listening to us (the computer geeks) explain SEO to them, even in simplest terms possible, may still feel like they are sitting in a classroom being taught Japanese. Over the years I have gained quite a few analogies (thankfully) but still find this to be the most difficult part. A client that doesn’t understand how something could help their business is not going to buy a product, thus making client understanding a difficult but essential hurdle.” Danielle Nyhof, Web Interaction Designer, DK Designs

Context, it seems, is the great missing variable no SEO group wants muddling up their equations: too squishy, too volatile, too hard to quantify.” Andrew Heaton, Web Designer, Revinity

Most of the SEOs I’ve met are pretty good talkers. We can talk about page authority, crawlers, link juice, canonicalization, C-blocks, trust flow, nofollowed links, and cross-domain snippet rank indexation (did I make that up?) for days and days and days. So what do all of those words and phrases have in common? They don’t actually mean anything. Seriously.

If you’re an SEO then they might make sense (though I would argue that we are overusing them) but to everyone else they’re gibberish. This makes it sound like a) You’re explaining something no one can possibly understand or b) You’re making it up. Designers, developers, managers, and other professionals are usually smart people. So they go with b. In fact, there is one very specific thing that all the web professionals have no problem calling us on:

We don’t know the algorithm either

Search engines have the ability to change their algorithms at any time. I already have to devote a lot of my time to keeping up on development changes.” Chris McGrath, Web Application Developer, ChrisMcGrath.net

They’re just as in the dark about what Google is doing next as anyone else, and building your strategy around certain tricks can sometimes backfire or have no effect at all.” Anonymous Web Designer

Is this rankings change random? Maybe. Will it even last? Maybe not. Have we given the users a chance to give us feedback via their interactions? Definitely not.” Brittany McCullough, Program Manager, Guide to Online Schools

This is how this line of reasoning works.

SEO: You need us because the algorithm is always changing.

Experienced Web Professional: But you don’t know what the algorithm is, either.

SEO: …

Congratulations! You’ve not only killed your own credibility with your client, but mine as well. And all the other SEOs out there, too. We need to stop using “the algorithm” as a reason to do anything. We are helping people find websites. Yes, the algorithm is part of the process, but it’s not the reason. And as much as we might like to think we’re John Connor using his Atari Portfolio to hack into Google, we’re not. So we need to stop using this to make us sound more mysterious.

SEOs do real things. We identify traffic opportunities. We fix technical problems. We give content ideas. We don’t need to hide behind the facade of an advanced computer program we’re predicting because most of us aren’t doing that. Most are doing actual SEO work (see above). Another problem, though, is that some of us aren’t:

Some of us are still giving outdated advice

I have read quite a few articles online from mom-and-pop SEOers (as I like to call them) who still believe in keyword stuffing. These individuals are the only negative experiences I have had recently as they are marketing themselves as knowledgeable SEOers and are filling the population of our potential clients with incorrect information and wildly high expectations.” Danielle Nyhof, Web Interaction Designer, DK Designs

There’s no shortage of amateurs with a high profile in any field, and SEO is certainly no exception. We’ve all encountered the “SEO expert” who’s still working with methods and mentalities from years past, still clutching that last keyword choked meta tag as if it were a drop of virgin’s blood, a Gríma Wormtongue trying to sway the client with promises of free traffic and that top-spot on the first page of Google results.” Andrew Heaton, Web Designer, Revinity

Whether it’s out of laziness or stubbornness, people are still giving bad advice. Keyword stuffing, comment spamming, and footer links are all classic examples. Though Penguin, Panda and the other updates have definitely cut down on the bad advice being given these days, the bad aftertaste still lingers. Remember, Penguin is only 10 months old. That means for many web professionals, the last SEO they worked with could have been giving them all the same, spammy advice that they were using for years.

So what can SEOs do?

Now that we’ve identified some of the popular reasons why other web pros avoid us in the cafeteria, what’s the solution? Well, all of these complaints center around 2 things: knowledge and communication.

Know your stuff from top to bottom. Be able to explain how to implement page speed suggestions. Understand how the URLs are being generated. Be able to explain how to get text on the page without affecting the design. Familiarize yourself with how fonts work and how to position text. Boost up on trivia. For example, most SEOs know dashes are better than underscores. But can you explain exactly why? (Hint)

Then we need to be able to talk about it. Practice talking about how a site ranks without using SEO terms like link juice and domain authority. Be able to explain how the SEO process works at both a fundamental level and at a technical level.

But most of all, we need to stop trying to trick people and get better at we actually do. We do SEO and it’s a real thing. Because frankly, we were never really tricking anyone other than ourselves.

The post Why Web Professionals Hate SEOs appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/why-web-professionals-hate-seos.htm/feed 27
Into The Fold: Why Web Design is More than One “Rule” http://www.eigene-homepage-erstellen.net/blog/design-dev/into-the-fold-design-rules.htm http://www.eigene-homepage-erstellen.net/blog/design-dev/into-the-fold-design-rules.htm#comments Thu, 24 Jan 2013 14:00:10 +0000 http://www.eigene-homepage-erstellen.net/?p=15430 If you want to call yourself a web designer these days, you’d better be ready to keep up. Daily study is required or you’ll be left in the dust. I follow countless blogs, listen to podcasts during my commute, and work on random projects at home just to try out new things. When I start… Read More

The post Into The Fold: Why Web Design is More than One “Rule” appeared first on Portent.

]]>
Pick Up The Paper Graphic

If you want to call yourself a web designer these days, you’d better be ready to keep up. Daily study is required or you’ll be left in the dust.

I follow countless blogs, listen to podcasts during my commute, and work on random projects at home just to try out new things.

When I start talking about design, my friends’ eyes gloss over and they suddenly need to go home – but what can I say? I’m really into this stuff.

That being said, I have a confession that might make the modern web design community cringe a bit: I care about the fold.

The what?

“The Fold” is a dated concept originally stemming from printed newspapers. If you wanted your headline to pop, it needed to be no more than halfway down the page: the point where newspapers were “folded” for display.

When digital media burst onto the scene, the average screen resolution was 1024×768 (or if you lived on the bleeding edge, 1280×1024). Consequently, designers determined that the safe digital equivalent of “the fold” was 600px from the top of the page. If your content was below this point, it was less likely to be seen.

Now, things are a bit different.

eigene-homepage-erstellen.net visits: Comparing screen resolutions in 2008 and 2013

In the future, that chart’s only going to become more fragmented. We’re already browsing the web on our phones, tablets and TVs. Pretty soon we’ll be browsing the web on our bathroom mirrors.

So, the fold is dead, right?

A few years ago, I was riding high on the “Fold Is Dead” bandwagon, preaching about how “users these days know how to scroll!”

But working at an Internet marketing agency that deals in real stats and analytics knocks you off that bandwagon real quick.

It’s still true: content above the fold gets looked at more than anything else.

But a couple of other things are also true:

  1. Users are comfortable scrolling (especially on small / mobile screens).
  2. The traditional approach to a quantifiable pixel number representing “the fold” is no longer relevant.

The fold is not dead… but you also shouldn’t be trying to force everything important above some arbitrary line.

Alright, I’m pulling my hair out now. What should I do?

Calm down, for starters.

The purpose of defining the fold was never to cram every possible thing above it. Because if everything is important, guess what? Nothing’s important.

Let’s go back to the fundamental: the whole purpose of considering the fold is to entice your audience to continue reading or, in the context of the web, scroll. Take a look at one of my favorite examples:

 

Notice how there’s almost nothing above the fold on this page, yet it screams “scroll down and continue reading.”

Whether it’s the headline suggesting there is something interesting below or the giant glacier submerged just below the cutoff, you’re compelled to read more. Additionally, simple design and large typography make it less intimidating for the user to dive down into more content. This is how you use the fold.

What I’m hearing is that I should stop trying so hard…

Exactly.

Perfection (in design) is achieved not when there is nothing more to add, but rather when there is nothing more to take away. – Antoine de Saint-Exupery

I love the web’s current explosion of screen resolutions and form factors. It’s whipping a wicked curveball right to the high inside corner of all these “rules” – rules that have evolved over time into completely skewed versions of what they were originally intended to be.

Some of your “above the fold” content is inevitably going to be cut off.

The resolution of screens viewing the web is constantly improving. No one can predict what your site may look like on some Internet-capable washing machine or car dashboard of the future, so don’t blindly obey hard and fast design “rules” without question. The fold is a guideline, not a mandate.

If a user starts scrolling down your main page, the fold has accomplished its goal. Now all you have to worry about is the rest of the site! (No pressure or anything.)

Have you come across any great examples of sites using the fold correctly? Let us know in the comments.

The post Into The Fold: Why Web Design is More than One “Rule” appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/design-dev/into-the-fold-design-rules.htm/feed 6
Get All Your Questions about Local SEO Answered! http://www.eigene-homepage-erstellen.net/blog/seo/local-seo-q-and-a.htm http://www.eigene-homepage-erstellen.net/blog/seo/local-seo-q-and-a.htm#comments Tue, 18 Sep 2012 13:00:10 +0000 http://www.eigene-homepage-erstellen.net/?p=11634 Last week we announced our new referral program for our PPC Essentials Package. There was a blog post about it. The world changed. We decided we really liked that feeling and wanted to hold onto it a little longer. So this week we’re announcing a special for our recently revamped and reworked Local SEO Package. For… Read More

The post Get All Your Questions about Local SEO Answered! appeared first on Portent.

]]>
Fresh Local SEO

Last week we announced our new referral program for our PPC Essentials Package. There was a blog post about it. The world changed. We decided we really liked that feeling and wanted to hold onto it a little longer. So this week we’re announcing a special for our recently revamped and reworked Local SEO Package. For a limited time, we’re taking $500 off our normal price and offering our Local SEO Package for a one time cost of $1500.

To celebrate, we’re launching the Portent Local Series! That means for the next 2 weeks, every Tuesday and Thursday we’re going to be writing a local-specific post, all about local search and search for small and local business owners.

Portent’s Local SEO Q&A

Kicking it off, we’re throwing an anything-goes Local SEO Q&A where we’ll answer all your questions about Local SEO. Want to know where your reviews go? Which directories you should pay attention to? What Google’s calling its local service these days? Post a question in the comments and we’ll collect them all for a blog post on October 2nd. We’ll also be digging through our microfilm records to find old blog posts and grab those questions, as well.

Woman pondering her Local SEO

“How will I ever decide which business categories to choose for my listing?”

So, if there’s anything you’re dying to know about local, ask us below in the comments. Then check back next week where we’ll answer everything you’ve ever wanted to know about local. Because we love local and want you to, also.

And remember, if you’re a small business owner and you’ve been waiting for just the right time to give your search strategy a jumpstart, head on over to our Local SEO page and take advantage of our limited offer!

The post Get All Your Questions about Local SEO Answered! appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/local-seo-q-and-a.htm/feed 23
Goodbye Google Places, Hello Google+ Local http://www.eigene-homepage-erstellen.net/blog/seo/new-google-plus-local.htm http://www.eigene-homepage-erstellen.net/blog/seo/new-google-plus-local.htm#comments Wed, 30 May 2012 21:57:07 +0000 http://www.eigene-homepage-erstellen.net/?p=10149 Business owners got a treat this morning as Google rolled out yet another update to its local platform, this time, ditching Google Places and replacing it with Google+ Local. This move completely integrates Google’s local listings with the current Google+ social network. Coincidentally, and equally newsworthy, Portent just happens to be moving to our new… Read More

The post Goodbye Google Places, Hello Google+ Local appeared first on Portent.

]]>
Business owners got a treat this morning as Google rolled out yet another update to its local platform, this time, ditching Google Places and replacing it with Google+ Local. This move completely integrates Google’s local listings with the current Google+ social network.

Google+ Local Result

The new Google+ Local Search

Coincidentally, and equally newsworthy, Portent just happens to be moving to our new office tomorrow, so we were lucky enough to give the new interface an in-depth examination that we’re sharing here with you.

 

Portent Move

Ian is in the box on the bottom

So what do you need to know, as a business owner? Let’s go over the top 5 things we’ve noticed with the new Google+ Local.

1. How do I get on Google+ Local?

You’re probably already on Google+ Local. And if you’re not, you will be soon. This was an automatic update so all Place pages are being moved to the new Google+ Local format as it rolls out. If you’ve claimed your page, all the contact information, pictures and reviews (sort of – we’ll get to that in a bit) are now seen in the new platform. If you’re not seeing it, then you probably just don’t have the update yet.

2. What about my old reviews?

Google will be moving the reviews from your Place page to your +Local page. However, since the update pairs each review with a public Google+ Profile, they will show up anonymously as “A Google User,” unless the reviewer goes through a brief verification process to publish them publicly.

Publish reviews on Google+

Publishing my Place reviews to +Local

3. What’s with the rating system?

If you’ve seen the update, you’ll notice that the 5 star system is gone. Replacing it is a new 30-point scoring system developed by Zagat. Each reviewer can rate a business from 0-3. Google then averages them, multiplies by 10 and gives the new score for the business. If you have enough reviews in different categories, you’ll get a score with multiple aspects. Otherwise, you’ll just get a single number.

Google+ Local new Review System

Google+ Local’s new 30-point review system

4. What if I already set up a Google+ Business Page?

For the moment, Google+ Local and Google+ Business Pages are completely separate. So your Business Page and Local listing exist completely independent from one another with no sharing of information. For us, it meant we had to update our address twice; once for each Google page. I imagine that this will eventually change, but since Business Pages currently only allow for one address anyway, it could be a while before we see any changes with this, as many businesses have multiple locations. For now, though, you have two Google+ pages to manage. Yay.

5. Why the change?

As David Mihm describes it, this is primarily a user-oriented interface update. Basically it’s a new face that puts local businesses firmly in the Google+ social arena. The hope is that after everyone looks up businesses, rates them and gets directions, they’ll keep hanging out on Google+ and give it the attention it so desperately needs. This is why, again for now, the dashboard for business owners hasn’t changed. This is for the reviewers, not the business owners.

What have you noticed with the new Google+ Local page? And how is your “+” key holding up these days?

The post Goodbye Google Places, Hello Google+ Local appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/new-google-plus-local.htm/feed 13
Wanted: Enterprise Search Gods http://www.eigene-homepage-erstellen.net/blog/seo/wanted-enterprise-search-gods.htm http://www.eigene-homepage-erstellen.net/blog/seo/wanted-enterprise-search-gods.htm#comments Tue, 15 May 2012 17:30:40 +0000 http://www.eigene-homepage-erstellen.net/?p=9695 Lately, I’ve been feeling like a grade schooler punished along with the rest of the class for one student’s bad behavior. With Google’s Pandas, Penguins and other pandemonium-producing updates yet to come, we’re all Bart Simpson being forced to write on the SEO chalkboard, “I will not outsource link building to third-world countries that spam… Read More

The post Wanted: Enterprise Search Gods appeared first on Portent.

]]>

Lately, I’ve been feeling like a grade schooler punished along with the rest of the class for one student’s bad behavior. With Google’s Pandas, Penguins and other pandemonium-producing updates yet to come, we’re all Bart Simpson being forced to write on the SEO chalkboard, “I will not outsource link building to third-world countries that spam blogs.” Sheesh!

It’s times like that that my mind turns to flights of fancy… If I could make Google do whatever I wanted… Well, I can and so can you — just not for the whole Web. We can control search engine performance inside our workplaces or the United Airlines site so that it can find a flight from Seattle to New York that does not take six connections over two days.

Enterprise SEO Rules, Web SEO Drools

With Web SEO, there’s Google throwing the proverbial wrench into site rankings with algorithm changes that drop results from view like stones falling from the sky causing grown SEO strategists to cry and the black hatters to sharpen their stilettos. With enterprise SEO, the strategist or developer is the one to tune the algorithm to the behavior, culture or whims of the users.

Do you like keyword metadata? With enterprise search optimization, you can make that a significant ranking factor because you are the search god and do not have to compensate for spam content. With enterprise search optimization, the strategist can create a relevant landscape that functions for their workforce, not everyone with a copy of Dreamweaver and an FTP connection.

  • Not enough linking between internal documents? Not a problem, manually designate authority pages and create number one results by mapping best bets (or editorialized results) to specific queries.
  • Worried about spelling and term variations? Stop. Most of the quality enterprise search engines will let you map terms so the same results come up no matter which spelling variation your clueless colleague used.
  • Can’t figure out what keywords to optimize? Just ask the users sitting right next to you and all around you, because they are your coworkers and colleagues.

Be the Hero, Not the Goat

International Data Group (IDG) estimates that the average worker spends 2.5 hours a day looking for information needed to do their job. One Fortune 500 company estimated that improving internal search would contribute $2 million a month in productivity gains. There is money to be made from figuring out how to make search within the workplace work as well as search outside of the workplace. And that, my friends, is raise-worthy.

I will be giving out more details on how to design a perfect enterprise search experience  with enterprise-specific data mining and user-centered design at the Enterprise Search Summit this week in New York City.

The post Wanted: Enterprise Search Gods appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/wanted-enterprise-search-gods.htm/feed 1