Portent » tools http://www.eigene-homepage-erstellen.net Internet Marketing: SEO, PPC & Social - Seattle, WA Thu, 03 Sep 2015 18:20:24 +0000 en-US hourly 1 http://wordpress.org/?v=4.3 Announcing Our SERP Preview Tool http://www.eigene-homepage-erstellen.net/blog/seo/announcing-serp-preview-tool.htm http://www.eigene-homepage-erstellen.net/blog/seo/announcing-serp-preview-tool.htm#comments Wed, 17 Sep 2014 16:28:57 +0000 http://www.eigene-homepage-erstellen.net/?p=26413 Has this ever happened to you? You’re working on your website, updating title tags and meta descriptions, when all of a sudden you’re panicked at the thought of a truncated title tag! Sure, you kept it under 55 characters, but you know that doesn’t always work, because Google truncates to a pixel width, not a… Read More

The post Announcing Our SERP Preview Tool appeared first on Portent.

]]>
Has this ever happened to you?

You’re working on your website, updating title tags and meta descriptions, when all of a sudden you’re panicked at the thought of a truncated title tag! Sure, you kept it under 55 characters, but you know that doesn’t always work, because Google truncates to a pixel width, not a number of characters.

We all know that risking a low CTR is out of the question, and that it’s annoying to push a page live and have to go back and change it until you get it right.

There has to be a better way!

Now there is, Portent’s SERP Preview Tool!
SERP Preview Tool
Preview your title tags, meta descriptions, and URLs to see how they display in the SERPs before you push them live! Best of all, our tool is designed to measure title tags according to pixel length, just like the search engines do. No more fussing and fighting with your imagination or guesswork. Now, you can just enter your desired title tag, and presto a preview before your eyes!

Be our guest, try it out today!

The post Announcing Our SERP Preview Tool appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/announcing-serp-preview-tool.htm/feed 0
How to Generate Video Sitemaps using Google Docs http://www.eigene-homepage-erstellen.net/blog/seo/how-to-generate-video-sitemaps-using-google-docs.htm http://www.eigene-homepage-erstellen.net/blog/seo/how-to-generate-video-sitemaps-using-google-docs.htm#comments Tue, 26 Mar 2013 14:00:37 +0000 http://www.eigene-homepage-erstellen.net/?p=16677 Let me just start out with the ending. The Google Doc below takes a list of pages containing embedded YouTube or Vimeo videos and uses magic to grab all the data you need for a video XML sitemap. Then it makes the sitemap for you. You can have it by going to the link below… Read More

The post How to Generate Video Sitemaps using Google Docs appeared first on Portent.

]]>
Let me just start out with the ending. The Google Doc below takes a list of pages containing embedded YouTube or Vimeo videos and uses magic to grab all the data you need for a video XML sitemap. Then it makes the sitemap for you.

You can have it by going to the link below and selecting “Make a copy” under “File.” The instructions are right in the document.

Video Sitemap Generator v1

If you want to know more about how it works, keep reading. But if, not – thanks for stopping by and hopefully this makes your life a bit easier!

Quick Disclaimer

Before I go into the details, I do recommend taking the time to learn more about video sitemaps. Rather than repeat what others have already said, I’ll just tell you to start with the horse’s mouth then read other articles like this great post by Phil Nottingham or this one by Justin Hammack.

Second, I should say that, even though this tool does generate a fully compliant video sitemap you can give straight to Google, I don’t actually recommend doing that. This isn’t the only video sitemap generator out there, but I wanted something that was a bit more straightforward and would allow me to easily customize the various fields.

So I made this to specifically help with some of the more time-consuming tasks but still let me optimize the data for each video. I’m still working on the tool that does my entire job for me.

Using the Google Doc

Graph of Using the Google Doc

You’ll notice several tabs on the spreadsheet, but basically the doc does two things. First it sucks up all the meta data from the actual video pages on YouTube or Vimeo. Second, it organizes all that data so you can customize it, then marks that in XML sitemap format.

Here’s the play by play:

1. Get your list of landing pages. Start with a list of URLs you know have either YouTube videos or Vimeo videos embedded on them. If you have a large site, you can use Screaming Frog or another crawler to scan your sites for pages that have the embed code.

YouTube has had a few types of embed codes over the years, so I recommend searching for code that contains either “youtube.com/v/” or “youtube.com/embed/”. For Vimeo, the links are going to contain either “player.vimeo.com/video” or “video.com/moogaloop.swf?clip_id” so that’s the text you want to look for.

2. Go to the YouTube or Vimeo extractor sheet. Pages with YouTube videos go on the “YouTube Embed Extractor” sheet. Pages with Vimeos go to the “Vimeo Embed Extractor.”

You’ll notice that there’s also another two tabs in case you already have a list of video pages and would rather just pull the data directly from there. That’s fine, but remember that you need the URL from your own site to make a video sitemap.

Chart of sitemap generator

3. Paste the URLs into the sheet. I know I say it’ll handle 15 URLs in the doc, but I really wouldn’t recommend doing more than 10 URLs at a time. The doc is pulling HTML from both the landing page and video page. This means that when you run 5 URLs, the doc is storing all of the HTML from 10 pages.

4. Drag down the formula rows. There’s a few hidden cells so be sure to include the orange cell on the right when you’re dragging down the formulas.

chart of drag down

5. Copy all the cells in the green zone (including the original landing page) and paste values only into the “Generator” sheet. Seriously, remember to do “values only” otherwise your browser might kill itself.

5.1 Rinse and repeat. Keep adding to the “Generator” sheet until you’ve finished all of the pages. Do some Vimeos then some YouTubes if you want. If there are other video types, you can just type those into the “Generator” sheet, too. That’s still easier then marking up a text file manually.

6. SEO your data. If you think you’re going to get rich snippets by copying the titles and descriptions directly from YouTube and Vimeo, good luck. Plus, chances are some of the titles and descriptions suck anyway, so just rewrite them. Also, I have columns for some of the more recommended optional tags like “video:category” and “video:uploader.” Add those where you can or leave them blank if you have nothing to put there.

(If you’re using the “Publication Date” you need to remember to put a single quote (‘) before the date, otherwise the Doc will convert it to MM/DD/YYYY format, which isn’t what the XML Sitemap uses. Unfortunately, you can’t just switch the column to “text” either, since that kills the IF(ISBLANK) condition in the next sheet that prevents empty <video:publication_date> tags from appearing when you leave those cells blank.

chart of generator tab

7. Go to the “Video Sitemap” tab and drag down that red cell. If your “Generator” sheet goes down to row 56, drag the red cell in “Video Sitemap” down to 56. It’s got a bunch of IF conditions in there, so if you left some of the optional tags blank, they won’t show at all.

screencap of sitemap tab

8. Close the <urlset>. Just write </urlset> below the last row in Column A.

9. Export as .txt and rename. Boom. Video Sitemap.

Customizing the document/how it works

The document uses the ImportData function to pull in the HTML from the landing pages. Then it uses a combination of RegexExtract and RegexReplace to grab the video source from the embed code, concatenate that as an actual URL, import that HTML, and grab all the Schema data. Vimeo doesn’t use the “name” or “description” itemprops in Schema, so the doc just grabs the Open Graph title and descriptions instead.

To get even nerdier for a minute, Schema uses ISO format for duration (like PT1H23M45S for 1 hour, 23 minutes and 45 seconds), but the XML sitemaps need everything in seconds, so the doc even takes that ugly duration format and converts it into seconds for you.

The doc does a lot, but not everything. At least not out of the box. If you want, though, you can edit it to do more. Let’s say you want to pull the Open Graph data instead of Schema from YouTube. You’ll just need to go to the cells that are grabbing the Schema data.

Chart of schema name regex

Then rewrite the function to look for the og:title element instead.

Chart of OG Title Regex

If you want to look for another video type, that’s when you’d go in the hidden cells. Here you’ll see where the imported HTML is being stored. You can also see this messy expression that is looking for YouTube embed codes.

Chart of YouTube extract

If you wanted to look for Viddler videos, for example, you’d want to edit this to extract something like “viddler.com.embed.([0-9]+)” here. Then you’d reconstruct the URL in J3 so that K3 can import the data. Since Viddler uses Open Graph, instead of Schema on the embed pages, you’d follow the steps described earlier to extract the OG data.

(Note: I have not actually tried the steps described above to extract Viddler video data; I’m just outlining the process I would hypothetically follow if I were to do this. You’ll have to do some testing yourself.)

The doc I created already has fields to manually add “category”, “tags”, “uploader”, and “upload date”. But If you wanted to add <video:expiration_date>, for example, you’d simply create another column in the “Generator” sheet.

Then you’d go to the “Video Sitemap” sheet and take a look at the formula in the red cell. The formula is long, but you can see it’s just a concatenation of all the cells in the generator sheet.

Somewhere before the </video:video> tag you’ll simply add “<video:expiration_date>”,Generator!N3,”</video:expiration_date>” being sure to use commas to separate the different tags.

If only some of the videos in the sitemap have an expiration date, then you’ll need to wrap that in an IF(ISBLANK) condition. The end result would look something like this:

concatenate

The char(10) isn’t really necessary, but it makes the cell easier to read in the doc by adding a line break. Unfortunately the line break doesn’t transfer when the document is exported as a .txt, so each <url> element will be one huge line when you view it.

Tips and Common Errors

Remember, folks, this is a Google Doc. Which means it totally works almost all of the time. But sometimes it breaks and sometimes it really, really breaks.

Sometimes you’ll see “<!DOCTYPE…” show up in the Schema/OG cells. This is because when RegexReplace can’t find a match, it just imports everything from the cell it’s referencing. So in this doc, that’s all of the source code from the page. Sometimes this might even blow the sheet’s margins all over the place.

The most common culprit is a video that’s been either removed or marked as private. In that case it’s a great opportunity to do a quick audit of videos on your site.

The second most common cause is something along the lines of “WTF IS WRONG WITH YOU I HATE COMPUTERS?!” because, like I said, it’s a Google Doc.

Also, be sure to delete rows that aren’t actively pulling data. Remember to follow the same rules in Step 4, and include the orange cell when you’re deleting the rows, because the hidden columns are where most of the data is stored.

Google Docs only lets you Import 50 URLs at a time, so, again, delete rows after you’ve copy/pasted them and be mindful of your space.

Enjoy!

Questions? Suggestions? Tips? Write them in the comments below. I’ll be sure to keep this post updated as I make any changes/improvements to the Doc so check back if you’re wondering.

(I will also likely mention them on Twitter or Google+ but I would never do any shameless self-promotion like that on here.)

Feature requests will be dependent on how easy they are and what I happen to be doing at that moment.

The post How to Generate Video Sitemaps using Google Docs appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/how-to-generate-video-sitemaps-using-google-docs.htm/feed 3
Tool Review: Broken Link Finder http://www.eigene-homepage-erstellen.net/uncategorized/tool-review-broken-link-finder.htm http://www.eigene-homepage-erstellen.net/uncategorized/tool-review-broken-link-finder.htm#comments Tue, 12 Mar 2013 14:00:08 +0000 http://www.eigene-homepage-erstellen.net/?p=16300 I’m a huge fan of Garrent French’s link building tool, the Link Prospector, so I was jazzed when he released the Broken Link Finder. I wanted to give an overview of why broken link building is important and how this tool can save you a ton of time and effort. What is broken link building?… Read More

The post Tool Review: Broken Link Finder appeared first on Portent.

]]>
Ptolemy's map of the world

Some people need help with their maps.

I’m a huge fan of Garrent French’s link building tool, the Link Prospector, so I was jazzed when he released the Broken Link Finder. I wanted to give an overview of why broken link building is important and how this tool can save you a ton of time and effort.

What is broken link building?

As the web grows, it changes: pages gets moved around, deleted, or simply neglected. The content is gone, but the links to these URLs remain. The result is a 404 “File Not Found” status, which is a pain for users but a link builder’s dream. Broken link building is the practice of asking a webmaster to change those links to point at relevant, replacement pages (that just so happen to be on your client’s site).

Think of it this way: say you’re a pizza connoisseur, so you pick up a map of the best pizza joints in town at your local pizza community resource center. You follow the map to one of the pizzerias, only to find that the place is closed, the building is permanently destroyed, or, worse, there’s a new establishment in its place that’s selling generic Viagra pills—either way, you’ll probably be disappointed.

Broken link building is like contacting the map maker and suggesting a different Italian restaurant to replace the delinquent one. The pizza cartographer updates the map, and the new establishment enjoys the benefits of leveraging resources and connections that were already in place.

Scalablility

Like any other SEO strategy, broken link building takes time. Using the Check My Links extension to discover dead links on resources pages can be tedious work. The enterprising minds at Citation Labs thought, “There must be a way to make this process scalable….”

Enter the Broken Link Finder. This tool does most of the heavy lifting for you by returning dead pages based on keywords you provide. Here’s an outline of how it works.

1. Create a campaign

Users familiar with Link Prospector will be comfortable with the interface of the new tool. I organize my campaigns by the types of linkable content I’m hoping to find, rather than by client or website.

Pizza Resource Links chart

2. Choose your keywords

Link Prospector allowed for a lot of trial and error for picking keywords to use for search queries; however, because the credit structure is different, your pocketbook will be in a world of hurt if you dump keywords in willy-nilly. Thankfully, they’ve provided a tool to help choose keywords that will yield the most results.

Graphic of keyword grades for Alaska

As you can see, it’s better to think broadly and more abstractly than you would for classic link building queries. It’s important to remember that you’re searching for opportunities for replacing dead content; these aren’t necessarily immediate wins.

3. Vet opportunities

After you add the keywords for the campaign, some sort of link wizardry happens and the tool produces a report.

Alaska opportunities

What would have been hours of sleuthing and a significant amount of luck is just sitting there for the taking. You can filter the list to see URLs by the number of referring links or by the link opportunity’s “grade,” which is based on its relevance to your target keywords. They even provide a link to the cached version of the page from the Wayback Machine so you can see what content your page should offer!

4. Reserve and outreach

Once you find the pages you want to contact, you can reserve them. This removes the URL from the tool’s index and prevents link builders from moving in on your territory.

Reserved Opportunities screen shot

You can also “dive” URLs, which creates a separate campaign that generates new opportunities based on that specific page.

Of course, premium prospects carry a premium price tag. At $7.50 per credit, this tool gets expensive very quickly. It forces you to be calculated with your research and content strategy.

To link builders, however, the value should be clear. In the example above, the report produced a page with 303 dead links. It could’ve taken me hours to find the same number of quality opportunities that the tool found in 15 minutes. This is scalability—without sacrificing relevance or value.

The future of link building

What I like most about this tool is it gets us to think about what’s really important for SEO: content and user experience. Fixing links helps users avoid the dreaded 404 page, and if you can create great content to fill that gap, everybody wins.

Thinking in terms of what content could be helpful to our users is the only way link building can be successful in the future. The Broken Link Finder is a great tool to help us do just that.

Have you used the Broken Link Finder? What strategies do you use for broken link building? Let us know in the comments!

The post Tool Review: Broken Link Finder appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/uncategorized/tool-review-broken-link-finder.htm/feed 3
Recording and slides from today’s webinar http://www.eigene-homepage-erstellen.net/blog/seo/recording-and-slides-from-todays-webinar.htm http://www.eigene-homepage-erstellen.net/blog/seo/recording-and-slides-from-todays-webinar.htm#comments Thu, 01 Mar 2012 01:05:59 +0000 http://www.conversationmarketing.com/?p=3554 If you missed today’s webinar, “SEO: 50 tips in 50 minutes,” you can get the slides here: SEO: 50 tips in 50 minutes View more presentations from Ian Lurie

The post Recording and slides from today’s webinar appeared first on Portent.

]]>
If you missed today’s webinar, “SEO: 50 tips in 50 minutes,” you can get the slides here:

SEO: 50 tips in 50 minutes

View more presentations from Ian Lurie

The post Recording and slides from today’s webinar appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/seo/recording-and-slides-from-todays-webinar.htm/feed 0
SEO, optimize thyself: Get more results for your effort http://www.eigene-homepage-erstellen.net/blog/random/seo-optimize-thyself-get-more-results-for-your-effort.htm http://www.eigene-homepage-erstellen.net/blog/random/seo-optimize-thyself-get-more-results-for-your-effort.htm#comments Fri, 23 Dec 2011 07:22:15 +0000 http://www.conversationmarketing.com/?p=3385 Yesterday I wrote about how effort does not equal results. Results are always better. Great results come from testing and tweaking your own routine. You can set up a virtuous cycle: You get more efficient, and have more time to learn more about your job, which helps you be more efficient, and so on. I’ve… Read More

The post SEO, optimize thyself: Get more results for your effort appeared first on Portent.

]]>
slow slug

Yesterday I wrote about how effort does not equal results. Results are always better.

Great results come from testing and tweaking your own routine. You can set up a virtuous cycle: You get more efficient, and have more time to learn more about your job, which helps you be more efficient, and so on.

I’ve worked on this myself for years, by attacking the problem on three fronts:

  • Time management: Removing time sucks.
  • Repetitive tasks: Getting rid of them through tools, or any other way I can.
  • Learning: Figuring out where to find the good stuff.

These are lots of little tricks I learned from books like Getting Things Done, sites like 43 folders and tons of great advice from friends and colleagues. Read through ’em—if you have more ideas, please, add them to the collection.

Time management: Projects, tasks, and sprints

The single biggest productivity increase I’ve made? I started using a timer to break my day into sprints. It works like this:

At the start of each week

I list all of my projects. A ‘project’, for me, is anything that’s going to require more than 45 minutes to complete. Bizarre, I know, but it’ll make sense in a minute.

I also make a list of things I’d really like to do. These are not ‘musts’—they’re things that, if I can get through the rest of my list, I’ll work on. An example might be learning Ruby, or working on an analytics idea that I think is cool, but don’t really need.

Then, I break all my projects up into grouped to-dos. I use , a super-simple text to-do list manager. That gives me a list that looks like this:

todolist.txt in action

todolist.txt in action

The further into the future the todo item, the more general it is. So a todo that’s more than a day in the future could be so general that David Allen would tsk at me. This may not work for you—find the best balance between detail and practicality.

Each day

  1. Each day, in the morning, I pick the next 3-4 big to-dos and break them down into smaller tasks (actions) that’ll take no more than 45 minutes each.
  2. I break up my days into 1-hour sprints.
  3. In each sprint, I work for 45 minutes, uninterrupted. I turn off e-mail, set my instant messenger to ‘away’, and put up the do not disturb sign.
  4. Then, I take 15 minutes to check messages, answer staff questions and (gasp) maybe even take a break.
  5. At the end of each day, I list out any stuff that didn’t get done, so I know what my next action (thank you, David Allen) is for the next day. That ensures that I can dive right back in the next morning.

That’s time management in a nutshell.

Further reading

If you want to learn more about these kinds of techniques, read:

  • Anything about Scrum. While the whole Scrum thing never worked for me, there are fantastic ideas in the methodology that have quite literally changed the way I work.
  • Read David Allen’s Getting Things Done. I’ll keep telling you this until you read it.
  • Check out the latest thinking on agile methodologies in general. Again, don’t adopt it all unless it just works for you, but there are lots of great nuggets.
  • Read Merlin Mann’s 43folders.com site.
  • Read Gina Tripani’s smarterware.org. She gives good geek.

Repetitive tasks

Next up: Getting rid of repetitive tasks. I’m not going to go too far into this. It’d take a year.

If you see stuff here that you want to learn more about, tell me! I’m always looking for more ideas.

Instead, I’ll list some tools and things I’ve learned that save me time:

  • TextExpander creates a sort of keyboard shorthand, so I can type longer phrases, signatures and other text snippets really quickly. For example, with TextExpander, I’ve configured my trusty MacBook so that “,badd” automatically types out my company’s billing address. So I turned about 60 characters into five. That may not seem like much, but trust me, but it adds up.
  • 1Password stores all my passwords for me, so I can log in more quickly. Again, that saves me a moment here and there, and it adds up fast.
  • I use Dropbox to store an encrypted version of my 1Password logins and my TextExpander snippets, so that my iMac, Macbook and iPad all access the same stuff.
  • I use Evernote to store more detailed notes about stuff. Since it runs on all my devices, including my cell phone, I can always jot stuff down.
  • I learned to use Apple’s Automator, Bash scripting and Python so that I can automate a lot of simple tasks myself. I try to automate anything I have to do more than twice. Even if it takes a few hours, I learn a ton, I get a great new tool, and I save a lot more time in the long run.
  • Oh, and I run Quicksilver, so I can start applications and deal with files without resorting to mouse clicks.

Learning

What I wrote in this post, way back when, still holds true for me. I use Google Reader plus Trunk.ly to store links from all of my various social networks.

Trunkly, we shall miss ye

Trunkly, we shall miss ye

Of course, Delicious has swallowed up Trunk.ly like some monstrous space amoeba. But chances are it’ll be incorporated into Delicious. I hope? Maybe? Please?

At lunch each day, I skim through my Reader list and my Trunk.ly links, finding interesting stuff. Then I read, and poof, good learning.

Find your own way

These are just ideas. I’m hardly the shining example of efficiency, what with my 30-minute tank-driving breaks

Employee training device

Employee training device

and other time wasters. But I do what I can. If you get ideas, or have questions, post ’em below. Thanks!

The post SEO, optimize thyself: Get more results for your effort appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/random/seo-optimize-thyself-get-more-results-for-your-effort.htm/feed 8
New content strategies tool: The Gramanator http://www.eigene-homepage-erstellen.net/blog/random/content-strategies-tool-gramanator.htm http://www.eigene-homepage-erstellen.net/blog/random/content-strategies-tool-gramanator.htm#comments Wed, 27 Jul 2011 21:38:17 +0000 http://www.conversationmarketing.com/2011/07/content-strategies-tool-gramanator.htm Announcing a new tool: The Gramanator During my MozCon content creation session today, I announced a new tool: The Gramanator. It is mighty. It is the Gramanator: Bow down before it. What it does The Gramanator will take any public Google Reader Shared Feed and: Clean it up, removing stop words. Assemble all of the… Read More

The post New content strategies tool: The Gramanator appeared first on Portent.

]]>
Announcing a new tool: The Gramanator

During my MozCon content creation session today, I announced a new tool: The Gramanator.

It is mighty.

It is the Gramanator:

the grammantor

Bow down before it.

What it does

The Gramanator will take any public Google Reader Shared Feed and:

  1. Clean it up, removing stop words.
  2. Assemble all of the different feed items in the feed into a single ‘corpus’ or index.
  3. Pull out the top 20-30 terms in that corpus and show frequency.
  4. Retrieve the top 20-30 bigrams (two word phrases) and trigrams (three word phrases) from the corpus.

Why it’s useful

You can use Google Reader to assemble lots of different RSS feeds—from Google Alerts, Twitter, sites like Hacker News and other sources—into a single aggregated feed. See the e-book I wrote about this a while ago for more information.

Once all that stuff is together in one aggregate feed, you can mine that text for all kinds of useful information. Here’s an example where I pulled the 2- and 3-word phrases from an aggregate RSS feed I assembled about the debt ceiling crisis:

the grammantor report

I haven’t seen the news today, but I’m guessing from the noise that the stock market fell about 200 points. Things don’t sound too optimistic, either, with phrases like ‘dour tone’ popping up.

That report is great brainstorming fodder, if you’re looking for writing ideas. It can also help you figure out trending topics, if you run it regularly.

More information coming soon

I’m working on a longer blog post for tomorrow that will outline my presentation at MozCon. That will explain a lot more about why this tool is useful, and how to use it. For now, I wanted to make sure I got the link up for folks who want to have a look.

Enjoy!

The Gramanator – free, no signup required.

Other stuff

The post New content strategies tool: The Gramanator appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/random/content-strategies-tool-gramanator.htm/feed 4
Ad Comparison Tool http://www.eigene-homepage-erstellen.net/blog/ppc/ad-comparison-tool.htm http://www.eigene-homepage-erstellen.net/blog/ppc/ad-comparison-tool.htm#comments Wed, 13 Jul 2011 14:11:23 +0000 http://www.eigene-homepage-erstellen.net/?p=1829 A few weeks back, I wrote a guest post on PPC Hero about the Profit per Impression (PPI) metric. In the post, I talk about calculating PPI manually using an Excel Spreadsheet. To that end, I created a free spreadsheet to do just that: the Ad Comparison Tool! How does the Ad Comparison Tool work?… Read More

The post Ad Comparison Tool appeared first on Portent.

]]>
Make ad decisions faster!

Make ad decisions faster!

A few weeks back, I wrote a guest post on PPC Hero about the Profit per Impression (PPI) metric.

In the post, I talk about calculating PPI manually using an Excel Spreadsheet.

To that end, I created a free spreadsheet to do just that: the Ad Comparison Tool!

How does the Ad Comparison Tool work?

You supply the impressions, cost and revenue data for up to 10 ads you’re testing:

Just “Paste as Values” into the tool.

What does the Ad Comparison Tool give you?

Based on your ad data, the tool will automatically calculate PPI and a Confidence rating telling you how trustworthy that PPI stat is:

Let the the tool help you make decisions!

Download the Ad Comparison Tool

The post Ad Comparison Tool appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/ppc/ad-comparison-tool.htm/feed 1
Google Apps Script Tip #1: Finding the last row http://www.eigene-homepage-erstellen.net/blog/analytics/google-app-script-1.htm http://www.eigene-homepage-erstellen.net/blog/analytics/google-app-script-1.htm#comments Fri, 08 Apr 2011 17:03:54 +0000 http://www.conversationmarketing.com/2011/04/google-app-script-1.htm I’ve been building some custom reports for Portent in Google Spreadsheets. They do fun stuff like grab Google Analytics data, insert data from other APIs and such. One thing that nearly drove me batty, though, was figuring out how to automatically add a new row below the last row with data. Turns out, Google App… Read More

The post Google Apps Script Tip #1: Finding the last row appeared first on Portent.

]]>
I’ve been building some custom reports for Portent in Google Spreadsheets. They do fun stuff like grab Google Analytics data, insert data from other APIs and such.

One thing that nearly drove me batty, though, was figuring out how to automatically add a new row below the last row with data.
Turns out, Google App Script has it’s own nifty command, called getLastRow. Here’s how you do it:

function FindRows() {
range = SpreadsheetApp.getActiveSheet().getLastRow();
return range;
}

That’s it. It’ll return the value range. That value is the number of the last row on the sheet that has data. You can then pass that back to other functions to start adding new rows in the right place, delete rows, etc..

If this is all gibberish, fear not. Marketing Ian will be back Monday. Today is Nerd Ian.

The post Google Apps Script Tip #1: Finding the last row appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/analytics/google-app-script-1.htm/feed 2
Can Quora drive traffic? http://www.eigene-homepage-erstellen.net/blog/random/can-quora-drive-traffic.htm http://www.eigene-homepage-erstellen.net/blog/random/can-quora-drive-traffic.htm#comments Tue, 11 Jan 2011 15:53:27 +0000 http://www.conversationmarketing.com/2011/01/can-quora-drive-traffic.htm I’m addicted to Quora. There, I’ve said it. I know some folks have doubts. I love StackOverflow too. But something about Quora has me on there, all the time, answering questions. I do have an ulterior motive: If I can get on there and help lots of people out, I can build an audience. If… Read More

The post Can Quora drive traffic? appeared first on Portent.

]]>
I’m addicted to Quora. There, I’ve said it.

I know some folks have doubts. I love StackOverflow too. But something about Quora has me on there, all the time, answering questions.

I do have an ulterior motive: If I can get on there and help lots of people out, I can build an audience. If I build an audience, that’s one more pool of interested people who might someday hire my company, buy a book, etc..

See, I’m not actually addicted. I have business motives! I can quit any time I want.

But that business motive means I gotta see Quora actually do something as a traffic or lead generator. So I’ve started keeping score.

Traffic: Bleagh

Traffic is a dead loser. After 4 weeks of posting 3-4 answers a day, and accumulating just over 400 followers, I’ve had a total of 42 visitors. Wooooo.

Quora Referrals

Quality and potential

But it pays to look a little deeper. On a site like Quora, I’m not necessarily looking for volume. I’m looking for quality. I want really, truly interested people to get in touch with me, follow me on Twitter, etc..

So, take another look at the numbers:

Quora numbers

Of the top 50 referring sites for Conversation Marketing, Quora generates the third highest time-on-site. If folks are spending that much time on my site after coming from Quora, then this might just be worth it. In my own, weird scoring system, these are high quality visitors.

On the other hand, goal conversion from Quora is zero. Zilch. Nada. But it’s not much of a sample size.

The jury’s still out

We’ll see how Quora does. My 30-minute-per-day investment really only requires one new client per quarter to pay off.

I’ll keep you posted.

The post Can Quora drive traffic? appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/random/can-quora-drive-traffic.htm/feed 9
Python code to grab KeywordDiscovery API data http://www.eigene-homepage-erstellen.net/blog/random/python_code_to_grab_keyworddis.htm http://www.eigene-homepage-erstellen.net/blog/random/python_code_to_grab_keyworddis.htm#comments Wed, 17 Nov 2010 07:39:46 +0000 http://www.conversationmarketing.com/2010/11/python_code_to_grab_keyworddis.htm If you use the KeywordDiscovery API, and Python, my pain is your gain. It took me a few hours to get this to work. You can grab it and go. Here’s the function, written in my usual Python Pigdin. I don’t recommend using it without a passing knowledge of Python, but that’s up to you:… Read More

The post Python code to grab KeywordDiscovery API data appeared first on Portent.

]]>
python siteIf you use the KeywordDiscovery API, and Python, my pain is your gain. It took me a few hours to get this to work. You can grab it and go. Here’s the function, written in my usual Python Pigdin. I don’t recommend using it without a passing knowledge of Python, but that’s up to you:

def kwdiscovery(username,password,phraselist):
base64string = base64.encodestring('%s:%s' % (username, password))[:-1]
authheader =  "Basic %s" % base64string
apiurl = "http://api.keyworddiscovery.com/queries.php?queries="
separator = "%0D%0A"
counter = 1
for phrase in phraselist:
# make sure there's no funny characters
try:
phrase.decode('ascii')
except UnicodeDecodeError:
continue
phrase = phrase.replace(" ","+")
phrase = phrase.replace("n","")
if (counter > 1):
apiurl = apiurl + separator + phrase
else:
apiurl = apiurl + phrase
counter = counter + 1
apiurl = apiurl + "&empty=1"
req = urllib2.Request(apiurl)
req.add_header("Authorization", authheader)
blah = urllib2.urlopen(req)
# because sometimes, things just go wrong
try:
result = ET.parse(blah)
resultlist = []
lst = result.findall("r")
for item in lst:
this = item.attrib["q"],item.attrib["m"]
resultlist.append(this)
except:
this = "__one of the words in this request
caused an error:",apiurl
resultlist = [this]
return resultlist

And here’s how you’d use the function:

#!/usr/bin/python
import string
import sys
import httplib
import urllib2
from urllib2 import Request, urlopen, URLError
import xml.etree.ElementTree as ET
import base64
f = open('longw.txt','r')
g = open('words_countedlongtail.txt','w')
words = f.readlines()
username = "ENTER KEYWORDDISCOVERY USERNAME HERE"
password = "ENTER KEYWORDDISCOVERY PASSWORD HERE"
start = 0
count = len(words)
while (count > 0):
count = count - 9
end = start + 9
a = words[start:end]
print "sent ",a
resultlist = kwdiscovery(username,password,a)
for l in resultlist:
q = str(l[0])
m = str(l[1])
line = q + "t" + m + "n"
g.write(line)
print "received ",line
start = end
f.close()
g.close()

Who knows, I might even create a web interface one of these days. In my spare time.

Related and recent

The post Python code to grab KeywordDiscovery API data appeared first on Portent.

]]>
http://www.eigene-homepage-erstellen.net/blog/random/python_code_to_grab_keyworddis.htm/feed 0