March 25th, 2008 - Pay-per-Click Best Practices
Get the Most from Pay-Per-Click Advertising
There are some big numbers involved when it comes to search engine advertising. Just ask Patrick Norman, Co-Founder and VP of Ridgeland, Miss.-based remote support solutions provider Bomgar Corp. Pay-per-click advertising was the break-through tactic that spurred the growth of the business when the only source of funding was the CEO’s personal debit card. In four years Bomgar has gained 2,500 customers in 30 countries, and they’re still spending on search engine marketing, these days to the tune of about $90,000 each month. “Not less than 40% of our leads come through SEM [search engine marketing] and we consistently see a 3-to-1 return on the dollars we spend on PPC [pay-per-click] advertising,” Norman says.
Big numbers indeed, but heed the warning of Karen Jensen, the Director of eBusiness at Irvine, CA.-based Printronix, a NASDAQ-listed printing solutions provider. She warns that those big numbers can as easily be outflows as inflows. “Don’t jump in, wade in slowly,” she says. “Because if you don’t understand how much it can cost you, and how to set the limits, you can waste a whole lot of money really fast.”
Such a disparity of results is possible with any form of marketing, but the wild successes and stinging failures of pay-per-click advertising deserve special attention. The contextual advertising that emerged in 2003 has quickly become an essential part of a web site marketing program. Its explosive growth has driven online advertising estimates up to the USD$80 billion mark by 2011 according to Piper Jaffray.
August 20th, 2007 - Matt Cutts Interview
Through no fault of Matt’s, it took quite some time for me to connect with him for this interview, conducted in late July, 2007. It was definitely worth the wait. Matt is a brilliant guy and for anyone who uses Google (I’m sure there’s a few), his work is quite relevant to the quality of information we receive. I used some of this material for an article that appeared in the Sydney Morning Herald and Melbourne’s The Age. You can read Matt’s latest musings on his blog.
Dan Skeen: So, Matt, why don’t you tell me a little bit about your history with dealing with Web spam and how you first got involved in that?
Matt Cutts: Ah, that’s an interesting question. When I joined Google, I knew nothing about Web spam. I was a computer graphics, computer vision sort of guy. But one of the very first assignments that I got was to develop SafeSearch, which is Google’s family filter. In the process of that, I ran across at least a site or two, that appeared to be trying to cheat; and back then, this was early 2000, PageRank was thought of as nearly completely unassailable. The whole idea of spamming PageRank was a bit alien to many people around the Web. So, it was a little bit of a wake up call; and within a year, I had gone and essentially asked to work on Web spam full time, and since about April, 2001, I have essentially worked on Web spam nonstop and search quality, in general.
Matt Cutts: Sure. A lot of the early search engines used on-page factors a lot more than links, and it actually took a couple of years before very many search engine optimizers or webmasters realized just how much of a difference things like hyperlinks and anchor text would make. So, a lot of the early spam attempts we’d see would be things like keyword stuffing, completely random gibberish, people doing dictionaries of tons of words on a page. You’d also see things like cloaking, which is showing different content of search engines than you show to users; and so back in those days, it was a little more like the Wild West, and people would try to show a page to search engines about G-rated cartoons. Then, when you actually visited that page, they might try to show you porn.
So, it’s interesting to watch the evolutions of the market over time because back in the early days, you could go to a large Search Engine Optimization firm and get counseled to say okay, let’s try this Black Hat technique, this technique that violates search engine’s quality guidelines. It was not difficult to find large companies that would propose those sort of schemes. In these days, thankfully, that’s quite rare. If you go to reputable SEO firms, for the most part, they’re quite up front about what they’re doing and they’ll at least inform you about the things that will possibly involve some risk.
So, one nice thing is you see fewer scams. Of course, over time, people have tried a lot of different techniques, everything from going to a bunch of guestbooks and signing them and saying hey, great site, check out my site, to all sorts of things in between. What we’re seeing these days is more of a trend where people essentially say I might be able to make some money for a short term doing shortcuts or tricks; but if I want traffic that lasts for a long time, it’s actually easier to go ahead and follow White Hat techniques and build links in an organic way or by using some smart gimmicks or neat hooks. That sort of traffic and those sort of rankings tend to last for a much longer time.
So, we’re seeing an increase in the amount of interest that people have in search; but you’re also seeing a lot more people who are willing to use these valid White Hat techniques.
Dan Skeen: Okay. How have those Black Hat techniques evolved? What are some of the latest tricks that you’re encountering and perhaps engaged in dealing with right now?
August 20th, 2007 - Todd Malicoat Interview
This is a complete transcript of my interview with Todd Malicoat, SEO expert and mastermind behind the popular blog Stuntdubl. The interview was a valuable source for an article I wrote for the Sydney Morning Herald. The interview was conducted in early August, 2007. I caught Todd at the end of a long day and he was very gracious in lending me the remaining time in his day.
Todd Malicoat: Yeah, definitely right around the 5-6 year range, kind of a transition from doing the webmastering kind of stuff to actually focusing solely on improvements in search rankings and a little bit of a progressive transitory period there.
I kind of just started dabbling and doing Web sites and everything else and worked for a hospitality company doing their Web site and got interested in it. I was kind of the junior network admin and ran around and fixed everybody’s printers and computers and all that good stuff. The webmaster left; and so I by default became the new proud owner of the Web site, and during the downtime, just kind of stumbled across Wordtracker and ended up going to a small business meeting or something where somebody was talking about meta tags. It was like
More recently, I kind of went into just a traditional SEO agency at We Build Pages here in upstate
Todd Malicoat: It’s actually Meta4creations, but Stuntdubl is usually what I go by and certainly the site that everybody’s more familiar with, incorporated as just a different name, though.
Dan Skeen: Good stuff. Are there any clients that you would care to mention?
Todd Malicoat: I usually don’t; I don’t do a lot of ongoing stuff, and it’s always been kind of a catch 22 thing because it’s nice to list clients but at the same time I guess you get a stigma attached with them if they’re doing SEO so…I traditionally usually don’t myself.
Todd Malicoat: I think a lot of times it’s because there’s a lot of transparency in terms of competition within the SEO marketplace. So, if one of my clients mentions that they’re doing SEO with me, their competitors may see me at a conference or something and start drilling me for information or be able to catch on to what my client’s doing. I would say that’s probably the number one reason, that and fear and paranoia of the search engines having a problem or taking a closer look at what they’re doing.
July 5th, 2007 - Eye Tracking Article in Wired
Well, it’s a short news piece, but nonetheless my first byline in Wired News:
It’s about a pretty interesting eye tracking device developed at Queen’s University in Kingston, Ontario. It helps advertisers understand when media is getting noticed by tracking the presence of the “red-eye” effect in the images it records.
June 5th, 2007 - Interview With Dan Warner on Domain Research and Domain Investing
This is the full transcript of my interview with Dan Warner at DarkBlueSea. The end result was a feature story in the Next technology section that appeared in the Sydney Morning Herald, The Age, and the Brisbane Times. There was some great interview material that didn’t make it into the article so with Dan’s gracious permission I’ve posted the full interview transcript here.
Dan Warner: We own around 550,000 domains or Web sites ourselves. That makes us the second largest portfolio out of all the professional domains in the world.
Dan Warner: The biggest is actually BuyDomains. They own, I believe, 675,000; we own 550,000. It goes down relatively quickly. There are probably only 15 domain portfolios in the world that own more than 100,000. I used to have a breakdown of exactly what it was; but when I did the breakdown, which was almost two years ago, it was eight people had more than 100,000, 18 or 19 had more than 10,000, and there was only something like 100 that owned more than 1,000 domains of value. Now, it’s just blown out. Everyone’s been buying like mad. It’s a havoc world.
Anyway, in the domain space I’m the guy that gets up and speaks at all the conferences. I speak at pretty much every domain conference, and I do a lot of analytics on the industry; so I pull it apart and put it back together again and figure out what’s right and wrong and report on details. I figure out who owns what domains and what portfolio and whether or not they have value and who owns trademark domains. One of the things that plagues our industry is that there is a portion of the domains that are owned, I think it’s a little bit less than 2 percent that actually have a trademark connotation like they’ve got the word Google or Yahoo or Microsoft in them, those kinds of domains. That’s just a part of the industry. Like any industry, you have people who are black hat in it; but those are the domains that a lot of people like to talk about.
Dan Skeen: What is your policy on those? When you’re buying domains, do you steer clear of anything with trademarks associated?
Dan Warner: With us, we actually have 100 percent no policy on it; so we don’t buy anything like that. The other thing is to know about that though, is when you register 550,000 domains ourselves it is impossible to vet them all for a trademark. So occasionally you run into domains that you bought, and you don’t have any idea that they were a trademark. But when you find out, the good thing is that you drop them.
Dan Skeen: Tell me a little bit about your role in the company, how long you’ve been with them.
Dan Warner: I’ve been with the company about five years. My role here at the company is a dual role of chief strategy officer and chief operations officer. I probably spend about 80 percent of my time on strategy and 20 percent on operations. I’m the spokesperson for the company as well.
Dan Skeen: And how did you accumulate all this strategic knowledge about domain buying and the domain market?
Dan Warner: We created a market intelligence engine about six years ago, or five years ago, and basically it goes and picks up data. There is a lot of data mining for things like search phrases, bids on search phrases, anything that can give us a window of information into the commercial or mind share viability of domains. We looked for all the domains that are based on search phrases or bid prices on search phrases that had volume, like users were actually searching for these phrases, and then bought was left in the market. And people naturally type these domains into their search bar and to their address bar; it’s a searching behavior. It’s absolutely identical to searching…people search for domains in search bars and people search for phrases in address bars as domains.
Dan Skeen: It amazes me how much volume there is actually with people doing that. That’s turned into quite a viable strategy for people with parked domains, hasn’t it?
Dan Warner: Oh, absolutely. The traffic is what pays the rent. There is a distinctive property analogy in this in that domain sales is actually realizing the asset value of domains. What pays the rent is the actual traffic. As you hold domains, they produce traffic; and that is oftentimes people’s primary means of making revenue. But, a secondary means is to sell domains because when you sell a domain, typically, you are getting between 25 and 100 years of revenue in a single transaction.
Dan Skeen: Based on what that revenue would be if it were just parked with no maintenance or contact?
Dan Warner: Yes, traffic.
Dan Skeen: That’s very interesting.
Dan Warner: When you can sell something for like 50 years or 100 years revenue, you do.
Dan Skeen: Yes. I would think so. So, there has been a huge accumulation based on the numbers you’re describing over recent years, in your case, using this intelligence. What’s driving it on the broader front? Why was there so much rapid accumulation of domains in the last four or five years?
Dan Warner: We’re more able to monetize revenue. The people who actually monetize this revenue are either Google or Yahoo. Those are the two primary means of actually getting parking revenue. We have a parking company ourselves, which is fabulous.com. It’s our registrar and parking company, and we park third-party domains, along with our own. We’re one of the largest services in the world for doing that. Now, we’re enabled by Google and Yahoo who go out there and find all the advertisers. So, they find all the advertisers, and then we make a call for them on every domain; and they tell us what advertisers they want to serve, and then they take a commission. Then they give us the rest of the money.
That’s definitely where we want to get some money. All right? There’s a certain amount of money that’s made in typo traffic, and there’s a certain amount of money that’s made in trademark traffic; those are the ones that people like to talk about. GoogleNews.com was not owned by Google. It’s owned by a domainer who was parking it who doesn’t actually get to make any money out of it anymore because Google stopped it, but these things have happened in the past. It’s a bit of a gold rush mentality; you get those that work on the edges, and then you get the highly professional corporations who won’t have anything to do with any that. We’re finding a major cleanup in the industry has happened over time, and a lot of the trademark domains and typos and things have actually made their way out of the industry, and it’s getting cleaner all the time.
Dan Skeen: Yet, it seems like there’s always some new scandal. Can you comment on kiting? What’s your take on that?
June 3rd, 2007 - Five Years of Web Innovation
Here’s the first draft of an article that ran in the Next technology section that appears in several Fairfax newspapers including the Sydney Morning Herald, The Age, and the Brisbane Times. You can view a published version here. Next was celebrating its fifth-year anniversary and the editor asked me to write something about five years of changes on the Web. Basically I sat down and the key innovations that stood out in my mind. Then I dug up details on dates and arranged them chronologically. Next came the tricky part, linking them all together. I wasn’t sure I could pull it off but in the end it came together alright.
Looking back over five years of changes on the Web, one theme that stands out is the increasing connectedness of the internet experience. More services, media, and publishers bring us together in a web of growing intricacy and reach.
With inter-connectedness as our theme, let’s look at some links between the internet’s biggest catalysts for change in the last five years.
Wikipedia Launched in 2001, the explosive growth of this volunteer-compiled resource has grown exponentially over the last five years. Today it contains 1,753,739 articles, including one on….
Technorati Founded in 2002, this blog search engine became a hub for bloggers. It currently tracks more than 75 million blogs. It also maintains a list of the 100 most popular blogs, in which #41 belongs to ….
Robert Scoble In February of 2003 Scoble published his “corporate blogging manifesto”, an inspiring list of 20 principles for bloggers. Scoble is a popular resident in …..
Second Life: This multi-player online world launched a beta version in 2003. Today Second Life has over five million registered accounts as reported on the Linden Life blog, which is run on…
WordPress: The popular open source blogging platform first appeared in 2003 as a joint effort between Matt Mullenweg and Mike Little. Today Wordpress.com hosts over 893,000 blogs. Many of these feature ads from ….
AdSense: AdSense was born with Google’s 2003 acquisition of Applied Semantics. The “Ads by Google” contextual advertising units made web publishing much more effective and lucrative for independent website owners. Revenue is based on advertisers’ bids for specific keywords, such as the current top bid of $1.04 for the search term….
iPod: The original unit was launched in 2001, but the release of the iTunes Store and the Windows version of iTunes software in 2003 sent sales skyward. Apple’s portable media player drove demand for automated downloads of audio (and later video) content, commonly known as podcasts. Podcasting tools are available from many publishers, including….
MySpace: Founded in July 2003, the social networking site rapidly grew to one of the world’s most popular websites. It was bought in 2005 for US $580 million by Rupert Murdoch’s News Corporation. MySpace made Murdoch shine at a time when many were calling his internet strategy a ….
Miserable failure: In October of 2003, several bloggers linked to the US White House site with the link text “miserable failure”, propelling George W Bush’s bio to top spot in Google for that phrase. It is the most popular example of “Google bombing”, a term tagged by 160 users of ….
Del.icio.us: This popular social bookmarking site, launched in late 2003, allowed users to categorize and share web pages using descriptive tags. The company was acquired by Yahoo in 2005. By then tagging was popular on other sites, including …..
Flickr: Ludicorp began building a massively multiplayer role-playing game, but instead used the tools they’d created to launch a photo sharing site. Flickr used tags to create the first tag cloud, a visual representation of popular tags. Flickr has over 1,600 photos tagged with the term ….
VOIP: 2004 saw the mass market roll-out of voice-over-IP services through providers like Skype. Offering a radical departure from traditional Telco pricing, Skype had nine million online users in January 2007. You can find 1,700 videos related to Skype on…
YouTube: Founded in February 2005 by three former employees of PayPal, YouTube uses Adobe’s Flash technology to display videos submitted by users. In 2006, the company was acquired for US$1.65 billion by the makers of ….
Google Maps: This free service was first announced on the Google Blog in February of 2005. It stood out from competitors through its fluid transitions between screen states, delivered by a programming method referred to as ….
Microsoft Vista: After a long production cycle, Microsoft Vista was released in late 2006, with the global release occurring in early 2007. It was the software manufacturer’s first new OS in 5 years, causing much excitement among users of….
Twitter: A micro-blogging service launched in 2006, Twitter allows users to send short messages to their network through SMS, instant messaging, or the web. Bringing the offline experience closer to the online, it further blurs the line between our ”real” and digital lives.
May 23rd, 2007 - The Origin of Twitter
This is my first podcast, so don’t expect much. Sad part is I have friends in A/V production who will no doubt have a good laugh at my first attempt. Ah well, here goes nothing…
During my interview with Biz Stone of Twitter, he described how the team first came up with the idea for what has become a wildly successful messaging tool. It’s a good story and I didn’t get a chance to share it in my column on Twitter, so here is the audio. Or you can read the full transcript.
May 23rd, 2007 - Interview with Biz Stone, Co-Founder of Twitter
Dan Skeen: I’m going to ask you a question probably a million people have already; so bear with me, but tell me about the origins of Twitter, where the idea came from and how it all got started?
Biz Stone: Actually, the basic idea for Twitter came from my colleague and co-founder Jack Dorsey. A little history about Jack is he had been, it turned out, obsessed with the idea of dispatch, like with regard to a taxicab and so forth since the age of about 14. He had always wanted to write software that would help dispatch taxies, but he was living in St. Louis at the time. He wrote software for couriers, like bike couriers, and they didn’t really need that anywhere in St. Louis. To make a long story short, he ended up moving to New York City and starting a company and writing software for a taxi cab dispatch, which is basically at its core sort of messaging service. At some point, he started mulling around with the idea of, wouldn’t it be neat if people could fake and not bid, sort of the same, in a similar fashion, but very simply.
One of the things that struck him was the way people were using that little status field in, when you use like AIM or other instant message clients, you know how there’s that status field that says, at a meeting, going to lunch, whatever. It turns out that, for about five years prior to working with us, he thought to himself that it would really be cool to make a whole service out of just that little status field; but he wasn’t exactly sure how that would manifest. He was working on another project, so he didn’t have time to do it; but when he started working at Odeo with myself and Evan Williams and a bunch of other folks here, at one point he decided he would just mention the idea. He said I’ve got this great idea, I really want to create a tool that focuses on something really simple like statuses as a way to keep friends and relatives sort of connecting at a very simple sort of ambient way.
At the same time, we had been discussing various interesting use cases for SMS; so when he brought the idea to us, we merged it with the idea of, well, what if you could set this so called status with an SMS, making it totally mobile and creating the ability for you and your friends to constantly be in touch by following each other’s status updates and using SMS. That’s when we decided; well, that’s a cool idea. So, we took two weeks; we were working Odeo, which was the podcasting company at the time; but Evan decided that Jack and I should sort of go off in a corner and spend two weeks living a prototype, which we did. We presented it to the rest of the team, and everyone just totally loved it; it was something that just caught on very quickly. It was sort of a giggle-inducing to be able to be sitting, you know, working at home, ripping up carpet, having your phone vibrate, and there’s your friend is sipping wine in Napa and another friend is heading to the beach, and you’re just laughing at sort of the context of it all, and just knowing that they’re doing that; and so we did update.
We decided then that we should work on it a little bit longer, but basically, that was the origin of it. Then later, we decided to begin adding multiple devices to it. So, rather than just SMS, you’d be able to update over an instant message or the Web, or increasingly other waves so that what we really got technology-wise is just a devising method for message-writing systems. The most popular use case right now for it is social; people are very much using it to stay connected with themselves or friends.
Dan Skeen: Now, it sounds like you guys are pretty stoked about the product right off the bat, but I would guess that it’s wildly exceeded your expectations in terms of adoption and popularity. Can you talk to me a bit about what you expected from the product and compare that to the actual state?
May 11th, 2007 - New Interviews and Articles
I’ve got some great material that I’ll be posting here quite soon. Here are some of the interviews I’ve done recently:
- Biz Stone, one of the founders of Twitter
- Dan Warner, Chief Strategy Officer at Dark Blue Sea, owner of the world’s second largest domain name portfolio
- An expert at flipping web sites, that is buying cheap web properties and turning them around at a huge profit
- Darren Rowse at Problogger
- Warren Adelman, COO of GoDaddy, the world’s largest domain registrar
There’s lots of audio too so I’ll post my first podcasts here as well. Subscribe via RSS or email and you won’t miss a thing.
April 28th, 2007 - Sitemap Best Practices
Here’s the original draft of this article that appeared in BizTech Magazine.
You will find two different sitemaps representing the Stephen Hawking website on the internet. Far from parallel universes, one is a utilitarian collection of links that represents the hierarchical structure of his website. The other is pure eye-candy – a stylistic collection of images and graphical pathways illustrating all the dimensions of the physicist’s life, career, and writing.
At some point the site owner (assumedly not Mr. Hawking himself, though I bet he’d be awesome with cascading style sheets) reviewed both and chose one over the other. Score one for simplicity and none for aesthetics, because the stylistic version rests on an obscure domain, essentially unused, while the other is accessed each day by many Hawking devotees. In making this decision, the site administrators likely surmised that the image-rich sitemap, while visually impressive, is hampered by search engine indexing and site maintenance flaws. When it comes to sitemaps, function usually trumps form.
Sitemaps have traditionally served a simple purpose in an unexciting way. Sitemaps were an effective navigation tool in the pre-Google years to help users survey the site’s material at a glance and quickly access their desired information. Today sitemaps are also used by search engine optimization experts, who use a sitemap to enable an automated search engine spider to properly index all of a site’s pages. These days you can create an XML-format sitemap that is available only to search engines. It’s tempting for some site owners, confident that site visitors will find their way through navigational menus and search engines, to tuck the traditional, skeleton-like html sitemap into the closet.
Not so for Helen Whelan, President of Success Television and owner of www.successtelevision.com. “With a deep, content-rich site, the sitemap is a wonderful means of helping users find the article or videos that are most relevant to them,” she says. “We regularly track our sitemap metrics to see who’s coming there, where from, and where they visit next so we can improve upon our content offerings.”
Helen uses Google Analytics, a web statistics package, to monitor the ins and outs of her sitemap. She particularly likes the site overlay feature that graphically shows her which links visitors are clicking on within the sitemap. Her esteem for sitemap activity is not uncommon. The sitemap remains a popular navigation tool for surfers.
A sizable contingent of web surfers follow the navigation menu, search engine, sitemap pattern. This means they’ll look to the nav menu options first. If they can’t find what they’re looking for they’ll search and if the results disappoint them, they’ll try the sitemap. As such, the sitemap is their last hope before moving on to the next site.
There are pros and cons to each form of sitemap mentioned so far: