Tag Archive | "Talk"

How to Talk to Your Clients in a Language They Understand

Posted by Lindsay_Halsey

A few years ago, while enjoying a day of skiing at Aspen Highlands with a group of girlfriends, a skier crashed into me from above, out of nowhere. He was a professional skier traveling at an exceptionally fast speed, and I felt lucky to get away with a mere leg injury. I couldn’t put weight on my leg, though, so I went to the local emergency room.

After a few hours of various doctors and nurses running scans to diagnose the issue, a new doctor whom I’d never met walked in the room. The first words out of his mouth were, “You have a radial tear in your medial meniscus”. I had no idea what he was talking about. He continued speaking in words better suited for a medical peer than a patient.

I wasn’t at all interested in medical-speak. I was a new mom, anxious to return to my family. I wanted to know for how long and to what extent this injury would impact us, and how active I could be at home while caring for our son.

I didn’t get the answers to any of those questions. Instead, my doctor left me feeling overwhelmed, lost, and frustrated.

Using industry jargon is easy to do

Whether you are a doctor, marketer, SEO, or another specialized professional, this experience made me realize that using industry jargon is easy to do. And I realized that I was susceptible myself — I speak to clients all the time with words that made them feel alienated and confused.

The words and phrases that mean a lot to us as SEO professionals mean little or nothing to our customers.

When we utilize these phrases in conversations and assume we’re communicating effectively, we may be leaving our prospects and clients feeling overwhelmed, lost, and frustrated.

Years ago, feeling that way motivated businesses to hire SEO consultants and agencies. Ample industry jargon was tossed about in the sales process, leaving a prospect set on hiring a professional since SEO was too hard to understand.

There was no way that prospect felt confident in taking a DIY approach to getting found by the search engines; there was no other option besides signing on the dotted line. With a signature in hand, an SEO consultant could begin working some behind-the-scenes magic and deliver impactful results.

Today — and over the last five years — this approach no longer works.

Collaboration is the foundation of SEO

Today, we drive results by building a business’s expertise, authority, and trust online. Sure, there are technical SEO tasks to accomplish (and we can’t forget about foundational action items like dialing in title tags and meta descriptions). But long term, significant growth comes from impacting a business’s E-A-T. And for that, collaboration is required.

As an SEO professional, I often think of myself as a rafting guide in the search engine waters. I’ve been down this river before and already know what to expect around the next bend. I’m responsible for leading a team; our collaborative success (or failure) ultimately depends on my timely, appropriate guidance.

Yet it’s not all about me. The team (or client) is just as invested in our success. We’re sharing the same raft, and we’ve chosen to navigate the same river. They have their paddles in the water and are actively engaged in our journey, eager to work together. Working together — collaboration — means success for us all.

Communication is key to collaboration

Effective communication is critical to a collaborative environment; communication relies on language. If a rafting guide says “port side paddle forward,” his team will likely look at him with confusion. If he says “left side paddle forward,” his team will understand his language and take the right action.

One way to improve communication with prospects and clients is to remove industry jargon from our vocabulary. Over the past few years, I’ve challenged myself to use more everyday words in client communication. As a result, we are closing more business and have more satisfied customers seeing better results. It’s a win, win, win.

Here are some practical examples for communicating (and therefore better collaborating) with SEO clients:

XML Sitemap // Your Website’s Resume 

Instead of telling a client that their website “lacks an XML sitemap,” I explain that this file is like a website’s resume. You wouldn’t show up to a job interview without a resume that lists out your assets in an easily digestible format. A resume quickly summarizes your “contents,” or the structure of your relevant roles and experience — just like a sitemap summarizes the contents and structure of a website.

Link Building // Relationships 

When a client hears you talk about link building, they instantly recall how they feel when they receive spammy emails requesting a favor in the form of a link exchange. They may worry that this tactic is too aggressive or short-sighted and in violation of Google’s terms of service. Consider describing “link building” as building a network of a business’s professional relationships that the search engines quickly and easily understand. Putting up signposts that search engines can read.

Featured Snippet // Above #1

Clients are often hyper-focused on their rankings. If you talk to them about “gaining a featured snippet result,” that language will leave them lost and therefore unengaged in the initiative. Instead, focus on what they want: to rank #1 for a keyword they’ve chosen. If you’re working with a client on a new piece of complete content (to help propel them to the top of the search results by sharing their expertise), you can get the client onboard by telling them the goal is to be “above #1.” 

SEO // Getting Found

Perhaps the most important term of all is “SEO.” We all assume our prospects and clients understand what SEO stands for and why it is important. But more often than not, the acronym alone can lead to confusion. Try substituting “getting found in Google” anytime you’re tempted to say “SEO,” and your client will be connected to the value instead of confounded by the vocabulary.

Removing industry jargon has been the most impactful of our changes to client communication. We also recommend (and practice) sending monthly reports, actively seeking feedback, and setting clear expectations. Read more client communication tips on the Moz blog and at Pathfinder SEO.

What expressions and words do you use in client communications?

Let’s create a shared, jargon-free vocabulary to improve how we talk to our clients. Let’s stop leaving our clients feeling overwhelmed, lost, or frustrated with SEO. After all, collaboration is the foundation of SEO. And to collaborate, we must create — and meet on — shared ground.

Please share your ideas and experiences in the comments below.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Is link building dead? Depends on who you talk to

Some SEOs argue any form of proactive link building is a waste of time. Some say it should be apart of any SEO strategy. So which is it?



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Real Talk about Moving Forward with Your Big Idea

Great to see you again! This week on Copyblogger, we looked at how to make progress on projects and opportunities that might seem intimidating at first. Stefanie Flaxman showed us how to take that Big Idea (exciting, challenging, scary) and break it down until you discover your first (or next) move. She shared a process
Read More…

The post Real Talk about Moving Forward with Your Big Idea appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

Good News: We Launched a New Index Early! Let’s Also Talk About 2015′s Mozscape Index Woes

Posted by LokiAstari

Good news, everyone: November’s Mozscape index is here! And it’s arrived earlier than expected.

Of late, we’ve faced some big challenges with the Mozscape index — and that’s hurt our customers and our data quality. I’m glad to say that we believe a lot of the troubles are now behind us and that this index, along with those to follow, will provide higher-quality, more relevant, and more useful link data.

Here are some details about this index release:

  • 144,950,855,587 (145 billion) URLs
  • 4,693,987,064 (4 billion) subdomains
  • 198,411,412 (198 million) root domains
  • 882,209,713,340 (882 billion) links
  • Followed vs nofollowed links
    • 3.25% of all links found were nofollowed
    • 63.49% of nofollowed links are internal
    • 36.51% are external
  • Rel canonical: 24.62% of all pages employ the rel=canonical tag
  • The average page has 81 links on it
    • 69 internal links on average
    • 12 external links on average
  • Correlations with Google Rankings:
    • Page Authority: 0.307
    • Domain Authority: 0.176
    • Linking C-Blocks to URL: 0.255

You’ll notice this index is a bit smaller than much of what we’ve released this year. That’s intentional on our part, in order to get fresher, higher-quality stuff and cut out a lot of the junk you may have seen in older indices. DA and PA scores should be more accurate in this index (accurate meaning more representative of how a domain or page will perform in Google based on link equity factors), and that accuracy should continue to climb in the next few indices. We’ll keep a close eye on it and, as always, report the metrics transparently on our index update release page.

What’s been up with the index over the last year?

Let’s be blunt: the Mozscape index has had a hard time this year. We’ve been slow to release, and the size of the index has jumped around.

Before we get down into the details of what happened, here’s the good news: We’re confident that we have found the underlying problem and the index can now improve. For our own peace of mind and to ensure stability, we will be growing the index slowly in the next quarter, planning for a release at least once a month (or quicker, if possible).

Also on the bright side, some of the improvements we made while trying to find the problem have increased the speed of our crawlers, and we are now hitting just over a billion pages a day.

We had a bug.

There was a small bug in our scheduling code (this is different from the code that creates the index, so our metrics were still good). Previously, this bug had been benign, but due to several other minor issues (when it rains, it pours!), it had a snowball effect and caused some large problems. This made identifying and tracking down the original problem relatively hard.

The bug had far-reaching consequences…

The bug was causing lower-value domains to be crawled more frequently than they should have been. This happened because we crawled a huge number of low-quality sites for a 30-day period (we’ll elaborate on this further down), and then generated an index with them. In turn, this raised all these sites’ domain authority above a certain threshold where they would have otherwise been ignored, when the bug was benign. Now that they crossed this threshold (from a DA of 0 to a DA of 1), the bug was acting on them, and when crawls were scheduled, these domains were treated as if they had a DA of 5 or 6. Billions of low-quality sites were flooding the schedule with pages that caused us to crawl fewer pages on high-quality sites because we were using the crawl budget to crawl lots of low-quality sites.

…And index quality was affected.

We noticed the drop in high-quality domain pages being crawled. As a result, we started using more and more data to build the index, increasing the size of our crawler fleet so that we expanded daily capacity to offset the low numbers and make sure we had enough pages from high-quality domains to get a quality index that accurately reflected PA/DA for our customers. This was a bit of a manual process, and we got it wrong twice: once on the low side, causing us to cancel index #49, and once on the high side, making index #48 huge.

Though we worked aggressively to maintain the quality of the index, importing more data meant it took longer to process the data and build the index. Additionally, because of the odd shape of some of the domains (see below) our algorithms and hardware cluster were put under some unusual stress that caused hot spots in our processing, exaggerating some of the delays.

However, in the final analysis, we maintained the approximate size and shape of good-quality domains, and thus PA and DA were being preserved in their quality for our customers.

There were a few contributing factors:

We imported a new set of domains from a partner company.

We basically did a swap with them. We showed them all the domains we had seen, and they would show us all the domains they had seen. We had a corpus of 390 million domains, while they had 450 million domains. A lot of this was overlap, but afterwards, we had approximately 470 million domains available to our schedulers.

On the face of it, that doesn’t sound so bad. However, it turns out a large chunk of the new domains we received were domains in .pw and .cn. Not a perfect fit for Moz, as most of our customers are in North America and Europe, but it does provide a more accurate description of the web, which in turn creates better Page/Domain authority values (in theory). More on this below.

Palau, a small island nation in the middle of the Pacific Ocean.

Palau has the TLD of .pw. Seems harmless, right? In the last couple of years, the domain registrar of Palau has been aggressively marketing itself as the “Professional Web” TLD. This seems to have attracted a lot of spammers (enough that even Symantec took notice).

The result was that we got a lot of spam from Palau in our index. That shouldn’t have been a big deal, in the grand scheme of things. But, as it turns out, there’s a lot of spam in Palau. In one index, domains with the .pw extension reached 5% of the domains in our index. As a reference point, that’s more than most European countries.

More interestingly, though, there seem to be a lot of links to .pw domains, but very few outlinks from .pw to any other part of the web.

Here’s a graph showing the outlinks per domain for each region of the index:

TQu--jaKCoqQLiRknNQw42R7GeMWfkuuKmDCOBUTmZ2Eg6FW1grq3z6oBJMZm_wItHmOD_K7UDicMgq_8OkLVnjLKDNxoRMfgU20B2ymlQK7eueKqIAcY3wsqfJizRwo7hnt7Yw2jA

China and its subdomains (also known as FQDNs).

In China, it seems to be relatively common for domains to have lots of subdomains. Normally, we can handle a site with a lot of subdomains (blogspot.com and wordpress.com are perfect examples of sites with many, many subdomains). But within the .cn TLD, 2% of domains have over 10,000 subdomains, and 80% have several thousand subdomains. This is much rarer in the North Americas and in Europe, in spite of a few outliers like WordPress and Blogspot.

Historically, the Mozcape index has slowly grown the total number of FQDNs, from ¼ billion in 2010 to 1 billion in 2013. Then, in 2014, we started to expand and got 6 billion FQDNs in the index. In 2015, one index had 56 billion FQDNs!

We found that a whopping 45 billion of those FQDNS were coming from only 250,000 domains. That means, on average, these sites had 180,000 subdomains each. (The record was 10 million subdomains for a single domain.)

Chinese sites are fond of links.

We started running across pages with thousands of links per page. It’s not terribly uncommon to have a large number of links on a particular page. However, we started to run into domains with tens of thousands of links per page, and tens of thousands of pages on the same site with these characteristics.

At the peak, we had two pages in the index with over 16,000 links on each of these pages. These could have been quite legitimate pages, but it was hard to tell, given the language barrier. However, in terms of SEO analysis, these pages were providing very little link equity and thus not contributing much to the index.

This is not exclusively a problem with the .cn TLD; this happens on a lot of spammy sites. But we did find a huge cluster of sites in the .cn TLD that were close together lexicographically, causing a hot spot in our processing cluster.

We had a 12-hour DNS outage that went unnoticed.

DNS is the backbone of the Internet. It should never die. If DNS fails, the Internet more or less dies, as it becomes impossible to lookup the IP address of a domain. Our crawlers, unfortunately, experienced a DNS outage.

The crawlers continued to crawl, but marked all the pages they crawled as DNS failures. Generally, when we have a DNS failure, it’s because a domain has “died,” or been taken offline. (Fun fact: the average life expectancy of a domain is 40 days.) This information is passed back to the schedulers, and the domain is blacklisted for 30 days, then retried. If it fails again, then we remove it from the schedulers.

In a 12-hour period, we crawl a lot of sites (approximately 500,000). We ended up banning a lot of sites from being recrawled for a 30-day period, and many of them were high-value domains.

Because we banned a lot of high-value domains, we filled that space with lower-quality domains for 30 days. This isn’t a huge problem for the index, as we use more than 30 days of data — in the end, we still included the quality domains. But it did cause a skew in what we crawled, and we took a deep dive into the .cn and .pw TLDs.

This caused the perfect storm.

We imported a lot of new domains (whose initial DA is unknown) that we had not seen previously. These would have been crawled slowly over time and would likely have resulted in their domains to be assigned a DA of 0, because their linkage with other domains in the index would be minimal.

But, because we had a DNS outage that caused a large number of high-quality domains to be banned, we replaced them in the schedule with a lot of low-quality domains from the .pw and .cn TLDs for a 30-day period. These domains, though not connected to other domains in the index, were highly connected to each other. Thus, when an index was generated with this information, a significant percentage of these domains gained enough DA to make the bug in scheduling non-benign.

With lots of low-quality domains now being available for scheduling, we used up a significant percentage of our crawl budget on low-quality sites. This had the effect of making our crawl of high-quality sites more shallow, while the low-quality sites were either dead or very slow to respond — this caused a reduction in the total number of actual pages crawled.

Another side effect was the shape of the domains we crawled. As noted above, domains with the .pw and .cn TLDs seem to have a different strategy in terms of linking — both externally to one other and internally to themselves — in comparison with North American and European sites. This data shape caused a couple of problems when processing the data that increased the required time to process the data (due to the unexpected shape and the resulting hot spots in our processing cluster).

What measures have we taken to solve this?

We fixed the originally benign bug in scheduling. This was a two-line code change to make sure that domains were correctly categorized by their Domain Authority. We use DA to determine how deeply to crawl a domain.

During this year, we have increased our crawler fleet and added some extra checks in the scheduler. With these new additions and the bug fix, we are now crawling at record rates and seeing more than 1 billion pages a day being checked by our crawlers.

We’ve also improved.

There’s a silver lining to all of this. The interesting shapes of data we saw caused us to examine several bottlenecks in our code and optimize them. This helped improve our performance in generating an index. We can now automatically handle some odd shapes in the data without any intervention, so we should see fewer issues with the processing cluster.

More restrictions were added.

  1. We have a maximum link limit per page (the first 2,000).
  2. We have banned domains with an excessive number of subdomains.
    • Any domain that has more than 10,000 subdomains has been banned…
    • …Unless it is explicitly whitelisted (e.g. WordPress.com).
      • We have ~70,000 whitelisted domains.
    • This ban affects approximately 250,000 domains (most with .cn and .pw TLDs)…
      • …and has removed 45 billion subdomains. Yes, BILLION! You can bet that was clogging up a lot of our crawl bandwidth with sites Google probably doesn’t care much about.

We made positive changes.

  1. Better monitoring of DNS (complete with alarms).
  2. Banning domains after DNS failure is not automatic for high-quality domains (but still is for low-quality domains).
  3. Several code quality improvements that will make generating the index faster.
  4. We’ve doubled our crawler fleet, with more improvements to come.

Now, how are things looking for 2016?

Good! But I’ve been told I need to be more specific. :-)

Before we get to 2016, we still have a good portion of 2015 to go. Our plan is stabilize the index at around 180 billion URLs for the end of the year and release an index predictably every three weeks.

We are also in the process of improving our correlations to Google’s index. Currently our fit is pretty good at a 75% match, but we’ve been higher at around 80%; we’re testing a new technique to improve our metrics correlations and Google coverage beyond that. This will be an ongoing process, and though we expect to see improvements in 2015, these improvements will continue on into 2016.

Our index struggles this year have taught us some very valuable lessons. We’ve identified some bottlenecks and their causes. We’re going to attack these bottlenecks and improve the performance of the processing cluster to get the index out quicker for you.

We’ve improved the crawling cluster and now exceed a billion pages a day. That’s a lot of pages. And guess what? We still have some spare bandwidth in our data center to crawl more sites. We plan to improve the crawlers to increase our crawl rate, reducing the number of historical days in our index and allowing us to see much more recent data.

In summary, in 2016, expect to see larger indexes, at a more consistent time frame, using less historical data, that maps closer to Google’s own index. And thank you for bearing with us, through the hard times and the good — we could never do it without you.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Marketers Talk Hummingbird, ‘(Not Provided)’ & More Ahead of SES Chicago 2013

Several Meetup Groups plan to converge on the SES Chicago conference this week. Here, organizers of several of these groups share insights on Hummingbird’s impact on SEO; how to cope without keyword data; and their Google holiday wish list.
Search Engine Watch – Latest

Posted in IM NewsComments Off

Which States Like to Talk About Films and Other Social Movie Facts

It’s Friday, which is the perfect time to talk movies. Have you seen any of this summer’s blockbusters? Are there films on your list that you’re still hoping to see before they head to DVD? And more importantly, have you been sharing your movie dreams online?

If, like me, you live in California – then you probably went to the movies but you probably didn’t talk about. If you live in Wyoming, that’s a whole ‘nother story.

According to Share This, folks in the coastal states (the blue ones) tend to watch more movies, but it’s the folks in the central states that talk about movies online.

states that shareAction / Adventure and Sci-Fi movies were the most shared with 2.6x the shares per film as any other category. The most popular film was Iron Man 3. People began sharing blog posts, pics and video trailers a full 40 days before the release and they kept on talking until two weeks after the premiere. Kudos to the PR team that helped initiate that buzz.

Facebook may be the king of social networking but when it comes to movie talk, Twitter reigns supreme. According to Share This 46% of movie conversations happened outside of Facebook. In addition to Twitter, movie fans also like to share by email and Reddit.

In “yeah, that makes sense” news, people who shared movie-related content were 6 times more like to buy movie tickets.

ticket buying

Family oriented and animated films had the biggest lift but Sci-Fi fans also liked to talk then buy.

Sharing Time

Most sharing happens just before the release but moviegoers, unlike the average social media maven, tend to post between noon and 3pm. For the average poster, that spike happens between 7 and 10 in the evening. I say it’s because movie goers are busy watching TV at night, so they post during the day while they’re bored at work.

How do you fit into this picture? Are you a frequent movie-goers or frequent movie-sharer? Why not give your favorite film a boost this week by talking about it online. Write a review, share an article or a photo. Doesn’t have to be a new movie. Share your old movie love, too.

Me? I’ve got Sunset Boulevard on the DVR and I hear Jaws 3 is playing on Encore. That’s me, all over the place.

See you at the movies!

 

 

Pilgrim’s Partners: SponsoredReviews.com – Bloggers earn cash, Advertisers build buzz!

Marketing Pilgrim – Internet News and Opinion

The story behind the Google and NASA partnership that reveals How Earth is radically changing over the decades.

Posted in IM NewsComments Off

Nuance Creates Mobile Advertisements That Talk Back

nuance voice adsConversations, word-of-mouth, chat — it’s amazing how many auditory words we use to describe text-based actions. Replying to comments on Facebook is a conversation, sharing a product review in a blog is word-of-mouth and chatrooms have functioned for years without any chatter at all.

Nuance is going to change that. They’re using voice recognition technology to create ads that actually converse with the consumer. As in “talk” with sound, not just words scrolling across the width of a box.

Nuance’s voice ads borrows from some old school ideas that work. For example, the moving ad. Remember those early banner ads that challenged you to click on a moving target? By replicating the carnival game experience, they got thousands of people to click through to a sales website.

Moving things catch our eye, that’s why video works so well. Add in gamification and people can’t resist. There’s something even more compelling about voice activation. We’d use it to turn the toaster on even if pushing a button was more convenient because it’s fun, it’s futuristic.

Nuance packed all of those elements into an ade for a make-believer product. Here it is in action:

Pretty cool (they released this yesterday, but it’s not an April Fool’s joke, is it?)

Because these ads are designed for mobile, they can react based on location. I could see a big restaurant chain using this ad to help people find the nearest restaurant. Tell it what kind of food you want and your price point and it leads you there.

Michael Thompson, executive vice president and general manager for Nuance Mobile says,

“Voice has already changed the mobile interface, making it faster and easier for consumers to discover and access information, and find people and content. Mobile advertising shouldn’t be any different, and should be designed specifically around the unique capabilities of the mobile device.”

And even though any sane person knows that the ad is simply choosing from a random collection of responses, we want to believe, so we will believe. Just like we believe in the Magic 8 Ball, fortune cookies and every silly decision app in the iTunes store.

Voice ads are one step toward creating smartphone and tablet apps that truly take advantage of what these devices do best. Let’s see what else we can come up with.

Join the Marketing Pilgrim Facebook Community

Marketing Pilgrim – Internet News and Opinion

Posted in IM NewsComments Off

Why Does Business Talk That Way?

It goes like this.

Our Mission is to provide the best SEO services in the world. We nurture win-win scenarios to create enduring value for our customers. We were voted top SEO agency in Texas and voted the best place to work. We value our staff – we want people to be the best we can be, so as we can maintain our preeminent position in the search industry

Place gun to my head. Pull trigger.

How many times have you come across corporate-speak and thought “who are these people trying to kid”? Yet, when many business people sit down to write, that is the sort of thing they invariably come up with.

Why?

Because they are business people. They are talking about business. That is how business sounds.

Well, it’s how they think business should sound, because that’s the way it has always sounded – a monotone drone of description, chest puffed out. These people are stuck in the business-speak echo chamber.

No one sounds like business-speak in real life. If you ask someone how their job is going, a lot of them will invariably say “it sucks”, “too busy”, “it’s okay”. These same people might work for the firm that has says they were nominated “best place to work”. The image and the reality don’t match. At best, people will ignore business-speak. No one really believes it.

There are better ways to communicate.

Truth

A lot of business-speak fails to communicate because it isn’t rooted in truth.

I once worked at a Telecommunications Company. The marketing team was having a meeting about a new brochure and came up with the slogan – I am not making this up – “(Company Name) – first in service!”. Once I stopped wondering how any of these people ever managed to land a job in Marketing, I asked how we knew we were “first in service”? It seemed a reasonable question, but I may as well have asked the Pope if he really believed in God.

Apparently, it was self-evident we were first in service! There was no basis of truth in it, of course. Just an empty slogan, meaning nothing. No measurement. It was a phrase that “sounded positive!”

I doubt any customers believed it, especially those waiting in call queues.

Do you notice how some small companies try to appear large? They list multiple offices, when in, reality they consist of two guys who have a call forwarding service. I’m not quite sure why a company would pretend to be any bigger than it actually is, because as soon as they get a customer, they are going to get found out. The feeling they’ll likely leave with that customer is that they are fundamentally dishonest.

Which is a strange approach to take.

Many customers consider small to be an advantage. Small can mean you are more connected with your customers as there is no barrier between you and the customer. They can talk directly to you. They can email you. They can see you Twittering. Many customers love that. Big companies have “policies”. They have call centers. They have barriers to entry. It’s no wonder they talk in business-speak. It’s just another means to keep people at a distance.

Small companies sometimes try to appear big because they think they need to be big in order to attract big companies as clients. This is sometimes true, but mostly false. It is true that big companies often like to deal with other big companies, mostly so they can successfully sue them if they stuff up. It is false because smart big companies will know a great idea when they hear one, and size simply won’t be a consideration so long as the small company has got something the big company wants.

For example, I mentioned I’d been reading “The Pumpkin Plan” recently. There is a story about a tiny two person company. They came up with a new way of marketing pharmaceuticals.

One major problem many pharmaceutical companies face is that they need to change their marketing approach in different regions, even though they are marketing the exact same product.

In some areas, they have to market based on price (Los Angeles). In other markets they need to influence the cardiologists (Boston). In other areas they must talk directly to African-American patients (Atlanta). Exact same product, different marketing strategy for each city. Get the strategy wrong, and they waste a lot of money and lose market share.

Two guys came up with a way to crunch the numbers that tell pharmaceutical companies exactly what the biggest driver of performance is in each territory.

Through a network of colleagues, they managed to land a meeting with a pharmaceutical company. Not just any meeting – they go straight to the top floor, and talk to the Chairman Johnson & Johnson Pharmaceuticals. They barely get five slides into their presentation when the Chairman stops them to call in his VP of marketing. They both love the idea! This solves a big problem for Johnson and Johnson. The result is that this two person company lands 500K worth of business on the spot, $ 4m worth of business in the first two years, and $ 14.2m by year four. They expand, of course.

So, they were two guys pitching to one the biggest pharmaceutical businesses in the world. They landed millions of dollars worth of business because Johnson and Johnson like their idea. They didn’t need to convince Johnson and Johnson they were anything more than two guys with a good idea. It didn’t require any business-speak about mission statements, just a focus on finding and solving a real problem.

Tell A True Story About You (And Them)

If you’re ever tempted to write business-speak, try telling a story instead. Turn your pitches into stories. Turn your proposal into stories. Turn your presentations into stories. Make them true stories. Tell them in your authentic voice. People love to be told a story as stories are both familiar and revealing. A string of facts is never going to have the same impact. Business-speak will invariably leave an audience focused on their smartphones.

A story can be about how you solved a problem in the past. A problem just like the one your prospective clients are having. What was the problem? Why was it painful? What did you do to solve it? What was the result?

Easy and memorable. You can structure almost anything as a story. Stories move from the status quo, straight into a crisis (business problem), then the crisis is resolved, and a new status quo is reached. Start with a problem. Explain why it is painful. Bring in the hero – you – and tell them what you did to solve the problem. Then tell them the result – the new status quo.

Are you more likely to recall the text of my opening paragraph, or the story about the two guys pitching to Johnson and Johnson?

Stories can be so much more effective than business-speak.

Categories: 

SEO Book.com

Posted in IM NewsComments Off

We need to talk about Google – the trouble with ranking signals

Author (displayed on the page): 

The results are in trouble

Partly due to the abuse through the search industry and partly due to a lack of foresight, Google appear to be in an awkward position where the current search signals are causing large amounts of disruption. Negative SEO has become more prevalent and people are now even demanding money to take down links and webmasters are threatening to sue. The situation has got so bad that Google are penalizing themselves. It’s a mess.

I’m not going to go through a point by point analysis of why links don’t work as others have already put it far better than I could.

The key here is that businesses that want to rank cannot be trusted to play nice. Links as votes don’t work if the knowledgeable can manipulate the vote.

Google’s response is a heavier reliance on social media signals which is itself, as unreliable as links. If not more so, as we have all witnessed the rise in social media spam. These seem to leave the search results even further away from the ideal of what’s best for the user will rank highest.

Google is frantically reaching for ways to disrupt this pattern, Google + isn’t an attempt to outdo Facebook, it’s trying to sort out the mess the search results are in right now. As the search engine giant struggles, algorithm updates may become even more drastic than those we have seen already.

Google need to go back to looking at content

One of the core problems is that they are relying on user generated signals too heavily to show them what is the best thing to rank. Links and social are great, but not used in the broad way that Google are. A search engine should be able to pin point the exact information I want, not big broad brush strokes based on what’s popular.

This is why Google need to step away from their reliance on these signals and go back to the content. Looking at not just what’s written on the page but establishing what it says but what it means. We all know ‘an apple’ is an apple but Google has almost completely forgotten about the fruit:




I’m no language expert but I know if someone is searching for ‘an apple’ rather than ‘apple’ they are most likely interested in information about the fruit. Not a tech firm. Because of the weight placed on the link and social signals, pages about boring old fruit get overwhelmed. The exception here is the ubiquitous Wikipedia, the only on topic page with the power to rank in this search.



Exciting things have stronger signals than boring things. You can have the best, most well researched, informative piece of content out there, yet you will rank below the guy who has run a series of competitions, linkbait and infographics on his site. You’re too busy writing and researching great content to market it. But Google would rather take the other guy who has pushed hard and garnered a lot of attention. So now it’s not about who’s creating the best content but who’s marketing their content best.



Google seem to have forgotten what we use search engines for, what we are looking for isn’t ‘stuff’, but information, precise facts, products and details. If we just want to find out what everyone is talking about that’s what we use social media for. My twitter feed brings me a constant stream of ‘stuff’ that I might be interested in. With the rise of social media Google needs to get serious, I want the best answer to my question, not the most popular. For instance 58% of Brits believe that Mt. Everest is the UK’s tallest mountain. It’s not, it’s in Nepal for a start. I don’t want the content to be dictated by the crowd, I want it to be dictated by quality.



Here’s another example:








In the UK, I barely even know that there is a shop called target at all. They have no UK online store or presence that I’m aware of. Google is so desperate to show me a brand though that rather than showing me the wealth of other things called target, like a 2004 movie or a UK TV show, they flood the first page of the results with repetitive information about something that is of no use to me.



Just fixing spam won’t fix this


The job of the search engine is to find the content that will best satisfy my query. As a webmaster I should be able to just create great content and leave Google to do the rest. However we all know that this doesn’t work. If you build it they won’t come, but if you market it they will.


So let’s say in an ‘ideal’ world Google would clear out the spam and count only the sites which have genuine signals. Anyone who works with clients in markets such as finance will know how difficult it is to gain genuine links in these sectors.



Many markets aren't exciting and a specialist site in a ‘boring’ niche is not a great natural link generator. More general sites which operate across more niches are better link generators. For an example think about Wikipedia and how they already rank almost everywhere.



The sites which can succeed are most likely to be bigger, where they can leverage scale, partnerships and expertise to their advantage. This means that the experts carefully crafting content around a single area will be drowned out by the jack of all trades covering a range of topics.



So what should Google do



This is an issue of trust. Google can quickly establish which of the sites in their index may have the best content for you, however this isn’t very exact. There are lots of sites with too similar content meaning they need additional metrics to tell them which of the sites they should display first.



Better understanding of user intent is key. By making the initial relevancy decision, ie which of this big list of sites is most relevant to the query, then they would be able start off with a more refined list. This would mean less reliance on those trust metrics.



User behavior also holds some of the answers here, Google does take notice of how we behave in the search results. For instance Click Through Rates from the results are part a signal Google use to detect where low quality content has crept into the results. More reliance on these behavioral metrics, which can be made hard to manipulate, could help refine results better.



More openness and honesty with webmasters would go a long way. Whenever Google makes a change the whole industry scrambles to react, trying to bring their sites inline with the changes. Information around these changes in usually sketchy at best. This means that the changes made can be unnecessary at best or damaging at worst. This doesn’t make the search results better, creates confusing signals for Google and cost businesses time and money.



All we want is to get our content found, if the way to do this is to create the best content possible, as Google is so fond of telling us, then this is what we would do. However we all know that’s just not enough. If Google could openly show that this was actually the case, then Webmasters would gladly divert the money spent on other dubious SEO tactics into just creating the best content possible, and surely that would create a better web for everyone.  

 

Wordtracker Blog

Posted in IM NewsComments Off

New Social Study: Men Get Personal, Women Talk Shop

Women may be tops when it comes to communicating but online, they’re more careful about what they say and to whom.

A new study from UK company uSamp shows that overall, men were more apt to share personal information online, topping women in every area except one – brand’s liked.

78% of the UK women surveyed said they’d be happy to share information about what they buy and even 74% of men said they’d share that info, too.

One of the biggest gaps between “I’d share” and “I wouldn’t” was date of birth. 55% of men said they’d share their birth date but only 45% of women were willing to admit how old they were.

Men were also much more willing to share their phone number online, 12% vs only 4% of women.

The phone number, like the home address are security issues. But income was another taboo subject with men coming in the highest at only 10%.

The big surprise? 70% of women were happy to share their relationship status. Men actually topped that number with 73%. The survey didn’t say, but I wonder how many of those men were married.

Education and occupation were too other bits of info that scored high on the “happy to share” scale.

The survey doesn’t only relate to public, social sharing. It can be inferred that folks feel the same way about sharing this information with websites and retailers. Keep this in mind when you’re putting together your registration screen and profile pages. More and more sites are making a phone number a required box and that’s likely to drive away customers. Feel free to ask for the moon, but don’t require your potential customers to give out more info than you really need to service them properly.

Pilgrim’s Partners: SponsoredReviews.com – Bloggers earn cash, Advertisers build buzz!



Marketing Pilgrim – Internet News and Opinion

Posted in IM NewsComments Off


Advert