Tag Archive | "Used"

Case Study: How a Media Company Grew 400% and Used SEO to Get Acquired

Posted by Gaetano-DiNardi-NYC

Disclaimer: I’m currently the Director of Demand Generation at Nextiva, and writing this case study post-mortem as the former VP of Marketing at Sales Hacker (Jan. 2017 – Sept. 2018).



Every B2B company is investing in content marketing right now. Why? Because they all want the same thing: Search traffic that leads to website conversions, which leads to money.

But here’s the challenge: Companies are struggling to get traction because competition has reached an all-time high. Keyword difficulty (and CPC) has skyrocketed in most verticals. In my current space, Unified Communication as a Service (UCaaS), some of the CPCs have nearly doubled since 2017, with many keywords hovering close to $ 300 per click.

Not to mention, organic CTRs are declining, and zero-click queries are rising.

Bottom line: If you’re not creating 10x quality content based on strategic keyword research that satisfies searcher intent and aligns back to business goals, you’re completely wasting your time.

So, that’s exactly what we did. The outcome? We grew from 19k monthly organic sessions to over 100k monthly organic sessions in approximately 14 months, leading to an acquisition by Outreach.io

We validated our hard work by measuring organic growth (traffic and keywords) against our email list growth and revenue, which correlated positively, as we expected. 

Organic Growth Highlights

January 2017–June 2018

As soon as I was hired at Sales Hacker as Director of Marketing, I began making SEO improvements from day one. While I didn’t waste any time, you’ll also notice that there was no silver bullet.

This was the result of daily blocking and tackling. Pure execution and no growth hacks or gimmicks. However, I firmly believe that the homepage redesign (in July 2017) was a tremendous enabler of growth.

Organic Growth to Present Day

I officially left Sales Hacker in August of 2018, when the company was acquired by Outreach.io. However, I thought it would be interesting to see the lasting impact of my work by sharing a present-day screenshot of the organic traffic trend, via Google Analytics. There appears to be a dip immediately following my departure, however, it looks like my predecessor, Colin Campbell, has picked up the slack and got the train back on the rails. Well done!

Unique considerations — Some context behind Sales Hacker’s growth

Before I dive into our findings, here’s a little context behind Sales Hacker’s growth:

  • Sales Hacker’s blog is 100 percent community-generated — This means we didn’t pay “content marketers” to write for us. Sales Hacker is a publishing hub led by B2B sales, marketing, and customer success contributors. This can be a blessing and a curse at the same time — on one hand, the site gets loads of amazing free content. On the other hand, the posts are not even close to being optimized upon receiving the first draft. That means, the editorial process is intense and laborious.
  • Aggressive publishing cadence (4–5x per week) — Sales Hacker built an incredible reputation in the B2B Sales Tech niche — we became known as the go-to destination for unbiased thought leadership for practitioners in the space (think of Sales Hacker as the sales equivalent to Growth Hackers). Due to high demand and popularity, we had more content available than we could handle. While it’s a good problem to have, we realized we needed to keep shipping content in order to avoid a content pipeline blockage and a backlog of unhappy contributors.
  • We had to “reverse engineer” SEO — In short, we got free community-generated and sponsored content from top sales and marketing leaders at SaaS companies like Intercom, HubSpot, Pipedrive, LinkedIn, Adobe and many others, but none of it was strategically built for SEO out of the box. We also had contributors like John Barrows, Richard Harris, Lauren Bailey, Tito Bohrt, and Trish Bertuzzi giving us a treasure trove of amazing content to work with. However, we had to collaborate with each contributor from beginning to end and guide them through the entire process. Topical ideation (based on what they were qualified to write about), keyword research, content structure, content type, etc. So, the real secret sauce was in our editorial process. Shout out to my teammate Alina Benny for learning and inheriting my SEO process after we hired her to run content marketing. She crushed it for us!
  • Almost all content was evergreen and highly tactical — I made it a rule that we’d never agree to publish fluffy pieces, whether it was sponsored or not. Plain and simple. Because we didn’t allow “content marketers” to publish with us, our content had a positive reputation, since it was coming from highly respected practitioners. We focused on evergreen content strategies in order to fuel our organic growth. Salespeople don’t want fluff. They want actionable and tactical advice they can implement immediately. I firmly believe that achieving audience satisfaction with our content was a major factor in our SEO success.
    • Outranking the “big guys” — If you look at the highest-ranking sales content, it’s the usual suspects. HubSpot, Salesforce, Forbes, Inc, and many other sites that were far more powerful than Sales Hacker. But it didn’t matter as much as traditional SEO wisdom tells us, largely due to the fact that we had authenticity and rawness to our content. We realized most sales practitioners would rather read insights from their peers in their community, above the traditional “Ultimate Guides,” which tended to be a tad dry.
    • We did VERY little manual link building — Our link building was literally an email from me, or our CEO, to a site we had a great relationship with. “Yo, can we get a link?” It was that simple. We never did large-scale outreach to build links. We were a very lean, remote digital marketing team, and therefore lacked the bandwidth to allocate resources to link building. However, we knew that we would acquire links naturally due to the popularity of our brand and the highly tactical nature of our content.
    • Our social media and brand firepower helped us to naturally acquire links — It helps A LOT when you have a popular brand on social media and a well-known CEO who authored an essential book called “Hacking Sales”. Most of Sales Hacker’s articles would get widely circulated by over 50+ SaaS partners which would help drive natural links.
    • Updating stale content was the lowest hanging fruit — The biggest chunk of our new-found organic traffic came from updating / refreshing old posts. We have specific examples of this coming up later in the post.
    • Email list growth was the “north star” metric — Because Sales Hacker is not a SaaS company, and the “product” is the audience, there was no need for aggressive website CTAs like “book a demo.” Instead, we built a very relationship heavy, referral-based sales cadence that was supported by marketing automation, so list growth was the metric to pay attention to. This was also a key component to positioning Sales Hacker for acquisition. Here’s how the email growth progression was trending.

    So, now that I’ve set the stage, let’s dive into exactly how I built this SEO strategy.

    Bonus: You can also watch the interview I had with Dan Shure on the Evolving SEO Podcast, where I breakdown this strategy in great detail.

    1) Audience research

    Imagine you are the new head of marketing for a well-known startup brand. You are tasked with tackling growth and need to show fast results — where do you start?

    That’s the exact position I was in. There were a million things I could have done, but I decided to start by surveying and interviewing our audience and customers.

    Because Sales Hacker is a business built on content, I knew this was the right choice.

    I also knew that I would be able to stand out in an unglamorous industry by talking to customers about their content interests.

    Think about it: B2B tech sales is all about numbers and selling stuff. Very few brands are really taking the time to learn about the types of content their audiences would like to consume.

    When I was asking people if I could talk to them about their media and content interests, their response was: “So, wait, you’re actually not trying to sell me something? Sure! Let’s talk!”

    Here’s what I set out to learn:

    • Goal 1 — Find one major brand messaging insight.
    • Goal 2 — Find one major audience development insight.
    • Goal 3 — Find one major content strategy insight.
    • Goal 4 — Find one major UX / website navigation insight.
    • Goal 5 — Find one major email marketing insight.

    In short, I accomplished all of these learning goals and implemented changes based on what the audience told me.

    If you’re curious, you can check out my entire UX research process for yourself, but here are some of the key learnings:

    Based on these outcomes, I was able to determine the following:

    • Topical “buckets” to focus on — Based on the most common daily tasks, the data told us to build content on sales prospecting, building partnerships and referral programs, outbound sales, sales management, sales leadership, sales training, and sales ops.
    • Thought leadership — 62 percent of site visitors said they kept coming back purely due to thought leadership content, so we had to double down on that.
    • Content Types — Step by step guides, checklists, and templates were highly desired. This told me that fluffy BS content had to be ruthlessly eliminated at all costs.
    • Sales Hacker Podcast — 76 percent of respondents said they would listen to the Sales Hacker Podcast (if it existed), so we had to launch it!

    2) SEO site audit — Key findings

    I can’t fully break down how to do an SEO site audit step by step in this post (because it would be way too much information), but I will share the key findings and takeaways from our own Site Audit that led to some major improvements in our website performance.

    Lack of referring domain growth

    Sales Hacker was not able to acquire referring domains at the same rate as competitors. I knew this wasn’t because of a link building acquisition problem, but due to a content quality problem.

    Lack of organic keyword growth

    Sales Hacker had been publishing blog content for years (before I joined) and there wasn’t much to show for it from an organic traffic standpoint. However, I do feel the brand experienced a remarkable social media uplift by building content that was helpful and engaging. 

    Sales Hacker did happen to get lucky and rank for some non-branded keywords by accident, but the amount of content published versus the amount of traffic they were getting wasn’t making sense. 

    To me, this immediately screamed that there was an issue with on-page optimization and keyword targeting. It wasn’t anyone’s fault – this was largely due to a startup founder thinking about building a community first, and then bringing SEO into the picture later. 

    At the end of the day, Sales Hacker was only ranking for 6k keywords at an estimated organic traffic cost of $ 8.9k — which is nothing. By the time Sales Hacker got acquired, the site had an organic traffic cost of $ 122k.

    Non-optimized URLs

    This is common among startups that are just looking to get content out. This is just one example, but truth be told, there was a whole mess of non-descriptive URLs that had to get cleaned up.

    Poor internal linking structure

    The internal linking concentration was poorly distributed. Most of the equity was pointing to some of the lowest value pages on the site.

    Poor taxonomy, site structure, and navigation

    I created a mind-map of how I envisioned the new site structure and internal linking scheme. I wanted all the content pages to be organized into categories and subcategories.

    My goals with the new proposed taxonomy would accomplish the following:

    • Increase engagement from natural site visitor exploration
    • Allow users to navigate to the most important content on the site
    • Improve landing page visibility from an increase in relevant internal links pointing to them.

    Topical directories and category pages eliminated with redirects

    Topical landing pages used to exist on SalesHacker.com, but they were eliminated with 301 redirects and disallowed in robots.txt. I didn’t agree with this configuration. Example: /social-selling/

    Trailing slash vs. non-trailing slash duplicate content with canonical errors

    Multiple pages for the same exact intent. Failing to specify the canonical version.

    Branded search problems — “Sales Hacker Webinar”

    Some of the site’s most important content is not discoverable from search due to technical problems. For example, a search for “Sales Hacker Webinar” returns irrelevant results in Google because there isn’t an optimized indexable hub page for webinar content. It doesn’t get that much search volume (0–10 monthly volume according to Keyword Explorer), but still, that’s 10 potential customers you are pissing off every month by not fixing this.

    3) Homepage — Before and after

    Sooooo, this beauty right here (screenshot below) was the homepage I inherited in early 2017 when I took over the site.

    Fast forward six months later, and this was the new homepage we built after doing audience and customer research…

    New homepage goals

    • Tell people EXACTLY what Sales Hacker is and what we do.
    • Make it stupidly simple to sign up for the email list.
    • Allow visitors to easily and quickly find the content they want.
    • Add social proof.
    • Improve internal linking.

    I’m proud to say, that it all went according to plan. I’m also proud to say that as a result, organic traffic skyrocketed shortly after.

    Special Note: Major shout out to Joshua Giardino, the lead developer who worked with me on the homepage redesign. Josh is one of my closest friends and my marketing mentor. I would not be writing this case study today without him!

    There wasn’t one super measurable thing we isolated in order to prove this. We just knew intuitively that there was a positive correlation with organic traffic growth, and figured it was due to the internal linking improvements and increased average session duration from improving the UX.

    4) Updating and optimizing existing content

    Special note: We enforced “Ditch the Pitch”

    Before I get into the nitty-gritty SEO stuff, I’ll tell you right now that one of the most important things we did was blockade contributors and sponsors from linking to product pages and injecting screenshots of product features into blog articles, webinars, etc.

    Side note: One thing we also had to do was add a nofollow attribute to all outbound links within sponsored content that sent referral traffic back to partner websites (which is no longer applicable due to the acquisition).

    The #1 complaint we discovered in our audience research was that people were getting irritated with content that was “too salesy” or “too pitchy” — and rightfully so, because who wants to get pitched at all day?

    So we made it all about value. Pure education. School of hard knocks style insights. Actionable and tactical. No fluff. No nonsense. To the point.

    And that’s where things really started to take off.

    Before and after: “Best sales books”

    What you are about to see is classic SEO on-page optimization at its finest.

    This is what the post originally looked like (and it didn’t rank well for “best sales books).

    And then after…

    And the result…

    Before and after: “Sales operations”

    What we noticed here was a crappy article attempting to explain the role of sales operations.

    Here are the steps we took to rank #1 for “Sales Operations:”

    • Built a super optimized mega guide on the topic.
    • Since the old crappy article had some decent links, we figured let’s 301 redirect it to the new mega guide.
    • Promote it on social, email and normal channels.

    Here’s what the new guide on Sales Ops looks like…

    And the result…

    5) New content opportunities

    One thing I quickly realized Sales Hacker had to its advantage was topical authority. Exploiting this was going to be our secret weapon, and boy, did we do it well: 

    “Cold calling”

    We knew we could win this SERP by creating content that was super actionable and tactical with examples.

    Most of the competing articles in the SERP were definition style and theory-based, or low-value roundups from domains with high authority.

    In this case, DA doesn’t really matter. The better man wins.

    “Best sales tools”

    Because Sales Hacker is an aggregator website, we had the advantage of easily out-ranking vendor websites for best and top queries.

    Of course, it also helps when you build a super helpful mega list of tools. We included over 150+ options to choose from in the list. Whereas SERP competitors did not even come close.

    “Channel sales”

    Notice how Sales Hacker’s article is from 2017 still beats HubSpot’s 2019 version. Why? Because we probably satisfied user intent better than them.

    For this query, we figured out that users really want to know about Direct Sales vs Channel Sales, and how they intersect.

    HubSpot went for the generic, “factory style” Ultimate Guide tactic.

    Don’t get me wrong, it works very well for them (especially with their 91 DA), but here is another example where nailing the user intent wins.

    “Sales excel templates”

    This was pure lead gen gold for us. Everyone loves templates, especially sales excel templates.

    The SERP was easily winnable because the competition was so BORING in their copy. Not only did we build a better content experience, but we used numbers, lists, and power words that salespeople like to see, such as FAST and Pipeline Growth.

    Special note: We never used long intros

    The one trend you’ll notice is that all of our content gets RIGHT TO THE POINT. This is inherently obvious, but we also uncovered it during audience surveying. Salespeople don’t have time for fluff. They need to cut to the chase ASAP, get what they came for, and get back to selling. It’s really that straightforward.

    When you figure out something THAT important to your audience, (like keeping intros short and sweet), and then you continuously leverage it to your advantage, it’s really powerful.

    6) Featured Snippets

    Featured snippets became a huge part of our quest for SERP dominance. Even for SERPs where organic clicks have reduced, we didn’t mind as much because we knew we were getting the snippet and free brand exposure.

    Here are some of the best-featured snippets we got!

    Featured snippet: “Channel sales”

    Featured snippet: “Sales pipeline management”

    Featured snippet: “BANT”

    Featured snippet: “Customer success manager”

    Featured snippet: “How to manage a sales team”

    Featured snippet: “How to get past the gatekeeper”

    Featured snippet: “Sales forecast modeling”

    Featured snippet: “How to build a sales pipeline”

    7) So, why did Sales Hacker get acquired?

    At first, it seems weird. Why would a SaaS company buy a blog? It really comes down to one thing — community (and the leverage you get with it).

    Two learnings from this acquisition are:

    1. It may be worth acquiring a niche media brand in your space

    2. It may be worth starting your own niche media brand in your space

    I feel like most B2B companies (not all, but most) come across as only trying to sell a product — because most of them are. You don’t see the majority of B2B brands doing a good job on social. They don’t know how to market to emotion. They completely ignore top-funnel in many cases and, as a result, get minimal engagement with their content.

    There’s really so many areas of opportunity to exploit in B2B marketing if you know how to leverage that human emotion — it’s easy to stand out if you have a soul. Sales Hacker became that “soul” for Outreach — that voice and community.

    But one final reason why a SaaS company would buy a media brand is to get the edge over a rival competitor. Especially in a niche where two giants are battling over the top spot.

    In this case, it’s Outreach’s good old arch-nemesis, Salesloft. You see, both Outreach and Salesloft are fighting tooth and nail to win a new category called “Sales Engagement”.

    As part of the acquisition process, I prepared a deck that highlighted how beneficial it would be for Outreach to acquire Sales Hacker, purely based on the traffic advantage it would give them over Salesloft.

    Sales Hacker vs. Salesloft vs Outreach — Total organic keywords

    This chart from 2018 (data exported via SEMrush), displays that Sales Hacker is ranking for more total organic keywords than Salesloft and Outreach combined.

    Sales Hacker vs. Salesloft vs Outreach — Estimated traffic cost

    This chart from 2018 (data exported via SEMrush), displays the cost of the organic traffic compared by domain. Sales Hacker ranks for more commercial terms due to having the highest traffic cost.

    Sales Hacker vs. Salesloft vs Outreach — Rank zone distributions

    This chart from 2018 (data exported via SEMrush), displays the rank zone distribution by domain. Sales Hacker ranked for more organic keywords across all search positions.

    Sales Hacker vs. Salesloft vs Outreach — Support vs. demand keywords

    This chart from 2018 (data exported via SEMrush), displays support vs demand keywords by domain. Because Sales Hacker did not have a support portal, all its keywords were inherently demand focused.

    Meanwhile, Outreach was mostly ranking for support keywords at the time. Compared to Salesloft, they were at a massive disadvantage.

    Conclusion

    I wouldn’t be writing this right now without the help, support, and trust that I got from so many people along the way.

    • Joshua Giardino — Lead developer at Sales Hacker, my marketing mentor and older brother I never had. Couldn’t have done this without you!
    • Max Altschuler — Founder of Sales Hacker, and the man who gave me a shot at the big leagues. You built an incredible platform and I am eternally grateful to have been a part of it.
    • Scott Barker — Head of Partnerships at Sales Hacker. Thanks for being in the trenches with me! It’s a pleasure to look back on this wild ride, and wonder how we pulled this off.
    • Alina Benny — My marketing protege. Super proud of your growth! You came into Sales Hacker with no fear and seized the opportunity.
    • Mike King — Founder of iPullRank, and the man who gave me my very first shot in SEO. Thanks for taking a chance on an unproven kid from the Bronx who was always late to work.
    • Yaniv Masjedi — Our phenomenal CMO at Nextiva. Thank you for always believing in me and encouraging me to flex my thought leadership muscle. Your support has enabled me to truly become a high-impact growth marketer.

    Thanks for reading — tell me what you think below in the comments!

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


    Moz Blog

    Posted in IM NewsComments Off

    I Used To Live In A Caravan: Here’s How I Made Enough Money To Live Anywhere In The World

    This is Part 2 of my new Services Arbitrage free introductory training series. If you have not read Part 1, go do that now – How To Start An Online Business Selling Services Other People Deliver (I Call It ‘Services Arbitrage’) (10-minute read). Press play on the video above to hear the story behind the caravan, or […]

    The post I Used To Live In A Caravan: Here’s How I Made Enough Money To Live Anywhere In The World appeared first on Yaro.Blog.

    Entrepreneurs-Journey.com by Yaro Starak

    Posted in IM NewsComments Off

    Greg Smith: Founder Of Canadian Tech Startup Thinkific Explains How They Used MVPs To Build A Hugely Successful Subscription Software Company

     [ Download MP3 | Transcript | iTunes | Soundcloud | Raw RSS ] One of the hottest business models in the tech startup world is anything with a recurring subscription business model, especially if it’s software based. Another hot online business model for talented individuals who want to make money from their knowledge, is […]

    The post Greg Smith: Founder Of Canadian Tech Startup Thinkific Explains How They Used MVPs To Build A Hugely Successful Subscription Software Company appeared first on Yaro.Blog.

    Entrepreneurs-Journey.com by Yaro Starak

    Posted in IM NewsComments Off

    Greg Smith: Founder Of Canadian Tech Startup Thinkific Explains How They Used MVPs To Build A Hugely Successful Subscription Software Company

     [ Download MP3 | Transcript | iTunes | Soundcloud | Raw RSS ] One of the hottest business models in the tech startup world is anything with a recurring subscription business model, especially if it’s software based. Another hot online business model for talented individuals who want to make money from their knowledge, is […]

    The post Greg Smith: Founder Of Canadian Tech Startup Thinkific Explains How They Used MVPs To Build A Hugely Successful Subscription Software Company appeared first on Yaro.Blog.

    Entrepreneurs-Journey.com by Yaro Starak

    Posted in IM NewsComments Off

    Google My Business Insights adds queries used to find your business

    Learn how people find your business in Google Maps and local search within Google My Business Insights with this new report.



    Please visit Search Engine Land for the full article.


    Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

    Posted in IM NewsComments Off

    Moz’s Link Data Used to Suck… But Not Anymore! The New Link Explorer is Here – Whiteboard Friday

    Posted by randfish

    Earlier this week we launched our brand-new link building tool, and we’re happy to say that Link Explorer addresses and improves upon a lot of the big problems that have plagued our legacy link tool, Open Site Explorer. In today’s Whiteboard Friday, Rand transparently lists out many of the biggest complaints we’ve heard about OSE over the years and explains the vast improvements Link Explorer provides, from DA scores updated daily to historic link data to a huge index of almost five trillion URLs.

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    Click on the whiteboard image above to open a high-resolution version in a new tab!


    Video Transcription

    Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m very excited to say that Moz’s Open Site Explorer product, which had a lot of challenges with it, is finally being retired, and we have a new product, Link Explorer, that’s taking its place. So let me walk you through why and how Moz’s link data for the last few years has really kind of sucked. There’s no two ways about it.

    If you heard me here on Whiteboard Friday, if you watched me at conferences, if you saw me blogging, you’d probably see me saying, “Hey, I personally use Ahrefs, or I use Majestic for my link research.” Moz has a lot of other good tools. The crawler is excellent. Moz Pro is good. But Open Site Explorer was really lagging, and today, that’s not the case. Let me walk you through this.

    The big complaints about OSE/Mozscape

    1. The index was just too small

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    Mozscape was probably about a fifth to a tenth the size of its competitors. While it got a lot of the quality good links of the web, it just didn’t get enough. As SEOs, we need to know all of the links, the good ones and the bad ones.

    2. The data was just too old

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    So, in Mozscape, a link that you built on November 1st, you got a link added to a website, you’re very proud of yourself. That’s excellent. You should expect that a link tool should pick that up within maybe a couple weeks, maybe three weeks at the outside. Google is probably picking it up within just a few days, sometimes hours.

    Yet, when Mozscape would crawl that, it would often be a month or more later, and by the time Mozscape processed its index, it could be another 40 days after that, meaning that you could see a 60- to 80-day delay, sometimes even longer, between when your link was built and when Mozscape actually found it. That sucks.

    3. PA/DA scores took forever to update

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    PA/DA scores, likewise, took forever to update because of this link problem. So the index would say, oh, your DA is over here. You’re at 25, and now maybe you’re at 30. But in reality, you’re probably far ahead of that, because you’ve been building a lot of links that Mozscape just hasn’t picked up yet. So this is this lagging indicator. Sometimes there would be links that it just didn’t even know about. So PA and DA just wouldn’t be as accurate or precise as you’d want them to be.

    4. Some scores were really confusing and out of date

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    MozRank and MozTrust relied on essentially the original Google PageRank paper from 1997, which there’s no way that’s what’s being used today. Google certainly uses some view of link equity that’s passed between links that is similar to PageRank, and I think they probably internally call that PageRank, but it looks nothing like what MozRank was called.

    Likewise, MozTrust, way out of date, from a paper in I think 2002 or 2003. Much more advancements in search have happened since then.

    Spam score was also out of date. It used a system that was correlated with what spam looked like three, four years ago, so much more up to date than these two, but really not nearly as sophisticated as what Google is doing today. So we needed to toss those out and find their replacements as well.

    5. There was no way to see links gained and lost over time

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    Mozscape had no way to see gained and lost links over time, and folks thought, “Gosh, these other tools in the SEO space give me this ability to show me links that their index has discovered or links they’ve seen that we’ve lost. I really want that.”

    6. DA didn’t correlate as well as it should have

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    So over time, DA became a less and less indicative measure of how well you were performing in Google’s rankings. That needed to change as well. The new DA, by the way, much, much better on this front.

    7. Bulk metrics checking and link reporting was too hard and manual

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    So folks would say, “Hey, I have this giant spreadsheet with all my link data. I want to upload that. I want you guys to crawl it. I want to go fetch all your metrics. I want to get DA scores for these hundreds or thousands of websites that I’ve got. How do I do that?” We didn’t provide a good way for you to do that either unless you were willing to write code and loop in our API.

    8. People wanted distribution of their links by DA

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    They wanted distributions of their links by domain authority. Show me where my links come from, yes, but also what sorts of buckets of DA do I have versus my competition? That was also missing.

    So, let me show you what the new Link Explorer has.

    Moz's new Link Explorer

    Click on the whiteboard image above to open a high-resolution version in a new tab!

    Wow, look at that magical board change, and it only took a fraction of a second. Amazing.

    What Link Explorer has done, as compared to the old Open Site Explorer, is pretty exciting. I’m actually very proud of the team. If you know me, you know I am a picky SOB. I usually don’t even like most of the stuff that we put out here, but oh my god, this is quite an incredible product.

    1. Link Explorer has a GIANT index

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    So I mentioned index size was a big problem. Link Explorer has got a giant index. Frankly, it’s about 20 times larger than what Open Site Explorer had and, as you can see, very, very competitive with the other services out there. Majestic Fresh says they have about a trillion URLs from their I think it’s the last 60 days. Ahrefs, about 3 trillion. Majestic’s historic, which goes all time, has about 7 trillion, and Moz, just in the last 90 days, which I think is our index — maybe it’s a little shorter than that, 60 days — 4.7 trillion, so almost 5 trillion URLs. Just really, really big. It covers a huge swath of the web, which is great.

    2. All data updates every 24 hours

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    So, unlike the old index, it is very fresh. Every time it finds a new link, it updates PA scores and DA scores. The whole interface can show you all the links that it found just yesterday every morning.

    3. DA and PA are tracked daily for every site

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    You don’t have to track them yourself. You don’t have to put them into your campaigns. Every time you go and visit a domain, you will see this graph showing you domain authority over time, which has been awesome.

    For my new company, I’ve been tracking all the links that come in to SparkToro, and I can see my DA rising. It’s really exciting. I put out a good blog post, I get a bunch of links, and my DA goes up the next day. How cool is that?

    4. Old scores are gone, and new scores are polished and high quality

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    So we got rid of MozRank and MozTrust, which were very old metrics and, frankly, very few people were using them, and most folks who were using them didn’t really know how to use them. PA basically takes care of both of them. It includes the weight of links that come to you and the trustworthiness. So that makes more sense as a metric.

    Spam score is now on a 0 to 100% risk model instead of the old 0 to 17 flags and the flags correlate to some percentage. So 0 to 100 risk model. Spam score is basically just a machine learning built model against sites that Google penalized or banned.

    So we took a huge amount of domains. We ran their names through Google. If they couldn’t rank for their own name, we said they were penalized. If we did a site: the domain.com and Google had de-indexed them, we said they were banned. Then we built this risk model. So in the 90% that means 90% of sites that had these qualities were penalized or banned. 2% means only 2% did. If you have a 30% spam score, that’s not too bad. If you have a 75% spam score, it’s getting a little sketchy.

    5. Discovered and lost links are available for every site, every day

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    So again, for this new startup that I’m doing, I’ve been watching as I get new links and I see where they come from, and then sometimes I’ll reach out on Twitter and say thank you to those folks who are linking to my blog posts and stuff. But it’s very, very cool to see links that I gain and links that I lose every single day. This is a feature that Ahrefs and Majestic have had for a long time, and frankly Moz was behind on this. So I’m very glad that we have it now.

    6. DA is back as a high-quality leading indicator of ranking ability

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    So, a note that is important: everyone’s DA has changed. Your DA has changed. My DA has changed. Moz’s DA changed. Google’s DA changed. I think it went from a 98 to a 97. My advice is take a look at yourself versus all your competitors that you’re trying to rank against and use that to benchmark yourself. The old DA was an old model on old data on an old, tiny index. The new one is based on this 4.7 trillion size index. It is much bigger. It is much fresher. It is much more accurate. You can see that in the correlations.

    7. Building link lists, tracking links that you want to acquire, and bulk metrics checking is now easy

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    Building link lists, tracking links that you want to acquire, and bulk metrics checking, which we never had before and, in fact, not a lot of the other tools have this link tracking ability, is now available through possibly my favorite feature in the tool called Link Tracking Lists. If you’ve used Keyword Explorer and you’ve set up your keywords to watch those over time and to build a keyword research set, very, very similar. If you have links you want to acquire, you add them to this list. If you have links that you want to check on, you add them to this list. It will give you all the metrics, and it will tell you: Does this link to your website that you can associate with a list, or does it not? Or does it link to some page on the domain, but maybe not exactly the page that you want? It will tell that too. Pretty cool.

    8. Link distribution by DA

    Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

    Finally, we do now have link distribution by DA. You can find that right on the Overview page at the bottom.

    Look, I’m not saying Link Explorer is the absolute perfect, best product out there, but it’s really, really damn good. I’m incredibly proud of the team. I’m very proud to have this product out there.

    If you’d like, I’ll be writing some more about how we went about building this product and a bunch of agency folks that we spent time with to develop this, and I would like to thank all of them of course. A huge thank you to the Moz team.

    I hope you’ll do me a favor. Check out Link Explorer. I think, very frankly, this team has earned 30 seconds of your time to go check it out.

    Try out Link Explorer!

    All right. Thanks, everyone. We’ll see you again for another edition of Whiteboard Friday. Take care.

    Video transcription by Speechpad.com

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


    Moz Blog

    Posted in IM NewsComments Off

    Google Confirms Chrome Usage Data Used to Measure Site Speed

    Posted by Tom-Anthony

    During a discussion with Google’s John Mueller at SMX Munich in March, he told me an interesting bit of data about how Google evaluates site speed nowadays. It has gotten a bit of interest from people when I mentioned it at SearchLove San Diego the week after, so I followed up with John to clarify my understanding.

    The short version is that Google is now using performance data aggregated from Chrome users who have opted in as a datapoint in the evaluation of site speed (and as a signal with regards to rankings). This is a positive move (IMHO) as it means we don’t need to treat optimizing site speed for Google as a separate task from optimizing for users.

    Previously, it has not been clear how Google evaluates site speed, and it was generally believed to be measured by Googlebot during its visits — a belief enhanced by the presence of speed charts in Search Console. However, the onset of JavaScript-enabled crawling made it less clear what Google is doing — they obviously want the most realistic data possible, but it’s a hard problem to solve. Googlebot is not built to replicate how actual visitors experience a site, and so as the task of crawling became more complex, it makes sense that Googlebot may not be the best mechanism for this (if it ever was the mechanism).

    In this post, I want to recap the pertinent data around this news quickly and try to understand what this may mean for users.

    Google Search Console

    Firstly, we should clarify our understand of what the “time spent downloading a page” metric in Google Search Console is telling us. Most of us will recognize graphs like this one:

    Until recently, I was unclear about exactly what this graph was telling me. But handily, John Mueller comes to the rescue again with a detailed answer [login required] (hat tip to James Baddiley from Chillisauce.com for bringing this to my attention):

    John clarified what this graph is showing:

    It’s technically not “downloading the page” but rather “receiving data in response to requesting a URL” – it’s not based on rendering the page, it includes all requests made.

    And that it is:

    this is the average over all requests for that day

    Because Google may be fetching a very different set of resources every day when it’s crawling your site, and because this graph does not account for anything to do with page rendering, it is not useful as a measure of the real performance of your site.

    For that reason, John points out that:

    Focusing blindly on that number doesn’t make sense.

    With which I quite agree. The graph can be useful for identifying certain classes of backend issues, but there are also probably better ways for you to do that (e.g. WebPageTest.org, of which I’m a big fan).

    Okay, so now we understand that graph and what it represents, let’s look at the next option: the Google WRS.

    Googlebot & the Web Rendering Service

    Google’s WRS is their headless browser mechanism based on Chrome 41, which is used for things like “Fetch as Googlebot” in Search Console, and is increasingly what Googlebot is using when it crawls pages.

    However, we know that this isn’t how Google evaluates pages because of a Twitter conversation between Aymen Loukil and Google’s Gary Illyes. Aymen wrote up a blog post detailing it at the time, but the important takeaway was that Gary confirmed that WRS is not responsible for evaluating site speed:

    Twitter conversation with Gary Ilyes

    At the time, Gary was unable to clarify what was being used to evaluate site performance (perhaps because the Chrome User Experience Report hadn’t been announced yet). It seems as though things have progressed since then, however. Google is now able to tell us a little more, which takes us on to the Chrome User Experience Report.

    Chrome User Experience Report

    Introduced in October last year, the Chrome User Experience Report “is a public dataset of key user experience metrics for top origins on the web,” whereby “performance data included in the report is from real-world conditions, aggregated from Chrome users who have opted-in to syncing their browsing history and have usage statistic reporting enabled.”

    Essentially, certain Chrome users allow their browser to report back load time metrics to Google. The report currently has a public dataset for the top 1 million+ origins, though I imagine they have data for many more domains than are included in the public data set.

    In March I was at SMX Munich (amazing conference!), where along with a small group of SEOs I had a chat with John Mueller. I asked John about how Google evaluates site speed, given that Gary had clarified it was not the WRS. John was kind enough to shed some light on the situation, but at that point, nothing was published anywhere.

    However, since then, John has confirmed this information in a Google Webmaster Central Hangout [15m30s, in German], where he explains they’re using this data along with some other data sources (he doesn’t say which, though notes that it is in part because the data set does not cover all domains).

    At SMX John also pointed out how Google’s PageSpeed Insights tool now includes data from the Chrome User Experience Report:

    The public dataset of performance data for the top million domains is also available in a public BigQuery project, if you’re into that sort of thing!

    We can’t be sure what all the other factors Google is using are, but we now know they are certainly using this data. As I mentioned above, I also imagine they are using data on more sites than are perhaps provided in the public dataset, but this is not confirmed.

    Pay attention to users

    Importantly, this means that there are changes you can make to your site that Googlebot is not capable of detecting, which are still detected by Google and used as a ranking signal. For example, we know that Googlebot does not support HTTP/2 crawling, but now we know that Google will be able to detect the speed improvements you would get from deploying HTTP/2 for your users.

    The same is true if you were to use service workers for advanced caching behaviors — Googlebot wouldn’t be aware, but users would. There are certainly other such examples.

    Essentially, this means that there’s no longer a reason to worry about pagespeed for Googlebot, and you should instead just focus on improving things for your users. You still need to pay attention to Googlebot for crawling purposes, which is a separate task.

    If you are unsure where to look for site speed advice, then you should look at:

    That’s all for now! If you have questions, please comment here and I’ll do my best! Thanks!

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


    Moz Blog

    Posted in IM NewsComments Off

    5 Brands That Used Influencer Marketing to Raise Their Profile

    Influencer marketing is more than just a marketing buzzword these days. More companies are utilizing this marketing method to boost sales and grow their brands.

    For those still confused about what influencer marketing is, it’s simply the act of promoting or selling products or services via influencers, or people who have the ability to affect a brand. Where the main influencers before were celebrities and industry leaders, today’s influencers are more varied. Nowadays, top brands are seeking out bloggers, food critics, makeup mavens and celebrities who rose to fame on platforms like YouTube and Instagram.

    Brands that Benefited from Influencer Marketing

    Influencer marketing provides a lot of benefits. Brands can reach the relevant demographic and enjoy high levels of engagement. It’s also affordable and can help retain a brand’s authenticity. Numerous companies have already successfully leveraged these people to give their brand a boost.

    Clinique for Men

    Clinique is renowned for its hypoallergenic skincare for women. When the iconic cosmetic company launched a men’s line, they raised product awareness by partnering with a disparate group of male influencers from various professions. These influencers consisted of filmmakers, outdoorsmen, stylists, and lifestyle bloggers, each representing a group of men who would be interested in using Clinique for Men. Every post used in the campaign was unique and defined the influencer. For instance, surfer Mikey de Temple posted a photo of himself wearing his surf gear, with his surfboard in the background, along with a Clinique product.

    Clinique’s campaign was golden for several reasons. One, the company’s choice of influencers were so diverse that it expanded the product’s reach. Also, the posts integrated the product smoothly into a setting that was so natural to the influencer. This helped create a more organic interest in Clinique’s men’s line.

    Fashion Nova

    One brand that has truly embraced influencer marketing is Fashion Nova. According to the company’s founder and CEO, Richard Saghian, Fashion Nova is a viral store that works with 3,000 to 5,000 influencers. Its aggressive marketing efforts rely on lots of model and celebrity influencers, like Kylie Jenner and beauty vlogger Blissful Brii. The former has 93.8 million followers on Instagram while the latter has 93 thousand subscribers on YouTube. These two influencers alone have garnered millions of engagements, likes, and comments for the company.

    While other brands go for low-key but very relatable influencers, Fashion Nova went for the celebrities. While this will obviously net a company high-levels of engagement, it can also be costly. But as Fashion Nova has proven, it’s a worthwhile investment.

    Lagavulin’s Whiskey Yule Log

    This is a magnificent example of how an influencer marketing campaign made a product culturally relevant to a generation. Young people might not have a taste for single malt whiskey, but Lagavulin’s 2016 campaign featuring Nick Offerman changed that. Offerman’s iconic Parks and Rec character, Ron Swanson, is known for his love of whisky. Lagavulin’s 45-minute video took inspiration from YouTube’s yule log videos and simply showed Offerman quietly sipping and enjoying his whiskey next to a fireplace.

    The campaign was a success because Lagavulin found the perfect influencer for its brand. Offerman’s character proved to be a critical match for the target audience. As a matter of fact, the campaign was so good that it won an award for Best Influencer & Celebrity Campaign.

    Zafferano

    Zafferano does not have the same name recall as Nobu or other famous restaurants. But this Singapore-based establishment is a prime example of how social media can be used to boost audience engagement. The company tapped 11 Instagram influencers who are popular in the lifestyle and food category. They invited them to the restaurant for a special meal and in turn, they shared photos of the dishes on Instagram. The influencers also described the dishes and their dining experience. Details like price and availability were also included.

    Zafferano’s campaign is notable because of the experience it created for the influencers. This, in turn, helped them come up with authentic and sincere reviews. Since the campaign had such a genuine feel, it encouraged followers to interact and engage with the posts.

    Zara

    Clothing powerhouse Zara was one of the most profitable companies in 2015, and that’s partly because of its successful influencer marketing campaign. The company’s social media marketing campaign got some help from several top fashion-forward Instagrammers. The Instagram posts shared by these popular influencers showcased Zara’s clothing lines and their followers used these photos to get ideas on what’s currently trending as well as tips on how to work a particular style.

    Related image

    Zara’s campaign was a success because the company handed the control over to the fashion influencers, the people that customers looked to for fashion advice. The content that was used in the campaign was subtle and useful, which made it even more valuable to the influencers’ thousands of followers.

    [Featured image via YouTube]

    The post 5 Brands That Used Influencer Marketing to Raise Their Profile appeared first on WebProNews.


    WebProNews

    Posted in IM NewsComments Off

    New gTLDs are Like Used Cars

    There may be a couple exceptions which prove the rule, but new TLDs are generally an awful investment for everyone except the registry operator.

    Here is the short version…

    And the long version…

    Diminishing Returns

    About a half-decade ago I wrote about how Google devalued domain names from an SEO perspective & there have been a number of leading “category killer” domains which have repeatedly been recycled from startup to acquisition to shut down to PPC park page to buy now for this once in a lifetime opportunity in an endless water cycle.

    The central web platforms are becoming ad heavy, which in turn decreases the reach of anything which is not an advertisement. For the most valuable concepts / markets / keywords ads eat up the entire interface for the first screen full of results. Key markets like hotels might get a second round of vertical ads to further displace the concept of organic results.

    Proprietary, Closed-Ecosystem Roach Motels

    The tech monopolies can only make so much money by stuffing ads onto their own platform. To keep increasing their take they need to increase the types, varieties & formats of media they host and control & keep the attention on their platform.

    Both Google & Facebook are promoting scams where they feed on desperate publishers & suck a copy of the publisher’s content into being hosted by the tech monopoly platform de jour & sprinkle a share of the revenues back to the content sources.

    They may even pay a bit upfront for new content formats, but then after the market is primed the deal shifts to where (once again) almost nobody other than the tech monopoly platform wins.

    The attempt to “own” the web & never let users go is so extreme both companies will make up bogus statistics to promote their proprietary / fake open / actually closed standards.

    If you ignore how Google’s AMP double, triple, or quadruple counts visitors in Google Analytics the visit numbers look appealing.

    But the flip side of those fake metrics is actual revenues do not flow.

    Facebook has the same sort of issues, with frequently needing to restate various metrics while partners fly blind.

    These companies are restructuring society & the race to the bottom to try to make the numbers work in an increasingly unstable & parasitic set of platform choices is destroying adjacent markets:

    Have you tried Angry Birds lately? It’s a swamp of dark patterns. All extractive logic meant to trick you into another in-app payment. It’s the perfect example of what happens when product managers have to squeeze ever-more-growth out of ever-less-fertile lands to hit their targets year after year. … back to the incentives. It’s not just those infused by venture capital timelines and return requirements, but also the likes of tax incentives favoring capital gains over income. … that’s the truly insidious part of the tech lords solution to everything. This fantasy that they will be greeted as liberators. When the new boss is really a lot like the old boss, except the big stick is replaced with the big algorithm. Depersonalizing all punishment but doling it out just the same. … this new world order is being driven by a tiny cabal of monopolies. So commercial dissent is near impossible. … competition is for the little people. Pitting one individual contractor against another in a race to the bottom. Hoarding all the bargaining power at the top. Disparaging any attempts against those at the bottom to organize with unions or otherwise.

    To be a success on the attention platforms you have to push toward the edges. But as you become successful you become a target.

    And the dehumanized “algorithm” is not above politics & public relations.

    Pewdiepie is the biggest success story on the YouTube platform. When he made a video showing some of the absurd aspects of Fiverr it led to a WSJ investigation which “uncovered” a pattern of anti-semitism. And yet one of the reporters who worked on that story wrote far more offensive and anti-semetic tweets. The hypocrisy of the hit job didn’t matter. They still were able to go after Pewdiepie’s ad relationships to cut him off from Disney’s Maker Studios & the premium tier of YouTube ads.

    The fact that he is an individual with broad reach means he’ll still be fine economically, but many other publishers would quickly end up in a death spiral from the above sequence.

    If it can happen to a leading player in a closed ecosystem then the risk to smaller players is even greater.

    In some emerging markets Facebook effectively *is* the Internet.

    The Decline of Exact Match Domains

    Domains have been so devalued (from an SEO perspective) that some names like PaydayLoans.net sell for about $ 3,000 at auction.

    $ 3,000 can sound like a lot to someone with no money, but names like that were going for 6 figures at their peak.

    Professional domain sellers participate in the domain auctions on sites like NameJet & SnapNames. Big keywords like [payday loans] in core trusted extensions are not missed. So if the 98% decline in price were an anomaly, at least one of them would have bid more in that auction.

    Why did exact match domains fall so hard? In part because Google shifted from scoring the web based on links to considering things like brand awareness in rankings. And it is very hard to run a large brand-oriented ad campaign promoting a generically descriptive domain name. Sure there are a few exceptions like Cars.com & Hotels.com, but if you watch much TV you’ll see a lot more ads associated with businesses that are not built on generically descriptive domain names.

    Not all domains have fallen quite that hard in price, but the more into the tail you go the less the domain acts as a memorable differentiator. If the barrier to entry increases, then the justification for spending a lot on a domain name as part of a go to market strategy makes less sense.

    Brandable Names Also Lost Value

    Arguably EMDs have lost more value than brandable domain names, but even brandable names have sharply slid.

    If you go back a decade or two tech startups would secure their name (say Snap.com or Monster.com or such) & then try to build a business on it.

    But in the current marketplace with there being many paths to market, some startups don’t even have a domain name at launch, but begin as iPhone or Android apps.

    Now people try to create success on a good enough, but cheap domain name & then as success comes they buy a better domain name.

    Jelly was recently acquired by Pinterest. Rather than buying jelly.com they were still using AskJelly.com for their core site & Jelly.co for their blog.

    As long as domain redirects work, there’s no reason to spend heavily on a domain name for a highly speculative new project.

    Rather then spending 6 figures on a domain name & then seeing if there is market fit, it is far more common to launch a site on something like getapp.com, joinapp.com, app.io, app.co, businessnameapp.com, etc.

    This in turn means that rather than 10,000s of startups all chasing their core .com domain name off the start, people test whatever is good enough & priced close to $ 10. Then only after they are successful do they try to upgrade to better, more memorable & far more expensive domain names.

    Money isn’t spent on the domain names until the project has already shown market fit.

    One in a thousand startups spending $ 1 million is less than one in three startups spending $ 100,000.

    New TLDs Undifferentiated, Risky & Overpriced

    No Actual Marketing Being Done

    Some of the companies which are registries for new TLDs talk up investing in marketing & differentiation for the new TLDs, but very few of them are doing much on the marketing front.

    You may see their banner ads on domainer blogs & they may even pay for placement with some of the registries, but there isn’t much going on in terms of cultivating a stable ecosystem.

    When Google or Facebook try to enter & dominate a new vertical, the end destination may be extractive rent seeking by a monopoly BUT off the start they are at least willing to shoulder some of the risk & cost upfront to try to build awareness.

    Where are the domain registries who have built successful new businesses on some of their new TLDs? Where are the subsidies offered to key talent to help drive awareness & promote the new strings?

    As far as I know, none of that stuff exists.

    In fact, what is prevalent is the exact opposite.

    Greed-Based Anti-Marketing

    So many of them are short sighted greed-based plays that they do the exact opposite of building an ecosystem … they hold back any domain which potentially might not be complete garbage so they can juice it for a premium ask price in the 10s of thousands of dollars.

    While searching on GoDaddy Auctions for a client project I have seen new TLDs like .link listed for sale for MORE THAN the asking price of similar .org names.

    If those prices had any sort of legitimate foundation then the person asking $ 30,000 for a .link would have bulk bought all the equivalent .net and .org names which are listed for cheaper prices.

    But the prices are based on fantasy & almost nobody is dumb enough to pay those sorts of prices.

    Anyone dumb enough to pay that would be better off buying their own registry rather than a single name.

    The holding back of names is the exact opposite of savvy marketing investment. It means there’s no reason to use the new TLD if you either have to pay through the nose or use a really crappy name nobody will remember.

    I didn’t buy more than 15 of Uniregistry’s domains because all names were reserved in the first place and I didn’t feel like buying 2nd tier domains … Domainers were angry when the first 2 Uniregistry’s New gTLDs (.sexy and .tattoo) came out and all remotely good names were reserved despite Frank saying that Uniregistry would not reserve any domains.

    Who defeats the race to the bottom aspects of the web by starting off from a “we only sell shit” standpoint?

    Nobody.

    And that’s why these new TLDs are a zero.

    Defaults Have Value

    Many online verticals are driven by winner take most monopoly economics. There’s a clear dominant leader in each of these core markets: social, search, short-form video, long-form video, retail, auctions, real estate, job search, classifieds, etc. Some other core markets have consolidated down to 3 or 4 core players who among them own about 50 different brands that attack different parts of the market.

    Almost all the category leading businesses which dominate aggregate usage are on .com domains.

    Contrast the lack of marketing for new TLDs with all the marketing one sees for the .com domain name.

    Local country code domain names & .com are not going anywhere. And both .org and .net are widely used & unlikely to face extreme price increases.

    Hosing The Masses…

    A decade ago domainers were frustrated Verisign increased the price of .com domains in ~ 5% increments:

    Every mom, every pop, every company that holds a domain name had no say in the matter. ICANN basically said to Verisign: “We agree to let you hose the masses if you stop suing us”.

    I don’t necessarily mind paying more for domains so much as I mind the money going to a monopolistic regulator which has historically had little regard for the registrants/registrars it should be serving

    Those 5% or 10% shifts were considered “hosing the masses.”

    Imagine what sort of blowback PIR would get from influential charities if they tried to increase the price of .org domains 30-fold overnight. It would be such a public relations disaster it would never be considered.

    Domain registries are not particularly expensive to run. A person who has a number of them can run each of them for less than the cost of a full time employee – say $ 25,000 to $ 50,00 per year.

    And yet, the very people who complained about Verisign’s benign price increases, monopolistic abuses & rent extraction are now pushing massive price hikes:

    .Hosting and .juegos are going up from about $ 10-$ 20 retail to about $ 300. Other domains will also see price increases.

    Here’s the thing with new TLD pricing: registry operators can increase prices as much as they want with just six months’ notice.

    in its applications, Uniregistry said it planned to enter into a contractual agreement to not increase its prices for five years.

    Why would anyone want to build a commercial enterprise (or anything they care about) on such a shoddy foundation?

    If a person promises…

    • no hold backs of premium domains, then reserves 10s of thousands of domains
    • no price hikes for 5 years, then hikes prices
    • the eventual price hikes being inline with inflation, then hikes prices 3,000%

    That’s 3 strikes and the batter is out.

    Doing the Math

    The claim the new TLDs need more revenues to exist are untrue. Running an extension costs maybe $ 50,000 per year. If a registry operator wanted to build a vibrant & stable ecosystem the first step would be dumping the concept of premium domains to encourage wide usage & adoption.

    There are hundreds of these new TLD extensions and almost none of them can be trusted to be a wise investment when compared against similar names in established extensions like .com, .net, .org & CCTLDs like .co.uk or .fr.

    There’s no renewal price protection & there’s no need, especially as prices on the core TLDs have sharply come down.

    Domain Pricing Trends

    Aggregate stats are somewhat hard to come by as many deals are not reported publicly & many sites which aggregate sales data also list minimum prices.

    However domains have lost value for many reasons

    • declining SEO-related value due to the search results becoming over-run with ads (Google keeps increasing their ad clicks 20% to 30% year over year)
    • broad market consolidation in key markets like travel, ecommerce, search & social
      • Google & Facebook are eating OVER 100% of online advertising growth – the rest of industry is shrinking in aggregate
      • are there any major news sites which haven’t struggled to monetize mobile?
      • there is a reason there are few great indy blogs compared to a decade ago
    • rising technical costs in implementing independent websites (responsive design, HTTPS, AMP, etc.) “Closed platforms increase the chunk size of competition & increase the cost of market entry, so people who have good ideas, it is a lot more expensive for their productivity to be monetized. They also don’t like standardization … it looks like rent seeking behaviors on top of friction” – Gabe Newell
    • harder to break into markets with brand-biased relevancy algorithms (increased chunk size of competition)
    • less value in trying to build a brand on a generic name, which struggles to rank in a landscape of brand-biased algorithms (inability to differentiate while being generically descriptive)
    • decline in PPC park page ad revenues
      • for many years Yahoo! hid the deterioration in their core business by relying heavily on partners for ad click volumes, but after they switched to leveraging Bing search, Microsoft was far more interested with click quality vs click quantity
      • absent the competitive bid from Yahoo!, Google drastically reduced partner payouts
      • most web browsers have replaced web address bars with dual function search boxes, drastically reducing direct navigation traffic

    All the above are the mechanics of “why” prices have been dropping, but it is also worth noting many of the leading portfolios have been sold.

    If the domain aftermarket is as vibrant as some people claim, there’s no way the Marchex portfolio of 200,000+ domains would have sold for only $ 28.1 million a couple years ago.

    RegistrarStats shows .com registrations have stopped growing & other extensions like .net, .org, .biz & .info are now shrinking.

    Both aftermarket domain prices & the pool of registered domains on established gTLDs are dropping.

    I know I’ve dropped hundreds & hundreds of domains over the past year. That might be due to my cynical views of the market, but I did hold many names for a decade or more.

    As barrier to entry increases, many of the legacy domains which could have one day been worth developing have lost much of their value.

    And the picked over new TLDs are an even worse investment due to the near infinite downside potential of price hikes, registries outright folding, etc.

    Into this face of declining value there is a rush of oversupply WITH irrational above-market pricing. And then the registries which spend next to nothing on marketing can’t understand why their great new namespaces went nowhere.

    As much as I cringe at .biz & .info, I’d prefer either of them over just about any new TLD.

    Any baggage they may carry is less than the risk of going with an unproven new extension without any protections whatsoever.

    Losing Faith in the Zimbabwe Dollar

    Who really loses is anyone who read what these domain registry operators wrote & trusted them.

    Uniregistry does not believe that registry fees should rise when the costs of other technology services have uniformly trended downward, simply because a registry operator believes it can extract higher profit from its base of registrants.

    How does one justify a 3000% price hike after stating “Our prices are fixed and only indexed to inflation after 5 years.”

    Are they pricing these names in Zimbabwe Dollars? Or did they just change their minds in a way that hurt anyone who trusted them & invested in their ecosystem?

    Frank Schilling warned about the dangers of lifting price controls

    The combination of “presumptive renewal” and the “lifting of price controls on registry services” is incredibly dangerous.
    Imagine buying a home, taking on a large mortgage, remodeling, moving in, only to be informed 6 months later that your property taxes will go up 10,000% with no better services offered by local government. The government doesn’t care if you can’t pay your tax/mortgage because they don’t really want you to pay your tax… they want you to abandon your home so they can take your property and resell it to a higher payer for more money, pocketing the difference themselves, leaving you with nothing.

    This agreement as written leaves the door open to exactly that type of scenario

    He didn’t believe the practice to be poor.

    Rather he felt he would have been made poorer, unless he was the person doing it:

    It would be the mother of all Internet tragedies and a crippling blow to ICANN’s relevance if millions of pioneering registrants were taxed out of their internet homes as a result of the greed of one registry and the benign neglect, apathy or tacit support of its master.

    It is a highly nuanced position.

    Categories: 

    SEO Book

    Posted in IM NewsComments Off

    I Used To Live In A Caravan: Here’s How I Made Enough Money To Live Anywhere In The World

    Most people don’t know this, but when I was younger my ‘bedroom’ was a small caravan (sometimes called a campervan, RV or motorhome, depending on where in the world you come from). This picture is a pretty good representation of my setup at the time, with my caravan at the back…

    The post I Used To Live In A Caravan: Here’s How I Made Enough Money To Live Anywhere In The World appeared first on Entrepreneurs-Journey.com.

    Entrepreneurs-Journey.com by Yaro Starak

    Posted in IM NewsComments Off

    Advert