Tag Archive | "Used"

Greg Smith: Founder Of Canadian Tech Startup Thinkific Explains How They Used MVPs To Build A Hugely Successful Subscription Software Company

 [ Download MP3 | Transcript | iTunes | Soundcloud | Raw RSS ] One of the hottest business models in the tech startup world is anything with a recurring subscription business model, especially if it’s software based. Another hot online business model for talented individuals who want to make money from their knowledge, is […]

The post Greg Smith: Founder Of Canadian Tech Startup Thinkific Explains How They Used MVPs To Build A Hugely Successful Subscription Software Company appeared first on Yaro.Blog.

Entrepreneurs-Journey.com by Yaro Starak

Posted in IM NewsComments Off

Greg Smith: Founder Of Canadian Tech Startup Thinkific Explains How They Used MVPs To Build A Hugely Successful Subscription Software Company

 [ Download MP3 | Transcript | iTunes | Soundcloud | Raw RSS ] One of the hottest business models in the tech startup world is anything with a recurring subscription business model, especially if it’s software based. Another hot online business model for talented individuals who want to make money from their knowledge, is […]

The post Greg Smith: Founder Of Canadian Tech Startup Thinkific Explains How They Used MVPs To Build A Hugely Successful Subscription Software Company appeared first on Yaro.Blog.

Entrepreneurs-Journey.com by Yaro Starak

Posted in IM NewsComments Off

Google My Business Insights adds queries used to find your business

Learn how people find your business in Google Maps and local search within Google My Business Insights with this new report.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Moz’s Link Data Used to Suck… But Not Anymore! The New Link Explorer is Here – Whiteboard Friday

Posted by randfish

Earlier this week we launched our brand-new link building tool, and we’re happy to say that Link Explorer addresses and improves upon a lot of the big problems that have plagued our legacy link tool, Open Site Explorer. In today’s Whiteboard Friday, Rand transparently lists out many of the biggest complaints we’ve heard about OSE over the years and explains the vast improvements Link Explorer provides, from DA scores updated daily to historic link data to a huge index of almost five trillion URLs.

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

Click on the whiteboard image above to open a high-resolution version in a new tab!


Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m very excited to say that Moz’s Open Site Explorer product, which had a lot of challenges with it, is finally being retired, and we have a new product, Link Explorer, that’s taking its place. So let me walk you through why and how Moz’s link data for the last few years has really kind of sucked. There’s no two ways about it.

If you heard me here on Whiteboard Friday, if you watched me at conferences, if you saw me blogging, you’d probably see me saying, “Hey, I personally use Ahrefs, or I use Majestic for my link research.” Moz has a lot of other good tools. The crawler is excellent. Moz Pro is good. But Open Site Explorer was really lagging, and today, that’s not the case. Let me walk you through this.

The big complaints about OSE/Mozscape

1. The index was just too small

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

Mozscape was probably about a fifth to a tenth the size of its competitors. While it got a lot of the quality good links of the web, it just didn’t get enough. As SEOs, we need to know all of the links, the good ones and the bad ones.

2. The data was just too old

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

So, in Mozscape, a link that you built on November 1st, you got a link added to a website, you’re very proud of yourself. That’s excellent. You should expect that a link tool should pick that up within maybe a couple weeks, maybe three weeks at the outside. Google is probably picking it up within just a few days, sometimes hours.

Yet, when Mozscape would crawl that, it would often be a month or more later, and by the time Mozscape processed its index, it could be another 40 days after that, meaning that you could see a 60- to 80-day delay, sometimes even longer, between when your link was built and when Mozscape actually found it. That sucks.

3. PA/DA scores took forever to update

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

PA/DA scores, likewise, took forever to update because of this link problem. So the index would say, oh, your DA is over here. You’re at 25, and now maybe you’re at 30. But in reality, you’re probably far ahead of that, because you’ve been building a lot of links that Mozscape just hasn’t picked up yet. So this is this lagging indicator. Sometimes there would be links that it just didn’t even know about. So PA and DA just wouldn’t be as accurate or precise as you’d want them to be.

4. Some scores were really confusing and out of date

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

MozRank and MozTrust relied on essentially the original Google PageRank paper from 1997, which there’s no way that’s what’s being used today. Google certainly uses some view of link equity that’s passed between links that is similar to PageRank, and I think they probably internally call that PageRank, but it looks nothing like what MozRank was called.

Likewise, MozTrust, way out of date, from a paper in I think 2002 or 2003. Much more advancements in search have happened since then.

Spam score was also out of date. It used a system that was correlated with what spam looked like three, four years ago, so much more up to date than these two, but really not nearly as sophisticated as what Google is doing today. So we needed to toss those out and find their replacements as well.

5. There was no way to see links gained and lost over time

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

Mozscape had no way to see gained and lost links over time, and folks thought, “Gosh, these other tools in the SEO space give me this ability to show me links that their index has discovered or links they’ve seen that we’ve lost. I really want that.”

6. DA didn’t correlate as well as it should have

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

So over time, DA became a less and less indicative measure of how well you were performing in Google’s rankings. That needed to change as well. The new DA, by the way, much, much better on this front.

7. Bulk metrics checking and link reporting was too hard and manual

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

So folks would say, “Hey, I have this giant spreadsheet with all my link data. I want to upload that. I want you guys to crawl it. I want to go fetch all your metrics. I want to get DA scores for these hundreds or thousands of websites that I’ve got. How do I do that?” We didn’t provide a good way for you to do that either unless you were willing to write code and loop in our API.

8. People wanted distribution of their links by DA

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

They wanted distributions of their links by domain authority. Show me where my links come from, yes, but also what sorts of buckets of DA do I have versus my competition? That was also missing.

So, let me show you what the new Link Explorer has.

Moz's new Link Explorer

Click on the whiteboard image above to open a high-resolution version in a new tab!

Wow, look at that magical board change, and it only took a fraction of a second. Amazing.

What Link Explorer has done, as compared to the old Open Site Explorer, is pretty exciting. I’m actually very proud of the team. If you know me, you know I am a picky SOB. I usually don’t even like most of the stuff that we put out here, but oh my god, this is quite an incredible product.

1. Link Explorer has a GIANT index

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

So I mentioned index size was a big problem. Link Explorer has got a giant index. Frankly, it’s about 20 times larger than what Open Site Explorer had and, as you can see, very, very competitive with the other services out there. Majestic Fresh says they have about a trillion URLs from their I think it’s the last 60 days. Ahrefs, about 3 trillion. Majestic’s historic, which goes all time, has about 7 trillion, and Moz, just in the last 90 days, which I think is our index — maybe it’s a little shorter than that, 60 days — 4.7 trillion, so almost 5 trillion URLs. Just really, really big. It covers a huge swath of the web, which is great.

2. All data updates every 24 hours

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

So, unlike the old index, it is very fresh. Every time it finds a new link, it updates PA scores and DA scores. The whole interface can show you all the links that it found just yesterday every morning.

3. DA and PA are tracked daily for every site

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

You don’t have to track them yourself. You don’t have to put them into your campaigns. Every time you go and visit a domain, you will see this graph showing you domain authority over time, which has been awesome.

For my new company, I’ve been tracking all the links that come in to SparkToro, and I can see my DA rising. It’s really exciting. I put out a good blog post, I get a bunch of links, and my DA goes up the next day. How cool is that?

4. Old scores are gone, and new scores are polished and high quality

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

So we got rid of MozRank and MozTrust, which were very old metrics and, frankly, very few people were using them, and most folks who were using them didn’t really know how to use them. PA basically takes care of both of them. It includes the weight of links that come to you and the trustworthiness. So that makes more sense as a metric.

Spam score is now on a 0 to 100% risk model instead of the old 0 to 17 flags and the flags correlate to some percentage. So 0 to 100 risk model. Spam score is basically just a machine learning built model against sites that Google penalized or banned.

So we took a huge amount of domains. We ran their names through Google. If they couldn’t rank for their own name, we said they were penalized. If we did a site: the domain.com and Google had de-indexed them, we said they were banned. Then we built this risk model. So in the 90% that means 90% of sites that had these qualities were penalized or banned. 2% means only 2% did. If you have a 30% spam score, that’s not too bad. If you have a 75% spam score, it’s getting a little sketchy.

5. Discovered and lost links are available for every site, every day

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

So again, for this new startup that I’m doing, I’ve been watching as I get new links and I see where they come from, and then sometimes I’ll reach out on Twitter and say thank you to those folks who are linking to my blog posts and stuff. But it’s very, very cool to see links that I gain and links that I lose every single day. This is a feature that Ahrefs and Majestic have had for a long time, and frankly Moz was behind on this. So I’m very glad that we have it now.

6. DA is back as a high-quality leading indicator of ranking ability

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

So, a note that is important: everyone’s DA has changed. Your DA has changed. My DA has changed. Moz’s DA changed. Google’s DA changed. I think it went from a 98 to a 97. My advice is take a look at yourself versus all your competitors that you’re trying to rank against and use that to benchmark yourself. The old DA was an old model on old data on an old, tiny index. The new one is based on this 4.7 trillion size index. It is much bigger. It is much fresher. It is much more accurate. You can see that in the correlations.

7. Building link lists, tracking links that you want to acquire, and bulk metrics checking is now easy

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

Building link lists, tracking links that you want to acquire, and bulk metrics checking, which we never had before and, in fact, not a lot of the other tools have this link tracking ability, is now available through possibly my favorite feature in the tool called Link Tracking Lists. If you’ve used Keyword Explorer and you’ve set up your keywords to watch those over time and to build a keyword research set, very, very similar. If you have links you want to acquire, you add them to this list. If you have links that you want to check on, you add them to this list. It will give you all the metrics, and it will tell you: Does this link to your website that you can associate with a list, or does it not? Or does it link to some page on the domain, but maybe not exactly the page that you want? It will tell that too. Pretty cool.

8. Link distribution by DA

Moz's Link Data Used to Suck... But Not Anymore! The New Link Explorer is Here - Whiteboard Friday

Finally, we do now have link distribution by DA. You can find that right on the Overview page at the bottom.

Look, I’m not saying Link Explorer is the absolute perfect, best product out there, but it’s really, really damn good. I’m incredibly proud of the team. I’m very proud to have this product out there.

If you’d like, I’ll be writing some more about how we went about building this product and a bunch of agency folks that we spent time with to develop this, and I would like to thank all of them of course. A huge thank you to the Moz team.

I hope you’ll do me a favor. Check out Link Explorer. I think, very frankly, this team has earned 30 seconds of your time to go check it out.

Try out Link Explorer!

All right. Thanks, everyone. We’ll see you again for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Google Confirms Chrome Usage Data Used to Measure Site Speed

Posted by Tom-Anthony

During a discussion with Google’s John Mueller at SMX Munich in March, he told me an interesting bit of data about how Google evaluates site speed nowadays. It has gotten a bit of interest from people when I mentioned it at SearchLove San Diego the week after, so I followed up with John to clarify my understanding.

The short version is that Google is now using performance data aggregated from Chrome users who have opted in as a datapoint in the evaluation of site speed (and as a signal with regards to rankings). This is a positive move (IMHO) as it means we don’t need to treat optimizing site speed for Google as a separate task from optimizing for users.

Previously, it has not been clear how Google evaluates site speed, and it was generally believed to be measured by Googlebot during its visits — a belief enhanced by the presence of speed charts in Search Console. However, the onset of JavaScript-enabled crawling made it less clear what Google is doing — they obviously want the most realistic data possible, but it’s a hard problem to solve. Googlebot is not built to replicate how actual visitors experience a site, and so as the task of crawling became more complex, it makes sense that Googlebot may not be the best mechanism for this (if it ever was the mechanism).

In this post, I want to recap the pertinent data around this news quickly and try to understand what this may mean for users.

Google Search Console

Firstly, we should clarify our understand of what the “time spent downloading a page” metric in Google Search Console is telling us. Most of us will recognize graphs like this one:

Until recently, I was unclear about exactly what this graph was telling me. But handily, John Mueller comes to the rescue again with a detailed answer [login required] (hat tip to James Baddiley from Chillisauce.com for bringing this to my attention):

John clarified what this graph is showing:

It’s technically not “downloading the page” but rather “receiving data in response to requesting a URL” – it’s not based on rendering the page, it includes all requests made.

And that it is:

this is the average over all requests for that day

Because Google may be fetching a very different set of resources every day when it’s crawling your site, and because this graph does not account for anything to do with page rendering, it is not useful as a measure of the real performance of your site.

For that reason, John points out that:

Focusing blindly on that number doesn’t make sense.

With which I quite agree. The graph can be useful for identifying certain classes of backend issues, but there are also probably better ways for you to do that (e.g. WebPageTest.org, of which I’m a big fan).

Okay, so now we understand that graph and what it represents, let’s look at the next option: the Google WRS.

Googlebot & the Web Rendering Service

Google’s WRS is their headless browser mechanism based on Chrome 41, which is used for things like “Fetch as Googlebot” in Search Console, and is increasingly what Googlebot is using when it crawls pages.

However, we know that this isn’t how Google evaluates pages because of a Twitter conversation between Aymen Loukil and Google’s Gary Illyes. Aymen wrote up a blog post detailing it at the time, but the important takeaway was that Gary confirmed that WRS is not responsible for evaluating site speed:

Twitter conversation with Gary Ilyes

At the time, Gary was unable to clarify what was being used to evaluate site performance (perhaps because the Chrome User Experience Report hadn’t been announced yet). It seems as though things have progressed since then, however. Google is now able to tell us a little more, which takes us on to the Chrome User Experience Report.

Chrome User Experience Report

Introduced in October last year, the Chrome User Experience Report “is a public dataset of key user experience metrics for top origins on the web,” whereby “performance data included in the report is from real-world conditions, aggregated from Chrome users who have opted-in to syncing their browsing history and have usage statistic reporting enabled.”

Essentially, certain Chrome users allow their browser to report back load time metrics to Google. The report currently has a public dataset for the top 1 million+ origins, though I imagine they have data for many more domains than are included in the public data set.

In March I was at SMX Munich (amazing conference!), where along with a small group of SEOs I had a chat with John Mueller. I asked John about how Google evaluates site speed, given that Gary had clarified it was not the WRS. John was kind enough to shed some light on the situation, but at that point, nothing was published anywhere.

However, since then, John has confirmed this information in a Google Webmaster Central Hangout [15m30s, in German], where he explains they’re using this data along with some other data sources (he doesn’t say which, though notes that it is in part because the data set does not cover all domains).

At SMX John also pointed out how Google’s PageSpeed Insights tool now includes data from the Chrome User Experience Report:

The public dataset of performance data for the top million domains is also available in a public BigQuery project, if you’re into that sort of thing!

We can’t be sure what all the other factors Google is using are, but we now know they are certainly using this data. As I mentioned above, I also imagine they are using data on more sites than are perhaps provided in the public dataset, but this is not confirmed.

Pay attention to users

Importantly, this means that there are changes you can make to your site that Googlebot is not capable of detecting, which are still detected by Google and used as a ranking signal. For example, we know that Googlebot does not support HTTP/2 crawling, but now we know that Google will be able to detect the speed improvements you would get from deploying HTTP/2 for your users.

The same is true if you were to use service workers for advanced caching behaviors — Googlebot wouldn’t be aware, but users would. There are certainly other such examples.

Essentially, this means that there’s no longer a reason to worry about pagespeed for Googlebot, and you should instead just focus on improving things for your users. You still need to pay attention to Googlebot for crawling purposes, which is a separate task.

If you are unsure where to look for site speed advice, then you should look at:

That’s all for now! If you have questions, please comment here and I’ll do my best! Thanks!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

5 Brands That Used Influencer Marketing to Raise Their Profile

Influencer marketing is more than just a marketing buzzword these days. More companies are utilizing this marketing method to boost sales and grow their brands.

For those still confused about what influencer marketing is, it’s simply the act of promoting or selling products or services via influencers, or people who have the ability to affect a brand. Where the main influencers before were celebrities and industry leaders, today’s influencers are more varied. Nowadays, top brands are seeking out bloggers, food critics, makeup mavens and celebrities who rose to fame on platforms like YouTube and Instagram.

Brands that Benefited from Influencer Marketing

Influencer marketing provides a lot of benefits. Brands can reach the relevant demographic and enjoy high levels of engagement. It’s also affordable and can help retain a brand’s authenticity. Numerous companies have already successfully leveraged these people to give their brand a boost.

Clinique for Men

Clinique is renowned for its hypoallergenic skincare for women. When the iconic cosmetic company launched a men’s line, they raised product awareness by partnering with a disparate group of male influencers from various professions. These influencers consisted of filmmakers, outdoorsmen, stylists, and lifestyle bloggers, each representing a group of men who would be interested in using Clinique for Men. Every post used in the campaign was unique and defined the influencer. For instance, surfer Mikey de Temple posted a photo of himself wearing his surf gear, with his surfboard in the background, along with a Clinique product.

Clinique’s campaign was golden for several reasons. One, the company’s choice of influencers were so diverse that it expanded the product’s reach. Also, the posts integrated the product smoothly into a setting that was so natural to the influencer. This helped create a more organic interest in Clinique’s men’s line.

Fashion Nova

One brand that has truly embraced influencer marketing is Fashion Nova. According to the company’s founder and CEO, Richard Saghian, Fashion Nova is a viral store that works with 3,000 to 5,000 influencers. Its aggressive marketing efforts rely on lots of model and celebrity influencers, like Kylie Jenner and beauty vlogger Blissful Brii. The former has 93.8 million followers on Instagram while the latter has 93 thousand subscribers on YouTube. These two influencers alone have garnered millions of engagements, likes, and comments for the company.

While other brands go for low-key but very relatable influencers, Fashion Nova went for the celebrities. While this will obviously net a company high-levels of engagement, it can also be costly. But as Fashion Nova has proven, it’s a worthwhile investment.

Lagavulin’s Whiskey Yule Log

This is a magnificent example of how an influencer marketing campaign made a product culturally relevant to a generation. Young people might not have a taste for single malt whiskey, but Lagavulin’s 2016 campaign featuring Nick Offerman changed that. Offerman’s iconic Parks and Rec character, Ron Swanson, is known for his love of whisky. Lagavulin’s 45-minute video took inspiration from YouTube’s yule log videos and simply showed Offerman quietly sipping and enjoying his whiskey next to a fireplace.

The campaign was a success because Lagavulin found the perfect influencer for its brand. Offerman’s character proved to be a critical match for the target audience. As a matter of fact, the campaign was so good that it won an award for Best Influencer & Celebrity Campaign.

Zafferano

Zafferano does not have the same name recall as Nobu or other famous restaurants. But this Singapore-based establishment is a prime example of how social media can be used to boost audience engagement. The company tapped 11 Instagram influencers who are popular in the lifestyle and food category. They invited them to the restaurant for a special meal and in turn, they shared photos of the dishes on Instagram. The influencers also described the dishes and their dining experience. Details like price and availability were also included.

Zafferano’s campaign is notable because of the experience it created for the influencers. This, in turn, helped them come up with authentic and sincere reviews. Since the campaign had such a genuine feel, it encouraged followers to interact and engage with the posts.

Zara

Clothing powerhouse Zara was one of the most profitable companies in 2015, and that’s partly because of its successful influencer marketing campaign. The company’s social media marketing campaign got some help from several top fashion-forward Instagrammers. The Instagram posts shared by these popular influencers showcased Zara’s clothing lines and their followers used these photos to get ideas on what’s currently trending as well as tips on how to work a particular style.

Related image

Zara’s campaign was a success because the company handed the control over to the fashion influencers, the people that customers looked to for fashion advice. The content that was used in the campaign was subtle and useful, which made it even more valuable to the influencers’ thousands of followers.

[Featured image via YouTube]

The post 5 Brands That Used Influencer Marketing to Raise Their Profile appeared first on WebProNews.


WebProNews

Posted in IM NewsComments Off

New gTLDs are Like Used Cars

There may be a couple exceptions which prove the rule, but new TLDs are generally an awful investment for everyone except the registry operator.

Here is the short version…

And the long version…

Diminishing Returns

About a half-decade ago I wrote about how Google devalued domain names from an SEO perspective & there have been a number of leading “category killer” domains which have repeatedly been recycled from startup to acquisition to shut down to PPC park page to buy now for this once in a lifetime opportunity in an endless water cycle.

The central web platforms are becoming ad heavy, which in turn decreases the reach of anything which is not an advertisement. For the most valuable concepts / markets / keywords ads eat up the entire interface for the first screen full of results. Key markets like hotels might get a second round of vertical ads to further displace the concept of organic results.

Proprietary, Closed-Ecosystem Roach Motels

The tech monopolies can only make so much money by stuffing ads onto their own platform. To keep increasing their take they need to increase the types, varieties & formats of media they host and control & keep the attention on their platform.

Both Google & Facebook are promoting scams where they feed on desperate publishers & suck a copy of the publisher’s content into being hosted by the tech monopoly platform de jour & sprinkle a share of the revenues back to the content sources.

They may even pay a bit upfront for new content formats, but then after the market is primed the deal shifts to where (once again) almost nobody other than the tech monopoly platform wins.

The attempt to “own” the web & never let users go is so extreme both companies will make up bogus statistics to promote their proprietary / fake open / actually closed standards.

If you ignore how Google’s AMP double, triple, or quadruple counts visitors in Google Analytics the visit numbers look appealing.

But the flip side of those fake metrics is actual revenues do not flow.

Facebook has the same sort of issues, with frequently needing to restate various metrics while partners fly blind.

These companies are restructuring society & the race to the bottom to try to make the numbers work in an increasingly unstable & parasitic set of platform choices is destroying adjacent markets:

Have you tried Angry Birds lately? It’s a swamp of dark patterns. All extractive logic meant to trick you into another in-app payment. It’s the perfect example of what happens when product managers have to squeeze ever-more-growth out of ever-less-fertile lands to hit their targets year after year. … back to the incentives. It’s not just those infused by venture capital timelines and return requirements, but also the likes of tax incentives favoring capital gains over income. … that’s the truly insidious part of the tech lords solution to everything. This fantasy that they will be greeted as liberators. When the new boss is really a lot like the old boss, except the big stick is replaced with the big algorithm. Depersonalizing all punishment but doling it out just the same. … this new world order is being driven by a tiny cabal of monopolies. So commercial dissent is near impossible. … competition is for the little people. Pitting one individual contractor against another in a race to the bottom. Hoarding all the bargaining power at the top. Disparaging any attempts against those at the bottom to organize with unions or otherwise.

To be a success on the attention platforms you have to push toward the edges. But as you become successful you become a target.

And the dehumanized “algorithm” is not above politics & public relations.

Pewdiepie is the biggest success story on the YouTube platform. When he made a video showing some of the absurd aspects of Fiverr it led to a WSJ investigation which “uncovered” a pattern of anti-semitism. And yet one of the reporters who worked on that story wrote far more offensive and anti-semetic tweets. The hypocrisy of the hit job didn’t matter. They still were able to go after Pewdiepie’s ad relationships to cut him off from Disney’s Maker Studios & the premium tier of YouTube ads.

The fact that he is an individual with broad reach means he’ll still be fine economically, but many other publishers would quickly end up in a death spiral from the above sequence.

If it can happen to a leading player in a closed ecosystem then the risk to smaller players is even greater.

In some emerging markets Facebook effectively *is* the Internet.

The Decline of Exact Match Domains

Domains have been so devalued (from an SEO perspective) that some names like PaydayLoans.net sell for about $ 3,000 at auction.

$ 3,000 can sound like a lot to someone with no money, but names like that were going for 6 figures at their peak.

Professional domain sellers participate in the domain auctions on sites like NameJet & SnapNames. Big keywords like [payday loans] in core trusted extensions are not missed. So if the 98% decline in price were an anomaly, at least one of them would have bid more in that auction.

Why did exact match domains fall so hard? In part because Google shifted from scoring the web based on links to considering things like brand awareness in rankings. And it is very hard to run a large brand-oriented ad campaign promoting a generically descriptive domain name. Sure there are a few exceptions like Cars.com & Hotels.com, but if you watch much TV you’ll see a lot more ads associated with businesses that are not built on generically descriptive domain names.

Not all domains have fallen quite that hard in price, but the more into the tail you go the less the domain acts as a memorable differentiator. If the barrier to entry increases, then the justification for spending a lot on a domain name as part of a go to market strategy makes less sense.

Brandable Names Also Lost Value

Arguably EMDs have lost more value than brandable domain names, but even brandable names have sharply slid.

If you go back a decade or two tech startups would secure their name (say Snap.com or Monster.com or such) & then try to build a business on it.

But in the current marketplace with there being many paths to market, some startups don’t even have a domain name at launch, but begin as iPhone or Android apps.

Now people try to create success on a good enough, but cheap domain name & then as success comes they buy a better domain name.

Jelly was recently acquired by Pinterest. Rather than buying jelly.com they were still using AskJelly.com for their core site & Jelly.co for their blog.

As long as domain redirects work, there’s no reason to spend heavily on a domain name for a highly speculative new project.

Rather then spending 6 figures on a domain name & then seeing if there is market fit, it is far more common to launch a site on something like getapp.com, joinapp.com, app.io, app.co, businessnameapp.com, etc.

This in turn means that rather than 10,000s of startups all chasing their core .com domain name off the start, people test whatever is good enough & priced close to $ 10. Then only after they are successful do they try to upgrade to better, more memorable & far more expensive domain names.

Money isn’t spent on the domain names until the project has already shown market fit.

One in a thousand startups spending $ 1 million is less than one in three startups spending $ 100,000.

New TLDs Undifferentiated, Risky & Overpriced

No Actual Marketing Being Done

Some of the companies which are registries for new TLDs talk up investing in marketing & differentiation for the new TLDs, but very few of them are doing much on the marketing front.

You may see their banner ads on domainer blogs & they may even pay for placement with some of the registries, but there isn’t much going on in terms of cultivating a stable ecosystem.

When Google or Facebook try to enter & dominate a new vertical, the end destination may be extractive rent seeking by a monopoly BUT off the start they are at least willing to shoulder some of the risk & cost upfront to try to build awareness.

Where are the domain registries who have built successful new businesses on some of their new TLDs? Where are the subsidies offered to key talent to help drive awareness & promote the new strings?

As far as I know, none of that stuff exists.

In fact, what is prevalent is the exact opposite.

Greed-Based Anti-Marketing

So many of them are short sighted greed-based plays that they do the exact opposite of building an ecosystem … they hold back any domain which potentially might not be complete garbage so they can juice it for a premium ask price in the 10s of thousands of dollars.

While searching on GoDaddy Auctions for a client project I have seen new TLDs like .link listed for sale for MORE THAN the asking price of similar .org names.

If those prices had any sort of legitimate foundation then the person asking $ 30,000 for a .link would have bulk bought all the equivalent .net and .org names which are listed for cheaper prices.

But the prices are based on fantasy & almost nobody is dumb enough to pay those sorts of prices.

Anyone dumb enough to pay that would be better off buying their own registry rather than a single name.

The holding back of names is the exact opposite of savvy marketing investment. It means there’s no reason to use the new TLD if you either have to pay through the nose or use a really crappy name nobody will remember.

I didn’t buy more than 15 of Uniregistry’s domains because all names were reserved in the first place and I didn’t feel like buying 2nd tier domains … Domainers were angry when the first 2 Uniregistry’s New gTLDs (.sexy and .tattoo) came out and all remotely good names were reserved despite Frank saying that Uniregistry would not reserve any domains.

Who defeats the race to the bottom aspects of the web by starting off from a “we only sell shit” standpoint?

Nobody.

And that’s why these new TLDs are a zero.

Defaults Have Value

Many online verticals are driven by winner take most monopoly economics. There’s a clear dominant leader in each of these core markets: social, search, short-form video, long-form video, retail, auctions, real estate, job search, classifieds, etc. Some other core markets have consolidated down to 3 or 4 core players who among them own about 50 different brands that attack different parts of the market.

Almost all the category leading businesses which dominate aggregate usage are on .com domains.

Contrast the lack of marketing for new TLDs with all the marketing one sees for the .com domain name.

Local country code domain names & .com are not going anywhere. And both .org and .net are widely used & unlikely to face extreme price increases.

Hosing The Masses…

A decade ago domainers were frustrated Verisign increased the price of .com domains in ~ 5% increments:

Every mom, every pop, every company that holds a domain name had no say in the matter. ICANN basically said to Verisign: “We agree to let you hose the masses if you stop suing us”.

I don’t necessarily mind paying more for domains so much as I mind the money going to a monopolistic regulator which has historically had little regard for the registrants/registrars it should be serving

Those 5% or 10% shifts were considered “hosing the masses.”

Imagine what sort of blowback PIR would get from influential charities if they tried to increase the price of .org domains 30-fold overnight. It would be such a public relations disaster it would never be considered.

Domain registries are not particularly expensive to run. A person who has a number of them can run each of them for less than the cost of a full time employee – say $ 25,000 to $ 50,00 per year.

And yet, the very people who complained about Verisign’s benign price increases, monopolistic abuses & rent extraction are now pushing massive price hikes:

.Hosting and .juegos are going up from about $ 10-$ 20 retail to about $ 300. Other domains will also see price increases.

Here’s the thing with new TLD pricing: registry operators can increase prices as much as they want with just six months’ notice.

in its applications, Uniregistry said it planned to enter into a contractual agreement to not increase its prices for five years.

Why would anyone want to build a commercial enterprise (or anything they care about) on such a shoddy foundation?

If a person promises…

  • no hold backs of premium domains, then reserves 10s of thousands of domains
  • no price hikes for 5 years, then hikes prices
  • the eventual price hikes being inline with inflation, then hikes prices 3,000%

That’s 3 strikes and the batter is out.

Doing the Math

The claim the new TLDs need more revenues to exist are untrue. Running an extension costs maybe $ 50,000 per year. If a registry operator wanted to build a vibrant & stable ecosystem the first step would be dumping the concept of premium domains to encourage wide usage & adoption.

There are hundreds of these new TLD extensions and almost none of them can be trusted to be a wise investment when compared against similar names in established extensions like .com, .net, .org & CCTLDs like .co.uk or .fr.

There’s no renewal price protection & there’s no need, especially as prices on the core TLDs have sharply come down.

Domain Pricing Trends

Aggregate stats are somewhat hard to come by as many deals are not reported publicly & many sites which aggregate sales data also list minimum prices.

However domains have lost value for many reasons

  • declining SEO-related value due to the search results becoming over-run with ads (Google keeps increasing their ad clicks 20% to 30% year over year)
  • broad market consolidation in key markets like travel, ecommerce, search & social
    • Google & Facebook are eating OVER 100% of online advertising growth – the rest of industry is shrinking in aggregate
    • are there any major news sites which haven’t struggled to monetize mobile?
    • there is a reason there are few great indy blogs compared to a decade ago
  • rising technical costs in implementing independent websites (responsive design, HTTPS, AMP, etc.) “Closed platforms increase the chunk size of competition & increase the cost of market entry, so people who have good ideas, it is a lot more expensive for their productivity to be monetized. They also don’t like standardization … it looks like rent seeking behaviors on top of friction” – Gabe Newell
  • harder to break into markets with brand-biased relevancy algorithms (increased chunk size of competition)
  • less value in trying to build a brand on a generic name, which struggles to rank in a landscape of brand-biased algorithms (inability to differentiate while being generically descriptive)
  • decline in PPC park page ad revenues
    • for many years Yahoo! hid the deterioration in their core business by relying heavily on partners for ad click volumes, but after they switched to leveraging Bing search, Microsoft was far more interested with click quality vs click quantity
    • absent the competitive bid from Yahoo!, Google drastically reduced partner payouts
    • most web browsers have replaced web address bars with dual function search boxes, drastically reducing direct navigation traffic

All the above are the mechanics of “why” prices have been dropping, but it is also worth noting many of the leading portfolios have been sold.

If the domain aftermarket is as vibrant as some people claim, there’s no way the Marchex portfolio of 200,000+ domains would have sold for only $ 28.1 million a couple years ago.

RegistrarStats shows .com registrations have stopped growing & other extensions like .net, .org, .biz & .info are now shrinking.

Both aftermarket domain prices & the pool of registered domains on established gTLDs are dropping.

I know I’ve dropped hundreds & hundreds of domains over the past year. That might be due to my cynical views of the market, but I did hold many names for a decade or more.

As barrier to entry increases, many of the legacy domains which could have one day been worth developing have lost much of their value.

And the picked over new TLDs are an even worse investment due to the near infinite downside potential of price hikes, registries outright folding, etc.

Into this face of declining value there is a rush of oversupply WITH irrational above-market pricing. And then the registries which spend next to nothing on marketing can’t understand why their great new namespaces went nowhere.

As much as I cringe at .biz & .info, I’d prefer either of them over just about any new TLD.

Any baggage they may carry is less than the risk of going with an unproven new extension without any protections whatsoever.

Losing Faith in the Zimbabwe Dollar

Who really loses is anyone who read what these domain registry operators wrote & trusted them.

Uniregistry does not believe that registry fees should rise when the costs of other technology services have uniformly trended downward, simply because a registry operator believes it can extract higher profit from its base of registrants.

How does one justify a 3000% price hike after stating “Our prices are fixed and only indexed to inflation after 5 years.”

Are they pricing these names in Zimbabwe Dollars? Or did they just change their minds in a way that hurt anyone who trusted them & invested in their ecosystem?

Frank Schilling warned about the dangers of lifting price controls

The combination of “presumptive renewal” and the “lifting of price controls on registry services” is incredibly dangerous.
Imagine buying a home, taking on a large mortgage, remodeling, moving in, only to be informed 6 months later that your property taxes will go up 10,000% with no better services offered by local government. The government doesn’t care if you can’t pay your tax/mortgage because they don’t really want you to pay your tax… they want you to abandon your home so they can take your property and resell it to a higher payer for more money, pocketing the difference themselves, leaving you with nothing.

This agreement as written leaves the door open to exactly that type of scenario

He didn’t believe the practice to be poor.

Rather he felt he would have been made poorer, unless he was the person doing it:

It would be the mother of all Internet tragedies and a crippling blow to ICANN’s relevance if millions of pioneering registrants were taxed out of their internet homes as a result of the greed of one registry and the benign neglect, apathy or tacit support of its master.

It is a highly nuanced position.

Categories: 

SEO Book

Posted in IM NewsComments Off

I Used To Live In A Caravan: Here’s How I Made Enough Money To Live Anywhere In The World

Most people don’t know this, but when I was younger my ‘bedroom’ was a small caravan (sometimes called a campervan, RV or motorhome, depending on where in the world you come from). This picture is a pretty good representation of my setup at the time, with my caravan at the back…

The post I Used To Live In A Caravan: Here’s How I Made Enough Money To Live Anywhere In The World appeared first on Entrepreneurs-Journey.com.

Entrepreneurs-Journey.com by Yaro Starak

Posted in IM NewsComments Off

3 Different Marketing Experiments I Used To Grow My Online Editing Business During The Early Days

Note from Yaro: This blog post was originally a two-part series I wrote in January 2005, shortly after this blog was first created. I combined the two parts into this one article, which details how I started and grew my online editing business, BetterEdit. I was still 100% focused on…

The post 3 Different Marketing Experiments I Used To Grow My Online Editing Business During The Early Days appeared first on Entrepreneurs-Journey.com.

Entrepreneurs-Journey.com by Yaro Starak

Posted in IM NewsComments Off

New take on Showcase Shopping ads? Categories of used items showing for retailer outlet queries

Similar to the Showcase ad format introduced this summer, the ads link to Google Shopping pages.

The post New take on Showcase Shopping ads? Categories of used items showing for retailer outlet queries appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Advert