In case you hadn’t already heard, AdWords can now spend up to double your campaign’s daily budget… which is pretty darned irritating! Fortunately, your favorite PPC superhero is here to save the day. Yep, here I am! So let’s see if we can’t script our way out of this mess. For 99 percent of campaigns, I’d normally recommend not using budget caps at all — I like to “tap it not cap it,” which basically means it’s better to control spend by bids (/ROI) rather than closing up shop with budgets. However, there are certain instances where budgets are not just useful, but essential — for example, if a client has a specific budget attached to a particular campaign. Yes, Google, some people actually have limited marketing budgets! At the very least, you should know when the overspend is happening, so you can judge for yourself whether said overspend should continue. If you’d really like to keep a close eye on costs, have a look at our script to track your account’s spend every hour. For those who only want to be alerted when campaigns are over their budgets, this is where the new script comes in! This latest script from Brainlabs (my employer) checks each campaign’s spend and budget. All you need to do is set a multiplier threshold — if the spend is larger than the budget multiplied by the threshold, then the campaign is labeled. You’ll get an email listing the newly labeled campaigns, along with their spend and budgets. And if you want, you can set another threshold so that if the spend gets too far over your budget, the campaign will be paused.
To use the script, copy the code below into a new AdWords Script and change the settings at the top:
Preview the script to make sure it’s working as expected (and check the logs in case there are any warnings). Then set up a schedule so the script runs hourly. A few things to note:
Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here. About The AuthorDaniel Gilbert is the CEO at Brainlabs, the best paid media agency in the world (self-declared). He has started and invested in a number of big data and technology startups since leaving Google in 2010. Source link : Originally from Marketing Online Tip - Feed http://ift.tt/2zUN9ph
0 Comments
Barry Schwartz
Barry Schwartz is Search Engine Land's News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on very advanced SEM topics. Barry can be followed on social media at @rustybrick, +BarrySchwartz and Facebook. For more background information on Barry, see his full bio and disclosures, click over here. See more source : Originally from Marketing Online Tip - Feed http://ift.tt/2z1l0jr At last week’s SMX East conference, Google’s webmaster trends analyst Gary Illyes took questions from the dual moderators — Barry Schwartz and Michelle Robbins — as well as from the audience in a session called “Ask Me Anything.” In this post, I will cover that question-and-answer dialogue, though what you’ll see below are paraphrases rather than exact quotes. I have grouped the questions and used section headers to help improve the flow and readability. Off-site signalsBarry: You’ve been saying recently that Google looks at other offsite signals, in addition to links, and some of this sounded like Google is doing some form of sentiment analysis. Gary: I did not say that Google did sentiment analysis, but others assumed that was what I meant. What I was attempting to explain is that how people perceive your site will affect your business, but will not necessarily affect how Google ranks your site. Mentions on third-party sites, however, might help you, because Google looks to them to get a better idea what your site is about and get keyword context. And that, in turn, might help you rank for more keywords. Imagine the Google ranking algo is more like a human. If a human sees a lot of brand mentions, they will remember that, and the context in which they saw them. As a result, they may associate that brand with something that they didn’t before. That can happen with the Google algorithm as well. Mobile First, AMP, PWAs and suchMichelle: Where should SEOs focus their efforts in 2018? Gary: If you are not mobile-friendly, then address that. That said, I believe the fear of the mobile-first index will be much greater than the actual impact in the end. Michelle: When will mobile-first roll out? Gary: Google doesn’t have a fixed timeline, but I can say that we have moved some sites over to it already. We are still monitoring those sites to make sure that we are not harming them inadvertently. Our team is working really hard to move over sites that are ready to the mobile-first index, but I don’t want to give a timeline because I’m not good at it. It will probably take years, and even then, will probably not be 100 percent converted. The mobile-first index as a phrase is a new thing, but we have been telling developers to go mobile for seven years. If you have a responsive site, you are pretty much set. But if you have a mobile site, you need to check for content parity and structured data parity between your desktop and mobile pages. You should also check for hreflang tags, and that you’ve also moved all media and images over. Michelle: Where does AMP fit? Is AMP separate from mobile-first? Is the only AMP benefit the increased site speed? Gary: Yes, this is correct. AMP is an alternate version of the site. If you have a desktop site, and no mobile site, but do have an AMP site, we will still index the desktop site. Michelle: If half a site is a progressive web app (PWA), and half is responsive, how does that impact search performance? Gary: PWAs are JavaScript apps. If they can render, they will do pretty much the same as the responsive site. However, we are currently using Chrome Version 41 for rendering, and that’s not the latest, so there are newer APIs not supported by V41. If you’re are using those APIs, you may have a problem. Google is working to get to the latest version of Chrome for rendering, which will solve that issue. Barry: I’ve seen desktop search showing one result and a mobile device showing a different page as an AMP result. Gary: This happens because of our emphasis on indexing mobile-friendly sites. AMP is an alternate version of the regular mobile page. First, the mobile page gets selected to be ranked. Then the AMP page gets swapped in. Michelle: So that means AMP is inconsequential in ranking? Gary: Yes. Michelle: Will there be a penalty for spamming news carousels? Gary: We get that question a lot. I do not support most penalties. I (and many others at Google) would like to have algorithms that ignore those things [like spam] and eliminate the benefit. I’ve spoken with the Top Stories team about this, and they are looking into a solution. Michelle: What about progressive web apps (PWAs)? Do they get the same treatment as AMP, i.e., no ranking boost? Gary: If you have a standalone app, it will show up in the mobile-first index. But if you have both a PWA and an AMP page, the AMP page will be shown. Michelle: What if the elements removed from your mobile-first site are ads? [Would that make the AMP version rank higher?] Gary: Your site will become faster [by adopting AMP and eliminating these ads]. The “above the fold” algorithm looks at how many ads there are, and if it sees too many, it may not let your site rank as highly as it otherwise might. But when we’re looking at whether sites are ready for the mobile-first index, we’re more concerned about parity regarding content, annotations and structured data than ads. Michelle: What about author markup? Gary: Because AMP pages on a media site can show up in the news carousel, the AMP team said that you shouldn’t remove the author info when you’re creating AMP pages. Search ConsoleBarry: When will SEOs be able to see voice search query information in Search Console? Gary: I have no update on that. I’m waiting for the search team leads to take action on it. Barry: How is the Search Console beta going? Gary: It’s going well. There are a significant number of sites in the beta. We’re getting good feedback and making changes. We want to launch something that works really well. I’m not going to predict when it will come out of beta. Barry: When will they get a year’s worth of data? Gary: They have started collecting the data. Not sure if it will launch. The original plan was to launch with the new UI. [Gary doesn’t know if plans have changed, or when the new UI will launch.] Barry: Why is there no Featured Snippet data in Search Console? You built it, tested it, and then didn’t launch it. Gary: There is internal resistance at Google. The internal team leads want to know how it would be useful to publishers. How would publishers use it? Barry: It would give us info on voice search. Gary: I need something to work with to argue for it (to persuade the team leads internally at Google that it would be a good thing to release). This question about how the featured snippet data would be used was then sent to the audience. Eric Enge (your author) spoke from the audience: I’d like to use the data to show clients just how real the move to voice search is. There are things they need to do to get ready, such as understand how interactions with their customers will change. Michelle: So, that data could be used to drive adoption. For now, that sounds like more of a strategic insight than immediately actionable information. Gary: The problem is that voice search has been here for a couple of years. Voice search is currently optimized for what we have, and people shouldn’t need to change anything about their sites. Maybe there will be new technologies in the future that will help users. Michelle: I think that it’s more complicated than that. There are things that you can do with your content that will help it surface better in search, and brands can invest resources in structuring content that can handle conversations better. Ads on Google and the user experienceMichelle: As you (Google) push organic results below the fold [to give more prominence to ads and carousels] … is that a good user experience? Gary: I click on a lot of search ads. (Note that Googler clicks that occur on our internal network don’t count as clicks for advertisers, so this costs you nothing.) I believe that ads in search are more relevant than the 10 blue links. On every search page, there’s pretty aggressive bidding going on for every single position. Since bids correlate to relevance and the quality of the site, this does tend to result in relevant results Barry: Sometimes the ads are more relevant than the organic results …? Gary: Especially on international searches. Michelle: How is that determined? Gary: This is done algorithmically. Michelle: How can you compare ads to organic if the two aren’t working together? Gary: The concept of a bidding process and the evaluation of quality are used by both sides. The separation between the groups is more about keeping the ads people who talk to clients away from the organic people, so they don’t try to influence them. The ads engineering people, they can talk to the organic side; that’s not forbidden. Ranking factors and featured snippetsMichelle: Does Google factor non-search traffic into rankings? Gary: First of all, search traffic is not something we use in rankings. As for other kinds of traffic, Google might see that through Analytics, but I swear we do not use Analytics data for search rankings. We also have data from Chrome, but Chrome is insanely noisy. I actually evaluated the potential for using that data but couldn’t determine how it could be effectively used in ranking. Barry: What about indirect signals from search traffic, such as pogosticking? Previously, Google has said that they do not use that directly for ranking. Gary: Yes, we use it only for QA of our ranking algorithms. Barry: At one point, Jeff Dean said that Google does use them. Gary: I do not know what he was talking about. The RankBrain team is using a lot of different data sources. There was a long internal email thread on this topic, but I was never able to get the bottom of it. Michelle: Is RankBrain used to validate featured snippets? Gary: RankBrain is a generic ranking algorithm which focuses on the 10 blue links. It tries to predict what results will work better based on historical query results. The featured snippets team uses their own result algorithm to generate a good result. I have not looked into what that means on their side. RankBrain is not involved, except that it will evaluate the related blue link. Barry: Featured snippets themselves are fascinating. You said that they are changing constantly. Please explain. Gary: The context for that discussion was about future developments for featured snippets. The team is working around the clock to improve their relevancy. The codebase underlying it is constantly changing. Michelle: Does the device being used by the searcher factor in? Gary: I don’t think so. Schema and markupGary: I want to live in a world where schema is not that important, but currently, we need it. If a team at Google recommends it, you probably should make use of it, as schema helps us understand the content on the page, and it is used in certain search features (but not in rankings algorithms). Michelle: Why do you want to be less reliant on it? Gary: I’m with Sergey and Larry on this. Google should have algorithms that can figure out things without needing schema, and there really should not be a need for penalties. Michelle: Schema is being used as training data? Gary: No, it’s being used for rich snippets. Michelle: Eventually the algo will not need the schema? Gary: I hope so. The algorithms should not need the extra data. Barry: Is there a team actively working on that? Gary: Indirectly, absolutely. It probably involves some sort of machine learning, and if so, it’s the Brain team that works on it. I do not know if they have an active project for that. Barry: How did you get entity data in the past? Gary: From Freebase and the Knowledge Graph. Panda and thin contentBarry: You said that pruning content was a bad idea. If you’re hit by Panda, how do people proceed? Gary: Panda is part of our core ranking algorithm. I don’t think that anyone in a responsible position at Google thinks of Panda as a penalty. It’s very similar to other parts of the algorithm. It’s a ranking algorithm. If you do something to attempt to rank higher than you should, it basically tries to remove the advantage you got, but not punish you. Ultimately, you want to have a great site that people love. That is what Google is looking for, and our users look for that, as well. If users leave comments or mention your site on their site and things like that, that will help your ranking. Pruning does not help with Panda. It’s very likely that you did not get Pandalyzed because of your low-quality content. It’s more about ensuring the content that is actually ranking doesn’t rank higher than it should. Barry: Pruning bad content is advice that SEOs have been giving for a long time to try and help people deal with Panda. Gary: I do not think that would ever have worked. It definitely does not work with the current version of the core algorithm, and it may just bring your traffic farther down. Panda basically disregards things you do to rank artificially. You should spend resources on improving content instead, but if you don’t have the means to do that, maybe remove it instead. Using disavowMichelle: Should you use disavow on the bad links to your site? Gary: I have a site that gets 100,000 visits every two weeks. I haven’t looked at the links to it for two years, even though I’ve been told that it has some porn site links. I’m fine with that. I don’t use the disavow file. Don’t overuse it. It is a big gun. Overusing it can destroy your rankings in a matter of hours. Don’t be afraid of sites that you don’t know. There’s no way you can know them all. If they have content, and they are not spammy, why would you disavow them? Sites like this are very unlikely to hurt you, and they may help you. I personally trust the Google filters. Barry: Penguin just ignores the links. Gary: Penguin does that, too (Gary’s phrase implies that there other algorithms that might filter bad links out, as well). Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here. About The AuthorReference source : Originally from Marketing Online Tip - Feed http://ift.tt/2zVZbi6 Join our social media and CX experts as they explain how social customer service tools can help brands provide winning digital customer experiences. They’ll discuss how to manage that experience across multiple social touch points, leverage evolving social customer service tools and platforms to deliver long-term value and act on real-time customer insights to drive social ROI. Attend this webinar and learn:
Register today for “CX in the Age of Social Media,” produced by Digital Marketing Depot and sponsored by Lithium. Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here. About The AuthorDigital Marketing Depot is a resource center for digital marketing strategies and tactics. We feature hosted white papers and E-Books, original research, and webcasts on digital marketing topics -- from advertising to analytics, SEO and PPC campaign management tools to social media management software, e-commerce to e-mail marketing, and much more about internet marketing. Digital Marketing Depot is a division of Third Door Media, publisher of Search Engine Land and Marketing Land, and producer of the conference series Search Marketing Expo and MarTech. Visit us at http://ift.tt/XKa9gM. Source link : Originally from Marketing Online Tip - Feed http://ift.tt/2httMg4 For those who missed it, Whitespark’s overhaul of the US Local Search Ecosystem interactive tool was recently released, and it does a fantastic job of showing how vast and complex the search industry has become. The ecosystem visualizes the web of search engines, data providers, publishers, directories and other businesses that use local data about businesses to power one simple action that people do every day: search online. For example, the infographic identifies Infogroup, Acxiom, Neustar/Localeze and Factual as the primary data aggregators, which collect and validate location data from businesses and share that data with publishers such as Apple, Bing, Foursquare and Google. (I refer to data aggregators and large publishers collectively as data amplifiers because they share a business’s location data not just directly with searchers, but also with other apps, tools, websites and businesses that, in turn, reshare that data to people across the digital world.) In Whitespark’s words, the ecosystem “shows how business information is distributed online, who the primary data providers are, how search engines use the data, and how it flows.” The interactive tool helps you understand the importance of sharing accurate location data and the consequences of maintaining inaccurate data. For example, because data aggregators influence a web of businesses across the ecosystem, it’s imperative that businesses meet the data formatting requirements of each aggregator. And as you can see, the ecosystem is complex: Local search expert David Mihm originally developed this infographic in 2009, and over the years, the ecosystem has changed dramatically to reflect the rich palette of destinations that people weave together throughout the process of discovery, as well as the number of companies that influence whether a business’s location data appears as it should when, say, a searcher finds them on Facebook, Yelp or Uber. A post on the Whitespark blog by Nyagoslav Zhekov dramatizes this evolution, tracing some of the businesses that have joined and departed. For instance, back in 2009, Apple did not even appear on the ecosystem, and Myspace did. In 2017, Apple is one of the principal data amplifiers, and Myspace is not a factor. You can tell by a quick glance of the 2009 version of the infographic how far the industry as grown: Now, here’s the interesting part: As far-reaching as the new infographic is, it’s just the tip of the iceberg. The infographic does not come close to identifying all the companies that license business information from data amplifiers or use it as a starting point to build out their own curated business directory. For instance, a quick glance at the following three lists of local citation sources shows dozens of additional places where business information exists: Many of the businesses that appear on these lists overlap with those on Whitespark’s local search ecosystem, and they have the same role: receiving and sharing location data that influences which locations appear in search results. But many names on the top citations lists didn’t make the cut and are not part of the infographic. Why? Because of two factors that influence each other:
The 2017 local search ecosystem is a brilliant foundation to get businesses grounded in the most influential sources of location data. But as the above examples demonstrate, the scope of location data companies far exceeds the Whitespark infographic. Put another way: Consider each wedge on the infographic to be a gateway to even more specialty sites by category. The scope of location data directories, publishers and aggregators can seem overwhelming. But if you manage multiple brick-and-mortar storefronts, don’t despair. You need not have a presence on every directory on the lists I’ve cited. It’s far more important to focus your efforts on building relationships with data amplifiers. When you share your data with the core aggregators and publishers, you create two advantages for yourself:
Understand the scope and richness of the location data ecosystem. Make sure you are constantly optimizing your data and content to be found everywhere. And let the data amplifiers help you succeed across the ecosystem. Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here. About The AuthorAdam Dorfman is the Senior Vice President of Product & Technology at SIM Partners where he leads the teams responsible for the best in class local automation platform Velocity. Follow him on Twitter @phixed. See more source : Originally from Marketing Online Tip - Feed http://ift.tt/2z2Apjo Video is booming as a content marketing medium. People love watching videos online, and producing great video content is quickly becoming one of the most surefire ways to command attention and grow a following. In fact, by 2019, video is expected to drive an astonishing 80% of all internet traffic. Clearly, it’s important for businesses to start working on their video content sooner rather than later. And while producing great content is essential, that’s only half the battle. For your videos to benefit your business, people have to be able to find them, and that involves optimization. So which video search engines should you focus on optimizing for? This article will explore the differences between YouTube and Google Videos, the two biggest video search engines on the web. Keep reading to learn more about the types of traffic these search engines will bring you – and why your videos might rank well in one but not the other. How do people find your videos?There’s no shortage of video search engines and video hosting sites on the Internet. YouTube, of course, is the web’s video giant, with 300 hours of new video uploaded every minute. Other video hosting sites like Daily Motion and Vimeo also get a significant amount of traffic. Social media sites like Facebook, Instagram, and Snapchat incorporate short video into their platforms as well. Social videos are gaining steam, and they may become a threat to YouTube in the future. For now, though, YouTube still dominates the online video world the way that Google dominates other search engines. And while plenty of video searches happen through Google, most of them return YouTube videos. If you produce video content, there’s a good chance your watchers are finding you either through YouTube’s built-in search function or through Google Videos searches. Google Videos returns mostly (but not exclusively) results from YouTube. This search for “video content marketing” also returned a video from lynda.com. Comparing YouTube and Google Videos searchesIf you search for the same keyword on YouTube and Google Videos, how similar will your results be? Not that similar, as it turns out. Take a look at the following example. Here are the first few results from a Google Videos search for “how to improve video SEO”: The top Google Videos results for the query “how to improve video SEO” And here are the first few results for the same query on YouTube: The top YouTube results for the query “how to improve video SEO” In this case, there’s no overlap at all between the top four results. Clearly, these two search engines don’t use the same criteria for ranking videos. “Wait a minute,” you might say. “Doesn’t Google own YouTube?” Yes it does. In fact, Google has owned YouTube for more than ten years. However, the two sites serve distinct purposes. Someone who visits YouTube probably isn’t looking for the same thing as someone who types a question into Google. Thanks to this difference in user intent, Google Videos and YouTube don’t use the same algorithms to rank videos, so it makes sense to think about them as two different search engines. Why YouTube and Google Videos display different resultsEarlier this year, Stone Temple released a study that found that YouTube and Google Videos return different top results for the same query more than half of the time. In fact, the more YouTube results show up in a Google Videos query, the more dramatically Google’s results differ from YouTube’s. Stone Temple found that the more YouTube videos appear in Google Videos results, the more results for that query vary between the two search engines. Source The study goes on to explore the reasons behind these differences. In a nutshell, it comes down to both user intent and monetization. Google as a video search engineSpecific searchesWhen someone goes to Google, they tend to be looking for something specific. They want to find out how to do something, track down a particular fact, or research the difference between several options. Google is most often used as a tool for finding other things, not as a medium in itself. Immediate resourcesThe videos Google displays tend to be to-the-point and useful. Google’s video results tend to favor how-tos and other specific, immediate resources. Videos made for entertainment purposes are probably less likely to rank highly in Google, although of course this is dependent on the search query and the individual video. Quality resultsGoogle also places a great deal of importance on user satisfaction, since that’s what keeps people coming back. Thus, they’re likely to favor higher-quality videos over lower-quality ones, even if the creators of those lower-quality videos are bidding higher in AdWords than their competitors. Of course, “quality” is a vague and somewhat subjective metric, and Google is famously tight-lipped about how their algorithm determines quality. The important thing to understand, though, is that Google won’t sacrifice good results for more ad money. YouTube as a video search engineEntertainment-focusedOn the whole, people go to YouTube to find entertainment. Google wants to solve people’s problems and send them on their way as quickly as possible, but YouTube wants to keep users watching. This is partly because view time is an indicator of a video’s quality. If people stick around and watch a whole video, it’s a good sign that that video is interesting, useful, or entertaining. View time also tends to be correlated with user satisfaction. People who find and watch lots of enjoyable, high-quality videos will probably keep coming back to YouTube. Longer videos favoredFor YouTube, view time is also linked to making money. The longer someone watches a video, the more ads YouTube gets to show them. This is also why YouTube tends to favor longer videos over shorter ones in its rankings. These differences shed some light on why Google Videos and YouTube use different algorithms, but unfortunately, we still don’t know exactly what the differences between those algorithms are. Considering how closely Google guards its secrets, we’re not likely to find out anytime soon, either. In the meantime, though, it’s important not to forget that the two search engines often have a lot of overlap in their results, even though they’re not exactly the same. Thus, it stands to reason that there are some general principles for ranking well in both places. How to rank well on video search enginesFirst, and most obviously, create great content. Your bounce rate says a lot about the quality of your videos. If a lot of people hit the “back” button within the first ten seconds of a video, YouTube and Google will both assume it’s not very good. So do your best to start each video with a compelling opening, and then give people a reason to keep watching. Include plenty of text-based information with your video. Search engines can’t watch a video and determine what it’s about, but they can read the accompanying text. Your title is important – it should be descriptive and use your main keyword, preferably at the beginning. Take the time to write an in-depth description of your video as well. Captions and transcripts aren’t necessary to include, but they improve accessibility, and they could give you a keyword boost. Finally, tag your video with some useful and relevant tags. Getting views and comments will help your rankings, but don’t be tempted to purchase these. YouTube has gotten smarter about figuring out when views and comments are fake. Promote your content through social media to get more engagement, and be patient – if you do great work, people will discover it in time. So, which is better: YouTube or Google Videos?At the end of the day, it’s hard to say whether YouTube is “better” than Google Videos, or vice versa. The two search engines tend to be used differently, but both of them are very popular, and both of them are valuable sources of traffic if you optimize your videos correctly. The type of content you create could have an impact on your rankings in each search engine. For instance, if you make short videos geared towards answering specific questions, you might have an easier time gaining traction in Google. If you make longer, more entertainment-focused videos, you might see better results from YouTube. This is far from a hard-and-fast rule, though. The main thing to remember? High-quality videos have a good chance of doing well in both search engines, regardless of other factors. We don’t know exactly which metrics Google Videos and YouTube use to determine rankings, but we do know viewers prefer well-made, informative, and entertaining videos. Focus on making the best video content you can, and you’ll probably find that your rankings take care of themselves. Have you noticed a difference in your videos’ rankings between different video search engines? Share your observations in the comments! Amanda DiSilvestro is a writer for NoRiskSEO, a full service SEO agency, and a contributor to SEW. You can connect with Amanda on Twitter and LinkedIn, or check out her services at amandadisilvestro.com. Want to stay on top of the latest search trends?Get top insights and news from our search experts. Related readingIn digital marketing, we’re always trying to keep up with the hottest new thing – advertising methods, ad types, targeting types, etc. – being pitched heavily within the industry in general. Video consumption is on the rise and while users enjoy it, brands face the challenge of creating more effective video ads to stand out in the overcrowded social feeds. Facebook recorded an increase of 800% in video consumption from 2015 to 2016, jumping from 1 billion views to 8 billion views in just a year. According to a study by FreeWheel Media, in Q1 2016, ads on long-form video content saw a 95% completion rate. Source link : Originally from Marketing Online Tip - Feed http://ift.tt/2zjFB2Y As we approach Halloween and our Netflix queues again fill up with all manner of spooky, startling and downright horrifying monsters, I’m reminded of another kind of monster we should all be afraid of: outdated SEO tactics. These tactics range from harmless but ineffective (like Casper the Friendly Ghost) all the way to completely devastating (like Freddy Krueger). And much like the bad guy in so many of the horror movies we all grew up watching, these tactics never seem to die, despite common sense, SEO professionals, and even Google warning people away from them. So today, we’re going to delve into 13 outdated SEO tactics that you should be terrified of and avoid at all costs. 1. Link and article directoriesLink directories are generally useless today, with the exception of high-quality, niche-specific directories that follow strict editorial guidelines. Long before search engines were as powerful and effective as they are today, link directories served as a way to categorize websites so that people could find what they were looking for. Thanks to the simplicity of installing and using the software that powers them, marketers’ insatiable appetite for fast and easy links, and website owners’ hunt for additional revenue streams, link directories exploded in popularity. But, since they didn’t provide any real value to visitors, search engines began to ignore many of these link directories — and they quickly lost their effectiveness as a link-building tactic. Eventually, link directories became a toxic wasteland of low-quality links that could actually get your website penalized. Article directories are even worse. What started off as a way to share your brilliant insight with a larger audience while earning links, this tactic was quickly abused. Marketers began using software to “rewrite” their articles and submit them to thousands of article directories at a time. As with link directories, article directories — now bloated with low-quality content — simply hit a point at which they provided no value to visitors. Marketers just used them for fast and easy links. Indeed, the glut of low-quality content flooding the web through these article directories appeared to be the proverbial straw that broke the camel’s back right before the release of Google’s Panda update, which decimated countless websites. With the exception of high-quality, niche-specific link directories — and you may only find one or two in any given industry — you should avoid link and article directories entirely. 2. Exact-match domainsFor a while, exact-match domains (EMDs) were a hot topic because they became a silver bullet for search engine optimization. It was easy to throw up a microsite on an exact-match domain and rank far more quickly than a traditional, branded domain — often in weeks, sometimes in days. With an EMD, your domain matches the exact keyword phrase you’re targeting. For example:
But much like a werewolf when the full moon wanes, EMDs quickly lost their power as Google adjusted their algorithm. Exact-match domains have the potential to rank as well as any other domain, but they also seem to have a higher potential to be flagged for spam, either algorithmically or manually. They become an even riskier proposition when you consider that they generally aren’t as “brandable,” and as a result, the domain will generally be viewed as less trustworthy, which can reduce conversions and make link building more difficult. 3. Reciprocal linkingSearch engines view a link to another site as a “vote” for that site — so reciprocal linking is essentially saying, “If you vote for me, I’ll vote for you.” This is the very definition of manipulative linking practices, yet that didn’t stop millions of marketers from blindly trading links, even with websites that had zero relevance to theirs. Worse yet, rather than links embedded within valuable content, these links were often simply dumped on a “links” or “resources” page, sometimes broken into categorical pages, along with hundreds of other links, offering no value to visitors. This tactic, though ineffective today, still stumbles slowly along like a putrid and rotting zombie, more than a decade after its death. 4. Flat URL architectureThis isn’t really a “tactic” as much as it is just the default way WordPress is set up, and most people don’t know that they need to change it. Ex. 1: http://ift.tt/1f1fzRH vs. Ex. 2: http://ift.tt/2gOSNBS A flat URL structure (Ex. 1) makes it more difficult for search engines to understand the hierarchy of your website because all of your pages are treated with the same level of importance, while a faceted or nested URL structure (Ex. 2) clearly communicates the importance of each page within your website in relation to every other page within your website. The first step is to change your default permalink settings. Then, if you haven’t already, publish your second-level pages, and create corresponding blog categories; or, if they already exist, move them and set up any applicable redirects. The slugs for your categories must exactly match the slugs for your second-level pages. This little detail is critical because it determines how search engines will value each page within your website relative to other pages within your website. Once properly configured, each third-level page and blog post will appear as a sub-page of the applicable second-level page based on the blog category it is assigned to. In other words, each third-level page/post adds more authority to the page it appears nested under. It’s important to think this through thoroughly because changing it later means having to redirect all of the pages in your website and potentially losing ranking. 5. Indiscriminate guest bloggingContrary to what some people claim, guest blogging is far from dead. However, it has changed dramatically. To fully understand the context, it’s important to understand the evolution of guest blogging over the years. Guest blogging has roots in traditional public relations. The basic premise is that you’re trying to leverage a larger, existing audience by publishing your article on an established publication. This helps you to:
In the early days, you would seek out publications for guest posting opportunities based on the size and, more importantly, the relevance of their audience. The intent was to get in front of more of the right people, and this involved writing killer content that their audience would find valuable, which would usually include a short bio, and maybe even one or more links back to your own website. Website owners attempting to keep Google happy by constantly adding fresh content were all too eager to publish a steady stream of posts from guest authors, and because links are the lifeblood of SEO, people quickly latched onto this tactic to build links and sucked the life out of it like a ravenous vampire. Marketers soon began submitting guest posts to any website that would accept them in an attempt to acquire a link. Your website is about construction? Great! Let me submit an article on construction trends, along with a bio that includes a link back to my crochet website — relevance be damned! The next predictable step was that many marketers began submitting completely off-topic articles, and website owners eagerly published them. This is why we can’t have nice things. Google understandably showed up like a mob of angry villagers with pitchforks and torches to put an end to this nonsense and, as they often do, created a lot of collateral damage in the process. Websites were penalized, and while some took years to recover, a many never did, so their owners had to start over on a new domain. A lot even went out of business. For a while, people shied away from guest blogging, but today, it’s returned to its traditional roots. 6. Keyword stuffingBack when search engines were only capable of interpreting simple signals, like keyword density, stuffing keywords by the truckload into a web page to make it seem more relevant was all the rage. What should have been just a few instances of a particular phrase sprinkled throughout a web page grew faster than a zombie outbreak. This doesn’t work — and more importantly, it makes it look like you employ drunk toddlers to write your copy, which doesn’t do much to inspire trust in your company. 7. Exact-match anchor textAt one point, anchor text — the clickable text of a link — was a huge ranking factor. For example, if you wanted to rank for “Tampa contractor,” you would have tried to acquire as many links using Tampa contractor as the anchor text as you could. Marketers predictably abused this tactic (seeing a trend yet?), and Google clamped down on it and dropped the ranking for websites with what they deemed to be unnatural amounts of keyword-rich anchor text backlinks. The anchor text distribution for a natural link profile will generally have a lot of variety. That’s because if 100 different people linked to the same page on your website, each link would likely be used in a different context within their content. One person might link to your web page using anchor text that describes the product (“blue widgets,” for example), while another may link using anchor text that describes the price, and yet another might even link using nondescript anchor text like “click here” or something similar. Below is an example of the anchor text distribution for Search Engine Land. The majority of your anchor text will not be an exact match to the keyword topics you’re targeting unless they are part of your brand or domain name. And this is OK because today, rather than anchor text, Google places more emphasis on:
I wouldn’t put too much effort into controlling the specific anchor text that others use to link to your website — it’s a waste of time, and it can potentially harm your ranking if you go overboard and create an unnatural pattern. The majority of anchor text for most websites with a natural link profile will generally be for branded terms anyway. 8. Pages for every keyword variationKeyword phrases, in the traditional thinking, are dead. The old approach involved creating a separate page for every keyword variation, but fortunately, search engines are a lot smarter today, so this isn’t necessary. Google’s Knowledge Graph, based on latent semantic indexing, started to kill off traditional thinking, but RankBrain drove a stake into its heart. Today, websites that still follow this antiquated tactic perform a lot like the zombie hordes you see mindlessly wandering around in a George Romero movie in search of fresh brains to devour. RankBrain is just a catchy name for Google’s machine-learning artificial intelligence system (Skynet was already taken, apparently) that helps it to better understand the user intent behind a query. It can even help Google to (appropriately) rank a web page for keyword phrases that aren’t in the content! This means that if you write content for a page about HVAC services, RankBrain understands that it would also very likely be a good match for a user entering any of the following queries:
If you’ve created individual pages for each keyword variation in the past, you may be tempted to leave them and just stop doing it in the future, but that’s not enough. You need to prune the unnecessary pages, merge content that can be merged, and create any applicable 301 redirects, because these unnecessary pages will have a negative impact on how Google views your website, and how often and how thoroughly it is crawled. So, instead of creating an individual page for every keyword phrase you want to rank for, create a more comprehensive page for a keyword topic. Using the HVAC example we mentioned earlier, this would involve creating a page about HVAC services, along with a subheading and content for each of the additional highly-related phrases. 9. Paid linksPaying for PageRank-passing links has been a clear violation of Google’s webmaster guidelines for a long time, but like the machete-wielding protagonist at Camp Crystal Lake, this one simply refuses to die. I take a pragmatic view to buying links: They can work to improve your ranking in the short term, but you may eventually get caught and penalized, so is it really even worth it? You might think you can be really careful — buy just a few links to get some traction and stay under Google’s radar — but that’s not going to happen. They are always hunting for both link buyers and link sellers, and it’s shockingly easy because all they have to do is follow the links. You might get be thinking, “Pffft… I know what I’m doing, Jeremy! I’m careful when I buy links!” Sure you are. But can you say the same thing about the site owners you buy the links from? Or everyone else who buys links from them? Let’s say Google catches one link buyer by identifying an unnatural pattern of inbound links — all they need to do next is evaluate the outbound links of anyone linking to that buyer to identify more link sellers. In turn, that will uncover more link buyers, which again uncovers more link sellers. See how fast it all goes downhill? So just don’t buy links. 10. Low-quality contentI recently gave a presentation on digital marketing to a group of franchisees of a large national brand. While discussing the type of content they should be producing for their websites, one of the franchisees frustratedly said, “I can’t write articles for my website — it takes too much time and effort just to do what I’m doing now!” Effective SEO requires you to regularly produce amazing content — which is, understandably, difficult for time-strapped marketers. A lack of time and resources can often lead to rushing content creation, or worse yet, outsourcing it to non-English speakers or budget services like Fiverr or Upwork. The resulting content is often the text equivalent of the unintelligible grunts from Frankenstein’s monster. The days of simply producing content just for the sake of publishing something, are, fortunately, far behind us thanks to Google’s Panda update in 2011. Since then, the algorithm has been further refined and worked into the core algorithm. Your content should be robust, well-written, accurate and engaging. There is no minimum or maximum ideal length; it just needs to be long enough to serve its purpose. Sometimes that may mean just a few hundred words, and other times, that may mean several thousand words. While we’re on the subject of writing content… 11. Writing for bots rather than peopleIf you’ve ever seen a web page or an article that repeats a particular keyword over and over, awkwardly forces a keyword phrase into a sentence in a way that doesn’t make sense or incorporates unnecessary heading tags, then you’ve probably seen an example of someone writing for bots rather than people. SEO has come a long way since the early days, when we had to really spell everything out in order for the search engines to understand and rank a page. You don’t need to do that anymore. Write for people, because they will be the ones buying your products or services. 12. Creating multiple interlinked websitesThere are two approaches to creating multiple interlinked websites — and neither one is an effective SEO tactic today. The first approach is interlinking several legitimate websites that you own. This is the lesser of two evils because if done properly, it won’t result in a penalty. However, it also won’t have much impact, if any, on your SEO efforts, since search engines place a high value on the number of linking root domains, not just the total number of links. Another black mark against this approach is that it reduces the resources you can direct to marketing your primary website. An example of this being done properly would be when a residential home builder links to a mortgage company that they also own, because there is a high relevance between both websites. The second approach, which is unquestionably black hat, is to create a series of websites just for the purpose of linking to other websites you own. Since this tactic requires you to create an ever-growing network of websites on such a scale that the only way to describe it would be a gremlin pool party, it’s an absolute certainty that you will also create a pattern that Google can identify, which will result in a penalty. Instead of trying to build, manage and market multiple websites just to acquire a few measly links, focus your efforts on earning lots of high-quality links from other legitimate websites. An added benefit is that as those websites become more authoritative, their links to your website will become more powerful. 13. Automated link buildingWhen links became an essential part of SEO, marketers predictably sought ways to maximize their link-building efforts using a variety of automated software programs. They blasted their links into guestbooks, blog comments and forums, submitted their websites to bookmarking services and link directories and spun poorly written articles by the thousands, for submission to every article directory they could find. I’m all for automating certain tasks to improve efficiency within your business, but link building is not one of them because the only kind of links that can be built this way violate Google’s webmaster guidelines. You can call me a purist, but there is simply no way to automate high-quality link building. That requires creating amazing content and developing relationships to earn links to it. There are no shortcuts. Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here. About The AuthorJeremy Knauff is the founder of Spartan Media, a digital marketing agency in Tampa, Florida. He's also a proud father, husband, and US Marine Corps veteran. After 18 years in the digital marketing industry, he's learned a thing or two, and today, while still serving his clients, he's working to share his knowledge with the industry to help even more people. Origin soruce : Originally from Marketing Online Tip - Feed http://ift.tt/2iNfXfU Below is what happened in search today, as reported on Search Engine Land and from other places across the web. From Search Engine Land:
Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:
Search News From Around The Web:
The post SearchCap: Bing Ads grows, Amazon ad revenues & PPC audits appeared first on Search Engine Land. Reference source : Originally from Marketing Online Tip - Feed http://ift.tt/2iQ6Jjn As marketing functions increasingly rely on technology, Scott Brinker, aka “Chief MarTech,” laid out nine reason he believes search marketers are poised for leadership as marketing becomes increasingly technology-dependent in a keynote presentation at SMX East in New York City last week. Search marketers, of course, employ any number of tools and technologies in their work, and the industry has spawned hundreds of products and solutions. Brinker outlined how the work of search marketers touches 22 of the 49 categories he has identified in the Marketing Technology Landscape infographic he has been compiling to track the growth in marketing technology companies. Brinker, program chair for the MarTech Conference series and editor VP platform ecosystem at HubSpot, highlighted the core functions of search marketing — testing, analysis, conversion optimization and so on — that encompass the overlap of marketing, technology and management. With more and more companies creating the role of chief marketing technologist, Brinker says search marketers have long been on the cutting edge of this growing trend. About The AuthorAs Third Door Media's paid media reporter, Ginny Marvin writes about paid online marketing topics including paid search, paid social, display and retargeting for Search Engine Land and Marketing Land. With more than 15 years of marketing experience, Ginny has held both in-house and agency management positions. She provides search marketing and demand generation advice for ecommerce companies and can be found on Twitter as @ginnymarvin. See more source : Originally from Marketing Online Tip - Feed http://ift.tt/2gVqIfE If you run a PPC agency, you’ll know it’s not that unusual for clients to occasionally bring in an outside auditor to review their PPC accounts. Sometimes, your client will let you know in advance; sometimes, you’ll find out when you see a request to access the account. And sometimes, you won’t find out until after the fact, when the final report is forwarded to you for discussion! I completely understand why some clients like to have an outside audit of their PPC accounts. For some companies, it’s simply part of their due diligence. For others, an executive will come up with the idea and push it through. And for some, it’s impossible to resist the allure of a “free” audit. I can also understand why clients might hesitate to inform their PPC agency of their decision. They might feel embarrassed or uncomfortable about the situation. Or they may feel ambivalent about the audit itself. In some cases, it may be that the client doesn’t trust the agency not to do some quick “fixes” in anticipation of the audit. (Although I have to say, if you don’t trust your agency enough to let them know of the audit in advance, you definitely shouldn’t trust them to run your campaigns!) But whatever the situation, external audits are something that PPC agencies have to expect. But what’s it like to go through one? And how could the process be improved? Today, I’m going to tell you about a recent external audit one of our clients initiated and some of the issues the process raised. When your client brings in an external auditorIn this case, our client let me know up front that they were bringing in an external auditor, which I appreciated. But at the same time, I was rather surprised, too. This was an account we’d held for about five years, and we had good communication with them. Moreover, we’d gotten them some excellent results, and everyone seemed very happy all round. As we learned later, the audit came about because a different executive in the company had been approached with the offer of a free PPC audit, and he felt the company had nothing to lose. So they agreed to it. Meanwhile, my contact at the company reassured me that they were happy with our work. She said they had worked with “good” and “bad” agencies before and knew the difference. She also recognized that the outside auditor wasn’t entirely neutral in this process. (Was this free audit a marketing strategy by the auditor? We weren’t sure. But assuredly, any “free” audit has strings attached.) At the same time, I reminded myself that my agency had never lost a client due to an audit (knock wood!). More importantly, we had nothing to hide, and I had total confidence in my team and our work. And who knows? Maybe the report would have some helpful recommendations. Having a fresh set of eyes on an account is never a bad idea. Besides, how detailed would a “free” audit be? A few days later, my client presented me with the report. And it was huge! It ran about 35 pages and was very detailed and thorough. At first, I was excited. Surely this would yield all kinds of valuable information! But once I started to dig into it, my enthusiasm started to flag. Because as it turned out, the report suffers from two major problems:
Problem #1: A regurgitation of existing dataUnfortunately, the report didn’t contain anything surprising or new. It was mostly a detailed recounting of what was currently happening with the account. And of course, we already knew what was happening with the account. If my client had asked, I could have easily filled her in on account details without going to an outside auditor. And my team and I do make a concerted effort to communicate with our clients. We usually have weekly or bimonthly standing calls with them, and we also provide them with relevant reports. Is it possible that the client was looking for information we weren’t providing? Possibly. But again, if we had been alerted to this need, we would have been more than happy to provide it. (If nothing else, the lesson here is to occasionally check in with the client to see if they want more detailed, or different, reporting.) Much more problematic than the redundancy in the report was its lack of recommendations. The vast bulk of the report was focused on current account status, not suggestions for changes or improvements — which seemed like a lost opportunity. Problem #2: Incorrect assumptionsAnother major issue with the report was that many of its conclusions were based on incorrect assumptions. The auditor lacked the context to clearly understand what was going on with the account. Repeatedly, the auditor found “errors” that weren’t errors at all — which he would have known if he’d had more background information. Without this context, the value of the whole audit exercise comes into question. What kind of information was the auditor lacking? I can think of four specific areas the auditor should have inquired about before even logging into AdWords: 1. What is the company’s business? What are its goals?Whenever we land a new client, we ask the owner or marketing team to complete an onboarding questionnaire. The questionnaire allows us to better understand their business and its goals. It only seems logical that an auditor would go through a similar process. After all, how can you audit a PPC account when you know little about the company? We can also extend this “context for understanding” to PPC tools. Not everything happens in AdWords. In this case, my team and I were using Google Analytics for some of our tracking, and the auditor missed this point completely. 2. What tests are the agency currently running?As an agency, we use labels religiously to clarify what we’re doing in client accounts — especially in terms of testing. But not all agencies do. And even so, it can be impossible to capture the complexity of these tests in one little label. Auditors would need to get more detailed information outside of the account to fully understand what’s being tested and why. For example, we were in the process of testing the “optimize for clicks” setting on some of our client’s campaigns. Of course, the auditor saw this setting selected and immediately marked it with a big red “X” in the report. We knew (and the client knew) why were testing this setting. But the auditor didn’t — and therefore he filled several paragraphs explaining why this isn’t an optimal setting in most cases. 3. What strategies and tactics have been tried in the past and haven’t worked?Similarly, it would be helpful for the auditor to know what things we’ve tested in the past — and the results. For this particular client, the auditor noted that we didn’t have any non-branded keywords live. Why? Because the nature of this client’s business is seasonal. And in the past, we had heavily tested non-branded keywords in peak season, with disappointing results each time. This year, we decided (in consultation with the client) to ditch non-branded keywords during peak season and expand our Google Display Network efforts instead. The result: a major success! But of the course, the auditor didn’t know any of this. So he marked another big X and wrote a few more paragraphs explaining why non-branded keywords are important. 4. What projects are slated for testing in the next quarter or two?As with all our clients, we had plans in place for testing over the next few months, including device adjustments and audience tests. But again, the auditor wasn’t aware of these plans. When he noted their absence, he assigned more red Xs and gave more lengthy explanations for why they should be done. But we knew that already. Make your audit worth your timeBased on this experience, I can only conclude that audits can eat up a lot of hours. The client had to spend time arranging for the audit and reviewing the report. I had to spend time reviewing the report and responding to the findings. And I can only imagine how many hours the auditor spent auditing the accounts and writing his report. Therefore, we can conclude that even a free audit comes at a cost. So if you decide to move forward with one, whether free or not, make it worth your time by ensuring that the auditor has answers to the questions outlined above. And suggest that they put more emphasis on making recommendations than recapping current status. Hopefully, by putting these pieces in place, you’ll end up with an accurate and valuable final report — that doesn’t immediately get filed in the circular folder. Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here. About The AuthorPauline Jakober is CEO of Group Twenty Seven, a boutique online advertising agency specializing in the Google AdWords and Bing Ads networks. As a Google AdWords Certified Partner, Jakober and her team practice cutting edge paid search strategy and management for clients across many industries. Origin soruce : Originally from Marketing Online Tip - Feed http://ift.tt/2zRJIzr |
AuthorWrite something about yourself. No need to be fancy, just an overview. Archives
November 2021
Categories |