Search This Blog

Thursday 24 December 2015

These reasons are your website to be penalized by Google




Google’s on an uncompromising mission. It wants to give its users access to accurate information, unique content and the finest writers. It continually tweaks and improves its algorithms so that the best of the web gets the exposure it deserves.
Unfortunately, there’s a flipside: a penalty. That’s the consequence of Google taking issue with something on your site. Sometimes a penalty is well deserved, but even if you know you’re in the wrong, you probably want to do something about it.

What Is a Google Penalty?

Google has been changing its ranking algorithms since December 2000. That’s when it released its toolbar extension. At the time, the toolbar update represented a sea change that would create the SEO industry as we know it. In fact, it was the first time Page Rank was published in a meaningful or usable form.
Over the next decade-and-a-bit, Google continued to refine the quality of its search results. Over time, it begins to eliminate poor quality content and elevate the good stuff to the top of the SERPs. That’s where penalties – come in.
The Penguin update was rolled out in 2012. It hit more than 1 in 10 search results overnight, wiped some sites out of search entirely, pushed poor quality content off the map and forced optimizers to think much more carefully about their content strategy. Since then, SEO professionals have been very tuned in to Google’s plans, fearing the next update in case it results in a penalty for a site they’re working on.

Recognizing a Penalty

Penalties can be automatic or manual. With manual penalties, you’ll probably be told, but you may not always know you’ve been targeted if the cause is algorithmic. Those penalties may take even the most experienced SEO professionals by surprise.
For algorithmic penalties, here are some sure-fire clues.
  • Your website is not ranking well for your brand name any more. That’s a dead giveaway. Even if your site doesn’t rank for much else, it should at least do well on that one keyword.
  • Any page one positions you had are slipping back to page two or three without any action on your part.
  • PageRank for your site has inexplicably dropped from a respectable two or three to a big fat zero (or a measly PR of one).
  • The entire website has been removed from Google’s cached search results overnight.
  • Running a site search – site:yourdomain.com keyword – yields no results.
  • Your listing – when you eventually find it in Google – is for a page on your site other than the home page.
If you see one or more of these factors, you can be pretty sure that a penalty has affected your site.

Why Has Google Penalized My Site?
Google is continually tweaking and revising the way it indexes content.
While it does publish clues about its algorithm updates, it rarely comes clean about all of its reasons for changes. Fixing things can be tough.
To get you off on the right track, here’s the part you’ve been waiting for: 50 common reasons for Google taking issue with your site. While we’re not saying we know the definite reasons for a penalty, we do know that these factors all contribute.

  1. Buying links. Some swear it doesn’t happen, but actual evidence is mixed. Buying links could certainly be seen as an attempt to manipulate PageRank, and therein lies the controversy. If you’ve been buying bad links (and lots of them), your actions could have caught up with you.
  2. Excessive reciprocal links. Swapping links was once an innocent marketing tactic until it started to be abused. If you’ve been exchanging lots of links with clients, it could be seen as a manipulation attempt.
  3. Duplicate content. Hopefully this one’s obvious: any duplicate content on your site makes it less useful in Google’s view, and that could result in a penalty. Make sure your content is unique and well-written; use tools like Copyscape and CopyGator too
  4. Overusing H1 tags. Correctly structuring content helps with SEO. The H1 tag helps Google to understand what the page is about. Excessive H1 tags could be seen as an attempt to pump Google’s listing with keywords.
  5. Internal 404s. Google wants to know that you tend to your content and weed out any errors and problems. If you’re delivering 404s inside your own website, it’s a sure fire signal that your users aren’t getting the information they ask for.
  6. Links from sites in another language. This one seems unfair, right? You’ve got a legitimate link from a client in another country, yet it’s technically counted against you. Well, Google’s reasoning is sound: users generally tend to prefer one language, so linking to sites in another language isn’t that useful for them.
  7. Keyword stuffed content. There are all kinds of weird and wonderful ‘rules’ about keyword density in content. The truth is that none of these rules are proven, and a very high keyword density is a flag for poorly written content. If Google detects a weirdly high number of keywords in a page, it may penalize you – rightly or wrongly.
  8. Footer links. Some web designers use footer links as a navigational aid; some try to manipulate PageRank by using the footer as a place to pass link juice unnaturally. There’s a short discussion about this on Moz.
  9. Missing sitemap data. Google uses the XML sitemap to parse your site’s structure and learn how it’s put together. Make sure your XML sitemap is available and up-to-date, and then submit it in your Webmaster Tools account.
  10. Hidden links. All of the links on your site should be visible and useful to users. Anything that’s hidden is considered suspicious. Never make a link the same color as the background of a page or button, even if you have an innocent reason.
  11. Broken external links. If you don’t keep links up-to-date, Google will assume you don’t care about the user experience and are happy to pack visitors off to various 404 error pages. Check links periodically and pull the duff ones.
  12. Scraped content. Sometimes website managers pull content from other sites in order to bulk our their own pages. Often, this is done with good intentions, and it may be an innocent error. But Google sees this as pointless duplication. Replace it with your own original content instead.
  13. Hidden content. Less ethical optimization tactics include disguising text on a page to manipulate the theme or keyword weighting. It goes without saying that this is a big no-no.
  14. Anchor text overuse. Once upon a time, SEO experts worked on linking certain keywords in order to reinforce their authority. Since the 2012 Penguin update, the over-use of anchor text linking is strongly discouraged. Switch out your forced, unnatural keyword links for honest links phrased in real English.
  15. Neglecting hreflang. Neglecting what now? ‘Hreflang’ is designed to notify Google that you have intentionally published duplicate content for different languages or localities. The jury’s out as to whether it really helps, but using it can’t hurt in the meantime.
  16. Website timing out or down. When a website goes down, everyone gets upset: the visitor, the webmaster and the search engine. If Google can’t find your site, it would rather de-index it rather than keep sending visitors to a dead end.
  17. Keyword domains. While domain names aren’t that risky in themselves, domain names with keywords in might be. Consider the anchor text linking issue: if we repeatedly link to that domain, Google might see that as anchor text manipulation. If you do use an exact match domain, make sure it has plenty of great content on it, otherwise Google will assume you’re trying to fool people into clicking.
  18. Rented links. Some experts still believe rented links are valid and useful for SEO. They pay for them on a monthly basis and change them around occasionally. However, we’d consider them paid links, and so would most of these experts on Quora.
  19. Using blog networks. As far as Google is concerned, any kind of network is a sign of potential SERP manipulation. Most blog networks have now shut down or given users the chance to delete all of these incoming links. You should too.
  20. Affiliate links all over the place. Google isn’t necessarily opposed to affiliate websites, but a high number of affiliate links is a red flag that the content may not be up to scratch. Although it’s possible to mask affiliate links with redirects, Google is wise to this tactic, so don’t rely on it.
  21. Site-wide links. We all need to link pages together, but Google is constantly scanning those links for unnatural patterns. A classic example is a web developer credit in the footer of a page. Don’t just nofollow: remove them entirely.
  22. Overusing meta keywords. Meta keywords have been a topic for debate for some time. They are way too easy to manipulate. Make sure you use no more than five per page.
  23. Slow speeds. If your site’s slow to load, your users will get frustrated. Many, many factors affect hosting speeds, so this is quite a tricky problem to assess and troubleshoot. Use a caching plugin or a CDN right away. You could also move your site to a data center closer to your most frequent visitors: that’s a little more involved.
  24. Spun content. Spinning is content theft. It could land you in hot water if the Google penalty doesn’t catch up with you first. Bought some super-cheap articles? Sometimes content is spun by the ‘writer’, so you may not even know about it. If the price was too good to be true, that’s a sign you may have bought spun articles.
  25. Comment spam. Most commenting systems have an automated spam detection system, but some comments still make it through. Keep a close eye on the comments you’re getting. Also, don’t let spam build up; if you don’t have time to moderate it, switch commenting off entirely.
  26. Black hat SEO advice. If you publish information about manipulating SERPs using black hat methods, expect to be penalized. Matt Cutts hinted at this in a video blog.
  27. Hacked content. If your site has been hacked, Google will quickly remove it from SERPs. Act quickly to contain hacking attempts and restore sites from backup if the worst does happen.
  28. Speedy link building. It’s natural to want your new site to rank quickly. Don’t overdo it. Lots of similar links pointing to the same place is a sign of automation. Don’t artificially bump your link velocity: make gradual changes over time.
  29. Spam reports. Google has published an online form for spam site reporting. Your site might have been submitted as a potential source of spam, genuinely or maliciously.
  30. Forum linking. We’ve all used forums awash with signature links. Sometimes there are so many, it can be hard to locate the actual posts. If you add a forum link, use good, natural linking techniques and consider making it a nofollow too.
  31. Hiding your sponsors. Having a sponsor is no bad thing. Plenty of sites wouldn’t exist without them. Don’t try to hide your sponsors, but follow the rules: nofollow sponsor links and make sure Google’s news bot doesn’t crawl pages where those links can be found.
  32. Robots.txt flaws. The robots.txt file should be used to tell search engines how to deal with your site. While there are legitimate reasons for excluding pages from robots.txt, do it sparingly: excessive blocking could be the cause of your penalty.
  33. Links to suspicious sites. Never associate yourself with a website that is doing something ethically or legally dubious. Hacking, porn and malware-ridden sites should be avoided. Also, try to remove links to other sites that have been penalized in the past, assuming you know about it.
  34. Landing pages. Businesses sometimes try to use multiple landing pages in order to improve their position in SERPs. Some companies also try to improve their position by creating lots of one-page websites optimized for a single keyword, then funneling users through to another site. Google considers this kind of thing to be bad practice.
  35. Over-optimization. Google doesn’t like to see too much of a good thing. An over-optimization penalty usually means you’ve gone a step too far in your bid to obsessively out-SEO everyone else in your industry. Cool it and publish some natural content before your rank suffers.
  36. Advertorials. The controversy around advertorial content was perhaps the most well-known of the pre-Penguin 2 debates. An advertorial is basically a page of content riddled with paid links, and often these pages were being used for aggressive manipulation of search results. The most famous example was Interflora: read about its penalty here.
  37. Too many outbound links. When linking to other websites, keep it natural. A high quantity of links is a sign that you’re swapping links with people for the sake of mutual SEO benefit.
  38. Redirection. If you’ve received a penalty on your site, using a 301 redirect could transfer the penalty to a new location. What’s more, the penalty could linger if you remove the redirect later. To be safe, don’t do it.
  39. Error codes. Aside from the obvious 404 error, there are a range of others that Google really hates to see. 302 (temporarily moved) isn’t ideal; if you really must redirect something, use 301. Also, if you see any 500 errors, deal with the root cause as soon as you can. Find invisible errors with this WebConfs HTTP Header Check tool.
  40. Duplicate metadata. Some blogging tools and CMS platforms make it all too easy to create duplicate metadata by accident. While metadata isn’t a cause for a penalty on its own, it can be a sign of a duplicate content issue on your site. In any case, it’s undesirable; try to deal with it.
  41. Malicious backlinks. Your site NEVER deserves this penalty – but it is something you should know about. If you’re really unlucky, an unethical competitor may try to shove your site down the SERPs by getting it penalized. The most common cause is a malicious backlink campaign.
  42. Targeted keywords. Google is waging war against some of the keywords most frequently appearing in spam sites. ‘Payday loans’ is a good example of a keyword that has already been targeted, although some people feel that it could do more. If you legitimately operate in an industry that’s rife with spam, expect to be caught in the crossfire.
  43. Smuggled links. Don’t be sneaky and put links into script files. Google is much better at analyzing scripts and picking out weird links that shouldn’t be there.
  44. Poor mobile websites. Google can normally detect a valid link between your mobile site and your website. If it’s poorly designed, it may not. Make sure the mobile site is sent to a device where the user agent is set to mobile. Matt Cutts also suggests using a separate subdomain.
  45. Few outbound links. Google wants to see content that references other content of a similar standard. If you don’t share the love, it might look like an attempt to attract traffic unnaturally.
  46. Domain has a bad rep. You may have innocently purchased a domain with a bad history, and that could cause you problems when you try to build a new site around it. Unfortunately this is often a dead end street; you may be best cutting your losses and buying another domain rather than throwing more money at the problem.
  47. Content theft. Even if you don’t steal content, someone else could steal yours. This is troublesome, since getting the content removed could involve filing multiple DMCA takedown notices or pursuing sites in court. If you’re penalized for this, try asking Google to remove the stolen content.
  48. Prominent ads. Advertising is OK when treated as a secondary concern. Ads should never dominate the page content or play second fiddle to an article or blog.
  49. Using a content farm. Over the two years since Panda was phased in, it has been considered poor form to buy content from a ‘farm’ (defined as “sites with shallow or low-quality content”). If your content is poorly researched, light on detail or exists mainly to fill up the page, employ a professional rewrite it.
  50. Beware of quick fixes. Don’t employ anyone that claims to have a magical, foolproof technique that will help to get your site to the top of the SERPs. The only way to rank well is to put in the groundwork over time.
How to Deal With a Penalty
Figured out the cause for your penalty? You’re halfway to fixing it – if it’s fixable at all.
Every problem will require a slightly different solution, but here are some things you can try.
  • Don’t panic. Even massive websites suffer from penalties.
  • Disavow troublesome links. Ask Google not to count troublesome links that are harming your website. Here’s the link to the disavow tool, and here’s Matt Cutts’ advice on the subject.
  • Get some links removed. While disavow is good, it’s not perfect. Put in some legwork and try to get some of the links taken down.
  • Request reconsideration if your penalty was manual.
  • Wait it out. Sometimes it takes Google a while to act on your changes and disavow requests, and then it could take a while for it to re-crawl your site.
In a few cases, it’s better to abandon a site rather than fight a Google penalty: if your domain has been tarnished, there’s little you can do. But most penalties can be fixed with a little effort, some hard work and an ethical approach to rebuilding your site.

Wednesday 28 October 2015

Google Latest Update "RankBrain"

What Does It Mean For SEO:

Google announced yesterday that they were adding another layer to their already complicated ranking algorithm. This time, they were focusing on machine learning and are calling this update, RankBrain.

As I covered in a recent article, machine learning has been making inroads into the search algorithms for quite some time now. Apple has publicly announced their own forays into search and machine learning, and Bing has talked about it extensively.

And at our digital agency we’re always paranoid about changes Google makes to their algorithm. So, we jumped right in when we heard the news and tried to decipher what all of this means for marketers.

But what are we really talking about when we discuss machine learning and the search engine algorithms?




Still Human Run

Even though we’d like to think that a company worth 100’s of billions of dollars, like Google and Apple, should have advanced technologies that us mere mortals haven’t even heard of, the truth of the matter is, they still have a lot of manual processes and antiquated code wrapped up into their operating systems and search algorithms.

So, while they may not have developed warp drive yet, they have been trying to remove the human element from their search algorithms. The reason being is that humans can only process what they know and what they recognize. And gathering data to learn from is a very manual process.

This means that all of the rules that currently govern how the search algorithm works, from link tracking, content crawling, and the hundreds of other activities, have all been written by a human being. And that human being wrote that rule because they recognized a pattern and adjusted the algorithm accordingly.

And while humans are great at recognizing patterns, we’re still very slow at doing this. Thus, the search companies have been enlisting machines to help write and govern the rules that run the algorithms. But machines are not great at recognizing patterns. Yes, they’re much faster than humans at almost everything, but they still struggle with recognizing patterns and understanding how those patterns interact with the big picture.

RankBrain

So, if a machine can’t run the system on its own, what will RankBrain really do? In short, it will look for patterns that humans have programmed it to look for and make pre-determined changes and adjustments based on the pattern recognition.

In truth, this isn’t true machine intelligence, it’s still a computer – albeit an incredibly powerful one – running a set of predetermined protocols and taking predetermined actions. Now, there may be some level of learning going on in the system, which theoretically would allow RankBrain to evolve its processes gradually over time. But you can rest assured that a human will still be spot-checking that evolution and making sure everything is progressing as it should.

Third Most Important Signal

Google has said that they consider the signals coming from RankBrain to be the third most important signal they base their rankings on. For a full list, you can read this great article on Search Engine Land by Danny Sullivan.

But what exactly is RankBrain tracking and improving? For the most part, it will be looking at intent. It will ask, “What was the intent of this search?”

Meaning, that if you searched for “homes in San Diego” Google’s algorithm already has a synonym code in place that will bring up results that match for “houses in San Diego” as well. But that line of code was written by a human. The hope is that as the billions of searches come in every day, RankBrain will track what users are clicking on and will begin making these correlations without a human having to manually write it into the code.

This will help Google keep up with trending topics, slang, and new queries that it hasn’t encountered before.

What Does This Mean For SEO?

Honestly, not a whole lot just yet. This is an algorithm update that helps the users and gives them a better experience. But it doesn’t really change how marketers should be tackling their SEO. They should still be focusing on well-designed pages that load quickly and give a great user experience. Content is still as important as ever. And every other SEO best practice should still be followed.

However, in the future, the increase in machine learning technologies will make the algorithm more agile and will begin to reward the marketers who are doing SEO the right way more and more. So, it’s a great day for people who practice SEO the right way and often get frustrated that old spammy tactics still produce results for their competitors. It’s just a matter of time before the machines are fully integrated and then finally they’ll be able to stay ahead of their human counterparts.

Article Source 

Friday 25 September 2015

Best SEO Interview Question 2016



1-  If a page includes more than one rel="canonical" Google will... - Ignore them all.
2-  If the robots <META> tag is missing, the default is:- INDEX, FOLLOW 
3Which of the following links is likely to pass the most value from a single page? - A link contained in the main body text
4-   If you want a page to pass value through its links, but stay out of the search engines' indices, which of the following tags should you place in the header? - NOINDEX, FOLLOW
5Which HTTP status code is best to serve when your site is down for maintence? - 503
6To give credit for duplicate content that appears on another site, use... - A cross-domain canonical tag
7Which Google update was closely associated with speed, faster indexing and fresher web results . -  Caffeine
8The de-facto version of a page located on the primary URL you want associated with the content is known as:  Canonical Version
9What is the maximum number of URLs typically allowed in an XML sitemap file? -  50,000
10- What are valid reasons why your webpage's title may not appear in Google's search results exactly as it does in the page title element in your HTML? - Your title does not contain your brand name or other key terms in the users' search query (or doesn't include them at the start of the title element), so Google is using text from elsewhere on the page.
11Which of the following types of sitemaps is NOT supported by Google? - Product type
12-   If these URLs have the same content, example.com/avocado and example.com/avocado/ are technically considered duplicate content and should be fixed. - True
13-   Which type of link has NOT been mentioned by Google as being risky for SEO? - Text link advertisements with rel="nofollow"
14True or false: rel="canonical" can be used to point to content on a different domain. - True
15The X-Robots-Tag should be located:- In the HTTP headers
16-   Which of the following statements about anchor text is true? - When an embedded image is linking, the alt attribute of the image may be treated as anchor text.

Friday 19 June 2015

Frequently asked Questions About Facebook Advertising



1) Which of these statements is true when it comes to Facebook’s Cover Photo?
a. Covers can’t be deceptive, misleading, or infringe on anyone else’s copyright.
b. The text on the cover photo must be less than 20%
c. You cannot use call-to-actions in the cover photo
Answer: A

Last year Facebook removed all of their cover photo guidelines except that the covers cannot be deceptive or misleading. All else goes!

2) What is Facebook engagement?
a) When someone views your page
b) When someone views your post
c) When someone likes, comments or shares your content
Answer: C

Engagement is key to generating more views of your content. Facebook decides if the content that you posted is worthy, by the amount of engagement it receives. The more engagement you have, the more people will see your post; so make it a good one!

3) Which of the following is NOT an audience characteristic that can be used to define your target audience for a Facebook Ad?
a) Location
b) Shoe size
c) Political Affiliation
Answer: C

Facebook allows you to target your ads to a very select group using many different demographics including location, interests, previous Facebook activity, and much more.

4) Your ad on Facebook will be approved if…?
a) it invites people to buy beer at your store
b) it leads to a landing page with a pop-up
c) it uses a “before and after” image
Answer: A.

Ads may not lead to websites with a pop-up or pop-under, and they may not contain “before and after” images. Full details on the policies.

5) How do you optimize your facebook ad to get the lowest cost per like?
a) use broad keywords
b) use specific keywords
c) use the categories they provide
Answer: B

The more specific your keywords are, the less the cost per like will be. For example, instead of using the keyword “dessert,” use the keyword “pecan pie.” By making your keywords specific and by using an extensive amount of them you will find your cost per “like” will drop significantly.

6) Which Facebook metric is most important for figuring out how many people see your content?
A) Number of Likes
B) Number of Shares and Comments
C) Organic Reach
Answer: C

Organic Reach tells you how many people your content is reaching, while the other two numbers tell you how many people are engaging with your content. They are related, but separate, measurements.

7) What type of Facebook post typically receives the most actions/engagement from fans?
a) Links
b) Images/Photos
C) Video
Answer: B

Images/Photos generally receive the most actions or engagement from fans. This is especially true with Facebook’s layout changes that add emphasis on high quality images.

8) When is the most effective time to post for maximum engagement?
a) Monday afternoon
b) Wednesday morning
c) Thursday evening
Answer: C

Facebook shows a dramatic spike in engagement on Thursday evenings.

9) What are the proper dimensions for a picture to be optimized on a Facebook status?
a) 851 x 315
b) 403 x 403
c) 800 x 1200
Answer: B

10) What are the three primary factors that EdgeRank (Facebook’s algorithm) consists of?
a) Time Decay, Number of Characters, Affinity
b) Affinity, Weight, Time Decay
c) Number or Characters, Likes, Images
Answer: A

The Time Decay factor is based on how long ago the edge (an action taken on Facebook) was created. So the older an edge the worst it ranks…

Affinity simply means how engaged the follower is with your page. Do they comment on your posts, like your photos, etc.? The more a follower interacts with your page, the higher you rank in the Affinity metric.

1) Facebook posts with images have better reach and engagement than those that are text only. True or false?
a) True
b) False
Answer: A

Photo posts get 39% more interaction than links, videos or text- based updates. Photos get 53% more likes, 104% more comments and 84% more click-throughs on links than text posts, according to Kissmetrics.

12) What’s the best way to use Facebook ads that target people who have visited your website?
a) Run a ‘Page Post Engagement’ ad
b) Run a ‘Clicks to Website’ ad
c) Run a ‘Website Conversions’ ad
Answer: C

In order to target people who have visited your website you must add a Facebook conversion tracking pixel to your website and then develop a custom audience that uses the data collected with it.

13) What is the ideal length for a Facebook post?
a) Less than 40 characters
b) 80 characters
c) Greater than 100 Characters
Answer: A


14) When creating a lookalike audience using Facebook’s Power Editor, which browser must be used?
a) Chrome
b) Safari
c) Firefox
d) Internet Explorer
Answer: A

Chrome is the only supported browser that works with Power Editor. While ads can be created in other browsers, when developing lookalike audiences through Power Editor you must use Chrome

15) Which of the following actions can’t be performed with Facebook Power Editor?
a) Target users by their name
b) Target users by their email ID
c) Target users by their phone number
Answer: A

In Facebook power editor, you can target users by their Unique Identification IDs (UIDs) but you can’t target them specifically by name.

16) What is a dark Facebook post?
A) A Facebook post that is unpublished on a page’s timeline, but can be used for advertising purposes.
B) A Facebook post generates a high volume of negative comments.
C) A Facebook post that receives less than 100 impressions.
Answer: A

A post that is unpublished on a page’s timeline, but can be used for advertising purposes. Through Facebook’s Power Editor tool, page administrators can create a post for the sole purpose of using it for an advertisement. This allows brands to test various types of posts without cluttering their timeline with the same topic.

17) When running a contest on Facebook, which of the following is not a legal means of entrance?
a. Like to Enter
b. Share on your/your friend’s Timeline to enter
c. Comment to Enter
Answer: B.

Facebook recently changed their promotion rules so that businesses can ask Fans to Like or Comment on a Post in order to be entered into a promotion. However, it is still illegal to ask Fans to Share a post as a means of entering.

18) The question: Where do you place a conversion tracking pixel?
a) Facebook Page
b) Initial landing page
c) Thank you page
Answer: C