Search This Blog

Showing posts with label seo. Show all posts
Showing posts with label seo. Show all posts

Thursday 24 December 2015

These reasons are your website to be penalized by Google




Google’s on an uncompromising mission. It wants to give its users access to accurate information, unique content and the finest writers. It continually tweaks and improves its algorithms so that the best of the web gets the exposure it deserves.
Unfortunately, there’s a flipside: a penalty. That’s the consequence of Google taking issue with something on your site. Sometimes a penalty is well deserved, but even if you know you’re in the wrong, you probably want to do something about it.

What Is a Google Penalty?

Google has been changing its ranking algorithms since December 2000. That’s when it released its toolbar extension. At the time, the toolbar update represented a sea change that would create the SEO industry as we know it. In fact, it was the first time Page Rank was published in a meaningful or usable form.
Over the next decade-and-a-bit, Google continued to refine the quality of its search results. Over time, it begins to eliminate poor quality content and elevate the good stuff to the top of the SERPs. That’s where penalties – come in.
The Penguin update was rolled out in 2012. It hit more than 1 in 10 search results overnight, wiped some sites out of search entirely, pushed poor quality content off the map and forced optimizers to think much more carefully about their content strategy. Since then, SEO professionals have been very tuned in to Google’s plans, fearing the next update in case it results in a penalty for a site they’re working on.

Recognizing a Penalty

Penalties can be automatic or manual. With manual penalties, you’ll probably be told, but you may not always know you’ve been targeted if the cause is algorithmic. Those penalties may take even the most experienced SEO professionals by surprise.
For algorithmic penalties, here are some sure-fire clues.
  • Your website is not ranking well for your brand name any more. That’s a dead giveaway. Even if your site doesn’t rank for much else, it should at least do well on that one keyword.
  • Any page one positions you had are slipping back to page two or three without any action on your part.
  • PageRank for your site has inexplicably dropped from a respectable two or three to a big fat zero (or a measly PR of one).
  • The entire website has been removed from Google’s cached search results overnight.
  • Running a site search – site:yourdomain.com keyword – yields no results.
  • Your listing – when you eventually find it in Google – is for a page on your site other than the home page.
If you see one or more of these factors, you can be pretty sure that a penalty has affected your site.

Why Has Google Penalized My Site?
Google is continually tweaking and revising the way it indexes content.
While it does publish clues about its algorithm updates, it rarely comes clean about all of its reasons for changes. Fixing things can be tough.
To get you off on the right track, here’s the part you’ve been waiting for: 50 common reasons for Google taking issue with your site. While we’re not saying we know the definite reasons for a penalty, we do know that these factors all contribute.

  1. Buying links. Some swear it doesn’t happen, but actual evidence is mixed. Buying links could certainly be seen as an attempt to manipulate PageRank, and therein lies the controversy. If you’ve been buying bad links (and lots of them), your actions could have caught up with you.
  2. Excessive reciprocal links. Swapping links was once an innocent marketing tactic until it started to be abused. If you’ve been exchanging lots of links with clients, it could be seen as a manipulation attempt.
  3. Duplicate content. Hopefully this one’s obvious: any duplicate content on your site makes it less useful in Google’s view, and that could result in a penalty. Make sure your content is unique and well-written; use tools like Copyscape and CopyGator too
  4. Overusing H1 tags. Correctly structuring content helps with SEO. The H1 tag helps Google to understand what the page is about. Excessive H1 tags could be seen as an attempt to pump Google’s listing with keywords.
  5. Internal 404s. Google wants to know that you tend to your content and weed out any errors and problems. If you’re delivering 404s inside your own website, it’s a sure fire signal that your users aren’t getting the information they ask for.
  6. Links from sites in another language. This one seems unfair, right? You’ve got a legitimate link from a client in another country, yet it’s technically counted against you. Well, Google’s reasoning is sound: users generally tend to prefer one language, so linking to sites in another language isn’t that useful for them.
  7. Keyword stuffed content. There are all kinds of weird and wonderful ‘rules’ about keyword density in content. The truth is that none of these rules are proven, and a very high keyword density is a flag for poorly written content. If Google detects a weirdly high number of keywords in a page, it may penalize you – rightly or wrongly.
  8. Footer links. Some web designers use footer links as a navigational aid; some try to manipulate PageRank by using the footer as a place to pass link juice unnaturally. There’s a short discussion about this on Moz.
  9. Missing sitemap data. Google uses the XML sitemap to parse your site’s structure and learn how it’s put together. Make sure your XML sitemap is available and up-to-date, and then submit it in your Webmaster Tools account.
  10. Hidden links. All of the links on your site should be visible and useful to users. Anything that’s hidden is considered suspicious. Never make a link the same color as the background of a page or button, even if you have an innocent reason.
  11. Broken external links. If you don’t keep links up-to-date, Google will assume you don’t care about the user experience and are happy to pack visitors off to various 404 error pages. Check links periodically and pull the duff ones.
  12. Scraped content. Sometimes website managers pull content from other sites in order to bulk our their own pages. Often, this is done with good intentions, and it may be an innocent error. But Google sees this as pointless duplication. Replace it with your own original content instead.
  13. Hidden content. Less ethical optimization tactics include disguising text on a page to manipulate the theme or keyword weighting. It goes without saying that this is a big no-no.
  14. Anchor text overuse. Once upon a time, SEO experts worked on linking certain keywords in order to reinforce their authority. Since the 2012 Penguin update, the over-use of anchor text linking is strongly discouraged. Switch out your forced, unnatural keyword links for honest links phrased in real English.
  15. Neglecting hreflang. Neglecting what now? ‘Hreflang’ is designed to notify Google that you have intentionally published duplicate content for different languages or localities. The jury’s out as to whether it really helps, but using it can’t hurt in the meantime.
  16. Website timing out or down. When a website goes down, everyone gets upset: the visitor, the webmaster and the search engine. If Google can’t find your site, it would rather de-index it rather than keep sending visitors to a dead end.
  17. Keyword domains. While domain names aren’t that risky in themselves, domain names with keywords in might be. Consider the anchor text linking issue: if we repeatedly link to that domain, Google might see that as anchor text manipulation. If you do use an exact match domain, make sure it has plenty of great content on it, otherwise Google will assume you’re trying to fool people into clicking.
  18. Rented links. Some experts still believe rented links are valid and useful for SEO. They pay for them on a monthly basis and change them around occasionally. However, we’d consider them paid links, and so would most of these experts on Quora.
  19. Using blog networks. As far as Google is concerned, any kind of network is a sign of potential SERP manipulation. Most blog networks have now shut down or given users the chance to delete all of these incoming links. You should too.
  20. Affiliate links all over the place. Google isn’t necessarily opposed to affiliate websites, but a high number of affiliate links is a red flag that the content may not be up to scratch. Although it’s possible to mask affiliate links with redirects, Google is wise to this tactic, so don’t rely on it.
  21. Site-wide links. We all need to link pages together, but Google is constantly scanning those links for unnatural patterns. A classic example is a web developer credit in the footer of a page. Don’t just nofollow: remove them entirely.
  22. Overusing meta keywords. Meta keywords have been a topic for debate for some time. They are way too easy to manipulate. Make sure you use no more than five per page.
  23. Slow speeds. If your site’s slow to load, your users will get frustrated. Many, many factors affect hosting speeds, so this is quite a tricky problem to assess and troubleshoot. Use a caching plugin or a CDN right away. You could also move your site to a data center closer to your most frequent visitors: that’s a little more involved.
  24. Spun content. Spinning is content theft. It could land you in hot water if the Google penalty doesn’t catch up with you first. Bought some super-cheap articles? Sometimes content is spun by the ‘writer’, so you may not even know about it. If the price was too good to be true, that’s a sign you may have bought spun articles.
  25. Comment spam. Most commenting systems have an automated spam detection system, but some comments still make it through. Keep a close eye on the comments you’re getting. Also, don’t let spam build up; if you don’t have time to moderate it, switch commenting off entirely.
  26. Black hat SEO advice. If you publish information about manipulating SERPs using black hat methods, expect to be penalized. Matt Cutts hinted at this in a video blog.
  27. Hacked content. If your site has been hacked, Google will quickly remove it from SERPs. Act quickly to contain hacking attempts and restore sites from backup if the worst does happen.
  28. Speedy link building. It’s natural to want your new site to rank quickly. Don’t overdo it. Lots of similar links pointing to the same place is a sign of automation. Don’t artificially bump your link velocity: make gradual changes over time.
  29. Spam reports. Google has published an online form for spam site reporting. Your site might have been submitted as a potential source of spam, genuinely or maliciously.
  30. Forum linking. We’ve all used forums awash with signature links. Sometimes there are so many, it can be hard to locate the actual posts. If you add a forum link, use good, natural linking techniques and consider making it a nofollow too.
  31. Hiding your sponsors. Having a sponsor is no bad thing. Plenty of sites wouldn’t exist without them. Don’t try to hide your sponsors, but follow the rules: nofollow sponsor links and make sure Google’s news bot doesn’t crawl pages where those links can be found.
  32. Robots.txt flaws. The robots.txt file should be used to tell search engines how to deal with your site. While there are legitimate reasons for excluding pages from robots.txt, do it sparingly: excessive blocking could be the cause of your penalty.
  33. Links to suspicious sites. Never associate yourself with a website that is doing something ethically or legally dubious. Hacking, porn and malware-ridden sites should be avoided. Also, try to remove links to other sites that have been penalized in the past, assuming you know about it.
  34. Landing pages. Businesses sometimes try to use multiple landing pages in order to improve their position in SERPs. Some companies also try to improve their position by creating lots of one-page websites optimized for a single keyword, then funneling users through to another site. Google considers this kind of thing to be bad practice.
  35. Over-optimization. Google doesn’t like to see too much of a good thing. An over-optimization penalty usually means you’ve gone a step too far in your bid to obsessively out-SEO everyone else in your industry. Cool it and publish some natural content before your rank suffers.
  36. Advertorials. The controversy around advertorial content was perhaps the most well-known of the pre-Penguin 2 debates. An advertorial is basically a page of content riddled with paid links, and often these pages were being used for aggressive manipulation of search results. The most famous example was Interflora: read about its penalty here.
  37. Too many outbound links. When linking to other websites, keep it natural. A high quantity of links is a sign that you’re swapping links with people for the sake of mutual SEO benefit.
  38. Redirection. If you’ve received a penalty on your site, using a 301 redirect could transfer the penalty to a new location. What’s more, the penalty could linger if you remove the redirect later. To be safe, don’t do it.
  39. Error codes. Aside from the obvious 404 error, there are a range of others that Google really hates to see. 302 (temporarily moved) isn’t ideal; if you really must redirect something, use 301. Also, if you see any 500 errors, deal with the root cause as soon as you can. Find invisible errors with this WebConfs HTTP Header Check tool.
  40. Duplicate metadata. Some blogging tools and CMS platforms make it all too easy to create duplicate metadata by accident. While metadata isn’t a cause for a penalty on its own, it can be a sign of a duplicate content issue on your site. In any case, it’s undesirable; try to deal with it.
  41. Malicious backlinks. Your site NEVER deserves this penalty – but it is something you should know about. If you’re really unlucky, an unethical competitor may try to shove your site down the SERPs by getting it penalized. The most common cause is a malicious backlink campaign.
  42. Targeted keywords. Google is waging war against some of the keywords most frequently appearing in spam sites. ‘Payday loans’ is a good example of a keyword that has already been targeted, although some people feel that it could do more. If you legitimately operate in an industry that’s rife with spam, expect to be caught in the crossfire.
  43. Smuggled links. Don’t be sneaky and put links into script files. Google is much better at analyzing scripts and picking out weird links that shouldn’t be there.
  44. Poor mobile websites. Google can normally detect a valid link between your mobile site and your website. If it’s poorly designed, it may not. Make sure the mobile site is sent to a device where the user agent is set to mobile. Matt Cutts also suggests using a separate subdomain.
  45. Few outbound links. Google wants to see content that references other content of a similar standard. If you don’t share the love, it might look like an attempt to attract traffic unnaturally.
  46. Domain has a bad rep. You may have innocently purchased a domain with a bad history, and that could cause you problems when you try to build a new site around it. Unfortunately this is often a dead end street; you may be best cutting your losses and buying another domain rather than throwing more money at the problem.
  47. Content theft. Even if you don’t steal content, someone else could steal yours. This is troublesome, since getting the content removed could involve filing multiple DMCA takedown notices or pursuing sites in court. If you’re penalized for this, try asking Google to remove the stolen content.
  48. Prominent ads. Advertising is OK when treated as a secondary concern. Ads should never dominate the page content or play second fiddle to an article or blog.
  49. Using a content farm. Over the two years since Panda was phased in, it has been considered poor form to buy content from a ‘farm’ (defined as “sites with shallow or low-quality content”). If your content is poorly researched, light on detail or exists mainly to fill up the page, employ a professional rewrite it.
  50. Beware of quick fixes. Don’t employ anyone that claims to have a magical, foolproof technique that will help to get your site to the top of the SERPs. The only way to rank well is to put in the groundwork over time.
How to Deal With a Penalty
Figured out the cause for your penalty? You’re halfway to fixing it – if it’s fixable at all.
Every problem will require a slightly different solution, but here are some things you can try.
  • Don’t panic. Even massive websites suffer from penalties.
  • Disavow troublesome links. Ask Google not to count troublesome links that are harming your website. Here’s the link to the disavow tool, and here’s Matt Cutts’ advice on the subject.
  • Get some links removed. While disavow is good, it’s not perfect. Put in some legwork and try to get some of the links taken down.
  • Request reconsideration if your penalty was manual.
  • Wait it out. Sometimes it takes Google a while to act on your changes and disavow requests, and then it could take a while for it to re-crawl your site.
In a few cases, it’s better to abandon a site rather than fight a Google penalty: if your domain has been tarnished, there’s little you can do. But most penalties can be fixed with a little effort, some hard work and an ethical approach to rebuilding your site.

Wednesday 28 October 2015

Google Latest Update "RankBrain"

What Does It Mean For SEO:

Google announced yesterday that they were adding another layer to their already complicated ranking algorithm. This time, they were focusing on machine learning and are calling this update, RankBrain.

As I covered in a recent article, machine learning has been making inroads into the search algorithms for quite some time now. Apple has publicly announced their own forays into search and machine learning, and Bing has talked about it extensively.

And at our digital agency we’re always paranoid about changes Google makes to their algorithm. So, we jumped right in when we heard the news and tried to decipher what all of this means for marketers.

But what are we really talking about when we discuss machine learning and the search engine algorithms?




Still Human Run

Even though we’d like to think that a company worth 100’s of billions of dollars, like Google and Apple, should have advanced technologies that us mere mortals haven’t even heard of, the truth of the matter is, they still have a lot of manual processes and antiquated code wrapped up into their operating systems and search algorithms.

So, while they may not have developed warp drive yet, they have been trying to remove the human element from their search algorithms. The reason being is that humans can only process what they know and what they recognize. And gathering data to learn from is a very manual process.

This means that all of the rules that currently govern how the search algorithm works, from link tracking, content crawling, and the hundreds of other activities, have all been written by a human being. And that human being wrote that rule because they recognized a pattern and adjusted the algorithm accordingly.

And while humans are great at recognizing patterns, we’re still very slow at doing this. Thus, the search companies have been enlisting machines to help write and govern the rules that run the algorithms. But machines are not great at recognizing patterns. Yes, they’re much faster than humans at almost everything, but they still struggle with recognizing patterns and understanding how those patterns interact with the big picture.

RankBrain

So, if a machine can’t run the system on its own, what will RankBrain really do? In short, it will look for patterns that humans have programmed it to look for and make pre-determined changes and adjustments based on the pattern recognition.

In truth, this isn’t true machine intelligence, it’s still a computer – albeit an incredibly powerful one – running a set of predetermined protocols and taking predetermined actions. Now, there may be some level of learning going on in the system, which theoretically would allow RankBrain to evolve its processes gradually over time. But you can rest assured that a human will still be spot-checking that evolution and making sure everything is progressing as it should.

Third Most Important Signal

Google has said that they consider the signals coming from RankBrain to be the third most important signal they base their rankings on. For a full list, you can read this great article on Search Engine Land by Danny Sullivan.

But what exactly is RankBrain tracking and improving? For the most part, it will be looking at intent. It will ask, “What was the intent of this search?”

Meaning, that if you searched for “homes in San Diego” Google’s algorithm already has a synonym code in place that will bring up results that match for “houses in San Diego” as well. But that line of code was written by a human. The hope is that as the billions of searches come in every day, RankBrain will track what users are clicking on and will begin making these correlations without a human having to manually write it into the code.

This will help Google keep up with trending topics, slang, and new queries that it hasn’t encountered before.

What Does This Mean For SEO?

Honestly, not a whole lot just yet. This is an algorithm update that helps the users and gives them a better experience. But it doesn’t really change how marketers should be tackling their SEO. They should still be focusing on well-designed pages that load quickly and give a great user experience. Content is still as important as ever. And every other SEO best practice should still be followed.

However, in the future, the increase in machine learning technologies will make the algorithm more agile and will begin to reward the marketers who are doing SEO the right way more and more. So, it’s a great day for people who practice SEO the right way and often get frustrated that old spammy tactics still produce results for their competitors. It’s just a matter of time before the machines are fully integrated and then finally they’ll be able to stay ahead of their human counterparts.

Article Source 

Friday 25 September 2015

Best SEO Interview Question 2016



1-  If a page includes more than one rel="canonical" Google will... - Ignore them all.
2-  If the robots <META> tag is missing, the default is:- INDEX, FOLLOW 
3Which of the following links is likely to pass the most value from a single page? - A link contained in the main body text
4-   If you want a page to pass value through its links, but stay out of the search engines' indices, which of the following tags should you place in the header? - NOINDEX, FOLLOW
5Which HTTP status code is best to serve when your site is down for maintence? - 503
6To give credit for duplicate content that appears on another site, use... - A cross-domain canonical tag
7Which Google update was closely associated with speed, faster indexing and fresher web results . -  Caffeine
8The de-facto version of a page located on the primary URL you want associated with the content is known as:  Canonical Version
9What is the maximum number of URLs typically allowed in an XML sitemap file? -  50,000
10- What are valid reasons why your webpage's title may not appear in Google's search results exactly as it does in the page title element in your HTML? - Your title does not contain your brand name or other key terms in the users' search query (or doesn't include them at the start of the title element), so Google is using text from elsewhere on the page.
11Which of the following types of sitemaps is NOT supported by Google? - Product type
12-   If these URLs have the same content, example.com/avocado and example.com/avocado/ are technically considered duplicate content and should be fixed. - True
13-   Which type of link has NOT been mentioned by Google as being risky for SEO? - Text link advertisements with rel="nofollow"
14True or false: rel="canonical" can be used to point to content on a different domain. - True
15The X-Robots-Tag should be located:- In the HTTP headers
16-   Which of the following statements about anchor text is true? - When an embedded image is linking, the alt attribute of the image may be treated as anchor text.

Thursday 26 March 2015

Best SEO & PPC Questions and Answers II


1. What does the term 'keyword stuffing' refer to?
Answers:
Overloading a page with keywords.
• Posting SEO rich comments on your own webpage.
• Repeating the same SEO terms over multiple social media channels.
• Overusing SEO in Gmail subject lines
2. Why is it bad idea from SEO perspective to host free articles and write ups that are very common on the internet?
Answers:
• Because people could turn up claiming copyright infirigement
• Because they will not lead to fresh traffic
• Because you will not get the benefits of proper keyword targeting
Because you could be penalized by search engine for using duplicate contents
3. What does SERP stand for?
Answers:
Search engine results page.
• Search engine results plan.
• Search engine results position.
• Search engine recurring payment.
• Search engine retirement plan.
4. Which type of url redirect is the most favorable for SEO?
Answers:
• 404
• 413
• 818
301
5. How often should a blog be updated for the best SEO results?
Answers:
• As often as humanly possible
• Once a month to build "Thirst"
• Rarely, but with high content quality
• Once a week
Consistently within reason and with quality content
6. Long-tail keywords are:
Answers:
• Used most often as social media hashtags
• Key phrases with repeating words
• Keywords generated through email chains
Searched less often than common keywords
7. How often a keyword is mentioned relatively to the space on a page is called:
Answers:
Keyword density
• Keyword saturation
• Keyword consistency
• Keyword frequency
8. How can one site 'recommend' another in the eyes of Google?
Answers:
• Emails to Google's tech department
• Collaborating on online ad campaign
Linking to the site on their own page
• Liking' or 'staring' their Facebook page, Google+ profile, and Tweets
9. Links used in 'link trading' are called:
Answers:
• Co-links
Reciprocal links
• Mutually Beneficial Links
• Symbiotic Linking
10. Flash Ads, by default, have a higher Quality Score than Static Ads.
Answers:
• True
False
11. The top social media network for appearing high in Google's SEO rankings is:
Answers:
• Twitter
Google Plus
• Instagram
• Facebook
12. What type of written content leads to higher page rankings?
Answers:
• Bulleted lists with relevant keywords
• Strategic abbreviations coupled with relevant keywords
• Keywords stylized with hashtags and unique text formatting
Complete sentences with relevant keywords
13. "Link building" increases the number and quality of
Answers:
• Links shared through social media
• Search engine results
• Outbound links
Inbound links
14. What are breadcrumb lists?
Answers:
• A series of links from the same website found on the same page of search engine results.
A row of internal links allowing visitors to quickly navigate back to a previous pages.
• A list of social media followers who link and share your content.
• Backlinks posted on related blogs.
15. What was the name of the 2011 change to Google's search results ranking algorithm?
Answers:
• Google Plebian
Google Panda
• Google Pelican
• Google Penguin
16. Alexa is a traffic ranking platform owned by:
Answers:
• Apple
Amazon
• Microsoft
• Google
17. Which of the following does NOT benefit link building efforts?
Answers:
• Blog comments
Breadcrumb links
• Forum signature linking
• Website directory submission
18. If you are driving a lot of PPC traffic to your website, but none of that traffic converts into leads, it’s an indication that:
Answers:
• Most of the traffic is coming from social media
• Your search term is too specific and should be shortened
Your marketing offer is insufficient or hard to identify
• You are bidding too high for your PPC terms
19. Which of the following statement about FFA pages are true?
Answers:
• They contain numerous inbound links
• They are Paid Listings
They are also called Link Farms
• They are greatly beneficial to SEO
20. If a website's search engine saturation for a particular search engine is 20%, what does it mean?
Answers:
• Search engines often pass on long tail searches to lesser known and new websites
• 20% of the websites pages will never be indexed
• Only 20% of the pages of the website will ever be indexed by the search engine
20% of the website pages have been indexed by the search engine
21. What factor is the largest contributor to a site's authority?
Answers:
Quality of sites providing inbound links
• Likes on Facebook + Followers on Twitter
• Number of social media networks a business is present on
• How useful the content of a site is
22. Which type of content will deliver better ROI over time?
Answers:
• Articles optimized for more than 5 keywords
• Articles with over 500 words
• Content with a high number of outbound links
• Content that is submitted to directories
Evergreen Content
23. What is the difference between search engine marketing (SEM) and Search Engine Optimization (SEO)?
Answers:
• SEM campaigns encourage marketers to develop landing pages and conversion rate optimization while SEO professionals only optimize keyword rankings on organic search result pages.
• SEM is part of SEO process
SEM Campaigns are designed to utilize both Paid and Organic methodologies to promote businesses, while SEO only promotes traffic through organic search.
• SEM covers social media marketing while SEO focuses only on organic
24. What is the illegal act of copying of a page by unauthorized parties in order to filter off traffic to another site called?
Answers:
• View Jacking
Page Jacking
• Visitors Jacking
• Traffic jacking
25. After Google, what is the web's most trafficked content destination on the web?
Answers:
• Ask
• Aol
YouTube
• Yahoo!
• Bing
26. True or False? Google recognizes text within pictures.
Answers:
• True
False
27. SEO favors a consistent title for all pages within a website over unique titles for each page.
Answers:
True
• False
28. In AdWords, a value that's used to determine your ad position, where ads are shown on a page is:
Answers:
• CPC Bid
• Edge Rank
• Ad Relevance
• Quality Score
Ad Rank
29. What happens if you ONLY raise your bids?
Answers:
• Increase Conversion Rate
• Increase CTR
• Increase Quality Score
Increase Ad Rank
30. When you delete a page of your website, what should you do?
Answers:
• Redirect to the URL address of the category of the page, if any.
Remove the URL address via Google Webmaster Tools.
• Redirect the URL address to your homepage.
• Do nothing.
31. A -50% bid adjustment at device level will make your Mobile CPC half of your desktop CPC.
Answers:
True
• False
32. What is called "Page Rank"?
Answers:
• The quality of webpage content.
Google's link analysis algorithm
• A method used to differentiate positive and negative SEO campaigns.
• How pages cumulatively rank across all search engines.
33. What term is used to describe the word or phrase users enter into a Google search?
Answers:
• Identifier
• Keyword
Query
• Input
34. True or False? Black Hat SEO has been eliminated via Google Spiders
Answers:
• TRUE
False
35. Which factor does NOT influence Quality Score in AdWords campaign?
Answers:
• past CTR
• landing page quality
• ad relevance
CPC bid
36. To enable conversion optimizer in Google AdWords, a campaign must:
Answers:
• Have at least 30 conversions in the past 30-days
• Conversion optimizer is the default setting for all campaigns in Google AdWords
Have at least 15 conversions in the last 30 days.
• Have conversion tracking, but no minimum conversions are required
• Have at least 15 conversions in the past 15-days
37. Generic keywords and key terms are called:
Answers:
Head Terms
• Vanilla Terms
• White Terms
• Ground Terms
38. Meta descriptions factor into Google's ranking algorithms for web search.
Answers:
• True
False
39. Can you use Google Authorship tag on company pages?
Answers:
• Yes, if the content of the page is longer than 1500 words.
• Yes, if you are one of the managers of company's Google+ page.
• Yes, if you linked your Gmail account with your company email service(e.g. first_last_name@companydomain.com).
No, its use is intended for blog purposes only.