Top 20 Google Analytic Interview Questions 2015

Here I am sharing Top 20 Google Analytics Interview Question Answers for all students who are preparing for Google Analytic exam or want to clear Google Analytic interview all question and answers related to Google Analytics 2015-2016. My Second blog will be on SEO & PPC Interview Question.

Custom Reporting SEO Google Analytics

Q1. What is ‘Treemap’ in Google Analytics ?

Q2. What is Heatmap in Google Analytics ?

Q3. What is difference between Goals & Funnels ?

Q4. What types of goals can Google Analytics track?

Q5. What is available in Google’s Real Time Reporting?

Q6. How can you track user engagement on websites that use Flash or AJAX and are located on one HTML page?

Q7. What is segment in Google Aanalytic ?

Q8. What is the difference between a visit and a session ?

Q9. What do you understand by assisted conversions ?

Q10. I want to track how many organic visits I am getting on a weekly basis for a predefined set of keywords. What is the best way to check that on a regular basis investing the least possible time?

Q11. How can I identify the keywords that are sending paid traffic to any site?

Q12. How will I identify the popular pages on my site?

Q13. How does Google calculates time on page?

Q14. How can you track Flash Events with Google Analytics code?

Q15. What is the purpose of a virtual pageview?

Q16. What is Google Tag Manager ?

Q17. What is the UTM parameters ?

Q18. What are the three elements of Event Tracking?

Q19. Using regular expressions, how could you filter out the IP address range of through

Q20. Which one is best Custom Dimensions or Custom Variables ?


Written by Piyush Yadav



Google’s 200 Ranking Factors: Google Update 2015 for secure Ranking.


1. Domain Age: In this video, Matt Cutts states that:

“The difference between a domain that’s six months old verses one year old is really not that big at all.”.

In other words, they do use domain age…but it’s not very important.

2. Keyword Appears in Top Level Domain: Doesn’t give the boost that it used to, but having your keyword in the domain still acts as a relevancy signal. After all, they still bold keywords that appear in a domain name.

3. Keyword As First Word in Domain: Moz’s 2011 Search Engine Ranking Factors panelists agreed that a domain that starts with their target keyword has an edge over sites that either don’t have the keyword in their domain or have the keyword in the middle or end of their domain:


Domain registration length: A Google patent states:

“Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year. Therefore, the date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain”.

5. Keyword in Subdomain Name: Moz’s panel also agreed that a keyword appearing in the subdomain boosts rank:


6. Domain History: A site with volatile ownership (via whois) or several drops may tell Google to “reset” the site’s history, negating links pointing to the domain.

7. Exact Match Domain: EMDs may still give you an edge…if it’s a quality site. But if the EMD happens to be a low-quality site, it’s vulnerable to the EMD update:


Public vs. Private WhoIs: Private WhoIs information may be a sign of “something to hide”. Matt Cutts is quoted as stating at Pubcon 2006:

“…When I checked the whois on them, they all had “whois privacy protection service” on them. That’s relatively unusual.  …Having whois privacy turned on isn’t automatically bad, but once you get several of these factors all together, you’re often talking about a very different type of webmaster than the fellow who just has a single site or so.”

9. Penalized WhoIs Owner: If Google identifies a particular person as a spammer it makes sense that they would scrutinize other sites owned by that person.

10. Country TLD extension: Having a Country Code Top Level Domain (.cn, .pt, .ca) helps the site rank for that particular country…but limits the site’s ability to rank globally.

Page-Level Factors


11. Keyword in Title Tag: The title tag is a webpage’s second most important piece of content (besides the content of the page) and therefore sends a strong on-page SEO signal.

12. Title Tag Starts with Keyword: According to Moz data, title tags that starts with a keyword tend to perform better than title tags with the keyword towards the end of the tag:


13. Keyword in Description Tag: Another relevancy signal. Not especially important now, but still makes a difference.

14. Keyword Appears in H1 Tag: H1 tags are a “second title tag” that sends another relevancy signal to Google, according to results from this correlation study:


15. Keyword is Most Frequently Used Phrase in Document: Having a keyword appear more than any other likely acts as a relevancy signal.

16. Content Length:  Content with more words can cover a wider breadth and are likely preferred to shorter superficial articles. SERPIQ found that content length correlated with SERP position:Content Length SEO
17. Keyword Density: Although not as important as it once was, keyword density is still something Google uses to determine the topic of a webpage. But going overboard can hurt you.

18. Latent Semantic Indexing Keywords in Content (LSI): LSI keywords help search engines extract meaning from words with more than one meaning (Apple the computer company vs. the fruit). The presence/absence of LSI probably also acts as a content quality signal.

19. LSI Keywords in Title and Description Tags: As with webpage content, LSI keywords in page meta tags probably help Google discern between synonyms. May also act as a relevancy signal.

20. Page Loading Speed via HTML: Both Google and Bing use page loading speed as a ranking factor. Search engine spiders can estimate your site speed fairly accurately based on a page’s code and filesize.

21. Duplicate Content: Identical content on the same site (even slightly modified) can negatively influence a site’s search engine visibility.

22. Rel=Canonical: When used properly, use of this tag may prevent Google from considering pages duplicate content.

23. Page Loading Speed via Chrome: Google may also use Chrome user data to get a better handle on a page’s loading time as this takes into account server speed, CDN usage and other non HTML-related site speed signals.

24. Image Optimization: Images on-page send search engines important relevancy signals through their file name, alt text, title, description and caption.

25. Recency of Content Updates: Google Caffeine update favors recently updated content, especially for time-sensitive searches. Highlighting this factor’s importance, Google shows the date of a page’s last update for certain pages:

google results date

26. Magnitude of Content Updates: The significance of edits and changes is also a freshness factor. Adding or removing entire sections is a more significant update than switching around the order of a few words.

27. Historical Updates Page Updates: How often has the page been updated over time? Daily, weekly, every 5-years? Frequency of page updates also play a role in freshness.

28. Keyword Prominence: Having a keyword appear in the first 100-words of a page’s content appears to be a significant relevancy signal.

29. Keyword in H2, H3 Tags: Having your keyword appear as a subheading in H2 or H3 format may be another weak relevancy signal. Moz’s panel agrees:

H2 Moz Image

30. Keyword Word Order: An exact match of a searcher’s keyword in a page’s content will generally rank better than the same keyword phrase in a different order. For example: consider a search for: “cat shaving techniques”. A page optimized for the phrase “cat shaving techniques” will rank better than a page optimized for “techniques for shaving a cat”. This is a good illustration of whykeyword research is really, really important.

31. Outbound Link Quality: Many SEOs think that linking out to authority sites helps send trust signals to Google.

32. Outbound Link Theme: According to Moz, search engines may use the content of the pages you link to as a relevancy signal. For example, if you have a page about cars that links to movie-related pages, this may tell Google that your page is about the movie Cars, not the automobile.

33. Grammar and Spelling: Proper grammar and spelling  is a quality signal, although Cutts gave mixed messages in 2011 on whether or not this was important.

34. Syndicated Content: Is the content on the page original? If it’s scraped or copied from an indexed page it won’t rank as well as the original or end up in their Supplemental Index.

35. Helpful Supplementary Content: According to a now-public Google Rater Guidelines Document, helpful supplementary content is an indicator of a page’s quality (and therefore, Google ranking). Examples include currency converters, loan interest calculators and interactive recipes.

36. Number of Outbound Links: Too many dofollow OBLs may “leak” PageRank, which can hurt that page’s rankings.

37. Multimedia: Images, videos and other multimedia elements may act as a content quality signal.

38. Number of Internal Links Pointing to Page: The number of internal links to a page indicates its importance relative to other pages on the site.

39. Quality of Internal Links Pointing to Page: Internal links from authoritative pages on domain have a stronger effect than pages with no or low PR.

40. Broken Links: Having too many broken links on a page may be a sign of a neglected or abandoned site. The Google Rater Guidelines Document uses broken links as one was to assess a homepage’s quality.

41. Reading Level: There’s no doubt that Google estimates the reading level of webpages:

Google Reading Level

But what they do with that information is up for debate. Some say that a basic reading level will help your page rank because it will appeal to the masses. However, Linchpin SEO discovered that reading level was one factor that separated quality sites from content mills.

42. Affiliate Links: Affiliate links themselves probably won’t hurt your rankings. But if you have too many, Google’s algorithm may pay closer attention to other quality signals to make sure you’re not a “thin affiliate site”.

43. HTML errors/W3C validation: Lots of HTML errors or sloppy coding may be a sign of a poor quality site. While controversial, many in SEO think that WC3 validation is a weak quality signal.

44. Page Host’s Domain Authority: All things being equal a page on an authoritative domain will higher than a page on a domain with less authority.

45. Page’s PageRank: Not perfectly correlated. But in general higher PR pages tend to rank better than low PR pages.

46. URL Length: Search Engine Journal notes that excessively long URLs may hurt search visibility.

47. URL Path: A page closer to the homepage may get a slight authority boost.

48. Human Editors: Although never confirmed, Google has filed a patent for a system that allows human editors to influence the SERPs.

49. Page Category: The category the page appears on is a relevancy signal. A page that’s part of a closely related category should get a relevancy boost compared to a page that’s filed under an unrelated or less related category.

50. WordPress Tags: Tags are WordPress-specific relevancy signal.  According to

“The only way it improves your SEO is by relating one piece of content to another, and more specifically a group of posts to each other”

51. Keyword in URL: Another important relevancy signal.

52. URL String:  The categories in the URL string are read by Google and may provide a thematic signal to what a page is about:

google url strings

53. References and Sources: Citing references and sources, like research papers do, may be a sign of quality. The Google Quality Guidelines states that reviewers should keep an eye out for sources when looking at certain pages: “This is a topic where expertise and/or authoritative sources are important…”.

54. Bullets and Numbered Lists: Bullets and numbered lists help break up your content for readers, making them more user friendly. Google likely agrees and may prefer content with bullets and numbers.

55. Priority of Page in Sitemap: The priority a page is given via the sitemap.xml file may influence ranking.

56. Too Many Outbound Links: Straight from the aforementioned Quality rater document:

“Some pages have way, way too many links, obscuring the page and distracting from the Main Content”

57. Quantity of Other Keywords Page Ranks For: If the page ranks for several other keywords it may give Google an internal sign of quality.

58. Page Age: Although Google prefers fresh content, an older page that’s regularly updated may outperform a newer page.

59. User Friendly Layout: Citing the Google Quality Guidelines Document yet again:

“The page layout on highest quality pages makes the Main Content immediately visible”

60. Parked Domains: A Google update in December of 2011 decreased search visibility of parked domains.

61. Useful Content:  As pointed out by Backlinko reader Jared Carrizales, Google may distinguish between “quality” and “useful” content.

Site-Level Factors

Site Level

62. Content Provides Value and Unique Insights: Google has stated that they’re on the hunt for sites that don’t bring anything new or useful to the table, especially thin affiliate sites.

63. Contact Us Page: The aforementioned Google Quality Document states that they prefer sites with an “appropriate amount of contact information”. Supposed bonus if your contact information matches your whois info.

64. Domain Trust/TrustRank: Site trust — measured by how many links away your site is from highly-trusted seed sites — is a massively important ranking factor. You can read more about TrustRank here.

65. Site Architecture: A well put-together site architecture (especially a silo structure) helps Google thematically organize your content.

66. Site Updates: How often a site is updated — and especially when new content is added to the site — is a site-wide freshness factor.

67. Number of Pages: The number of pages a site has is a weak sign of authority. At the very least a large site helps distinguish it from thin affiliate sites.

68. Presence of Sitemap: A sitemap helps search engines index your pages easier and more thoroughly, improving visibility.

69. Site Uptime: Lots of downtime from site maintenance or server issues may hurt your ranking (and can even result in deindexing if not corrected).

70. Server Location: Server location may influence where your site ranks in different geographical regions. Especially important for geo-specific searches.

71. SSL Certificate:  Google has confirmed that they index SSL certificates and that they use HTTPS as a ranking signal.

72. Terms of Service and Privacy Pages: These two pages help tell Google that a site is a trustworthy member of the internet.

73. Duplicate Meta Information On-Site: Duplicate meta information across your site may bring down all of your page’s visibility.

74. Breadcrumb Navigation: This is a style of user-friendly site-architecture that helps users (and search engines) know where they are on a site:

breadcrumbs navigation

Both and Ethical SEO Consulting claim that this set-up may be a ranking factor.

75. Mobile Optimized: Google’s official stance on mobile is to create a responsive site. It’s likely that responsive sites get an edge in searches from a mobile device. In fact, they now add “Mobile friendly” tags to sites that display well on mobile devices. Google also started penalizing sites in Mobile search that aren’t mobile friendly

76. YouTube: There’s no doubt that YouTube videos are given preferential treatment in the SERPs (probably because Google owns it ):

youtube results

In fact, Search Engine Land found that traffic increased significantly after Google Panda.

77. Site Usability: A site that’s difficult to use or to navigate can hurt ranking by reducing time on site, pages viewed and bounce rate. This may be an independent algorithmic factor gleaned from massive amounts of user data.

78. Use of Google Analytics and Google Webmaster Tools: Some think that having these two programs installed on your site can improve your page’s indexing. They may also directly influence rank by giving Google more data to work with (ie. more accurate bounce rate, whether or not you get referall traffic from your backlinks etc.).

79. User reviews/Site reputation: A site’s on review sites like and likely play an important role in the algorithm. Google even posted a rarely candid outline of their approach to user reviews after an eyeglass site was caught ripping off customers in an effort to get backlinks.

Backlink Factors

Backlink Factors

80. Linking Domain Age: Backlinks from aged domains may be more powerful than new domains.

81. # of Linking Root Domains: The number of referring domains is one of the most important ranking factors in Google’s algorithm, as you can see from this chart from Moz (bottom axis is SERP position):

Linking Roor Domains

82. # of Links from Separate C-Class IPs: Links from seperate class-c IP addresses suggest a wider breadth of sites linking to you.

83. # of Linking Pages: The total number of linking pages — even if some are on the same domain — is a ranking factor.

84. Alt Tag (for Image Links): Alt text is an image’s version of anchor text.

85. Links from .edu or .gov Domains: Matt Cutts has stated that TLD doesn’t factor into a site’s importance. However, that doesn’t stop SEOs from thinking that there’s a special place in the algo for .gov and .edu TLDs.

86. Authority of Linking Page: The authority (PageRank) of the referring page is an extremely important ranking factor.

87. Authority of Linking Domain: The referring domain’s authority may play an independent role in a link’s importance (ie. a PR2 page link from a site with a homepage PR3  may be worth less than a PR2 page link from PR8

88. Links From Competitors: Links from other pages ranking in the same SERP may be more valuable for a page’s rank for that particular keyword.

89. Social Shares of Referring Page: The amount of page-level social shares may influence the link’s value.

90. Links from Bad Neighborhoods: Links from “bad neighborhoods” may hurt your site.

91. Guest Posts: Although guest posting can be part of a white hat SEO campaign, links coming from guest posts — especially in an author bio area — may not be as valuable as a contextual link on the same page.

92. Links to Homepage Domain that Page Sits On: Links to a referring page’s homepage may play special importance in evaluating a site’s — and therefore a link’s — weight.

93. Nofollow Links: One of the most controversial topics in SEO. Google’s official word on the matter is:

“In general, we don’t follow them.”

Which suggests that they do…at least in certain cases. Having a certain % of nofollow links may also indicate a natural vs. unnatural link profile.

94. Diversity of Link Types: Having an unnaturally large percentage of your links come from a single source (ie. forum profiles, blog comments) may be a sign of webspam. On the other hand, links from diverse sources is a sign of a natural link profile.

95. “Sponsored Links” Or Other Words Around Link: Words like “sponsors”, “link partners” and “sponsored links” may decrease a link’s value.

96. Contextual Links: Links embedded inside a page’s content are considered more powerful than links on an empty page or found elsewhere on the page.

contextual backlink

A good example of contextual links are backlinks from guestographics.

97. Excessive 301 Redirects to Page: Links coming from 301 redirects dilute some (or even all) PR, according to a Webmaster Help Video.

98. Backlink Anchor Text: As noted in this description of Google’s original algorithm:

“First, anchors often provide more accurate descriptions of web pages than the pages themselves.”

Obviously, anchor text is less important than before (and likely a webspam signal). But it still sends a strong relevancy signal in small doses.

99. Internal Link Anchor Text: Internal link anchor text is another relevancy signal, although probably weighed differently than backlink anchor text.

100. Link Title Attribution: The link title (the text that appears when you hover over a link) is also used as a weak relevancy signals.

101. Country TLD of Referring Domain: Getting links from country-specific top level domain extensions (.de, .cn, may help you rank better in that country.

102. Link Location In Content: Links in the beginning of a piece of content carry slight more weight than links placed at the end of the content.

103. Link Location on Page: Where a link appears on a page is important. Generally, links embedded in a page’s content are more powerful than links in the footer or sidebar area.

104. Linking Domain Relevancy: A link from site in a similar niche is significantly more powerful than a link from a completely unrelated site. That’s why any effective SEO strategy today focuses on obtaining relevant links.

105. Page Level Relevancy:  The Hilltop Algorithm states that link from a page that’s closely tied to page’s content is more powerful than a link from an unrelated page.

106. Text Around Link Sentiment: Google has probably figured out whether or not a link to your site is a recommendation or part of a negative review. Links with positive sentiments around them likely carry more weight.

107. Keyword in Title: Google gives extra love to links on pages that contain your page’s keyword in the title (“Experts linking to experts”.)

108. Positive Link Velocity: A site with positive link velocity usually gets a SERP boost.

109. Negative Link Velocity: Negative link velocity can significantly reduce rankings as it’s a signal of decreasing popularity.

110. Links from “Hub” Pages: Aaron Wall claims that getting links from pages that are considered top resources (or hubs) on a certain topic are given special treatment.

111. Link from Authority Sites: A link from a site considered an “authority site” likely pass more juice than a link from a small, microniche site.

112. Linked to as Wikipedia Source: Although the links are nofollow, many think that getting a link from Wikipedia gives you a little added trust and authority in the eyes of search engines.

113. Co-Occurrences: The words that tend to appear around your backlinkshelps tell Google what that page is about.

114. Backlink Age: According to a Google patent, older links have more ranking power than newly minted backlinks.

115. Links from Real Sites vs. Splogs: Due to the proliferation of blog networks, Google probably gives more weight to links coming from “real sites” than from fake blogs. They likely use brand and user-interaction signals to distinguish between the two.

116. Natural Link Profile: A site with a “natural” link profile is going to rank highly and be more durable to updates.

117. Reciprocal Links: Google’s Link Schemes page lists “Excessive link exchanging” as a link scheme to avoid.

118. User Generated Content Links: Google is able to identify links generated from UGC vs. the actual site owner. For example, they know that a link from the official blog at is very different than a link from

119. Links from 301: Links from 301 redirects may lose a little bit of juice compared to a direct link. However, Matt Cutts says that a 301 is similar to a direct link.

120. Microformats: Pages that support microformats may rank above pages without it. This may be a direct boost or the fact that pages with microformatting have a higher SERP CTR:


121. DMOZ Listed: Many believe that Google gives DMOZ listed sites a little extra trust.

122. TrustRank of Linking Site: The trustworthiness of the site linking to you determines how much “TrustRank” gets passed onto you.

123. Number of Outbound Links on Page: PageRank is finite. A link on a page with hundreds of OBLs passes less PR than a page with only a few OBLs.

124. Forum Profile Links: Because of industrial-level spamming, Google may significantly devalue links from forum profiles.

125. Word Count of Linking Content: A link from a 1000-word post is more valuable than a link inside of  a 25-word snippet.

126. Quality of Linking Content: Links from poorly written or spun content don’t pass as much value as links from well-written, multimedia-enhanced content.

127. Sitewide Links: Matt Cutts has confirmed that sitewide links are “compressed” to count as a single link.

User Interaction

User Interaction

128. Organic Click Through Rate for a Keyword: Pages that get clicked more in CTR may get a SERP boost for that particular keyword.

129. Organic CTR for All Keywords: A page’s (or site’s) organic CTR for all keywords is ranks for may be a human-based, user interaction signal.

130. Bounce Rate: Not everyone in SEO agrees bounce rate matters, but it may be a way of Google to use their users as quality testers (pages where people quickly bounce is probably not very good).

131. Direct Traffic: It’s confirmed that Google uses data from Google Chrometo determine whether or not people visit a site (and how often). Sites with lots of direct traffic are likely higher quality than sites that get very little direct traffic.

132. Repeat Traffic: They may also look at whether or not users go back to a page or site after visiting. Sites with repeat visitors may get a Google ranking boost.

133. Blocked Sites: Google has discontinued this feature in Chrome. However,Panda used this feature as a quality signal.

134. Chrome Bookmarks: We know that Google collects Chrome browser usage data. Pages that get bookmarked in Chrome might get a boost.

135. Google Toolbar Data: Search Engine Watch’s Danny Goodwin reportsthat Google uses toolbar data as a ranking signal. However, besides page loading speed and malware, it’s not known what kind of data they glean from the toolbar.

136. Number of Comments: Pages with lots of comments may be a signal of user-interaction and quality.

137. Dwell Time: Google pays very close attention to “dwell time”: how long people spend on your page when coming from a Google search. This is also sometimes referred to as “long clicks vs short clicks”. If people spend a lot of time on your site, that may be used as a quality signal.

Special Algorithm Rules

Special Algorithm Rules

138. Query Deserves Freshness: Google gives newer pages a boost for certain searches.

139. Query Deserves Diversity: Google may add diversity to a SERP for ambiguous keywords, such as “Ted”, “WWF” or “ruby”.

140. User Browsing History: Sites that you frequently visit while signed into Google get a SERP bump for your searches.

141. User Search History: Search chain influence search results for later searches. For example, if you search for “reviews” then search for “toasters”, Google is more likely to show toaster review sites higher in the SERPs.

142. Geo Targeting: Google gives preference to sites with a local server IP and country-specific domain name extension.

143. Safe Search: Search results with curse words or adult content won’t appear for people with Safe Search turned on.

144. Google+ Circles: Google shows higher results for authors and sites that you’ve added to your Google Plus Circles

145. DMCA Complaints: Google “downranks” pages with DMCA complaints.

146. Domain Diversity: The so-called “Bigfoot Update” supposedly added more domains to each SERP page.

147. Transactional Searches: Google sometimes displays different results for shopping-related keywords, like flight searches.

148. Local Searches: Google often places Google+ Local results above the “normal” organic SERPs.

google local results

149. Google News Box: Certain keywords trigger a Google News box:

google news box

150. Big Brand Preference: After the Vince Update, Google began giving big brands a boost for certain short-tail searches.

151. Shopping Results: Google sometimes displays Google Shopping results in organic SERPs:

Google Shopping

152. Image Results: Google elbows our organic listings for image results for searches commonly used on Google Image Search.

153. Easter Egg Results: Google has a dozen or so Easter Egg results. For example, when you search for “Atari Breakout” in Google image search, the search results turn into a playable game (!).  Shout out to Victor Pan for this one.

154. Single Site Results for Brands: Domain or brand-oriented keywords bring up several results from the same site.

Social Signals

Social Signals

155. Number of Tweets: Like links, the tweets a page has may influence its rank in Google.

156. Authority of Twitter Users Accounts: It’s likely that Tweets coming from aged, authority Twitter profiles with a ton of followers (like Justin Bieber) have more of an effect than tweets from new, low-influence accounts.

157. Number of Facebook Likes: Although Google can’t see most Facebook accounts, it’s likely they consider the number of Facebook likes a page receives as a weak ranking signal.

158. Facebook Shares: Facebook shares — because they’re more similar to a backlink — may have a stronger influence than Facebook likes.

159. Authority of Facebook User Accounts: As with Twitter, Facebook shares and likes coming from popular Facebook pages may pass more weight.

160. Pinterest Pins: Pinterest is an insanely popular social media account with lots of public data. It’s probably that Google considers Pinterest Pins a social signal.

161. Votes on Social Sharing Sites: It’s possible that Google uses shares at sites like Reddit, Stumbleupon and Digg as another type of social signal.

162. Number of Google+1’s: Although Matt Cutts gone on the record as saying Google+ has “no direct effect” on rankings, it’s hard to believe that they’d ignore their own social network.

163. Authority of Google+ User Accounts: It’s logical that Google would weigh +1’s coming from authoritative accounts more than from accounts without many followers.

164. Known Authorship: In February 2013, Google CEO Eric Schmidt famously claimed:

“Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in most users naturally clicking on the top (verified) results.”

Although the Google+ authorship program has been shut down, it’s likely Google uses some form of authorship to determine influential content producers online (and give them a boost in rankings).

165. Social Signal Relevancy: Google probably uses relevancy information from the account sharing the content and the text surrounding the link.

166. Site Level Social Signals: Site-wide social signals may increase a site’s overall authority, which will increase search visibility for all of its pages.

Brand Signals

Brand Signals

167. Brand Name Anchor Text: Branded anchor text is a simple — but strong — brand signal.

168. Branded Searches: It’s simple: people search for brands. If people search for your site in Google (ie. “Backlinko twitter”, Backlinko + “ranking factors”), Google likely takes this into consideration when determining a brand.

169. Site Has Facebook Page and Likes: Brands tend to have Facebook pages with lots of likes.

170. Site has Twitter Profile with Followers: Twitter profiles with a lot of followers signals a popular brand.

171. Official Linkedin Company Page: Most real businesses have company Linkedin pages.

172. Employees Listed at Linkedin: Rand Fishkin thinks that having Linkedin profiles that say they work for your company is a brand signal.

173. Legitimacy of Social Media Accounts: A social media account with 10,000 followers and 2 posts is probably interpreted a lot differently than another 10,000-follower strong account with lots of interaction.

174. Brand Mentions on News Sites: Really big brands get mentioned on Google News sites all the time. In fact, some brands even have their own Google News feed on the first page:

google news for brands

175. Co-Citations: Brands get mentioned without getting linked to. Google likely looks at non-hyperlinked brand mentions as a brand signal.

176. Number of RSS Subscribers: Considering that Google owns the popular Feedburner RSS service, it makes sense that they would look at RSS Subscriber data as a popularity/brand signal.

177. Brick and Mortar Location With Google+ Local Listing: Real businesses have offices. It’s possible that Google fishes for location-data to determine whether or not a site is a big brand.

178. Website is Tax Paying Business: Moz reports that Google may look at whether or not a site is associated with a tax-paying business.

On-Site WebSpam Factors

On Site Webspam

179. Panda Penalty: Sites with low-quality content (particularly content farms) are less visible in search after getting hit by a Panda penalty.

180. Links to Bad Neighborhoods: Linking out to “bad neighborhoods” — like pharmacy or payday loan sites — may hurt your search visibility.

181. Redirects: Sneaky redirects is a big no-no. If caught, it can get a site not just penalized, but de-indexed.

182. Popups or Distracting Ads: The official Google Rater Guidelines Document says that popups and distracting ads is a sign of a low-quality site.

183. Site Over-Optimization: Includes on-page factors like keyword stuffing, header tag stuffing, excessive keyword decoration.

184. Page Over-Optimizaton: Many people report that — unlike Panda — Penguin targets individual page (and even then just for certain keywords).

185. Ads Above the Fold: The “Page Layout Algorithm” penalizes sites with lots of ads (and not much content) above the fold.

186. Hiding Affiliate Links: Going too far when trying to hide affiliate links (especially with cloaking) can bring on a penalty.

187. Affiliate Sites: It’s no secret that Google isn’t the biggest fan of affiliates. And many think that sites that monetize with affiliate links are put under extra scrutiny.

188. Autogenerated Content: Google isn’t a big fan of autogenerated content. If they suspect that your site’s pumping out computer-generated content, it could result in a penalty or de-indexing.

189. Excess PageRank Sculpting: Going too far with PageRank sculpting — by nofollowing all outbound links or most internal links — may be a sign of gaming the system.

190. IP Address Flagged as Spam: If your server’s IP address is flagged for spam, it may hurt all of the sites on that server.

191. Meta Tag Spamming: Keyword stuffing can also happen in meta tags. If Google thinks you’re adding keywords to your meta tags to game the algo, they may hit your site with a penalty.

Meta Tags

Off Page Webspam Factors

Off Page Webwpam

192. Unnatural Influx of Links: A sudden (and unnatural) influx of links is a sure-fire sign of phony links.

193. Penguin Penalty: Sites that were hit by Google Penguin are significantly less visible in search.

194. Link Profile with High % of Low Quality Links: Lots of links from sources commonly used by black hat SEOs (like blog comments and forum profiles) may be a sign of gaming the system.

195. Linking Domain Relevancy: The famous analysis found that sites with an unnaturally high amount of links from unrelated sites were more susceptible to Penguin.

Penguin LDR

196. Unnatural Links Warning: Google sent out thousands of “Google Webmaster Tools notice of detected unnatural links” messages. This usually precedes a ranking drop, although not 100% of the time.

197. Links from the Same Class C IP: Getting an unnatural amount of links from sites on the same server IP may be a sign of blog network link building.

198. “Poison” Anchor Text: Having “poison” anchor text (especially pharmacy keywords) pointed to your site may be a sign of spam or a hacked site. Either way, it can hurt your site’s ranking.

199. Manual Penalty: Google has been known to hand out manual penalties, like in the well-publicized Interflora fiasco.

200. Selling Links: Selling links can definitely impact toolbar PageRank and may hurt your search visibility.

201. Google Sandbox: New sites that get a sudden influx of links are sometimes put in the Google Sandbox, which temporarily limits search visibility.

202. Google Dance: The Google Dance can temporarily shake up rankings. According to a Google Patent, this may be a way for them to determine whether or not a site is trying to game the algorithm.

203. Disavow Tool: Use of the Disavow Tool may remove a manual or algorithmic penalty for sites that were the victims of negative SEO.

204. Reconsideration Request: A successful reconsideration request can lift a penalty.

205. Temporary Link Schemes: Google has (apparently) caught onto people that create — and quickly remove — spammy links. Also know as a temporary link scheme.

“How Can I Use This Information For My Site?”

I created a free step-by-step checklist that you can use to quickly apply the most important information from this post to your site.

The checklist contains the 10 most important ranking factors on this list…

…and super-actionable strategies that you can use to get higher rankings and more traffic.

by Brian Dean | Last updated May. 12, 2015 @ link credit :

What is Bot traffic?

Bot traffic is part of on-line/internet traffic comes from automated bot and unknown internet spiders.

Bot traffic is really hard to detect but according to authorized blogs sources it may be generated 10% to 25% of traffic. Each site automatically gets a set level of bot traffic.  So, the results is lower the human traffic than higher the bot traffic.

One another way to creates human made bots, when a publisher produces for himself bot traffic to generate artificial ads impression to make their client happy. By doing this they will get no revenue but getting huge amount of impression only.

Bot traffic from ads agency

There are two types of Bot available

  1. Official Bots.
  • Good bots are useful for providing numerous internet services, thus these are Good Bots.
  • Search Engine Bot like Google Bot, Yahoo Slurp, and Bing Bot.
  • Bots used for usability & response time management.
  • E-reputation spider’s bots
  • Advertising measurement bots.
  1. Bad Bots.
  • Email address Harvesting
  • Automated account Sign-up (to create multiple email account)
  • Content Spinning
  • Blogs and Comments Span
  • Click & impression fraud

Bot traffic wastes internet infrastructure resources and may affects web analytics and advertising.

Captchas are a good way to prevent bot traffic but they are also an annoyance for website visitors.


Brands Will Lose Billions to Bots in 2015

According to research by White Ops and the Association of National Advertisers, 11 percent of display ad impressions, 23 percent of video ad impressions, and 52 percent of sourced traffic is fraudulent. (Original Source:

Bots traffic in 2015

Report Highlights (

Bot Traffic is up by 21%

Compared to the previous report from 2012, we see a 21% growth in total bot traffic, which now represents 61.5% of website visitors. The bulk of that growth is attributed to increased visits by good bots (i.e., certified agents of legitimate software, such as search engines) whose presence increased from 20% to 31% in 2013. Looking at user-agent data we can provide two plausible explanations of this growth:

  • Evolution of Web Based Services: Emergence of new online services introduces new bot types into the pool. For instance, we see newly established SEO oriented services that crawl a site at a rate of 30-50 daily visits or more.
  • Increased activity of existing bots: Visitation patterns of some good bots (e.g., search engine type crawlers) consist of re-occurring cycles. In some cases we see that these cycles are getting shorter and shorter to allow higher sampling rates, which also results in additional bot traffic.

31% of Bots Are Still Malicious, but with Much Fewer Spammers

While the relative percentage of malicious bots remains unchanged, there is a noticeable reduction in Spam Bot activity, which decreased from 2% in 2012 to 0.5% in 2013. The most plausible explanation for this steep decrease is Google’s anti-spam campaign, which includes the recent Penguin 2.0 and 2.1 updates.

SEO link building was always a major motivation for automated link spamming. With its latest Penguin updates Google managed to increase the perceivable risk for comment spamming SEO techniques, while also driving down their actual effectiveness.

Based on our figures, it looks like Google was able to discourage link spamming practices, causing a 75% decrease in automated link spamming activity.

Evidence of More Sophisticated Hacker Activity

Another point of interest is the 8% increase in the activity of “Other Impersonators” – a group which consists of unclassified bots with hostile intentions.

The common denominator for this group is that all of its members are trying to assume someone else’s identity. For example, some of these bots use browser user-agents while others try to pass themselves as search engine bots or agents of other legitimate services. The goal is always the same – to infiltrate their way through the website’s security measures.

The generalized definition of such non-human agents also reflects on these bots’ origins. Where other malicious bots are agents of known malware with a dedicated developer, GUI, “brand” name and patch history, these “Impersonators” are custom-made bots, usually crafted for a very specific malicious activity.

– See more at:


Written By 


Digital Marketing Expert

18 Best SEO interview Questions 2015 . Write Answers in comment section.

Q1.  if a page include more than one rel=”canonical” then Google will …..

A. Ignore them all.
B. Count the first one.
C. Count the last one.

Q2. When including Open Graph (OG) tags on a webpage, which of the following is NOT a required property?

A: og:title
B: og:description
C: og:image
D: og:url
E: og:type
Q3. What is the optimal HTTP status code to permanently remove a URL from Google’s index?

A: 301
B: 302
C: 404
D: 410
E: 500
Q4.  Best practices for rel=”canonical” link elements include:
A: Relative URLs
B: Absolute URLs
C: Nofollow the link element
Q5  Which of the following are true about the growth of mobile vs. desktop search in Google?

A: Mobile search queries overtook desktop in 2013 and are expected to continue growing faster.
B: Desktop search queries are now shrinking, and mobile is growing, but will not overtake desktop in the next 2-3 years.
C: Both desktop and mobile are growing, but mobile is growing faster and is expected to overtake desktop in the next 1-2 years.
D: Desktop is not growing (but not shrinking much). Mobile is expected to overake desktop in number of queries by 2016.

Q6  The X-Robots-Tag should be located:

A: In your robots.txt file
B: In the <head>
C: In the HTTP headers
D: Anywhere inside the <body> element
E: In Robert Downey Jr.’s basement
Q7. What is the maximum number of URLs typically allowed in an XML sitemap file?

A: 500
B: 5,000
C: 50,000
D: 500,000
E: 5,000,000
Q8. The minimum REQUIRED tags in an XML sitemap include:

A: <urlset>
B: <url>
C: <loc>
D: None of these
E: A, B, and C
Q9.  What is an example of a “soft” 404 error?

A: A page that returns a 404 HTTP response code, but quickly changes to a 200 a short time later.
B: A page that returns a 200 HTTP response code, and displays a blank page.
C: A page that displays a 404 HTTP response code, and displays a 404 error page.
D: A page that returns a 404 code with pillows.
Q10. How can meta description tags help with the practice of search engine optimization?

A: They’re an important ranking factor in the search algorithms.
B: They help to tell the search engines which keywords are most important on your page.
C: They serve as the copy that will entice searchers to click on your listing.
D: Trick question; meta descriptions are NOT important.


Q11. Which of the following would be the best choice of URL structure (for both search engines and humans)?
Q12. Which group of ranking factors do SEOs generally consider to have the largest influence on rankings?
A: Engagement Metrics
B: Social shares
C: Link-based metrics
D: Site speed
E: Exact match domains
Q13.  True or false: For best results, incorporate authorship markup on every page of your website, including the homepage and category pages.

Q14. Which HTTP status code is best to serve when your site is down for maintence?
A: 200
B: 302
C: 404
D: 503
E: 90210
Q15.  What are valid reasons why your webpage’s title may not appear in Google’s search results exactly as it does in the page title element in your HTML?

A: Google has overwritten your title element with a title taken from the Yahoo! Directory.
B: Your title does not contain your brand name or other key terms in the users’ search query (or doesn’t include them at the start of the title element), so Google is using text from elsewhere on the page.
C: Google is pulling a title for your webpage from one you entered into an AdWords ad for that URL (that produces a higher clickthrough rate).
D: Your page contains the meta “title-h1″ tag specifying that Google should use the text within the H1 in place of the title element for the search results listing
E: Both options B & D
Q16. In general, internal links pass about the same value as external links.

Q17. Implementing structured data may help:

A: Lead to rich snippets in search results
B: Enchance CTR from search engine results
C: Search engines understand your content
D: Your content to appear in specialized search results like “in-depth Articles”
E: All of these
Q18. Which of the following types of sitemaps is NOT supported by Google?
A: News
B: Product
C: Image
D: Mobile
E: Video
NOTE : Please put your valid answer in comment box, we will approve them and send your full answer sheet to your mailbox.
Written by Piyush Yadav

Difference between Universal Analytics and Google Analytics

Difference between ga.js and analytic.js is comes under following things.

  1. Data Collection and integration
  2. Data Processing
  3. Custom Dimensions and metrics
  4. Custom variables
  5. User Interface
  6. Javascript library
  7. Tracking Code
  8. Technical Knowledge
  9. Referrals Processing
  10. Cookies
  11. Privacy and data usage

Data Collection and Integration

  • Universal Analytics (UA) provides more ways to collect and integrate different types of data than Google Analytics (GA).
  • Through UA you can integrate data across multiple devices and platforms. This is something which is not possible with GA.
  • Consequently UA provides better understanding of relationship between online and offline marketing channels that drive sales and conversions than GA.

Data processing

  • The date processing in UA is visitor based instead of visit based.
  • Consequently UA is more visitor centric than visit centric.

Custom Dimensions and Metrics

  • In UA you can create and use your own dimensions and metrics to collect the type of data GA does not automatically collect (like phone call data, CRM data etc).
  • These user defined dimensions and metrics are known as ‘custom dimensions’ and ‘custom metrics’.
  • Through custom dimensions you can import additional data into your analytics account.
  • GA does not allow you to define your own dimensions and metrics.

New Custom Dimensions & Custom Metrics

Custom dimensions and custom metrics are like default dimensions and metrics in your Analytics account, except you create and define them yourself. They’re powerful tools you can use to collect and segment data that Google Analytics doesn’t automatically track, like product details, levels in games, or authors of content pages.

More configuration options

Universal Analytics gives you more configuration options in your Google Analytics account, so you don’t have to adjust your tracking code to make modifications. From the Admin page in your account, you can now control these settings:

  • Organic search sources
  • Session and campaign timeout handling
  • Referral exclusions
  • Search term exclusions

Custom Variables

  • UA uses custom dimensions instead of custom variables
  • GA uses custom variables instead of custom dimensions.
  • Though ‘custom dimensions’ are not available in GA, custom variables are still available in UA (although not sure for how long)


User Interface

  • Interface wise both UA and GA reports look the same.
  • The difference is in how each collect, integrate and process the data.
  • However once you start using custom dimensions and custom metrics, your UA reports may look very different from your GA reports.


JavaScript Library

  • UA uses ‘analytics.js’ JavaScript library whereas GA uses ‘ga.js’ JavaScript library.
  • The ‘analytic.js’ library is similar to ‘ga.js’ library but provides new set of features for collecting and integrating data.

Technical Knowledge

To use all the features of UA you need good technical knowledge of your development environment/ platform or you should know someone who has such knowledge.

Otherwise you may have a hard time using custom dimensions, custom metrics and integrating data across multiple devices/ platforms.

Without this technical knowledge, UA is not very useful  for you. This is not really the case with GA.


Referrals Processing

In UA referrals are processed differently.

By default all referrals trigger a new web session in UA. This can affect the total number of web sessions in your analytics reports.

For example: let us suppose a visitor arrived on the website from then returned back to

When the visitor arrived on the website from, it will trigger a new web session for website. When the visitor returned back to from, it will trigger another new web session for

In UA referrals are processed differently

If you do not want the new web session to be triggered when the visitor returned to from then you need to exclude referrals from


Cookie is a text file which is used to store information about a visitor, his preferences, location, browsing behaviour and other details.

While GA can use up to 4 cookies (_utma,_utmb,_utmz and _utmv) to collect visitors’ usage data, UA uses only 1 cookie (called _ga).

Related article: Google Analytics & Universal Analytics Cookies – Complete Guide

Privacy and Data usage

Google warns against collecting any personally identifiable data in your UA accounts. Google can terminate your analytics account if you breach this policy.

You need to give your end users proper notice and get consent about what data you will collect via UA. You also need to give your end users the opportunity to ‘opt out’ from being tracked.

That means you need to make changes in your privacy and data usage policies. Google recommends using Google Analytics opt out browser add on if you want to block Google Analytics.

Note: You can learn more about the UA usage guidelines from here and about the privacy from here.

Server Side Configuration settings in Universal Analytics

UA lets you change following server side configuration sections via the account admin:

  1. Change session and campaigns timeout settings.
  2. Add/delete search engines
  3. Exclude referral traffic sources
  4. Exclude search terms

Server Side Configuration settings in Universal Analytics

In case of GA you need to add special tracking codes to all the web page on your website to change each of the aforesaid server side configurations.

UA has simplified changing these server configurations by providing easy to use controls in the account ‘admin panel’ which don’t require editing the existing tracking code on every web page of your website.


The Measurement Protocol

UA uses a new measurement protocol (a protocol is a set of rules) which let you send data from any device/system/environment (including smartphones, tablets, call center data, digital appliances, point of purchase systems or any online or offline customer contact point) to your Google Analytics account provided you have formatted your data according to the protocol.

Through this protocol you can import offline conversion data into GA. The measurement protocol includes a new JavaScript library called ‘analytics.js’.

Just like GA, Universal Analytics tracking code also request an invisible file called ‘_utm.gif’ each time a web page is loaded into the browser to send tracking data to the analytics server. This GIF request is pretty long and looks something like the one below:’s%20Blog%20About%20Marketing%20and%20More&sd=24-bit&sr=1600×900&vp=1583×330&je=1&fl=15.0%20r0&_u=MACAAAQBI~&jid=1551246513&cid=1701241469.1413716603&tid=UA-52269-5&_r=1&z=1412809081

Note: In case of UA the parameters of the _utm.gif are different than that of GA.

To learn more about the Measurement Protocol, check out the article: Understanding Universal Analytics Measurement Protocol


Setting up new Universal Analytics Account

Follow the steps below to set up your new Universal Analytics account:

Step-1: Get administrative access to your Google Analytics account.

Step-2: Go to ‘Admin’ area and from the property drop down menu, select ‘create new property’.

Setting up new Universal Analytics Account

This will create a new ‘web property’ in your existing Google Analytics account.



Once you have completed the form, click on the ‘Get Tracking ID’ button

The UA tracking code will now appear in a box. Copy-paste this tracking code in the head section (<head>….</head>) of every web page on your website. Remove the old Google Analytics tracking code at the same time.

Step-5: Check the source code of the home page and other web pages on your website and look for ‘analytics.js’. If you can see this JavaScript library in your HTML code then the website is using the UA tracking code. Also check your real time reports to make sure that you are getting data into your Universal Analytics reports.

Step-6: Change server side configuration options via your Admin panel (explained later in this article).

Step-7: Set up custom dimensions and custom metrics (if required).

Change session and campaigns timeout settings

Go to the ‘Admin’ area of your account and then click on ‘Session Settings’ link under ‘Tracking Info’ drop down menu.
Change session and campaigns timeout settings

By default both GA and UA ends a web session after 30 minutes of inactivity on a website or when the browser window is closed.

By default the attribution to a marketing campaign expires (timeout) after 6 months from the last time a visitor visited your website.

There are certain situations in which you may need to change the default time when a web session or a marketing campaings ends.

For example:

1. If your website automatically logs out a visitor say after 2 minutes of inactivity (common in case of bank websites) or if visitors spend 5 minutes on an average on your website then it doesn’t make any sense to end a web session after 30 minutes of inactivity.

May be a session timeout of 5 minutes will be better in this case.

Choose a session timeout which matches to the average time spent on your website/web pages (verify this trend over a long period of time: at least 3 or more months).

Note: Your web session can not be less than 1 minute or greater than 4 hours.


2. Majority of marketing campaigns become irrelevant for conversion attribution after few weeks. So it doesn’t make any sense to set campaign timeout to 6 months.

Choose a campaign timeout which matches to the time you think your campaigns will remain relevant for attributing conversions.

Note: Your campaign timeout can not be greater than 24 months (2 years)


To change Session timeout in GA use _setSessionCookieTimeout() method. Similarly, to change Campaign timeout in GA use _setCampaignCookieTimeout() method.

Following is an example of how you can call these methods in your Google Analytics Tracking Code:

Session timeout in GA use _setSessionCookieTimeout() method

To be continued : Original Source :

SEO Copywriter : Piyush Yadav


What do you mean by Domain Cross linking ?

If you want to add more business and you already think about making 20-30 websites and linked each other then it is call Domain Cross-linking. If you add rel=”nofollow” then at least this is showing your good sense of SEO.

But still Domain Cross linking is not good for SEO. Its looks spam if your website shows different different business and linking each other.

In a interview, Google web spam head Matt Cutt says that cross linking is OK if your business located a different different location. Means multiple domains and works for same company, It make sense link between them. But if you want to same as Matt Cutt believes then follow 2 points: 

1. Don’t use foot links; use a country locator page

2. Keep them normal static HTML.


Written By :

Piyush Yadav

SEO @seobysearch

How to set up Custom Dimensions and Metrics in Google Analytic ?

how to set up custom dimensions and metrics

 What are dimensions and metrics?

A Metric is count of some data type like pages views or Avg. Session views and % of CTR. Dimensions are those data types that gives a information like Cities, Screen Name, Browsers and Devices.Google Analytics offers over 200 pre-defined dimensions and metrics.

Using custom dimensions and metrics, you can define your own dimensions and metrics.

Why are custom dimensions and metrics useful?

Custom dimensions and metrics allow you to bring data you might have outside of Google Analytics. For example: 1.  If you store the gender of signed-in users in a CRM system, you might want to combine that with your Google Analytics data to see page-views by gender.  2. If you’re a game developer, metrics like level completions or high scores may be more relevant to you than pre-defined metrics like screen views or average time on site.

How are custom dimensions different from custom variables?

The fundamental difference between the two is that custom dimensions are primarily managed on the server side, whereas custom variables are primarily managed on the client side.

  • Using Custom Dimensions, Less data needs to be sent in each hit. Only the index and value need to be sent at collection time.
  • Custom dimension definitions are more flexible, Name and Scope can be edited in Property setting without modification in Code.
  • Each property has 20 available custom dimension indices, while custom variables are managed on the client side in five custom variable “slots” per property.

Understanding Custom Dimensions and Metrics

The lifecycle of a custom dimension or metric has four stages:

  • Configuration – you define your custom dimensions and metrics with an index, a name, and other properties like scope.
  • Collection – you send custom dimension and metric values to Google Analytics from your implementation.
  • Processing – your data is processed using your custom dimension and metric definitions and any view (profile) filters.
  • Reporting – you build new reports using your custom dimensions and metrics in the web interface.


When you define a custom dimension or metric, you specify its name and other configuration values at a particular index. Custom Dimensions have the following configuration values:

  • Name – the name of the custom dimension as it will appear in your reports.
  • Scope – specifies to which data the custom dimension or metric will be applied. Learn more about Scope.
  • Active – whether the custom dimension or metric value will be processed. Inactive custom dimensions may still appear in reporting, but their values will not be processed.

Custom metrics have the following configuration values:

  • Name – the name of the custom metric as it will appear in your reports.
  • Type – determines how the custom metric value will be displayed in reports.
  • Minimum / Maximum Value – the minimum and maximum values that will be processed and displayed in your reports.
  • Active – whether the custom metric value will be processed. Inactive custom metrics may still appear in reporting, but their values will not be processed.       [In My Next Session of Blogs : Read about : Collection, Processing, & Reporting ]

Read How to Set up custom dimensions

  1. Sign in to Google Analytics.
  2. Select the Admin tab and navigate to the property to which you want to add custom dimensions.
  3. In the PROPERTY column, click Custom Definitions, then click Custom Dimensions.
  4. Click New Custom Dimension.
  5. Add a Name.
    This can be any string, but use something unique so it’s not confused with any other dimension or metric in your reports.
  6. Select the Scope.
    Choose to track at the Hit, Session, User, or Product level. Read more about scope and how custom dimensions are processed in our Developer Guide.
  7. Check the Active box to start collecting data and see the dimension in your reports right away. To create the dimension but have it remain inactive, uncheck the box.
  8. Click Create.

Read How to Set up custom metrics

  1. Sign in to Google Analytics.
  2. Select the Admin tab and navigate to the property to which you want to add custom metrics.
  3. In the PROPERTY column, click Custom Definitions, then Custom Metrics.
  4. Click the New Custom Metric button.
  5. Add a Name.
    This can be any string, but use something unique so it’s not confused with another dimension or metric in your reports.
  6. From the Formatting Type dropdown, select an Integer, Currency, or Time.
    An integer can be any number. The currency type will match the view settings (i.e., USD, Yen, etc.) and should be entered as a decimal number. Specify time in seconds, but it appears as hh:mm:ss in your reports.
  7. Check the Active box to start collecting data and see the metric in your reports right away. To create the metric but have it remain inactive, uncheck the box.
  8. Click Create.

Modify your tracking code

After you create custom dimensions or metrics in your property, you must also modify your tracking code. This should be completed by a qualified developer. Follow the instructions in the Developer Guide for your specific environment:

Edit custom dimensions and metrics

Custom dimensions and metrics can’t be deleted once created, but you can return to these settings in your account to manage and edit them. To stop using an existing custom dimension, uncheck the Active box, and click Save.