1_Search Optimization Basics.pdf
Document Details
Tags
Full Transcript
Unit:1 search engine optimization Basics What is seo? Search Engine + Optimization Search Engine: ○ A search engine is a software system designed to search out the information from world wide web according to the user’s query. Importance of Search Engine Top...
Unit:1 search engine optimization Basics What is seo? Search Engine + Optimization Search Engine: ○ A search engine is a software system designed to search out the information from world wide web according to the user’s query. Importance of Search Engine Top Search Engines: ○ Google, Yahoo, Bing, Yandex, DuckDuckGo What is seo? Search Engine + Optimization Optimization: ○ The action of making the best or most effective use of a situation or resource. Search Engine Optimization: ○ It is the process of making your website/web pages better for search engines, so that it can rank higher. Do we optimize search engines in SEO? ○ We optimize our websites/web pages. Need of seo Organic Search is most often primary source of Website traffic. SEO builds trust & credibility. SEO impacts the buying cycle. If you’re not on Page 1, You’re not winning the click. site architecture Website Architecture: Website architecture (also called website structure) is how well pages on your site are organized and connected to each other. Why is website architecture important?: Help your pages rank on Google. It can impact Google’s ability to find and index all of your important pages. How to Structure a Website? Use internal links strategically Make sure users can access your pages in just a few clicks Create an easy-to-follow navigation menu How to Structure a Website? Optimize your site’s URLs How to Structure a Website Use breadcrumbs ○ Breadcrumb navigation is a text-based navigation path made up of links. It shows users where they are in your site structure. How to Structure a Website Use HTML and XML sitemaps Optimization types on-page seo On-page SEO (also called on-site SEO) is the process of optimizing webpages and their content for both search engines and users. It can help rank pages higher on Google and drive more organic traffic. Why is on-page seo important? ○ Search engines use keywords and other on-page SEO elements to check whether a page matches a user’s search intent. ○ And if the page is relevant and useful, Google serves it to the user. on-page seo On-page optimization techniques: Write unique, helpful content Place target keywords strategically Write keyword-rich title tags Write click-worthy meta descriptions Use headings and subheadings to structure your page Optimize URLs Add internal links Add external links Include and optimize images Write unique, helpful content ○ Incorporate keywords naturally into your content (and avoid keyword stuffing) ○ Make sure your content matches the search intent of your target keyword ○ Fully answer the query—your content should be useful to users ○ Write unique content that offers something competitors don’t ○ Include visual content (more on that later) Place target keywords strategically ○ Google scans your content to see what a page is about—and readers will likely do the same. ○ So you should include your target keywords in these key areas: H1 First paragraph Subheaders (H2s, H3s, etc.) ○ This will help Google gain context about the topic of your page. And users will be able to quickly tell whether the page matches their search intent. Write keyword-rich title tags Few tips to follow when writing your title tags: ○ Keep it brief. We recommend keeping title tags between 50 and 60 characters so Google doesn’t cut them off ○ Include your target keyword. This helps both Google and users determine what your page is about. ○ Be unique. Avoid duplicate title tags so that each individual page’s purpose is clear to Google (and users know what they’re clicking on). Write click-worthy meta descriptions ○ A meta description tag is an HTML element on a site that provides a brief summary of the page. Use headings and subheadings to structure your page Optimize URLs ○ Use words that are relevant to your content so users can tell what your page is about. ○ And not random numbers, publish dates, or full sentences. ○ Using your target keyword in your URL is a good way to ensure your URL matches the topic of your content. ○ If Google understands what a page is about, it’s able to match it with relevant search queries. Add internal links ○ Internal links are hyperlinks that point to different pages on the same site. ○ Internal links are an important part of on-page SEO optimization: ▪ They help search engines understand your site’s structure and how pages are related to each other ▪ They allow Google crawlers to discover and navigate to new pages ▪ They signal to Google that the linked-to page is valuable ▪ They help users navigate through your website (and keep them on your site longer) Add external links ○ External links are links on your site that point to other sites. ○ Only link to authoritative and trustworthy sites related to the topic. ○ Use descriptive and natural anchor text to show readers what to expect when they click. Include and optimize images ○ Including images in your content increases your chances of ranking in Google Images. This is a great way to get more traffic to your site. ○ A good place to start optimizing your images is by writing descriptive alt text for them. ○ Alt text (short for alternative text) is text included in HTML code that describes an image on a webpage. off-page seo Off-page SEO refers to SEO tactics applied outside of a website to improve its rankings. The goal of off-page SEO is to get search engines (and users) to see your site as more trustworthy and authoritative. Why is on-page seo important? ○ Think about off-page SEO as building your site’s reputation. ○ Highly reputable websites tend to rank better because search engines consider them to have more Expertise, Authoritativeness, and Trustworthiness (E-A-T). off-page seo Off-page optimization techniques: Backlink Building: ○ When other reputable websites link to your content, it signals to search engines that your site is trustworthy and valuable. ○ Quality matters more than quantity. off-page seo Social Media Presence: ○ Social signals play a role in SEO. Having a strong and active presence on social media platforms can contribute to your online visibility and indirectly impact your search engine rankings. off-page seo Influencer Marketing: ○ Collaborating with influencers in your industry can amplify your reach. When influencers mention or link to your content, it can bring in new audiences and improve your site's authority. off-page seo Content Marketing: ○ Creating valuable and shareable content attracts natural backlinks. When people find your content helpful or interesting, they are more likely to link to it, contributing to your off-page SEO efforts. Steps of Most Search Processes Experience the Formulate that Goal need for an need in a string answer, solution, of words and or piece of phrases (the information query) Obtain information relevant to inquiry Execute the query, check the results, see whether you got what you wanted and if not, try a refined query Search query A search query is a phrase or a keyword combination users enter in search engines to find things of interest. Search query types Navigational Search Query Informational Search Query Transactional Search Query informational Informational search queries refer to searches where users are looking for information. navigational People who perform navigational searches are usually looking to find a particular web page related to a specific brand or product. This could be a website, social media account, or a blog post related to that particular brand or product. transactional Transactional search queries refer to searches where users are looking to perform a transaction - usually with the intention of making a purchase. These queries often include keywords like “order”, “buy” or “purchase”. Sometimes, they will also include the specific brands or products they want to transact with or buy. Search engine result page Serp layout 1. Search Query Box ○ The search engine show the query you’ve performed and allow you to edit or re-enter a new query from the search results page. 2. Vertical Navigation Each engine offers the option to search different verticals, such as images, news, video, or maps. 3. Result Information This section provides a small amount of meta-information about the results that you’re viewing, including an estimate of the number of pages relevant to that particular query. 4. Paid Search Advertising Paid search results are marked with a small “Ad” icon in the top left corner of their snippet. 5. Organic(Natural/Algorithmic) Results The organic results are determined by Google’s complex algorithm (which has 200+ ranking signals). Even though Google’s algorithm is top secret, they have publicly confirmed a few key ranking factors, including: ○ Off-page SEO signals (the number of websites linking to a specific page. Also known as “backlinks”) ○ On-page SEO signals (the keywords you use on your page) ○ Site loading speed ○ Brand presence and trust signals 6. Query Refinement Suggestions The goal of these links is to let users search with a more specific and possibly more relevant query that will satisfy their intent. Featured Snippets Featured Snippets are a short section of content pulled from a webpage or video. Knowledge Graph and Knowledge Panel Knowledge Graphs and panels usually show up on the right side of the organic results. Search engine architecture How do search engine work? Search engines work through three primary functions: 1. Crawling: Search the Internet for content, looking over the code/content for each URL they find. 2. Indexing: Store and organize the content found during the crawling process. Once a page is in the index, it’s in the running to be displayed as a result to relevant queries. 3. Ranking: Provide the pieces of content that will best answer a searcher's query, which means that results are ordered by most relevant to least relevant. Crawler/ Spider Ranking Algorithm Indexer Indexes Indexes Indexes Categories crawling Crawling is the discovery process in which search engines send out a team of robots (known as crawlers or spiders or bots) to find new and updated content. Content can vary — it could be a webpage, an image, a video, a PDF, etc. Content is discovered by links. Googlebot starts out by fetching a few web pages, and then follows the links on those webpages to find new URLs. By hopping along this path of links, the crawler is able to find new content and add it to their index. Search engine index Search engines process and store information they find in an index, a huge database of all the content they’ve discovered and deem good enough to serve up to searchers. ranking When someone performs a search, search engines search their index for highly relevant content and then orders that content in the hopes of solving the searcher's query. This ordering of search results by relevance is known as ranking. In general, you can assume that the higher a website is ranked, the more relevant the search engine believes that site is to the query. Search engine types Search engines are classified into the following four categories based on how it works. 1. Crawler based search engines 2. Human powered directories 3. Hybrid search engines 4. Meta search engines crawler based search engine All crawler based search engines use a crawler or bot or spider for crawling and indexing new content to the search database. There are four basic steps, every crawler based search engines follow before displaying any sites in the search results. ○ Crawling ○ Indexing ○ Calculating Relevancy ○ Retrieving the Result crawler based search engine 1. Crawling Search engines crawl the whole web to fetch the web pages available. A piece of software called crawler or bot or spider, performs the crawling of the entire web. The crawling frequency depends on the search engine and it may take few days between crawls. This is the reason sometimes you can see your old or deleted page content is showing in the search results. The search results will show the new updated content, once the search engines crawl your site again. crawler based search engine 2. Indexing Indexing is next step after crawling which is a process of identifying the words and expressions that best describe the page. The identified words are referred as keywords and the page is assigned to the identified keywords. Sometimes when the crawler does not understand the meaning of your page, your site may rank lower on the search results. Here you need to optimize your pages for search engine crawlers to make sure the content is easily understandable. Once the crawlers pickup correct keywords your page will be assigned to those keywords and rank high on search results. crawler based search engine 3. Calculating Relevancy Search engine compares the search string in the search request with the indexed pages from the database. Since it is likely that more than one page contains the search string, search engine starts calculating the relevancy of each of the pages in its index with the search string. There are various algorithms to calculate relevancy. crawler based search engine 3. Calculating Relevancy Each of these algorithms has different relative weights for common factors like keyword density, links, or meta tags. That is why different search engines give different search results pages for the same search string. It is a known fact that all major search engines periodically change their algorithms. If you want to keep your site at the top, you also need to adapt your pages to the latest changes. crawler based search engine 4. Retrieving Results The last step in search engines’ activity is retrieving the results. Basically, it is simply displaying them in the browser in an order. Search engines sort the endless pages of search results in the order of most relevant to the least relevant sites. Human-powered directories Human powered directories also referred as open directory system depends on human based activities for listings. Below is how the indexing in human powered directories work: ○ Site owner submits a short description of the site to the directory along with category it is to be listed. ○ Submitted site is then manually reviewed and added in the appropriate category or rejected for listing. Human-powered directories ○ Keywords entered in a search box will be matched with the description of the sites. This means the changes made to the content of a web pages are not taken into consideration as it is only the description that matters. ○ A good site with good content is more likely to be reviewed for free compared to a site with poor content. Hybrid search engine Hybrid Search Engines use both crawler based and manual indexing for listing the sites in search results. Most of the crawler based search engines like Google basically uses crawlers as a primary mechanism and human powered directories as secondary mechanism. For example, Google may take the description of a webpage from human powered directories and show in the search results. As human powered directories are disappearing, hybrid types are becoming more and more crawler based search engines. Hybrid search engine But still there are manual filtering of search result happens to remove the copied and spammy sites. When a site is being identified for spammy activities, the website owner needs to take corrective action and resubmit the site to search engines. The experts do manual review of the submitted site before including it again in the search results. In this manner though the crawlers control the processes, the control is manual to monitor and show the search results naturally. Meta search engine Metasearch engines are a way to get a wide range of search results from different search engines. The theory is that you can get a wider breadth of answers and information to inform your decisions better. In simple terms, a metasearch engine takes the query you've entered and gathers results from multiple search engines online, such as Google, Bing, Yahoo, and more. They aggregate the results for you so you can choose the best information from the search results provided. Meta search engine Some service-based industries such as airlines and hotel chains use a form of the metasearch engine. If you are searching for a hotel room in a city, you will likely find websites that will scour hotels in that city. The search engine will then return different results from searching specific hotel websites. You can sometimes even see the same hotel room for different prices at other websites due to your metasearch. ranking factors The factors that affect search engine rankings are grouped into the following general categories: 1. Visible on-page factors 2. Invisible on-page factors 3. Time-based factors 4. External factors visible on-page factors The visible on-page factors covered here are the following: 1. Page title 2. Page headings 3. Page copy 4. Outbound links 5. Keywords in URLs and domain name 6. Internal link structure and anchors 7. Overall site topicality Page Title The page title is a string of text, defined by contents of the element in the section of the HTML document. The title is visible both in the title bar of a browser window, as well as the headline of a search engine result. “Make sure each page has a descriptive tag and headings. Page Title One of the biggest mistakes web developers make is to set the title for all pages on a web site to the same generic text. Frequently, this text is the company name and/or a slogan. In this case, at best your pages will be indexed poorly. At worst, the site could receive a penalty if the search engines see the pages as duplicate content. Be sure all pages on a dynamic site have unique and relevant titles. When writing titles, it is also wise to insert some targeted keywords. Page Headings Page headings are sections of text set off from web page copy to indicate overall context and meaning. They are usually larger in size than the other copy within the document. They are typically created using tags in HTML, where x is a number between 1 and 6. They have been abused in the past to manipulate search rankings, but they are still an important on-page factor, and they also serve to help the user navigate a page. Page Copy It is intuitively clear that a page that contains the keywords that a user is looking for should be relevant to his or her search query. Search engine algorithms take this into account as well. Keyword insertion, however, should not be done in the excess. Mentioning the keywords in various inflections (plural, singular, past, present, and so on) is likely beneficial, as well as varying word order (“chocolate chip cookies” versus “cookies with chocolate chips”) Outbound Links Search engines will evaluate the links that a document contains. A related link on a web page is valuable content in and of itself, and is treated as such by search engines. However, links to totally irrelevant or spam content can potentially hurt the rankings of a page. Linking to a “bad neighborhood” of spam sites or even lots of irrelevant sites can hurt a site’s rankings. Keywords in URL and Domain name It is likely that keywords contained by a URL, both in the domain name or in the file name, do have a minor but apparently positive effect on ranking. It also likely has an effect on CTR because keywords in the URL may make a user more likely to click a link due to an increase in perceived relevance. The URL, like the page title, is also often selected as the anchor text for a link. This may have the same previously mentioned beneficial effect. Internal Link Structure and Anchors Search engines may make the assumption that pages not linked to within a web site’s internal link structure, are less important. Linking from the home page to content that you would like to rank can improve that page’s rankings, as well as linking to it from a sitemap and from various related content within the site. Home Page ➪ Article Part 1 ➪ Article Part 2 ➪ Article Part 3 ➪ Article Part 4 Internal Link Structure and Anchors Home Page ➪ Article Part 1 ➪ Article Part 2 ➪ Article Part 3 ➪ Article Part 4 Don’t use simple pagination. Page with “< prev” and “next >” links, but also add links to the individual pages, that is, “< prev 1 2 3 4 next >.” This creates a better navigation scheme to all pages. Add a sitemap with links to all the pages. invisible on-page factors The invisible on-page factors covered here are the following: 1. Meta description 2. Meta keywords 3. Alt and title attributes 4. Page structure considerations Meta Description The importance of a meta description lies in the fact that search engines may choose to use it in the SERPs. Example:... Meta Keywords It helps tell search engines what the topic of the page is. Example:... Alt Attribute Example: Page Structure Considerations Search engines use block-level elements, for example , , or elements to group related text. Using block-level elements indiscriminately for layout, as illustrated in the following example, may be harmful: Dog food is likely to be less relevant than: dog food. time based factors The time based factors covered here are the following: 1. Site and Page Age 2. Link Age 3. Domain Registration Age Site and Page Age Website that has existed for many years is rank better than a new site. But it is good to updating content on a page over time. It indicate that the site is active and includes fresh content. Link Age Links that are present on other good sites pointing to a web site acquire more value over times. A link actually appreciates in value. Domain Registration Age The content quality and updates frequency matters most, how long a domain has been running also has a great influence on a website’s rank on Google. That is known as the domain registration age. SE may view a long time period domain name registration as an indication that a web site is not engaging in spam. Avoid domain names longer than 15 characters. external factors The external factors covered here are the following: Quantity, quality, and relevance of inbound links Link churn Number of links on a page Link location Semantic relationships among links on a page IP addresses of cross-linked sites Quantity, quality and relevance of inbound links A site with many inbound links is likely to be relevant. But the quality of each inbound link is also concern. Search engine struggle to identify the quality and use the very complex algorithms. Link Churn Links that appear and disappear on pages are to be part of linking scheme. The rate at which these links appear and disappear is known as link churn. In worst case, your website will be regarded as spam and panelized. Number of Links on a Page Link on a page with few outbound links is generally worth. This concept is also implied by the formula for Google’s PageRank. Semantic Relationship among Link on a Page SE may assume that a page with many links to pages that are not semantically related but it is trick to manipulate rankings. Link Location Link presented in content near the center of the page may be regarded by the search engines as more important. Links embedded in content near the bottom of a page are usually less important. IP Addresses of Cross – Linked Sites When sites are interlinked with many links that come from such similar IP addresses, they will be consider suspicious. Those links may be devalued. Seo techniques Black Hat SEO Black hat SEO refers to any practices aimed at increasing a website’s ranking in search results that violate search engine policies. Black hat SEO attempts to manipulate search engines and send organic search traffic to low-quality or even malicious websites. Although Google has made significant progress in fighting spam and questionable SEO tactics through its algorithms, black hat SEO continues to exist and can be effective in certain industries. However, using such tactics carries substantial risks, such as facing manual actions (penalties) or algorithmic actions if the website is detected engaging in these practices. Black Hat SEO Techniques Keyword Stuffing Doorway Pages Hidden Text or Links Link Farms Cloaking Keyword Stuffing Keyword stuffing involves filling up a page with the phrase you want to rank for. While keywords help search engines understand what your content is about, overusing them can land you in trouble. Keyword Stuffing As you can see, using the same terms repeatedly can make your text sound unnatural. It can also make your site look spammy. Search engines are very good at detecting keyword stuffing, and they penalize offenders by not displaying their content. Therefore, you’ll want to use keywords moderately and naturally. Doorway Pages Doorways are websites or pages created to rank for similar keywords and direct users to the same content. For example, a website owner may publish several pages targeted at specific regions that all direct users to the same landing page. With this setup, users that look up a particular search query may see several similar pages in the results. When they click on any of these options, they will be redirected to a different (and often unrelated) page. This practice goes against Google’s guidelines and may have harmful consequences for your website. Doorway Pages Link Farm A link farm is a website or a collection of websites developed solely for the purpose of link building. Each website links out to the site or sites they want to rank higher on search engines. Link Farm Link farms often have low-quality content and lots of links. The links normally contain the keyword they want the site to rank for in the anchor text. Search engines like Google can easily detect link farms and using them should be avoided. Instead, you should use white hat SEO tactics like creating amazing content, graphs, data, interviews or any other content that allows you to acquire backlinks naturally over time. Hidden Text The text which search engines can view but readers can't is known as hidden text. This technique is used to incorporate irrelevant keywords and hide text or links to increase keyword density or improve internal link structure. Some of the ways to hide text are to set the font size to zero, create white text on a white background, etc. Cloacking Cloaking involves showing one piece of content to users and a different piece of content to search engines. Websites practicing black hat SEO will do this in order to make content rank for a variety of terms irrelevant to their content. Spam websites will often do this to try and avoid a search engine bot finding out the spam content they serve to users. White Hat SEO The term “white hat SEO” refers to SEO tactics that are in line with the terms and conditions of the major search engines, including Google. White hat SEO is the opposite of Black Hat SEO. Generally, white hat SEO refers to any practice that improves your search rankings on a search engine results page (SERP) while maintaining the integrity of your website and staying within the search engines’ terms of service. These tactics stay within the bounds as defined by Google. White Hat SEO Examples of white hat SEO include: ○ Offering quality content and services ○ Fast site loading times and mobile-friendliness ○ Using descriptive, keyword-rich meta tags ○ Making your site easy to navigate White Hat SEO Techniques Web Feed Social Bookmarking Link Bait Traditional and Search Engine Sitemaps Social Bookmarking Social bookmarking is an online service to store, organize and manage bookmarks of web pages for future reference. The web pages we bookmarked at social bookmarking sites are considered as quality by search engines. It is the easiest way to earn some good links for your website. Find some good social bookmarking sites and submit your website for bookmarking. There are so many high authority and high PR sites are available for social bookmarking Sitemap A sitemap is a file that tells search engines like Google what pages you have on your website. It helps them find and index your site. Sitemaps are available in extensible markup language (XML) and hypertext markup language (HTML) format. While sitemaps are typically created for crawling purposes, companies also build sitemaps when they’re planning their website architecture. Sitemap Why Are Sitemaps Important? When search engine bots crawl your site, they follow links to discover pages. But sometimes, they can miss a few nooks and crannies. Especially if your site is large or has complex navigation. That's where sitemaps come to the rescue. By creating a sitemap, you're giving search engines a handy directory of all your pages. Your pages need to be found before they can rank in search results. And sitemaps help with that. There are two main kinds of sitemaps: XML and HTML. HTML/Traditional Sitemap An HTML sitemap is a page on your website listing all important website pages. It serves as a table of contents. And helps both search engine bots and human visitors easily navigate through your site. HTML sitemaps are designed primarily for users. They provide a handy overview of your website's structure and allow visitors to find specific pages quickly. XML/Search Engine Sitemap An XML sitemap is a file that lists all the pages on your website. Which makes it easier for search engines to crawl and index your content. XML sitemaps are written for search engine bots—not users. Along with the list of pages, an XML sitemap can also include other technical details. Like when the page was last modified, how frequently the page content is likely to change, and the page’s priority relative to other pages on the site (indicated on a scale ranging from 0.0 to 1.0). What Are the Differences Between XML Sitemaps and HTML Sitemaps? XML sitemaps are: ○ Intended for search engines ○ Written in XML code ○ Able to include URLs in any order ○ Not designed for human readability or navigation HTML sitemaps are: ○ Intended for users ○ Created in HTML and displayed as webpages ○ Helpful for providing a structured list of links to pages within the site ○ Designed for human readability and navigation. But can also be used by search engines for crawling.