SEO Glossary
Every SEO term explained in plain language. 58+ terms with real definitions, not jargon-filled summaries.
Alt Text
Alt text (alternative text) is a written description of an image that appears in the HTML code. Search engines use alt text to understand image content since they cannot "see" images, and screen readers use it to describe images to visually impaired users. Good alt text is concise, descriptive, and includes relevant keywords naturally.
Anchor Text
Anchor text is the visible, clickable text in a hyperlink. Search engines use anchor text to understand what the linked page is about. For example, if many sites link to a page with the anchor text "best running shoes," Google interprets that page as relevant for that topic. Over-optimized anchor text (using exact-match keywords repeatedly) can trigger spam penalties.
Check your anchor text with Internal Links CheckerBacklink
A backlink is a link from one website to another. Backlinks are one of Google's most important ranking factors because they act as "votes of confidence" from other sites. A backlink from a high-authority, relevant website is significantly more valuable than hundreds of links from low-quality or unrelated sites.
Analyze your backlinksBlack Hat SEO
Black hat SEO refers to aggressive techniques that violate search engine guidelines to manipulate rankings. Common tactics include keyword stuffing, cloaking (showing different content to search engines and users), link schemes, and hidden text. While these methods may produce short-term gains, they risk severe penalties including complete removal from search results.
Bounce Rate
Bounce rate is the percentage of visitors who leave your site after viewing only one page without taking any further action. A high bounce rate can signal that your content doesn't match user intent, your page loads too slowly, or your design is off-putting. However, a high bounce rate isn't always bad -- for single-page resources like recipes or definitions, it can be normal.
Broken Link
A broken link (also called a dead link) is a hyperlink that leads to a page that no longer exists, returning a 404 error. Broken links hurt SEO by wasting crawl budget, creating poor user experiences, and leaking link equity. Regularly auditing and fixing broken links is an essential part of technical SEO maintenance.
Find broken links with HTTP Status CheckerCanonical URL
A canonical URL is an HTML element (rel="canonical") that tells search engines which version of a page is the "master" copy when multiple URLs have identical or very similar content. This prevents duplicate content issues caused by URL parameters, HTTP/HTTPS variants, or www/non-www versions. Proper canonical tags consolidate link equity to a single URL.
Check your canonical URLsClick-Through Rate (CTR)
Click-through rate is the percentage of people who click on your link after seeing it in search results. It's calculated by dividing clicks by impressions. A higher CTR signals to search engines that your result is relevant and useful. You can improve CTR by writing compelling title tags and meta descriptions that match search intent.
Optimize your meta tags for better CTRContent Quality
Content quality refers to how well your page content serves the user's search intent. Google evaluates factors like depth of coverage, originality, readability, accuracy, freshness, and whether the content provides unique value compared to competing pages. High-quality content comprehensively answers the user's question and provides actionable information.
Score your content qualityCore Web Vitals
Core Web Vitals are a set of three specific metrics Google uses to evaluate user experience: Largest Contentful Paint (LCP) measures loading speed, Interaction to Next Paint (INP) measures interactivity, and Cumulative Layout Shift (CLS) measures visual stability. Since 2021, these metrics are an official Google ranking factor and are measured in the field for real users.
Check your Core Web VitalsCrawl Budget
Crawl budget is the number of pages a search engine bot will crawl on your site within a given timeframe. For large websites (10,000+ pages), managing crawl budget is critical -- if Google wastes time crawling low-value pages, important content may not get indexed. Factors like site speed, duplicate content, and server errors all affect crawl budget efficiency.
Optimize crawl budget with Robots.txt CheckerCrawler (Bot/Spider)
A crawler (also called a bot or spider) is an automated program that systematically browses the web to discover and index content. Google's main crawler is called Googlebot. Crawlers follow links from page to page, read content, and report back to the search engine's index. Your robots.txt file controls which parts of your site crawlers can access.
Check your robots.txt configurationDuplicate Content
Duplicate content refers to blocks of content that appear on more than one URL, either within the same site or across different sites. Search engines struggle to determine which version to index and rank, which can dilute ranking signals across all versions. Common causes include URL parameters, printer-friendly pages, and content syndication without proper canonical tags.
Fix duplicate content with Canonical URL CheckerE-E-A-T
E-E-A-T stands for Experience, Expertise, Authoritativeness, and Trustworthiness. It's a framework from Google's Search Quality Rater Guidelines used to evaluate content quality. Google wants to see that content creators have first-hand experience with the topic, demonstrable expertise, recognition from other authorities, and that the site is trustworthy. E-E-A-T is especially critical for YMYL (Your Money or Your Life) topics.
Evaluate your E-E-A-T signalsExternal Link
An external link is a hyperlink that points from one website to a different website. External links serve two purposes: outbound links to authoritative sources can boost your content's credibility, while inbound external links (backlinks) from other sites signal to Google that your content is valuable. The quality of the linking domain matters far more than the quantity of links.
Analyze your external linksFeatured Snippet
A featured snippet is a special search result that appears at the top of Google's organic results in a box, providing a direct answer to the user's query. Types include paragraph snippets, list snippets, table snippets, and video snippets. Winning a featured snippet can dramatically increase visibility and click-through rates, even if your page doesn't rank #1 organically.
Fetch and Render
Fetch and Render is a tool (now part of Google Search Console's URL Inspection) that lets you see how Googlebot crawls and renders your page. It shows you both the raw HTML and the rendered version, helping identify issues where JavaScript content might not be visible to search engines. This is critical for sites using client-side rendering frameworks.
Google Algorithm
Google's algorithm is the complex system used to retrieve data from its search index and deliver the best possible results for a query. It considers hundreds of ranking factors including relevance, quality, usability, and context. Major algorithm updates (like Panda, Penguin, and Helpful Content) can significantly impact rankings, making it important to follow SEO best practices rather than trying to game the system.
Google Search Console
Google Search Console (GSC) is a free tool from Google that helps website owners monitor and troubleshoot their site's presence in Google Search results. It provides data on search queries, impressions, clicks, indexing status, Core Web Vitals, mobile usability, and security issues. GSC is essential for any serious SEO effort.
H1 Tag
The H1 tag is the main heading of a webpage and typically the most prominent text on the page. Every page should have exactly one H1 tag that clearly describes the page's main topic and ideally includes the primary target keyword. While the H1 tag is not as powerful a ranking factor as it once was, it helps search engines and users quickly understand what a page is about.
Check your heading structureHTTPS
HTTPS (HyperText Transfer Protocol Secure) is the encrypted version of HTTP, using SSL/TLS certificates to secure data transmitted between a user's browser and the website. Google has confirmed HTTPS as a ranking signal since 2014, and modern browsers display "Not Secure" warnings for HTTP pages. All websites should use HTTPS to protect user data and maintain search rankings.
Detect mixed content issuesHreflang
Hreflang is an HTML attribute that tells search engines which language and regional version of a page to show to users in different locations. For example, a site with both English and Spanish versions would use hreflang tags to ensure Spanish-speaking users see the Spanish page. Incorrect hreflang implementation is one of the most common technical SEO errors on multilingual sites.
Index/Indexing
Indexing is the process by which search engines store and organize web page content in their database (index). When Google indexes a page, it means the page has been crawled, processed, and added to its search database, making it eligible to appear in search results. Pages that are not indexed cannot rank. You can check index status in Google Search Console.
Ensure your pages are indexableInternal Link
An internal link is a hyperlink that connects one page of a website to another page on the same website. Internal links help search engines discover new pages, understand site structure, and distribute link equity (PageRank) throughout the site. A strategic internal linking structure ensures your most important pages receive the most link authority.
Analyze your internal linksJSON-LD (Structured Data)
JSON-LD (JavaScript Object Notation for Linked Data) is Google's recommended format for adding structured data to web pages. Structured data helps search engines understand the content and context of a page, enabling rich results like star ratings, FAQ dropdowns, recipe cards, and event listings in search results. Common schema types include Article, Product, FAQ, and Organization.
Keyword
A keyword is a word or phrase that users type into search engines to find information. In SEO, keywords are the terms you optimize your content to rank for. Effective keyword strategy involves researching what your target audience searches for, understanding search intent behind those queries, and naturally incorporating keywords into your content, titles, and meta tags.
Analyze your keywordsKeyword Density
Keyword density is the percentage of times a keyword appears on a page relative to the total word count. For example, if a 1,000-word article mentions a keyword 15 times, the keyword density is 1.5%. While there's no "perfect" density, modern SEO focuses on natural language use rather than hitting a specific number. Search engines are sophisticated enough to understand topics without exact keyword repetition.
Check your keyword densityKeyword Stuffing
Keyword stuffing is the practice of overloading a page with target keywords in an attempt to manipulate search rankings. This outdated tactic includes repeating keywords unnaturally, hiding keywords in the page code, or using irrelevant keywords. Google's algorithms can easily detect keyword stuffing and will penalize pages that do it, often dropping them significantly in rankings or removing them entirely.
Check for spam signalsLink Building
Link building is the process of acquiring hyperlinks from other websites to your own. It's one of the most challenging but rewarding aspects of SEO. Effective link building strategies include creating exceptional content that naturally attracts links, guest posting on relevant sites, digital PR, broken link building, and building relationships with industry influencers. Quality always trumps quantity.
Analyze your backlink profileLink Equity (Link Juice)
Link equity (informally called "link juice") is the value or authority passed from one page to another through hyperlinks. When a high-authority page links to your site, it passes some of its authority to you. Factors affecting link equity include the linking page's authority, the relevance of the linking site, the number of other outbound links on the page, and whether the link is followed or nofollowed.
Map your link equity flowLLM Optimization
LLM Optimization (also called GEO - Generative Engine Optimization) is the practice of optimizing content to be cited and referenced by AI language models like ChatGPT, Google AI Overviews, and Perplexity. This involves structuring content with clear question-answer formats, using authoritative data with citations, and making information easily extractable by AI systems.
Check your LLM optimization scoreLong-Tail Keyword
A long-tail keyword is a specific, usually longer search phrase that gets lower search volume but typically has higher conversion rates and less competition. For example, "shoes" is a head term while "best waterproof trail running shoes for women" is a long-tail keyword. Long-tail keywords make up the majority of all searches and are often easier to rank for.
Discover long-tail keywordsMeta Description
A meta description is an HTML attribute that provides a brief summary of a page's content, displayed as the snippet text below the title in search results. While not a direct ranking factor, a well-written meta description (150-160 characters) can significantly improve click-through rates by compelling users to choose your result over competitors.
Check your meta descriptionsMixed Content
Mixed content occurs when an HTTPS page loads resources (images, scripts, stylesheets) over insecure HTTP connections. This creates security vulnerabilities and triggers browser warnings that erode user trust. Modern browsers may block mixed content entirely, causing broken images or non-functional features. All resources on HTTPS pages should be loaded via HTTPS.
Detect mixed content issuesMobile-First Indexing
Mobile-first indexing means Google primarily uses the mobile version of your website for indexing and ranking. Since most users now search on mobile devices, Google evaluates the mobile experience as the baseline. This means your mobile site must have the same content, structured data, and meta tags as your desktop version, and must provide a good user experience on small screens.
Nofollow
Nofollow is a link attribute (rel="nofollow") that tells search engines not to pass link equity through a specific hyperlink. It was originally created to combat comment spam. Google now treats nofollow as a "hint" rather than a directive. Related attributes include rel="sponsored" for paid links and rel="ugc" for user-generated content like forum posts and comments.
Noindex
Noindex is a directive that tells search engines not to include a specific page in their search index. It can be implemented via a meta robots tag in the HTML or an X-Robots-Tag HTTP header. Common uses include preventing indexation of thin content, duplicate pages, admin areas, and staging environments. A page with noindex will not appear in search results.
Check for noindex directivesOpen Graph
Open Graph is a protocol created by Facebook that controls how URLs are displayed when shared on social media. Open Graph meta tags specify the title, description, image, and URL that appear in social shares. Properly configured OG tags ensure your content looks professional and compelling when shared on Facebook, LinkedIn, Slack, and other platforms that support the protocol.
Check your Open Graph tagsOrganic Traffic
Organic traffic refers to visitors who reach your website by clicking on unpaid search results, as opposed to paid advertising, social media, or direct visits. Organic traffic is considered the most valuable traffic source because it's free, sustainable, and indicates that your content matches real user search intent. Growing organic traffic is the primary goal of SEO.
Outbound Link
An outbound link is a hyperlink on your page that points to a page on a different website. Outbound links to authoritative, relevant sources can boost your content's credibility and help search engines understand your page's topic. However, linking to spammy or low-quality sites can hurt your rankings. Always review your outbound links for quality and relevance.
Analyze your outbound linksPage Speed
Page speed measures how quickly a web page loads its content. It's both a Google ranking factor and a critical user experience metric. Slow pages lead to higher bounce rates, lower conversions, and reduced crawl efficiency. Key metrics include Time to First Byte (TTFB), First Contentful Paint (FCP), and Largest Contentful Paint (LCP). Optimization techniques include image compression, code minification, and CDN usage.
Test your page speedPageRank
PageRank is Google's original algorithm for ranking web pages based on the quantity and quality of links pointing to them. Named after Google co-founder Larry Page, it treats each link as a "vote" for a page, with votes from high-authority pages carrying more weight. While Google no longer publicly shares PageRank scores, the underlying concept of link-based authority still drives modern rankings.
Redirect (301/302)
A redirect automatically sends users and search engines from one URL to another. A 301 redirect is permanent and passes most link equity to the new URL, while a 302 redirect is temporary and may not transfer link equity. Redirects are essential when moving or restructuring content. Redirect chains (A redirects to B redirects to C) should be avoided as they slow crawling and dilute link equity.
Detect redirect chainsRobots.txt
Robots.txt is a text file placed in a website's root directory that tells search engine crawlers which pages or sections they are allowed or not allowed to crawl. It's used to manage crawl budget, prevent indexing of sensitive areas, and point crawlers to sitemaps. Note that robots.txt is a suggestion, not a security measure -- pages blocked by robots.txt can still appear in search results if other sites link to them.
Analyze your robots.txtSchema Markup
Schema markup is a structured data vocabulary (from schema.org) that you add to your HTML to help search engines understand your content's meaning and context. It enables rich results in search, such as star ratings, event dates, recipe details, and FAQ accordions. Implementing schema markup can improve click-through rates by making your search listings more visually appealing and informative.
SERP
SERP stands for Search Engine Results Page -- the page displayed by a search engine in response to a query. Modern SERPs include organic results, paid ads, featured snippets, knowledge panels, image packs, video carousels, "People Also Ask" sections, and local pack results. Understanding SERP features for your target keywords helps you optimize content for maximum visibility.
Sitemap (XML)
An XML sitemap is a file that lists all the important pages on your website, helping search engines discover and crawl your content efficiently. It includes metadata like when a page was last modified and how frequently it changes. Sitemaps are especially important for large sites, new sites with few backlinks, and sites with pages not well-linked through internal navigation.
Validate your sitemapSpam Score
Spam score is a metric that estimates the likelihood that a website will be penalized or banned by search engines based on characteristics commonly found in spam sites. Factors include thin content, excessive advertising, suspicious link profiles, hidden text, and domain age. A high spam score doesn't guarantee a penalty, but it indicates elevated risk that should be investigated.
Check your spam scoreTitle Tag
The title tag is an HTML element that specifies the title of a web page. It appears in browser tabs, search engine results, and social media shares. Title tags are one of the most important on-page SEO elements. Best practices include keeping them under 60 characters, placing the primary keyword near the beginning, making each title unique, and writing compelling copy that encourages clicks.
Optimize your title tagsToxic Backlink
A toxic backlink is a link from a spammy, low-quality, or suspicious website that can harm your search rankings. Common sources include link farms, hacked websites, irrelevant directories, and sites with manipulative link schemes. Google's Penguin algorithm targets sites with unnatural link profiles. If you have toxic backlinks, you can disavow them through Google Search Console.
Detect toxic backlinksURL Structure
URL structure refers to how your page URLs are formatted and organized. SEO-friendly URLs are short, descriptive, include relevant keywords, use hyphens to separate words, and follow a logical hierarchy. Good URL structure helps both search engines and users understand what a page is about before clicking. Avoid long URLs with random parameters, session IDs, or unnecessary subdirectories.
Analyze your URL structureUser Experience (UX)
User Experience (UX) encompasses every aspect of how a user interacts with your website, including page speed, navigation, readability, mobile-friendliness, and visual design. Google increasingly uses UX signals (like Core Web Vitals and mobile usability) as ranking factors. Sites that provide excellent user experiences tend to have lower bounce rates, longer session durations, and higher conversion rates.
Check your performance and UXWhite Hat SEO
White hat SEO refers to optimization strategies that follow search engine guidelines and focus on providing genuine value to users. Techniques include creating high-quality content, earning natural backlinks, optimizing page speed, using proper meta tags, and building a great user experience. White hat SEO produces sustainable, long-term results without the risk of penalties.
X-Robots-Tag
The X-Robots-Tag is an HTTP response header that provides crawling and indexing directives to search engines, similar to the meta robots tag but applied at the server level. It's particularly useful for controlling indexation of non-HTML files like PDFs, images, and documents that cannot contain meta tags. Directives include noindex, nofollow, noarchive, and nosnippet.
Check your X-Robots-Tag headersPut Your Knowledge Into Practice
Now that you understand the terms, run a free audit to see how your website measures up across all 19 SEO checks.
Start Free SEO Audit