SEO Reference

SEO Glossary

Every SEO term explained in plain language. 58+ terms with real definitions, not jargon-filled summaries.

A

Alt Text

Alt text (alternative text) is a written description of an image that appears in the HTML code. Search engines use alt text to understand image content since they cannot "see" images, and screen readers use it to describe images to visually impaired users. Good alt text is concise, descriptive, and includes relevant keywords naturally.

Anchor Text

Anchor text is the visible, clickable text in a hyperlink. Search engines use anchor text to understand what the linked page is about. For example, if many sites link to a page with the anchor text "best running shoes," Google interprets that page as relevant for that topic. Over-optimized anchor text (using exact-match keywords repeatedly) can trigger spam penalties.

Check your anchor text with Internal Links Checker

Authority (Domain Authority)

Authority refers to a website's perceived credibility and trustworthiness in the eyes of search engines. Domain Authority (DA) is a metric developed by Moz that predicts how well a site will rank on search engines on a scale of 1-100. Higher authority sites generally rank better and pass more value through their outbound links.

Analyze your authority with Backlinks Checker
B

Black Hat SEO

Black hat SEO refers to aggressive techniques that violate search engine guidelines to manipulate rankings. Common tactics include keyword stuffing, cloaking (showing different content to search engines and users), link schemes, and hidden text. While these methods may produce short-term gains, they risk severe penalties including complete removal from search results.

Bounce Rate

Bounce rate is the percentage of visitors who leave your site after viewing only one page without taking any further action. A high bounce rate can signal that your content doesn't match user intent, your page loads too slowly, or your design is off-putting. However, a high bounce rate isn't always bad -- for single-page resources like recipes or definitions, it can be normal.

C

Canonical URL

A canonical URL is an HTML element (rel="canonical") that tells search engines which version of a page is the "master" copy when multiple URLs have identical or very similar content. This prevents duplicate content issues caused by URL parameters, HTTP/HTTPS variants, or www/non-www versions. Proper canonical tags consolidate link equity to a single URL.

Check your canonical URLs

Click-Through Rate (CTR)

Click-through rate is the percentage of people who click on your link after seeing it in search results. It's calculated by dividing clicks by impressions. A higher CTR signals to search engines that your result is relevant and useful. You can improve CTR by writing compelling title tags and meta descriptions that match search intent.

Optimize your meta tags for better CTR

Content Quality

Content quality refers to how well your page content serves the user's search intent. Google evaluates factors like depth of coverage, originality, readability, accuracy, freshness, and whether the content provides unique value compared to competing pages. High-quality content comprehensively answers the user's question and provides actionable information.

Score your content quality

Core Web Vitals

Core Web Vitals are a set of three specific metrics Google uses to evaluate user experience: Largest Contentful Paint (LCP) measures loading speed, Interaction to Next Paint (INP) measures interactivity, and Cumulative Layout Shift (CLS) measures visual stability. Since 2021, these metrics are an official Google ranking factor and are measured in the field for real users.

Check your Core Web Vitals

Crawl Budget

Crawl budget is the number of pages a search engine bot will crawl on your site within a given timeframe. For large websites (10,000+ pages), managing crawl budget is critical -- if Google wastes time crawling low-value pages, important content may not get indexed. Factors like site speed, duplicate content, and server errors all affect crawl budget efficiency.

Optimize crawl budget with Robots.txt Checker

Crawler (Bot/Spider)

A crawler (also called a bot or spider) is an automated program that systematically browses the web to discover and index content. Google's main crawler is called Googlebot. Crawlers follow links from page to page, read content, and report back to the search engine's index. Your robots.txt file controls which parts of your site crawlers can access.

Check your robots.txt configuration
D

Domain Authority (DA)

Domain Authority is a search engine ranking score developed by Moz that predicts how likely a website is to rank in search engine result pages (SERPs). Scored from 1 to 100, it's calculated based on factors like the number and quality of backlinks, the age of the domain, and overall trust signals. While not a Google metric, it's widely used as a comparative benchmark.

Check your domain authority signals

Duplicate Content

Duplicate content refers to blocks of content that appear on more than one URL, either within the same site or across different sites. Search engines struggle to determine which version to index and rank, which can dilute ranking signals across all versions. Common causes include URL parameters, printer-friendly pages, and content syndication without proper canonical tags.

Fix duplicate content with Canonical URL Checker
E

E-E-A-T

E-E-A-T stands for Experience, Expertise, Authoritativeness, and Trustworthiness. It's a framework from Google's Search Quality Rater Guidelines used to evaluate content quality. Google wants to see that content creators have first-hand experience with the topic, demonstrable expertise, recognition from other authorities, and that the site is trustworthy. E-E-A-T is especially critical for YMYL (Your Money or Your Life) topics.

Evaluate your E-E-A-T signals
F

Fetch and Render

Fetch and Render is a tool (now part of Google Search Console's URL Inspection) that lets you see how Googlebot crawls and renders your page. It shows you both the raw HTML and the rendered version, helping identify issues where JavaScript content might not be visible to search engines. This is critical for sites using client-side rendering frameworks.

G

Google Algorithm

Google's algorithm is the complex system used to retrieve data from its search index and deliver the best possible results for a query. It considers hundreds of ranking factors including relevance, quality, usability, and context. Major algorithm updates (like Panda, Penguin, and Helpful Content) can significantly impact rankings, making it important to follow SEO best practices rather than trying to game the system.

Google Search Console

Google Search Console (GSC) is a free tool from Google that helps website owners monitor and troubleshoot their site's presence in Google Search results. It provides data on search queries, impressions, clicks, indexing status, Core Web Vitals, mobile usability, and security issues. GSC is essential for any serious SEO effort.

H

H1 Tag

The H1 tag is the main heading of a webpage and typically the most prominent text on the page. Every page should have exactly one H1 tag that clearly describes the page's main topic and ideally includes the primary target keyword. While the H1 tag is not as powerful a ranking factor as it once was, it helps search engines and users quickly understand what a page is about.

Check your heading structure

HTTPS

HTTPS (HyperText Transfer Protocol Secure) is the encrypted version of HTTP, using SSL/TLS certificates to secure data transmitted between a user's browser and the website. Google has confirmed HTTPS as a ranking signal since 2014, and modern browsers display "Not Secure" warnings for HTTP pages. All websites should use HTTPS to protect user data and maintain search rankings.

Detect mixed content issues

Hreflang

Hreflang is an HTML attribute that tells search engines which language and regional version of a page to show to users in different locations. For example, a site with both English and Spanish versions would use hreflang tags to ensure Spanish-speaking users see the Spanish page. Incorrect hreflang implementation is one of the most common technical SEO errors on multilingual sites.

I

Index/Indexing

Indexing is the process by which search engines store and organize web page content in their database (index). When Google indexes a page, it means the page has been crawled, processed, and added to its search database, making it eligible to appear in search results. Pages that are not indexed cannot rank. You can check index status in Google Search Console.

Ensure your pages are indexable
J

JSON-LD (Structured Data)

JSON-LD (JavaScript Object Notation for Linked Data) is Google's recommended format for adding structured data to web pages. Structured data helps search engines understand the content and context of a page, enabling rich results like star ratings, FAQ dropdowns, recipe cards, and event listings in search results. Common schema types include Article, Product, FAQ, and Organization.

K

Keyword

A keyword is a word or phrase that users type into search engines to find information. In SEO, keywords are the terms you optimize your content to rank for. Effective keyword strategy involves researching what your target audience searches for, understanding search intent behind those queries, and naturally incorporating keywords into your content, titles, and meta tags.

Analyze your keywords

Keyword Density

Keyword density is the percentage of times a keyword appears on a page relative to the total word count. For example, if a 1,000-word article mentions a keyword 15 times, the keyword density is 1.5%. While there's no "perfect" density, modern SEO focuses on natural language use rather than hitting a specific number. Search engines are sophisticated enough to understand topics without exact keyword repetition.

Check your keyword density

Keyword Stuffing

Keyword stuffing is the practice of overloading a page with target keywords in an attempt to manipulate search rankings. This outdated tactic includes repeating keywords unnaturally, hiding keywords in the page code, or using irrelevant keywords. Google's algorithms can easily detect keyword stuffing and will penalize pages that do it, often dropping them significantly in rankings or removing them entirely.

Check for spam signals
L

LLM Optimization

LLM Optimization (also called GEO - Generative Engine Optimization) is the practice of optimizing content to be cited and referenced by AI language models like ChatGPT, Google AI Overviews, and Perplexity. This involves structuring content with clear question-answer formats, using authoritative data with citations, and making information easily extractable by AI systems.

Check your LLM optimization score

Long-Tail Keyword

A long-tail keyword is a specific, usually longer search phrase that gets lower search volume but typically has higher conversion rates and less competition. For example, "shoes" is a head term while "best waterproof trail running shoes for women" is a long-tail keyword. Long-tail keywords make up the majority of all searches and are often easier to rank for.

Discover long-tail keywords
M

Meta Description

A meta description is an HTML attribute that provides a brief summary of a page's content, displayed as the snippet text below the title in search results. While not a direct ranking factor, a well-written meta description (150-160 characters) can significantly improve click-through rates by compelling users to choose your result over competitors.

Check your meta descriptions

Meta Tags

Meta tags are snippets of HTML code that provide search engines and browsers with information about a web page. Key meta tags include the title tag (shown in search results and browser tabs), meta description (snippet text), meta robots (crawling instructions), Open Graph tags (social sharing), and viewport tags (mobile display). Proper meta tag optimization is fundamental to on-page SEO.

Audit all your meta tags

Mixed Content

Mixed content occurs when an HTTPS page loads resources (images, scripts, stylesheets) over insecure HTTP connections. This creates security vulnerabilities and triggers browser warnings that erode user trust. Modern browsers may block mixed content entirely, causing broken images or non-functional features. All resources on HTTPS pages should be loaded via HTTPS.

Detect mixed content issues

Mobile-First Indexing

Mobile-first indexing means Google primarily uses the mobile version of your website for indexing and ranking. Since most users now search on mobile devices, Google evaluates the mobile experience as the baseline. This means your mobile site must have the same content, structured data, and meta tags as your desktop version, and must provide a good user experience on small screens.

N

Nofollow

Nofollow is a link attribute (rel="nofollow") that tells search engines not to pass link equity through a specific hyperlink. It was originally created to combat comment spam. Google now treats nofollow as a "hint" rather than a directive. Related attributes include rel="sponsored" for paid links and rel="ugc" for user-generated content like forum posts and comments.

Noindex

Noindex is a directive that tells search engines not to include a specific page in their search index. It can be implemented via a meta robots tag in the HTML or an X-Robots-Tag HTTP header. Common uses include preventing indexation of thin content, duplicate pages, admin areas, and staging environments. A page with noindex will not appear in search results.

Check for noindex directives
O

Open Graph

Open Graph is a protocol created by Facebook that controls how URLs are displayed when shared on social media. Open Graph meta tags specify the title, description, image, and URL that appear in social shares. Properly configured OG tags ensure your content looks professional and compelling when shared on Facebook, LinkedIn, Slack, and other platforms that support the protocol.

Check your Open Graph tags

Organic Traffic

Organic traffic refers to visitors who reach your website by clicking on unpaid search results, as opposed to paid advertising, social media, or direct visits. Organic traffic is considered the most valuable traffic source because it's free, sustainable, and indicates that your content matches real user search intent. Growing organic traffic is the primary goal of SEO.

P

Page Speed

Page speed measures how quickly a web page loads its content. It's both a Google ranking factor and a critical user experience metric. Slow pages lead to higher bounce rates, lower conversions, and reduced crawl efficiency. Key metrics include Time to First Byte (TTFB), First Contentful Paint (FCP), and Largest Contentful Paint (LCP). Optimization techniques include image compression, code minification, and CDN usage.

Test your page speed

PageRank

PageRank is Google's original algorithm for ranking web pages based on the quantity and quality of links pointing to them. Named after Google co-founder Larry Page, it treats each link as a "vote" for a page, with votes from high-authority pages carrying more weight. While Google no longer publicly shares PageRank scores, the underlying concept of link-based authority still drives modern rankings.

R

Redirect (301/302)

A redirect automatically sends users and search engines from one URL to another. A 301 redirect is permanent and passes most link equity to the new URL, while a 302 redirect is temporary and may not transfer link equity. Redirects are essential when moving or restructuring content. Redirect chains (A redirects to B redirects to C) should be avoided as they slow crawling and dilute link equity.

Detect redirect chains

Robots.txt

Robots.txt is a text file placed in a website's root directory that tells search engine crawlers which pages or sections they are allowed or not allowed to crawl. It's used to manage crawl budget, prevent indexing of sensitive areas, and point crawlers to sitemaps. Note that robots.txt is a suggestion, not a security measure -- pages blocked by robots.txt can still appear in search results if other sites link to them.

Analyze your robots.txt
S

Schema Markup

Schema markup is a structured data vocabulary (from schema.org) that you add to your HTML to help search engines understand your content's meaning and context. It enables rich results in search, such as star ratings, event dates, recipe details, and FAQ accordions. Implementing schema markup can improve click-through rates by making your search listings more visually appealing and informative.

SERP

SERP stands for Search Engine Results Page -- the page displayed by a search engine in response to a query. Modern SERPs include organic results, paid ads, featured snippets, knowledge panels, image packs, video carousels, "People Also Ask" sections, and local pack results. Understanding SERP features for your target keywords helps you optimize content for maximum visibility.

Sitemap (XML)

An XML sitemap is a file that lists all the important pages on your website, helping search engines discover and crawl your content efficiently. It includes metadata like when a page was last modified and how frequently it changes. Sitemaps are especially important for large sites, new sites with few backlinks, and sites with pages not well-linked through internal navigation.

Validate your sitemap

Spam Score

Spam score is a metric that estimates the likelihood that a website will be penalized or banned by search engines based on characteristics commonly found in spam sites. Factors include thin content, excessive advertising, suspicious link profiles, hidden text, and domain age. A high spam score doesn't guarantee a penalty, but it indicates elevated risk that should be investigated.

Check your spam score
T

Title Tag

The title tag is an HTML element that specifies the title of a web page. It appears in browser tabs, search engine results, and social media shares. Title tags are one of the most important on-page SEO elements. Best practices include keeping them under 60 characters, placing the primary keyword near the beginning, making each title unique, and writing compelling copy that encourages clicks.

Optimize your title tags
U

URL Structure

URL structure refers to how your page URLs are formatted and organized. SEO-friendly URLs are short, descriptive, include relevant keywords, use hyphens to separate words, and follow a logical hierarchy. Good URL structure helps both search engines and users understand what a page is about before clicking. Avoid long URLs with random parameters, session IDs, or unnecessary subdirectories.

Analyze your URL structure

User Experience (UX)

User Experience (UX) encompasses every aspect of how a user interacts with your website, including page speed, navigation, readability, mobile-friendliness, and visual design. Google increasingly uses UX signals (like Core Web Vitals and mobile usability) as ranking factors. Sites that provide excellent user experiences tend to have lower bounce rates, longer session durations, and higher conversion rates.

Check your performance and UX
W

White Hat SEO

White hat SEO refers to optimization strategies that follow search engine guidelines and focus on providing genuine value to users. Techniques include creating high-quality content, earning natural backlinks, optimizing page speed, using proper meta tags, and building a great user experience. White hat SEO produces sustainable, long-term results without the risk of penalties.

X

X-Robots-Tag

The X-Robots-Tag is an HTTP response header that provides crawling and indexing directives to search engines, similar to the meta robots tag but applied at the server level. It's particularly useful for controlling indexation of non-HTML files like PDFs, images, and documents that cannot contain meta tags. Directives include noindex, nofollow, noarchive, and nosnippet.

Check your X-Robots-Tag headers

Put Your Knowledge Into Practice

Now that you understand the terms, run a free audit to see how your website measures up across all 19 SEO checks.

Start Free SEO Audit