The SEO Glossary (2025 Edition)
Here’s a glossary of terms which I’ll commonly refer to in my writing along with definitions and references.
If there’s something you definitely think I’ve missed, please do get in touch.
A
Accessibility
The capacity of a website to serve users with specific needs. Accessible sites are able to cater for users with:
Visual impairment (colourblind/high-contrast mode, large text, alt text for screen readers, etc.)
Hearing impairment (subtitles on videos, visual cues in place of sound effects, etc.)
Motion sickness (avoiding excessive on-screen animations)
Keyboard shortcuts for users navigating without a mouse
Agentic AI
See: AI Agent.
AGI
Artificial General Intelligence (or, Strong AI) refers to an AI system which would be capable of understanding, learning and applying its knowledge to almost any task, in a way comparable to humans. Unlike ANI, which is limited to specific tasks, AGI would be adaptable and able to apply its problem-solving skills across different contexts.
At the time of writing, no artificial general intelligence has yet been created. It’s still a theoretical concept which is the goal of much AI research happening today.
AI Agent
AI agents are AI which act on behalf of a user. They go beyond answering questions in that they act as a digital assistant, being able to make decisions and take actions such as booking meetings, write and deploy code or buying things.
These are opposed to Generative AI, which only generates content without taking further action.
For more, check out the article on AI agents.
Alt Text
The “alternative text” is an HTML attribute used to describe the content of an image. It gives search engines context about what the image depicts and is what screen readers will read out for visually impaired users.
Also see: Accessibility
Anchor Text
The clickable, visible text of a hyperlink. Anchor text gives both users and search engines context about the destination page.
ANI
Artificial Narrow Intelligence (or, Weak AI) refers to AI programs which are trained to specialise within a domain, but where the same algorithm can’t be applied to learn with the same versatility as humans.
For example, humans can both learn to write code, recognise pictures of flowers and play tennis. ChatGPT can analyse and produce both text and images, but can’t seamlessly transition to unrelated domains such as writing music or diagnosing X-Ray images.
Every AI currently in operation, including ChatGPT, Perplexity and Gemini, are narrow.
This is opposed to Strong AI, or Artificial General Intelligence/AGI.
B
Backlink
A link from another website pointing to your website which is an indication of trust and value.
Bot
See: Crawler.
C
Canonical Tag
An HTML tag which tells search engines which version of a page should be treated as the main one. This helps prevent duplicate content issues, where search engines might spread the ranking equity between two identical pages.
CLS
The Cumulative Layout Shift is how much the page elements move unexpectedly around the page whilst it’s loading. This is one of the Core Web Vitals.
Commercial Intent
See: User Intent
Core Web Vitals
A set of metrics defined by Google to measure real-world user experience on a website. They focus on loading performance, interactivity, and visual stability, specifically:
Largest Contentful Paint (LCP)
Interaction to Next Paint (INP, replacing FID)
Cumulative Layout Shift (CLS).
Crawlability
The ease and speed with which a crawler can navigate through your site.
We usually want to make it as easy as possible to improve our chances of being indexed by Google, have our brand exposed to AI, and so on.
Crawler (or, Bot/Spider)
A bot which automatically analyses and indexes your website. It starts with a sitemap and explores the pages listed within, or if it finds itself on your site without one, it scans content by reading the page and navigating through links. Having solid internal linking is a great way to help crawlers do this.
For SEO purposes, the crawler could belong to an AI or a Search Engine, but crawlers exist for all sorts of reasons.
Crawl Budget
The number of pages a search engine crawler is willing/able to crawl on a site within a given timeframe. Crawl budget is especially important for large or complex sites, where inefficient use of it can lead to important pages not being indexed.
D
DA
Domain Authority is a third-party metric which was developed by Moz. It’s a score on a scale from 1-100 and is assessed based on factors like a site’s strength and number of backlinks. Greater strength and more backlinks tends to lead to higher DA, meaning the site is more likely to show up in search results.
Other SEO tools provide a similar score such as Ahrefs (Domain Rating; DR) and SEMrush (Authority Score; AS).
It’s important to note that DA isn’t used by Google. Google instead evaluates signals such as backlinks, relevance, and quality at the individual page level.
The Dark Web
A part of the internet which is only accessible with specialised software, allowing users and website operators to be anonymous and untraceable.
The Deep Web
The parts of the internet which aren’t indexable by Search Engines including content behind a paywall, private content such as a person’s email inbox, and any other content that can’t be found by crawling the public-facing portion of the internet.
Duplicate Content
Content that appears in more than one place on the internet, either within the same site or across different domains. Search engines may struggle to decide which version to rank, which can dilute visibility and authority.
For example, if an eCommerce site copies manufacturer descriptions for every product, it may compete with dozens of other sites showing the same text.
E
EEAT
Expertise, Experience, Authoritativeness, and Trustworthiness — Google’s quality-rating guideline framework. These are the four signals from your website/content which cannot be replicated by AI. AI crawlers specifically look for these signals when deciding which content to recommend.
F
Featured Snippet
A highlighted box that appears at the top of Google’s search results, showing direct answers pulled from webpages. Featured snippets can be paragraphs, lists, tables, or videos.
For example, searching “how to boil an egg” may display a short step-by-step list directly in the SERP, with a link to the source page.
G
GEO
Generative Engine Optimisation. This is an alternative to Search Engine Optimisation, but instead of for Search Engines, it focuses on Generative AI such as ChatGPT and Perplexity.ai.
GSC
Google Search Console. This is a free tool used for monitoring the search performance and indexing of a website, such as the site’s impressions, clicks and which keywords it’s ranking for.
Generative AI
This term refers to Artificial Intelligence which “generates” content such as text, images and sound. Examples include ChatGPT, Midjourney and ElevenLabs.
H
Header Tags (H1-H6)
Semantic HTML tags which help to structure a page and its content hierarchy such that it can be more easily understood by Search Engines and AI crawlers.
I
Impressions
How many times a page/website appeared in search results across a certain timeframe, even if they aren’t then clicked on.
Indexing
The process by which Search Engines and AI crawlers use to store and organise content from across the web so it can be found when answering queries.
Informational Intent
See: User Intent
INP
The Interaction to Next Paint measures how responsive a page is when the user interacts by tracking the delay until the next visible update. It’s one of the Core Web Vitals.
J
JSON-LD
JavaScript Object Notation for Linked Data is a type of structured data, or schema. It formats data in a way that search engines and AI crawlers can interpret it.
K
Keyword
The word or phrase which users type (or speak) into a search engine.
Keyword Cannibalisation
When multiple pages on the same site are competing for the same keyword.
L
LCP
The Largest Contentful Paint is a measure of how quickly the largest visible element (such as an image or heading) loads onto the screen. It’s one of the Core Web Vitals.
Link Equity
The value or “authority” passed from one page to another through hyperlinks. Search engines treat links as signals of trust, and the amount of link equity transferred depends on factors like the linking page’s authority, relevance, and whether the link is marked as follow or nofollow.
For example, a link from a respected news site to your online store passes more equity than a link from an unrelated directory.
Link Juice
See: Link Equity.
LLM
A Large Language Model is a type of AI trained on huge amounts of text data to understand and generate human-like language. They work by predicting the most likely next word or sequence in a text, allowing them to answer questions, write code, translate languages or perform other text-based tasks.
ChatGPT and Google’s Gemini are both LLMs.
Long-Tail Keyword
A keyword phrase which is longer and more specific than broad, single-word search terms. Long-tail keywords usually have lower search volume but higher intent, meaning they often drive more qualified traffic.
For example, “running shoes” is broad, while “women’s lightweight trail running shoes” is a long-tail keyword.
M
Meta Description
An HTML attribute which summarises the content of a webpage. It is often displayed under the title in search results, though it doesn’t directly influence rankings.
Well-written meta descriptions can improve click-through rates by enticing users to visit.
N
Navigational Intent
See: User Intent
Nofollow
An HTML link attribute which suggests to Search Engines that the page shouldn’t pass link equity to the linked page. AI crawlers take a much lesser hint from this attribute.
O
Organic Traffic
Organic traffic refers to visitors who arrive at a website through unpaid search engine results. This is opposed to paid traffic (from adverts) or referral traffic (from links on other sites) and is a key metric for measuring SEO performance.
P
Q
R
Rich Results
Rich Results are enhanced SERP listings with additional features (such as ratings and FAQs).
Robots.txt
A simple text file placed at the root of a website that tells crawlers which parts of the site should/shouldn’t be accessed. Most search engines respect robots.txt, but it’s not an enforceable rule.
S
Schema (Structured Data)
Schema is a data structure (such as JSON-LD) which format data in a way which makes it easier for crawlers to understand, and also increases a page’s eligibility for showing Rich Results.
For example, on an FAQ page, not using schema means crawlers have only the markup to infer which parts of the page are questions, which are answers, and where one question ends and the next begins. Using schema means we can give crawlers this information exactly.
Search Engine
An online tool which users can use to enter a prompt and find content from around the internet related to said prompt, such as Google, Yahoo and Bing.
SEO
Search Engine Optimisation, the art of optimising a website to appear at the top of Search Engine Results Pages (SERPs) such as Google.
SEO
Search Everywhere Optimisation, a new interpretation of SEO coined in 2025 by Neil Patel of NP Digital. This term is intended to imply digital marketing will go far beyond just optimising for search engines and will include optimising for AI, social media, and beyond.
SERP
Search Engine Results Page — this is the “10 blue links” page that comes up in traditional search on Google (or other Search Engine) when entering a query.
Sitemap (XML)
A file which lists important URLs on a site for search engines to crawl. Sitemaps help ensure that pages can be discovered even when they’re not internally linked.
SGE
Search Generative Experience is Google’s experimental feature which integrates generative AI into search results. Instead of only showing a list of links, SGE creates AI-written summaries, answers, and recommendations directly in the SERP.
For example, a search for “best email marketing tools” might generate a synthesized overview comparing Mailchimp, Klaviyo, and ConvertKit, before listing traditional organic results underneath.
Spider
See: Crawler.
Strong AI
See: AGI.
Structured Data
See: Schema.
The Surface Web
Any part of the internet which is accessible and indexable by search engines. This is opposed to the Deep Web, which refers to content which can’t be found by web crawlers.
T
Technical SEO
The parts of Search Engine Optimisation which stem from the technical features of a page which allow Search Engines and AI crawlers to understand and index a page. These include the page’s:
Loading speed
Mobile-responsiveness
Structured data
Internal linking
Title Tag
An HTML element which declares the title of a webpage. It appears in browser tabs and is usually shown as the main headline in search results.
Traditional SEO
“Traditional Search Engine Optimisation” refers to the kind of SEO which businesses have used to get to the top of the Google over the last 20 years. This mostly involves targeting keywords and earning backlinks, and excludes strategies being used to earn citations from AI chatbots.
Transactional intent
See: User Intent
U
User Intent
A user’s intent is what they’re really trying to achieve when they search for something. The keywords they use are classified into 4 buckets of user intent:
Informational intent; the user wants to learn about something.
Navigational intent; the user wants to find a specific page or resource.
Commercial intent; the user wants to research something they want to buy.
Transactional intent; the user wants to take action such as buying.
V
W
Weak AI
See: ANI.
X
Y
YMYL
Your Money or Your Life. “YMYL” topics like health, finance or safety are held to a higher standard and will take Search Engines like Google a longer time to trust.
Z
Zero-Click Search
Queries which are answered directly in the SERP by AI Overviews, Featured Snippets, Knowledge Panels, etc. which don’t result in a click through to a third-party website.