Search engine optimization

From Vero - Wikipedia
Jump to navigation Jump to search

Template:Short description Template:Redirect Template:Multiple Issues Template:Pp Template:Use mdy dates Template:Internet Marketing

Search engine optimization (SEO) is the process of improving the quality and quantity of website traffic to a website or a web page from search engines.<ref>Template:Cite web</ref><ref>Template:Cite journal</ref> SEO targets unpaid search traffic (usually referred to as "organic" results) rather than direct traffic, referral traffic, social media traffic, or paid traffic.

Organic search engine traffic originates from a variety of searches, including image search, video search, academic search,<ref name="aseo">Template:Cite web</ref> news search, industry-specific vertical search engines, and large language models.

As an Internet marketing strategy, SEO considers how search engines work, the algorithms that dictate search engine results, what people search for, the actual search queries or keywords typed into search engines, and which search engines are preferred by a target audience. SEO helps websites attract more visitors from a search engine and rank higher within a search engine results page (SERP), aiming to either convert the visitors or build brand awareness.<ref>Ortiz-Cordova, A. and Jansen, B. J. (2012) Classifying Web Search Queries in Order to Identify High Revenue Generating Customers. Template:Webarchive. Journal of the American Society for Information Sciences and Technology. 63(7), 1426 – 1441.</ref>

History

Webmasters and content providers began optimizing websites for search engines in the mid-1990s as the first search engines were cataloging the early Web. Search engine users would query the URL of a page, and then receive information found on the page, if it existed in the search engine's index.

ALIWEB and the earliest versions of search engines required website developers to manually upload website index files in order to be searchable and widely did not utilize any form of ranking algorithm for user queries.<ref>Template:Cite web</ref> The emergence of automated web crawlers would later be used to proactively discover and index websites. This led to website developers to optimize their website’s search signals, including the use of meta tags, to achieve greater visibility in search results.

According to a 2004 article by former industry analyst and current Google employee Danny Sullivan, the phrase "search engine optimization" came into use in 1997. Sullivan credits SEO practitioner Bruce Clay as one of the first people to popularize the term.<ref>Template:Cite web See Google groups thread Template:Webarchive.</ref>

In some cases, early search algorithms weighted particular HTML attributes in ways that could be leveraged by web content providers to manipulate their search rankings.<ref>Template:Cite web</ref> As early as 1997, search engine providers began adjusting their algorithms to prevent these actions.<ref name="infoseeknyt">Template:Cite news</ref> Eventually, search engines would incorporate more meaningful measures of page purpose, including the more recent development of semantic search.<ref>Template:Cite web</ref>

Some search engines frequently sponsor SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.<ref name="g-wmguide" /><ref name="ms-wmguide" /> Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.<ref name="googlesitemaps">Template:Cite web</ref> Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.

In 2015, it was reported that Google was developing and promoting mobile search as a key feature within future products, resulting in brands and marketers shifting toward mobile-first experiences.<ref>Template:Cite web</ref>

In the 2020s, the rise of generative AI tools such as ChatGPT, Claude, Perplexity, and Gemini gave rise to discussion around a concept variously referred to as generative engine optimization, answer engine optimization or artificial intelligence optimization. This approach focuses on optimizing content for inclusion in AI-generated answers provided by large language models (LLMs). This shift has led digital marketers to discuss content formats, authority signals, and how structured data is presented to make content more "promptable".<ref>Template:Cite web</ref>

It has also been argued that each of these tactics should be considered as subsets of "search experience optimization," described by Ahrefs as "optimizing a brand’s presence for non-linear search journeys over multiple platforms, not just Google."<ref>Template:Cite web</ref>

Relationship between Google and SEO industry

In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.<ref name="lgscalehyptxt">Template:Cite web</ref> PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.

Page and Brin founded Google in 1998.<ref>Template:Cite web</ref> Google attracted a loyal following among the growing number of Internet users, who liked its simple design.<ref name="bbc-1">Template:Cite news</ref> Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link-building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focus on exchanging, buying, and selling links, often on a massive scale. Some of these schemes involved the creation of thousands of sites for the sole purpose of link spamming.<ref>Template:Cite web</ref>

By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation.<ref name="nyt0607">Template:Cite news</ref> The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization and have shared their personal opinions.<ref>Template:Cite web</ref> Patents related to search engines can provide information to better understand search engines.<ref>Template:Cite web</ref> In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.<ref>Template:Cite web</ref>

In 2007, Google announced a campaign against paid links that transfer PageRank.<ref>Template:Cite web</ref> On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any no follow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.<ref>Template:Cite web</ref> As a result of this change, the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally, several solutions have been suggested that include the usage of iframes, Flash, and JavaScript.<ref>Template:Cite web</ref>

In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.<ref>Template:Cite web</ref> On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts, and other content much sooner after publishing than before, Google Caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."<ref>Template:Cite web</ref> Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs, the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.<ref>Template:Cite web</ref>

Google has implemented numerous algorithm updates to improve search quality, including Panda (2011) for content quality, Penguin (2012) for link spam, Hummingbird (2013) for natural language processing, and BERT (2019) for query understanding. These updates reflect the ongoing evolution of search technology and Google's efforts to combat spam while improving user experience.

On May 20, 2025, Google announced that AI Mode would be released to all US users. AI Mode uses what Google calls a "query fan-out technique" which breaks down the search query into multiple sub-topics which generates additional search queries for the user.<ref>Template:Cite web</ref>

Methods

Getting indexed

A simple illustration of the Pagerank algorithm. Percentage shows the perceived importance.

The leading search engines, such as Google, Bing, Brave Search and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine-indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.<ref>Template:Cite web</ref> Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links<ref>Template:Cite web</ref> in addition to their URL submission console.<ref>Template:Cite web</ref> Yahoo! formerly operated a paid submission service that guaranteed to crawl for a cost per click;<ref>Template:Cite web</ref> however, this practice was discontinued in 2009.

Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by search engines. The distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.<ref name="cho">Template:Cite web</ref>

Mobile devices are used for the majority of Google searches.<ref>Template:Cite web</ref> In November 2016, Google announced a major change to the way they are crawling websites and started to make their index mobile-first, which means the mobile version of a given website becomes the starting point for what Google includes in their index.<ref>Template:Cite web</ref> In May 2019, Google updated the rendering engine of their crawler to be the latest version of Chromium (74 at the time of the announcement). Google indicated that they would regularly update the Chromium rendering engine to the latest version.<ref>Template:Cite web</ref> In December 2019, Google began updating the User-Agent string of their crawler to reflect the latest Chrome version used by their rendering service. The delay was to allow webmasters time to update their code that responded to particular bot User-Agent strings. Google ran evaluations and felt confident the impact would be minor.<ref>Template:Cite web</ref>

Preventing crawling

Template:Main To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually <meta name="robots" content="noindex"> ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish to crawl. Pages typically prevented from being crawled include login-specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.<ref>Template:Cite web</ref>

In 2020, Google sunsetted the standard (and open-sourced their code) and now treats it as a hint rather than a directive. To adequately ensure that pages are not indexed, a page-level robot's meta tag should be included.<ref>Template:Cite web</ref>

Increasing prominence

A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility. Page design makes users trust a site and want to stay once they find it. When people bounce off a site, it counts against the site and affects its credibility.<ref name=":0">Template:Cite book</ref>

Writing content that includes frequently searched keyword phrases so as to be relevant to a wide variety of search queries will tend to increase traffic. Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's metadata, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL canonicalization of web pages accessible via multiple URLs, using the canonical link element<ref>Template:Cite web</ref> or via 301 redirects can help make sure links to different versions of the URL all count towards the page's link popularity score. These are known as incoming links, which point to the URL and can count towards the page link's popularity score, impacting the credibility of a website.<ref name=":0" />

White hat versus black hat techniques

Template:Update

Common white-hat methods of search engine optimization

SEO techniques can be classified into two broad categories: techniques that search engine companies recommend ("white hat"), and those techniques of which search engines do not approve ("black hat"). Search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods and the practitioners who employ them as either white hat SEO or black hat SEO.<ref>Template:Cite web</ref> White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.<ref>Template:Cite web</ref>

An SEO technique is considered a white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines<ref name="g-wmguide">Template:Cite web</ref><ref name="ms-wmguide">Template:Cite web</ref><ref>Template:Cite web</ref> are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO has been compared to web development that promotes accessibility,<ref>Template:Cite web</ref> although the two are not identical.

Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines or involve deception. One black hat technique uses hidden text, either as text colored similar to the background, in an invisible div, or positioned off-screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between the black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.Template:Citation needed

Search engines may penalize sites they discover using black or grey hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for the use of deceptive practices.<ref name="intwebspam">Template:Cite web</ref> Both companies subsequently apologized, fixed the offending pages, and were restored to Google's search engine results page.<ref>Template:Cite web</ref>

Companies that employ black hat techniques or other spammy tactics can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.<ref>Template:Cite news</ref> Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.<ref name="wired09082005">Template:Cite magazine</ref> Google's Matt Cutts later confirmed that Google had banned Traffic Power and some of its clients.<ref>Template:Cite web</ref>

As marketing strategy

SEO is one approach within digital marketing, alongside other strategies such as pay-per-click advertising and social media marketing. Search engine marketing (SEM) is the practice of designing, running, and optimizing search engine ad campaigns. Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. SEM focuses on prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.<ref>Template:Cite journal</ref> A successful Internet marketing campaign may also depend upon building high-quality web pages to engage and persuade internet users, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.<ref>Template:Cite web</ref><ref>Template:Cite web</ref>

In November 2015, Google released a full 160-page version of its Search Quality Rating Guidelines to the public,<ref>Template:Cite web</ref> which revealed a shift in their focus towards "usefulness" and mobile local search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016, where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device.<ref>Template:Cite newsTemplate:Cbignore</ref> Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and determine how user-friendly their websites are. The closer the keywords are together their ranking will improve based on key terms.<ref name=":0" />

SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantee and uncertainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.<ref>Template:Cite magazine</ref> Search engines can change their algorithms, impacting a website's search engine ranking, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.<ref>Template:Cite web</ref> Industry analysts note that websites may face risks from algorithm changes that can significantly impact organic traffic. In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.

International markets and SEO

Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. Google has maintained dominant market share in most regions, with varying percentages by market.<ref>Template:Cite news</ref> In markets outside the United States, Google's share is often larger, and data showed Google was the dominant search engine worldwide as of 2007.<ref>Template:Cite web</ref> As of 2006, Google had an 85–90% market share in Germany.<ref name="grehan-1">Template:Cite web</ref> As of March 2024, Google still had a significant market share of 89.85% in Germany.<ref>Template:Cite web</ref> As of March 2024, Google's market share in the UK was 93.61%.<ref>Template:Cite web</ref>

Successful search engine optimization (SEO) for international markets requires more than just translating web pages. It may also involve registering a domain name with a country-code top-level domain (ccTLD) or a relevant top-level domain (TLD) for the target market, choosing web hosting with a local IP address or server, and using a Content Delivery Network (CDN) to improve website speed and performance globally. It is also important to understand the local culture so that the content feels relevant to the audience. This includes conducting keyword research for each market, using hreflang tags to target the right languages, and building local backlinks. However, the core SEO principles—such as creating high-quality content, improving user experience, and building links—remain the same, regardless of language or region.<ref name="grehan-1" />

Regional search engines have a strong presence in specific markets:

Multilingual SEO

By the early 2000s, businesses recognized that the web and search engines could help them reach global audiences. As a result, the need for multilingual SEO emerged.<ref>Template:Cite journal</ref> In the early years of international SEO development, simple translation was seen as sufficient. However, over time, it became clear that localization and transcreation—adapting content to local language, culture, and emotional resonance—were more effective than basic translation.<ref>Template:Cite web</ref>

On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."<ref>Template:Cite web</ref><ref>Template:Cite web</ref>

In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%. On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.<ref>Template:Cite web</ref><ref>Template:Cite web</ref>

See also

Template:Div col

References

Template:Reflist

Template:Spoken Wikipedia

Template:Search engine optimization

Template:Authority control