What “SEO” Means
Search engine optimization (SEO) is the process of improving how easily people can discover and understand your web content via search engines. Government and university guidance frames SEO as aligning site structure, content, and technical signals so pages are crawled, indexed, and presented clearly in search results. This includes on‑page organization (titles, headings, links), technical files (XML sitemaps, robots.txt), and overall site health (fixing broken links, improving load time). (med.stanford.edu)
Why SEO Matters
Public-sector guidance emphasizes that strong SEO improves transparency and user experience by helping people find accurate information quickly, which reduces frustration and support burden. It is treated as part of delivering a usable, accessible government website, not merely a marketing tactic. (digital.gov)
Core On‑Page Practices
Page titles and meta descriptions
Government guidance underscores providing a unique HTML title and a meta description for every page. Titles are often used as the clickable result text; descriptions frequently display in search results and help users decide to click. These elements also assist screen readers, linking SEO with accessibility. (digital.gov)
Headings and semantic HTML
Use semantic HTML so search engines can understand page structure and so assistive technologies can navigate it. Government guidance calls for a clear hierarchy (one H1 for the page title, followed by H2/H3 for sections and subsections). Stanford’s accessibility guidance likewise explains that headings are crucial for both structure and SEO. (digital.gov)
Internal links, site structure, and page health
University guidance highlights a “flat” site structure that is easy to crawl, meaningful internal linking, fixing broken links, and optimizing images and other factors that affect load time. These practices contribute to overall site health and findability. (med.stanford.edu)
Avoiding empty or placeholder pages
Federal web style guidance warns that publishing blank or “coming soon” pages can harm rankings when crawlers discover that previously rich pages have become empty, potentially causing exclusion from results. (gsa.gov)
Core Technical Practices
XML sitemaps
An XML sitemap lists the URLs on a site to help search engines index content more intelligently and keep their index up to date. Federal guidance recommends listing sitemaps in robots.txt and, for large sites, using a sitemap index. Include lastmod dates when possible. (search.gov)
robots.txt
A robots.txt file instructs crawlers which areas of a site they may access. Properly configured robots.txt supports discoverability and can reference your sitemap locations. Digital.gov provides definitions and examples from federal sites. (digital.gov)
URL naming
University guidance for editors notes URL best practices such as using lowercase letters, hyphens to separate words, avoiding special characters and spaces, and keeping URLs readable by humans—principles that also aid search. (sites.utexas.edu)
Canonicalization (one preferred URL)
University IT guidance explains that multiple URL variants (http/https, www/non‑www) can create duplicate versions of the same page. Defining a canonical URL helps consolidate signals so the preferred version is indexed and ranked. (it.umn.edu)
PDFs and non‑HTML files
Federal guidance details how to make PDFs discoverable: use descriptive file names, ensure text is searchable (OCR for scanned PDFs), and set appropriate document properties (e.g., Title). These factors influence how such files appear and match queries. (digital.gov)
Accessibility, Plain Language, and SEO
Government resources connect SEO with accessibility and clear communication. Semantic HTML and proper headings improve assistive technology navigation, while plain‑language writing helps users understand content on first read and supports discoverability. CDC and Stanford resources provide plain‑language and accessibility checklists and training that complement SEO goals. (digital.gov)
Performance and Mobile Considerations
University materials tie faster load times and optimized media to better user experience and site health—factors that support findability. Government editor training also stresses optimizing content for varied browsers and connection speeds. (med.stanford.edu)
Measurement and Monitoring
Federal analytics training describes commonly used web metrics (e.g., sessions, users, pageviews, bounce rate, referrers) that inform ongoing optimization. Universities commonly use automated scanning platforms to monitor accessibility, broken links, and SEO issues across sites as part of routine quality assurance. (digital.gov)
Key Government and University Resources (Selected)
- Digital.gov primers on search, SEO, robots.txt, and XML sitemaps provide federal best practices for discoverability. (digital.gov)
- Search.gov documentation explains how sitemaps and robots.txt support indexing and discoverability. (search.gov)
- Stanford and other university guides cover headings, page titles, site health, and monitoring for SEO and accessibility. (sitesuserguide.stanford.edu)
- CDC plain‑language resources align content clarity with public comprehension—an aim that also serves search effectiveness. (cdc.gov)

