iFactory Insights

How a website redesign sparks campus-wide transformation

Spark a Campus-wide Transformation - title image of a spark
Every campus thinks they're just fixing the website, until the project surfaces questions about audiences, brand, and ownership no one has aligned on.

The website is the largest, most-used, always-on marketing property a higher-education institution has. It’s the asset every audience touches at once, the only one that has to work for prospective students, current students, faculty, parents, donors, and alumni at the same time. No brochure does that. No campaign does that. No open house or press release does that.

That’s why a redesign is rarely just a tech project or a brand refresh. It’s the moment the institution is forced, sometimes for the first time, to look at itself the way a prospective student does. Most internal stakeholders don’t have direct access to that view. Their day-to-day work is shaped by their office, their colleagues, their committees, their accreditation cycles. The redesign cuts across all of that, which is why the conversation gets bigger than the website almost immediately, and why the value the project produces is bigger than the website too.

This post walks through how that happens and what to do with it. Outside-in: how prospective students navigate, and why it doesn’t match the way the institution is structured. Inside-out: how the site grew up, and why an AI-search reframe is now unavoidable. The pivot: how UX research and information architecture turn opinion-driven debates into evidence-based decisions. What survives the launch: the institutional capabilities that make the redesign worth more than the website it produces.

Outside-in: students don't care about your org chart

Prospective students arrive at your website with a question. The question almost never maps to a single office on your campus. Cost crosses admissions and financial aid. Fit crosses admissions, the program, residential life, and student affairs. Transfer credit crosses admissions, the registrar, the program, and sometimes the dean’s office. Accommodations cross disability services, academic affairs, residential life, and the program itself. The student doesn’t know about any of that, and they shouldn’t have to.

The site often reflects how the institution is organized. That is confusing for the prospective student. That gap is the actual problem the redesign is solving, and most of the territorial debates inside a redesign project are echoes of it.

Internal stakeholders are usually surprised by how flat that gap looks from the outside. From inside, the institution has a structure that took decades to build, and that structure is meaningful to the people who do the work. From outside, the institution is one entity with a question to answer. The structure is invisible. What’s visible is whether the answer to the question is on the site, whether it’s easy to find, and whether it agrees with itself across the pages it shows up on.

AI search makes the gap harder to hide. Large-language-model retrieval doesn’t navigate the org chart any more than a prospective student does. It synthesizes across whatever pages it pulls and produces a single answer, and when those pages were written by different offices using different language under different assumptions, the answer is fragmented and low-confidence. The audience cost of that is real. According to EAB’s October–November 2025 survey of more than 5,000 high school students, 18% had already removed an institution from consideration based on what AI tools told them, and 34% said their interest in a school had grown because of AI research. The site’s job, increasingly, is to be the source AI tools synthesize from, not just the destination they direct students to.

18%

of high school students have already removed a college from consideration based on what AI tools told them.

Source: EAB student survey of 5,000+ high schoolstudents, October–November 2025.

The redesign is the project that has to address this directly. Not by flattening the institution’s structure, which isn’t the redesign team’s job, but by producing a public-facing site whose architecture answers prospective students’ questions instead of mirroring the campus directory. To understand why that’s harder than it sounds, it helps to look at how the site got built in the first place.

The site wasn't only built to be a prospective-student marketing channel. AI search just made it one.

Most higher-ed websites started as marketing. Admissions content, viewbook material, “about the college” pages, the work of presenting the institution to prospective students. The infrastructure layer that most campus sites now carry, the academic catalog, the policy library, the directory, the news archive, the committee minutes, accumulated afterward, as faculty, departments, and offices realized the web was a useful place to put their work. Marketing was the original purpose. The rest grew up around it, page by page, with each addition doing a job other than marketing.

That history shapes how most non-marketing pages get written. The default frame is outbound: someone on campus has information that needs to go out, and the page is the vehicle. A policy, an appointment, a curriculum change, a deadline. The audience is presumed to be already engaged in some way, a current student looking up a deadline, a faculty colleague checking a policy, a parent confirming a date. The writer’s job is to deliver the information accurately, and a meaningful share of what lives on a higher-ed site is best served by exactly that.

What’s changed is that the marketing surface area is no longer bounded. Prospective students arrive through Google and AI search, often at URLs nobody wrote with them in mind. They haven’t chosen the institution. They haven’t decided to engage. They’re asking a question, cost, transfer credit, outcomes, accommodations, and a search algorithm has handed them a page and called it an answer. That’s a different job from outbound communication. It’s an inbound job. The page assumes nothing on the reader’s part except that they have a question, and it has to do work the writer almost certainly didn’t sign up for.

Page owners can be resistant during a redesign, and the honest description of why is that they’re thinking about what they need to communicate out, not about who the page might find on the other end. They’ve been working in the outbound frame because that’s how the job was defined for them, often before AI search existed and before the homepage-and-landing-pages model of campus marketing started to leak into the rest of the site. When the redesign team asks them to evaluate their pages against an inbound frame, prospective-student search, a reader who never chose anything, they’re being asked to re-examine work that was correctly done under one set of assumptions, using assumptions they’ve never been responsible for. That’s not the same as bad intent. But it does mean the resistance often shows up as a focus on what the page owner needs to say, not on whether the prospective student arriving via search can use what’s there.

The redesign is the moment your institution can introduce the inbound frame on purpose, as something the site now also has to do across more of its surface area. Not a replacement for outbound work. There’s still real institutional information that needs to be communicated, and the site is still where a lot of it lives. It’s an additional layer your institution decides where to apply, page by page, with criteria most campuses don’t currently have. That decision is what the next section is about.

The pivot: shared evidence beats shared opinion

Most arguments about a website are really arguments about priorities, audiences, and resources, and those arguments don’t resolve as long as everyone is operating on instinct. The same program page can be defended three different ways by three different campus offices, all of them right under their own frame, and none of them holding the data that would let the others change their mind.

UX research and information architecture are how the redesign turns those arguments into evidence-based decisions. Not because data settles every question. It doesn’t. But the work surfaces what prospective students actually do, want, and miss when they encounter the site without context. That changes what the conversation is about. The question stops being “what should the page lead with” and becomes “what does someone arriving via search at this page need to be able to do in the first thirty seconds.”

A few specific instruments do most of this work.

Audience modeling and analytics review show who actually visits the site, in what order, with what tasks. Most higher-ed institutions discover that a small set of pages, including program pages, admissions, cost, outcomes, and a handful of department landing pages, concentrate the meaningful traffic, while the bulk of the site receives almost none. That finding alone reorients most internal debates. A lot of what feels contested in a redesign is contested over pages almost no prospective student visits.

Surveys of prospective students, current students, parents, and alumni put structured data behind questions that otherwise get answered by whoever’s most senior in the room. What did students consider when they were choosing? What did they wish the site had told them earlier? What questions did they ask their counselor or their parents that the site should have answered first? Surveys make institutional knowledge that’s currently sitting in admissions counselors’ heads available to the redesign as evidence.

Stakeholder interviews bring quiet internal voices forward in the same way. The financial aid counselor who has answered the same five questions for ten years has insight nobody has documented. So does the admissions reader who knows which program pages cause confusion, the alumni officer who knows what graduates wish their younger selves had known, the academic advisor who knows where transfer students get stuck. Interviews don’t generate hypotheses out of nothing. They make existing institutional knowledge available to the redesign team.

Card sorting is worth adding when the site’s structure is the central problem the redesign needs to solve. Asking prospective students, current students, and internal stakeholders to sort the same content categories produces a clear picture of how each group expects information to be organized, and the gaps between them are usually where navigation breaks down. Internal stakeholders tend to sort by how the institution is organized. Prospective students sort by their question, like cost, fit, outcomes, and life on campus. When the gap is large, card sorting gives the redesign team something concrete to point at instead of arguing about navigation in the abstract.

Journey mapping follows a prospective student across pages and across offices. The handoffs are usually where the site fails. Between admissions and financial aid. Between a program page and the registrar. Between a campus visit page and the calendar of events. Internal stakeholders tend not to see these failures because each office’s content works fine on its own. The journey map shows where it stops working as a connected experience.

The output of this work isn’t a sitemap or a wireframe. It’s a shared agreement about who the site is for, what they need it to do, and what good looks like. The sitemap and wireframes come from that agreement, not the other way around. Without it, design decisions get made by whoever’s loudest in the meeting, which means they get unmade the moment someone louder shows up. With it, the redesign team has something to point to when the inevitable late-stage debate breaks out. The institution has something to govern against after launch, which is what the next section is about.

Most arguments about a website are really arguments about priorities. UX research turns opinion-vs-opinion into evidence-vs-evidence.

What survives the launch

A good redesign launches a website. A great redesign launches an institutional capability. The site is the visible deliverable, the thing leadership will see at the all-hands meeting and the thing the press release will mention. The invisible deliverables are what determine whether the investment compounds over the next five years or starts decaying the day the redesign team leaves.

Four institutional capabilities should survive the launch.

Governance. 

Documented ownership, publishing rights, review cycles, retirement criteria. Without governance, even a freshly redesigned site accumulates clutter within twelve to eighteen months, and the conditions that produced the original problem reassert themselves. Pages get added because someone has authority to add them, not because anyone has decided what the site should be doing more of. Pages stay up because nobody has authority to retire them, not because they’re still serving anyone. The redesign is the moment the institution can name who decides what gets published, who decides what gets retired, and what the criteria are for both. Documenting that, and giving it a calendar, is governance. (iFactory’s content audit and governance work covers this in more depth.)

Content strategy as a practice.

Editorial standards, templates, the rules for adding new pages, the criteria for what belongs in marketing’s surface area and what belongs elsewhere. The redesign produces a set of these as deliverables, but the deliverables decay if nobody owns the practice. Content strategy as a practice means a person or a small team has the standing authority to enforce standards across the site, to push back when a new page proposal doesn’t meet them, and to update the standards as student behavior, AI search, and the institution’s own priorities evolve. Most institutions don’t have this role today. The redesign is the moment to define it.

Service improvements that outlast the website. 

Some of the most valuable work the redesign produces isn’t on the website at all. When the journey mapping reveals that prospective students get three different answers about transfer credit from three different offices, the long-term fix isn’t a better page. It’s a single source of truth across the offices, and the page just reflects it. When the survey data shows that students don’t understand the financial aid timeline, the long-term fix isn’t a clearer page about the timeline. It’s a process change inside financial aid that the page then describes. The redesign surfaces these problems by forcing the institution to look at the prospective-student experience as a connected whole. Acting on them is what turns the redesign into a service-improvement project, and the institutions that do it get returns the website itself can’t deliver.

Sustained discipline about what’s public-facing, what’s publicly accessible, and what should be neither. 

This is the capability the inside-out section was building toward. The redesign can produce a clean classification at launch, but classifications drift. Six months in, a department posts an internal-facing meeting summary in the public root because that’s where they have permission to publish. A year in, a faculty committee uploads a draft policy with the wrong sharing settings. Two years in, the careful classification work the redesign produced has been quietly undone in a hundred small ways. The capability that has to survive isn’t the classification itself. It’s the institutional habit of asking “where does this belong?” every time something new gets created, and having a governance structure that can answer the question without a meeting.

The website is the visible asset. The habit of seeing the institution through prospective students’ eyes, and the four capabilities that protect that habit, are the durable ones. That’s what the redesign is actually delivering. The site is just the artifact that proves the work happened.

Three buckets every redesign has to sort content into

The classification keeps internal content from appearing in AI search.

01

Public-facing
Optimized for prospective students and AI search. Schema, structured content, consistent entity reference.

02

Publicly accessible
Not prioritized but required including compliance disclosures, archived policies, technical reference material.

03

Authenticated
Behind login. Portals and intranets for current faculty, staff and  students. Authentication keeps them from getting publicly indexed.

Not Sure Where Your Institution Stands?

iFactory’s AI Search Readiness Audit evaluates whether tools like ChatGPT, Perplexity, Google AI Overviews, and Claude can find, understand, and recommend your institution to prospective students.

Frequently asked questions

Because the website is the only marketing property that has to work for every audience at once. When the redesign team starts looking at how prospective students actually search, navigate, and decide, the institution is forced to confront the gap between how it’s organized internally and how prospective students think about it from outside, as one entity with a question. Most internal stakeholders don’t have direct access to the prospective-student view in their day-to-day work. The redesign is the first time the institution sees itself through that lens, and the questions that surface, about audiences, brand, ownership, and what belongs on the public site, were already there. The redesign just made them unavoidable.

You can go straight to design. Most institutions that do find themselves making the same arguments six months in that they would have made on day one, just with more sunk cost. The reason UX research exists is that it converts opinion-driven debates about audiences and priorities into evidence-driven decisions. Surveys, audience modeling, journey mapping, stakeholder interviews, and (when the site’s structure is the central problem) card sorting do most of this work. Skipping the research doesn’t make the arguments go away. It just makes them louder later.

It means treating the entire public-facing site, not just the marketing pages, as potential source material for AI search, and making structural decisions accordingly. Schema markup, structured content, consistent entity references, and a deliberate classification of what belongs in the public root versus what should move behind authentication or be deindexed. AI readiness isn’t a separate workstream from the redesign. It’s a property of doing the redesign well. The institutions that bake it in will see compounding returns over the next several years. The institutions that don’t will find themselves doing it as a remediation project later.

A useful frame is to ask whether a prospective student, current student, parent, donor, or alum is the intended reader. If yes, public-facing. If the page exists for an internal audience (faculty committee minutes, departmental archives, internal-facing policy drafts), it usually belongs behind authentication or in a system other than the main CMS. The middle category, publicly accessible but de-prioritized, covers content with a real but narrow audience: compliance disclosures, archived policies, technical reference material. Most institutions have never been forced to make this classification call deliberately. The redesign is the moment to make it on purpose, with criteria the team can apply to new content as it gets created.

ifactory logo

iFactory Insights

Never miss a post