CU Reviews Admin Dashboard

Overhauling the internal system of a course review website.

Role

Product Designer

Timeline

September 2024 - December 2024

Team

1 Product Manager, 2 Designers, 5 Developers, 1 Marketing Manager

Toolkit

Figma, User Research, User Testing

OVERVIEW

What is CU Reviews?

CU Reviews is a platform where Cornell students share honest course reviews with each other. Built by Cornell's largest software development team, it’s grown to over 17,000 users and more than 5,000 reviews.

But while the user-facing side was thriving, the admin side... not so much.

THE PROBLEM

An outdated admin experience

When I joined the CU Reviews team, I discovered our admin dashboard hadn’t been touched since its initial set up in 2019. The existing system for managing reviews was outdated and limited in functionality, causing delayed review approvals and confused admin users. My task was to overhaul this experience while adding crucial features for managing admin users and tracking analytics.

THE SOLUTION

Approving reviews

Admins can now confidently sort through reviews with safeguard pop-ups and confirmation snackbars. They can also seamlessly navigate tabs to view recently approved reviews and reported reviews.

Viewing analytics

Admins can finally monitor key metrics to understand how CU Reviews is performing. They can also select a specific time period, allowing them to see trends over time and measure the impact of marketing events.

Managing admin users

Users can filter, sort, and search to find specific admin in a centralized interface. They can also add, edit, or remove admin accounts.

Testing developer tools

Developers now have a place to test in-progress functions and easily view testing history. They can also quickly access GPT Costing information.

RESEARCH

I started by interviewing 5 admin users.

I chose users across roles— designers, devs, pms, and marketers. My research goal was to understand 1) what people wished the dashboard could do and 2) how people actually used the dashboard. One thing became immediately clear: almost no one used the admin dashboard, and when they did, it left them frustrated and confused. As one admin put it, "I barely remember the last time I used the page– there’s nothing that’s relevant to me."

Why wasn't anyone using the admin dashboard?

Through affinity mapping of the interview responses, I uncovered several critical pain points:

1.  Cluttered with non-functional features

Most buttons didn’t work. Developers clarified that most were leftovers from backend testing, but other roles were left confused.

2.  Lack of feedback

There was no visible interaction when using features, which left our team unsure if their actions went through.

3. No meaningful metrics

The existing metrics weren't helping our marketing and product teams actually make decisions.

INFORMATION ARCHITECTURE

An all-in-one dashboard?

Guided by the user needs, I decided on content requirements for the new admin site. My first instinct was to create a consolidated dashboard with everything on a single screen– review management, admin controls, and analytics.

However, during a discussion with developers, our technical PM made a compelling point: "Our admin interface shouldn't try to do everything at once. It should be organized and focused." That insight prompted a restructure.

Instead of one overwhelming page, I broke the experience into separate pages, each serving a specific purpose.

ITERATIONS

Deciding on button layouts

As I designed the review approval flow, I was torn between two button layouts: distance between buttons to prevent mistakes vs. placed close together for speed. To answer this dilemma, I brought both versions to users for testing, and quickly found that people preferred adjacent buttons – efficiency beat out error prevention.


I also added universally recognizable icons alongside the text for quicker, more intuitive understanding, and adjusted our design system's color contrast ratios to meet WCAG standards, making the buttons clear and distinguishable for users of all visual abilities.

WORKING COLLABORATIVELY

Crafting the analytics dashboard

Designing the analytics dashboard came with new questions: "Should we use library components or custom graphs?"

I consulted with developers on my team and learned that pre-built components would be faster to implement, custom graphs would give us the flexibility to evolve with our needs. I decided on a flexible solution: designing custom graphs that could be implemented with either approach.

After creating several iterations of the data visualizations, I shared versions with my subteam for feedback. Through our discussions, we landed on three key elements that told the clearest story: a consolidated overview adjustable by date, a linear progress bar to visualize goal metrics, and categorized bar graphs that made data easy to understand.

FEEDBACK FROM MORE PERSPECTIVES

Realizing I made a mistake

I kept the "Add Admin" feature in a header just because that's where it was in the old interface. The whole team had gotten used to it, so none of us questioned it.

Then a new team member tested it and asked: "Why is this in a header and not a pop-up?" A day later, I brought it to a design critique, and other designers said the exact same thing— the overhead bar looked like a search filter, not an admin tool. They suggested a fix: make it a button that opens a modal.

Getting the same feedback twice from completely different people was the wake-up call I needed. I went back and redesigned the admin management flow, this time as a pop-up modal.

INTERACTION DESIGN

Designing for clarity and confidence

As I redesigned the interface, I kept returning to Don Norman’s The Design of Everyday Things. His breakdown of affordances (knowing what will happen) and feedback (knowing what did happen) shaped a lot of my decisions.

I added affordances and feedback to guide user actions.

I spent time digging into Material Design principles on material.io to understand best design practices.  For affordances, I added hover states to signal that buttons were active and clickable. On the feedback side, I added cues that told users their actions were received and understood, including loading states, color changes, and snackbars to confirm successfully approved reviews.

I used safeguards to prevent “oops” moments.

For critical actions like deleting reviews or removing admins, I added pop-up modals. It’s a simple reminder to double-check, but it gives users a moment to pause and avoid mistakes.

CONSISTENT DESIGN

Unifying patterns across the platform

As I was finalizing my designs, I made an important discovery: a fellow designer on a different CU Reviews project had created filter and sort patterns, but included a reset button– a thoughtful detail I had missed in my own work. Even though their feature wasn't live yet, I knew we needed to be consistent. I quickly updated my designs to match.

This moment stuck with me. It reminded me that good design isn't about getting everything perfect the first time – it's about staying open to improvements, even in those final stages. While it's easy to rush toward the finish line, taking a step back to align with other designers' work ultimately creates a better, more cohesive experience.

FINAL PROTOTYPES

Approving reviews

Admins can now confidently sort through reviews with safeguard pop-ups and confirmation snackbars. They can also seamlessly navigate tabs to view recently approved reviews and reported reviews.

Viewing analytics

Admins can finally monitor key metrics to understand how CU Reviews is performing. They can also select a specific time period, allowing them to see trends over time and measure the impact of marketing events.

Managing admin users

Users can filter, sort, and search to find specific admin in a centralized interface. They can also add, edit, or remove admin accounts.

Testing developer tools

Developers now have a place to test in-progress functions and easily view testing history. They can also quickly access GPT Costing information.

WHAT'S NEXT

As I wrap up my design work for CU Reviews, I’ve handed off detailed documentation to the developers to guide the next phase: implementation. While I’ll be shifting my focus to new design projects, I’m not stepping away completely. I’ll stay involved through check-ins to answer questions, clarify design choices, and ensure that the thoughtful decisions I made throughout the design process come to life as intended. After all, the success of this redesign will ultimately be measured by how well it serves our admin users in practice.

REFLECTIONS

Learning to work cross-functionally

This project was my first deep dive into cross-functional collaboration, and it reshaped how I approach design.  I found myself reaching out to team members across different roles for their unique perspectives— my PM helped prioritize which metrics mattered most, our marketing manager shared ideas for effective data visualization, and developers kept my goals grounded in technical realities. 

One moment that stood out was when I initially suggested color-changing progress bars to indicate completion status. It seemed like a great idea to me, but the developers explained it would add unnecessary complexity for an internal tool. Conversations like these clarified my goals, helping me focus on what matters in the project’s context.

Feedback, feedback, feedback!

I made it a habit to seek feedback every stage—whether through methodical user testing, formal critique sessions with other designers, or even quick questions in our team Slack. Each conversation helped shape the rationale behind my design decisions. This constant feedback loop was essential in helping me spot areas for improvement, uncover perspectives I hadn’t considered, and ultimately craft a well-rounded solution.

thanks for stopping by. leave a message!

renee du © 2025

thanks for stopping by. leave a message!

renee du © 2025

thanks for stopping by. leave a message!

renee du © 2025