// webdev · nextjs · typescript · networking
I Built My Own CCNA Practice Exam App
The honest origin of this project: I didn't want to pay for a Udemy course.
I was studying for the CCNA and the practice exam platforms I found were either expensive subscriptions or question banks that felt outdated. So I did what any reasonable person would do — spent several weekends building my own, which almost certainly took longer and cost more in time than a subscription ever would. No regrets.
What it does
The app is a full CCNA 200-301 exam simulator. The core of it is a 293-question bank I wrote and categorised across the five CCNA domains — networking concepts, network implementation, network operations, network security, and network troubleshooting. Each question has four options and a detailed explanation of the correct answer.
At exam time it pulls 120 random questions, starts an 80-minute timer, and requires 82.5% to pass — matching the real exam's format and pass threshold. Beyond the main exam there's a domain-specific study mode, a subnet calculator with binary visualisation, a searchable Cisco IOS command reference, and five guided Packet Tracer labs.
The question bank
Sourcing and organising 293 questions was the most time-consuming part of the project. I gathered them from various practice resources across the internet — forums, study guides, community dumps — then went through each one, verified the answers, wrote or cleaned up the explanations, and categorised them by CCNA domain. A lot of the raw material needed work before it was usable: ambiguous wording, wrong answers, explanations that didn't actually explain anything.
That curation process was more useful than just sitting a practice exam would have been. Going through a question and asking "is this actually correct, and can I explain why?" is a different level of engagement than just clicking an answer and moving on.
The data structure is simple:
interface Question {
id: number
question: string
options: string[] // always 4
answer: number // index into options[]
domain: string
explanation: string
}
All 293 questions live in a single TypeScript file. No database for question storage — just a typed array that gets imported where it's needed. Simple, fast, and easy to add to.
The exam engine
The part I'm most pleased with is how the randomisation works. Every time you start an exam, two things happen: 120 questions are selected at random from the 293, and then the answer options within each question are shuffled independently.
That second part matters. Without it you can start memorising answer positions — "the answer to the OSPF question is always B". Reshuffling the options on every attempt means you have to actually know why an answer is correct:
return selected.map((q) => {
const correctOption = q.options[q.answer]
const shuffledOptions = [...q.options].sort(() => Math.random() - 0.5)
return {
...q,
options: shuffledOptions,
answer: shuffledOptions.indexOf(correctOption),
}
})
The key thing here is finding the correct answer's new position after shuffling by value, not by index. Straightforward once you see it, but it took me a moment to think through cleanly.
The timer colours shift from grey to amber under ten minutes and to red under five — a small thing, but it adds genuine urgency. The exam auto-submits at zero.
Persistence: Supabase + localStorage
I wanted exam history to persist across sessions, which meant I needed a backend. I used Supabase — it handles auth and gives you a Postgres database with a straightforward JavaScript client.
The pattern I landed on was dual persistence: every completed exam saves to localStorage immediately, and if the user is logged in, it also writes to Supabase. Guests get full functionality with local history; logged-in users get cloud sync across devices. No account required to use the app.
// Always save locally
localStorage.setItem('ccna_history', JSON.stringify(recent.slice(0, 50)))
// Additionally persist to Supabase if authenticated
if (user) {
await supabase.from('exam_results').insert({
user_id: user.id,
score: correct,
total,
passed,
domain_stats: domainStats,
})
}
The results screen breaks down your score by domain and flags anything below 80% with a "Drill →" link that takes you straight to the study mode filtered to that domain. If you're consistently failing Network Security questions, you go drill Network Security questions.
Hiding the navbar during the exam
A small detail: the navbar disappears when you're in an active exam. I didn't want to use conditional rendering and cause layout shift, so I handled it through context instead.
A NavBarContext exposes a setHidden function. The exam component calls it based on the current phase:
useEffect(() => {
setHidden(phase === 'exam')
}, [phase, setHidden])
The navbar component reads from context and returns null if hidden. Clean, no flicker.
The labs
Alongside the exam engine I built five Packet Tracer labs — hands-on exercises covering VLANs and trunking, Spanning Tree Protocol, inter-VLAN routing, OSPFv2, and access control lists.
Each lab follows the same structure: objectives, a topology diagram with IP addressing, step-by-step IOS commands, a troubleshooting scenario with realistic faults, and a six-question knowledge check that auto-grades and explains wrong answers. The .pkt files are served from /public/ for download.
The troubleshooting scenarios were deliberately tricky. The STP lab, for example, ships with SW3 unexpectedly winning the root bridge election because it happened to have the lowest MAC address. The OSPF lab has mismatched hello/dead timers. The kind of faults that catch you out in the real world because they're not immediately obvious from show commands alone.
What I'd do differently
The question bank lives in a TypeScript file. That was fine at 50 questions. At 293 it's a 2,800-line file that's annoying to edit. If I were starting again I'd store questions in JSON or markdown with frontmatter and load them at build time — easier to edit, easier to diff in git.
I also underestimated how much a proper review mode would matter. The current results screen shows incorrect answers inline, which works, but a dedicated flashcard-style review loop would be more useful for spaced repetition.
The stack
Next.js 16 App Router, TypeScript, Tailwind CSS v4, Supabase for auth and exam history. No UI component library — everything built from scratch with Tailwind. No state management library — React hooks and context throughout.
The TypeScript was a deliberate choice over the JavaScript I'd used on this site. With 293 questions and a non-trivial exam engine, having the type system catch shape mismatches at compile time was worth the overhead. A question missing its explanation field would have been a silent runtime bug in JavaScript; TypeScript turns it into a build error.
Building this reinforced something I keep coming back to: active engagement with material beats passive consumption. Curating the questions, verifying the answers, cleaning up the explanations, building the lab scenarios — all of it required actually understanding the content rather than just reading it.
It also meant I went into my Network+ exam having worked through variations of most of the concepts on the syllabus multiple times. That didn't hurt.