Use Cases

Real Problems NORDON Solves

Every developer who uses AI coding assistants hits the same walls. Here are the everyday scenarios where NORDON makes the biggest difference.

01

Multi-Session Debugging

Without NORDON

You're debugging a tricky issue. You spend an hour with your AI assistant, try several approaches, and narrow it down to a race condition. But your session times out. When you start a new session, the AI has no idea what you already tried. You re-explain the error, re-share the stack trace, and watch it suggest the same fixes you already ruled out.

With NORDON

NORDON remembers every failed attempt. When your next session starts, your AI already knows: the error message, what was tried, what didn't work, and what the leading hypothesis was. It picks up exactly where you left off.

What your AI sees at session start

<nordon-context>
  <memory type="failure" importance="0.95">
  ## Race condition in OrderService.checkout()
  Intermittent 500 error when two requests hit checkout simultaneously.
  Tried: mutex lock (deadlocked), optimistic locking (still races on read).
  Root cause narrowed to: stale cache read between validateInventory()
  and deductInventory(). Next step: try read-through cache invalidation.
</memory>
</nordon-context>
02

Onboarding to a New Codebase

Without NORDON

A new developer joins your team. They spend days asking questions: Where's the deployment config? Why is this service split into two? What's the naming convention for API routes? Every answer lives in someone's head or a Slack thread from six months ago.

With NORDON

Your team's accumulated knowledge -- architecture decisions, deployment procedures, naming conventions, gotchas -- is already stored as NORDON memories. The new dev's AI assistant automatically gets all of it from day one.

What your AI sees at session start

<nordon-context>
  <memory type="procedure" importance="0.85">
  ## Deployment to Production
  1. Run tests: npm run test:ci
  2. Build: npm run build:prod
  3. Deploy via: kubectl apply -f k8s/prod/
  4. IMPORTANT: Always run db:migrate BEFORE deploying the app container
  5. Verify: curl https://api.example.com/health
  Note: Never deploy on Fridays. See incident #247.
</memory>
</nordon-context>
03

Maintaining Consistency Across PRs

Without NORDON

Developer A decides to use REST for the new payments API. Two weeks later, Developer B starts building the notifications API using GraphQL. Nobody remembers (or communicates) the original decision. Now you have two API styles and a mess to clean up.

With NORDON

NORDON surfaces past architecture decisions automatically. When Developer B starts a new API, their AI assistant already knows the team chose REST and why. Conflicts get caught before code is written.

What your AI sees at session start

<nordon-context>
  <memory type="decision" importance="0.9">
  ## REST over GraphQL for all public APIs
  Decision date: 2025-11-15
  Chose REST for API layer. Benchmarks showed 3x better performance
  on our read-heavy workload with proper caching. GraphQL overhead
  not justified for our use case. Team agreed unanimously.
  Applies to: all new API endpoints in /api/v2/
</memory>
</nordon-context>
04

Avoiding Repeated Mistakes

Without NORDON

Every few weeks, someone on the team hits the same deployment issue: the migration runs after the app container starts, causing 500 errors for 30 seconds. Each time, someone debugs it from scratch, wastes an hour, and fixes it the same way.

With NORDON

NORDON stores failure memories. When similar patterns are detected -- like someone writing a deployment script or modifying the CI pipeline -- the relevant failure memory surfaces automatically with the fix.

What your AI sees at session start

<nordon-context>
  <memory type="failure" importance="0.92">
  ## Migration timing bug in CI/CD pipeline
  Problem: App container starts before DB migration completes.
  Symptom: 500 errors for 30-60s after deploy.
  Fix: Add initContainer to k8s deployment that runs migrations
  before app pod starts. See commit abc123.
  WARNING: This will recur if anyone modifies deploy.yaml
  without the initContainer dependency.
</memory>
</nordon-context>
05

Complex Feature Development

Without NORDON

You're building a multi-file feature over several days. Day one, you set up the data model. Day two, the API layer. Day three, the frontend. But by day three, your AI doesn't remember the data model decisions from day one, or the API contract from day two.

With NORDON

NORDON's branch-aware memory keeps all feature context scoped to the branch. Every session on that branch gets the full picture: what was built, what's left, and what decisions shaped the architecture.

What your AI sees at session start

<nordon-context>
  <memory type="context" importance="0.88">
  ## Feature: User permissions system (branch: feat/permissions)
  Day 1: Created Permission and Role models with RBAC schema.
  Day 2: Built /api/v2/permissions endpoints. Uses middleware
  pattern for route-level checks. Added caching layer.
  Remaining: Frontend permission gates, admin UI for role mgmt.
  Key constraint: Must be backward-compatible with existing
  session tokens. See constraint memory #142.
</memory>
</nordon-context>
06

Code Review Context

Without NORDON

You're reviewing a PR and see an unusual implementation choice -- why did they use polling instead of WebSockets? Why is there a 500ms delay hardcoded? Without context, you either approve blindly or leave a comment that wastes everyone's time.

With NORDON

NORDON's decision memories capture the 'why' behind implementation choices. Reviewers (and their AI assistants) can see the reasoning that led to each decision, making reviews faster and more informed.

What your AI sees at session start

<nordon-context>
  <memory type="decision" importance="0.82">
  ## Polling over WebSockets for dashboard updates
  WebSockets caused connection storms behind our load balancer
  (ALB has 100-connection limit per target). Polling every 5s
  with ETag caching gives us near-real-time updates with 90%
  fewer connections. Acceptable tradeoff for our scale.
  Related: constraint memory about ALB connection limits.
</memory>
</nordon-context>

Stop repeating yourself

Install NORDON and let your AI assistant remember what matters.