docs: add phase 4-7 instruction guides

This commit is contained in:
Joseph Doherty
2026-02-26 06:23:13 -05:00
parent 1bc64cf36e
commit ca6ed0f09f
4 changed files with 864 additions and 0 deletions

View File

@@ -0,0 +1,186 @@
# Phase 4: .NET Solution Design
Design the target .NET 10 solution structure and map every Go item to its .NET counterpart. This phase translates the Go codebase decomposition (from Phases 1-2) and library mappings (from Phase 3) into a concrete .NET implementation plan.
## Objective
Every module, feature, and test in the porting database must have either a .NET mapping (project, namespace, class, method) or a justified N/A status. The result is a complete blueprint for the porting work in Phase 6.
## Prerequisites
- Phases 1-3 complete: all Go items in the DB, all libraries mapped
- Verify with: `dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db`
## Solution Structure
Define the .NET solution layout following standard conventions:
```
src/
NATS.Server/ # Main server library (all core logic)
Protocol/ # Wire protocol parsing, commands
Subscriptions/ # SubList trie, subject matching
JetStream/ # Stream management, consumers
Cluster/ # Routes, gateways, leaf nodes
Auth/ # Authentication, accounts, JWT
...
NATS.Server.Host/ # Host/entry point (Program.cs, DI, config)
tests/
NATS.Server.Tests/ # Unit tests for NATS.Server
Protocol/
Subscriptions/
JetStream/
...
NATS.Server.IntegrationTests/ # Cross-module and end-to-end tests
```
The `NATS.Server` project holds all portable logic. `NATS.Server.Host` is the thin entry point that wires up dependency injection, configuration, and hosting. Tests mirror the source structure.
## Naming Conventions
Follow these rules consistently when mapping Go items to .NET:
| Aspect | Convention | Example |
|--------|-----------|---------|
| Classes | PascalCase | `NatsParser`, `SubList`, `JetStreamController` |
| Methods | PascalCase | `TryParse`, `Match`, `ProcessMessage` |
| Namespaces | `NATS.Server.[Module]` | `NATS.Server.Protocol`, `NATS.Server.Subscriptions` |
| Test classes | `[ClassName]Tests` | `NatsParserTests`, `SubListTests` |
| Test methods | `[Method]_[Scenario]_[Expected]` | `TryParse_ValidInput_ReturnsTrue` |
| Interfaces | `I[Name]` | `IMessageRouter`, `ISubListAccess` |
| Projects | `NATS.Server[.Suffix]` | `NATS.Server`, `NATS.Server.Host` |
Avoid abbreviations unless they are universally understood (e.g., `TCP`, `TLS`, `JWT`). Prefer descriptive names over short ones.
## Steps
### Step 1: Map modules
For each module in the database, assign a .NET project, namespace, and class. The `--namespace` and `--class` options are optional but recommended.
```bash
# List all modules to review
dotnet run --project tools/NatsNet.PortTracker -- module list --db porting.db
# Map a module to its .NET target
dotnet run --project tools/NatsNet.PortTracker -- module map <id> \
--project "NATS.Server" \
--namespace "NATS.Server.Protocol" \
--class "NatsParser" \
--db porting.db
```
Work through all modules systematically. Group related Go files into the same namespace:
| Go package/file pattern | .NET namespace |
|------------------------|----------------|
| `server/parser.go` | `NATS.Server.Protocol` |
| `server/sublist.go` | `NATS.Server.Subscriptions` |
| `server/jetstream*.go` | `NATS.Server.JetStream` |
| `server/route.go`, `server/gateway.go` | `NATS.Server.Cluster` |
| `server/auth.go`, `server/accounts.go` | `NATS.Server.Auth` |
| `server/pse/` | Likely N/A (Go-specific platform code) |
### Step 2: Map features
For each feature (function/method), assign the .NET class and method name:
```bash
# List features for a specific module
dotnet run --project tools/NatsNet.PortTracker -- feature list --module <module_id> --db porting.db
# Map a feature
dotnet run --project tools/NatsNet.PortTracker -- feature map <id> \
--project "NATS.Server" \
--class "NatsParser" \
--method "TryParse" \
--db porting.db
```
When mapping Go functions to .NET methods:
- Go free functions become static methods or instance methods on the appropriate class
- Go methods with receivers map to instance methods on the corresponding .NET class
- Go `init()` functions typically map to static constructors or initialization in DI setup
- Go `goroutine` launches map to `Task`-based async methods
### Step 3: Map tests
For each test function, assign the .NET test class and method:
```bash
# List tests for a module
dotnet run --project tools/NatsNet.PortTracker -- test list --module <module_id> --db porting.db
# Map a test
dotnet run --project tools/NatsNet.PortTracker -- test map <id> \
--project "NATS.Server.Tests" \
--class "NatsParserTests" \
--method "TryParse_ValidInput_ReturnsTrue" \
--db porting.db
```
Go test naming (`TestParserValid`) translates to .NET naming (`TryParse_ValidInput_ReturnsTrue`). Each Go `Test*` function maps to one or more `[Fact]` or `[Theory]` methods. Table-driven Go tests often become `[Theory]` with `[InlineData]` or `[MemberData]`.
### Step 4: Mark N/A items
Some Go code has no .NET equivalent. Mark these with a clear reason:
```bash
# Mark a module as N/A
dotnet run --project tools/NatsNet.PortTracker -- module set-na <id> \
--reason "Go-specific platform code, not needed in .NET" \
--db porting.db
# Mark a feature as N/A
dotnet run --project tools/NatsNet.PortTracker -- feature set-na <id> \
--reason "Go signal handling, replaced by .NET host lifecycle" \
--db porting.db
```
### Common N/A categories
Items that typically do not need a .NET port:
| Go item | Reason |
|---------|--------|
| `pse_darwin.go`, `pse_linux.go`, `pse_windows.go` | Go-specific platform syscall wrappers; use .NET `System.Diagnostics.Process` instead |
| `disk_avail_windows.go`, `disk_avail_linux.go` | Go-specific disk APIs; use .NET `System.IO.DriveInfo` instead |
| Custom logger (`logger.go`, `log.go`) | Replaced by Serilog via `NATS.Server.Host` |
| Signal handling (`signal.go`) | Replaced by .NET Generic Host `IHostLifetime` |
| Go `sync.Pool`, `sync.Map` wrappers | .NET has `ObjectPool<T>`, `ConcurrentDictionary<K,V>` built-in |
| Build tags / `_test.go` helpers specific to Go test infra | Replaced by xUnit attributes and test fixtures |
| `go:embed` directives | Replaced by embedded resources or `IFileProvider` |
Every N/A must include a reason. Bare N/A status without explanation is not acceptable.
## Verification
After mapping all items, run a quick check:
```bash
# Count unmapped items (should be 0)
dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db
# Review all modules — every row should show DotNet Project filled or status n_a
dotnet run --project tools/NatsNet.PortTracker -- module list --db porting.db
# Review N/A items to confirm they all have reasons
dotnet run --project tools/NatsNet.PortTracker -- module list --status n_a --db porting.db
dotnet run --project tools/NatsNet.PortTracker -- feature list --status n_a --db porting.db
```
## Completion Criteria
- Every module has `dotnet_project` and `dotnet_namespace` set, or status is `n_a` with a reason
- Every feature has `dotnet_project`, `dotnet_class`, and `dotnet_method` set, or status is `n_a` with a reason
- Every test has `dotnet_project`, `dotnet_class`, and `dotnet_method` set, or status is `n_a` with a reason
- Naming follows PascalCase and the namespace hierarchy described above
- No two features map to the same class + method combination (collisions)
## Related Documentation
- [Phase 3: Library Mapping](phase-3-library-mapping.md) -- library mappings inform .NET class choices
- [Phase 5: Mapping Verification](phase-5-mapping-verification.md) -- next phase, validates all mappings
- [Phase 6: Porting](phase-6-porting.md) -- uses these mappings as the implementation blueprint

View File

@@ -0,0 +1,193 @@
# Phase 5: Mapping Verification
Verify that every Go item in the porting database is either mapped to a .NET target or justified as N/A. This phase is a quality gate between design (Phase 4) and implementation (Phase 6).
## Objective
Confirm zero unmapped items, validate all N/A justifications, enforce naming conventions, and detect collisions. The porting database must be a complete, consistent blueprint before any code is written.
## Prerequisites
- Phase 4 complete: all items have .NET mappings or N/A status
- Verify with: `dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db`
## Steps
### Step 1: Confirm zero unmapped items
Run the summary report and verify that no items remain in `not_started` status without a .NET mapping:
```bash
dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db
```
The output shows counts per status. All items should be in one of these categories:
- `not_started` with .NET mapping fields populated (ready for Phase 6)
- `n_a` with a reason in the notes field
If any items lack both a mapping and N/A status, go back to Phase 4 and address them.
### Step 2: Review N/A items
Every N/A item must have a justification. Review them by type:
```bash
# Review N/A modules
dotnet run --project tools/NatsNet.PortTracker -- module list --status n_a --db porting.db
# Review N/A features
dotnet run --project tools/NatsNet.PortTracker -- feature list --status n_a --db porting.db
# Review N/A tests
dotnet run --project tools/NatsNet.PortTracker -- test list --status n_a --db porting.db
```
For each N/A item, verify:
1. The reason is documented (check with `module show <id>`, `feature show <id>`, or `test show <id>`)
2. The reason is valid (the item genuinely has no .NET equivalent or is replaced by a .NET facility)
3. No dependent items rely on this N/A item being ported
```bash
# Check if anything depends on an N/A item
dotnet run --project tools/NatsNet.PortTracker -- dependency show module <id> --db porting.db
dotnet run --project tools/NatsNet.PortTracker -- dependency show feature <id> --db porting.db
```
If a non-N/A item depends on an N/A item, either the dependency needs to be resolved differently or the N/A classification is wrong.
### Step 3: Verify naming conventions
Walk through the mappings and check for naming compliance:
**PascalCase check**: All `dotnet_class` and `dotnet_method` values must use PascalCase. No `snake_case`, no `camelCase`.
```bash
# List all mapped modules and spot-check names
dotnet run --project tools/NatsNet.PortTracker -- module list --db porting.db
# List all mapped features for a module and check class/method names
dotnet run --project tools/NatsNet.PortTracker -- feature list --module <id> --db porting.db
```
**Namespace hierarchy check**: Namespaces must follow `NATS.Server.[Module]` pattern:
| Valid | Invalid |
|-------|---------|
| `NATS.Server.Protocol` | `Protocol` (missing root) |
| `NATS.Server.JetStream` | `NATS.Server.jetstream` (wrong case) |
| `NATS.Server.Subscriptions` | `NATSServer.Subscriptions` (wrong root) |
**Test naming check**: Test classes must end in `Tests`. Test methods must follow `[Method]_[Scenario]_[Expected]` pattern:
| Valid | Invalid |
|-------|---------|
| `NatsParserTests` | `ParserTest` (wrong suffix) |
| `TryParse_ValidInput_ReturnsTrue` | `TestParserValid` (Go-style naming) |
| `Match_WildcardSubject_ReturnsSubscribers` | `test_match` (snake_case) |
### Step 4: Check for collisions
No two features should map to the same class + method combination. This would cause compile errors or overwrite conflicts.
```bash
# Export the full mapping report for review
dotnet run --project tools/NatsNet.PortTracker -- report export --format md --output porting-mapping-report.md --db porting.db
```
Open `porting-mapping-report.md` and search for duplicate class + method pairs. If the database is large, run a targeted SQL query:
```bash
sqlite3 porting.db "
SELECT dotnet_class, dotnet_method, COUNT(*) as cnt
FROM features
WHERE dotnet_class IS NOT NULL AND dotnet_method IS NOT NULL
GROUP BY dotnet_class, dotnet_method
HAVING cnt > 1;
"
```
If collisions are found, rename one of the conflicting methods. Common resolution: add a more specific suffix (`ParseHeaders` vs `ParseBody` instead of two `Parse` methods).
### Step 5: Validate cross-references
Verify that test mappings reference the correct test project:
```bash
# All tests should target NATS.Server.Tests or NATS.Server.IntegrationTests
dotnet run --project tools/NatsNet.PortTracker -- test list --db porting.db
```
Check that:
- Unit tests point to `NATS.Server.Tests`
- Integration tests (if any) point to `NATS.Server.IntegrationTests`
- No tests accidentally point to `NATS.Server` (the library project)
### Step 6: Run phase check
Run the built-in phase verification:
```bash
dotnet run --project tools/NatsNet.PortTracker -- phase check 5 --db porting.db
```
This runs automated checks and reports any remaining issues. All checks must pass.
### Step 7: Export final mapping report
Generate the definitive mapping report that serves as the implementation reference for Phase 6:
```bash
dotnet run --project tools/NatsNet.PortTracker -- report export \
--format md \
--output porting-mapping-report.md \
--db porting.db
```
Review the exported report for completeness. This document becomes the source of truth for the porting work.
## Troubleshooting
### Unmapped items found
```bash
# Find features with no .NET mapping and not N/A
dotnet run --project tools/NatsNet.PortTracker -- feature list --status not_started --db porting.db
```
For each unmapped item, either map it (Phase 4 Step 2) or set it to N/A with a reason.
### N/A item has dependents
If a non-N/A feature depends on an N/A feature:
1. Determine if the dependency is real or an artifact of the Go call graph
2. If real, the N/A classification is likely wrong -- map the item instead
3. If the dependency is Go-specific, remove or reclassify it
### Naming collision detected
Rename one of the colliding methods to be more specific:
```bash
dotnet run --project tools/NatsNet.PortTracker -- feature map <id> \
--method "ParseHeadersFromBuffer" \
--db porting.db
```
## Completion Criteria
- Zero items in `not_started` status without a .NET mapping
- All N/A items have a documented, valid reason
- All `dotnet_class` and `dotnet_method` values follow PascalCase
- All namespaces follow `NATS.Server.[Module]` hierarchy
- No two features map to the same class + method combination
- All tests target the correct test project
- `phase check 5` passes with no errors
- Mapping report exported and reviewed
## Related Documentation
- [Phase 4: .NET Solution Design](phase-4-dotnet-design.md) -- the mapping phase this verifies
- [Phase 6: Porting](phase-6-porting.md) -- uses the verified mappings for implementation

View File

@@ -0,0 +1,252 @@
# Phase 6: Initial Porting
Port Go code to .NET 10 C#, working through the dependency graph bottom-up. This is the main implementation phase where the actual code is written.
## Objective
Implement every non-N/A module, feature, and test in the porting database. Work from leaf nodes (items with no unported dependencies) upward through the dependency graph. Keep the database current as work progresses.
## Prerequisites
- Phase 5 complete: all mappings verified, no collisions, naming validated
- .NET solution structure created:
- `src/NATS.Server/NATS.Server.csproj`
- `src/NATS.Server.Host/NATS.Server.Host.csproj`
- `tests/NATS.Server.Tests/NATS.Server.Tests.csproj`
- `tests/NATS.Server.IntegrationTests/NATS.Server.IntegrationTests.csproj`
- Library dependencies (NuGet packages) added per Phase 3 mappings
- Verify readiness: `dotnet run --project tools/NatsNet.PortTracker -- phase check 5 --db porting.db`
## Porting Workflow
This is the core loop. Repeat until all items are complete.
### Step 1: Find ready items
Query for items whose dependencies are all ported (status `complete`, `verified`, or `n_a`):
```bash
dotnet run --project tools/NatsNet.PortTracker -- dependency ready --db porting.db
```
This returns modules and features that have no unported dependencies. Start with these.
### Step 2: Pick an item and mark as stub
Choose an item from the ready list. Mark it as `stub` to signal work has begun:
```bash
dotnet run --project tools/NatsNet.PortTracker -- feature update <id> --status stub --db porting.db
```
### Step 3: Create the skeleton
In the .NET project, create the class and method skeleton based on the mapping:
1. Look up the mapping: `dotnet run --project tools/NatsNet.PortTracker -- feature show <id> --db porting.db`
2. Create the file at the correct path under `src/NATS.Server/` following the namespace hierarchy
3. Add the class declaration, method signature, and a `throw new NotImplementedException()` body
For batch scaffolding of an entire module:
```bash
dotnet run --project tools/NatsNet.PortTracker -- feature update 0 --status stub \
--all-in-module <module_id> --db porting.db
```
### Step 4: Implement the logic
Reference the Go source code. The database stores the Go file path and line number for each feature:
```bash
dotnet run --project tools/NatsNet.PortTracker -- feature show <id> --db porting.db
```
The output includes `Go File`, `Go Line`, and `Go LOC` fields. Open the Go source at those coordinates and translate the logic to C#.
Key translation patterns:
| Go pattern | .NET equivalent |
|-----------|-----------------|
| `goroutine` + `channel` | `Task` + `Channel<T>` or `async/await` |
| `sync.Mutex` | `lock` statement or `SemaphoreSlim` |
| `sync.RWMutex` | `ReaderWriterLockSlim` |
| `sync.WaitGroup` | `Task.WhenAll` or `CountdownEvent` |
| `defer` | `try/finally` or `using`/`IDisposable` |
| `interface{}` / `any` | `object` or generics |
| `[]byte` | `byte[]`, `ReadOnlySpan<byte>`, or `ReadOnlyMemory<byte>` |
| `map[K]V` | `Dictionary<K,V>` or `ConcurrentDictionary<K,V>` |
| `error` return | Exceptions or `Result<T>` pattern |
| `panic/recover` | Exceptions (avoid `Environment.FailFast` for recoverable cases) |
| `select` on channels | `Task.WhenAny` or `Channel<T>` reader patterns |
| `context.Context` | `CancellationToken` |
| `io.Reader/Writer` | `Stream`, `PipeReader/PipeWriter` |
### Step 5: Mark complete
Once the implementation compiles and the basic logic is in place:
```bash
dotnet run --project tools/NatsNet.PortTracker -- feature update <id> --status complete --db porting.db
```
### Step 6: Run targeted tests
If tests exist for this feature, run them:
```bash
dotnet test --filter "FullyQualifiedName~NATS.Server.Tests.Protocol" \
tests/NATS.Server.Tests/
```
Fix any failures before moving on.
### Step 7: Check what is now unblocked
Completing items may unblock others that depend on them:
```bash
dotnet run --project tools/NatsNet.PortTracker -- dependency ready --db porting.db
```
Return to Step 2 with the newly available items.
## DB Update Discipline
The porting database must stay current. Update status at every transition:
```bash
# Starting work on a feature
dotnet run --project tools/NatsNet.PortTracker -- feature update 42 --status stub --db porting.db
# Feature implemented
dotnet run --project tools/NatsNet.PortTracker -- feature update 42 --status complete --db porting.db
# Batch scaffolding for all features in a module
dotnet run --project tools/NatsNet.PortTracker -- feature update 0 --status stub \
--all-in-module 3 --db porting.db
# Module fully ported (all its features are complete)
dotnet run --project tools/NatsNet.PortTracker -- module update 3 --status complete --db porting.db
# Check progress
dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db
```
Status transitions follow this progression:
```
not_started -> stub -> complete -> verified (Phase 7)
\-> n_a (if determined during porting)
```
Never skip `stub` -- it signals that work is in progress and prevents duplicate effort.
## Porting Order Strategy
### Start with leaf modules
Leaf modules have no dependencies on other unported modules. They are safe to port first because nothing they call is missing.
```bash
# These are the leaves — port them first
dotnet run --project tools/NatsNet.PortTracker -- dependency ready --db porting.db
```
Typical leaf modules include:
- Utility/helper code (string manipulation, byte buffer pools)
- Constants and enums
- Configuration types (options, settings)
- Error types and codes
### Then work upward
After leaves are done, modules that depended only on those leaves become ready. Continue up the dependency graph:
```
Leaf utilities -> Protocol types -> Parser -> Connection handler -> Server
```
### Port tests alongside features
When porting a feature, also port its associated tests in the same pass. This provides immediate validation:
```bash
# List tests for a feature
dotnet run --project tools/NatsNet.PortTracker -- test list --module <module_id> --db porting.db
# After porting a test
dotnet run --project tools/NatsNet.PortTracker -- test update <id> --status complete --db porting.db
```
## Progress Tracking
Check overall progress regularly:
```bash
# Summary stats
dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db
# What is still blocked
dotnet run --project tools/NatsNet.PortTracker -- dependency blocked --db porting.db
# Phase-level check
dotnet run --project tools/NatsNet.PortTracker -- phase check 6 --db porting.db
```
## Handling Discoveries During Porting
During implementation, you may find:
### Items that should be N/A
If a feature turns out to be unnecessary in .NET (discovered during implementation):
```bash
dotnet run --project tools/NatsNet.PortTracker -- feature set-na <id> \
--reason "Go-specific memory management, handled by .NET GC" --db porting.db
```
### Missing dependencies
If the Go analyzer missed a dependency:
```bash
# The dependency is tracked in the DB via the dependencies table
# For now, just ensure the target is ported before continuing
dotnet run --project tools/NatsNet.PortTracker -- dependency show feature <id> --db porting.db
```
### Design changes
If the .NET design needs to differ from the original mapping (e.g., splitting a large Go function into multiple .NET methods), update the mapping:
```bash
dotnet run --project tools/NatsNet.PortTracker -- feature map <id> \
--class "NewClassName" \
--method "NewMethodName" \
--db porting.db
```
## Tips
1. **Keep the build green.** The solution should compile after each feature is completed. Do not leave unresolved references.
2. **Write idiomatic C#.** Do not transliterate Go line-by-line. Use .NET patterns (async/await, LINQ, Span, dependency injection) where they produce cleaner code.
3. **Use `CancellationToken` everywhere.** The Go code uses `context.Context` pervasively -- mirror this with `CancellationToken` parameters.
4. **Prefer `ReadOnlySpan<byte>` for hot paths.** The NATS parser processes bytes at high throughput. Use spans and avoid allocations in the critical path.
5. **Do not port Go comments verbatim.** Translate the intent into C# XML doc comments where appropriate.
6. **Run `dotnet build` frequently.** Catch compile errors early rather than accumulating them.
## Completion Criteria
- All non-N/A modules have status `complete` or better
- All non-N/A features have status `complete` or better
- All non-N/A tests have status `complete` or better
- The solution compiles without errors: `dotnet build`
- `dependency blocked` returns no items (or only items waiting for Phase 7 verification)
- `report summary` shows the expected completion counts
## Related Documentation
- [Phase 5: Mapping Verification](phase-5-mapping-verification.md) -- the verified mappings this phase implements
- [Phase 7: Porting Verification](phase-7-porting-verification.md) -- targeted testing of the ported code

View File

@@ -0,0 +1,233 @@
# Phase 7: Porting Verification
Verify all ported code through targeted testing per module. This phase does NOT run the full test suite as a single pass -- it systematically verifies each module, marks items as verified, and confirms behavioral equivalence with the Go server.
## Objective
Every ported module passes its targeted tests. Every item in the database reaches `verified` or `n_a` status. Cross-module integration tests pass. Key behavioral scenarios produce equivalent results between the Go and .NET servers.
## Prerequisites
- Phase 6 complete: all non-N/A items at `complete` or better
- All tests ported and compilable
- Verify readiness: `dotnet run --project tools/NatsNet.PortTracker -- phase check 6 --db porting.db`
## Verification Workflow
Work through modules one at a time. Do not move to the next module until the current one is fully verified.
### Step 1: List modules to verify
```bash
# Show all modules — look for status 'complete' (not yet verified)
dotnet run --project tools/NatsNet.PortTracker -- module list --db porting.db
```
Start with leaf modules (those with the fewest dependencies) and work upward, same order as the porting phase.
### Step 2: List tests for the module
For each module, identify all mapped tests:
```bash
dotnet run --project tools/NatsNet.PortTracker -- test list --module <module_id> --db porting.db
```
This shows every test associated with the module, its status, and its .NET method name.
### Step 3: Run targeted tests
Run only the tests for this module using `dotnet test --filter`:
```bash
# Filter by namespace (matches all tests in the module's namespace)
dotnet test --filter "FullyQualifiedName~NATS.Server.Tests.Protocol" \
tests/NATS.Server.Tests/
# Filter by test class
dotnet test --filter "FullyQualifiedName~NATS.Server.Tests.Protocol.NatsParserTests" \
tests/NATS.Server.Tests/
# Filter by specific test method
dotnet test --filter "FullyQualifiedName~NATS.Server.Tests.Protocol.NatsParserTests.TryParse_ValidInput_ReturnsTrue" \
tests/NATS.Server.Tests/
```
The `--filter` flag uses partial matching on the fully qualified test name. Use the namespace pattern for module-wide runs, and the class or method pattern for debugging specific failures.
### Step 4: Handle failures
When tests fail:
1. **Read the failure output.** The test runner prints the assertion that failed, the expected vs actual values, and the stack trace.
2. **Locate the Go reference.** Look up the test in the database to find the original Go test and source:
```bash
dotnet run --project tools/NatsNet.PortTracker -- test show <test_id> --db porting.db
```
3. **Compare Go and .NET logic.** Open the Go source at the stored line number. Check for translation errors: off-by-one, missing edge cases, different default values.
4. **Fix and re-run.** After fixing, re-run only the failing test:
```bash
dotnet test --filter "FullyQualifiedName~NATS.Server.Tests.Protocol.NatsParserTests.TryParse_EmptyInput_ReturnsFalse" \
tests/NATS.Server.Tests/
```
5. **Then re-run the full module.** Confirm no regressions:
```bash
dotnet test --filter "FullyQualifiedName~NATS.Server.Tests.Protocol" \
tests/NATS.Server.Tests/
```
Common failure causes:
| Symptom | Likely cause |
|---------|-------------|
| Off-by-one in buffer parsing | Go slices are half-open `[start:end)`, C# spans match but array indexing might differ |
| Timeout in async test | Missing `CancellationToken`, or `Task` not awaited |
| Wrong byte sequence | Go uses `[]byte("string")` which is UTF-8; ensure C# uses `Encoding.UTF8` |
| Nil vs null behavior | Go nil checks behave differently from C# null; check for `default` values |
| Map iteration order | Go maps iterate in random order; if the test depends on order, sort first |
### Step 5: Mark module as verified
Once all tests pass for a module:
```bash
# Mark the module itself as verified
dotnet run --project tools/NatsNet.PortTracker -- module update <module_id> --status verified --db porting.db
# Mark all features in the module as verified
dotnet run --project tools/NatsNet.PortTracker -- feature update 0 --status verified \
--all-in-module <module_id> --db porting.db
# Mark individual tests as verified
dotnet run --project tools/NatsNet.PortTracker -- test update <test_id> --status verified --db porting.db
```
### Step 6: Move to next module
Repeat Steps 2-5 for each module. Track progress:
```bash
dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db
```
## Integration Testing
After all modules are individually verified, run integration tests that exercise cross-module behavior.
### Step 7: Run integration tests
```bash
dotnet test tests/NATS.Server.IntegrationTests/
```
Integration tests cover scenarios like:
- Client connects, subscribes, receives published messages
- Multiple clients with wildcard subscriptions
- Connection lifecycle (connect, disconnect, reconnect)
- Protocol error handling (malformed commands, oversized payloads)
- Configuration loading and server startup
Fix any failures by tracing through the modules involved and checking the interaction boundaries.
### Step 8: Behavioral comparison
Run both the Go server and the .NET server with the same workload and compare behavior. This catches semantic differences that unit tests might miss.
**Setup:**
1. Start the Go server:
```bash
cd golang/nats-server && go run . -p 4222
```
2. Start the .NET server:
```bash
dotnet run --project src/NATS.Server.Host -- --port 4223
```
**Comparison scenarios:**
| Scenario | What to compare |
|----------|----------------|
| Basic pub/sub | Publish a message, verify subscriber receives identical payload |
| Wildcard matching | Subscribe with `foo.*` and `foo.>`, publish to `foo.bar`, verify same match results |
| Queue groups | Multiple subscribers in a queue group, verify round-robin distribution |
| Protocol errors | Send malformed commands, verify same error responses |
| Connection info | Connect and check `INFO` response fields |
| Graceful shutdown | Send SIGTERM, verify clean disconnection |
Use the `nats` CLI tool to drive traffic:
```bash
# Subscribe on Go server
nats sub -s nats://localhost:4222 "test.>"
# Subscribe on .NET server
nats sub -s nats://localhost:4223 "test.>"
# Publish to both and compare
nats pub -s nats://localhost:4222 test.hello "payload"
nats pub -s nats://localhost:4223 test.hello "payload"
```
Document any behavioral differences. Some differences are expected (e.g., server name, version string) while others indicate bugs.
### Step 9: Final verification
Run the complete check:
```bash
# Phase 7 check — all tests verified
dotnet run --project tools/NatsNet.PortTracker -- phase check 7 --db porting.db
# Final summary — all items should be verified or n_a
dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db
# Export final report
dotnet run --project tools/NatsNet.PortTracker -- report export \
--format md \
--output porting-final-report.md \
--db porting.db
```
## Troubleshooting
### Test passes individually but fails in module run
Likely a test ordering dependency or shared state. Check for:
- Static mutable state not reset between tests
- Port conflicts if tests start servers
- File system artifacts from previous test runs
Fix by adding proper test cleanup (`IDisposable`, `IAsyncLifetime`) and using unique ports/paths per test.
### Module passes but integration test fails
The issue is at a module boundary. Check:
- Interface implementations match expectations
- Serialization/deserialization is consistent across modules
- Thread safety at module interaction points
- Async patterns are correct (no fire-and-forget `Task` without error handling)
### Behavioral difference with Go server
1. Identify the specific protocol message or state that differs
2. Trace through both implementations step by step
3. Check the NATS protocol specification for the correct behavior
4. Fix the .NET implementation to match (the Go server is the reference)
## Completion Criteria
- All non-N/A modules have status `verified`
- All non-N/A features have status `verified`
- All non-N/A tests have status `verified`
- All targeted tests pass: `dotnet test tests/NATS.Server.Tests/`
- All integration tests pass: `dotnet test tests/NATS.Server.IntegrationTests/`
- Key behavioral scenarios produce equivalent results on Go and .NET servers
- `phase check 7` passes with no errors
- Final report exported and reviewed
## Related Documentation
- [Phase 6: Porting](phase-6-porting.md) -- the implementation phase this verifies
- [Phase 4: .NET Solution Design](phase-4-dotnet-design.md) -- the original design mappings