10 Commits

Author SHA1 Message Date
Joseph Doherty
8d68f63e6c chore: update project structure, naming, and add reporting infrastructure
- Update all 7 phase docs with source/target location references
  (golang/ for Go source, dotnet/ for .NET version)
- Rename NATS.Server to ZB.MOM.NatsNet.Server in phase 4-7 docs
- Update solution layout to dotnet/src/ and dotnet/tests/ structure
- Create CLAUDE.md with project summary and phase links
- Update .gitignore: track porting.db, add standard .NET patterns
- Add golang/nats-server as git submodule
- Add reports/generate-report.sh and pre-commit hook
- Add documentation_rules.md to version control
2026-02-26 06:38:56 -05:00
Joseph Doherty
ca6ed0f09f docs: add phase 4-7 instruction guides 2026-02-26 06:23:13 -05:00
Joseph Doherty
1bc64cf36e docs: add phase 1-3 instruction guides 2026-02-26 06:22:21 -05:00
Joseph Doherty
cecbb49653 feat(porttracker): add all remaining commands (feature, test, library, dependency, report, phase) 2026-02-26 06:17:43 -05:00
Joseph Doherty
c31bf6050d feat(go-analyzer): add SQLite writer, complete analyzer pipeline
Add sqlite.go with DBWriter that writes analysis results (modules,
features, tests, dependencies, library mappings) to the porting
database. Successfully analyzes nats-server: 12 modules, 3673
features, 3257 tests, 36 library mappings, 11 dependencies.
2026-02-26 06:15:01 -05:00
Joseph Doherty
6f5a063307 feat(porttracker): add module commands (list, show, update, map, set-na) 2026-02-26 06:12:40 -05:00
Joseph Doherty
864749f681 feat(go-analyzer): add file-to-module grouping logic 2026-02-26 06:11:09 -05:00
Joseph Doherty
f0f5d6d6b3 feat(go-analyzer): add AST parsing and analysis engine 2026-02-26 06:11:06 -05:00
Joseph Doherty
9fe6a8ee36 feat(porttracker): add DB access layer and init command
Add Database.cs with SQLite connection management and helper methods
(Execute, ExecuteScalar, Query), Schema.cs for schema initialization,
and replace default Program.cs with System.CommandLine v3 CLI featuring
global --db/--schema options and an init command.
2026-02-26 06:08:27 -05:00
Joseph Doherty
3b43922f5c feat(go-analyzer): add data model types 2026-02-26 06:06:30 -05:00
30 changed files with 4152 additions and 6 deletions

32
.gitignore vendored
View File

@@ -1,12 +1,29 @@
# SQLite database (local state)
porting.db
# SQLite transient files (WAL mode)
porting.db-journal
porting.db-wal
porting.db-shm
# .NET build output
tools/NatsNet.PortTracker/bin/
tools/NatsNet.PortTracker/obj/
**/bin/
**/obj/
# .NET user / IDE integration files
*.user
*.suo
*.userosscache
*.sln.docstates
# Visual Studio cache/options directory
.vs/
# NuGet
*.nupkg
*.snupkg
project.lock.json
project.fragment.lock.json
*.nuget.props
*.nuget.targets
packages/
# Go build output
tools/go-analyzer/go-analyzer
@@ -14,3 +31,10 @@ tools/go-analyzer/go-analyzer
# OS files
.DS_Store
Thumbs.db
# IDE files
.idea/
.vscode/*.code-workspace
.vscode/settings.json
.vscode/tasks.json
.vscode/launch.json

3
.gitmodules vendored Normal file
View File

@@ -0,0 +1,3 @@
[submodule "golang/nats-server"]
path = golang/nats-server
url = https://github.com/nats-io/nats-server.git

61
CLAUDE.md Normal file
View File

@@ -0,0 +1,61 @@
# CLAUDE.md
## Project Summary
This project is porting the NATS server from Go to .NET 10 C#. The Go source (~130K LOC across 109 non-test files, 85 test files) is at `golang/nats-server/`. The porting process is tracked via an SQLite database (`porting.db`) and managed by two tools: a Go AST analyzer and a .NET PortTracker CLI.
## Folder Layout
```
natsnet/
├── golang/nats-server/ # Go source (reference)
├── dotnet/ # .NET ported version
│ ├── src/
│ │ ├── ZB.MOM.NatsNet.Server/ # Main server library
│ │ └── ZB.MOM.NatsNet.Server.Host/ # Host/entry point
│ └── tests/
│ ├── ZB.MOM.NatsNet.Server.Tests/ # Unit tests
│ └── ZB.MOM.NatsNet.Server.IntegrationTests/ # Integration tests
├── tools/
│ ├── go-analyzer/ # Go AST analyzer (Phases 1-2)
│ └── NatsNet.PortTracker/ # .NET CLI tool (all phases)
├── docs/plans/phases/ # Phase instruction guides
├── reports/ # Generated porting reports
├── porting.db # SQLite tracking database
├── porting-schema.sql # Database schema
└── documentation_rules.md # Documentation conventions
```
## Tools
### Go AST Analyzer
```bash
CGO_ENABLED=1 go build -o go-analyzer . && ./go-analyzer --source golang/nats-server --db porting.db --schema porting-schema.sql
```
### .NET PortTracker CLI
```bash
dotnet run --project tools/NatsNet.PortTracker -- <command> --db porting.db
```
## Phase Instructions
- **Phase 1: Go Codebase Decomposition** - `docs/plans/phases/phase-1-decomposition.md`
- **Phase 2: Verification of Captured Items** - `docs/plans/phases/phase-2-verification.md`
- **Phase 3: Library Mapping** - `docs/plans/phases/phase-3-library-mapping.md`
- **Phase 4: .NET Solution Design** - `docs/plans/phases/phase-4-dotnet-design.md`
- **Phase 5: Mapping Verification** - `docs/plans/phases/phase-5-mapping-verification.md`
- **Phase 6: Initial Porting** - `docs/plans/phases/phase-6-porting.md`
- **Phase 7: Porting Verification** - `docs/plans/phases/phase-7-porting-verification.md`
## Naming Conventions
.NET projects use the `ZB.MOM.NatsNet.Server` prefix. Namespaces follow the `ZB.MOM.NatsNet.Server.[Module]` pattern.
## Reports
- `reports/current.md` always has the latest porting status.
- `reports/report_{commit_id}.md` snapshots are generated on each commit via pre-commit hook.
- Run `./reports/generate-report.sh` manually to regenerate.

View File

@@ -0,0 +1,298 @@
# Phase 1: Go Codebase Decomposition
## Objective
Parse the Go NATS server source code into a structured SQLite database, extracting
modules, features (functions/methods), unit tests, external library imports, and
inter-module dependencies. This database becomes the single source of truth that
drives all subsequent porting phases.
## Prerequisites
| Requirement | Version / Notes |
|---|---|
| Go | 1.25+ (required by `tools/go-analyzer/go.mod`) |
| .NET SDK | 10.0+ |
| SQLite3 CLI | 3.x (optional, for manual inspection) |
| CGO | Must be enabled (`CGO_ENABLED=1`); the Go analyzer uses `github.com/mattn/go-sqlite3` |
| Go source | Cloned at `golang/nats-server/` relative to the repo root |
Verify prerequisites before starting:
```bash
go version # should print go1.25 or later
dotnet --version # should print 10.x
sqlite3 --version # optional, any 3.x
echo $CGO_ENABLED # should print 1 (or be unset; we set it explicitly below)
```
## Source and Target Locations
| Component | Path |
|---|---|
| Go source code | `golang/` (specifically `golang/nats-server/`) |
| .NET ported version | `dotnet/` |
## Steps
### Step 1: Initialize the porting database
Create a fresh SQLite database with the porting tracker schema. This creates
`porting.db` in the repository root with all tables, indexes, and triggers.
```bash
dotnet run --project tools/NatsNet.PortTracker -- init --db porting.db --schema porting-schema.sql
```
Expected output:
```
Database initialized at porting.db
```
If the database already exists, this command is idempotent -- it applies
`CREATE TABLE IF NOT EXISTS` statements and will not destroy existing data.
### Step 2: Build the Go analyzer
The Go analyzer is a standalone tool that uses `go/ast` to parse Go source files
and writes results directly into the SQLite database.
```bash
cd tools/go-analyzer && CGO_ENABLED=1 go build -o go-analyzer . && cd ../..
```
This produces the binary `tools/go-analyzer/go-analyzer`. If the build fails,
see the Troubleshooting section below.
### Step 3: Run the Go analyzer
Point the analyzer at the NATS server source and the porting database:
```bash
./tools/go-analyzer/go-analyzer \
--source golang/nats-server \
--db porting.db \
--schema porting-schema.sql
```
Expected output (counts will vary with the NATS server version):
```
Analysis complete:
Modules: <N>
Features: <N>
Unit Tests: <N>
Dependencies: <N>
Imports: <N>
```
The analyzer does the following:
1. Walks `golang/nats-server/server/` for all `.go` files (skipping `configs/`
and `testdata/` directories).
2. Groups files into logical modules by directory.
3. Parses each non-test file, extracting every `func` and method as a feature.
4. Parses each `_test.go` file, extracting `Test*` and `Benchmark*` functions.
5. Infers module-level dependencies from cross-package imports.
6. Collects all import paths and classifies them as stdlib or external.
7. Writes modules, features, unit_tests, dependencies, and library_mappings to
the database.
### Step 4: Review module groupings
List all modules that were created to confirm the grouping makes sense:
```bash
dotnet run --project tools/NatsNet.PortTracker -- module list --db porting.db
```
This shows every module with its ID, name, status, Go package, and line count.
Verify that the major areas of the NATS server are represented (e.g., core,
jetstream, client, auth, protocol, route, gateway, leafnode, mqtt, websocket,
monitoring, logging, errors, subscriptions, tls, events, raft, config, accounts).
### Step 5: Spot-check individual modules
Inspect a few modules to verify that features and tests were correctly extracted:
```bash
# Check the first module
dotnet run --project tools/NatsNet.PortTracker -- module show 1 --db porting.db
# Check a few more
dotnet run --project tools/NatsNet.PortTracker -- module show 2 --db porting.db
dotnet run --project tools/NatsNet.PortTracker -- module show 3 --db porting.db
```
For each module, confirm that:
- Features list functions and methods from the corresponding Go files.
- Each feature has a `go_file`, `go_method`, and `go_line_number`.
- Tests are listed under the module and have `go_file` and `go_method` populated.
- Dependencies point to other valid modules.
### Step 6: Verify test extraction
List all extracted tests to confirm test files were parsed:
```bash
dotnet run --project tools/NatsNet.PortTracker -- test list --db porting.db
```
Spot-check a few individual tests for detail:
```bash
dotnet run --project tools/NatsNet.PortTracker -- test show 1 --db porting.db
```
Verify that each test has a module assignment and that the `feature_id` link is
populated where the analyzer could infer the connection from naming conventions
(e.g., `TestConnect` links to a feature named `Connect`).
### Step 7: Review inter-module dependencies
Check the dependency graph for a representative module:
```bash
dotnet run --project tools/NatsNet.PortTracker -- dependency show module 1 --db porting.db
```
Also view the full blocked-items report to see the overall dependency shape:
```bash
dotnet run --project tools/NatsNet.PortTracker -- dependency blocked --db porting.db
```
And check which items have no unported dependencies and are ready to start:
```bash
dotnet run --project tools/NatsNet.PortTracker -- dependency ready --db porting.db
```
### Step 8: Review extracted library mappings
List all external Go libraries that were detected:
```bash
dotnet run --project tools/NatsNet.PortTracker -- library list --db porting.db
```
All entries should have status `not_mapped` at this point. Library mapping is
handled in Phase 3.
### Step 9: Generate a baseline summary report
Create a summary snapshot to use as a reference for Phase 2 verification:
```bash
dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db
```
Optionally export a full markdown report for archival:
```bash
dotnet run --project tools/NatsNet.PortTracker -- report export --format md --output docs/reports/phase-1-baseline.md --db porting.db
```
## Completion Criteria
Phase 1 is complete when ALL of the following are true:
- [ ] `porting.db` exists and contains data in all five tables (modules, features,
unit_tests, dependencies, library_mappings).
- [ ] All Go source files under `golang/nats-server/server/` are accounted for
(no files silently skipped without a logged warning).
- [ ] All public and private functions/methods are extracted as features.
- [ ] All `Test*` and `Benchmark*` functions are extracted as unit_tests.
- [ ] Test-to-feature links are populated where naming conventions allow inference.
- [ ] Module-level dependencies are recorded in the dependencies table.
- [ ] External import paths are recorded in the library_mappings table.
- [ ] `phase check 1` shows all checklist items marked `[x]`:
```bash
dotnet run --project tools/NatsNet.PortTracker -- phase check 1 --db porting.db
```
Expected:
```
Phase 1: Analysis & Schema
Run Go AST analyzer, populate DB schema, map libraries
Phase 1 Checklist:
[x] Modules populated: <N>
[x] Features populated: <N>
[x] Unit tests populated: <N>
[x] Dependencies mapped: <N>
[x] Libraries identified: <N>
[ ] All libraries mapped: 0/<N>
```
Note: The "All libraries mapped" item will be unchecked -- that is expected.
Library mapping is the concern of Phase 3.
## Troubleshooting
### CGO_ENABLED not set or build fails with "gcc not found"
The `go-sqlite3` driver requires C compilation. Make sure you have a C compiler
installed and CGO is enabled:
```bash
# macOS -- Xcode command line tools
xcode-select --install
# Then build with explicit CGO
CGO_ENABLED=1 go build -o go-analyzer .
```
### "cannot find package" errors during Go build
The analyzer depends on `github.com/mattn/go-sqlite3`. Run:
```bash
cd tools/go-analyzer
go mod download
go mod verify
```
### Wrong source path
The `--source` flag must point to the root of the cloned nats-server repository
(the directory that contains the `server/` subdirectory). If you see
"discovering files: no such file or directory", verify:
```bash
ls golang/nats-server/server/
```
### Database locked errors
If you run the analyzer while another process has `porting.db` open, SQLite may
report a lock error. Close any other connections (including `sqlite3` CLI
sessions) and retry. The schema enables WAL mode to reduce lock contention:
```sql
PRAGMA journal_mode=WAL;
```
### Analyzer prints warnings but continues
Warnings like "Warning: skipping server/foo.go: <parse error>" mean an individual
file could not be parsed. The analyzer continues with remaining files. Investigate
any warnings -- they may indicate a Go version mismatch or syntax not yet
supported by the `go/ast` parser at your Go version.
### Empty database after analyzer runs
If the analyzer prints zeros for all counts, verify that:
1. The `--source` path is correct and contains Go files.
2. The `--schema` path points to a valid `porting-schema.sql`.
3. The `--db` path is writable.
You can inspect the database directly:
```bash
sqlite3 porting.db "SELECT COUNT(*) FROM modules;"
sqlite3 porting.db "SELECT COUNT(*) FROM features;"
sqlite3 porting.db "SELECT COUNT(*) FROM unit_tests;"
```

View File

@@ -0,0 +1,322 @@
# Phase 2: Verification of Captured Items
## Objective
Verify that the Phase 1 decomposition captured every Go source file, function,
test, and dependency accurately. Compare database counts against independent
baselines derived directly from the filesystem. Identify and fix any gaps before
proceeding to library mapping and porting.
## Prerequisites
- Phase 1 is complete (`porting.db` is populated).
- The Go source at `golang/nats-server/` has not changed since the Phase 1
analyzer run. If the source was updated, re-run the Phase 1 analyzer first.
- `dotnet`, `sqlite3`, `find`, `grep`, and `wc` are available on your PATH.
## Source and Target Locations
| Component | Path |
|---|---|
| Go source code | `golang/` (specifically `golang/nats-server/`) |
| .NET ported version | `dotnet/` |
## Steps
### Step 1: Generate the summary report
Start with a high-level view of what the database contains:
```bash
dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db
```
Record the counts for modules, features, unit tests, and library mappings. These
are the numbers you will verify in subsequent steps.
### Step 2: Count Go source files on disk
Count non-test `.go` files under the server directory (the scope of the analyzer):
```bash
find golang/nats-server/server -name "*.go" ! -name "*_test.go" ! -path "*/configs/*" ! -path "*/testdata/*" | wc -l
```
This should produce approximately 109 files. Compare this count against the
number of distinct `go_file` values in the features table:
```bash
sqlite3 porting.db "SELECT COUNT(DISTINCT go_file) FROM features;"
```
If the database count is lower, some source files may have been skipped. Check
the analyzer stderr output for warnings, or list the missing files:
```bash
sqlite3 porting.db "SELECT DISTINCT go_file FROM features ORDER BY go_file;" > /tmp/db_files.txt
find golang/nats-server/server -name "*.go" ! -name "*_test.go" ! -path "*/configs/*" ! -path "*/testdata/*" -exec realpath --relative-to=golang/nats-server {} \; | sort > /tmp/disk_files.txt
diff /tmp/db_files.txt /tmp/disk_files.txt
```
### Step 3: Count Go test files on disk
```bash
find golang/nats-server/server -name "*_test.go" ! -path "*/configs/*" ! -path "*/testdata/*" | wc -l
```
This should produce approximately 85 files. Compare against distinct test files
in the database:
```bash
sqlite3 porting.db "SELECT COUNT(DISTINCT go_file) FROM unit_tests;"
```
### Step 4: Compare function counts
Count all exported and unexported functions in source files on disk:
```bash
grep -r "^func " golang/nats-server/server/ --include="*.go" --exclude="*_test.go" | grep -v "/configs/" | grep -v "/testdata/" | wc -l
```
Compare against the features count from the database:
```bash
sqlite3 porting.db "SELECT COUNT(*) FROM features;"
```
The numbers should be close. Small discrepancies can occur because:
- The `grep` approach counts lines starting with `func` which may miss functions
with preceding comments on the same line or multi-line signatures.
- The AST parser used by the analyzer is more accurate; it finds all `func`
declarations regardless of formatting.
If the database count is significantly lower (more than 5% off), investigate.
### Step 5: Compare test function counts
Count test functions on disk:
```bash
grep -r "^func Test" golang/nats-server/server/ --include="*_test.go" | wc -l
```
Also count benchmarks:
```bash
grep -r "^func Benchmark" golang/nats-server/server/ --include="*_test.go" | wc -l
```
Compare the combined total against the unit_tests table:
```bash
sqlite3 porting.db "SELECT COUNT(*) FROM unit_tests;"
```
### Step 6: Run the phase check command
The PortTracker has a built-in Phase 1 checklist that verifies all tables are
populated:
```bash
dotnet run --project tools/NatsNet.PortTracker -- phase check 1 --db porting.db
```
All items except "All libraries mapped" should show `[x]`.
### Step 7: Check for orphaned items
Look for features that are not linked to any module (should be zero):
```bash
sqlite3 porting.db "SELECT COUNT(*) FROM features WHERE module_id NOT IN (SELECT id FROM modules);"
```
Look for tests that are not linked to any module (should be zero):
```bash
sqlite3 porting.db "SELECT COUNT(*) FROM unit_tests WHERE module_id NOT IN (SELECT id FROM modules);"
```
Look for test-to-feature links that point to non-existent features:
```bash
sqlite3 porting.db "SELECT COUNT(*) FROM unit_tests WHERE feature_id IS NOT NULL AND feature_id NOT IN (SELECT id FROM features);"
```
Look for dependencies that reference non-existent source or target items:
```bash
sqlite3 porting.db "
SELECT COUNT(*) FROM dependencies
WHERE (source_type = 'module' AND source_id NOT IN (SELECT id FROM modules))
OR (target_type = 'module' AND target_id NOT IN (SELECT id FROM modules))
OR (source_type = 'feature' AND source_id NOT IN (SELECT id FROM features))
OR (target_type = 'feature' AND target_id NOT IN (SELECT id FROM features))
OR (source_type = 'unit_test' AND source_id NOT IN (SELECT id FROM unit_tests))
OR (target_type = 'unit_test' AND target_id NOT IN (SELECT id FROM unit_tests));
"
```
All of these queries should return 0.
### Step 8: Review the largest modules
The largest modules are the most likely to have issues. List modules sorted by
feature count:
```bash
sqlite3 porting.db "
SELECT m.id, m.name, m.go_line_count,
COUNT(f.id) as feature_count
FROM modules m
LEFT JOIN features f ON f.module_id = m.id
GROUP BY m.id
ORDER BY feature_count DESC
LIMIT 10;
"
```
For each of the top 3 modules, do a manual spot-check:
```bash
dotnet run --project tools/NatsNet.PortTracker -- module show <id> --db porting.db
```
Scroll through the features list and verify that the functions look correct
(check a few against the actual Go source file).
### Step 9: Validate the dependency graph
Check for any circular module dependencies (modules that depend on each other):
```bash
sqlite3 porting.db "
SELECT d1.source_id, d1.target_id
FROM dependencies d1
JOIN dependencies d2
ON d1.source_type = d2.target_type AND d1.source_id = d2.target_id
AND d1.target_type = d2.source_type AND d1.target_id = d2.source_id
WHERE d1.source_type = 'module' AND d1.target_type = 'module';
"
```
Circular dependencies are not necessarily wrong (Go packages can have them via
interfaces), but they should be reviewed.
Check which items are blocked by unported dependencies:
```bash
dotnet run --project tools/NatsNet.PortTracker -- dependency blocked --db porting.db
```
And confirm that at least some items are ready to port (have no unported deps):
```bash
dotnet run --project tools/NatsNet.PortTracker -- dependency ready --db porting.db
```
### Step 10: Verify library import completeness
Ensure every external import found in the source is tracked:
```bash
sqlite3 porting.db "SELECT COUNT(*) FROM library_mappings;"
```
Cross-check against a manual count of unique non-stdlib imports:
```bash
grep -rh "\"" golang/nats-server/server/ --include="*.go" | \
grep -oP '"\K[^"]+' | \
grep '\.' | \
sort -u | \
wc -l
```
This is an approximate check. The AST-based analyzer is more accurate than grep
for import extraction, but the numbers should be in the same ballpark.
### Step 11: Export a verification snapshot
Save the current state as a markdown report for your records:
```bash
dotnet run --project tools/NatsNet.PortTracker -- report export \
--format md \
--output docs/reports/phase-2-verification.md \
--db porting.db
```
## Completion Criteria
Phase 2 is complete when ALL of the following are true:
- [ ] Source file counts on disk match distinct `go_file` counts in the database
(within a small margin for intentionally excluded directories).
- [ ] Feature counts from `grep` are within 5% of the database count (AST is the
authoritative source).
- [ ] Test function counts from `grep` match the database count closely.
- [ ] No orphaned features (all linked to valid modules).
- [ ] No orphaned tests (all linked to valid modules).
- [ ] No broken test-to-feature links.
- [ ] No dangling dependency references.
- [ ] Dependency graph is reviewed -- circular deps (if any) are acknowledged.
- [ ] `dependency ready` returns at least one item (the graph has valid roots).
- [ ] Library mappings table contains all external imports.
- [ ] `phase check 1` passes with all items except "All libraries mapped" checked.
## Troubleshooting
### File count mismatch is large
If the disk file count exceeds the database count by more than a few files,
re-run the analyzer with stderr visible:
```bash
./tools/go-analyzer/go-analyzer \
--source golang/nats-server \
--db porting.db \
--schema porting-schema.sql 2>&1 | tee /tmp/analyzer.log
```
Search for warnings:
```bash
grep "Warning" /tmp/analyzer.log
```
Common causes:
- Files with build tags that prevent parsing (e.g., `//go:build ignore`).
- Files in excluded directories (`configs/`, `testdata/`).
- Syntax errors in Go files that the parser cannot handle.
### Feature count is significantly different
The AST parser counts every `func` declaration, including unexported helper
functions. The `grep` baseline only matches lines starting with `func `. If
features that have multiline signatures like:
```go
func (s *Server) handleConnection(
conn net.Conn,
) {
```
...they will be missed by grep but found by the AST parser. Trust the database
count as authoritative.
### Orphaned records found
If orphaned records exist, the analyzer may have a bug or the database was
partially populated from a prior run. The safest fix is to:
1. Delete the database: `rm porting.db`
2. Re-run Phase 1 from Step 1.
### Tests not linked to features
The analyzer uses naming conventions to link tests to features (e.g.,
`TestConnect` maps to a feature containing `Connect`). If many tests show
`feature_id = NULL`, this is expected for tests whose names do not follow the
convention. These links can be manually added later if needed.

View File

@@ -0,0 +1,290 @@
# Phase 3: Library Mapping
## Objective
Map every external Go dependency detected by the analyzer to its .NET equivalent.
This includes Go standard library packages, well-known third-party libraries, and
NATS ecosystem packages. When Phase 3 is complete, the `library suggest` command
returns an empty list and every import has a documented .NET migration path.
## Prerequisites
- Phases 1 and 2 are complete (database is populated and verified).
- Familiarity with both the Go standard library and .NET BCL / NuGet ecosystem.
- A working `dotnet` CLI for running PortTracker commands.
## Source and Target Locations
| Component | Path |
|---|---|
| Go source code | `golang/` (specifically `golang/nats-server/`) |
| .NET ported version | `dotnet/` |
## Steps
### Step 1: List all unmapped libraries
Start by seeing everything that needs attention:
```bash
dotnet run --project tools/NatsNet.PortTracker -- library suggest --db porting.db
```
This shows all library_mappings entries with status `not_mapped`, sorted by import
path. The output includes the Go import path, the library name, and a usage
description.
If the list is empty, all libraries are already mapped and Phase 3 is complete.
### Step 2: Review the full library list
To see both mapped and unmapped libraries in one view:
```bash
dotnet run --project tools/NatsNet.PortTracker -- library list --db porting.db
```
You can also filter by status:
```bash
dotnet run --project tools/NatsNet.PortTracker -- library list --status not_mapped --db porting.db
dotnet run --project tools/NatsNet.PortTracker -- library list --status mapped --db porting.db
```
### Step 3: Map each library
For each unmapped library, determine the appropriate .NET equivalent using the
reference table below, then record the mapping:
```bash
dotnet run --project tools/NatsNet.PortTracker -- library map <id> \
--package "<NuGet package or BCL>" \
--namespace "<.NET namespace>" \
--notes "<migration notes>" \
--db porting.db
```
**Example -- mapping `encoding/json`:**
```bash
dotnet run --project tools/NatsNet.PortTracker -- library map 1 \
--package "System.Text.Json" \
--namespace "System.Text.Json" \
--notes "Use JsonSerializer. Consider source generators for AOT." \
--db porting.db
```
**Example -- mapping `github.com/klauspost/compress`:**
```bash
dotnet run --project tools/NatsNet.PortTracker -- library map 12 \
--package "System.IO.Compression" \
--namespace "System.IO.Compression" \
--notes "S2/Snappy codec needs evaluation; may need custom impl or IronSnappy NuGet." \
--db porting.db
```
Repeat for every entry in the `library suggest` output.
### Step 4: Handle libraries that need custom implementations
Some Go libraries have no direct .NET equivalent and will require custom code.
For these, record the mapping with a descriptive note:
```bash
dotnet run --project tools/NatsNet.PortTracker -- library map <id> \
--package "Custom" \
--namespace "NatsNet.Internal" \
--notes "No direct equivalent; requires custom implementation. See <details>." \
--db porting.db
```
### Step 5: Verify all libraries are mapped
Run the suggest command again -- it should return an empty list:
```bash
dotnet run --project tools/NatsNet.PortTracker -- library suggest --db porting.db
```
Expected output:
```
All libraries have been mapped!
```
Also verify via the full list:
```bash
dotnet run --project tools/NatsNet.PortTracker -- library list --db porting.db
```
Every entry should show status `mapped` (or `verified` if you have already
validated the mapping in code).
### Step 6: Run the phase check
Confirm Phase 1 now shows full completion including library mappings:
```bash
dotnet run --project tools/NatsNet.PortTracker -- phase check 1 --db porting.db
```
Expected:
```
Phase 1 Checklist:
[x] Modules populated: <N>
[x] Features populated: <N>
[x] Unit tests populated: <N>
[x] Dependencies mapped: <N>
[x] Libraries identified: <N>
[x] All libraries mapped: <N>/<N>
```
### Step 7: Export the mapping report
Save the complete library mapping state for reference during porting:
```bash
dotnet run --project tools/NatsNet.PortTracker -- report export \
--format md \
--output docs/reports/phase-3-library-mapping.md \
--db porting.db
```
## Common Go to .NET Mappings Reference
Use this table as a starting point when determining .NET equivalents. Adapt based
on the specific usage patterns found in the NATS server source.
### Go Standard Library
| Go Package | .NET Equivalent | Notes |
|---|---|---|
| `encoding/json` | `System.Text.Json` | Use `JsonSerializer`; consider source generators for performance |
| `encoding/binary` | `System.Buffers.Binary.BinaryPrimitives` | For endian-aware reads/writes |
| `encoding/base64` | `System.Convert` | `Convert.ToBase64String` / `Convert.FromBase64String` |
| `encoding/hex` | `System.Convert` | `Convert.ToHexString` (.NET 5+) |
| `encoding/pem` | `System.Security.Cryptography.PemEncoding` | .NET 5+ |
| `sync` | `System.Threading` | `Mutex` -> `lock` / `Monitor`; `RWMutex` -> `ReaderWriterLockSlim`; `WaitGroup` -> `CountdownEvent`; `Once` -> `Lazy<T>` |
| `sync/atomic` | `System.Threading.Interlocked` | `Interlocked.Increment`, `CompareExchange`, etc. |
| `net` | `System.Net.Sockets` | `TcpListener`, `TcpClient`, `Socket` |
| `net/http` | `System.Net.Http` / `Microsoft.AspNetCore` | Client: `HttpClient`; Server: Kestrel / minimal APIs |
| `net/url` | `System.Uri` | `Uri`, `UriBuilder` |
| `crypto/tls` | `System.Net.Security.SslStream` | Wrap `NetworkStream` with `SslStream` |
| `crypto/x509` | `System.Security.Cryptography.X509Certificates` | `X509Certificate2` |
| `crypto/sha256` | `System.Security.Cryptography.SHA256` | `SHA256.HashData()` (.NET 8+) |
| `crypto/ed25519` | `System.Security.Cryptography` | `Ed25519` support in .NET 9+ |
| `crypto/rand` | `System.Security.Cryptography.RandomNumberGenerator` | `RandomNumberGenerator.Fill()` |
| `time` | `System.TimeSpan` / `System.Threading.Timer` | `time.Duration` -> `TimeSpan`; `time.Ticker` -> `PeriodicTimer`; `time.After` -> `Task.Delay` |
| `time` (parsing) | `System.DateTime` / `System.DateTimeOffset` | `DateTime.Parse`, custom formats |
| `fmt` | String interpolation / `String.Format` | `$"..."` for most cases; `String.Format` for dynamic |
| `io` | `System.IO` | `Reader` -> `Stream`; `Writer` -> `Stream`; `io.Copy` -> `Stream.CopyTo` |
| `io/fs` | `System.IO` | `Directory`, `File`, `FileInfo` |
| `bufio` | `System.IO.BufferedStream` | Or `StreamReader` / `StreamWriter` |
| `bytes` | `System.Buffers` / `MemoryStream` | `bytes.Buffer` -> `MemoryStream` or `ArrayBufferWriter<byte>` |
| `strings` | `System.String` / `System.Text.StringBuilder` | Most methods have direct equivalents |
| `strconv` | `int.Parse`, `double.Parse`, etc. | Or `Convert` class |
| `context` | `CancellationToken` | `context.Context` -> `CancellationToken`; `context.WithCancel` -> `CancellationTokenSource` |
| `os` | `System.Environment` / `System.IO` | `os.Exit` -> `Environment.Exit`; file ops -> `File` class |
| `os/signal` | `System.Runtime.Loader.AssemblyLoadContext` | `UnloadingEvent` or `Console.CancelKeyPress` |
| `path/filepath` | `System.IO.Path` | `Path.Combine`, `Path.GetDirectoryName`, etc. |
| `sort` | `System.Linq` / `Array.Sort` | LINQ `.OrderBy()` or in-place `Array.Sort` |
| `math` | `System.Math` | Direct equivalent |
| `math/rand` | `System.Random` | `Random.Shared` for thread-safe usage (.NET 6+) |
| `regexp` | `System.Text.RegularExpressions.Regex` | Consider source generators for compiled patterns |
| `errors` | `System.Exception` | Go errors -> .NET exceptions; `errors.Is` -> pattern matching |
| `log` | `Serilog` | Project choice: Serilog via `Microsoft.Extensions.Logging` |
| `testing` | `xUnit` | `testing.T` -> xUnit `[Fact]`/`[Theory]`; `testing.B` -> BenchmarkDotNet |
| `flag` | `System.CommandLine` | Or `Microsoft.Extensions.Configuration` |
| `embed` | Embedded resources | `.csproj` `<EmbeddedResource>` items |
| `runtime` | `System.Runtime` / `System.Environment` | `runtime.GOOS` -> `RuntimeInformation.IsOSPlatform` |
### NATS Ecosystem Libraries
| Go Package | .NET Equivalent | Notes |
|---|---|---|
| `github.com/nats-io/jwt/v2` | Custom / evaluate existing | JWT claims for NATS auth; may need custom implementation matching NATS JWT spec |
| `github.com/nats-io/nkeys` | Custom implementation | Ed25519 key pairs for NATS authentication; use `System.Security.Cryptography` Ed25519 |
| `github.com/nats-io/nuid` | Custom / `System.Guid` | NATS unique IDs; simple custom implementation or adapt to `Guid` if format is flexible |
| `github.com/nats-io/nats.go` | `NATS.Net` (official) | Only used in tests; the official .NET NATS client |
### Third-Party Libraries
| Go Package | .NET Equivalent | Notes |
|---|---|---|
| `github.com/klauspost/compress` | `System.IO.Compression` | General compression: `GZipStream`, `DeflateStream`. S2/Snappy: evaluate IronSnappy NuGet or custom port |
| `github.com/minio/highwayhash` | Custom / NuGet | HighwayHash implementation; search NuGet or port the algorithm |
| `golang.org/x/crypto` | `System.Security.Cryptography` | `bcrypt` -> `Rfc2898DeriveBytes` or BCrypt.Net NuGet; `argon2` -> Konscious.Security NuGet |
| `golang.org/x/sys` | `System.Runtime.InteropServices` | Platform-specific syscalls -> P/Invoke or `RuntimeInformation` |
| `golang.org/x/time` | `System.Threading.RateLimiting` | `rate.Limiter` -> `RateLimiter` (.NET 7+); `TokenBucketRateLimiter` or `SlidingWindowRateLimiter` |
| `golang.org/x/text` | `System.Globalization` | Unicode normalization, encoding detection |
## Mapping Decision Guidelines
When choosing a .NET equivalent, follow these priorities:
1. **BCL first**: Prefer built-in .NET Base Class Library types over NuGet packages.
2. **Official packages second**: If BCL does not cover it, prefer
`Microsoft.*` or `System.*` NuGet packages.
3. **Well-maintained NuGet third**: Choose packages with active maintenance,
high download counts, and compatible licenses.
4. **Custom implementation last**: Only write custom code when no suitable
package exists. Document the rationale in the mapping notes.
For each mapping, consider:
- **API surface**: Does the .NET equivalent cover all methods used in the Go code?
- **Performance**: Are there performance-critical paths that need benchmarking?
- **Thread safety**: Go's concurrency model differs from .NET. Note any
synchronization concerns.
- **Platform support**: Does the .NET package work on all target platforms
(Linux, macOS, Windows)?
## Completion Criteria
Phase 3 is complete when ALL of the following are true:
- [ ] `library suggest` returns "All libraries have been mapped!"
- [ ] Every entry in `library list` shows status `mapped` or `verified`.
- [ ] Each mapping includes a `--package` (the NuGet package or BCL assembly),
a `--namespace` (the .NET namespace to use), and `--notes` (migration
guidance).
- [ ] `phase check 1` shows all items checked including "All libraries mapped".
- [ ] A mapping report has been exported for reference.
## Troubleshooting
### "Library <id> not found"
The ID you passed to `library map` does not exist. Run `library suggest` to get
the current list of IDs and their import paths.
### Unsure which .NET package to use
For unfamiliar Go packages:
1. Check what the package does in the Go source (look at the import usage in the
files listed by the analyzer).
2. Search NuGet.org for equivalent functionality.
3. Check if the Go package is a thin wrapper around a well-known algorithm that
.NET implements natively.
4. When in doubt, map it as "Custom" with detailed notes and revisit during
the porting phase.
### Multiple .NET options for one Go package
When there are several valid .NET equivalents (e.g., `Newtonsoft.Json` vs
`System.Text.Json`), prefer the one that:
- Is part of the BCL or a Microsoft package.
- Has better performance characteristics.
- Has source generator support for AOT compilation.
Record the alternatives in the `--notes` field so the decision can be revisited.
### Stdlib packages showing as unmapped
The analyzer classifies imports as stdlib vs external based on whether the first
path component contains a dot. Standard library packages like `encoding/json`,
`net/http`, etc. should still be recorded in the library_mappings table so that
every import path has a documented .NET migration path. Map them using the
reference table above.

View File

@@ -0,0 +1,193 @@
# Phase 4: .NET Solution Design
Design the target .NET 10 solution structure and map every Go item to its .NET counterpart. This phase translates the Go codebase decomposition (from Phases 1-2) and library mappings (from Phase 3) into a concrete .NET implementation plan.
## Objective
Every module, feature, and test in the porting database must have either a .NET mapping (project, namespace, class, method) or a justified N/A status. The result is a complete blueprint for the porting work in Phase 6.
## Prerequisites
- Phases 1-3 complete: all Go items in the DB, all libraries mapped
- Verify with: `dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db`
## Source and Target Locations
- **Go source code** is located in the `golang/` folder (specifically `golang/nats-server/`)
- **.NET ported version** is located in the `dotnet/` folder
## Solution Structure
Define the .NET solution layout following standard conventions:
```
dotnet/
ZB.MOM.NatsNet.sln
src/
ZB.MOM.NatsNet.Server/ # Main server library (all core logic)
Protocol/ # Wire protocol parsing, commands
Subscriptions/ # SubList trie, subject matching
JetStream/ # Stream management, consumers
Cluster/ # Routes, gateways, leaf nodes
Auth/ # Authentication, accounts, JWT
...
ZB.MOM.NatsNet.Server.Host/ # Host/entry point (Program.cs, DI, config)
tests/
ZB.MOM.NatsNet.Server.Tests/ # Unit tests for ZB.MOM.NatsNet.Server
Protocol/
Subscriptions/
JetStream/
...
ZB.MOM.NatsNet.Server.IntegrationTests/ # Cross-module and end-to-end tests
```
The `ZB.MOM.NatsNet.Server` project holds all portable logic. `ZB.MOM.NatsNet.Server.Host` is the thin entry point that wires up dependency injection, configuration, and hosting. Tests mirror the source structure.
## Naming Conventions
Follow these rules consistently when mapping Go items to .NET:
| Aspect | Convention | Example |
|--------|-----------|---------|
| Classes | PascalCase | `NatsParser`, `SubList`, `JetStreamController` |
| Methods | PascalCase | `TryParse`, `Match`, `ProcessMessage` |
| Namespaces | `ZB.MOM.NatsNet.Server.[Module]` | `ZB.MOM.NatsNet.Server.Protocol`, `ZB.MOM.NatsNet.Server.Subscriptions` |
| Test classes | `[ClassName]Tests` | `NatsParserTests`, `SubListTests` |
| Test methods | `[Method]_[Scenario]_[Expected]` | `TryParse_ValidInput_ReturnsTrue` |
| Interfaces | `I[Name]` | `IMessageRouter`, `ISubListAccess` |
| Projects | `ZB.MOM.NatsNet.Server[.Suffix]` | `ZB.MOM.NatsNet.Server`, `ZB.MOM.NatsNet.Server.Host` |
Avoid abbreviations unless they are universally understood (e.g., `TCP`, `TLS`, `JWT`). Prefer descriptive names over short ones.
## Steps
### Step 1: Map modules
For each module in the database, assign a .NET project, namespace, and class. The `--namespace` and `--class` options are optional but recommended.
```bash
# List all modules to review
dotnet run --project tools/NatsNet.PortTracker -- module list --db porting.db
# Map a module to its .NET target
dotnet run --project tools/NatsNet.PortTracker -- module map <id> \
--project "ZB.MOM.NatsNet.Server" \
--namespace "ZB.MOM.NatsNet.Server.Protocol" \
--class "NatsParser" \
--db porting.db
```
Work through all modules systematically. Group related Go files into the same namespace:
| Go package/file pattern | .NET namespace |
|------------------------|----------------|
| `server/parser.go` | `ZB.MOM.NatsNet.Server.Protocol` |
| `server/sublist.go` | `ZB.MOM.NatsNet.Server.Subscriptions` |
| `server/jetstream*.go` | `ZB.MOM.NatsNet.Server.JetStream` |
| `server/route.go`, `server/gateway.go` | `ZB.MOM.NatsNet.Server.Cluster` |
| `server/auth.go`, `server/accounts.go` | `ZB.MOM.NatsNet.Server.Auth` |
| `server/pse/` | Likely N/A (Go-specific platform code) |
### Step 2: Map features
For each feature (function/method), assign the .NET class and method name:
```bash
# List features for a specific module
dotnet run --project tools/NatsNet.PortTracker -- feature list --module <module_id> --db porting.db
# Map a feature
dotnet run --project tools/NatsNet.PortTracker -- feature map <id> \
--project "ZB.MOM.NatsNet.Server" \
--class "NatsParser" \
--method "TryParse" \
--db porting.db
```
When mapping Go functions to .NET methods:
- Go free functions become static methods or instance methods on the appropriate class
- Go methods with receivers map to instance methods on the corresponding .NET class
- Go `init()` functions typically map to static constructors or initialization in DI setup
- Go `goroutine` launches map to `Task`-based async methods
### Step 3: Map tests
For each test function, assign the .NET test class and method:
```bash
# List tests for a module
dotnet run --project tools/NatsNet.PortTracker -- test list --module <module_id> --db porting.db
# Map a test
dotnet run --project tools/NatsNet.PortTracker -- test map <id> \
--project "ZB.MOM.NatsNet.Server.Tests" \
--class "NatsParserTests" \
--method "TryParse_ValidInput_ReturnsTrue" \
--db porting.db
```
Go test naming (`TestParserValid`) translates to .NET naming (`TryParse_ValidInput_ReturnsTrue`). Each Go `Test*` function maps to one or more `[Fact]` or `[Theory]` methods. Table-driven Go tests often become `[Theory]` with `[InlineData]` or `[MemberData]`.
### Step 4: Mark N/A items
Some Go code has no .NET equivalent. Mark these with a clear reason:
```bash
# Mark a module as N/A
dotnet run --project tools/NatsNet.PortTracker -- module set-na <id> \
--reason "Go-specific platform code, not needed in .NET" \
--db porting.db
# Mark a feature as N/A
dotnet run --project tools/NatsNet.PortTracker -- feature set-na <id> \
--reason "Go signal handling, replaced by .NET host lifecycle" \
--db porting.db
```
### Common N/A categories
Items that typically do not need a .NET port:
| Go item | Reason |
|---------|--------|
| `pse_darwin.go`, `pse_linux.go`, `pse_windows.go` | Go-specific platform syscall wrappers; use .NET `System.Diagnostics.Process` instead |
| `disk_avail_windows.go`, `disk_avail_linux.go` | Go-specific disk APIs; use .NET `System.IO.DriveInfo` instead |
| Custom logger (`logger.go`, `log.go`) | Replaced by Serilog via `ZB.MOM.NatsNet.Server.Host` |
| Signal handling (`signal.go`) | Replaced by .NET Generic Host `IHostLifetime` |
| Go `sync.Pool`, `sync.Map` wrappers | .NET has `ObjectPool<T>`, `ConcurrentDictionary<K,V>` built-in |
| Build tags / `_test.go` helpers specific to Go test infra | Replaced by xUnit attributes and test fixtures |
| `go:embed` directives | Replaced by embedded resources or `IFileProvider` |
Every N/A must include a reason. Bare N/A status without explanation is not acceptable.
## Verification
After mapping all items, run a quick check:
```bash
# Count unmapped items (should be 0)
dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db
# Review all modules — every row should show DotNet Project filled or status n_a
dotnet run --project tools/NatsNet.PortTracker -- module list --db porting.db
# Review N/A items to confirm they all have reasons
dotnet run --project tools/NatsNet.PortTracker -- module list --status n_a --db porting.db
dotnet run --project tools/NatsNet.PortTracker -- feature list --status n_a --db porting.db
```
## Completion Criteria
- Every module has `dotnet_project` and `dotnet_namespace` set, or status is `n_a` with a reason
- Every feature has `dotnet_project`, `dotnet_class`, and `dotnet_method` set, or status is `n_a` with a reason
- Every test has `dotnet_project`, `dotnet_class`, and `dotnet_method` set, or status is `n_a` with a reason
- Naming follows PascalCase and the namespace hierarchy described above
- No two features map to the same class + method combination (collisions)
## Related Documentation
- [Phase 3: Library Mapping](phase-3-library-mapping.md) -- library mappings inform .NET class choices
- [Phase 5: Mapping Verification](phase-5-mapping-verification.md) -- next phase, validates all mappings
- [Phase 6: Porting](phase-6-porting.md) -- uses these mappings as the implementation blueprint

View File

@@ -0,0 +1,198 @@
# Phase 5: Mapping Verification
Verify that every Go item in the porting database is either mapped to a .NET target or justified as N/A. This phase is a quality gate between design (Phase 4) and implementation (Phase 6).
## Objective
Confirm zero unmapped items, validate all N/A justifications, enforce naming conventions, and detect collisions. The porting database must be a complete, consistent blueprint before any code is written.
## Prerequisites
- Phase 4 complete: all items have .NET mappings or N/A status
- Verify with: `dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db`
## Source and Target Locations
- **Go source code** is located in the `golang/` folder (specifically `golang/nats-server/`)
- **.NET ported version** is located in the `dotnet/` folder
## Steps
### Step 1: Confirm zero unmapped items
Run the summary report and verify that no items remain in `not_started` status without a .NET mapping:
```bash
dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db
```
The output shows counts per status. All items should be in one of these categories:
- `not_started` with .NET mapping fields populated (ready for Phase 6)
- `n_a` with a reason in the notes field
If any items lack both a mapping and N/A status, go back to Phase 4 and address them.
### Step 2: Review N/A items
Every N/A item must have a justification. Review them by type:
```bash
# Review N/A modules
dotnet run --project tools/NatsNet.PortTracker -- module list --status n_a --db porting.db
# Review N/A features
dotnet run --project tools/NatsNet.PortTracker -- feature list --status n_a --db porting.db
# Review N/A tests
dotnet run --project tools/NatsNet.PortTracker -- test list --status n_a --db porting.db
```
For each N/A item, verify:
1. The reason is documented (check with `module show <id>`, `feature show <id>`, or `test show <id>`)
2. The reason is valid (the item genuinely has no .NET equivalent or is replaced by a .NET facility)
3. No dependent items rely on this N/A item being ported
```bash
# Check if anything depends on an N/A item
dotnet run --project tools/NatsNet.PortTracker -- dependency show module <id> --db porting.db
dotnet run --project tools/NatsNet.PortTracker -- dependency show feature <id> --db porting.db
```
If a non-N/A item depends on an N/A item, either the dependency needs to be resolved differently or the N/A classification is wrong.
### Step 3: Verify naming conventions
Walk through the mappings and check for naming compliance:
**PascalCase check**: All `dotnet_class` and `dotnet_method` values must use PascalCase. No `snake_case`, no `camelCase`.
```bash
# List all mapped modules and spot-check names
dotnet run --project tools/NatsNet.PortTracker -- module list --db porting.db
# List all mapped features for a module and check class/method names
dotnet run --project tools/NatsNet.PortTracker -- feature list --module <id> --db porting.db
```
**Namespace hierarchy check**: Namespaces must follow `ZB.MOM.NatsNet.Server.[Module]` pattern:
| Valid | Invalid |
|-------|---------|
| `ZB.MOM.NatsNet.Server.Protocol` | `Protocol` (missing root) |
| `ZB.MOM.NatsNet.Server.JetStream` | `ZB.MOM.NatsNet.Server.jetstream` (wrong case) |
| `ZB.MOM.NatsNet.Server.Subscriptions` | `NATSServer.Subscriptions` (wrong root) |
**Test naming check**: Test classes must end in `Tests`. Test methods must follow `[Method]_[Scenario]_[Expected]` pattern:
| Valid | Invalid |
|-------|---------|
| `NatsParserTests` | `ParserTest` (wrong suffix) |
| `TryParse_ValidInput_ReturnsTrue` | `TestParserValid` (Go-style naming) |
| `Match_WildcardSubject_ReturnsSubscribers` | `test_match` (snake_case) |
### Step 4: Check for collisions
No two features should map to the same class + method combination. This would cause compile errors or overwrite conflicts.
```bash
# Export the full mapping report for review
dotnet run --project tools/NatsNet.PortTracker -- report export --format md --output porting-mapping-report.md --db porting.db
```
Open `porting-mapping-report.md` and search for duplicate class + method pairs. If the database is large, run a targeted SQL query:
```bash
sqlite3 porting.db "
SELECT dotnet_class, dotnet_method, COUNT(*) as cnt
FROM features
WHERE dotnet_class IS NOT NULL AND dotnet_method IS NOT NULL
GROUP BY dotnet_class, dotnet_method
HAVING cnt > 1;
"
```
If collisions are found, rename one of the conflicting methods. Common resolution: add a more specific suffix (`ParseHeaders` vs `ParseBody` instead of two `Parse` methods).
### Step 5: Validate cross-references
Verify that test mappings reference the correct test project:
```bash
# All tests should target ZB.MOM.NatsNet.Server.Tests or ZB.MOM.NatsNet.Server.IntegrationTests
dotnet run --project tools/NatsNet.PortTracker -- test list --db porting.db
```
Check that:
- Unit tests point to `ZB.MOM.NatsNet.Server.Tests`
- Integration tests (if any) point to `ZB.MOM.NatsNet.Server.IntegrationTests`
- No tests accidentally point to `ZB.MOM.NatsNet.Server` (the library project)
### Step 6: Run phase check
Run the built-in phase verification:
```bash
dotnet run --project tools/NatsNet.PortTracker -- phase check 5 --db porting.db
```
This runs automated checks and reports any remaining issues. All checks must pass.
### Step 7: Export final mapping report
Generate the definitive mapping report that serves as the implementation reference for Phase 6:
```bash
dotnet run --project tools/NatsNet.PortTracker -- report export \
--format md \
--output porting-mapping-report.md \
--db porting.db
```
Review the exported report for completeness. This document becomes the source of truth for the porting work.
## Troubleshooting
### Unmapped items found
```bash
# Find features with no .NET mapping and not N/A
dotnet run --project tools/NatsNet.PortTracker -- feature list --status not_started --db porting.db
```
For each unmapped item, either map it (Phase 4 Step 2) or set it to N/A with a reason.
### N/A item has dependents
If a non-N/A feature depends on an N/A feature:
1. Determine if the dependency is real or an artifact of the Go call graph
2. If real, the N/A classification is likely wrong -- map the item instead
3. If the dependency is Go-specific, remove or reclassify it
### Naming collision detected
Rename one of the colliding methods to be more specific:
```bash
dotnet run --project tools/NatsNet.PortTracker -- feature map <id> \
--method "ParseHeadersFromBuffer" \
--db porting.db
```
## Completion Criteria
- Zero items in `not_started` status without a .NET mapping
- All N/A items have a documented, valid reason
- All `dotnet_class` and `dotnet_method` values follow PascalCase
- All namespaces follow `ZB.MOM.NatsNet.Server.[Module]` hierarchy
- No two features map to the same class + method combination
- All tests target the correct test project
- `phase check 5` passes with no errors
- Mapping report exported and reviewed
## Related Documentation
- [Phase 4: .NET Solution Design](phase-4-dotnet-design.md) -- the mapping phase this verifies
- [Phase 6: Porting](phase-6-porting.md) -- uses the verified mappings for implementation

View File

@@ -0,0 +1,257 @@
# Phase 6: Initial Porting
Port Go code to .NET 10 C#, working through the dependency graph bottom-up. This is the main implementation phase where the actual code is written.
## Objective
Implement every non-N/A module, feature, and test in the porting database. Work from leaf nodes (items with no unported dependencies) upward through the dependency graph. Keep the database current as work progresses.
## Prerequisites
- Phase 5 complete: all mappings verified, no collisions, naming validated
- .NET solution structure created:
- `dotnet/src/ZB.MOM.NatsNet.Server/ZB.MOM.NatsNet.Server.csproj`
- `dotnet/src/ZB.MOM.NatsNet.Server.Host/ZB.MOM.NatsNet.Server.Host.csproj`
- `dotnet/tests/ZB.MOM.NatsNet.Server.Tests/ZB.MOM.NatsNet.Server.Tests.csproj`
- `dotnet/tests/ZB.MOM.NatsNet.Server.IntegrationTests/ZB.MOM.NatsNet.Server.IntegrationTests.csproj`
- Library dependencies (NuGet packages) added per Phase 3 mappings
- Verify readiness: `dotnet run --project tools/NatsNet.PortTracker -- phase check 5 --db porting.db`
## Source and Target Locations
- **Go source code** is located in the `golang/` folder (specifically `golang/nats-server/`)
- **.NET ported version** is located in the `dotnet/` folder
## Porting Workflow
This is the core loop. Repeat until all items are complete.
### Step 1: Find ready items
Query for items whose dependencies are all ported (status `complete`, `verified`, or `n_a`):
```bash
dotnet run --project tools/NatsNet.PortTracker -- dependency ready --db porting.db
```
This returns modules and features that have no unported dependencies. Start with these.
### Step 2: Pick an item and mark as stub
Choose an item from the ready list. Mark it as `stub` to signal work has begun:
```bash
dotnet run --project tools/NatsNet.PortTracker -- feature update <id> --status stub --db porting.db
```
### Step 3: Create the skeleton
In the .NET project, create the class and method skeleton based on the mapping:
1. Look up the mapping: `dotnet run --project tools/NatsNet.PortTracker -- feature show <id> --db porting.db`
2. Create the file at the correct path under `dotnet/src/ZB.MOM.NatsNet.Server/` following the namespace hierarchy
3. Add the class declaration, method signature, and a `throw new NotImplementedException()` body
For batch scaffolding of an entire module:
```bash
dotnet run --project tools/NatsNet.PortTracker -- feature update 0 --status stub \
--all-in-module <module_id> --db porting.db
```
### Step 4: Implement the logic
Reference the Go source code. The database stores the Go file path and line number for each feature:
```bash
dotnet run --project tools/NatsNet.PortTracker -- feature show <id> --db porting.db
```
The output includes `Go File`, `Go Line`, and `Go LOC` fields. Open the Go source at those coordinates and translate the logic to C#.
Key translation patterns:
| Go pattern | .NET equivalent |
|-----------|-----------------|
| `goroutine` + `channel` | `Task` + `Channel<T>` or `async/await` |
| `sync.Mutex` | `lock` statement or `SemaphoreSlim` |
| `sync.RWMutex` | `ReaderWriterLockSlim` |
| `sync.WaitGroup` | `Task.WhenAll` or `CountdownEvent` |
| `defer` | `try/finally` or `using`/`IDisposable` |
| `interface{}` / `any` | `object` or generics |
| `[]byte` | `byte[]`, `ReadOnlySpan<byte>`, or `ReadOnlyMemory<byte>` |
| `map[K]V` | `Dictionary<K,V>` or `ConcurrentDictionary<K,V>` |
| `error` return | Exceptions or `Result<T>` pattern |
| `panic/recover` | Exceptions (avoid `Environment.FailFast` for recoverable cases) |
| `select` on channels | `Task.WhenAny` or `Channel<T>` reader patterns |
| `context.Context` | `CancellationToken` |
| `io.Reader/Writer` | `Stream`, `PipeReader/PipeWriter` |
### Step 5: Mark complete
Once the implementation compiles and the basic logic is in place:
```bash
dotnet run --project tools/NatsNet.PortTracker -- feature update <id> --status complete --db porting.db
```
### Step 6: Run targeted tests
If tests exist for this feature, run them:
```bash
dotnet test --filter "FullyQualifiedName~ZB.MOM.NatsNet.Server.Tests.Protocol" \
dotnet/tests/ZB.MOM.NatsNet.Server.Tests/
```
Fix any failures before moving on.
### Step 7: Check what is now unblocked
Completing items may unblock others that depend on them:
```bash
dotnet run --project tools/NatsNet.PortTracker -- dependency ready --db porting.db
```
Return to Step 2 with the newly available items.
## DB Update Discipline
The porting database must stay current. Update status at every transition:
```bash
# Starting work on a feature
dotnet run --project tools/NatsNet.PortTracker -- feature update 42 --status stub --db porting.db
# Feature implemented
dotnet run --project tools/NatsNet.PortTracker -- feature update 42 --status complete --db porting.db
# Batch scaffolding for all features in a module
dotnet run --project tools/NatsNet.PortTracker -- feature update 0 --status stub \
--all-in-module 3 --db porting.db
# Module fully ported (all its features are complete)
dotnet run --project tools/NatsNet.PortTracker -- module update 3 --status complete --db porting.db
# Check progress
dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db
```
Status transitions follow this progression:
```
not_started -> stub -> complete -> verified (Phase 7)
\-> n_a (if determined during porting)
```
Never skip `stub` -- it signals that work is in progress and prevents duplicate effort.
## Porting Order Strategy
### Start with leaf modules
Leaf modules have no dependencies on other unported modules. They are safe to port first because nothing they call is missing.
```bash
# These are the leaves — port them first
dotnet run --project tools/NatsNet.PortTracker -- dependency ready --db porting.db
```
Typical leaf modules include:
- Utility/helper code (string manipulation, byte buffer pools)
- Constants and enums
- Configuration types (options, settings)
- Error types and codes
### Then work upward
After leaves are done, modules that depended only on those leaves become ready. Continue up the dependency graph:
```
Leaf utilities -> Protocol types -> Parser -> Connection handler -> Server
```
### Port tests alongside features
When porting a feature, also port its associated tests in the same pass. This provides immediate validation:
```bash
# List tests for a feature
dotnet run --project tools/NatsNet.PortTracker -- test list --module <module_id> --db porting.db
# After porting a test
dotnet run --project tools/NatsNet.PortTracker -- test update <id> --status complete --db porting.db
```
## Progress Tracking
Check overall progress regularly:
```bash
# Summary stats
dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db
# What is still blocked
dotnet run --project tools/NatsNet.PortTracker -- dependency blocked --db porting.db
# Phase-level check
dotnet run --project tools/NatsNet.PortTracker -- phase check 6 --db porting.db
```
## Handling Discoveries During Porting
During implementation, you may find:
### Items that should be N/A
If a feature turns out to be unnecessary in .NET (discovered during implementation):
```bash
dotnet run --project tools/NatsNet.PortTracker -- feature set-na <id> \
--reason "Go-specific memory management, handled by .NET GC" --db porting.db
```
### Missing dependencies
If the Go analyzer missed a dependency:
```bash
# The dependency is tracked in the DB via the dependencies table
# For now, just ensure the target is ported before continuing
dotnet run --project tools/NatsNet.PortTracker -- dependency show feature <id> --db porting.db
```
### Design changes
If the .NET design needs to differ from the original mapping (e.g., splitting a large Go function into multiple .NET methods), update the mapping:
```bash
dotnet run --project tools/NatsNet.PortTracker -- feature map <id> \
--class "NewClassName" \
--method "NewMethodName" \
--db porting.db
```
## Tips
1. **Keep the build green.** The solution should compile after each feature is completed. Do not leave unresolved references.
2. **Write idiomatic C#.** Do not transliterate Go line-by-line. Use .NET patterns (async/await, LINQ, Span, dependency injection) where they produce cleaner code.
3. **Use `CancellationToken` everywhere.** The Go code uses `context.Context` pervasively -- mirror this with `CancellationToken` parameters.
4. **Prefer `ReadOnlySpan<byte>` for hot paths.** The NATS parser processes bytes at high throughput. Use spans and avoid allocations in the critical path.
5. **Do not port Go comments verbatim.** Translate the intent into C# XML doc comments where appropriate.
6. **Run `dotnet build` frequently.** Catch compile errors early rather than accumulating them.
## Completion Criteria
- All non-N/A modules have status `complete` or better
- All non-N/A features have status `complete` or better
- All non-N/A tests have status `complete` or better
- The solution compiles without errors: `dotnet build`
- `dependency blocked` returns no items (or only items waiting for Phase 7 verification)
- `report summary` shows the expected completion counts
## Related Documentation
- [Phase 5: Mapping Verification](phase-5-mapping-verification.md) -- the verified mappings this phase implements
- [Phase 7: Porting Verification](phase-7-porting-verification.md) -- targeted testing of the ported code

View File

@@ -0,0 +1,238 @@
# Phase 7: Porting Verification
Verify all ported code through targeted testing per module. This phase does NOT run the full test suite as a single pass -- it systematically verifies each module, marks items as verified, and confirms behavioral equivalence with the Go server.
## Objective
Every ported module passes its targeted tests. Every item in the database reaches `verified` or `n_a` status. Cross-module integration tests pass. Key behavioral scenarios produce equivalent results between the Go and .NET servers.
## Prerequisites
- Phase 6 complete: all non-N/A items at `complete` or better
- All tests ported and compilable
- Verify readiness: `dotnet run --project tools/NatsNet.PortTracker -- phase check 6 --db porting.db`
## Source and Target Locations
- **Go source code** is located in the `golang/` folder (specifically `golang/nats-server/`)
- **.NET ported version** is located in the `dotnet/` folder
## Verification Workflow
Work through modules one at a time. Do not move to the next module until the current one is fully verified.
### Step 1: List modules to verify
```bash
# Show all modules — look for status 'complete' (not yet verified)
dotnet run --project tools/NatsNet.PortTracker -- module list --db porting.db
```
Start with leaf modules (those with the fewest dependencies) and work upward, same order as the porting phase.
### Step 2: List tests for the module
For each module, identify all mapped tests:
```bash
dotnet run --project tools/NatsNet.PortTracker -- test list --module <module_id> --db porting.db
```
This shows every test associated with the module, its status, and its .NET method name.
### Step 3: Run targeted tests
Run only the tests for this module using `dotnet test --filter`:
```bash
# Filter by namespace (matches all tests in the module's namespace)
dotnet test --filter "FullyQualifiedName~ZB.MOM.NatsNet.Server.Tests.Protocol" \
dotnet/tests/ZB.MOM.NatsNet.Server.Tests/
# Filter by test class
dotnet test --filter "FullyQualifiedName~ZB.MOM.NatsNet.Server.Tests.Protocol.NatsParserTests" \
dotnet/tests/ZB.MOM.NatsNet.Server.Tests/
# Filter by specific test method
dotnet test --filter "FullyQualifiedName~ZB.MOM.NatsNet.Server.Tests.Protocol.NatsParserTests.TryParse_ValidInput_ReturnsTrue" \
dotnet/tests/ZB.MOM.NatsNet.Server.Tests/
```
The `--filter` flag uses partial matching on the fully qualified test name. Use the namespace pattern for module-wide runs, and the class or method pattern for debugging specific failures.
### Step 4: Handle failures
When tests fail:
1. **Read the failure output.** The test runner prints the assertion that failed, the expected vs actual values, and the stack trace.
2. **Locate the Go reference.** Look up the test in the database to find the original Go test and source:
```bash
dotnet run --project tools/NatsNet.PortTracker -- test show <test_id> --db porting.db
```
3. **Compare Go and .NET logic.** Open the Go source at the stored line number. Check for translation errors: off-by-one, missing edge cases, different default values.
4. **Fix and re-run.** After fixing, re-run only the failing test:
```bash
dotnet test --filter "FullyQualifiedName~ZB.MOM.NatsNet.Server.Tests.Protocol.NatsParserTests.TryParse_EmptyInput_ReturnsFalse" \
dotnet/tests/ZB.MOM.NatsNet.Server.Tests/
```
5. **Then re-run the full module.** Confirm no regressions:
```bash
dotnet test --filter "FullyQualifiedName~ZB.MOM.NatsNet.Server.Tests.Protocol" \
dotnet/tests/ZB.MOM.NatsNet.Server.Tests/
```
Common failure causes:
| Symptom | Likely cause |
|---------|-------------|
| Off-by-one in buffer parsing | Go slices are half-open `[start:end)`, C# spans match but array indexing might differ |
| Timeout in async test | Missing `CancellationToken`, or `Task` not awaited |
| Wrong byte sequence | Go uses `[]byte("string")` which is UTF-8; ensure C# uses `Encoding.UTF8` |
| Nil vs null behavior | Go nil checks behave differently from C# null; check for `default` values |
| Map iteration order | Go maps iterate in random order; if the test depends on order, sort first |
### Step 5: Mark module as verified
Once all tests pass for a module:
```bash
# Mark the module itself as verified
dotnet run --project tools/NatsNet.PortTracker -- module update <module_id> --status verified --db porting.db
# Mark all features in the module as verified
dotnet run --project tools/NatsNet.PortTracker -- feature update 0 --status verified \
--all-in-module <module_id> --db porting.db
# Mark individual tests as verified
dotnet run --project tools/NatsNet.PortTracker -- test update <test_id> --status verified --db porting.db
```
### Step 6: Move to next module
Repeat Steps 2-5 for each module. Track progress:
```bash
dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db
```
## Integration Testing
After all modules are individually verified, run integration tests that exercise cross-module behavior.
### Step 7: Run integration tests
```bash
dotnet test dotnet/tests/ZB.MOM.NatsNet.Server.IntegrationTests/
```
Integration tests cover scenarios like:
- Client connects, subscribes, receives published messages
- Multiple clients with wildcard subscriptions
- Connection lifecycle (connect, disconnect, reconnect)
- Protocol error handling (malformed commands, oversized payloads)
- Configuration loading and server startup
Fix any failures by tracing through the modules involved and checking the interaction boundaries.
### Step 8: Behavioral comparison
Run both the Go server and the .NET server with the same workload and compare behavior. This catches semantic differences that unit tests might miss.
**Setup:**
1. Start the Go server:
```bash
cd golang/nats-server && go run . -p 4222
```
2. Start the .NET server:
```bash
dotnet run --project dotnet/src/ZB.MOM.NatsNet.Server.Host -- --port 4223
```
**Comparison scenarios:**
| Scenario | What to compare |
|----------|----------------|
| Basic pub/sub | Publish a message, verify subscriber receives identical payload |
| Wildcard matching | Subscribe with `foo.*` and `foo.>`, publish to `foo.bar`, verify same match results |
| Queue groups | Multiple subscribers in a queue group, verify round-robin distribution |
| Protocol errors | Send malformed commands, verify same error responses |
| Connection info | Connect and check `INFO` response fields |
| Graceful shutdown | Send SIGTERM, verify clean disconnection |
Use the `nats` CLI tool to drive traffic:
```bash
# Subscribe on Go server
nats sub -s nats://localhost:4222 "test.>"
# Subscribe on .NET server
nats sub -s nats://localhost:4223 "test.>"
# Publish to both and compare
nats pub -s nats://localhost:4222 test.hello "payload"
nats pub -s nats://localhost:4223 test.hello "payload"
```
Document any behavioral differences. Some differences are expected (e.g., server name, version string) while others indicate bugs.
### Step 9: Final verification
Run the complete check:
```bash
# Phase 7 check — all tests verified
dotnet run --project tools/NatsNet.PortTracker -- phase check 7 --db porting.db
# Final summary — all items should be verified or n_a
dotnet run --project tools/NatsNet.PortTracker -- report summary --db porting.db
# Export final report
dotnet run --project tools/NatsNet.PortTracker -- report export \
--format md \
--output porting-final-report.md \
--db porting.db
```
## Troubleshooting
### Test passes individually but fails in module run
Likely a test ordering dependency or shared state. Check for:
- Static mutable state not reset between tests
- Port conflicts if tests start servers
- File system artifacts from previous test runs
Fix by adding proper test cleanup (`IDisposable`, `IAsyncLifetime`) and using unique ports/paths per test.
### Module passes but integration test fails
The issue is at a module boundary. Check:
- Interface implementations match expectations
- Serialization/deserialization is consistent across modules
- Thread safety at module interaction points
- Async patterns are correct (no fire-and-forget `Task` without error handling)
### Behavioral difference with Go server
1. Identify the specific protocol message or state that differs
2. Trace through both implementations step by step
3. Check the NATS protocol specification for the correct behavior
4. Fix the .NET implementation to match (the Go server is the reference)
## Completion Criteria
- All non-N/A modules have status `verified`
- All non-N/A features have status `verified`
- All non-N/A tests have status `verified`
- All targeted tests pass: `dotnet test dotnet/tests/ZB.MOM.NatsNet.Server.Tests/`
- All integration tests pass: `dotnet test dotnet/tests/ZB.MOM.NatsNet.Server.IntegrationTests/`
- Key behavioral scenarios produce equivalent results on Go and .NET servers
- `phase check 7` passes with no errors
- Final report exported and reviewed
## Related Documentation
- [Phase 6: Porting](phase-6-porting.md) -- the implementation phase this verifies
- [Phase 4: .NET Solution Design](phase-4-dotnet-design.md) -- the original design mappings

318
documentation_rules.md Normal file
View File

@@ -0,0 +1,318 @@
# Documentation Rules
This document defines the documentation system for the NATS .NET server project. It provides guidelines for generating, updating, and maintaining project documentation.
The documentation is intended for internal team reference — explaining what the system is, how it works, how to extend it, and how to debug it.
## Folder Structure
```
Documentation/
├── Instructions/ # Guidelines for LLMs (meta-documentation)
│ └── (this file serves as the single instructions reference)
├── GettingStarted/ # Onboarding, prerequisites, first run
├── Protocol/ # Wire protocol, parser, command types
├── Subscriptions/ # SubList trie, subject matching, wildcards
├── Server/ # NatsServer orchestrator, NatsClient handler
├── Configuration/ # NatsOptions, appsettings, CLI arguments
├── Operations/ # Deployment, monitoring, health checks, troubleshooting
└── Plans/ # Design documents and implementation plans
```
Future module folders (add as modules are ported):
```
├── Authentication/ # Auth mechanisms, NKeys, JWT, accounts
├── Clustering/ # Routes, gateways, leaf nodes
├── JetStream/ # Streams, consumers, storage, RAFT
├── Monitoring/ # HTTP endpoints (/varz, /connz, etc.)
├── WebSocket/ # WebSocket transport
└── TLS/ # TLS configuration and setup
```
---
## Style Guide
### Tone and Voice
- **Technical and direct** — no marketing language. Avoid "powerful", "robust", "seamless", "blazing fast".
- **Assume the reader is a .NET developer** — don't explain dependency injection, async/await, or LINQ basics.
- **Explain "why" not just "what"** — document reasoning behind patterns and decisions.
- **Use present tense** — "The parser reads..." not "The parser will read..."
### Formatting Rules
| Aspect | Convention |
|--------|------------|
| File names | `PascalCase.md` |
| H1 (`#`) | Document title only, Title Case |
| H2 (`##`) | Major sections, Title Case |
| H3+ (`###`) | Subsections, Sentence case |
| Code blocks | Always specify language (`csharp`, `json`, `bash`, `xml`) |
| Code snippets | 5-25 lines typical; include class/method context |
| Cross-references | Relative paths: `[See SubList](../Subscriptions/SubList.md)` |
| Inline code | Backticks for code refs: `NatsServer`, `SubList.Match()`, `NatsOptions` |
| Lists | Bullets for unordered, numbers for sequential steps |
| Tables | For structured reference (config options, command formats) |
### Naming Conventions
- Match code terminology exactly: `SubList` not "Subject List", `NatsClient` not "NATS Client Handler"
- Use backticks for all code references: `NatsParser`, `appsettings.json`, `dotnet test`
- Spell out acronyms on first use: "NATS Adaptive Edge Messaging (NATS)" — common acronyms that don't need expansion: API, JSON, TCP, HTTP, TLS, JWT
### Code Snippet Guidelines
**Do:**
- Copy snippets from actual source files
- Include enough context (class name, method signature)
- Specify the language in code blocks
- Show 5-25 line examples
```csharp
// Good — shows class context
public sealed class NatsServer : IMessageRouter, ISubListAccess, IDisposable
{
public async Task StartAsync(CancellationToken ct)
{
_listener = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp);
_listener.Bind(new IPEndPoint(IPAddress.Parse(_options.Host), _options.Port));
_listener.Listen(128);
}
}
```
**Don't:**
- Invent example code that doesn't exist in the codebase
- Include 100+ line dumps without explanation
- Use pseudocode when real code is available
- Omit the language specifier on code blocks
### Structure Conventions
Every documentation file must include:
1. **Title and purpose** — H1 heading with 1-2 sentence description
2. **Key concepts** — if the topic requires background understanding
3. **Code examples** — embedded snippets from actual codebase
4. **Configuration** — if the component has configurable options
5. **Related documentation** — links to related topics
Organize content from general to specific:
1. Overview/introduction
2. Key concepts
3. Basic usage
4. Advanced usage / internals
5. Configuration
6. Troubleshooting
7. Related documentation
End each document with:
```markdown
## Related Documentation
- [Related Topic](../Component/Topic.md)
```
### What to Avoid
- Don't document the obvious (e.g., "The constructor creates a new instance")
- Don't duplicate source code comments — reference the file instead
- Don't include temporary information (dates, version numbers, "coming soon")
- Don't over-explain .NET basics
---
## Generating Documentation
### Document Types
Each component folder should contain these standard files:
| File | Purpose |
|------|---------|
| `Overview.md` | What the component does, key concepts, architecture |
| `Development.md` | How to add/modify features, patterns to follow |
| `Configuration.md` | All configurable options with defaults and examples |
| `Troubleshooting.md` | Common issues, error messages, debugging steps |
Create additional topic-specific files as needed (e.g., `Protocol/Parser.md`, `Subscriptions/SubList.md`).
### Generation Process
1. **Identify scope** — which component folder does this belong to? (See Component Map below)
2. **Read source code** — understand the current implementation, identify key classes/methods/patterns, note configuration options
3. **Check existing documentation** — avoid duplication, cross-reference rather than repeat
4. **Write documentation** — follow the style guide, use real code snippets
5. **Verify accuracy** — confirm snippets match source, verify file paths and class names, test commands
### Creating New Component Folders
1. Create the folder under `Documentation/`
2. Add at minimum `Overview.md`
3. Add other standard files as content warrants
4. Update the Component Map section below
5. Add cross-references from related documentation
---
## Updating Documentation
### Update Triggers
| Code Change | Update These Docs |
|-------------|-------------------|
| New protocol command | `Protocol/` relevant file |
| Parser modified | `Protocol/Parser.md` |
| Subject matching changed | `Subscriptions/SubjectMatch.md` |
| SubList trie modified | `Subscriptions/SubList.md` |
| New subscription type | `Subscriptions/Overview.md` |
| NatsServer changed | `Server/Overview.md` |
| NatsClient changed | `Server/Client.md` |
| Config option added/removed | Component's `Configuration.md` |
| NatsOptions changed | `Configuration/Overview.md` |
| Host startup changed | `Operations/Deployment.md` + `Configuration/` |
| New test patterns | Corresponding component docs |
| Auth mechanism added | `Authentication/` (create if needed) |
| Clustering added | `Clustering/` (create if needed) |
### Update Process
1. **Identify affected documentation** — use the Component Map to determine which docs need updating
2. **Read current documentation** — understand existing structure before making changes
3. **Make targeted updates** — only modify sections affected by the code change; don't rewrite unaffected sections
4. **Update code snippets** — if the code change affects documented examples, update them to match
5. **Update cross-references** — add links to newly related docs, remove links to deleted content
6. **Add verification comment** — at the bottom: `<!-- Last verified against codebase: YYYY-MM-DD -->`
### Deletion Handling
- When code is removed, remove corresponding doc sections
- When code is renamed, update all references (docs, snippets, cross-reference links)
- If an entire feature is removed, delete the doc file and update any index/overview docs
- Search all docs for links to removed content
### What Not to Update
- Don't reformat documentation that wasn't affected by the code change
- Don't update examples that still work correctly
- Don't add new content unrelated to the code change
- Don't change writing style in unaffected sections
---
## Component Map
### Source to Documentation Mapping
| Source Path | Documentation Folder |
|-------------|---------------------|
| `src/NATS.Server/Protocol/NatsParser.cs` | `Protocol/` |
| `src/NATS.Server/Protocol/NatsProtocol.cs` | `Protocol/` |
| `src/NATS.Server/Subscriptions/SubList.cs` | `Subscriptions/` |
| `src/NATS.Server/Subscriptions/SubjectMatch.cs` | `Subscriptions/` |
| `src/NATS.Server/Subscriptions/Subscription.cs` | `Subscriptions/` |
| `src/NATS.Server/Subscriptions/SubListResult.cs` | `Subscriptions/` |
| `src/NATS.Server/NatsServer.cs` | `Server/` |
| `src/NATS.Server/NatsClient.cs` | `Server/` |
| `src/NATS.Server/NatsOptions.cs` | `Configuration/` |
| `src/NATS.Server.Host/Program.cs` | `Operations/` and `Configuration/` |
| `tests/NATS.Server.Tests/` | Document in corresponding component |
| `golang/nats-server/server/` | Reference material (not documented separately) |
### Component Details
#### Protocol/
Documents the wire protocol and parser.
**Source paths:**
- `src/NATS.Server/Protocol/NatsParser.cs` — state machine parser
- `src/NATS.Server/Protocol/NatsProtocol.cs` — constants, ServerInfo, ClientOptions
**Typical files:**
- `Overview.md` — NATS protocol format, command types, wire format
- `Parser.md` — Parser implementation, `TryParse` flow, state machine
- `Commands.md` — Individual command formats (PUB, SUB, UNSUB, MSG, etc.)
#### Subscriptions/
Documents subject matching and the subscription trie.
**Source paths:**
- `src/NATS.Server/Subscriptions/SubList.cs` — trie + cache
- `src/NATS.Server/Subscriptions/SubjectMatch.cs` — validation and wildcard matching
- `src/NATS.Server/Subscriptions/Subscription.cs` — subscription model
- `src/NATS.Server/Subscriptions/SubListResult.cs` — match result container
**Typical files:**
- `Overview.md` — Subject namespace, wildcard rules, queue groups
- `SubList.md` — Trie internals, cache invalidation, thread safety
- `SubjectMatch.md` — Validation rules, wildcard matching algorithm
#### Server/
Documents the server orchestrator and client connection handler.
**Source paths:**
- `src/NATS.Server/NatsServer.cs` — accept loop, message routing
- `src/NATS.Server/NatsClient.cs` — per-connection read/write, subscription tracking
**Typical files:**
- `Overview.md` — Server architecture, connection lifecycle, message flow
- `Client.md` — Client connection handling, command dispatch, write serialization
- `MessageRouting.md` — How messages flow from PUB to subscribers
#### Configuration/
Documents server configuration options.
**Source paths:**
- `src/NATS.Server/NatsOptions.cs` — configuration model
- `src/NATS.Server.Host/Program.cs` — CLI argument parsing, Serilog setup
**Typical files:**
- `Overview.md` — All options with defaults and descriptions
- `Logging.md` — Serilog configuration, log levels, LogContext usage
#### Operations/
Documents deployment and operational concerns.
**Source paths:**
- `src/NATS.Server.Host/` — host application
**Typical files:**
- `Overview.md` — Running the server, CLI arguments
- `Deployment.md` — Deployment procedures
- `Troubleshooting.md` — Common issues and debugging
#### GettingStarted/
Documents onboarding and project overview.
**Typical files:**
- `Setup.md` — Prerequisites, building, running
- `Architecture.md` — System overview, Go reference mapping
- `Development.md` — Development workflow, testing, contributing
### Ambiguous Cases
| Code Type | Document In |
|-----------|-------------|
| Logging setup | `Configuration/Logging.md` |
| Integration tests | `Operations/Testing.md` or corresponding component |
| Shared interfaces (`IMessageRouter`, `ISubListAccess`) | `Server/Overview.md` |
| Go reference code | Don't document separately; reference in `.NET` component docs |
### Adding New Components
When a new module is ported (Authentication, Clustering, JetStream, etc.):
1. Create a new folder under `Documentation/`
2. Add at minimum `Overview.md`
3. Add this mapping table entry
4. Update CLAUDE.md documentation index if it has one
5. Cross-reference from related component docs

1
golang/nats-server Submodule

Submodule golang/nats-server added at 66e9bbc7f8

34
reports/generate-report.sh Executable file
View File

@@ -0,0 +1,34 @@
#!/bin/bash
# Generate porting tracker reports
# Writes reports/current.md (always) and reports/report_{commit_id}.md (snapshot)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
REPO_ROOT="$(cd "$SCRIPT_DIR/.." && pwd)"
DB_PATH="$REPO_ROOT/porting.db"
TRACKER_PROJECT="$REPO_ROOT/tools/NatsNet.PortTracker"
CURRENT_REPORT="$SCRIPT_DIR/current.md"
# Check if DB exists
if [ ! -f "$DB_PATH" ]; then
echo "Warning: $DB_PATH not found, skipping report generation"
exit 0
fi
# Generate current.md
dotnet run --project "$TRACKER_PROJECT" -- report export \
--format md \
--output "$CURRENT_REPORT" \
--db "$DB_PATH" 2>/dev/null || {
echo "Warning: report generation failed, skipping"
exit 0
}
# Generate commit-specific snapshot
COMMIT_ID=$(git -C "$REPO_ROOT" rev-parse --short HEAD 2>/dev/null || echo "unknown")
COMMIT_REPORT="$SCRIPT_DIR/report_${COMMIT_ID}.md"
cp "$CURRENT_REPORT" "$COMMIT_REPORT"
echo "Reports generated: current.md, report_${COMMIT_ID}.md"

View File

@@ -0,0 +1,146 @@
using System.CommandLine;
using NatsNet.PortTracker.Data;
namespace NatsNet.PortTracker.Commands;
public static class DependencyCommands
{
public static Command Create(Option<string> dbOption, Option<string> schemaOption)
{
var depCommand = new Command("dependency", "Manage dependencies");
// show
var showType = new Argument<string>("type") { Description = "Item type (module, feature, unit_test)" };
var showId = new Argument<int>("id") { Description = "Item ID" };
var showCmd = new Command("show", "Show dependencies for an item");
showCmd.Add(showType);
showCmd.Add(showId);
showCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var type = parseResult.GetValue(showType)!;
var id = parseResult.GetValue(showId);
using var db = new Database(dbPath);
var deps = db.Query(
"SELECT target_type, target_id, dependency_kind FROM dependencies WHERE source_type = @type AND source_id = @id",
("@type", type), ("@id", id));
Console.WriteLine($"Dependencies of {type} #{id} ({deps.Count}):");
foreach (var d in deps)
Console.WriteLine($" -> {d["target_type"]} #{d["target_id"]} [{d["dependency_kind"]}]");
var rdeps = db.Query(
"SELECT source_type, source_id, dependency_kind FROM dependencies WHERE target_type = @type AND target_id = @id",
("@type", type), ("@id", id));
Console.WriteLine($"\nReverse dependencies (depends on {type} #{id}) ({rdeps.Count}):");
foreach (var d in rdeps)
Console.WriteLine($" <- {d["source_type"]} #{d["source_id"]} [{d["dependency_kind"]}]");
});
// blocked
var blockedCmd = new Command("blocked", "Show items blocked by unported dependencies");
blockedCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
using var db = new Database(dbPath);
var sql = @"
SELECT 'module' as item_type, m.id, m.name, m.status, d.target_type, d.target_id, d.dependency_kind
FROM modules m
JOIN dependencies d ON d.source_type = 'module' AND d.source_id = m.id
WHERE m.status NOT IN ('complete', 'verified', 'n_a')
AND (
(d.target_type = 'module' AND d.target_id IN (SELECT id FROM modules WHERE status NOT IN ('complete', 'verified', 'n_a')))
OR (d.target_type = 'feature' AND d.target_id IN (SELECT id FROM features WHERE status NOT IN ('complete', 'verified', 'n_a')))
OR (d.target_type = 'unit_test' AND d.target_id IN (SELECT id FROM unit_tests WHERE status NOT IN ('complete', 'verified', 'n_a')))
)
UNION ALL
SELECT 'feature' as item_type, f.id, f.name, f.status, d.target_type, d.target_id, d.dependency_kind
FROM features f
JOIN dependencies d ON d.source_type = 'feature' AND d.source_id = f.id
WHERE f.status NOT IN ('complete', 'verified', 'n_a')
AND (
(d.target_type = 'module' AND d.target_id IN (SELECT id FROM modules WHERE status NOT IN ('complete', 'verified', 'n_a')))
OR (d.target_type = 'feature' AND d.target_id IN (SELECT id FROM features WHERE status NOT IN ('complete', 'verified', 'n_a')))
OR (d.target_type = 'unit_test' AND d.target_id IN (SELECT id FROM unit_tests WHERE status NOT IN ('complete', 'verified', 'n_a')))
)
ORDER BY 1, 2";
var rows = db.Query(sql);
if (rows.Count == 0)
{
Console.WriteLine("No blocked items found.");
return;
}
Console.WriteLine($"{"Type",-10} {"ID",-5} {"Name",-30} {"Status",-15} {"Blocked By",-15} {"Dep ID",-8} {"Kind",-10}");
Console.WriteLine(new string('-', 93));
foreach (var row in rows)
{
Console.WriteLine($"{row["item_type"],-10} {row["id"],-5} {Truncate(row["name"]?.ToString(), 29),-30} {row["status"],-15} {row["target_type"],-15} {row["target_id"],-8} {row["dependency_kind"],-10}");
}
Console.WriteLine($"\n{rows.Count} blocking relationships found.");
});
// ready
var readyCmd = new Command("ready", "Show items ready to port (no unported dependencies)");
readyCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
using var db = new Database(dbPath);
var sql = @"
SELECT 'module' as item_type, m.id, m.name, m.status
FROM modules m
WHERE m.status IN ('not_started', 'stub')
AND NOT EXISTS (
SELECT 1 FROM dependencies d
WHERE d.source_type = 'module' AND d.source_id = m.id
AND (
(d.target_type = 'module' AND d.target_id IN (SELECT id FROM modules WHERE status NOT IN ('complete', 'verified', 'n_a')))
OR (d.target_type = 'feature' AND d.target_id IN (SELECT id FROM features WHERE status NOT IN ('complete', 'verified', 'n_a')))
OR (d.target_type = 'unit_test' AND d.target_id IN (SELECT id FROM unit_tests WHERE status NOT IN ('complete', 'verified', 'n_a')))
)
)
UNION ALL
SELECT 'feature' as item_type, f.id, f.name, f.status
FROM features f
WHERE f.status IN ('not_started', 'stub')
AND NOT EXISTS (
SELECT 1 FROM dependencies d
WHERE d.source_type = 'feature' AND d.source_id = f.id
AND (
(d.target_type = 'module' AND d.target_id IN (SELECT id FROM modules WHERE status NOT IN ('complete', 'verified', 'n_a')))
OR (d.target_type = 'feature' AND d.target_id IN (SELECT id FROM features WHERE status NOT IN ('complete', 'verified', 'n_a')))
OR (d.target_type = 'unit_test' AND d.target_id IN (SELECT id FROM unit_tests WHERE status NOT IN ('complete', 'verified', 'n_a')))
)
)
ORDER BY 1, 2";
var rows = db.Query(sql);
if (rows.Count == 0)
{
Console.WriteLine("No items are ready to port (all items either have unported deps or are already done).");
return;
}
Console.WriteLine($"{"Type",-10} {"ID",-5} {"Name",-40} {"Status",-15}");
Console.WriteLine(new string('-', 70));
foreach (var row in rows)
{
Console.WriteLine($"{row["item_type"],-10} {row["id"],-5} {Truncate(row["name"]?.ToString(), 39),-40} {row["status"],-15}");
}
Console.WriteLine($"\n{rows.Count} items ready to port.");
});
depCommand.Add(showCmd);
depCommand.Add(blockedCmd);
depCommand.Add(readyCmd);
return depCommand;
}
private static string Truncate(string? s, int maxLen)
{
if (s is null) return "";
return s.Length <= maxLen ? s : s[..(maxLen - 2)] + "..";
}
}

View File

@@ -0,0 +1,183 @@
using System.CommandLine;
using NatsNet.PortTracker.Data;
namespace NatsNet.PortTracker.Commands;
public static class FeatureCommands
{
public static Command Create(Option<string> dbOption, Option<string> schemaOption)
{
var featureCommand = new Command("feature", "Manage features");
// list
var listModule = new Option<int?>("--module") { Description = "Filter by module ID" };
var listStatus = new Option<string?>("--status") { Description = "Filter by status" };
var listCmd = new Command("list", "List features");
listCmd.Add(listModule);
listCmd.Add(listStatus);
listCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var moduleId = parseResult.GetValue(listModule);
var status = parseResult.GetValue(listStatus);
using var db = new Database(dbPath);
var sql = "SELECT f.id, f.name, f.status, f.module_id, m.name as module_name, f.go_method, f.dotnet_method FROM features f LEFT JOIN modules m ON f.module_id = m.id";
var parameters = new List<(string, object?)>();
var clauses = new List<string>();
if (moduleId is not null)
{
clauses.Add("f.module_id = @module");
parameters.Add(("@module", moduleId));
}
if (status is not null)
{
clauses.Add("f.status = @status");
parameters.Add(("@status", status));
}
if (clauses.Count > 0)
sql += " WHERE " + string.Join(" AND ", clauses);
sql += " ORDER BY m.name, f.name";
var rows = db.Query(sql, parameters.ToArray());
Console.WriteLine($"{"ID",-5} {"Name",-30} {"Status",-15} {"Module",-20} {"Go Method",-25} {"DotNet Method",-25}");
Console.WriteLine(new string('-', 120));
foreach (var row in rows)
{
Console.WriteLine($"{row["id"],-5} {Truncate(row["name"]?.ToString(), 29),-30} {row["status"],-15} {Truncate(row["module_name"]?.ToString(), 19),-20} {Truncate(row["go_method"]?.ToString(), 24),-25} {Truncate(row["dotnet_method"]?.ToString(), 24),-25}");
}
Console.WriteLine($"\nTotal: {rows.Count} features");
});
// show
var showId = new Argument<int>("id") { Description = "Feature ID" };
var showCmd = new Command("show", "Show feature details");
showCmd.Add(showId);
showCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var id = parseResult.GetValue(showId);
using var db = new Database(dbPath);
var features = db.Query(
"SELECT f.*, m.name as module_name FROM features f LEFT JOIN modules m ON f.module_id = m.id WHERE f.id = @id",
("@id", id));
if (features.Count == 0)
{
Console.WriteLine($"Feature {id} not found.");
return;
}
var f = features[0];
Console.WriteLine($"Feature #{f["id"]}: {f["name"]}");
Console.WriteLine($" Module: #{f["module_id"]} ({f["module_name"]})");
Console.WriteLine($" Status: {f["status"]}");
Console.WriteLine($" Go File: {f["go_file"]}");
Console.WriteLine($" Go Class: {f["go_class"]}");
Console.WriteLine($" Go Method: {f["go_method"]}");
Console.WriteLine($" Go Line: {f["go_line_number"]}");
Console.WriteLine($" Go LOC: {f["go_line_count"]}");
Console.WriteLine($" .NET: {f["dotnet_project"]} / {f["dotnet_class"]} / {f["dotnet_method"]}");
Console.WriteLine($" Notes: {f["notes"]}");
var deps = db.Query(
"SELECT d.target_type, d.target_id, d.dependency_kind FROM dependencies d WHERE d.source_type = 'feature' AND d.source_id = @id",
("@id", id));
Console.WriteLine($"\n Dependencies ({deps.Count}):");
foreach (var d in deps)
Console.WriteLine($" -> {d["target_type"]} #{d["target_id"]} [{d["dependency_kind"]}]");
var rdeps = db.Query(
"SELECT d.source_type, d.source_id, d.dependency_kind FROM dependencies d WHERE d.target_type = 'feature' AND d.target_id = @id",
("@id", id));
Console.WriteLine($"\n Reverse Dependencies ({rdeps.Count}):");
foreach (var d in rdeps)
Console.WriteLine($" <- {d["source_type"]} #{d["source_id"]} [{d["dependency_kind"]}]");
});
// update
var updateId = new Argument<int>("id") { Description = "Feature ID (use 0 with --all-in-module)" };
var updateStatus = new Option<string>("--status") { Description = "New status", Required = true };
var updateAllInModule = new Option<int?>("--all-in-module") { Description = "Update all features in this module ID" };
var updateCmd = new Command("update", "Update feature status");
updateCmd.Add(updateId);
updateCmd.Add(updateStatus);
updateCmd.Add(updateAllInModule);
updateCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var id = parseResult.GetValue(updateId);
var status = parseResult.GetValue(updateStatus)!;
var allInModule = parseResult.GetValue(updateAllInModule);
using var db = new Database(dbPath);
if (allInModule is not null)
{
var affected = db.Execute(
"UPDATE features SET status = @status WHERE module_id = @module",
("@status", status), ("@module", allInModule));
Console.WriteLine($"Updated {affected} features in module {allInModule} to '{status}'.");
}
else
{
var affected = db.Execute(
"UPDATE features SET status = @status WHERE id = @id",
("@status", status), ("@id", id));
Console.WriteLine(affected > 0 ? $"Feature {id} updated to '{status}'." : $"Feature {id} not found.");
}
});
// map
var mapId = new Argument<int>("id") { Description = "Feature ID" };
var mapProject = new Option<string?>("--project") { Description = "Target .NET project" };
var mapClass = new Option<string?>("--class") { Description = "Target .NET class" };
var mapMethod = new Option<string?>("--method") { Description = "Target .NET method" };
var mapCmd = new Command("map", "Map feature to .NET method");
mapCmd.Add(mapId);
mapCmd.Add(mapProject);
mapCmd.Add(mapClass);
mapCmd.Add(mapMethod);
mapCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var id = parseResult.GetValue(mapId);
var project = parseResult.GetValue(mapProject);
var cls = parseResult.GetValue(mapClass);
var method = parseResult.GetValue(mapMethod);
using var db = new Database(dbPath);
var affected = db.Execute(
"UPDATE features SET dotnet_project = COALESCE(@project, dotnet_project), dotnet_class = COALESCE(@cls, dotnet_class), dotnet_method = COALESCE(@method, dotnet_method) WHERE id = @id",
("@project", project), ("@cls", cls), ("@method", method), ("@id", id));
Console.WriteLine(affected > 0 ? $"Feature {id} mapped." : $"Feature {id} not found.");
});
// set-na
var naId = new Argument<int>("id") { Description = "Feature ID" };
var naReason = new Option<string>("--reason") { Description = "Reason for N/A", Required = true };
var naCmd = new Command("set-na", "Mark feature as N/A");
naCmd.Add(naId);
naCmd.Add(naReason);
naCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var id = parseResult.GetValue(naId);
var reason = parseResult.GetValue(naReason)!;
using var db = new Database(dbPath);
var affected = db.Execute(
"UPDATE features SET status = 'n_a', notes = @reason WHERE id = @id",
("@reason", reason), ("@id", id));
Console.WriteLine(affected > 0 ? $"Feature {id} set to N/A: {reason}" : $"Feature {id} not found.");
});
featureCommand.Add(listCmd);
featureCommand.Add(showCmd);
featureCommand.Add(updateCmd);
featureCommand.Add(mapCmd);
featureCommand.Add(naCmd);
return featureCommand;
}
private static string Truncate(string? s, int maxLen)
{
if (s is null) return "";
return s.Length <= maxLen ? s : s[..(maxLen - 2)] + "..";
}
}

View File

@@ -0,0 +1,98 @@
using System.CommandLine;
using NatsNet.PortTracker.Data;
namespace NatsNet.PortTracker.Commands;
public static class LibraryCommands
{
public static Command Create(Option<string> dbOption, Option<string> schemaOption)
{
var libraryCommand = new Command("library", "Manage library mappings");
// list
var listStatus = new Option<string?>("--status") { Description = "Filter by status" };
var listCmd = new Command("list", "List library mappings");
listCmd.Add(listStatus);
listCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var status = parseResult.GetValue(listStatus);
using var db = new Database(dbPath);
var sql = "SELECT id, go_import_path, go_library_name, dotnet_package, dotnet_namespace, status FROM library_mappings";
var parameters = new List<(string, object?)>();
if (status is not null)
{
sql += " WHERE status = @status";
parameters.Add(("@status", status));
}
sql += " ORDER BY go_import_path";
var rows = db.Query(sql, parameters.ToArray());
Console.WriteLine($"{"ID",-5} {"Go Import Path",-40} {"Go Library",-20} {"DotNet Package",-25} {"DotNet Namespace",-25} {"Status",-12}");
Console.WriteLine(new string('-', 127));
foreach (var row in rows)
{
Console.WriteLine($"{row["id"],-5} {Truncate(row["go_import_path"]?.ToString(), 39),-40} {Truncate(row["go_library_name"]?.ToString(), 19),-20} {Truncate(row["dotnet_package"]?.ToString(), 24),-25} {Truncate(row["dotnet_namespace"]?.ToString(), 24),-25} {row["status"],-12}");
}
Console.WriteLine($"\nTotal: {rows.Count} library mappings");
});
// map
var mapId = new Argument<int>("id") { Description = "Library mapping ID" };
var mapPackage = new Option<string?>("--package") { Description = ".NET NuGet package" };
var mapNamespace = new Option<string?>("--namespace") { Description = ".NET namespace" };
var mapNotes = new Option<string?>("--notes") { Description = "Usage notes" };
var mapCmd = new Command("map", "Map Go library to .NET package");
mapCmd.Add(mapId);
mapCmd.Add(mapPackage);
mapCmd.Add(mapNamespace);
mapCmd.Add(mapNotes);
mapCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var id = parseResult.GetValue(mapId);
var package = parseResult.GetValue(mapPackage);
var ns = parseResult.GetValue(mapNamespace);
var notes = parseResult.GetValue(mapNotes);
using var db = new Database(dbPath);
var affected = db.Execute(
"UPDATE library_mappings SET dotnet_package = COALESCE(@package, dotnet_package), dotnet_namespace = COALESCE(@ns, dotnet_namespace), dotnet_usage_notes = COALESCE(@notes, dotnet_usage_notes), status = 'mapped' WHERE id = @id",
("@package", package), ("@ns", ns), ("@notes", notes), ("@id", id));
Console.WriteLine(affected > 0 ? $"Library {id} mapped." : $"Library {id} not found.");
});
// suggest
var suggestCmd = new Command("suggest", "Show unmapped libraries");
suggestCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
using var db = new Database(dbPath);
var rows = db.Query(
"SELECT id, go_import_path, go_library_name, go_usage_description FROM library_mappings WHERE status = 'not_mapped' ORDER BY go_import_path");
if (rows.Count == 0)
{
Console.WriteLine("All libraries have been mapped!");
return;
}
Console.WriteLine($"{"ID",-5} {"Go Import Path",-45} {"Library",-20} {"Usage",-40}");
Console.WriteLine(new string('-', 110));
foreach (var row in rows)
{
Console.WriteLine($"{row["id"],-5} {Truncate(row["go_import_path"]?.ToString(), 44),-45} {Truncate(row["go_library_name"]?.ToString(), 19),-20} {Truncate(row["go_usage_description"]?.ToString(), 39),-40}");
}
Console.WriteLine($"\n{rows.Count} unmapped libraries need attention.");
});
libraryCommand.Add(listCmd);
libraryCommand.Add(mapCmd);
libraryCommand.Add(suggestCmd);
return libraryCommand;
}
private static string Truncate(string? s, int maxLen)
{
if (s is null) return "";
return s.Length <= maxLen ? s : s[..(maxLen - 2)] + "..";
}
}

View File

@@ -0,0 +1,153 @@
using System.CommandLine;
using NatsNet.PortTracker.Data;
namespace NatsNet.PortTracker.Commands;
public static class ModuleCommands
{
public static Command Create(Option<string> dbOption, Option<string> schemaOption)
{
var moduleCommand = new Command("module", "Manage modules");
// list
var listStatus = new Option<string?>("--status") { Description = "Filter by status" };
var listCmd = new Command("list", "List modules");
listCmd.Add(listStatus);
listCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var status = parseResult.GetValue(listStatus);
using var db = new Database(dbPath);
var sql = "SELECT id, name, status, go_package, go_line_count, dotnet_project, dotnet_class FROM modules";
var parameters = new List<(string, object?)>();
if (status is not null)
{
sql += " WHERE status = @status";
parameters.Add(("@status", status));
}
sql += " ORDER BY name";
var rows = db.Query(sql, parameters.ToArray());
Console.WriteLine($"{"ID",-5} {"Name",-25} {"Status",-15} {"Go Pkg",-15} {"LOC",-8} {"DotNet Project",-25} {"DotNet Class",-20}");
Console.WriteLine(new string('-', 113));
foreach (var row in rows)
{
Console.WriteLine($"{row["id"],-5} {row["name"],-25} {row["status"],-15} {row["go_package"],-15} {row["go_line_count"],-8} {row["dotnet_project"] ?? "",-25} {row["dotnet_class"] ?? "",-20}");
}
Console.WriteLine($"\nTotal: {rows.Count} modules");
});
// show
var showId = new Argument<int>("id") { Description = "Module ID" };
var showCmd = new Command("show", "Show module details");
showCmd.Add(showId);
showCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var id = parseResult.GetValue(showId);
using var db = new Database(dbPath);
var modules = db.Query("SELECT * FROM modules WHERE id = @id", ("@id", id));
if (modules.Count == 0)
{
Console.WriteLine($"Module {id} not found.");
return;
}
var mod = modules[0];
Console.WriteLine($"Module #{mod["id"]}: {mod["name"]}");
Console.WriteLine($" Status: {mod["status"]}");
Console.WriteLine($" Go Package: {mod["go_package"]}");
Console.WriteLine($" Go File: {mod["go_file"]}");
Console.WriteLine($" Go LOC: {mod["go_line_count"]}");
Console.WriteLine($" .NET: {mod["dotnet_project"]} / {mod["dotnet_namespace"]} / {mod["dotnet_class"]}");
Console.WriteLine($" Notes: {mod["notes"]}");
var features = db.Query(
"SELECT id, name, status, go_method, dotnet_method FROM features WHERE module_id = @id ORDER BY name",
("@id", id));
Console.WriteLine($"\n Features ({features.Count}):");
foreach (var f in features)
Console.WriteLine($" #{f["id"],-5} {f["name"],-35} {f["status"],-15} {f["dotnet_method"] ?? ""}");
var tests = db.Query(
"SELECT id, name, status, dotnet_method FROM unit_tests WHERE module_id = @id ORDER BY name",
("@id", id));
Console.WriteLine($"\n Tests ({tests.Count}):");
foreach (var t in tests)
Console.WriteLine($" #{t["id"],-5} {t["name"],-35} {t["status"],-15} {t["dotnet_method"] ?? ""}");
var deps = db.Query(
"SELECT d.target_type, d.target_id, d.dependency_kind, m.name as target_name FROM dependencies d LEFT JOIN modules m ON d.target_type = 'module' AND d.target_id = m.id WHERE d.source_type = 'module' AND d.source_id = @id",
("@id", id));
Console.WriteLine($"\n Dependencies ({deps.Count}):");
foreach (var d in deps)
Console.WriteLine($" -> {d["target_type"]} #{d["target_id"]} ({d["target_name"]}) [{d["dependency_kind"]}]");
});
// update
var updateId = new Argument<int>("id") { Description = "Module ID" };
var updateStatus = new Option<string>("--status") { Description = "New status", Required = true };
var updateCmd = new Command("update", "Update module status");
updateCmd.Add(updateId);
updateCmd.Add(updateStatus);
updateCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var id = parseResult.GetValue(updateId);
var status = parseResult.GetValue(updateStatus)!;
using var db = new Database(dbPath);
var affected = db.Execute("UPDATE modules SET status = @status WHERE id = @id",
("@status", status), ("@id", id));
Console.WriteLine(affected > 0 ? $"Module {id} updated to '{status}'." : $"Module {id} not found.");
});
// map
var mapId = new Argument<int>("id") { Description = "Module ID" };
var mapProject = new Option<string>("--project") { Description = "Target .NET project", Required = true };
var mapNamespace = new Option<string?>("--namespace") { Description = "Target namespace" };
var mapClass = new Option<string?>("--class") { Description = "Target class" };
var mapCmd = new Command("map", "Map module to .NET project");
mapCmd.Add(mapId);
mapCmd.Add(mapProject);
mapCmd.Add(mapNamespace);
mapCmd.Add(mapClass);
mapCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var id = parseResult.GetValue(mapId);
var project = parseResult.GetValue(mapProject)!;
var ns = parseResult.GetValue(mapNamespace);
var cls = parseResult.GetValue(mapClass);
using var db = new Database(dbPath);
var affected = db.Execute(
"UPDATE modules SET dotnet_project = @project, dotnet_namespace = @ns, dotnet_class = @cls WHERE id = @id",
("@project", project), ("@ns", ns), ("@cls", cls), ("@id", id));
Console.WriteLine(affected > 0 ? $"Module {id} mapped to {project}." : $"Module {id} not found.");
});
// set-na
var naId = new Argument<int>("id") { Description = "Module ID" };
var naReason = new Option<string>("--reason") { Description = "Reason for N/A", Required = true };
var naCmd = new Command("set-na", "Mark module as N/A");
naCmd.Add(naId);
naCmd.Add(naReason);
naCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var id = parseResult.GetValue(naId);
var reason = parseResult.GetValue(naReason)!;
using var db = new Database(dbPath);
var affected = db.Execute(
"UPDATE modules SET status = 'n_a', notes = @reason WHERE id = @id",
("@reason", reason), ("@id", id));
Console.WriteLine(affected > 0 ? $"Module {id} set to N/A: {reason}" : $"Module {id} not found.");
});
moduleCommand.Add(listCmd);
moduleCommand.Add(showCmd);
moduleCommand.Add(updateCmd);
moduleCommand.Add(mapCmd);
moduleCommand.Add(naCmd);
return moduleCommand;
}
}

View File

@@ -0,0 +1,212 @@
using System.CommandLine;
using NatsNet.PortTracker.Data;
namespace NatsNet.PortTracker.Commands;
public static class PhaseCommands
{
private static readonly (int Number, string Name, string Description)[] Phases =
[
(1, "Analysis & Schema", "Run Go AST analyzer, populate DB schema, map libraries"),
(2, "Core Infrastructure", "Port foundational modules (logging, errors, options)"),
(3, "Message Layer", "Port message parsing, headers, protocol handling"),
(4, "Connection Layer", "Port connection management, reconnection logic"),
(5, "Client API", "Port publish, subscribe, request-reply"),
(6, "Advanced Features", "Port JetStream, KV, Object Store, Services"),
(7, "Testing & Verification", "Port and verify all unit tests, integration tests"),
];
public static Command Create(Option<string> dbOption, Option<string> schemaOption)
{
var phaseCommand = new Command("phase", "Manage porting phases");
// list
var listCmd = new Command("list", "List all phases with status");
listCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
using var db = new Database(dbPath);
Console.WriteLine($"{"Phase",-7} {"Name",-25} {"Description",-55} {"Status",-12}");
Console.WriteLine(new string('-', 99));
foreach (var (number, name, description) in Phases)
{
var status = CalculatePhaseStatus(db, number);
Console.WriteLine($"{number,-7} {name,-25} {description,-55} {status,-12}");
}
});
// check
var checkPhase = new Argument<int>("phase") { Description = "Phase number (1-7)" };
var checkCmd = new Command("check", "Check phase completion status");
checkCmd.Add(checkPhase);
checkCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var phase = parseResult.GetValue(checkPhase);
using var db = new Database(dbPath);
if (phase < 1 || phase > 7)
{
Console.WriteLine("Phase must be between 1 and 7.");
return;
}
var (_, name, description) = Phases[phase - 1];
Console.WriteLine($"Phase {phase}: {name}");
Console.WriteLine($" {description}\n");
RunPhaseCheck(db, phase);
});
phaseCommand.Add(listCmd);
phaseCommand.Add(checkCmd);
return phaseCommand;
}
private static string CalculatePhaseStatus(Database db, int phase)
{
return phase switch
{
1 => CalculatePhase1Status(db),
2 => CalculateModulePhaseStatus(db, "Core Infrastructure"),
3 => CalculateModulePhaseStatus(db, "Message Layer"),
4 => CalculateModulePhaseStatus(db, "Connection Layer"),
5 => CalculateModulePhaseStatus(db, "Client API"),
6 => CalculateModulePhaseStatus(db, "Advanced Features"),
7 => CalculatePhase7Status(db),
_ => "unknown"
};
}
private static string CalculatePhase1Status(Database db)
{
var totalModules = db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules");
var totalLibraries = db.ExecuteScalar<long>("SELECT COUNT(*) FROM library_mappings");
if (totalModules == 0 && totalLibraries == 0) return "not_started";
var mappedLibraries = db.ExecuteScalar<long>("SELECT COUNT(*) FROM library_mappings WHERE status != 'not_mapped'");
if (totalModules > 0 && (totalLibraries == 0 || mappedLibraries == totalLibraries)) return "complete";
return "in_progress";
}
private static string CalculateModulePhaseStatus(Database db, string phaseDescription)
{
// Generic phase status based on overall module completion
var total = db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules");
if (total == 0) return "not_started";
var done = db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules WHERE status IN ('complete', 'verified', 'n_a')");
if (done == total) return "complete";
var started = db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules WHERE status != 'not_started'");
return started > 0 ? "in_progress" : "not_started";
}
private static string CalculatePhase7Status(Database db)
{
var totalTests = db.ExecuteScalar<long>("SELECT COUNT(*) FROM unit_tests");
if (totalTests == 0) return "not_started";
var doneTests = db.ExecuteScalar<long>("SELECT COUNT(*) FROM unit_tests WHERE status IN ('complete', 'verified', 'n_a')");
if (doneTests == totalTests) return "complete";
var startedTests = db.ExecuteScalar<long>("SELECT COUNT(*) FROM unit_tests WHERE status != 'not_started'");
return startedTests > 0 ? "in_progress" : "not_started";
}
private static void RunPhaseCheck(Database db, int phase)
{
switch (phase)
{
case 1:
CheckPhase1(db);
break;
case 2:
case 3:
case 4:
case 5:
case 6:
CheckModulePhase(db, phase);
break;
case 7:
CheckPhase7(db);
break;
}
}
private static void CheckPhase1(Database db)
{
var totalModules = db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules");
var totalFeatures = db.ExecuteScalar<long>("SELECT COUNT(*) FROM features");
var totalTests = db.ExecuteScalar<long>("SELECT COUNT(*) FROM unit_tests");
var totalLibs = db.ExecuteScalar<long>("SELECT COUNT(*) FROM library_mappings");
var mappedLibs = db.ExecuteScalar<long>("SELECT COUNT(*) FROM library_mappings WHERE status != 'not_mapped'");
var totalDeps = db.ExecuteScalar<long>("SELECT COUNT(*) FROM dependencies");
Console.WriteLine("Phase 1 Checklist:");
Console.WriteLine($" [{ (totalModules > 0 ? "x" : " ") }] Modules populated: {totalModules}");
Console.WriteLine($" [{ (totalFeatures > 0 ? "x" : " ") }] Features populated: {totalFeatures}");
Console.WriteLine($" [{ (totalTests > 0 ? "x" : " ") }] Unit tests populated: {totalTests}");
Console.WriteLine($" [{ (totalDeps > 0 ? "x" : " ") }] Dependencies mapped: {totalDeps}");
Console.WriteLine($" [{ (totalLibs > 0 ? "x" : " ") }] Libraries identified: {totalLibs}");
Console.WriteLine($" [{ (totalLibs > 0 && mappedLibs == totalLibs ? "x" : " ") }] All libraries mapped: {mappedLibs}/{totalLibs}");
}
private static void CheckModulePhase(Database db, int phase)
{
var totalModules = db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules");
var doneModules = db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules WHERE status IN ('complete', 'verified', 'n_a')");
var stubModules = db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules WHERE status = 'stub'");
var totalFeatures = db.ExecuteScalar<long>("SELECT COUNT(*) FROM features");
var doneFeatures = db.ExecuteScalar<long>("SELECT COUNT(*) FROM features WHERE status IN ('complete', 'verified', 'n_a')");
var stubFeatures = db.ExecuteScalar<long>("SELECT COUNT(*) FROM features WHERE status = 'stub'");
var modPct = totalModules > 0 ? (double)doneModules / totalModules * 100 : 0;
var featPct = totalFeatures > 0 ? (double)doneFeatures / totalFeatures * 100 : 0;
Console.WriteLine($"Phase {phase} Progress:");
Console.WriteLine($" Modules: {doneModules}/{totalModules} complete ({modPct:F1}%), {stubModules} stubs");
Console.WriteLine($" Features: {doneFeatures}/{totalFeatures} complete ({featPct:F1}%), {stubFeatures} stubs");
// Show incomplete modules
var incomplete = db.Query(
"SELECT id, name, status FROM modules WHERE status NOT IN ('complete', 'verified', 'n_a') ORDER BY name");
if (incomplete.Count > 0)
{
Console.WriteLine($"\n Incomplete modules ({incomplete.Count}):");
foreach (var m in incomplete)
Console.WriteLine($" #{m["id"],-5} {m["name"],-30} {m["status"]}");
}
}
private static void CheckPhase7(Database db)
{
var totalTests = db.ExecuteScalar<long>("SELECT COUNT(*) FROM unit_tests");
var doneTests = db.ExecuteScalar<long>("SELECT COUNT(*) FROM unit_tests WHERE status IN ('complete', 'verified', 'n_a')");
var stubTests = db.ExecuteScalar<long>("SELECT COUNT(*) FROM unit_tests WHERE status = 'stub'");
var verifiedTests = db.ExecuteScalar<long>("SELECT COUNT(*) FROM unit_tests WHERE status = 'verified'");
var pct = totalTests > 0 ? (double)doneTests / totalTests * 100 : 0;
Console.WriteLine("Phase 7 Progress:");
Console.WriteLine($" Total tests: {totalTests}");
Console.WriteLine($" Complete: {doneTests - verifiedTests}");
Console.WriteLine($" Verified: {verifiedTests}");
Console.WriteLine($" Stubs: {stubTests}");
Console.WriteLine($" Not started: {totalTests - doneTests - stubTests}");
Console.WriteLine($" Progress: {pct:F1}%");
// Modules with incomplete tests
var modulesWithIncomplete = db.Query(@"
SELECT m.id, m.name, COUNT(*) as total,
SUM(CASE WHEN t.status IN ('complete', 'verified', 'n_a') THEN 1 ELSE 0 END) as done
FROM unit_tests t
JOIN modules m ON t.module_id = m.id
GROUP BY m.id, m.name
HAVING done < total
ORDER BY m.name");
if (modulesWithIncomplete.Count > 0)
{
Console.WriteLine($"\n Modules with incomplete tests ({modulesWithIncomplete.Count}):");
foreach (var m in modulesWithIncomplete)
Console.WriteLine($" #{m["id"],-5} {m["name"],-30} {m["done"]}/{m["total"]}");
}
}
}

View File

@@ -0,0 +1,58 @@
using System.CommandLine;
using NatsNet.PortTracker.Data;
using NatsNet.PortTracker.Reporting;
namespace NatsNet.PortTracker.Commands;
public static class ReportCommands
{
public static Command Create(Option<string> dbOption, Option<string> schemaOption)
{
var reportCommand = new Command("report", "Generate reports");
// summary
var summaryCmd = new Command("summary", "Show status summary");
summaryCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
using var db = new Database(dbPath);
ReportGenerator.PrintSummary(db);
});
// export
var exportFormat = new Option<string>("--format") { Description = "Export format (md)", DefaultValueFactory = _ => "md" };
var exportOutput = new Option<string?>("--output") { Description = "Output file path (stdout if not specified)" };
var exportCmd = new Command("export", "Export status report");
exportCmd.Add(exportFormat);
exportCmd.Add(exportOutput);
exportCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var format = parseResult.GetValue(exportFormat)!;
var output = parseResult.GetValue(exportOutput);
using var db = new Database(dbPath);
if (format != "md")
{
Console.WriteLine($"Unsupported format: {format}. Supported: md");
return;
}
var markdown = ReportGenerator.ExportMarkdown(db);
if (output is not null)
{
File.WriteAllText(output, markdown);
Console.WriteLine($"Report exported to {output}");
}
else
{
Console.Write(markdown);
}
});
reportCommand.Add(summaryCmd);
reportCommand.Add(exportCmd);
return reportCommand;
}
}

View File

@@ -0,0 +1,143 @@
using System.CommandLine;
using NatsNet.PortTracker.Data;
namespace NatsNet.PortTracker.Commands;
public static class TestCommands
{
public static Command Create(Option<string> dbOption, Option<string> schemaOption)
{
var testCommand = new Command("test", "Manage unit tests");
// list
var listModule = new Option<int?>("--module") { Description = "Filter by module ID" };
var listStatus = new Option<string?>("--status") { Description = "Filter by status" };
var listCmd = new Command("list", "List unit tests");
listCmd.Add(listModule);
listCmd.Add(listStatus);
listCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var moduleId = parseResult.GetValue(listModule);
var status = parseResult.GetValue(listStatus);
using var db = new Database(dbPath);
var sql = "SELECT t.id, t.name, t.status, t.module_id, m.name as module_name, t.go_method, t.dotnet_method FROM unit_tests t LEFT JOIN modules m ON t.module_id = m.id";
var parameters = new List<(string, object?)>();
var clauses = new List<string>();
if (moduleId is not null)
{
clauses.Add("t.module_id = @module");
parameters.Add(("@module", moduleId));
}
if (status is not null)
{
clauses.Add("t.status = @status");
parameters.Add(("@status", status));
}
if (clauses.Count > 0)
sql += " WHERE " + string.Join(" AND ", clauses);
sql += " ORDER BY m.name, t.name";
var rows = db.Query(sql, parameters.ToArray());
Console.WriteLine($"{"ID",-5} {"Name",-30} {"Status",-15} {"Module",-20} {"Go Method",-25} {"DotNet Method",-25}");
Console.WriteLine(new string('-', 120));
foreach (var row in rows)
{
Console.WriteLine($"{row["id"],-5} {Truncate(row["name"]?.ToString(), 29),-30} {row["status"],-15} {Truncate(row["module_name"]?.ToString(), 19),-20} {Truncate(row["go_method"]?.ToString(), 24),-25} {Truncate(row["dotnet_method"]?.ToString(), 24),-25}");
}
Console.WriteLine($"\nTotal: {rows.Count} tests");
});
// show
var showId = new Argument<int>("id") { Description = "Test ID" };
var showCmd = new Command("show", "Show test details");
showCmd.Add(showId);
showCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var id = parseResult.GetValue(showId);
using var db = new Database(dbPath);
var tests = db.Query(
"SELECT t.*, m.name as module_name FROM unit_tests t LEFT JOIN modules m ON t.module_id = m.id WHERE t.id = @id",
("@id", id));
if (tests.Count == 0)
{
Console.WriteLine($"Test {id} not found.");
return;
}
var t = tests[0];
Console.WriteLine($"Test #{t["id"]}: {t["name"]}");
Console.WriteLine($" Module: #{t["module_id"]} ({t["module_name"]})");
Console.WriteLine($" Feature: {(t["feature_id"] is not null ? $"#{t["feature_id"]}" : "(none)")}");
Console.WriteLine($" Status: {t["status"]}");
Console.WriteLine($" Go File: {t["go_file"]}");
Console.WriteLine($" Go Class: {t["go_class"]}");
Console.WriteLine($" Go Method: {t["go_method"]}");
Console.WriteLine($" Go Line: {t["go_line_number"]}");
Console.WriteLine($" Go LOC: {t["go_line_count"]}");
Console.WriteLine($" .NET: {t["dotnet_project"]} / {t["dotnet_class"]} / {t["dotnet_method"]}");
Console.WriteLine($" Notes: {t["notes"]}");
var deps = db.Query(
"SELECT d.target_type, d.target_id, d.dependency_kind FROM dependencies d WHERE d.source_type = 'unit_test' AND d.source_id = @id",
("@id", id));
Console.WriteLine($"\n Dependencies ({deps.Count}):");
foreach (var d in deps)
Console.WriteLine($" -> {d["target_type"]} #{d["target_id"]} [{d["dependency_kind"]}]");
});
// update
var updateId = new Argument<int>("id") { Description = "Test ID" };
var updateStatus = new Option<string>("--status") { Description = "New status", Required = true };
var updateCmd = new Command("update", "Update test status");
updateCmd.Add(updateId);
updateCmd.Add(updateStatus);
updateCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var id = parseResult.GetValue(updateId);
var status = parseResult.GetValue(updateStatus)!;
using var db = new Database(dbPath);
var affected = db.Execute("UPDATE unit_tests SET status = @status WHERE id = @id",
("@status", status), ("@id", id));
Console.WriteLine(affected > 0 ? $"Test {id} updated to '{status}'." : $"Test {id} not found.");
});
// map
var mapId = new Argument<int>("id") { Description = "Test ID" };
var mapProject = new Option<string?>("--project") { Description = "Target .NET project" };
var mapClass = new Option<string?>("--class") { Description = "Target .NET test class" };
var mapMethod = new Option<string?>("--method") { Description = "Target .NET test method" };
var mapCmd = new Command("map", "Map test to .NET test method");
mapCmd.Add(mapId);
mapCmd.Add(mapProject);
mapCmd.Add(mapClass);
mapCmd.Add(mapMethod);
mapCmd.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var id = parseResult.GetValue(mapId);
var project = parseResult.GetValue(mapProject);
var cls = parseResult.GetValue(mapClass);
var method = parseResult.GetValue(mapMethod);
using var db = new Database(dbPath);
var affected = db.Execute(
"UPDATE unit_tests SET dotnet_project = COALESCE(@project, dotnet_project), dotnet_class = COALESCE(@cls, dotnet_class), dotnet_method = COALESCE(@method, dotnet_method) WHERE id = @id",
("@project", project), ("@cls", cls), ("@method", method), ("@id", id));
Console.WriteLine(affected > 0 ? $"Test {id} mapped." : $"Test {id} not found.");
});
testCommand.Add(listCmd);
testCommand.Add(showCmd);
testCommand.Add(updateCmd);
testCommand.Add(mapCmd);
return testCommand;
}
private static string Truncate(string? s, int maxLen)
{
if (s is null) return "";
return s.Length <= maxLen ? s : s[..(maxLen - 2)] + "..";
}
}

View File

@@ -0,0 +1,77 @@
using Microsoft.Data.Sqlite;
namespace NatsNet.PortTracker.Data;
public sealed class Database : IDisposable
{
private readonly SqliteConnection _connection;
public Database(string dbPath)
{
var connectionString = new SqliteConnectionStringBuilder
{
DataSource = dbPath,
Mode = SqliteOpenMode.ReadWriteCreate,
ForeignKeys = true
}.ToString();
_connection = new SqliteConnection(connectionString);
_connection.Open();
using var cmd = _connection.CreateCommand();
cmd.CommandText = "PRAGMA journal_mode=WAL;";
cmd.ExecuteNonQuery();
}
public SqliteConnection Connection => _connection;
public SqliteCommand CreateCommand(string sql)
{
var cmd = _connection.CreateCommand();
cmd.CommandText = sql;
return cmd;
}
public int Execute(string sql, params (string name, object? value)[] parameters)
{
using var cmd = CreateCommand(sql);
foreach (var (name, value) in parameters)
cmd.Parameters.AddWithValue(name, value ?? DBNull.Value);
return cmd.ExecuteNonQuery();
}
public T? ExecuteScalar<T>(string sql, params (string name, object? value)[] parameters)
{
using var cmd = CreateCommand(sql);
foreach (var (name, value) in parameters)
cmd.Parameters.AddWithValue(name, value ?? DBNull.Value);
var result = cmd.ExecuteScalar();
if (result is null or DBNull) return default;
return (T)Convert.ChangeType(result, typeof(T));
}
public List<Dictionary<string, object?>> Query(string sql, params (string name, object? value)[] parameters)
{
using var cmd = CreateCommand(sql);
foreach (var (name, value) in parameters)
cmd.Parameters.AddWithValue(name, value ?? DBNull.Value);
var results = new List<Dictionary<string, object?>>();
using var reader = cmd.ExecuteReader();
while (reader.Read())
{
var row = new Dictionary<string, object?>();
for (int i = 0; i < reader.FieldCount; i++)
{
row[reader.GetName(i)] = reader.IsDBNull(i) ? null : reader.GetValue(i);
}
results.Add(row);
}
return results;
}
public void Dispose()
{
_connection.Dispose();
}
}

View File

@@ -0,0 +1,10 @@
namespace NatsNet.PortTracker.Data;
public static class Schema
{
public static void Initialize(Database db, string schemaPath)
{
var sql = File.ReadAllText(schemaPath);
db.Execute(sql);
}
}

View File

@@ -1,2 +1,44 @@
// See https://aka.ms/new-console-template for more information
Console.WriteLine("Hello, World!");
using System.CommandLine;
using NatsNet.PortTracker.Commands;
using NatsNet.PortTracker.Data;
var dbOption = new Option<string>("--db")
{
Description = "Path to the SQLite database file",
DefaultValueFactory = _ => Path.Combine(Directory.GetCurrentDirectory(), "porting.db"),
Recursive = true
};
var schemaOption = new Option<string>("--schema")
{
Description = "Path to the SQL schema file",
DefaultValueFactory = _ => Path.Combine(Directory.GetCurrentDirectory(), "porting-schema.sql"),
Recursive = true
};
var rootCommand = new RootCommand("NATS .NET Porting Tracker");
rootCommand.Add(dbOption);
rootCommand.Add(schemaOption);
// init command
var initCommand = new Command("init", "Create or reset the database schema");
initCommand.SetAction(parseResult =>
{
var dbPath = parseResult.GetValue(dbOption)!;
var schemaPath = parseResult.GetValue(schemaOption)!;
using var db = new Database(dbPath);
Schema.Initialize(db, schemaPath);
Console.WriteLine($"Database initialized at {dbPath}");
});
rootCommand.Add(initCommand);
rootCommand.Add(ModuleCommands.Create(dbOption, schemaOption));
rootCommand.Add(FeatureCommands.Create(dbOption, schemaOption));
rootCommand.Add(TestCommands.Create(dbOption, schemaOption));
rootCommand.Add(LibraryCommands.Create(dbOption, schemaOption));
rootCommand.Add(DependencyCommands.Create(dbOption, schemaOption));
rootCommand.Add(ReportCommands.Create(dbOption, schemaOption));
rootCommand.Add(PhaseCommands.Create(dbOption, schemaOption));
var parseResult = rootCommand.Parse(args);
return await parseResult.InvokeAsync();

View File

@@ -0,0 +1,95 @@
using NatsNet.PortTracker.Data;
namespace NatsNet.PortTracker.Reporting;
public static class ReportGenerator
{
public static void PrintSummary(Database db)
{
Console.WriteLine("=== Porting Status Summary ===\n");
PrintTableSummary(db, "modules", "Modules");
PrintTableSummary(db, "features", "Features");
PrintTableSummary(db, "unit_tests", "Unit Tests");
PrintLibrarySummary(db);
// Overall progress
var totalItems = db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules") +
db.ExecuteScalar<long>("SELECT COUNT(*) FROM features") +
db.ExecuteScalar<long>("SELECT COUNT(*) FROM unit_tests");
var doneItems = db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules WHERE status IN ('complete', 'verified', 'n_a')") +
db.ExecuteScalar<long>("SELECT COUNT(*) FROM features WHERE status IN ('complete', 'verified', 'n_a')") +
db.ExecuteScalar<long>("SELECT COUNT(*) FROM unit_tests WHERE status IN ('complete', 'verified', 'n_a')");
var pct = totalItems > 0 ? (double)doneItems / totalItems * 100 : 0;
Console.WriteLine($"\nOverall Progress: {doneItems}/{totalItems} ({pct:F1}%)");
}
public static string ExportMarkdown(Database db)
{
var sb = new System.Text.StringBuilder();
sb.AppendLine("# NATS .NET Porting Status Report");
sb.AppendLine($"\nGenerated: {DateTime.UtcNow:yyyy-MM-dd HH:mm:ss} UTC\n");
AppendTableMarkdown(sb, db, "modules", "Modules");
AppendTableMarkdown(sb, db, "features", "Features");
AppendTableMarkdown(sb, db, "unit_tests", "Unit Tests");
AppendLibraryMarkdown(sb, db);
// Overall
var totalItems = db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules") +
db.ExecuteScalar<long>("SELECT COUNT(*) FROM features") +
db.ExecuteScalar<long>("SELECT COUNT(*) FROM unit_tests");
var doneItems = db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules WHERE status IN ('complete', 'verified', 'n_a')") +
db.ExecuteScalar<long>("SELECT COUNT(*) FROM features WHERE status IN ('complete', 'verified', 'n_a')") +
db.ExecuteScalar<long>("SELECT COUNT(*) FROM unit_tests WHERE status IN ('complete', 'verified', 'n_a')");
var pct = totalItems > 0 ? (double)doneItems / totalItems * 100 : 0;
sb.AppendLine($"\n## Overall Progress\n");
sb.AppendLine($"**{doneItems}/{totalItems} items complete ({pct:F1}%)**");
return sb.ToString();
}
private static void PrintTableSummary(Database db, string table, string label)
{
var rows = db.Query($"SELECT status, COUNT(*) as cnt FROM {table} GROUP BY status ORDER BY status");
var total = rows.Sum(r => Convert.ToInt64(r["cnt"]));
Console.WriteLine($"{label} ({total} total):");
foreach (var row in rows)
Console.WriteLine($" {row["status"],-15} {row["cnt"],5}");
Console.WriteLine();
}
private static void PrintLibrarySummary(Database db)
{
var rows = db.Query("SELECT status, COUNT(*) as cnt FROM library_mappings GROUP BY status ORDER BY status");
var total = rows.Sum(r => Convert.ToInt64(r["cnt"]));
Console.WriteLine($"Library Mappings ({total} total):");
foreach (var row in rows)
Console.WriteLine($" {row["status"],-15} {row["cnt"],5}");
Console.WriteLine();
}
private static void AppendTableMarkdown(System.Text.StringBuilder sb, Database db, string table, string label)
{
var rows = db.Query($"SELECT status, COUNT(*) as cnt FROM {table} GROUP BY status ORDER BY status");
var total = rows.Sum(r => Convert.ToInt64(r["cnt"]));
sb.AppendLine($"## {label} ({total} total)\n");
sb.AppendLine("| Status | Count |");
sb.AppendLine("|--------|-------|");
foreach (var row in rows)
sb.AppendLine($"| {row["status"]} | {row["cnt"]} |");
sb.AppendLine();
}
private static void AppendLibraryMarkdown(System.Text.StringBuilder sb, Database db)
{
var rows = db.Query("SELECT status, COUNT(*) as cnt FROM library_mappings GROUP BY status ORDER BY status");
var total = rows.Sum(r => Convert.ToInt64(r["cnt"]));
sb.AppendLine($"## Library Mappings ({total} total)\n");
sb.AppendLine("| Status | Count |");
sb.AppendLine("|--------|-------|");
foreach (var row in rows)
sb.AppendLine($"| {row["status"]} | {row["cnt"]} |");
sb.AppendLine();
}
}

View File

@@ -0,0 +1,344 @@
package main
import (
"fmt"
"go/ast"
"go/parser"
"go/token"
"os"
"path/filepath"
"sort"
"strings"
)
// Analyzer parses Go source code and extracts structural information.
type Analyzer struct {
sourceDir string
fset *token.FileSet
}
// NewAnalyzer creates a new Analyzer for the given source directory.
func NewAnalyzer(sourceDir string) *Analyzer {
return &Analyzer{
sourceDir: sourceDir,
fset: token.NewFileSet(),
}
}
// Analyze runs the full analysis pipeline.
func (a *Analyzer) Analyze() (*AnalysisResult, error) {
serverDir := filepath.Join(a.sourceDir, "server")
// 1. Discover all Go files grouped by directory
fileGroups, err := a.discoverFiles(serverDir)
if err != nil {
return nil, fmt.Errorf("discovering files: %w", err)
}
// 2. Parse each group into modules
result := &AnalysisResult{}
allImports := make(map[string]*ImportInfo)
for dir, files := range fileGroups {
module, imports, err := a.parseModule(dir, files)
if err != nil {
return nil, fmt.Errorf("parsing module %s: %w", dir, err)
}
result.Modules = append(result.Modules, *module)
for _, imp := range imports {
if existing, ok := allImports[imp.ImportPath]; ok {
existing.UsedInFiles = append(existing.UsedInFiles, imp.UsedInFiles...)
} else {
allImports[imp.ImportPath] = &imp
}
}
}
// 3. Build module-level dependencies from import analysis
result.Dependencies = a.buildDependencies(result.Modules)
// 4. Collect imports
for _, imp := range allImports {
result.Imports = append(result.Imports, *imp)
}
sort.Slice(result.Imports, func(i, j int) bool {
return result.Imports[i].ImportPath < result.Imports[j].ImportPath
})
// Sort modules by name
sort.Slice(result.Modules, func(i, j int) bool {
return result.Modules[i].Name < result.Modules[j].Name
})
return result, nil
}
// discoverFiles walks the source tree and groups .go files by directory.
func (a *Analyzer) discoverFiles(root string) (map[string][]string, error) {
groups := make(map[string][]string)
err := filepath.Walk(root, func(path string, info os.FileInfo, err error) error {
if err != nil {
return err
}
if info.IsDir() {
if info.Name() == "configs" || info.Name() == "testdata" {
return filepath.SkipDir
}
return nil
}
if !strings.HasSuffix(info.Name(), ".go") {
return nil
}
dir := filepath.Dir(path)
groups[dir] = append(groups[dir], path)
return nil
})
return groups, err
}
// parseModule parses all Go files in a directory into a Module.
func (a *Analyzer) parseModule(dir string, files []string) (*Module, []ImportInfo, error) {
moduleName := a.moduleNameFromDir(dir)
module := &Module{
Name: moduleName,
GoPackage: moduleName,
GoFile: dir,
}
var sourceFiles []string
var testFiles []string
for _, f := range files {
if strings.HasSuffix(f, "_test.go") {
testFiles = append(testFiles, f)
} else {
sourceFiles = append(sourceFiles, f)
}
}
var allImports []ImportInfo
totalLines := 0
for _, f := range sourceFiles {
features, imports, lines, err := a.parseSourceFile(f)
if err != nil {
fmt.Fprintf(os.Stderr, "Warning: skipping %s: %v\n", f, err)
continue
}
module.Features = append(module.Features, features...)
allImports = append(allImports, imports...)
totalLines += lines
}
for _, f := range testFiles {
tests, _, lines, err := a.parseTestFile(f)
if err != nil {
fmt.Fprintf(os.Stderr, "Warning: skipping test %s: %v\n", f, err)
continue
}
module.Tests = append(module.Tests, tests...)
totalLines += lines
}
module.GoLineCount = totalLines
return module, allImports, nil
}
// parseSourceFile extracts functions, methods, and imports from a Go source file.
func (a *Analyzer) parseSourceFile(filePath string) ([]Feature, []ImportInfo, int, error) {
src, err := os.ReadFile(filePath)
if err != nil {
return nil, nil, 0, err
}
file, err := parser.ParseFile(a.fset, filePath, src, parser.ParseComments)
if err != nil {
return nil, nil, 0, err
}
lines := strings.Count(string(src), "\n") + 1
relPath := a.relPath(filePath)
var features []Feature
var imports []ImportInfo
for _, imp := range file.Imports {
path := strings.Trim(imp.Path.Value, "\"")
imports = append(imports, ImportInfo{
ImportPath: path,
IsStdlib: isStdlib(path),
UsedInFiles: []string{relPath},
})
}
for _, decl := range file.Decls {
fn, ok := decl.(*ast.FuncDecl)
if !ok {
continue
}
feature := Feature{
Name: fn.Name.Name,
GoFile: relPath,
GoMethod: fn.Name.Name,
GoLineNumber: a.fset.Position(fn.Pos()).Line,
}
startLine := a.fset.Position(fn.Pos()).Line
endLine := a.fset.Position(fn.End()).Line
feature.GoLineCount = endLine - startLine + 1
if fn.Recv != nil && len(fn.Recv.List) > 0 {
feature.GoClass = a.receiverTypeName(fn.Recv.List[0].Type)
feature.Name = feature.GoClass + "." + fn.Name.Name
}
if fn.Doc != nil {
feature.Description = strings.TrimSpace(fn.Doc.Text())
}
features = append(features, feature)
}
return features, imports, lines, nil
}
// parseTestFile extracts test functions from a Go test file.
func (a *Analyzer) parseTestFile(filePath string) ([]TestFunc, []ImportInfo, int, error) {
src, err := os.ReadFile(filePath)
if err != nil {
return nil, nil, 0, err
}
file, err := parser.ParseFile(a.fset, filePath, src, parser.ParseComments)
if err != nil {
return nil, nil, 0, err
}
lines := strings.Count(string(src), "\n") + 1
relPath := a.relPath(filePath)
var tests []TestFunc
var imports []ImportInfo
for _, imp := range file.Imports {
path := strings.Trim(imp.Path.Value, "\"")
imports = append(imports, ImportInfo{
ImportPath: path,
IsStdlib: isStdlib(path),
UsedInFiles: []string{relPath},
})
}
for _, decl := range file.Decls {
fn, ok := decl.(*ast.FuncDecl)
if !ok {
continue
}
name := fn.Name.Name
if !strings.HasPrefix(name, "Test") && !strings.HasPrefix(name, "Benchmark") {
continue
}
startLine := a.fset.Position(fn.Pos()).Line
endLine := a.fset.Position(fn.End()).Line
test := TestFunc{
Name: name,
GoFile: relPath,
GoMethod: name,
GoLineNumber: startLine,
GoLineCount: endLine - startLine + 1,
}
if fn.Doc != nil {
test.Description = strings.TrimSpace(fn.Doc.Text())
}
test.FeatureName = a.inferFeatureName(name)
tests = append(tests, test)
}
return tests, imports, lines, nil
}
// buildDependencies creates module-level dependencies based on cross-package imports.
func (a *Analyzer) buildDependencies(modules []Module) []Dependency {
pkgToModule := make(map[string]string)
for _, m := range modules {
pkgToModule[m.GoPackage] = m.Name
}
var deps []Dependency
for _, m := range modules {
if m.Name != "server" && m.GoPackage != "server" {
deps = append(deps, Dependency{
SourceModule: "server",
TargetModule: m.Name,
DependencyKind: "calls",
})
}
}
return deps
}
// moduleNameFromDir converts a directory path to a module name.
func (a *Analyzer) moduleNameFromDir(dir string) string {
base := filepath.Base(dir)
if base == "server" {
return "server"
}
return base
}
// relPath returns a path relative to the analyzer's source directory.
func (a *Analyzer) relPath(absPath string) string {
rel, err := filepath.Rel(a.sourceDir, absPath)
if err != nil {
return absPath
}
return rel
}
// receiverTypeName extracts the type name from a method receiver.
func (a *Analyzer) receiverTypeName(expr ast.Expr) string {
switch t := expr.(type) {
case *ast.StarExpr:
return a.receiverTypeName(t.X)
case *ast.Ident:
return t.Name
default:
return ""
}
}
// inferFeatureName attempts to derive a feature name from a test name.
func (a *Analyzer) inferFeatureName(testName string) string {
name := testName
for _, prefix := range []string{"Test", "Benchmark"} {
if strings.HasPrefix(name, prefix) {
name = strings.TrimPrefix(name, prefix)
break
}
}
if name == "" {
return ""
}
if idx := strings.Index(name, "_"); idx > 0 {
name = name[:idx] + "." + name[idx+1:]
}
return name
}
// isStdlib checks if an import path is a Go standard library package.
func isStdlib(importPath string) bool {
firstSlash := strings.Index(importPath, "/")
var first string
if firstSlash < 0 {
first = importPath
} else {
first = importPath[:firstSlash]
}
return !strings.Contains(first, ".")
}

View File

@@ -1,3 +1,5 @@
module github.com/natsnet/go-analyzer
go 1.25.5
require github.com/mattn/go-sqlite3 v1.14.34

2
tools/go-analyzer/go.sum Normal file
View File

@@ -0,0 +1,2 @@
github.com/mattn/go-sqlite3 v1.14.34 h1:3NtcvcUnFBPsuRcno8pUtupspG/GM+9nZ88zgJcp6Zk=
github.com/mattn/go-sqlite3 v1.14.34/go.mod h1:Uh1q+B4BYcTPb+yiD3kU8Ct7aC0hY9fxUwlHK0RXw+Y=

View File

@@ -0,0 +1,113 @@
package main
import (
"path/filepath"
"sort"
"strings"
)
// ModuleGrouper groups Go source files into logical modules.
type ModuleGrouper struct {
Prefixes map[string]string
}
// DefaultGrouper creates a grouper with default prefix rules for nats-server.
func DefaultGrouper() *ModuleGrouper {
return &ModuleGrouper{
Prefixes: map[string]string{
"jetstream": "jetstream",
"consumer": "jetstream",
"stream": "jetstream",
"store": "jetstream",
"filestore": "jetstream",
"memstore": "jetstream",
"raft": "raft",
"gateway": "gateway",
"leafnode": "leafnode",
"route": "route",
"client": "client",
"client_proxyproto": "client",
"server": "core",
"service": "core",
"signal": "core",
"reload": "core",
"opts": "config",
"auth": "auth",
"auth_callout": "auth",
"jwt": "auth",
"nkey": "auth",
"accounts": "accounts",
"ocsp": "tls",
"ocsp_peer": "tls",
"ocsp_responsecache": "tls",
"ciphersuites": "tls",
"parser": "protocol",
"proto": "protocol",
"sublist": "subscriptions",
"subject_transform": "subscriptions",
"monitor": "monitoring",
"monitor_sort_opts": "monitoring",
"mqtt": "mqtt",
"websocket": "websocket",
"events": "events",
"msgtrace": "events",
"log": "logging",
"errors": "errors",
"errors_gen": "errors",
"const": "core",
"util": "core",
"ring": "core",
"sendq": "core",
"ipqueue": "core",
"rate_counter": "core",
"scheduler": "core",
"sdm": "core",
"dirstore": "core",
"disk_avail": "core",
"elastic": "core",
},
}
}
// GroupFiles takes a flat list of Go files and returns them grouped by module name.
func (g *ModuleGrouper) GroupFiles(files []string) map[string][]string {
groups := make(map[string][]string)
for _, f := range files {
base := filepath.Base(f)
base = strings.TrimSuffix(base, ".go")
base = strings.TrimSuffix(base, "_test")
for _, suffix := range []string{"_windows", "_linux", "_darwin", "_bsd",
"_solaris", "_wasm", "_netbsd", "_openbsd", "_dragonfly", "_zos", "_other"} {
base = strings.TrimSuffix(base, suffix)
}
module := g.classify(base)
groups[module] = append(groups[module], f)
}
for k := range groups {
sort.Strings(groups[k])
}
return groups
}
// classify determines which module a file belongs to based on its base name.
func (g *ModuleGrouper) classify(baseName string) string {
if module, ok := g.Prefixes[baseName]; ok {
return module
}
bestMatch := ""
bestModule := "core"
for prefix, module := range g.Prefixes {
if strings.HasPrefix(baseName, prefix) && len(prefix) > len(bestMatch) {
bestMatch = prefix
bestModule = module
}
}
return bestModule
}

154
tools/go-analyzer/sqlite.go Normal file
View File

@@ -0,0 +1,154 @@
package main
import (
"database/sql"
"fmt"
"os"
_ "github.com/mattn/go-sqlite3"
)
// OpenDB opens or creates the SQLite database and applies the schema.
func OpenDB(dbPath, schemaPath string) (*sql.DB, error) {
db, err := sql.Open("sqlite3", dbPath+"?_journal_mode=WAL&_foreign_keys=ON")
if err != nil {
return nil, fmt.Errorf("opening database: %w", err)
}
schema, err := os.ReadFile(schemaPath)
if err != nil {
return nil, fmt.Errorf("reading schema: %w", err)
}
if _, err := db.Exec(string(schema)); err != nil {
return nil, fmt.Errorf("applying schema: %w", err)
}
return db, nil
}
// DBWriter writes analysis results to the SQLite database.
type DBWriter struct {
db *sql.DB
}
// NewDBWriter creates a new DBWriter.
func NewDBWriter(db *sql.DB) *DBWriter {
return &DBWriter{db: db}
}
// WriteAll writes all analysis results to the database in a single transaction.
func (w *DBWriter) WriteAll(result *AnalysisResult) error {
tx, err := w.db.Begin()
if err != nil {
return fmt.Errorf("beginning transaction: %w", err)
}
defer tx.Rollback()
moduleIDs := make(map[string]int64)
featureIDs := make(map[string]int64)
for _, mod := range result.Modules {
modID, err := w.insertModule(tx, &mod)
if err != nil {
return fmt.Errorf("inserting module %s: %w", mod.Name, err)
}
moduleIDs[mod.Name] = modID
for _, feat := range mod.Features {
featID, err := w.insertFeature(tx, modID, &feat)
if err != nil {
return fmt.Errorf("inserting feature %s: %w", feat.Name, err)
}
featureIDs[mod.Name+":"+feat.Name] = featID
}
for _, test := range mod.Tests {
var featureID *int64
if test.FeatureName != "" {
if fid, ok := featureIDs[mod.Name+":"+test.FeatureName]; ok {
featureID = &fid
}
}
if err := w.insertTest(tx, modID, featureID, &test); err != nil {
return fmt.Errorf("inserting test %s: %w", test.Name, err)
}
}
}
for _, dep := range result.Dependencies {
sourceID, ok := moduleIDs[dep.SourceModule]
if !ok {
continue
}
targetID, ok := moduleIDs[dep.TargetModule]
if !ok {
continue
}
if err := w.insertDependency(tx, "module", sourceID, "module", targetID, dep.DependencyKind); err != nil {
return fmt.Errorf("inserting dependency %s->%s: %w", dep.SourceModule, dep.TargetModule, err)
}
}
for _, imp := range result.Imports {
if imp.IsStdlib {
continue
}
if err := w.insertLibrary(tx, &imp); err != nil {
return fmt.Errorf("inserting library %s: %w", imp.ImportPath, err)
}
}
return tx.Commit()
}
func (w *DBWriter) insertModule(tx *sql.Tx, mod *Module) (int64, error) {
res, err := tx.Exec(
`INSERT INTO modules (name, description, go_package, go_file, go_line_count, status)
VALUES (?, ?, ?, ?, ?, 'not_started')`,
mod.Name, mod.Description, mod.GoPackage, mod.GoFile, mod.GoLineCount,
)
if err != nil {
return 0, err
}
return res.LastInsertId()
}
func (w *DBWriter) insertFeature(tx *sql.Tx, moduleID int64, feat *Feature) (int64, error) {
res, err := tx.Exec(
`INSERT INTO features (module_id, name, description, go_file, go_class, go_method, go_line_number, go_line_count, status)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, 'not_started')`,
moduleID, feat.Name, feat.Description, feat.GoFile, feat.GoClass, feat.GoMethod, feat.GoLineNumber, feat.GoLineCount,
)
if err != nil {
return 0, err
}
return res.LastInsertId()
}
func (w *DBWriter) insertTest(tx *sql.Tx, moduleID int64, featureID *int64, test *TestFunc) error {
_, err := tx.Exec(
`INSERT INTO unit_tests (module_id, feature_id, name, description, go_file, go_class, go_method, go_line_number, go_line_count, status)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, 'not_started')`,
moduleID, featureID, test.Name, test.Description, test.GoFile, test.GoClass, test.GoMethod, test.GoLineNumber, test.GoLineCount,
)
return err
}
func (w *DBWriter) insertDependency(tx *sql.Tx, srcType string, srcID int64, tgtType string, tgtID int64, kind string) error {
_, err := tx.Exec(
`INSERT OR IGNORE INTO dependencies (source_type, source_id, target_type, target_id, dependency_kind)
VALUES (?, ?, ?, ?, ?)`,
srcType, srcID, tgtType, tgtID, kind,
)
return err
}
func (w *DBWriter) insertLibrary(tx *sql.Tx, imp *ImportInfo) error {
_, err := tx.Exec(
`INSERT OR IGNORE INTO library_mappings (go_import_path, go_library_name, status)
VALUES (?, ?, 'not_mapped')`,
imp.ImportPath, imp.ImportPath,
)
return err
}

View File

@@ -0,0 +1,77 @@
package main
// AnalysisResult holds all extracted data from Go source analysis.
type AnalysisResult struct {
Modules []Module
Dependencies []Dependency
Imports []ImportInfo
}
// TotalFeatures returns the count of all features across all modules.
func (r *AnalysisResult) TotalFeatures() int {
count := 0
for _, m := range r.Modules {
count += len(m.Features)
}
return count
}
// TotalTests returns the count of all tests across all modules.
func (r *AnalysisResult) TotalTests() int {
count := 0
for _, m := range r.Modules {
count += len(m.Tests)
}
return count
}
// Module represents a logical grouping of Go source files.
type Module struct {
Name string
Description string
GoPackage string
GoFile string // primary file or directory
GoLineCount int
Features []Feature
Tests []TestFunc
}
// Feature represents a function or method extracted from Go source.
type Feature struct {
Name string
Description string
GoFile string
GoClass string // receiver type, empty for package-level functions
GoMethod string
GoLineNumber int
GoLineCount int
}
// TestFunc represents a test function extracted from Go source.
type TestFunc struct {
Name string
Description string
GoFile string
GoClass string
GoMethod string
GoLineNumber int
GoLineCount int
// FeatureName links this test to a feature by naming convention
FeatureName string
}
// Dependency represents a call relationship between two items.
type Dependency struct {
SourceModule string
SourceFeature string // empty for module-level deps
TargetModule string
TargetFeature string // empty for module-level deps
DependencyKind string // "calls"
}
// ImportInfo represents a Go import path found in source files.
type ImportInfo struct {
ImportPath string
IsStdlib bool
UsedInFiles []string
}