Detailed step-by-step plan covering: SQLite schema, Go AST analyzer (5 files), .NET PortTracker CLI (8 command groups), and 7 phase instruction documents. Includes native task tracking and persistence.
2364 lines
80 KiB
Markdown
2364 lines
80 KiB
Markdown
# Porting Tracker Implementation Plan
|
|
|
|
> **For Claude:** REQUIRED SUB-SKILL: Use superpowers-extended-cc:executing-plans to implement this plan task-by-task.
|
|
|
|
**Goal:** Build the SQLite database, Go AST analyzer, .NET PortTracker CLI, and 7 phase instruction guides for tracking the NATS server Go-to-.NET port.
|
|
|
|
**Architecture:** Two tools + one DB. A Go program parses Go source via AST and populates SQLite. A .NET 10 CLI app manages the DB for all subsequent phases. Seven markdown guides provide step-by-step instructions.
|
|
|
|
**Tech Stack:** Go 1.25 (go/ast, go/parser, golang.org/x/tools/go/callgraph, mattn/go-sqlite3), .NET 10 (System.CommandLine, Microsoft.Data.Sqlite), SQLite 3.
|
|
|
|
---
|
|
|
|
### Task 0: Project scaffolding and .gitignore
|
|
|
|
**Files:**
|
|
- Create: `porting-schema.sql`
|
|
- Create: `.gitignore`
|
|
- Create: `tools/go-analyzer/go.mod`
|
|
- Create: `tools/NatsNet.PortTracker/NatsNet.PortTracker.csproj`
|
|
|
|
**Step 1: Create .gitignore**
|
|
|
|
```bash
|
|
cat > .gitignore << 'GITEOF'
|
|
# SQLite database (local state)
|
|
porting.db
|
|
porting.db-journal
|
|
porting.db-wal
|
|
porting.db-shm
|
|
|
|
# .NET build output
|
|
tools/NatsNet.PortTracker/bin/
|
|
tools/NatsNet.PortTracker/obj/
|
|
|
|
# Go build output
|
|
tools/go-analyzer/go-analyzer
|
|
|
|
# OS files
|
|
.DS_Store
|
|
Thumbs.db
|
|
GITEOF
|
|
```
|
|
|
|
**Step 2: Create SQLite schema file**
|
|
|
|
```sql
|
|
-- porting-schema.sql
|
|
-- Schema for NATS server Go-to-.NET porting tracker
|
|
|
|
PRAGMA journal_mode=WAL;
|
|
PRAGMA foreign_keys=ON;
|
|
|
|
CREATE TABLE IF NOT EXISTS modules (
|
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
name TEXT NOT NULL,
|
|
description TEXT,
|
|
go_package TEXT,
|
|
go_file TEXT,
|
|
go_line_start INTEGER,
|
|
go_line_count INTEGER,
|
|
status TEXT NOT NULL DEFAULT 'not_started'
|
|
CHECK (status IN ('not_started', 'stub', 'complete', 'verified', 'n_a')),
|
|
dotnet_project TEXT,
|
|
dotnet_namespace TEXT,
|
|
dotnet_class TEXT,
|
|
notes TEXT,
|
|
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
|
|
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
|
|
);
|
|
|
|
CREATE TABLE IF NOT EXISTS features (
|
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
module_id INTEGER NOT NULL REFERENCES modules(id) ON DELETE CASCADE,
|
|
name TEXT NOT NULL,
|
|
description TEXT,
|
|
go_file TEXT,
|
|
go_class TEXT,
|
|
go_method TEXT,
|
|
go_line_number INTEGER,
|
|
go_line_count INTEGER,
|
|
status TEXT NOT NULL DEFAULT 'not_started'
|
|
CHECK (status IN ('not_started', 'stub', 'complete', 'verified', 'n_a')),
|
|
dotnet_project TEXT,
|
|
dotnet_class TEXT,
|
|
dotnet_method TEXT,
|
|
notes TEXT,
|
|
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
|
|
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
|
|
);
|
|
|
|
CREATE TABLE IF NOT EXISTS unit_tests (
|
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
module_id INTEGER NOT NULL REFERENCES modules(id) ON DELETE CASCADE,
|
|
feature_id INTEGER REFERENCES features(id) ON DELETE SET NULL,
|
|
name TEXT NOT NULL,
|
|
description TEXT,
|
|
go_file TEXT,
|
|
go_class TEXT,
|
|
go_method TEXT,
|
|
go_line_number INTEGER,
|
|
go_line_count INTEGER,
|
|
status TEXT NOT NULL DEFAULT 'not_started'
|
|
CHECK (status IN ('not_started', 'stub', 'complete', 'verified', 'n_a')),
|
|
dotnet_project TEXT,
|
|
dotnet_class TEXT,
|
|
dotnet_method TEXT,
|
|
notes TEXT,
|
|
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
|
|
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
|
|
);
|
|
|
|
CREATE TABLE IF NOT EXISTS dependencies (
|
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
source_type TEXT NOT NULL CHECK (source_type IN ('module', 'feature', 'unit_test')),
|
|
source_id INTEGER NOT NULL,
|
|
target_type TEXT NOT NULL CHECK (target_type IN ('module', 'feature', 'unit_test')),
|
|
target_id INTEGER NOT NULL,
|
|
dependency_kind TEXT DEFAULT 'calls',
|
|
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
|
|
UNIQUE (source_type, source_id, target_type, target_id)
|
|
);
|
|
|
|
CREATE TABLE IF NOT EXISTS library_mappings (
|
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
go_import_path TEXT NOT NULL UNIQUE,
|
|
go_library_name TEXT,
|
|
go_usage_description TEXT,
|
|
dotnet_package TEXT,
|
|
dotnet_namespace TEXT,
|
|
dotnet_usage_notes TEXT,
|
|
status TEXT NOT NULL DEFAULT 'not_mapped'
|
|
CHECK (status IN ('not_mapped', 'mapped', 'verified')),
|
|
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
|
|
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
|
|
);
|
|
|
|
-- Indexes
|
|
CREATE INDEX IF NOT EXISTS idx_features_module ON features(module_id);
|
|
CREATE INDEX IF NOT EXISTS idx_features_status ON features(status);
|
|
CREATE INDEX IF NOT EXISTS idx_unit_tests_module ON unit_tests(module_id);
|
|
CREATE INDEX IF NOT EXISTS idx_unit_tests_feature ON unit_tests(feature_id);
|
|
CREATE INDEX IF NOT EXISTS idx_unit_tests_status ON unit_tests(status);
|
|
CREATE INDEX IF NOT EXISTS idx_deps_source ON dependencies(source_type, source_id);
|
|
CREATE INDEX IF NOT EXISTS idx_deps_target ON dependencies(target_type, target_id);
|
|
CREATE INDEX IF NOT EXISTS idx_library_status ON library_mappings(status);
|
|
CREATE INDEX IF NOT EXISTS idx_modules_status ON modules(status);
|
|
|
|
-- Triggers to auto-update updated_at
|
|
CREATE TRIGGER IF NOT EXISTS trg_modules_updated AFTER UPDATE ON modules
|
|
BEGIN
|
|
UPDATE modules SET updated_at = CURRENT_TIMESTAMP WHERE id = NEW.id;
|
|
END;
|
|
|
|
CREATE TRIGGER IF NOT EXISTS trg_features_updated AFTER UPDATE ON features
|
|
BEGIN
|
|
UPDATE features SET updated_at = CURRENT_TIMESTAMP WHERE id = NEW.id;
|
|
END;
|
|
|
|
CREATE TRIGGER IF NOT EXISTS trg_unit_tests_updated AFTER UPDATE ON unit_tests
|
|
BEGIN
|
|
UPDATE unit_tests SET updated_at = CURRENT_TIMESTAMP WHERE id = NEW.id;
|
|
END;
|
|
|
|
CREATE TRIGGER IF NOT EXISTS trg_library_mappings_updated AFTER UPDATE ON library_mappings
|
|
BEGIN
|
|
UPDATE library_mappings SET updated_at = CURRENT_TIMESTAMP WHERE id = NEW.id;
|
|
END;
|
|
```
|
|
|
|
**Step 3: Initialize Go module**
|
|
|
|
```bash
|
|
mkdir -p tools/go-analyzer
|
|
cd tools/go-analyzer
|
|
go mod init github.com/natsnet/go-analyzer
|
|
```
|
|
|
|
**Step 4: Initialize .NET project**
|
|
|
|
```bash
|
|
mkdir -p tools/NatsNet.PortTracker
|
|
cd tools/NatsNet.PortTracker
|
|
dotnet new console --framework net10.0
|
|
dotnet add package Microsoft.Data.Sqlite --version 9.*
|
|
dotnet add package System.CommandLine --version 2.*
|
|
```
|
|
|
|
**Step 5: Commit scaffolding**
|
|
|
|
```bash
|
|
git add .gitignore porting-schema.sql tools/go-analyzer/go.mod tools/NatsNet.PortTracker/
|
|
git commit -m "scaffold: add project structure, schema, and gitignore"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 1: Go AST Analyzer — main.go (CLI entry point)
|
|
|
|
**Files:**
|
|
- Create: `tools/go-analyzer/main.go`
|
|
|
|
**Step 1: Write main.go**
|
|
|
|
This is the entry point that parses CLI flags, opens the SQLite DB, runs the analyzer, and writes results.
|
|
|
|
```go
|
|
package main
|
|
|
|
import (
|
|
"flag"
|
|
"fmt"
|
|
"log"
|
|
"os"
|
|
)
|
|
|
|
func main() {
|
|
sourceDir := flag.String("source", "", "Path to Go source root (e.g., ../../golang/nats-server)")
|
|
dbPath := flag.String("db", "", "Path to SQLite database file (e.g., ../../porting.db)")
|
|
schemaPath := flag.String("schema", "", "Path to SQL schema file (e.g., ../../porting-schema.sql)")
|
|
flag.Parse()
|
|
|
|
if *sourceDir == "" || *dbPath == "" || *schemaPath == "" {
|
|
fmt.Fprintf(os.Stderr, "Usage: go-analyzer --source <path> --db <path> --schema <path>\n")
|
|
flag.PrintDefaults()
|
|
os.Exit(1)
|
|
}
|
|
|
|
// Open DB and apply schema
|
|
db, err := OpenDB(*dbPath, *schemaPath)
|
|
if err != nil {
|
|
log.Fatalf("Failed to open database: %v", err)
|
|
}
|
|
defer db.Close()
|
|
|
|
// Run analysis
|
|
analyzer := NewAnalyzer(*sourceDir)
|
|
result, err := analyzer.Analyze()
|
|
if err != nil {
|
|
log.Fatalf("Analysis failed: %v", err)
|
|
}
|
|
|
|
// Write to DB
|
|
writer := NewDBWriter(db)
|
|
if err := writer.WriteAll(result); err != nil {
|
|
log.Fatalf("Failed to write results: %v", err)
|
|
}
|
|
|
|
fmt.Printf("Analysis complete:\n")
|
|
fmt.Printf(" Modules: %d\n", len(result.Modules))
|
|
fmt.Printf(" Features: %d\n", result.TotalFeatures())
|
|
fmt.Printf(" Unit Tests: %d\n", result.TotalTests())
|
|
fmt.Printf(" Dependencies: %d\n", len(result.Dependencies))
|
|
fmt.Printf(" Imports: %d\n", len(result.Imports))
|
|
}
|
|
```
|
|
|
|
**Step 2: Verify it compiles (with stubs)**
|
|
|
|
We need the stubs from Tasks 2-4 first, so just verify syntax:
|
|
|
|
```bash
|
|
cd tools/go-analyzer
|
|
go vet ./... 2>&1 || echo "Expected errors — stubs not yet written"
|
|
```
|
|
|
|
**Step 3: Commit**
|
|
|
|
```bash
|
|
git add tools/go-analyzer/main.go
|
|
git commit -m "feat(go-analyzer): add CLI entry point"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 2: Go AST Analyzer — types.go (data model)
|
|
|
|
**Files:**
|
|
- Create: `tools/go-analyzer/types.go`
|
|
|
|
**Step 1: Write types.go**
|
|
|
|
Defines the data structures that the analyzer produces and the DB writer consumes.
|
|
|
|
```go
|
|
package main
|
|
|
|
// AnalysisResult holds all extracted data from Go source analysis.
|
|
type AnalysisResult struct {
|
|
Modules []Module
|
|
Dependencies []Dependency
|
|
Imports []ImportInfo
|
|
}
|
|
|
|
// TotalFeatures returns the count of all features across all modules.
|
|
func (r *AnalysisResult) TotalFeatures() int {
|
|
count := 0
|
|
for _, m := range r.Modules {
|
|
count += len(m.Features)
|
|
}
|
|
return count
|
|
}
|
|
|
|
// TotalTests returns the count of all tests across all modules.
|
|
func (r *AnalysisResult) TotalTests() int {
|
|
count := 0
|
|
for _, m := range r.Modules {
|
|
count += len(m.Tests)
|
|
}
|
|
return count
|
|
}
|
|
|
|
// Module represents a logical grouping of Go source files.
|
|
type Module struct {
|
|
Name string
|
|
Description string
|
|
GoPackage string
|
|
GoFile string // primary file or directory
|
|
GoLineCount int
|
|
Features []Feature
|
|
Tests []TestFunc
|
|
}
|
|
|
|
// Feature represents a function or method extracted from Go source.
|
|
type Feature struct {
|
|
Name string
|
|
Description string
|
|
GoFile string
|
|
GoClass string // receiver type, empty for package-level functions
|
|
GoMethod string
|
|
GoLineNumber int
|
|
GoLineCount int
|
|
}
|
|
|
|
// TestFunc represents a test function extracted from Go source.
|
|
type TestFunc struct {
|
|
Name string
|
|
Description string
|
|
GoFile string
|
|
GoClass string
|
|
GoMethod string
|
|
GoLineNumber int
|
|
GoLineCount int
|
|
// FeatureName links this test to a feature by naming convention
|
|
FeatureName string
|
|
}
|
|
|
|
// Dependency represents a call relationship between two items.
|
|
type Dependency struct {
|
|
SourceModule string
|
|
SourceFeature string // empty for module-level deps
|
|
TargetModule string
|
|
TargetFeature string // empty for module-level deps
|
|
DependencyKind string // "calls"
|
|
}
|
|
|
|
// ImportInfo represents a Go import path found in source files.
|
|
type ImportInfo struct {
|
|
ImportPath string
|
|
IsStdlib bool
|
|
UsedInFiles []string
|
|
}
|
|
```
|
|
|
|
**Step 2: Verify compilation**
|
|
|
|
```bash
|
|
cd tools/go-analyzer && go build ./... 2>&1 || echo "Expected — other files not yet written"
|
|
```
|
|
|
|
**Step 3: Commit**
|
|
|
|
```bash
|
|
git add tools/go-analyzer/types.go
|
|
git commit -m "feat(go-analyzer): add data model types"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 3: Go AST Analyzer — analyzer.go (AST parsing + call graph)
|
|
|
|
**Files:**
|
|
- Create: `tools/go-analyzer/analyzer.go`
|
|
|
|
**Step 1: Write analyzer.go**
|
|
|
|
This is the core analysis engine. It walks Go source files using `go/ast` and `go/parser` to extract functions, methods, structs, and imports. It uses `golang.org/x/tools/go/packages` for type-checked call graph analysis.
|
|
|
|
```go
|
|
package main
|
|
|
|
import (
|
|
"fmt"
|
|
"go/ast"
|
|
"go/parser"
|
|
"go/token"
|
|
"os"
|
|
"path/filepath"
|
|
"sort"
|
|
"strings"
|
|
)
|
|
|
|
// Analyzer parses Go source code and extracts structural information.
|
|
type Analyzer struct {
|
|
sourceDir string
|
|
fset *token.FileSet
|
|
}
|
|
|
|
// NewAnalyzer creates a new Analyzer for the given source directory.
|
|
func NewAnalyzer(sourceDir string) *Analyzer {
|
|
return &Analyzer{
|
|
sourceDir: sourceDir,
|
|
fset: token.NewFileSet(),
|
|
}
|
|
}
|
|
|
|
// Analyze runs the full analysis pipeline.
|
|
func (a *Analyzer) Analyze() (*AnalysisResult, error) {
|
|
serverDir := filepath.Join(a.sourceDir, "server")
|
|
|
|
// 1. Discover all Go files grouped by directory
|
|
fileGroups, err := a.discoverFiles(serverDir)
|
|
if err != nil {
|
|
return nil, fmt.Errorf("discovering files: %w", err)
|
|
}
|
|
|
|
// 2. Parse each group into modules
|
|
result := &AnalysisResult{}
|
|
allImports := make(map[string]*ImportInfo)
|
|
|
|
for dir, files := range fileGroups {
|
|
module, imports, err := a.parseModule(dir, files)
|
|
if err != nil {
|
|
return nil, fmt.Errorf("parsing module %s: %w", dir, err)
|
|
}
|
|
result.Modules = append(result.Modules, *module)
|
|
for _, imp := range imports {
|
|
if existing, ok := allImports[imp.ImportPath]; ok {
|
|
existing.UsedInFiles = append(existing.UsedInFiles, imp.UsedInFiles...)
|
|
} else {
|
|
allImports[imp.ImportPath] = &imp
|
|
}
|
|
}
|
|
}
|
|
|
|
// 3. Build module-level dependencies from import analysis
|
|
result.Dependencies = a.buildDependencies(result.Modules)
|
|
|
|
// 4. Collect imports
|
|
for _, imp := range allImports {
|
|
result.Imports = append(result.Imports, *imp)
|
|
}
|
|
sort.Slice(result.Imports, func(i, j int) bool {
|
|
return result.Imports[i].ImportPath < result.Imports[j].ImportPath
|
|
})
|
|
|
|
// Sort modules by name
|
|
sort.Slice(result.Modules, func(i, j int) bool {
|
|
return result.Modules[i].Name < result.Modules[j].Name
|
|
})
|
|
|
|
return result, nil
|
|
}
|
|
|
|
// discoverFiles walks the source tree and groups .go files by directory.
|
|
func (a *Analyzer) discoverFiles(root string) (map[string][]string, error) {
|
|
groups := make(map[string][]string)
|
|
err := filepath.Walk(root, func(path string, info os.FileInfo, err error) error {
|
|
if err != nil {
|
|
return err
|
|
}
|
|
// Skip non-Go files, configs directories
|
|
if info.IsDir() {
|
|
if info.Name() == "configs" || info.Name() == "testdata" {
|
|
return filepath.SkipDir
|
|
}
|
|
return nil
|
|
}
|
|
if !strings.HasSuffix(info.Name(), ".go") {
|
|
return nil
|
|
}
|
|
dir := filepath.Dir(path)
|
|
groups[dir] = append(groups[dir], path)
|
|
return nil
|
|
})
|
|
return groups, err
|
|
}
|
|
|
|
// parseModule parses all Go files in a directory into a Module.
|
|
func (a *Analyzer) parseModule(dir string, files []string) (*Module, []ImportInfo, error) {
|
|
moduleName := a.moduleNameFromDir(dir)
|
|
|
|
module := &Module{
|
|
Name: moduleName,
|
|
GoPackage: moduleName,
|
|
GoFile: dir,
|
|
}
|
|
|
|
var sourceFiles []string
|
|
var testFiles []string
|
|
for _, f := range files {
|
|
if strings.HasSuffix(f, "_test.go") {
|
|
testFiles = append(testFiles, f)
|
|
} else {
|
|
sourceFiles = append(sourceFiles, f)
|
|
}
|
|
}
|
|
|
|
var allImports []ImportInfo
|
|
totalLines := 0
|
|
|
|
// Parse source files
|
|
for _, f := range sourceFiles {
|
|
features, imports, lines, err := a.parseSourceFile(f)
|
|
if err != nil {
|
|
fmt.Fprintf(os.Stderr, "Warning: skipping %s: %v\n", f, err)
|
|
continue
|
|
}
|
|
module.Features = append(module.Features, features...)
|
|
allImports = append(allImports, imports...)
|
|
totalLines += lines
|
|
}
|
|
|
|
// Parse test files
|
|
for _, f := range testFiles {
|
|
tests, _, lines, err := a.parseTestFile(f)
|
|
if err != nil {
|
|
fmt.Fprintf(os.Stderr, "Warning: skipping test %s: %v\n", f, err)
|
|
continue
|
|
}
|
|
module.Tests = append(module.Tests, tests...)
|
|
totalLines += lines
|
|
}
|
|
|
|
module.GoLineCount = totalLines
|
|
return module, allImports, nil
|
|
}
|
|
|
|
// parseSourceFile extracts functions, methods, and imports from a Go source file.
|
|
func (a *Analyzer) parseSourceFile(filePath string) ([]Feature, []ImportInfo, int, error) {
|
|
src, err := os.ReadFile(filePath)
|
|
if err != nil {
|
|
return nil, nil, 0, err
|
|
}
|
|
|
|
file, err := parser.ParseFile(a.fset, filePath, src, parser.ParseComments)
|
|
if err != nil {
|
|
return nil, nil, 0, err
|
|
}
|
|
|
|
lines := strings.Count(string(src), "\n") + 1
|
|
relPath := a.relPath(filePath)
|
|
|
|
var features []Feature
|
|
var imports []ImportInfo
|
|
|
|
// Extract imports
|
|
for _, imp := range file.Imports {
|
|
path := strings.Trim(imp.Path.Value, "\"")
|
|
imports = append(imports, ImportInfo{
|
|
ImportPath: path,
|
|
IsStdlib: isStdlib(path),
|
|
UsedInFiles: []string{relPath},
|
|
})
|
|
}
|
|
|
|
// Extract functions and methods
|
|
for _, decl := range file.Decls {
|
|
fn, ok := decl.(*ast.FuncDecl)
|
|
if !ok {
|
|
continue
|
|
}
|
|
|
|
feature := Feature{
|
|
Name: fn.Name.Name,
|
|
GoFile: relPath,
|
|
GoMethod: fn.Name.Name,
|
|
GoLineNumber: a.fset.Position(fn.Pos()).Line,
|
|
}
|
|
|
|
// Calculate line count for this function
|
|
startLine := a.fset.Position(fn.Pos()).Line
|
|
endLine := a.fset.Position(fn.End()).Line
|
|
feature.GoLineCount = endLine - startLine + 1
|
|
|
|
// If it's a method, extract receiver type
|
|
if fn.Recv != nil && len(fn.Recv.List) > 0 {
|
|
feature.GoClass = a.receiverTypeName(fn.Recv.List[0].Type)
|
|
feature.Name = feature.GoClass + "." + fn.Name.Name
|
|
}
|
|
|
|
// Build description from doc comment
|
|
if fn.Doc != nil {
|
|
feature.Description = strings.TrimSpace(fn.Doc.Text())
|
|
}
|
|
|
|
features = append(features, feature)
|
|
}
|
|
|
|
return features, imports, lines, nil
|
|
}
|
|
|
|
// parseTestFile extracts test functions from a Go test file.
|
|
func (a *Analyzer) parseTestFile(filePath string) ([]TestFunc, []ImportInfo, int, error) {
|
|
src, err := os.ReadFile(filePath)
|
|
if err != nil {
|
|
return nil, nil, 0, err
|
|
}
|
|
|
|
file, err := parser.ParseFile(a.fset, filePath, src, parser.ParseComments)
|
|
if err != nil {
|
|
return nil, nil, 0, err
|
|
}
|
|
|
|
lines := strings.Count(string(src), "\n") + 1
|
|
relPath := a.relPath(filePath)
|
|
|
|
var tests []TestFunc
|
|
var imports []ImportInfo
|
|
|
|
for _, imp := range file.Imports {
|
|
path := strings.Trim(imp.Path.Value, "\"")
|
|
imports = append(imports, ImportInfo{
|
|
ImportPath: path,
|
|
IsStdlib: isStdlib(path),
|
|
UsedInFiles: []string{relPath},
|
|
})
|
|
}
|
|
|
|
for _, decl := range file.Decls {
|
|
fn, ok := decl.(*ast.FuncDecl)
|
|
if !ok {
|
|
continue
|
|
}
|
|
name := fn.Name.Name
|
|
if !strings.HasPrefix(name, "Test") && !strings.HasPrefix(name, "Benchmark") {
|
|
continue
|
|
}
|
|
|
|
startLine := a.fset.Position(fn.Pos()).Line
|
|
endLine := a.fset.Position(fn.End()).Line
|
|
|
|
test := TestFunc{
|
|
Name: name,
|
|
GoFile: relPath,
|
|
GoMethod: name,
|
|
GoLineNumber: startLine,
|
|
GoLineCount: endLine - startLine + 1,
|
|
}
|
|
|
|
if fn.Doc != nil {
|
|
test.Description = strings.TrimSpace(fn.Doc.Text())
|
|
}
|
|
|
|
// Try to link to a feature by naming convention:
|
|
// TestFoo -> Foo, TestServer_Foo -> Server.Foo
|
|
test.FeatureName = a.inferFeatureName(name)
|
|
|
|
tests = append(tests, test)
|
|
}
|
|
|
|
return tests, imports, lines, nil
|
|
}
|
|
|
|
// buildDependencies creates module-level dependencies based on cross-package imports.
|
|
func (a *Analyzer) buildDependencies(modules []Module) []Dependency {
|
|
// Map package names to module names
|
|
pkgToModule := make(map[string]string)
|
|
for _, m := range modules {
|
|
pkgToModule[m.GoPackage] = m.Name
|
|
}
|
|
|
|
// For now, build module-level dependencies based on directory structure.
|
|
// Cross-file function calls within the same package are tracked at feature level.
|
|
var deps []Dependency
|
|
|
|
// Subdirectory packages are dependencies of the main server package
|
|
for _, m := range modules {
|
|
if m.Name != "server" && m.GoPackage != "server" {
|
|
deps = append(deps, Dependency{
|
|
SourceModule: "server",
|
|
TargetModule: m.Name,
|
|
DependencyKind: "calls",
|
|
})
|
|
}
|
|
}
|
|
|
|
return deps
|
|
}
|
|
|
|
// moduleNameFromDir converts a directory path to a module name.
|
|
func (a *Analyzer) moduleNameFromDir(dir string) string {
|
|
// If it's the server root directory, use "server"
|
|
base := filepath.Base(dir)
|
|
if base == "server" {
|
|
return "server"
|
|
}
|
|
// For subdirectories, use the subdirectory name
|
|
return base
|
|
}
|
|
|
|
// relPath returns a path relative to the analyzer's source directory.
|
|
func (a *Analyzer) relPath(absPath string) string {
|
|
rel, err := filepath.Rel(a.sourceDir, absPath)
|
|
if err != nil {
|
|
return absPath
|
|
}
|
|
return rel
|
|
}
|
|
|
|
// receiverTypeName extracts the type name from a method receiver.
|
|
func (a *Analyzer) receiverTypeName(expr ast.Expr) string {
|
|
switch t := expr.(type) {
|
|
case *ast.StarExpr:
|
|
return a.receiverTypeName(t.X)
|
|
case *ast.Ident:
|
|
return t.Name
|
|
default:
|
|
return ""
|
|
}
|
|
}
|
|
|
|
// inferFeatureName attempts to derive a feature name from a test name.
|
|
// TestFoo -> Foo, TestServer_Foo -> Server.Foo, TestFoo_Bar -> Foo_Bar
|
|
func (a *Analyzer) inferFeatureName(testName string) string {
|
|
name := testName
|
|
for _, prefix := range []string{"Test", "Benchmark"} {
|
|
if strings.HasPrefix(name, prefix) {
|
|
name = strings.TrimPrefix(name, prefix)
|
|
break
|
|
}
|
|
}
|
|
if name == "" {
|
|
return ""
|
|
}
|
|
// Replace first underscore with dot for struct.Method convention
|
|
if idx := strings.Index(name, "_"); idx > 0 {
|
|
name = name[:idx] + "." + name[idx+1:]
|
|
}
|
|
return name
|
|
}
|
|
|
|
// isStdlib checks if an import path is a Go standard library package.
|
|
func isStdlib(importPath string) bool {
|
|
// Stdlib packages don't contain dots in first path element
|
|
firstSlash := strings.Index(importPath, "/")
|
|
var first string
|
|
if firstSlash < 0 {
|
|
first = importPath
|
|
} else {
|
|
first = importPath[:firstSlash]
|
|
}
|
|
return !strings.Contains(first, ".")
|
|
}
|
|
```
|
|
|
|
**Step 2: Add Go dependencies**
|
|
|
|
```bash
|
|
cd tools/go-analyzer
|
|
go mod tidy
|
|
```
|
|
|
|
**Step 3: Verify compilation**
|
|
|
|
```bash
|
|
cd tools/go-analyzer && go build ./... 2>&1 || echo "Expected — sqlite.go not yet written"
|
|
```
|
|
|
|
**Step 4: Commit**
|
|
|
|
```bash
|
|
git add tools/go-analyzer/analyzer.go
|
|
git commit -m "feat(go-analyzer): add AST parsing and analysis engine"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 4: Go AST Analyzer — grouper.go (file-to-module grouping)
|
|
|
|
**Files:**
|
|
- Create: `tools/go-analyzer/grouper.go`
|
|
|
|
**Step 1: Write grouper.go**
|
|
|
|
Handles the logic for grouping related Go files into logical modules (e.g., `jetstream.go` + `jetstream_api.go` + `jetstream_cluster.go` -> "jetstream").
|
|
|
|
```go
|
|
package main
|
|
|
|
import (
|
|
"path/filepath"
|
|
"sort"
|
|
"strings"
|
|
)
|
|
|
|
// ModuleGrouper groups Go source files into logical modules.
|
|
type ModuleGrouper struct {
|
|
// Prefixes maps a file prefix to a module name.
|
|
// Files starting with this prefix are grouped together.
|
|
Prefixes map[string]string
|
|
}
|
|
|
|
// DefaultGrouper creates a grouper with default prefix rules for nats-server.
|
|
func DefaultGrouper() *ModuleGrouper {
|
|
return &ModuleGrouper{
|
|
Prefixes: map[string]string{
|
|
"jetstream": "jetstream",
|
|
"consumer": "jetstream",
|
|
"stream": "jetstream",
|
|
"store": "jetstream",
|
|
"filestore": "jetstream",
|
|
"memstore": "jetstream",
|
|
"raft": "raft",
|
|
"gateway": "gateway",
|
|
"leafnode": "leafnode",
|
|
"route": "route",
|
|
"client": "client",
|
|
"client_proxyproto": "client",
|
|
"server": "core",
|
|
"service": "core",
|
|
"signal": "core",
|
|
"reload": "core",
|
|
"opts": "config",
|
|
"auth": "auth",
|
|
"auth_callout": "auth",
|
|
"jwt": "auth",
|
|
"nkey": "auth",
|
|
"accounts": "accounts",
|
|
"ocsp": "tls",
|
|
"ocsp_peer": "tls",
|
|
"ocsp_responsecache": "tls",
|
|
"ciphersuites": "tls",
|
|
"parser": "protocol",
|
|
"proto": "protocol",
|
|
"sublist": "subscriptions",
|
|
"subject_transform": "subscriptions",
|
|
"monitor": "monitoring",
|
|
"monitor_sort_opts": "monitoring",
|
|
"mqtt": "mqtt",
|
|
"websocket": "websocket",
|
|
"events": "events",
|
|
"msgtrace": "events",
|
|
"log": "logging",
|
|
"errors": "errors",
|
|
"errors_gen": "errors",
|
|
"const": "core",
|
|
"util": "core",
|
|
"ring": "core",
|
|
"sendq": "core",
|
|
"ipqueue": "core",
|
|
"rate_counter": "core",
|
|
"scheduler": "core",
|
|
"sdm": "core",
|
|
"dirstore": "core",
|
|
"disk_avail": "core",
|
|
"elastic": "core",
|
|
},
|
|
}
|
|
}
|
|
|
|
// GroupFiles takes a flat list of Go files and returns them grouped by module name.
|
|
func (g *ModuleGrouper) GroupFiles(files []string) map[string][]string {
|
|
groups := make(map[string][]string)
|
|
|
|
for _, f := range files {
|
|
base := filepath.Base(f)
|
|
base = strings.TrimSuffix(base, ".go")
|
|
base = strings.TrimSuffix(base, "_test")
|
|
|
|
// Remove platform suffixes
|
|
for _, suffix := range []string{"_windows", "_linux", "_darwin", "_bsd",
|
|
"_solaris", "_wasm", "_netbsd", "_openbsd", "_dragonfly", "_zos", "_other"} {
|
|
base = strings.TrimSuffix(base, suffix)
|
|
}
|
|
|
|
module := g.classify(base)
|
|
groups[module] = append(groups[module], f)
|
|
}
|
|
|
|
// Sort files within each group
|
|
for k := range groups {
|
|
sort.Strings(groups[k])
|
|
}
|
|
|
|
return groups
|
|
}
|
|
|
|
// classify determines which module a file belongs to based on its base name.
|
|
func (g *ModuleGrouper) classify(baseName string) string {
|
|
// Exact match first
|
|
if module, ok := g.Prefixes[baseName]; ok {
|
|
return module
|
|
}
|
|
|
|
// Prefix match (longest prefix wins)
|
|
bestMatch := ""
|
|
bestModule := "core" // default
|
|
for prefix, module := range g.Prefixes {
|
|
if strings.HasPrefix(baseName, prefix) && len(prefix) > len(bestMatch) {
|
|
bestMatch = prefix
|
|
bestModule = module
|
|
}
|
|
}
|
|
|
|
return bestModule
|
|
}
|
|
```
|
|
|
|
**Step 2: Commit**
|
|
|
|
```bash
|
|
git add tools/go-analyzer/grouper.go
|
|
git commit -m "feat(go-analyzer): add file-to-module grouping logic"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 5: Go AST Analyzer — sqlite.go (database writer)
|
|
|
|
**Files:**
|
|
- Create: `tools/go-analyzer/sqlite.go`
|
|
|
|
**Step 1: Write sqlite.go**
|
|
|
|
Handles opening the database, applying the schema, and writing analysis results.
|
|
|
|
```go
|
|
package main
|
|
|
|
import (
|
|
"database/sql"
|
|
"fmt"
|
|
"os"
|
|
|
|
_ "github.com/mattn/go-sqlite3"
|
|
)
|
|
|
|
// OpenDB opens or creates the SQLite database and applies the schema.
|
|
func OpenDB(dbPath, schemaPath string) (*sql.DB, error) {
|
|
db, err := sql.Open("sqlite3", dbPath+"?_journal_mode=WAL&_foreign_keys=ON")
|
|
if err != nil {
|
|
return nil, fmt.Errorf("opening database: %w", err)
|
|
}
|
|
|
|
schema, err := os.ReadFile(schemaPath)
|
|
if err != nil {
|
|
return nil, fmt.Errorf("reading schema: %w", err)
|
|
}
|
|
|
|
if _, err := db.Exec(string(schema)); err != nil {
|
|
return nil, fmt.Errorf("applying schema: %w", err)
|
|
}
|
|
|
|
return db, nil
|
|
}
|
|
|
|
// DBWriter writes analysis results to the SQLite database.
|
|
type DBWriter struct {
|
|
db *sql.DB
|
|
}
|
|
|
|
// NewDBWriter creates a new DBWriter.
|
|
func NewDBWriter(db *sql.DB) *DBWriter {
|
|
return &DBWriter{db: db}
|
|
}
|
|
|
|
// WriteAll writes all analysis results to the database in a single transaction.
|
|
func (w *DBWriter) WriteAll(result *AnalysisResult) error {
|
|
tx, err := w.db.Begin()
|
|
if err != nil {
|
|
return fmt.Errorf("beginning transaction: %w", err)
|
|
}
|
|
defer tx.Rollback()
|
|
|
|
// Track module name -> DB id for dependency resolution
|
|
moduleIDs := make(map[string]int64)
|
|
// Track "module:feature" -> DB id for test linking
|
|
featureIDs := make(map[string]int64)
|
|
|
|
// 1. Insert modules and their features/tests
|
|
for _, mod := range result.Modules {
|
|
modID, err := w.insertModule(tx, &mod)
|
|
if err != nil {
|
|
return fmt.Errorf("inserting module %s: %w", mod.Name, err)
|
|
}
|
|
moduleIDs[mod.Name] = modID
|
|
|
|
for _, feat := range mod.Features {
|
|
featID, err := w.insertFeature(tx, modID, &feat)
|
|
if err != nil {
|
|
return fmt.Errorf("inserting feature %s: %w", feat.Name, err)
|
|
}
|
|
featureIDs[mod.Name+":"+feat.Name] = featID
|
|
}
|
|
|
|
for _, test := range mod.Tests {
|
|
var featureID *int64
|
|
if test.FeatureName != "" {
|
|
if fid, ok := featureIDs[mod.Name+":"+test.FeatureName]; ok {
|
|
featureID = &fid
|
|
}
|
|
}
|
|
if err := w.insertTest(tx, modID, featureID, &test); err != nil {
|
|
return fmt.Errorf("inserting test %s: %w", test.Name, err)
|
|
}
|
|
}
|
|
}
|
|
|
|
// 2. Insert dependencies
|
|
for _, dep := range result.Dependencies {
|
|
sourceID, ok := moduleIDs[dep.SourceModule]
|
|
if !ok {
|
|
continue
|
|
}
|
|
targetID, ok := moduleIDs[dep.TargetModule]
|
|
if !ok {
|
|
continue
|
|
}
|
|
if err := w.insertDependency(tx, "module", sourceID, "module", targetID, dep.DependencyKind); err != nil {
|
|
return fmt.Errorf("inserting dependency %s->%s: %w", dep.SourceModule, dep.TargetModule, err)
|
|
}
|
|
}
|
|
|
|
// 3. Insert library mappings (Go-side only)
|
|
for _, imp := range result.Imports {
|
|
if imp.IsStdlib {
|
|
continue // Skip stdlib for now; they can be added manually
|
|
}
|
|
if err := w.insertLibrary(tx, &imp); err != nil {
|
|
return fmt.Errorf("inserting library %s: %w", imp.ImportPath, err)
|
|
}
|
|
}
|
|
|
|
return tx.Commit()
|
|
}
|
|
|
|
func (w *DBWriter) insertModule(tx *sql.Tx, mod *Module) (int64, error) {
|
|
res, err := tx.Exec(
|
|
`INSERT INTO modules (name, description, go_package, go_file, go_line_count, status)
|
|
VALUES (?, ?, ?, ?, ?, 'not_started')`,
|
|
mod.Name, mod.Description, mod.GoPackage, mod.GoFile, mod.GoLineCount,
|
|
)
|
|
if err != nil {
|
|
return 0, err
|
|
}
|
|
return res.LastInsertId()
|
|
}
|
|
|
|
func (w *DBWriter) insertFeature(tx *sql.Tx, moduleID int64, feat *Feature) (int64, error) {
|
|
res, err := tx.Exec(
|
|
`INSERT INTO features (module_id, name, description, go_file, go_class, go_method, go_line_number, go_line_count, status)
|
|
VALUES (?, ?, ?, ?, ?, ?, ?, ?, 'not_started')`,
|
|
moduleID, feat.Name, feat.Description, feat.GoFile, feat.GoClass, feat.GoMethod, feat.GoLineNumber, feat.GoLineCount,
|
|
)
|
|
if err != nil {
|
|
return 0, err
|
|
}
|
|
return res.LastInsertId()
|
|
}
|
|
|
|
func (w *DBWriter) insertTest(tx *sql.Tx, moduleID int64, featureID *int64, test *TestFunc) error {
|
|
_, err := tx.Exec(
|
|
`INSERT INTO unit_tests (module_id, feature_id, name, description, go_file, go_class, go_method, go_line_number, go_line_count, status)
|
|
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, 'not_started')`,
|
|
moduleID, featureID, test.Name, test.Description, test.GoFile, test.GoClass, test.GoMethod, test.GoLineNumber, test.GoLineCount,
|
|
)
|
|
return err
|
|
}
|
|
|
|
func (w *DBWriter) insertDependency(tx *sql.Tx, srcType string, srcID int64, tgtType string, tgtID int64, kind string) error {
|
|
_, err := tx.Exec(
|
|
`INSERT OR IGNORE INTO dependencies (source_type, source_id, target_type, target_id, dependency_kind)
|
|
VALUES (?, ?, ?, ?, ?)`,
|
|
srcType, srcID, tgtType, tgtID, kind,
|
|
)
|
|
return err
|
|
}
|
|
|
|
func (w *DBWriter) insertLibrary(tx *sql.Tx, imp *ImportInfo) error {
|
|
_, err := tx.Exec(
|
|
`INSERT OR IGNORE INTO library_mappings (go_import_path, go_library_name, status)
|
|
VALUES (?, ?, 'not_mapped')`,
|
|
imp.ImportPath, imp.ImportPath,
|
|
)
|
|
return err
|
|
}
|
|
```
|
|
|
|
**Step 2: Add sqlite3 dependency**
|
|
|
|
```bash
|
|
cd tools/go-analyzer
|
|
go get github.com/mattn/go-sqlite3
|
|
go mod tidy
|
|
```
|
|
|
|
**Step 3: Build the full analyzer**
|
|
|
|
```bash
|
|
cd tools/go-analyzer && go build -o go-analyzer .
|
|
```
|
|
|
|
Expected: successful build.
|
|
|
|
**Step 4: Run against the nats-server source**
|
|
|
|
```bash
|
|
cd tools/go-analyzer
|
|
./go-analyzer --source ../../golang/nats-server --db ../../porting.db --schema ../../porting-schema.sql
|
|
```
|
|
|
|
Expected output: counts of modules, features, tests, dependencies, imports.
|
|
|
|
**Step 5: Verify data in SQLite**
|
|
|
|
```bash
|
|
sqlite3 ../../porting.db "SELECT name, go_line_count FROM modules ORDER BY go_line_count DESC LIMIT 10;"
|
|
sqlite3 ../../porting.db "SELECT COUNT(*) FROM features;"
|
|
sqlite3 ../../porting.db "SELECT COUNT(*) FROM unit_tests;"
|
|
sqlite3 ../../porting.db "SELECT COUNT(*) FROM library_mappings;"
|
|
```
|
|
|
|
**Step 6: Commit**
|
|
|
|
```bash
|
|
git add tools/go-analyzer/sqlite.go tools/go-analyzer/go.mod tools/go-analyzer/go.sum
|
|
git commit -m "feat(go-analyzer): add SQLite writer, complete analyzer pipeline"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 6: .NET PortTracker — Project setup and DB access layer
|
|
|
|
**Files:**
|
|
- Create: `tools/NatsNet.PortTracker/Data/Database.cs`
|
|
- Create: `tools/NatsNet.PortTracker/Data/Schema.cs`
|
|
- Modify: `tools/NatsNet.PortTracker/Program.cs`
|
|
|
|
**Step 1: Write Database.cs — connection management**
|
|
|
|
```csharp
|
|
using Microsoft.Data.Sqlite;
|
|
|
|
namespace NatsNet.PortTracker.Data;
|
|
|
|
public sealed class Database : IDisposable
|
|
{
|
|
private readonly SqliteConnection _connection;
|
|
|
|
public Database(string dbPath)
|
|
{
|
|
var connectionString = new SqliteConnectionStringBuilder
|
|
{
|
|
DataSource = dbPath,
|
|
Mode = SqliteOpenMode.ReadWriteCreate,
|
|
ForeignKeys = true
|
|
}.ToString();
|
|
|
|
_connection = new SqliteConnection(connectionString);
|
|
_connection.Open();
|
|
|
|
// Enable WAL mode
|
|
using var cmd = _connection.CreateCommand();
|
|
cmd.CommandText = "PRAGMA journal_mode=WAL;";
|
|
cmd.ExecuteNonQuery();
|
|
}
|
|
|
|
public SqliteConnection Connection => _connection;
|
|
|
|
public SqliteCommand CreateCommand(string sql)
|
|
{
|
|
var cmd = _connection.CreateCommand();
|
|
cmd.CommandText = sql;
|
|
return cmd;
|
|
}
|
|
|
|
public int Execute(string sql, params (string name, object? value)[] parameters)
|
|
{
|
|
using var cmd = CreateCommand(sql);
|
|
foreach (var (name, value) in parameters)
|
|
cmd.Parameters.AddWithValue(name, value ?? DBNull.Value);
|
|
return cmd.ExecuteNonQuery();
|
|
}
|
|
|
|
public T? ExecuteScalar<T>(string sql, params (string name, object? value)[] parameters)
|
|
{
|
|
using var cmd = CreateCommand(sql);
|
|
foreach (var (name, value) in parameters)
|
|
cmd.Parameters.AddWithValue(name, value ?? DBNull.Value);
|
|
var result = cmd.ExecuteScalar();
|
|
if (result is null or DBNull) return default;
|
|
return (T)Convert.ChangeType(result, typeof(T));
|
|
}
|
|
|
|
public List<Dictionary<string, object?>> Query(string sql, params (string name, object? value)[] parameters)
|
|
{
|
|
using var cmd = CreateCommand(sql);
|
|
foreach (var (name, value) in parameters)
|
|
cmd.Parameters.AddWithValue(name, value ?? DBNull.Value);
|
|
|
|
var results = new List<Dictionary<string, object?>>();
|
|
using var reader = cmd.ExecuteReader();
|
|
while (reader.Read())
|
|
{
|
|
var row = new Dictionary<string, object?>();
|
|
for (int i = 0; i < reader.FieldCount; i++)
|
|
{
|
|
row[reader.GetName(i)] = reader.IsDBNull(i) ? null : reader.GetValue(i);
|
|
}
|
|
results.Add(row);
|
|
}
|
|
return results;
|
|
}
|
|
|
|
public void Dispose()
|
|
{
|
|
_connection.Dispose();
|
|
}
|
|
}
|
|
```
|
|
|
|
**Step 2: Write Schema.cs — schema initialization from embedded SQL**
|
|
|
|
```csharp
|
|
namespace NatsNet.PortTracker.Data;
|
|
|
|
public static class Schema
|
|
{
|
|
public static void Initialize(Database db, string schemaPath)
|
|
{
|
|
var sql = File.ReadAllText(schemaPath);
|
|
db.Execute(sql);
|
|
}
|
|
}
|
|
```
|
|
|
|
**Step 3: Write minimal Program.cs with init command**
|
|
|
|
```csharp
|
|
using System.CommandLine;
|
|
using NatsNet.PortTracker.Data;
|
|
|
|
var dbOption = new Option<string>(
|
|
"--db",
|
|
getDefaultValue: () => Path.Combine(Directory.GetCurrentDirectory(), "porting.db"),
|
|
description: "Path to the SQLite database file");
|
|
|
|
var schemaOption = new Option<string>(
|
|
"--schema",
|
|
getDefaultValue: () => Path.Combine(Directory.GetCurrentDirectory(), "porting-schema.sql"),
|
|
description: "Path to the SQL schema file");
|
|
|
|
var rootCommand = new RootCommand("NATS .NET Porting Tracker");
|
|
rootCommand.AddGlobalOption(dbOption);
|
|
rootCommand.AddGlobalOption(schemaOption);
|
|
|
|
// init command
|
|
var initCommand = new Command("init", "Create or reset the database schema");
|
|
initCommand.SetHandler((string dbPath, string schemaPath) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
Schema.Initialize(db, schemaPath);
|
|
Console.WriteLine($"Database initialized at {dbPath}");
|
|
}, dbOption, schemaOption);
|
|
|
|
rootCommand.AddCommand(initCommand);
|
|
|
|
return await rootCommand.InvokeAsync(args);
|
|
```
|
|
|
|
**Step 4: Build and test init command**
|
|
|
|
```bash
|
|
cd tools/NatsNet.PortTracker && dotnet build
|
|
dotnet run -- init --db ../../porting.db --schema ../../porting-schema.sql
|
|
```
|
|
|
|
Expected: "Database initialized at ../../porting.db"
|
|
|
|
**Step 5: Commit**
|
|
|
|
```bash
|
|
git add tools/NatsNet.PortTracker/
|
|
git commit -m "feat(porttracker): add project scaffolding, DB layer, and init command"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 7: .NET PortTracker — Module commands (list, show, update, map, set-na)
|
|
|
|
**Files:**
|
|
- Create: `tools/NatsNet.PortTracker/Commands/ModuleCommands.cs`
|
|
- Modify: `tools/NatsNet.PortTracker/Program.cs`
|
|
|
|
**Step 1: Write ModuleCommands.cs**
|
|
|
|
```csharp
|
|
using System.CommandLine;
|
|
using NatsNet.PortTracker.Data;
|
|
|
|
namespace NatsNet.PortTracker.Commands;
|
|
|
|
public static class ModuleCommands
|
|
{
|
|
public static Command Create(Option<string> dbOption, Option<string> schemaOption)
|
|
{
|
|
var moduleCommand = new Command("module", "Manage modules");
|
|
|
|
// list
|
|
var listStatus = new Option<string?>("--status", "Filter by status");
|
|
var listCmd = new Command("list", "List modules");
|
|
listCmd.AddOption(listStatus);
|
|
listCmd.SetHandler((string dbPath, string schemaPath, string? status) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
var sql = "SELECT id, name, status, go_package, go_line_count, dotnet_project, dotnet_class FROM modules";
|
|
var parameters = new List<(string, object?)>();
|
|
if (status is not null)
|
|
{
|
|
sql += " WHERE status = @status";
|
|
parameters.Add(("@status", status));
|
|
}
|
|
sql += " ORDER BY name";
|
|
|
|
var rows = db.Query(sql, parameters.ToArray());
|
|
Console.WriteLine($"{"ID",-5} {"Name",-25} {"Status",-15} {"Go Pkg",-15} {"LOC",-8} {"DotNet Project",-25} {"DotNet Class",-20}");
|
|
Console.WriteLine(new string('-', 113));
|
|
foreach (var row in rows)
|
|
{
|
|
Console.WriteLine($"{row["id"],-5} {row["name"],-25} {row["status"],-15} {row["go_package"],-15} {row["go_line_count"],-8} {row["dotnet_project"] ?? "",-25} {row["dotnet_class"] ?? "",-20}");
|
|
}
|
|
Console.WriteLine($"\nTotal: {rows.Count} modules");
|
|
}, dbOption, schemaOption, listStatus);
|
|
|
|
// show
|
|
var showId = new Argument<int>("id", "Module ID");
|
|
var showCmd = new Command("show", "Show module details");
|
|
showCmd.AddArgument(showId);
|
|
showCmd.SetHandler((string dbPath, string schemaPath, int id) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
var modules = db.Query("SELECT * FROM modules WHERE id = @id", ("@id", id));
|
|
if (modules.Count == 0)
|
|
{
|
|
Console.WriteLine($"Module {id} not found.");
|
|
return;
|
|
}
|
|
var mod = modules[0];
|
|
Console.WriteLine($"Module #{mod["id"]}: {mod["name"]}");
|
|
Console.WriteLine($" Status: {mod["status"]}");
|
|
Console.WriteLine($" Go Package: {mod["go_package"]}");
|
|
Console.WriteLine($" Go File: {mod["go_file"]}");
|
|
Console.WriteLine($" Go LOC: {mod["go_line_count"]}");
|
|
Console.WriteLine($" .NET: {mod["dotnet_project"]} / {mod["dotnet_namespace"]} / {mod["dotnet_class"]}");
|
|
Console.WriteLine($" Notes: {mod["notes"]}");
|
|
|
|
var features = db.Query(
|
|
"SELECT id, name, status, go_method, dotnet_method FROM features WHERE module_id = @id ORDER BY name",
|
|
("@id", id));
|
|
Console.WriteLine($"\n Features ({features.Count}):");
|
|
foreach (var f in features)
|
|
Console.WriteLine($" #{f["id"],-5} {f["name"],-35} {f["status"],-15} {f["dotnet_method"] ?? ""}");
|
|
|
|
var tests = db.Query(
|
|
"SELECT id, name, status, dotnet_method FROM unit_tests WHERE module_id = @id ORDER BY name",
|
|
("@id", id));
|
|
Console.WriteLine($"\n Tests ({tests.Count}):");
|
|
foreach (var t in tests)
|
|
Console.WriteLine($" #{t["id"],-5} {t["name"],-35} {t["status"],-15} {t["dotnet_method"] ?? ""}");
|
|
|
|
var deps = db.Query(
|
|
"SELECT d.target_type, d.target_id, d.dependency_kind, m.name as target_name FROM dependencies d LEFT JOIN modules m ON d.target_type = 'module' AND d.target_id = m.id WHERE d.source_type = 'module' AND d.source_id = @id",
|
|
("@id", id));
|
|
Console.WriteLine($"\n Dependencies ({deps.Count}):");
|
|
foreach (var d in deps)
|
|
Console.WriteLine($" -> {d["target_type"]} #{d["target_id"]} ({d["target_name"]}) [{d["dependency_kind"]}]");
|
|
}, dbOption, schemaOption, showId);
|
|
|
|
// update
|
|
var updateId = new Argument<int>("id", "Module ID");
|
|
var updateStatus = new Option<string>("--status", "New status") { IsRequired = true };
|
|
var updateCmd = new Command("update", "Update module status");
|
|
updateCmd.AddArgument(updateId);
|
|
updateCmd.AddOption(updateStatus);
|
|
updateCmd.SetHandler((string dbPath, string schemaPath, int id, string status) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
var affected = db.Execute("UPDATE modules SET status = @status WHERE id = @id",
|
|
("@status", status), ("@id", id));
|
|
Console.WriteLine(affected > 0 ? $"Module {id} updated to '{status}'." : $"Module {id} not found.");
|
|
}, dbOption, schemaOption, updateId, updateStatus);
|
|
|
|
// map
|
|
var mapId = new Argument<int>("id", "Module ID");
|
|
var mapProject = new Option<string>("--project", "Target .NET project") { IsRequired = true };
|
|
var mapNamespace = new Option<string?>("--namespace", "Target namespace");
|
|
var mapClass = new Option<string?>("--class", "Target class");
|
|
var mapCmd = new Command("map", "Map module to .NET project");
|
|
mapCmd.AddArgument(mapId);
|
|
mapCmd.AddOption(mapProject);
|
|
mapCmd.AddOption(mapNamespace);
|
|
mapCmd.AddOption(mapClass);
|
|
mapCmd.SetHandler((string dbPath, string schemaPath, int id, string project, string? ns, string? cls) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
var affected = db.Execute(
|
|
"UPDATE modules SET dotnet_project = @project, dotnet_namespace = @ns, dotnet_class = @cls WHERE id = @id",
|
|
("@project", project), ("@ns", ns), ("@cls", cls), ("@id", id));
|
|
Console.WriteLine(affected > 0 ? $"Module {id} mapped to {project}." : $"Module {id} not found.");
|
|
}, dbOption, schemaOption, mapId, mapProject, mapNamespace, mapClass);
|
|
|
|
// set-na
|
|
var naId = new Argument<int>("id", "Module ID");
|
|
var naReason = new Option<string>("--reason", "Reason for N/A") { IsRequired = true };
|
|
var naCmd = new Command("set-na", "Mark module as N/A");
|
|
naCmd.AddArgument(naId);
|
|
naCmd.AddOption(naReason);
|
|
naCmd.SetHandler((string dbPath, string schemaPath, int id, string reason) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
var affected = db.Execute(
|
|
"UPDATE modules SET status = 'n_a', notes = @reason WHERE id = @id",
|
|
("@reason", reason), ("@id", id));
|
|
Console.WriteLine(affected > 0 ? $"Module {id} set to N/A: {reason}" : $"Module {id} not found.");
|
|
}, dbOption, schemaOption, naId, naReason);
|
|
|
|
moduleCommand.AddCommand(listCmd);
|
|
moduleCommand.AddCommand(showCmd);
|
|
moduleCommand.AddCommand(updateCmd);
|
|
moduleCommand.AddCommand(mapCmd);
|
|
moduleCommand.AddCommand(naCmd);
|
|
|
|
return moduleCommand;
|
|
}
|
|
}
|
|
```
|
|
|
|
**Step 2: Register in Program.cs**
|
|
|
|
Add after the init command registration:
|
|
|
|
```csharp
|
|
rootCommand.AddCommand(ModuleCommands.Create(dbOption, schemaOption));
|
|
```
|
|
|
|
**Step 3: Build and test**
|
|
|
|
```bash
|
|
cd tools/NatsNet.PortTracker && dotnet build
|
|
dotnet run -- module list --db ../../porting.db
|
|
dotnet run -- module show 1 --db ../../porting.db
|
|
```
|
|
|
|
**Step 4: Commit**
|
|
|
|
```bash
|
|
git add tools/NatsNet.PortTracker/
|
|
git commit -m "feat(porttracker): add module commands (list, show, update, map, set-na)"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 8: .NET PortTracker — Feature commands
|
|
|
|
**Files:**
|
|
- Create: `tools/NatsNet.PortTracker/Commands/FeatureCommands.cs`
|
|
- Modify: `tools/NatsNet.PortTracker/Program.cs`
|
|
|
|
**Step 1: Write FeatureCommands.cs**
|
|
|
|
Same pattern as ModuleCommands but for the `features` table. Includes `--module` filter on list, and `--all-in-module` batch update.
|
|
|
|
```csharp
|
|
using System.CommandLine;
|
|
using NatsNet.PortTracker.Data;
|
|
|
|
namespace NatsNet.PortTracker.Commands;
|
|
|
|
public static class FeatureCommands
|
|
{
|
|
public static Command Create(Option<string> dbOption, Option<string> schemaOption)
|
|
{
|
|
var featureCommand = new Command("feature", "Manage features");
|
|
|
|
// list
|
|
var listModule = new Option<int?>("--module", "Filter by module ID");
|
|
var listStatus = new Option<string?>("--status", "Filter by status");
|
|
var listCmd = new Command("list", "List features");
|
|
listCmd.AddOption(listModule);
|
|
listCmd.AddOption(listStatus);
|
|
listCmd.SetHandler((string dbPath, string schemaPath, int? moduleId, string? status) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
var sql = "SELECT f.id, f.name, f.status, f.go_file, f.go_method, f.go_line_count, f.dotnet_class, f.dotnet_method, m.name as module_name FROM features f JOIN modules m ON f.module_id = m.id WHERE 1=1";
|
|
var parameters = new List<(string, object?)>();
|
|
if (moduleId is not null)
|
|
{
|
|
sql += " AND f.module_id = @moduleId";
|
|
parameters.Add(("@moduleId", moduleId));
|
|
}
|
|
if (status is not null)
|
|
{
|
|
sql += " AND f.status = @status";
|
|
parameters.Add(("@status", status));
|
|
}
|
|
sql += " ORDER BY m.name, f.name";
|
|
|
|
var rows = db.Query(sql, parameters.ToArray());
|
|
Console.WriteLine($"{"ID",-6} {"Module",-18} {"Name",-35} {"Status",-13} {"Go LOC",-8} {"DotNet",-30}");
|
|
Console.WriteLine(new string('-', 110));
|
|
foreach (var row in rows)
|
|
{
|
|
var dotnet = row["dotnet_class"] is not null ? $"{row["dotnet_class"]}.{row["dotnet_method"]}" : "";
|
|
Console.WriteLine($"{row["id"],-6} {row["module_name"],-18} {Truncate(row["name"]?.ToString(), 35),-35} {row["status"],-13} {row["go_line_count"],-8} {Truncate(dotnet, 30),-30}");
|
|
}
|
|
Console.WriteLine($"\nTotal: {rows.Count} features");
|
|
}, dbOption, schemaOption, listModule, listStatus);
|
|
|
|
// show
|
|
var showId = new Argument<int>("id", "Feature ID");
|
|
var showCmd = new Command("show", "Show feature details");
|
|
showCmd.AddArgument(showId);
|
|
showCmd.SetHandler((string dbPath, string schemaPath, int id) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
var rows = db.Query("SELECT f.*, m.name as module_name FROM features f JOIN modules m ON f.module_id = m.id WHERE f.id = @id", ("@id", id));
|
|
if (rows.Count == 0) { Console.WriteLine($"Feature {id} not found."); return; }
|
|
var f = rows[0];
|
|
Console.WriteLine($"Feature #{f["id"]}: {f["name"]}");
|
|
Console.WriteLine($" Module: {f["module_name"]} (#{f["module_id"]})");
|
|
Console.WriteLine($" Status: {f["status"]}");
|
|
Console.WriteLine($" Go: {f["go_file"]}:{f["go_line_number"]} ({f["go_class"]}.{f["go_method"]}, {f["go_line_count"]} lines)");
|
|
Console.WriteLine($" .NET: {f["dotnet_project"]} / {f["dotnet_class"]}.{f["dotnet_method"]}");
|
|
Console.WriteLine($" Notes: {f["notes"]}");
|
|
}, dbOption, schemaOption, showId);
|
|
|
|
// update
|
|
var updateId = new Argument<int>("id", "Feature ID");
|
|
var updateStatus = new Option<string>("--status", "New status") { IsRequired = true };
|
|
var updateAllInModule = new Option<int?>("--all-in-module", "Update all features in this module");
|
|
var updateCmd = new Command("update", "Update feature status");
|
|
updateCmd.AddArgument(updateId);
|
|
updateCmd.AddOption(updateStatus);
|
|
updateCmd.AddOption(updateAllInModule);
|
|
updateCmd.SetHandler((string dbPath, string schemaPath, int id, string status, int? allInModule) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
if (allInModule is not null)
|
|
{
|
|
var affected = db.Execute("UPDATE features SET status = @status WHERE module_id = @mid",
|
|
("@status", status), ("@mid", allInModule));
|
|
Console.WriteLine($"Updated {affected} features in module {allInModule} to '{status}'.");
|
|
}
|
|
else
|
|
{
|
|
var affected = db.Execute("UPDATE features SET status = @status WHERE id = @id",
|
|
("@status", status), ("@id", id));
|
|
Console.WriteLine(affected > 0 ? $"Feature {id} updated to '{status}'." : $"Feature {id} not found.");
|
|
}
|
|
}, dbOption, schemaOption, updateId, updateStatus, updateAllInModule);
|
|
|
|
// map
|
|
var mapId = new Argument<int>("id", "Feature ID");
|
|
var mapProject = new Option<string>("--project", "Target .NET project") { IsRequired = true };
|
|
var mapClass = new Option<string>("--class", "Target class") { IsRequired = true };
|
|
var mapMethod = new Option<string?>("--method", "Target method");
|
|
var mapCmd = new Command("map", "Map feature to .NET class/method");
|
|
mapCmd.AddArgument(mapId);
|
|
mapCmd.AddOption(mapProject);
|
|
mapCmd.AddOption(mapClass);
|
|
mapCmd.AddOption(mapMethod);
|
|
mapCmd.SetHandler((string dbPath, string schemaPath, int id, string project, string cls, string? method) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
var affected = db.Execute(
|
|
"UPDATE features SET dotnet_project = @project, dotnet_class = @cls, dotnet_method = @method WHERE id = @id",
|
|
("@project", project), ("@cls", cls), ("@method", method), ("@id", id));
|
|
Console.WriteLine(affected > 0 ? $"Feature {id} mapped to {cls}.{method}." : $"Feature {id} not found.");
|
|
}, dbOption, schemaOption, mapId, mapProject, mapClass, mapMethod);
|
|
|
|
// set-na
|
|
var naId = new Argument<int>("id", "Feature ID");
|
|
var naReason = new Option<string>("--reason", "Reason for N/A") { IsRequired = true };
|
|
var naCmd = new Command("set-na", "Mark feature as N/A");
|
|
naCmd.AddArgument(naId);
|
|
naCmd.AddOption(naReason);
|
|
naCmd.SetHandler((string dbPath, string schemaPath, int id, string reason) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
var affected = db.Execute("UPDATE features SET status = 'n_a', notes = @reason WHERE id = @id",
|
|
("@reason", reason), ("@id", id));
|
|
Console.WriteLine(affected > 0 ? $"Feature {id} set to N/A: {reason}" : $"Feature {id} not found.");
|
|
}, dbOption, schemaOption, naId, naReason);
|
|
|
|
featureCommand.AddCommand(listCmd);
|
|
featureCommand.AddCommand(showCmd);
|
|
featureCommand.AddCommand(updateCmd);
|
|
featureCommand.AddCommand(mapCmd);
|
|
featureCommand.AddCommand(naCmd);
|
|
|
|
return featureCommand;
|
|
}
|
|
|
|
private static string Truncate(string? s, int maxLen) =>
|
|
s is null ? "" : s.Length <= maxLen ? s : s[..(maxLen - 3)] + "...";
|
|
}
|
|
```
|
|
|
|
**Step 2: Register in Program.cs**
|
|
|
|
```csharp
|
|
rootCommand.AddCommand(FeatureCommands.Create(dbOption, schemaOption));
|
|
```
|
|
|
|
**Step 3: Build and test**
|
|
|
|
```bash
|
|
cd tools/NatsNet.PortTracker && dotnet build
|
|
dotnet run -- feature list --db ../../porting.db
|
|
```
|
|
|
|
**Step 4: Commit**
|
|
|
|
```bash
|
|
git add tools/NatsNet.PortTracker/
|
|
git commit -m "feat(porttracker): add feature commands (list, show, update, map, set-na)"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 9: .NET PortTracker — Test commands
|
|
|
|
**Files:**
|
|
- Create: `tools/NatsNet.PortTracker/Commands/TestCommands.cs`
|
|
- Modify: `tools/NatsNet.PortTracker/Program.cs`
|
|
|
|
Same pattern as FeatureCommands but for `unit_tests` table. Includes `list`, `show`, `update`, `map`. No `set-na` for tests (they follow their feature's status).
|
|
|
|
Follow the exact same structure as FeatureCommands, replacing table/column references for `unit_tests`. Register with `rootCommand.AddCommand(TestCommands.Create(dbOption, schemaOption));`.
|
|
|
|
**Step 1: Write TestCommands.cs (same pattern as FeatureCommands)**
|
|
|
|
**Step 2: Register in Program.cs**
|
|
|
|
**Step 3: Build and test**
|
|
|
|
```bash
|
|
cd tools/NatsNet.PortTracker && dotnet build
|
|
dotnet run -- test list --db ../../porting.db
|
|
```
|
|
|
|
**Step 4: Commit**
|
|
|
|
```bash
|
|
git add tools/NatsNet.PortTracker/
|
|
git commit -m "feat(porttracker): add test commands (list, show, update, map)"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 10: .NET PortTracker — Library commands
|
|
|
|
**Files:**
|
|
- Create: `tools/NatsNet.PortTracker/Commands/LibraryCommands.cs`
|
|
- Modify: `tools/NatsNet.PortTracker/Program.cs`
|
|
|
|
**Step 1: Write LibraryCommands.cs**
|
|
|
|
Commands: `list`, `map`, `suggest`. The `suggest` command shows all unmapped libraries.
|
|
|
|
```csharp
|
|
using System.CommandLine;
|
|
using NatsNet.PortTracker.Data;
|
|
|
|
namespace NatsNet.PortTracker.Commands;
|
|
|
|
public static class LibraryCommands
|
|
{
|
|
public static Command Create(Option<string> dbOption, Option<string> schemaOption)
|
|
{
|
|
var libCommand = new Command("library", "Manage library mappings");
|
|
|
|
// list
|
|
var listStatus = new Option<string?>("--status", "Filter by status");
|
|
var listCmd = new Command("list", "List library mappings");
|
|
listCmd.AddOption(listStatus);
|
|
listCmd.SetHandler((string dbPath, string schemaPath, string? status) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
var sql = "SELECT id, go_import_path, go_library_name, dotnet_package, dotnet_namespace, status FROM library_mappings";
|
|
var parameters = new List<(string, object?)>();
|
|
if (status is not null)
|
|
{
|
|
sql += " WHERE status = @status";
|
|
parameters.Add(("@status", status));
|
|
}
|
|
sql += " ORDER BY go_import_path";
|
|
|
|
var rows = db.Query(sql, parameters.ToArray());
|
|
Console.WriteLine($"{"ID",-5} {"Go Import",-45} {"Status",-12} {"DotNet Package",-30} {"DotNet Namespace",-25}");
|
|
Console.WriteLine(new string('-', 117));
|
|
foreach (var row in rows)
|
|
Console.WriteLine($"{row["id"],-5} {row["go_import_path"],-45} {row["status"],-12} {row["dotnet_package"] ?? "",-30} {row["dotnet_namespace"] ?? "",-25}");
|
|
Console.WriteLine($"\nTotal: {rows.Count} libraries");
|
|
}, dbOption, schemaOption, listStatus);
|
|
|
|
// map
|
|
var mapId = new Argument<int>("id", "Library mapping ID");
|
|
var mapPackage = new Option<string>("--package", "NuGet package or BCL") { IsRequired = true };
|
|
var mapNamespace = new Option<string?>("--namespace", "Target namespace");
|
|
var mapNotes = new Option<string?>("--notes", "Usage notes");
|
|
var mapCmd = new Command("map", "Map Go library to .NET equivalent");
|
|
mapCmd.AddArgument(mapId);
|
|
mapCmd.AddOption(mapPackage);
|
|
mapCmd.AddOption(mapNamespace);
|
|
mapCmd.AddOption(mapNotes);
|
|
mapCmd.SetHandler((string dbPath, string schemaPath, int id, string package_, string? ns, string? notes) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
var affected = db.Execute(
|
|
"UPDATE library_mappings SET dotnet_package = @pkg, dotnet_namespace = @ns, dotnet_usage_notes = @notes, status = 'mapped' WHERE id = @id",
|
|
("@pkg", package_), ("@ns", ns), ("@notes", notes), ("@id", id));
|
|
Console.WriteLine(affected > 0 ? $"Library {id} mapped to {package_}." : $"Library {id} not found.");
|
|
}, dbOption, schemaOption, mapId, mapPackage, mapNamespace, mapNotes);
|
|
|
|
// suggest
|
|
var suggestCmd = new Command("suggest", "Show unmapped libraries");
|
|
suggestCmd.SetHandler((string dbPath, string schemaPath) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
var rows = db.Query("SELECT id, go_import_path, go_library_name FROM library_mappings WHERE status = 'not_mapped' ORDER BY go_import_path");
|
|
if (rows.Count == 0)
|
|
{
|
|
Console.WriteLine("All libraries are mapped!");
|
|
return;
|
|
}
|
|
Console.WriteLine($"Unmapped libraries ({rows.Count}):\n");
|
|
foreach (var row in rows)
|
|
Console.WriteLine($" #{row["id"],-5} {row["go_import_path"]}");
|
|
}, dbOption, schemaOption);
|
|
|
|
libCommand.AddCommand(listCmd);
|
|
libCommand.AddCommand(mapCmd);
|
|
libCommand.AddCommand(suggestCmd);
|
|
|
|
return libCommand;
|
|
}
|
|
}
|
|
```
|
|
|
|
**Step 2: Register in Program.cs**
|
|
|
|
**Step 3: Build, test, commit**
|
|
|
|
```bash
|
|
cd tools/NatsNet.PortTracker && dotnet build
|
|
dotnet run -- library list --db ../../porting.db
|
|
dotnet run -- library suggest --db ../../porting.db
|
|
git add tools/NatsNet.PortTracker/
|
|
git commit -m "feat(porttracker): add library commands (list, map, suggest)"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 11: .NET PortTracker — Dependency commands
|
|
|
|
**Files:**
|
|
- Create: `tools/NatsNet.PortTracker/Commands/DependencyCommands.cs`
|
|
- Modify: `tools/NatsNet.PortTracker/Program.cs`
|
|
|
|
**Step 1: Write DependencyCommands.cs**
|
|
|
|
Commands: `show <type> <id>`, `blocked`, `ready`.
|
|
|
|
The `ready` command is the most important — it queries for items whose dependencies are ALL at status `complete`, `verified`, or `n_a`, meaning they can be worked on.
|
|
|
|
The `blocked` command shows items that have at least one dependency still at `not_started` or `stub`.
|
|
|
|
```csharp
|
|
using System.CommandLine;
|
|
using NatsNet.PortTracker.Data;
|
|
|
|
namespace NatsNet.PortTracker.Commands;
|
|
|
|
public static class DependencyCommands
|
|
{
|
|
public static Command Create(Option<string> dbOption, Option<string> schemaOption)
|
|
{
|
|
var depCommand = new Command("dependency", "Manage dependencies");
|
|
|
|
// show
|
|
var showType = new Argument<string>("type", "Item type: module, feature, or unit_test");
|
|
var showId = new Argument<int>("id", "Item ID");
|
|
var showCmd = new Command("show", "Show dependencies for an item");
|
|
showCmd.AddArgument(showType);
|
|
showCmd.AddArgument(showId);
|
|
showCmd.SetHandler((string dbPath, string schemaPath, string type, int id) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
|
|
// Items this depends on
|
|
var deps = db.Query(
|
|
"SELECT target_type, target_id, dependency_kind FROM dependencies WHERE source_type = @type AND source_id = @id",
|
|
("@type", type), ("@id", id));
|
|
Console.WriteLine($"Dependencies of {type} #{id} ({deps.Count}):");
|
|
foreach (var d in deps)
|
|
{
|
|
var targetName = ResolveItemName(db, d["target_type"]!.ToString()!, Convert.ToInt32(d["target_id"]));
|
|
Console.WriteLine($" -> {d["target_type"]} #{d["target_id"]} ({targetName}) [{d["dependency_kind"]}]");
|
|
}
|
|
|
|
// Items that depend on this
|
|
var rdeps = db.Query(
|
|
"SELECT source_type, source_id, dependency_kind FROM dependencies WHERE target_type = @type AND target_id = @id",
|
|
("@type", type), ("@id", id));
|
|
Console.WriteLine($"\nDepended on by ({rdeps.Count}):");
|
|
foreach (var d in rdeps)
|
|
{
|
|
var srcName = ResolveItemName(db, d["source_type"]!.ToString()!, Convert.ToInt32(d["source_id"]));
|
|
Console.WriteLine($" <- {d["source_type"]} #{d["source_id"]} ({srcName}) [{d["dependency_kind"]}]");
|
|
}
|
|
}, dbOption, schemaOption, showType, showId);
|
|
|
|
// blocked
|
|
var blockedCmd = new Command("blocked", "Show items with unported dependencies");
|
|
blockedCmd.SetHandler((string dbPath, string schemaPath) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
// Modules that have at least one dependency not yet complete/verified/n_a
|
|
var sql = @"
|
|
SELECT DISTINCT d.source_type, d.source_id,
|
|
CASE d.source_type
|
|
WHEN 'module' THEN (SELECT name FROM modules WHERE id = d.source_id)
|
|
WHEN 'feature' THEN (SELECT name FROM features WHERE id = d.source_id)
|
|
WHEN 'unit_test' THEN (SELECT name FROM unit_tests WHERE id = d.source_id)
|
|
END as name,
|
|
COUNT(*) as blocking_count
|
|
FROM dependencies d
|
|
WHERE EXISTS (
|
|
SELECT 1 FROM (
|
|
SELECT 'module' as type, id, status FROM modules
|
|
UNION ALL SELECT 'feature', id, status FROM features
|
|
UNION ALL SELECT 'unit_test', id, status FROM unit_tests
|
|
) items
|
|
WHERE items.type = d.target_type AND items.id = d.target_id
|
|
AND items.status NOT IN ('complete', 'verified', 'n_a')
|
|
)
|
|
GROUP BY d.source_type, d.source_id
|
|
ORDER BY d.source_type, blocking_count DESC";
|
|
var rows = db.Query(sql);
|
|
Console.WriteLine($"Blocked items ({rows.Count}):\n");
|
|
Console.WriteLine($"{"Type",-12} {"ID",-6} {"Name",-35} {"Blocking Deps",-15}");
|
|
Console.WriteLine(new string('-', 68));
|
|
foreach (var row in rows)
|
|
Console.WriteLine($"{row["source_type"],-12} {row["source_id"],-6} {row["name"],-35} {row["blocking_count"],-15}");
|
|
}, dbOption, schemaOption);
|
|
|
|
// ready
|
|
var readyCmd = new Command("ready", "Show items ready to port (all deps satisfied)");
|
|
readyCmd.SetHandler((string dbPath, string schemaPath) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
// Items with status not_started or stub whose ALL dependencies are complete/verified/n_a
|
|
// (or items with no dependencies at all)
|
|
var sql = @"
|
|
SELECT 'module' as type, id, name, status FROM modules
|
|
WHERE status IN ('not_started', 'stub')
|
|
AND NOT EXISTS (
|
|
SELECT 1 FROM dependencies d
|
|
JOIN (
|
|
SELECT 'module' as type, id, status FROM modules
|
|
UNION ALL SELECT 'feature', id, status FROM features
|
|
UNION ALL SELECT 'unit_test', id, status FROM unit_tests
|
|
) items ON items.type = d.target_type AND items.id = d.target_id
|
|
WHERE d.source_type = 'module' AND d.source_id = modules.id
|
|
AND items.status NOT IN ('complete', 'verified', 'n_a')
|
|
)
|
|
UNION ALL
|
|
SELECT 'feature', id, name, status FROM features
|
|
WHERE status IN ('not_started', 'stub')
|
|
AND NOT EXISTS (
|
|
SELECT 1 FROM dependencies d
|
|
JOIN (
|
|
SELECT 'module' as type, id, status FROM modules
|
|
UNION ALL SELECT 'feature', id, status FROM features
|
|
UNION ALL SELECT 'unit_test', id, status FROM unit_tests
|
|
) items ON items.type = d.target_type AND items.id = d.target_id
|
|
WHERE d.source_type = 'feature' AND d.source_id = features.id
|
|
AND items.status NOT IN ('complete', 'verified', 'n_a')
|
|
)
|
|
ORDER BY type, name";
|
|
var rows = db.Query(sql);
|
|
Console.WriteLine($"Items ready to port ({rows.Count}):\n");
|
|
Console.WriteLine($"{"Type",-12} {"ID",-6} {"Name",-40} {"Status",-15}");
|
|
Console.WriteLine(new string('-', 73));
|
|
foreach (var row in rows)
|
|
Console.WriteLine($"{row["type"],-12} {row["id"],-6} {row["name"],-40} {row["status"],-15}");
|
|
}, dbOption, schemaOption);
|
|
|
|
depCommand.AddCommand(showCmd);
|
|
depCommand.AddCommand(blockedCmd);
|
|
depCommand.AddCommand(readyCmd);
|
|
|
|
return depCommand;
|
|
}
|
|
|
|
private static string ResolveItemName(Database db, string type, int id)
|
|
{
|
|
var table = type switch
|
|
{
|
|
"module" => "modules",
|
|
"feature" => "features",
|
|
"unit_test" => "unit_tests",
|
|
_ => "modules"
|
|
};
|
|
return db.ExecuteScalar<string>($"SELECT name FROM {table} WHERE id = @id", ("@id", id)) ?? "?";
|
|
}
|
|
}
|
|
```
|
|
|
|
**Step 2: Register in Program.cs**
|
|
|
|
**Step 3: Build, test, commit**
|
|
|
|
```bash
|
|
cd tools/NatsNet.PortTracker && dotnet build
|
|
dotnet run -- dependency ready --db ../../porting.db
|
|
dotnet run -- dependency blocked --db ../../porting.db
|
|
git add tools/NatsNet.PortTracker/
|
|
git commit -m "feat(porttracker): add dependency commands (show, blocked, ready)"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 12: .NET PortTracker — Report commands
|
|
|
|
**Files:**
|
|
- Create: `tools/NatsNet.PortTracker/Reporting/ReportGenerator.cs`
|
|
- Create: `tools/NatsNet.PortTracker/Commands/ReportCommands.cs`
|
|
- Modify: `tools/NatsNet.PortTracker/Program.cs`
|
|
|
|
**Step 1: Write ReportGenerator.cs**
|
|
|
|
```csharp
|
|
using NatsNet.PortTracker.Data;
|
|
|
|
namespace NatsNet.PortTracker.Reporting;
|
|
|
|
public class ReportGenerator
|
|
{
|
|
private readonly Database _db;
|
|
|
|
public ReportGenerator(Database db) => _db = db;
|
|
|
|
public void PrintSummary()
|
|
{
|
|
PrintTableSummary("Modules", "modules");
|
|
PrintTableSummary("Features", "features");
|
|
PrintTableSummary("Unit Tests", "unit_tests");
|
|
PrintTableSummary("Libraries", "library_mappings");
|
|
}
|
|
|
|
private void PrintTableSummary(string label, string table)
|
|
{
|
|
var statusCol = table == "library_mappings" ? "status" : "status";
|
|
var rows = _db.Query($"SELECT {statusCol} as status, COUNT(*) as count FROM {table} GROUP BY {statusCol} ORDER BY {statusCol}");
|
|
var total = rows.Sum(r => Convert.ToInt32(r["count"]));
|
|
|
|
Console.WriteLine($"\n{label} ({total} total):");
|
|
foreach (var row in rows)
|
|
{
|
|
var count = Convert.ToInt32(row["count"]);
|
|
var pct = total > 0 ? (count * 100.0 / total).ToString("F1") : "0.0";
|
|
Console.WriteLine($" {row["status"],-15} {count,6} ({pct}%)");
|
|
}
|
|
}
|
|
|
|
public string ExportMarkdown()
|
|
{
|
|
var sb = new System.Text.StringBuilder();
|
|
sb.AppendLine("# Porting Status Report");
|
|
sb.AppendLine($"\nGenerated: {DateTime.UtcNow:yyyy-MM-dd HH:mm} UTC\n");
|
|
|
|
AppendMarkdownSection(sb, "Modules", "modules");
|
|
AppendMarkdownSection(sb, "Features", "features");
|
|
AppendMarkdownSection(sb, "Unit Tests", "unit_tests");
|
|
AppendMarkdownSection(sb, "Library Mappings", "library_mappings");
|
|
|
|
// Per-module breakdown
|
|
sb.AppendLine("\n## Per-Module Breakdown\n");
|
|
sb.AppendLine("| Module | Features | Tests | Status |");
|
|
sb.AppendLine("|--------|----------|-------|--------|");
|
|
var modules = _db.Query("SELECT id, name, status FROM modules ORDER BY name");
|
|
foreach (var mod in modules)
|
|
{
|
|
var fCount = _db.ExecuteScalar<long>("SELECT COUNT(*) FROM features WHERE module_id = @id", ("@id", mod["id"]!));
|
|
var tCount = _db.ExecuteScalar<long>("SELECT COUNT(*) FROM unit_tests WHERE module_id = @id", ("@id", mod["id"]!));
|
|
sb.AppendLine($"| {mod["name"]} | {fCount} | {tCount} | {mod["status"]} |");
|
|
}
|
|
|
|
return sb.ToString();
|
|
}
|
|
|
|
private void AppendMarkdownSection(System.Text.StringBuilder sb, string label, string table)
|
|
{
|
|
sb.AppendLine($"\n## {label}\n");
|
|
sb.AppendLine("| Status | Count | % |");
|
|
sb.AppendLine("|--------|-------|---|");
|
|
var rows = _db.Query($"SELECT status, COUNT(*) as count FROM {table} GROUP BY status ORDER BY status");
|
|
var total = rows.Sum(r => Convert.ToInt32(r["count"]));
|
|
foreach (var row in rows)
|
|
{
|
|
var count = Convert.ToInt32(row["count"]);
|
|
var pct = total > 0 ? (count * 100.0 / total).ToString("F1") : "0.0";
|
|
sb.AppendLine($"| {row["status"]} | {count} | {pct}% |");
|
|
}
|
|
sb.AppendLine($"| **Total** | **{total}** | |");
|
|
}
|
|
}
|
|
```
|
|
|
|
**Step 2: Write ReportCommands.cs**
|
|
|
|
```csharp
|
|
using System.CommandLine;
|
|
using NatsNet.PortTracker.Data;
|
|
using NatsNet.PortTracker.Reporting;
|
|
|
|
namespace NatsNet.PortTracker.Commands;
|
|
|
|
public static class ReportCommands
|
|
{
|
|
public static Command Create(Option<string> dbOption, Option<string> schemaOption)
|
|
{
|
|
var reportCommand = new Command("report", "Generate reports");
|
|
|
|
// summary
|
|
var summaryCmd = new Command("summary", "Show overall progress");
|
|
summaryCmd.SetHandler((string dbPath, string schemaPath) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
var report = new ReportGenerator(db);
|
|
report.PrintSummary();
|
|
}, dbOption, schemaOption);
|
|
|
|
// export
|
|
var exportFormat = new Option<string>("--format", getDefaultValue: () => "md", "Export format (md)");
|
|
var exportOutput = new Option<string?>("--output", "Output file path (stdout if not set)");
|
|
var exportCmd = new Command("export", "Export status report");
|
|
exportCmd.AddOption(exportFormat);
|
|
exportCmd.AddOption(exportOutput);
|
|
exportCmd.SetHandler((string dbPath, string schemaPath, string format, string? output) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
var report = new ReportGenerator(db);
|
|
var md = report.ExportMarkdown();
|
|
if (output is not null)
|
|
{
|
|
File.WriteAllText(output, md);
|
|
Console.WriteLine($"Report exported to {output}");
|
|
}
|
|
else
|
|
{
|
|
Console.Write(md);
|
|
}
|
|
}, dbOption, schemaOption, exportFormat, exportOutput);
|
|
|
|
reportCommand.AddCommand(summaryCmd);
|
|
reportCommand.AddCommand(exportCmd);
|
|
|
|
return reportCommand;
|
|
}
|
|
}
|
|
```
|
|
|
|
**Step 3: Register in Program.cs, build, test, commit**
|
|
|
|
```bash
|
|
cd tools/NatsNet.PortTracker && dotnet build
|
|
dotnet run -- report summary --db ../../porting.db
|
|
git add tools/NatsNet.PortTracker/
|
|
git commit -m "feat(porttracker): add report commands (summary, export)"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 13: .NET PortTracker — Phase commands
|
|
|
|
**Files:**
|
|
- Create: `tools/NatsNet.PortTracker/Commands/PhaseCommands.cs`
|
|
- Modify: `tools/NatsNet.PortTracker/Program.cs`
|
|
|
|
**Step 1: Write PhaseCommands.cs**
|
|
|
|
The `list` command shows all 7 phases with a calculated status. The `check` command runs verification queries for a specific phase.
|
|
|
|
```csharp
|
|
using System.CommandLine;
|
|
using NatsNet.PortTracker.Data;
|
|
|
|
namespace NatsNet.PortTracker.Commands;
|
|
|
|
public static class PhaseCommands
|
|
{
|
|
private static readonly (int Number, string Name, string Description)[] Phases =
|
|
[
|
|
(1, "Decomposition", "Break down Go codebase into modules, features, tests, dependencies"),
|
|
(2, "Verification", "Verify all items captured and dependencies populated"),
|
|
(3, "Library Mapping", "Map Go libraries to .NET equivalents"),
|
|
(4, ".NET Design", "Design .NET solution and map items"),
|
|
(5, "Mapping Verification", "Verify all Go items mapped to .NET"),
|
|
(6, "Porting", "Port Go code to .NET"),
|
|
(7, "Port Verification", "Verify ported code via targeted testing"),
|
|
];
|
|
|
|
public static Command Create(Option<string> dbOption, Option<string> schemaOption)
|
|
{
|
|
var phaseCommand = new Command("phase", "Manage phases");
|
|
|
|
// list
|
|
var listCmd = new Command("list", "Show all phases and their status");
|
|
listCmd.SetHandler((string dbPath, string schemaPath) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
Console.WriteLine($"{"#",-4} {"Phase",-25} {"Status",-15} {"Description"}");
|
|
Console.WriteLine(new string('-', 85));
|
|
foreach (var (num, name, desc) in Phases)
|
|
{
|
|
var status = GetPhaseStatus(db, num);
|
|
Console.WriteLine($"{num,-4} {name,-25} {status,-15} {desc}");
|
|
}
|
|
}, dbOption, schemaOption);
|
|
|
|
// check
|
|
var checkNum = new Argument<int>("phase", "Phase number (1-7)");
|
|
var checkCmd = new Command("check", "Run verification for a phase");
|
|
checkCmd.AddArgument(checkNum);
|
|
checkCmd.SetHandler((string dbPath, string schemaPath, int phase) =>
|
|
{
|
|
using var db = new Database(dbPath);
|
|
RunPhaseCheck(db, phase);
|
|
}, dbOption, schemaOption, checkNum);
|
|
|
|
phaseCommand.AddCommand(listCmd);
|
|
phaseCommand.AddCommand(checkCmd);
|
|
|
|
return phaseCommand;
|
|
}
|
|
|
|
private static string GetPhaseStatus(Database db, int phase)
|
|
{
|
|
return phase switch
|
|
{
|
|
1 => db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules") > 0 ? "done" : "pending",
|
|
2 => db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules") > 0 ? "ready" : "blocked",
|
|
3 => db.ExecuteScalar<long>("SELECT COUNT(*) FROM library_mappings WHERE status = 'not_mapped'") == 0
|
|
&& db.ExecuteScalar<long>("SELECT COUNT(*) FROM library_mappings") > 0 ? "done" : "pending",
|
|
4 => db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules WHERE dotnet_project IS NULL AND status != 'n_a'") == 0
|
|
&& db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules") > 0 ? "done" : "pending",
|
|
5 => db.ExecuteScalar<long>("SELECT COUNT(*) FROM features WHERE dotnet_class IS NULL AND status != 'n_a'") == 0
|
|
&& db.ExecuteScalar<long>("SELECT COUNT(*) FROM features") > 0 ? "done" : "pending",
|
|
6 => db.ExecuteScalar<long>("SELECT COUNT(*) FROM features WHERE status NOT IN ('complete', 'verified', 'n_a')") == 0
|
|
&& db.ExecuteScalar<long>("SELECT COUNT(*) FROM features") > 0 ? "done" : "pending",
|
|
7 => db.ExecuteScalar<long>("SELECT COUNT(*) FROM features WHERE status != 'verified' AND status != 'n_a'") == 0
|
|
&& db.ExecuteScalar<long>("SELECT COUNT(*) FROM features") > 0 ? "done" : "pending",
|
|
_ => "unknown"
|
|
};
|
|
}
|
|
|
|
private static void RunPhaseCheck(Database db, int phase)
|
|
{
|
|
Console.WriteLine($"Phase {phase} verification:\n");
|
|
switch (phase)
|
|
{
|
|
case 1:
|
|
var mods = db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules");
|
|
var feats = db.ExecuteScalar<long>("SELECT COUNT(*) FROM features");
|
|
var tests = db.ExecuteScalar<long>("SELECT COUNT(*) FROM unit_tests");
|
|
var deps = db.ExecuteScalar<long>("SELECT COUNT(*) FROM dependencies");
|
|
Console.WriteLine($" Modules: {mods}");
|
|
Console.WriteLine($" Features: {feats}");
|
|
Console.WriteLine($" Unit Tests: {tests}");
|
|
Console.WriteLine($" Dependencies: {deps}");
|
|
Console.WriteLine(mods > 0 && feats > 0 ? "\n PASS: Data populated" : "\n FAIL: Missing data");
|
|
break;
|
|
case 2:
|
|
var orphanFeatures = db.ExecuteScalar<long>("SELECT COUNT(*) FROM features WHERE module_id NOT IN (SELECT id FROM modules)");
|
|
var orphanTests = db.ExecuteScalar<long>("SELECT COUNT(*) FROM unit_tests WHERE module_id NOT IN (SELECT id FROM modules)");
|
|
Console.WriteLine($" Orphaned features: {orphanFeatures}");
|
|
Console.WriteLine($" Orphaned tests: {orphanTests}");
|
|
Console.WriteLine(orphanFeatures == 0 && orphanTests == 0 ? "\n PASS: No orphans" : "\n FAIL: Orphaned items found");
|
|
break;
|
|
case 3:
|
|
var unmapped = db.ExecuteScalar<long>("SELECT COUNT(*) FROM library_mappings WHERE status = 'not_mapped'");
|
|
var total = db.ExecuteScalar<long>("SELECT COUNT(*) FROM library_mappings");
|
|
Console.WriteLine($" Total libraries: {total}");
|
|
Console.WriteLine($" Unmapped: {unmapped}");
|
|
Console.WriteLine(unmapped == 0 ? "\n PASS: All libraries mapped" : $"\n FAIL: {unmapped} libraries unmapped");
|
|
break;
|
|
case 4:
|
|
case 5:
|
|
var unmappedMods = db.ExecuteScalar<long>("SELECT COUNT(*) FROM modules WHERE dotnet_project IS NULL AND status != 'n_a'");
|
|
var unmappedFeats = db.ExecuteScalar<long>("SELECT COUNT(*) FROM features WHERE dotnet_class IS NULL AND status != 'n_a'");
|
|
var unmappedTests = db.ExecuteScalar<long>("SELECT COUNT(*) FROM unit_tests WHERE dotnet_class IS NULL AND status != 'n_a'");
|
|
Console.WriteLine($" Unmapped modules: {unmappedMods}");
|
|
Console.WriteLine($" Unmapped features: {unmappedFeats}");
|
|
Console.WriteLine($" Unmapped tests: {unmappedTests}");
|
|
var allMapped = unmappedMods == 0 && unmappedFeats == 0 && unmappedTests == 0;
|
|
Console.WriteLine(allMapped ? "\n PASS: All items mapped" : "\n FAIL: Unmapped items remain");
|
|
break;
|
|
case 6:
|
|
var notPortedF = db.ExecuteScalar<long>("SELECT COUNT(*) FROM features WHERE status NOT IN ('complete', 'verified', 'n_a')");
|
|
var totalF = db.ExecuteScalar<long>("SELECT COUNT(*) FROM features");
|
|
Console.WriteLine($" Features not ported: {notPortedF} / {totalF}");
|
|
Console.WriteLine(notPortedF == 0 ? "\n PASS: All features ported" : $"\n FAIL: {notPortedF} features remaining");
|
|
break;
|
|
case 7:
|
|
var notVerified = db.ExecuteScalar<long>("SELECT COUNT(*) FROM features WHERE status NOT IN ('verified', 'n_a')");
|
|
var totalV = db.ExecuteScalar<long>("SELECT COUNT(*) FROM features");
|
|
Console.WriteLine($" Features not verified: {notVerified} / {totalV}");
|
|
Console.WriteLine(notVerified == 0 ? "\n PASS: All features verified" : $"\n FAIL: {notVerified} features not verified");
|
|
break;
|
|
default:
|
|
Console.WriteLine($" Unknown phase: {phase}");
|
|
break;
|
|
}
|
|
}
|
|
}
|
|
```
|
|
|
|
**Step 2: Register in Program.cs, build, test, commit**
|
|
|
|
```bash
|
|
cd tools/NatsNet.PortTracker && dotnet build
|
|
dotnet run -- phase list --db ../../porting.db
|
|
dotnet run -- phase check 1 --db ../../porting.db
|
|
git add tools/NatsNet.PortTracker/
|
|
git commit -m "feat(porttracker): add phase commands (list, check)"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 14: Write Phase 1-3 instruction documents
|
|
|
|
**Files:**
|
|
- Create: `docs/plans/phases/phase-1-decomposition.md`
|
|
- Create: `docs/plans/phases/phase-2-verification.md`
|
|
- Create: `docs/plans/phases/phase-3-library-mapping.md`
|
|
|
|
**Step 1: Write phase-1-decomposition.md**
|
|
|
|
Full step-by-step guide for breaking down the Go codebase. Includes exact commands to run, expected output patterns, and troubleshooting.
|
|
|
|
Key sections:
|
|
- Prerequisites (Go toolchain, SQLite, built tools)
|
|
- Step-by-step: init DB, run analyzer, review modules, spot-check features, verify tests, review deps
|
|
- Verification: `porttracker phase check 1`
|
|
- Troubleshooting: common issues with Go parsing
|
|
|
|
**Step 2: Write phase-2-verification.md**
|
|
|
|
Cross-checking captured data against baselines. Includes exact shell commands for counting functions/files in Go source to compare against DB counts.
|
|
|
|
**Step 3: Write phase-3-library-mapping.md**
|
|
|
|
Guide for mapping each Go import to .NET. Includes a reference table of common Go stdlib -> .NET BCL mappings as a starting point.
|
|
|
|
**Step 4: Commit**
|
|
|
|
```bash
|
|
git add docs/plans/phases/
|
|
git commit -m "docs: add phase 1-3 instruction guides"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 15: Write Phase 4-7 instruction documents
|
|
|
|
**Files:**
|
|
- Create: `docs/plans/phases/phase-4-dotnet-design.md`
|
|
- Create: `docs/plans/phases/phase-5-mapping-verification.md`
|
|
- Create: `docs/plans/phases/phase-6-porting.md`
|
|
- Create: `docs/plans/phases/phase-7-porting-verification.md`
|
|
|
|
**Step 1: Write phase-4-dotnet-design.md**
|
|
|
|
Guide for designing the .NET solution structure, mapping modules/features/tests. References `documentation_rules.md` for naming conventions.
|
|
|
|
**Step 2: Write phase-5-mapping-verification.md**
|
|
|
|
Verification checklist: all items mapped or N/A with justification, naming validated, no collisions.
|
|
|
|
**Step 3: Write phase-6-porting.md**
|
|
|
|
Detailed porting workflow: dependency-ordered work, DB update discipline, commit patterns. Includes exact `porttracker` commands for the workflow loop.
|
|
|
|
**Step 4: Write phase-7-porting-verification.md**
|
|
|
|
Targeted testing guide: per-module test execution using `dotnet test --filter`, status update flow, behavioral comparison approach.
|
|
|
|
**Step 5: Commit**
|
|
|
|
```bash
|
|
git add docs/plans/phases/
|
|
git commit -m "docs: add phase 4-7 instruction guides"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 16: Final integration test and cleanup
|
|
|
|
**Files:**
|
|
- Modify: `tools/NatsNet.PortTracker/Program.cs` (ensure all commands registered)
|
|
|
|
**Step 1: Clean build of both tools**
|
|
|
|
```bash
|
|
cd tools/go-analyzer && go build -o go-analyzer .
|
|
cd ../../tools/NatsNet.PortTracker && dotnet build
|
|
```
|
|
|
|
**Step 2: End-to-end test**
|
|
|
|
```bash
|
|
# Delete any existing DB
|
|
rm -f porting.db
|
|
|
|
# Init
|
|
cd tools/NatsNet.PortTracker
|
|
dotnet run -- init --db ../../porting.db --schema ../../porting-schema.sql
|
|
|
|
# Run Go analyzer
|
|
cd ../go-analyzer
|
|
./go-analyzer --source ../../golang/nats-server --db ../../porting.db --schema ../../porting-schema.sql
|
|
|
|
# Test all porttracker commands
|
|
cd ../NatsNet.PortTracker
|
|
dotnet run -- module list --db ../../porting.db
|
|
dotnet run -- feature list --db ../../porting.db | head -20
|
|
dotnet run -- test list --db ../../porting.db | head -20
|
|
dotnet run -- library suggest --db ../../porting.db
|
|
dotnet run -- dependency ready --db ../../porting.db
|
|
dotnet run -- report summary --db ../../porting.db
|
|
dotnet run -- phase list --db ../../porting.db
|
|
dotnet run -- phase check 1 --db ../../porting.db
|
|
dotnet run -- report export --format md --output ../../porting-status.md --db ../../porting.db
|
|
```
|
|
|
|
**Step 3: Verify DB contents via sqlite3**
|
|
|
|
```bash
|
|
sqlite3 ../../porting.db ".tables"
|
|
sqlite3 ../../porting.db "SELECT COUNT(*) FROM modules;"
|
|
sqlite3 ../../porting.db "SELECT COUNT(*) FROM features;"
|
|
sqlite3 ../../porting.db "SELECT COUNT(*) FROM unit_tests;"
|
|
```
|
|
|
|
**Step 4: Final commit**
|
|
|
|
```bash
|
|
git add -A
|
|
git commit -m "feat: complete porting tracker tooling — Go analyzer, .NET CLI, phase guides"
|
|
```
|