refactor(docs): move requirements and test infra docs into docs/ subdirectories

Organize documentation by moving requirements (HighLevelReqs, Component-*,
lmxproxy_protocol) to docs/requirements/ and test infrastructure docs to
docs/test_infra/. Updates all cross-references in README, CLAUDE.md,
infra/README, component docs, and 23 plan files.
This commit is contained in:
Joseph Doherty
2026-03-21 01:11:35 -04:00
parent 0a85a839a2
commit d91aa83665
52 changed files with 486 additions and 124 deletions

View File

@@ -0,0 +1,121 @@
# Test Infrastructure
This document describes the local Docker-based test infrastructure for ScadaLink development. Six services provide the external dependencies needed to run and test the system locally. The first six run in `infra/docker-compose.yml`; Traefik runs alongside the cluster nodes in `docker/docker-compose.yml`.
## Services
| Service | Image | Port(s) | Config | Compose File |
|---------|-------|---------|--------|-------------|
| OPC UA Server | `mcr.microsoft.com/iotedge/opc-plc:latest` | 50000 (OPC UA), 8080 (web) | `infra/opcua/nodes.json` | `infra/` |
| LDAP Server | `glauth/glauth:latest` | 3893 | `infra/glauth/config.toml` | `infra/` |
| MS SQL 2022 | `mcr.microsoft.com/mssql/server:2022-latest` | 1433 | `infra/mssql/setup.sql` | `infra/` |
| SMTP (Mailpit) | `axllent/mailpit:latest` | 1025 (SMTP), 8025 (web) | Environment vars | `infra/` |
| REST API (Flask) | Custom build (`infra/restapi/Dockerfile`) | 5200 | `infra/restapi/app.py` | `infra/` |
| LmxFakeProxy | Custom build (`infra/lmxfakeproxy/Dockerfile`) | 50051 (gRPC) | Environment vars | `infra/` |
| Traefik LB | `traefik:v3.4` | 9000 (proxy), 8180 (dashboard) | `docker/traefik/` | `docker/` |
## Quick Start
```bash
cd infra
docker compose up -d
```
After the first startup, run the SQL setup and seed scripts:
```bash
docker exec -i scadalink-mssql /opt/mssql-tools18/bin/sqlcmd \
-S localhost -U sa -P 'ScadaLink_Dev1#' -C \
-i /docker-entrypoint-initdb.d/setup.sql
docker exec -i scadalink-mssql /opt/mssql-tools18/bin/sqlcmd \
-S localhost -U sa -P 'ScadaLink_Dev1#' -C \
-i /docker-entrypoint-initdb.d/machinedata_seed.sql
```
## Per-Service Documentation
Each service has a dedicated document with configuration details, verification steps, and troubleshooting:
- [test_infra_opcua.md](test_infra_opcua.md) — OPC UA test server (Azure IoT OPC PLC)
- [test_infra_ldap.md](test_infra_ldap.md) — LDAP test server (GLAuth)
- [test_infra_db.md](test_infra_db.md) — MS SQL 2022 database
- [test_infra_smtp.md](test_infra_smtp.md) — SMTP test server (Mailpit)
- [test_infra_restapi.md](test_infra_restapi.md) — REST API test server (Flask)
- [test_infra_lmxfakeproxy.md](test_infra_lmxfakeproxy.md) — LmxProxy fake server (OPC UA bridge)
- Traefik LB — see `docker/README.md` and `docker/traefik/` (runs with the cluster, not in `infra/`)
## Connection Strings
For use in `appsettings.Development.json`:
```json
{
"ConnectionStrings": {
"ScadaLinkConfig": "Server=localhost,1433;Database=ScadaLinkConfig;User Id=scadalink_app;Password=ScadaLink_Dev1#;TrustServerCertificate=true",
"ScadaLinkMachineData": "Server=localhost,1433;Database=ScadaLinkMachineData;User Id=scadalink_app;Password=ScadaLink_Dev1#;TrustServerCertificate=true"
},
"Ldap": {
"Server": "localhost",
"Port": 3893,
"BaseDN": "dc=scadalink,dc=local",
"UseSsl": false
},
"OpcUa": {
"EndpointUrl": "opc.tcp://localhost:50000"
},
"Smtp": {
"Server": "localhost",
"Port": 1025,
"AuthMode": "None",
"FromAddress": "scada-notifications@company.com",
"ConnectionTimeout": 30
},
"ExternalSystems": {
"TestApi": {
"BaseUrl": "http://localhost:5200",
"AuthMode": "ApiKey",
"ApiKey": "scadalink-test-key-1"
}
}
}
```
## Stopping & Teardown
```bash
cd infra
docker compose down # stop containers, preserve SQL data volume
docker compose stop opcua # stop a single service (also: ldap, mssql, smtp, restapi)
```
**Full teardown** (removes volumes, optionally images and venv):
```bash
cd infra
./teardown.sh # stop containers + delete SQL data volume
./teardown.sh --images # also remove downloaded Docker images
./teardown.sh --all # also remove the Python venv
```
After a full teardown, the next `docker compose up -d` starts fresh — re-run the SQL setup script.
## Files
```
infra/
docker-compose.yml # All five services
teardown.sh # Teardown script (volumes, images, venv)
glauth/config.toml # LDAP users and groups
mssql/setup.sql # Database and user creation
mssql/machinedata_seed.sql # Machine Data tables, stored procedures, sample data
opcua/nodes.json # Custom OPC UA tag definitions
restapi/app.py # Flask REST API server
restapi/Dockerfile # REST API container build
lmxfakeproxy/ # .NET gRPC proxy bridging LmxProxy protocol to OPC UA
tools/ # Python CLI tools (opcua, ldap, mssql, smtp, restapi)
README.md # Quick-start for the infra folder
docker/
traefik/traefik.yml # Traefik static config (entrypoints, file provider)
traefik/dynamic.yml # Traefik dynamic config (load balancer, health check routing)
```

View File

@@ -0,0 +1,162 @@
# Test Infrastructure: MS SQL 2022 Database
## Overview
The test database uses Microsoft SQL Server 2022 Developer Edition running in Docker. It provides two empty databases for ScadaLink — schema creation is handled by EF Core migrations at application startup (dev mode).
## Image & Ports
- **Image**: `mcr.microsoft.com/mssql/server:2022-latest`
- **Port**: 1433
- **Edition**: Developer (free, full-featured)
## Credentials
| Account | Username | Password | Purpose |
|---------|----------|----------|---------|
| SA | `sa` | `ScadaLink_Dev1#` | Server admin (setup only) |
| App | `scadalink_app` | `ScadaLink_Dev1#` | Application login (db_owner on both databases) |
## Databases
| Database | Purpose |
|----------|---------|
| `ScadaLinkConfig` | Configuration Database component — templates, deployments, users, audit log |
| `ScadaLinkMachineData` | Machine/operational data storage |
Both databases are created by `infra/mssql/setup.sql`. EF Core migrations populate the `ScadaLinkConfig` schema. The `ScadaLinkMachineData` database is seeded with sample tables and stored procedures by `infra/mssql/machinedata_seed.sql`.
## Data Persistence
SQL data is stored in the named Docker volume `scadalink-mssql-data`. Data survives container restarts and `docker compose down`. To reset the database completely:
```bash
docker compose down -v
docker compose up -d
# Re-run setup.sql after the container starts
```
## First-Time Setup
After the first `docker compose up -d`, run the setup script:
```bash
docker exec -i scadalink-mssql /opt/mssql-tools18/bin/sqlcmd \
-S localhost -U sa -P 'ScadaLink_Dev1#' -C \
-i /docker-entrypoint-initdb.d/setup.sql
```
This creates the databases and the `scadalink_app` login. Then seed the Machine Data database with sample tables, stored procedures, and data:
```bash
docker exec -i scadalink-mssql /opt/mssql-tools18/bin/sqlcmd \
-S localhost -U sa -P 'ScadaLink_Dev1#' -C \
-i /docker-entrypoint-initdb.d/machinedata_seed.sql
```
You only need to run these once (or again after deleting the volume).
## Connection Strings
For `appsettings.Development.json`:
```
Server=localhost,1433;Database=ScadaLinkConfig;User Id=scadalink_app;Password=ScadaLink_Dev1#;TrustServerCertificate=true
```
```
Server=localhost,1433;Database=ScadaLinkMachineData;User Id=scadalink_app;Password=ScadaLink_Dev1#;TrustServerCertificate=true
```
## Verification
1. Check the container is running:
```bash
docker ps --filter name=scadalink-mssql
```
2. Query using `sqlcmd` inside the container:
```bash
docker exec -it scadalink-mssql /opt/mssql-tools18/bin/sqlcmd \
-S localhost -U sa -P 'ScadaLink_Dev1#' -C \
-Q "SELECT name FROM sys.databases"
```
3. Verify the app login:
```bash
docker exec -it scadalink-mssql /opt/mssql-tools18/bin/sqlcmd \
-S localhost -U scadalink_app -P 'ScadaLink_Dev1#' -C \
-d ScadaLinkConfig \
-Q "SELECT DB_NAME()"
```
## CLI Tool
The `infra/tools/mssql_tool.py` script provides a convenient CLI for interacting with the SQL Server.
**Install dependencies** (one-time):
```bash
pip install -r infra/tools/requirements.txt
```
**Commands**:
```bash
# Check connectivity and verify expected databases exist
python infra/tools/mssql_tool.py check
# Run the first-time setup script (uses autocommit mode for CREATE DATABASE)
python infra/tools/mssql_tool.py setup --script infra/mssql/setup.sql
# List tables in a database
python infra/tools/mssql_tool.py tables --database ScadaLinkConfig
# Run an ad-hoc query
python infra/tools/mssql_tool.py query --database ScadaLinkConfig --sql "SELECT name FROM sys.tables"
```
Use `--host`, `--port`, `--user`, `--password` to override defaults (localhost:1433, sa, ScadaLink_Dev1#). Run with `--help` for full usage.
## Machine Data Tables & Stored Procedures
The `machinedata_seed.sql` script creates the following in `ScadaLinkMachineData`:
**Tables**:
| Table | Description | Sample Data |
|-------|-------------|-------------|
| `TagHistory` | Time-series tag values from OPC UA / custom protocols | Pressure, flow, level, temperature, speed readings for SiteA/SiteB |
| `ProductionCounts` | Shift/line production totals (good, reject, efficiency) | 3 days of 2-shift data across 2 sites |
| `EquipmentEvents` | State changes, faults, maintenance, alarm events | Pump faults, belt inspections, batch starts |
| `BatchRecords` | Production batch tracking (start, complete, abort) | 5 batches including one in-progress |
| `AlarmHistory` | Historical alarm activations, acks, and clears | Active, acknowledged, and cleared alarms |
**Stored Procedures**:
| Procedure | Description |
|-----------|-------------|
| `usp_GetTagHistory` | Get tag values for a tag path within a date range |
| `usp_GetProductionSummary` | Aggregate production by line over a date range |
| `usp_InsertBatchRecord` | Insert a new batch (for `CachedWrite` testing) |
| `usp_CompleteBatch` | Complete or abort a batch |
| `usp_GetEquipmentEvents` | Get recent equipment events with optional filters |
| `usp_GetActiveAlarms` | Get active/acknowledged alarms by severity |
## Relevance to ScadaLink Components
- **Configuration Database** — primary consumer; EF Core context targets `ScadaLinkConfig`.
- **Deployment Manager** — reads/writes deployment records in `ScadaLinkConfig`.
- **Template Engine** — reads/writes template definitions in `ScadaLinkConfig`.
- **Security & Auth** — user/role data stored in `ScadaLinkConfig`.
- **External System Gateway** — scripts use `Database.Connection("machineDataConnection")` to query `ScadaLinkMachineData` tables and stored procedures.
- **Site Runtime** — scripts call stored procedures via `Database.Connection()` and `Database.CachedWrite()` for batch recording and data queries.
- **Inbound API** — methods can query machine data via named database connections.
## Notes
- The `sa` password must meet SQL Server complexity requirements (uppercase, lowercase, digit, special character, 8+ characters).
- If the container fails to start, check Docker has at least 2GB RAM allocated (SQL Server minimum requirement).
- The setup script is idempotent — safe to run multiple times.

View File

@@ -0,0 +1,127 @@
# Test Infrastructure: LDAP Server
## Overview
The test LDAP server uses [GLAuth](https://glauth.github.io/), a lightweight LDAP server backed by a TOML config file. It provides test users and groups that map to ScadaLink's role-based authorization model.
## Image & Ports
- **Image**: `glauth/glauth:latest`
- **LDAP port**: 3893 (plain LDAP, no TLS — dev only)
## Base DN
```
dc=scadalink,dc=local
```
## Test Users
All users have the password `password`.
| Username | Email | Primary Group | Additional Groups | ScadaLink Role |
|----------|-------|---------------|-------------------|----------------|
| `admin` | admin@scadalink.local | SCADA-Admins | — | Full administrator |
| `designer` | designer@scadalink.local | SCADA-Designers | — | Template designer |
| `deployer` | deployer@scadalink.local | SCADA-Deploy-All | — | Deploy to all sites |
| `site-deployer` | site-deployer@scadalink.local | SCADA-Deploy-SiteA | — | Deploy to SiteA only |
| `multi-role` | multi-role@scadalink.local | SCADA-Admins | SCADA-Designers, SCADA-Deploy-All | Multiple roles |
## Groups
| Group | GID | Purpose |
|-------|-----|---------|
| SCADA-Admins | 5501 | Full administrative access |
| SCADA-Designers | 5502 | Template creation and editing |
| SCADA-Deploy-All | 5503 | Deploy to any site |
| SCADA-Deploy-SiteA | 5504 | Deploy to SiteA only (site-scoped) |
## User DNs
Users bind with their full DN, which includes the primary group as an OU:
```
cn=<username>,ou=<PrimaryGroupName>,ou=users,dc=scadalink,dc=local
```
For example: `cn=admin,ou=SCADA-Admins,ou=users,dc=scadalink,dc=local`
The full DNs for all test users:
| Username | Full DN |
|----------|---------|
| `admin` | `cn=admin,ou=SCADA-Admins,ou=users,dc=scadalink,dc=local` |
| `designer` | `cn=designer,ou=SCADA-Designers,ou=users,dc=scadalink,dc=local` |
| `deployer` | `cn=deployer,ou=SCADA-Deploy-All,ou=users,dc=scadalink,dc=local` |
| `site-deployer` | `cn=site-deployer,ou=SCADA-Deploy-SiteA,ou=users,dc=scadalink,dc=local` |
| `multi-role` | `cn=multi-role,ou=SCADA-Admins,ou=users,dc=scadalink,dc=local` |
## Verification
1. Check the container is running:
```bash
docker ps --filter name=scadalink-ldap
```
2. Test a user bind with `ldapsearch`:
```bash
ldapsearch -H ldap://localhost:3893 \
-D "cn=admin,ou=SCADA-Admins,ou=users,dc=scadalink,dc=local" \
-w password \
-b "dc=scadalink,dc=local" \
"(objectClass=*)"
```
3. Search for group membership:
```bash
ldapsearch -H ldap://localhost:3893 \
-D "cn=admin,ou=SCADA-Admins,ou=users,dc=scadalink,dc=local" \
-w password \
-b "dc=scadalink,dc=local" \
"(cn=multi-role)"
```
## CLI Tool
The `infra/tools/ldap_tool.py` script provides a convenient CLI for interacting with the LDAP server.
**Install dependencies** (one-time):
```bash
pip install -r infra/tools/requirements.txt
```
**Commands**:
```bash
# Check LDAP connectivity and list entries
python infra/tools/ldap_tool.py check
# Test user authentication
python infra/tools/ldap_tool.py bind --user designer --password password
# List all users with group memberships
python infra/tools/ldap_tool.py users
# List all groups with members
python infra/tools/ldap_tool.py groups
# Search with an arbitrary LDAP filter
python infra/tools/ldap_tool.py search --filter "(cn=multi-role)"
```
Use `--host` and `--port` to override defaults (localhost:3893). Run with `--help` for full usage.
## Relevance to ScadaLink Components
- **Security & Auth** — test LDAP bind authentication, group-to-role mapping, and multi-group resolution.
- **Central UI** — test login flows with different role combinations.
## Notes
- GLAuth uses plain LDAP on port 3893. ScadaLink's Security & Auth component requires LDAPS/StartTLS in production. For dev testing, configure the LDAP client to allow plaintext connections.
- To add users or groups, edit `infra/glauth/config.toml` locally and restart the container: `docker compose restart ldap`. Note that the file is named `config.toml` on the host but is mounted into the container as `/app/config/config.cfg` (the path GLAuth expects).
- The `admin` user is configured with `[[users.capabilities]]` (`action = "search"`, `object = "*"`) in the GLAuth config. This grants the admin account permission to perform LDAP search operations, which is required for user/group lookups.
- Anonymous bind is not allowed. All LDAP operations (including searches) require an authenticated bind. Use the `admin` account for search operations.

View File

@@ -0,0 +1,76 @@
# Test Infrastructure: LmxFakeProxy
## Overview
LmxFakeProxy is a .NET gRPC server that implements the `scada.ScadaService` proto (full parity with the real LmxProxy server) but bridges to the OPC UA test server instead of System Platform MXAccess. This enables end-to-end testing of `RealLmxProxyClient` and the LmxProxy DCL adapter.
## Image & Ports
- **Image**: Custom build (`infra/lmxfakeproxy/Dockerfile`)
- **gRPC endpoint**: `localhost:50051`
## Configuration
| Environment Variable | Default | Description |
|---------------------|---------|-------------|
| `PORT` | `50051` | gRPC listen port |
| `OPC_ENDPOINT` | `opc.tcp://localhost:50000` | Backend OPC UA server |
| `OPC_PREFIX` | `ns=3;s=` | Prefix prepended to LMX tags to form OPC UA NodeIds |
| `API_KEY` | *(none)* | If set, enforces API key on all gRPC calls |
## Tag Address Mapping
LMX-style flat addresses are mapped to OPC UA NodeIds by prepending the configured prefix:
| LMX Tag | OPC UA NodeId |
|---------|--------------|
| `Motor.Speed` | `ns=3;s=Motor.Speed` |
| `Pump.FlowRate` | `ns=3;s=Pump.FlowRate` |
| `Tank.Level` | `ns=3;s=Tank.Level` |
## Supported RPCs
Full parity with the `scada.ScadaService` proto:
- **Connect / Disconnect / GetConnectionState** — Session management
- **Read / ReadBatch** — Read tag values via OPC UA
- **Write / WriteBatch / WriteBatchAndWait** — Write values via OPC UA
- **Subscribe** — Server-streaming subscriptions via OPC UA MonitoredItems
- **CheckApiKey** — API key validation
## Verification
1. Ensure the OPC UA test server is running:
```bash
docker ps --filter name=scadalink-opcua
```
2. Start the fake proxy:
```bash
docker compose up -d lmxfakeproxy
```
3. Check logs:
```bash
docker logs scadalink-lmxfakeproxy
```
4. Test with the ScadaLink CLI or a gRPC client.
## Running Standalone (without Docker)
```bash
cd infra/lmxfakeproxy
dotnet run -- --opc-endpoint opc.tcp://localhost:50000 --opc-prefix "ns=3;s="
```
With API key enforcement:
```bash
dotnet run -- --api-key my-secret-key
```
## Relevance to ScadaLink Components
- **Data Connection Layer** — Test `RealLmxProxyClient` and `LmxProxyDataConnection` against real OPC UA data
- **Site Runtime** — Deploy instances with LmxProxy data connections pointing at this server
- **Integration Tests** — End-to-end tests of the LmxProxy protocol path

View File

@@ -0,0 +1,98 @@
# Test Infrastructure: OPC UA Server
## Overview
The test OPC UA server uses [Azure IoT OPC PLC](https://github.com/Azure-Samples/iot-edge-opc-plc), a simulated OPC UA server that generates realistic data. It is configured with custom nodes that match ScadaLink attribute patterns.
## Image & Ports
- **Image**: `mcr.microsoft.com/iotedge/opc-plc:latest`
- **OPC UA endpoint**: `opc.tcp://localhost:50000`
- **Web/config UI**: `http://localhost:8080`
## Startup Flags
```
--autoaccept Accept all client certificates
--unsecuretransport Enable plain (non-TLS) OPC UA connections for dev tools
--sph Show PLC heartbeat on console
--sn=5 --sr=10 --st=uint 5 slow-changing nodes (10s cycle, uint)
--fn=5 --fr=1 --ft=uint 5 fast-changing nodes (1s cycle, uint)
--gn=5 5 stepping nodes
--nf=/app/config/nodes.json Custom node definitions
--pn=50000 Listen port
```
## Custom Nodes
The file `infra/opcua/nodes.json` defines a single `ConfigFolder` object (not an array) with a root "ScadaLink" folder containing four equipment subfolders. Tags match typical ScadaLink instance attribute patterns:
| Folder | Tags | Types |
|--------|------|-------|
| Motor | Speed, Temperature, Current, Running, FaultCode | Double, Boolean, UInt32 |
| Pump | FlowRate, Pressure, Running | Double, Boolean |
| Tank | Level, Temperature, HighLevel, LowLevel | Double, Boolean |
| Valve | Position, Command | Double, UInt32 |
| JoeAppEngine | BTCS, AlarmCntsBySeverity, Scheduler/ScanTime | String, Int32[], DateTime |
All custom nodes hold their initial/default values (0 for numerics, false for booleans, empty for strings, epoch for DateTime) until written. OPC PLC's custom node format does not support random value generation for these nodes.
Custom nodes live in namespace 3 (`http://microsoft.com/Opc/OpcPlc/`). Node IDs follow the pattern `ns=3;s=<Folder>.<Tag>` (e.g., `ns=3;s=Motor.Speed`). Nested folders use dot notation: `ns=3;s=JoeAppEngine.Scheduler.ScanTime`.
The browse path from the Objects root is: `OpcPlc > ScadaLink > Motor|Pump|Tank|Valve|JoeAppEngine`.
## Verification
1. Check the container is running:
```bash
docker ps --filter name=scadalink-opcua
```
2. Verify the OPC UA endpoint using any OPC UA client (e.g., UaExpert, opcua-commander):
```bash
# Using opcua-commander (npm install -g opcua-commander)
opcua-commander -e opc.tcp://localhost:50000
```
3. Check the web UI at `http://localhost:8080` for server status and node listing.
## CLI Tool
The `infra/tools/opcua_tool.py` script provides a convenient CLI for interacting with the OPC UA server.
**Install dependencies** (one-time):
```bash
pip install -r infra/tools/requirements.txt
```
**Commands**:
```bash
# Check server status, namespaces, and endpoints
python infra/tools/opcua_tool.py check
# Browse the Objects folder (top-level)
python infra/tools/opcua_tool.py browse
# Browse a specific equipment folder
python infra/tools/opcua_tool.py browse --path "3:OpcPlc.3:ScadaLink.3:Motor"
# Read a tag value
python infra/tools/opcua_tool.py read --node "ns=3;s=Motor.Speed"
# Write a value to a tag
python infra/tools/opcua_tool.py write --node "ns=3;s=Motor.Running" --value true --type Boolean
# Monitor value changes for 15 seconds
python infra/tools/opcua_tool.py monitor --nodes "ns=3;s=Motor.Speed,ns=3;s=Pump.FlowRate" --duration 15
```
Use `--endpoint` to override the default endpoint (`opc.tcp://localhost:50000`). Run with `--help` for full usage.
## Relevance to ScadaLink Components
- **Data Connection Layer** — connect to this server to test OPC UA subscription, read/write, and reconnection behavior.
- **Site Runtime / Instance Actors** — deploy instances with tag mappings pointing at these nodes.
- **Template Engine** — design templates with attributes matching the Motor/Pump/Tank/Valve folder structure.

View File

@@ -0,0 +1,208 @@
# Test Infrastructure: REST API Server (Flask)
## Overview
The test REST API server is a lightweight Python/Flask application that provides HTTP endpoints matching the patterns used by ScadaLink's External System Gateway and Inbound API components. It supports simple parameter/response methods, complex nested object/list methods, authentication, and error simulation.
## Image & Ports
- **Image**: Custom build from `infra/restapi/Dockerfile` (Python 3.13 + Flask)
- **API port**: 5200
## Configuration
| Setting | Value | Description |
|---------|-------|-------------|
| `API_NO_AUTH` | `0` | Set to `1` to disable API key authentication |
| `PORT` | `5200` | Server listen port |
## Authentication
The server validates requests using one of two methods:
- **API Key**: `X-API-Key: scadalink-test-key-1` header
- **Basic Auth**: Any username/password (accepts all credentials)
The `GET /api/Ping` endpoint is always unauthenticated (health check).
Auth can be disabled entirely by setting `API_NO_AUTH=1` in the Docker Compose environment or passing `--no-auth` when running directly.
For `appsettings.Development.json` (External System Gateway):
```json
{
"ExternalSystems": {
"TestApi": {
"BaseUrl": "http://localhost:5200",
"AuthMode": "ApiKey",
"ApiKey": "scadalink-test-key-1"
}
}
}
```
## Endpoints
### Simple Methods
| Method | Path | HTTP | Params | Response | Description |
|--------|------|------|--------|----------|-------------|
| Ping | `/api/Ping` | GET | — | `{"pong": true}` | Health check (no auth) |
| Add | `/api/Add` | POST | `{"a": 5, "b": 3}` | `{"result": 8}` | Add two numbers |
| Multiply | `/api/Multiply` | POST | `{"a": 4, "b": 7}` | `{"result": 28}` | Multiply two numbers |
| Echo | `/api/Echo` | POST | `{"message": "hello"}` | `{"message": "hello"}` | Echo back input |
| GetStatus | `/api/GetStatus` | POST | `{}` | `{"status": "running", "uptime": 123.4}` | Server status |
### Complex Methods (nested objects + lists)
| Method | Path | HTTP | Description |
|--------|------|------|-------------|
| GetProductionReport | `/api/GetProductionReport` | POST | Production report with line details |
| GetRecipe | `/api/GetRecipe` | POST | Recipe with ingredients list |
| SubmitBatch | `/api/SubmitBatch` | POST | Submit batch with items (complex input) |
| GetEquipmentStatus | `/api/GetEquipmentStatus` | POST | Equipment list with nested status objects |
### Error Simulation
| Method | Path | HTTP | Params | Description |
|--------|------|------|--------|-------------|
| SimulateTimeout | `/api/SimulateTimeout` | POST | `{"seconds": 5}` | Delay response (max 60s) |
| SimulateError | `/api/SimulateError` | POST | `{"code": 500}` | Return specified HTTP error (400599) |
### Method Discovery
| Method | Path | HTTP | Description |
|--------|------|------|-------------|
| methods | `/api/methods` | GET | List all available methods with signatures |
## Response Examples
**GetProductionReport** (`{"siteId": "SiteA", "startDate": "2026-03-01", "endDate": "2026-03-16"}`):
```json
{
"siteName": "Site SiteA",
"totalUnits": 14250,
"lines": [
{ "lineName": "Line-1", "units": 8200, "efficiency": 92.5 },
{ "lineName": "Line-2", "units": 6050, "efficiency": 88.1 }
]
}
```
**GetRecipe** (`{"recipeId": "R-100"}`):
```json
{
"recipeId": "R-100",
"name": "Standard Mix",
"version": 3,
"ingredients": [
{ "name": "Material-A", "quantity": 45.0, "unit": "kg" },
{ "name": "Material-B", "quantity": 12.5, "unit": "L" }
]
}
```
**SubmitBatch** (complex input with items list):
```json
{
"siteId": "SiteA",
"recipeId": "R-100",
"items": [
{ "materialId": "MAT-001", "quantity": 45.0, "lotNumber": "LOT-2026-001" },
{ "materialId": "MAT-002", "quantity": 12.5, "lotNumber": "LOT-2026-002" }
]
}
```
Response: `{"batchId": "BATCH-A1B2C3D4", "accepted": true, "itemCount": 2}`
**GetEquipmentStatus** (`{"siteId": "SiteA"}`):
```json
{
"siteId": "SiteA",
"equipment": [
{
"equipmentId": "PUMP-001",
"name": "Feed Pump A",
"status": { "state": "running", "health": 98.5, "lastMaintenance": "2026-02-15" }
},
{
"equipmentId": "TANK-001",
"name": "Mix Tank 1",
"status": { "state": "idle", "health": 100.0, "lastMaintenance": "2026-03-01" }
},
{
"equipmentId": "CONV-001",
"name": "Conveyor B",
"status": { "state": "alarm", "health": 72.3, "lastMaintenance": "2026-01-20" }
}
]
}
```
## Verification
1. Check the container is running:
```bash
docker ps --filter name=scadalink-restapi
```
2. Test the health endpoint:
```bash
curl http://localhost:5200/api/Ping
```
3. Test an authenticated call:
```bash
curl -X POST http://localhost:5200/api/Add \
-H "X-API-Key: scadalink-test-key-1" \
-H "Content-Type: application/json" \
-d '{"a": 2, "b": 3}'
```
## CLI Tool
The `infra/tools/restapi_tool.py` script provides a CLI for interacting with the REST API server. This tool requires the `requests` library (included in `tools/requirements.txt`).
**Commands**:
```bash
# Check API server connectivity and status
python infra/tools/restapi_tool.py check
# Call a simple method
python infra/tools/restapi_tool.py call --method Add --params '{"a": 2, "b": 3}'
# Call a complex method
python infra/tools/restapi_tool.py call --method GetProductionReport --params '{"siteId": "SiteA", "startDate": "2026-03-01", "endDate": "2026-03-16"}'
# Simulate an error
python infra/tools/restapi_tool.py call --method SimulateError --params '{"code": 503}'
# List all available methods
python infra/tools/restapi_tool.py methods
```
Use `--url` to override the base URL (default: `http://localhost:5200`), `--api-key` for the API key. Run with `--help` for full usage.
## Relevance to ScadaLink Components
- **External System Gateway** — test HTTP/REST calls (`ExternalSystem.Call()` and `CachedCall()`), API key authentication, error classification (5xx vs 4xx), and timeout handling.
- **Inbound API** — test the `POST /api/{methodName}` pattern, flat JSON parameters, and extended type system (Object, List) with complex nested responses.
- **Store-and-Forward Engine** — verify buffered retry by using `SimulateError` to return transient errors (503, 408, 429) and observing store-and-forward behavior.
## Notes
- The server is stateless — no data persistence between container restarts.
- `SimulateTimeout` caps at 60 seconds to prevent accidental container hangs.
- `SimulateError` accepts codes 400599; other values default to 500.
- `SubmitBatch` generates a unique `batchId` per call (UUID-based).
- The `/api/methods` endpoint provides machine-readable method discovery (useful for CLI tool and automated testing).
- To simulate connection failures for store-and-forward testing, stop the container: `docker compose stop restapi`. Restart with `docker compose start restapi`.

View File

@@ -0,0 +1,116 @@
# Test Infrastructure: SMTP Server (Mailpit)
## Overview
The test SMTP server uses [Mailpit](https://mailpit.axllent.org/), a lightweight email testing tool that captures all outgoing emails without delivering them. It provides both an SMTP server for sending and a web UI for inspecting captured messages.
## Image & Ports
- **Image**: `axllent/mailpit:latest`
- **SMTP port**: 1025
- **Web UI / API**: `http://localhost:8025`
## Configuration
| Setting | Value | Description |
|---------|-------|-------------|
| `MP_SMTP_AUTH_ACCEPT_ANY` | `1` | Accept any SMTP credentials (or none) — no real authentication |
| `MP_SMTP_AUTH_ALLOW_INSECURE` | `1` | Allow auth over plain SMTP (no TLS required) — dev only |
| `MP_MAX_MESSAGES` | `500` | Maximum stored messages before oldest are auto-deleted |
Mailpit accepts all emails regardless of sender/recipient domain. No emails leave the server — they are captured and viewable in the web UI.
## SMTP Connection Settings
For `appsettings.Development.json` (Notification Service):
```json
{
"Smtp": {
"Server": "localhost",
"Port": 1025,
"AuthMode": "None",
"FromAddress": "scada-notifications@company.com",
"ConnectionTimeout": 30
}
}
```
Since `MP_SMTP_AUTH_ACCEPT_ANY` is enabled, the Notification Service can use any auth mode:
- **No auth**: Connect directly, no credentials needed.
- **Basic Auth**: Any username/password will be accepted (useful for testing the auth code path without a real server).
- **OAuth2**: Not supported by Mailpit. For OAuth2 testing, use a real Microsoft 365 tenant.
## Mailpit API
Mailpit exposes a REST API at `http://localhost:8025/api` for programmatic access:
| Endpoint | Method | Description |
|----------|--------|-------------|
| `/api/v1/info` | GET | Server info (version, message count) |
| `/api/v1/messages` | GET | List messages (supports `?limit=N`) |
| `/api/v1/message/{id}` | GET | Read a specific message |
| `/api/v1/messages` | DELETE | Delete all messages |
| `/api/v1/search?query=...` | GET | Search messages |
## Verification
1. Check the container is running:
```bash
docker ps --filter name=scadalink-smtp
```
2. Open the web UI at `http://localhost:8025` to view captured emails.
3. Send a test email using `curl` or any SMTP client:
```bash
# Using Python's smtplib (one-liner)
python3 -c "
import smtplib; from email.mime.text import MIMEText
msg = MIMEText('Test body'); msg['Subject'] = 'Test'; msg['From'] = 'test@example.com'; msg['To'] = 'user@example.com'
smtplib.SMTP('localhost', 1025).sendmail('test@example.com', ['user@example.com'], msg.as_string())
print('Sent')
"
```
## CLI Tool
The `infra/tools/smtp_tool.py` script provides a convenient CLI for interacting with the SMTP server and Mailpit API. This tool uses only Python standard library modules — no additional dependencies required.
**Commands**:
```bash
# Check SMTP connectivity and Mailpit status
python infra/tools/smtp_tool.py check
# Send a test email
python infra/tools/smtp_tool.py send --to user@example.com --subject "Alarm: Tank High Level" --body "Tank level exceeded 95%"
# Send with BCC (matches ScadaLink notification delivery pattern)
python infra/tools/smtp_tool.py send --to scada-notifications@company.com --bcc "operator1@company.com,operator2@company.com" --subject "Shift Report"
# List captured messages
python infra/tools/smtp_tool.py list
# Read a specific message by ID
python infra/tools/smtp_tool.py read --id <message-id>
# Clear all messages
python infra/tools/smtp_tool.py clear
```
Use `--host` and `--port` to override SMTP defaults (localhost:1025), `--api` for the Mailpit API URL. Run with `--help` for full usage.
## Relevance to ScadaLink Components
- **Notification Service** — test SMTP delivery, BCC recipient handling, plain-text formatting, and store-and-forward retry behavior (Mailpit can be stopped/started to simulate transient failures).
- **Store-and-Forward Engine** — verify buffered retry by stopping the SMTP container and observing queued notifications.
## Notes
- Mailpit does **not** support OAuth2 Client Credentials authentication. To test the OAuth2 code path, use a real Microsoft 365 tenant (see Q12 in `docs/plans/questions.md`).
- To simulate SMTP failures for store-and-forward testing, stop the container: `docker compose stop smtp`. Restart with `docker compose start smtp`.
- The web UI at `http://localhost:8025` provides real-time message inspection, search, and message source viewing.
- No data persistence — messages are stored in a temporary database inside the container and lost on container removal.