Auto: ablegacy-11 — RSLogix 500/PLC-5 CSV symbol import

Closes #254
This commit is contained in:
Joseph Doherty
2026-04-26 04:13:13 -04:00
parent 4fdeef7a6c
commit 4e8df38bb2
19 changed files with 1644 additions and 0 deletions

View File

@@ -186,6 +186,52 @@ parse time — both combinations are semantically meaningless against a contiguo
For `B`-files the Rockwell convention is "one BOOL per word, not per bit": `B3:0,10` For `B`-files the Rockwell convention is "one BOOL per word, not per bit": `B3:0,10`
returns `bool[10]` (one per word's non-zero state), not `bool[160]`. returns `bool[10]` (one per word's non-zero state), not `bool[160]`.
### `import-rslogix`
ablegacy-11 / [#254](https://github.com/dohertj2/lmxopcua/issues/254) — bulk-import RSLogix
500 / 5 CSV symbol exports into an `appsettings.json` tag fragment. Avoids hand-typing every
`N7:0` / `F8:12` / `B3:0/5` row of a several-hundred-tag PLC. Binary `.RSS` / `.RSP` project
files are out of scope; export to CSV first.
```powershell
# Default: emit JSON fragment to stdout
otopcua-ablegacy-cli import-rslogix `
--file C:\plc\plc-export.csv `
--device ab://192.168.1.20/1,0
# Write the fragment to a file + print a summary line to stdout
otopcua-ablegacy-cli import-rslogix `
--file C:\plc\plc-export.csv `
--device ab://192.168.1.20/1,0 `
--output tags.json
# Filter by Scope column — only import Local:1 program-scoped tags
otopcua-ablegacy-cli import-rslogix `
--file C:\plc\plc-export.csv `
--device ab://192.168.1.20/1,0 `
--scope Local:1
# Summary mode — one-line counter for CI / health checks
otopcua-ablegacy-cli import-rslogix `
--file C:\plc\plc-export.csv `
--device ab://192.168.1.20/1,0 `
--emit summary
```
| Flag | Default | Purpose |
|---|---|---|
| `-f` / `--file` | **required** | RSLogix CSV path |
| `-d` / `--device` | **required** | `ab://host[:port]/cip-path` every imported tag binds to |
| `--emit` | `appsettings-fragment` | `appsettings-fragment` (JSON) or `summary` (one-line counter) |
| `-o` / `--output` | stdout | Optional output file path |
| `--scope` | none | Scope filter — `Global` / `Local:N` (case-insensitive); empty Scope counts as Global |
| `--max-rows` | unlimited | Defensive cap on rows imported |
| `--strict` | off | Fail-fast on first malformed row (default permissive: skip + log) |
See [drivers/AbLegacy-RSLogix-Import.md](drivers/AbLegacy-RSLogix-Import.md) for the full
column reference, file-letter → `AbLegacyDataType` mapping, and the API surface
(`IRsLogixImporter`, `AbLegacyDriverOptions.AddRsLogixImport`).
## Known caveat — ab_server upstream gap ## Known caveat — ab_server upstream gap
The integration-fixture `ab_server` Docker container accepts TCP but its PCCC The integration-fixture `ab_server` Docker container accepts TCP but its PCCC

View File

@@ -67,6 +67,19 @@ their flag values to the already-shipped driver.
then the other. The plausible result identifies the correct setting then the other. The plausible result identifies the correct setting
for that device family. (Modbus, S7.) for that device family. (Modbus, S7.)
## Family-specific commands
Most drivers ship the four shared verbs and nothing else. AB Legacy adds a
fifth family-specific verb for bulk symbol-table import:
| Driver | Extra verb | Doc |
|---|---|---|
| AB Legacy | `import-rslogix` — read RSLogix 500/5 CSV symbol exports + emit a JSON tag fragment | [drivers/AbLegacy-RSLogix-Import.md](drivers/AbLegacy-RSLogix-Import.md) |
Binary RSLogix project files (`.RSS` / `.RSP`) are out of scope for v1 — the
format is proprietary and undocumented; no parser ships in libplctag or any
community library. Export to CSV first.
## Known gaps ## Known gaps
- **AB Legacy cip-path quirk** — libplctag's ab_server requires a - **AB Legacy cip-path quirk** — libplctag's ab_server requires a

View File

@@ -0,0 +1,163 @@
# AB Legacy — RSLogix symbol & data-table import
ablegacy-11 / [#254](https://github.com/dohertj2/lmxopcua/issues/254) — bulk-import
RSLogix 500 / 5 symbol exports into the AB Legacy driver. Saves operators from
hand-typing every `N7:0` / `F8:12` / `B3:0/5` row of a several-hundred-tag PLC
into `appsettings.json`.
## Supported formats — v1
| Format | Status | Notes |
|---|---|---|
| `.CSV` "Database Export" | **supported** | Header columns `Symbol,Address,Description,DataType,Scope`; quoted fields, doubled-quote escapes, comment lines (`;` / `#`) all honoured |
| `.SLC` text export | **supported** | RSLogix 500's "Save As Text" emits the same column shape — point the importer at the file directly |
| `.RSS` (RSLogix 500 binary project) | **out of scope** | Proprietary; no parser ships in libplctag or any community project. Export to CSV first |
| `.RSP` (RSLogix 5 binary project) | **out of scope** | Same as `.RSS` |
The binary `.RSS` / `.RSP` non-goal isn't a "we don't have time" decision —
Rockwell's binary format is undocumented + tied to RSLogix's internal page
layout, and the only known parsers are commercial IDE plugins. v1 ships with
text/CSV only and a clean abstraction (`IRsLogixImporter`) so a binary parser
can slot in later without reshaping the call sites.
## CSV column reference
| Column | Required | Notes |
|---|---|---|
| `Symbol` | yes | OPC UA tag name. RSLogix symbols are already stable; the importer uses them verbatim |
| `Address` | yes | PCCC address. File letter implies `DataType` (see below); the importer's resolution wins over the CSV's `DataType` column |
| `Description` | no | Parsed but currently unused — `AbLegacyTagDefinition` has no `Description` field at the v2 schema layer (see [#248](https://github.com/dohertj2/lmxopcua/issues/248)). Held in the column contract for future schema bumps |
| `DataType` | no | RSLogix-supplied (`INT` / `REAL` / `BOOL` / `TIMER` / …). Ignored at import time; the importer derives the type from the file letter |
| `Scope` | no | `Global` (default when blank) or `Local:N` for ladder-file-N-scoped tags. Acts as a filter when `--scope` is set on the CLI |
### File-letter → `AbLegacyDataType` mapping
| Letter | Example | Maps to | Notes |
|---|---|---|---|
| `N` | `N7:0` | `Int` (signed 16-bit) | |
| `F` | `F8:0` | `Float` (32-bit IEEE-754) | |
| `B` | `B3:0/0` | `Bit` | Bit-within-word also forces Bit when `BitIndex` is set |
| `L` | `L9:0` | `Long` (signed 32-bit) | SLC 5/05+ only |
| `ST` | `ST10:0` | `String` | 82-byte fixed-length + length word |
| `T` | `T4:0.ACC` | `TimerElement` | Sub-element implied by `.ACC` / `.PRE` / `.EN` / `.DN` |
| `C` | `C5:0.ACC` | `CounterElement` | |
| `R` | `R6:0.LEN` | `ControlElement` | |
| `A` | `A14:0` | `AnalogInt` | Older hardware |
| `I` / `O` / `S` | `I:0/0` | `Int` (or `Bit` with bit suffix) | I/O + status files |
| `PD` / `MG` / `PLS` / `BT` | `PD9:0` | `PidElement` etc. | Family-gated; PD/MG common on SLC500 + PLC-5; PLS/BT PLC-5 only |
| `RTC` / `HSC` / `DLS` / … | `RTC:0.YR` | `MicroLogixFunctionFile` | MicroLogix 1100 / 1400 only |
A bit suffix (`/N`) on any file letter forces `Bit`, regardless of the file
letter's normal classification — `N7:0/3` parses as Bit, not Int.
## Scope filter
The `Scope` column distinguishes program-scoped tags (`Local:1`, `Local:2`, …)
from globals. RSLogix exports usually mix both. The CLI's `--scope` flag (and
`ImportOptions.ScopeFilter` at the API level) keeps only the rows whose
`Scope` value matches case-insensitively; rows with no `Scope` column count as
`Global`.
```powershell
# Import only the Global symbols
otopcua-ablegacy-cli import-rslogix `
--file plc-export.csv `
--device ab://192.168.1.20/1,0 `
--scope Global
# Import only the file-2 program-scope tags
otopcua-ablegacy-cli import-rslogix `
--file plc-export.csv `
--device ab://192.168.1.20/1,0 `
--scope Local:2
```
## CLI subcommand — `import-rslogix`
```powershell
otopcua-ablegacy-cli import-rslogix --help
```
| Flag | Default | Purpose |
|---|---|---|
| `-f` / `--file` | **required** | Path to the CSV export |
| `-d` / `--device` | **required** | Canonical AB Legacy gateway URI every imported tag binds to |
| `--emit` | `appsettings-fragment` | `appsettings-fragment` (JSON) or `summary` (one-line counter) |
| `-o` / `--output` | stdout | Optional path; when set the JSON fragment is written there + summary line goes to stdout |
| `--scope` | none | Optional Scope filter (case-insensitive) |
| `--max-rows` | unlimited | Defensive cap on rows imported |
| `--strict` | off | Fail-fast on the first malformed row (default permissive: skip + log) |
### `appsettings-fragment` output shape
The default `--emit appsettings-fragment` mode writes a JSON object whose
`Tags` array is shaped like the `AbLegacyDriverConfigDto.Tags` array — paste
straight into the driver-instance config under
`Drivers/<instance>/Config/Tags`.
```json
{
"Tags": [
{
"Name": "MotorSpeed",
"DeviceHostAddress": "ab://192.168.1.20/1,0",
"Address": "N7:0",
"DataType": "Int",
"Writable": true
},
]
}
```
### Summary line
`--emit summary` writes a single line:
```
Imported 142 tag(s), skipped 3, errors 0.
```
`Skipped` covers Scope-filter rejections + missing-required-field rows; `errors`
covers rows whose `Address` failed to parse as a PCCC address.
## API surface — `IRsLogixImporter` + `AddRsLogixImport`
For server-side / bootstrap use-cases the importer is also reachable via:
```csharp
using ZB.MOM.WW.OtOpcUa.Driver.AbLegacy;
using ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Import;
var options = new AbLegacyDriverOptions
{
Devices = [new AbLegacyDeviceOptions("ab://192.168.1.20/1,0")],
};
// Append imported tags onto an existing options object.
var updated = options.AddRsLogixImport(
path: @"C:\plc\plc-export.csv",
deviceHostAddress: "ab://192.168.1.20/1,0",
out var result);
// result.ParsedCount / SkippedCount / ErrorCount surface the import telemetry.
Console.WriteLine($"Imported {result.ParsedCount} tags");
```
For a hand-managed importer instance (e.g. supplying a custom `ILogger`) call
`new RsLogixSymbolImport(logger).Parse(stream, deviceHostAddress, opts)`
directly.
## Operational notes
- The importer is **additive**`AddRsLogixImport` concatenates onto the
existing `Tags` list rather than replacing it. Hand-rolled tags (system-status
variables, computed fields the operator added by hand) survive a re-import.
- Re-imports are not idempotent today — calling `AddRsLogixImport` twice will
produce duplicate tag rows. Operators are expected to either start from a
clean options object or de-duplicate themselves; a future schema rev may add
a `replace=true` switch.
- Description metadata is dropped on the floor — see the column reference
above. When [#248](https://github.com/dohertj2/lmxopcua/issues/248) lands a
`Description` field on `AbLegacyTagDefinition` the importer will start
populating it without further changes to the CSV contract.

View File

@@ -59,6 +59,28 @@ supplies a `FakeAbLegacyTag`.
`_Diagnostics/<host>/<name>` short-circuit returns the live snapshot through `_Diagnostics/<host>/<name>` short-circuit returns the live snapshot through
`ReadAsync` without bumping `RequestCount`; two devices keep counters `ReadAsync` without bumping `RequestCount`; two devices keep counters
independent. independent.
- `RsLogixSymbolImportTests` — ablegacy-11 / #254 RSLogix CSV symbol-import parser:
canonical 8-row CSV (one row per N/F/B/L/ST/T/C/R) → 8 typed
`AbLegacyTagDefinition`s with the right `DataType`; header + comment-line
(`;` / `#`) skipping; malformed-row → log warning + skip (`IgnoreInvalid=true`
default) vs. `InvalidDataException` (`IgnoreInvalid=false`); empty stream →
empty result; UTF-8 BOM survival; embedded comma in quoted Description;
doubled-quote escape; `--scope` filter (Global vs. Local:N); `MaxRowsToImport`
cap; missing required header column → `InvalidDataException` regardless of
`IgnoreInvalid`; `TryResolveDataType` rejects garbage + bit-suffix overrides
the file letter (`N7:0/3` → Bit).
- `RsLogixSymbolImportGoldenTests` — golden-snapshot integration: loads
`Fixtures/rslogix-canonical.csv` (8-row canonical export covering every v1
file letter), serialises the resulting tag list, and compares to
`Fixtures/rslogix-canonical-expected.json`. On mismatch the actual JSON is
dumped to `%TEMP%/rslogix-canonical-actual.json` and the path printed in the
failure message so the dev can `cp` the golden after reviewing the diff.
- `AbLegacyDriverFactoryAddRsLogixImportTests` — covers the
`AbLegacyDriverFactoryExtensions.AddRsLogixImport` extension method:
appends imported tags onto an existing options object without dropping the
hand-rolled tags or the device list; mutates by-copy (immutability
guarantee); `AddRsLogixImportWithResult` tuple overload returns both the
modified options and the import counters.
- `AbLegacyDeadbandTests` — PR 8 per-tag deadband / change filter: - `AbLegacyDeadbandTests` — PR 8 per-tag deadband / change filter:
absolute-only suppression sequence `[10.0, 10.5, 11.5, 11.6] -> [10.0, 11.5]`, absolute-only suppression sequence `[10.0, 10.5, 11.5, 11.6] -> [10.0, 11.5]`,
percent-only suppression with a zero-prev short-circuit, both-set logical-OR percent-only suppression with a zero-prev short-circuit, both-set logical-OR
@@ -173,5 +195,10 @@ falsely marked Stopped just because the driver-wide probe timeout is tight.
— known-limitations write-up + resolution paths — known-limitations write-up + resolution paths
- `tests/ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Tests/FakeAbLegacyTag.cs` - `tests/ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Tests/FakeAbLegacyTag.cs`
in-process fake + factory in-process fake + factory
- `tests/ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Tests/Fixtures/rslogix-canonical.csv`
— ablegacy-11 / #254 8-row canonical RSLogix CSV symbol export, one row per
v1 file letter (N/F/B/L/ST/T/C/R)
- `tests/ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Tests/Fixtures/rslogix-canonical-expected.json`
— golden snapshot the import tests compare against
- `src/ZB.MOM.WW.OtOpcUa.Driver.AbLegacy/AbLegacyDriver.cs` — scope remarks - `src/ZB.MOM.WW.OtOpcUa.Driver.AbLegacy/AbLegacyDriver.cs` — scope remarks
at the top of the file at the top of the file

View File

@@ -196,5 +196,54 @@ if ($DiagnosticsRequestCountNodeId) {
} }
} }
# ablegacy-11 / #254 — RSLogix CSV import smoke. Builds an in-memory canonical CSV
# (one row per N/F/B/L/ST/T/C/R file letter), invokes `import-rslogix --emit
# appsettings-fragment` against it, parses the resulting JSON, and asserts the Tags
# array carries exactly 8 entries. Doesn't talk to the PLC — purely offline parser
# coverage.
Write-Header "RSLogix CSV import"
$importCsvPath = Join-Path $env:TEMP "ablegacy-rslogix-canonical-$([guid]::NewGuid()).csv"
$importJsonPath = Join-Path $env:TEMP "ablegacy-rslogix-fragment-$([guid]::NewGuid()).json"
@"
Symbol,Address,Description,DataType,Scope
MotorSpeed,N7:0,Motor speed setpoint,INT,Global
TankLevel,F8:0,Tank level (gallons),REAL,Global
RunFlag,B3:0/0,Run command flag,BOOL,Global
TotalCount,L9:0,Total piece count,LINT,Global
RecipeName,ST10:0,"Recipe name, free-form text",STRING,Global
DwellTimer,T4:0.ACC,Dwell timer accumulator,TIMER,Global
PieceCounter,C5:0.ACC,Piece counter accumulator,COUNTER,Global
StateMachine,R6:0.LEN,State-machine control length,CONTROL,Global
"@ | Set-Content -Path $importCsvPath -Encoding UTF8
try {
$importResult = Invoke-Cli -Cli $abLegacyCli `
-Args @("import-rslogix", "--file", $importCsvPath, "--device", $Gateway,
"--emit", "appsettings-fragment", "--output", $importJsonPath)
if ($importResult.ExitCode -ne 0) {
Write-Fail "import-rslogix exit=$($importResult.ExitCode): $($importResult.Output)"
$results += @{ Passed = $false; Reason = "import-rslogix exit $($importResult.ExitCode)" }
}
elseif (-not (Test-Path $importJsonPath)) {
Write-Fail "import-rslogix produced no output file at $importJsonPath"
$results += @{ Passed = $false; Reason = "no output file" }
}
else {
$fragment = Get-Content $importJsonPath -Raw | ConvertFrom-Json
$tagCount = @($fragment.Tags).Count
if ($tagCount -eq 8) {
Write-Pass "import-rslogix emitted $tagCount tag(s) — matches CSV row count"
$results += @{ Passed = $true }
} else {
Write-Fail "import-rslogix emitted $tagCount tag(s); expected 8"
$results += @{ Passed = $false; Reason = "tag count $tagCount" }
}
}
}
finally {
Remove-Item -Path $importCsvPath -ErrorAction SilentlyContinue
Remove-Item -Path $importJsonPath -ErrorAction SilentlyContinue
}
Write-Summary -Title "AB Legacy e2e" -Results $results Write-Summary -Title "AB Legacy e2e" -Results $results
if ($results | Where-Object { -not $_.Passed }) { exit 1 } if ($results | Where-Object { -not $_.Passed }) { exit 1 }

View File

@@ -0,0 +1,130 @@
using System.IO;
using System.Text.Json;
using CliFx;
using CliFx.Attributes;
using CliFx.Exceptions;
using CliFx.Infrastructure;
using ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Import;
namespace ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Cli.Commands;
/// <summary>
/// ablegacy-11 / #254 — read an RSLogix 500 / 5 "Database Export" CSV and emit either an
/// <c>appsettings.json</c> tag fragment or a summary line. Avoids the AbLegacyCommandBase
/// hierarchy because import is a purely-offline operation: no gateway, no driver, no
/// timeout. Mirrors the
/// <a href="https://github.com/dohertj2/lmxopcua/issues/254">#254 plan section</a>'s CLI
/// specification verbatim.
/// </summary>
[Command("import-rslogix", Description =
"Read an RSLogix 500/5 CSV symbol export and emit a JSON tag fragment for appsettings.json. " +
"Binary .RSS / .RSP project files are out of scope (see docs/drivers/AbLegacy-RSLogix-Import.md).")]
public sealed class ImportRslogixCommand : ICommand
{
[CommandOption("file", 'f', Description =
"Path to the RSLogix CSV export. RFC 4180-ish format with header columns " +
"Symbol,Address,Description,DataType,Scope; quoted fields + doubled-quote escapes " +
"are honoured; comment lines starting with ; or # are skipped.",
IsRequired = true)]
public string File { get; init; } = default!;
[CommandOption("device", 'd', Description =
"Canonical AB Legacy gateway URI (ab://host[:port]/cip-path) every imported tag " +
"binds to. Required even though import is offline — the resulting tag definitions " +
"carry the gateway address as their DeviceHostAddress.",
IsRequired = true)]
public string Device { get; init; } = default!;
[CommandOption("emit", Description =
"Output shape: 'appsettings-fragment' (default) emits a JSON object with a Tags array " +
"ready to paste into appsettings.json; 'summary' emits one human-readable counter line.")]
public string Emit { get; init; } = "appsettings-fragment";
[CommandOption("output", 'o', Description =
"Optional output file. When omitted, the result goes to stdout.")]
public string? Output { get; init; }
[CommandOption("scope", Description =
"Optional Scope filter — match the row's Scope column against this value " +
"case-insensitively. Common values: 'Global', 'Local:1', 'Local:2'. Rows with " +
"no Scope column count as Global.")]
public string? Scope { get; init; }
[CommandOption("max-rows", Description =
"Defensive cap on the number of rows imported. Beyond the cap the parser stops " +
"and emits a warning; useful for dry-running a large export against the CLI.")]
public int? MaxRows { get; init; }
[CommandOption("strict", Description =
"When set, the first malformed row throws and the CLI exits non-zero. Default is " +
"permissive (skip + log).")]
public bool Strict { get; init; }
public async ValueTask ExecuteAsync(IConsole console)
{
if (!System.IO.File.Exists(File))
{
// Surface a clean exit-code-1 with a one-line error rather than letting
// FileNotFoundException bubble up through CliFx's default exception path —
// the CLI tests and operators both prefer `import-rslogix --file missing.csv`
// to print "file not found" rather than a stack trace.
throw new CommandException($"RSLogix CSV not found: {File}", exitCode: 1);
}
var opts = new ImportOptions(
ScopeFilter: Scope,
MaxRowsToImport: MaxRows,
IgnoreInvalid: !Strict);
RsLogixImportResult result;
using (var stream = System.IO.File.OpenRead(File))
{
var importer = new RsLogixSymbolImport();
result = importer.Parse(stream, Device, opts);
}
var emit = Emit?.Trim().ToLowerInvariant();
var payload = emit switch
{
"summary" => FormatSummary(result),
"appsettings-fragment" or null or "" => FormatFragment(result),
_ => throw new CommandException(
$"Unknown --emit value '{Emit}'. Use 'appsettings-fragment' or 'summary'.",
exitCode: 2),
};
if (Output is { Length: > 0 })
{
await System.IO.File.WriteAllTextAsync(Output, payload);
await console.Output.WriteLineAsync(
$"Wrote {result.ParsedCount} tag(s) to {Output} (skipped={result.SkippedCount}, errors={result.ErrorCount}).");
}
else
{
await console.Output.WriteLineAsync(payload);
}
}
/// <summary>
/// Serialise the imported tag list as a JSON fragment shaped like the
/// <c>AbLegacyDriverConfigDto</c>'s <c>Tags</c> array — drop straight into the
/// <c>appsettings.json</c> driver config under
/// <c>Drivers/&lt;instance&gt;/Config/Tags</c>.
/// </summary>
internal static string FormatFragment(RsLogixImportResult result)
{
var tags = result.Tags.Select(t => new
{
Name = t.Name,
DeviceHostAddress = t.DeviceHostAddress,
Address = t.Address,
DataType = t.DataType.ToString(),
Writable = t.Writable,
}).ToArray();
var doc = new { Tags = tags };
return JsonSerializer.Serialize(doc, new JsonSerializerOptions { WriteIndented = true });
}
private static string FormatSummary(RsLogixImportResult result) =>
$"Imported {result.ParsedCount} tag(s), skipped {result.SkippedCount}, errors {result.ErrorCount}.";
}

View File

@@ -1,6 +1,10 @@
using System.IO;
using System.Text.Json; using System.Text.Json;
using System.Text.Json.Serialization; using System.Text.Json.Serialization;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Logging.Abstractions;
using ZB.MOM.WW.OtOpcUa.Core.Hosting; using ZB.MOM.WW.OtOpcUa.Core.Hosting;
using ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Import;
using ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.PlcFamilies; using ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.PlcFamilies;
namespace ZB.MOM.WW.OtOpcUa.Driver.AbLegacy; namespace ZB.MOM.WW.OtOpcUa.Driver.AbLegacy;
@@ -75,6 +79,80 @@ public static class AbLegacyDriverFactoryExtensions
return new AbLegacyDriver(options, driverInstanceId); return new AbLegacyDriver(options, driverInstanceId);
} }
/// <summary>
/// ablegacy-11 / #254 — append RSLogix CSV symbol-export rows to
/// <paramref name="options"/> as <see cref="AbLegacyTagDefinition"/> entries bound to
/// <paramref name="deviceHostAddress"/>. Returns a new <see cref="AbLegacyDriverOptions"/>
/// with the imported tags concatenated onto the existing <c>Tags</c> list — useful both
/// at startup-time (server-side bootstrap that wants to seed a device's address space
/// from a customer-supplied CSV) and from the CLI (<c>import-rslogix</c> emits the
/// resulting JSON fragment for hand-merging into an appsettings file).
/// </summary>
/// <remarks>
/// <para>
/// The importer is permissive by default — malformed rows are logged and skipped;
/// the resulting <see cref="RsLogixImportResult"/> counts surface on
/// <paramref name="result"/> for callers that want to assert "we got the row count
/// we expected".
/// </para>
/// <para>
/// RSLogix 500's <c>.RSS</c> + RSLogix 5's <c>.RSP</c> binary project files are
/// out of scope for v1 — the binary format is proprietary and undocumented; no
/// libplctag or community parser exists. Customers must export to text/CSV via
/// RSLogix's "Tools → Database → Save" or "Database Export" before pointing the
/// importer at the file. See <c>docs/drivers/AbLegacy-RSLogix-Import.md</c>.
/// </para>
/// </remarks>
public static AbLegacyDriverOptions AddRsLogixImport(
this AbLegacyDriverOptions options,
string path,
string deviceHostAddress,
out RsLogixImportResult result,
ImportOptions? importOptions = null,
ILogger<RsLogixSymbolImport>? logger = null)
{
ArgumentNullException.ThrowIfNull(options);
ArgumentException.ThrowIfNullOrWhiteSpace(path);
ArgumentException.ThrowIfNullOrWhiteSpace(deviceHostAddress);
using var stream = File.OpenRead(path);
var importer = new RsLogixSymbolImport(logger ?? NullLogger<RsLogixSymbolImport>.Instance);
result = importer.Parse(stream, deviceHostAddress, importOptions);
// Concat onto whatever's already on the options — the importer is additive so
// hand-edited Tags rows (e.g., system-status fields not surfaced by RSLogix) keep
// sitting alongside the bulk-imported symbol rows. Use init-syntax with-expression
// so the returned options keeps every other field (Devices, Probe, Timeout, …)
// untouched.
var merged = new List<AbLegacyTagDefinition>(options.Tags.Count + result.Tags.Count);
merged.AddRange(options.Tags);
merged.AddRange(result.Tags);
return new AbLegacyDriverOptions
{
Devices = options.Devices,
Tags = merged,
Probe = options.Probe,
Timeout = options.Timeout,
Retries = options.Retries,
};
}
/// <summary>
/// CLI-friendly overload that returns the <see cref="RsLogixImportResult"/> alongside
/// the modified options as a tuple. Mirrors <see cref="AddRsLogixImport"/> but avoids
/// the <c>out</c> parameter for call sites that prefer pattern-matched destructuring.
/// </summary>
public static (AbLegacyDriverOptions Options, RsLogixImportResult Result) AddRsLogixImportWithResult(
this AbLegacyDriverOptions options,
string path,
string deviceHostAddress,
ImportOptions? importOptions = null,
ILogger<RsLogixSymbolImport>? logger = null)
{
var updated = options.AddRsLogixImport(path, deviceHostAddress, out var result, importOptions, logger);
return (updated, result);
}
private static T ParseEnum<T>(string? raw, string driverInstanceId, string field, private static T ParseEnum<T>(string? raw, string driverInstanceId, string field,
string? tagName = null, T? fallback = null) where T : struct, Enum string? tagName = null, T? fallback = null) where T : struct, Enum
{ {

View File

@@ -0,0 +1,40 @@
namespace ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Import;
/// <summary>
/// Materialises <see cref="AbLegacyTagDefinition"/> entries from a RSLogix export. v1 ships
/// a single implementation (<see cref="RsLogixSymbolImport"/>) for text/CSV "Database
/// Export" — RSLogix 500's <c>.RSS</c> and RSLogix 5's <c>.RSP</c> binary project files are
/// proprietary and out of scope (no parser ships with libplctag or any community library at
/// the time of writing). The interface exists so a binary parser can slot in later without
/// reshaping the call sites.
/// </summary>
/// <remarks>
/// <para>
/// The <c>deviceHostAddress</c> parameter on <see cref="Parse"/> is required because RSLogix
/// exports list addresses scoped to a single PLC; the importer needs to stamp every
/// resulting tag with the gateway address that the runtime layer will use to reach
/// it. Multi-device deployments call the importer once per device, then concatenate.
/// </para>
/// <para>
/// <see cref="Parse"/> never throws on parse errors when
/// <see cref="ImportOptions.IgnoreInvalid"/> is <c>true</c> (default) — malformed rows
/// are skipped with a structured warning logged via the importer's <c>ILogger</c>, and
/// the counts surface on <see cref="RsLogixImportResult"/>. With
/// <see cref="ImportOptions.IgnoreInvalid"/> set to <c>false</c> the first malformed row
/// throws <see cref="System.IO.InvalidDataException"/>.
/// </para>
/// </remarks>
public interface IRsLogixImporter
{
/// <summary>
/// Read the entire <paramref name="stream"/> and emit one
/// <see cref="AbLegacyTagDefinition"/> per recognised symbol row.
/// </summary>
/// <param name="stream">Open, readable stream over the RSLogix export. Caller owns it.</param>
/// <param name="deviceHostAddress">
/// Canonical AB Legacy gateway URI (<c>ab://host[:port]/cip-path</c>) the resulting
/// tags should bind to.
/// </param>
/// <param name="options">Filter + safety knobs; <c>null</c> ≡ default options.</param>
RsLogixImportResult Parse(Stream stream, string deviceHostAddress, ImportOptions? options = null);
}

View File

@@ -0,0 +1,27 @@
namespace ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Import;
/// <summary>
/// Options that drive an <see cref="IRsLogixImporter"/> run. Captures the few knobs that
/// reasonably differ between projects without forcing a dedicated subclass per import shape:
/// scope filter (Global vs. Local:N), maximum rows to keep (defensive cap on suspicious
/// exports), and whether to silently drop malformed rows or surface a parse exception.
/// </summary>
/// <remarks>
/// <para>
/// <see cref="ScopeFilter"/> matches the optional <c>Scope</c> column on RSLogix CSV
/// exports — "Global" tags live at the project root; "Local:1" / "Local:2" / etc. are
/// scoped to ladder file 1, ladder file 2, etc. When non-null, only rows whose
/// <c>Scope</c> value matches case-insensitively are emitted; rows with no <c>Scope</c>
/// column are treated as Global.
/// </para>
/// <para>
/// <see cref="IgnoreInvalid"/> defaults to <c>true</c> — RSLogix exports tend to carry
/// the occasional cosmetic row (single-letter alias, comment-only rows, blank lines)
/// and the v1 contract is "import what we can, log a warning for everything else".
/// Set to <c>false</c> to fail-fast on the first malformed row (useful for CI lint).
/// </para>
/// </remarks>
public sealed record ImportOptions(
string? ScopeFilter = null,
int? MaxRowsToImport = null,
bool IgnoreInvalid = true);

View File

@@ -0,0 +1,14 @@
namespace ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Import;
/// <summary>
/// Outcome of a single <see cref="IRsLogixImporter"/> run. <see cref="Tags"/> carries the
/// imported tag definitions ready to drop into <c>AbLegacyDriverOptions.Tags</c>;
/// <see cref="ParsedCount"/>, <see cref="SkippedCount"/>, and <see cref="ErrorCount"/>
/// give the operator a single line of telemetry ("imported 142 / skipped 3 / errored 0")
/// suitable for either a CLI summary or a startup-time log line.
/// </summary>
public sealed record RsLogixImportResult(
IReadOnlyList<AbLegacyTagDefinition> Tags,
int ParsedCount,
int SkippedCount,
int ErrorCount);

View File

@@ -0,0 +1,325 @@
using System.Globalization;
using System.IO;
using System.Text;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Logging.Abstractions;
namespace ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Import;
/// <summary>
/// Materialises <see cref="AbLegacyTagDefinition"/> entries from RSLogix 500 / 5
/// "Database Export" CSV. The expected column shape is
/// <c>Symbol,Address,Description,DataType,Scope</c> — a slight superset of what RSLogix
/// itself emits ("DataType" is RSLogix-supplied for symbol exports but ignored here in
/// favour of the file-letter prefix on <c>Address</c>; it is left in the schema for
/// forward-compatibility with editor tools that prefer to drive the type explicitly).
/// </summary>
/// <remarks>
/// <para>
/// The parser is deliberately tolerant: header row + comment lines (starting with
/// <c>;</c> or <c>#</c>) are skipped silently, headers are matched case-insensitively,
/// and quoted fields handle embedded commas the way RFC 4180 prescribes ("foo,bar"
/// → <c>foo,bar</c>; doubled quotes inside a quoted field collapse to a single
/// literal quote).
/// </para>
/// <para>
/// Type resolution defers to <see cref="AbLegacyAddress.TryParse(string?)"/> +
/// <see cref="TryResolveDataType"/> so the whole "what kind of file is N7?" knowledge
/// lives in one place. Function-file (<c>RTC</c>, <c>HSC</c>, …) and structure-file
/// (<c>PD</c>, <c>MG</c>, <c>PLS</c>, <c>BT</c>) prefixes are accepted but parsed
/// conditionally on <see cref="PlcFamilies.AbLegacyPlcFamily"/>; for the import path
/// we don't yet know the family so we use Slc500 as the parser context — that family
/// covers every common letter <see cref="RsLogixSymbolImport"/> needs to classify.
/// </para>
/// <para>
/// <see cref="System.IO.InvalidDataException"/> surfaces only when
/// <see cref="ImportOptions.IgnoreInvalid"/> is <c>false</c> — the default permissive
/// path logs a warning per malformed row and bumps the <c>SkippedCount</c> /
/// <c>ErrorCount</c> totals on <see cref="RsLogixImportResult"/>.
/// </para>
/// </remarks>
public sealed class RsLogixSymbolImport : IRsLogixImporter
{
private readonly ILogger<RsLogixSymbolImport> _logger;
public RsLogixSymbolImport() : this(NullLogger<RsLogixSymbolImport>.Instance) { }
public RsLogixSymbolImport(ILogger<RsLogixSymbolImport> logger)
{
_logger = logger ?? NullLogger<RsLogixSymbolImport>.Instance;
}
/// <inheritdoc />
public RsLogixImportResult Parse(Stream stream, string deviceHostAddress, ImportOptions? options = null)
{
ArgumentNullException.ThrowIfNull(stream);
ArgumentException.ThrowIfNullOrWhiteSpace(deviceHostAddress);
var opts = options ?? new ImportOptions();
var tags = new List<AbLegacyTagDefinition>();
var parsed = 0;
var skipped = 0;
var errors = 0;
// detectEncodingFromByteOrderMarks=true honours UTF-8 BOMs (RSLogix tools on Windows
// emit them often) without making the caller reach for a pre-decoded TextReader.
// leaveOpen=true lets the caller manage the stream's lifecycle.
using var reader = new StreamReader(stream, Encoding.UTF8, detectEncodingFromByteOrderMarks: true, bufferSize: 4096, leaveOpen: true);
int? symbolIdx = null;
int? addressIdx = null;
int? descriptionIdx = null;
int? dataTypeIdx = null;
int? scopeIdx = null;
var headerSeen = false;
var lineNumber = 0;
string? line;
while ((line = reader.ReadLine()) is not null)
{
lineNumber++;
if (string.IsNullOrWhiteSpace(line)) continue;
var trimmed = line.TrimStart();
if (trimmed.StartsWith(';') || trimmed.StartsWith('#')) continue;
var fields = SplitCsv(line);
if (fields.Count == 0) continue;
if (!headerSeen)
{
// First non-blank, non-comment row — treat as header. Map every column we
// recognise; missing required columns short-circuit the whole run with a
// single InvalidDataException because the failure is structural, not
// per-row.
for (var i = 0; i < fields.Count; i++)
{
var header = fields[i].Trim().ToLowerInvariant();
switch (header)
{
case "symbol": symbolIdx = i; break;
case "address": addressIdx = i; break;
case "description": descriptionIdx = i; break;
case "datatype":
case "data type":
case "type": dataTypeIdx = i; break;
case "scope": scopeIdx = i; break;
}
}
if (symbolIdx is null || addressIdx is null)
{
throw new InvalidDataException(
$"RSLogix import header at line {lineNumber} is missing required Symbol or Address column. " +
$"Got: {string.Join(",", fields)}");
}
headerSeen = true;
continue;
}
if (opts.MaxRowsToImport is int cap && parsed >= cap)
{
_logger.LogWarning(
"RSLogix import hit MaxRowsToImport={Cap} at line {LineNumber}; remaining rows skipped.",
cap, lineNumber);
break;
}
// Per-row error scoping — we want a single bad row to skip cleanly without
// dropping the rest of the file. The else branch in IgnoreInvalid=false mode
// re-throws to surface the failure to the caller.
try
{
// symbolIdx + addressIdx are guaranteed non-null past the header gate above.
var symbol = SafeField(fields, symbolIdx!.Value);
var address = SafeField(fields, addressIdx!.Value);
var description = descriptionIdx.HasValue ? SafeField(fields, descriptionIdx.Value) : null;
var scope = scopeIdx.HasValue ? SafeField(fields, scopeIdx.Value) : null;
if (string.IsNullOrWhiteSpace(symbol) || string.IsNullOrWhiteSpace(address))
{
skipped++;
_logger.LogWarning(
"RSLogix CSV row at line {LineNumber} skipped — missing Symbol or Address (symbol='{Symbol}', address='{Address}').",
lineNumber, symbol, address);
continue;
}
// Scope filter: row's Scope (or "Global" when blank) must match the filter
// case-insensitively. RSLogix CSV scope values look like "Global" or
// "Local:N" / "LOCAL:1" depending on the tool that emitted them.
if (opts.ScopeFilter is { } wanted)
{
var actual = string.IsNullOrWhiteSpace(scope) ? "Global" : scope.Trim();
if (!string.Equals(actual, wanted.Trim(), StringComparison.OrdinalIgnoreCase))
{
skipped++;
continue;
}
}
if (!TryResolveDataType(address.Trim(), out var dataType))
{
if (!opts.IgnoreInvalid)
{
throw new InvalidDataException(
$"RSLogix CSV row at line {lineNumber} has unrecognised PCCC address '{address}'.");
}
errors++;
_logger.LogWarning(
"RSLogix CSV row at line {LineNumber} skipped — unrecognised PCCC address '{Address}'.",
lineNumber, address);
continue;
}
// Description column is parsed but currently unused — AbLegacyTagDefinition
// doesn't carry a Description field today (the v2 schema ledger lives on the
// server's metadata side of the bridge per #248). We retain the column in the
// CSV header contract so a future schema bump can pick it up without breaking
// existing exports. _ discard suppresses the unused-local warning.
_ = description;
tags.Add(new AbLegacyTagDefinition(
Name: symbol.Trim(),
DeviceHostAddress: deviceHostAddress,
Address: address.Trim(),
DataType: dataType,
Writable: true));
parsed++;
}
catch (InvalidDataException) when (opts.IgnoreInvalid)
{
errors++;
_logger.LogWarning("RSLogix CSV row at line {LineNumber} skipped — invalid data.", lineNumber);
}
catch (Exception ex) when (opts.IgnoreInvalid)
{
errors++;
_logger.LogWarning(ex, "RSLogix CSV row at line {LineNumber} skipped — parser threw.", lineNumber);
}
}
if (!headerSeen)
{
// Empty CSV (only blanks / comments) — return an empty result rather than
// surface a "no header found" error. The CLI will report parsed=0 which is the
// honest answer.
return new RsLogixImportResult([], 0, skipped, errors);
}
return new RsLogixImportResult(tags, parsed, skipped, errors);
}
/// <summary>
/// Resolve a PCCC <paramref name="address"/> to the matching
/// <see cref="AbLegacyDataType"/>. Returns <c>false</c> for unparsable addresses.
/// </summary>
/// <remarks>
/// The mapping follows the file-letter table on
/// <see cref="AbLegacyAddress"/> doc comments:
/// N→Int, F→Float, B→Bit, L→Long, ST→String, T→TimerElement, C→CounterElement,
/// R→ControlElement, A→AnalogInt, S/I/O→Int (status / I/O bits resolve as Bit when
/// the address carries a <c>/N</c> bit suffix), PD→PidElement, MG→MessageElement,
/// PLS→PlsElement, BT→BlockTransferElement, function-file letters (RTC/HSC/etc.) →
/// MicroLogixFunctionFile.
/// </remarks>
public static bool TryResolveDataType(string address, out AbLegacyDataType dataType)
{
dataType = AbLegacyDataType.Int;
// Use Slc500 as the parser family — it accepts every common letter the importer
// sees in the wild. Family-specific gating (PLC-5 octal I:/O:, PD/MG/PLS/BT) only
// matters for runtime addressing, not for shape classification at import time.
var parsed = AbLegacyAddress.TryParse(address, PlcFamilies.AbLegacyPlcFamily.Plc5)
?? AbLegacyAddress.TryParse(address, PlcFamilies.AbLegacyPlcFamily.Slc500)
?? AbLegacyAddress.TryParse(address, PlcFamilies.AbLegacyPlcFamily.MicroLogix);
if (parsed is null) return false;
var letter = parsed.FileLetter;
// Bit-within-word references on N/L/I/O/S files surface as Bit regardless of the
// base file type. B-file references with no bit suffix are rare in real exports
// but still classify as Bit (the wire-level element is a single word — Rockwell
// convention is one bool per word).
if (parsed.BitIndex is not null)
{
dataType = AbLegacyDataType.Bit;
return true;
}
dataType = letter switch
{
"N" => AbLegacyDataType.Int,
"F" => AbLegacyDataType.Float,
"B" => AbLegacyDataType.Bit,
"L" => AbLegacyDataType.Long,
"ST" => AbLegacyDataType.String,
"T" => AbLegacyDataType.TimerElement,
"C" => AbLegacyDataType.CounterElement,
"R" => AbLegacyDataType.ControlElement,
"A" => AbLegacyDataType.AnalogInt,
"I" or "O" or "S" => AbLegacyDataType.Int,
"PD" => AbLegacyDataType.PidElement,
"MG" => AbLegacyDataType.MessageElement,
"PLS" => AbLegacyDataType.PlsElement,
"BT" => AbLegacyDataType.BlockTransferElement,
_ when AbLegacyAddress.IsFunctionFileLetter(letter) => AbLegacyDataType.MicroLogixFunctionFile,
_ => AbLegacyDataType.Int,
};
return true;
}
private static string SafeField(IReadOnlyList<string> fields, int idx) =>
idx >= 0 && idx < fields.Count ? fields[idx] : string.Empty;
/// <summary>
/// RFC 4180-ish CSV splitter — quoted fields, doubled-quote escape, embedded comma
/// inside quoted fields. Avoids a third-party CSV dependency for a five-column
/// parser.
/// </summary>
internal static List<string> SplitCsv(string line)
{
var fields = new List<string>();
var sb = new StringBuilder(line.Length);
var inQuotes = false;
for (var i = 0; i < line.Length; i++)
{
var c = line[i];
if (inQuotes)
{
if (c == '"')
{
// Doubled quote inside a quoted field is a literal `"`; otherwise the
// quote terminates the quoted segment.
if (i + 1 < line.Length && line[i + 1] == '"')
{
sb.Append('"');
i++;
}
else
{
inQuotes = false;
}
}
else
{
sb.Append(c);
}
}
else
{
switch (c)
{
case '"':
inQuotes = true;
break;
case ',':
fields.Add(sb.ToString());
sb.Clear();
break;
default:
sb.Append(c);
break;
}
}
}
fields.Add(sb.ToString());
return fields;
}
}

View File

@@ -22,6 +22,10 @@
Decision #41 — AbLegacy split from AbCip since PCCC addressing (file-based N7:0) and Decision #41 — AbLegacy split from AbCip since PCCC addressing (file-based N7:0) and
Logix addressing (symbolic Motor1.Speed) pull the abstraction in incompatible directions. --> Logix addressing (symbolic Motor1.Speed) pull the abstraction in incompatible directions. -->
<PackageReference Include="libplctag" Version="1.5.2"/> <PackageReference Include="libplctag" Version="1.5.2"/>
<!-- ablegacy-11 / #254 — RsLogixSymbolImport logs warnings for malformed CSV rows
via ILogger so import-time issues surface in Serilog without making the importer
throw. Abstractions only — runtime sink is the host's responsibility. -->
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" Version="10.0.0"/>
</ItemGroup> </ItemGroup>
<ItemGroup> <ItemGroup>

View File

@@ -0,0 +1,190 @@
using System.IO;
using System.Text.Json;
using CliFx.Exceptions;
using CliFx.Infrastructure;
using Shouldly;
using Xunit;
using ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Cli.Commands;
namespace ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Cli.Tests;
/// <summary>
/// Coverage for the <c>import-rslogix</c> CLI command. The command is intentionally
/// thin (open file, hand to <c>RsLogixSymbolImport</c>, serialise) — these tests focus
/// on the I/O + flag-handling shape rather than re-running the parser.
/// </summary>
[Trait("Category", "Unit")]
public sealed class ImportRslogixCommandTests
{
private const string CanonicalCsv = """
Symbol,Address,Description,DataType,Scope
MotorSpeed,N7:0,Motor speed,INT,Global
TankLevel,F8:0,Tank level,REAL,Global
RunFlag,B3:0/0,Run flag,BOOL,Global
""";
[Fact]
public async Task Execute_with_valid_csv_emits_json_fragment_with_three_tags()
{
var path = Path.Combine(Path.GetTempPath(), $"rslogix-cli-{Guid.NewGuid():N}.csv");
File.WriteAllText(path, CanonicalCsv);
try
{
using var console = new FakeInMemoryConsole();
var cmd = new ImportRslogixCommand
{
File = path,
Device = "ab://10.0.0.5/1,0",
Emit = "appsettings-fragment",
};
await cmd.ExecuteAsync(console);
var output = console.ReadOutputString();
output.ShouldContain("\"Tags\"");
// Parse the emitted JSON and assert the structural properties — flake-resistant
// vs. comparing whitespace-sensitive text.
using var doc = JsonDocument.Parse(output);
var tags = doc.RootElement.GetProperty("Tags");
tags.GetArrayLength().ShouldBe(3);
tags[0].GetProperty("Name").GetString().ShouldBe("MotorSpeed");
tags[0].GetProperty("DataType").GetString().ShouldBe("Int");
tags[2].GetProperty("DataType").GetString().ShouldBe("Bit");
}
finally
{
try { File.Delete(path); } catch { /* best-effort */ }
}
}
[Fact]
public async Task Execute_with_summary_emit_prints_counters()
{
var path = Path.Combine(Path.GetTempPath(), $"rslogix-cli-{Guid.NewGuid():N}.csv");
File.WriteAllText(path, CanonicalCsv);
try
{
using var console = new FakeInMemoryConsole();
var cmd = new ImportRslogixCommand
{
File = path,
Device = "ab://10.0.0.5/1,0",
Emit = "summary",
};
await cmd.ExecuteAsync(console);
var output = console.ReadOutputString();
output.ShouldContain("Imported 3");
output.ShouldContain("skipped 0");
}
finally
{
try { File.Delete(path); } catch { /* best-effort */ }
}
}
[Fact]
public async Task Execute_with_output_path_writes_file_and_prints_summary()
{
var inputPath = Path.Combine(Path.GetTempPath(), $"rslogix-cli-{Guid.NewGuid():N}.csv");
var outputPath = Path.Combine(Path.GetTempPath(), $"rslogix-cli-{Guid.NewGuid():N}.json");
File.WriteAllText(inputPath, CanonicalCsv);
try
{
using var console = new FakeInMemoryConsole();
var cmd = new ImportRslogixCommand
{
File = inputPath,
Device = "ab://10.0.0.5/1,0",
Emit = "appsettings-fragment",
Output = outputPath,
};
await cmd.ExecuteAsync(console);
File.Exists(outputPath).ShouldBeTrue();
var fileBody = File.ReadAllText(outputPath);
using var doc = JsonDocument.Parse(fileBody);
doc.RootElement.GetProperty("Tags").GetArrayLength().ShouldBe(3);
// Stdout still gets the human-readable summary line.
console.ReadOutputString().ShouldContain("Wrote 3");
}
finally
{
try { File.Delete(inputPath); } catch { /* best-effort */ }
try { File.Delete(outputPath); } catch { /* best-effort */ }
}
}
[Fact]
public async Task Execute_with_missing_file_throws_command_exception()
{
var missing = Path.Combine(Path.GetTempPath(), $"does-not-exist-{Guid.NewGuid():N}.csv");
using var console = new FakeInMemoryConsole();
var cmd = new ImportRslogixCommand
{
File = missing,
Device = "ab://10.0.0.5/1,0",
};
var ex = await Should.ThrowAsync<CommandException>(async () => await cmd.ExecuteAsync(console));
ex.ExitCode.ShouldBe(1);
}
[Fact]
public async Task Execute_with_unknown_emit_throws_command_exception()
{
var path = Path.Combine(Path.GetTempPath(), $"rslogix-cli-{Guid.NewGuid():N}.csv");
File.WriteAllText(path, CanonicalCsv);
try
{
using var console = new FakeInMemoryConsole();
var cmd = new ImportRslogixCommand
{
File = path,
Device = "ab://10.0.0.5/1,0",
Emit = "yaml",
};
var ex = await Should.ThrowAsync<CommandException>(async () => await cmd.ExecuteAsync(console));
ex.ExitCode.ShouldBe(2);
}
finally
{
try { File.Delete(path); } catch { /* best-effort */ }
}
}
[Fact]
public async Task Execute_scope_filter_only_imports_matching_rows()
{
var path = Path.Combine(Path.GetTempPath(), $"rslogix-cli-{Guid.NewGuid():N}.csv");
File.WriteAllText(path, """
Symbol,Address,Description,DataType,Scope
G1,N7:0,desc,INT,Global
L1,N7:1,desc,INT,Local:1
L2,N7:2,desc,INT,Local:2
""");
try
{
using var console = new FakeInMemoryConsole();
var cmd = new ImportRslogixCommand
{
File = path,
Device = "ab://10.0.0.5/1,0",
Emit = "summary",
Scope = "Local:1",
};
await cmd.ExecuteAsync(console);
console.ReadOutputString().ShouldContain("Imported 1");
}
finally
{
try { File.Delete(path); } catch { /* best-effort */ }
}
}
}

View File

@@ -0,0 +1,60 @@
{
"Tags": [
{
"Name": "MotorSpeed",
"DeviceHostAddress": "ab://192.168.1.20/1,0",
"Address": "N7:0",
"DataType": "Int",
"Writable": true
},
{
"Name": "TankLevel",
"DeviceHostAddress": "ab://192.168.1.20/1,0",
"Address": "F8:0",
"DataType": "Float",
"Writable": true
},
{
"Name": "RunFlag",
"DeviceHostAddress": "ab://192.168.1.20/1,0",
"Address": "B3:0/0",
"DataType": "Bit",
"Writable": true
},
{
"Name": "TotalCount",
"DeviceHostAddress": "ab://192.168.1.20/1,0",
"Address": "L9:0",
"DataType": "Long",
"Writable": true
},
{
"Name": "RecipeName",
"DeviceHostAddress": "ab://192.168.1.20/1,0",
"Address": "ST10:0",
"DataType": "String",
"Writable": true
},
{
"Name": "DwellTimer",
"DeviceHostAddress": "ab://192.168.1.20/1,0",
"Address": "T4:0.ACC",
"DataType": "TimerElement",
"Writable": true
},
{
"Name": "PieceCounter",
"DeviceHostAddress": "ab://192.168.1.20/1,0",
"Address": "C5:0.ACC",
"DataType": "CounterElement",
"Writable": true
},
{
"Name": "StateMachine",
"DeviceHostAddress": "ab://192.168.1.20/1,0",
"Address": "R6:0.LEN",
"DataType": "ControlElement",
"Writable": true
}
]
}

View File

@@ -0,0 +1,13 @@
; ablegacy-11 / #254 — canonical RSLogix CSV symbol export covering one row per
; file letter the v1 importer recognises (N/F/B/L/ST/T/C/R). Comment lines
; (starting with `;` or `#`) are skipped by the parser so this header doc
; survives a round-trip without affecting the imported tag count.
Symbol,Address,Description,DataType,Scope
MotorSpeed,N7:0,Motor speed setpoint,INT,Global
TankLevel,F8:0,Tank level (gallons),REAL,Global
RunFlag,B3:0/0,Run command flag,BOOL,Global
TotalCount,L9:0,Total piece count,LINT,Global
RecipeName,ST10:0,"Recipe name, free-form text",STRING,Global
DwellTimer,T4:0.ACC,Dwell timer accumulator,TIMER,Global
PieceCounter,C5:0.ACC,Piece counter accumulator,COUNTER,Global
StateMachine,R6:0.LEN,State-machine control length,CONTROL,Global
1 ; ablegacy-11 / #254 — canonical RSLogix CSV symbol export covering one row per
2 ; file letter the v1 importer recognises (N/F/B/L/ST/T/C/R). Comment lines
3 ; (starting with `;` or `#`) are skipped by the parser so this header doc
4 ; survives a round-trip without affecting the imported tag count.
5 Symbol Address Description DataType Scope
6 MotorSpeed N7:0 Motor speed setpoint INT Global
7 TankLevel F8:0 Tank level (gallons) REAL Global
8 RunFlag B3:0/0 Run command flag BOOL Global
9 TotalCount L9:0 Total piece count LINT Global
10 RecipeName ST10:0 Recipe name, free-form text STRING Global
11 DwellTimer T4:0.ACC Dwell timer accumulator TIMER Global
12 PieceCounter C5:0.ACC Piece counter accumulator COUNTER Global
13 StateMachine R6:0.LEN State-machine control length CONTROL Global

View File

@@ -0,0 +1,82 @@
using System.IO;
using Shouldly;
using Xunit;
using ZB.MOM.WW.OtOpcUa.Driver.AbLegacy;
using ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.PlcFamilies;
namespace ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Tests.Import;
/// <summary>
/// Coverage for <see cref="AbLegacyDriverFactoryExtensions.AddRsLogixImport"/> — the
/// extension method that opens a CSV file and concatenates the resulting tag definitions
/// onto an existing <see cref="AbLegacyDriverOptions"/>.
/// </summary>
[Trait("Category", "Unit")]
public sealed class AbLegacyDriverFactoryAddRsLogixImportTests
{
[Fact]
public void AddRsLogixImport_appends_tags_and_preserves_existing_options()
{
// Existing options have one device + one hand-rolled tag. The importer should
// append on top — never replace — so the device + the original tag survive.
var path = Path.Combine(Path.GetTempPath(), $"rslogix-import-{Guid.NewGuid():N}.csv");
File.WriteAllText(path, """
Symbol,Address,Description,DataType,Scope
New1,N7:0,desc,INT,Global
New2,F8:0,desc,REAL,Global
""");
try
{
var existingTag = new AbLegacyTagDefinition(
Name: "Manual",
DeviceHostAddress: "ab://10.0.0.1/1,0",
Address: "S:0",
DataType: AbLegacyDataType.Int);
var options = new AbLegacyDriverOptions
{
Devices = [new AbLegacyDeviceOptions("ab://10.0.0.1/1,0", AbLegacyPlcFamily.Slc500)],
Tags = [existingTag],
};
var updated = options.AddRsLogixImport(path, "ab://10.0.0.1/1,0", out var result);
// Imported counts surface on the result.
result.ParsedCount.ShouldBe(2);
// Devices + the original Manual tag are preserved on the returned options.
updated.Devices.Count.ShouldBe(1);
updated.Tags.Count.ShouldBe(3);
updated.Tags[0].Name.ShouldBe("Manual");
updated.Tags[1].Name.ShouldBe("New1");
updated.Tags[2].Name.ShouldBe("New2");
// Original options object is unchanged (immutability guarantee).
options.Tags.Count.ShouldBe(1);
}
finally
{
try { File.Delete(path); } catch { /* best-effort */ }
}
}
[Fact]
public void AddRsLogixImportWithResult_returns_tuple()
{
var path = Path.Combine(Path.GetTempPath(), $"rslogix-import-{Guid.NewGuid():N}.csv");
File.WriteAllText(path, """
Symbol,Address,Description,DataType,Scope
T,N7:0,desc,INT,Global
""");
try
{
var options = new AbLegacyDriverOptions();
var (updated, result) = options.AddRsLogixImportWithResult(path, "ab://10.0.0.1/1,0");
result.ParsedCount.ShouldBe(1);
updated.Tags.Count.ShouldBe(1);
}
finally
{
try { File.Delete(path); } catch { /* best-effort */ }
}
}
}

View File

@@ -0,0 +1,107 @@
using System.IO;
using System.Text.Json;
using System.Text.Json.Nodes;
using Microsoft.Extensions.Logging.Abstractions;
using Shouldly;
using Xunit;
using ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Import;
namespace ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Tests.Import;
/// <summary>
/// End-to-end golden-snapshot test. Loads the canonical CSV fixture from
/// <c>Fixtures/rslogix-canonical.csv</c>, runs it through
/// <see cref="RsLogixSymbolImport"/>, then compares the resulting tag list to
/// <c>Fixtures/rslogix-canonical-expected.json</c>.
/// </summary>
/// <remarks>
/// <para>
/// Mismatch path: the test writes the actual JSON to a temp file path and prints the
/// path in the failure message, so the dev can run
/// <c>cp $TEMP/rslogix-canonical-actual.json tests/.../Fixtures/rslogix-canonical-expected.json</c>
/// to bless the new shape. Treats both sides as <see cref="JsonNode"/> trees so
/// insignificant whitespace + key-order differences don't false-fail.
/// </para>
/// </remarks>
[Trait("Category", "Unit")]
public sealed class RsLogixSymbolImportGoldenTests
{
private const string Device = "ab://192.168.1.20/1,0";
private static string FixturePath(string name) =>
Path.Combine(AppContext.BaseDirectory, "Fixtures", name);
[Fact]
public void Canonical_csv_matches_golden_json()
{
var importer = new RsLogixSymbolImport(NullLogger<RsLogixSymbolImport>.Instance);
using var stream = File.OpenRead(FixturePath("rslogix-canonical.csv"));
var result = importer.Parse(stream, Device);
result.ParsedCount.ShouldBe(8);
result.SkippedCount.ShouldBe(0);
result.ErrorCount.ShouldBe(0);
var actualPayload = new
{
Tags = result.Tags.Select(t => new
{
Name = t.Name,
DeviceHostAddress = t.DeviceHostAddress,
Address = t.Address,
DataType = t.DataType.ToString(),
Writable = t.Writable,
}).ToArray()
};
var actualJson = JsonSerializer.Serialize(actualPayload,
new JsonSerializerOptions { WriteIndented = true });
var expectedJson = File.ReadAllText(FixturePath("rslogix-canonical-expected.json"));
var actualNode = JsonNode.Parse(actualJson)!;
var expectedNode = JsonNode.Parse(expectedJson)!;
if (!JsonTreesEqual(actualNode, expectedNode))
{
// Dump the actual JSON to a discoverable temp path so the dev can `cp` it over
// the fixture once they've reviewed the diff. The test message points straight
// at the file.
var dump = Path.Combine(Path.GetTempPath(), "rslogix-canonical-actual.json");
File.WriteAllText(dump, actualJson);
throw new Xunit.Sdk.XunitException(
$"RSLogix golden mismatch. Actual written to: {dump}\n--- Expected ---\n{expectedJson}\n--- Actual ---\n{actualJson}");
}
}
/// <summary>
/// Structural JSON equality — recursively compares two <see cref="JsonNode"/> trees
/// by shape + value, ignoring property-order on objects. Cheaper than pulling in a
/// dedicated diff library for one assertion.
/// </summary>
private static bool JsonTreesEqual(JsonNode? a, JsonNode? b)
{
if (a is null && b is null) return true;
if (a is null || b is null) return false;
if (a is JsonObject ao && b is JsonObject bo)
{
if (ao.Count != bo.Count) return false;
foreach (var kvp in ao)
{
if (!bo.TryGetPropertyValue(kvp.Key, out var bv)) return false;
if (!JsonTreesEqual(kvp.Value, bv)) return false;
}
return true;
}
if (a is JsonArray aa && b is JsonArray ba)
{
if (aa.Count != ba.Count) return false;
for (var i = 0; i < aa.Count; i++)
{
if (!JsonTreesEqual(aa[i], ba[i])) return false;
}
return true;
}
// Primitives — fall back to canonical JSON form for value equality.
return a.ToJsonString() == b.ToJsonString();
}
}

View File

@@ -0,0 +1,266 @@
using System.IO;
using System.Text;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Logging.Abstractions;
using Shouldly;
using Xunit;
using ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Import;
namespace ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.Tests.Import;
/// <summary>
/// Unit coverage for <see cref="RsLogixSymbolImport"/>. Drives the parser through
/// synthesised in-memory streams — the golden-snapshot fixture has its own dedicated
/// test class (<see cref="RsLogixSymbolImportGoldenTests"/>).
/// </summary>
[Trait("Category", "Unit")]
public sealed class RsLogixSymbolImportTests
{
private const string Device = "ab://10.0.0.5/1,0";
private static RsLogixImportResult ParseString(string csv, ImportOptions? opts = null)
{
var importer = new RsLogixSymbolImport(NullLogger<RsLogixSymbolImport>.Instance);
using var stream = new MemoryStream(Encoding.UTF8.GetBytes(csv));
return importer.Parse(stream, Device, opts);
}
[Fact]
public void Parse_canonical_eight_file_letters_yields_eight_typed_tags()
{
// One row per file letter the v1 contract supports — N/F/B/L/ST/T/C/R. The expected
// DataType is the file-letter resolution from RsLogixSymbolImport.TryResolveDataType,
// not whatever the RSLogix-supplied DataType column says.
const string csv = """
Symbol,Address,Description,DataType,Scope
S_N,N7:0,n,INT,Global
S_F,F8:0,f,REAL,Global
S_B,B3:0/0,b,BOOL,Global
S_L,L9:0,l,LINT,Global
S_ST,ST10:0,s,STRING,Global
S_T,T4:0.ACC,t,TIMER,Global
S_C,C5:0.ACC,c,COUNTER,Global
S_R,R6:0.LEN,r,CONTROL,Global
""";
var result = ParseString(csv);
result.ParsedCount.ShouldBe(8);
result.SkippedCount.ShouldBe(0);
result.ErrorCount.ShouldBe(0);
result.Tags.Count.ShouldBe(8);
result.Tags[0].DataType.ShouldBe(AbLegacyDataType.Int);
result.Tags[1].DataType.ShouldBe(AbLegacyDataType.Float);
result.Tags[2].DataType.ShouldBe(AbLegacyDataType.Bit);
result.Tags[3].DataType.ShouldBe(AbLegacyDataType.Long);
result.Tags[4].DataType.ShouldBe(AbLegacyDataType.String);
result.Tags[5].DataType.ShouldBe(AbLegacyDataType.TimerElement);
result.Tags[6].DataType.ShouldBe(AbLegacyDataType.CounterElement);
result.Tags[7].DataType.ShouldBe(AbLegacyDataType.ControlElement);
// Every tag should bind to the supplied device gateway and use the symbol verbatim
// for its Name (no synthesised key — RSLogix symbols are already stable).
result.Tags.ShouldAllBe(t => t.DeviceHostAddress == Device);
result.Tags[0].Name.ShouldBe("S_N");
result.Tags[2].Address.ShouldBe("B3:0/0");
}
[Fact]
public void Parse_skips_header_and_comment_lines()
{
// Comment lines (`;` / `#`) live both before and after the header — both forms must
// survive the parser without bumping the tag count.
const string csv = """
; top-level comment
# also a comment
Symbol,Address,Description,DataType,Scope
; mid-stream comment
Tag1,N7:0,desc,INT,Global
# another comment
Tag2,F8:0,desc,REAL,Global
""";
var result = ParseString(csv);
result.ParsedCount.ShouldBe(2);
result.Tags[0].Name.ShouldBe("Tag1");
result.Tags[1].Name.ShouldBe("Tag2");
}
[Fact]
public void Parse_malformed_row_skips_with_log_warning()
{
// Malformed rows: missing address. Default IgnoreInvalid=true skips them with a
// warning logged — the surviving row still imports cleanly.
var collector = new ListLogger<RsLogixSymbolImport>();
const string csv = """
Symbol,Address,Description,DataType,Scope
Good,N7:0,ok,INT,Global
Broken,,still here,INT,Global
AlsoGood,F8:0,ok,REAL,Global
""";
var importer = new RsLogixSymbolImport(collector);
using var stream = new MemoryStream(Encoding.UTF8.GetBytes(csv));
var result = importer.Parse(stream, Device);
result.ParsedCount.ShouldBe(2);
result.SkippedCount.ShouldBe(1);
// The warning channel saw exactly one entry for the broken row.
collector.Messages.ShouldContain(m => m.Contains("Broken") || m.Contains("missing"));
}
[Fact]
public void Parse_empty_stream_returns_empty_result()
{
var result = ParseString(string.Empty);
result.ParsedCount.ShouldBe(0);
result.SkippedCount.ShouldBe(0);
result.ErrorCount.ShouldBe(0);
result.Tags.ShouldBeEmpty();
}
[Fact]
public void Parse_handles_quoted_field_with_embedded_comma()
{
// Description with an embedded `,` must round-trip through the RFC-4180 splitter
// without splitting the row into extra fields. Address column resolution still
// lands on the correct file letter.
const string csv = """
Symbol,Address,Description,DataType,Scope
Mixer,N7:5,"Mixer speed, RPM",INT,Global
""";
var result = ParseString(csv);
result.ParsedCount.ShouldBe(1);
result.Tags[0].Name.ShouldBe("Mixer");
result.Tags[0].Address.ShouldBe("N7:5");
result.Tags[0].DataType.ShouldBe(AbLegacyDataType.Int);
}
[Fact]
public void Parse_doubled_quote_inside_quoted_field_decodes_to_single_quote()
{
// RFC 4180 doubled-quote escape — `""` inside a quoted field is a literal `"`.
// Four-quote raw delimiter so the embedded triple-quote sequence in the CSV
// payload doesn't terminate the literal early.
const string csv = """"
Symbol,Address,Description,DataType,Scope
Quoted,N7:0,"He said ""hi""",INT,Global
"""";
var result = ParseString(csv);
result.ParsedCount.ShouldBe(1);
// The description goes to /dev/null today (AbLegacyTagDefinition has no Description
// field) but the parser still has to consume the row without splitting on the inner
// quotes — a parse-side regression would emit ParsedCount=0 / ErrorCount>=1.
result.ErrorCount.ShouldBe(0);
}
[Fact]
public void Parse_scope_filter_drops_non_matching_rows()
{
const string csv = """
Symbol,Address,Description,DataType,Scope
G,N7:0,global,INT,Global
L1,N7:1,local one,INT,Local:1
L2,N7:2,local two,INT,Local:2
""";
var result = ParseString(csv, new ImportOptions(ScopeFilter: "Local:1"));
result.ParsedCount.ShouldBe(1);
result.Tags[0].Name.ShouldBe("L1");
result.SkippedCount.ShouldBe(2);
}
[Fact]
public void Parse_handles_utf8_bom()
{
// RSLogix tools on Windows emit UTF-8 with BOM — make sure detectEncodingFromByte-
// OrderMarks=true on the StreamReader strips the BOM rather than letting it become
// part of the first column header (which would knock out the Symbol mapping).
const string csv = "Symbol,Address,Description,DataType,Scope\nT,N7:0,desc,INT,Global\n";
var bom = Encoding.UTF8.GetPreamble();
var bytes = Encoding.UTF8.GetBytes(csv);
var withBom = new byte[bom.Length + bytes.Length];
bom.CopyTo(withBom, 0);
bytes.CopyTo(withBom, bom.Length);
var importer = new RsLogixSymbolImport(NullLogger<RsLogixSymbolImport>.Instance);
using var stream = new MemoryStream(withBom);
var result = importer.Parse(stream, Device);
result.ParsedCount.ShouldBe(1);
result.Tags[0].Name.ShouldBe("T");
}
[Fact]
public void Parse_strict_mode_throws_on_first_invalid_address()
{
const string csv = """
Symbol,Address,Description,DataType,Scope
Good,N7:0,ok,INT,Global
Broken,not-a-pccc-address,bad,INT,Global
""";
// IgnoreInvalid=false → the unrecognised PCCC address surfaces as InvalidDataException
// rather than the silent-skip path.
Should.Throw<InvalidDataException>(
() => ParseString(csv, new ImportOptions(IgnoreInvalid: false)));
}
[Fact]
public void Parse_max_rows_caps_imports()
{
const string csv = """
Symbol,Address,Description,DataType,Scope
A,N7:0,a,INT,Global
B,N7:1,b,INT,Global
C,N7:2,c,INT,Global
D,N7:3,d,INT,Global
""";
var result = ParseString(csv, new ImportOptions(MaxRowsToImport: 2));
result.ParsedCount.ShouldBe(2);
result.Tags.Count.ShouldBe(2);
}
[Fact]
public void Parse_missing_required_column_throws_invalid_data()
{
// No Address column at all — structural failure, not per-row. Throws regardless of
// the IgnoreInvalid knob (the latter governs per-row failures, not header shape).
const string csv = """
Symbol,Description,DataType,Scope
T,desc,INT,Global
""";
Should.Throw<InvalidDataException>(() => ParseString(csv));
}
[Fact]
public void TryResolveDataType_returns_false_for_garbage()
{
RsLogixSymbolImport.TryResolveDataType("not a pccc address", out _).ShouldBeFalse();
RsLogixSymbolImport.TryResolveDataType("", out _).ShouldBeFalse();
}
[Fact]
public void TryResolveDataType_bit_index_overrides_file_letter()
{
// N7:0/3 — bit 3 of word 0 of integer file 7. The bit suffix forces Bit regardless
// of N's normal Int classification.
RsLogixSymbolImport.TryResolveDataType("N7:0/3", out var dt).ShouldBeTrue();
dt.ShouldBe(AbLegacyDataType.Bit);
}
/// <summary>
/// Minimal in-memory <see cref="ILogger{TCategoryName}"/> implementation so the unit
/// tests can assert on the warning side-channel without depending on a logging
/// framework. Captures the formatted message verbatim.
/// </summary>
private sealed class ListLogger<T> : ILogger<T>
{
public List<string> Messages { get; } = new();
public IDisposable? BeginScope<TState>(TState state) where TState : notnull => null;
public bool IsEnabled(LogLevel logLevel) => true;
public void Log<TState>(LogLevel logLevel, EventId eventId, TState state, Exception? exception,
Func<TState, Exception?, string> formatter)
{
Messages.Add(formatter(state, exception));
}
}
}

View File

@@ -23,6 +23,16 @@
<ProjectReference Include="..\..\src\ZB.MOM.WW.OtOpcUa.Driver.AbLegacy\ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.csproj"/> <ProjectReference Include="..\..\src\ZB.MOM.WW.OtOpcUa.Driver.AbLegacy\ZB.MOM.WW.OtOpcUa.Driver.AbLegacy.csproj"/>
</ItemGroup> </ItemGroup>
<ItemGroup>
<!-- ablegacy-11 / #254 — RSLogix CSV import fixture + golden snapshot. -->
<None Update="Fixtures\rslogix-canonical.csv">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</None>
<None Update="Fixtures\rslogix-canonical-expected.json">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</None>
</ItemGroup>
<ItemGroup> <ItemGroup>
<NuGetAuditSuppress Include="https://github.com/advisories/GHSA-37gx-xxp4-5rgx"/> <NuGetAuditSuppress Include="https://github.com/advisories/GHSA-37gx-xxp4-5rgx"/>
<NuGetAuditSuppress Include="https://github.com/advisories/GHSA-w3x6-4m5h-cxqf"/> <NuGetAuditSuppress Include="https://github.com/advisories/GHSA-w3x6-4m5h-cxqf"/>