Reformat / cleanup
This commit is contained in:
34
README.md
34
README.md
@@ -1,16 +1,21 @@
|
|||||||
# CBDD
|
# CBDD
|
||||||
|
|
||||||
CBDD is an embedded, document-oriented database engine for .NET 10. It targets internal platform teams that need predictable ACID behavior, low-latency local persistence, and typed access patterns without running an external database server.
|
CBDD is an embedded, document-oriented database engine for .NET 10. It targets internal platform teams that need
|
||||||
|
predictable ACID behavior, low-latency local persistence, and typed access patterns without running an external database
|
||||||
|
server.
|
||||||
|
|
||||||
## Purpose And Business Context
|
## Purpose And Business Context
|
||||||
|
|
||||||
CBDD provides a local data layer for services and tools that need transactional durability, deterministic startup, and high-throughput reads/writes. The primary business outcome is reducing operational overhead for workloads that do not require a networked database cluster.
|
CBDD provides a local data layer for services and tools that need transactional durability, deterministic startup, and
|
||||||
|
high-throughput reads/writes. The primary business outcome is reducing operational overhead for workloads that do not
|
||||||
|
require a networked database cluster.
|
||||||
|
|
||||||
## Ownership And Support
|
## Ownership And Support
|
||||||
|
|
||||||
- Owning team: CBDD maintainers (repository owner: `@dohertj2`)
|
- Owning team: CBDD maintainers (repository owner: `@dohertj2`)
|
||||||
- Primary support path: open a Gitea issue in this repository with labels `incident` or `bug`
|
- Primary support path: open a Gitea issue in this repository with labels `incident` or `bug`
|
||||||
- Escalation path: follow [`docs/runbook.md`](docs/runbook.md) and page the release maintainer listed in the active release PR
|
- Escalation path: follow [`docs/runbook.md`](docs/runbook.md) and page the release maintainer listed in the active
|
||||||
|
release PR
|
||||||
|
|
||||||
## Architecture Overview
|
## Architecture Overview
|
||||||
|
|
||||||
@@ -22,6 +27,7 @@ CBDD has four primary layers:
|
|||||||
4. Source-generated mapping (`src/CBDD.SourceGenerators`)
|
4. Source-generated mapping (`src/CBDD.SourceGenerators`)
|
||||||
|
|
||||||
Detailed architecture material:
|
Detailed architecture material:
|
||||||
|
|
||||||
- [`docs/architecture.md`](docs/architecture.md)
|
- [`docs/architecture.md`](docs/architecture.md)
|
||||||
- [`RFC.md`](RFC.md)
|
- [`RFC.md`](RFC.md)
|
||||||
- [`C-BSON.md`](C-BSON.md)
|
- [`C-BSON.md`](C-BSON.md)
|
||||||
@@ -36,34 +42,44 @@ Detailed architecture material:
|
|||||||
## Setup And Local Run
|
## Setup And Local Run
|
||||||
|
|
||||||
1. Clone the repository.
|
1. Clone the repository.
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
git clone https://gitea.dohertylan.com/dohertj2/CBDD.git
|
git clone https://gitea.dohertylan.com/dohertj2/CBDD.git
|
||||||
cd CBDD
|
cd CBDD
|
||||||
```
|
```
|
||||||
|
|
||||||
Expected outcome: local repository checkout with `CBDD.slnx` present.
|
Expected outcome: local repository checkout with `CBDD.slnx` present.
|
||||||
|
|
||||||
2. Restore dependencies.
|
2. Restore dependencies.
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
dotnet restore
|
dotnet restore
|
||||||
```
|
```
|
||||||
|
|
||||||
Expected outcome: restore completes without package errors.
|
Expected outcome: restore completes without package errors.
|
||||||
|
|
||||||
3. Build the solution.
|
3. Build the solution.
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
dotnet build CBDD.slnx -c Release
|
dotnet build CBDD.slnx -c Release
|
||||||
```
|
```
|
||||||
|
|
||||||
Expected outcome: solution builds without compiler errors.
|
Expected outcome: solution builds without compiler errors.
|
||||||
|
|
||||||
4. Run tests.
|
4. Run tests.
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
dotnet test CBDD.slnx -c Release
|
dotnet test CBDD.slnx -c Release
|
||||||
```
|
```
|
||||||
|
|
||||||
Expected outcome: all tests pass.
|
Expected outcome: all tests pass.
|
||||||
|
|
||||||
5. Run the full repository fitness check.
|
5. Run the full repository fitness check.
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
bash scripts/fitness-check.sh
|
bash scripts/fitness-check.sh
|
||||||
```
|
```
|
||||||
|
|
||||||
Expected outcome: format, build, tests, coverage threshold, and package checks complete.
|
Expected outcome: format, build, tests, coverage threshold, and package checks complete.
|
||||||
|
|
||||||
## Configuration And Secrets
|
## Configuration And Secrets
|
||||||
@@ -135,9 +151,12 @@ if (!result.Executed)
|
|||||||
|
|
||||||
Common issues and remediation:
|
Common issues and remediation:
|
||||||
|
|
||||||
- Build/test environment failures: [`docs/troubleshooting.md#build-and-test-failures`](docs/troubleshooting.md#build-and-test-failures)
|
- Build/test environment failures: [
|
||||||
- Data-file recovery procedures: [`docs/troubleshooting.md#data-file-and-recovery-issues`](docs/troubleshooting.md#data-file-and-recovery-issues)
|
`docs/troubleshooting.md#build-and-test-failures`](docs/troubleshooting.md#build-and-test-failures)
|
||||||
- Query/index behavior verification: [`docs/troubleshooting.md#query-and-index-issues`](docs/troubleshooting.md#query-and-index-issues)
|
- Data-file recovery procedures: [
|
||||||
|
`docs/troubleshooting.md#data-file-and-recovery-issues`](docs/troubleshooting.md#data-file-and-recovery-issues)
|
||||||
|
- Query/index behavior verification: [
|
||||||
|
`docs/troubleshooting.md#query-and-index-issues`](docs/troubleshooting.md#query-and-index-issues)
|
||||||
|
|
||||||
## Change Governance
|
## Change Governance
|
||||||
|
|
||||||
@@ -150,4 +169,5 @@ Common issues and remediation:
|
|||||||
|
|
||||||
- Documentation home: [`docs/README.md`](docs/README.md)
|
- Documentation home: [`docs/README.md`](docs/README.md)
|
||||||
- Major feature inventory: [`docs/features/README.md`](docs/features/README.md)
|
- Major feature inventory: [`docs/features/README.md`](docs/features/README.md)
|
||||||
- Architecture decisions: [`docs/adr/0001-storage-engine-and-source-generation.md`](docs/adr/0001-storage-engine-and-source-generation.md)
|
- Architecture decisions: [
|
||||||
|
`docs/adr/0001-storage-engine-and-source-generation.md`](docs/adr/0001-storage-engine-and-source-generation.md)
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
using System;
|
using System.Collections.Concurrent;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Bson;
|
namespace ZB.MOM.WW.CBDD.Bson;
|
||||||
|
|
||||||
@@ -8,15 +8,15 @@ namespace ZB.MOM.WW.CBDD.Bson;
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public sealed class BsonDocument
|
public sealed class BsonDocument
|
||||||
{
|
{
|
||||||
|
private readonly ConcurrentDictionary<ushort, string>? _keys;
|
||||||
private readonly Memory<byte> _rawData;
|
private readonly Memory<byte> _rawData;
|
||||||
private readonly System.Collections.Concurrent.ConcurrentDictionary<ushort, string>? _keys;
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new instance of the <see cref="BsonDocument" /> class from raw BSON memory.
|
/// Initializes a new instance of the <see cref="BsonDocument" /> class from raw BSON memory.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="rawBsonData">The raw BSON data.</param>
|
/// <param name="rawBsonData">The raw BSON data.</param>
|
||||||
/// <param name="keys">The optional key dictionary.</param>
|
/// <param name="keys">The optional key dictionary.</param>
|
||||||
public BsonDocument(Memory<byte> rawBsonData, System.Collections.Concurrent.ConcurrentDictionary<ushort, string>? keys = null)
|
public BsonDocument(Memory<byte> rawBsonData, ConcurrentDictionary<ushort, string>? keys = null)
|
||||||
{
|
{
|
||||||
_rawData = rawBsonData;
|
_rawData = rawBsonData;
|
||||||
_keys = keys;
|
_keys = keys;
|
||||||
@@ -27,7 +27,7 @@ public sealed class BsonDocument
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="rawBsonData">The raw BSON data.</param>
|
/// <param name="rawBsonData">The raw BSON data.</param>
|
||||||
/// <param name="keys">The optional key dictionary.</param>
|
/// <param name="keys">The optional key dictionary.</param>
|
||||||
public BsonDocument(byte[] rawBsonData, System.Collections.Concurrent.ConcurrentDictionary<ushort, string>? keys = null)
|
public BsonDocument(byte[] rawBsonData, ConcurrentDictionary<ushort, string>? keys = null)
|
||||||
{
|
{
|
||||||
_rawData = rawBsonData;
|
_rawData = rawBsonData;
|
||||||
_keys = keys;
|
_keys = keys;
|
||||||
@@ -46,7 +46,11 @@ public sealed class BsonDocument
|
|||||||
/// <summary>
|
/// <summary>
|
||||||
/// Creates a reader for this document
|
/// Creates a reader for this document
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public BsonSpanReader GetReader() => new BsonSpanReader(_rawData.Span, _keys ?? new System.Collections.Concurrent.ConcurrentDictionary<ushort, string>());
|
public BsonSpanReader GetReader()
|
||||||
|
{
|
||||||
|
return new BsonSpanReader(_rawData.Span,
|
||||||
|
_keys ?? new ConcurrentDictionary<ushort, string>());
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Tries to get a field value by name.
|
/// Tries to get a field value by name.
|
||||||
@@ -70,7 +74,7 @@ public sealed class BsonDocument
|
|||||||
if (type == BsonType.EndOfDocument)
|
if (type == BsonType.EndOfDocument)
|
||||||
break;
|
break;
|
||||||
|
|
||||||
var name = reader.ReadElementHeader();
|
string name = reader.ReadElementHeader();
|
||||||
|
|
||||||
if (name == fieldName && type == BsonType.String)
|
if (name == fieldName && type == BsonType.String)
|
||||||
{
|
{
|
||||||
@@ -105,7 +109,7 @@ public sealed class BsonDocument
|
|||||||
if (type == BsonType.EndOfDocument)
|
if (type == BsonType.EndOfDocument)
|
||||||
break;
|
break;
|
||||||
|
|
||||||
var name = reader.ReadElementHeader();
|
string name = reader.ReadElementHeader();
|
||||||
|
|
||||||
if (name == fieldName && type == BsonType.Int32)
|
if (name == fieldName && type == BsonType.Int32)
|
||||||
{
|
{
|
||||||
@@ -140,7 +144,7 @@ public sealed class BsonDocument
|
|||||||
if (type == BsonType.EndOfDocument)
|
if (type == BsonType.EndOfDocument)
|
||||||
break;
|
break;
|
||||||
|
|
||||||
var name = reader.ReadElementHeader();
|
string name = reader.ReadElementHeader();
|
||||||
|
|
||||||
if (name == fieldName && type == BsonType.ObjectId)
|
if (name == fieldName && type == BsonType.ObjectId)
|
||||||
{
|
{
|
||||||
@@ -160,7 +164,8 @@ public sealed class BsonDocument
|
|||||||
/// <param name="keyMap">The key map used for field name encoding.</param>
|
/// <param name="keyMap">The key map used for field name encoding.</param>
|
||||||
/// <param name="buildAction">The action that populates the builder.</param>
|
/// <param name="buildAction">The action that populates the builder.</param>
|
||||||
/// <returns>The created BSON document.</returns>
|
/// <returns>The created BSON document.</returns>
|
||||||
public static BsonDocument Create(System.Collections.Concurrent.ConcurrentDictionary<string, ushort> keyMap, Action<BsonDocumentBuilder> buildAction)
|
public static BsonDocument Create(ConcurrentDictionary<string, ushort> keyMap,
|
||||||
|
Action<BsonDocumentBuilder> buildAction)
|
||||||
{
|
{
|
||||||
var builder = new BsonDocumentBuilder(keyMap);
|
var builder = new BsonDocumentBuilder(keyMap);
|
||||||
buildAction(builder);
|
buildAction(builder);
|
||||||
@@ -173,15 +178,15 @@ public sealed class BsonDocument
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public sealed class BsonDocumentBuilder
|
public sealed class BsonDocumentBuilder
|
||||||
{
|
{
|
||||||
|
private readonly ConcurrentDictionary<string, ushort> _keyMap;
|
||||||
private byte[] _buffer = new byte[1024]; // Start with 1KB
|
private byte[] _buffer = new byte[1024]; // Start with 1KB
|
||||||
private int _position;
|
private int _position;
|
||||||
private readonly System.Collections.Concurrent.ConcurrentDictionary<string, ushort> _keyMap;
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new instance of the <see cref="BsonDocumentBuilder" /> class.
|
/// Initializes a new instance of the <see cref="BsonDocumentBuilder" /> class.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="keyMap">The key map used for field name encoding.</param>
|
/// <param name="keyMap">The key map used for field name encoding.</param>
|
||||||
public BsonDocumentBuilder(System.Collections.Concurrent.ConcurrentDictionary<string, ushort> keyMap)
|
public BsonDocumentBuilder(ConcurrentDictionary<string, ushort> keyMap)
|
||||||
{
|
{
|
||||||
_keyMap = keyMap;
|
_keyMap = keyMap;
|
||||||
var writer = new BsonSpanWriter(_buffer, _keyMap);
|
var writer = new BsonSpanWriter(_buffer, _keyMap);
|
||||||
@@ -270,7 +275,7 @@ public sealed class BsonDocumentBuilder
|
|||||||
public BsonDocument Build()
|
public BsonDocument Build()
|
||||||
{
|
{
|
||||||
// Layout: [int32 size][field bytes...][0x00 terminator]
|
// Layout: [int32 size][field bytes...][0x00 terminator]
|
||||||
var totalSize = _position + 5;
|
int totalSize = _position + 5;
|
||||||
var finalBuffer = new byte[totalSize];
|
var finalBuffer = new byte[totalSize];
|
||||||
|
|
||||||
BitConverter.TryWriteBytes(finalBuffer.AsSpan(0, 4), totalSize);
|
BitConverter.TryWriteBytes(finalBuffer.AsSpan(0, 4), totalSize);
|
||||||
|
|||||||
@@ -1,4 +1,3 @@
|
|||||||
using System;
|
|
||||||
using System.Buffers;
|
using System.Buffers;
|
||||||
using System.Buffers.Binary;
|
using System.Buffers.Binary;
|
||||||
using System.Text;
|
using System.Text;
|
||||||
@@ -11,8 +10,7 @@ namespace ZB.MOM.WW.CBDD.Bson;
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public ref struct BsonBufferWriter
|
public ref struct BsonBufferWriter
|
||||||
{
|
{
|
||||||
private IBufferWriter<byte> _writer;
|
private readonly IBufferWriter<byte> _writer;
|
||||||
private int _totalBytesWritten;
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new instance of the <see cref="BsonBufferWriter" /> struct.
|
/// Initializes a new instance of the <see cref="BsonBufferWriter" /> struct.
|
||||||
@@ -21,20 +19,20 @@ public ref struct BsonBufferWriter
|
|||||||
public BsonBufferWriter(IBufferWriter<byte> writer)
|
public BsonBufferWriter(IBufferWriter<byte> writer)
|
||||||
{
|
{
|
||||||
_writer = writer;
|
_writer = writer;
|
||||||
_totalBytesWritten = 0;
|
Position = 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the current write position in bytes.
|
/// Gets the current write position in bytes.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public int Position => _totalBytesWritten;
|
public int Position { get; private set; }
|
||||||
|
|
||||||
private void WriteBytes(ReadOnlySpan<byte> data)
|
private void WriteBytes(ReadOnlySpan<byte> data)
|
||||||
{
|
{
|
||||||
var destination = _writer.GetSpan(data.Length);
|
var destination = _writer.GetSpan(data.Length);
|
||||||
data.CopyTo(destination);
|
data.CopyTo(destination);
|
||||||
_writer.Advance(data.Length);
|
_writer.Advance(data.Length);
|
||||||
_totalBytesWritten += data.Length;
|
Position += data.Length;
|
||||||
}
|
}
|
||||||
|
|
||||||
private void WriteByte(byte value)
|
private void WriteByte(byte value)
|
||||||
@@ -42,7 +40,7 @@ public ref struct BsonBufferWriter
|
|||||||
var span = _writer.GetSpan(1);
|
var span = _writer.GetSpan(1);
|
||||||
span[0] = value;
|
span[0] = value;
|
||||||
_writer.Advance(1);
|
_writer.Advance(1);
|
||||||
_totalBytesWritten++;
|
Position++;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -67,12 +65,15 @@ public ref struct BsonBufferWriter
|
|||||||
public int BeginDocument()
|
public int BeginDocument()
|
||||||
{
|
{
|
||||||
// Write placeholder for size (4 bytes)
|
// Write placeholder for size (4 bytes)
|
||||||
var sizePosition = _totalBytesWritten;
|
int sizePosition = Position;
|
||||||
var span = _writer.GetSpan(4);
|
var span = _writer.GetSpan(4);
|
||||||
// Initialize with default value (will be patched later)
|
// Initialize with default value (will be patched later)
|
||||||
span[0] = 0; span[1] = 0; span[2] = 0; span[3] = 0;
|
span[0] = 0;
|
||||||
|
span[1] = 0;
|
||||||
|
span[2] = 0;
|
||||||
|
span[3] = 0;
|
||||||
_writer.Advance(4);
|
_writer.Advance(4);
|
||||||
_totalBytesWritten += 4;
|
Position += 4;
|
||||||
return sizePosition;
|
return sizePosition;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -129,7 +130,7 @@ public ref struct BsonBufferWriter
|
|||||||
var span = _writer.GetSpan(4);
|
var span = _writer.GetSpan(4);
|
||||||
BinaryPrimitives.WriteInt32LittleEndian(span, value);
|
BinaryPrimitives.WriteInt32LittleEndian(span, value);
|
||||||
_writer.Advance(4);
|
_writer.Advance(4);
|
||||||
_totalBytesWritten += 4;
|
Position += 4;
|
||||||
}
|
}
|
||||||
|
|
||||||
private void WriteInt64Internal(long value)
|
private void WriteInt64Internal(long value)
|
||||||
@@ -137,7 +138,7 @@ public ref struct BsonBufferWriter
|
|||||||
var span = _writer.GetSpan(8);
|
var span = _writer.GetSpan(8);
|
||||||
BinaryPrimitives.WriteInt64LittleEndian(span, value);
|
BinaryPrimitives.WriteInt64LittleEndian(span, value);
|
||||||
_writer.Advance(8);
|
_writer.Advance(8);
|
||||||
_totalBytesWritten += 8;
|
Position += 8;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -189,7 +190,7 @@ public ref struct BsonBufferWriter
|
|||||||
private void WriteStringValue(string value)
|
private void WriteStringValue(string value)
|
||||||
{
|
{
|
||||||
// String: length (int32) + UTF8 bytes + null terminator
|
// String: length (int32) + UTF8 bytes + null terminator
|
||||||
var bytes = Encoding.UTF8.GetBytes(value);
|
byte[] bytes = Encoding.UTF8.GetBytes(value);
|
||||||
WriteInt32Internal(bytes.Length + 1); // +1 for null terminator
|
WriteInt32Internal(bytes.Length + 1); // +1 for null terminator
|
||||||
WriteBytes(bytes);
|
WriteBytes(bytes);
|
||||||
WriteByte(0);
|
WriteByte(0);
|
||||||
@@ -200,7 +201,7 @@ public ref struct BsonBufferWriter
|
|||||||
var span = _writer.GetSpan(8);
|
var span = _writer.GetSpan(8);
|
||||||
BinaryPrimitives.WriteDoubleLittleEndian(span, value);
|
BinaryPrimitives.WriteDoubleLittleEndian(span, value);
|
||||||
_writer.Advance(8);
|
_writer.Advance(8);
|
||||||
_totalBytesWritten += 8;
|
Position += 8;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -243,7 +244,7 @@ public ref struct BsonBufferWriter
|
|||||||
|
|
||||||
private void WriteCString(string value)
|
private void WriteCString(string value)
|
||||||
{
|
{
|
||||||
var bytes = Encoding.UTF8.GetBytes(value);
|
byte[] bytes = Encoding.UTF8.GetBytes(value);
|
||||||
WriteBytes(bytes);
|
WriteBytes(bytes);
|
||||||
WriteByte(0); // Null terminator
|
WriteByte(0); // Null terminator
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
using System;
|
|
||||||
using System.Buffers.Binary;
|
using System.Buffers.Binary;
|
||||||
|
using System.Collections.Concurrent;
|
||||||
using System.Text;
|
using System.Text;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Bson;
|
namespace ZB.MOM.WW.CBDD.Bson;
|
||||||
@@ -11,30 +11,29 @@ namespace ZB.MOM.WW.CBDD.Bson;
|
|||||||
public ref struct BsonSpanReader
|
public ref struct BsonSpanReader
|
||||||
{
|
{
|
||||||
private ReadOnlySpan<byte> _buffer;
|
private ReadOnlySpan<byte> _buffer;
|
||||||
private int _position;
|
private readonly ConcurrentDictionary<ushort, string> _keys;
|
||||||
private readonly System.Collections.Concurrent.ConcurrentDictionary<ushort, string> _keys;
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new instance of the <see cref="BsonSpanReader" /> struct.
|
/// Initializes a new instance of the <see cref="BsonSpanReader" /> struct.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="buffer">The BSON buffer to read.</param>
|
/// <param name="buffer">The BSON buffer to read.</param>
|
||||||
/// <param name="keys">The reverse key dictionary used for compressed element headers.</param>
|
/// <param name="keys">The reverse key dictionary used for compressed element headers.</param>
|
||||||
public BsonSpanReader(ReadOnlySpan<byte> buffer, System.Collections.Concurrent.ConcurrentDictionary<ushort, string> keys)
|
public BsonSpanReader(ReadOnlySpan<byte> buffer, ConcurrentDictionary<ushort, string> keys)
|
||||||
{
|
{
|
||||||
_buffer = buffer;
|
_buffer = buffer;
|
||||||
_position = 0;
|
Position = 0;
|
||||||
_keys = keys;
|
_keys = keys;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the current read position in the buffer.
|
/// Gets the current read position in the buffer.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public int Position => _position;
|
public int Position { get; private set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the number of unread bytes remaining in the buffer.
|
/// Gets the number of unread bytes remaining in the buffer.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public int Remaining => _buffer.Length - _position;
|
public int Remaining => _buffer.Length - Position;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Reads the document size (first 4 bytes of a BSON document)
|
/// Reads the document size (first 4 bytes of a BSON document)
|
||||||
@@ -44,8 +43,8 @@ public ref struct BsonSpanReader
|
|||||||
if (Remaining < 4)
|
if (Remaining < 4)
|
||||||
throw new InvalidOperationException("Not enough bytes to read document size");
|
throw new InvalidOperationException("Not enough bytes to read document size");
|
||||||
|
|
||||||
var size = BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(_position, 4));
|
int size = BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(Position, 4));
|
||||||
_position += 4;
|
Position += 4;
|
||||||
return size;
|
return size;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -57,8 +56,8 @@ public ref struct BsonSpanReader
|
|||||||
if (Remaining < 1)
|
if (Remaining < 1)
|
||||||
throw new InvalidOperationException("Not enough bytes to read BSON type");
|
throw new InvalidOperationException("Not enough bytes to read BSON type");
|
||||||
|
|
||||||
var type = (BsonType)_buffer[_position];
|
var type = (BsonType)_buffer[Position];
|
||||||
_position++;
|
Position++;
|
||||||
return type;
|
return type;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -67,15 +66,15 @@ public ref struct BsonSpanReader
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public string ReadCString()
|
public string ReadCString()
|
||||||
{
|
{
|
||||||
var start = _position;
|
int start = Position;
|
||||||
while (_position < _buffer.Length && _buffer[_position] != 0)
|
while (Position < _buffer.Length && _buffer[Position] != 0)
|
||||||
_position++;
|
Position++;
|
||||||
|
|
||||||
if (_position >= _buffer.Length)
|
if (Position >= _buffer.Length)
|
||||||
throw new InvalidOperationException("Unterminated C-string");
|
throw new InvalidOperationException("Unterminated C-string");
|
||||||
|
|
||||||
var nameBytes = _buffer.Slice(start, _position - start);
|
var nameBytes = _buffer.Slice(start, Position - start);
|
||||||
_position++; // Skip null terminator
|
Position++; // Skip null terminator
|
||||||
|
|
||||||
return Encoding.UTF8.GetString(nameBytes);
|
return Encoding.UTF8.GetString(nameBytes);
|
||||||
}
|
}
|
||||||
@@ -86,15 +85,15 @@ public ref struct BsonSpanReader
|
|||||||
/// <param name="destination">The destination character span.</param>
|
/// <param name="destination">The destination character span.</param>
|
||||||
public int ReadCString(Span<char> destination)
|
public int ReadCString(Span<char> destination)
|
||||||
{
|
{
|
||||||
var start = _position;
|
int start = Position;
|
||||||
while (_position < _buffer.Length && _buffer[_position] != 0)
|
while (Position < _buffer.Length && _buffer[Position] != 0)
|
||||||
_position++;
|
Position++;
|
||||||
|
|
||||||
if (_position >= _buffer.Length)
|
if (Position >= _buffer.Length)
|
||||||
throw new InvalidOperationException("Unterminated C-string");
|
throw new InvalidOperationException("Unterminated C-string");
|
||||||
|
|
||||||
var nameBytes = _buffer.Slice(start, _position - start);
|
var nameBytes = _buffer.Slice(start, Position - start);
|
||||||
_position++; // Skip null terminator
|
Position++; // Skip null terminator
|
||||||
|
|
||||||
return Encoding.UTF8.GetChars(nameBytes, destination);
|
return Encoding.UTF8.GetChars(nameBytes, destination);
|
||||||
}
|
}
|
||||||
@@ -104,14 +103,14 @@ public ref struct BsonSpanReader
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public string ReadString()
|
public string ReadString()
|
||||||
{
|
{
|
||||||
var length = BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(_position, 4));
|
int length = BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(Position, 4));
|
||||||
_position += 4;
|
Position += 4;
|
||||||
|
|
||||||
if (length < 1)
|
if (length < 1)
|
||||||
throw new InvalidOperationException("Invalid string length");
|
throw new InvalidOperationException("Invalid string length");
|
||||||
|
|
||||||
var stringBytes = _buffer.Slice(_position, length - 1); // Exclude null terminator
|
var stringBytes = _buffer.Slice(Position, length - 1); // Exclude null terminator
|
||||||
_position += length;
|
Position += length;
|
||||||
|
|
||||||
return Encoding.UTF8.GetString(stringBytes);
|
return Encoding.UTF8.GetString(stringBytes);
|
||||||
}
|
}
|
||||||
@@ -124,8 +123,8 @@ public ref struct BsonSpanReader
|
|||||||
if (Remaining < 4)
|
if (Remaining < 4)
|
||||||
throw new InvalidOperationException("Not enough bytes to read Int32");
|
throw new InvalidOperationException("Not enough bytes to read Int32");
|
||||||
|
|
||||||
var value = BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(_position, 4));
|
int value = BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(Position, 4));
|
||||||
_position += 4;
|
Position += 4;
|
||||||
return value;
|
return value;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -137,8 +136,8 @@ public ref struct BsonSpanReader
|
|||||||
if (Remaining < 8)
|
if (Remaining < 8)
|
||||||
throw new InvalidOperationException("Not enough bytes to read Int64");
|
throw new InvalidOperationException("Not enough bytes to read Int64");
|
||||||
|
|
||||||
var value = BinaryPrimitives.ReadInt64LittleEndian(_buffer.Slice(_position, 8));
|
long value = BinaryPrimitives.ReadInt64LittleEndian(_buffer.Slice(Position, 8));
|
||||||
_position += 8;
|
Position += 8;
|
||||||
return value;
|
return value;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -150,8 +149,8 @@ public ref struct BsonSpanReader
|
|||||||
if (Remaining < 8)
|
if (Remaining < 8)
|
||||||
throw new InvalidOperationException("Not enough bytes to read Double");
|
throw new InvalidOperationException("Not enough bytes to read Double");
|
||||||
|
|
||||||
var value = BinaryPrimitives.ReadDoubleLittleEndian(_buffer.Slice(_position, 8));
|
double value = BinaryPrimitives.ReadDoubleLittleEndian(_buffer.Slice(Position, 8));
|
||||||
_position += 8;
|
Position += 8;
|
||||||
return value;
|
return value;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -162,20 +161,20 @@ public ref struct BsonSpanReader
|
|||||||
public (double, double) ReadCoordinates()
|
public (double, double) ReadCoordinates()
|
||||||
{
|
{
|
||||||
// Skip array size (4 bytes)
|
// Skip array size (4 bytes)
|
||||||
_position += 4;
|
Position += 4;
|
||||||
|
|
||||||
// Skip element 0 header: Type(1) + Name("0\0") (3 bytes)
|
// Skip element 0 header: Type(1) + Name("0\0") (3 bytes)
|
||||||
_position += 3;
|
Position += 3;
|
||||||
var x = BinaryPrimitives.ReadDoubleLittleEndian(_buffer.Slice(_position, 8));
|
double x = BinaryPrimitives.ReadDoubleLittleEndian(_buffer.Slice(Position, 8));
|
||||||
_position += 8;
|
Position += 8;
|
||||||
|
|
||||||
// Skip element 1 header: Type(1) + Name("1\0") (3 bytes)
|
// Skip element 1 header: Type(1) + Name("1\0") (3 bytes)
|
||||||
_position += 3;
|
Position += 3;
|
||||||
var y = BinaryPrimitives.ReadDoubleLittleEndian(_buffer.Slice(_position, 8));
|
double y = BinaryPrimitives.ReadDoubleLittleEndian(_buffer.Slice(Position, 8));
|
||||||
_position += 8;
|
Position += 8;
|
||||||
|
|
||||||
// Skip end of array marker (1 byte)
|
// Skip end of array marker (1 byte)
|
||||||
_position++;
|
Position++;
|
||||||
|
|
||||||
return (x, y);
|
return (x, y);
|
||||||
}
|
}
|
||||||
@@ -189,11 +188,11 @@ public ref struct BsonSpanReader
|
|||||||
throw new InvalidOperationException("Not enough bytes to read Decimal128");
|
throw new InvalidOperationException("Not enough bytes to read Decimal128");
|
||||||
|
|
||||||
var bits = new int[4];
|
var bits = new int[4];
|
||||||
bits[0] = BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(_position, 4));
|
bits[0] = BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(Position, 4));
|
||||||
bits[1] = BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(_position + 4, 4));
|
bits[1] = BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(Position + 4, 4));
|
||||||
bits[2] = BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(_position + 8, 4));
|
bits[2] = BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(Position + 8, 4));
|
||||||
bits[3] = BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(_position + 12, 4));
|
bits[3] = BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(Position + 12, 4));
|
||||||
_position += 16;
|
Position += 16;
|
||||||
|
|
||||||
return new decimal(bits);
|
return new decimal(bits);
|
||||||
}
|
}
|
||||||
@@ -206,8 +205,8 @@ public ref struct BsonSpanReader
|
|||||||
if (Remaining < 1)
|
if (Remaining < 1)
|
||||||
throw new InvalidOperationException("Not enough bytes to read Boolean");
|
throw new InvalidOperationException("Not enough bytes to read Boolean");
|
||||||
|
|
||||||
var value = _buffer[_position] != 0;
|
bool value = _buffer[Position] != 0;
|
||||||
_position++;
|
Position++;
|
||||||
return value;
|
return value;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -216,7 +215,7 @@ public ref struct BsonSpanReader
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public DateTime ReadDateTime()
|
public DateTime ReadDateTime()
|
||||||
{
|
{
|
||||||
var milliseconds = ReadInt64();
|
long milliseconds = ReadInt64();
|
||||||
return DateTimeOffset.FromUnixTimeMilliseconds(milliseconds).UtcDateTime;
|
return DateTimeOffset.FromUnixTimeMilliseconds(milliseconds).UtcDateTime;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -225,7 +224,7 @@ public ref struct BsonSpanReader
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public DateTimeOffset ReadDateTimeOffset()
|
public DateTimeOffset ReadDateTimeOffset()
|
||||||
{
|
{
|
||||||
var milliseconds = ReadInt64();
|
long milliseconds = ReadInt64();
|
||||||
return DateTimeOffset.FromUnixTimeMilliseconds(milliseconds);
|
return DateTimeOffset.FromUnixTimeMilliseconds(milliseconds);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -234,7 +233,7 @@ public ref struct BsonSpanReader
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public TimeSpan ReadTimeSpan()
|
public TimeSpan ReadTimeSpan()
|
||||||
{
|
{
|
||||||
var ticks = ReadInt64();
|
long ticks = ReadInt64();
|
||||||
return TimeSpan.FromTicks(ticks);
|
return TimeSpan.FromTicks(ticks);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -243,7 +242,7 @@ public ref struct BsonSpanReader
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public DateOnly ReadDateOnly()
|
public DateOnly ReadDateOnly()
|
||||||
{
|
{
|
||||||
var dayNumber = ReadInt32();
|
int dayNumber = ReadInt32();
|
||||||
return DateOnly.FromDayNumber(dayNumber);
|
return DateOnly.FromDayNumber(dayNumber);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -252,7 +251,7 @@ public ref struct BsonSpanReader
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public TimeOnly ReadTimeOnly()
|
public TimeOnly ReadTimeOnly()
|
||||||
{
|
{
|
||||||
var ticks = ReadInt64();
|
long ticks = ReadInt64();
|
||||||
return new TimeOnly(ticks);
|
return new TimeOnly(ticks);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -272,8 +271,8 @@ public ref struct BsonSpanReader
|
|||||||
if (Remaining < 12)
|
if (Remaining < 12)
|
||||||
throw new InvalidOperationException("Not enough bytes to read ObjectId");
|
throw new InvalidOperationException("Not enough bytes to read ObjectId");
|
||||||
|
|
||||||
var oidBytes = _buffer.Slice(_position, 12);
|
var oidBytes = _buffer.Slice(Position, 12);
|
||||||
_position += 12;
|
Position += 12;
|
||||||
return new ObjectId(oidBytes);
|
return new ObjectId(oidBytes);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -283,19 +282,19 @@ public ref struct BsonSpanReader
|
|||||||
/// <param name="subtype">When this method returns, contains the BSON binary subtype.</param>
|
/// <param name="subtype">When this method returns, contains the BSON binary subtype.</param>
|
||||||
public ReadOnlySpan<byte> ReadBinary(out byte subtype)
|
public ReadOnlySpan<byte> ReadBinary(out byte subtype)
|
||||||
{
|
{
|
||||||
var length = ReadInt32();
|
int length = ReadInt32();
|
||||||
|
|
||||||
if (Remaining < 1)
|
if (Remaining < 1)
|
||||||
throw new InvalidOperationException("Not enough bytes to read binary subtype");
|
throw new InvalidOperationException("Not enough bytes to read binary subtype");
|
||||||
|
|
||||||
subtype = _buffer[_position];
|
subtype = _buffer[Position];
|
||||||
_position++;
|
Position++;
|
||||||
|
|
||||||
if (Remaining < length)
|
if (Remaining < length)
|
||||||
throw new InvalidOperationException("Not enough bytes to read binary data");
|
throw new InvalidOperationException("Not enough bytes to read binary data");
|
||||||
|
|
||||||
var data = _buffer.Slice(_position, length);
|
var data = _buffer.Slice(Position, length);
|
||||||
_position += length;
|
Position += length;
|
||||||
return data;
|
return data;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -308,37 +307,37 @@ public ref struct BsonSpanReader
|
|||||||
switch (type)
|
switch (type)
|
||||||
{
|
{
|
||||||
case BsonType.Double:
|
case BsonType.Double:
|
||||||
_position += 8;
|
Position += 8;
|
||||||
break;
|
break;
|
||||||
case BsonType.String:
|
case BsonType.String:
|
||||||
var stringLength = ReadInt32();
|
int stringLength = ReadInt32();
|
||||||
_position += stringLength;
|
Position += stringLength;
|
||||||
break;
|
break;
|
||||||
case BsonType.Document:
|
case BsonType.Document:
|
||||||
case BsonType.Array:
|
case BsonType.Array:
|
||||||
var docLength = BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(_position, 4));
|
int docLength = BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(Position, 4));
|
||||||
_position += docLength;
|
Position += docLength;
|
||||||
break;
|
break;
|
||||||
case BsonType.Binary:
|
case BsonType.Binary:
|
||||||
var binaryLength = ReadInt32();
|
int binaryLength = ReadInt32();
|
||||||
_position += 1 + binaryLength; // subtype + data
|
Position += 1 + binaryLength; // subtype + data
|
||||||
break;
|
break;
|
||||||
case BsonType.ObjectId:
|
case BsonType.ObjectId:
|
||||||
_position += 12;
|
Position += 12;
|
||||||
break;
|
break;
|
||||||
case BsonType.Boolean:
|
case BsonType.Boolean:
|
||||||
_position += 1;
|
Position += 1;
|
||||||
break;
|
break;
|
||||||
case BsonType.DateTime:
|
case BsonType.DateTime:
|
||||||
case BsonType.Int64:
|
case BsonType.Int64:
|
||||||
case BsonType.Timestamp:
|
case BsonType.Timestamp:
|
||||||
_position += 8;
|
Position += 8;
|
||||||
break;
|
break;
|
||||||
case BsonType.Decimal128:
|
case BsonType.Decimal128:
|
||||||
_position += 16;
|
Position += 16;
|
||||||
break;
|
break;
|
||||||
case BsonType.Int32:
|
case BsonType.Int32:
|
||||||
_position += 4;
|
Position += 4;
|
||||||
break;
|
break;
|
||||||
case BsonType.Null:
|
case BsonType.Null:
|
||||||
// No data
|
// No data
|
||||||
@@ -355,8 +354,8 @@ public ref struct BsonSpanReader
|
|||||||
{
|
{
|
||||||
if (Remaining < 1)
|
if (Remaining < 1)
|
||||||
throw new InvalidOperationException("Not enough bytes to read byte");
|
throw new InvalidOperationException("Not enough bytes to read byte");
|
||||||
var value = _buffer[_position];
|
byte value = _buffer[Position];
|
||||||
_position++;
|
Position++;
|
||||||
return value;
|
return value;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -367,7 +366,7 @@ public ref struct BsonSpanReader
|
|||||||
{
|
{
|
||||||
if (Remaining < 4)
|
if (Remaining < 4)
|
||||||
throw new InvalidOperationException("Not enough bytes to peek Int32");
|
throw new InvalidOperationException("Not enough bytes to peek Int32");
|
||||||
return BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(_position, 4));
|
return BinaryPrimitives.ReadInt32LittleEndian(_buffer.Slice(Position, 4));
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -378,13 +377,11 @@ public ref struct BsonSpanReader
|
|||||||
if (Remaining < 2)
|
if (Remaining < 2)
|
||||||
throw new InvalidOperationException("Not enough bytes to read BSON element key ID");
|
throw new InvalidOperationException("Not enough bytes to read BSON element key ID");
|
||||||
|
|
||||||
var id = BinaryPrimitives.ReadUInt16LittleEndian(_buffer.Slice(_position, 2));
|
ushort id = BinaryPrimitives.ReadUInt16LittleEndian(_buffer.Slice(Position, 2));
|
||||||
_position += 2;
|
Position += 2;
|
||||||
|
|
||||||
if (!_keys.TryGetValue(id, out var key))
|
if (!_keys.TryGetValue(id, out string? key))
|
||||||
{
|
|
||||||
throw new InvalidOperationException($"BSON Key ID {id} not found in reverse key dictionary.");
|
throw new InvalidOperationException($"BSON Key ID {id} not found in reverse key dictionary.");
|
||||||
}
|
|
||||||
|
|
||||||
return key;
|
return key;
|
||||||
}
|
}
|
||||||
@@ -392,5 +389,8 @@ public ref struct BsonSpanReader
|
|||||||
/// <summary>
|
/// <summary>
|
||||||
/// Returns a span containing all unread bytes.
|
/// Returns a span containing all unread bytes.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public ReadOnlySpan<byte> RemainingBytes() => _buffer[_position..];
|
public ReadOnlySpan<byte> RemainingBytes()
|
||||||
|
{
|
||||||
|
return _buffer[Position..];
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -1,5 +1,5 @@
|
|||||||
using System;
|
|
||||||
using System.Buffers.Binary;
|
using System.Buffers.Binary;
|
||||||
|
using System.Collections.Concurrent;
|
||||||
using System.Text;
|
using System.Text;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Bson;
|
namespace ZB.MOM.WW.CBDD.Bson;
|
||||||
@@ -11,39 +11,38 @@ namespace ZB.MOM.WW.CBDD.Bson;
|
|||||||
public ref struct BsonSpanWriter
|
public ref struct BsonSpanWriter
|
||||||
{
|
{
|
||||||
private Span<byte> _buffer;
|
private Span<byte> _buffer;
|
||||||
private int _position;
|
private readonly ConcurrentDictionary<string, ushort> _keyMap;
|
||||||
private readonly System.Collections.Concurrent.ConcurrentDictionary<string, ushort> _keyMap;
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new instance of the <see cref="BsonSpanWriter" /> struct.
|
/// Initializes a new instance of the <see cref="BsonSpanWriter" /> struct.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="buffer">The destination buffer to write BSON bytes into.</param>
|
/// <param name="buffer">The destination buffer to write BSON bytes into.</param>
|
||||||
/// <param name="keyMap">The cached key-name to key-id mapping.</param>
|
/// <param name="keyMap">The cached key-name to key-id mapping.</param>
|
||||||
public BsonSpanWriter(Span<byte> buffer, System.Collections.Concurrent.ConcurrentDictionary<string, ushort> keyMap)
|
public BsonSpanWriter(Span<byte> buffer, ConcurrentDictionary<string, ushort> keyMap)
|
||||||
{
|
{
|
||||||
_buffer = buffer;
|
_buffer = buffer;
|
||||||
_keyMap = keyMap;
|
_keyMap = keyMap;
|
||||||
_position = 0;
|
Position = 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the current write position in the buffer.
|
/// Gets the current write position in the buffer.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public int Position => _position;
|
public int Position { get; private set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the number of bytes remaining in the buffer.
|
/// Gets the number of bytes remaining in the buffer.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public int Remaining => _buffer.Length - _position;
|
public int Remaining => _buffer.Length - Position;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Writes document size placeholder and returns the position to patch later
|
/// Writes document size placeholder and returns the position to patch later
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public int WriteDocumentSizePlaceholder()
|
public int WriteDocumentSizePlaceholder()
|
||||||
{
|
{
|
||||||
var sizePosition = _position;
|
int sizePosition = Position;
|
||||||
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(_position, 4), 0);
|
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(Position, 4), 0);
|
||||||
_position += 4;
|
Position += 4;
|
||||||
return sizePosition;
|
return sizePosition;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -53,7 +52,7 @@ public ref struct BsonSpanWriter
|
|||||||
/// <param name="sizePosition">The position where the size placeholder was written.</param>
|
/// <param name="sizePosition">The position where the size placeholder was written.</param>
|
||||||
public void PatchDocumentSize(int sizePosition)
|
public void PatchDocumentSize(int sizePosition)
|
||||||
{
|
{
|
||||||
var size = _position - sizePosition;
|
int size = Position - sizePosition;
|
||||||
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(sizePosition, 4), size);
|
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(sizePosition, 4), size);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -64,16 +63,15 @@ public ref struct BsonSpanWriter
|
|||||||
/// <param name="name">The field name.</param>
|
/// <param name="name">The field name.</param>
|
||||||
public void WriteElementHeader(BsonType type, string name)
|
public void WriteElementHeader(BsonType type, string name)
|
||||||
{
|
{
|
||||||
_buffer[_position] = (byte)type;
|
_buffer[Position] = (byte)type;
|
||||||
_position++;
|
Position++;
|
||||||
|
|
||||||
if (!_keyMap.TryGetValue(name, out var id))
|
if (!_keyMap.TryGetValue(name, out ushort id))
|
||||||
{
|
throw new InvalidOperationException(
|
||||||
throw new InvalidOperationException($"BSON Key '{name}' not found in dictionary cache. Ensure all keys are registered before serialization.");
|
$"BSON Key '{name}' not found in dictionary cache. Ensure all keys are registered before serialization.");
|
||||||
}
|
|
||||||
|
|
||||||
BinaryPrimitives.WriteUInt16LittleEndian(_buffer.Slice(_position, 2), id);
|
BinaryPrimitives.WriteUInt16LittleEndian(_buffer.Slice(Position, 2), id);
|
||||||
_position += 2;
|
Position += 2;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -81,10 +79,10 @@ public ref struct BsonSpanWriter
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
private void WriteCString(string value)
|
private void WriteCString(string value)
|
||||||
{
|
{
|
||||||
var bytesWritten = Encoding.UTF8.GetBytes(value, _buffer[_position..]);
|
int bytesWritten = Encoding.UTF8.GetBytes(value, _buffer[Position..]);
|
||||||
_position += bytesWritten;
|
Position += bytesWritten;
|
||||||
_buffer[_position] = 0; // Null terminator
|
_buffer[Position] = 0; // Null terminator
|
||||||
_position++;
|
Position++;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -92,8 +90,8 @@ public ref struct BsonSpanWriter
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public void WriteEndOfDocument()
|
public void WriteEndOfDocument()
|
||||||
{
|
{
|
||||||
_buffer[_position] = 0;
|
_buffer[Position] = 0;
|
||||||
_position++;
|
Position++;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -105,17 +103,17 @@ public ref struct BsonSpanWriter
|
|||||||
{
|
{
|
||||||
WriteElementHeader(BsonType.String, name);
|
WriteElementHeader(BsonType.String, name);
|
||||||
|
|
||||||
var valueBytes = Encoding.UTF8.GetByteCount(value);
|
int valueBytes = Encoding.UTF8.GetByteCount(value);
|
||||||
var stringLength = valueBytes + 1; // Include null terminator
|
int stringLength = valueBytes + 1; // Include null terminator
|
||||||
|
|
||||||
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(_position, 4), stringLength);
|
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(Position, 4), stringLength);
|
||||||
_position += 4;
|
Position += 4;
|
||||||
|
|
||||||
Encoding.UTF8.GetBytes(value, _buffer[_position..]);
|
Encoding.UTF8.GetBytes(value, _buffer[Position..]);
|
||||||
_position += valueBytes;
|
Position += valueBytes;
|
||||||
|
|
||||||
_buffer[_position] = 0; // Null terminator
|
_buffer[Position] = 0; // Null terminator
|
||||||
_position++;
|
Position++;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -126,8 +124,8 @@ public ref struct BsonSpanWriter
|
|||||||
public void WriteInt32(string name, int value)
|
public void WriteInt32(string name, int value)
|
||||||
{
|
{
|
||||||
WriteElementHeader(BsonType.Int32, name);
|
WriteElementHeader(BsonType.Int32, name);
|
||||||
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(_position, 4), value);
|
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(Position, 4), value);
|
||||||
_position += 4;
|
Position += 4;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -138,8 +136,8 @@ public ref struct BsonSpanWriter
|
|||||||
public void WriteInt64(string name, long value)
|
public void WriteInt64(string name, long value)
|
||||||
{
|
{
|
||||||
WriteElementHeader(BsonType.Int64, name);
|
WriteElementHeader(BsonType.Int64, name);
|
||||||
BinaryPrimitives.WriteInt64LittleEndian(_buffer.Slice(_position, 8), value);
|
BinaryPrimitives.WriteInt64LittleEndian(_buffer.Slice(Position, 8), value);
|
||||||
_position += 8;
|
Position += 8;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -150,8 +148,8 @@ public ref struct BsonSpanWriter
|
|||||||
public void WriteDouble(string name, double value)
|
public void WriteDouble(string name, double value)
|
||||||
{
|
{
|
||||||
WriteElementHeader(BsonType.Double, name);
|
WriteElementHeader(BsonType.Double, name);
|
||||||
BinaryPrimitives.WriteDoubleLittleEndian(_buffer.Slice(_position, 8), value);
|
BinaryPrimitives.WriteDoubleLittleEndian(_buffer.Slice(Position, 8), value);
|
||||||
_position += 8;
|
Position += 8;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -164,27 +162,27 @@ public ref struct BsonSpanWriter
|
|||||||
{
|
{
|
||||||
WriteElementHeader(BsonType.Array, name);
|
WriteElementHeader(BsonType.Array, name);
|
||||||
|
|
||||||
var startPos = _position;
|
int startPos = Position;
|
||||||
_position += 4; // Placeholder for array size
|
Position += 4; // Placeholder for array size
|
||||||
|
|
||||||
// Element 0: X
|
// Element 0: X
|
||||||
_buffer[_position++] = (byte)BsonType.Double;
|
_buffer[Position++] = (byte)BsonType.Double;
|
||||||
_buffer[_position++] = 0x30; // '0'
|
_buffer[Position++] = 0x30; // '0'
|
||||||
_buffer[_position++] = 0x00; // Null
|
_buffer[Position++] = 0x00; // Null
|
||||||
BinaryPrimitives.WriteDoubleLittleEndian(_buffer.Slice(_position, 8), coordinates.Item1);
|
BinaryPrimitives.WriteDoubleLittleEndian(_buffer.Slice(Position, 8), coordinates.Item1);
|
||||||
_position += 8;
|
Position += 8;
|
||||||
|
|
||||||
// Element 1: Y
|
// Element 1: Y
|
||||||
_buffer[_position++] = (byte)BsonType.Double;
|
_buffer[Position++] = (byte)BsonType.Double;
|
||||||
_buffer[_position++] = 0x31; // '1'
|
_buffer[Position++] = 0x31; // '1'
|
||||||
_buffer[_position++] = 0x00; // Null
|
_buffer[Position++] = 0x00; // Null
|
||||||
BinaryPrimitives.WriteDoubleLittleEndian(_buffer.Slice(_position, 8), coordinates.Item2);
|
BinaryPrimitives.WriteDoubleLittleEndian(_buffer.Slice(Position, 8), coordinates.Item2);
|
||||||
_position += 8;
|
Position += 8;
|
||||||
|
|
||||||
_buffer[_position++] = 0x00; // End of array marker
|
_buffer[Position++] = 0x00; // End of array marker
|
||||||
|
|
||||||
// Patch array size
|
// Patch array size
|
||||||
var size = _position - startPos;
|
int size = Position - startPos;
|
||||||
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(startPos, 4), size);
|
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(startPos, 4), size);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -198,12 +196,12 @@ public ref struct BsonSpanWriter
|
|||||||
WriteElementHeader(BsonType.Decimal128, name);
|
WriteElementHeader(BsonType.Decimal128, name);
|
||||||
// Note: usage of C# decimal bits for round-trip fidelity within ZB.MOM.WW.CBDD.
|
// Note: usage of C# decimal bits for round-trip fidelity within ZB.MOM.WW.CBDD.
|
||||||
// This makes it compatible with CBDD Reader but strictly speaking not standard IEEE 754-2008 Decimal128.
|
// This makes it compatible with CBDD Reader but strictly speaking not standard IEEE 754-2008 Decimal128.
|
||||||
var bits = decimal.GetBits(value);
|
int[] bits = decimal.GetBits(value);
|
||||||
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(_position, 4), bits[0]);
|
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(Position, 4), bits[0]);
|
||||||
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(_position + 4, 4), bits[1]);
|
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(Position + 4, 4), bits[1]);
|
||||||
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(_position + 8, 4), bits[2]);
|
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(Position + 8, 4), bits[2]);
|
||||||
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(_position + 12, 4), bits[3]);
|
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(Position + 12, 4), bits[3]);
|
||||||
_position += 16;
|
Position += 16;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -214,8 +212,8 @@ public ref struct BsonSpanWriter
|
|||||||
public void WriteBoolean(string name, bool value)
|
public void WriteBoolean(string name, bool value)
|
||||||
{
|
{
|
||||||
WriteElementHeader(BsonType.Boolean, name);
|
WriteElementHeader(BsonType.Boolean, name);
|
||||||
_buffer[_position] = (byte)(value ? 1 : 0);
|
_buffer[Position] = (byte)(value ? 1 : 0);
|
||||||
_position++;
|
Position++;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -226,9 +224,9 @@ public ref struct BsonSpanWriter
|
|||||||
public void WriteDateTime(string name, DateTime value)
|
public void WriteDateTime(string name, DateTime value)
|
||||||
{
|
{
|
||||||
WriteElementHeader(BsonType.DateTime, name);
|
WriteElementHeader(BsonType.DateTime, name);
|
||||||
var milliseconds = new DateTimeOffset(value.ToUniversalTime()).ToUnixTimeMilliseconds();
|
long milliseconds = new DateTimeOffset(value.ToUniversalTime()).ToUnixTimeMilliseconds();
|
||||||
BinaryPrimitives.WriteInt64LittleEndian(_buffer.Slice(_position, 8), milliseconds);
|
BinaryPrimitives.WriteInt64LittleEndian(_buffer.Slice(Position, 8), milliseconds);
|
||||||
_position += 8;
|
Position += 8;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -239,9 +237,9 @@ public ref struct BsonSpanWriter
|
|||||||
public void WriteDateTimeOffset(string name, DateTimeOffset value)
|
public void WriteDateTimeOffset(string name, DateTimeOffset value)
|
||||||
{
|
{
|
||||||
WriteElementHeader(BsonType.DateTime, name);
|
WriteElementHeader(BsonType.DateTime, name);
|
||||||
var milliseconds = value.ToUnixTimeMilliseconds();
|
long milliseconds = value.ToUnixTimeMilliseconds();
|
||||||
BinaryPrimitives.WriteInt64LittleEndian(_buffer.Slice(_position, 8), milliseconds);
|
BinaryPrimitives.WriteInt64LittleEndian(_buffer.Slice(Position, 8), milliseconds);
|
||||||
_position += 8;
|
Position += 8;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -252,8 +250,8 @@ public ref struct BsonSpanWriter
|
|||||||
public void WriteTimeSpan(string name, TimeSpan value)
|
public void WriteTimeSpan(string name, TimeSpan value)
|
||||||
{
|
{
|
||||||
WriteElementHeader(BsonType.Int64, name);
|
WriteElementHeader(BsonType.Int64, name);
|
||||||
BinaryPrimitives.WriteInt64LittleEndian(_buffer.Slice(_position, 8), value.Ticks);
|
BinaryPrimitives.WriteInt64LittleEndian(_buffer.Slice(Position, 8), value.Ticks);
|
||||||
_position += 8;
|
Position += 8;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -264,8 +262,8 @@ public ref struct BsonSpanWriter
|
|||||||
public void WriteDateOnly(string name, DateOnly value)
|
public void WriteDateOnly(string name, DateOnly value)
|
||||||
{
|
{
|
||||||
WriteElementHeader(BsonType.Int32, name);
|
WriteElementHeader(BsonType.Int32, name);
|
||||||
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(_position, 4), value.DayNumber);
|
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(Position, 4), value.DayNumber);
|
||||||
_position += 4;
|
Position += 4;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -276,8 +274,8 @@ public ref struct BsonSpanWriter
|
|||||||
public void WriteTimeOnly(string name, TimeOnly value)
|
public void WriteTimeOnly(string name, TimeOnly value)
|
||||||
{
|
{
|
||||||
WriteElementHeader(BsonType.Int64, name);
|
WriteElementHeader(BsonType.Int64, name);
|
||||||
BinaryPrimitives.WriteInt64LittleEndian(_buffer.Slice(_position, 8), value.Ticks);
|
BinaryPrimitives.WriteInt64LittleEndian(_buffer.Slice(Position, 8), value.Ticks);
|
||||||
_position += 8;
|
Position += 8;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -298,8 +296,8 @@ public ref struct BsonSpanWriter
|
|||||||
public void WriteObjectId(string name, ObjectId value)
|
public void WriteObjectId(string name, ObjectId value)
|
||||||
{
|
{
|
||||||
WriteElementHeader(BsonType.ObjectId, name);
|
WriteElementHeader(BsonType.ObjectId, name);
|
||||||
value.WriteTo(_buffer.Slice(_position, 12));
|
value.WriteTo(_buffer.Slice(Position, 12));
|
||||||
_position += 12;
|
Position += 12;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -322,14 +320,14 @@ public ref struct BsonSpanWriter
|
|||||||
{
|
{
|
||||||
WriteElementHeader(BsonType.Binary, name);
|
WriteElementHeader(BsonType.Binary, name);
|
||||||
|
|
||||||
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(_position, 4), data.Length);
|
BinaryPrimitives.WriteInt32LittleEndian(_buffer.Slice(Position, 4), data.Length);
|
||||||
_position += 4;
|
Position += 4;
|
||||||
|
|
||||||
_buffer[_position] = subtype;
|
_buffer[Position] = subtype;
|
||||||
_position++;
|
Position++;
|
||||||
|
|
||||||
data.CopyTo(_buffer[_position..]);
|
data.CopyTo(_buffer[Position..]);
|
||||||
_position += data.Length;
|
Position += data.Length;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
|
|||||||
@@ -1,7 +1,5 @@
|
|||||||
using System;
|
namespace ZB.MOM.WW.CBDD.Bson;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Bson
|
|
||||||
{
|
|
||||||
[AttributeUsage(AttributeTargets.Property)]
|
[AttributeUsage(AttributeTargets.Property)]
|
||||||
public class BsonIdAttribute : Attribute
|
public class BsonIdAttribute : Attribute
|
||||||
{
|
{
|
||||||
@@ -11,4 +9,3 @@ namespace ZB.MOM.WW.CBDD.Bson
|
|||||||
public class BsonIgnoreAttribute : Attribute
|
public class BsonIgnoreAttribute : Attribute
|
||||||
{
|
{
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
namespace ZB.MOM.WW.CBDD.Bson.Schema;
|
namespace ZB.MOM.WW.CBDD.Bson.Schema;
|
||||||
|
|
||||||
public partial class BsonField
|
public class BsonField
|
||||||
{
|
{
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the field name.
|
/// Gets the field name.
|
||||||
@@ -33,7 +33,7 @@ public partial class BsonField
|
|||||||
/// <param name="writer">The BSON writer.</param>
|
/// <param name="writer">The BSON writer.</param>
|
||||||
public void ToBson(ref BsonSpanWriter writer)
|
public void ToBson(ref BsonSpanWriter writer)
|
||||||
{
|
{
|
||||||
var size = writer.BeginDocument();
|
int size = writer.BeginDocument();
|
||||||
writer.WriteString("n", Name);
|
writer.WriteString("n", Name);
|
||||||
writer.WriteInt32("t", (int)Type);
|
writer.WriteInt32("t", (int)Type);
|
||||||
writer.WriteBoolean("b", IsNullable);
|
writer.WriteBoolean("b", IsNullable);
|
||||||
@@ -44,10 +44,7 @@ public partial class BsonField
|
|||||||
NestedSchema.ToBson(ref writer);
|
NestedSchema.ToBson(ref writer);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (ArrayItemType != null)
|
if (ArrayItemType != null) writer.WriteInt32("a", (int)ArrayItemType.Value);
|
||||||
{
|
|
||||||
writer.WriteInt32("a", (int)ArrayItemType.Value);
|
|
||||||
}
|
|
||||||
|
|
||||||
writer.EndDocument(size);
|
writer.EndDocument(size);
|
||||||
}
|
}
|
||||||
@@ -61,9 +58,9 @@ public partial class BsonField
|
|||||||
{
|
{
|
||||||
reader.ReadInt32(); // Read doc size
|
reader.ReadInt32(); // Read doc size
|
||||||
|
|
||||||
string name = "";
|
var name = "";
|
||||||
BsonType type = BsonType.Null;
|
var type = BsonType.Null;
|
||||||
bool isNullable = false;
|
var isNullable = false;
|
||||||
BsonSchema? nestedSchema = null;
|
BsonSchema? nestedSchema = null;
|
||||||
BsonType? arrayItemType = null;
|
BsonType? arrayItemType = null;
|
||||||
|
|
||||||
@@ -72,7 +69,7 @@ public partial class BsonField
|
|||||||
var btype = reader.ReadBsonType();
|
var btype = reader.ReadBsonType();
|
||||||
if (btype == BsonType.EndOfDocument) break;
|
if (btype == BsonType.EndOfDocument) break;
|
||||||
|
|
||||||
var key = reader.ReadElementHeader();
|
string key = reader.ReadElementHeader();
|
||||||
switch (key)
|
switch (key)
|
||||||
{
|
{
|
||||||
case "n": name = reader.ReadString(); break;
|
case "n": name = reader.ReadString(); break;
|
||||||
@@ -121,8 +118,14 @@ public partial class BsonField
|
|||||||
}
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override bool Equals(object? obj) => Equals(obj as BsonField);
|
public override bool Equals(object? obj)
|
||||||
|
{
|
||||||
|
return Equals(obj as BsonField);
|
||||||
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override int GetHashCode() => (int)GetHash();
|
public override int GetHashCode()
|
||||||
|
{
|
||||||
|
return (int)GetHash();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -1,6 +1,6 @@
|
|||||||
namespace ZB.MOM.WW.CBDD.Bson.Schema;
|
namespace ZB.MOM.WW.CBDD.Bson.Schema;
|
||||||
|
|
||||||
public partial class BsonSchema
|
public class BsonSchema
|
||||||
{
|
{
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the schema title.
|
/// Gets or sets the schema title.
|
||||||
@@ -23,16 +23,17 @@ public partial class BsonSchema
|
|||||||
/// <param name="writer">The BSON writer to write into.</param>
|
/// <param name="writer">The BSON writer to write into.</param>
|
||||||
public void ToBson(ref BsonSpanWriter writer)
|
public void ToBson(ref BsonSpanWriter writer)
|
||||||
{
|
{
|
||||||
var size = writer.BeginDocument();
|
int size = writer.BeginDocument();
|
||||||
if (Title != null) writer.WriteString("t", Title);
|
if (Title != null) writer.WriteString("t", Title);
|
||||||
if (Version != null) writer.WriteInt32("_v", Version.Value);
|
if (Version != null) writer.WriteInt32("_v", Version.Value);
|
||||||
|
|
||||||
var fieldsSize = writer.BeginArray("f");
|
int fieldsSize = writer.BeginArray("f");
|
||||||
for (int i = 0; i < Fields.Count; i++)
|
for (var i = 0; i < Fields.Count; i++)
|
||||||
{
|
{
|
||||||
writer.WriteElementHeader(BsonType.Document, i.ToString());
|
writer.WriteElementHeader(BsonType.Document, i.ToString());
|
||||||
Fields[i].ToBson(ref writer);
|
Fields[i].ToBson(ref writer);
|
||||||
}
|
}
|
||||||
|
|
||||||
writer.EndArray(fieldsSize);
|
writer.EndArray(fieldsSize);
|
||||||
|
|
||||||
writer.EndDocument(size);
|
writer.EndDocument(size);
|
||||||
@@ -54,7 +55,7 @@ public partial class BsonSchema
|
|||||||
var btype = reader.ReadBsonType();
|
var btype = reader.ReadBsonType();
|
||||||
if (btype == BsonType.EndOfDocument) break;
|
if (btype == BsonType.EndOfDocument) break;
|
||||||
|
|
||||||
var key = reader.ReadElementHeader();
|
string key = reader.ReadElementHeader();
|
||||||
switch (key)
|
switch (key)
|
||||||
{
|
{
|
||||||
case "t": schema.Title = reader.ReadString(); break;
|
case "t": schema.Title = reader.ReadString(); break;
|
||||||
@@ -68,6 +69,7 @@ public partial class BsonSchema
|
|||||||
reader.ReadElementHeader(); // index
|
reader.ReadElementHeader(); // index
|
||||||
schema.Fields.Add(BsonField.FromBson(ref reader));
|
schema.Fields.Add(BsonField.FromBson(ref reader));
|
||||||
}
|
}
|
||||||
|
|
||||||
break;
|
break;
|
||||||
default: reader.SkipValue(btype); break;
|
default: reader.SkipValue(btype); break;
|
||||||
}
|
}
|
||||||
@@ -84,10 +86,7 @@ public partial class BsonSchema
|
|||||||
{
|
{
|
||||||
var hash = new HashCode();
|
var hash = new HashCode();
|
||||||
hash.Add(Title);
|
hash.Add(Title);
|
||||||
foreach (var field in Fields)
|
foreach (var field in Fields) hash.Add(field.GetHash());
|
||||||
{
|
|
||||||
hash.Add(field.GetHash());
|
|
||||||
}
|
|
||||||
return hash.ToHashCode();
|
return hash.ToHashCode();
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -103,10 +102,16 @@ public partial class BsonSchema
|
|||||||
}
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override bool Equals(object? obj) => Equals(obj as BsonSchema);
|
public override bool Equals(object? obj)
|
||||||
|
{
|
||||||
|
return Equals(obj as BsonSchema);
|
||||||
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override int GetHashCode() => (int)GetHash();
|
public override int GetHashCode()
|
||||||
|
{
|
||||||
|
return (int)GetHash();
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Enumerates all field keys in this schema, including nested schema keys.
|
/// Enumerates all field keys in this schema, including nested schema keys.
|
||||||
@@ -118,12 +123,8 @@ public partial class BsonSchema
|
|||||||
{
|
{
|
||||||
yield return field.Name;
|
yield return field.Name;
|
||||||
if (field.NestedSchema != null)
|
if (field.NestedSchema != null)
|
||||||
{
|
foreach (string nestedKey in field.NestedSchema.GetAllKeys())
|
||||||
foreach (var nestedKey in field.NestedSchema.GetAllKeys())
|
|
||||||
{
|
|
||||||
yield return nestedKey;
|
yield return nestedKey;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|||||||
@@ -1,4 +1,3 @@
|
|||||||
using System;
|
|
||||||
using System.Runtime.InteropServices;
|
using System.Runtime.InteropServices;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Bson;
|
namespace ZB.MOM.WW.CBDD.Bson;
|
||||||
@@ -16,12 +15,12 @@ public readonly struct ObjectId : IEquatable<ObjectId>
|
|||||||
/// <summary>
|
/// <summary>
|
||||||
/// Empty ObjectId (all zeros)
|
/// Empty ObjectId (all zeros)
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public static readonly ObjectId Empty = new ObjectId(0, 0);
|
public static readonly ObjectId Empty = new(0, 0);
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Maximum ObjectId (all 0xFF bytes) - useful for range queries
|
/// Maximum ObjectId (all 0xFF bytes) - useful for range queries
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public static readonly ObjectId MaxValue = new ObjectId(int.MaxValue, long.MaxValue);
|
public static readonly ObjectId MaxValue = new(int.MaxValue, long.MaxValue);
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new instance of the <see cref="ObjectId" /> struct from raw bytes.
|
/// Initializes a new instance of the <see cref="ObjectId" /> struct from raw bytes.
|
||||||
@@ -53,7 +52,7 @@ public readonly struct ObjectId : IEquatable<ObjectId>
|
|||||||
public static ObjectId NewObjectId()
|
public static ObjectId NewObjectId()
|
||||||
{
|
{
|
||||||
var timestamp = (int)DateTimeOffset.UtcNow.ToUnixTimeSeconds();
|
var timestamp = (int)DateTimeOffset.UtcNow.ToUnixTimeSeconds();
|
||||||
var random = Random.Shared.NextInt64();
|
long random = Random.Shared.NextInt64();
|
||||||
return new ObjectId(timestamp, random);
|
return new ObjectId(timestamp, random);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -90,17 +89,32 @@ public readonly struct ObjectId : IEquatable<ObjectId>
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="other">The object to compare with this instance.</param>
|
/// <param name="other">The object to compare with this instance.</param>
|
||||||
/// <returns><see langword="true" /> if the values are equal; otherwise, <see langword="false" />.</returns>
|
/// <returns><see langword="true" /> if the values are equal; otherwise, <see langword="false" />.</returns>
|
||||||
public bool Equals(ObjectId other) =>
|
public bool Equals(ObjectId other)
|
||||||
_timestamp == other._timestamp && _randomAndCounter == other._randomAndCounter;
|
{
|
||||||
|
return _timestamp == other._timestamp && _randomAndCounter == other._randomAndCounter;
|
||||||
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override bool Equals(object? obj) => obj is ObjectId other && Equals(other);
|
public override bool Equals(object? obj)
|
||||||
|
{
|
||||||
|
return obj is ObjectId other && Equals(other);
|
||||||
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override int GetHashCode() => HashCode.Combine(_timestamp, _randomAndCounter);
|
public override int GetHashCode()
|
||||||
|
{
|
||||||
|
return HashCode.Combine(_timestamp, _randomAndCounter);
|
||||||
|
}
|
||||||
|
|
||||||
public static bool operator ==(ObjectId left, ObjectId right) => left.Equals(right);
|
public static bool operator ==(ObjectId left, ObjectId right)
|
||||||
public static bool operator !=(ObjectId left, ObjectId right) => !left.Equals(right);
|
{
|
||||||
|
return left.Equals(right);
|
||||||
|
}
|
||||||
|
|
||||||
|
public static bool operator !=(ObjectId left, ObjectId right)
|
||||||
|
{
|
||||||
|
return !left.Equals(right);
|
||||||
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override string ToString()
|
public override string ToString()
|
||||||
|
|||||||
@@ -1,18 +1,16 @@
|
|||||||
using System;
|
|
||||||
using System.Collections.Concurrent;
|
using System.Collections.Concurrent;
|
||||||
using System.Collections.Generic;
|
|
||||||
using System.Threading;
|
|
||||||
using System.Threading.Channels;
|
using System.Threading.Channels;
|
||||||
using System.Threading.Tasks;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.CDC;
|
namespace ZB.MOM.WW.CBDD.Core.CDC;
|
||||||
|
|
||||||
internal sealed class ChangeStreamDispatcher : IDisposable
|
internal sealed class ChangeStreamDispatcher : IDisposable
|
||||||
{
|
{
|
||||||
private readonly Channel<InternalChangeEvent> _channel;
|
private readonly Channel<InternalChangeEvent> _channel;
|
||||||
private readonly ConcurrentDictionary<string, ConcurrentDictionary<ChannelWriter<InternalChangeEvent>, byte>> _subscriptions = new();
|
|
||||||
private readonly ConcurrentDictionary<string, int> _payloadWatcherCounts = new();
|
|
||||||
private readonly CancellationTokenSource _cts = new();
|
private readonly CancellationTokenSource _cts = new();
|
||||||
|
private readonly ConcurrentDictionary<string, int> _payloadWatcherCounts = new();
|
||||||
|
|
||||||
|
private readonly ConcurrentDictionary<string, ConcurrentDictionary<ChannelWriter<InternalChangeEvent>, byte>>
|
||||||
|
_subscriptions = new();
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new change stream dispatcher.
|
/// Initializes a new change stream dispatcher.
|
||||||
@@ -28,6 +26,15 @@ internal sealed class ChangeStreamDispatcher : IDisposable
|
|||||||
Task.Run(ProcessEventsAsync);
|
Task.Run(ProcessEventsAsync);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Releases dispatcher resources.
|
||||||
|
/// </summary>
|
||||||
|
public void Dispose()
|
||||||
|
{
|
||||||
|
_cts.Cancel();
|
||||||
|
_cts.Dispose();
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Publishes a change event to subscribers.
|
/// Publishes a change event to subscribers.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -44,7 +51,7 @@ internal sealed class ChangeStreamDispatcher : IDisposable
|
|||||||
/// <returns><see langword="true" /> if payload watchers exist; otherwise, <see langword="false" />.</returns>
|
/// <returns><see langword="true" /> if payload watchers exist; otherwise, <see langword="false" />.</returns>
|
||||||
public bool HasPayloadWatchers(string collectionName)
|
public bool HasPayloadWatchers(string collectionName)
|
||||||
{
|
{
|
||||||
return _payloadWatcherCounts.TryGetValue(collectionName, out var count) && count > 0;
|
return _payloadWatcherCounts.TryGetValue(collectionName, out int count) && count > 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -66,12 +73,10 @@ internal sealed class ChangeStreamDispatcher : IDisposable
|
|||||||
/// <returns>An <see cref="IDisposable" /> that removes the subscription when disposed.</returns>
|
/// <returns>An <see cref="IDisposable" /> that removes the subscription when disposed.</returns>
|
||||||
public IDisposable Subscribe(string collectionName, bool capturePayload, ChannelWriter<InternalChangeEvent> writer)
|
public IDisposable Subscribe(string collectionName, bool capturePayload, ChannelWriter<InternalChangeEvent> writer)
|
||||||
{
|
{
|
||||||
if (capturePayload)
|
if (capturePayload) _payloadWatcherCounts.AddOrUpdate(collectionName, 1, (_, count) => count + 1);
|
||||||
{
|
|
||||||
_payloadWatcherCounts.AddOrUpdate(collectionName, 1, (_, count) => count + 1);
|
|
||||||
}
|
|
||||||
|
|
||||||
var collectionSubs = _subscriptions.GetOrAdd(collectionName, _ => new ConcurrentDictionary<ChannelWriter<InternalChangeEvent>, byte>());
|
var collectionSubs = _subscriptions.GetOrAdd(collectionName,
|
||||||
|
_ => new ConcurrentDictionary<ChannelWriter<InternalChangeEvent>, byte>());
|
||||||
collectionSubs.TryAdd(writer, 0);
|
collectionSubs.TryAdd(writer, 0);
|
||||||
|
|
||||||
return new Subscription(() => Unsubscribe(collectionName, capturePayload, writer));
|
return new Subscription(() => Unsubscribe(collectionName, capturePayload, writer));
|
||||||
@@ -79,15 +84,9 @@ internal sealed class ChangeStreamDispatcher : IDisposable
|
|||||||
|
|
||||||
private void Unsubscribe(string collectionName, bool capturePayload, ChannelWriter<InternalChangeEvent> writer)
|
private void Unsubscribe(string collectionName, bool capturePayload, ChannelWriter<InternalChangeEvent> writer)
|
||||||
{
|
{
|
||||||
if (_subscriptions.TryGetValue(collectionName, out var collectionSubs))
|
if (_subscriptions.TryGetValue(collectionName, out var collectionSubs)) collectionSubs.TryRemove(writer, out _);
|
||||||
{
|
|
||||||
collectionSubs.TryRemove(writer, out _);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (capturePayload)
|
if (capturePayload) _payloadWatcherCounts.AddOrUpdate(collectionName, 0, (_, count) => Math.Max(0, count - 1));
|
||||||
{
|
|
||||||
_payloadWatcherCounts.AddOrUpdate(collectionName, 0, (_, count) => Math.Max(0, count - 1));
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
private async Task ProcessEventsAsync()
|
private async Task ProcessEventsAsync()
|
||||||
@@ -96,38 +95,23 @@ internal sealed class ChangeStreamDispatcher : IDisposable
|
|||||||
{
|
{
|
||||||
var reader = _channel.Reader;
|
var reader = _channel.Reader;
|
||||||
while (await reader.WaitToReadAsync(_cts.Token))
|
while (await reader.WaitToReadAsync(_cts.Token))
|
||||||
{
|
|
||||||
while (reader.TryRead(out var @event))
|
while (reader.TryRead(out var @event))
|
||||||
{
|
|
||||||
if (_subscriptions.TryGetValue(@event.CollectionName, out var collectionSubs))
|
if (_subscriptions.TryGetValue(@event.CollectionName, out var collectionSubs))
|
||||||
{
|
|
||||||
foreach (var writer in collectionSubs.Keys)
|
foreach (var writer in collectionSubs.Keys)
|
||||||
{
|
|
||||||
// Optimized fan-out: non-blocking TryWrite.
|
// Optimized fan-out: non-blocking TryWrite.
|
||||||
// If a subscriber channel is full (unlikely with Unbounded),
|
// If a subscriber channel is full (unlikely with Unbounded),
|
||||||
// we skip or drop. Usually, subscribers will also use Unbounded.
|
// we skip or drop. Usually, subscribers will also use Unbounded.
|
||||||
writer.TryWrite(@event);
|
writer.TryWrite(@event);
|
||||||
}
|
}
|
||||||
|
catch (OperationCanceledException)
|
||||||
|
{
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
catch (OperationCanceledException) { }
|
|
||||||
catch (Exception)
|
catch (Exception)
|
||||||
{
|
{
|
||||||
// Internal error logging could go here
|
// Internal error logging could go here
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Releases dispatcher resources.
|
|
||||||
/// </summary>
|
|
||||||
public void Dispose()
|
|
||||||
{
|
|
||||||
_cts.Cancel();
|
|
||||||
_cts.Dispose();
|
|
||||||
}
|
|
||||||
|
|
||||||
private sealed class Subscription : IDisposable
|
private sealed class Subscription : IDisposable
|
||||||
{
|
{
|
||||||
private readonly Action _onDispose;
|
private readonly Action _onDispose;
|
||||||
|
|||||||
@@ -1,4 +1,3 @@
|
|||||||
using System;
|
|
||||||
using ZB.MOM.WW.CBDD.Core.Transactions;
|
using ZB.MOM.WW.CBDD.Core.Transactions;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.CDC;
|
namespace ZB.MOM.WW.CBDD.Core.CDC;
|
||||||
|
|||||||
@@ -1,9 +1,5 @@
|
|||||||
using System;
|
|
||||||
using System.Collections.Concurrent;
|
using System.Collections.Concurrent;
|
||||||
using System.Collections.Generic;
|
|
||||||
using System.Threading.Channels;
|
using System.Threading.Channels;
|
||||||
using System.Threading.Tasks;
|
|
||||||
using System.Threading;
|
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
using ZB.MOM.WW.CBDD.Core.Collections;
|
using ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
using ZB.MOM.WW.CBDD.Core.Indexing;
|
using ZB.MOM.WW.CBDD.Core.Indexing;
|
||||||
@@ -12,11 +8,11 @@ namespace ZB.MOM.WW.CBDD.Core.CDC;
|
|||||||
|
|
||||||
internal sealed class ChangeStreamObservable<TId, T> : IObservable<ChangeStreamEvent<TId, T>> where T : class
|
internal sealed class ChangeStreamObservable<TId, T> : IObservable<ChangeStreamEvent<TId, T>> where T : class
|
||||||
{
|
{
|
||||||
private readonly ChangeStreamDispatcher _dispatcher;
|
|
||||||
private readonly string _collectionName;
|
|
||||||
private readonly bool _capturePayload;
|
private readonly bool _capturePayload;
|
||||||
private readonly IDocumentMapper<TId, T> _mapper;
|
private readonly string _collectionName;
|
||||||
|
private readonly ChangeStreamDispatcher _dispatcher;
|
||||||
private readonly ConcurrentDictionary<ushort, string> _keyReverseMap;
|
private readonly ConcurrentDictionary<ushort, string> _keyReverseMap;
|
||||||
|
private readonly IDocumentMapper<TId, T> _mapper;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new observable wrapper for collection change events.
|
/// Initializes a new observable wrapper for collection change events.
|
||||||
@@ -60,14 +56,13 @@ internal sealed class ChangeStreamObservable<TId, T> : IObservable<ChangeStreamE
|
|||||||
return new CompositeDisposable(dispatcherSubscription, cts, channel.Writer, bridgeTask);
|
return new CompositeDisposable(dispatcherSubscription, cts, channel.Writer, bridgeTask);
|
||||||
}
|
}
|
||||||
|
|
||||||
private async Task BridgeChannelToObserverAsync(ChannelReader<InternalChangeEvent> reader, IObserver<ChangeStreamEvent<TId, T>> observer, CancellationToken ct)
|
private async Task BridgeChannelToObserverAsync(ChannelReader<InternalChangeEvent> reader,
|
||||||
|
IObserver<ChangeStreamEvent<TId, T>> observer, CancellationToken ct)
|
||||||
{
|
{
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
while (await reader.WaitToReadAsync(ct))
|
while (await reader.WaitToReadAsync(ct))
|
||||||
{
|
|
||||||
while (reader.TryRead(out var internalEvent))
|
while (reader.TryRead(out var internalEvent))
|
||||||
{
|
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
// Deserializza ID
|
// Deserializza ID
|
||||||
@@ -76,9 +71,8 @@ internal sealed class ChangeStreamObservable<TId, T> : IObservable<ChangeStreamE
|
|||||||
// Deserializza Payload (se presente)
|
// Deserializza Payload (se presente)
|
||||||
T? entity = default;
|
T? entity = default;
|
||||||
if (internalEvent.PayloadBytes.HasValue)
|
if (internalEvent.PayloadBytes.HasValue)
|
||||||
{
|
entity = _mapper.Deserialize(new BsonSpanReader(internalEvent.PayloadBytes.Value.Span,
|
||||||
entity = _mapper.Deserialize(new BsonSpanReader(internalEvent.PayloadBytes.Value.Span, _keyReverseMap));
|
_keyReverseMap));
|
||||||
}
|
|
||||||
|
|
||||||
var externalEvent = new ChangeStreamEvent<TId, T>
|
var externalEvent = new ChangeStreamEvent<TId, T>
|
||||||
{
|
{
|
||||||
@@ -98,8 +92,7 @@ internal sealed class ChangeStreamObservable<TId, T> : IObservable<ChangeStreamE
|
|||||||
// Or we can stop the observer.
|
// Or we can stop the observer.
|
||||||
observer.OnError(ex);
|
observer.OnError(ex);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
|
||||||
observer.OnCompleted();
|
observer.OnCompleted();
|
||||||
}
|
}
|
||||||
catch (OperationCanceledException)
|
catch (OperationCanceledException)
|
||||||
@@ -114,10 +107,10 @@ internal sealed class ChangeStreamObservable<TId, T> : IObservable<ChangeStreamE
|
|||||||
|
|
||||||
private sealed class CompositeDisposable : IDisposable
|
private sealed class CompositeDisposable : IDisposable
|
||||||
{
|
{
|
||||||
private readonly IDisposable _dispatcherSubscription;
|
|
||||||
private readonly CancellationTokenSource _cts;
|
|
||||||
private readonly ChannelWriter<InternalChangeEvent> _writer;
|
|
||||||
private readonly Task _bridgeTask;
|
private readonly Task _bridgeTask;
|
||||||
|
private readonly CancellationTokenSource _cts;
|
||||||
|
private readonly IDisposable _dispatcherSubscription;
|
||||||
|
private readonly ChannelWriter<InternalChangeEvent> _writer;
|
||||||
private bool _disposed;
|
private bool _disposed;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -127,7 +120,8 @@ internal sealed class ChangeStreamObservable<TId, T> : IObservable<ChangeStreamE
|
|||||||
/// <param name="cts">The cancellation source controlling the bridge task.</param>
|
/// <param name="cts">The cancellation source controlling the bridge task.</param>
|
||||||
/// <param name="writer">The channel writer for internal change events.</param>
|
/// <param name="writer">The channel writer for internal change events.</param>
|
||||||
/// <param name="bridgeTask">The running bridge task.</param>
|
/// <param name="bridgeTask">The running bridge task.</param>
|
||||||
public CompositeDisposable(IDisposable dispatcherSubscription, CancellationTokenSource cts, ChannelWriter<InternalChangeEvent> writer, Task bridgeTask)
|
public CompositeDisposable(IDisposable dispatcherSubscription, CancellationTokenSource cts,
|
||||||
|
ChannelWriter<InternalChangeEvent> writer, Task bridgeTask)
|
||||||
{
|
{
|
||||||
_dispatcherSubscription = dispatcherSubscription;
|
_dispatcherSubscription = dispatcherSubscription;
|
||||||
_cts = cts;
|
_cts = cts;
|
||||||
|
|||||||
@@ -1,6 +1,5 @@
|
|||||||
using System.Collections.Concurrent;
|
using System.Collections.Concurrent;
|
||||||
using ZB.MOM.WW.CBDD.Core.Collections;
|
using ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
using ZB.MOM.WW.CBDD.Core.Indexing;
|
|
||||||
using ZB.MOM.WW.CBDD.Core.Transactions;
|
using ZB.MOM.WW.CBDD.Core.Transactions;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.CDC;
|
namespace ZB.MOM.WW.CBDD.Core.CDC;
|
||||||
@@ -13,11 +12,11 @@ namespace ZB.MOM.WW.CBDD.Core.CDC;
|
|||||||
/// <typeparam name="T">Document type.</typeparam>
|
/// <typeparam name="T">Document type.</typeparam>
|
||||||
internal sealed class CollectionCdcPublisher<TId, T> where T : class
|
internal sealed class CollectionCdcPublisher<TId, T> where T : class
|
||||||
{
|
{
|
||||||
private readonly ITransactionHolder _transactionHolder;
|
|
||||||
private readonly string _collectionName;
|
private readonly string _collectionName;
|
||||||
private readonly IDocumentMapper<TId, T> _mapper;
|
|
||||||
private readonly ChangeStreamDispatcher? _dispatcher;
|
private readonly ChangeStreamDispatcher? _dispatcher;
|
||||||
private readonly ConcurrentDictionary<ushort, string> _keyReverseMap;
|
private readonly ConcurrentDictionary<ushort, string> _keyReverseMap;
|
||||||
|
private readonly IDocumentMapper<TId, T> _mapper;
|
||||||
|
private readonly ITransactionHolder _transactionHolder;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new instance of the <see cref="CollectionCdcPublisher" /> class.
|
/// Initializes a new instance of the <see cref="CollectionCdcPublisher" /> class.
|
||||||
@@ -74,15 +73,11 @@ internal sealed class CollectionCdcPublisher<TId, T> where T : class
|
|||||||
return;
|
return;
|
||||||
|
|
||||||
ReadOnlyMemory<byte>? payload = null;
|
ReadOnlyMemory<byte>? payload = null;
|
||||||
if (!docData.IsEmpty && _dispatcher.HasPayloadWatchers(_collectionName))
|
if (!docData.IsEmpty && _dispatcher.HasPayloadWatchers(_collectionName)) payload = docData.ToArray();
|
||||||
{
|
|
||||||
payload = docData.ToArray();
|
|
||||||
}
|
|
||||||
|
|
||||||
var idBytes = _mapper.ToIndexKey(id).Data.ToArray();
|
byte[] idBytes = _mapper.ToIndexKey(id).Data.ToArray();
|
||||||
|
|
||||||
if (transaction is Transaction t)
|
if (transaction is Transaction t)
|
||||||
{
|
|
||||||
t.AddChange(new InternalChangeEvent
|
t.AddChange(new InternalChangeEvent
|
||||||
{
|
{
|
||||||
Timestamp = DateTime.UtcNow.Ticks,
|
Timestamp = DateTime.UtcNow.Ticks,
|
||||||
@@ -94,4 +89,3 @@ internal sealed class CollectionCdcPublisher<TId, T> where T : class
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|||||||
@@ -1,10 +1,6 @@
|
|||||||
using System;
|
|
||||||
using System.Buffers;
|
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
using ZB.MOM.WW.CBDD.Core.Indexing;
|
|
||||||
using System.Linq;
|
|
||||||
using System.Collections.Generic;
|
|
||||||
using ZB.MOM.WW.CBDD.Bson.Schema;
|
using ZB.MOM.WW.CBDD.Bson.Schema;
|
||||||
|
using ZB.MOM.WW.CBDD.Core.Indexing;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Collections;
|
namespace ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
|
|
||||||
@@ -52,14 +48,20 @@ public abstract class DocumentMapperBase<TId, T> : IDocumentMapper<TId, T> where
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="id">The identifier value.</param>
|
/// <param name="id">The identifier value.</param>
|
||||||
/// <returns>The index key representation of the identifier.</returns>
|
/// <returns>The index key representation of the identifier.</returns>
|
||||||
public virtual IndexKey ToIndexKey(TId id) => IndexKey.Create(id);
|
public virtual IndexKey ToIndexKey(TId id)
|
||||||
|
{
|
||||||
|
return IndexKey.Create(id);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Converts an index key back into a typed identifier value.
|
/// Converts an index key back into a typed identifier value.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="key">The index key to convert.</param>
|
/// <param name="key">The index key to convert.</param>
|
||||||
/// <returns>The typed identifier value.</returns>
|
/// <returns>The typed identifier value.</returns>
|
||||||
public virtual TId FromIndexKey(IndexKey key) => key.As<TId>();
|
public virtual TId FromIndexKey(IndexKey key)
|
||||||
|
{
|
||||||
|
return key.As<TId>();
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets all mapped field keys used by this mapper.
|
/// Gets all mapped field keys used by this mapper.
|
||||||
@@ -70,7 +72,10 @@ public abstract class DocumentMapperBase<TId, T> : IDocumentMapper<TId, T> where
|
|||||||
/// Builds the BSON schema for the mapped entity type.
|
/// Builds the BSON schema for the mapped entity type.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <returns>The generated BSON schema.</returns>
|
/// <returns>The generated BSON schema.</returns>
|
||||||
public virtual BsonSchema GetSchema() => BsonSchemaGenerator.FromType<T>();
|
public virtual BsonSchema GetSchema()
|
||||||
|
{
|
||||||
|
return BsonSchemaGenerator.FromType<T>();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -79,10 +84,16 @@ public abstract class DocumentMapperBase<TId, T> : IDocumentMapper<TId, T> where
|
|||||||
public abstract class ObjectIdMapperBase<T> : DocumentMapperBase<ObjectId, T>, IDocumentMapper<T> where T : class
|
public abstract class ObjectIdMapperBase<T> : DocumentMapperBase<ObjectId, T>, IDocumentMapper<T> where T : class
|
||||||
{
|
{
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override IndexKey ToIndexKey(ObjectId id) => IndexKey.Create(id);
|
public override IndexKey ToIndexKey(ObjectId id)
|
||||||
|
{
|
||||||
|
return IndexKey.Create(id);
|
||||||
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override ObjectId FromIndexKey(IndexKey key) => key.As<ObjectId>();
|
public override ObjectId FromIndexKey(IndexKey key)
|
||||||
|
{
|
||||||
|
return key.As<ObjectId>();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -91,10 +102,16 @@ public abstract class ObjectIdMapperBase<T> : DocumentMapperBase<ObjectId, T>, I
|
|||||||
public abstract class Int32MapperBase<T> : DocumentMapperBase<int, T> where T : class
|
public abstract class Int32MapperBase<T> : DocumentMapperBase<int, T> where T : class
|
||||||
{
|
{
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override IndexKey ToIndexKey(int id) => IndexKey.Create(id);
|
public override IndexKey ToIndexKey(int id)
|
||||||
|
{
|
||||||
|
return IndexKey.Create(id);
|
||||||
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override int FromIndexKey(IndexKey key) => key.As<int>();
|
public override int FromIndexKey(IndexKey key)
|
||||||
|
{
|
||||||
|
return key.As<int>();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -103,10 +120,16 @@ public abstract class Int32MapperBase<T> : DocumentMapperBase<int, T> where T :
|
|||||||
public abstract class StringMapperBase<T> : DocumentMapperBase<string, T> where T : class
|
public abstract class StringMapperBase<T> : DocumentMapperBase<string, T> where T : class
|
||||||
{
|
{
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override IndexKey ToIndexKey(string id) => IndexKey.Create(id);
|
public override IndexKey ToIndexKey(string id)
|
||||||
|
{
|
||||||
|
return IndexKey.Create(id);
|
||||||
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override string FromIndexKey(IndexKey key) => key.As<string>();
|
public override string FromIndexKey(IndexKey key)
|
||||||
|
{
|
||||||
|
return key.As<string>();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -115,8 +138,14 @@ public abstract class StringMapperBase<T> : DocumentMapperBase<string, T> where
|
|||||||
public abstract class GuidMapperBase<T> : DocumentMapperBase<Guid, T> where T : class
|
public abstract class GuidMapperBase<T> : DocumentMapperBase<Guid, T> where T : class
|
||||||
{
|
{
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override IndexKey ToIndexKey(Guid id) => IndexKey.Create(id);
|
public override IndexKey ToIndexKey(Guid id)
|
||||||
|
{
|
||||||
|
return IndexKey.Create(id);
|
||||||
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override Guid FromIndexKey(IndexKey key) => key.As<Guid>();
|
public override Guid FromIndexKey(IndexKey key)
|
||||||
|
{
|
||||||
|
return key.As<Guid>();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -1,16 +1,15 @@
|
|||||||
using System.Reflection;
|
|
||||||
using System.Linq;
|
|
||||||
using System.Collections;
|
using System.Collections;
|
||||||
using System.Collections.Generic;
|
|
||||||
using System.Collections.Concurrent;
|
using System.Collections.Concurrent;
|
||||||
|
using System.Reflection;
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
using System;
|
|
||||||
using ZB.MOM.WW.CBDD.Bson.Schema;
|
using ZB.MOM.WW.CBDD.Bson.Schema;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Collections;
|
namespace ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
|
|
||||||
public static class BsonSchemaGenerator
|
public static class BsonSchemaGenerator
|
||||||
{
|
{
|
||||||
|
private static readonly ConcurrentDictionary<Type, BsonSchema> _cache = new();
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Generates a BSON schema for the specified CLR type.
|
/// Generates a BSON schema for the specified CLR type.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -21,8 +20,6 @@ public static class BsonSchemaGenerator
|
|||||||
return FromType(typeof(T));
|
return FromType(typeof(T));
|
||||||
}
|
}
|
||||||
|
|
||||||
private static readonly ConcurrentDictionary<Type, BsonSchema> _cache = new();
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Generates a BSON schema for the specified CLR type.
|
/// Generates a BSON schema for the specified CLR type.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -47,10 +44,7 @@ public static class BsonSchemaGenerator
|
|||||||
AddField(schema, prop.Name, prop.PropertyType);
|
AddField(schema, prop.Name, prop.PropertyType);
|
||||||
}
|
}
|
||||||
|
|
||||||
foreach (var field in fields)
|
foreach (var field in fields) AddField(schema, field.Name, field.FieldType);
|
||||||
{
|
|
||||||
AddField(schema, field.Name, field.FieldType);
|
|
||||||
}
|
|
||||||
|
|
||||||
return schema;
|
return schema;
|
||||||
}
|
}
|
||||||
@@ -60,10 +54,7 @@ public static class BsonSchemaGenerator
|
|||||||
name = name.ToLowerInvariant();
|
name = name.ToLowerInvariant();
|
||||||
|
|
||||||
// Convention: id -> _id for root document
|
// Convention: id -> _id for root document
|
||||||
if (name.Equals("id", StringComparison.OrdinalIgnoreCase))
|
if (name.Equals("id", StringComparison.OrdinalIgnoreCase)) name = "_id";
|
||||||
{
|
|
||||||
name = "_id";
|
|
||||||
}
|
|
||||||
|
|
||||||
var (bsonType, nestedSchema, itemType) = GetBsonType(type);
|
var (bsonType, nestedSchema, itemType) = GetBsonType(type);
|
||||||
|
|
||||||
@@ -106,11 +97,9 @@ public static class BsonSchemaGenerator
|
|||||||
// Nested Objects / Structs
|
// Nested Objects / Structs
|
||||||
// If it's not a string, not a primitive, and not an array/list, treat as Document
|
// If it's not a string, not a primitive, and not an array/list, treat as Document
|
||||||
if (type != typeof(string) && !type.IsPrimitive && !type.IsEnum)
|
if (type != typeof(string) && !type.IsPrimitive && !type.IsEnum)
|
||||||
{
|
|
||||||
// Avoid infinite recursion?
|
// Avoid infinite recursion?
|
||||||
// Simple approach: generating nested schema
|
// Simple approach: generating nested schema
|
||||||
return (BsonType.Document, FromType(type), null);
|
return (BsonType.Document, FromType(type), null);
|
||||||
}
|
|
||||||
|
|
||||||
return (BsonType.Undefined, null, null);
|
return (BsonType.Undefined, null, null);
|
||||||
}
|
}
|
||||||
@@ -126,9 +115,7 @@ public static class BsonSchemaGenerator
|
|||||||
|
|
||||||
// If type itself is IEnumerable<T>
|
// If type itself is IEnumerable<T>
|
||||||
if (type.IsGenericType && type.GetGenericTypeDefinition() == typeof(IEnumerable<>))
|
if (type.IsGenericType && type.GetGenericTypeDefinition() == typeof(IEnumerable<>))
|
||||||
{
|
|
||||||
return type.GetGenericArguments()[0];
|
return type.GetGenericArguments()[0];
|
||||||
}
|
|
||||||
|
|
||||||
var enumerableType = type.GetInterfaces()
|
var enumerableType = type.GetInterfaces()
|
||||||
.FirstOrDefault(i => i.IsGenericType && i.GetGenericTypeDefinition() == typeof(IEnumerable<>));
|
.FirstOrDefault(i => i.IsGenericType && i.GetGenericTypeDefinition() == typeof(IEnumerable<>));
|
||||||
|
|||||||
@@ -18,8 +18,8 @@ public partial class DocumentCollection<TId, T> where T : class
|
|||||||
if (predicate == null) throw new ArgumentNullException(nameof(predicate));
|
if (predicate == null) throw new ArgumentNullException(nameof(predicate));
|
||||||
|
|
||||||
var transaction = _transactionHolder.GetCurrentTransactionOrStart();
|
var transaction = _transactionHolder.GetCurrentTransactionOrStart();
|
||||||
var txnId = transaction.TransactionId;
|
ulong txnId = transaction.TransactionId;
|
||||||
var pageCount = _storage.PageCount;
|
uint pageCount = _storage.PageCount;
|
||||||
var buffer = new byte[_storage.PageSize];
|
var buffer = new byte[_storage.PageSize];
|
||||||
var pageResults = new List<T>();
|
var pageResults = new List<T>();
|
||||||
|
|
||||||
@@ -28,10 +28,7 @@ public partial class DocumentCollection<TId, T> where T : class
|
|||||||
pageResults.Clear();
|
pageResults.Clear();
|
||||||
ScanPage(pageId, txnId, buffer, predicate, pageResults);
|
ScanPage(pageId, txnId, buffer, predicate, pageResults);
|
||||||
|
|
||||||
foreach (var doc in pageResults)
|
foreach (var doc in pageResults) yield return doc;
|
||||||
{
|
|
||||||
yield return doc;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -46,7 +43,7 @@ public partial class DocumentCollection<TId, T> where T : class
|
|||||||
if (predicate == null) throw new ArgumentNullException(nameof(predicate));
|
if (predicate == null) throw new ArgumentNullException(nameof(predicate));
|
||||||
|
|
||||||
var transaction = _transactionHolder.GetCurrentTransactionOrStart();
|
var transaction = _transactionHolder.GetCurrentTransactionOrStart();
|
||||||
var txnId = transaction.TransactionId;
|
ulong txnId = transaction.TransactionId;
|
||||||
var pageCount = (int)_storage.PageCount;
|
var pageCount = (int)_storage.PageCount;
|
||||||
|
|
||||||
if (degreeOfParallelism <= 0)
|
if (degreeOfParallelism <= 0)
|
||||||
@@ -61,15 +58,14 @@ public partial class DocumentCollection<TId, T> where T : class
|
|||||||
var localResults = new List<T>();
|
var localResults = new List<T>();
|
||||||
|
|
||||||
for (int i = range.Item1; i < range.Item2; i++)
|
for (int i = range.Item1; i < range.Item2; i++)
|
||||||
{
|
|
||||||
ScanPage((uint)i, txnId, localBuffer, predicate, localResults);
|
ScanPage((uint)i, txnId, localBuffer, predicate, localResults);
|
||||||
}
|
|
||||||
|
|
||||||
return localResults;
|
return localResults;
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
private void ScanPage(uint pageId, ulong txnId, byte[] buffer, Func<BsonSpanReader, bool> predicate, List<T> results)
|
private void ScanPage(uint pageId, ulong txnId, byte[] buffer, Func<BsonSpanReader, bool> predicate,
|
||||||
|
List<T> results)
|
||||||
{
|
{
|
||||||
_storage.ReadPage(pageId, txnId, buffer);
|
_storage.ReadPage(pageId, txnId, buffer);
|
||||||
var header = SlottedPageHeader.ReadFrom(buffer);
|
var header = SlottedPageHeader.ReadFrom(buffer);
|
||||||
@@ -80,7 +76,7 @@ public partial class DocumentCollection<TId, T> where T : class
|
|||||||
var slots = MemoryMarshal.Cast<byte, SlotEntry>(
|
var slots = MemoryMarshal.Cast<byte, SlotEntry>(
|
||||||
buffer.AsSpan(SlottedPageHeader.Size, header.SlotCount * SlotEntry.Size));
|
buffer.AsSpan(SlottedPageHeader.Size, header.SlotCount * SlotEntry.Size));
|
||||||
|
|
||||||
for (int i = 0; i < header.SlotCount; i++)
|
for (var i = 0; i < header.SlotCount; i++)
|
||||||
{
|
{
|
||||||
var slot = slots[i];
|
var slot = slots[i];
|
||||||
|
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -1,9 +1,6 @@
|
|||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
using ZB.MOM.WW.CBDD.Core.Indexing;
|
|
||||||
using System;
|
|
||||||
using System.Buffers;
|
|
||||||
using System.Collections.Generic;
|
|
||||||
using ZB.MOM.WW.CBDD.Bson.Schema;
|
using ZB.MOM.WW.CBDD.Bson.Schema;
|
||||||
|
using ZB.MOM.WW.CBDD.Core.Indexing;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Collections;
|
namespace ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
|
|
||||||
|
|||||||
@@ -1,5 +1,3 @@
|
|||||||
using System;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Collections;
|
namespace ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
|
|
||||||
public readonly struct SchemaVersion
|
public readonly struct SchemaVersion
|
||||||
@@ -26,5 +24,8 @@ public readonly struct SchemaVersion
|
|||||||
}
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override string ToString() => $"v{Version} (0x{Hash:X16})";
|
public override string ToString()
|
||||||
|
{
|
||||||
|
return $"v{Version} (0x{Hash:X16})";
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -55,9 +55,10 @@ public readonly struct CompressedPayloadHeader
|
|||||||
/// <param name="codec">Compression codec used for payload bytes.</param>
|
/// <param name="codec">Compression codec used for payload bytes.</param>
|
||||||
/// <param name="originalLength">Original uncompressed payload length.</param>
|
/// <param name="originalLength">Original uncompressed payload length.</param>
|
||||||
/// <param name="compressedPayload">Compressed payload bytes.</param>
|
/// <param name="compressedPayload">Compressed payload bytes.</param>
|
||||||
public static CompressedPayloadHeader Create(CompressionCodec codec, int originalLength, ReadOnlySpan<byte> compressedPayload)
|
public static CompressedPayloadHeader Create(CompressionCodec codec, int originalLength,
|
||||||
|
ReadOnlySpan<byte> compressedPayload)
|
||||||
{
|
{
|
||||||
var checksum = ComputeChecksum(compressedPayload);
|
uint checksum = ComputeChecksum(compressedPayload);
|
||||||
return new CompressedPayloadHeader(codec, originalLength, compressedPayload.Length, checksum);
|
return new CompressedPayloadHeader(codec, originalLength, compressedPayload.Length, checksum);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -89,9 +90,9 @@ public readonly struct CompressedPayloadHeader
|
|||||||
throw new ArgumentException($"Source must be at least {Size} bytes.", nameof(source));
|
throw new ArgumentException($"Source must be at least {Size} bytes.", nameof(source));
|
||||||
|
|
||||||
var codec = (CompressionCodec)source[0];
|
var codec = (CompressionCodec)source[0];
|
||||||
var originalLength = BinaryPrimitives.ReadInt32LittleEndian(source.Slice(4, 4));
|
int originalLength = BinaryPrimitives.ReadInt32LittleEndian(source.Slice(4, 4));
|
||||||
var compressedLength = BinaryPrimitives.ReadInt32LittleEndian(source.Slice(8, 4));
|
int compressedLength = BinaryPrimitives.ReadInt32LittleEndian(source.Slice(8, 4));
|
||||||
var checksum = BinaryPrimitives.ReadUInt32LittleEndian(source.Slice(12, 4));
|
uint checksum = BinaryPrimitives.ReadUInt32LittleEndian(source.Slice(12, 4));
|
||||||
return new CompressedPayloadHeader(codec, originalLength, compressedLength, checksum);
|
return new CompressedPayloadHeader(codec, originalLength, compressedLength, checksum);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -108,7 +109,10 @@ public readonly struct CompressedPayloadHeader
|
|||||||
/// Compute Checksum.
|
/// Compute Checksum.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="payload">Payload bytes.</param>
|
/// <param name="payload">Payload bytes.</param>
|
||||||
public static uint ComputeChecksum(ReadOnlySpan<byte> payload) => Crc32Calculator.Compute(payload);
|
public static uint ComputeChecksum(ReadOnlySpan<byte> payload)
|
||||||
|
{
|
||||||
|
return Crc32Calculator.Compute(payload);
|
||||||
|
}
|
||||||
|
|
||||||
private static class Crc32Calculator
|
private static class Crc32Calculator
|
||||||
{
|
{
|
||||||
@@ -121,10 +125,10 @@ public readonly struct CompressedPayloadHeader
|
|||||||
/// <param name="payload">Payload bytes.</param>
|
/// <param name="payload">Payload bytes.</param>
|
||||||
public static uint Compute(ReadOnlySpan<byte> payload)
|
public static uint Compute(ReadOnlySpan<byte> payload)
|
||||||
{
|
{
|
||||||
uint crc = 0xFFFFFFFFu;
|
var crc = 0xFFFFFFFFu;
|
||||||
for (int i = 0; i < payload.Length; i++)
|
for (var i = 0; i < payload.Length; i++)
|
||||||
{
|
{
|
||||||
var index = (crc ^ payload[i]) & 0xFF;
|
uint index = (crc ^ payload[i]) & 0xFF;
|
||||||
crc = (crc >> 8) ^ Table[index];
|
crc = (crc >> 8) ^ Table[index];
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -137,10 +141,7 @@ public readonly struct CompressedPayloadHeader
|
|||||||
for (uint i = 0; i < table.Length; i++)
|
for (uint i = 0; i < table.Length; i++)
|
||||||
{
|
{
|
||||||
uint value = i;
|
uint value = i;
|
||||||
for (int bit = 0; bit < 8; bit++)
|
for (var bit = 0; bit < 8; bit++) value = (value & 1) != 0 ? (value >> 1) ^ Polynomial : value >> 1;
|
||||||
{
|
|
||||||
value = (value & 1) != 0 ? (value >> 1) ^ Polynomial : value >> 1;
|
|
||||||
}
|
|
||||||
|
|
||||||
table[i] = value;
|
table[i] = value;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -59,16 +59,19 @@ public sealed class CompressionOptions
|
|||||||
throw new ArgumentOutOfRangeException(nameof(MinSizeBytes), "MinSizeBytes must be non-negative.");
|
throw new ArgumentOutOfRangeException(nameof(MinSizeBytes), "MinSizeBytes must be non-negative.");
|
||||||
|
|
||||||
if (candidate.MinSavingsPercent is < 0 or > 100)
|
if (candidate.MinSavingsPercent is < 0 or > 100)
|
||||||
throw new ArgumentOutOfRangeException(nameof(MinSavingsPercent), "MinSavingsPercent must be between 0 and 100.");
|
throw new ArgumentOutOfRangeException(nameof(MinSavingsPercent),
|
||||||
|
"MinSavingsPercent must be between 0 and 100.");
|
||||||
|
|
||||||
if (!Enum.IsDefined(candidate.Codec))
|
if (!Enum.IsDefined(candidate.Codec))
|
||||||
throw new ArgumentOutOfRangeException(nameof(Codec), $"Unsupported codec: {candidate.Codec}.");
|
throw new ArgumentOutOfRangeException(nameof(Codec), $"Unsupported codec: {candidate.Codec}.");
|
||||||
|
|
||||||
if (candidate.MaxDecompressedSizeBytes <= 0)
|
if (candidate.MaxDecompressedSizeBytes <= 0)
|
||||||
throw new ArgumentOutOfRangeException(nameof(MaxDecompressedSizeBytes), "MaxDecompressedSizeBytes must be greater than 0.");
|
throw new ArgumentOutOfRangeException(nameof(MaxDecompressedSizeBytes),
|
||||||
|
"MaxDecompressedSizeBytes must be greater than 0.");
|
||||||
|
|
||||||
if (candidate.MaxCompressionInputBytes is <= 0)
|
if (candidate.MaxCompressionInputBytes is <= 0)
|
||||||
throw new ArgumentOutOfRangeException(nameof(MaxCompressionInputBytes), "MaxCompressionInputBytes must be greater than 0 when provided.");
|
throw new ArgumentOutOfRangeException(nameof(MaxCompressionInputBytes),
|
||||||
|
"MaxCompressionInputBytes must be greater than 0 when provided.");
|
||||||
|
|
||||||
return candidate;
|
return candidate;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -24,10 +24,7 @@ public sealed class CompressionService
|
|||||||
if (additionalCodecs == null)
|
if (additionalCodecs == null)
|
||||||
return;
|
return;
|
||||||
|
|
||||||
foreach (var codec in additionalCodecs)
|
foreach (var codec in additionalCodecs) RegisterCodec(codec);
|
||||||
{
|
|
||||||
RegisterCodec(codec);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -45,7 +42,10 @@ public sealed class CompressionService
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="codec">The codec identifier to resolve.</param>
|
/// <param name="codec">The codec identifier to resolve.</param>
|
||||||
/// <param name="compressionCodec">When this method returns, contains the resolved codec when found.</param>
|
/// <param name="compressionCodec">When this method returns, contains the resolved codec when found.</param>
|
||||||
/// <returns><see langword="true"/> when a codec is registered for <paramref name="codec"/>; otherwise, <see langword="false"/>.</returns>
|
/// <returns>
|
||||||
|
/// <see langword="true" /> when a codec is registered for <paramref name="codec" />; otherwise,
|
||||||
|
/// <see langword="false" />.
|
||||||
|
/// </returns>
|
||||||
public bool TryGetCodec(CompressionCodec codec, out ICompressionCodec compressionCodec)
|
public bool TryGetCodec(CompressionCodec codec, out ICompressionCodec compressionCodec)
|
||||||
{
|
{
|
||||||
return _codecs.TryGetValue(codec, out compressionCodec!);
|
return _codecs.TryGetValue(codec, out compressionCodec!);
|
||||||
@@ -81,10 +81,14 @@ public sealed class CompressionService
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="input">The compressed payload bytes.</param>
|
/// <param name="input">The compressed payload bytes.</param>
|
||||||
/// <param name="codec">The codec to use.</param>
|
/// <param name="codec">The codec to use.</param>
|
||||||
/// <param name="expectedLength">The expected decompressed byte length, or a negative value to skip exact-length validation.</param>
|
/// <param name="expectedLength">
|
||||||
|
/// The expected decompressed byte length, or a negative value to skip exact-length
|
||||||
|
/// validation.
|
||||||
|
/// </param>
|
||||||
/// <param name="maxDecompressedSizeBytes">The maximum allowed decompressed byte length.</param>
|
/// <param name="maxDecompressedSizeBytes">The maximum allowed decompressed byte length.</param>
|
||||||
/// <returns>The decompressed payload bytes.</returns>
|
/// <returns>The decompressed payload bytes.</returns>
|
||||||
public byte[] Decompress(ReadOnlySpan<byte> input, CompressionCodec codec, int expectedLength, int maxDecompressedSizeBytes)
|
public byte[] Decompress(ReadOnlySpan<byte> input, CompressionCodec codec, int expectedLength,
|
||||||
|
int maxDecompressedSizeBytes)
|
||||||
{
|
{
|
||||||
return GetCodec(codec).Decompress(input, expectedLength, maxDecompressedSizeBytes);
|
return GetCodec(codec).Decompress(input, expectedLength, maxDecompressedSizeBytes);
|
||||||
}
|
}
|
||||||
@@ -97,12 +101,70 @@ public sealed class CompressionService
|
|||||||
/// <param name="level">The compression level.</param>
|
/// <param name="level">The compression level.</param>
|
||||||
/// <param name="maxDecompressedSizeBytes">The maximum allowed decompressed byte length.</param>
|
/// <param name="maxDecompressedSizeBytes">The maximum allowed decompressed byte length.</param>
|
||||||
/// <returns>The decompressed payload bytes after roundtrip.</returns>
|
/// <returns>The decompressed payload bytes after roundtrip.</returns>
|
||||||
public byte[] Roundtrip(ReadOnlySpan<byte> input, CompressionCodec codec, CompressionLevel level, int maxDecompressedSizeBytes)
|
public byte[] Roundtrip(ReadOnlySpan<byte> input, CompressionCodec codec, CompressionLevel level,
|
||||||
|
int maxDecompressedSizeBytes)
|
||||||
{
|
{
|
||||||
var compressed = Compress(input, codec, level);
|
byte[] compressed = Compress(input, codec, level);
|
||||||
return Decompress(compressed, codec, input.Length, maxDecompressedSizeBytes);
|
return Decompress(compressed, codec, input.Length, maxDecompressedSizeBytes);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
private static byte[] CompressWithCodecStream(ReadOnlySpan<byte> input, Func<Stream, Stream> streamFactory)
|
||||||
|
{
|
||||||
|
using var output = new MemoryStream(input.Length);
|
||||||
|
using (var codecStream = streamFactory(output))
|
||||||
|
{
|
||||||
|
codecStream.Write(input);
|
||||||
|
codecStream.Flush();
|
||||||
|
}
|
||||||
|
|
||||||
|
return output.ToArray();
|
||||||
|
}
|
||||||
|
|
||||||
|
private static byte[] DecompressWithCodecStream(
|
||||||
|
ReadOnlySpan<byte> input,
|
||||||
|
Func<Stream, Stream> streamFactory,
|
||||||
|
int expectedLength,
|
||||||
|
int maxDecompressedSizeBytes)
|
||||||
|
{
|
||||||
|
if (maxDecompressedSizeBytes <= 0)
|
||||||
|
throw new ArgumentOutOfRangeException(nameof(maxDecompressedSizeBytes));
|
||||||
|
|
||||||
|
using var compressed = new MemoryStream(input.ToArray(), false);
|
||||||
|
using var codecStream = streamFactory(compressed);
|
||||||
|
using var output = expectedLength > 0
|
||||||
|
? new MemoryStream(expectedLength)
|
||||||
|
: new MemoryStream();
|
||||||
|
|
||||||
|
byte[] buffer = ArrayPool<byte>.Shared.Rent(8192);
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var totalWritten = 0;
|
||||||
|
while (true)
|
||||||
|
{
|
||||||
|
int bytesRead = codecStream.Read(buffer, 0, buffer.Length);
|
||||||
|
if (bytesRead <= 0)
|
||||||
|
break;
|
||||||
|
|
||||||
|
totalWritten += bytesRead;
|
||||||
|
if (totalWritten > maxDecompressedSizeBytes)
|
||||||
|
throw new InvalidDataException(
|
||||||
|
$"Decompressed payload exceeds max allowed size ({maxDecompressedSizeBytes} bytes).");
|
||||||
|
|
||||||
|
output.Write(buffer, 0, bytesRead);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (expectedLength >= 0 && totalWritten != expectedLength)
|
||||||
|
throw new InvalidDataException(
|
||||||
|
$"Expected decompressed length {expectedLength}, actual {totalWritten}.");
|
||||||
|
|
||||||
|
return output.ToArray();
|
||||||
|
}
|
||||||
|
finally
|
||||||
|
{
|
||||||
|
ArrayPool<byte>.Shared.Return(buffer);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
private sealed class NoneCompressionCodec : ICompressionCodec
|
private sealed class NoneCompressionCodec : ICompressionCodec
|
||||||
{
|
{
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -116,7 +178,10 @@ public sealed class CompressionService
|
|||||||
/// <param name="input">The payload bytes to copy.</param>
|
/// <param name="input">The payload bytes to copy.</param>
|
||||||
/// <param name="level">The requested compression level.</param>
|
/// <param name="level">The requested compression level.</param>
|
||||||
/// <returns>The copied payload bytes.</returns>
|
/// <returns>The copied payload bytes.</returns>
|
||||||
public byte[] Compress(ReadOnlySpan<byte> input, CompressionLevel level) => input.ToArray();
|
public byte[] Compress(ReadOnlySpan<byte> input, CompressionLevel level)
|
||||||
|
{
|
||||||
|
return input.ToArray();
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Validates and returns an uncompressed payload copy.
|
/// Validates and returns an uncompressed payload copy.
|
||||||
@@ -128,10 +193,12 @@ public sealed class CompressionService
|
|||||||
public byte[] Decompress(ReadOnlySpan<byte> input, int expectedLength, int maxDecompressedSizeBytes)
|
public byte[] Decompress(ReadOnlySpan<byte> input, int expectedLength, int maxDecompressedSizeBytes)
|
||||||
{
|
{
|
||||||
if (input.Length > maxDecompressedSizeBytes)
|
if (input.Length > maxDecompressedSizeBytes)
|
||||||
throw new InvalidDataException($"Decompressed payload exceeds max allowed size ({maxDecompressedSizeBytes} bytes).");
|
throw new InvalidDataException(
|
||||||
|
$"Decompressed payload exceeds max allowed size ({maxDecompressedSizeBytes} bytes).");
|
||||||
|
|
||||||
if (expectedLength >= 0 && expectedLength != input.Length)
|
if (expectedLength >= 0 && expectedLength != input.Length)
|
||||||
throw new InvalidDataException($"Expected decompressed length {expectedLength}, actual {input.Length}.");
|
throw new InvalidDataException(
|
||||||
|
$"Expected decompressed length {expectedLength}, actual {input.Length}.");
|
||||||
|
|
||||||
return input.ToArray();
|
return input.ToArray();
|
||||||
}
|
}
|
||||||
@@ -152,19 +219,24 @@ public sealed class CompressionService
|
|||||||
/// <returns>The compressed payload bytes.</returns>
|
/// <returns>The compressed payload bytes.</returns>
|
||||||
public byte[] Compress(ReadOnlySpan<byte> input, CompressionLevel level)
|
public byte[] Compress(ReadOnlySpan<byte> input, CompressionLevel level)
|
||||||
{
|
{
|
||||||
return CompressWithCodecStream(input, stream => new BrotliStream(stream, level, leaveOpen: true));
|
return CompressWithCodecStream(input, stream => new BrotliStream(stream, level, true));
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Decompresses Brotli-compressed payload bytes.
|
/// Decompresses Brotli-compressed payload bytes.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="input">The compressed payload bytes.</param>
|
/// <param name="input">The compressed payload bytes.</param>
|
||||||
/// <param name="expectedLength">The expected decompressed byte length, or a negative value to skip exact-length validation.</param>
|
/// <param name="expectedLength">
|
||||||
|
/// The expected decompressed byte length, or a negative value to skip exact-length
|
||||||
|
/// validation.
|
||||||
|
/// </param>
|
||||||
/// <param name="maxDecompressedSizeBytes">The maximum allowed decompressed byte length.</param>
|
/// <param name="maxDecompressedSizeBytes">The maximum allowed decompressed byte length.</param>
|
||||||
/// <returns>The decompressed payload bytes.</returns>
|
/// <returns>The decompressed payload bytes.</returns>
|
||||||
public byte[] Decompress(ReadOnlySpan<byte> input, int expectedLength, int maxDecompressedSizeBytes)
|
public byte[] Decompress(ReadOnlySpan<byte> input, int expectedLength, int maxDecompressedSizeBytes)
|
||||||
{
|
{
|
||||||
return DecompressWithCodecStream(input, stream => new BrotliStream(stream, CompressionMode.Decompress, leaveOpen: true), expectedLength, maxDecompressedSizeBytes);
|
return DecompressWithCodecStream(input,
|
||||||
|
stream => new BrotliStream(stream, CompressionMode.Decompress, true), expectedLength,
|
||||||
|
maxDecompressedSizeBytes);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -183,74 +255,24 @@ public sealed class CompressionService
|
|||||||
/// <returns>The compressed payload bytes.</returns>
|
/// <returns>The compressed payload bytes.</returns>
|
||||||
public byte[] Compress(ReadOnlySpan<byte> input, CompressionLevel level)
|
public byte[] Compress(ReadOnlySpan<byte> input, CompressionLevel level)
|
||||||
{
|
{
|
||||||
return CompressWithCodecStream(input, stream => new DeflateStream(stream, level, leaveOpen: true));
|
return CompressWithCodecStream(input, stream => new DeflateStream(stream, level, true));
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Decompresses Deflate-compressed payload bytes.
|
/// Decompresses Deflate-compressed payload bytes.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="input">The compressed payload bytes.</param>
|
/// <param name="input">The compressed payload bytes.</param>
|
||||||
/// <param name="expectedLength">The expected decompressed byte length, or a negative value to skip exact-length validation.</param>
|
/// <param name="expectedLength">
|
||||||
|
/// The expected decompressed byte length, or a negative value to skip exact-length
|
||||||
|
/// validation.
|
||||||
|
/// </param>
|
||||||
/// <param name="maxDecompressedSizeBytes">The maximum allowed decompressed byte length.</param>
|
/// <param name="maxDecompressedSizeBytes">The maximum allowed decompressed byte length.</param>
|
||||||
/// <returns>The decompressed payload bytes.</returns>
|
/// <returns>The decompressed payload bytes.</returns>
|
||||||
public byte[] Decompress(ReadOnlySpan<byte> input, int expectedLength, int maxDecompressedSizeBytes)
|
public byte[] Decompress(ReadOnlySpan<byte> input, int expectedLength, int maxDecompressedSizeBytes)
|
||||||
{
|
{
|
||||||
return DecompressWithCodecStream(input, stream => new DeflateStream(stream, CompressionMode.Decompress, leaveOpen: true), expectedLength, maxDecompressedSizeBytes);
|
return DecompressWithCodecStream(input,
|
||||||
}
|
stream => new DeflateStream(stream, CompressionMode.Decompress, true), expectedLength,
|
||||||
}
|
maxDecompressedSizeBytes);
|
||||||
|
|
||||||
private static byte[] CompressWithCodecStream(ReadOnlySpan<byte> input, Func<Stream, Stream> streamFactory)
|
|
||||||
{
|
|
||||||
using var output = new MemoryStream(capacity: input.Length);
|
|
||||||
using (var codecStream = streamFactory(output))
|
|
||||||
{
|
|
||||||
codecStream.Write(input);
|
|
||||||
codecStream.Flush();
|
|
||||||
}
|
|
||||||
|
|
||||||
return output.ToArray();
|
|
||||||
}
|
|
||||||
|
|
||||||
private static byte[] DecompressWithCodecStream(
|
|
||||||
ReadOnlySpan<byte> input,
|
|
||||||
Func<Stream, Stream> streamFactory,
|
|
||||||
int expectedLength,
|
|
||||||
int maxDecompressedSizeBytes)
|
|
||||||
{
|
|
||||||
if (maxDecompressedSizeBytes <= 0)
|
|
||||||
throw new ArgumentOutOfRangeException(nameof(maxDecompressedSizeBytes));
|
|
||||||
|
|
||||||
using var compressed = new MemoryStream(input.ToArray(), writable: false);
|
|
||||||
using var codecStream = streamFactory(compressed);
|
|
||||||
using var output = expectedLength > 0
|
|
||||||
? new MemoryStream(capacity: expectedLength)
|
|
||||||
: new MemoryStream();
|
|
||||||
|
|
||||||
var buffer = ArrayPool<byte>.Shared.Rent(8192);
|
|
||||||
try
|
|
||||||
{
|
|
||||||
int totalWritten = 0;
|
|
||||||
while (true)
|
|
||||||
{
|
|
||||||
var bytesRead = codecStream.Read(buffer, 0, buffer.Length);
|
|
||||||
if (bytesRead <= 0)
|
|
||||||
break;
|
|
||||||
|
|
||||||
totalWritten += bytesRead;
|
|
||||||
if (totalWritten > maxDecompressedSizeBytes)
|
|
||||||
throw new InvalidDataException($"Decompressed payload exceeds max allowed size ({maxDecompressedSizeBytes} bytes).");
|
|
||||||
|
|
||||||
output.Write(buffer, 0, bytesRead);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (expectedLength >= 0 && totalWritten != expectedLength)
|
|
||||||
throw new InvalidDataException($"Expected decompressed length {expectedLength}, actual {totalWritten}.");
|
|
||||||
|
|
||||||
return output.ToArray();
|
|
||||||
}
|
|
||||||
finally
|
|
||||||
{
|
|
||||||
ArrayPool<byte>.Shared.Return(buffer);
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -9,30 +9,37 @@ public readonly struct CompressionStats
|
|||||||
/// Gets or sets the CompressedDocumentCount.
|
/// Gets or sets the CompressedDocumentCount.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public long CompressedDocumentCount { get; init; }
|
public long CompressedDocumentCount { get; init; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the BytesBeforeCompression.
|
/// Gets or sets the BytesBeforeCompression.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public long BytesBeforeCompression { get; init; }
|
public long BytesBeforeCompression { get; init; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the BytesAfterCompression.
|
/// Gets or sets the BytesAfterCompression.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public long BytesAfterCompression { get; init; }
|
public long BytesAfterCompression { get; init; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the CompressionCpuTicks.
|
/// Gets or sets the CompressionCpuTicks.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public long CompressionCpuTicks { get; init; }
|
public long CompressionCpuTicks { get; init; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the DecompressionCpuTicks.
|
/// Gets or sets the DecompressionCpuTicks.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public long DecompressionCpuTicks { get; init; }
|
public long DecompressionCpuTicks { get; init; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the CompressionFailureCount.
|
/// Gets or sets the CompressionFailureCount.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public long CompressionFailureCount { get; init; }
|
public long CompressionFailureCount { get; init; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the ChecksumFailureCount.
|
/// Gets or sets the ChecksumFailureCount.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public long ChecksumFailureCount { get; init; }
|
public long ChecksumFailureCount { get; init; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the SafetyLimitRejectionCount.
|
/// Gets or sets the SafetyLimitRejectionCount.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
|
|||||||
@@ -1,5 +1,3 @@
|
|||||||
using System.Threading;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Compression;
|
namespace ZB.MOM.WW.CBDD.Core.Compression;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -7,21 +5,21 @@ namespace ZB.MOM.WW.CBDD.Core.Compression;
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public sealed class CompressionTelemetry
|
public sealed class CompressionTelemetry
|
||||||
{
|
{
|
||||||
|
private long _checksumFailureCount;
|
||||||
|
private long _compressedDocumentCount;
|
||||||
private long _compressionAttempts;
|
private long _compressionAttempts;
|
||||||
private long _compressionSuccesses;
|
private long _compressionCpuTicks;
|
||||||
private long _compressionFailures;
|
private long _compressionFailures;
|
||||||
private long _compressionSkippedTooSmall;
|
|
||||||
private long _compressionSkippedInsufficientSavings;
|
|
||||||
private long _decompressionAttempts;
|
|
||||||
private long _decompressionSuccesses;
|
|
||||||
private long _decompressionFailures;
|
|
||||||
private long _compressionInputBytes;
|
private long _compressionInputBytes;
|
||||||
private long _compressionOutputBytes;
|
private long _compressionOutputBytes;
|
||||||
private long _decompressionOutputBytes;
|
private long _compressionSkippedInsufficientSavings;
|
||||||
private long _compressedDocumentCount;
|
private long _compressionSkippedTooSmall;
|
||||||
private long _compressionCpuTicks;
|
private long _compressionSuccesses;
|
||||||
|
private long _decompressionAttempts;
|
||||||
private long _decompressionCpuTicks;
|
private long _decompressionCpuTicks;
|
||||||
private long _checksumFailureCount;
|
private long _decompressionFailures;
|
||||||
|
private long _decompressionOutputBytes;
|
||||||
|
private long _decompressionSuccesses;
|
||||||
private long _safetyLimitRejectionCount;
|
private long _safetyLimitRejectionCount;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -128,44 +126,68 @@ public sealed class CompressionTelemetry
|
|||||||
/// <summary>
|
/// <summary>
|
||||||
/// Records a failed compression operation.
|
/// Records a failed compression operation.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public void RecordCompressionFailure() => Interlocked.Increment(ref _compressionFailures);
|
public void RecordCompressionFailure()
|
||||||
|
{
|
||||||
|
Interlocked.Increment(ref _compressionFailures);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Records that compression was skipped because the payload was too small.
|
/// Records that compression was skipped because the payload was too small.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public void RecordCompressionSkippedTooSmall() => Interlocked.Increment(ref _compressionSkippedTooSmall);
|
public void RecordCompressionSkippedTooSmall()
|
||||||
|
{
|
||||||
|
Interlocked.Increment(ref _compressionSkippedTooSmall);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Records that compression was skipped due to insufficient expected savings.
|
/// Records that compression was skipped due to insufficient expected savings.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public void RecordCompressionSkippedInsufficientSavings() => Interlocked.Increment(ref _compressionSkippedInsufficientSavings);
|
public void RecordCompressionSkippedInsufficientSavings()
|
||||||
|
{
|
||||||
|
Interlocked.Increment(ref _compressionSkippedInsufficientSavings);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Records a decompression attempt.
|
/// Records a decompression attempt.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public void RecordDecompressionAttempt() => Interlocked.Increment(ref _decompressionAttempts);
|
public void RecordDecompressionAttempt()
|
||||||
|
{
|
||||||
|
Interlocked.Increment(ref _decompressionAttempts);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Adds CPU ticks spent performing compression.
|
/// Adds CPU ticks spent performing compression.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="ticks">The CPU ticks to add.</param>
|
/// <param name="ticks">The CPU ticks to add.</param>
|
||||||
public void RecordCompressionCpuTicks(long ticks) => Interlocked.Add(ref _compressionCpuTicks, ticks);
|
public void RecordCompressionCpuTicks(long ticks)
|
||||||
|
{
|
||||||
|
Interlocked.Add(ref _compressionCpuTicks, ticks);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Adds CPU ticks spent performing decompression.
|
/// Adds CPU ticks spent performing decompression.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="ticks">The CPU ticks to add.</param>
|
/// <param name="ticks">The CPU ticks to add.</param>
|
||||||
public void RecordDecompressionCpuTicks(long ticks) => Interlocked.Add(ref _decompressionCpuTicks, ticks);
|
public void RecordDecompressionCpuTicks(long ticks)
|
||||||
|
{
|
||||||
|
Interlocked.Add(ref _decompressionCpuTicks, ticks);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Records a checksum validation failure.
|
/// Records a checksum validation failure.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public void RecordChecksumFailure() => Interlocked.Increment(ref _checksumFailureCount);
|
public void RecordChecksumFailure()
|
||||||
|
{
|
||||||
|
Interlocked.Increment(ref _checksumFailureCount);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Records a decompression rejection due to safety limits.
|
/// Records a decompression rejection due to safety limits.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public void RecordSafetyLimitRejection() => Interlocked.Increment(ref _safetyLimitRejectionCount);
|
public void RecordSafetyLimitRejection()
|
||||||
|
{
|
||||||
|
Interlocked.Increment(ref _safetyLimitRejectionCount);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Records a successful decompression operation.
|
/// Records a successful decompression operation.
|
||||||
@@ -180,7 +202,10 @@ public sealed class CompressionTelemetry
|
|||||||
/// <summary>
|
/// <summary>
|
||||||
/// Records a failed decompression operation.
|
/// Records a failed decompression operation.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public void RecordDecompressionFailure() => Interlocked.Increment(ref _decompressionFailures);
|
public void RecordDecompressionFailure()
|
||||||
|
{
|
||||||
|
Interlocked.Increment(ref _decompressionFailures);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Returns a point-in-time snapshot of compression telemetry.
|
/// Returns a point-in-time snapshot of compression telemetry.
|
||||||
|
|||||||
@@ -1,13 +1,10 @@
|
|||||||
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
|
using ZB.MOM.WW.CBDD.Core.CDC;
|
||||||
using ZB.MOM.WW.CBDD.Core.Collections;
|
using ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
|
using ZB.MOM.WW.CBDD.Core.Compression;
|
||||||
|
using ZB.MOM.WW.CBDD.Core.Metadata;
|
||||||
using ZB.MOM.WW.CBDD.Core.Storage;
|
using ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
using ZB.MOM.WW.CBDD.Core.Transactions;
|
using ZB.MOM.WW.CBDD.Core.Transactions;
|
||||||
using ZB.MOM.WW.CBDD.Core.Metadata;
|
|
||||||
using ZB.MOM.WW.CBDD.Core.Compression;
|
|
||||||
using System.Threading;
|
|
||||||
using System;
|
|
||||||
using System.Collections.Generic;
|
|
||||||
using System.Threading.Tasks;
|
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core;
|
namespace ZB.MOM.WW.CBDD.Core;
|
||||||
|
|
||||||
@@ -24,26 +21,16 @@ internal interface ICompactionAwareCollection
|
|||||||
/// Inherit and add DocumentCollection{T} properties for your entities.
|
/// Inherit and add DocumentCollection{T} properties for your entities.
|
||||||
/// Use partial class for Source Generator integration.
|
/// Use partial class for Source Generator integration.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public abstract partial class DocumentDbContext : IDisposable, ITransactionHolder
|
public abstract class DocumentDbContext : IDisposable, ITransactionHolder
|
||||||
{
|
{
|
||||||
private readonly IStorageEngine _storage;
|
internal readonly ChangeStreamDispatcher _cdc;
|
||||||
internal readonly CDC.ChangeStreamDispatcher _cdc;
|
private readonly List<ICompactionAwareCollection> _compactionAwareCollections = new();
|
||||||
protected bool _disposed;
|
|
||||||
private readonly SemaphoreSlim _transactionLock = new SemaphoreSlim(1, 1);
|
|
||||||
|
|
||||||
/// <summary>
|
private readonly IReadOnlyDictionary<Type, object> _model;
|
||||||
/// Gets the current active transaction, if any.
|
private readonly List<IDocumentMapper> _registeredMappers = new();
|
||||||
/// </summary>
|
private readonly IStorageEngine _storage;
|
||||||
public ITransaction? CurrentTransaction
|
private readonly SemaphoreSlim _transactionLock = new(1, 1);
|
||||||
{
|
protected bool _disposed;
|
||||||
get
|
|
||||||
{
|
|
||||||
if (_disposed)
|
|
||||||
throw new ObjectDisposedException(nameof(DocumentDbContext));
|
|
||||||
return field != null && (field.State == TransactionState.Active) ? field : null;
|
|
||||||
}
|
|
||||||
private set;
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Creates a new database context with default configuration
|
/// Creates a new database context with default configuration
|
||||||
@@ -91,7 +78,7 @@ public abstract partial class DocumentDbContext : IDisposable, ITransactionHolde
|
|||||||
throw new ArgumentNullException(nameof(databasePath));
|
throw new ArgumentNullException(nameof(databasePath));
|
||||||
|
|
||||||
_storage = new StorageEngine(databasePath, config, compressionOptions, maintenanceOptions);
|
_storage = new StorageEngine(databasePath, config, compressionOptions, maintenanceOptions);
|
||||||
_cdc = new CDC.ChangeStreamDispatcher();
|
_cdc = new ChangeStreamDispatcher();
|
||||||
_storage.RegisterCdc(_cdc);
|
_storage.RegisterCdc(_cdc);
|
||||||
|
|
||||||
// Initialize model before collections
|
// Initialize model before collections
|
||||||
@@ -102,16 +89,18 @@ public abstract partial class DocumentDbContext : IDisposable, ITransactionHolde
|
|||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes document collections for the context.
|
/// Gets the current active transaction, if any.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
protected virtual void InitializeCollections()
|
public ITransaction? CurrentTransaction
|
||||||
{
|
{
|
||||||
// Derived classes can override to initialize collections
|
get
|
||||||
|
{
|
||||||
|
if (_disposed)
|
||||||
|
throw new ObjectDisposedException(nameof(DocumentDbContext));
|
||||||
|
return field != null && field.State == TransactionState.Active ? field : null;
|
||||||
|
}
|
||||||
|
private set;
|
||||||
}
|
}
|
||||||
|
|
||||||
private readonly IReadOnlyDictionary<Type, object> _model;
|
|
||||||
private readonly List<IDocumentMapper> _registeredMappers = new();
|
|
||||||
private readonly List<ICompactionAwareCollection> _compactionAwareCollections = new();
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the concrete storage engine for advanced scenarios in derived contexts.
|
/// Gets the concrete storage engine for advanced scenarios in derived contexts.
|
||||||
@@ -133,6 +122,49 @@ public abstract partial class DocumentDbContext : IDisposable, ITransactionHolde
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
protected CompressionTelemetry CompressionTelemetry => _storage.CompressionTelemetry;
|
protected CompressionTelemetry CompressionTelemetry => _storage.CompressionTelemetry;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Releases resources used by the context.
|
||||||
|
/// </summary>
|
||||||
|
public void Dispose()
|
||||||
|
{
|
||||||
|
if (_disposed)
|
||||||
|
return;
|
||||||
|
|
||||||
|
_disposed = true;
|
||||||
|
|
||||||
|
_storage?.Dispose();
|
||||||
|
_cdc?.Dispose();
|
||||||
|
_transactionLock?.Dispose();
|
||||||
|
|
||||||
|
GC.SuppressFinalize(this);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the current active transaction or starts a new one.
|
||||||
|
/// </summary>
|
||||||
|
/// <returns>The active transaction.</returns>
|
||||||
|
public ITransaction GetCurrentTransactionOrStart()
|
||||||
|
{
|
||||||
|
return BeginTransaction();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the current active transaction or starts a new one asynchronously.
|
||||||
|
/// </summary>
|
||||||
|
/// <returns>The active transaction.</returns>
|
||||||
|
public async Task<ITransaction> GetCurrentTransactionOrStartAsync()
|
||||||
|
{
|
||||||
|
return await BeginTransactionAsync();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Initializes document collections for the context.
|
||||||
|
/// </summary>
|
||||||
|
protected virtual void InitializeCollections()
|
||||||
|
{
|
||||||
|
// Derived classes can override to initialize collections
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Override to configure the model using Fluent API.
|
/// Override to configure the model using Fluent API.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -158,7 +190,7 @@ public abstract partial class DocumentDbContext : IDisposable, ITransactionHolde
|
|||||||
string? customName = null;
|
string? customName = null;
|
||||||
EntityTypeBuilder<T>? builder = null;
|
EntityTypeBuilder<T>? builder = null;
|
||||||
|
|
||||||
if (_model.TryGetValue(typeof(T), out var builderObj))
|
if (_model.TryGetValue(typeof(T), out object? builderObj))
|
||||||
{
|
{
|
||||||
builder = builderObj as EntityTypeBuilder<T>;
|
builder = builderObj as EntityTypeBuilder<T>;
|
||||||
customName = builder?.CollectionName;
|
customName = builder?.CollectionName;
|
||||||
@@ -167,18 +199,12 @@ public abstract partial class DocumentDbContext : IDisposable, ITransactionHolde
|
|||||||
_registeredMappers.Add(mapper);
|
_registeredMappers.Add(mapper);
|
||||||
var collection = new DocumentCollection<TId, T>(_storage, this, mapper, customName);
|
var collection = new DocumentCollection<TId, T>(_storage, this, mapper, customName);
|
||||||
if (collection is ICompactionAwareCollection compactionAwareCollection)
|
if (collection is ICompactionAwareCollection compactionAwareCollection)
|
||||||
{
|
|
||||||
_compactionAwareCollections.Add(compactionAwareCollection);
|
_compactionAwareCollections.Add(compactionAwareCollection);
|
||||||
}
|
|
||||||
|
|
||||||
// Apply configurations from ModelBuilder
|
// Apply configurations from ModelBuilder
|
||||||
if (builder != null)
|
if (builder != null)
|
||||||
{
|
|
||||||
foreach (var indexBuilder in builder.Indexes)
|
foreach (var indexBuilder in builder.Indexes)
|
||||||
{
|
|
||||||
collection.ApplyIndexBuilder(indexBuilder);
|
collection.ApplyIndexBuilder(indexBuilder);
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
_storage.RegisterMappers(_registeredMappers);
|
_storage.RegisterMappers(_registeredMappers);
|
||||||
|
|
||||||
@@ -190,7 +216,10 @@ public abstract partial class DocumentDbContext : IDisposable, ITransactionHolde
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
/// <typeparam name="T">The type of entity to retrieve the document collection for. Must be a reference type.</typeparam>
|
/// <typeparam name="T">The type of entity to retrieve the document collection for. Must be a reference type.</typeparam>
|
||||||
/// <returns>A DocumentCollection<ObjectId, T> instance for the specified entity type.</returns>
|
/// <returns>A DocumentCollection<ObjectId, T> instance for the specified entity type.</returns>
|
||||||
public DocumentCollection<ObjectId, T> Set<T>() where T : class => Set<ObjectId, T>();
|
public DocumentCollection<ObjectId, T> Set<T>() where T : class
|
||||||
|
{
|
||||||
|
return Set<ObjectId, T>();
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets a collection for managing documents of type T, identified by keys of type TId.
|
/// Gets a collection for managing documents of type T, identified by keys of type TId.
|
||||||
@@ -200,23 +229,9 @@ public abstract partial class DocumentDbContext : IDisposable, ITransactionHolde
|
|||||||
/// <typeparam name="T">The type of the document to be managed. Must be a reference type.</typeparam>
|
/// <typeparam name="T">The type of the document to be managed. Must be a reference type.</typeparam>
|
||||||
/// <returns>A DocumentCollection<TId, T> instance for performing operations on documents of type T.</returns>
|
/// <returns>A DocumentCollection<TId, T> instance for performing operations on documents of type T.</returns>
|
||||||
public virtual DocumentCollection<TId, T> Set<TId, T>() where T : class
|
public virtual DocumentCollection<TId, T> Set<TId, T>() where T : class
|
||||||
=> throw new InvalidOperationException($"No collection registered for entity type '{typeof(T).Name}' with key type '{typeof(TId).Name}'.");
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Releases resources used by the context.
|
|
||||||
/// </summary>
|
|
||||||
public void Dispose()
|
|
||||||
{
|
{
|
||||||
if (_disposed)
|
throw new InvalidOperationException(
|
||||||
return;
|
$"No collection registered for entity type '{typeof(T).Name}' with key type '{typeof(TId).Name}'.");
|
||||||
|
|
||||||
_disposed = true;
|
|
||||||
|
|
||||||
_storage?.Dispose();
|
|
||||||
_cdc?.Dispose();
|
|
||||||
_transactionLock?.Dispose();
|
|
||||||
|
|
||||||
GC.SuppressFinalize(this);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -252,7 +267,7 @@ public abstract partial class DocumentDbContext : IDisposable, ITransactionHolde
|
|||||||
if (_disposed)
|
if (_disposed)
|
||||||
throw new ObjectDisposedException(nameof(DocumentDbContext));
|
throw new ObjectDisposedException(nameof(DocumentDbContext));
|
||||||
|
|
||||||
bool lockAcquired = false;
|
var lockAcquired = false;
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
await _transactionLock.WaitAsync(ct);
|
await _transactionLock.WaitAsync(ct);
|
||||||
@@ -270,24 +285,6 @@ public abstract partial class DocumentDbContext : IDisposable, ITransactionHolde
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Gets the current active transaction or starts a new one.
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>The active transaction.</returns>
|
|
||||||
public ITransaction GetCurrentTransactionOrStart()
|
|
||||||
{
|
|
||||||
return BeginTransaction();
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Gets the current active transaction or starts a new one asynchronously.
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>The active transaction.</returns>
|
|
||||||
public async Task<ITransaction> GetCurrentTransactionOrStartAsync()
|
|
||||||
{
|
|
||||||
return await BeginTransactionAsync();
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Commits the current transaction if one is active.
|
/// Commits the current transaction if one is active.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -296,7 +293,6 @@ public abstract partial class DocumentDbContext : IDisposable, ITransactionHolde
|
|||||||
if (_disposed)
|
if (_disposed)
|
||||||
throw new ObjectDisposedException(nameof(DocumentDbContext));
|
throw new ObjectDisposedException(nameof(DocumentDbContext));
|
||||||
if (CurrentTransaction != null)
|
if (CurrentTransaction != null)
|
||||||
{
|
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
CurrentTransaction.Commit();
|
CurrentTransaction.Commit();
|
||||||
@@ -306,7 +302,6 @@ public abstract partial class DocumentDbContext : IDisposable, ITransactionHolde
|
|||||||
CurrentTransaction = null;
|
CurrentTransaction = null;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Commits the current transaction asynchronously if one is active.
|
/// Commits the current transaction asynchronously if one is active.
|
||||||
@@ -317,7 +312,6 @@ public abstract partial class DocumentDbContext : IDisposable, ITransactionHolde
|
|||||||
if (_disposed)
|
if (_disposed)
|
||||||
throw new ObjectDisposedException(nameof(DocumentDbContext));
|
throw new ObjectDisposedException(nameof(DocumentDbContext));
|
||||||
if (CurrentTransaction != null)
|
if (CurrentTransaction != null)
|
||||||
{
|
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
await CurrentTransaction.CommitAsync(ct);
|
await CurrentTransaction.CommitAsync(ct);
|
||||||
@@ -327,7 +321,6 @@ public abstract partial class DocumentDbContext : IDisposable, ITransactionHolde
|
|||||||
CurrentTransaction = null;
|
CurrentTransaction = null;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Executes a checkpoint using the requested mode.
|
/// Executes a checkpoint using the requested mode.
|
||||||
@@ -348,7 +341,8 @@ public abstract partial class DocumentDbContext : IDisposable, ITransactionHolde
|
|||||||
/// <param name="mode">Checkpoint mode to execute.</param>
|
/// <param name="mode">Checkpoint mode to execute.</param>
|
||||||
/// <param name="ct">The cancellation token.</param>
|
/// <param name="ct">The cancellation token.</param>
|
||||||
/// <returns>The checkpoint execution result.</returns>
|
/// <returns>The checkpoint execution result.</returns>
|
||||||
public Task<CheckpointResult> CheckpointAsync(CheckpointMode mode = CheckpointMode.Truncate, CancellationToken ct = default)
|
public Task<CheckpointResult> CheckpointAsync(CheckpointMode mode = CheckpointMode.Truncate,
|
||||||
|
CancellationToken ct = default)
|
||||||
{
|
{
|
||||||
if (_disposed)
|
if (_disposed)
|
||||||
throw new ObjectDisposedException(nameof(DocumentDbContext));
|
throw new ObjectDisposedException(nameof(DocumentDbContext));
|
||||||
@@ -437,10 +431,7 @@ public abstract partial class DocumentDbContext : IDisposable, ITransactionHolde
|
|||||||
|
|
||||||
private void RefreshCollectionBindingsAfterCompaction()
|
private void RefreshCollectionBindingsAfterCompaction()
|
||||||
{
|
{
|
||||||
foreach (var collection in _compactionAwareCollections)
|
foreach (var collection in _compactionAwareCollections) collection.RefreshIndexBindingsAfterCompaction();
|
||||||
{
|
|
||||||
collection.RefreshIndexBindingsAfterCompaction();
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -515,7 +506,8 @@ public abstract partial class DocumentDbContext : IDisposable, ITransactionHolde
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="options">Compression migration options.</param>
|
/// <param name="options">Compression migration options.</param>
|
||||||
/// <param name="ct">Cancellation token for the asynchronous operation.</param>
|
/// <param name="ct">Cancellation token for the asynchronous operation.</param>
|
||||||
public Task<CompressionMigrationResult> MigrateCompressionAsync(CompressionMigrationOptions? options = null, CancellationToken ct = default)
|
public Task<CompressionMigrationResult> MigrateCompressionAsync(CompressionMigrationOptions? options = null,
|
||||||
|
CancellationToken ct = default)
|
||||||
{
|
{
|
||||||
if (_disposed)
|
if (_disposed)
|
||||||
throw new ObjectDisposedException(nameof(DocumentDbContext));
|
throw new ObjectDisposedException(nameof(DocumentDbContext));
|
||||||
|
|||||||
@@ -1,24 +1,21 @@
|
|||||||
using System.Buffers;
|
using System.Buffers;
|
||||||
using ZB.MOM.WW.CBDD.Core.Storage;
|
using ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
|
||||||
using System.Collections.Generic;
|
|
||||||
using System;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
||||||
|
|
||||||
internal sealed class BTreeCursor : IBTreeCursor
|
internal sealed class BTreeCursor : IBTreeCursor
|
||||||
{
|
{
|
||||||
|
private readonly List<IndexEntry> _currentEntries;
|
||||||
private readonly BTreeIndex _index;
|
private readonly BTreeIndex _index;
|
||||||
private readonly ulong _transactionId;
|
|
||||||
private readonly IIndexStorage _storage;
|
private readonly IIndexStorage _storage;
|
||||||
|
private readonly ulong _transactionId;
|
||||||
|
private int _currentEntryIndex;
|
||||||
|
private BTreeNodeHeader _currentHeader;
|
||||||
|
private uint _currentPageId;
|
||||||
|
private bool _isValid;
|
||||||
|
|
||||||
// State
|
// State
|
||||||
private byte[] _pageBuffer;
|
private byte[] _pageBuffer;
|
||||||
private uint _currentPageId;
|
|
||||||
private int _currentEntryIndex;
|
|
||||||
private BTreeNodeHeader _currentHeader;
|
|
||||||
private List<IndexEntry> _currentEntries;
|
|
||||||
private bool _isValid;
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new instance of the <see cref="BTreeCursor" /> class.
|
/// Initializes a new instance of the <see cref="BTreeCursor" /> class.
|
||||||
@@ -55,7 +52,7 @@ internal sealed class BTreeCursor : IBTreeCursor
|
|||||||
public bool MoveToFirst()
|
public bool MoveToFirst()
|
||||||
{
|
{
|
||||||
// Find left-most leaf
|
// Find left-most leaf
|
||||||
var pageId = _index.RootPageId;
|
uint pageId = _index.RootPageId;
|
||||||
while (true)
|
while (true)
|
||||||
{
|
{
|
||||||
LoadPage(pageId);
|
LoadPage(pageId);
|
||||||
@@ -63,7 +60,7 @@ internal sealed class BTreeCursor : IBTreeCursor
|
|||||||
|
|
||||||
// Go to first child (P0)
|
// Go to first child (P0)
|
||||||
// Internal node format: [Header] [P0] [Entry1] ...
|
// Internal node format: [Header] [P0] [Entry1] ...
|
||||||
var dataOffset = 32 + 20;
|
int dataOffset = 32 + 20;
|
||||||
pageId = BitConverter.ToUInt32(_pageBuffer.AsSpan(dataOffset, 4));
|
pageId = BitConverter.ToUInt32(_pageBuffer.AsSpan(dataOffset, 4));
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -77,7 +74,7 @@ internal sealed class BTreeCursor : IBTreeCursor
|
|||||||
public bool MoveToLast()
|
public bool MoveToLast()
|
||||||
{
|
{
|
||||||
// Find right-most leaf
|
// Find right-most leaf
|
||||||
var pageId = _index.RootPageId;
|
uint pageId = _index.RootPageId;
|
||||||
while (true)
|
while (true)
|
||||||
{
|
{
|
||||||
LoadPage(pageId);
|
LoadPage(pageId);
|
||||||
@@ -93,16 +90,17 @@ internal sealed class BTreeCursor : IBTreeCursor
|
|||||||
// We want the last pointer.
|
// We want the last pointer.
|
||||||
|
|
||||||
// Re-read P0 just in case
|
// Re-read P0 just in case
|
||||||
uint lastPointer = BitConverter.ToUInt32(_pageBuffer.AsSpan(32 + 20, 4));
|
var lastPointer = BitConverter.ToUInt32(_pageBuffer.AsSpan(32 + 20, 4));
|
||||||
|
|
||||||
var offset = 32 + 20 + 4;
|
int offset = 32 + 20 + 4;
|
||||||
for (int i = 0; i < _currentHeader.EntryCount; i++)
|
for (var i = 0; i < _currentHeader.EntryCount; i++)
|
||||||
{
|
{
|
||||||
var keyLen = BitConverter.ToInt32(_pageBuffer.AsSpan(offset, 4));
|
var keyLen = BitConverter.ToInt32(_pageBuffer.AsSpan(offset, 4));
|
||||||
offset += 4 + keyLen;
|
offset += 4 + keyLen;
|
||||||
lastPointer = BitConverter.ToUInt32(_pageBuffer.AsSpan(offset, 4));
|
lastPointer = BitConverter.ToUInt32(_pageBuffer.AsSpan(offset, 4));
|
||||||
offset += 4;
|
offset += 4;
|
||||||
}
|
}
|
||||||
|
|
||||||
pageId = lastPointer;
|
pageId = lastPointer;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -119,12 +117,12 @@ internal sealed class BTreeCursor : IBTreeCursor
|
|||||||
public bool Seek(IndexKey key)
|
public bool Seek(IndexKey key)
|
||||||
{
|
{
|
||||||
// Use Index to find leaf
|
// Use Index to find leaf
|
||||||
var leafPageId = _index.FindLeafNode(key, _transactionId);
|
uint leafPageId = _index.FindLeafNode(key, _transactionId);
|
||||||
LoadPage(leafPageId);
|
LoadPage(leafPageId);
|
||||||
ParseEntries();
|
ParseEntries();
|
||||||
|
|
||||||
// Binary search in entries
|
// Binary search in entries
|
||||||
var idx = _currentEntries.BinarySearch(new IndexEntry(key, default(DocumentLocation)));
|
int idx = _currentEntries.BinarySearch(new IndexEntry(key, default(DocumentLocation)));
|
||||||
|
|
||||||
if (idx >= 0)
|
if (idx >= 0)
|
||||||
{
|
{
|
||||||
@@ -133,8 +131,7 @@ internal sealed class BTreeCursor : IBTreeCursor
|
|||||||
_isValid = true;
|
_isValid = true;
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
else
|
|
||||||
{
|
|
||||||
// Not found, ~idx is the next larger value
|
// Not found, ~idx is the next larger value
|
||||||
_currentEntryIndex = ~idx;
|
_currentEntryIndex = ~idx;
|
||||||
|
|
||||||
@@ -143,8 +140,7 @@ internal sealed class BTreeCursor : IBTreeCursor
|
|||||||
_isValid = true;
|
_isValid = true;
|
||||||
return false; // Positioned at next greater
|
return false; // Positioned at next greater
|
||||||
}
|
}
|
||||||
else
|
|
||||||
{
|
|
||||||
// Key is larger than max in this page, move to next page
|
// Key is larger than max in this page, move to next page
|
||||||
if (_currentHeader.NextLeafPageId != 0)
|
if (_currentHeader.NextLeafPageId != 0)
|
||||||
{
|
{
|
||||||
@@ -162,8 +158,6 @@ internal sealed class BTreeCursor : IBTreeCursor
|
|||||||
_isValid = false;
|
_isValid = false;
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Moves the cursor to the next entry.
|
/// Moves the cursor to the next entry.
|
||||||
@@ -174,10 +168,7 @@ internal sealed class BTreeCursor : IBTreeCursor
|
|||||||
if (!_isValid) return false;
|
if (!_isValid) return false;
|
||||||
|
|
||||||
_currentEntryIndex++;
|
_currentEntryIndex++;
|
||||||
if (_currentEntryIndex < _currentEntries.Count)
|
if (_currentEntryIndex < _currentEntries.Count) return true;
|
||||||
{
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Move to next page
|
// Move to next page
|
||||||
if (_currentHeader.NextLeafPageId != 0)
|
if (_currentHeader.NextLeafPageId != 0)
|
||||||
@@ -199,10 +190,7 @@ internal sealed class BTreeCursor : IBTreeCursor
|
|||||||
if (!_isValid) return false;
|
if (!_isValid) return false;
|
||||||
|
|
||||||
_currentEntryIndex--;
|
_currentEntryIndex--;
|
||||||
if (_currentEntryIndex >= 0)
|
if (_currentEntryIndex >= 0) return true;
|
||||||
{
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Move to prev page
|
// Move to prev page
|
||||||
if (_currentHeader.PrevLeafPageId != 0)
|
if (_currentHeader.PrevLeafPageId != 0)
|
||||||
@@ -215,6 +203,18 @@ internal sealed class BTreeCursor : IBTreeCursor
|
|||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Releases cursor resources.
|
||||||
|
/// </summary>
|
||||||
|
public void Dispose()
|
||||||
|
{
|
||||||
|
if (_pageBuffer != null)
|
||||||
|
{
|
||||||
|
ArrayPool<byte>.Shared.Return(_pageBuffer);
|
||||||
|
_pageBuffer = null!;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
private void LoadPage(uint pageId)
|
private void LoadPage(uint pageId)
|
||||||
{
|
{
|
||||||
if (_currentPageId == pageId && _pageBuffer != null) return;
|
if (_currentPageId == pageId && _pageBuffer != null) return;
|
||||||
@@ -229,9 +229,9 @@ internal sealed class BTreeCursor : IBTreeCursor
|
|||||||
// Helper to parse entries from current page buffer
|
// Helper to parse entries from current page buffer
|
||||||
// (Similar to BTreeIndex.ReadLeafEntries)
|
// (Similar to BTreeIndex.ReadLeafEntries)
|
||||||
_currentEntries.Clear();
|
_currentEntries.Clear();
|
||||||
var dataOffset = 32 + 20;
|
int dataOffset = 32 + 20;
|
||||||
|
|
||||||
for (int i = 0; i < _currentHeader.EntryCount; i++)
|
for (var i = 0; i < _currentHeader.EntryCount; i++)
|
||||||
{
|
{
|
||||||
// Read Key
|
// Read Key
|
||||||
var keyLen = BitConverter.ToInt32(_pageBuffer.AsSpan(dataOffset, 4));
|
var keyLen = BitConverter.ToInt32(_pageBuffer.AsSpan(dataOffset, 4));
|
||||||
@@ -257,13 +257,11 @@ internal sealed class BTreeCursor : IBTreeCursor
|
|||||||
_isValid = true;
|
_isValid = true;
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
else
|
|
||||||
{
|
|
||||||
// Empty page? Should not happen in helper logic unless root leaf is empty
|
// Empty page? Should not happen in helper logic unless root leaf is empty
|
||||||
_isValid = false;
|
_isValid = false;
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
private bool PositionAtEnd()
|
private bool PositionAtEnd()
|
||||||
{
|
{
|
||||||
@@ -274,22 +272,8 @@ internal sealed class BTreeCursor : IBTreeCursor
|
|||||||
_isValid = true;
|
_isValid = true;
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
else
|
|
||||||
{
|
|
||||||
_isValid = false;
|
_isValid = false;
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Releases cursor resources.
|
|
||||||
/// </summary>
|
|
||||||
public void Dispose()
|
|
||||||
{
|
|
||||||
if (_pageBuffer != null)
|
|
||||||
{
|
|
||||||
ArrayPool<byte>.Shared.Return(_pageBuffer);
|
|
||||||
_pageBuffer = null!;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|||||||
@@ -1,8 +1,6 @@
|
|||||||
using ZB.MOM.WW.CBDD.Bson;
|
using System.Buffers;
|
||||||
|
using System.Text.RegularExpressions;
|
||||||
using ZB.MOM.WW.CBDD.Core.Storage;
|
using ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
using ZB.MOM.WW.CBDD.Core.Transactions;
|
|
||||||
using System;
|
|
||||||
using System.Collections.Generic;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
||||||
|
|
||||||
@@ -11,10 +9,9 @@ namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public sealed class BTreeIndex
|
public sealed class BTreeIndex
|
||||||
{
|
{
|
||||||
private readonly IIndexStorage _storage;
|
|
||||||
private readonly IndexOptions _options;
|
|
||||||
private uint _rootPageId;
|
|
||||||
internal const int MaxEntriesPerNode = 100; // Low value to test splitting
|
internal const int MaxEntriesPerNode = 100; // Low value to test splitting
|
||||||
|
private readonly IndexOptions _options;
|
||||||
|
private readonly IIndexStorage _storage;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new instance of the <see cref="BTreeIndex" /> class.
|
/// Initializes a new instance of the <see cref="BTreeIndex" /> class.
|
||||||
@@ -41,15 +38,15 @@ public sealed class BTreeIndex
|
|||||||
{
|
{
|
||||||
_storage = storage ?? throw new ArgumentNullException(nameof(storage));
|
_storage = storage ?? throw new ArgumentNullException(nameof(storage));
|
||||||
_options = options;
|
_options = options;
|
||||||
_rootPageId = rootPageId;
|
RootPageId = rootPageId;
|
||||||
|
|
||||||
if (_rootPageId == 0)
|
if (RootPageId == 0)
|
||||||
{
|
{
|
||||||
// Allocate new root page (cannot use page 0 which is file header)
|
// Allocate new root page (cannot use page 0 which is file header)
|
||||||
_rootPageId = _storage.AllocatePage();
|
RootPageId = _storage.AllocatePage();
|
||||||
|
|
||||||
// Initialize as empty leaf
|
// Initialize as empty leaf
|
||||||
var pageBuffer = System.Buffers.ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
byte[] pageBuffer = ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
// Clear buffer
|
// Clear buffer
|
||||||
@@ -58,7 +55,7 @@ public sealed class BTreeIndex
|
|||||||
// Write headers
|
// Write headers
|
||||||
var pageHeader = new PageHeader
|
var pageHeader = new PageHeader
|
||||||
{
|
{
|
||||||
PageId = _rootPageId,
|
PageId = RootPageId,
|
||||||
PageType = PageType.Index,
|
PageType = PageType.Index,
|
||||||
FreeBytes = (ushort)(_storage.PageSize - 32),
|
FreeBytes = (ushort)(_storage.PageSize - 32),
|
||||||
NextPageId = 0,
|
NextPageId = 0,
|
||||||
@@ -75,11 +72,11 @@ public sealed class BTreeIndex
|
|||||||
};
|
};
|
||||||
nodeHeader.WriteTo(pageBuffer.AsSpan(32));
|
nodeHeader.WriteTo(pageBuffer.AsSpan(32));
|
||||||
|
|
||||||
_storage.WritePageImmediate(_rootPageId, pageBuffer);
|
_storage.WritePageImmediate(RootPageId, pageBuffer);
|
||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(pageBuffer);
|
ArrayPool<byte>.Shared.Return(pageBuffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -87,7 +84,7 @@ public sealed class BTreeIndex
|
|||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the current root page identifier for the B+Tree.
|
/// Gets the current root page identifier for the B+Tree.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public uint RootPageId => _rootPageId;
|
public uint RootPageId { get; private set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Updates the in-memory root page identifier.
|
/// Updates the in-memory root page identifier.
|
||||||
@@ -98,7 +95,7 @@ public sealed class BTreeIndex
|
|||||||
if (rootPageId == 0)
|
if (rootPageId == 0)
|
||||||
throw new ArgumentOutOfRangeException(nameof(rootPageId));
|
throw new ArgumentOutOfRangeException(nameof(rootPageId));
|
||||||
|
|
||||||
_rootPageId = rootPageId;
|
RootPageId = rootPageId;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -129,14 +126,14 @@ public sealed class BTreeIndex
|
|||||||
/// <param name="transactionId">The optional transaction identifier.</param>
|
/// <param name="transactionId">The optional transaction identifier.</param>
|
||||||
public void Insert(IndexKey key, DocumentLocation location, ulong? transactionId = null)
|
public void Insert(IndexKey key, DocumentLocation location, ulong? transactionId = null)
|
||||||
{
|
{
|
||||||
var txnId = transactionId ?? 0;
|
ulong txnId = transactionId ?? 0;
|
||||||
var entry = new IndexEntry(key, location);
|
var entry = new IndexEntry(key, location);
|
||||||
var path = new List<uint>();
|
var path = new List<uint>();
|
||||||
|
|
||||||
// Find the leaf node for insertion
|
// Find the leaf node for insertion
|
||||||
var leafPageId = FindLeafNodeWithPath(key, path, txnId);
|
uint leafPageId = FindLeafNodeWithPath(key, path, txnId);
|
||||||
|
|
||||||
var pageBuffer = System.Buffers.ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
byte[] pageBuffer = ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
ReadPage(leafPageId, txnId, pageBuffer);
|
ReadPage(leafPageId, txnId, pageBuffer);
|
||||||
@@ -158,7 +155,7 @@ public sealed class BTreeIndex
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(pageBuffer);
|
ArrayPool<byte>.Shared.Return(pageBuffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -171,25 +168,25 @@ public sealed class BTreeIndex
|
|||||||
public bool TryFind(IndexKey key, out DocumentLocation location, ulong? transactionId = null)
|
public bool TryFind(IndexKey key, out DocumentLocation location, ulong? transactionId = null)
|
||||||
{
|
{
|
||||||
location = default;
|
location = default;
|
||||||
var txnId = transactionId ?? 0;
|
ulong txnId = transactionId ?? 0;
|
||||||
|
|
||||||
var leafPageId = FindLeafNode(key, txnId);
|
uint leafPageId = FindLeafNode(key, txnId);
|
||||||
|
|
||||||
Span<byte> pageBuffer = stackalloc byte[_storage.PageSize];
|
Span<byte> pageBuffer = stackalloc byte[_storage.PageSize];
|
||||||
ReadPage(leafPageId, txnId, pageBuffer);
|
ReadPage(leafPageId, txnId, pageBuffer);
|
||||||
|
|
||||||
var header = BTreeNodeHeader.ReadFrom(pageBuffer[32..]);
|
var header = BTreeNodeHeader.ReadFrom(pageBuffer[32..]);
|
||||||
var dataOffset = 32 + 20; // Page header + BTree node header
|
int dataOffset = 32 + 20; // Page header + BTree node header
|
||||||
|
|
||||||
// Linear search in leaf (could be optimized with binary search)
|
// Linear search in leaf (could be optimized with binary search)
|
||||||
for (int i = 0; i < header.EntryCount; i++)
|
for (var i = 0; i < header.EntryCount; i++)
|
||||||
{
|
{
|
||||||
var entryKey = ReadIndexKey(pageBuffer, dataOffset);
|
var entryKey = ReadIndexKey(pageBuffer, dataOffset);
|
||||||
|
|
||||||
if (entryKey.Equals(key))
|
if (entryKey.Equals(key))
|
||||||
{
|
{
|
||||||
// Found - read DocumentLocation (6 bytes: 4 for PageId + 2 for SlotIndex)
|
// Found - read DocumentLocation (6 bytes: 4 for PageId + 2 for SlotIndex)
|
||||||
var locationOffset = dataOffset + entryKey.Data.Length + 4; // +4 for key length prefix
|
int locationOffset = dataOffset + entryKey.Data.Length + 4; // +4 for key length prefix
|
||||||
location = DocumentLocation.ReadFrom(pageBuffer.Slice(locationOffset, DocumentLocation.SerializedSize));
|
location = DocumentLocation.ReadFrom(pageBuffer.Slice(locationOffset, DocumentLocation.SerializedSize));
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
@@ -208,32 +205,35 @@ public sealed class BTreeIndex
|
|||||||
/// <param name="maxKey">The upper bound key.</param>
|
/// <param name="maxKey">The upper bound key.</param>
|
||||||
/// <param name="direction">The scan direction.</param>
|
/// <param name="direction">The scan direction.</param>
|
||||||
/// <param name="transactionId">The optional transaction identifier.</param>
|
/// <param name="transactionId">The optional transaction identifier.</param>
|
||||||
public IEnumerable<IndexEntry> Range(IndexKey minKey, IndexKey maxKey, IndexDirection direction = IndexDirection.Forward, ulong? transactionId = null)
|
public IEnumerable<IndexEntry> Range(IndexKey minKey, IndexKey maxKey,
|
||||||
|
IndexDirection direction = IndexDirection.Forward, ulong? transactionId = null)
|
||||||
{
|
{
|
||||||
var txnId = transactionId ?? 0;
|
ulong txnId = transactionId ?? 0;
|
||||||
var pageBuffer = System.Buffers.ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
byte[] pageBuffer = ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
||||||
|
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
if (direction == IndexDirection.Forward)
|
if (direction == IndexDirection.Forward)
|
||||||
{
|
{
|
||||||
var leafPageId = FindLeafNode(minKey, txnId);
|
uint leafPageId = FindLeafNode(minKey, txnId);
|
||||||
|
|
||||||
while (leafPageId != 0)
|
while (leafPageId != 0)
|
||||||
{
|
{
|
||||||
ReadPage(leafPageId, txnId, pageBuffer);
|
ReadPage(leafPageId, txnId, pageBuffer);
|
||||||
|
|
||||||
var header = BTreeNodeHeader.ReadFrom(pageBuffer.AsSpan(32));
|
var header = BTreeNodeHeader.ReadFrom(pageBuffer.AsSpan(32));
|
||||||
var dataOffset = 32 + 20; // Adjusted for 20-byte header
|
int dataOffset = 32 + 20; // Adjusted for 20-byte header
|
||||||
|
|
||||||
for (int i = 0; i < header.EntryCount; i++)
|
for (var i = 0; i < header.EntryCount; i++)
|
||||||
{
|
{
|
||||||
var entryKey = ReadIndexKey(pageBuffer, dataOffset);
|
var entryKey = ReadIndexKey(pageBuffer, dataOffset);
|
||||||
|
|
||||||
if (entryKey >= minKey && entryKey <= maxKey)
|
if (entryKey >= minKey && entryKey <= maxKey)
|
||||||
{
|
{
|
||||||
var locationOffset = dataOffset + 4 + entryKey.Data.Length;
|
int locationOffset = dataOffset + 4 + entryKey.Data.Length;
|
||||||
var location = DocumentLocation.ReadFrom(pageBuffer.AsSpan(locationOffset, DocumentLocation.SerializedSize));
|
var location =
|
||||||
|
DocumentLocation.ReadFrom(pageBuffer.AsSpan(locationOffset,
|
||||||
|
DocumentLocation.SerializedSize));
|
||||||
yield return new IndexEntry(entryKey, location);
|
yield return new IndexEntry(entryKey, location);
|
||||||
}
|
}
|
||||||
else if (entryKey > maxKey)
|
else if (entryKey > maxKey)
|
||||||
@@ -250,7 +250,7 @@ public sealed class BTreeIndex
|
|||||||
else // Backward
|
else // Backward
|
||||||
{
|
{
|
||||||
// Start from the end of the range (maxKey)
|
// Start from the end of the range (maxKey)
|
||||||
var leafPageId = FindLeafNode(maxKey, txnId);
|
uint leafPageId = FindLeafNode(maxKey, txnId);
|
||||||
|
|
||||||
while (leafPageId != 0)
|
while (leafPageId != 0)
|
||||||
{
|
{
|
||||||
@@ -267,13 +267,8 @@ public sealed class BTreeIndex
|
|||||||
{
|
{
|
||||||
var entry = entries[i];
|
var entry = entries[i];
|
||||||
if (entry.Key <= maxKey && entry.Key >= minKey)
|
if (entry.Key <= maxKey && entry.Key >= minKey)
|
||||||
{
|
|
||||||
yield return entry;
|
yield return entry;
|
||||||
}
|
else if (entry.Key < minKey) yield break; // Exceeded range (below min)
|
||||||
else if (entry.Key < minKey)
|
|
||||||
{
|
|
||||||
yield break; // Exceeded range (below min)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Check if we need to continue to previous leaf
|
// Check if we need to continue to previous leaf
|
||||||
@@ -296,7 +291,7 @@ public sealed class BTreeIndex
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(pageBuffer);
|
ArrayPool<byte>.Shared.Return(pageBuffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -314,8 +309,8 @@ public sealed class BTreeIndex
|
|||||||
|
|
||||||
private uint FindLeafNodeWithPath(IndexKey key, List<uint> path, ulong transactionId)
|
private uint FindLeafNodeWithPath(IndexKey key, List<uint> path, ulong transactionId)
|
||||||
{
|
{
|
||||||
var currentPageId = _rootPageId;
|
uint currentPageId = RootPageId;
|
||||||
var pageBuffer = System.Buffers.ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
byte[] pageBuffer = ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
||||||
|
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
@@ -324,10 +319,7 @@ public sealed class BTreeIndex
|
|||||||
ReadPage(currentPageId, transactionId, pageBuffer);
|
ReadPage(currentPageId, transactionId, pageBuffer);
|
||||||
var header = BTreeNodeHeader.ReadFrom(pageBuffer.AsSpan(32));
|
var header = BTreeNodeHeader.ReadFrom(pageBuffer.AsSpan(32));
|
||||||
|
|
||||||
if (header.IsLeaf)
|
if (header.IsLeaf) return currentPageId;
|
||||||
{
|
|
||||||
return currentPageId;
|
|
||||||
}
|
|
||||||
|
|
||||||
path.Add(currentPageId);
|
path.Add(currentPageId);
|
||||||
currentPageId = FindChildNode(pageBuffer, header, key);
|
currentPageId = FindChildNode(pageBuffer, header, key);
|
||||||
@@ -335,7 +327,7 @@ public sealed class BTreeIndex
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(pageBuffer);
|
ArrayPool<byte>.Shared.Return(pageBuffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -348,24 +340,21 @@ public sealed class BTreeIndex
|
|||||||
// [Entry 2: Key2, P2]
|
// [Entry 2: Key2, P2]
|
||||||
// ...
|
// ...
|
||||||
|
|
||||||
var dataOffset = 32 + 20;
|
int dataOffset = 32 + 20;
|
||||||
var p0 = BitConverter.ToUInt32(nodeBuffer.Slice(dataOffset, 4));
|
var p0 = BitConverter.ToUInt32(nodeBuffer.Slice(dataOffset, 4));
|
||||||
dataOffset += 4;
|
dataOffset += 4;
|
||||||
|
|
||||||
uint childPageId = p0;
|
uint childPageId = p0;
|
||||||
|
|
||||||
// Linear search for now (optimize to binary search later)
|
// Linear search for now (optimize to binary search later)
|
||||||
for (int i = 0; i < header.EntryCount; i++)
|
for (var i = 0; i < header.EntryCount; i++)
|
||||||
{
|
{
|
||||||
var entryKey = ReadIndexKey(nodeBuffer, dataOffset);
|
var entryKey = ReadIndexKey(nodeBuffer, dataOffset);
|
||||||
var keyLen = 4 + entryKey.Data.Length;
|
int keyLen = 4 + entryKey.Data.Length;
|
||||||
var pointerOffset = dataOffset + keyLen;
|
int pointerOffset = dataOffset + keyLen;
|
||||||
var nextPointer = BitConverter.ToUInt32(nodeBuffer.Slice(pointerOffset, 4));
|
var nextPointer = BitConverter.ToUInt32(nodeBuffer.Slice(pointerOffset, 4));
|
||||||
|
|
||||||
if (key < entryKey)
|
if (key < entryKey) return childPageId;
|
||||||
{
|
|
||||||
return childPageId;
|
|
||||||
}
|
|
||||||
|
|
||||||
childPageId = nextPointer;
|
childPageId = nextPointer;
|
||||||
dataOffset += keyLen + 4; // Key + Pointer
|
dataOffset += keyLen + 4; // Key + Pointer
|
||||||
@@ -395,15 +384,12 @@ public sealed class BTreeIndex
|
|||||||
public IEnumerable<IndexEntry> Equal(IndexKey key, ulong transactionId)
|
public IEnumerable<IndexEntry> Equal(IndexKey key, ulong transactionId)
|
||||||
{
|
{
|
||||||
using var cursor = CreateCursor(transactionId);
|
using var cursor = CreateCursor(transactionId);
|
||||||
if (cursor.Seek(key))
|
if (cursor.Seek(key)) yield return cursor.Current;
|
||||||
{
|
|
||||||
yield return cursor.Current;
|
|
||||||
// Handle duplicates if we support them? Current impl looks unique-ish per key unless multi-value index.
|
// Handle duplicates if we support them? Current impl looks unique-ish per key unless multi-value index.
|
||||||
// BTreeIndex doesn't strictly prevent duplicates in structure, but usually unique keys.
|
// BTreeIndex doesn't strictly prevent duplicates in structure, but usually unique keys.
|
||||||
// If unique, yield one. If not, loop.
|
// If unique, yield one. If not, loop.
|
||||||
// Assuming unique for now based on TryFind.
|
// Assuming unique for now based on TryFind.
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Returns entries greater than the specified key.
|
/// Returns entries greater than the specified key.
|
||||||
@@ -418,9 +404,8 @@ public sealed class BTreeIndex
|
|||||||
bool found = cursor.Seek(key);
|
bool found = cursor.Seek(key);
|
||||||
|
|
||||||
if (found && !orEqual)
|
if (found && !orEqual)
|
||||||
{
|
if (!cursor.MoveNext())
|
||||||
if (!cursor.MoveNext()) yield break;
|
yield break;
|
||||||
}
|
|
||||||
|
|
||||||
// Loop forward
|
// Loop forward
|
||||||
do
|
do
|
||||||
@@ -471,15 +456,15 @@ public sealed class BTreeIndex
|
|||||||
/// <param name="endInclusive">If true, includes entries equal to <paramref name="end" />.</param>
|
/// <param name="endInclusive">If true, includes entries equal to <paramref name="end" />.</param>
|
||||||
/// <param name="transactionId">The transaction identifier used for isolation.</param>
|
/// <param name="transactionId">The transaction identifier used for isolation.</param>
|
||||||
/// <returns>An enumerable sequence of matching entries.</returns>
|
/// <returns>An enumerable sequence of matching entries.</returns>
|
||||||
public IEnumerable<IndexEntry> Between(IndexKey start, IndexKey end, bool startInclusive, bool endInclusive, ulong transactionId)
|
public IEnumerable<IndexEntry> Between(IndexKey start, IndexKey end, bool startInclusive, bool endInclusive,
|
||||||
|
ulong transactionId)
|
||||||
{
|
{
|
||||||
using var cursor = CreateCursor(transactionId);
|
using var cursor = CreateCursor(transactionId);
|
||||||
bool found = cursor.Seek(start);
|
bool found = cursor.Seek(start);
|
||||||
|
|
||||||
if (found && !startInclusive)
|
if (found && !startInclusive)
|
||||||
{
|
if (!cursor.MoveNext())
|
||||||
if (!cursor.MoveNext()) yield break;
|
yield break;
|
||||||
}
|
|
||||||
|
|
||||||
// Iterate while <= end
|
// Iterate while <= end
|
||||||
do
|
do
|
||||||
@@ -489,7 +474,6 @@ public sealed class BTreeIndex
|
|||||||
if (current.Key == end && !endInclusive) yield break;
|
if (current.Key == end && !endInclusive) yield break;
|
||||||
|
|
||||||
yield return current;
|
yield return current;
|
||||||
|
|
||||||
} while (cursor.MoveNext());
|
} while (cursor.MoveNext());
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -509,13 +493,18 @@ public sealed class BTreeIndex
|
|||||||
{
|
{
|
||||||
var current = cursor.Current;
|
var current = cursor.Current;
|
||||||
string val;
|
string val;
|
||||||
try { val = current.Key.As<string>(); }
|
try
|
||||||
catch { break; }
|
{
|
||||||
|
val = current.Key.As<string>();
|
||||||
|
}
|
||||||
|
catch
|
||||||
|
{
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
|
||||||
if (!val.StartsWith(prefix)) break;
|
if (!val.StartsWith(prefix)) break;
|
||||||
|
|
||||||
yield return current;
|
yield return current;
|
||||||
|
|
||||||
} while (cursor.MoveNext());
|
} while (cursor.MoveNext());
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -531,13 +520,9 @@ public sealed class BTreeIndex
|
|||||||
using var cursor = CreateCursor(transactionId);
|
using var cursor = CreateCursor(transactionId);
|
||||||
|
|
||||||
foreach (var key in sortedKeys)
|
foreach (var key in sortedKeys)
|
||||||
{
|
|
||||||
if (cursor.Seek(key))
|
if (cursor.Seek(key))
|
||||||
{
|
|
||||||
yield return cursor.Current;
|
yield return cursor.Current;
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Returns string-key entries that match a SQL-like pattern.
|
/// Returns string-key entries that match a SQL-like pattern.
|
||||||
@@ -547,14 +532,14 @@ public sealed class BTreeIndex
|
|||||||
/// <returns>An enumerable sequence of matching entries.</returns>
|
/// <returns>An enumerable sequence of matching entries.</returns>
|
||||||
public IEnumerable<IndexEntry> Like(string pattern, ulong transactionId)
|
public IEnumerable<IndexEntry> Like(string pattern, ulong transactionId)
|
||||||
{
|
{
|
||||||
string regexPattern = "^" + System.Text.RegularExpressions.Regex.Escape(pattern)
|
string regexPattern = "^" + Regex.Escape(pattern)
|
||||||
.Replace("%", ".*")
|
.Replace("%", ".*")
|
||||||
.Replace("_", ".") + "$";
|
.Replace("_", ".") + "$";
|
||||||
|
|
||||||
var regex = new System.Text.RegularExpressions.Regex(regexPattern, System.Text.RegularExpressions.RegexOptions.Compiled);
|
var regex = new Regex(regexPattern, RegexOptions.Compiled);
|
||||||
|
|
||||||
string prefix = "";
|
var prefix = "";
|
||||||
for (int i = 0; i < pattern.Length; i++)
|
for (var i = 0; i < pattern.Length; i++)
|
||||||
{
|
{
|
||||||
if (pattern[i] == '%' || pattern[i] == '_') break;
|
if (pattern[i] == '%' || pattern[i] == '_') break;
|
||||||
prefix += pattern[i];
|
prefix += pattern[i];
|
||||||
@@ -563,30 +548,34 @@ public sealed class BTreeIndex
|
|||||||
using var cursor = CreateCursor(transactionId);
|
using var cursor = CreateCursor(transactionId);
|
||||||
|
|
||||||
if (!string.IsNullOrEmpty(prefix))
|
if (!string.IsNullOrEmpty(prefix))
|
||||||
{
|
|
||||||
cursor.Seek(IndexKey.Create(prefix));
|
cursor.Seek(IndexKey.Create(prefix));
|
||||||
}
|
|
||||||
else
|
else
|
||||||
{
|
|
||||||
cursor.MoveToFirst();
|
cursor.MoveToFirst();
|
||||||
}
|
|
||||||
|
|
||||||
do
|
do
|
||||||
{
|
{
|
||||||
IndexEntry current;
|
IndexEntry current;
|
||||||
try { current = cursor.Current; } catch { break; } // Safe break if cursor invalid
|
|
||||||
|
|
||||||
if (!string.IsNullOrEmpty(prefix))
|
|
||||||
{
|
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
string val = current.Key.As<string>();
|
current = cursor.Current;
|
||||||
|
}
|
||||||
|
catch
|
||||||
|
{
|
||||||
|
break;
|
||||||
|
} // Safe break if cursor invalid
|
||||||
|
|
||||||
|
if (!string.IsNullOrEmpty(prefix))
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var val = current.Key.As<string>();
|
||||||
if (!val.StartsWith(prefix)) break;
|
if (!val.StartsWith(prefix)) break;
|
||||||
}
|
}
|
||||||
catch { break; }
|
catch
|
||||||
|
{
|
||||||
|
break;
|
||||||
}
|
}
|
||||||
|
|
||||||
bool match = false;
|
var match = false;
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
match = regex.IsMatch(current.Key.As<string>());
|
match = regex.IsMatch(current.Key.As<string>());
|
||||||
@@ -597,7 +586,6 @@ public sealed class BTreeIndex
|
|||||||
}
|
}
|
||||||
|
|
||||||
if (match) yield return current;
|
if (match) yield return current;
|
||||||
|
|
||||||
} while (cursor.MoveNext());
|
} while (cursor.MoveNext());
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -605,10 +593,10 @@ public sealed class BTreeIndex
|
|||||||
{
|
{
|
||||||
// Read current entries to determine offset
|
// Read current entries to determine offset
|
||||||
var header = BTreeNodeHeader.ReadFrom(pageBuffer[32..]);
|
var header = BTreeNodeHeader.ReadFrom(pageBuffer[32..]);
|
||||||
var dataOffset = 32 + 20;
|
int dataOffset = 32 + 20;
|
||||||
|
|
||||||
// Skip existing entries to find free space
|
// Skip existing entries to find free space
|
||||||
for (int i = 0; i < header.EntryCount; i++)
|
for (var i = 0; i < header.EntryCount; i++)
|
||||||
{
|
{
|
||||||
var keyLen = BitConverter.ToInt32(pageBuffer.Slice(dataOffset, 4));
|
var keyLen = BitConverter.ToInt32(pageBuffer.Slice(dataOffset, 4));
|
||||||
dataOffset += 4 + keyLen + DocumentLocation.SerializedSize; // Length + Key + DocumentLocation
|
dataOffset += 4 + keyLen + DocumentLocation.SerializedSize; // Length + Key + DocumentLocation
|
||||||
@@ -635,37 +623,34 @@ public sealed class BTreeIndex
|
|||||||
|
|
||||||
private void SplitNode(uint nodePageId, List<uint> path, ulong transactionId)
|
private void SplitNode(uint nodePageId, List<uint> path, ulong transactionId)
|
||||||
{
|
{
|
||||||
var pageBuffer = System.Buffers.ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
byte[] pageBuffer = ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
ReadPage(nodePageId, transactionId, pageBuffer);
|
ReadPage(nodePageId, transactionId, pageBuffer);
|
||||||
var header = BTreeNodeHeader.ReadFrom(pageBuffer.AsSpan(32));
|
var header = BTreeNodeHeader.ReadFrom(pageBuffer.AsSpan(32));
|
||||||
|
|
||||||
if (header.IsLeaf)
|
if (header.IsLeaf)
|
||||||
{
|
|
||||||
SplitLeafNode(nodePageId, header, pageBuffer, path, transactionId);
|
SplitLeafNode(nodePageId, header, pageBuffer, path, transactionId);
|
||||||
}
|
|
||||||
else
|
else
|
||||||
{
|
|
||||||
SplitInternalNode(nodePageId, header, pageBuffer, path, transactionId);
|
SplitInternalNode(nodePageId, header, pageBuffer, path, transactionId);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(pageBuffer);
|
ArrayPool<byte>.Shared.Return(pageBuffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private void SplitLeafNode(uint nodePageId, BTreeNodeHeader header, Span<byte> pageBuffer, List<uint> path, ulong transactionId)
|
private void SplitLeafNode(uint nodePageId, BTreeNodeHeader header, Span<byte> pageBuffer, List<uint> path,
|
||||||
|
ulong transactionId)
|
||||||
{
|
{
|
||||||
var entries = ReadLeafEntries(pageBuffer, header.EntryCount);
|
var entries = ReadLeafEntries(pageBuffer, header.EntryCount);
|
||||||
|
|
||||||
var splitPoint = entries.Count / 2;
|
int splitPoint = entries.Count / 2;
|
||||||
var leftEntries = entries.Take(splitPoint).ToList();
|
var leftEntries = entries.Take(splitPoint).ToList();
|
||||||
var rightEntries = entries.Skip(splitPoint).ToList();
|
var rightEntries = entries.Skip(splitPoint).ToList();
|
||||||
|
|
||||||
// Create new node for right half
|
// Create new node for right half
|
||||||
var newNodeId = CreateNode(isLeaf: true, transactionId);
|
uint newNodeId = CreateNode(true, transactionId);
|
||||||
|
|
||||||
// Update original node (left)
|
// Update original node (left)
|
||||||
// Next -> RightNode
|
// Next -> RightNode
|
||||||
@@ -678,10 +663,7 @@ public sealed class BTreeIndex
|
|||||||
WriteLeafNode(newNodeId, rightEntries, header.NextLeafPageId, nodePageId, transactionId);
|
WriteLeafNode(newNodeId, rightEntries, header.NextLeafPageId, nodePageId, transactionId);
|
||||||
|
|
||||||
// Update Original Next Node's Prev pointer to point to New Node
|
// Update Original Next Node's Prev pointer to point to New Node
|
||||||
if (header.NextLeafPageId != 0)
|
if (header.NextLeafPageId != 0) UpdatePrevPointer(header.NextLeafPageId, newNodeId, transactionId);
|
||||||
{
|
|
||||||
UpdatePrevPointer(header.NextLeafPageId, newNodeId, transactionId);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Promote key to parent (first key of right node)
|
// Promote key to parent (first key of right node)
|
||||||
var promoteKey = rightEntries[0].Key;
|
var promoteKey = rightEntries[0].Key;
|
||||||
@@ -690,7 +672,7 @@ public sealed class BTreeIndex
|
|||||||
|
|
||||||
private void UpdatePrevPointer(uint pageId, uint newPrevId, ulong transactionId)
|
private void UpdatePrevPointer(uint pageId, uint newPrevId, ulong transactionId)
|
||||||
{
|
{
|
||||||
var buffer = System.Buffers.ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
byte[] buffer = ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
ReadPage(pageId, transactionId, buffer);
|
ReadPage(pageId, transactionId, buffer);
|
||||||
@@ -701,24 +683,27 @@ public sealed class BTreeIndex
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(buffer);
|
ArrayPool<byte>.Shared.Return(buffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private void SplitInternalNode(uint nodePageId, BTreeNodeHeader header, Span<byte> pageBuffer, List<uint> path, ulong transactionId)
|
private void SplitInternalNode(uint nodePageId, BTreeNodeHeader header, Span<byte> pageBuffer, List<uint> path,
|
||||||
|
ulong transactionId)
|
||||||
{
|
{
|
||||||
var (p0, entries) = ReadInternalEntries(pageBuffer, header.EntryCount);
|
(uint p0, var entries) = ReadInternalEntries(pageBuffer, header.EntryCount);
|
||||||
var splitPoint = entries.Count / 2;
|
int splitPoint = entries.Count / 2;
|
||||||
|
|
||||||
// For internal nodes, the median key moves UP to parent and is excluded from children
|
// For internal nodes, the median key moves UP to parent and is excluded from children
|
||||||
var promoteKey = entries[splitPoint].Key;
|
var promoteKey = entries[splitPoint].Key;
|
||||||
|
|
||||||
var leftEntries = entries.Take(splitPoint).ToList();
|
var leftEntries = entries.Take(splitPoint).ToList();
|
||||||
var rightEntries = entries.Skip(splitPoint + 1).ToList();
|
var rightEntries = entries.Skip(splitPoint + 1).ToList();
|
||||||
var rightP0 = entries[splitPoint].PageId; // Attempting to use the pointer associated with promoted key as P0 for right node
|
uint rightP0 =
|
||||||
|
entries[splitPoint]
|
||||||
|
.PageId; // Attempting to use the pointer associated with promoted key as P0 for right node
|
||||||
|
|
||||||
// Create new internal node
|
// Create new internal node
|
||||||
var newNodeId = CreateNode(isLeaf: false, transactionId);
|
uint newNodeId = CreateNode(false, transactionId);
|
||||||
|
|
||||||
// Update left node
|
// Update left node
|
||||||
WriteInternalNode(nodePageId, p0, leftEntries, transactionId);
|
WriteInternalNode(nodePageId, p0, leftEntries, transactionId);
|
||||||
@@ -730,7 +715,8 @@ public sealed class BTreeIndex
|
|||||||
InsertIntoParent(nodePageId, promoteKey, newNodeId, path, transactionId);
|
InsertIntoParent(nodePageId, promoteKey, newNodeId, path, transactionId);
|
||||||
}
|
}
|
||||||
|
|
||||||
private void InsertIntoParent(uint leftChildPageId, IndexKey key, uint rightChildPageId, List<uint> path, ulong transactionId)
|
private void InsertIntoParent(uint leftChildPageId, IndexKey key, uint rightChildPageId, List<uint> path,
|
||||||
|
ulong transactionId)
|
||||||
{
|
{
|
||||||
if (path.Count == 0 || path.Last() == leftChildPageId)
|
if (path.Count == 0 || path.Last() == leftChildPageId)
|
||||||
{
|
{
|
||||||
@@ -747,10 +733,10 @@ public sealed class BTreeIndex
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
var parentPageId = path.Last();
|
uint parentPageId = path.Last();
|
||||||
path.RemoveAt(path.Count - 1); // Pop parent for recursive calls
|
path.RemoveAt(path.Count - 1); // Pop parent for recursive calls
|
||||||
|
|
||||||
var pageBuffer = System.Buffers.ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
byte[] pageBuffer = ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
ReadPage(parentPageId, transactionId, pageBuffer);
|
ReadPage(parentPageId, transactionId, pageBuffer);
|
||||||
@@ -764,7 +750,8 @@ public sealed class BTreeIndex
|
|||||||
// But wait, to Split we need the median.
|
// But wait, to Split we need the median.
|
||||||
// Better approach: Read all, add new entry, then split the collection and write back.
|
// Better approach: Read all, add new entry, then split the collection and write back.
|
||||||
|
|
||||||
var (p0, entries) = ReadInternalEntries(pageBuffer.AsSpan(0, _storage.PageSize), header.EntryCount);
|
(uint p0, var entries) =
|
||||||
|
ReadInternalEntries(pageBuffer.AsSpan(0, _storage.PageSize), header.EntryCount);
|
||||||
|
|
||||||
// Insert new key/pointer in sorted order
|
// Insert new key/pointer in sorted order
|
||||||
var newEntry = new InternalEntry(key, rightChildPageId);
|
var newEntry = new InternalEntry(key, rightChildPageId);
|
||||||
@@ -773,14 +760,14 @@ public sealed class BTreeIndex
|
|||||||
else entries.Insert(insertIndex, newEntry);
|
else entries.Insert(insertIndex, newEntry);
|
||||||
|
|
||||||
// Now split these extended entries
|
// Now split these extended entries
|
||||||
var splitPoint = entries.Count / 2;
|
int splitPoint = entries.Count / 2;
|
||||||
var promoteKey = entries[splitPoint].Key;
|
var promoteKey = entries[splitPoint].Key;
|
||||||
var rightP0 = entries[splitPoint].PageId;
|
uint rightP0 = entries[splitPoint].PageId;
|
||||||
|
|
||||||
var leftEntries = entries.Take(splitPoint).ToList();
|
var leftEntries = entries.Take(splitPoint).ToList();
|
||||||
var rightEntries = entries.Skip(splitPoint + 1).ToList();
|
var rightEntries = entries.Skip(splitPoint + 1).ToList();
|
||||||
|
|
||||||
var newParentId = CreateNode(isLeaf: false, transactionId);
|
uint newParentId = CreateNode(false, transactionId);
|
||||||
|
|
||||||
WriteInternalNode(parentPageId, p0, leftEntries, transactionId);
|
WriteInternalNode(parentPageId, p0, leftEntries, transactionId);
|
||||||
WriteInternalNode(newParentId, rightP0, rightEntries, transactionId);
|
WriteInternalNode(newParentId, rightP0, rightEntries, transactionId);
|
||||||
@@ -790,21 +777,22 @@ public sealed class BTreeIndex
|
|||||||
else
|
else
|
||||||
{
|
{
|
||||||
// Insert directly
|
// Insert directly
|
||||||
InsertIntoInternal(parentPageId, header, pageBuffer.AsSpan(0, _storage.PageSize), key, rightChildPageId, transactionId);
|
InsertIntoInternal(parentPageId, header, pageBuffer.AsSpan(0, _storage.PageSize), key, rightChildPageId,
|
||||||
|
transactionId);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(pageBuffer);
|
ArrayPool<byte>.Shared.Return(pageBuffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private void CreateNewRoot(uint leftChildId, IndexKey key, uint rightChildId, ulong transactionId)
|
private void CreateNewRoot(uint leftChildId, IndexKey key, uint rightChildId, ulong transactionId)
|
||||||
{
|
{
|
||||||
var newRootId = CreateNode(isLeaf: false, transactionId);
|
uint newRootId = CreateNode(false, transactionId);
|
||||||
var entries = new List<InternalEntry> { new InternalEntry(key, rightChildId) };
|
var entries = new List<InternalEntry> { new(key, rightChildId) };
|
||||||
WriteInternalNode(newRootId, leftChildId, entries, transactionId);
|
WriteInternalNode(newRootId, leftChildId, entries, transactionId);
|
||||||
_rootPageId = newRootId; // Update in-memory root
|
RootPageId = newRootId; // Update in-memory root
|
||||||
|
|
||||||
// TODO: Update root in file header/metadata block so it persists?
|
// TODO: Update root in file header/metadata block so it persists?
|
||||||
// For now user passes rootPageId to ctor. BTreeIndex doesn't manage master root pointer persistence yet.
|
// For now user passes rootPageId to ctor. BTreeIndex doesn't manage master root pointer persistence yet.
|
||||||
@@ -812,8 +800,8 @@ public sealed class BTreeIndex
|
|||||||
|
|
||||||
private uint CreateNode(bool isLeaf, ulong transactionId)
|
private uint CreateNode(bool isLeaf, ulong transactionId)
|
||||||
{
|
{
|
||||||
var pageId = _storage.AllocatePage();
|
uint pageId = _storage.AllocatePage();
|
||||||
var pageBuffer = System.Buffers.ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
byte[] pageBuffer = ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
Array.Clear(pageBuffer, 0, _storage.PageSize);
|
Array.Clear(pageBuffer, 0, _storage.PageSize);
|
||||||
@@ -846,7 +834,7 @@ public sealed class BTreeIndex
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(pageBuffer);
|
ArrayPool<byte>.Shared.Return(pageBuffer);
|
||||||
}
|
}
|
||||||
|
|
||||||
return pageId;
|
return pageId;
|
||||||
@@ -855,42 +843,45 @@ public sealed class BTreeIndex
|
|||||||
private List<IndexEntry> ReadLeafEntries(Span<byte> pageBuffer, int count)
|
private List<IndexEntry> ReadLeafEntries(Span<byte> pageBuffer, int count)
|
||||||
{
|
{
|
||||||
var entries = new List<IndexEntry>(count);
|
var entries = new List<IndexEntry>(count);
|
||||||
var dataOffset = 32 + 20;
|
int dataOffset = 32 + 20;
|
||||||
|
|
||||||
for (int i = 0; i < count; i++)
|
for (var i = 0; i < count; i++)
|
||||||
{
|
{
|
||||||
var key = ReadIndexKey(pageBuffer, dataOffset);
|
var key = ReadIndexKey(pageBuffer, dataOffset);
|
||||||
var locationOffset = dataOffset + 4 + key.Data.Length;
|
int locationOffset = dataOffset + 4 + key.Data.Length;
|
||||||
var location = DocumentLocation.ReadFrom(pageBuffer.Slice(locationOffset, DocumentLocation.SerializedSize));
|
var location = DocumentLocation.ReadFrom(pageBuffer.Slice(locationOffset, DocumentLocation.SerializedSize));
|
||||||
entries.Add(new IndexEntry(key, location));
|
entries.Add(new IndexEntry(key, location));
|
||||||
dataOffset = locationOffset + DocumentLocation.SerializedSize;
|
dataOffset = locationOffset + DocumentLocation.SerializedSize;
|
||||||
}
|
}
|
||||||
|
|
||||||
return entries;
|
return entries;
|
||||||
}
|
}
|
||||||
|
|
||||||
private (uint P0, List<InternalEntry> Entries) ReadInternalEntries(Span<byte> pageBuffer, int count)
|
private (uint P0, List<InternalEntry> Entries) ReadInternalEntries(Span<byte> pageBuffer, int count)
|
||||||
{
|
{
|
||||||
var entries = new List<InternalEntry>(count);
|
var entries = new List<InternalEntry>(count);
|
||||||
var dataOffset = 32 + 20;
|
int dataOffset = 32 + 20;
|
||||||
|
|
||||||
var p0 = BitConverter.ToUInt32(pageBuffer.Slice(dataOffset, 4));
|
var p0 = BitConverter.ToUInt32(pageBuffer.Slice(dataOffset, 4));
|
||||||
dataOffset += 4;
|
dataOffset += 4;
|
||||||
|
|
||||||
for (int i = 0; i < count; i++)
|
for (var i = 0; i < count; i++)
|
||||||
{
|
{
|
||||||
var key = ReadIndexKey(pageBuffer, dataOffset);
|
var key = ReadIndexKey(pageBuffer, dataOffset);
|
||||||
var ptrOffset = dataOffset + 4 + key.Data.Length;
|
int ptrOffset = dataOffset + 4 + key.Data.Length;
|
||||||
var pageId = BitConverter.ToUInt32(pageBuffer.Slice(ptrOffset, 4));
|
var pageId = BitConverter.ToUInt32(pageBuffer.Slice(ptrOffset, 4));
|
||||||
entries.Add(new InternalEntry(key, pageId));
|
entries.Add(new InternalEntry(key, pageId));
|
||||||
dataOffset = ptrOffset + 4;
|
dataOffset = ptrOffset + 4;
|
||||||
}
|
}
|
||||||
|
|
||||||
return (p0, entries);
|
return (p0, entries);
|
||||||
}
|
}
|
||||||
|
|
||||||
private void WriteLeafNode(uint pageId, List<IndexEntry> entries, uint nextLeafId, uint prevLeafId, ulong? transactionId = null)
|
private void WriteLeafNode(uint pageId, List<IndexEntry> entries, uint nextLeafId, uint prevLeafId,
|
||||||
|
ulong? transactionId = null)
|
||||||
{
|
{
|
||||||
var txnId = transactionId ?? 0;
|
ulong txnId = transactionId ?? 0;
|
||||||
var pageBuffer = System.Buffers.ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
byte[] pageBuffer = ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
Array.Clear(pageBuffer, 0, _storage.PageSize);
|
Array.Clear(pageBuffer, 0, _storage.PageSize);
|
||||||
@@ -919,7 +910,7 @@ public sealed class BTreeIndex
|
|||||||
nodeHeader.WriteTo(pageBuffer.AsSpan(32, 20));
|
nodeHeader.WriteTo(pageBuffer.AsSpan(32, 20));
|
||||||
|
|
||||||
// Write entries with DocumentLocation (6 bytes instead of ObjectId 12 bytes)
|
// Write entries with DocumentLocation (6 bytes instead of ObjectId 12 bytes)
|
||||||
var dataOffset = 32 + 20;
|
int dataOffset = 32 + 20;
|
||||||
foreach (var entry in entries)
|
foreach (var entry in entries)
|
||||||
{
|
{
|
||||||
BitConverter.TryWriteBytes(pageBuffer.AsSpan(dataOffset, 4), entry.Key.Data.Length);
|
BitConverter.TryWriteBytes(pageBuffer.AsSpan(dataOffset, 4), entry.Key.Data.Length);
|
||||||
@@ -933,13 +924,13 @@ public sealed class BTreeIndex
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(pageBuffer);
|
ArrayPool<byte>.Shared.Return(pageBuffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private void WriteInternalNode(uint pageId, uint p0, List<InternalEntry> entries, ulong transactionId)
|
private void WriteInternalNode(uint pageId, uint p0, List<InternalEntry> entries, ulong transactionId)
|
||||||
{
|
{
|
||||||
var pageBuffer = System.Buffers.ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
byte[] pageBuffer = ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
Array.Clear(pageBuffer, 0, _storage.PageSize);
|
Array.Clear(pageBuffer, 0, _storage.PageSize);
|
||||||
@@ -967,7 +958,7 @@ public sealed class BTreeIndex
|
|||||||
nodeHeader.WriteTo(pageBuffer.AsSpan(32, 20));
|
nodeHeader.WriteTo(pageBuffer.AsSpan(32, 20));
|
||||||
|
|
||||||
// Write P0
|
// Write P0
|
||||||
var dataOffset = 32 + 20;
|
int dataOffset = 32 + 20;
|
||||||
BitConverter.TryWriteBytes(pageBuffer.AsSpan(dataOffset, 4), p0);
|
BitConverter.TryWriteBytes(pageBuffer.AsSpan(dataOffset, 4), p0);
|
||||||
dataOffset += 4;
|
dataOffset += 4;
|
||||||
|
|
||||||
@@ -985,14 +976,15 @@ public sealed class BTreeIndex
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(pageBuffer);
|
ArrayPool<byte>.Shared.Return(pageBuffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private void InsertIntoInternal(uint pageId, BTreeNodeHeader header, Span<byte> pageBuffer, IndexKey key, uint rightChildId, ulong transactionId)
|
private void InsertIntoInternal(uint pageId, BTreeNodeHeader header, Span<byte> pageBuffer, IndexKey key,
|
||||||
|
uint rightChildId, ulong transactionId)
|
||||||
{
|
{
|
||||||
// Read, insert, write back. In production do in-place shift.
|
// Read, insert, write back. In production do in-place shift.
|
||||||
var (p0, entries) = ReadInternalEntries(pageBuffer, header.EntryCount);
|
(uint p0, var entries) = ReadInternalEntries(pageBuffer, header.EntryCount);
|
||||||
|
|
||||||
var newEntry = new InternalEntry(key, rightChildId);
|
var newEntry = new InternalEntry(key, rightChildId);
|
||||||
int insertIndex = entries.FindIndex(e => e.Key > key);
|
int insertIndex = entries.FindIndex(e => e.Key > key);
|
||||||
@@ -1025,11 +1017,11 @@ public sealed class BTreeIndex
|
|||||||
/// <param name="transactionId">The optional transaction identifier.</param>
|
/// <param name="transactionId">The optional transaction identifier.</param>
|
||||||
public bool Delete(IndexKey key, DocumentLocation location, ulong? transactionId = null)
|
public bool Delete(IndexKey key, DocumentLocation location, ulong? transactionId = null)
|
||||||
{
|
{
|
||||||
var txnId = transactionId ?? 0;
|
ulong txnId = transactionId ?? 0;
|
||||||
var path = new List<uint>();
|
var path = new List<uint>();
|
||||||
var leafPageId = FindLeafNodeWithPath(key, path, txnId);
|
uint leafPageId = FindLeafNodeWithPath(key, path, txnId);
|
||||||
|
|
||||||
var pageBuffer = System.Buffers.ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
byte[] pageBuffer = ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
ReadPage(leafPageId, txnId, pageBuffer);
|
ReadPage(leafPageId, txnId, pageBuffer);
|
||||||
@@ -1037,14 +1029,11 @@ public sealed class BTreeIndex
|
|||||||
|
|
||||||
// Check if key exists in leaf
|
// Check if key exists in leaf
|
||||||
var entries = ReadLeafEntries(pageBuffer, header.EntryCount);
|
var entries = ReadLeafEntries(pageBuffer, header.EntryCount);
|
||||||
var entryIndex = entries.FindIndex(e => e.Key.Equals(key) &&
|
int entryIndex = entries.FindIndex(e => e.Key.Equals(key) &&
|
||||||
e.Location.PageId == location.PageId &&
|
e.Location.PageId == location.PageId &&
|
||||||
e.Location.SlotIndex == location.SlotIndex);
|
e.Location.SlotIndex == location.SlotIndex);
|
||||||
|
|
||||||
if (entryIndex == -1)
|
if (entryIndex == -1) return false; // Not found
|
||||||
{
|
|
||||||
return false; // Not found
|
|
||||||
}
|
|
||||||
|
|
||||||
// Remove entry
|
// Remove entry
|
||||||
entries.RemoveAt(entryIndex);
|
entries.RemoveAt(entryIndex);
|
||||||
@@ -1055,34 +1044,29 @@ public sealed class BTreeIndex
|
|||||||
// Check for underflow (min 50% fill)
|
// Check for underflow (min 50% fill)
|
||||||
// Simplified: min 1 entry for now, or MaxEntries/2
|
// Simplified: min 1 entry for now, or MaxEntries/2
|
||||||
int minEntries = MaxEntriesPerNode / 2;
|
int minEntries = MaxEntriesPerNode / 2;
|
||||||
if (entries.Count < minEntries && _rootPageId != leafPageId)
|
if (entries.Count < minEntries && RootPageId != leafPageId) HandleUnderflow(leafPageId, path, txnId);
|
||||||
{
|
|
||||||
HandleUnderflow(leafPageId, path, txnId);
|
|
||||||
}
|
|
||||||
|
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(pageBuffer);
|
ArrayPool<byte>.Shared.Return(pageBuffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private void HandleUnderflow(uint nodeId, List<uint> path, ulong transactionId)
|
private void HandleUnderflow(uint nodeId, List<uint> path, ulong transactionId)
|
||||||
{
|
{
|
||||||
if (path.Count == 0)
|
if (path.Count == 0)
|
||||||
{
|
|
||||||
// Node is root
|
// Node is root
|
||||||
if (nodeId == _rootPageId)
|
if (nodeId == RootPageId)
|
||||||
{
|
|
||||||
// Special case: Collapse root if it has only 1 child (and is not a leaf)
|
// Special case: Collapse root if it has only 1 child (and is not a leaf)
|
||||||
// For now, simpliest implementation: do nothing for root underflow unless it's empty
|
// For now, simpliest implementation: do nothing for root underflow unless it's empty
|
||||||
// If it's a leaf root, it can be empty.
|
// If it's a leaf root, it can be empty.
|
||||||
return;
|
return;
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
var parentPageId = path[^1]; // Parent is last in path (before current node removed? No, path contains ancestors)
|
uint
|
||||||
|
parentPageId =
|
||||||
|
path[^1]; // Parent is last in path (before current node removed? No, path contains ancestors)
|
||||||
// Wait, FindLeafNodeWithPath adds ancestors. So path.Last() is not current node, it's parent.
|
// Wait, FindLeafNodeWithPath adds ancestors. So path.Last() is not current node, it's parent.
|
||||||
// Let's verify FindLeafNodeWithPath:
|
// Let's verify FindLeafNodeWithPath:
|
||||||
// path.Add(currentPageId); currentPageId = FindChildNode(...);
|
// path.Add(currentPageId); currentPageId = FindChildNode(...);
|
||||||
@@ -1091,37 +1075,34 @@ public sealed class BTreeIndex
|
|||||||
// Correct.
|
// Correct.
|
||||||
// So path.Last() is the parent.
|
// So path.Last() is the parent.
|
||||||
|
|
||||||
var pageBuffer = System.Buffers.ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
byte[] pageBuffer = ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
ReadPage(parentPageId, transactionId, pageBuffer);
|
ReadPage(parentPageId, transactionId, pageBuffer);
|
||||||
var parentHeader = BTreeNodeHeader.ReadFrom(pageBuffer.AsSpan(32));
|
var parentHeader = BTreeNodeHeader.ReadFrom(pageBuffer.AsSpan(32));
|
||||||
var (p0, parentEntries) = ReadInternalEntries(pageBuffer, parentHeader.EntryCount);
|
(uint p0, var parentEntries) = ReadInternalEntries(pageBuffer, parentHeader.EntryCount);
|
||||||
|
|
||||||
// Find index of current node in parent
|
// Find index of current node in parent
|
||||||
int childIndex = -1;
|
int childIndex = -1;
|
||||||
if (p0 == nodeId) childIndex = -1; // -1 indicates P0
|
if (p0 == nodeId) childIndex = -1; // -1 indicates P0
|
||||||
else
|
else
|
||||||
{
|
|
||||||
childIndex = parentEntries.FindIndex(e => e.PageId == nodeId);
|
childIndex = parentEntries.FindIndex(e => e.PageId == nodeId);
|
||||||
}
|
|
||||||
|
|
||||||
// Try to borrow from siblings
|
// Try to borrow from siblings
|
||||||
if (BorrowFromSibling(nodeId, parentPageId, childIndex, parentEntries, p0, transactionId))
|
if (BorrowFromSibling(nodeId, parentPageId, childIndex, parentEntries, p0,
|
||||||
{
|
transactionId)) return; // Rebalanced
|
||||||
return; // Rebalanced
|
|
||||||
}
|
|
||||||
|
|
||||||
// Borrow failed, valid siblings are too small -> MERGE
|
// Borrow failed, valid siblings are too small -> MERGE
|
||||||
MergeWithSibling(nodeId, parentPageId, childIndex, parentEntries, p0, path, transactionId);
|
MergeWithSibling(nodeId, parentPageId, childIndex, parentEntries, p0, path, transactionId);
|
||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(pageBuffer);
|
ArrayPool<byte>.Shared.Return(pageBuffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private bool BorrowFromSibling(uint nodeId, uint parentId, int childIndex, List<InternalEntry> parentEntries, uint p0, ulong transactionId)
|
private bool BorrowFromSibling(uint nodeId, uint parentId, int childIndex, List<InternalEntry> parentEntries,
|
||||||
|
uint p0, ulong transactionId)
|
||||||
{
|
{
|
||||||
// TODO: Implement rotation (borrow from left or right sibling)
|
// TODO: Implement rotation (borrow from left or right sibling)
|
||||||
// Complexity: High. Need to update Parent, Sibling, and Node.
|
// Complexity: High. Need to update Parent, Sibling, and Node.
|
||||||
@@ -1130,7 +1111,8 @@ public sealed class BTreeIndex
|
|||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
|
|
||||||
private void MergeWithSibling(uint nodeId, uint parentId, int childIndex, List<InternalEntry> parentEntries, uint p0, List<uint> path, ulong transactionId)
|
private void MergeWithSibling(uint nodeId, uint parentId, int childIndex, List<InternalEntry> parentEntries,
|
||||||
|
uint p0, List<uint> path, ulong transactionId)
|
||||||
{
|
{
|
||||||
// Identify sibling to merge with.
|
// Identify sibling to merge with.
|
||||||
// If P0 (childIndex -1), merge with right sibling (Entry 0).
|
// If P0 (childIndex -1), merge with right sibling (Entry 0).
|
||||||
@@ -1167,14 +1149,10 @@ public sealed class BTreeIndex
|
|||||||
|
|
||||||
// Remove separator key and right pointer from Parent
|
// Remove separator key and right pointer from Parent
|
||||||
if (childIndex == -1)
|
if (childIndex == -1)
|
||||||
{
|
|
||||||
parentEntries.RemoveAt(0); // Removing Entry 0 (Key 0, P1) - P1 was Right Node
|
parentEntries.RemoveAt(0); // Removing Entry 0 (Key 0, P1) - P1 was Right Node
|
||||||
// P0 remains P0 (which was Left Node)
|
// P0 remains P0 (which was Left Node)
|
||||||
}
|
|
||||||
else
|
else
|
||||||
{
|
|
||||||
parentEntries.RemoveAt(childIndex); // Remove entry pointing to Right Node
|
parentEntries.RemoveAt(childIndex); // Remove entry pointing to Right Node
|
||||||
}
|
|
||||||
|
|
||||||
// Write updated Parent
|
// Write updated Parent
|
||||||
WriteInternalNode(parentId, p0, parentEntries, transactionId);
|
WriteInternalNode(parentId, p0, parentEntries, transactionId);
|
||||||
@@ -1186,23 +1164,23 @@ public sealed class BTreeIndex
|
|||||||
|
|
||||||
// Recursive Underflow Check on Parent
|
// Recursive Underflow Check on Parent
|
||||||
int minInternal = MaxEntriesPerNode / 2;
|
int minInternal = MaxEntriesPerNode / 2;
|
||||||
if (parentEntries.Count < minInternal && parentId != _rootPageId)
|
if (parentEntries.Count < minInternal && parentId != RootPageId)
|
||||||
{
|
{
|
||||||
var parentPath = new List<uint>(path.Take(path.Count - 1)); // Path to grandparent
|
var parentPath = new List<uint>(path.Take(path.Count - 1)); // Path to grandparent
|
||||||
HandleUnderflow(parentId, parentPath, transactionId);
|
HandleUnderflow(parentId, parentPath, transactionId);
|
||||||
}
|
}
|
||||||
else if (parentId == _rootPageId && parentEntries.Count == 0)
|
else if (parentId == RootPageId && parentEntries.Count == 0)
|
||||||
{
|
{
|
||||||
// Root collapse: Root has 0 entries (only P0).
|
// Root collapse: Root has 0 entries (only P0).
|
||||||
// P0 becomes new root.
|
// P0 becomes new root.
|
||||||
_rootPageId = p0; // P0 is the merged node (LeftNode)
|
RootPageId = p0; // P0 is the merged node (LeftNode)
|
||||||
// TODO: Update persistent root pointer if stored
|
// TODO: Update persistent root pointer if stored
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private void MergeNodes(uint leftNodeId, uint rightNodeId, IndexKey separatorKey, ulong transactionId)
|
private void MergeNodes(uint leftNodeId, uint rightNodeId, IndexKey separatorKey, ulong transactionId)
|
||||||
{
|
{
|
||||||
var buffer = System.Buffers.ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
byte[] buffer = ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
// Read both nodes
|
// Read both nodes
|
||||||
@@ -1218,7 +1196,9 @@ public sealed class BTreeIndex
|
|||||||
var leftEntries = ReadLeafEntries(buffer, leftHeader.EntryCount);
|
var leftEntries = ReadLeafEntries(buffer, leftHeader.EntryCount);
|
||||||
|
|
||||||
ReadPage(rightNodeId, transactionId, buffer);
|
ReadPage(rightNodeId, transactionId, buffer);
|
||||||
var rightEntries = ReadLeafEntries(buffer.AsSpan(0, _storage.PageSize), ((BTreeNodeHeader.ReadFrom(buffer.AsSpan(32))).EntryCount)); // Dirty read reuse buffer? No, bad hygiene.
|
var rightEntries = ReadLeafEntries(buffer.AsSpan(0, _storage.PageSize),
|
||||||
|
BTreeNodeHeader.ReadFrom(buffer.AsSpan(32))
|
||||||
|
.EntryCount); // Dirty read reuse buffer? No, bad hygiene.
|
||||||
// Re-read right clean
|
// Re-read right clean
|
||||||
var rightHeader = BTreeNodeHeader.ReadFrom(buffer.AsSpan(32));
|
var rightHeader = BTreeNodeHeader.ReadFrom(buffer.AsSpan(32));
|
||||||
rightEntries = ReadLeafEntries(buffer, rightHeader.EntryCount);
|
rightEntries = ReadLeafEntries(buffer, rightHeader.EntryCount);
|
||||||
@@ -1229,24 +1209,23 @@ public sealed class BTreeIndex
|
|||||||
// Update Left
|
// Update Left
|
||||||
// Next -> Right.Next
|
// Next -> Right.Next
|
||||||
// Prev -> Left.Prev (unchanged)
|
// Prev -> Left.Prev (unchanged)
|
||||||
WriteLeafNode(leftNodeId, leftEntries, rightHeader.NextLeafPageId, leftHeader.PrevLeafPageId, transactionId);
|
WriteLeafNode(leftNodeId, leftEntries, rightHeader.NextLeafPageId, leftHeader.PrevLeafPageId,
|
||||||
|
transactionId);
|
||||||
|
|
||||||
// Update Right.Next's Prev pointer to point to Left (since Right is gone)
|
// Update Right.Next's Prev pointer to point to Left (since Right is gone)
|
||||||
if (rightHeader.NextLeafPageId != 0)
|
if (rightHeader.NextLeafPageId != 0)
|
||||||
{
|
|
||||||
UpdatePrevPointer(rightHeader.NextLeafPageId, leftNodeId, transactionId);
|
UpdatePrevPointer(rightHeader.NextLeafPageId, leftNodeId, transactionId);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
// Internal Node Merge
|
// Internal Node Merge
|
||||||
ReadPage(leftNodeId, transactionId, buffer);
|
ReadPage(leftNodeId, transactionId, buffer);
|
||||||
// leftHeader is already read and valid
|
// leftHeader is already read and valid
|
||||||
var (leftP0, leftEntries) = ReadInternalEntries(buffer, leftHeader.EntryCount);
|
(uint leftP0, var leftEntries) = ReadInternalEntries(buffer, leftHeader.EntryCount);
|
||||||
|
|
||||||
ReadPage(rightNodeId, transactionId, buffer);
|
ReadPage(rightNodeId, transactionId, buffer);
|
||||||
var rightHeader = BTreeNodeHeader.ReadFrom(buffer.AsSpan(32));
|
var rightHeader = BTreeNodeHeader.ReadFrom(buffer.AsSpan(32));
|
||||||
var (rightP0, rightEntries) = ReadInternalEntries(buffer, rightHeader.EntryCount);
|
(uint rightP0, var rightEntries) = ReadInternalEntries(buffer, rightHeader.EntryCount);
|
||||||
|
|
||||||
// Add Separator Key (from parent) pointing to Right's P0
|
// Add Separator Key (from parent) pointing to Right's P0
|
||||||
leftEntries.Add(new InternalEntry(separatorKey, rightP0));
|
leftEntries.Add(new InternalEntry(separatorKey, rightP0));
|
||||||
@@ -1260,7 +1239,7 @@ public sealed class BTreeIndex
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(buffer);
|
ArrayPool<byte>.Shared.Return(buffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1,6 +1,5 @@
|
|||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
using ZB.MOM.WW.CBDD.Core.Storage;
|
using ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
using System;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
||||||
|
|
||||||
@@ -120,7 +119,7 @@ public struct BTreeNodeHeader
|
|||||||
if (destination.Length < 20)
|
if (destination.Length < 20)
|
||||||
throw new ArgumentException("Destination must be at least 20 bytes");
|
throw new ArgumentException("Destination must be at least 20 bytes");
|
||||||
|
|
||||||
BitConverter.TryWriteBytes(destination[0..4], PageId);
|
BitConverter.TryWriteBytes(destination[..4], PageId);
|
||||||
destination[4] = (byte)(IsLeaf ? 1 : 0);
|
destination[4] = (byte)(IsLeaf ? 1 : 0);
|
||||||
BitConverter.TryWriteBytes(destination[5..7], EntryCount);
|
BitConverter.TryWriteBytes(destination[5..7], EntryCount);
|
||||||
BitConverter.TryWriteBytes(destination[7..11], ParentPageId);
|
BitConverter.TryWriteBytes(destination[7..11], ParentPageId);
|
||||||
@@ -140,17 +139,14 @@ public struct BTreeNodeHeader
|
|||||||
|
|
||||||
var header = new BTreeNodeHeader
|
var header = new BTreeNodeHeader
|
||||||
{
|
{
|
||||||
PageId = BitConverter.ToUInt32(source[0..4]),
|
PageId = BitConverter.ToUInt32(source[..4]),
|
||||||
IsLeaf = source[4] != 0,
|
IsLeaf = source[4] != 0,
|
||||||
EntryCount = BitConverter.ToUInt16(source[5..7]),
|
EntryCount = BitConverter.ToUInt16(source[5..7]),
|
||||||
ParentPageId = BitConverter.ToUInt32(source[7..11]),
|
ParentPageId = BitConverter.ToUInt32(source[7..11]),
|
||||||
NextLeafPageId = BitConverter.ToUInt32(source[11..15])
|
NextLeafPageId = BitConverter.ToUInt32(source[11..15])
|
||||||
};
|
};
|
||||||
|
|
||||||
if (source.Length >= 20)
|
if (source.Length >= 20) header.PrevLeafPageId = BitConverter.ToUInt32(source[15..19]);
|
||||||
{
|
|
||||||
header.PrevLeafPageId = BitConverter.ToUInt32(source[15..19]);
|
|
||||||
}
|
|
||||||
|
|
||||||
return header;
|
return header;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,6 +1,4 @@
|
|||||||
using System;
|
|
||||||
using System.Linq.Expressions;
|
using System.Linq.Expressions;
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
||||||
|
|
||||||
@@ -11,6 +9,44 @@ namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
|||||||
/// <typeparam name="T">Document type</typeparam>
|
/// <typeparam name="T">Document type</typeparam>
|
||||||
public sealed class CollectionIndexDefinition<T> where T : class
|
public sealed class CollectionIndexDefinition<T> where T : class
|
||||||
{
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a new index definition
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="name">Index name</param>
|
||||||
|
/// <param name="propertyPaths">Property paths for the index</param>
|
||||||
|
/// <param name="keySelectorExpression">Expression to extract key from document</param>
|
||||||
|
/// <param name="isUnique">Enforce uniqueness</param>
|
||||||
|
/// <param name="type">Index structure type (BTree or Hash)</param>
|
||||||
|
/// <param name="isPrimary">Is this the primary key index</param>
|
||||||
|
/// <param name="dimensions">The vector dimensions for vector indexes.</param>
|
||||||
|
/// <param name="metric">The distance metric for vector indexes.</param>
|
||||||
|
public CollectionIndexDefinition(
|
||||||
|
string name,
|
||||||
|
string[] propertyPaths,
|
||||||
|
Expression<Func<T, object>> keySelectorExpression,
|
||||||
|
bool isUnique = false,
|
||||||
|
IndexType type = IndexType.BTree,
|
||||||
|
bool isPrimary = false,
|
||||||
|
int dimensions = 0,
|
||||||
|
VectorMetric metric = VectorMetric.Cosine)
|
||||||
|
{
|
||||||
|
if (string.IsNullOrWhiteSpace(name))
|
||||||
|
throw new ArgumentException("Index name cannot be empty", nameof(name));
|
||||||
|
|
||||||
|
if (propertyPaths == null || propertyPaths.Length == 0)
|
||||||
|
throw new ArgumentException("Property paths cannot be empty", nameof(propertyPaths));
|
||||||
|
|
||||||
|
Name = name;
|
||||||
|
PropertyPaths = propertyPaths;
|
||||||
|
KeySelectorExpression = keySelectorExpression ?? throw new ArgumentNullException(nameof(keySelectorExpression));
|
||||||
|
KeySelector = keySelectorExpression.Compile(); // Compile for performance
|
||||||
|
IsUnique = isUnique;
|
||||||
|
Type = type;
|
||||||
|
IsPrimary = isPrimary;
|
||||||
|
Dimensions = dimensions;
|
||||||
|
Metric = metric;
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Unique name for this index (auto-generated or user-specified)
|
/// Unique name for this index (auto-generated or user-specified)
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -54,44 +90,6 @@ public sealed class CollectionIndexDefinition<T> where T : class
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public bool IsPrimary { get; }
|
public bool IsPrimary { get; }
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Creates a new index definition
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="name">Index name</param>
|
|
||||||
/// <param name="propertyPaths">Property paths for the index</param>
|
|
||||||
/// <param name="keySelectorExpression">Expression to extract key from document</param>
|
|
||||||
/// <param name="isUnique">Enforce uniqueness</param>
|
|
||||||
/// <param name="type">Index structure type (BTree or Hash)</param>
|
|
||||||
/// <param name="isPrimary">Is this the primary key index</param>
|
|
||||||
/// <param name="dimensions">The vector dimensions for vector indexes.</param>
|
|
||||||
/// <param name="metric">The distance metric for vector indexes.</param>
|
|
||||||
public CollectionIndexDefinition(
|
|
||||||
string name,
|
|
||||||
string[] propertyPaths,
|
|
||||||
Expression<Func<T, object>> keySelectorExpression,
|
|
||||||
bool isUnique = false,
|
|
||||||
IndexType type = IndexType.BTree,
|
|
||||||
bool isPrimary = false,
|
|
||||||
int dimensions = 0,
|
|
||||||
VectorMetric metric = VectorMetric.Cosine)
|
|
||||||
{
|
|
||||||
if (string.IsNullOrWhiteSpace(name))
|
|
||||||
throw new ArgumentException("Index name cannot be empty", nameof(name));
|
|
||||||
|
|
||||||
if (propertyPaths == null || propertyPaths.Length == 0)
|
|
||||||
throw new ArgumentException("Property paths cannot be empty", nameof(propertyPaths));
|
|
||||||
|
|
||||||
Name = name;
|
|
||||||
PropertyPaths = propertyPaths;
|
|
||||||
KeySelectorExpression = keySelectorExpression ?? throw new ArgumentNullException(nameof(keySelectorExpression));
|
|
||||||
KeySelector = keySelectorExpression.Compile(); // Compile for performance
|
|
||||||
IsUnique = isUnique;
|
|
||||||
Type = type;
|
|
||||||
IsPrimary = isPrimary;
|
|
||||||
Dimensions = dimensions;
|
|
||||||
Metric = metric;
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Converts this high-level definition to low-level IndexOptions for BTreeIndex
|
/// Converts this high-level definition to low-level IndexOptions for BTreeIndex
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -136,11 +134,9 @@ public sealed class CollectionIndexDefinition<T> where T : class
|
|||||||
if (propertyPaths.Length > PropertyPaths.Length)
|
if (propertyPaths.Length > PropertyPaths.Length)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
for (int i = 0; i < propertyPaths.Length; i++)
|
for (var i = 0; i < propertyPaths.Length; i++)
|
||||||
{
|
|
||||||
if (!PropertyPaths[i].Equals(propertyPaths[i], StringComparison.OrdinalIgnoreCase))
|
if (!PropertyPaths[i].Equals(propertyPaths[i], StringComparison.OrdinalIgnoreCase))
|
||||||
return false;
|
return false;
|
||||||
}
|
|
||||||
|
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
@@ -148,8 +144,8 @@ public sealed class CollectionIndexDefinition<T> where T : class
|
|||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override string ToString()
|
public override string ToString()
|
||||||
{
|
{
|
||||||
var uniqueStr = IsUnique ? "Unique" : "Non-Unique";
|
string uniqueStr = IsUnique ? "Unique" : "Non-Unique";
|
||||||
var paths = string.Join(", ", PropertyPaths);
|
string paths = string.Join(", ", PropertyPaths);
|
||||||
return $"{Name} ({uniqueStr} {Type} on [{paths}])";
|
return $"{Name} ({uniqueStr} {Type} on [{paths}])";
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -197,6 +193,7 @@ public sealed class CollectionIndexInfo
|
|||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override string ToString()
|
public override string ToString()
|
||||||
{
|
{
|
||||||
return $"{Name}: {string.Join(", ", PropertyPaths)} ({EstimatedDocumentCount} docs, {EstimatedSizeBytes:N0} bytes)";
|
return
|
||||||
|
$"{Name}: {string.Join(", ", PropertyPaths)} ({EstimatedDocumentCount} docs, {EstimatedSizeBytes:N0} bytes)";
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1,7 +1,4 @@
|
|||||||
using System;
|
|
||||||
using System.Collections.Generic;
|
|
||||||
using System.Linq.Expressions;
|
using System.Linq.Expressions;
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
|
||||||
using ZB.MOM.WW.CBDD.Core.Collections;
|
using ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
using ZB.MOM.WW.CBDD.Core.Storage;
|
using ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
using ZB.MOM.WW.CBDD.Core.Transactions;
|
using ZB.MOM.WW.CBDD.Core.Transactions;
|
||||||
@@ -16,12 +13,12 @@ namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
|||||||
/// <typeparam name="T">Document type</typeparam>
|
/// <typeparam name="T">Document type</typeparam>
|
||||||
public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
||||||
{
|
{
|
||||||
private readonly Dictionary<string, CollectionSecondaryIndex<TId, T>> _indexes;
|
|
||||||
private readonly IStorageEngine _storage;
|
|
||||||
private readonly IDocumentMapper<TId, T> _mapper;
|
|
||||||
private readonly object _lock = new();
|
|
||||||
private bool _disposed;
|
|
||||||
private readonly string _collectionName;
|
private readonly string _collectionName;
|
||||||
|
private readonly Dictionary<string, CollectionSecondaryIndex<TId, T>> _indexes;
|
||||||
|
private readonly object _lock = new();
|
||||||
|
private readonly IDocumentMapper<TId, T> _mapper;
|
||||||
|
private readonly IStorageEngine _storage;
|
||||||
|
private bool _disposed;
|
||||||
private CollectionMetadata _metadata;
|
private CollectionMetadata _metadata;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -41,7 +38,8 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
/// <param name="storage">The storage abstraction used to persist index state.</param>
|
/// <param name="storage">The storage abstraction used to persist index state.</param>
|
||||||
/// <param name="mapper">The document mapper for the collection.</param>
|
/// <param name="mapper">The document mapper for the collection.</param>
|
||||||
/// <param name="collectionName">An optional collection name override.</param>
|
/// <param name="collectionName">An optional collection name override.</param>
|
||||||
internal CollectionIndexManager(IStorageEngine storage, IDocumentMapper<TId, T> mapper, string? collectionName = null)
|
internal CollectionIndexManager(IStorageEngine storage, IDocumentMapper<TId, T> mapper,
|
||||||
|
string? collectionName = null)
|
||||||
{
|
{
|
||||||
_storage = storage ?? throw new ArgumentNullException(nameof(storage));
|
_storage = storage ?? throw new ArgumentNullException(nameof(storage));
|
||||||
_mapper = mapper ?? throw new ArgumentNullException(nameof(mapper));
|
_mapper = mapper ?? throw new ArgumentNullException(nameof(mapper));
|
||||||
@@ -49,17 +47,53 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
_indexes = new Dictionary<string, CollectionSecondaryIndex<TId, T>>(StringComparer.OrdinalIgnoreCase);
|
_indexes = new Dictionary<string, CollectionSecondaryIndex<TId, T>>(StringComparer.OrdinalIgnoreCase);
|
||||||
|
|
||||||
// Load existing metadata via storage
|
// Load existing metadata via storage
|
||||||
_metadata = _storage.GetCollectionMetadata(_collectionName) ?? new CollectionMetadata { Name = _collectionName };
|
_metadata = _storage.GetCollectionMetadata(_collectionName) ??
|
||||||
|
new CollectionMetadata { Name = _collectionName };
|
||||||
|
|
||||||
// Initialize indexes from metadata
|
// Initialize indexes from metadata
|
||||||
foreach (var idxMeta in _metadata.Indexes)
|
foreach (var idxMeta in _metadata.Indexes)
|
||||||
{
|
{
|
||||||
var definition = RebuildDefinition(idxMeta.Name, idxMeta.PropertyPaths, idxMeta.IsUnique, idxMeta.Type, idxMeta.Dimensions, idxMeta.Metric);
|
var definition = RebuildDefinition(idxMeta.Name, idxMeta.PropertyPaths, idxMeta.IsUnique, idxMeta.Type,
|
||||||
|
idxMeta.Dimensions, idxMeta.Metric);
|
||||||
var index = new CollectionSecondaryIndex<TId, T>(definition, _storage, _mapper, idxMeta.RootPageId);
|
var index = new CollectionSecondaryIndex<TId, T>(definition, _storage, _mapper, idxMeta.RootPageId);
|
||||||
_indexes[idxMeta.Name] = index;
|
_indexes[idxMeta.Name] = index;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the root page identifier for the primary index.
|
||||||
|
/// </summary>
|
||||||
|
public uint PrimaryRootPageId => _metadata.PrimaryRootPageId;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Releases resources used by the index manager.
|
||||||
|
/// </summary>
|
||||||
|
public void Dispose()
|
||||||
|
{
|
||||||
|
if (_disposed)
|
||||||
|
return;
|
||||||
|
|
||||||
|
// No auto-save on dispose to avoid unnecessary I/O if no changes
|
||||||
|
|
||||||
|
lock (_lock)
|
||||||
|
{
|
||||||
|
foreach (var index in _indexes.Values)
|
||||||
|
try
|
||||||
|
{
|
||||||
|
index.Dispose();
|
||||||
|
}
|
||||||
|
catch
|
||||||
|
{
|
||||||
|
/* Best effort */
|
||||||
|
}
|
||||||
|
|
||||||
|
_indexes.Clear();
|
||||||
|
_disposed = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
GC.SuppressFinalize(this);
|
||||||
|
}
|
||||||
|
|
||||||
private void UpdateMetadata()
|
private void UpdateMetadata()
|
||||||
{
|
{
|
||||||
_metadata.Indexes.Clear();
|
_metadata.Indexes.Clear();
|
||||||
@@ -129,7 +163,7 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
throw new ArgumentNullException(nameof(keySelector));
|
throw new ArgumentNullException(nameof(keySelector));
|
||||||
|
|
||||||
// Extract property paths from expression
|
// Extract property paths from expression
|
||||||
var propertyPaths = ExpressionAnalyzer.ExtractPropertyPaths(keySelector);
|
string[] propertyPaths = ExpressionAnalyzer.ExtractPropertyPaths(keySelector);
|
||||||
|
|
||||||
// Generate name if not provided
|
// Generate name if not provided
|
||||||
name ??= GenerateIndexName(propertyPaths);
|
name ??= GenerateIndexName(propertyPaths);
|
||||||
@@ -158,10 +192,11 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
/// <param name="metric">Distance metric used by the vector index.</param>
|
/// <param name="metric">Distance metric used by the vector index.</param>
|
||||||
/// <param name="name">Optional index name.</param>
|
/// <param name="name">Optional index name.</param>
|
||||||
/// <returns>The created or existing index.</returns>
|
/// <returns>The created or existing index.</returns>
|
||||||
public CollectionSecondaryIndex<TId, T> CreateVectorIndex<TKey>(Expression<Func<T, TKey>> keySelector, int dimensions, VectorMetric metric = VectorMetric.Cosine, string? name = null)
|
public CollectionSecondaryIndex<TId, T> CreateVectorIndex<TKey>(Expression<Func<T, TKey>> keySelector,
|
||||||
|
int dimensions, VectorMetric metric = VectorMetric.Cosine, string? name = null)
|
||||||
{
|
{
|
||||||
var propertyPaths = ExpressionAnalyzer.ExtractPropertyPaths(keySelector);
|
string[] propertyPaths = ExpressionAnalyzer.ExtractPropertyPaths(keySelector);
|
||||||
var indexName = name ?? GenerateIndexName(propertyPaths);
|
string indexName = name ?? GenerateIndexName(propertyPaths);
|
||||||
|
|
||||||
lock (_lock)
|
lock (_lock)
|
||||||
{
|
{
|
||||||
@@ -169,15 +204,13 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
return existing;
|
return existing;
|
||||||
|
|
||||||
var body = keySelector.Body;
|
var body = keySelector.Body;
|
||||||
if (body.Type != typeof(object))
|
if (body.Type != typeof(object)) body = Expression.Convert(body, typeof(object));
|
||||||
{
|
|
||||||
body = Expression.Convert(body, typeof(object));
|
|
||||||
}
|
|
||||||
|
|
||||||
// Reuse the original parameter from keySelector to avoid invalid expression trees.
|
// Reuse the original parameter from keySelector to avoid invalid expression trees.
|
||||||
var lambda = Expression.Lambda<Func<T, object>>(body, keySelector.Parameters);
|
var lambda = Expression.Lambda<Func<T, object>>(body, keySelector.Parameters);
|
||||||
|
|
||||||
var definition = new CollectionIndexDefinition<T>(indexName, propertyPaths, lambda, false, IndexType.Vector, false, dimensions, metric);
|
var definition = new CollectionIndexDefinition<T>(indexName, propertyPaths, lambda, false, IndexType.Vector,
|
||||||
|
false, dimensions, metric);
|
||||||
return CreateIndex(definition);
|
return CreateIndex(definition);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -194,7 +227,7 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
string? name = null,
|
string? name = null,
|
||||||
bool unique = false)
|
bool unique = false)
|
||||||
{
|
{
|
||||||
var propertyPaths = ExpressionAnalyzer.ExtractPropertyPaths(keySelector);
|
string[] propertyPaths = ExpressionAnalyzer.ExtractPropertyPaths(keySelector);
|
||||||
name ??= GenerateIndexName(propertyPaths);
|
name ??= GenerateIndexName(propertyPaths);
|
||||||
|
|
||||||
lock (_lock)
|
lock (_lock)
|
||||||
@@ -220,10 +253,7 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
{
|
{
|
||||||
// Convert LambdaExpression to Expression<Func<T, object>> properly by sharing parameters
|
// Convert LambdaExpression to Expression<Func<T, object>> properly by sharing parameters
|
||||||
var body = keySelector.Body;
|
var body = keySelector.Body;
|
||||||
if (body.Type != typeof(object))
|
if (body.Type != typeof(object)) body = Expression.Convert(body, typeof(object));
|
||||||
{
|
|
||||||
body = Expression.Convert(body, typeof(object));
|
|
||||||
}
|
|
||||||
|
|
||||||
var lambda = Expression.Lambda<Func<T, object>>(body, keySelector.Parameters);
|
var lambda = Expression.Lambda<Func<T, object>>(body, keySelector.Parameters);
|
||||||
|
|
||||||
@@ -244,8 +274,8 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
VectorMetric metric = VectorMetric.Cosine,
|
VectorMetric metric = VectorMetric.Cosine,
|
||||||
string? name = null)
|
string? name = null)
|
||||||
{
|
{
|
||||||
var propertyPaths = ExpressionAnalyzer.ExtractPropertyPaths(keySelector);
|
string[] propertyPaths = ExpressionAnalyzer.ExtractPropertyPaths(keySelector);
|
||||||
var indexName = name ?? GenerateIndexName(propertyPaths);
|
string indexName = name ?? GenerateIndexName(propertyPaths);
|
||||||
|
|
||||||
lock (_lock)
|
lock (_lock)
|
||||||
{
|
{
|
||||||
@@ -253,14 +283,12 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
return existing;
|
return existing;
|
||||||
|
|
||||||
var body = keySelector.Body;
|
var body = keySelector.Body;
|
||||||
if (body.Type != typeof(object))
|
if (body.Type != typeof(object)) body = Expression.Convert(body, typeof(object));
|
||||||
{
|
|
||||||
body = Expression.Convert(body, typeof(object));
|
|
||||||
}
|
|
||||||
|
|
||||||
var lambda = Expression.Lambda<Func<T, object>>(body, keySelector.Parameters);
|
var lambda = Expression.Lambda<Func<T, object>>(body, keySelector.Parameters);
|
||||||
|
|
||||||
var definition = new CollectionIndexDefinition<T>(indexName, propertyPaths, lambda, false, IndexType.Vector, false, dimensions, metric);
|
var definition = new CollectionIndexDefinition<T>(indexName, propertyPaths, lambda, false, IndexType.Vector,
|
||||||
|
false, dimensions, metric);
|
||||||
return CreateIndex(definition);
|
return CreateIndex(definition);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -275,8 +303,8 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
LambdaExpression keySelector,
|
LambdaExpression keySelector,
|
||||||
string? name = null)
|
string? name = null)
|
||||||
{
|
{
|
||||||
var propertyPaths = ExpressionAnalyzer.ExtractPropertyPaths(keySelector);
|
string[] propertyPaths = ExpressionAnalyzer.ExtractPropertyPaths(keySelector);
|
||||||
var indexName = name ?? GenerateIndexName(propertyPaths);
|
string indexName = name ?? GenerateIndexName(propertyPaths);
|
||||||
|
|
||||||
lock (_lock)
|
lock (_lock)
|
||||||
{
|
{
|
||||||
@@ -284,14 +312,12 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
return existing;
|
return existing;
|
||||||
|
|
||||||
var body = keySelector.Body;
|
var body = keySelector.Body;
|
||||||
if (body.Type != typeof(object))
|
if (body.Type != typeof(object)) body = Expression.Convert(body, typeof(object));
|
||||||
{
|
|
||||||
body = Expression.Convert(body, typeof(object));
|
|
||||||
}
|
|
||||||
|
|
||||||
var lambda = Expression.Lambda<Func<T, object>>(body, keySelector.Parameters);
|
var lambda = Expression.Lambda<Func<T, object>>(body, keySelector.Parameters);
|
||||||
|
|
||||||
var definition = new CollectionIndexDefinition<T>(indexName, propertyPaths, lambda, false, IndexType.Spatial);
|
var definition =
|
||||||
|
new CollectionIndexDefinition<T>(indexName, propertyPaths, lambda, false, IndexType.Spatial);
|
||||||
return CreateIndex(definition);
|
return CreateIndex(definition);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -425,10 +451,7 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
|
|
||||||
lock (_lock)
|
lock (_lock)
|
||||||
{
|
{
|
||||||
foreach (var index in _indexes.Values)
|
foreach (var index in _indexes.Values) index.Insert(document, location, transaction);
|
||||||
{
|
|
||||||
index.Insert(document, location, transaction);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -440,7 +463,8 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
/// <param name="oldLocation">Physical location of old document</param>
|
/// <param name="oldLocation">Physical location of old document</param>
|
||||||
/// <param name="newLocation">Physical location of new document</param>
|
/// <param name="newLocation">Physical location of new document</param>
|
||||||
/// <param name="transaction">Transaction context</param>
|
/// <param name="transaction">Transaction context</param>
|
||||||
public void UpdateInAll(T oldDocument, T newDocument, DocumentLocation oldLocation, DocumentLocation newLocation, ITransaction transaction)
|
public void UpdateInAll(T oldDocument, T newDocument, DocumentLocation oldLocation, DocumentLocation newLocation,
|
||||||
|
ITransaction transaction)
|
||||||
{
|
{
|
||||||
if (oldDocument == null)
|
if (oldDocument == null)
|
||||||
throw new ArgumentNullException(nameof(oldDocument));
|
throw new ArgumentNullException(nameof(oldDocument));
|
||||||
@@ -450,11 +474,9 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
lock (_lock)
|
lock (_lock)
|
||||||
{
|
{
|
||||||
foreach (var index in _indexes.Values)
|
foreach (var index in _indexes.Values)
|
||||||
{
|
|
||||||
index.Update(oldDocument, newDocument, oldLocation, newLocation, transaction);
|
index.Update(oldDocument, newDocument, oldLocation, newLocation, transaction);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Deletes a document from all indexes
|
/// Deletes a document from all indexes
|
||||||
@@ -469,10 +491,7 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
|
|
||||||
lock (_lock)
|
lock (_lock)
|
||||||
{
|
{
|
||||||
foreach (var index in _indexes.Values)
|
foreach (var index in _indexes.Values) index.Delete(document, location, transaction);
|
||||||
{
|
|
||||||
index.Delete(document, location, transaction);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -484,20 +503,17 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
return $"idx_{string.Join("_", propertyPaths)}";
|
return $"idx_{string.Join("_", propertyPaths)}";
|
||||||
}
|
}
|
||||||
|
|
||||||
private CollectionIndexDefinition<T> RebuildDefinition(string name, string[] paths, bool isUnique, IndexType type, int dimensions = 0, VectorMetric metric = VectorMetric.Cosine)
|
private CollectionIndexDefinition<T> RebuildDefinition(string name, string[] paths, bool isUnique, IndexType type,
|
||||||
|
int dimensions = 0, VectorMetric metric = VectorMetric.Cosine)
|
||||||
{
|
{
|
||||||
var param = Expression.Parameter(typeof(T), "u");
|
var param = Expression.Parameter(typeof(T), "u");
|
||||||
Expression body;
|
Expression body;
|
||||||
|
|
||||||
if (paths.Length == 1)
|
if (paths.Length == 1)
|
||||||
{
|
|
||||||
body = Expression.PropertyOrField(param, paths[0]);
|
body = Expression.PropertyOrField(param, paths[0]);
|
||||||
}
|
|
||||||
else
|
else
|
||||||
{
|
|
||||||
body = Expression.NewArrayInit(typeof(object),
|
body = Expression.NewArrayInit(typeof(object),
|
||||||
paths.Select(p => Expression.Convert(Expression.PropertyOrField(param, p), typeof(object))));
|
paths.Select(p => Expression.Convert(Expression.PropertyOrField(param, p), typeof(object))));
|
||||||
}
|
|
||||||
|
|
||||||
var objectBody = Expression.Convert(body, typeof(object));
|
var objectBody = Expression.Convert(body, typeof(object));
|
||||||
var lambda = Expression.Lambda<Func<T, object>>(objectBody, param);
|
var lambda = Expression.Lambda<Func<T, object>>(objectBody, param);
|
||||||
@@ -505,11 +521,6 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
return new CollectionIndexDefinition<T>(name, paths, lambda, isUnique, type, false, dimensions, metric);
|
return new CollectionIndexDefinition<T>(name, paths, lambda, isUnique, type, false, dimensions, metric);
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Gets the root page identifier for the primary index.
|
|
||||||
/// </summary>
|
|
||||||
public uint PrimaryRootPageId => _metadata.PrimaryRootPageId;
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Rebinds cached metadata and index instances from persisted metadata.
|
/// Rebinds cached metadata and index instances from persisted metadata.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -525,8 +536,13 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
throw new ObjectDisposedException(nameof(CollectionIndexManager<TId, T>));
|
throw new ObjectDisposedException(nameof(CollectionIndexManager<TId, T>));
|
||||||
|
|
||||||
foreach (var index in _indexes.Values)
|
foreach (var index in _indexes.Values)
|
||||||
|
try
|
||||||
{
|
{
|
||||||
try { index.Dispose(); } catch { /* Best effort */ }
|
index.Dispose();
|
||||||
|
}
|
||||||
|
catch
|
||||||
|
{
|
||||||
|
/* Best effort */
|
||||||
}
|
}
|
||||||
|
|
||||||
_indexes.Clear();
|
_indexes.Clear();
|
||||||
@@ -534,7 +550,8 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
|
|
||||||
foreach (var idxMeta in _metadata.Indexes)
|
foreach (var idxMeta in _metadata.Indexes)
|
||||||
{
|
{
|
||||||
var definition = RebuildDefinition(idxMeta.Name, idxMeta.PropertyPaths, idxMeta.IsUnique, idxMeta.Type, idxMeta.Dimensions, idxMeta.Metric);
|
var definition = RebuildDefinition(idxMeta.Name, idxMeta.PropertyPaths, idxMeta.IsUnique, idxMeta.Type,
|
||||||
|
idxMeta.Dimensions, idxMeta.Metric);
|
||||||
var index = new CollectionSecondaryIndex<TId, T>(definition, _storage, _mapper, idxMeta.RootPageId);
|
var index = new CollectionSecondaryIndex<TId, T>(definition, _storage, _mapper, idxMeta.RootPageId);
|
||||||
_indexes[idxMeta.Name] = index;
|
_indexes[idxMeta.Name] = index;
|
||||||
}
|
}
|
||||||
@@ -561,37 +578,16 @@ public sealed class CollectionIndexManager<TId, T> : IDisposable where T : class
|
|||||||
/// Gets the current collection metadata.
|
/// Gets the current collection metadata.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <returns>The collection metadata.</returns>
|
/// <returns>The collection metadata.</returns>
|
||||||
public CollectionMetadata GetMetadata() => _metadata;
|
public CollectionMetadata GetMetadata()
|
||||||
|
{
|
||||||
|
return _metadata;
|
||||||
|
}
|
||||||
|
|
||||||
private void SaveMetadata()
|
private void SaveMetadata()
|
||||||
{
|
{
|
||||||
UpdateMetadata();
|
UpdateMetadata();
|
||||||
_storage.SaveCollectionMetadata(_metadata);
|
_storage.SaveCollectionMetadata(_metadata);
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Releases resources used by the index manager.
|
|
||||||
/// </summary>
|
|
||||||
public void Dispose()
|
|
||||||
{
|
|
||||||
if (_disposed)
|
|
||||||
return;
|
|
||||||
|
|
||||||
// No auto-save on dispose to avoid unnecessary I/O if no changes
|
|
||||||
|
|
||||||
lock (_lock)
|
|
||||||
{
|
|
||||||
foreach (var index in _indexes.Values)
|
|
||||||
{
|
|
||||||
try { index.Dispose(); } catch { /* Best effort */ }
|
|
||||||
}
|
|
||||||
|
|
||||||
_indexes.Clear();
|
|
||||||
_disposed = true;
|
|
||||||
}
|
|
||||||
|
|
||||||
GC.SuppressFinalize(this);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -607,35 +603,30 @@ public static class ExpressionAnalyzer
|
|||||||
public static string[] ExtractPropertyPaths(LambdaExpression expression)
|
public static string[] ExtractPropertyPaths(LambdaExpression expression)
|
||||||
{
|
{
|
||||||
if (expression.Body is MemberExpression memberExpr)
|
if (expression.Body is MemberExpression memberExpr)
|
||||||
{
|
|
||||||
// Simple property: p => p.Age
|
// Simple property: p => p.Age
|
||||||
return new[] { memberExpr.Member.Name };
|
return new[] { memberExpr.Member.Name };
|
||||||
}
|
|
||||||
else if (expression.Body is NewExpression newExpr)
|
if (expression.Body is NewExpression newExpr)
|
||||||
{
|
|
||||||
// Compound key via anonymous type: p => new { p.City, p.Age }
|
// Compound key via anonymous type: p => new { p.City, p.Age }
|
||||||
return newExpr.Arguments
|
return newExpr.Arguments
|
||||||
.OfType<MemberExpression>()
|
.OfType<MemberExpression>()
|
||||||
.Select(m => m.Member.Name)
|
.Select(m => m.Member.Name)
|
||||||
.ToArray();
|
.ToArray();
|
||||||
}
|
|
||||||
else if (expression.Body is UnaryExpression { NodeType: ExpressionType.Convert } unaryExpr)
|
if (expression.Body is UnaryExpression { NodeType: ExpressionType.Convert } unaryExpr)
|
||||||
{
|
{
|
||||||
// Handle Convert(Member) or Convert(New)
|
// Handle Convert(Member) or Convert(New)
|
||||||
if (unaryExpr.Operand is MemberExpression innerMember)
|
if (unaryExpr.Operand is MemberExpression innerMember)
|
||||||
{
|
|
||||||
// Wrapped property: p => (object)p.Age
|
// Wrapped property: p => (object)p.Age
|
||||||
return new[] { innerMember.Member.Name };
|
return new[] { innerMember.Member.Name };
|
||||||
}
|
|
||||||
else if (unaryExpr.Operand is NewExpression innerNew)
|
if (unaryExpr.Operand is NewExpression innerNew)
|
||||||
{
|
|
||||||
// Wrapped anonymous type: p => (object)new { p.City, p.Age }
|
// Wrapped anonymous type: p => (object)new { p.City, p.Age }
|
||||||
return innerNew.Arguments
|
return innerNew.Arguments
|
||||||
.OfType<MemberExpression>()
|
.OfType<MemberExpression>()
|
||||||
.Select(m => m.Member.Name)
|
.Select(m => m.Member.Name)
|
||||||
.ToArray();
|
.ToArray();
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
throw new ArgumentException(
|
throw new ArgumentException(
|
||||||
"Expression must be a property accessor (p => p.Property) or anonymous type (p => new { p.Prop1, p.Prop2 })",
|
"Expression must be a property accessor (p => p.Property) or anonymous type (p => new { p.Prop1, p.Prop2 })",
|
||||||
|
|||||||
@@ -1,11 +1,8 @@
|
|||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
using ZB.MOM.WW.CBDD.Core.Collections;
|
using ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
|
using ZB.MOM.WW.CBDD.Core.Indexing.Internal;
|
||||||
using ZB.MOM.WW.CBDD.Core.Storage;
|
using ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
using ZB.MOM.WW.CBDD.Core.Transactions;
|
using ZB.MOM.WW.CBDD.Core.Transactions;
|
||||||
using ZB.MOM.WW.CBDD.Core.Indexing.Internal;
|
|
||||||
using System;
|
|
||||||
using System.Linq;
|
|
||||||
using System.Collections.Generic;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
||||||
|
|
||||||
@@ -18,28 +15,11 @@ namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
|||||||
/// <typeparam name="T">Document type</typeparam>
|
/// <typeparam name="T">Document type</typeparam>
|
||||||
public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : class
|
public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : class
|
||||||
{
|
{
|
||||||
private readonly CollectionIndexDefinition<T> _definition;
|
|
||||||
private readonly BTreeIndex? _btreeIndex;
|
|
||||||
private readonly VectorSearchIndex? _vectorIndex;
|
|
||||||
private readonly RTreeIndex? _spatialIndex;
|
|
||||||
private readonly IDocumentMapper<TId, T> _mapper;
|
private readonly IDocumentMapper<TId, T> _mapper;
|
||||||
|
private readonly RTreeIndex? _spatialIndex;
|
||||||
|
private readonly VectorSearchIndex? _vectorIndex;
|
||||||
private bool _disposed;
|
private bool _disposed;
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Gets the index definition
|
|
||||||
/// </summary>
|
|
||||||
public CollectionIndexDefinition<T> Definition => _definition;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Gets the underlying BTree index (for advanced scenarios)
|
|
||||||
/// </summary>
|
|
||||||
public BTreeIndex? BTreeIndex => _btreeIndex;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Gets the root page identifier for the underlying index structure.
|
|
||||||
/// </summary>
|
|
||||||
public uint RootPageId => _btreeIndex?.RootPageId ?? _vectorIndex?.RootPageId ?? _spatialIndex?.RootPageId ?? 0;
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new instance of the <see cref="CollectionSecondaryIndex{TId, T}" /> class.
|
/// Initializes a new instance of the <see cref="CollectionSecondaryIndex{TId, T}" /> class.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -57,7 +37,8 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new instance of the <see cref="CollectionSecondaryIndex{TId, T}"/> class from index storage abstractions.
|
/// Initializes a new instance of the <see cref="CollectionSecondaryIndex{TId, T}" /> class from index storage
|
||||||
|
/// abstractions.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="definition">The index definition.</param>
|
/// <param name="definition">The index definition.</param>
|
||||||
/// <param name="storage">The index storage abstraction.</param>
|
/// <param name="storage">The index storage abstraction.</param>
|
||||||
@@ -69,7 +50,7 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
IDocumentMapper<TId, T> mapper,
|
IDocumentMapper<TId, T> mapper,
|
||||||
uint rootPageId = 0)
|
uint rootPageId = 0)
|
||||||
{
|
{
|
||||||
_definition = definition ?? throw new ArgumentNullException(nameof(definition));
|
Definition = definition ?? throw new ArgumentNullException(nameof(definition));
|
||||||
_mapper = mapper ?? throw new ArgumentNullException(nameof(mapper));
|
_mapper = mapper ?? throw new ArgumentNullException(nameof(mapper));
|
||||||
|
|
||||||
var indexOptions = definition.ToIndexOptions();
|
var indexOptions = definition.ToIndexOptions();
|
||||||
@@ -77,23 +58,53 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
if (indexOptions.Type == IndexType.Vector)
|
if (indexOptions.Type == IndexType.Vector)
|
||||||
{
|
{
|
||||||
_vectorIndex = new VectorSearchIndex(storage, indexOptions, rootPageId);
|
_vectorIndex = new VectorSearchIndex(storage, indexOptions, rootPageId);
|
||||||
_btreeIndex = null;
|
BTreeIndex = null;
|
||||||
_spatialIndex = null;
|
_spatialIndex = null;
|
||||||
}
|
}
|
||||||
else if (indexOptions.Type == IndexType.Spatial)
|
else if (indexOptions.Type == IndexType.Spatial)
|
||||||
{
|
{
|
||||||
_spatialIndex = new RTreeIndex(storage, indexOptions, rootPageId);
|
_spatialIndex = new RTreeIndex(storage, indexOptions, rootPageId);
|
||||||
_btreeIndex = null;
|
BTreeIndex = null;
|
||||||
_vectorIndex = null;
|
_vectorIndex = null;
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
_btreeIndex = new BTreeIndex(storage, indexOptions, rootPageId);
|
BTreeIndex = new BTreeIndex(storage, indexOptions, rootPageId);
|
||||||
_vectorIndex = null;
|
_vectorIndex = null;
|
||||||
_spatialIndex = null;
|
_spatialIndex = null;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the index definition
|
||||||
|
/// </summary>
|
||||||
|
public CollectionIndexDefinition<T> Definition { get; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the underlying BTree index (for advanced scenarios)
|
||||||
|
/// </summary>
|
||||||
|
public BTreeIndex? BTreeIndex { get; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the root page identifier for the underlying index structure.
|
||||||
|
/// </summary>
|
||||||
|
public uint RootPageId => BTreeIndex?.RootPageId ?? _vectorIndex?.RootPageId ?? _spatialIndex?.RootPageId ?? 0;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Releases resources used by this index wrapper.
|
||||||
|
/// </summary>
|
||||||
|
public void Dispose()
|
||||||
|
{
|
||||||
|
if (_disposed)
|
||||||
|
return;
|
||||||
|
|
||||||
|
// BTreeIndex doesn't currently implement IDisposable
|
||||||
|
// Future: may need to flush buffers, close resources
|
||||||
|
|
||||||
|
_disposed = true;
|
||||||
|
GC.SuppressFinalize(this);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Inserts a document into this index
|
/// Inserts a document into this index
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -106,7 +117,7 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
throw new ArgumentNullException(nameof(document));
|
throw new ArgumentNullException(nameof(document));
|
||||||
|
|
||||||
// Extract key using compiled selector (fast!)
|
// Extract key using compiled selector (fast!)
|
||||||
var keyValue = _definition.KeySelector(document);
|
object? keyValue = Definition.KeySelector(document);
|
||||||
if (keyValue == null)
|
if (keyValue == null)
|
||||||
return; // Skip null keys
|
return; // Skip null keys
|
||||||
|
|
||||||
@@ -114,32 +125,24 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
{
|
{
|
||||||
// Vector Index Support
|
// Vector Index Support
|
||||||
if (keyValue is float[] singleVector)
|
if (keyValue is float[] singleVector)
|
||||||
{
|
|
||||||
_vectorIndex.Insert(singleVector, location, transaction);
|
_vectorIndex.Insert(singleVector, location, transaction);
|
||||||
}
|
|
||||||
else if (keyValue is IEnumerable<float[]> vectors)
|
else if (keyValue is IEnumerable<float[]> vectors)
|
||||||
{
|
foreach (float[] v in vectors)
|
||||||
foreach (var v in vectors)
|
|
||||||
{
|
|
||||||
_vectorIndex.Insert(v, location, transaction);
|
_vectorIndex.Insert(v, location, transaction);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
|
||||||
else if (_spatialIndex != null)
|
else if (_spatialIndex != null)
|
||||||
{
|
{
|
||||||
// Geospatial Index Support
|
// Geospatial Index Support
|
||||||
if (keyValue is ValueTuple<double, double> t)
|
if (keyValue is ValueTuple<double, double> t)
|
||||||
{
|
|
||||||
_spatialIndex.Insert(GeoBox.FromPoint(new GeoPoint(t.Item1, t.Item2)), location, transaction);
|
_spatialIndex.Insert(GeoBox.FromPoint(new GeoPoint(t.Item1, t.Item2)), location, transaction);
|
||||||
}
|
}
|
||||||
}
|
else if (BTreeIndex != null)
|
||||||
else if (_btreeIndex != null)
|
|
||||||
{
|
{
|
||||||
// BTree Index logic
|
// BTree Index logic
|
||||||
var userKey = ConvertToIndexKey(keyValue);
|
var userKey = ConvertToIndexKey(keyValue);
|
||||||
var documentId = _mapper.GetId(document);
|
var documentId = _mapper.GetId(document);
|
||||||
var compositeKey = CreateCompositeKey(userKey, _mapper.ToIndexKey(documentId));
|
var compositeKey = CreateCompositeKey(userKey, _mapper.ToIndexKey(documentId));
|
||||||
_btreeIndex.Insert(compositeKey, location, transaction?.TransactionId);
|
BTreeIndex.Insert(compositeKey, location, transaction?.TransactionId);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -152,7 +155,8 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
/// <param name="oldLocation">Physical location of old document</param>
|
/// <param name="oldLocation">Physical location of old document</param>
|
||||||
/// <param name="newLocation">Physical location of new document</param>
|
/// <param name="newLocation">Physical location of new document</param>
|
||||||
/// <param name="transaction">Optional transaction</param>
|
/// <param name="transaction">Optional transaction</param>
|
||||||
public void Update(T oldDocument, T newDocument, DocumentLocation oldLocation, DocumentLocation newLocation, ITransaction transaction)
|
public void Update(T oldDocument, T newDocument, DocumentLocation oldLocation, DocumentLocation newLocation,
|
||||||
|
ITransaction transaction)
|
||||||
{
|
{
|
||||||
if (oldDocument == null)
|
if (oldDocument == null)
|
||||||
throw new ArgumentNullException(nameof(oldDocument));
|
throw new ArgumentNullException(nameof(oldDocument));
|
||||||
@@ -160,8 +164,8 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
throw new ArgumentNullException(nameof(newDocument));
|
throw new ArgumentNullException(nameof(newDocument));
|
||||||
|
|
||||||
// Extract keys from both versions
|
// Extract keys from both versions
|
||||||
var oldKey = _definition.KeySelector(oldDocument);
|
object? oldKey = Definition.KeySelector(oldDocument);
|
||||||
var newKey = _definition.KeySelector(newDocument);
|
object? newKey = Definition.KeySelector(newDocument);
|
||||||
|
|
||||||
// If keys are the same, no index update needed (optimization)
|
// If keys are the same, no index update needed (optimization)
|
||||||
if (Equals(oldKey, newKey))
|
if (Equals(oldKey, newKey))
|
||||||
@@ -174,7 +178,7 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
{
|
{
|
||||||
var oldUserKey = ConvertToIndexKey(oldKey);
|
var oldUserKey = ConvertToIndexKey(oldKey);
|
||||||
var oldCompositeKey = CreateCompositeKey(oldUserKey, _mapper.ToIndexKey(documentId));
|
var oldCompositeKey = CreateCompositeKey(oldUserKey, _mapper.ToIndexKey(documentId));
|
||||||
_btreeIndex?.Delete(oldCompositeKey, oldLocation, transaction?.TransactionId);
|
BTreeIndex?.Delete(oldCompositeKey, oldLocation, transaction?.TransactionId);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Insert new entry if it has a key
|
// Insert new entry if it has a key
|
||||||
@@ -182,7 +186,7 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
{
|
{
|
||||||
var newUserKey = ConvertToIndexKey(newKey);
|
var newUserKey = ConvertToIndexKey(newKey);
|
||||||
var newCompositeKey = CreateCompositeKey(newUserKey, _mapper.ToIndexKey(documentId));
|
var newCompositeKey = CreateCompositeKey(newUserKey, _mapper.ToIndexKey(documentId));
|
||||||
_btreeIndex?.Insert(newCompositeKey, newLocation, transaction?.TransactionId);
|
BTreeIndex?.Insert(newCompositeKey, newLocation, transaction?.TransactionId);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -198,7 +202,7 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
throw new ArgumentNullException(nameof(document));
|
throw new ArgumentNullException(nameof(document));
|
||||||
|
|
||||||
// Extract key
|
// Extract key
|
||||||
var keyValue = _definition.KeySelector(document);
|
object? keyValue = Definition.KeySelector(document);
|
||||||
if (keyValue == null)
|
if (keyValue == null)
|
||||||
return; // Nothing to delete
|
return; // Nothing to delete
|
||||||
|
|
||||||
@@ -207,7 +211,7 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
|
|
||||||
// Create composite key and delete
|
// Create composite key and delete
|
||||||
var compositeKey = CreateCompositeKey(userKey, _mapper.ToIndexKey(documentId));
|
var compositeKey = CreateCompositeKey(userKey, _mapper.ToIndexKey(documentId));
|
||||||
_btreeIndex?.Delete(compositeKey, location, transaction?.TransactionId);
|
BTreeIndex?.Delete(compositeKey, location, transaction?.TransactionId);
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -222,17 +226,16 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
return null;
|
return null;
|
||||||
|
|
||||||
if (_vectorIndex != null && key is float[] query)
|
if (_vectorIndex != null && key is float[] query)
|
||||||
{
|
|
||||||
return _vectorIndex.Search(query, 1, transaction: transaction).FirstOrDefault().Location;
|
return _vectorIndex.Search(query, 1, transaction: transaction).FirstOrDefault().Location;
|
||||||
}
|
|
||||||
|
|
||||||
if (_btreeIndex != null)
|
if (BTreeIndex != null)
|
||||||
{
|
{
|
||||||
var userKey = ConvertToIndexKey(key);
|
var userKey = ConvertToIndexKey(key);
|
||||||
var minComposite = CreateCompositeKeyBoundary(userKey, useMinObjectId: true);
|
var minComposite = CreateCompositeKeyBoundary(userKey, true);
|
||||||
var maxComposite = CreateCompositeKeyBoundary(userKey, useMinObjectId: false);
|
var maxComposite = CreateCompositeKeyBoundary(userKey, false);
|
||||||
var firstEntry = _btreeIndex.Range(minComposite, maxComposite, IndexDirection.Forward, transaction?.TransactionId).FirstOrDefault();
|
var firstEntry = BTreeIndex
|
||||||
return firstEntry.Location.PageId == 0 ? null : (DocumentLocation?)firstEntry.Location;
|
.Range(minComposite, maxComposite, IndexDirection.Forward, transaction?.TransactionId).FirstOrDefault();
|
||||||
|
return firstEntry.Location.PageId == 0 ? null : firstEntry.Location;
|
||||||
}
|
}
|
||||||
|
|
||||||
return null;
|
return null;
|
||||||
@@ -246,7 +249,8 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
/// <param name="efSearch">The search breadth parameter.</param>
|
/// <param name="efSearch">The search breadth parameter.</param>
|
||||||
/// <param name="transaction">Optional transaction.</param>
|
/// <param name="transaction">Optional transaction.</param>
|
||||||
/// <returns>The matching vector search results.</returns>
|
/// <returns>The matching vector search results.</returns>
|
||||||
public IEnumerable<VectorSearchResult> VectorSearch(float[] query, int k, int efSearch = 100, ITransaction? transaction = null)
|
public IEnumerable<VectorSearchResult> VectorSearch(float[] query, int k, int efSearch = 100,
|
||||||
|
ITransaction? transaction = null)
|
||||||
{
|
{
|
||||||
if (_vectorIndex == null)
|
if (_vectorIndex == null)
|
||||||
throw new InvalidOperationException("This index is not a vector index.");
|
throw new InvalidOperationException("This index is not a vector index.");
|
||||||
@@ -260,16 +264,14 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
/// <param name="center">The center point.</param>
|
/// <param name="center">The center point.</param>
|
||||||
/// <param name="radiusKm">The search radius in kilometers.</param>
|
/// <param name="radiusKm">The search radius in kilometers.</param>
|
||||||
/// <param name="transaction">Optional transaction.</param>
|
/// <param name="transaction">Optional transaction.</param>
|
||||||
public IEnumerable<DocumentLocation> Near((double Latitude, double Longitude) center, double radiusKm, ITransaction? transaction = null)
|
public IEnumerable<DocumentLocation> Near((double Latitude, double Longitude) center, double radiusKm,
|
||||||
|
ITransaction? transaction = null)
|
||||||
{
|
{
|
||||||
if (_spatialIndex == null)
|
if (_spatialIndex == null)
|
||||||
throw new InvalidOperationException("This index is not a spatial index.");
|
throw new InvalidOperationException("This index is not a spatial index.");
|
||||||
|
|
||||||
var queryBox = SpatialMath.BoundingBox(center.Latitude, center.Longitude, radiusKm);
|
var queryBox = SpatialMath.BoundingBox(center.Latitude, center.Longitude, radiusKm);
|
||||||
foreach (var loc in _spatialIndex.Search(queryBox, transaction))
|
foreach (var loc in _spatialIndex.Search(queryBox, transaction)) yield return loc;
|
||||||
{
|
|
||||||
yield return loc;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -278,7 +280,8 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
/// <param name="min">The minimum latitude/longitude corner.</param>
|
/// <param name="min">The minimum latitude/longitude corner.</param>
|
||||||
/// <param name="max">The maximum latitude/longitude corner.</param>
|
/// <param name="max">The maximum latitude/longitude corner.</param>
|
||||||
/// <param name="transaction">Optional transaction.</param>
|
/// <param name="transaction">Optional transaction.</param>
|
||||||
public IEnumerable<DocumentLocation> Within((double Latitude, double Longitude) min, (double Latitude, double Longitude) max, ITransaction? transaction = null)
|
public IEnumerable<DocumentLocation> Within((double Latitude, double Longitude) min,
|
||||||
|
(double Latitude, double Longitude) max, ITransaction? transaction = null)
|
||||||
{
|
{
|
||||||
if (_spatialIndex == null)
|
if (_spatialIndex == null)
|
||||||
throw new InvalidOperationException("This index is not a spatial index.");
|
throw new InvalidOperationException("This index is not a spatial index.");
|
||||||
@@ -295,9 +298,10 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
/// <param name="direction">Scan direction.</param>
|
/// <param name="direction">Scan direction.</param>
|
||||||
/// <param name="transaction">Optional transaction to read uncommitted changes</param>
|
/// <param name="transaction">Optional transaction to read uncommitted changes</param>
|
||||||
/// <returns>Enumerable of document locations in key order</returns>
|
/// <returns>Enumerable of document locations in key order</returns>
|
||||||
public IEnumerable<DocumentLocation> Range(object? minKey, object? maxKey, IndexDirection direction = IndexDirection.Forward, ITransaction? transaction = null)
|
public IEnumerable<DocumentLocation> Range(object? minKey, object? maxKey,
|
||||||
|
IndexDirection direction = IndexDirection.Forward, ITransaction? transaction = null)
|
||||||
{
|
{
|
||||||
if (_btreeIndex == null) yield break;
|
if (BTreeIndex == null) yield break;
|
||||||
|
|
||||||
// Handle unbounded ranges
|
// Handle unbounded ranges
|
||||||
IndexKey actualMinKey;
|
IndexKey actualMinKey;
|
||||||
@@ -313,12 +317,12 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
{
|
{
|
||||||
actualMinKey = new IndexKey(new byte[0]);
|
actualMinKey = new IndexKey(new byte[0]);
|
||||||
var userMaxKey = ConvertToIndexKey(maxKey!);
|
var userMaxKey = ConvertToIndexKey(maxKey!);
|
||||||
actualMaxKey = CreateCompositeKeyBoundary(userMaxKey, useMinObjectId: false); // Max boundary
|
actualMaxKey = CreateCompositeKeyBoundary(userMaxKey, false); // Max boundary
|
||||||
}
|
}
|
||||||
else if (maxKey == null)
|
else if (maxKey == null)
|
||||||
{
|
{
|
||||||
var userMinKey = ConvertToIndexKey(minKey);
|
var userMinKey = ConvertToIndexKey(minKey);
|
||||||
actualMinKey = CreateCompositeKeyBoundary(userMinKey, useMinObjectId: true); // Min boundary
|
actualMinKey = CreateCompositeKeyBoundary(userMinKey, true); // Min boundary
|
||||||
actualMaxKey = new IndexKey(Enumerable.Repeat((byte)0xFF, 255).ToArray());
|
actualMaxKey = new IndexKey(Enumerable.Repeat((byte)0xFF, 255).ToArray());
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
@@ -330,17 +334,15 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
// Create composite boundaries:
|
// Create composite boundaries:
|
||||||
// Min: (userMinKey, ObjectId.Empty) - captures all docs with key >= userMinKey
|
// Min: (userMinKey, ObjectId.Empty) - captures all docs with key >= userMinKey
|
||||||
// Max: (userMaxKey, ObjectId.MaxValue) - captures all docs with key <= userMaxKey
|
// Max: (userMaxKey, ObjectId.MaxValue) - captures all docs with key <= userMaxKey
|
||||||
actualMinKey = CreateCompositeKeyBoundary(userMinKey, useMinObjectId: true);
|
actualMinKey = CreateCompositeKeyBoundary(userMinKey, true);
|
||||||
actualMaxKey = CreateCompositeKeyBoundary(userMaxKey, useMinObjectId: false);
|
actualMaxKey = CreateCompositeKeyBoundary(userMaxKey, false);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Use BTreeIndex.Range with WAL-aware reads and direction
|
// Use BTreeIndex.Range with WAL-aware reads and direction
|
||||||
// Extract DocumentLocation from each entry
|
// Extract DocumentLocation from each entry
|
||||||
foreach (var entry in _btreeIndex.Range(actualMinKey, actualMaxKey, direction, transaction?.TransactionId))
|
foreach (var entry in BTreeIndex.Range(actualMinKey, actualMaxKey, direction, transaction?.TransactionId))
|
||||||
{
|
|
||||||
yield return entry.Location;
|
yield return entry.Location;
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets statistics about this index
|
/// Gets statistics about this index
|
||||||
@@ -349,16 +351,38 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
{
|
{
|
||||||
return new CollectionIndexInfo
|
return new CollectionIndexInfo
|
||||||
{
|
{
|
||||||
Name = _definition.Name,
|
Name = Definition.Name,
|
||||||
PropertyPaths = _definition.PropertyPaths,
|
PropertyPaths = Definition.PropertyPaths,
|
||||||
IsUnique = _definition.IsUnique,
|
IsUnique = Definition.IsUnique,
|
||||||
Type = _definition.Type,
|
Type = Definition.Type,
|
||||||
IsPrimary = _definition.IsPrimary,
|
IsPrimary = Definition.IsPrimary,
|
||||||
EstimatedDocumentCount = 0, // TODO: Track or calculate document count
|
EstimatedDocumentCount = 0, // TODO: Track or calculate document count
|
||||||
EstimatedSizeBytes = 0 // TODO: Calculate index size
|
EstimatedSizeBytes = 0 // TODO: Calculate index size
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Converts a CLR value to an IndexKey for BTree storage.
|
||||||
|
/// Supports all common .NET types.
|
||||||
|
/// </summary>
|
||||||
|
private IndexKey ConvertToIndexKey(object value)
|
||||||
|
{
|
||||||
|
return value switch
|
||||||
|
{
|
||||||
|
ObjectId objectId => new IndexKey(objectId),
|
||||||
|
string str => new IndexKey(str),
|
||||||
|
int intVal => new IndexKey(intVal),
|
||||||
|
long longVal => new IndexKey(longVal),
|
||||||
|
DateTime dateTime => new IndexKey(dateTime.Ticks),
|
||||||
|
bool boolVal => new IndexKey(boolVal ? 1 : 0),
|
||||||
|
byte[] byteArray => new IndexKey(byteArray),
|
||||||
|
|
||||||
|
// For compound keys or complex types, use ToString and serialize
|
||||||
|
// TODO: Better compound key serialization
|
||||||
|
_ => new IndexKey(value.ToString() ?? string.Empty)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
#region Composite Key Support (SQLite-style for Duplicate Keys)
|
#region Composite Key Support (SQLite-style for Duplicate Keys)
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -388,7 +412,7 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
{
|
{
|
||||||
// For range boundaries, we use an empty key for Min and a very large key for Max
|
// For range boundaries, we use an empty key for Min and a very large key for Max
|
||||||
// to wrap around all possible IDs for this user key.
|
// to wrap around all possible IDs for this user key.
|
||||||
IndexKey idBoundary = useMinObjectId
|
var idBoundary = useMinObjectId
|
||||||
? new IndexKey(Array.Empty<byte>())
|
? new IndexKey(Array.Empty<byte>())
|
||||||
: new IndexKey(Enumerable.Repeat((byte)0xFF, 16).ToArray()); // Using 16 as a safe max for GUID/ObjectId
|
: new IndexKey(Enumerable.Repeat((byte)0xFF, 16).ToArray()); // Using 16 as a safe max for GUID/ObjectId
|
||||||
|
|
||||||
@@ -402,7 +426,7 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
private IndexKey ExtractUserKey(IndexKey compositeKey)
|
private IndexKey ExtractUserKey(IndexKey compositeKey)
|
||||||
{
|
{
|
||||||
// Composite key = UserKey + ObjectId(12 bytes)
|
// Composite key = UserKey + ObjectId(12 bytes)
|
||||||
var userKeyLength = compositeKey.Data.Length - 12;
|
int userKeyLength = compositeKey.Data.Length - 12;
|
||||||
if (userKeyLength <= 0)
|
if (userKeyLength <= 0)
|
||||||
return compositeKey; // Fallback for malformed keys
|
return compositeKey; // Fallback for malformed keys
|
||||||
|
|
||||||
@@ -411,41 +435,4 @@ public sealed class CollectionSecondaryIndex<TId, T> : IDisposable where T : cla
|
|||||||
}
|
}
|
||||||
|
|
||||||
#endregion
|
#endregion
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Converts a CLR value to an IndexKey for BTree storage.
|
|
||||||
/// Supports all common .NET types.
|
|
||||||
/// </summary>
|
|
||||||
private IndexKey ConvertToIndexKey(object value)
|
|
||||||
{
|
|
||||||
return value switch
|
|
||||||
{
|
|
||||||
ObjectId objectId => new IndexKey(objectId),
|
|
||||||
string str => new IndexKey(str),
|
|
||||||
int intVal => new IndexKey(intVal),
|
|
||||||
long longVal => new IndexKey(longVal),
|
|
||||||
DateTime dateTime => new IndexKey(dateTime.Ticks),
|
|
||||||
bool boolVal => new IndexKey(boolVal ? 1 : 0),
|
|
||||||
byte[] byteArray => new IndexKey(byteArray),
|
|
||||||
|
|
||||||
// For compound keys or complex types, use ToString and serialize
|
|
||||||
// TODO: Better compound key serialization
|
|
||||||
_ => new IndexKey(value.ToString() ?? string.Empty)
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Releases resources used by this index wrapper.
|
|
||||||
/// </summary>
|
|
||||||
public void Dispose()
|
|
||||||
{
|
|
||||||
if (_disposed)
|
|
||||||
return;
|
|
||||||
|
|
||||||
// BTreeIndex doesn't currently implement IDisposable
|
|
||||||
// Future: may need to flush buffers, close resources
|
|
||||||
|
|
||||||
_disposed = true;
|
|
||||||
GC.SuppressFinalize(this);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
@@ -10,7 +10,8 @@ public static class GeoSpatialExtensions
|
|||||||
/// <param name="center">The center point (Latitude, Longitude) for the proximity search.</param>
|
/// <param name="center">The center point (Latitude, Longitude) for the proximity search.</param>
|
||||||
/// <param name="radiusKm">The radius in kilometers.</param>
|
/// <param name="radiusKm">The radius in kilometers.</param>
|
||||||
/// <returns>True if the point is within the specified radius.</returns>
|
/// <returns>True if the point is within the specified radius.</returns>
|
||||||
public static bool Near(this (double Latitude, double Longitude) point, (double Latitude, double Longitude) center, double radiusKm)
|
public static bool Near(this (double Latitude, double Longitude) point, (double Latitude, double Longitude) center,
|
||||||
|
double radiusKm)
|
||||||
{
|
{
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
@@ -23,7 +24,8 @@ public static class GeoSpatialExtensions
|
|||||||
/// <param name="min">The minimum (Latitude, Longitude) of the bounding box.</param>
|
/// <param name="min">The minimum (Latitude, Longitude) of the bounding box.</param>
|
||||||
/// <param name="max">The maximum (Latitude, Longitude) of the bounding box.</param>
|
/// <param name="max">The maximum (Latitude, Longitude) of the bounding box.</param>
|
||||||
/// <returns>True if the point is within the specified bounding box.</returns>
|
/// <returns>True if the point is within the specified bounding box.</returns>
|
||||||
public static bool Within(this (double Latitude, double Longitude) point, (double Latitude, double Longitude) min, (double Latitude, double Longitude) max)
|
public static bool Within(this (double Latitude, double Longitude) point, (double Latitude, double Longitude) min,
|
||||||
|
(double Latitude, double Longitude) max)
|
||||||
{
|
{
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,7 +1,4 @@
|
|||||||
using ZB.MOM.WW.CBDD.Bson;
|
|
||||||
using ZB.MOM.WW.CBDD.Core.Storage;
|
using ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
using System;
|
|
||||||
using System.Collections.Generic;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
||||||
|
|
||||||
@@ -32,9 +29,9 @@ public sealed class HashIndex
|
|||||||
public void Insert(IndexKey key, DocumentLocation location)
|
public void Insert(IndexKey key, DocumentLocation location)
|
||||||
{
|
{
|
||||||
if (_options.Unique && TryFind(key, out _))
|
if (_options.Unique && TryFind(key, out _))
|
||||||
throw new InvalidOperationException($"Duplicate key violation for unique index");
|
throw new InvalidOperationException("Duplicate key violation for unique index");
|
||||||
|
|
||||||
var hashCode = key.GetHashCode();
|
int hashCode = key.GetHashCode();
|
||||||
|
|
||||||
if (!_buckets.TryGetValue(hashCode, out var bucket))
|
if (!_buckets.TryGetValue(hashCode, out var bucket))
|
||||||
{
|
{
|
||||||
@@ -54,19 +51,17 @@ public sealed class HashIndex
|
|||||||
public bool TryFind(IndexKey key, out DocumentLocation location)
|
public bool TryFind(IndexKey key, out DocumentLocation location)
|
||||||
{
|
{
|
||||||
location = default;
|
location = default;
|
||||||
var hashCode = key.GetHashCode();
|
int hashCode = key.GetHashCode();
|
||||||
|
|
||||||
if (!_buckets.TryGetValue(hashCode, out var bucket))
|
if (!_buckets.TryGetValue(hashCode, out var bucket))
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
foreach (var entry in bucket)
|
foreach (var entry in bucket)
|
||||||
{
|
|
||||||
if (entry.Key == key)
|
if (entry.Key == key)
|
||||||
{
|
{
|
||||||
location = entry.Location;
|
location = entry.Location;
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
@@ -79,13 +74,12 @@ public sealed class HashIndex
|
|||||||
/// <returns><see langword="true" /> if an entry is removed; otherwise, <see langword="false" />.</returns>
|
/// <returns><see langword="true" /> if an entry is removed; otherwise, <see langword="false" />.</returns>
|
||||||
public bool Remove(IndexKey key, DocumentLocation location)
|
public bool Remove(IndexKey key, DocumentLocation location)
|
||||||
{
|
{
|
||||||
var hashCode = key.GetHashCode();
|
int hashCode = key.GetHashCode();
|
||||||
|
|
||||||
if (!_buckets.TryGetValue(hashCode, out var bucket))
|
if (!_buckets.TryGetValue(hashCode, out var bucket))
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
for (int i = 0; i < bucket.Count; i++)
|
for (var i = 0; i < bucket.Count; i++)
|
||||||
{
|
|
||||||
if (bucket[i].Key == key &&
|
if (bucket[i].Key == key &&
|
||||||
bucket[i].Location.PageId == location.PageId &&
|
bucket[i].Location.PageId == location.PageId &&
|
||||||
bucket[i].Location.SlotIndex == location.SlotIndex)
|
bucket[i].Location.SlotIndex == location.SlotIndex)
|
||||||
@@ -97,7 +91,6 @@ public sealed class HashIndex
|
|||||||
|
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
@@ -109,15 +102,13 @@ public sealed class HashIndex
|
|||||||
/// <returns>All matching index entries.</returns>
|
/// <returns>All matching index entries.</returns>
|
||||||
public IEnumerable<IndexEntry> FindAll(IndexKey key)
|
public IEnumerable<IndexEntry> FindAll(IndexKey key)
|
||||||
{
|
{
|
||||||
var hashCode = key.GetHashCode();
|
int hashCode = key.GetHashCode();
|
||||||
|
|
||||||
if (!_buckets.TryGetValue(hashCode, out var bucket))
|
if (!_buckets.TryGetValue(hashCode, out var bucket))
|
||||||
yield break;
|
yield break;
|
||||||
|
|
||||||
foreach (var entry in bucket)
|
foreach (var entry in bucket)
|
||||||
{
|
|
||||||
if (entry.Key == key)
|
if (entry.Key == key)
|
||||||
yield return entry;
|
yield return entry;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|||||||
@@ -1,6 +1,3 @@
|
|||||||
using ZB.MOM.WW.CBDD.Core.Indexing;
|
|
||||||
using System;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
|
|||||||
@@ -1,6 +1,5 @@
|
|||||||
|
using System.Text;
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
using System;
|
|
||||||
using System.Linq;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
||||||
|
|
||||||
@@ -17,12 +16,12 @@ public struct IndexKey : IEquatable<IndexKey>, IComparable<IndexKey>
|
|||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the minimum possible index key.
|
/// Gets the minimum possible index key.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public static IndexKey MinKey => new IndexKey(Array.Empty<byte>());
|
public static IndexKey MinKey => new(Array.Empty<byte>());
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the maximum possible index key.
|
/// Gets the maximum possible index key.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public static IndexKey MaxKey => new IndexKey(Enumerable.Repeat((byte)0xFF, 32).ToArray());
|
public static IndexKey MaxKey => new(Enumerable.Repeat((byte)0xFF, 32).ToArray());
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new instance of the <see cref="IndexKey" /> struct from raw key bytes.
|
/// Initializes a new instance of the <see cref="IndexKey" /> struct from raw key bytes.
|
||||||
@@ -71,7 +70,7 @@ public struct IndexKey : IEquatable<IndexKey>, IComparable<IndexKey>
|
|||||||
/// <param name="value">The string value.</param>
|
/// <param name="value">The string value.</param>
|
||||||
public IndexKey(string value)
|
public IndexKey(string value)
|
||||||
{
|
{
|
||||||
_data = System.Text.Encoding.UTF8.GetBytes(value);
|
_data = Encoding.UTF8.GetBytes(value);
|
||||||
_hashCode = ComputeHashCode(_data);
|
_hashCode = ComputeHashCode(_data);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -95,18 +94,19 @@ public struct IndexKey : IEquatable<IndexKey>, IComparable<IndexKey>
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="other">The key to compare with.</param>
|
/// <param name="other">The key to compare with.</param>
|
||||||
/// <returns>
|
/// <returns>
|
||||||
/// A value less than zero if this key is less than <paramref name="other"/>, zero if equal, or greater than zero if greater.
|
/// A value less than zero if this key is less than <paramref name="other" />, zero if equal, or greater than zero if
|
||||||
|
/// greater.
|
||||||
/// </returns>
|
/// </returns>
|
||||||
public readonly int CompareTo(IndexKey other)
|
public readonly int CompareTo(IndexKey other)
|
||||||
{
|
{
|
||||||
if (_data == null) return other._data == null ? 0 : -1;
|
if (_data == null) return other._data == null ? 0 : -1;
|
||||||
if (other._data == null) return 1;
|
if (other._data == null) return 1;
|
||||||
|
|
||||||
var minLength = Math.Min(_data.Length, other._data.Length);
|
int minLength = Math.Min(_data.Length, other._data.Length);
|
||||||
|
|
||||||
for (int i = 0; i < minLength; i++)
|
for (var i = 0; i < minLength; i++)
|
||||||
{
|
{
|
||||||
var cmp = _data[i].CompareTo(other._data[i]);
|
int cmp = _data[i].CompareTo(other._data[i]);
|
||||||
if (cmp != 0)
|
if (cmp != 0)
|
||||||
return cmp;
|
return cmp;
|
||||||
}
|
}
|
||||||
@@ -131,17 +131,46 @@ public struct IndexKey : IEquatable<IndexKey>, IComparable<IndexKey>
|
|||||||
}
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override readonly bool Equals(object? obj) => obj is IndexKey other && Equals(other);
|
public readonly override bool Equals(object? obj)
|
||||||
|
{
|
||||||
|
return obj is IndexKey other && Equals(other);
|
||||||
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override readonly int GetHashCode() => _hashCode;
|
public readonly override int GetHashCode()
|
||||||
|
{
|
||||||
|
return _hashCode;
|
||||||
|
}
|
||||||
|
|
||||||
public static bool operator ==(IndexKey left, IndexKey right) => left.Equals(right);
|
public static bool operator ==(IndexKey left, IndexKey right)
|
||||||
public static bool operator !=(IndexKey left, IndexKey right) => !left.Equals(right);
|
{
|
||||||
public static bool operator <(IndexKey left, IndexKey right) => left.CompareTo(right) < 0;
|
return left.Equals(right);
|
||||||
public static bool operator >(IndexKey left, IndexKey right) => left.CompareTo(right) > 0;
|
}
|
||||||
public static bool operator <=(IndexKey left, IndexKey right) => left.CompareTo(right) <= 0;
|
|
||||||
public static bool operator >=(IndexKey left, IndexKey right) => left.CompareTo(right) >= 0;
|
public static bool operator !=(IndexKey left, IndexKey right)
|
||||||
|
{
|
||||||
|
return !left.Equals(right);
|
||||||
|
}
|
||||||
|
|
||||||
|
public static bool operator <(IndexKey left, IndexKey right)
|
||||||
|
{
|
||||||
|
return left.CompareTo(right) < 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static bool operator >(IndexKey left, IndexKey right)
|
||||||
|
{
|
||||||
|
return left.CompareTo(right) > 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static bool operator <=(IndexKey left, IndexKey right)
|
||||||
|
{
|
||||||
|
return left.CompareTo(right) <= 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static bool operator >=(IndexKey left, IndexKey right)
|
||||||
|
{
|
||||||
|
return left.CompareTo(right) >= 0;
|
||||||
|
}
|
||||||
|
|
||||||
private static int ComputeHashCode(ReadOnlySpan<byte> data)
|
private static int ComputeHashCode(ReadOnlySpan<byte> data)
|
||||||
{
|
{
|
||||||
@@ -167,7 +196,8 @@ public struct IndexKey : IEquatable<IndexKey>, IComparable<IndexKey>
|
|||||||
if (typeof(T) == typeof(Guid)) return new IndexKey((Guid)(object)value);
|
if (typeof(T) == typeof(Guid)) return new IndexKey((Guid)(object)value);
|
||||||
if (typeof(T) == typeof(byte[])) return new IndexKey((byte[])(object)value);
|
if (typeof(T) == typeof(byte[])) return new IndexKey((byte[])(object)value);
|
||||||
|
|
||||||
throw new NotSupportedException($"Type {typeof(T).Name} is not supported as an IndexKey. Provide a custom mapping.");
|
throw new NotSupportedException(
|
||||||
|
$"Type {typeof(T).Name} is not supported as an IndexKey. Provide a custom mapping.");
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -182,10 +212,11 @@ public struct IndexKey : IEquatable<IndexKey>, IComparable<IndexKey>
|
|||||||
if (typeof(T) == typeof(ObjectId)) return (T)(object)new ObjectId(_data);
|
if (typeof(T) == typeof(ObjectId)) return (T)(object)new ObjectId(_data);
|
||||||
if (typeof(T) == typeof(int)) return (T)(object)BitConverter.ToInt32(_data);
|
if (typeof(T) == typeof(int)) return (T)(object)BitConverter.ToInt32(_data);
|
||||||
if (typeof(T) == typeof(long)) return (T)(object)BitConverter.ToInt64(_data);
|
if (typeof(T) == typeof(long)) return (T)(object)BitConverter.ToInt64(_data);
|
||||||
if (typeof(T) == typeof(string)) return (T)(object)System.Text.Encoding.UTF8.GetString(_data);
|
if (typeof(T) == typeof(string)) return (T)(object)Encoding.UTF8.GetString(_data);
|
||||||
if (typeof(T) == typeof(Guid)) return (T)(object)new Guid(_data);
|
if (typeof(T) == typeof(Guid)) return (T)(object)new Guid(_data);
|
||||||
if (typeof(T) == typeof(byte[])) return (T)(object)_data;
|
if (typeof(T) == typeof(byte[])) return (T)(object)_data;
|
||||||
|
|
||||||
throw new NotSupportedException($"Type {typeof(T).Name} cannot be extracted from IndexKey. Provide a custom mapping.");
|
throw new NotSupportedException(
|
||||||
|
$"Type {typeof(T).Name} cannot be extracted from IndexKey. Provide a custom mapping.");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -83,36 +83,45 @@ public readonly struct IndexOptions
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="fields">The indexed field names.</param>
|
/// <param name="fields">The indexed field names.</param>
|
||||||
/// <returns>The configured index options.</returns>
|
/// <returns>The configured index options.</returns>
|
||||||
public static IndexOptions CreateBTree(params string[] fields) => new()
|
public static IndexOptions CreateBTree(params string[] fields)
|
||||||
|
{
|
||||||
|
return new IndexOptions
|
||||||
{
|
{
|
||||||
Type = IndexType.BTree,
|
Type = IndexType.BTree,
|
||||||
Unique = false,
|
Unique = false,
|
||||||
Fields = fields
|
Fields = fields
|
||||||
};
|
};
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Creates unique B+Tree index options.
|
/// Creates unique B+Tree index options.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="fields">The indexed field names.</param>
|
/// <param name="fields">The indexed field names.</param>
|
||||||
/// <returns>The configured index options.</returns>
|
/// <returns>The configured index options.</returns>
|
||||||
public static IndexOptions CreateUnique(params string[] fields) => new()
|
public static IndexOptions CreateUnique(params string[] fields)
|
||||||
|
{
|
||||||
|
return new IndexOptions
|
||||||
{
|
{
|
||||||
Type = IndexType.BTree,
|
Type = IndexType.BTree,
|
||||||
Unique = true,
|
Unique = true,
|
||||||
Fields = fields
|
Fields = fields
|
||||||
};
|
};
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Creates hash index options.
|
/// Creates hash index options.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="fields">The indexed field names.</param>
|
/// <param name="fields">The indexed field names.</param>
|
||||||
/// <returns>The configured index options.</returns>
|
/// <returns>The configured index options.</returns>
|
||||||
public static IndexOptions CreateHash(params string[] fields) => new()
|
public static IndexOptions CreateHash(params string[] fields)
|
||||||
|
{
|
||||||
|
return new IndexOptions
|
||||||
{
|
{
|
||||||
Type = IndexType.Hash,
|
Type = IndexType.Hash,
|
||||||
Unique = false,
|
Unique = false,
|
||||||
Fields = fields
|
Fields = fields
|
||||||
};
|
};
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Creates vector index options.
|
/// Creates vector index options.
|
||||||
@@ -123,7 +132,10 @@ public readonly struct IndexOptions
|
|||||||
/// <param name="ef">The candidate list size used during index construction.</param>
|
/// <param name="ef">The candidate list size used during index construction.</param>
|
||||||
/// <param name="fields">The indexed field names.</param>
|
/// <param name="fields">The indexed field names.</param>
|
||||||
/// <returns>The configured index options.</returns>
|
/// <returns>The configured index options.</returns>
|
||||||
public static IndexOptions CreateVector(int dimensions, VectorMetric metric = VectorMetric.Cosine, int m = 16, int ef = 200, params string[] fields) => new()
|
public static IndexOptions CreateVector(int dimensions, VectorMetric metric = VectorMetric.Cosine, int m = 16,
|
||||||
|
int ef = 200, params string[] fields)
|
||||||
|
{
|
||||||
|
return new IndexOptions
|
||||||
{
|
{
|
||||||
Type = IndexType.Vector,
|
Type = IndexType.Vector,
|
||||||
Unique = false,
|
Unique = false,
|
||||||
@@ -133,16 +145,20 @@ public readonly struct IndexOptions
|
|||||||
M = m,
|
M = m,
|
||||||
EfConstruction = ef
|
EfConstruction = ef
|
||||||
};
|
};
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Creates spatial index options.
|
/// Creates spatial index options.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="fields">The indexed field names.</param>
|
/// <param name="fields">The indexed field names.</param>
|
||||||
/// <returns>The configured index options.</returns>
|
/// <returns>The configured index options.</returns>
|
||||||
public static IndexOptions CreateSpatial(params string[] fields) => new()
|
public static IndexOptions CreateSpatial(params string[] fields)
|
||||||
|
{
|
||||||
|
return new IndexOptions
|
||||||
{
|
{
|
||||||
Type = IndexType.Spatial,
|
Type = IndexType.Spatial,
|
||||||
Unique = false,
|
Unique = false,
|
||||||
Fields = fields
|
Fields = fields
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
}
|
||||||
@@ -23,6 +23,11 @@ internal record struct GeoBox(double MinLat, double MinLon, double MaxLat, doubl
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public static GeoBox Empty => new(double.MaxValue, double.MaxValue, double.MinValue, double.MinValue);
|
public static GeoBox Empty => new(double.MaxValue, double.MaxValue, double.MinValue, double.MinValue);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the area of this bounding box.
|
||||||
|
/// </summary>
|
||||||
|
public double Area => Math.Max(0, MaxLat - MinLat) * Math.Max(0, MaxLon - MinLon);
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Determines whether this box contains the specified point.
|
/// Determines whether this box contains the specified point.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -82,9 +87,4 @@ internal record struct GeoBox(double MinLat, double MinLon, double MaxLat, doubl
|
|||||||
Math.Max(MaxLat, other.MaxLat),
|
Math.Max(MaxLat, other.MaxLat),
|
||||||
Math.Max(MaxLon, other.MaxLon));
|
Math.Max(MaxLon, other.MaxLon));
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Gets the area of this bounding box.
|
|
||||||
/// </summary>
|
|
||||||
public double Area => Math.Max(0, MaxLat - MinLat) * Math.Max(0, MaxLon - MinLon);
|
|
||||||
}
|
}
|
||||||
@@ -1,5 +1,3 @@
|
|||||||
using ZB.MOM.WW.CBDD.Core.Indexing;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
||||||
|
|
||||||
public struct InternalEntry
|
public struct InternalEntry
|
||||||
|
|||||||
@@ -1,8 +1,7 @@
|
|||||||
|
using System.Buffers;
|
||||||
|
using ZB.MOM.WW.CBDD.Core.Indexing.Internal;
|
||||||
using ZB.MOM.WW.CBDD.Core.Storage;
|
using ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
using ZB.MOM.WW.CBDD.Core.Transactions;
|
using ZB.MOM.WW.CBDD.Core.Transactions;
|
||||||
using ZB.MOM.WW.CBDD.Core.Collections;
|
|
||||||
using ZB.MOM.WW.CBDD.Core.Indexing.Internal;
|
|
||||||
using System.Buffers;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
||||||
|
|
||||||
@@ -12,11 +11,10 @@ namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
internal class RTreeIndex : IDisposable
|
internal class RTreeIndex : IDisposable
|
||||||
{
|
{
|
||||||
private readonly IIndexStorage _storage;
|
|
||||||
private readonly IndexOptions _options;
|
|
||||||
private uint _rootPageId;
|
|
||||||
private readonly object _lock = new();
|
private readonly object _lock = new();
|
||||||
|
private readonly IndexOptions _options;
|
||||||
private readonly int _pageSize;
|
private readonly int _pageSize;
|
||||||
|
private readonly IIndexStorage _storage;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new instance of the <see cref="RTreeIndex" /> class.
|
/// Initializes a new instance of the <see cref="RTreeIndex" /> class.
|
||||||
@@ -28,30 +26,37 @@ internal class RTreeIndex : IDisposable
|
|||||||
{
|
{
|
||||||
_storage = storage ?? throw new ArgumentNullException(nameof(storage));
|
_storage = storage ?? throw new ArgumentNullException(nameof(storage));
|
||||||
_options = options;
|
_options = options;
|
||||||
_rootPageId = rootPageId;
|
RootPageId = rootPageId;
|
||||||
_pageSize = _storage.PageSize;
|
_pageSize = _storage.PageSize;
|
||||||
|
|
||||||
if (_rootPageId == 0)
|
if (RootPageId == 0) InitializeNewIndex();
|
||||||
{
|
|
||||||
InitializeNewIndex();
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the current root page identifier.
|
/// Gets the current root page identifier.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public uint RootPageId => _rootPageId;
|
public uint RootPageId { get; private set; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Releases resources used by the index.
|
||||||
|
/// </summary>
|
||||||
|
public void Dispose()
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
private void InitializeNewIndex()
|
private void InitializeNewIndex()
|
||||||
{
|
{
|
||||||
var buffer = RentPageBuffer();
|
byte[] buffer = RentPageBuffer();
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
_rootPageId = _storage.AllocatePage();
|
RootPageId = _storage.AllocatePage();
|
||||||
SpatialPage.Initialize(buffer, _rootPageId, true, 0);
|
SpatialPage.Initialize(buffer, RootPageId, true, 0);
|
||||||
_storage.WritePageImmediate(_rootPageId, buffer);
|
_storage.WritePageImmediate(RootPageId, buffer);
|
||||||
|
}
|
||||||
|
finally
|
||||||
|
{
|
||||||
|
ReturnPageBuffer(buffer);
|
||||||
}
|
}
|
||||||
finally { ReturnPageBuffer(buffer); }
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -62,12 +67,12 @@ internal class RTreeIndex : IDisposable
|
|||||||
/// <returns>A sequence of matching document locations.</returns>
|
/// <returns>A sequence of matching document locations.</returns>
|
||||||
public IEnumerable<DocumentLocation> Search(GeoBox area, ITransaction? transaction = null)
|
public IEnumerable<DocumentLocation> Search(GeoBox area, ITransaction? transaction = null)
|
||||||
{
|
{
|
||||||
if (_rootPageId == 0) yield break;
|
if (RootPageId == 0) yield break;
|
||||||
|
|
||||||
var stack = new Stack<uint>();
|
var stack = new Stack<uint>();
|
||||||
stack.Push(_rootPageId);
|
stack.Push(RootPageId);
|
||||||
|
|
||||||
var buffer = RentPageBuffer();
|
byte[] buffer = RentPageBuffer();
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
while (stack.Count > 0)
|
while (stack.Count > 0)
|
||||||
@@ -78,25 +83,24 @@ internal class RTreeIndex : IDisposable
|
|||||||
bool isLeaf = SpatialPage.GetIsLeaf(buffer);
|
bool isLeaf = SpatialPage.GetIsLeaf(buffer);
|
||||||
ushort count = SpatialPage.GetEntryCount(buffer);
|
ushort count = SpatialPage.GetEntryCount(buffer);
|
||||||
|
|
||||||
for (int i = 0; i < count; i++)
|
for (var i = 0; i < count; i++)
|
||||||
{
|
{
|
||||||
SpatialPage.ReadEntry(buffer, i, out var mbr, out var pointer);
|
SpatialPage.ReadEntry(buffer, i, out var mbr, out var pointer);
|
||||||
|
|
||||||
if (area.Intersects(mbr))
|
if (area.Intersects(mbr))
|
||||||
{
|
{
|
||||||
if (isLeaf)
|
if (isLeaf)
|
||||||
{
|
|
||||||
yield return pointer;
|
yield return pointer;
|
||||||
}
|
|
||||||
else
|
else
|
||||||
{
|
|
||||||
stack.Push(pointer.PageId);
|
stack.Push(pointer.PageId);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
finally
|
||||||
|
{
|
||||||
|
ReturnPageBuffer(buffer);
|
||||||
}
|
}
|
||||||
finally { ReturnPageBuffer(buffer); }
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -109,7 +113,7 @@ internal class RTreeIndex : IDisposable
|
|||||||
{
|
{
|
||||||
lock (_lock)
|
lock (_lock)
|
||||||
{
|
{
|
||||||
var leafPageId = ChooseLeaf(_rootPageId, mbr, transaction);
|
uint leafPageId = ChooseLeaf(RootPageId, mbr, transaction);
|
||||||
InsertIntoNode(leafPageId, mbr, loc, transaction);
|
InsertIntoNode(leafPageId, mbr, loc, transaction);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -117,7 +121,7 @@ internal class RTreeIndex : IDisposable
|
|||||||
private uint ChooseLeaf(uint rootId, GeoBox mbr, ITransaction? transaction)
|
private uint ChooseLeaf(uint rootId, GeoBox mbr, ITransaction? transaction)
|
||||||
{
|
{
|
||||||
uint currentId = rootId;
|
uint currentId = rootId;
|
||||||
var buffer = RentPageBuffer();
|
byte[] buffer = RentPageBuffer();
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
while (true)
|
while (true)
|
||||||
@@ -127,10 +131,10 @@ internal class RTreeIndex : IDisposable
|
|||||||
|
|
||||||
ushort count = SpatialPage.GetEntryCount(buffer);
|
ushort count = SpatialPage.GetEntryCount(buffer);
|
||||||
uint bestChild = 0;
|
uint bestChild = 0;
|
||||||
double minEnlargement = double.MaxValue;
|
var minEnlargement = double.MaxValue;
|
||||||
double minArea = double.MaxValue;
|
var minArea = double.MaxValue;
|
||||||
|
|
||||||
for (int i = 0; i < count; i++)
|
for (var i = 0; i < count; i++)
|
||||||
{
|
{
|
||||||
SpatialPage.ReadEntry(buffer, i, out var childMbr, out var pointer);
|
SpatialPage.ReadEntry(buffer, i, out var childMbr, out var pointer);
|
||||||
|
|
||||||
@@ -156,12 +160,15 @@ internal class RTreeIndex : IDisposable
|
|||||||
currentId = bestChild;
|
currentId = bestChild;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
finally { ReturnPageBuffer(buffer); }
|
finally
|
||||||
|
{
|
||||||
|
ReturnPageBuffer(buffer);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private void InsertIntoNode(uint pageId, GeoBox mbr, DocumentLocation pointer, ITransaction? transaction)
|
private void InsertIntoNode(uint pageId, GeoBox mbr, DocumentLocation pointer, ITransaction? transaction)
|
||||||
{
|
{
|
||||||
var buffer = RentPageBuffer();
|
byte[] buffer = RentPageBuffer();
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
_storage.ReadPage(pageId, transaction?.TransactionId, buffer);
|
_storage.ReadPage(pageId, transaction?.TransactionId, buffer);
|
||||||
@@ -186,17 +193,20 @@ internal class RTreeIndex : IDisposable
|
|||||||
SplitNode(pageId, mbr, pointer, transaction);
|
SplitNode(pageId, mbr, pointer, transaction);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
finally { ReturnPageBuffer(buffer); }
|
finally
|
||||||
|
{
|
||||||
|
ReturnPageBuffer(buffer);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private void UpdateMBRUpwards(uint pageId, ITransaction? transaction)
|
private void UpdateMBRUpwards(uint pageId, ITransaction? transaction)
|
||||||
{
|
{
|
||||||
var buffer = RentPageBuffer();
|
byte[] buffer = RentPageBuffer();
|
||||||
var parentBuffer = RentPageBuffer();
|
byte[] parentBuffer = RentPageBuffer();
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
uint currentId = pageId;
|
uint currentId = pageId;
|
||||||
while (currentId != _rootPageId)
|
while (currentId != RootPageId)
|
||||||
{
|
{
|
||||||
_storage.ReadPage(currentId, transaction?.TransactionId, buffer);
|
_storage.ReadPage(currentId, transaction?.TransactionId, buffer);
|
||||||
var currentMbr = SpatialPage.CalculateMBR(buffer);
|
var currentMbr = SpatialPage.CalculateMBR(buffer);
|
||||||
@@ -206,9 +216,9 @@ internal class RTreeIndex : IDisposable
|
|||||||
|
|
||||||
_storage.ReadPage(parentId, transaction?.TransactionId, parentBuffer);
|
_storage.ReadPage(parentId, transaction?.TransactionId, parentBuffer);
|
||||||
ushort count = SpatialPage.GetEntryCount(parentBuffer);
|
ushort count = SpatialPage.GetEntryCount(parentBuffer);
|
||||||
bool changed = false;
|
var changed = false;
|
||||||
|
|
||||||
for (int i = 0; i < count; i++)
|
for (var i = 0; i < count; i++)
|
||||||
{
|
{
|
||||||
SpatialPage.ReadEntry(parentBuffer, i, out var mbr, out var pointer);
|
SpatialPage.ReadEntry(parentBuffer, i, out var mbr, out var pointer);
|
||||||
if (pointer.PageId == currentId)
|
if (pointer.PageId == currentId)
|
||||||
@@ -218,6 +228,7 @@ internal class RTreeIndex : IDisposable
|
|||||||
SpatialPage.WriteEntry(parentBuffer, i, currentMbr, pointer);
|
SpatialPage.WriteEntry(parentBuffer, i, currentMbr, pointer);
|
||||||
changed = true;
|
changed = true;
|
||||||
}
|
}
|
||||||
|
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -241,8 +252,8 @@ internal class RTreeIndex : IDisposable
|
|||||||
|
|
||||||
private void SplitNode(uint pageId, GeoBox newMbr, DocumentLocation newPointer, ITransaction? transaction)
|
private void SplitNode(uint pageId, GeoBox newMbr, DocumentLocation newPointer, ITransaction? transaction)
|
||||||
{
|
{
|
||||||
var buffer = RentPageBuffer();
|
byte[] buffer = RentPageBuffer();
|
||||||
var newBuffer = RentPageBuffer();
|
byte[] newBuffer = RentPageBuffer();
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
_storage.ReadPage(pageId, transaction?.TransactionId, buffer);
|
_storage.ReadPage(pageId, transaction?.TransactionId, buffer);
|
||||||
@@ -253,11 +264,12 @@ internal class RTreeIndex : IDisposable
|
|||||||
|
|
||||||
// Collect all entries including the new one
|
// Collect all entries including the new one
|
||||||
var entries = new List<(GeoBox Mbr, DocumentLocation Pointer)>();
|
var entries = new List<(GeoBox Mbr, DocumentLocation Pointer)>();
|
||||||
for (int i = 0; i < count; i++)
|
for (var i = 0; i < count; i++)
|
||||||
{
|
{
|
||||||
SpatialPage.ReadEntry(buffer, i, out var m, out var p);
|
SpatialPage.ReadEntry(buffer, i, out var m, out var p);
|
||||||
entries.Add((m, p));
|
entries.Add((m, p));
|
||||||
}
|
}
|
||||||
|
|
||||||
entries.Add((newMbr, newPointer));
|
entries.Add((newMbr, newPointer));
|
||||||
|
|
||||||
// Pick Seeds
|
// Pick Seeds
|
||||||
@@ -277,8 +289,8 @@ internal class RTreeIndex : IDisposable
|
|||||||
SpatialPage.WriteEntry(newBuffer, 0, seed2.Mbr, seed2.Pointer);
|
SpatialPage.WriteEntry(newBuffer, 0, seed2.Mbr, seed2.Pointer);
|
||||||
SpatialPage.SetEntryCount(newBuffer, 1);
|
SpatialPage.SetEntryCount(newBuffer, 1);
|
||||||
|
|
||||||
GeoBox mbr1 = seed1.Mbr;
|
var mbr1 = seed1.Mbr;
|
||||||
GeoBox mbr2 = seed2.Mbr;
|
var mbr2 = seed2.Mbr;
|
||||||
|
|
||||||
// Distribute remaining entries
|
// Distribute remaining entries
|
||||||
while (entries.Count > 0)
|
while (entries.Count > 0)
|
||||||
@@ -320,7 +332,7 @@ internal class RTreeIndex : IDisposable
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Propagate split upwards
|
// Propagate split upwards
|
||||||
if (pageId == _rootPageId)
|
if (pageId == RootPageId)
|
||||||
{
|
{
|
||||||
// New Root
|
// New Root
|
||||||
uint newRootId = _storage.AllocatePage();
|
uint newRootId = _storage.AllocatePage();
|
||||||
@@ -334,7 +346,7 @@ internal class RTreeIndex : IDisposable
|
|||||||
else
|
else
|
||||||
_storage.WritePageImmediate(newRootId, buffer);
|
_storage.WritePageImmediate(newRootId, buffer);
|
||||||
|
|
||||||
_rootPageId = newRootId;
|
RootPageId = newRootId;
|
||||||
|
|
||||||
// Update parent pointers
|
// Update parent pointers
|
||||||
UpdateParentPointer(pageId, newRootId, transaction);
|
UpdateParentPointer(pageId, newRootId, transaction);
|
||||||
@@ -356,7 +368,7 @@ internal class RTreeIndex : IDisposable
|
|||||||
|
|
||||||
private void UpdateParentPointer(uint pageId, uint parentId, ITransaction? transaction)
|
private void UpdateParentPointer(uint pageId, uint parentId, ITransaction? transaction)
|
||||||
{
|
{
|
||||||
var buffer = RentPageBuffer();
|
byte[] buffer = RentPageBuffer();
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
_storage.ReadPage(pageId, transaction?.TransactionId, buffer);
|
_storage.ReadPage(pageId, transaction?.TransactionId, buffer);
|
||||||
@@ -366,17 +378,20 @@ internal class RTreeIndex : IDisposable
|
|||||||
else
|
else
|
||||||
_storage.WritePageImmediate(pageId, buffer);
|
_storage.WritePageImmediate(pageId, buffer);
|
||||||
}
|
}
|
||||||
finally { ReturnPageBuffer(buffer); }
|
finally
|
||||||
|
{
|
||||||
|
ReturnPageBuffer(buffer);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private void PickSeeds(List<(GeoBox Mbr, DocumentLocation Pointer)> entries, out (GeoBox Mbr, DocumentLocation Pointer) s1, out (GeoBox Mbr, DocumentLocation Pointer) s2)
|
private void PickSeeds(List<(GeoBox Mbr, DocumentLocation Pointer)> entries,
|
||||||
|
out (GeoBox Mbr, DocumentLocation Pointer) s1, out (GeoBox Mbr, DocumentLocation Pointer) s2)
|
||||||
{
|
{
|
||||||
double maxWaste = double.MinValue;
|
var maxWaste = double.MinValue;
|
||||||
s1 = entries[0];
|
s1 = entries[0];
|
||||||
s2 = entries[1];
|
s2 = entries[1];
|
||||||
|
|
||||||
for (int i = 0; i < entries.Count; i++)
|
for (var i = 0; i < entries.Count; i++)
|
||||||
{
|
|
||||||
for (int j = i + 1; j < entries.Count; j++)
|
for (int j = i + 1; j < entries.Count; j++)
|
||||||
{
|
{
|
||||||
var combined = entries[i].Mbr.ExpandTo(entries[j].Mbr);
|
var combined = entries[i].Mbr.ExpandTo(entries[j].Mbr);
|
||||||
@@ -389,7 +404,6 @@ internal class RTreeIndex : IDisposable
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
private byte[] RentPageBuffer()
|
private byte[] RentPageBuffer()
|
||||||
{
|
{
|
||||||
@@ -400,11 +414,4 @@ internal class RTreeIndex : IDisposable
|
|||||||
{
|
{
|
||||||
ArrayPool<byte>.Shared.Return(buffer);
|
ArrayPool<byte>.Shared.Return(buffer);
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Releases resources used by the index.
|
|
||||||
/// </summary>
|
|
||||||
public void Dispose()
|
|
||||||
{
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
@@ -13,7 +13,10 @@ public static class SpatialMath
|
|||||||
/// <param name="p1">The first point.</param>
|
/// <param name="p1">The first point.</param>
|
||||||
/// <param name="p2">The second point.</param>
|
/// <param name="p2">The second point.</param>
|
||||||
/// <returns>The distance in kilometers.</returns>
|
/// <returns>The distance in kilometers.</returns>
|
||||||
internal static double DistanceKm(GeoPoint p1, GeoPoint p2) => DistanceKm(p1.Latitude, p1.Longitude, p2.Latitude, p2.Longitude);
|
internal static double DistanceKm(GeoPoint p1, GeoPoint p2)
|
||||||
|
{
|
||||||
|
return DistanceKm(p1.Latitude, p1.Longitude, p2.Latitude, p2.Longitude);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Calculates distance between two coordinates on Earth using Haversine formula.
|
/// Calculates distance between two coordinates on Earth using Haversine formula.
|
||||||
@@ -42,7 +45,10 @@ public static class SpatialMath
|
|||||||
/// <param name="center">The center point.</param>
|
/// <param name="center">The center point.</param>
|
||||||
/// <param name="radiusKm">The radius in kilometers.</param>
|
/// <param name="radiusKm">The radius in kilometers.</param>
|
||||||
/// <returns>The bounding box.</returns>
|
/// <returns>The bounding box.</returns>
|
||||||
internal static GeoBox BoundingBox(GeoPoint center, double radiusKm) => BoundingBox(center.Latitude, center.Longitude, radiusKm);
|
internal static GeoBox BoundingBox(GeoPoint center, double radiusKm)
|
||||||
|
{
|
||||||
|
return BoundingBox(center.Latitude, center.Longitude, radiusKm);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Creates a bounding box from a coordinate and radius.
|
/// Creates a bounding box from a coordinate and radius.
|
||||||
@@ -51,7 +57,10 @@ public static class SpatialMath
|
|||||||
/// <param name="lon">The center longitude.</param>
|
/// <param name="lon">The center longitude.</param>
|
||||||
/// <param name="radiusKm">The radius in kilometers.</param>
|
/// <param name="radiusKm">The radius in kilometers.</param>
|
||||||
/// <returns>The bounding box.</returns>
|
/// <returns>The bounding box.</returns>
|
||||||
internal static GeoBox InternalBoundingBox(double lat, double lon, double radiusKm) => BoundingBox(lat, lon, radiusKm);
|
internal static GeoBox InternalBoundingBox(double lat, double lon, double radiusKm)
|
||||||
|
{
|
||||||
|
return BoundingBox(lat, lon, radiusKm);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Creates a bounding box centered at a coordinate with a given radius in kilometers.
|
/// Creates a bounding box centered at a coordinate with a given radius in kilometers.
|
||||||
@@ -72,6 +81,13 @@ public static class SpatialMath
|
|||||||
lon + dLon);
|
lon + dLon);
|
||||||
}
|
}
|
||||||
|
|
||||||
private static double ToRadians(double degrees) => degrees * Math.PI / 180.0;
|
private static double ToRadians(double degrees)
|
||||||
private static double ToDegrees(double radians) => radians * 180.0 / Math.PI;
|
{
|
||||||
|
return degrees * Math.PI / 180.0;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static double ToDegrees(double radians)
|
||||||
|
{
|
||||||
|
return radians * 180.0 / Math.PI;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -1,7 +1,5 @@
|
|||||||
using System.Runtime.Intrinsics;
|
|
||||||
using System.Runtime.Intrinsics.X86;
|
|
||||||
using System.Runtime.InteropServices;
|
|
||||||
using System.Numerics;
|
using System.Numerics;
|
||||||
|
using System.Runtime.InteropServices;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
||||||
|
|
||||||
@@ -56,7 +54,7 @@ public static class VectorMath
|
|||||||
throw new ArgumentException("Vectors must have same length");
|
throw new ArgumentException("Vectors must have same length");
|
||||||
|
|
||||||
float dot = 0;
|
float dot = 0;
|
||||||
int i = 0;
|
var i = 0;
|
||||||
|
|
||||||
// SIMD Optimization for .NET
|
// SIMD Optimization for .NET
|
||||||
if (Vector.IsHardwareAccelerated && v1.Length >= Vector<float>.Count)
|
if (Vector.IsHardwareAccelerated && v1.Length >= Vector<float>.Count)
|
||||||
@@ -65,20 +63,14 @@ public static class VectorMath
|
|||||||
var v1Span = MemoryMarshal.Cast<float, Vector<float>>(v1);
|
var v1Span = MemoryMarshal.Cast<float, Vector<float>>(v1);
|
||||||
var v2Span = MemoryMarshal.Cast<float, Vector<float>>(v2);
|
var v2Span = MemoryMarshal.Cast<float, Vector<float>>(v2);
|
||||||
|
|
||||||
foreach (var chunk in Enumerable.Range(0, v1Span.Length))
|
foreach (int chunk in Enumerable.Range(0, v1Span.Length)) vDot += v1Span[chunk] * v2Span[chunk];
|
||||||
{
|
|
||||||
vDot += v1Span[chunk] * v2Span[chunk];
|
|
||||||
}
|
|
||||||
|
|
||||||
dot = Vector.Dot(vDot, Vector<float>.One);
|
dot = Vector.Dot(vDot, Vector<float>.One);
|
||||||
i = v1Span.Length * Vector<float>.Count;
|
i = v1Span.Length * Vector<float>.Count;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Remaining elements
|
// Remaining elements
|
||||||
for (; i < v1.Length; i++)
|
for (; i < v1.Length; i++) dot += v1[i] * v2[i];
|
||||||
{
|
|
||||||
dot += v1[i] * v2[i];
|
|
||||||
}
|
|
||||||
|
|
||||||
return dot;
|
return dot;
|
||||||
}
|
}
|
||||||
@@ -95,7 +87,7 @@ public static class VectorMath
|
|||||||
throw new ArgumentException("Vectors must have same length");
|
throw new ArgumentException("Vectors must have same length");
|
||||||
|
|
||||||
float dist = 0;
|
float dist = 0;
|
||||||
int i = 0;
|
var i = 0;
|
||||||
|
|
||||||
if (Vector.IsHardwareAccelerated && v1.Length >= Vector<float>.Count)
|
if (Vector.IsHardwareAccelerated && v1.Length >= Vector<float>.Count)
|
||||||
{
|
{
|
||||||
@@ -103,7 +95,7 @@ public static class VectorMath
|
|||||||
var v1Span = MemoryMarshal.Cast<float, Vector<float>>(v1);
|
var v1Span = MemoryMarshal.Cast<float, Vector<float>>(v1);
|
||||||
var v2Span = MemoryMarshal.Cast<float, Vector<float>>(v2);
|
var v2Span = MemoryMarshal.Cast<float, Vector<float>>(v2);
|
||||||
|
|
||||||
foreach (var chunk in Enumerable.Range(0, v1Span.Length))
|
foreach (int chunk in Enumerable.Range(0, v1Span.Length))
|
||||||
{
|
{
|
||||||
var diff = v1Span[chunk] - v2Span[chunk];
|
var diff = v1Span[chunk] - v2Span[chunk];
|
||||||
vDist += diff * diff;
|
vDist += diff * diff;
|
||||||
|
|||||||
@@ -9,7 +9,10 @@ public static class VectorSearchExtensions
|
|||||||
/// <param name="vector">The vector property of the entity.</param>
|
/// <param name="vector">The vector property of the entity.</param>
|
||||||
/// <param name="query">The query vector to compare against.</param>
|
/// <param name="query">The query vector to compare against.</param>
|
||||||
/// <param name="k">Number of nearest neighbors to return.</param>
|
/// <param name="k">Number of nearest neighbors to return.</param>
|
||||||
/// <returns>True if the document is part of the top-k results (always returns true when evaluated in memory for compilation purposes).</returns>
|
/// <returns>
|
||||||
|
/// True if the document is part of the top-k results (always returns true when evaluated in memory for
|
||||||
|
/// compilation purposes).
|
||||||
|
/// </returns>
|
||||||
public static bool VectorSearch(this float[] vector, float[] query, int k)
|
public static bool VectorSearch(this float[] vector, float[] query, int k)
|
||||||
{
|
{
|
||||||
return true;
|
return true;
|
||||||
@@ -22,7 +25,10 @@ public static class VectorSearchExtensions
|
|||||||
/// <param name="vectors">The vector collection of the entity.</param>
|
/// <param name="vectors">The vector collection of the entity.</param>
|
||||||
/// <param name="query">The query vector to compare against.</param>
|
/// <param name="query">The query vector to compare against.</param>
|
||||||
/// <param name="k">Number of nearest neighbors to return.</param>
|
/// <param name="k">Number of nearest neighbors to return.</param>
|
||||||
/// <returns>True if the document is part of the top-k results (always returns true when evaluated in memory for compilation purposes).</returns>
|
/// <returns>
|
||||||
|
/// True if the document is part of the top-k results (always returns true when evaluated in memory for
|
||||||
|
/// compilation purposes).
|
||||||
|
/// </returns>
|
||||||
public static bool VectorSearch(this IEnumerable<float[]> vectors, float[] query, int k)
|
public static bool VectorSearch(this IEnumerable<float[]> vectors, float[] query, int k)
|
||||||
{
|
{
|
||||||
return true;
|
return true;
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
|
using System.Buffers;
|
||||||
using ZB.MOM.WW.CBDD.Core.Storage;
|
using ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
using ZB.MOM.WW.CBDD.Core.Transactions;
|
using ZB.MOM.WW.CBDD.Core.Transactions;
|
||||||
using System.Collections.Generic;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
||||||
|
|
||||||
@@ -10,17 +10,10 @@ namespace ZB.MOM.WW.CBDD.Core.Indexing;
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public sealed class VectorSearchIndex
|
public sealed class VectorSearchIndex
|
||||||
{
|
{
|
||||||
private struct NodeReference
|
private readonly IndexOptions _options;
|
||||||
{
|
private readonly Random _random = new(42);
|
||||||
public uint PageId;
|
|
||||||
public int NodeIndex;
|
|
||||||
public int MaxLevel;
|
|
||||||
}
|
|
||||||
|
|
||||||
private readonly IIndexStorage _storage;
|
private readonly IIndexStorage _storage;
|
||||||
private readonly IndexOptions _options;
|
|
||||||
private uint _rootPageId;
|
|
||||||
private readonly Random _random = new(42);
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new vector search index.
|
/// Initializes a new vector search index.
|
||||||
@@ -43,13 +36,13 @@ public sealed class VectorSearchIndex
|
|||||||
{
|
{
|
||||||
_storage = storage ?? throw new ArgumentNullException(nameof(storage));
|
_storage = storage ?? throw new ArgumentNullException(nameof(storage));
|
||||||
_options = options;
|
_options = options;
|
||||||
_rootPageId = rootPageId;
|
RootPageId = rootPageId;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the root page identifier of the index.
|
/// Gets the root page identifier of the index.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public uint RootPageId => _rootPageId;
|
public uint RootPageId { get; private set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Inserts a vector and its document location into the index.
|
/// Inserts a vector and its document location into the index.
|
||||||
@@ -60,28 +53,33 @@ public sealed class VectorSearchIndex
|
|||||||
public void Insert(float[] vector, DocumentLocation docLocation, ITransaction? transaction = null)
|
public void Insert(float[] vector, DocumentLocation docLocation, ITransaction? transaction = null)
|
||||||
{
|
{
|
||||||
if (vector.Length != _options.Dimensions)
|
if (vector.Length != _options.Dimensions)
|
||||||
throw new ArgumentException($"Vector dimension mismatch. Expected {_options.Dimensions}, got {vector.Length}");
|
throw new ArgumentException(
|
||||||
|
$"Vector dimension mismatch. Expected {_options.Dimensions}, got {vector.Length}");
|
||||||
|
|
||||||
// 1. Determine level for new node
|
// 1. Determine level for new node
|
||||||
int targetLevel = GetRandomLevel();
|
int targetLevel = GetRandomLevel();
|
||||||
|
|
||||||
// 2. If index is empty, create first page and first node
|
// 2. If index is empty, create first page and first node
|
||||||
if (_rootPageId == 0)
|
if (RootPageId == 0)
|
||||||
{
|
{
|
||||||
_rootPageId = CreateNewPage(transaction);
|
RootPageId = CreateNewPage(transaction);
|
||||||
var pageBuffer = RentPageBuffer();
|
byte[] pageBuffer = RentPageBuffer();
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
_storage.ReadPage(_rootPageId, transaction?.TransactionId, pageBuffer);
|
_storage.ReadPage(RootPageId, transaction?.TransactionId, pageBuffer);
|
||||||
VectorPage.WriteNode(pageBuffer, 0, docLocation, targetLevel, vector, _options.Dimensions);
|
VectorPage.WriteNode(pageBuffer, 0, docLocation, targetLevel, vector, _options.Dimensions);
|
||||||
VectorPage.IncrementNodeCount(pageBuffer); // Helper needs to be added or handled
|
VectorPage.IncrementNodeCount(pageBuffer); // Helper needs to be added or handled
|
||||||
|
|
||||||
if (transaction != null)
|
if (transaction != null)
|
||||||
_storage.WritePage(_rootPageId, transaction.TransactionId, pageBuffer);
|
_storage.WritePage(RootPageId, transaction.TransactionId, pageBuffer);
|
||||||
else
|
else
|
||||||
_storage.WritePageImmediate(_rootPageId, pageBuffer);
|
_storage.WritePageImmediate(RootPageId, pageBuffer);
|
||||||
}
|
}
|
||||||
finally { ReturnPageBuffer(pageBuffer); }
|
finally
|
||||||
|
{
|
||||||
|
ReturnPageBuffer(pageBuffer);
|
||||||
|
}
|
||||||
|
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -92,9 +90,7 @@ public sealed class VectorSearchIndex
|
|||||||
|
|
||||||
// 4. Greedy search down to targetLevel+1
|
// 4. Greedy search down to targetLevel+1
|
||||||
for (int l = entryPoint.MaxLevel; l > targetLevel; l--)
|
for (int l = entryPoint.MaxLevel; l > targetLevel; l--)
|
||||||
{
|
|
||||||
currentPoint = GreedySearch(currentPoint, vector, l, transaction);
|
currentPoint = GreedySearch(currentPoint, vector, l, transaction);
|
||||||
}
|
|
||||||
|
|
||||||
// 5. Create the new node
|
// 5. Create the new node
|
||||||
var newNode = AllocateNode(vector, docLocation, targetLevel, transaction);
|
var newNode = AllocateNode(vector, docLocation, targetLevel, transaction);
|
||||||
@@ -105,23 +101,18 @@ public sealed class VectorSearchIndex
|
|||||||
var neighbors = SearchLayer(currentPoint, vector, _options.EfConstruction, l, transaction);
|
var neighbors = SearchLayer(currentPoint, vector, _options.EfConstruction, l, transaction);
|
||||||
var selectedNeighbors = SelectNeighbors(neighbors, vector, _options.M, l, transaction);
|
var selectedNeighbors = SelectNeighbors(neighbors, vector, _options.M, l, transaction);
|
||||||
|
|
||||||
foreach (var neighbor in selectedNeighbors)
|
foreach (var neighbor in selectedNeighbors) AddBidirectionalLink(newNode, neighbor, l, transaction);
|
||||||
{
|
|
||||||
AddBidirectionalLink(newNode, neighbor, l, transaction);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Move currentPoint down for next level if available
|
// Move currentPoint down for next level if available
|
||||||
currentPoint = GreedySearch(currentPoint, vector, l, transaction);
|
currentPoint = GreedySearch(currentPoint, vector, l, transaction);
|
||||||
}
|
}
|
||||||
|
|
||||||
// 7. Update entry point if new node is higher
|
// 7. Update entry point if new node is higher
|
||||||
if (targetLevel > entryPoint.MaxLevel)
|
if (targetLevel > entryPoint.MaxLevel) UpdateEntryPoint(newNode, transaction);
|
||||||
{
|
|
||||||
UpdateEntryPoint(newNode, transaction);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
private IEnumerable<NodeReference> SelectNeighbors(IEnumerable<NodeReference> candidates, float[] query, int m, int level, ITransaction? transaction)
|
private IEnumerable<NodeReference> SelectNeighbors(IEnumerable<NodeReference> candidates, float[] query, int m,
|
||||||
|
int level, ITransaction? transaction)
|
||||||
{
|
{
|
||||||
// Simple heuristic: just take top M nearest.
|
// Simple heuristic: just take top M nearest.
|
||||||
// HNSW Paper suggests more complex heuristic to maintain connectivity diversity.
|
// HNSW Paper suggests more complex heuristic to maintain connectivity diversity.
|
||||||
@@ -136,14 +127,14 @@ public sealed class VectorSearchIndex
|
|||||||
|
|
||||||
private void Link(NodeReference from, NodeReference to, int level, ITransaction? transaction)
|
private void Link(NodeReference from, NodeReference to, int level, ITransaction? transaction)
|
||||||
{
|
{
|
||||||
var buffer = RentPageBuffer();
|
byte[] buffer = RentPageBuffer();
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
_storage.ReadPage(from.PageId, transaction?.TransactionId, buffer);
|
_storage.ReadPage(from.PageId, transaction?.TransactionId, buffer);
|
||||||
var links = VectorPage.GetLinksSpan(buffer, from.NodeIndex, level, _options.Dimensions, _options.M);
|
var links = VectorPage.GetLinksSpan(buffer, from.NodeIndex, level, _options.Dimensions, _options.M);
|
||||||
|
|
||||||
// Find first empty slot (PageId == 0)
|
// Find first empty slot (PageId == 0)
|
||||||
for (int i = 0; i < links.Length; i += 6)
|
for (var i = 0; i < links.Length; i += 6)
|
||||||
{
|
{
|
||||||
var existing = DocumentLocation.ReadFrom(links.Slice(i, 6));
|
var existing = DocumentLocation.ReadFrom(links.Slice(i, 6));
|
||||||
if (existing.PageId == 0)
|
if (existing.PageId == 0)
|
||||||
@@ -160,7 +151,10 @@ public sealed class VectorSearchIndex
|
|||||||
// If full, we should technically prune or redistribute links as per HNSW paper.
|
// If full, we should technically prune or redistribute links as per HNSW paper.
|
||||||
// For now, we assume M is large enough or we skip (limited connectivity).
|
// For now, we assume M is large enough or we skip (limited connectivity).
|
||||||
}
|
}
|
||||||
finally { ReturnPageBuffer(buffer); }
|
finally
|
||||||
|
{
|
||||||
|
ReturnPageBuffer(buffer);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private NodeReference AllocateNode(float[] vector, DocumentLocation docLoc, int level, ITransaction? transaction)
|
private NodeReference AllocateNode(float[] vector, DocumentLocation docLoc, int level, ITransaction? transaction)
|
||||||
@@ -168,10 +162,10 @@ public sealed class VectorSearchIndex
|
|||||||
// Find a page with space or create new
|
// Find a page with space or create new
|
||||||
// For simplicity, we search for a page with available slots or append to a new one.
|
// For simplicity, we search for a page with available slots or append to a new one.
|
||||||
// Implementation omitted for brevity but required for full persistence.
|
// Implementation omitted for brevity but required for full persistence.
|
||||||
uint pageId = _rootPageId; // Placeholder: need allocation strategy
|
uint pageId = RootPageId; // Placeholder: need allocation strategy
|
||||||
int index = 0;
|
var index = 0;
|
||||||
|
|
||||||
var buffer = RentPageBuffer();
|
byte[] buffer = RentPageBuffer();
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
_storage.ReadPage(pageId, transaction?.TransactionId, buffer);
|
_storage.ReadPage(pageId, transaction?.TransactionId, buffer);
|
||||||
@@ -184,7 +178,10 @@ public sealed class VectorSearchIndex
|
|||||||
else
|
else
|
||||||
_storage.WritePageImmediate(pageId, buffer);
|
_storage.WritePageImmediate(pageId, buffer);
|
||||||
}
|
}
|
||||||
finally { ReturnPageBuffer(buffer); }
|
finally
|
||||||
|
{
|
||||||
|
ReturnPageBuffer(buffer);
|
||||||
|
}
|
||||||
|
|
||||||
return new NodeReference { PageId = pageId, NodeIndex = index, MaxLevel = level };
|
return new NodeReference { PageId = pageId, NodeIndex = index, MaxLevel = level };
|
||||||
}
|
}
|
||||||
@@ -197,7 +194,7 @@ public sealed class VectorSearchIndex
|
|||||||
|
|
||||||
private NodeReference GreedySearch(NodeReference entryPoint, float[] query, int level, ITransaction? transaction)
|
private NodeReference GreedySearch(NodeReference entryPoint, float[] query, int level, ITransaction? transaction)
|
||||||
{
|
{
|
||||||
bool changed = true;
|
var changed = true;
|
||||||
var current = entryPoint;
|
var current = entryPoint;
|
||||||
float currentDist = VectorMath.Distance(query, LoadVector(current, transaction), _options.Metric);
|
float currentDist = VectorMath.Distance(query, LoadVector(current, transaction), _options.Metric);
|
||||||
|
|
||||||
@@ -215,10 +212,12 @@ public sealed class VectorSearchIndex
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
return current;
|
return current;
|
||||||
}
|
}
|
||||||
|
|
||||||
private IEnumerable<NodeReference> SearchLayer(NodeReference entryPoint, float[] query, int ef, int level, ITransaction? transaction)
|
private IEnumerable<NodeReference> SearchLayer(NodeReference entryPoint, float[] query, int ef, int level,
|
||||||
|
ITransaction? transaction)
|
||||||
{
|
{
|
||||||
var visited = new HashSet<NodeReference>();
|
var visited = new HashSet<NodeReference>();
|
||||||
var candidates = new PriorityQueue<NodeReference, float>();
|
var candidates = new PriorityQueue<NodeReference, float>();
|
||||||
@@ -233,14 +232,13 @@ public sealed class VectorSearchIndex
|
|||||||
{
|
{
|
||||||
float d_c = 0;
|
float d_c = 0;
|
||||||
candidates.TryPeek(out var c, out d_c);
|
candidates.TryPeek(out var c, out d_c);
|
||||||
result.TryPeek(out var f, out var d_f);
|
result.TryPeek(out var f, out float d_f);
|
||||||
|
|
||||||
if (d_c > -d_f) break;
|
if (d_c > -d_f) break;
|
||||||
|
|
||||||
candidates.Dequeue();
|
candidates.Dequeue();
|
||||||
|
|
||||||
foreach (var e in GetNeighbors(c, level, transaction))
|
foreach (var e in GetNeighbors(c, level, transaction))
|
||||||
{
|
|
||||||
if (!visited.Contains(e))
|
if (!visited.Contains(e))
|
||||||
{
|
{
|
||||||
visited.Add(e);
|
visited.Add(e);
|
||||||
@@ -255,7 +253,6 @@ public sealed class VectorSearchIndex
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
// Convert result to list (ordered by distance)
|
// Convert result to list (ordered by distance)
|
||||||
var list = new List<NodeReference>();
|
var list = new List<NodeReference>();
|
||||||
@@ -268,20 +265,23 @@ public sealed class VectorSearchIndex
|
|||||||
{
|
{
|
||||||
// For now, assume a fixed location or track it in page 0 of index
|
// For now, assume a fixed location or track it in page 0 of index
|
||||||
// TODO: Real implementation
|
// TODO: Real implementation
|
||||||
return new NodeReference { PageId = _rootPageId, NodeIndex = 0, MaxLevel = 0 };
|
return new NodeReference { PageId = RootPageId, NodeIndex = 0, MaxLevel = 0 };
|
||||||
}
|
}
|
||||||
|
|
||||||
private float[] LoadVector(NodeReference node, ITransaction? transaction)
|
private float[] LoadVector(NodeReference node, ITransaction? transaction)
|
||||||
{
|
{
|
||||||
var buffer = RentPageBuffer();
|
byte[] buffer = RentPageBuffer();
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
_storage.ReadPage(node.PageId, transaction?.TransactionId, buffer);
|
_storage.ReadPage(node.PageId, transaction?.TransactionId, buffer);
|
||||||
float[] vector = new float[_options.Dimensions];
|
var vector = new float[_options.Dimensions];
|
||||||
VectorPage.ReadNodeData(buffer, node.NodeIndex, out _, out _, vector);
|
VectorPage.ReadNodeData(buffer, node.NodeIndex, out _, out _, vector);
|
||||||
return vector;
|
return vector;
|
||||||
}
|
}
|
||||||
finally { ReturnPageBuffer(buffer); }
|
finally
|
||||||
|
{
|
||||||
|
ReturnPageBuffer(buffer);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -292,24 +292,22 @@ public sealed class VectorSearchIndex
|
|||||||
/// <param name="efSearch">The search breadth parameter.</param>
|
/// <param name="efSearch">The search breadth parameter.</param>
|
||||||
/// <param name="transaction">Optional transaction context.</param>
|
/// <param name="transaction">Optional transaction context.</param>
|
||||||
/// <returns>The nearest vector search results.</returns>
|
/// <returns>The nearest vector search results.</returns>
|
||||||
public IEnumerable<VectorSearchResult> Search(float[] query, int k, int efSearch = 100, ITransaction? transaction = null)
|
public IEnumerable<VectorSearchResult> Search(float[] query, int k, int efSearch = 100,
|
||||||
|
ITransaction? transaction = null)
|
||||||
{
|
{
|
||||||
if (_rootPageId == 0) yield break;
|
if (RootPageId == 0) yield break;
|
||||||
|
|
||||||
var entryPoint = GetEntryPoint();
|
var entryPoint = GetEntryPoint();
|
||||||
var currentPoint = entryPoint;
|
var currentPoint = entryPoint;
|
||||||
|
|
||||||
// 1. Greedy search through higher layers to find entry point for level 0
|
// 1. Greedy search through higher layers to find entry point for level 0
|
||||||
for (int l = entryPoint.MaxLevel; l > 0; l--)
|
for (int l = entryPoint.MaxLevel; l > 0; l--) currentPoint = GreedySearch(currentPoint, query, l, transaction);
|
||||||
{
|
|
||||||
currentPoint = GreedySearch(currentPoint, query, l, transaction);
|
|
||||||
}
|
|
||||||
|
|
||||||
// 2. Comprehensive search on level 0
|
// 2. Comprehensive search on level 0
|
||||||
var nearest = SearchLayer(currentPoint, query, Math.Max(efSearch, k), 0, transaction);
|
var nearest = SearchLayer(currentPoint, query, Math.Max(efSearch, k), 0, transaction);
|
||||||
|
|
||||||
// 3. Return top-k results
|
// 3. Return top-k results
|
||||||
int count = 0;
|
var count = 0;
|
||||||
foreach (var node in nearest)
|
foreach (var node in nearest)
|
||||||
{
|
{
|
||||||
if (count++ >= k) break;
|
if (count++ >= k) break;
|
||||||
@@ -322,26 +320,29 @@ public sealed class VectorSearchIndex
|
|||||||
|
|
||||||
private DocumentLocation LoadDocumentLocation(NodeReference node, ITransaction? transaction)
|
private DocumentLocation LoadDocumentLocation(NodeReference node, ITransaction? transaction)
|
||||||
{
|
{
|
||||||
var buffer = RentPageBuffer();
|
byte[] buffer = RentPageBuffer();
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
_storage.ReadPage(node.PageId, transaction?.TransactionId, buffer);
|
_storage.ReadPage(node.PageId, transaction?.TransactionId, buffer);
|
||||||
VectorPage.ReadNodeData(buffer, node.NodeIndex, out var loc, out _, new float[0]); // Vector not needed here
|
VectorPage.ReadNodeData(buffer, node.NodeIndex, out var loc, out _, new float[0]); // Vector not needed here
|
||||||
return loc;
|
return loc;
|
||||||
}
|
}
|
||||||
finally { ReturnPageBuffer(buffer); }
|
finally
|
||||||
|
{
|
||||||
|
ReturnPageBuffer(buffer);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private IEnumerable<NodeReference> GetNeighbors(NodeReference node, int level, ITransaction? transaction)
|
private IEnumerable<NodeReference> GetNeighbors(NodeReference node, int level, ITransaction? transaction)
|
||||||
{
|
{
|
||||||
var buffer = RentPageBuffer();
|
byte[] buffer = RentPageBuffer();
|
||||||
var results = new List<NodeReference>();
|
var results = new List<NodeReference>();
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
_storage.ReadPage(node.PageId, transaction?.TransactionId, buffer);
|
_storage.ReadPage(node.PageId, transaction?.TransactionId, buffer);
|
||||||
var links = VectorPage.GetLinksSpan(buffer, node.NodeIndex, level, _options.Dimensions, _options.M);
|
var links = VectorPage.GetLinksSpan(buffer, node.NodeIndex, level, _options.Dimensions, _options.M);
|
||||||
|
|
||||||
for (int i = 0; i < links.Length; i += 6)
|
for (var i = 0; i < links.Length; i += 6)
|
||||||
{
|
{
|
||||||
var loc = DocumentLocation.ReadFrom(links.Slice(i, 6));
|
var loc = DocumentLocation.ReadFrom(links.Slice(i, 6));
|
||||||
if (loc.PageId == 0) break; // End of links
|
if (loc.PageId == 0) break; // End of links
|
||||||
@@ -349,7 +350,11 @@ public sealed class VectorSearchIndex
|
|||||||
results.Add(new NodeReference { PageId = loc.PageId, NodeIndex = loc.SlotIndex });
|
results.Add(new NodeReference { PageId = loc.PageId, NodeIndex = loc.SlotIndex });
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
finally { ReturnPageBuffer(buffer); }
|
finally
|
||||||
|
{
|
||||||
|
ReturnPageBuffer(buffer);
|
||||||
|
}
|
||||||
|
|
||||||
return results;
|
return results;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -357,29 +362,43 @@ public sealed class VectorSearchIndex
|
|||||||
{
|
{
|
||||||
// Probability p = 1/M for each level
|
// Probability p = 1/M for each level
|
||||||
double p = 1.0 / _options.M;
|
double p = 1.0 / _options.M;
|
||||||
int level = 0;
|
var level = 0;
|
||||||
while (_random.NextDouble() < p && level < 15)
|
while (_random.NextDouble() < p && level < 15) level++;
|
||||||
{
|
|
||||||
level++;
|
|
||||||
}
|
|
||||||
return level;
|
return level;
|
||||||
}
|
}
|
||||||
|
|
||||||
private uint CreateNewPage(ITransaction? transaction)
|
private uint CreateNewPage(ITransaction? transaction)
|
||||||
{
|
{
|
||||||
uint pageId = _storage.AllocatePage();
|
uint pageId = _storage.AllocatePage();
|
||||||
var buffer = RentPageBuffer();
|
byte[] buffer = RentPageBuffer();
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
VectorPage.Initialize(buffer, pageId, _options.Dimensions, _options.M);
|
VectorPage.Initialize(buffer, pageId, _options.Dimensions, _options.M);
|
||||||
_storage.WritePageImmediate(pageId, buffer);
|
_storage.WritePageImmediate(pageId, buffer);
|
||||||
return pageId;
|
return pageId;
|
||||||
}
|
}
|
||||||
finally { ReturnPageBuffer(buffer); }
|
finally
|
||||||
|
{
|
||||||
|
ReturnPageBuffer(buffer);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private byte[] RentPageBuffer() => System.Buffers.ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
private byte[] RentPageBuffer()
|
||||||
private void ReturnPageBuffer(byte[] buffer) => System.Buffers.ArrayPool<byte>.Shared.Return(buffer);
|
{
|
||||||
|
return ArrayPool<byte>.Shared.Rent(_storage.PageSize);
|
||||||
|
}
|
||||||
|
|
||||||
|
private void ReturnPageBuffer(byte[] buffer)
|
||||||
|
{
|
||||||
|
ArrayPool<byte>.Shared.Return(buffer);
|
||||||
|
}
|
||||||
|
|
||||||
|
private struct NodeReference
|
||||||
|
{
|
||||||
|
public uint PageId;
|
||||||
|
public int NodeIndex;
|
||||||
|
public int MaxLevel;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
public record struct VectorSearchResult(DocumentLocation Location, float Distance);
|
public record struct VectorSearchResult(DocumentLocation Location, float Distance);
|
||||||
@@ -1,5 +1,5 @@
|
|||||||
using ZB.MOM.WW.CBDD.Core.Indexing;
|
|
||||||
using System.Linq.Expressions;
|
using System.Linq.Expressions;
|
||||||
|
using ZB.MOM.WW.CBDD.Core.Indexing;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Metadata;
|
namespace ZB.MOM.WW.CBDD.Core.Metadata;
|
||||||
|
|
||||||
@@ -54,7 +54,8 @@ public class EntityTypeBuilder<T> where T : class
|
|||||||
/// <param name="name">The optional index name.</param>
|
/// <param name="name">The optional index name.</param>
|
||||||
/// <param name="unique">A value indicating whether the index is unique.</param>
|
/// <param name="unique">A value indicating whether the index is unique.</param>
|
||||||
/// <returns>The current entity type builder.</returns>
|
/// <returns>The current entity type builder.</returns>
|
||||||
public EntityTypeBuilder<T> HasIndex<TKey>(Expression<Func<T, TKey>> keySelector, string? name = null, bool unique = false)
|
public EntityTypeBuilder<T> HasIndex<TKey>(Expression<Func<T, TKey>> keySelector, string? name = null,
|
||||||
|
bool unique = false)
|
||||||
{
|
{
|
||||||
Indexes.Add(new IndexBuilder<T>(keySelector, name, unique));
|
Indexes.Add(new IndexBuilder<T>(keySelector, name, unique));
|
||||||
return this;
|
return this;
|
||||||
@@ -69,7 +70,8 @@ public class EntityTypeBuilder<T> where T : class
|
|||||||
/// <param name="metric">The vector similarity metric.</param>
|
/// <param name="metric">The vector similarity metric.</param>
|
||||||
/// <param name="name">The optional index name.</param>
|
/// <param name="name">The optional index name.</param>
|
||||||
/// <returns>The current entity type builder.</returns>
|
/// <returns>The current entity type builder.</returns>
|
||||||
public EntityTypeBuilder<T> HasVectorIndex<TKey>(Expression<Func<T, TKey>> keySelector, int dimensions, VectorMetric metric = VectorMetric.Cosine, string? name = null)
|
public EntityTypeBuilder<T> HasVectorIndex<TKey>(Expression<Func<T, TKey>> keySelector, int dimensions,
|
||||||
|
VectorMetric metric = VectorMetric.Cosine, string? name = null)
|
||||||
{
|
{
|
||||||
Indexes.Add(new IndexBuilder<T>(keySelector, name, false, IndexType.Vector, dimensions, metric));
|
Indexes.Add(new IndexBuilder<T>(keySelector, name, false, IndexType.Vector, dimensions, metric));
|
||||||
return this;
|
return this;
|
||||||
@@ -108,10 +110,7 @@ public class EntityTypeBuilder<T> where T : class
|
|||||||
/// <returns>The current entity type builder.</returns>
|
/// <returns>The current entity type builder.</returns>
|
||||||
public EntityTypeBuilder<T> HasConversion<TConverter>()
|
public EntityTypeBuilder<T> HasConversion<TConverter>()
|
||||||
{
|
{
|
||||||
if (!string.IsNullOrEmpty(PrimaryKeyName))
|
if (!string.IsNullOrEmpty(PrimaryKeyName)) PropertyConverters[PrimaryKeyName] = typeof(TConverter);
|
||||||
{
|
|
||||||
PropertyConverters[PrimaryKeyName] = typeof(TConverter);
|
|
||||||
}
|
|
||||||
return this;
|
return this;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -123,7 +122,7 @@ public class EntityTypeBuilder<T> where T : class
|
|||||||
/// <returns>A builder for the selected property.</returns>
|
/// <returns>A builder for the selected property.</returns>
|
||||||
public PropertyBuilder Property<TProperty>(Expression<Func<T, TProperty>> propertyExpression)
|
public PropertyBuilder Property<TProperty>(Expression<Func<T, TProperty>> propertyExpression)
|
||||||
{
|
{
|
||||||
var propertyName = ExpressionAnalyzer.ExtractPropertyPaths(propertyExpression).FirstOrDefault();
|
string? propertyName = ExpressionAnalyzer.ExtractPropertyPaths(propertyExpression).FirstOrDefault();
|
||||||
return new PropertyBuilder(this, propertyName);
|
return new PropertyBuilder(this, propertyName);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -149,10 +148,7 @@ public class EntityTypeBuilder<T> where T : class
|
|||||||
/// <returns>The current property builder.</returns>
|
/// <returns>The current property builder.</returns>
|
||||||
public PropertyBuilder ValueGeneratedOnAdd()
|
public PropertyBuilder ValueGeneratedOnAdd()
|
||||||
{
|
{
|
||||||
if (_propertyName == _parent.PrimaryKeyName)
|
if (_propertyName == _parent.PrimaryKeyName) _parent.ValueGeneratedOnAdd = true;
|
||||||
{
|
|
||||||
_parent.ValueGeneratedOnAdd = true;
|
|
||||||
}
|
|
||||||
return this;
|
return this;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -163,10 +159,7 @@ public class EntityTypeBuilder<T> where T : class
|
|||||||
/// <returns>The current property builder.</returns>
|
/// <returns>The current property builder.</returns>
|
||||||
public PropertyBuilder HasConversion<TConverter>()
|
public PropertyBuilder HasConversion<TConverter>()
|
||||||
{
|
{
|
||||||
if (!string.IsNullOrEmpty(_propertyName))
|
if (!string.IsNullOrEmpty(_propertyName)) _parent.PropertyConverters[_propertyName] = typeof(TConverter);
|
||||||
{
|
|
||||||
_parent.PropertyConverters[_propertyName] = typeof(TConverter);
|
|
||||||
}
|
|
||||||
return this;
|
return this;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -174,6 +167,26 @@ public class EntityTypeBuilder<T> where T : class
|
|||||||
|
|
||||||
public class IndexBuilder<T>
|
public class IndexBuilder<T>
|
||||||
{
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Initializes a new instance of the <see cref="IndexBuilder{T}" /> class.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="keySelector">The index key selector expression.</param>
|
||||||
|
/// <param name="name">The optional index name.</param>
|
||||||
|
/// <param name="unique">A value indicating whether the index is unique.</param>
|
||||||
|
/// <param name="type">The index type.</param>
|
||||||
|
/// <param name="dimensions">The vector dimensions.</param>
|
||||||
|
/// <param name="metric">The vector metric.</param>
|
||||||
|
public IndexBuilder(LambdaExpression keySelector, string? name, bool unique, IndexType type = IndexType.BTree,
|
||||||
|
int dimensions = 0, VectorMetric metric = VectorMetric.Cosine)
|
||||||
|
{
|
||||||
|
KeySelector = keySelector;
|
||||||
|
Name = name;
|
||||||
|
IsUnique = unique;
|
||||||
|
Type = type;
|
||||||
|
Dimensions = dimensions;
|
||||||
|
Metric = metric;
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the index key selector expression.
|
/// Gets the index key selector expression.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -203,23 +216,4 @@ public class IndexBuilder<T>
|
|||||||
/// Gets the vector metric.
|
/// Gets the vector metric.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public VectorMetric Metric { get; }
|
public VectorMetric Metric { get; }
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Initializes a new instance of the <see cref="IndexBuilder{T}"/> class.
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="keySelector">The index key selector expression.</param>
|
|
||||||
/// <param name="name">The optional index name.</param>
|
|
||||||
/// <param name="unique">A value indicating whether the index is unique.</param>
|
|
||||||
/// <param name="type">The index type.</param>
|
|
||||||
/// <param name="dimensions">The vector dimensions.</param>
|
|
||||||
/// <param name="metric">The vector metric.</param>
|
|
||||||
public IndexBuilder(LambdaExpression keySelector, string? name, bool unique, IndexType type = IndexType.BTree, int dimensions = 0, VectorMetric metric = VectorMetric.Cosine)
|
|
||||||
{
|
|
||||||
KeySelector = keySelector;
|
|
||||||
Name = name;
|
|
||||||
IsUnique = unique;
|
|
||||||
Type = type;
|
|
||||||
Dimensions = dimensions;
|
|
||||||
Metric = metric;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
@@ -1,6 +1,3 @@
|
|||||||
using System.Linq.Expressions;
|
|
||||||
using ZB.MOM.WW.CBDD.Core.Indexing;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Metadata;
|
namespace ZB.MOM.WW.CBDD.Core.Metadata;
|
||||||
|
|
||||||
public class ModelBuilder
|
public class ModelBuilder
|
||||||
@@ -14,11 +11,12 @@ public class ModelBuilder
|
|||||||
/// <returns>The entity builder for <typeparamref name="T" />.</returns>
|
/// <returns>The entity builder for <typeparamref name="T" />.</returns>
|
||||||
public EntityTypeBuilder<T> Entity<T>() where T : class
|
public EntityTypeBuilder<T> Entity<T>() where T : class
|
||||||
{
|
{
|
||||||
if (!_entityBuilders.TryGetValue(typeof(T), out var builder))
|
if (!_entityBuilders.TryGetValue(typeof(T), out object? builder))
|
||||||
{
|
{
|
||||||
builder = new EntityTypeBuilder<T>();
|
builder = new EntityTypeBuilder<T>();
|
||||||
_entityBuilders[typeof(T)] = builder;
|
_entityBuilders[typeof(T)] = builder;
|
||||||
}
|
}
|
||||||
|
|
||||||
return (EntityTypeBuilder<T>)builder;
|
return (EntityTypeBuilder<T>)builder;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -26,5 +24,8 @@ public class ModelBuilder
|
|||||||
/// Gets all registered entity builders.
|
/// Gets all registered entity builders.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <returns>A read-only dictionary of entity builders keyed by entity type.</returns>
|
/// <returns>A read-only dictionary of entity builders keyed by entity type.</returns>
|
||||||
public IReadOnlyDictionary<Type, object> GetEntityBuilders() => _entityBuilders;
|
public IReadOnlyDictionary<Type, object> GetEntityBuilders()
|
||||||
|
{
|
||||||
|
return _entityBuilders;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -9,13 +9,15 @@ internal class BTreeExpressionVisitor : ExpressionVisitor
|
|||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the query model built while visiting an expression tree.
|
/// Gets the query model built while visiting an expression tree.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public QueryModel GetModel() => _model;
|
public QueryModel GetModel()
|
||||||
|
{
|
||||||
|
return _model;
|
||||||
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
protected override Expression VisitMethodCall(MethodCallExpression node)
|
protected override Expression VisitMethodCall(MethodCallExpression node)
|
||||||
{
|
{
|
||||||
if (node.Method.DeclaringType == typeof(Queryable))
|
if (node.Method.DeclaringType == typeof(Queryable))
|
||||||
{
|
|
||||||
switch (node.Method.Name)
|
switch (node.Method.Name)
|
||||||
{
|
{
|
||||||
case "Where":
|
case "Where":
|
||||||
@@ -35,7 +37,6 @@ internal class BTreeExpressionVisitor : ExpressionVisitor
|
|||||||
VisitSkip(node);
|
VisitSkip(node);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
return base.VisitMethodCall(node);
|
return base.VisitMethodCall(node);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
using System.Linq;
|
|
||||||
using System.Linq.Expressions;
|
using System.Linq.Expressions;
|
||||||
using System.Reflection;
|
using System.Reflection;
|
||||||
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
using ZB.MOM.WW.CBDD.Core.Collections;
|
using ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
using static ZB.MOM.WW.CBDD.Core.Query.IndexOptimizer;
|
using static ZB.MOM.WW.CBDD.Core.Query.IndexOptimizer;
|
||||||
|
|
||||||
@@ -30,8 +30,7 @@ public class BTreeQueryProvider<TId, T> : IQueryProvider where T : class
|
|||||||
try
|
try
|
||||||
{
|
{
|
||||||
return (IQueryable)Activator.CreateInstance(
|
return (IQueryable)Activator.CreateInstance(
|
||||||
typeof(BTreeQueryable<>).MakeGenericType(elementType),
|
typeof(BTreeQueryable<>).MakeGenericType(elementType), this, expression)!;
|
||||||
new object[] { this, expression })!;
|
|
||||||
}
|
}
|
||||||
catch (TargetInvocationException ex)
|
catch (TargetInvocationException ex)
|
||||||
{
|
{
|
||||||
@@ -80,45 +79,30 @@ public class BTreeQueryProvider<TId, T> : IQueryProvider where T : class
|
|||||||
IEnumerable<T> sourceData = null!;
|
IEnumerable<T> sourceData = null!;
|
||||||
|
|
||||||
// A. Try Index Optimization (Only if Where clause exists)
|
// A. Try Index Optimization (Only if Where clause exists)
|
||||||
var indexOpt = IndexOptimizer.TryOptimize<T>(model, _collection.GetIndexes());
|
var indexOpt = TryOptimize<T>(model, _collection.GetIndexes());
|
||||||
if (indexOpt != null)
|
if (indexOpt != null)
|
||||||
{
|
{
|
||||||
if (indexOpt.IsVectorSearch)
|
if (indexOpt.IsVectorSearch)
|
||||||
{
|
|
||||||
sourceData = _collection.VectorSearch(indexOpt.IndexName, indexOpt.VectorQuery!, indexOpt.K);
|
sourceData = _collection.VectorSearch(indexOpt.IndexName, indexOpt.VectorQuery!, indexOpt.K);
|
||||||
}
|
|
||||||
else if (indexOpt.IsSpatialSearch)
|
else if (indexOpt.IsSpatialSearch)
|
||||||
{
|
|
||||||
sourceData = indexOpt.SpatialType == SpatialQueryType.Near
|
sourceData = indexOpt.SpatialType == SpatialQueryType.Near
|
||||||
? _collection.Near(indexOpt.IndexName, indexOpt.SpatialPoint, indexOpt.RadiusKm)
|
? _collection.Near(indexOpt.IndexName, indexOpt.SpatialPoint, indexOpt.RadiusKm)
|
||||||
: _collection.Within(indexOpt.IndexName, indexOpt.SpatialMin, indexOpt.SpatialMax);
|
: _collection.Within(indexOpt.IndexName, indexOpt.SpatialMin, indexOpt.SpatialMax);
|
||||||
}
|
|
||||||
else
|
else
|
||||||
{
|
|
||||||
sourceData = _collection.QueryIndex(indexOpt.IndexName, indexOpt.MinValue, indexOpt.MaxValue);
|
sourceData = _collection.QueryIndex(indexOpt.IndexName, indexOpt.MinValue, indexOpt.MaxValue);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
// B. Try Scan Optimization (if no index used)
|
// B. Try Scan Optimization (if no index used)
|
||||||
if (sourceData == null)
|
if (sourceData == null)
|
||||||
{
|
{
|
||||||
Func<ZB.MOM.WW.CBDD.Bson.BsonSpanReader, bool>? bsonPredicate = null;
|
Func<BsonSpanReader, bool>? bsonPredicate = null;
|
||||||
if (model.WhereClause != null)
|
if (model.WhereClause != null) bsonPredicate = BsonExpressionEvaluator.TryCompile<T>(model.WhereClause);
|
||||||
{
|
|
||||||
bsonPredicate = BsonExpressionEvaluator.TryCompile<T>(model.WhereClause);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (bsonPredicate != null)
|
if (bsonPredicate != null) sourceData = _collection.Scan(bsonPredicate);
|
||||||
{
|
|
||||||
sourceData = _collection.Scan(bsonPredicate);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// C. Fallback to Full Scan
|
// C. Fallback to Full Scan
|
||||||
if (sourceData == null)
|
if (sourceData == null) sourceData = _collection.FindAll();
|
||||||
{
|
|
||||||
sourceData = _collection.FindAll();
|
|
||||||
}
|
|
||||||
|
|
||||||
// 3. Rewrite Expression Tree to use Enumerable
|
// 3. Rewrite Expression Tree to use Enumerable
|
||||||
// Replace the "Root" IQueryable with our sourceData IEnumerable
|
// Replace the "Root" IQueryable with our sourceData IEnumerable
|
||||||
@@ -140,10 +124,8 @@ public class BTreeQueryProvider<TId, T> : IQueryProvider where T : class
|
|||||||
// We need to turn it into a Func<TResult> and invoke it.
|
// We need to turn it into a Func<TResult> and invoke it.
|
||||||
|
|
||||||
if (rewrittenExpression.Type != typeof(TResult))
|
if (rewrittenExpression.Type != typeof(TResult))
|
||||||
{
|
|
||||||
// If TResult is object (non-generic Execute), we need to cast
|
// If TResult is object (non-generic Execute), we need to cast
|
||||||
rewrittenExpression = Expression.Convert(rewrittenExpression, typeof(TResult));
|
rewrittenExpression = Expression.Convert(rewrittenExpression, typeof(TResult));
|
||||||
}
|
|
||||||
|
|
||||||
var lambda = Expression.Lambda<Func<TResult>>(rewrittenExpression);
|
var lambda = Expression.Lambda<Func<TResult>>(rewrittenExpression);
|
||||||
var compiled = lambda.Compile();
|
var compiled = lambda.Compile();
|
||||||
@@ -162,11 +144,9 @@ public class BTreeQueryProvider<TId, T> : IQueryProvider where T : class
|
|||||||
{
|
{
|
||||||
// If we found a Queryable, that's our root source
|
// If we found a Queryable, that's our root source
|
||||||
if (Root == null && node.Value is IQueryable q)
|
if (Root == null && node.Value is IQueryable q)
|
||||||
{
|
|
||||||
// We typically want the "base" queryable (the BTreeQueryable instance)
|
// We typically want the "base" queryable (the BTreeQueryable instance)
|
||||||
// In a chain like Coll.Where.Select, the root is Coll.
|
// In a chain like Coll.Where.Select, the root is Coll.
|
||||||
Root = q;
|
Root = q;
|
||||||
}
|
|
||||||
return base.VisitConstant(node);
|
return base.VisitConstant(node);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,5 +1,4 @@
|
|||||||
using System.Collections;
|
using System.Collections;
|
||||||
using System.Linq;
|
|
||||||
using System.Linq.Expressions;
|
using System.Linq.Expressions;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Query;
|
namespace ZB.MOM.WW.CBDD.Core.Query;
|
||||||
|
|||||||
@@ -29,12 +29,11 @@ internal static class BsonExpressionEvaluator
|
|||||||
}
|
}
|
||||||
|
|
||||||
if (left is MemberExpression member && right is ConstantExpression constant)
|
if (left is MemberExpression member && right is ConstantExpression constant)
|
||||||
{
|
|
||||||
// Check if member is property of parameter
|
// Check if member is property of parameter
|
||||||
if (member.Expression == expression.Parameters[0])
|
if (member.Expression == expression.Parameters[0])
|
||||||
{
|
{
|
||||||
var propertyName = member.Member.Name.ToLowerInvariant();
|
string propertyName = member.Member.Name.ToLowerInvariant();
|
||||||
var value = constant.Value;
|
object? value = constant.Value;
|
||||||
|
|
||||||
// Handle Id mapping?
|
// Handle Id mapping?
|
||||||
// If property is "id", Bson field is "_id"
|
// If property is "id", Bson field is "_id"
|
||||||
@@ -43,12 +42,13 @@ internal static class BsonExpressionEvaluator
|
|||||||
return CreatePredicate(propertyName, value, nodeType);
|
return CreatePredicate(propertyName, value, nodeType);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
private static ExpressionType Flip(ExpressionType type) => type switch
|
private static ExpressionType Flip(ExpressionType type)
|
||||||
|
{
|
||||||
|
return type switch
|
||||||
{
|
{
|
||||||
ExpressionType.GreaterThan => ExpressionType.LessThan,
|
ExpressionType.GreaterThan => ExpressionType.LessThan,
|
||||||
ExpressionType.LessThan => ExpressionType.GreaterThan,
|
ExpressionType.LessThan => ExpressionType.GreaterThan,
|
||||||
@@ -56,8 +56,10 @@ internal static class BsonExpressionEvaluator
|
|||||||
ExpressionType.LessThanOrEqual => ExpressionType.GreaterThanOrEqual,
|
ExpressionType.LessThanOrEqual => ExpressionType.GreaterThanOrEqual,
|
||||||
_ => type
|
_ => type
|
||||||
};
|
};
|
||||||
|
}
|
||||||
|
|
||||||
private static Func<BsonSpanReader, bool>? CreatePredicate(string propertyName, object? targetValue, ExpressionType op)
|
private static Func<BsonSpanReader, bool>? CreatePredicate(string propertyName, object? targetValue,
|
||||||
|
ExpressionType op)
|
||||||
{
|
{
|
||||||
// We need to return a delegate that searches for propertyName in BsonSpanReader and compares
|
// We need to return a delegate that searches for propertyName in BsonSpanReader and compares
|
||||||
|
|
||||||
@@ -71,13 +73,11 @@ internal static class BsonExpressionEvaluator
|
|||||||
var type = reader.ReadBsonType();
|
var type = reader.ReadBsonType();
|
||||||
if (type == 0) break;
|
if (type == 0) break;
|
||||||
|
|
||||||
var name = reader.ReadElementHeader();
|
string name = reader.ReadElementHeader();
|
||||||
|
|
||||||
if (name == propertyName)
|
if (name == propertyName)
|
||||||
{
|
|
||||||
// Found -> read value and compare
|
// Found -> read value and compare
|
||||||
return Compare(ref reader, type, targetValue, op);
|
return Compare(ref reader, type, targetValue, op);
|
||||||
}
|
|
||||||
|
|
||||||
reader.SkipValue(type);
|
reader.SkipValue(type);
|
||||||
}
|
}
|
||||||
@@ -86,6 +86,7 @@ internal static class BsonExpressionEvaluator
|
|||||||
{
|
{
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
|
|
||||||
return false; // Not found
|
return false; // Not found
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
@@ -97,9 +98,8 @@ internal static class BsonExpressionEvaluator
|
|||||||
|
|
||||||
if (type == BsonType.Int32)
|
if (type == BsonType.Int32)
|
||||||
{
|
{
|
||||||
var val = reader.ReadInt32();
|
int val = reader.ReadInt32();
|
||||||
if (target is int targetInt)
|
if (target is int targetInt)
|
||||||
{
|
|
||||||
return op switch
|
return op switch
|
||||||
{
|
{
|
||||||
ExpressionType.Equal => val == targetInt,
|
ExpressionType.Equal => val == targetInt,
|
||||||
@@ -111,13 +111,12 @@ internal static class BsonExpressionEvaluator
|
|||||||
_ => false
|
_ => false
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
}
|
|
||||||
else if (type == BsonType.String)
|
else if (type == BsonType.String)
|
||||||
{
|
{
|
||||||
var val = reader.ReadString();
|
string val = reader.ReadString();
|
||||||
if (target is string targetStr)
|
if (target is string targetStr)
|
||||||
{
|
{
|
||||||
var cmp = string.Compare(val, targetStr, StringComparison.Ordinal);
|
int cmp = string.Compare(val, targetStr, StringComparison.Ordinal);
|
||||||
return op switch
|
return op switch
|
||||||
{
|
{
|
||||||
ExpressionType.Equal => cmp == 0,
|
ExpressionType.Equal => cmp == 0,
|
||||||
|
|||||||
@@ -1,6 +1,3 @@
|
|||||||
using System;
|
|
||||||
using System.Collections.Generic;
|
|
||||||
using System.Linq;
|
|
||||||
using System.Linq.Expressions;
|
using System.Linq.Expressions;
|
||||||
using System.Reflection;
|
using System.Reflection;
|
||||||
|
|
||||||
@@ -26,10 +23,7 @@ internal class EnumerableRewriter : ExpressionVisitor
|
|||||||
protected override Expression VisitConstant(ConstantExpression node)
|
protected override Expression VisitConstant(ConstantExpression node)
|
||||||
{
|
{
|
||||||
// Replace the IQueryable source with the materialized IEnumerable
|
// Replace the IQueryable source with the materialized IEnumerable
|
||||||
if (node.Value == _source)
|
if (node.Value == _source) return Expression.Constant(_target);
|
||||||
{
|
|
||||||
return Expression.Constant(_target);
|
|
||||||
}
|
|
||||||
return base.VisitConstant(node);
|
return base.VisitConstant(node);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -38,11 +32,11 @@ internal class EnumerableRewriter : ExpressionVisitor
|
|||||||
{
|
{
|
||||||
if (node.Method.DeclaringType == typeof(Queryable))
|
if (node.Method.DeclaringType == typeof(Queryable))
|
||||||
{
|
{
|
||||||
var methodName = node.Method.Name;
|
string methodName = node.Method.Name;
|
||||||
var typeArgs = node.Method.GetGenericArguments();
|
var typeArgs = node.Method.GetGenericArguments();
|
||||||
var args = new Expression[node.Arguments.Count];
|
var args = new Expression[node.Arguments.Count];
|
||||||
|
|
||||||
for (int i = 0; i < node.Arguments.Count; i++)
|
for (var i = 0; i < node.Arguments.Count; i++)
|
||||||
{
|
{
|
||||||
var arg = Visit(node.Arguments[i]);
|
var arg = Visit(node.Arguments[i]);
|
||||||
|
|
||||||
@@ -52,6 +46,7 @@ internal class EnumerableRewriter : ExpressionVisitor
|
|||||||
var lambda = (LambdaExpression)quote.Operand;
|
var lambda = (LambdaExpression)quote.Operand;
|
||||||
arg = Expression.Constant(lambda.Compile());
|
arg = Expression.Constant(lambda.Compile());
|
||||||
}
|
}
|
||||||
|
|
||||||
args[i] = arg;
|
args[i] = arg;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -5,6 +5,227 @@ namespace ZB.MOM.WW.CBDD.Core.Query;
|
|||||||
|
|
||||||
internal static class IndexOptimizer
|
internal static class IndexOptimizer
|
||||||
{
|
{
|
||||||
|
public enum SpatialQueryType
|
||||||
|
{
|
||||||
|
Near,
|
||||||
|
Within
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Attempts to optimize a query model using available indexes.
|
||||||
|
/// </summary>
|
||||||
|
/// <typeparam name="T">The document type.</typeparam>
|
||||||
|
/// <param name="model">The query model.</param>
|
||||||
|
/// <param name="indexes">The available collection indexes.</param>
|
||||||
|
/// <returns>An optimization result when optimization is possible; otherwise, <see langword="null" />.</returns>
|
||||||
|
public static OptimizationResult? TryOptimize<T>(QueryModel model, IEnumerable<CollectionIndexInfo> indexes)
|
||||||
|
{
|
||||||
|
if (model.WhereClause == null) return null;
|
||||||
|
|
||||||
|
return OptimizeExpression(model.WhereClause.Body, model.WhereClause.Parameters[0], indexes);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static OptimizationResult? OptimizeExpression(Expression expression, ParameterExpression parameter,
|
||||||
|
IEnumerable<CollectionIndexInfo> indexes)
|
||||||
|
{
|
||||||
|
// ... (Existing AndAlso logic remains the same) ...
|
||||||
|
if (expression is BinaryExpression binary && binary.NodeType == ExpressionType.AndAlso)
|
||||||
|
{
|
||||||
|
var left = OptimizeExpression(binary.Left, parameter, indexes);
|
||||||
|
var right = OptimizeExpression(binary.Right, parameter, indexes);
|
||||||
|
|
||||||
|
if (left != null && right != null && left.IndexName == right.IndexName)
|
||||||
|
return new OptimizationResult
|
||||||
|
{
|
||||||
|
IndexName = left.IndexName,
|
||||||
|
MinValue = left.MinValue ?? right.MinValue,
|
||||||
|
MaxValue = left.MaxValue ?? right.MaxValue,
|
||||||
|
IsRange = true
|
||||||
|
};
|
||||||
|
return left ?? right;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle Simple Binary Predicates
|
||||||
|
(string? propertyName, object? value, var op) = ParseSimplePredicate(expression, parameter);
|
||||||
|
if (propertyName != null)
|
||||||
|
{
|
||||||
|
var index = indexes.FirstOrDefault(i => Matches(i, propertyName));
|
||||||
|
if (index != null)
|
||||||
|
{
|
||||||
|
var result = new OptimizationResult { IndexName = index.Name };
|
||||||
|
switch (op)
|
||||||
|
{
|
||||||
|
case ExpressionType.Equal:
|
||||||
|
result.MinValue = value;
|
||||||
|
result.MaxValue = value;
|
||||||
|
result.IsRange = false;
|
||||||
|
break;
|
||||||
|
case ExpressionType.GreaterThan:
|
||||||
|
case ExpressionType.GreaterThanOrEqual:
|
||||||
|
result.MinValue = value;
|
||||||
|
result.MaxValue = null;
|
||||||
|
result.IsRange = true;
|
||||||
|
break;
|
||||||
|
case ExpressionType.LessThan:
|
||||||
|
case ExpressionType.LessThanOrEqual:
|
||||||
|
result.MinValue = null;
|
||||||
|
result.MaxValue = value;
|
||||||
|
result.IsRange = true;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle StartsWith
|
||||||
|
if (expression is MethodCallExpression call && call.Method.Name == "StartsWith" &&
|
||||||
|
call.Object is MemberExpression member)
|
||||||
|
if (member.Expression == parameter && call.Arguments[0] is ConstantExpression constant &&
|
||||||
|
constant.Value is string prefix)
|
||||||
|
{
|
||||||
|
var index = indexes.FirstOrDefault(i => Matches(i, member.Member.Name));
|
||||||
|
if (index != null && index.Type == IndexType.BTree)
|
||||||
|
{
|
||||||
|
string nextPrefix = IncrementPrefix(prefix);
|
||||||
|
return new OptimizationResult
|
||||||
|
{
|
||||||
|
IndexName = index.Name,
|
||||||
|
MinValue = prefix,
|
||||||
|
MaxValue = nextPrefix,
|
||||||
|
IsRange = true
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle Method Calls (VectorSearch, Near, Within)
|
||||||
|
if (expression is MethodCallExpression mcall)
|
||||||
|
{
|
||||||
|
// VectorSearch(this float[] vector, float[] query, int k)
|
||||||
|
if (mcall.Method.Name == "VectorSearch" && mcall.Arguments[0] is MemberExpression vMember &&
|
||||||
|
vMember.Expression == parameter)
|
||||||
|
{
|
||||||
|
float[] query = EvaluateExpression<float[]>(mcall.Arguments[1]);
|
||||||
|
var k = EvaluateExpression<int>(mcall.Arguments[2]);
|
||||||
|
|
||||||
|
var index = indexes.FirstOrDefault(i => i.Type == IndexType.Vector && Matches(i, vMember.Member.Name));
|
||||||
|
if (index != null)
|
||||||
|
return new OptimizationResult
|
||||||
|
{
|
||||||
|
IndexName = index.Name,
|
||||||
|
IsVectorSearch = true,
|
||||||
|
VectorQuery = query,
|
||||||
|
K = k
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Near(this (double, double) point, (double, double) center, double radiusKm)
|
||||||
|
if (mcall.Method.Name == "Near" && mcall.Arguments[0] is MemberExpression nMember &&
|
||||||
|
nMember.Expression == parameter)
|
||||||
|
{
|
||||||
|
var center = EvaluateExpression<(double, double)>(mcall.Arguments[1]);
|
||||||
|
var radius = EvaluateExpression<double>(mcall.Arguments[2]);
|
||||||
|
|
||||||
|
var index = indexes.FirstOrDefault(i => i.Type == IndexType.Spatial && Matches(i, nMember.Member.Name));
|
||||||
|
if (index != null)
|
||||||
|
return new OptimizationResult
|
||||||
|
{
|
||||||
|
IndexName = index.Name,
|
||||||
|
IsSpatialSearch = true,
|
||||||
|
SpatialType = SpatialQueryType.Near,
|
||||||
|
SpatialPoint = center,
|
||||||
|
RadiusKm = radius
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Within(this (double, double) point, (double, double) min, (double, double) max)
|
||||||
|
if (mcall.Method.Name == "Within" && mcall.Arguments[0] is MemberExpression wMember &&
|
||||||
|
wMember.Expression == parameter)
|
||||||
|
{
|
||||||
|
var min = EvaluateExpression<(double, double)>(mcall.Arguments[1]);
|
||||||
|
var max = EvaluateExpression<(double, double)>(mcall.Arguments[2]);
|
||||||
|
|
||||||
|
var index = indexes.FirstOrDefault(i => i.Type == IndexType.Spatial && Matches(i, wMember.Member.Name));
|
||||||
|
if (index != null)
|
||||||
|
return new OptimizationResult
|
||||||
|
{
|
||||||
|
IndexName = index.Name,
|
||||||
|
IsSpatialSearch = true,
|
||||||
|
SpatialType = SpatialQueryType.Within,
|
||||||
|
SpatialMin = min,
|
||||||
|
SpatialMax = max
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string IncrementPrefix(string prefix)
|
||||||
|
{
|
||||||
|
if (string.IsNullOrEmpty(prefix)) return null!;
|
||||||
|
char lastChar = prefix[prefix.Length - 1];
|
||||||
|
if (lastChar == char.MaxValue) return prefix; // Cannot increment
|
||||||
|
return prefix.Substring(0, prefix.Length - 1) + (char)(lastChar + 1);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static T EvaluateExpression<T>(Expression expression)
|
||||||
|
{
|
||||||
|
if (expression is ConstantExpression constant) return (T)constant.Value!;
|
||||||
|
|
||||||
|
// Evaluate more complex expressions (closures, properties, etc.)
|
||||||
|
var lambda = Expression.Lambda(expression);
|
||||||
|
var compiled = lambda.Compile();
|
||||||
|
return (T)compiled.DynamicInvoke()!;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static bool Matches(CollectionIndexInfo index, string propertyName)
|
||||||
|
{
|
||||||
|
if (index.PropertyPaths == null || index.PropertyPaths.Length == 0) return false;
|
||||||
|
return index.PropertyPaths[0].Equals(propertyName, StringComparison.OrdinalIgnoreCase);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static (string? propertyName, object? value, ExpressionType op) ParseSimplePredicate(Expression expression,
|
||||||
|
ParameterExpression parameter)
|
||||||
|
{
|
||||||
|
if (expression is BinaryExpression binary)
|
||||||
|
{
|
||||||
|
var left = binary.Left;
|
||||||
|
var right = binary.Right;
|
||||||
|
var nodeType = binary.NodeType;
|
||||||
|
|
||||||
|
if (right is MemberExpression && left is ConstantExpression)
|
||||||
|
{
|
||||||
|
(left, right) = (right, left);
|
||||||
|
nodeType = Flip(nodeType);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (left is MemberExpression member && right is ConstantExpression constant)
|
||||||
|
if (member.Expression == parameter)
|
||||||
|
return (member.Member.Name, constant.Value, nodeType);
|
||||||
|
|
||||||
|
// Handle Convert
|
||||||
|
if (left is UnaryExpression unary && unary.Operand is MemberExpression member2 &&
|
||||||
|
right is ConstantExpression constant2)
|
||||||
|
if (member2.Expression == parameter)
|
||||||
|
return (member2.Member.Name, constant2.Value, nodeType);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (null, null, ExpressionType.Default);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static ExpressionType Flip(ExpressionType type)
|
||||||
|
{
|
||||||
|
return type switch
|
||||||
|
{
|
||||||
|
ExpressionType.GreaterThan => ExpressionType.LessThan,
|
||||||
|
ExpressionType.LessThan => ExpressionType.GreaterThan,
|
||||||
|
ExpressionType.GreaterThanOrEqual => ExpressionType.LessThanOrEqual,
|
||||||
|
ExpressionType.LessThanOrEqual => ExpressionType.GreaterThanOrEqual,
|
||||||
|
_ => type
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Represents the selected index and bounds for an optimized query.
|
/// Represents the selected index and bounds for an optimized query.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -75,225 +296,4 @@ internal static class IndexOptimizer
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public SpatialQueryType SpatialType { get; set; }
|
public SpatialQueryType SpatialType { get; set; }
|
||||||
}
|
}
|
||||||
|
|
||||||
public enum SpatialQueryType { Near, Within }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Attempts to optimize a query model using available indexes.
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="T">The document type.</typeparam>
|
|
||||||
/// <param name="model">The query model.</param>
|
|
||||||
/// <param name="indexes">The available collection indexes.</param>
|
|
||||||
/// <returns>An optimization result when optimization is possible; otherwise, <see langword="null"/>.</returns>
|
|
||||||
public static OptimizationResult? TryOptimize<T>(QueryModel model, IEnumerable<CollectionIndexInfo> indexes)
|
|
||||||
{
|
|
||||||
if (model.WhereClause == null) return null;
|
|
||||||
|
|
||||||
return OptimizeExpression(model.WhereClause.Body, model.WhereClause.Parameters[0], indexes);
|
|
||||||
}
|
|
||||||
|
|
||||||
private static OptimizationResult? OptimizeExpression(Expression expression, ParameterExpression parameter, IEnumerable<CollectionIndexInfo> indexes)
|
|
||||||
{
|
|
||||||
// ... (Existing AndAlso logic remains the same) ...
|
|
||||||
if (expression is BinaryExpression binary && binary.NodeType == ExpressionType.AndAlso)
|
|
||||||
{
|
|
||||||
var left = OptimizeExpression(binary.Left, parameter, indexes);
|
|
||||||
var right = OptimizeExpression(binary.Right, parameter, indexes);
|
|
||||||
|
|
||||||
if (left != null && right != null && left.IndexName == right.IndexName)
|
|
||||||
{
|
|
||||||
return new OptimizationResult
|
|
||||||
{
|
|
||||||
IndexName = left.IndexName,
|
|
||||||
MinValue = left.MinValue ?? right.MinValue,
|
|
||||||
MaxValue = left.MaxValue ?? right.MaxValue,
|
|
||||||
IsRange = true
|
|
||||||
};
|
|
||||||
}
|
|
||||||
return left ?? right;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Handle Simple Binary Predicates
|
|
||||||
var (propertyName, value, op) = ParseSimplePredicate(expression, parameter);
|
|
||||||
if (propertyName != null)
|
|
||||||
{
|
|
||||||
var index = indexes.FirstOrDefault(i => Matches(i, propertyName));
|
|
||||||
if (index != null)
|
|
||||||
{
|
|
||||||
var result = new OptimizationResult { IndexName = index.Name };
|
|
||||||
switch (op)
|
|
||||||
{
|
|
||||||
case ExpressionType.Equal:
|
|
||||||
result.MinValue = value;
|
|
||||||
result.MaxValue = value;
|
|
||||||
result.IsRange = false;
|
|
||||||
break;
|
|
||||||
case ExpressionType.GreaterThan:
|
|
||||||
case ExpressionType.GreaterThanOrEqual:
|
|
||||||
result.MinValue = value;
|
|
||||||
result.MaxValue = null;
|
|
||||||
result.IsRange = true;
|
|
||||||
break;
|
|
||||||
case ExpressionType.LessThan:
|
|
||||||
case ExpressionType.LessThanOrEqual:
|
|
||||||
result.MinValue = null;
|
|
||||||
result.MaxValue = value;
|
|
||||||
result.IsRange = true;
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
return result;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Handle StartsWith
|
|
||||||
if (expression is MethodCallExpression call && call.Method.Name == "StartsWith" && call.Object is MemberExpression member)
|
|
||||||
{
|
|
||||||
if (member.Expression == parameter && call.Arguments[0] is ConstantExpression constant && constant.Value is string prefix)
|
|
||||||
{
|
|
||||||
var index = indexes.FirstOrDefault(i => Matches(i, member.Member.Name));
|
|
||||||
if (index != null && index.Type == IndexType.BTree)
|
|
||||||
{
|
|
||||||
var nextPrefix = IncrementPrefix(prefix);
|
|
||||||
return new OptimizationResult
|
|
||||||
{
|
|
||||||
IndexName = index.Name,
|
|
||||||
MinValue = prefix,
|
|
||||||
MaxValue = nextPrefix,
|
|
||||||
IsRange = true
|
|
||||||
};
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Handle Method Calls (VectorSearch, Near, Within)
|
|
||||||
if (expression is MethodCallExpression mcall)
|
|
||||||
{
|
|
||||||
// VectorSearch(this float[] vector, float[] query, int k)
|
|
||||||
if (mcall.Method.Name == "VectorSearch" && mcall.Arguments[0] is MemberExpression vMember && vMember.Expression == parameter)
|
|
||||||
{
|
|
||||||
var query = EvaluateExpression<float[]>(mcall.Arguments[1]);
|
|
||||||
var k = EvaluateExpression<int>(mcall.Arguments[2]);
|
|
||||||
|
|
||||||
var index = indexes.FirstOrDefault(i => i.Type == IndexType.Vector && Matches(i, vMember.Member.Name));
|
|
||||||
if (index != null)
|
|
||||||
{
|
|
||||||
return new OptimizationResult
|
|
||||||
{
|
|
||||||
IndexName = index.Name,
|
|
||||||
IsVectorSearch = true,
|
|
||||||
VectorQuery = query,
|
|
||||||
K = k
|
|
||||||
};
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Near(this (double, double) point, (double, double) center, double radiusKm)
|
|
||||||
if (mcall.Method.Name == "Near" && mcall.Arguments[0] is MemberExpression nMember && nMember.Expression == parameter)
|
|
||||||
{
|
|
||||||
var center = EvaluateExpression<(double, double)>(mcall.Arguments[1]);
|
|
||||||
var radius = EvaluateExpression<double>(mcall.Arguments[2]);
|
|
||||||
|
|
||||||
var index = indexes.FirstOrDefault(i => i.Type == IndexType.Spatial && Matches(i, nMember.Member.Name));
|
|
||||||
if (index != null)
|
|
||||||
{
|
|
||||||
return new OptimizationResult
|
|
||||||
{
|
|
||||||
IndexName = index.Name,
|
|
||||||
IsSpatialSearch = true,
|
|
||||||
SpatialType = SpatialQueryType.Near,
|
|
||||||
SpatialPoint = center,
|
|
||||||
RadiusKm = radius
|
|
||||||
};
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Within(this (double, double) point, (double, double) min, (double, double) max)
|
|
||||||
if (mcall.Method.Name == "Within" && mcall.Arguments[0] is MemberExpression wMember && wMember.Expression == parameter)
|
|
||||||
{
|
|
||||||
var min = EvaluateExpression<(double, double)>(mcall.Arguments[1]);
|
|
||||||
var max = EvaluateExpression<(double, double)>(mcall.Arguments[2]);
|
|
||||||
|
|
||||||
var index = indexes.FirstOrDefault(i => i.Type == IndexType.Spatial && Matches(i, wMember.Member.Name));
|
|
||||||
if (index != null)
|
|
||||||
{
|
|
||||||
return new OptimizationResult
|
|
||||||
{
|
|
||||||
IndexName = index.Name,
|
|
||||||
IsSpatialSearch = true,
|
|
||||||
SpatialType = SpatialQueryType.Within,
|
|
||||||
SpatialMin = min,
|
|
||||||
SpatialMax = max
|
|
||||||
};
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
private static string IncrementPrefix(string prefix)
|
|
||||||
{
|
|
||||||
if (string.IsNullOrEmpty(prefix)) return null!;
|
|
||||||
char lastChar = prefix[prefix.Length - 1];
|
|
||||||
if (lastChar == char.MaxValue) return prefix; // Cannot increment
|
|
||||||
return prefix.Substring(0, prefix.Length - 1) + (char)(lastChar + 1);
|
|
||||||
}
|
|
||||||
|
|
||||||
private static T EvaluateExpression<T>(Expression expression)
|
|
||||||
{
|
|
||||||
if (expression is ConstantExpression constant)
|
|
||||||
{
|
|
||||||
return (T)constant.Value!;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Evaluate more complex expressions (closures, properties, etc.)
|
|
||||||
var lambda = Expression.Lambda(expression);
|
|
||||||
var compiled = lambda.Compile();
|
|
||||||
return (T)compiled.DynamicInvoke()!;
|
|
||||||
}
|
|
||||||
|
|
||||||
private static bool Matches(CollectionIndexInfo index, string propertyName)
|
|
||||||
{
|
|
||||||
if (index.PropertyPaths == null || index.PropertyPaths.Length == 0) return false;
|
|
||||||
return index.PropertyPaths[0].Equals(propertyName, StringComparison.OrdinalIgnoreCase);
|
|
||||||
}
|
|
||||||
|
|
||||||
private static (string? propertyName, object? value, ExpressionType op) ParseSimplePredicate(Expression expression, ParameterExpression parameter)
|
|
||||||
{
|
|
||||||
if (expression is BinaryExpression binary)
|
|
||||||
{
|
|
||||||
var left = binary.Left;
|
|
||||||
var right = binary.Right;
|
|
||||||
var nodeType = binary.NodeType;
|
|
||||||
|
|
||||||
if (right is MemberExpression && left is ConstantExpression)
|
|
||||||
{
|
|
||||||
(left, right) = (right, left);
|
|
||||||
nodeType = Flip(nodeType);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (left is MemberExpression member && right is ConstantExpression constant)
|
|
||||||
{
|
|
||||||
if (member.Expression == parameter)
|
|
||||||
return (member.Member.Name, constant.Value, nodeType);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Handle Convert
|
|
||||||
if (left is UnaryExpression unary && unary.Operand is MemberExpression member2 && right is ConstantExpression constant2)
|
|
||||||
{
|
|
||||||
if (member2.Expression == parameter)
|
|
||||||
return (member2.Member.Name, constant2.Value, nodeType);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return (null, null, ExpressionType.Default);
|
|
||||||
}
|
|
||||||
|
|
||||||
private static ExpressionType Flip(ExpressionType type) => type switch
|
|
||||||
{
|
|
||||||
ExpressionType.GreaterThan => ExpressionType.LessThan,
|
|
||||||
ExpressionType.LessThan => ExpressionType.GreaterThan,
|
|
||||||
ExpressionType.GreaterThanOrEqual => ExpressionType.LessThanOrEqual,
|
|
||||||
ExpressionType.LessThanOrEqual => ExpressionType.GreaterThanOrEqual,
|
|
||||||
_ => type
|
|
||||||
};
|
|
||||||
}
|
}
|
||||||
@@ -1,6 +1,6 @@
|
|||||||
using System.Runtime.InteropServices;
|
using System.Buffers;
|
||||||
|
using System.Buffers.Binary;
|
||||||
using System.Text;
|
using System.Text;
|
||||||
using ZB.MOM.WW.CBDD.Core;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Storage;
|
namespace ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
|
|
||||||
@@ -49,8 +49,8 @@ public struct DictionaryPage
|
|||||||
header.WriteTo(page);
|
header.WriteTo(page);
|
||||||
|
|
||||||
// 2. Initialize Counts
|
// 2. Initialize Counts
|
||||||
System.Buffers.Binary.BinaryPrimitives.WriteUInt16LittleEndian(page.Slice(CountOffset), 0);
|
BinaryPrimitives.WriteUInt16LittleEndian(page.Slice(CountOffset), 0);
|
||||||
System.Buffers.Binary.BinaryPrimitives.WriteUInt16LittleEndian(page.Slice(FreeSpaceEndOffset), (ushort)page.Length);
|
BinaryPrimitives.WriteUInt16LittleEndian(page.Slice(FreeSpaceEndOffset), (ushort)page.Length);
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -63,29 +63,26 @@ public struct DictionaryPage
|
|||||||
/// <returns><see langword="true" /> if the entry was inserted; otherwise, <see langword="false" />.</returns>
|
/// <returns><see langword="true" /> if the entry was inserted; otherwise, <see langword="false" />.</returns>
|
||||||
public static bool Insert(Span<byte> page, string key, ushort value)
|
public static bool Insert(Span<byte> page, string key, ushort value)
|
||||||
{
|
{
|
||||||
var keyByteCount = Encoding.UTF8.GetByteCount(key);
|
int keyByteCount = Encoding.UTF8.GetByteCount(key);
|
||||||
if (keyByteCount > 255) throw new ArgumentException("Key length must be <= 255 bytes");
|
if (keyByteCount > 255) throw new ArgumentException("Key length must be <= 255 bytes");
|
||||||
|
|
||||||
// Entry Size: KeyLen(1) + Key(N) + Value(2)
|
// Entry Size: KeyLen(1) + Key(N) + Value(2)
|
||||||
var entrySize = 1 + keyByteCount + 2;
|
int entrySize = 1 + keyByteCount + 2;
|
||||||
var requiredSpace = entrySize + 2; // +2 for Offset entry
|
int requiredSpace = entrySize + 2; // +2 for Offset entry
|
||||||
|
|
||||||
var count = System.Buffers.Binary.BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(CountOffset));
|
ushort count = BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(CountOffset));
|
||||||
var freeSpaceEnd = System.Buffers.Binary.BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(FreeSpaceEndOffset));
|
ushort freeSpaceEnd = BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(FreeSpaceEndOffset));
|
||||||
|
|
||||||
var offsetsEnd = OffsetsStart + (count * 2);
|
int offsetsEnd = OffsetsStart + count * 2;
|
||||||
var freeSpace = freeSpaceEnd - offsetsEnd;
|
int freeSpace = freeSpaceEnd - offsetsEnd;
|
||||||
|
|
||||||
if (freeSpace < requiredSpace)
|
if (freeSpace < requiredSpace) return false; // Page Full
|
||||||
{
|
|
||||||
return false; // Page Full
|
|
||||||
}
|
|
||||||
|
|
||||||
// 1. Prepare Data
|
// 1. Prepare Data
|
||||||
var insertionOffset = (ushort)(freeSpaceEnd - entrySize);
|
var insertionOffset = (ushort)(freeSpaceEnd - entrySize);
|
||||||
page[insertionOffset] = (byte)keyByteCount; // Write Key Length
|
page[insertionOffset] = (byte)keyByteCount; // Write Key Length
|
||||||
Encoding.UTF8.GetBytes(key, page.Slice(insertionOffset + 1, keyByteCount)); // Write Key
|
Encoding.UTF8.GetBytes(key, page.Slice(insertionOffset + 1, keyByteCount)); // Write Key
|
||||||
System.Buffers.Binary.BinaryPrimitives.WriteUInt16LittleEndian(page.Slice(insertionOffset + 1 + keyByteCount), value); // Write Value
|
BinaryPrimitives.WriteUInt16LittleEndian(page.Slice(insertionOffset + 1 + keyByteCount), value); // Write Value
|
||||||
|
|
||||||
// 2. Insert Offset into Sorted List
|
// 2. Insert Offset into Sorted List
|
||||||
// Find insert Index using spans
|
// Find insert Index using spans
|
||||||
@@ -95,21 +92,21 @@ public struct DictionaryPage
|
|||||||
// Shift offsets if needed
|
// Shift offsets if needed
|
||||||
if (insertIndex < count)
|
if (insertIndex < count)
|
||||||
{
|
{
|
||||||
var src = page.Slice(OffsetsStart + (insertIndex * 2), (count - insertIndex) * 2);
|
var src = page.Slice(OffsetsStart + insertIndex * 2, (count - insertIndex) * 2);
|
||||||
var dest = page.Slice(OffsetsStart + ((insertIndex + 1) * 2));
|
var dest = page.Slice(OffsetsStart + (insertIndex + 1) * 2);
|
||||||
src.CopyTo(dest);
|
src.CopyTo(dest);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Write new offset
|
// Write new offset
|
||||||
System.Buffers.Binary.BinaryPrimitives.WriteUInt16LittleEndian(page.Slice(OffsetsStart + (insertIndex * 2)), insertionOffset);
|
BinaryPrimitives.WriteUInt16LittleEndian(page.Slice(OffsetsStart + insertIndex * 2), insertionOffset);
|
||||||
|
|
||||||
// 3. Update Metadata
|
// 3. Update Metadata
|
||||||
System.Buffers.Binary.BinaryPrimitives.WriteUInt16LittleEndian(page.Slice(CountOffset), (ushort)(count + 1));
|
BinaryPrimitives.WriteUInt16LittleEndian(page.Slice(CountOffset), (ushort)(count + 1));
|
||||||
System.Buffers.Binary.BinaryPrimitives.WriteUInt16LittleEndian(page.Slice(FreeSpaceEndOffset), insertionOffset);
|
BinaryPrimitives.WriteUInt16LittleEndian(page.Slice(FreeSpaceEndOffset), insertionOffset);
|
||||||
|
|
||||||
// Update FreeBytes in header (approximate)
|
// Update FreeBytes in header (approximate)
|
||||||
var pageHeader = PageHeader.ReadFrom(page);
|
var pageHeader = PageHeader.ReadFrom(page);
|
||||||
pageHeader.FreeBytes = (ushort)(insertionOffset - (OffsetsStart + ((count + 1) * 2)));
|
pageHeader.FreeBytes = (ushort)(insertionOffset - (OffsetsStart + (count + 1) * 2));
|
||||||
pageHeader.WriteTo(page);
|
pageHeader.WriteTo(page);
|
||||||
|
|
||||||
return true;
|
return true;
|
||||||
@@ -125,27 +122,27 @@ public struct DictionaryPage
|
|||||||
public static bool TryFind(ReadOnlySpan<byte> page, ReadOnlySpan<byte> keyBytes, out ushort value)
|
public static bool TryFind(ReadOnlySpan<byte> page, ReadOnlySpan<byte> keyBytes, out ushort value)
|
||||||
{
|
{
|
||||||
value = 0;
|
value = 0;
|
||||||
var count = System.Buffers.Binary.BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(CountOffset));
|
ushort count = BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(CountOffset));
|
||||||
if (count == 0) return false;
|
if (count == 0) return false;
|
||||||
|
|
||||||
// Binary Search
|
// Binary Search
|
||||||
int low = 0;
|
var low = 0;
|
||||||
int high = count - 1;
|
int high = count - 1;
|
||||||
|
|
||||||
while (low <= high)
|
while (low <= high)
|
||||||
{
|
{
|
||||||
int mid = low + (high - low) / 2;
|
int mid = low + (high - low) / 2;
|
||||||
var offset = System.Buffers.Binary.BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(OffsetsStart + (mid * 2)));
|
ushort offset = BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(OffsetsStart + mid * 2));
|
||||||
|
|
||||||
// Read Key at Offset
|
// Read Key at Offset
|
||||||
var keyLen = page[offset];
|
byte keyLen = page[offset];
|
||||||
var entryKeySpan = page.Slice(offset + 1, keyLen);
|
var entryKeySpan = page.Slice(offset + 1, keyLen);
|
||||||
|
|
||||||
int comparison = entryKeySpan.SequenceCompareTo(keyBytes);
|
int comparison = entryKeySpan.SequenceCompareTo(keyBytes);
|
||||||
|
|
||||||
if (comparison == 0)
|
if (comparison == 0)
|
||||||
{
|
{
|
||||||
value = System.Buffers.Binary.BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(offset + 1 + keyLen));
|
value = BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(offset + 1 + keyLen));
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -167,14 +164,15 @@ public struct DictionaryPage
|
|||||||
/// <param name="value">When this method returns, contains the found value.</param>
|
/// <param name="value">When this method returns, contains the found value.</param>
|
||||||
/// <param name="transactionId">Optional transaction identifier for isolated reads.</param>
|
/// <param name="transactionId">Optional transaction identifier for isolated reads.</param>
|
||||||
/// <returns><see langword="true" /> if the key was found; otherwise, <see langword="false" />.</returns>
|
/// <returns><see langword="true" /> if the key was found; otherwise, <see langword="false" />.</returns>
|
||||||
public static bool TryFindGlobal(StorageEngine storage, uint startPageId, string key, out ushort value, ulong? transactionId = null)
|
public static bool TryFindGlobal(StorageEngine storage, uint startPageId, string key, out ushort value,
|
||||||
|
ulong? transactionId = null)
|
||||||
{
|
{
|
||||||
var keyByteCount = Encoding.UTF8.GetByteCount(key);
|
int keyByteCount = Encoding.UTF8.GetByteCount(key);
|
||||||
Span<byte> keyBytes = keyByteCount <= 256 ? stackalloc byte[keyByteCount] : new byte[keyByteCount];
|
var keyBytes = keyByteCount <= 256 ? stackalloc byte[keyByteCount] : new byte[keyByteCount];
|
||||||
Encoding.UTF8.GetBytes(key, keyBytes);
|
Encoding.UTF8.GetBytes(key, keyBytes);
|
||||||
|
|
||||||
var pageId = startPageId;
|
uint pageId = startPageId;
|
||||||
var pageBuffer = System.Buffers.ArrayPool<byte>.Shared.Rent(storage.PageSize);
|
byte[] pageBuffer = ArrayPool<byte>.Shared.Rent(storage.PageSize);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
while (pageId != 0)
|
while (pageId != 0)
|
||||||
@@ -183,10 +181,7 @@ public struct DictionaryPage
|
|||||||
storage.ReadPage(pageId, transactionId, pageBuffer);
|
storage.ReadPage(pageId, transactionId, pageBuffer);
|
||||||
|
|
||||||
// TryFind in this page
|
// TryFind in this page
|
||||||
if (TryFind(pageBuffer, keyBytes, out value))
|
if (TryFind(pageBuffer, keyBytes, out value)) return true;
|
||||||
{
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Move to next page
|
// Move to next page
|
||||||
var header = PageHeader.ReadFrom(pageBuffer);
|
var header = PageHeader.ReadFrom(pageBuffer);
|
||||||
@@ -195,7 +190,7 @@ public struct DictionaryPage
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(pageBuffer);
|
ArrayPool<byte>.Shared.Return(pageBuffer);
|
||||||
}
|
}
|
||||||
|
|
||||||
value = 0;
|
value = 0;
|
||||||
@@ -204,15 +199,15 @@ public struct DictionaryPage
|
|||||||
|
|
||||||
private static int FindInsertIndex(ReadOnlySpan<byte> page, int count, ReadOnlySpan<byte> keyBytes)
|
private static int FindInsertIndex(ReadOnlySpan<byte> page, int count, ReadOnlySpan<byte> keyBytes)
|
||||||
{
|
{
|
||||||
int low = 0;
|
var low = 0;
|
||||||
int high = count - 1;
|
int high = count - 1;
|
||||||
|
|
||||||
while (low <= high)
|
while (low <= high)
|
||||||
{
|
{
|
||||||
int mid = low + (high - low) / 2;
|
int mid = low + (high - low) / 2;
|
||||||
var offset = System.Buffers.Binary.BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(OffsetsStart + (mid * 2)));
|
ushort offset = BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(OffsetsStart + mid * 2));
|
||||||
|
|
||||||
var keyLen = page[offset];
|
byte keyLen = page[offset];
|
||||||
var entryKeySpan = page.Slice(offset + 1, keyLen);
|
var entryKeySpan = page.Slice(offset + 1, keyLen);
|
||||||
|
|
||||||
int comparison = entryKeySpan.SequenceCompareTo(keyBytes);
|
int comparison = entryKeySpan.SequenceCompareTo(keyBytes);
|
||||||
@@ -223,6 +218,7 @@ public struct DictionaryPage
|
|||||||
else
|
else
|
||||||
high = mid - 1;
|
high = mid - 1;
|
||||||
}
|
}
|
||||||
|
|
||||||
return low;
|
return low;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -233,18 +229,20 @@ public struct DictionaryPage
|
|||||||
/// <returns>All key-value pairs in the page.</returns>
|
/// <returns>All key-value pairs in the page.</returns>
|
||||||
public static IEnumerable<(string Key, ushort Value)> GetAll(ReadOnlySpan<byte> page)
|
public static IEnumerable<(string Key, ushort Value)> GetAll(ReadOnlySpan<byte> page)
|
||||||
{
|
{
|
||||||
var count = System.Buffers.Binary.BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(CountOffset));
|
ushort count = BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(CountOffset));
|
||||||
var list = new List<(string Key, ushort Value)>();
|
var list = new List<(string Key, ushort Value)>();
|
||||||
for (int i = 0; i < count; i++)
|
for (var i = 0; i < count; i++)
|
||||||
{
|
{
|
||||||
var offset = System.Buffers.Binary.BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(OffsetsStart + (i * 2)));
|
ushort offset = BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(OffsetsStart + i * 2));
|
||||||
var keyLen = page[offset];
|
byte keyLen = page[offset];
|
||||||
var keyStr = Encoding.UTF8.GetString(page.Slice(offset + 1, keyLen));
|
string keyStr = Encoding.UTF8.GetString(page.Slice(offset + 1, keyLen));
|
||||||
var val = System.Buffers.Binary.BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(offset + 1 + keyLen));
|
ushort val = BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(offset + 1 + keyLen));
|
||||||
list.Add((keyStr, val));
|
list.Add((keyStr, val));
|
||||||
}
|
}
|
||||||
|
|
||||||
return list;
|
return list;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Retrieves all key-value pairs across a chain of DictionaryPages.
|
/// Retrieves all key-value pairs across a chain of DictionaryPages.
|
||||||
/// Used for rebuilding the in-memory cache.
|
/// Used for rebuilding the in-memory cache.
|
||||||
@@ -253,10 +251,11 @@ public struct DictionaryPage
|
|||||||
/// <param name="startPageId">The first page in the dictionary chain.</param>
|
/// <param name="startPageId">The first page in the dictionary chain.</param>
|
||||||
/// <param name="transactionId">Optional transaction identifier for isolated reads.</param>
|
/// <param name="transactionId">Optional transaction identifier for isolated reads.</param>
|
||||||
/// <returns>All key-value pairs across the page chain.</returns>
|
/// <returns>All key-value pairs across the page chain.</returns>
|
||||||
public static IEnumerable<(string Key, ushort Value)> FindAllGlobal(StorageEngine storage, uint startPageId, ulong? transactionId = null)
|
public static IEnumerable<(string Key, ushort Value)> FindAllGlobal(StorageEngine storage, uint startPageId,
|
||||||
|
ulong? transactionId = null)
|
||||||
{
|
{
|
||||||
var pageId = startPageId;
|
uint pageId = startPageId;
|
||||||
var pageBuffer = System.Buffers.ArrayPool<byte>.Shared.Rent(storage.PageSize);
|
byte[] pageBuffer = ArrayPool<byte>.Shared.Rent(storage.PageSize);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
while (pageId != 0)
|
while (pageId != 0)
|
||||||
@@ -265,10 +264,7 @@ public struct DictionaryPage
|
|||||||
storage.ReadPage(pageId, transactionId, pageBuffer);
|
storage.ReadPage(pageId, transactionId, pageBuffer);
|
||||||
|
|
||||||
// Get all entries in this page
|
// Get all entries in this page
|
||||||
foreach (var entry in GetAll(pageBuffer))
|
foreach (var entry in GetAll(pageBuffer)) yield return entry;
|
||||||
{
|
|
||||||
yield return entry;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Move to next page
|
// Move to next page
|
||||||
var header = PageHeader.ReadFrom(pageBuffer);
|
var header = PageHeader.ReadFrom(pageBuffer);
|
||||||
@@ -277,7 +273,7 @@ public struct DictionaryPage
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(pageBuffer);
|
ArrayPool<byte>.Shared.Return(pageBuffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -9,15 +9,18 @@ internal interface IIndexStorage
|
|||||||
/// Gets or sets the PageSize.
|
/// Gets or sets the PageSize.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
int PageSize { get; }
|
int PageSize { get; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Executes AllocatePage.
|
/// Executes AllocatePage.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
uint AllocatePage();
|
uint AllocatePage();
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Executes FreePage.
|
/// Executes FreePage.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="pageId">The page identifier.</param>
|
/// <param name="pageId">The page identifier.</param>
|
||||||
void FreePage(uint pageId);
|
void FreePage(uint pageId);
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Executes ReadPage.
|
/// Executes ReadPage.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -25,6 +28,7 @@ internal interface IIndexStorage
|
|||||||
/// <param name="transactionId">The optional transaction identifier.</param>
|
/// <param name="transactionId">The optional transaction identifier.</param>
|
||||||
/// <param name="destination">The destination buffer.</param>
|
/// <param name="destination">The destination buffer.</param>
|
||||||
void ReadPage(uint pageId, ulong? transactionId, Span<byte> destination);
|
void ReadPage(uint pageId, ulong? transactionId, Span<byte> destination);
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Executes WritePage.
|
/// Executes WritePage.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -32,6 +36,7 @@ internal interface IIndexStorage
|
|||||||
/// <param name="transactionId">The transaction identifier.</param>
|
/// <param name="transactionId">The transaction identifier.</param>
|
||||||
/// <param name="data">The source page data.</param>
|
/// <param name="data">The source page data.</param>
|
||||||
void WritePage(uint pageId, ulong transactionId, ReadOnlySpan<byte> data);
|
void WritePage(uint pageId, ulong transactionId, ReadOnlySpan<byte> data);
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Executes WritePageImmediate.
|
/// Executes WritePageImmediate.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
|
|||||||
@@ -61,7 +61,8 @@ internal interface IStorageEngine : IIndexStorage, IDisposable
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="isolationLevel">The transaction isolation level.</param>
|
/// <param name="isolationLevel">The transaction isolation level.</param>
|
||||||
/// <param name="ct">A cancellation token.</param>
|
/// <param name="ct">A cancellation token.</param>
|
||||||
Task<Transaction> BeginTransactionAsync(IsolationLevel isolationLevel = IsolationLevel.ReadCommitted, CancellationToken ct = default);
|
Task<Transaction> BeginTransactionAsync(IsolationLevel isolationLevel = IsolationLevel.ReadCommitted,
|
||||||
|
CancellationToken ct = default);
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets collection metadata by name.
|
/// Gets collection metadata by name.
|
||||||
|
|||||||
@@ -59,25 +59,12 @@ public readonly struct PageFileConfig
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public sealed class PageFile : IDisposable
|
public sealed class PageFile : IDisposable
|
||||||
{
|
{
|
||||||
private readonly string _filePath;
|
|
||||||
private readonly PageFileConfig _config;
|
private readonly PageFileConfig _config;
|
||||||
private FileStream? _fileStream;
|
|
||||||
private MemoryMappedFile? _mappedFile;
|
|
||||||
private readonly object _lock = new();
|
private readonly object _lock = new();
|
||||||
private bool _disposed;
|
private bool _disposed;
|
||||||
private bool _wasCreated;
|
private FileStream? _fileStream;
|
||||||
private uint _nextPageId;
|
|
||||||
private uint _firstFreePageId;
|
private uint _firstFreePageId;
|
||||||
|
private MemoryMappedFile? _mappedFile;
|
||||||
/// <summary>
|
|
||||||
/// Gets the next page identifier that will be allocated.
|
|
||||||
/// </summary>
|
|
||||||
public uint NextPageId => _nextPageId;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Indicates whether this file was newly created on the current open call.
|
|
||||||
/// </summary>
|
|
||||||
public bool WasCreated => _wasCreated;
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new instance of the <see cref="PageFile" /> class.
|
/// Initializes a new instance of the <see cref="PageFile" /> class.
|
||||||
@@ -86,10 +73,20 @@ public sealed class PageFile : IDisposable
|
|||||||
/// <param name="config">The page file configuration.</param>
|
/// <param name="config">The page file configuration.</param>
|
||||||
public PageFile(string filePath, PageFileConfig config)
|
public PageFile(string filePath, PageFileConfig config)
|
||||||
{
|
{
|
||||||
_filePath = filePath ?? throw new ArgumentNullException(nameof(filePath));
|
FilePath = filePath ?? throw new ArgumentNullException(nameof(filePath));
|
||||||
_config = config;
|
_config = config;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the next page identifier that will be allocated.
|
||||||
|
/// </summary>
|
||||||
|
public uint NextPageId { get; private set; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Indicates whether this file was newly created on the current open call.
|
||||||
|
/// </summary>
|
||||||
|
public bool WasCreated { get; private set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the configured page size in bytes.
|
/// Gets the configured page size in bytes.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -98,7 +95,7 @@ public sealed class PageFile : IDisposable
|
|||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the underlying file path.
|
/// Gets the underlying file path.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string FilePath => _filePath;
|
public string FilePath { get; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the current physical file length in bytes.
|
/// Gets the current physical file length in bytes.
|
||||||
@@ -120,6 +117,31 @@ public sealed class PageFile : IDisposable
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public PageFileConfig Config => _config;
|
public PageFileConfig Config => _config;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Releases resources used by the page file.
|
||||||
|
/// </summary>
|
||||||
|
public void Dispose()
|
||||||
|
{
|
||||||
|
if (_disposed)
|
||||||
|
return;
|
||||||
|
|
||||||
|
lock (_lock)
|
||||||
|
{
|
||||||
|
// 1. Flush any pending writes from memory-mapped file
|
||||||
|
if (_fileStream != null) _fileStream.Flush(true);
|
||||||
|
|
||||||
|
// 2. Close memory-mapped file first
|
||||||
|
_mappedFile?.Dispose();
|
||||||
|
|
||||||
|
// 3. Then close file stream
|
||||||
|
_fileStream?.Dispose();
|
||||||
|
|
||||||
|
_disposed = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
GC.SuppressFinalize(this);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Opens the page file, creating it if it doesn't exist
|
/// Opens the page file, creating it if it doesn't exist
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -130,26 +152,28 @@ public sealed class PageFile : IDisposable
|
|||||||
if (_fileStream != null)
|
if (_fileStream != null)
|
||||||
return; // Already open
|
return; // Already open
|
||||||
|
|
||||||
var fileExists = File.Exists(_filePath);
|
bool fileExists = File.Exists(FilePath);
|
||||||
|
|
||||||
_fileStream = new FileStream(
|
_fileStream = new FileStream(
|
||||||
_filePath,
|
FilePath,
|
||||||
FileMode.OpenOrCreate,
|
FileMode.OpenOrCreate,
|
||||||
FileAccess.ReadWrite,
|
FileAccess.ReadWrite,
|
||||||
FileShare.None,
|
FileShare.None,
|
||||||
bufferSize: 4096,
|
4096,
|
||||||
FileOptions.RandomAccess);
|
FileOptions.RandomAccess);
|
||||||
|
|
||||||
_wasCreated = !fileExists || _fileStream.Length == 0;
|
WasCreated = !fileExists || _fileStream.Length == 0;
|
||||||
if (_wasCreated)
|
if (WasCreated)
|
||||||
{
|
{
|
||||||
// Initialize new file with 2 pages (Header + Collection Metadata)
|
// Initialize new file with 2 pages (Header + Collection Metadata)
|
||||||
_fileStream.SetLength(_config.InitialFileSize < _config.PageSize * 2 ? _config.PageSize * 2 : _config.InitialFileSize);
|
_fileStream.SetLength(_config.InitialFileSize < _config.PageSize * 2
|
||||||
|
? _config.PageSize * 2
|
||||||
|
: _config.InitialFileSize);
|
||||||
InitializeHeader();
|
InitializeHeader();
|
||||||
}
|
}
|
||||||
|
|
||||||
// Initialize next page ID based on file length
|
// Initialize next page ID based on file length
|
||||||
_nextPageId = (uint)(_fileStream.Length / _config.PageSize);
|
NextPageId = (uint)(_fileStream.Length / _config.PageSize);
|
||||||
|
|
||||||
_mappedFile = MemoryMappedFile.CreateFromFile(
|
_mappedFile = MemoryMappedFile.CreateFromFile(
|
||||||
_fileStream,
|
_fileStream,
|
||||||
@@ -157,7 +181,7 @@ public sealed class PageFile : IDisposable
|
|||||||
_fileStream.Length,
|
_fileStream.Length,
|
||||||
_config.Access,
|
_config.Access,
|
||||||
HandleInheritability.None,
|
HandleInheritability.None,
|
||||||
leaveOpen: true);
|
true);
|
||||||
|
|
||||||
// Read free list head from Page 0
|
// Read free list head from Page 0
|
||||||
if (_fileStream.Length >= _config.PageSize)
|
if (_fileStream.Length >= _config.PageSize)
|
||||||
@@ -228,7 +252,7 @@ public sealed class PageFile : IDisposable
|
|||||||
if (_mappedFile == null)
|
if (_mappedFile == null)
|
||||||
throw new InvalidOperationException("File not open");
|
throw new InvalidOperationException("File not open");
|
||||||
|
|
||||||
var offset = (long)pageId * _config.PageSize;
|
long offset = pageId * _config.PageSize;
|
||||||
|
|
||||||
using var accessor = _mappedFile.CreateViewAccessor(offset, _config.PageSize, MemoryMappedFileAccess.Read);
|
using var accessor = _mappedFile.CreateViewAccessor(offset, _config.PageSize, MemoryMappedFileAccess.Read);
|
||||||
var temp = new byte[_config.PageSize];
|
var temp = new byte[_config.PageSize];
|
||||||
@@ -249,16 +273,15 @@ public sealed class PageFile : IDisposable
|
|||||||
if (_mappedFile == null)
|
if (_mappedFile == null)
|
||||||
throw new InvalidOperationException("File not open");
|
throw new InvalidOperationException("File not open");
|
||||||
|
|
||||||
var offset = (long)pageId * _config.PageSize;
|
long offset = pageId * _config.PageSize;
|
||||||
|
|
||||||
// Ensure file is large enough
|
// Ensure file is large enough
|
||||||
if (offset + _config.PageSize > _fileStream!.Length)
|
if (offset + _config.PageSize > _fileStream!.Length)
|
||||||
{
|
|
||||||
lock (_lock)
|
lock (_lock)
|
||||||
{
|
{
|
||||||
if (offset + _config.PageSize > _fileStream.Length)
|
if (offset + _config.PageSize > _fileStream.Length)
|
||||||
{
|
{
|
||||||
var newSize = Math.Max(offset + _config.PageSize, _fileStream.Length * 2);
|
long newSize = Math.Max(offset + _config.PageSize, _fileStream.Length * 2);
|
||||||
_fileStream.SetLength(newSize);
|
_fileStream.SetLength(newSize);
|
||||||
|
|
||||||
// Recreate memory-mapped file with new size
|
// Recreate memory-mapped file with new size
|
||||||
@@ -269,8 +292,7 @@ public sealed class PageFile : IDisposable
|
|||||||
_fileStream.Length,
|
_fileStream.Length,
|
||||||
_config.Access,
|
_config.Access,
|
||||||
HandleInheritability.None,
|
HandleInheritability.None,
|
||||||
leaveOpen: true);
|
true);
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -294,7 +316,7 @@ public sealed class PageFile : IDisposable
|
|||||||
// 1. Try to reuse a free page
|
// 1. Try to reuse a free page
|
||||||
if (_firstFreePageId != 0)
|
if (_firstFreePageId != 0)
|
||||||
{
|
{
|
||||||
var recycledPageId = _firstFreePageId;
|
uint recycledPageId = _firstFreePageId;
|
||||||
|
|
||||||
// Read the recycled page to update the free list head
|
// Read the recycled page to update the free list head
|
||||||
var buffer = new byte[_config.PageSize];
|
var buffer = new byte[_config.PageSize];
|
||||||
@@ -311,13 +333,13 @@ public sealed class PageFile : IDisposable
|
|||||||
}
|
}
|
||||||
|
|
||||||
// 2. No free pages, append new one
|
// 2. No free pages, append new one
|
||||||
var pageId = _nextPageId++;
|
uint pageId = NextPageId++;
|
||||||
|
|
||||||
// Extend file if necessary
|
// Extend file if necessary
|
||||||
var requiredLength = (long)(pageId + 1) * _config.PageSize;
|
long requiredLength = (pageId + 1) * _config.PageSize;
|
||||||
if (requiredLength > _fileStream.Length)
|
if (requiredLength > _fileStream.Length)
|
||||||
{
|
{
|
||||||
var newSize = Math.Max(requiredLength, _fileStream.Length * 2);
|
long newSize = Math.Max(requiredLength, _fileStream.Length * 2);
|
||||||
_fileStream.SetLength(newSize);
|
_fileStream.SetLength(newSize);
|
||||||
|
|
||||||
// Recreate memory-mapped file with new size
|
// Recreate memory-mapped file with new size
|
||||||
@@ -328,7 +350,7 @@ public sealed class PageFile : IDisposable
|
|||||||
_fileStream.Length,
|
_fileStream.Length,
|
||||||
_config.Access,
|
_config.Access,
|
||||||
HandleInheritability.None,
|
HandleInheritability.None,
|
||||||
leaveOpen: true);
|
true);
|
||||||
}
|
}
|
||||||
|
|
||||||
return pageId;
|
return pageId;
|
||||||
@@ -401,11 +423,13 @@ public sealed class PageFile : IDisposable
|
|||||||
if (_mappedFile == null)
|
if (_mappedFile == null)
|
||||||
throw new InvalidOperationException("File not open");
|
throw new InvalidOperationException("File not open");
|
||||||
|
|
||||||
var absoluteOffset = 32 + extensionOffset;
|
int absoluteOffset = 32 + extensionOffset;
|
||||||
if (absoluteOffset + destination.Length > _config.PageSize)
|
if (absoluteOffset + destination.Length > _config.PageSize)
|
||||||
throw new ArgumentOutOfRangeException(nameof(destination), "Requested range exceeds page 0 extension region.");
|
throw new ArgumentOutOfRangeException(nameof(destination),
|
||||||
|
"Requested range exceeds page 0 extension region.");
|
||||||
|
|
||||||
using var accessor = _mappedFile.CreateViewAccessor(absoluteOffset, destination.Length, MemoryMappedFileAccess.Read);
|
using var accessor =
|
||||||
|
_mappedFile.CreateViewAccessor(absoluteOffset, destination.Length, MemoryMappedFileAccess.Read);
|
||||||
var temp = new byte[destination.Length];
|
var temp = new byte[destination.Length];
|
||||||
accessor.ReadArray(0, temp, 0, temp.Length);
|
accessor.ReadArray(0, temp, 0, temp.Length);
|
||||||
temp.CopyTo(destination);
|
temp.CopyTo(destination);
|
||||||
@@ -427,11 +451,12 @@ public sealed class PageFile : IDisposable
|
|||||||
if (_mappedFile == null)
|
if (_mappedFile == null)
|
||||||
throw new InvalidOperationException("File not open");
|
throw new InvalidOperationException("File not open");
|
||||||
|
|
||||||
var absoluteOffset = 32 + extensionOffset;
|
int absoluteOffset = 32 + extensionOffset;
|
||||||
if (absoluteOffset + source.Length > _config.PageSize)
|
if (absoluteOffset + source.Length > _config.PageSize)
|
||||||
throw new ArgumentOutOfRangeException(nameof(source), "Requested range exceeds page 0 extension region.");
|
throw new ArgumentOutOfRangeException(nameof(source), "Requested range exceeds page 0 extension region.");
|
||||||
|
|
||||||
using var accessor = _mappedFile.CreateViewAccessor(absoluteOffset, source.Length, MemoryMappedFileAccess.Write);
|
using var accessor =
|
||||||
|
_mappedFile.CreateViewAccessor(absoluteOffset, source.Length, MemoryMappedFileAccess.Write);
|
||||||
accessor.WriteArray(0, source.ToArray(), 0, source.Length);
|
accessor.WriteArray(0, source.ToArray(), 0, source.Length);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -443,7 +468,7 @@ public sealed class PageFile : IDisposable
|
|||||||
{
|
{
|
||||||
lock (_lock)
|
lock (_lock)
|
||||||
{
|
{
|
||||||
_fileStream?.Flush(flushToDisk: true);
|
_fileStream?.Flush(true);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -460,14 +485,11 @@ public sealed class PageFile : IDisposable
|
|||||||
{
|
{
|
||||||
EnsureFileOpen();
|
EnsureFileOpen();
|
||||||
|
|
||||||
var directory = Path.GetDirectoryName(destinationPath);
|
string? directory = Path.GetDirectoryName(destinationPath);
|
||||||
if (!string.IsNullOrWhiteSpace(directory))
|
if (!string.IsNullOrWhiteSpace(directory)) Directory.CreateDirectory(directory);
|
||||||
{
|
|
||||||
Directory.CreateDirectory(directory);
|
|
||||||
}
|
|
||||||
|
|
||||||
_fileStream!.Flush(flushToDisk: true);
|
_fileStream!.Flush(true);
|
||||||
var originalPosition = _fileStream.Position;
|
long originalPosition = _fileStream.Position;
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
_fileStream.Position = 0;
|
_fileStream.Position = 0;
|
||||||
@@ -476,10 +498,10 @@ public sealed class PageFile : IDisposable
|
|||||||
FileMode.Create,
|
FileMode.Create,
|
||||||
FileAccess.Write,
|
FileAccess.Write,
|
||||||
FileShare.None,
|
FileShare.None,
|
||||||
bufferSize: 128 * 1024,
|
128 * 1024,
|
||||||
FileOptions.SequentialScan | FileOptions.WriteThrough);
|
FileOptions.SequentialScan | FileOptions.WriteThrough);
|
||||||
_fileStream.CopyTo(destination);
|
_fileStream.CopyTo(destination);
|
||||||
destination.Flush(flushToDisk: true);
|
destination.Flush(true);
|
||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
@@ -508,11 +530,12 @@ public sealed class PageFile : IDisposable
|
|||||||
FileMode.Open,
|
FileMode.Open,
|
||||||
FileAccess.Read,
|
FileAccess.Read,
|
||||||
FileShare.Read,
|
FileShare.Read,
|
||||||
bufferSize: 128 * 1024,
|
128 * 1024,
|
||||||
FileOptions.SequentialScan);
|
FileOptions.SequentialScan);
|
||||||
|
|
||||||
if (source.Length <= 0 || source.Length % _config.PageSize != 0)
|
if (source.Length <= 0 || source.Length % _config.PageSize != 0)
|
||||||
throw new InvalidDataException($"Replacement file length must be a positive multiple of page size ({_config.PageSize}).");
|
throw new InvalidDataException(
|
||||||
|
$"Replacement file length must be a positive multiple of page size ({_config.PageSize}).");
|
||||||
|
|
||||||
_mappedFile?.Dispose();
|
_mappedFile?.Dispose();
|
||||||
_mappedFile = null;
|
_mappedFile = null;
|
||||||
@@ -520,7 +543,7 @@ public sealed class PageFile : IDisposable
|
|||||||
_fileStream!.SetLength(source.Length);
|
_fileStream!.SetLength(source.Length);
|
||||||
_fileStream.Position = 0;
|
_fileStream.Position = 0;
|
||||||
source.CopyTo(_fileStream);
|
source.CopyTo(_fileStream);
|
||||||
_fileStream.Flush(flushToDisk: true);
|
_fileStream.Flush(true);
|
||||||
|
|
||||||
_mappedFile = MemoryMappedFile.CreateFromFile(
|
_mappedFile = MemoryMappedFile.CreateFromFile(
|
||||||
_fileStream,
|
_fileStream,
|
||||||
@@ -528,16 +551,17 @@ public sealed class PageFile : IDisposable
|
|||||||
_fileStream.Length,
|
_fileStream.Length,
|
||||||
_config.Access,
|
_config.Access,
|
||||||
HandleInheritability.None,
|
HandleInheritability.None,
|
||||||
leaveOpen: true);
|
true);
|
||||||
|
|
||||||
_nextPageId = (uint)(_fileStream.Length / _config.PageSize);
|
NextPageId = (uint)(_fileStream.Length / _config.PageSize);
|
||||||
_firstFreePageId = 0;
|
_firstFreePageId = 0;
|
||||||
|
|
||||||
if (_fileStream.Length >= _config.PageSize)
|
if (_fileStream.Length >= _config.PageSize)
|
||||||
{
|
{
|
||||||
const int pageHeaderSizeBytes = 32;
|
const int pageHeaderSizeBytes = 32;
|
||||||
var headerSpan = new byte[pageHeaderSizeBytes];
|
var headerSpan = new byte[pageHeaderSizeBytes];
|
||||||
using var accessor = _mappedFile.CreateViewAccessor(0, pageHeaderSizeBytes, MemoryMappedFileAccess.Read);
|
using var accessor =
|
||||||
|
_mappedFile.CreateViewAccessor(0, pageHeaderSizeBytes, MemoryMappedFileAccess.Read);
|
||||||
accessor.ReadArray(0, headerSpan, 0, pageHeaderSizeBytes);
|
accessor.ReadArray(0, headerSpan, 0, pageHeaderSizeBytes);
|
||||||
var header = PageHeader.ReadFrom(headerSpan);
|
var header = PageHeader.ReadFrom(headerSpan);
|
||||||
_firstFreePageId = header.NextPageId;
|
_firstFreePageId = header.NextPageId;
|
||||||
@@ -563,7 +587,10 @@ public sealed class PageFile : IDisposable
|
|||||||
/// <summary>
|
/// <summary>
|
||||||
/// Normalizes the free-list by rebuilding it from a deterministic sorted free page set.
|
/// Normalizes the free-list by rebuilding it from a deterministic sorted free page set.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="includeEmptyPages">If set to <see langword="true"/>, all-zero pages are converted into explicit free-list pages.</param>
|
/// <param name="includeEmptyPages">
|
||||||
|
/// If set to <see langword="true" />, all-zero pages are converted into explicit free-list
|
||||||
|
/// pages.
|
||||||
|
/// </param>
|
||||||
/// <returns>The number of pages in the normalized free-list.</returns>
|
/// <returns>The number of pages in the normalized free-list.</returns>
|
||||||
public int NormalizeFreeList(bool includeEmptyPages = true)
|
public int NormalizeFreeList(bool includeEmptyPages = true)
|
||||||
{
|
{
|
||||||
@@ -580,7 +607,10 @@ public sealed class PageFile : IDisposable
|
|||||||
/// Truncates contiguous reclaimable pages at the end of the file.
|
/// Truncates contiguous reclaimable pages at the end of the file.
|
||||||
/// Reclaimable tail pages include explicit free pages and truly empty pages.
|
/// Reclaimable tail pages include explicit free pages and truly empty pages.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="minimumPageCount">Minimum number of pages that must remain after truncation (defaults to header + metadata pages).</param>
|
/// <param name="minimumPageCount">
|
||||||
|
/// Minimum number of pages that must remain after truncation (defaults to header + metadata
|
||||||
|
/// pages).
|
||||||
|
/// </param>
|
||||||
/// <returns>Details about the truncation operation.</returns>
|
/// <returns>Details about the truncation operation.</returns>
|
||||||
public TailTruncationResult TruncateReclaimableTailPages(uint minimumPageCount = 2)
|
public TailTruncationResult TruncateReclaimableTailPages(uint minimumPageCount = 2)
|
||||||
{
|
{
|
||||||
@@ -588,31 +618,22 @@ public sealed class PageFile : IDisposable
|
|||||||
{
|
{
|
||||||
EnsureFileOpen();
|
EnsureFileOpen();
|
||||||
|
|
||||||
if (_nextPageId <= minimumPageCount)
|
if (NextPageId <= minimumPageCount) return TailTruncationResult.None(NextPageId);
|
||||||
{
|
|
||||||
return TailTruncationResult.None(_nextPageId);
|
|
||||||
}
|
|
||||||
|
|
||||||
var freePages = new HashSet<uint>(CollectFreePageIds(includeEmptyPages: true));
|
var freePages = new HashSet<uint>(CollectFreePageIds(true));
|
||||||
var originalPageCount = _nextPageId;
|
uint originalPageCount = NextPageId;
|
||||||
var newPageCount = _nextPageId;
|
uint newPageCount = NextPageId;
|
||||||
var pageBuffer = new byte[_config.PageSize];
|
var pageBuffer = new byte[_config.PageSize];
|
||||||
|
|
||||||
while (newPageCount > minimumPageCount)
|
while (newPageCount > minimumPageCount)
|
||||||
{
|
{
|
||||||
var candidatePageId = newPageCount - 1;
|
uint candidatePageId = newPageCount - 1;
|
||||||
if (!IsReclaimableTailPage(candidatePageId, freePages, pageBuffer))
|
if (!IsReclaimableTailPage(candidatePageId, freePages, pageBuffer)) break;
|
||||||
{
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
|
|
||||||
newPageCount--;
|
newPageCount--;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (newPageCount == originalPageCount)
|
if (newPageCount == originalPageCount) return TailTruncationResult.None(originalPageCount);
|
||||||
{
|
|
||||||
return TailTruncationResult.None(originalPageCount);
|
|
||||||
}
|
|
||||||
|
|
||||||
freePages.RemoveWhere(pageId => pageId >= newPageCount);
|
freePages.RemoveWhere(pageId => pageId >= newPageCount);
|
||||||
var remainingFreePages = freePages.ToList();
|
var remainingFreePages = freePages.ToList();
|
||||||
@@ -622,10 +643,10 @@ public sealed class PageFile : IDisposable
|
|||||||
_mappedFile?.Dispose();
|
_mappedFile?.Dispose();
|
||||||
_mappedFile = null;
|
_mappedFile = null;
|
||||||
|
|
||||||
var previousLengthBytes = _fileStream!.Length;
|
long previousLengthBytes = _fileStream!.Length;
|
||||||
var newLengthBytes = (long)newPageCount * _config.PageSize;
|
long newLengthBytes = newPageCount * _config.PageSize;
|
||||||
_fileStream.SetLength(newLengthBytes);
|
_fileStream.SetLength(newLengthBytes);
|
||||||
_fileStream.Flush(flushToDisk: true);
|
_fileStream.Flush(true);
|
||||||
|
|
||||||
_mappedFile = MemoryMappedFile.CreateFromFile(
|
_mappedFile = MemoryMappedFile.CreateFromFile(
|
||||||
_fileStream,
|
_fileStream,
|
||||||
@@ -633,9 +654,9 @@ public sealed class PageFile : IDisposable
|
|||||||
newLengthBytes,
|
newLengthBytes,
|
||||||
_config.Access,
|
_config.Access,
|
||||||
HandleInheritability.None,
|
HandleInheritability.None,
|
||||||
leaveOpen: true);
|
true);
|
||||||
|
|
||||||
_nextPageId = newPageCount;
|
NextPageId = newPageCount;
|
||||||
|
|
||||||
return new TailTruncationResult(
|
return new TailTruncationResult(
|
||||||
originalPageCount,
|
originalPageCount,
|
||||||
@@ -655,8 +676,8 @@ public sealed class PageFile : IDisposable
|
|||||||
{
|
{
|
||||||
EnsureFileOpen();
|
EnsureFileOpen();
|
||||||
|
|
||||||
var targetLengthBytes = (long)_nextPageId * _config.PageSize;
|
long targetLengthBytes = NextPageId * _config.PageSize;
|
||||||
var currentLengthBytes = _fileStream!.Length;
|
long currentLengthBytes = _fileStream!.Length;
|
||||||
if (currentLengthBytes <= targetLengthBytes)
|
if (currentLengthBytes <= targetLengthBytes)
|
||||||
return 0;
|
return 0;
|
||||||
|
|
||||||
@@ -664,7 +685,7 @@ public sealed class PageFile : IDisposable
|
|||||||
_mappedFile = null;
|
_mappedFile = null;
|
||||||
|
|
||||||
_fileStream.SetLength(targetLengthBytes);
|
_fileStream.SetLength(targetLengthBytes);
|
||||||
_fileStream.Flush(flushToDisk: true);
|
_fileStream.Flush(true);
|
||||||
|
|
||||||
_mappedFile = MemoryMappedFile.CreateFromFile(
|
_mappedFile = MemoryMappedFile.CreateFromFile(
|
||||||
_fileStream,
|
_fileStream,
|
||||||
@@ -672,7 +693,7 @@ public sealed class PageFile : IDisposable
|
|||||||
targetLengthBytes,
|
targetLengthBytes,
|
||||||
_config.Access,
|
_config.Access,
|
||||||
HandleInheritability.None,
|
HandleInheritability.None,
|
||||||
leaveOpen: true);
|
true);
|
||||||
|
|
||||||
return currentLengthBytes - targetLengthBytes;
|
return currentLengthBytes - targetLengthBytes;
|
||||||
}
|
}
|
||||||
@@ -701,7 +722,7 @@ public sealed class PageFile : IDisposable
|
|||||||
{
|
{
|
||||||
EnsureFileOpen();
|
EnsureFileOpen();
|
||||||
|
|
||||||
if (pageId >= _nextPageId)
|
if (pageId >= NextPageId)
|
||||||
throw new ArgumentOutOfRangeException(nameof(pageId));
|
throw new ArgumentOutOfRangeException(nameof(pageId));
|
||||||
|
|
||||||
var pageBuffer = new byte[_config.PageSize];
|
var pageBuffer = new byte[_config.PageSize];
|
||||||
@@ -742,7 +763,7 @@ public sealed class PageFile : IDisposable
|
|||||||
if (!IsSlottedPageType(header.PageType))
|
if (!IsSlottedPageType(header.PageType))
|
||||||
return SlottedPageDefragmentationResult.None;
|
return SlottedPageDefragmentationResult.None;
|
||||||
|
|
||||||
var slotArrayEnd = SlottedPageHeader.Size + (header.SlotCount * SlotEntry.Size);
|
int slotArrayEnd = SlottedPageHeader.Size + header.SlotCount * SlotEntry.Size;
|
||||||
if (slotArrayEnd > pageBuffer.Length)
|
if (slotArrayEnd > pageBuffer.Length)
|
||||||
return SlottedPageDefragmentationResult.None;
|
return SlottedPageDefragmentationResult.None;
|
||||||
|
|
||||||
@@ -750,29 +771,29 @@ public sealed class PageFile : IDisposable
|
|||||||
|
|
||||||
for (ushort i = 0; i < header.SlotCount; i++)
|
for (ushort i = 0; i < header.SlotCount; i++)
|
||||||
{
|
{
|
||||||
var slotOffset = SlottedPageHeader.Size + (i * SlotEntry.Size);
|
int slotOffset = SlottedPageHeader.Size + i * SlotEntry.Size;
|
||||||
var slot = SlotEntry.ReadFrom(pageBuffer.Slice(slotOffset, SlotEntry.Size));
|
var slot = SlotEntry.ReadFrom(pageBuffer.Slice(slotOffset, SlotEntry.Size));
|
||||||
|
|
||||||
if ((slot.Flags & SlotFlags.Deleted) != 0 || slot.Length == 0)
|
if ((slot.Flags & SlotFlags.Deleted) != 0 || slot.Length == 0)
|
||||||
continue;
|
continue;
|
||||||
|
|
||||||
var dataEnd = slot.Offset + slot.Length;
|
int dataEnd = slot.Offset + slot.Length;
|
||||||
if (slot.Offset < slotArrayEnd || dataEnd > pageBuffer.Length)
|
if (slot.Offset < slotArrayEnd || dataEnd > pageBuffer.Length)
|
||||||
return SlottedPageDefragmentationResult.None;
|
return SlottedPageDefragmentationResult.None;
|
||||||
|
|
||||||
var slotData = pageBuffer.Slice(slot.Offset, slot.Length).ToArray();
|
byte[] slotData = pageBuffer.Slice(slot.Offset, slot.Length).ToArray();
|
||||||
activeSlots.Add((i, slot, slotData));
|
activeSlots.Add((i, slot, slotData));
|
||||||
}
|
}
|
||||||
|
|
||||||
var newFreeSpaceStart = (ushort)slotArrayEnd;
|
var newFreeSpaceStart = (ushort)slotArrayEnd;
|
||||||
var writeCursor = pageBuffer.Length;
|
int writeCursor = pageBuffer.Length;
|
||||||
var changed = false;
|
var changed = false;
|
||||||
var relocatedSlots = 0;
|
var relocatedSlots = 0;
|
||||||
var oldFreeBytes = header.AvailableFreeSpace;
|
int oldFreeBytes = header.AvailableFreeSpace;
|
||||||
|
|
||||||
for (var i = 0; i < activeSlots.Count; i++)
|
for (var i = 0; i < activeSlots.Count; i++)
|
||||||
{
|
{
|
||||||
var (slotIndex, slot, slotData) = activeSlots[i];
|
(ushort slotIndex, var slot, byte[] slotData) = activeSlots[i];
|
||||||
writeCursor -= slotData.Length;
|
writeCursor -= slotData.Length;
|
||||||
if (writeCursor < newFreeSpaceStart)
|
if (writeCursor < newFreeSpaceStart)
|
||||||
return SlottedPageDefragmentationResult.None;
|
return SlottedPageDefragmentationResult.None;
|
||||||
@@ -786,57 +807,24 @@ public sealed class PageFile : IDisposable
|
|||||||
changed = true;
|
changed = true;
|
||||||
}
|
}
|
||||||
|
|
||||||
var slotOffset = SlottedPageHeader.Size + (slotIndex * SlotEntry.Size);
|
int slotOffset = SlottedPageHeader.Size + slotIndex * SlotEntry.Size;
|
||||||
slot.WriteTo(pageBuffer.Slice(slotOffset, SlotEntry.Size));
|
slot.WriteTo(pageBuffer.Slice(slotOffset, SlotEntry.Size));
|
||||||
}
|
}
|
||||||
|
|
||||||
if (writeCursor > newFreeSpaceStart)
|
if (writeCursor > newFreeSpaceStart)
|
||||||
{
|
|
||||||
pageBuffer.Slice(newFreeSpaceStart, writeCursor - newFreeSpaceStart).Clear();
|
pageBuffer.Slice(newFreeSpaceStart, writeCursor - newFreeSpaceStart).Clear();
|
||||||
}
|
|
||||||
|
|
||||||
if (header.FreeSpaceStart != newFreeSpaceStart || header.FreeSpaceEnd != writeCursor)
|
if (header.FreeSpaceStart != newFreeSpaceStart || header.FreeSpaceEnd != writeCursor) changed = true;
|
||||||
{
|
|
||||||
changed = true;
|
|
||||||
}
|
|
||||||
|
|
||||||
header.FreeSpaceStart = newFreeSpaceStart;
|
header.FreeSpaceStart = newFreeSpaceStart;
|
||||||
header.FreeSpaceEnd = (ushort)writeCursor;
|
header.FreeSpaceEnd = (ushort)writeCursor;
|
||||||
header.WriteTo(pageBuffer);
|
header.WriteTo(pageBuffer);
|
||||||
|
|
||||||
var newFreeBytes = header.AvailableFreeSpace;
|
int newFreeBytes = header.AvailableFreeSpace;
|
||||||
var reclaimedBytes = Math.Max(0, newFreeBytes - oldFreeBytes);
|
int reclaimedBytes = Math.Max(0, newFreeBytes - oldFreeBytes);
|
||||||
return new SlottedPageDefragmentationResult(changed, reclaimedBytes, relocatedSlots);
|
return new SlottedPageDefragmentationResult(changed, reclaimedBytes, relocatedSlots);
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Releases resources used by the page file.
|
|
||||||
/// </summary>
|
|
||||||
public void Dispose()
|
|
||||||
{
|
|
||||||
if (_disposed)
|
|
||||||
return;
|
|
||||||
|
|
||||||
lock (_lock)
|
|
||||||
{
|
|
||||||
// 1. Flush any pending writes from memory-mapped file
|
|
||||||
if (_fileStream != null)
|
|
||||||
{
|
|
||||||
_fileStream.Flush(flushToDisk: true);
|
|
||||||
}
|
|
||||||
|
|
||||||
// 2. Close memory-mapped file first
|
|
||||||
_mappedFile?.Dispose();
|
|
||||||
|
|
||||||
// 3. Then close file stream
|
|
||||||
_fileStream?.Dispose();
|
|
||||||
|
|
||||||
_disposed = true;
|
|
||||||
}
|
|
||||||
|
|
||||||
GC.SuppressFinalize(this);
|
|
||||||
}
|
|
||||||
|
|
||||||
private void EnsureFileOpen()
|
private void EnsureFileOpen()
|
||||||
{
|
{
|
||||||
if (_fileStream == null || _mappedFile == null)
|
if (_fileStream == null || _mappedFile == null)
|
||||||
@@ -846,26 +834,23 @@ public sealed class PageFile : IDisposable
|
|||||||
private List<uint> CollectFreePageIds(bool includeEmptyPages)
|
private List<uint> CollectFreePageIds(bool includeEmptyPages)
|
||||||
{
|
{
|
||||||
var freePages = new HashSet<uint>();
|
var freePages = new HashSet<uint>();
|
||||||
if (_nextPageId <= 2)
|
if (NextPageId <= 2)
|
||||||
return [];
|
return [];
|
||||||
|
|
||||||
var pageBuffer = new byte[_config.PageSize];
|
var pageBuffer = new byte[_config.PageSize];
|
||||||
|
|
||||||
var seen = new HashSet<uint>();
|
var seen = new HashSet<uint>();
|
||||||
var current = _firstFreePageId;
|
uint current = _firstFreePageId;
|
||||||
while (current != 0 && current < _nextPageId && seen.Add(current))
|
while (current != 0 && current < NextPageId && seen.Add(current))
|
||||||
{
|
{
|
||||||
if (current > 1)
|
if (current > 1) freePages.Add(current);
|
||||||
{
|
|
||||||
freePages.Add(current);
|
|
||||||
}
|
|
||||||
|
|
||||||
ReadPage(current, pageBuffer);
|
ReadPage(current, pageBuffer);
|
||||||
var header = PageHeader.ReadFrom(pageBuffer);
|
var header = PageHeader.ReadFrom(pageBuffer);
|
||||||
current = header.NextPageId;
|
current = header.NextPageId;
|
||||||
}
|
}
|
||||||
|
|
||||||
for (uint pageId = 2; pageId < _nextPageId; pageId++)
|
for (uint pageId = 2; pageId < NextPageId; pageId++)
|
||||||
{
|
{
|
||||||
ReadPage(pageId, pageBuffer);
|
ReadPage(pageId, pageBuffer);
|
||||||
var header = PageHeader.ReadFrom(pageBuffer);
|
var header = PageHeader.ReadFrom(pageBuffer);
|
||||||
@@ -876,10 +861,7 @@ public sealed class PageFile : IDisposable
|
|||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (includeEmptyPages && IsTrulyEmptyPage(pageBuffer))
|
if (includeEmptyPages && IsTrulyEmptyPage(pageBuffer)) freePages.Add(pageId);
|
||||||
{
|
|
||||||
freePages.Add(pageId);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
var ordered = freePages.ToList();
|
var ordered = freePages.ToList();
|
||||||
@@ -900,8 +882,8 @@ public sealed class PageFile : IDisposable
|
|||||||
|
|
||||||
for (var i = 0; i < sortedFreePageIds.Count; i++)
|
for (var i = 0; i < sortedFreePageIds.Count; i++)
|
||||||
{
|
{
|
||||||
var pageId = sortedFreePageIds[i];
|
uint pageId = sortedFreePageIds[i];
|
||||||
var nextPageId = i + 1 < sortedFreePageIds.Count ? sortedFreePageIds[i + 1] : 0;
|
uint nextPageId = i + 1 < sortedFreePageIds.Count ? sortedFreePageIds[i + 1] : 0;
|
||||||
|
|
||||||
Array.Clear(pageBuffer, 0, pageBuffer.Length);
|
Array.Clear(pageBuffer, 0, pageBuffer.Length);
|
||||||
var freeHeader = new PageHeader
|
var freeHeader = new PageHeader
|
||||||
@@ -928,7 +910,7 @@ public sealed class PageFile : IDisposable
|
|||||||
|
|
||||||
private bool IsReclaimableTailPage(uint pageId, HashSet<uint> explicitFreePages, byte[] pageBuffer)
|
private bool IsReclaimableTailPage(uint pageId, HashSet<uint> explicitFreePages, byte[] pageBuffer)
|
||||||
{
|
{
|
||||||
if (pageId <= 1 || pageId >= _nextPageId)
|
if (pageId <= 1 || pageId >= NextPageId)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
if (explicitFreePages.Contains(pageId))
|
if (explicitFreePages.Contains(pageId))
|
||||||
@@ -945,10 +927,8 @@ public sealed class PageFile : IDisposable
|
|||||||
private static bool IsTrulyEmptyPage(ReadOnlySpan<byte> pageBuffer)
|
private static bool IsTrulyEmptyPage(ReadOnlySpan<byte> pageBuffer)
|
||||||
{
|
{
|
||||||
for (var i = 0; i < pageBuffer.Length; i++)
|
for (var i = 0; i < pageBuffer.Length; i++)
|
||||||
{
|
|
||||||
if (pageBuffer[i] != 0)
|
if (pageBuffer[i] != 0)
|
||||||
return false;
|
return false;
|
||||||
}
|
|
||||||
|
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -11,43 +11,31 @@ namespace ZB.MOM.WW.CBDD.Core.Storage;
|
|||||||
public struct PageHeader
|
public struct PageHeader
|
||||||
{
|
{
|
||||||
/// <summary>Page ID (offset in pages from start of file)</summary>
|
/// <summary>Page ID (offset in pages from start of file)</summary>
|
||||||
[FieldOffset(0)]
|
[FieldOffset(0)] public uint PageId;
|
||||||
public uint PageId;
|
|
||||||
|
|
||||||
/// <summary>Type of this page</summary>
|
/// <summary>Type of this page</summary>
|
||||||
[FieldOffset(4)]
|
[FieldOffset(4)] public PageType PageType;
|
||||||
public PageType PageType;
|
|
||||||
|
|
||||||
/// <summary>Number of free bytes in this page</summary>
|
/// <summary>Number of free bytes in this page</summary>
|
||||||
[FieldOffset(5)]
|
[FieldOffset(5)] public ushort FreeBytes;
|
||||||
public ushort FreeBytes;
|
|
||||||
|
|
||||||
/// <summary>ID of next page in linked list (0 if none). For Page 0 (Header), this points to the First Free Page.</summary>
|
/// <summary>ID of next page in linked list (0 if none). For Page 0 (Header), this points to the First Free Page.</summary>
|
||||||
[FieldOffset(7)]
|
[FieldOffset(7)] public uint NextPageId;
|
||||||
public uint NextPageId;
|
|
||||||
|
|
||||||
/// <summary>Transaction ID that last modified this page</summary>
|
/// <summary>Transaction ID that last modified this page</summary>
|
||||||
[FieldOffset(11)]
|
[FieldOffset(11)] public ulong TransactionId;
|
||||||
public ulong TransactionId;
|
|
||||||
|
|
||||||
/// <summary>Checksum for data integrity (CRC32)</summary>
|
/// <summary>Checksum for data integrity (CRC32)</summary>
|
||||||
[FieldOffset(19)]
|
[FieldOffset(19)] public uint Checksum;
|
||||||
public uint Checksum;
|
|
||||||
|
|
||||||
/// <summary>Dictionary Root Page ID (Only used in Page 0 / File Header)</summary>
|
/// <summary>Dictionary Root Page ID (Only used in Page 0 / File Header)</summary>
|
||||||
[FieldOffset(23)]
|
[FieldOffset(23)] public uint DictionaryRootPageId;
|
||||||
public uint DictionaryRootPageId;
|
|
||||||
|
|
||||||
[FieldOffset(27)]
|
[FieldOffset(27)] private byte _reserved5;
|
||||||
private byte _reserved5;
|
[FieldOffset(28)] private byte _reserved6;
|
||||||
[FieldOffset(28)]
|
[FieldOffset(29)] private byte _reserved7;
|
||||||
private byte _reserved6;
|
[FieldOffset(30)] private byte _reserved8;
|
||||||
[FieldOffset(29)]
|
[FieldOffset(31)] private byte _reserved9;
|
||||||
private byte _reserved7;
|
|
||||||
[FieldOffset(30)]
|
|
||||||
private byte _reserved8;
|
|
||||||
[FieldOffset(31)]
|
|
||||||
private byte _reserved9;
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Writes the header to a span
|
/// Writes the header to a span
|
||||||
|
|||||||
@@ -1,3 +1,4 @@
|
|||||||
|
using System.Buffers.Binary;
|
||||||
using System.Runtime.InteropServices;
|
using System.Runtime.InteropServices;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Storage;
|
namespace ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
@@ -10,36 +11,28 @@ namespace ZB.MOM.WW.CBDD.Core.Storage;
|
|||||||
public struct SlottedPageHeader
|
public struct SlottedPageHeader
|
||||||
{
|
{
|
||||||
/// <summary>Page ID</summary>
|
/// <summary>Page ID</summary>
|
||||||
[FieldOffset(0)]
|
[FieldOffset(0)] public uint PageId;
|
||||||
public uint PageId;
|
|
||||||
|
|
||||||
/// <summary>Type of page (Data, Overflow, Index, Metadata)</summary>
|
/// <summary>Type of page (Data, Overflow, Index, Metadata)</summary>
|
||||||
[FieldOffset(4)]
|
[FieldOffset(4)] public PageType PageType;
|
||||||
public PageType PageType;
|
|
||||||
|
|
||||||
/// <summary>Number of slot entries in this page</summary>
|
/// <summary>Number of slot entries in this page</summary>
|
||||||
[FieldOffset(8)]
|
[FieldOffset(8)] public ushort SlotCount;
|
||||||
public ushort SlotCount;
|
|
||||||
|
|
||||||
/// <summary>Offset where free space starts (grows down with slots)</summary>
|
/// <summary>Offset where free space starts (grows down with slots)</summary>
|
||||||
[FieldOffset(10)]
|
[FieldOffset(10)] public ushort FreeSpaceStart;
|
||||||
public ushort FreeSpaceStart;
|
|
||||||
|
|
||||||
/// <summary>Offset where free space ends (grows up with data)</summary>
|
/// <summary>Offset where free space ends (grows up with data)</summary>
|
||||||
[FieldOffset(12)]
|
[FieldOffset(12)] public ushort FreeSpaceEnd;
|
||||||
public ushort FreeSpaceEnd;
|
|
||||||
|
|
||||||
/// <summary>Next overflow page ID (0 if none)</summary>
|
/// <summary>Next overflow page ID (0 if none)</summary>
|
||||||
[FieldOffset(14)]
|
[FieldOffset(14)] public uint NextOverflowPage;
|
||||||
public uint NextOverflowPage;
|
|
||||||
|
|
||||||
/// <summary>Transaction ID that last modified this page</summary>
|
/// <summary>Transaction ID that last modified this page</summary>
|
||||||
[FieldOffset(18)]
|
[FieldOffset(18)] public uint TransactionId;
|
||||||
public uint TransactionId;
|
|
||||||
|
|
||||||
/// <summary>Reserved for future use</summary>
|
/// <summary>Reserved for future use</summary>
|
||||||
[FieldOffset(22)]
|
[FieldOffset(22)] public ushort Reserved;
|
||||||
public ushort Reserved;
|
|
||||||
|
|
||||||
public const int Size = 24;
|
public const int Size = 24;
|
||||||
|
|
||||||
@@ -90,16 +83,13 @@ public struct SlottedPageHeader
|
|||||||
public struct SlotEntry
|
public struct SlotEntry
|
||||||
{
|
{
|
||||||
/// <summary>Offset to document data within page</summary>
|
/// <summary>Offset to document data within page</summary>
|
||||||
[FieldOffset(0)]
|
[FieldOffset(0)] public ushort Offset;
|
||||||
public ushort Offset;
|
|
||||||
|
|
||||||
/// <summary>Length of document data in bytes</summary>
|
/// <summary>Length of document data in bytes</summary>
|
||||||
[FieldOffset(2)]
|
[FieldOffset(2)] public ushort Length;
|
||||||
public ushort Length;
|
|
||||||
|
|
||||||
/// <summary>Slot flags (deleted, overflow, etc.)</summary>
|
/// <summary>Slot flags (deleted, overflow, etc.)</summary>
|
||||||
[FieldOffset(4)]
|
[FieldOffset(4)] public SlotFlags Flags;
|
||||||
public SlotFlags Flags;
|
|
||||||
|
|
||||||
public const int Size = 8;
|
public const int Size = 8;
|
||||||
|
|
||||||
@@ -144,7 +134,7 @@ public enum SlotFlags : uint
|
|||||||
HasOverflow = 1 << 1,
|
HasOverflow = 1 << 1,
|
||||||
|
|
||||||
/// <summary>Document data is compressed</summary>
|
/// <summary>Document data is compressed</summary>
|
||||||
Compressed = 1 << 2,
|
Compressed = 1 << 2
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -157,6 +147,7 @@ public readonly struct DocumentLocation
|
|||||||
/// Gets the page identifier containing the document.
|
/// Gets the page identifier containing the document.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public uint PageId { get; init; }
|
public uint PageId { get; init; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the slot index within the page.
|
/// Gets the slot index within the page.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -182,8 +173,8 @@ public readonly struct DocumentLocation
|
|||||||
if (destination.Length < 6)
|
if (destination.Length < 6)
|
||||||
throw new ArgumentException("Destination must be at least 6 bytes", nameof(destination));
|
throw new ArgumentException("Destination must be at least 6 bytes", nameof(destination));
|
||||||
|
|
||||||
System.Buffers.Binary.BinaryPrimitives.WriteUInt32LittleEndian(destination, PageId);
|
BinaryPrimitives.WriteUInt32LittleEndian(destination, PageId);
|
||||||
System.Buffers.Binary.BinaryPrimitives.WriteUInt16LittleEndian(destination.Slice(4), SlotIndex);
|
BinaryPrimitives.WriteUInt16LittleEndian(destination.Slice(4), SlotIndex);
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -195,8 +186,8 @@ public readonly struct DocumentLocation
|
|||||||
if (source.Length < 6)
|
if (source.Length < 6)
|
||||||
throw new ArgumentException("Source must be at least 6 bytes", nameof(source));
|
throw new ArgumentException("Source must be at least 6 bytes", nameof(source));
|
||||||
|
|
||||||
var pageId = System.Buffers.Binary.BinaryPrimitives.ReadUInt32LittleEndian(source);
|
uint pageId = BinaryPrimitives.ReadUInt32LittleEndian(source);
|
||||||
var slotIndex = System.Buffers.Binary.BinaryPrimitives.ReadUInt16LittleEndian(source.Slice(4));
|
ushort slotIndex = BinaryPrimitives.ReadUInt16LittleEndian(source.Slice(4));
|
||||||
|
|
||||||
return new DocumentLocation(pageId, slotIndex);
|
return new DocumentLocation(pageId, slotIndex);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,6 +1,5 @@
|
|||||||
using System.Buffers.Binary;
|
using System.Buffers.Binary;
|
||||||
using System.Runtime.InteropServices;
|
using System.Runtime.InteropServices;
|
||||||
using ZB.MOM.WW.CBDD.Core.Indexing;
|
|
||||||
using ZB.MOM.WW.CBDD.Core.Indexing.Internal;
|
using ZB.MOM.WW.CBDD.Core.Indexing.Internal;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Storage;
|
namespace ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
@@ -59,49 +58,70 @@ internal struct SpatialPage
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="page">The page buffer.</param>
|
/// <param name="page">The page buffer.</param>
|
||||||
/// <returns><see langword="true" /> if the page is a leaf node; otherwise, <see langword="false" />.</returns>
|
/// <returns><see langword="true" /> if the page is a leaf node; otherwise, <see langword="false" />.</returns>
|
||||||
public static bool GetIsLeaf(ReadOnlySpan<byte> page) => page[IsLeafOffset] == 1;
|
public static bool GetIsLeaf(ReadOnlySpan<byte> page)
|
||||||
|
{
|
||||||
|
return page[IsLeafOffset] == 1;
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the tree level stored in the page.
|
/// Gets the tree level stored in the page.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="page">The page buffer.</param>
|
/// <param name="page">The page buffer.</param>
|
||||||
/// <returns>The level value.</returns>
|
/// <returns>The level value.</returns>
|
||||||
public static byte GetLevel(ReadOnlySpan<byte> page) => page[LevelOffset];
|
public static byte GetLevel(ReadOnlySpan<byte> page)
|
||||||
|
{
|
||||||
|
return page[LevelOffset];
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the number of entries in the page.
|
/// Gets the number of entries in the page.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="page">The page buffer.</param>
|
/// <param name="page">The page buffer.</param>
|
||||||
/// <returns>The number of entries.</returns>
|
/// <returns>The number of entries.</returns>
|
||||||
public static ushort GetEntryCount(ReadOnlySpan<byte> page) => BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(EntryCountOffset));
|
public static ushort GetEntryCount(ReadOnlySpan<byte> page)
|
||||||
|
{
|
||||||
|
return BinaryPrimitives.ReadUInt16LittleEndian(page.Slice(EntryCountOffset));
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Sets the number of entries in the page.
|
/// Sets the number of entries in the page.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="page">The page buffer.</param>
|
/// <param name="page">The page buffer.</param>
|
||||||
/// <param name="count">The entry count to set.</param>
|
/// <param name="count">The entry count to set.</param>
|
||||||
public static void SetEntryCount(Span<byte> page, ushort count) => BinaryPrimitives.WriteUInt16LittleEndian(page.Slice(EntryCountOffset), count);
|
public static void SetEntryCount(Span<byte> page, ushort count)
|
||||||
|
{
|
||||||
|
BinaryPrimitives.WriteUInt16LittleEndian(page.Slice(EntryCountOffset), count);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the parent page identifier.
|
/// Gets the parent page identifier.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="page">The page buffer.</param>
|
/// <param name="page">The page buffer.</param>
|
||||||
/// <returns>The parent page identifier.</returns>
|
/// <returns>The parent page identifier.</returns>
|
||||||
public static uint GetParentPageId(ReadOnlySpan<byte> page) => BinaryPrimitives.ReadUInt32LittleEndian(page.Slice(ParentPageIdOffset));
|
public static uint GetParentPageId(ReadOnlySpan<byte> page)
|
||||||
|
{
|
||||||
|
return BinaryPrimitives.ReadUInt32LittleEndian(page.Slice(ParentPageIdOffset));
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Sets the parent page identifier.
|
/// Sets the parent page identifier.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="page">The page buffer.</param>
|
/// <param name="page">The page buffer.</param>
|
||||||
/// <param name="parentId">The parent page identifier.</param>
|
/// <param name="parentId">The parent page identifier.</param>
|
||||||
public static void SetParentPageId(Span<byte> page, uint parentId) => BinaryPrimitives.WriteUInt32LittleEndian(page.Slice(ParentPageIdOffset), parentId);
|
public static void SetParentPageId(Span<byte> page, uint parentId)
|
||||||
|
{
|
||||||
|
BinaryPrimitives.WriteUInt32LittleEndian(page.Slice(ParentPageIdOffset), parentId);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the maximum number of entries that can fit in a page.
|
/// Gets the maximum number of entries that can fit in a page.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="pageSize">The page size in bytes.</param>
|
/// <param name="pageSize">The page size in bytes.</param>
|
||||||
/// <returns>The maximum number of entries.</returns>
|
/// <returns>The maximum number of entries.</returns>
|
||||||
public static int GetMaxEntries(int pageSize) => (pageSize - DataOffset) / EntrySize;
|
public static int GetMaxEntries(int pageSize)
|
||||||
|
{
|
||||||
|
return (pageSize - DataOffset) / EntrySize;
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Writes an entry at the specified index.
|
/// Writes an entry at the specified index.
|
||||||
@@ -112,7 +132,7 @@ internal struct SpatialPage
|
|||||||
/// <param name="pointer">The document location pointer.</param>
|
/// <param name="pointer">The document location pointer.</param>
|
||||||
public static void WriteEntry(Span<byte> page, int index, GeoBox mbr, DocumentLocation pointer)
|
public static void WriteEntry(Span<byte> page, int index, GeoBox mbr, DocumentLocation pointer)
|
||||||
{
|
{
|
||||||
int offset = DataOffset + (index * EntrySize);
|
int offset = DataOffset + index * EntrySize;
|
||||||
var entrySpan = page.Slice(offset, EntrySize);
|
var entrySpan = page.Slice(offset, EntrySize);
|
||||||
|
|
||||||
// Write MBR (4 doubles)
|
// Write MBR (4 doubles)
|
||||||
@@ -135,7 +155,7 @@ internal struct SpatialPage
|
|||||||
/// <param name="pointer">When this method returns, contains the entry document location.</param>
|
/// <param name="pointer">When this method returns, contains the entry document location.</param>
|
||||||
public static void ReadEntry(ReadOnlySpan<byte> page, int index, out GeoBox mbr, out DocumentLocation pointer)
|
public static void ReadEntry(ReadOnlySpan<byte> page, int index, out GeoBox mbr, out DocumentLocation pointer)
|
||||||
{
|
{
|
||||||
int offset = DataOffset + (index * EntrySize);
|
int offset = DataOffset + index * EntrySize;
|
||||||
var entrySpan = page.Slice(offset, EntrySize);
|
var entrySpan = page.Slice(offset, EntrySize);
|
||||||
|
|
||||||
var doubles = MemoryMarshal.Cast<byte, double>(entrySpan.Slice(0, 32));
|
var doubles = MemoryMarshal.Cast<byte, double>(entrySpan.Slice(0, 32));
|
||||||
@@ -153,13 +173,14 @@ internal struct SpatialPage
|
|||||||
ushort count = GetEntryCount(page);
|
ushort count = GetEntryCount(page);
|
||||||
if (count == 0) return GeoBox.Empty;
|
if (count == 0) return GeoBox.Empty;
|
||||||
|
|
||||||
GeoBox result = GeoBox.Empty;
|
var result = GeoBox.Empty;
|
||||||
for (int i = 0; i < count; i++)
|
for (var i = 0; i < count; i++)
|
||||||
{
|
{
|
||||||
ReadEntry(page, i, out var mbr, out _);
|
ReadEntry(page, i, out var mbr, out _);
|
||||||
if (i == 0) result = mbr;
|
if (i == 0) result = mbr;
|
||||||
else result = result.ExpandTo(mbr);
|
else result = result.ExpandTo(mbr);
|
||||||
}
|
}
|
||||||
|
|
||||||
return result;
|
return result;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1,9 +1,5 @@
|
|||||||
using System;
|
|
||||||
using System.Collections.Generic;
|
|
||||||
using System.IO;
|
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
|
||||||
using ZB.MOM.WW.CBDD.Core.Indexing;
|
|
||||||
using ZB.MOM.WW.CBDD.Core.Collections;
|
using ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
|
using ZB.MOM.WW.CBDD.Core.Indexing;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Storage;
|
namespace ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
|
|
||||||
@@ -81,38 +77,6 @@ public sealed partial class StorageEngine
|
|||||||
.FirstOrDefault(x => string.Equals(x.Name, name, StringComparison.OrdinalIgnoreCase));
|
.FirstOrDefault(x => string.Equals(x.Name, name, StringComparison.OrdinalIgnoreCase));
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Returns all collection metadata entries currently registered in page 1.
|
|
||||||
/// </summary>
|
|
||||||
public IReadOnlyList<CollectionMetadata> GetAllCollectionMetadata()
|
|
||||||
{
|
|
||||||
var result = new List<CollectionMetadata>();
|
|
||||||
var buffer = new byte[PageSize];
|
|
||||||
ReadPage(1, null, buffer);
|
|
||||||
|
|
||||||
var header = SlottedPageHeader.ReadFrom(buffer);
|
|
||||||
if (header.PageType != PageType.Collection || header.SlotCount == 0)
|
|
||||||
return result;
|
|
||||||
|
|
||||||
for (ushort i = 0; i < header.SlotCount; i++)
|
|
||||||
{
|
|
||||||
var slotOffset = SlottedPageHeader.Size + (i * SlotEntry.Size);
|
|
||||||
var slot = SlotEntry.ReadFrom(buffer.AsSpan(slotOffset));
|
|
||||||
if ((slot.Flags & SlotFlags.Deleted) != 0)
|
|
||||||
continue;
|
|
||||||
|
|
||||||
if (slot.Offset < SlottedPageHeader.Size || slot.Offset + slot.Length > buffer.Length)
|
|
||||||
continue;
|
|
||||||
|
|
||||||
if (TryDeserializeCollectionMetadata(buffer.AsSpan(slot.Offset, slot.Length), out var metadata) && metadata != null)
|
|
||||||
{
|
|
||||||
result.Add(metadata);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return result;
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Saves collection metadata to the metadata page.
|
/// Saves collection metadata to the metadata page.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -133,10 +97,7 @@ public sealed partial class StorageEngine
|
|||||||
writer.Write((byte)idx.Type);
|
writer.Write((byte)idx.Type);
|
||||||
writer.Write(idx.RootPageId);
|
writer.Write(idx.RootPageId);
|
||||||
writer.Write(idx.PropertyPaths.Length);
|
writer.Write(idx.PropertyPaths.Length);
|
||||||
foreach (var path in idx.PropertyPaths)
|
foreach (string path in idx.PropertyPaths) writer.Write(path);
|
||||||
{
|
|
||||||
writer.Write(path);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (idx.Type == IndexType.Vector)
|
if (idx.Type == IndexType.Vector)
|
||||||
{
|
{
|
||||||
@@ -145,7 +106,7 @@ public sealed partial class StorageEngine
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
var newData = stream.ToArray();
|
byte[] newData = stream.ToArray();
|
||||||
|
|
||||||
var buffer = new byte[PageSize];
|
var buffer = new byte[PageSize];
|
||||||
ReadPage(1, null, buffer);
|
ReadPage(1, null, buffer);
|
||||||
@@ -155,7 +116,7 @@ public sealed partial class StorageEngine
|
|||||||
|
|
||||||
for (ushort i = 0; i < header.SlotCount; i++)
|
for (ushort i = 0; i < header.SlotCount; i++)
|
||||||
{
|
{
|
||||||
var slotOffset = SlottedPageHeader.Size + (i * SlotEntry.Size);
|
int slotOffset = SlottedPageHeader.Size + i * SlotEntry.Size;
|
||||||
var slot = SlotEntry.ReadFrom(buffer.AsSpan(slotOffset));
|
var slot = SlotEntry.ReadFrom(buffer.AsSpan(slotOffset));
|
||||||
if ((slot.Flags & SlotFlags.Deleted) != 0) continue;
|
if ((slot.Flags & SlotFlags.Deleted) != 0) continue;
|
||||||
|
|
||||||
@@ -163,7 +124,7 @@ public sealed partial class StorageEngine
|
|||||||
{
|
{
|
||||||
using var ms = new MemoryStream(buffer, slot.Offset, slot.Length, false);
|
using var ms = new MemoryStream(buffer, slot.Offset, slot.Length, false);
|
||||||
using var reader = new BinaryReader(ms);
|
using var reader = new BinaryReader(ms);
|
||||||
var name = reader.ReadString();
|
string name = reader.ReadString();
|
||||||
|
|
||||||
if (string.Equals(name, metadata.Name, StringComparison.OrdinalIgnoreCase))
|
if (string.Equals(name, metadata.Name, StringComparison.OrdinalIgnoreCase))
|
||||||
{
|
{
|
||||||
@@ -171,22 +132,23 @@ public sealed partial class StorageEngine
|
|||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
catch { }
|
catch
|
||||||
|
{
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if (existingSlotIndex >= 0)
|
if (existingSlotIndex >= 0)
|
||||||
{
|
{
|
||||||
var slotOffset = SlottedPageHeader.Size + (existingSlotIndex * SlotEntry.Size);
|
int slotOffset = SlottedPageHeader.Size + existingSlotIndex * SlotEntry.Size;
|
||||||
var slot = SlotEntry.ReadFrom(buffer.AsSpan(slotOffset));
|
var slot = SlotEntry.ReadFrom(buffer.AsSpan(slotOffset));
|
||||||
slot.Flags |= SlotFlags.Deleted;
|
slot.Flags |= SlotFlags.Deleted;
|
||||||
slot.WriteTo(buffer.AsSpan(slotOffset));
|
slot.WriteTo(buffer.AsSpan(slotOffset));
|
||||||
}
|
}
|
||||||
|
|
||||||
if (header.AvailableFreeSpace < newData.Length + SlotEntry.Size)
|
if (header.AvailableFreeSpace < newData.Length + SlotEntry.Size)
|
||||||
{
|
|
||||||
// Compact logic omitted as per current architecture
|
// Compact logic omitted as per current architecture
|
||||||
throw new InvalidOperationException("Not enough space in Metadata Page (Page 1) to save collection metadata.");
|
throw new InvalidOperationException(
|
||||||
}
|
"Not enough space in Metadata Page (Page 1) to save collection metadata.");
|
||||||
|
|
||||||
int docOffset = header.FreeSpaceEnd - newData.Length;
|
int docOffset = header.FreeSpaceEnd - newData.Length;
|
||||||
newData.CopyTo(buffer.AsSpan(docOffset));
|
newData.CopyTo(buffer.AsSpan(docOffset));
|
||||||
@@ -202,7 +164,7 @@ public sealed partial class StorageEngine
|
|||||||
header.SlotCount++;
|
header.SlotCount++;
|
||||||
}
|
}
|
||||||
|
|
||||||
var newSlotEntryOffset = SlottedPageHeader.Size + (slotIndex * SlotEntry.Size);
|
int newSlotEntryOffset = SlottedPageHeader.Size + slotIndex * SlotEntry.Size;
|
||||||
var newSlot = new SlotEntry
|
var newSlot = new SlotEntry
|
||||||
{
|
{
|
||||||
Offset = (ushort)docOffset,
|
Offset = (ushort)docOffset,
|
||||||
@@ -213,14 +175,52 @@ public sealed partial class StorageEngine
|
|||||||
|
|
||||||
header.FreeSpaceEnd = (ushort)docOffset;
|
header.FreeSpaceEnd = (ushort)docOffset;
|
||||||
if (existingSlotIndex == -1)
|
if (existingSlotIndex == -1)
|
||||||
{
|
header.FreeSpaceStart = (ushort)(SlottedPageHeader.Size + header.SlotCount * SlotEntry.Size);
|
||||||
header.FreeSpaceStart = (ushort)(SlottedPageHeader.Size + (header.SlotCount * SlotEntry.Size));
|
|
||||||
}
|
|
||||||
|
|
||||||
header.WriteTo(buffer);
|
header.WriteTo(buffer);
|
||||||
WritePageImmediate(1, buffer);
|
WritePageImmediate(1, buffer);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Registers all BSON keys used by a set of mappers into the global dictionary.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="mappers">The mappers whose keys should be registered.</param>
|
||||||
|
public void RegisterMappers(IEnumerable<IDocumentMapper> mappers)
|
||||||
|
{
|
||||||
|
var allKeys = mappers.SelectMany(m => m.UsedKeys).Distinct();
|
||||||
|
RegisterKeys(allKeys);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Returns all collection metadata entries currently registered in page 1.
|
||||||
|
/// </summary>
|
||||||
|
public IReadOnlyList<CollectionMetadata> GetAllCollectionMetadata()
|
||||||
|
{
|
||||||
|
var result = new List<CollectionMetadata>();
|
||||||
|
var buffer = new byte[PageSize];
|
||||||
|
ReadPage(1, null, buffer);
|
||||||
|
|
||||||
|
var header = SlottedPageHeader.ReadFrom(buffer);
|
||||||
|
if (header.PageType != PageType.Collection || header.SlotCount == 0)
|
||||||
|
return result;
|
||||||
|
|
||||||
|
for (ushort i = 0; i < header.SlotCount; i++)
|
||||||
|
{
|
||||||
|
int slotOffset = SlottedPageHeader.Size + i * SlotEntry.Size;
|
||||||
|
var slot = SlotEntry.ReadFrom(buffer.AsSpan(slotOffset));
|
||||||
|
if ((slot.Flags & SlotFlags.Deleted) != 0)
|
||||||
|
continue;
|
||||||
|
|
||||||
|
if (slot.Offset < SlottedPageHeader.Size || slot.Offset + slot.Length > buffer.Length)
|
||||||
|
continue;
|
||||||
|
|
||||||
|
if (TryDeserializeCollectionMetadata(buffer.AsSpan(slot.Offset, slot.Length), out var metadata) &&
|
||||||
|
metadata != null) result.Add(metadata);
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
private static bool TryDeserializeCollectionMetadata(ReadOnlySpan<byte> rawBytes, out CollectionMetadata? metadata)
|
private static bool TryDeserializeCollectionMetadata(ReadOnlySpan<byte> rawBytes, out CollectionMetadata? metadata)
|
||||||
{
|
{
|
||||||
metadata = null;
|
metadata = null;
|
||||||
@@ -230,16 +230,16 @@ public sealed partial class StorageEngine
|
|||||||
using var ms = new MemoryStream(rawBytes.ToArray());
|
using var ms = new MemoryStream(rawBytes.ToArray());
|
||||||
using var reader = new BinaryReader(ms);
|
using var reader = new BinaryReader(ms);
|
||||||
|
|
||||||
var collName = reader.ReadString();
|
string collName = reader.ReadString();
|
||||||
var parsed = new CollectionMetadata { Name = collName };
|
var parsed = new CollectionMetadata { Name = collName };
|
||||||
parsed.PrimaryRootPageId = reader.ReadUInt32();
|
parsed.PrimaryRootPageId = reader.ReadUInt32();
|
||||||
parsed.SchemaRootPageId = reader.ReadUInt32();
|
parsed.SchemaRootPageId = reader.ReadUInt32();
|
||||||
|
|
||||||
var indexCount = reader.ReadInt32();
|
int indexCount = reader.ReadInt32();
|
||||||
if (indexCount < 0)
|
if (indexCount < 0)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
for (int j = 0; j < indexCount; j++)
|
for (var j = 0; j < indexCount; j++)
|
||||||
{
|
{
|
||||||
var idx = new IndexMetadata
|
var idx = new IndexMetadata
|
||||||
{
|
{
|
||||||
@@ -249,12 +249,12 @@ public sealed partial class StorageEngine
|
|||||||
RootPageId = reader.ReadUInt32()
|
RootPageId = reader.ReadUInt32()
|
||||||
};
|
};
|
||||||
|
|
||||||
var pathCount = reader.ReadInt32();
|
int pathCount = reader.ReadInt32();
|
||||||
if (pathCount < 0)
|
if (pathCount < 0)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
idx.PropertyPaths = new string[pathCount];
|
idx.PropertyPaths = new string[pathCount];
|
||||||
for (int k = 0; k < pathCount; k++)
|
for (var k = 0; k < pathCount; k++)
|
||||||
idx.PropertyPaths[k] = reader.ReadString();
|
idx.PropertyPaths[k] = reader.ReadString();
|
||||||
|
|
||||||
if (idx.Type == IndexType.Vector)
|
if (idx.Type == IndexType.Vector)
|
||||||
@@ -274,14 +274,4 @@ public sealed partial class StorageEngine
|
|||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Registers all BSON keys used by a set of mappers into the global dictionary.
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="mappers">The mappers whose keys should be registered.</param>
|
|
||||||
public void RegisterMappers(IEnumerable<IDocumentMapper> mappers)
|
|
||||||
{
|
|
||||||
var allKeys = mappers.SelectMany(m => m.UsedKeys).Distinct();
|
|
||||||
RegisterKeys(allKeys);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
@@ -89,7 +89,8 @@ public sealed class CollectionCompressionRatioEntry
|
|||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the compression ratio.
|
/// Gets the compression ratio.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public double CompressionRatio => BytesAfterCompression <= 0 ? 1.0 : (double)BytesBeforeCompression / BytesAfterCompression;
|
public double CompressionRatio =>
|
||||||
|
BytesAfterCompression <= 0 ? 1.0 : (double)BytesBeforeCompression / BytesAfterCompression;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -182,7 +183,7 @@ public sealed partial class StorageEngine
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public IReadOnlyList<PageTypeUsageEntry> GetPageUsageByPageType()
|
public IReadOnlyList<PageTypeUsageEntry> GetPageUsageByPageType()
|
||||||
{
|
{
|
||||||
var pageCount = _pageFile.NextPageId;
|
uint pageCount = _pageFile.NextPageId;
|
||||||
var buffer = new byte[_pageFile.PageSize];
|
var buffer = new byte[_pageFile.PageSize];
|
||||||
var counts = new Dictionary<PageType, int>();
|
var counts = new Dictionary<PageType, int>();
|
||||||
|
|
||||||
@@ -190,7 +191,7 @@ public sealed partial class StorageEngine
|
|||||||
{
|
{
|
||||||
_pageFile.ReadPage(pageId, buffer);
|
_pageFile.ReadPage(pageId, buffer);
|
||||||
var pageType = PageHeader.ReadFrom(buffer).PageType;
|
var pageType = PageHeader.ReadFrom(buffer).PageType;
|
||||||
counts[pageType] = counts.TryGetValue(pageType, out var count) ? count + 1 : 1;
|
counts[pageType] = counts.TryGetValue(pageType, out int count) ? count + 1 : 1;
|
||||||
}
|
}
|
||||||
|
|
||||||
return counts
|
return counts
|
||||||
@@ -221,27 +222,23 @@ public sealed partial class StorageEngine
|
|||||||
pageIds.Add(metadata.SchemaRootPageId);
|
pageIds.Add(metadata.SchemaRootPageId);
|
||||||
|
|
||||||
foreach (var indexMetadata in metadata.Indexes)
|
foreach (var indexMetadata in metadata.Indexes)
|
||||||
{
|
|
||||||
if (indexMetadata.RootPageId != 0)
|
if (indexMetadata.RootPageId != 0)
|
||||||
pageIds.Add(indexMetadata.RootPageId);
|
pageIds.Add(indexMetadata.RootPageId);
|
||||||
}
|
|
||||||
|
|
||||||
foreach (var location in EnumeratePrimaryLocations(metadata))
|
foreach (var location in EnumeratePrimaryLocations(metadata))
|
||||||
{
|
{
|
||||||
pageIds.Add(location.PageId);
|
pageIds.Add(location.PageId);
|
||||||
if (TryReadFirstOverflowPage(location, out var firstOverflowPage))
|
if (TryReadFirstOverflowPage(location, out uint firstOverflowPage))
|
||||||
{
|
|
||||||
AddOverflowChainPages(pageIds, firstOverflowPage);
|
AddOverflowChainPages(pageIds, firstOverflowPage);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
int data = 0;
|
var data = 0;
|
||||||
int overflow = 0;
|
var overflow = 0;
|
||||||
int indexPages = 0;
|
var indexPages = 0;
|
||||||
int other = 0;
|
var other = 0;
|
||||||
|
|
||||||
var pageBuffer = new byte[_pageFile.PageSize];
|
var pageBuffer = new byte[_pageFile.PageSize];
|
||||||
foreach (var pageId in pageIds)
|
foreach (uint pageId in pageIds)
|
||||||
{
|
{
|
||||||
if (pageId >= _pageFile.NextPageId)
|
if (pageId >= _pageFile.NextPageId)
|
||||||
continue;
|
continue;
|
||||||
@@ -250,22 +247,14 @@ public sealed partial class StorageEngine
|
|||||||
var pageType = PageHeader.ReadFrom(pageBuffer).PageType;
|
var pageType = PageHeader.ReadFrom(pageBuffer).PageType;
|
||||||
|
|
||||||
if (pageType == PageType.Data)
|
if (pageType == PageType.Data)
|
||||||
{
|
|
||||||
data++;
|
data++;
|
||||||
}
|
|
||||||
else if (pageType == PageType.Overflow)
|
else if (pageType == PageType.Overflow)
|
||||||
{
|
|
||||||
overflow++;
|
overflow++;
|
||||||
}
|
|
||||||
else if (pageType == PageType.Index || pageType == PageType.Vector || pageType == PageType.Spatial)
|
else if (pageType == PageType.Index || pageType == PageType.Vector || pageType == PageType.Spatial)
|
||||||
{
|
|
||||||
indexPages++;
|
indexPages++;
|
||||||
}
|
|
||||||
else
|
else
|
||||||
{
|
|
||||||
other++;
|
other++;
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
results.Add(new CollectionPageUsageEntry
|
results.Add(new CollectionPageUsageEntry
|
||||||
{
|
{
|
||||||
@@ -298,7 +287,8 @@ public sealed partial class StorageEngine
|
|||||||
|
|
||||||
foreach (var location in EnumeratePrimaryLocations(metadata))
|
foreach (var location in EnumeratePrimaryLocations(metadata))
|
||||||
{
|
{
|
||||||
if (!TryReadSlotPayloadStats(location, out var isCompressed, out var originalBytes, out var storedBytes))
|
if (!TryReadSlotPayloadStats(location, out bool isCompressed, out int originalBytes,
|
||||||
|
out int storedBytes))
|
||||||
continue;
|
continue;
|
||||||
|
|
||||||
docs++;
|
docs++;
|
||||||
@@ -343,8 +333,8 @@ public sealed partial class StorageEngine
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public FragmentationMapReport GetFragmentationMap()
|
public FragmentationMapReport GetFragmentationMap()
|
||||||
{
|
{
|
||||||
var freePageSet = new HashSet<uint>(_pageFile.EnumerateFreePages(includeEmptyPages: true));
|
var freePageSet = new HashSet<uint>(_pageFile.EnumerateFreePages());
|
||||||
var pageCount = _pageFile.NextPageId;
|
uint pageCount = _pageFile.NextPageId;
|
||||||
var buffer = new byte[_pageFile.PageSize];
|
var buffer = new byte[_pageFile.PageSize];
|
||||||
var pages = new List<FragmentationPageEntry>((int)pageCount);
|
var pages = new List<FragmentationPageEntry>((int)pageCount);
|
||||||
|
|
||||||
@@ -354,17 +344,12 @@ public sealed partial class StorageEngine
|
|||||||
{
|
{
|
||||||
_pageFile.ReadPage(pageId, buffer);
|
_pageFile.ReadPage(pageId, buffer);
|
||||||
var pageHeader = PageHeader.ReadFrom(buffer);
|
var pageHeader = PageHeader.ReadFrom(buffer);
|
||||||
var isFreePage = freePageSet.Contains(pageId);
|
bool isFreePage = freePageSet.Contains(pageId);
|
||||||
|
|
||||||
int freeBytes = 0;
|
var freeBytes = 0;
|
||||||
if (isFreePage)
|
if (isFreePage)
|
||||||
{
|
|
||||||
freeBytes = _pageFile.PageSize;
|
freeBytes = _pageFile.PageSize;
|
||||||
}
|
else if (TryReadSlottedFreeSpace(buffer, out int slottedFreeBytes)) freeBytes = slottedFreeBytes;
|
||||||
else if (TryReadSlottedFreeSpace(buffer, out var slottedFreeBytes))
|
|
||||||
{
|
|
||||||
freeBytes = slottedFreeBytes;
|
|
||||||
}
|
|
||||||
|
|
||||||
totalFreeBytes += freeBytes;
|
totalFreeBytes += freeBytes;
|
||||||
|
|
||||||
@@ -378,7 +363,7 @@ public sealed partial class StorageEngine
|
|||||||
}
|
}
|
||||||
|
|
||||||
uint tailReclaimablePages = 0;
|
uint tailReclaimablePages = 0;
|
||||||
for (var i = pageCount; i > 2; i--)
|
for (uint i = pageCount; i > 2; i--)
|
||||||
{
|
{
|
||||||
if (!freePageSet.Contains(i - 1))
|
if (!freePageSet.Contains(i - 1))
|
||||||
break;
|
break;
|
||||||
@@ -386,12 +371,12 @@ public sealed partial class StorageEngine
|
|||||||
tailReclaimablePages++;
|
tailReclaimablePages++;
|
||||||
}
|
}
|
||||||
|
|
||||||
var fileBytes = Math.Max(1L, _pageFile.FileLengthBytes);
|
long fileBytes = Math.Max(1L, _pageFile.FileLengthBytes);
|
||||||
return new FragmentationMapReport
|
return new FragmentationMapReport
|
||||||
{
|
{
|
||||||
Pages = pages,
|
Pages = pages,
|
||||||
TotalFreeBytes = totalFreeBytes,
|
TotalFreeBytes = totalFreeBytes,
|
||||||
FragmentationPercent = (totalFreeBytes * 100d) / fileBytes,
|
FragmentationPercent = totalFreeBytes * 100d / fileBytes,
|
||||||
TailReclaimablePages = tailReclaimablePages
|
TailReclaimablePages = tailReclaimablePages
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
@@ -403,11 +388,9 @@ public sealed partial class StorageEngine
|
|||||||
|
|
||||||
var index = new BTreeIndex(this, IndexOptions.CreateUnique("_id"), metadata.PrimaryRootPageId);
|
var index = new BTreeIndex(this, IndexOptions.CreateUnique("_id"), metadata.PrimaryRootPageId);
|
||||||
|
|
||||||
foreach (var entry in index.Range(IndexKey.MinKey, IndexKey.MaxKey, IndexDirection.Forward, transactionId: 0))
|
foreach (var entry in index.Range(IndexKey.MinKey, IndexKey.MaxKey, IndexDirection.Forward, 0))
|
||||||
{
|
|
||||||
yield return entry.Location;
|
yield return entry.Location;
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
private bool TryReadFirstOverflowPage(in DocumentLocation location, out uint firstOverflowPage)
|
private bool TryReadFirstOverflowPage(in DocumentLocation location, out uint firstOverflowPage)
|
||||||
{
|
{
|
||||||
@@ -419,7 +402,7 @@ public sealed partial class StorageEngine
|
|||||||
if (location.SlotIndex >= header.SlotCount)
|
if (location.SlotIndex >= header.SlotCount)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
var slotOffset = SlottedPageHeader.Size + (location.SlotIndex * SlotEntry.Size);
|
int slotOffset = SlottedPageHeader.Size + location.SlotIndex * SlotEntry.Size;
|
||||||
var slot = SlotEntry.ReadFrom(pageBuffer.AsSpan(slotOffset, SlotEntry.Size));
|
var slot = SlotEntry.ReadFrom(pageBuffer.AsSpan(slotOffset, SlotEntry.Size));
|
||||||
if ((slot.Flags & SlotFlags.Deleted) != 0)
|
if ((slot.Flags & SlotFlags.Deleted) != 0)
|
||||||
return false;
|
return false;
|
||||||
@@ -441,7 +424,7 @@ public sealed partial class StorageEngine
|
|||||||
|
|
||||||
var buffer = new byte[_pageFile.PageSize];
|
var buffer = new byte[_pageFile.PageSize];
|
||||||
var visited = new HashSet<uint>();
|
var visited = new HashSet<uint>();
|
||||||
var current = firstOverflowPage;
|
uint current = firstOverflowPage;
|
||||||
|
|
||||||
while (current != 0 && current < _pageFile.NextPageId && visited.Add(current))
|
while (current != 0 && current < _pageFile.NextPageId && visited.Add(current))
|
||||||
{
|
{
|
||||||
@@ -472,12 +455,12 @@ public sealed partial class StorageEngine
|
|||||||
if (location.SlotIndex >= header.SlotCount)
|
if (location.SlotIndex >= header.SlotCount)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
var slotOffset = SlottedPageHeader.Size + (location.SlotIndex * SlotEntry.Size);
|
int slotOffset = SlottedPageHeader.Size + location.SlotIndex * SlotEntry.Size;
|
||||||
var slot = SlotEntry.ReadFrom(pageBuffer.AsSpan(slotOffset, SlotEntry.Size));
|
var slot = SlotEntry.ReadFrom(pageBuffer.AsSpan(slotOffset, SlotEntry.Size));
|
||||||
if ((slot.Flags & SlotFlags.Deleted) != 0)
|
if ((slot.Flags & SlotFlags.Deleted) != 0)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
var hasOverflow = (slot.Flags & SlotFlags.HasOverflow) != 0;
|
bool hasOverflow = (slot.Flags & SlotFlags.HasOverflow) != 0;
|
||||||
isCompressed = (slot.Flags & SlotFlags.Compressed) != 0;
|
isCompressed = (slot.Flags & SlotFlags.Compressed) != 0;
|
||||||
|
|
||||||
if (!hasOverflow)
|
if (!hasOverflow)
|
||||||
@@ -492,7 +475,8 @@ public sealed partial class StorageEngine
|
|||||||
if (slot.Length < CompressedPayloadHeader.Size)
|
if (slot.Length < CompressedPayloadHeader.Size)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
var compressedHeader = CompressedPayloadHeader.ReadFrom(pageBuffer.AsSpan(slot.Offset, CompressedPayloadHeader.Size));
|
var compressedHeader =
|
||||||
|
CompressedPayloadHeader.ReadFrom(pageBuffer.AsSpan(slot.Offset, CompressedPayloadHeader.Size));
|
||||||
originalBytes = compressedHeader.OriginalLength;
|
originalBytes = compressedHeader.OriginalLength;
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
@@ -501,7 +485,7 @@ public sealed partial class StorageEngine
|
|||||||
return false;
|
return false;
|
||||||
|
|
||||||
var primaryPayload = pageBuffer.AsSpan(slot.Offset, slot.Length);
|
var primaryPayload = pageBuffer.AsSpan(slot.Offset, slot.Length);
|
||||||
var totalStoredBytes = BinaryPrimitives.ReadInt32LittleEndian(primaryPayload.Slice(0, 4));
|
int totalStoredBytes = BinaryPrimitives.ReadInt32LittleEndian(primaryPayload.Slice(0, 4));
|
||||||
if (totalStoredBytes < 0)
|
if (totalStoredBytes < 0)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
@@ -522,8 +506,8 @@ public sealed partial class StorageEngine
|
|||||||
else
|
else
|
||||||
{
|
{
|
||||||
storedPrefix.CopyTo(headerBuffer);
|
storedPrefix.CopyTo(headerBuffer);
|
||||||
var copied = storedPrefix.Length;
|
int copied = storedPrefix.Length;
|
||||||
var nextOverflow = BinaryPrimitives.ReadUInt32LittleEndian(primaryPayload.Slice(4, 4));
|
uint nextOverflow = BinaryPrimitives.ReadUInt32LittleEndian(primaryPayload.Slice(4, 4));
|
||||||
var overflowBuffer = new byte[_pageFile.PageSize];
|
var overflowBuffer = new byte[_pageFile.PageSize];
|
||||||
|
|
||||||
while (copied < CompressedPayloadHeader.Size && nextOverflow != 0 && nextOverflow < _pageFile.NextPageId)
|
while (copied < CompressedPayloadHeader.Size && nextOverflow != 0 && nextOverflow < _pageFile.NextPageId)
|
||||||
@@ -533,7 +517,8 @@ public sealed partial class StorageEngine
|
|||||||
if (overflowHeader.PageType != PageType.Overflow)
|
if (overflowHeader.PageType != PageType.Overflow)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
var available = Math.Min(CompressedPayloadHeader.Size - copied, _pageFile.PageSize - SlottedPageHeader.Size);
|
int available = Math.Min(CompressedPayloadHeader.Size - copied,
|
||||||
|
_pageFile.PageSize - SlottedPageHeader.Size);
|
||||||
overflowBuffer.AsSpan(SlottedPageHeader.Size, available).CopyTo(headerBuffer.Slice(copied));
|
overflowBuffer.AsSpan(SlottedPageHeader.Size, available).CopyTo(headerBuffer.Slice(copied));
|
||||||
copied += available;
|
copied += available;
|
||||||
nextOverflow = overflowHeader.NextOverflowPage;
|
nextOverflow = overflowHeader.NextOverflowPage;
|
||||||
|
|||||||
@@ -1,17 +1,92 @@
|
|||||||
using System.Collections.Concurrent;
|
using System.Collections.Concurrent;
|
||||||
using System.Text;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Storage;
|
namespace ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
|
|
||||||
public sealed partial class StorageEngine
|
public sealed partial class StorageEngine
|
||||||
{
|
{
|
||||||
private readonly ConcurrentDictionary<string, ushort> _dictionaryCache = new(StringComparer.OrdinalIgnoreCase);
|
private readonly ConcurrentDictionary<string, ushort> _dictionaryCache = new(StringComparer.OrdinalIgnoreCase);
|
||||||
|
|
||||||
|
// Lock for dictionary modifications (simple lock for now, could be RW lock)
|
||||||
|
private readonly object _dictionaryLock = new();
|
||||||
private readonly ConcurrentDictionary<ushort, string> _dictionaryReverseCache = new();
|
private readonly ConcurrentDictionary<ushort, string> _dictionaryReverseCache = new();
|
||||||
private uint _dictionaryRootPageId;
|
private uint _dictionaryRootPageId;
|
||||||
private ushort _nextDictionaryId;
|
private ushort _nextDictionaryId;
|
||||||
|
|
||||||
// Lock for dictionary modifications (simple lock for now, could be RW lock)
|
/// <summary>
|
||||||
private readonly object _dictionaryLock = new();
|
/// Gets the key-to-id dictionary cache.
|
||||||
|
/// </summary>
|
||||||
|
/// <returns>The key-to-id map.</returns>
|
||||||
|
public ConcurrentDictionary<string, ushort> GetKeyMap()
|
||||||
|
{
|
||||||
|
return _dictionaryCache;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the id-to-key dictionary cache.
|
||||||
|
/// </summary>
|
||||||
|
/// <returns>The id-to-key map.</returns>
|
||||||
|
public ConcurrentDictionary<ushort, string> GetKeyReverseMap()
|
||||||
|
{
|
||||||
|
return _dictionaryReverseCache;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the ID for a dictionary key, creating it if it doesn't exist.
|
||||||
|
/// Thread-safe.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="key">The dictionary key.</param>
|
||||||
|
/// <returns>The dictionary identifier for the key.</returns>
|
||||||
|
public ushort GetOrAddDictionaryEntry(string key)
|
||||||
|
{
|
||||||
|
key = key.ToLowerInvariant();
|
||||||
|
if (_dictionaryCache.TryGetValue(key, out ushort id)) return id;
|
||||||
|
|
||||||
|
lock (_dictionaryLock)
|
||||||
|
{
|
||||||
|
// Double checked locking
|
||||||
|
if (_dictionaryCache.TryGetValue(key, out id)) return id;
|
||||||
|
|
||||||
|
// Try to find in storage (in case cache is incomplete or another process?)
|
||||||
|
// Note: FindAllGlobal loaded everything, so strict cache miss means it's not in DB.
|
||||||
|
// BUT if we support concurrent writers (multiple processed), we should re-check DB.
|
||||||
|
// Current CBDD seems to be single-process exclusive lock (FileShare.None).
|
||||||
|
// So in-memory cache is authoritative after load.
|
||||||
|
|
||||||
|
// Generate New ID
|
||||||
|
ushort nextId = _nextDictionaryId;
|
||||||
|
if (nextId == 0) nextId = DictionaryPage.ReservedValuesEnd + 1; // Should be init, but safety
|
||||||
|
|
||||||
|
// Insert into Page
|
||||||
|
// usage of default(ulong) or null transaction?
|
||||||
|
// Dictionary updates should ideally be transactional or immediate?
|
||||||
|
// "Immediate" for now to simplify, as dictionary is cross-collection.
|
||||||
|
// If we use transaction, we need to pass it in. For now, immediate write.
|
||||||
|
|
||||||
|
// We need to support "Insert Global" which handles overflow.
|
||||||
|
// DictionaryPage.Insert only handles single page.
|
||||||
|
|
||||||
|
// We need logic here to traverse chain and find space.
|
||||||
|
if (InsertDictionaryEntryGlobal(key, nextId))
|
||||||
|
{
|
||||||
|
_dictionaryCache[key] = nextId;
|
||||||
|
_dictionaryReverseCache[nextId] = key;
|
||||||
|
_nextDictionaryId++;
|
||||||
|
return nextId;
|
||||||
|
}
|
||||||
|
|
||||||
|
throw new InvalidOperationException("Failed to insert dictionary entry (Storage Full?)");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Registers a set of keys in the global dictionary.
|
||||||
|
/// Ensures all keys are assigned an ID and persisted.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="keys">The keys to register.</param>
|
||||||
|
public void RegisterKeys(IEnumerable<string> keys)
|
||||||
|
{
|
||||||
|
foreach (string key in keys) GetOrAddDictionaryEntry(key.ToLowerInvariant());
|
||||||
|
}
|
||||||
|
|
||||||
private void InitializeDictionary()
|
private void InitializeDictionary()
|
||||||
{
|
{
|
||||||
@@ -57,13 +132,14 @@ public sealed partial class StorageEngine
|
|||||||
|
|
||||||
// Warm cache
|
// Warm cache
|
||||||
ushort maxId = DictionaryPage.ReservedValuesEnd;
|
ushort maxId = DictionaryPage.ReservedValuesEnd;
|
||||||
foreach (var (key, val) in DictionaryPage.FindAllGlobal(this, _dictionaryRootPageId))
|
foreach ((string key, ushort val) in DictionaryPage.FindAllGlobal(this, _dictionaryRootPageId))
|
||||||
{
|
{
|
||||||
var lowerKey = key.ToLowerInvariant();
|
string lowerKey = key.ToLowerInvariant();
|
||||||
_dictionaryCache[lowerKey] = val;
|
_dictionaryCache[lowerKey] = val;
|
||||||
_dictionaryReverseCache[val] = lowerKey;
|
_dictionaryReverseCache[val] = lowerKey;
|
||||||
if (val > maxId) maxId = val;
|
if (val > maxId) maxId = val;
|
||||||
}
|
}
|
||||||
|
|
||||||
_nextDictionaryId = (ushort)(maxId + 1);
|
_nextDictionaryId = (ushort)(maxId + 1);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -72,78 +148,10 @@ public sealed partial class StorageEngine
|
|||||||
|
|
||||||
// Pre-register common array indices to avoid mapping during high-frequency writes
|
// Pre-register common array indices to avoid mapping during high-frequency writes
|
||||||
var indices = new List<string>(101);
|
var indices = new List<string>(101);
|
||||||
for (int i = 0; i <= 100; i++) indices.Add(i.ToString());
|
for (var i = 0; i <= 100; i++) indices.Add(i.ToString());
|
||||||
RegisterKeys(indices);
|
RegisterKeys(indices);
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Gets the key-to-id dictionary cache.
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>The key-to-id map.</returns>
|
|
||||||
public ConcurrentDictionary<string, ushort> GetKeyMap() => _dictionaryCache;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Gets the id-to-key dictionary cache.
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>The id-to-key map.</returns>
|
|
||||||
public ConcurrentDictionary<ushort, string> GetKeyReverseMap() => _dictionaryReverseCache;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Gets the ID for a dictionary key, creating it if it doesn't exist.
|
|
||||||
/// Thread-safe.
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="key">The dictionary key.</param>
|
|
||||||
/// <returns>The dictionary identifier for the key.</returns>
|
|
||||||
public ushort GetOrAddDictionaryEntry(string key)
|
|
||||||
{
|
|
||||||
key = key.ToLowerInvariant();
|
|
||||||
if (_dictionaryCache.TryGetValue(key, out var id))
|
|
||||||
{
|
|
||||||
return id;
|
|
||||||
}
|
|
||||||
|
|
||||||
lock (_dictionaryLock)
|
|
||||||
{
|
|
||||||
// Double checked locking
|
|
||||||
if (_dictionaryCache.TryGetValue(key, out id))
|
|
||||||
{
|
|
||||||
return id;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Try to find in storage (in case cache is incomplete or another process?)
|
|
||||||
// Note: FindAllGlobal loaded everything, so strict cache miss means it's not in DB.
|
|
||||||
// BUT if we support concurrent writers (multiple processed), we should re-check DB.
|
|
||||||
// Current CBDD seems to be single-process exclusive lock (FileShare.None).
|
|
||||||
// So in-memory cache is authoritative after load.
|
|
||||||
|
|
||||||
// Generate New ID
|
|
||||||
ushort nextId = _nextDictionaryId;
|
|
||||||
if (nextId == 0) nextId = DictionaryPage.ReservedValuesEnd + 1; // Should be init, but safety
|
|
||||||
|
|
||||||
// Insert into Page
|
|
||||||
// usage of default(ulong) or null transaction?
|
|
||||||
// Dictionary updates should ideally be transactional or immediate?
|
|
||||||
// "Immediate" for now to simplify, as dictionary is cross-collection.
|
|
||||||
// If we use transaction, we need to pass it in. For now, immediate write.
|
|
||||||
|
|
||||||
// We need to support "Insert Global" which handles overflow.
|
|
||||||
// DictionaryPage.Insert only handles single page.
|
|
||||||
|
|
||||||
// We need logic here to traverse chain and find space.
|
|
||||||
if (InsertDictionaryEntryGlobal(key, nextId))
|
|
||||||
{
|
|
||||||
_dictionaryCache[key] = nextId;
|
|
||||||
_dictionaryReverseCache[nextId] = key;
|
|
||||||
_nextDictionaryId++;
|
|
||||||
return nextId;
|
|
||||||
}
|
|
||||||
else
|
|
||||||
{
|
|
||||||
throw new InvalidOperationException("Failed to insert dictionary entry (Storage Full?)");
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the dictionary key for an identifier.
|
/// Gets the dictionary key for an identifier.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -151,14 +159,14 @@ public sealed partial class StorageEngine
|
|||||||
/// <returns>The dictionary key if found; otherwise, <see langword="null" />.</returns>
|
/// <returns>The dictionary key if found; otherwise, <see langword="null" />.</returns>
|
||||||
public string? GetDictionaryKey(ushort id)
|
public string? GetDictionaryKey(ushort id)
|
||||||
{
|
{
|
||||||
if (_dictionaryReverseCache.TryGetValue(id, out var key))
|
if (_dictionaryReverseCache.TryGetValue(id, out string? key))
|
||||||
return key;
|
return key;
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
private bool InsertDictionaryEntryGlobal(string key, ushort value)
|
private bool InsertDictionaryEntryGlobal(string key, ushort value)
|
||||||
{
|
{
|
||||||
var pageId = _dictionaryRootPageId;
|
uint pageId = _dictionaryRootPageId;
|
||||||
var pageBuffer = new byte[PageSize];
|
var pageBuffer = new byte[PageSize];
|
||||||
|
|
||||||
while (true)
|
while (true)
|
||||||
@@ -182,7 +190,7 @@ public sealed partial class StorageEngine
|
|||||||
}
|
}
|
||||||
|
|
||||||
// No Next Page - Allocate New
|
// No Next Page - Allocate New
|
||||||
var newPageId = AllocatePage();
|
uint newPageId = AllocatePage();
|
||||||
var newPageBuffer = new byte[PageSize];
|
var newPageBuffer = new byte[PageSize];
|
||||||
DictionaryPage.Initialize(newPageBuffer, newPageId);
|
DictionaryPage.Initialize(newPageBuffer, newPageId);
|
||||||
|
|
||||||
@@ -203,17 +211,4 @@ public sealed partial class StorageEngine
|
|||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Registers a set of keys in the global dictionary.
|
|
||||||
/// Ensures all keys are assigned an ID and persisted.
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="keys">The keys to register.</param>
|
|
||||||
public void RegisterKeys(IEnumerable<string> keys)
|
|
||||||
{
|
|
||||||
foreach (var key in keys)
|
|
||||||
{
|
|
||||||
GetOrAddDictionaryEntry(key.ToLowerInvariant());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
@@ -39,7 +39,8 @@ internal readonly struct StorageFormatMetadata
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public bool CompressionCapabilityEnabled => (FeatureFlags & StorageFeatureFlags.CompressionCapability) != 0;
|
public bool CompressionCapabilityEnabled => (FeatureFlags & StorageFeatureFlags.CompressionCapability) != 0;
|
||||||
|
|
||||||
private StorageFormatMetadata(bool isPresent, byte version, StorageFeatureFlags featureFlags, CompressionCodec defaultCodec)
|
private StorageFormatMetadata(bool isPresent, byte version, StorageFeatureFlags featureFlags,
|
||||||
|
CompressionCodec defaultCodec)
|
||||||
{
|
{
|
||||||
IsPresent = isPresent;
|
IsPresent = isPresent;
|
||||||
Version = version;
|
Version = version;
|
||||||
@@ -53,7 +54,8 @@ internal readonly struct StorageFormatMetadata
|
|||||||
/// <param name="version">The storage format version.</param>
|
/// <param name="version">The storage format version.</param>
|
||||||
/// <param name="featureFlags">Enabled feature flags.</param>
|
/// <param name="featureFlags">Enabled feature flags.</param>
|
||||||
/// <param name="defaultCodec">The default compression codec.</param>
|
/// <param name="defaultCodec">The default compression codec.</param>
|
||||||
public static StorageFormatMetadata Present(byte version, StorageFeatureFlags featureFlags, CompressionCodec defaultCodec)
|
public static StorageFormatMetadata Present(byte version, StorageFeatureFlags featureFlags,
|
||||||
|
CompressionCodec defaultCodec)
|
||||||
{
|
{
|
||||||
return new StorageFormatMetadata(true, version, featureFlags, defaultCodec);
|
return new StorageFormatMetadata(true, version, featureFlags, defaultCodec);
|
||||||
}
|
}
|
||||||
@@ -88,12 +90,13 @@ public sealed partial class StorageEngine
|
|||||||
return metadata;
|
return metadata;
|
||||||
|
|
||||||
if (!_pageFile.WasCreated)
|
if (!_pageFile.WasCreated)
|
||||||
return StorageFormatMetadata.Legacy(_compressionOptions.Codec);
|
return StorageFormatMetadata.Legacy(CompressionOptions.Codec);
|
||||||
|
|
||||||
var featureFlags = _compressionOptions.EnableCompression
|
var featureFlags = CompressionOptions.EnableCompression
|
||||||
? StorageFeatureFlags.CompressionCapability
|
? StorageFeatureFlags.CompressionCapability
|
||||||
: StorageFeatureFlags.None;
|
: StorageFeatureFlags.None;
|
||||||
var initialMetadata = StorageFormatMetadata.Present(CurrentStorageFormatVersion, featureFlags, _compressionOptions.Codec);
|
var initialMetadata =
|
||||||
|
StorageFormatMetadata.Present(CurrentStorageFormatVersion, featureFlags, CompressionOptions.Codec);
|
||||||
WriteStorageFormatMetadata(initialMetadata);
|
WriteStorageFormatMetadata(initialMetadata);
|
||||||
return initialMetadata;
|
return initialMetadata;
|
||||||
}
|
}
|
||||||
@@ -104,11 +107,11 @@ public sealed partial class StorageEngine
|
|||||||
if (source.Length < StorageFormatMetadata.WireSize)
|
if (source.Length < StorageFormatMetadata.WireSize)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
var magic = BinaryPrimitives.ReadUInt32LittleEndian(source.Slice(0, 4));
|
uint magic = BinaryPrimitives.ReadUInt32LittleEndian(source.Slice(0, 4));
|
||||||
if (magic != StorageFormatMagic)
|
if (magic != StorageFormatMagic)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
var version = source[4];
|
byte version = source[4];
|
||||||
var featureFlags = (StorageFeatureFlags)source[5];
|
var featureFlags = (StorageFeatureFlags)source[5];
|
||||||
var codec = (CompressionCodec)source[6];
|
var codec = (CompressionCodec)source[6];
|
||||||
if (!Enum.IsDefined(codec))
|
if (!Enum.IsDefined(codec))
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -1,5 +1,3 @@
|
|||||||
using ZB.MOM.WW.CBDD.Core.Transactions;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Storage;
|
namespace ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
|
|
||||||
public sealed partial class StorageEngine
|
public sealed partial class StorageEngine
|
||||||
|
|||||||
@@ -117,7 +117,8 @@ public sealed partial class StorageEngine
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="options">Optional compression migration options.</param>
|
/// <param name="options">Optional compression migration options.</param>
|
||||||
/// <param name="ct">A token used to cancel the operation.</param>
|
/// <param name="ct">A token used to cancel the operation.</param>
|
||||||
public async Task<CompressionMigrationResult> MigrateCompressionAsync(CompressionMigrationOptions? options = null, CancellationToken ct = default)
|
public async Task<CompressionMigrationResult> MigrateCompressionAsync(CompressionMigrationOptions? options = null,
|
||||||
|
CancellationToken ct = default)
|
||||||
{
|
{
|
||||||
var normalized = NormalizeMigrationOptions(options);
|
var normalized = NormalizeMigrationOptions(options);
|
||||||
|
|
||||||
@@ -147,13 +148,13 @@ public sealed partial class StorageEngine
|
|||||||
{
|
{
|
||||||
ct.ThrowIfCancellationRequested();
|
ct.ThrowIfCancellationRequested();
|
||||||
|
|
||||||
if (!TryReadStoredPayload(location, out var storedPayload, out var isCompressed))
|
if (!TryReadStoredPayload(location, out byte[] storedPayload, out bool isCompressed))
|
||||||
{
|
{
|
||||||
docsSkipped++;
|
docsSkipped++;
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!TryGetLogicalPayload(storedPayload, isCompressed, out var logicalPayload))
|
if (!TryGetLogicalPayload(storedPayload, isCompressed, out byte[] logicalPayload))
|
||||||
{
|
{
|
||||||
docsSkipped++;
|
docsSkipped++;
|
||||||
continue;
|
continue;
|
||||||
@@ -162,15 +163,14 @@ public sealed partial class StorageEngine
|
|||||||
docsScanned++;
|
docsScanned++;
|
||||||
bytesBefore += logicalPayload.Length;
|
bytesBefore += logicalPayload.Length;
|
||||||
|
|
||||||
var targetStored = BuildTargetStoredPayload(logicalPayload, normalized, out var targetCompressed);
|
byte[] targetStored =
|
||||||
|
BuildTargetStoredPayload(logicalPayload, normalized, out bool targetCompressed);
|
||||||
bytesEstimatedAfter += targetStored.Length;
|
bytesEstimatedAfter += targetStored.Length;
|
||||||
|
|
||||||
if (normalized.DryRun)
|
if (normalized.DryRun) continue;
|
||||||
{
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!TryRewriteStoredPayloadAtLocation(location, targetStored, targetCompressed, out var actualStoredBytes))
|
if (!TryRewriteStoredPayloadAtLocation(location, targetStored, targetCompressed,
|
||||||
|
out int actualStoredBytes))
|
||||||
{
|
{
|
||||||
docsSkipped++;
|
docsSkipped++;
|
||||||
continue;
|
continue;
|
||||||
@@ -184,9 +184,9 @@ public sealed partial class StorageEngine
|
|||||||
if (!normalized.DryRun)
|
if (!normalized.DryRun)
|
||||||
{
|
{
|
||||||
var metadata = StorageFormatMetadata.Present(
|
var metadata = StorageFormatMetadata.Present(
|
||||||
version: 1,
|
1,
|
||||||
featureFlags: StorageFeatureFlags.CompressionCapability,
|
StorageFeatureFlags.CompressionCapability,
|
||||||
defaultCodec: normalized.Codec);
|
normalized.Codec);
|
||||||
WriteStorageFormatMetadata(metadata);
|
WriteStorageFormatMetadata(metadata);
|
||||||
_pageFile.Flush();
|
_pageFile.Flush();
|
||||||
}
|
}
|
||||||
@@ -221,7 +221,8 @@ public sealed partial class StorageEngine
|
|||||||
var normalized = options ?? new CompressionMigrationOptions();
|
var normalized = options ?? new CompressionMigrationOptions();
|
||||||
|
|
||||||
if (!Enum.IsDefined(normalized.Codec) || normalized.Codec == CompressionCodec.None)
|
if (!Enum.IsDefined(normalized.Codec) || normalized.Codec == CompressionCodec.None)
|
||||||
throw new ArgumentOutOfRangeException(nameof(options), "Migration codec must be a supported non-None codec.");
|
throw new ArgumentOutOfRangeException(nameof(options),
|
||||||
|
"Migration codec must be a supported non-None codec.");
|
||||||
|
|
||||||
if (normalized.MinSizeBytes < 0)
|
if (normalized.MinSizeBytes < 0)
|
||||||
throw new ArgumentOutOfRangeException(nameof(options), "MinSizeBytes must be non-negative.");
|
throw new ArgumentOutOfRangeException(nameof(options), "MinSizeBytes must be non-negative.");
|
||||||
@@ -250,7 +251,8 @@ public sealed partial class StorageEngine
|
|||||||
.ToList();
|
.ToList();
|
||||||
}
|
}
|
||||||
|
|
||||||
private byte[] BuildTargetStoredPayload(ReadOnlySpan<byte> logicalPayload, CompressionMigrationOptions options, out bool compressed)
|
private byte[] BuildTargetStoredPayload(ReadOnlySpan<byte> logicalPayload, CompressionMigrationOptions options,
|
||||||
|
out bool compressed)
|
||||||
{
|
{
|
||||||
compressed = false;
|
compressed = false;
|
||||||
|
|
||||||
@@ -259,10 +261,10 @@ public sealed partial class StorageEngine
|
|||||||
|
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
var compressedPayload = _compressionService.Compress(logicalPayload, options.Codec, options.Level);
|
byte[] compressedPayload = CompressionService.Compress(logicalPayload, options.Codec, options.Level);
|
||||||
var storedLength = CompressedPayloadHeader.Size + compressedPayload.Length;
|
int storedLength = CompressedPayloadHeader.Size + compressedPayload.Length;
|
||||||
var savings = logicalPayload.Length - storedLength;
|
int savings = logicalPayload.Length - storedLength;
|
||||||
var savingsPercent = logicalPayload.Length == 0 ? 0 : (int)((savings * 100L) / logicalPayload.Length);
|
int savingsPercent = logicalPayload.Length == 0 ? 0 : (int)(savings * 100L / logicalPayload.Length);
|
||||||
if (savings <= 0 || savingsPercent < options.MinSavingsPercent)
|
if (savings <= 0 || savingsPercent < options.MinSavingsPercent)
|
||||||
return logicalPayload.ToArray();
|
return logicalPayload.ToArray();
|
||||||
|
|
||||||
@@ -308,11 +310,11 @@ public sealed partial class StorageEngine
|
|||||||
|
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
logicalPayload = _compressionService.Decompress(
|
logicalPayload = CompressionService.Decompress(
|
||||||
compressedPayload,
|
compressedPayload,
|
||||||
header.Codec,
|
header.Codec,
|
||||||
header.OriginalLength,
|
header.OriginalLength,
|
||||||
Math.Max(header.OriginalLength, _compressionOptions.MaxDecompressedSizeBytes));
|
Math.Max(header.OriginalLength, CompressionOptions.MaxDecompressedSizeBytes));
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
catch
|
catch
|
||||||
@@ -336,13 +338,13 @@ public sealed partial class StorageEngine
|
|||||||
if (location.SlotIndex >= header.SlotCount)
|
if (location.SlotIndex >= header.SlotCount)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
var slotOffset = SlottedPageHeader.Size + (location.SlotIndex * SlotEntry.Size);
|
int slotOffset = SlottedPageHeader.Size + location.SlotIndex * SlotEntry.Size;
|
||||||
var slot = SlotEntry.ReadFrom(pageBuffer.AsSpan(slotOffset, SlotEntry.Size));
|
var slot = SlotEntry.ReadFrom(pageBuffer.AsSpan(slotOffset, SlotEntry.Size));
|
||||||
if ((slot.Flags & SlotFlags.Deleted) != 0)
|
if ((slot.Flags & SlotFlags.Deleted) != 0)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
isCompressed = (slot.Flags & SlotFlags.Compressed) != 0;
|
isCompressed = (slot.Flags & SlotFlags.Compressed) != 0;
|
||||||
var hasOverflow = (slot.Flags & SlotFlags.HasOverflow) != 0;
|
bool hasOverflow = (slot.Flags & SlotFlags.HasOverflow) != 0;
|
||||||
|
|
||||||
if (!hasOverflow)
|
if (!hasOverflow)
|
||||||
{
|
{
|
||||||
@@ -354,14 +356,14 @@ public sealed partial class StorageEngine
|
|||||||
return false;
|
return false;
|
||||||
|
|
||||||
var primaryPayload = pageBuffer.AsSpan(slot.Offset, slot.Length);
|
var primaryPayload = pageBuffer.AsSpan(slot.Offset, slot.Length);
|
||||||
var totalStoredLength = BinaryPrimitives.ReadInt32LittleEndian(primaryPayload.Slice(0, 4));
|
int totalStoredLength = BinaryPrimitives.ReadInt32LittleEndian(primaryPayload.Slice(0, 4));
|
||||||
var nextOverflow = BinaryPrimitives.ReadUInt32LittleEndian(primaryPayload.Slice(4, 4));
|
uint nextOverflow = BinaryPrimitives.ReadUInt32LittleEndian(primaryPayload.Slice(4, 4));
|
||||||
if (totalStoredLength < 0)
|
if (totalStoredLength < 0)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
var output = new byte[totalStoredLength];
|
var output = new byte[totalStoredLength];
|
||||||
var primaryChunk = primaryPayload.Slice(8);
|
var primaryChunk = primaryPayload.Slice(8);
|
||||||
var copied = Math.Min(primaryChunk.Length, output.Length);
|
int copied = Math.Min(primaryChunk.Length, output.Length);
|
||||||
primaryChunk.Slice(0, copied).CopyTo(output);
|
primaryChunk.Slice(0, copied).CopyTo(output);
|
||||||
|
|
||||||
var overflowBuffer = new byte[_pageFile.PageSize];
|
var overflowBuffer = new byte[_pageFile.PageSize];
|
||||||
@@ -372,7 +374,7 @@ public sealed partial class StorageEngine
|
|||||||
if (overflowHeader.PageType != PageType.Overflow)
|
if (overflowHeader.PageType != PageType.Overflow)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
var chunk = Math.Min(output.Length - copied, _pageFile.PageSize - SlottedPageHeader.Size);
|
int chunk = Math.Min(output.Length - copied, _pageFile.PageSize - SlottedPageHeader.Size);
|
||||||
overflowBuffer.AsSpan(SlottedPageHeader.Size, chunk).CopyTo(output.AsSpan(copied));
|
overflowBuffer.AsSpan(SlottedPageHeader.Size, chunk).CopyTo(output.AsSpan(copied));
|
||||||
copied += chunk;
|
copied += chunk;
|
||||||
nextOverflow = overflowHeader.NextOverflowPage;
|
nextOverflow = overflowHeader.NextOverflowPage;
|
||||||
@@ -403,12 +405,12 @@ public sealed partial class StorageEngine
|
|||||||
if (location.SlotIndex >= pageHeader.SlotCount)
|
if (location.SlotIndex >= pageHeader.SlotCount)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
var slotOffset = SlottedPageHeader.Size + (location.SlotIndex * SlotEntry.Size);
|
int slotOffset = SlottedPageHeader.Size + location.SlotIndex * SlotEntry.Size;
|
||||||
var slot = SlotEntry.ReadFrom(pageBuffer.AsSpan(slotOffset, SlotEntry.Size));
|
var slot = SlotEntry.ReadFrom(pageBuffer.AsSpan(slotOffset, SlotEntry.Size));
|
||||||
if ((slot.Flags & SlotFlags.Deleted) != 0)
|
if ((slot.Flags & SlotFlags.Deleted) != 0)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
var oldHasOverflow = (slot.Flags & SlotFlags.HasOverflow) != 0;
|
bool oldHasOverflow = (slot.Flags & SlotFlags.HasOverflow) != 0;
|
||||||
uint oldOverflowHead = 0;
|
uint oldOverflowHead = 0;
|
||||||
if (oldHasOverflow)
|
if (oldHasOverflow)
|
||||||
{
|
{
|
||||||
@@ -442,12 +444,12 @@ public sealed partial class StorageEngine
|
|||||||
if (slot.Length < 8)
|
if (slot.Length < 8)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
var primaryChunkSize = slot.Length - 8;
|
int primaryChunkSize = slot.Length - 8;
|
||||||
if (primaryChunkSize < 0)
|
if (primaryChunkSize < 0)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
var remainder = newStoredPayload.Slice(primaryChunkSize);
|
var remainder = newStoredPayload.Slice(primaryChunkSize);
|
||||||
var newOverflowHead = BuildOverflowChainForMigration(remainder);
|
uint newOverflowHead = BuildOverflowChainForMigration(remainder);
|
||||||
|
|
||||||
var slotPayload = pageBuffer.AsSpan(slot.Offset, slot.Length);
|
var slotPayload = pageBuffer.AsSpan(slot.Offset, slot.Length);
|
||||||
slotPayload.Clear();
|
slotPayload.Clear();
|
||||||
@@ -475,22 +477,22 @@ public sealed partial class StorageEngine
|
|||||||
if (overflowPayload.IsEmpty)
|
if (overflowPayload.IsEmpty)
|
||||||
return 0;
|
return 0;
|
||||||
|
|
||||||
var chunkSize = _pageFile.PageSize - SlottedPageHeader.Size;
|
int chunkSize = _pageFile.PageSize - SlottedPageHeader.Size;
|
||||||
uint nextOverflowPageId = 0;
|
uint nextOverflowPageId = 0;
|
||||||
|
|
||||||
var tailSize = overflowPayload.Length % chunkSize;
|
int tailSize = overflowPayload.Length % chunkSize;
|
||||||
var fullPages = overflowPayload.Length / chunkSize;
|
int fullPages = overflowPayload.Length / chunkSize;
|
||||||
|
|
||||||
if (tailSize > 0)
|
if (tailSize > 0)
|
||||||
{
|
{
|
||||||
var tailOffset = fullPages * chunkSize;
|
int tailOffset = fullPages * chunkSize;
|
||||||
var tailSlice = overflowPayload.Slice(tailOffset, tailSize);
|
var tailSlice = overflowPayload.Slice(tailOffset, tailSize);
|
||||||
nextOverflowPageId = AllocateOverflowPageForMigration(tailSlice, nextOverflowPageId);
|
nextOverflowPageId = AllocateOverflowPageForMigration(tailSlice, nextOverflowPageId);
|
||||||
}
|
}
|
||||||
|
|
||||||
for (var i = fullPages - 1; i >= 0; i--)
|
for (int i = fullPages - 1; i >= 0; i--)
|
||||||
{
|
{
|
||||||
var chunkOffset = i * chunkSize;
|
int chunkOffset = i * chunkSize;
|
||||||
var chunk = overflowPayload.Slice(chunkOffset, chunkSize);
|
var chunk = overflowPayload.Slice(chunkOffset, chunkSize);
|
||||||
nextOverflowPageId = AllocateOverflowPageForMigration(chunk, nextOverflowPageId);
|
nextOverflowPageId = AllocateOverflowPageForMigration(chunk, nextOverflowPageId);
|
||||||
}
|
}
|
||||||
@@ -500,7 +502,7 @@ public sealed partial class StorageEngine
|
|||||||
|
|
||||||
private uint AllocateOverflowPageForMigration(ReadOnlySpan<byte> payloadChunk, uint nextOverflowPageId)
|
private uint AllocateOverflowPageForMigration(ReadOnlySpan<byte> payloadChunk, uint nextOverflowPageId)
|
||||||
{
|
{
|
||||||
var pageId = _pageFile.AllocatePage();
|
uint pageId = _pageFile.AllocatePage();
|
||||||
var buffer = new byte[_pageFile.PageSize];
|
var buffer = new byte[_pageFile.PageSize];
|
||||||
|
|
||||||
var header = new SlottedPageHeader
|
var header = new SlottedPageHeader
|
||||||
@@ -524,13 +526,13 @@ public sealed partial class StorageEngine
|
|||||||
{
|
{
|
||||||
var buffer = new byte[_pageFile.PageSize];
|
var buffer = new byte[_pageFile.PageSize];
|
||||||
var visited = new HashSet<uint>();
|
var visited = new HashSet<uint>();
|
||||||
var current = firstOverflowPage;
|
uint current = firstOverflowPage;
|
||||||
|
|
||||||
while (current != 0 && current < _pageFile.NextPageId && visited.Add(current))
|
while (current != 0 && current < _pageFile.NextPageId && visited.Add(current))
|
||||||
{
|
{
|
||||||
_pageFile.ReadPage(current, buffer);
|
_pageFile.ReadPage(current, buffer);
|
||||||
var header = SlottedPageHeader.ReadFrom(buffer);
|
var header = SlottedPageHeader.ReadFrom(buffer);
|
||||||
var next = header.NextOverflowPage;
|
uint next = header.NextOverflowPage;
|
||||||
_pageFile.FreePage(current);
|
_pageFile.FreePage(current);
|
||||||
current = next;
|
current = next;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
using ZB.MOM.WW.CBDD.Core.Transactions;
|
using System.Collections.Concurrent;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Storage;
|
namespace ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
|
|
||||||
@@ -20,17 +20,17 @@ public sealed partial class StorageEngine
|
|||||||
if (transactionId.HasValue &&
|
if (transactionId.HasValue &&
|
||||||
transactionId.Value != 0 &&
|
transactionId.Value != 0 &&
|
||||||
_walCache.TryGetValue(transactionId.Value, out var txnPages) &&
|
_walCache.TryGetValue(transactionId.Value, out var txnPages) &&
|
||||||
txnPages.TryGetValue(pageId, out var uncommittedData))
|
txnPages.TryGetValue(pageId, out byte[]? uncommittedData))
|
||||||
{
|
{
|
||||||
var length = Math.Min(uncommittedData.Length, destination.Length);
|
int length = Math.Min(uncommittedData.Length, destination.Length);
|
||||||
uncommittedData.AsSpan(0, length).CopyTo(destination);
|
uncommittedData.AsSpan(0, length).CopyTo(destination);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
// 2. Check WAL index (committed but not checkpointed)
|
// 2. Check WAL index (committed but not checkpointed)
|
||||||
if (_walIndex.TryGetValue(pageId, out var committedData))
|
if (_walIndex.TryGetValue(pageId, out byte[]? committedData))
|
||||||
{
|
{
|
||||||
var length = Math.Min(committedData.Length, destination.Length);
|
int length = Math.Min(committedData.Length, destination.Length);
|
||||||
committedData.AsSpan(0, length).CopyTo(destination);
|
committedData.AsSpan(0, length).CopyTo(destination);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
@@ -54,10 +54,10 @@ public sealed partial class StorageEngine
|
|||||||
|
|
||||||
// Get or create transaction-local cache
|
// Get or create transaction-local cache
|
||||||
var txnPages = _walCache.GetOrAdd(transactionId,
|
var txnPages = _walCache.GetOrAdd(transactionId,
|
||||||
_ => new System.Collections.Concurrent.ConcurrentDictionary<uint, byte[]>());
|
_ => new ConcurrentDictionary<uint, byte[]>());
|
||||||
|
|
||||||
// Store defensive copy
|
// Store defensive copy
|
||||||
var copy = data.ToArray();
|
byte[] copy = data.ToArray();
|
||||||
txnPages[pageId] = copy;
|
txnPages[pageId] = copy;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -50,7 +50,7 @@ public sealed partial class StorageEngine
|
|||||||
lockAcquired = _commitLock.Wait(0);
|
lockAcquired = _commitLock.Wait(0);
|
||||||
if (!lockAcquired)
|
if (!lockAcquired)
|
||||||
{
|
{
|
||||||
var walSize = _wal.GetCurrentSize();
|
long walSize = _wal.GetCurrentSize();
|
||||||
return new CheckpointResult(mode, false, 0, walSize, walSize, false, false);
|
return new CheckpointResult(mode, false, 0, walSize, walSize, false, false);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -66,19 +66,18 @@ public sealed partial class StorageEngine
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
if (lockAcquired)
|
if (lockAcquired) _commitLock.Release();
|
||||||
{
|
|
||||||
_commitLock.Release();
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private void CheckpointInternal()
|
private void CheckpointInternal()
|
||||||
=> _ = CheckpointInternal(CheckpointMode.Truncate);
|
{
|
||||||
|
_ = CheckpointInternal(CheckpointMode.Truncate);
|
||||||
|
}
|
||||||
|
|
||||||
private CheckpointResult CheckpointInternal(CheckpointMode mode)
|
private CheckpointResult CheckpointInternal(CheckpointMode mode)
|
||||||
{
|
{
|
||||||
var walBytesBefore = _wal.GetCurrentSize();
|
long walBytesBefore = _wal.GetCurrentSize();
|
||||||
var appliedPages = 0;
|
var appliedPages = 0;
|
||||||
var truncated = false;
|
var truncated = false;
|
||||||
var restarted = false;
|
var restarted = false;
|
||||||
@@ -91,10 +90,7 @@ public sealed partial class StorageEngine
|
|||||||
}
|
}
|
||||||
|
|
||||||
// 2. Flush PageFile to ensure durability.
|
// 2. Flush PageFile to ensure durability.
|
||||||
if (appliedPages > 0)
|
if (appliedPages > 0) _pageFile.Flush();
|
||||||
{
|
|
||||||
_pageFile.Flush();
|
|
||||||
}
|
|
||||||
|
|
||||||
// 3. Clear in-memory WAL index (now persisted).
|
// 3. Clear in-memory WAL index (now persisted).
|
||||||
_walIndex.Clear();
|
_walIndex.Clear();
|
||||||
@@ -109,6 +105,7 @@ public sealed partial class StorageEngine
|
|||||||
_wal.WriteCheckpointRecord();
|
_wal.WriteCheckpointRecord();
|
||||||
_wal.Flush();
|
_wal.Flush();
|
||||||
}
|
}
|
||||||
|
|
||||||
break;
|
break;
|
||||||
case CheckpointMode.Truncate:
|
case CheckpointMode.Truncate:
|
||||||
if (walBytesBefore > 0)
|
if (walBytesBefore > 0)
|
||||||
@@ -116,6 +113,7 @@ public sealed partial class StorageEngine
|
|||||||
_wal.Truncate();
|
_wal.Truncate();
|
||||||
truncated = true;
|
truncated = true;
|
||||||
}
|
}
|
||||||
|
|
||||||
break;
|
break;
|
||||||
case CheckpointMode.Restart:
|
case CheckpointMode.Restart:
|
||||||
_wal.Restart();
|
_wal.Restart();
|
||||||
@@ -126,7 +124,7 @@ public sealed partial class StorageEngine
|
|||||||
throw new ArgumentOutOfRangeException(nameof(mode), mode, "Unsupported checkpoint mode.");
|
throw new ArgumentOutOfRangeException(nameof(mode), mode, "Unsupported checkpoint mode.");
|
||||||
}
|
}
|
||||||
|
|
||||||
var walBytesAfter = _wal.GetCurrentSize();
|
long walBytesAfter = _wal.GetCurrentSize();
|
||||||
return new CheckpointResult(mode, true, appliedPages, walBytesBefore, walBytesAfter, truncated, restarted);
|
return new CheckpointResult(mode, true, appliedPages, walBytesBefore, walBytesAfter, truncated, restarted);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -153,7 +151,7 @@ public sealed partial class StorageEngine
|
|||||||
lockAcquired = await _commitLock.WaitAsync(0, ct);
|
lockAcquired = await _commitLock.WaitAsync(0, ct);
|
||||||
if (!lockAcquired)
|
if (!lockAcquired)
|
||||||
{
|
{
|
||||||
var walSize = _wal.GetCurrentSize();
|
long walSize = _wal.GetCurrentSize();
|
||||||
return new CheckpointResult(mode, false, 0, walSize, walSize, false, false);
|
return new CheckpointResult(mode, false, 0, walSize, walSize, false, false);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -170,10 +168,7 @@ public sealed partial class StorageEngine
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
if (lockAcquired)
|
if (lockAcquired) _commitLock.Release();
|
||||||
{
|
|
||||||
_commitLock.Release();
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -189,35 +184,28 @@ public sealed partial class StorageEngine
|
|||||||
// 1. Read WAL and locate the latest checkpoint boundary.
|
// 1. Read WAL and locate the latest checkpoint boundary.
|
||||||
var records = _wal.ReadAll();
|
var records = _wal.ReadAll();
|
||||||
var startIndex = 0;
|
var startIndex = 0;
|
||||||
for (var i = records.Count - 1; i >= 0; i--)
|
for (int i = records.Count - 1; i >= 0; i--)
|
||||||
{
|
|
||||||
if (records[i].Type == WalRecordType.Checkpoint)
|
if (records[i].Type == WalRecordType.Checkpoint)
|
||||||
{
|
{
|
||||||
startIndex = i + 1;
|
startIndex = i + 1;
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
// 2. Replay WAL in source order with deterministic commit application.
|
// 2. Replay WAL in source order with deterministic commit application.
|
||||||
var pendingWrites = new Dictionary<ulong, List<(uint pageId, byte[] data)>>();
|
var pendingWrites = new Dictionary<ulong, List<(uint pageId, byte[] data)>>();
|
||||||
var appliedAny = false;
|
var appliedAny = false;
|
||||||
|
|
||||||
for (var i = startIndex; i < records.Count; i++)
|
for (int i = startIndex; i < records.Count; i++)
|
||||||
{
|
{
|
||||||
var record = records[i];
|
var record = records[i];
|
||||||
switch (record.Type)
|
switch (record.Type)
|
||||||
{
|
{
|
||||||
case WalRecordType.Begin:
|
case WalRecordType.Begin:
|
||||||
if (!pendingWrites.ContainsKey(record.TransactionId))
|
if (!pendingWrites.ContainsKey(record.TransactionId))
|
||||||
{
|
|
||||||
pendingWrites[record.TransactionId] = new List<(uint, byte[])>();
|
pendingWrites[record.TransactionId] = new List<(uint, byte[])>();
|
||||||
}
|
|
||||||
break;
|
break;
|
||||||
case WalRecordType.Write:
|
case WalRecordType.Write:
|
||||||
if (record.AfterImage == null)
|
if (record.AfterImage == null) break;
|
||||||
{
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!pendingWrites.TryGetValue(record.TransactionId, out var writes))
|
if (!pendingWrites.TryGetValue(record.TransactionId, out var writes))
|
||||||
{
|
{
|
||||||
@@ -228,12 +216,9 @@ public sealed partial class StorageEngine
|
|||||||
writes.Add((record.PageId, record.AfterImage));
|
writes.Add((record.PageId, record.AfterImage));
|
||||||
break;
|
break;
|
||||||
case WalRecordType.Commit:
|
case WalRecordType.Commit:
|
||||||
if (!pendingWrites.TryGetValue(record.TransactionId, out var committedWrites))
|
if (!pendingWrites.TryGetValue(record.TransactionId, out var committedWrites)) break;
|
||||||
{
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
|
|
||||||
foreach (var (pageId, data) in committedWrites)
|
foreach ((uint pageId, byte[] data) in committedWrites)
|
||||||
{
|
{
|
||||||
_pageFile.WritePage(pageId, data);
|
_pageFile.WritePage(pageId, data);
|
||||||
appliedAny = true;
|
appliedAny = true;
|
||||||
@@ -251,19 +236,13 @@ public sealed partial class StorageEngine
|
|||||||
}
|
}
|
||||||
|
|
||||||
// 3. Flush PageFile to ensure durability.
|
// 3. Flush PageFile to ensure durability.
|
||||||
if (appliedAny)
|
if (appliedAny) _pageFile.Flush();
|
||||||
{
|
|
||||||
_pageFile.Flush();
|
|
||||||
}
|
|
||||||
|
|
||||||
// 4. Clear in-memory WAL index (redundant since we just recovered).
|
// 4. Clear in-memory WAL index (redundant since we just recovered).
|
||||||
_walIndex.Clear();
|
_walIndex.Clear();
|
||||||
|
|
||||||
// 5. Truncate WAL (all changes now in PageFile).
|
// 5. Truncate WAL (all changes now in PageFile).
|
||||||
if (_wal.GetCurrentSize() > 0)
|
if (_wal.GetCurrentSize() > 0) _wal.Truncate();
|
||||||
{
|
|
||||||
_wal.Truncate();
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -1,5 +1,3 @@
|
|||||||
using System;
|
|
||||||
using System.Collections.Generic;
|
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
using ZB.MOM.WW.CBDD.Bson.Schema;
|
using ZB.MOM.WW.CBDD.Bson.Schema;
|
||||||
|
|
||||||
@@ -17,7 +15,7 @@ public sealed partial class StorageEngine
|
|||||||
var schemas = new List<BsonSchema>();
|
var schemas = new List<BsonSchema>();
|
||||||
if (rootPageId == 0) return schemas;
|
if (rootPageId == 0) return schemas;
|
||||||
|
|
||||||
var pageId = rootPageId;
|
uint pageId = rootPageId;
|
||||||
var buffer = new byte[PageSize];
|
var buffer = new byte[PageSize];
|
||||||
|
|
||||||
while (pageId != 0)
|
while (pageId != 0)
|
||||||
@@ -33,7 +31,7 @@ public sealed partial class StorageEngine
|
|||||||
var reader = new BsonSpanReader(buffer.AsSpan(32, used), GetKeyReverseMap());
|
var reader = new BsonSpanReader(buffer.AsSpan(32, used), GetKeyReverseMap());
|
||||||
while (reader.Remaining >= 4)
|
while (reader.Remaining >= 4)
|
||||||
{
|
{
|
||||||
var docSize = reader.PeekInt32();
|
int docSize = reader.PeekInt32();
|
||||||
if (docSize <= 0 || docSize > reader.Remaining) break;
|
if (docSize <= 0 || docSize > reader.Remaining) break;
|
||||||
|
|
||||||
var schema = BsonSchema.FromBson(ref reader);
|
var schema = BsonSchema.FromBson(ref reader);
|
||||||
@@ -60,7 +58,7 @@ public sealed partial class StorageEngine
|
|||||||
var tempBuffer = new byte[PageSize];
|
var tempBuffer = new byte[PageSize];
|
||||||
var tempWriter = new BsonSpanWriter(tempBuffer, GetKeyMap());
|
var tempWriter = new BsonSpanWriter(tempBuffer, GetKeyMap());
|
||||||
schema.ToBson(ref tempWriter);
|
schema.ToBson(ref tempWriter);
|
||||||
var schemaSize = tempWriter.Position;
|
int schemaSize = tempWriter.Position;
|
||||||
|
|
||||||
if (rootPageId == 0)
|
if (rootPageId == 0)
|
||||||
{
|
{
|
||||||
@@ -106,7 +104,7 @@ public sealed partial class StorageEngine
|
|||||||
else
|
else
|
||||||
{
|
{
|
||||||
// Allocate new page
|
// Allocate new page
|
||||||
var newPageId = AllocatePage();
|
uint newPageId = AllocatePage();
|
||||||
lastHeader.NextPageId = newPageId;
|
lastHeader.NextPageId = newPageId;
|
||||||
lastHeader.WriteTo(buffer);
|
lastHeader.WriteTo(buffer);
|
||||||
WritePageImmediate(lastPageId, buffer);
|
WritePageImmediate(lastPageId, buffer);
|
||||||
|
|||||||
@@ -4,6 +4,208 @@ namespace ZB.MOM.WW.CBDD.Core.Storage;
|
|||||||
|
|
||||||
public sealed partial class StorageEngine
|
public sealed partial class StorageEngine
|
||||||
{
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the number of active transactions (diagnostics).
|
||||||
|
/// </summary>
|
||||||
|
public int ActiveTransactionCount => _walCache.Count;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Prepares a transaction: writes all changes to WAL but doesn't commit yet.
|
||||||
|
/// Part of 2-Phase Commit protocol.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="transactionId">Transaction ID</param>
|
||||||
|
/// <param name="writeSet">All writes to record in WAL</param>
|
||||||
|
/// <returns>True if preparation succeeded</returns>
|
||||||
|
public bool PrepareTransaction(ulong transactionId)
|
||||||
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
_wal.WriteBeginRecord(transactionId);
|
||||||
|
|
||||||
|
foreach (var walEntry in _walCache[transactionId])
|
||||||
|
_wal.WriteDataRecord(transactionId, walEntry.Key, walEntry.Value);
|
||||||
|
|
||||||
|
_wal.Flush(); // Ensure WAL is persisted
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
catch
|
||||||
|
{
|
||||||
|
// TODO: Log error?
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Prepares a transaction asynchronously by writing pending changes to the WAL.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="transactionId">The transaction identifier.</param>
|
||||||
|
/// <param name="ct">The cancellation token.</param>
|
||||||
|
/// <returns><see langword="true" /> if preparation succeeds; otherwise, <see langword="false" />.</returns>
|
||||||
|
public async Task<bool> PrepareTransactionAsync(ulong transactionId, CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
await _wal.WriteBeginRecordAsync(transactionId, ct);
|
||||||
|
|
||||||
|
if (_walCache.TryGetValue(transactionId, out var changes))
|
||||||
|
foreach (var walEntry in changes)
|
||||||
|
await _wal.WriteDataRecordAsync(transactionId, walEntry.Key, walEntry.Value, ct);
|
||||||
|
|
||||||
|
await _wal.FlushAsync(ct); // Ensure WAL is persisted
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
catch
|
||||||
|
{
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Commits a transaction:
|
||||||
|
/// 1. Writes all changes to WAL (for durability)
|
||||||
|
/// 2. Writes commit record
|
||||||
|
/// 3. Flushes WAL to disk
|
||||||
|
/// 4. Moves pages from cache to WAL index (for future reads)
|
||||||
|
/// 5. Clears WAL cache
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="transactionId">Transaction to commit</param>
|
||||||
|
/// <param name="writeSet">All writes performed in this transaction (unused, kept for compatibility)</param>
|
||||||
|
public void CommitTransaction(ulong transactionId)
|
||||||
|
{
|
||||||
|
_commitLock.Wait();
|
||||||
|
try
|
||||||
|
{
|
||||||
|
CommitTransactionCore(transactionId);
|
||||||
|
}
|
||||||
|
finally
|
||||||
|
{
|
||||||
|
_commitLock.Release();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private void CommitTransactionCore(ulong transactionId)
|
||||||
|
{
|
||||||
|
// Get ALL pages from WAL cache (includes both data and index pages)
|
||||||
|
if (!_walCache.TryGetValue(transactionId, out var pages))
|
||||||
|
{
|
||||||
|
// No writes for this transaction, just write commit record
|
||||||
|
_wal.WriteCommitRecord(transactionId);
|
||||||
|
_wal.Flush();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// 1. Write all changes to WAL (from cache, not writeSet!)
|
||||||
|
_wal.WriteBeginRecord(transactionId);
|
||||||
|
|
||||||
|
foreach ((uint pageId, byte[] data) in pages) _wal.WriteDataRecord(transactionId, pageId, data);
|
||||||
|
|
||||||
|
// 2. Write commit record and flush
|
||||||
|
_wal.WriteCommitRecord(transactionId);
|
||||||
|
_wal.Flush(); // Durability: ensure WAL is on disk
|
||||||
|
|
||||||
|
// 3. Move pages from cache to WAL index (for reads)
|
||||||
|
_walCache.TryRemove(transactionId, out _);
|
||||||
|
foreach (var kvp in pages) _walIndex[kvp.Key] = kvp.Value;
|
||||||
|
|
||||||
|
// Auto-checkpoint if WAL grows too large
|
||||||
|
if (_wal.GetCurrentSize() > MaxWalSize) CheckpointInternal();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Commits a prepared transaction asynchronously by identifier.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="transactionId">The transaction identifier.</param>
|
||||||
|
/// <param name="ct">The cancellation token.</param>
|
||||||
|
public async Task CommitTransactionAsync(ulong transactionId, CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
await _commitLock.WaitAsync(ct);
|
||||||
|
try
|
||||||
|
{
|
||||||
|
await CommitTransactionCoreAsync(transactionId, ct);
|
||||||
|
}
|
||||||
|
finally
|
||||||
|
{
|
||||||
|
_commitLock.Release();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private async Task CommitTransactionCoreAsync(ulong transactionId, CancellationToken ct)
|
||||||
|
{
|
||||||
|
// Get ALL pages from WAL cache (includes both data and index pages)
|
||||||
|
if (!_walCache.TryGetValue(transactionId, out var pages))
|
||||||
|
{
|
||||||
|
// No writes for this transaction, just write commit record
|
||||||
|
await _wal.WriteCommitRecordAsync(transactionId, ct);
|
||||||
|
await _wal.FlushAsync(ct);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// 1. Write all changes to WAL (from cache, not writeSet!)
|
||||||
|
await _wal.WriteBeginRecordAsync(transactionId, ct);
|
||||||
|
|
||||||
|
foreach ((uint pageId, byte[] data) in pages) await _wal.WriteDataRecordAsync(transactionId, pageId, data, ct);
|
||||||
|
|
||||||
|
// 2. Write commit record and flush
|
||||||
|
await _wal.WriteCommitRecordAsync(transactionId, ct);
|
||||||
|
await _wal.FlushAsync(ct); // Durability: ensure WAL is on disk
|
||||||
|
|
||||||
|
// 3. Move pages from cache to WAL index (for reads)
|
||||||
|
_walCache.TryRemove(transactionId, out _);
|
||||||
|
foreach (var kvp in pages) _walIndex[kvp.Key] = kvp.Value;
|
||||||
|
|
||||||
|
// Auto-checkpoint if WAL grows too large
|
||||||
|
if (_wal.GetCurrentSize() > MaxWalSize)
|
||||||
|
// Checkpoint might be sync or async. For now sync inside the lock is "safe" but blocking.
|
||||||
|
// Ideally this should be async too.
|
||||||
|
CheckpointInternal();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Marks a transaction as committed after WAL writes.
|
||||||
|
/// Used for 2PC: after Prepare() writes to WAL, this finalizes the commit.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="transactionId">Transaction to mark committed</param>
|
||||||
|
public void MarkTransactionCommitted(ulong transactionId)
|
||||||
|
{
|
||||||
|
_commitLock.Wait();
|
||||||
|
try
|
||||||
|
{
|
||||||
|
_wal.WriteCommitRecord(transactionId);
|
||||||
|
_wal.Flush();
|
||||||
|
|
||||||
|
// Move from cache to WAL index
|
||||||
|
if (_walCache.TryRemove(transactionId, out var pages))
|
||||||
|
foreach (var kvp in pages)
|
||||||
|
_walIndex[kvp.Key] = kvp.Value;
|
||||||
|
|
||||||
|
// Auto-checkpoint if WAL grows too large
|
||||||
|
if (_wal.GetCurrentSize() > MaxWalSize) CheckpointInternal();
|
||||||
|
}
|
||||||
|
finally
|
||||||
|
{
|
||||||
|
_commitLock.Release();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Rolls back a transaction: discards all uncommitted changes.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="transactionId">Transaction to rollback</param>
|
||||||
|
public void RollbackTransaction(ulong transactionId)
|
||||||
|
{
|
||||||
|
_walCache.TryRemove(transactionId, out _);
|
||||||
|
_wal.WriteAbortRecord(transactionId);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Writes an abort record for the specified transaction.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="transactionId">The transaction identifier.</param>
|
||||||
|
internal void WriteAbortRecord(ulong transactionId)
|
||||||
|
{
|
||||||
|
_wal.WriteAbortRecord(transactionId);
|
||||||
|
}
|
||||||
|
|
||||||
#region Transaction Management
|
#region Transaction Management
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -16,7 +218,7 @@ public sealed partial class StorageEngine
|
|||||||
_commitLock.Wait();
|
_commitLock.Wait();
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
var txnId = _nextTransactionId++;
|
ulong txnId = _nextTransactionId++;
|
||||||
var transaction = new Transaction(txnId, this, isolationLevel);
|
var transaction = new Transaction(txnId, this, isolationLevel);
|
||||||
_activeTransactions[txnId] = transaction;
|
_activeTransactions[txnId] = transaction;
|
||||||
return transaction;
|
return transaction;
|
||||||
@@ -33,12 +235,13 @@ public sealed partial class StorageEngine
|
|||||||
/// <param name="isolationLevel">The transaction isolation level.</param>
|
/// <param name="isolationLevel">The transaction isolation level.</param>
|
||||||
/// <param name="ct">The cancellation token.</param>
|
/// <param name="ct">The cancellation token.</param>
|
||||||
/// <returns>The started transaction.</returns>
|
/// <returns>The started transaction.</returns>
|
||||||
public async Task<Transaction> BeginTransactionAsync(IsolationLevel isolationLevel = IsolationLevel.ReadCommitted, CancellationToken ct = default)
|
public async Task<Transaction> BeginTransactionAsync(IsolationLevel isolationLevel = IsolationLevel.ReadCommitted,
|
||||||
|
CancellationToken ct = default)
|
||||||
{
|
{
|
||||||
await _commitLock.WaitAsync(ct);
|
await _commitLock.WaitAsync(ct);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
var txnId = _nextTransactionId++;
|
ulong txnId = _nextTransactionId++;
|
||||||
var transaction = new Transaction(txnId, this, isolationLevel);
|
var transaction = new Transaction(txnId, this, isolationLevel);
|
||||||
_activeTransactions[txnId] = transaction;
|
_activeTransactions[txnId] = transaction;
|
||||||
return transaction;
|
return transaction;
|
||||||
@@ -121,236 +324,4 @@ public sealed partial class StorageEngine
|
|||||||
// but for consistency we might consider it. For now, sync is fine as it's not the happy path bottleneck.
|
// but for consistency we might consider it. For now, sync is fine as it's not the happy path bottleneck.
|
||||||
|
|
||||||
#endregion
|
#endregion
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Prepares a transaction: writes all changes to WAL but doesn't commit yet.
|
|
||||||
/// Part of 2-Phase Commit protocol.
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="transactionId">Transaction ID</param>
|
|
||||||
/// <param name="writeSet">All writes to record in WAL</param>
|
|
||||||
/// <returns>True if preparation succeeded</returns>
|
|
||||||
public bool PrepareTransaction(ulong transactionId)
|
|
||||||
{
|
|
||||||
try
|
|
||||||
{
|
|
||||||
_wal.WriteBeginRecord(transactionId);
|
|
||||||
|
|
||||||
foreach (var walEntry in _walCache[transactionId])
|
|
||||||
{
|
|
||||||
_wal.WriteDataRecord(transactionId, walEntry.Key, walEntry.Value);
|
|
||||||
}
|
|
||||||
|
|
||||||
_wal.Flush(); // Ensure WAL is persisted
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
catch
|
|
||||||
{
|
|
||||||
// TODO: Log error?
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Prepares a transaction asynchronously by writing pending changes to the WAL.
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="transactionId">The transaction identifier.</param>
|
|
||||||
/// <param name="ct">The cancellation token.</param>
|
|
||||||
/// <returns><see langword="true"/> if preparation succeeds; otherwise, <see langword="false"/>.</returns>
|
|
||||||
public async Task<bool> PrepareTransactionAsync(ulong transactionId, CancellationToken ct = default)
|
|
||||||
{
|
|
||||||
try
|
|
||||||
{
|
|
||||||
await _wal.WriteBeginRecordAsync(transactionId, ct);
|
|
||||||
|
|
||||||
if (_walCache.TryGetValue(transactionId, out var changes))
|
|
||||||
{
|
|
||||||
foreach (var walEntry in changes)
|
|
||||||
{
|
|
||||||
await _wal.WriteDataRecordAsync(transactionId, walEntry.Key, walEntry.Value, ct);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
await _wal.FlushAsync(ct); // Ensure WAL is persisted
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
catch
|
|
||||||
{
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Commits a transaction:
|
|
||||||
/// 1. Writes all changes to WAL (for durability)
|
|
||||||
/// 2. Writes commit record
|
|
||||||
/// 3. Flushes WAL to disk
|
|
||||||
/// 4. Moves pages from cache to WAL index (for future reads)
|
|
||||||
/// 5. Clears WAL cache
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="transactionId">Transaction to commit</param>
|
|
||||||
/// <param name="writeSet">All writes performed in this transaction (unused, kept for compatibility)</param>
|
|
||||||
public void CommitTransaction(ulong transactionId)
|
|
||||||
{
|
|
||||||
_commitLock.Wait();
|
|
||||||
try
|
|
||||||
{
|
|
||||||
CommitTransactionCore(transactionId);
|
|
||||||
}
|
|
||||||
finally
|
|
||||||
{
|
|
||||||
_commitLock.Release();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private void CommitTransactionCore(ulong transactionId)
|
|
||||||
{
|
|
||||||
// Get ALL pages from WAL cache (includes both data and index pages)
|
|
||||||
if (!_walCache.TryGetValue(transactionId, out var pages))
|
|
||||||
{
|
|
||||||
// No writes for this transaction, just write commit record
|
|
||||||
_wal.WriteCommitRecord(transactionId);
|
|
||||||
_wal.Flush();
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// 1. Write all changes to WAL (from cache, not writeSet!)
|
|
||||||
_wal.WriteBeginRecord(transactionId);
|
|
||||||
|
|
||||||
foreach (var (pageId, data) in pages)
|
|
||||||
{
|
|
||||||
_wal.WriteDataRecord(transactionId, pageId, data);
|
|
||||||
}
|
|
||||||
|
|
||||||
// 2. Write commit record and flush
|
|
||||||
_wal.WriteCommitRecord(transactionId);
|
|
||||||
_wal.Flush(); // Durability: ensure WAL is on disk
|
|
||||||
|
|
||||||
// 3. Move pages from cache to WAL index (for reads)
|
|
||||||
_walCache.TryRemove(transactionId, out _);
|
|
||||||
foreach (var kvp in pages)
|
|
||||||
{
|
|
||||||
_walIndex[kvp.Key] = kvp.Value;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Auto-checkpoint if WAL grows too large
|
|
||||||
if (_wal.GetCurrentSize() > MaxWalSize)
|
|
||||||
{
|
|
||||||
CheckpointInternal();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Commits a prepared transaction asynchronously by identifier.
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="transactionId">The transaction identifier.</param>
|
|
||||||
/// <param name="ct">The cancellation token.</param>
|
|
||||||
public async Task CommitTransactionAsync(ulong transactionId, CancellationToken ct = default)
|
|
||||||
{
|
|
||||||
await _commitLock.WaitAsync(ct);
|
|
||||||
try
|
|
||||||
{
|
|
||||||
await CommitTransactionCoreAsync(transactionId, ct);
|
|
||||||
}
|
|
||||||
finally
|
|
||||||
{
|
|
||||||
_commitLock.Release();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private async Task CommitTransactionCoreAsync(ulong transactionId, CancellationToken ct)
|
|
||||||
{
|
|
||||||
// Get ALL pages from WAL cache (includes both data and index pages)
|
|
||||||
if (!_walCache.TryGetValue(transactionId, out var pages))
|
|
||||||
{
|
|
||||||
// No writes for this transaction, just write commit record
|
|
||||||
await _wal.WriteCommitRecordAsync(transactionId, ct);
|
|
||||||
await _wal.FlushAsync(ct);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// 1. Write all changes to WAL (from cache, not writeSet!)
|
|
||||||
await _wal.WriteBeginRecordAsync(transactionId, ct);
|
|
||||||
|
|
||||||
foreach (var (pageId, data) in pages)
|
|
||||||
{
|
|
||||||
await _wal.WriteDataRecordAsync(transactionId, pageId, data, ct);
|
|
||||||
}
|
|
||||||
|
|
||||||
// 2. Write commit record and flush
|
|
||||||
await _wal.WriteCommitRecordAsync(transactionId, ct);
|
|
||||||
await _wal.FlushAsync(ct); // Durability: ensure WAL is on disk
|
|
||||||
|
|
||||||
// 3. Move pages from cache to WAL index (for reads)
|
|
||||||
_walCache.TryRemove(transactionId, out _);
|
|
||||||
foreach (var kvp in pages)
|
|
||||||
{
|
|
||||||
_walIndex[kvp.Key] = kvp.Value;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Auto-checkpoint if WAL grows too large
|
|
||||||
if (_wal.GetCurrentSize() > MaxWalSize)
|
|
||||||
{
|
|
||||||
// Checkpoint might be sync or async. For now sync inside the lock is "safe" but blocking.
|
|
||||||
// Ideally this should be async too.
|
|
||||||
CheckpointInternal();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Marks a transaction as committed after WAL writes.
|
|
||||||
/// Used for 2PC: after Prepare() writes to WAL, this finalizes the commit.
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="transactionId">Transaction to mark committed</param>
|
|
||||||
public void MarkTransactionCommitted(ulong transactionId)
|
|
||||||
{
|
|
||||||
_commitLock.Wait();
|
|
||||||
try
|
|
||||||
{
|
|
||||||
_wal.WriteCommitRecord(transactionId);
|
|
||||||
_wal.Flush();
|
|
||||||
|
|
||||||
// Move from cache to WAL index
|
|
||||||
if (_walCache.TryRemove(transactionId, out var pages))
|
|
||||||
{
|
|
||||||
foreach (var kvp in pages)
|
|
||||||
{
|
|
||||||
_walIndex[kvp.Key] = kvp.Value;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Auto-checkpoint if WAL grows too large
|
|
||||||
if (_wal.GetCurrentSize() > MaxWalSize)
|
|
||||||
{
|
|
||||||
CheckpointInternal();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
finally
|
|
||||||
{
|
|
||||||
_commitLock.Release();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Rolls back a transaction: discards all uncommitted changes.
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="transactionId">Transaction to rollback</param>
|
|
||||||
public void RollbackTransaction(ulong transactionId)
|
|
||||||
{
|
|
||||||
_walCache.TryRemove(transactionId, out _);
|
|
||||||
_wal.WriteAbortRecord(transactionId);
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Writes an abort record for the specified transaction.
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="transactionId">The transaction identifier.</param>
|
|
||||||
internal void WriteAbortRecord(ulong transactionId)
|
|
||||||
{
|
|
||||||
_wal.WriteAbortRecord(transactionId);
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Gets the number of active transactions (diagnostics).
|
|
||||||
/// </summary>
|
|
||||||
public int ActiveTransactionCount => _walCache.Count;
|
|
||||||
}
|
}
|
||||||
@@ -1,4 +1,5 @@
|
|||||||
using System.Collections.Concurrent;
|
using System.Collections.Concurrent;
|
||||||
|
using ZB.MOM.WW.CBDD.Core.CDC;
|
||||||
using ZB.MOM.WW.CBDD.Core.Compression;
|
using ZB.MOM.WW.CBDD.Core.Compression;
|
||||||
using ZB.MOM.WW.CBDD.Core.Transactions;
|
using ZB.MOM.WW.CBDD.Core.Transactions;
|
||||||
|
|
||||||
@@ -6,7 +7,6 @@ namespace ZB.MOM.WW.CBDD.Core.Storage;
|
|||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Central storage engine managing page-based storage with WAL for durability.
|
/// Central storage engine managing page-based storage with WAL for durability.
|
||||||
///
|
|
||||||
/// Architecture (WAL-based like SQLite/PostgreSQL):
|
/// Architecture (WAL-based like SQLite/PostgreSQL):
|
||||||
/// - PageFile: Committed baseline (persistent on disk)
|
/// - PageFile: Committed baseline (persistent on disk)
|
||||||
/// - WAL Cache: Uncommitted transaction writes (in-memory)
|
/// - WAL Cache: Uncommitted transaction writes (in-memory)
|
||||||
@@ -16,14 +16,15 @@ namespace ZB.MOM.WW.CBDD.Core.Storage;
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public sealed partial class StorageEngine : IStorageEngine, IDisposable
|
public sealed partial class StorageEngine : IStorageEngine, IDisposable
|
||||||
{
|
{
|
||||||
|
private const long MaxWalSize = 4 * 1024 * 1024; // 4MB
|
||||||
|
|
||||||
|
// Transaction Management
|
||||||
|
private readonly ConcurrentDictionary<ulong, Transaction> _activeTransactions;
|
||||||
|
|
||||||
|
// Global lock for commit/checkpoint synchronization
|
||||||
|
private readonly SemaphoreSlim _commitLock = new(1, 1);
|
||||||
private readonly PageFile _pageFile;
|
private readonly PageFile _pageFile;
|
||||||
private readonly WriteAheadLog _wal;
|
private readonly WriteAheadLog _wal;
|
||||||
private readonly CompressionOptions _compressionOptions;
|
|
||||||
private readonly CompressionService _compressionService;
|
|
||||||
private readonly CompressionTelemetry _compressionTelemetry;
|
|
||||||
private readonly StorageFormatMetadata _storageFormatMetadata;
|
|
||||||
private readonly MaintenanceOptions _maintenanceOptions;
|
|
||||||
private CDC.ChangeStreamDispatcher? _cdc;
|
|
||||||
|
|
||||||
// WAL cache: TransactionId → (PageId → PageData)
|
// WAL cache: TransactionId → (PageId → PageData)
|
||||||
// Stores uncommitted writes for "Read Your Own Writes" isolation
|
// Stores uncommitted writes for "Read Your Own Writes" isolation
|
||||||
@@ -32,16 +33,8 @@ public sealed partial class StorageEngine : IStorageEngine, IDisposable
|
|||||||
// WAL index cache: PageId → PageData (from latest committed transaction)
|
// WAL index cache: PageId → PageData (from latest committed transaction)
|
||||||
// Lazily populated on first read after commit
|
// Lazily populated on first read after commit
|
||||||
private readonly ConcurrentDictionary<uint, byte[]> _walIndex;
|
private readonly ConcurrentDictionary<uint, byte[]> _walIndex;
|
||||||
|
|
||||||
// Global lock for commit/checkpoint synchronization
|
|
||||||
private readonly SemaphoreSlim _commitLock = new(1, 1);
|
|
||||||
|
|
||||||
// Transaction Management
|
|
||||||
private readonly ConcurrentDictionary<ulong, Transaction> _activeTransactions;
|
|
||||||
private ulong _nextTransactionId;
|
private ulong _nextTransactionId;
|
||||||
|
|
||||||
private const long MaxWalSize = 4 * 1024 * 1024; // 4MB
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new instance of the <see cref="StorageEngine" /> class.
|
/// Initializes a new instance of the <see cref="StorageEngine" /> class.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -55,13 +48,13 @@ public sealed partial class StorageEngine : IStorageEngine, IDisposable
|
|||||||
CompressionOptions? compressionOptions = null,
|
CompressionOptions? compressionOptions = null,
|
||||||
MaintenanceOptions? maintenanceOptions = null)
|
MaintenanceOptions? maintenanceOptions = null)
|
||||||
{
|
{
|
||||||
_compressionOptions = CompressionOptions.Normalize(compressionOptions);
|
CompressionOptions = CompressionOptions.Normalize(compressionOptions);
|
||||||
_compressionService = new CompressionService();
|
CompressionService = new CompressionService();
|
||||||
_compressionTelemetry = new CompressionTelemetry();
|
CompressionTelemetry = new CompressionTelemetry();
|
||||||
_maintenanceOptions = maintenanceOptions ?? new MaintenanceOptions();
|
MaintenanceOptions = maintenanceOptions ?? new MaintenanceOptions();
|
||||||
|
|
||||||
// Auto-derive WAL path
|
// Auto-derive WAL path
|
||||||
var walPath = Path.ChangeExtension(databasePath, ".wal");
|
string walPath = Path.ChangeExtension(databasePath, ".wal");
|
||||||
|
|
||||||
// Initialize storage infrastructure
|
// Initialize storage infrastructure
|
||||||
_pageFile = new PageFile(databasePath, config);
|
_pageFile = new PageFile(databasePath, config);
|
||||||
@@ -72,14 +65,11 @@ public sealed partial class StorageEngine : IStorageEngine, IDisposable
|
|||||||
_walIndex = new ConcurrentDictionary<uint, byte[]>();
|
_walIndex = new ConcurrentDictionary<uint, byte[]>();
|
||||||
_activeTransactions = new ConcurrentDictionary<ulong, Transaction>();
|
_activeTransactions = new ConcurrentDictionary<ulong, Transaction>();
|
||||||
_nextTransactionId = 1;
|
_nextTransactionId = 1;
|
||||||
_storageFormatMetadata = InitializeStorageFormatMetadata();
|
StorageFormatMetadata = InitializeStorageFormatMetadata();
|
||||||
|
|
||||||
// Recover from WAL if exists (crash recovery or resume after close)
|
// Recover from WAL if exists (crash recovery or resume after close)
|
||||||
// This replays any committed transactions not yet checkpointed
|
// This replays any committed transactions not yet checkpointed
|
||||||
if (_wal.GetCurrentSize() > 0)
|
if (_wal.GetCurrentSize() > 0) Recover();
|
||||||
{
|
|
||||||
Recover();
|
|
||||||
}
|
|
||||||
|
|
||||||
_ = ResumeCompactionIfNeeded();
|
_ = ResumeCompactionIfNeeded();
|
||||||
|
|
||||||
@@ -91,35 +81,35 @@ public sealed partial class StorageEngine : IStorageEngine, IDisposable
|
|||||||
// _checkpointManager.StartAutoCheckpoint();
|
// _checkpointManager.StartAutoCheckpoint();
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Page size for this storage engine
|
|
||||||
/// </summary>
|
|
||||||
public int PageSize => _pageFile.PageSize;
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Compression options for this engine instance.
|
/// Compression options for this engine instance.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public CompressionOptions CompressionOptions => _compressionOptions;
|
public CompressionOptions CompressionOptions { get; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Compression codec service for payload roundtrip operations.
|
/// Compression codec service for payload roundtrip operations.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public CompressionService CompressionService => _compressionService;
|
public CompressionService CompressionService { get; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Compression telemetry counters for this engine instance.
|
/// Compression telemetry counters for this engine instance.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public CompressionTelemetry CompressionTelemetry => _compressionTelemetry;
|
public CompressionTelemetry CompressionTelemetry { get; }
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Returns a point-in-time snapshot of compression telemetry counters.
|
|
||||||
/// </summary>
|
|
||||||
public CompressionStats GetCompressionStats() => _compressionTelemetry.GetSnapshot();
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets storage format metadata associated with the current database.
|
/// Gets storage format metadata associated with the current database.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
internal StorageFormatMetadata StorageFormatMetadata => _storageFormatMetadata;
|
internal StorageFormatMetadata StorageFormatMetadata { get; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the registered change stream dispatcher, if available.
|
||||||
|
/// </summary>
|
||||||
|
internal ChangeStreamDispatcher? Cdc { get; private set; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Page size for this storage engine
|
||||||
|
/// </summary>
|
||||||
|
public int PageSize => _pageFile.PageSize;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Checks if a page is currently being modified by another active transaction.
|
/// Checks if a page is currently being modified by another active transaction.
|
||||||
@@ -132,13 +122,14 @@ public sealed partial class StorageEngine : IStorageEngine, IDisposable
|
|||||||
{
|
{
|
||||||
foreach (var kvp in _walCache)
|
foreach (var kvp in _walCache)
|
||||||
{
|
{
|
||||||
var txId = kvp.Key;
|
ulong txId = kvp.Key;
|
||||||
if (txId == excludingTxId) continue;
|
if (txId == excludingTxId) continue;
|
||||||
|
|
||||||
var txnPages = kvp.Value;
|
var txnPages = kvp.Value;
|
||||||
if (txnPages.ContainsKey(pageId))
|
if (txnPages.ContainsKey(pageId))
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
|
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -151,13 +142,15 @@ public sealed partial class StorageEngine : IStorageEngine, IDisposable
|
|||||||
if (_activeTransactions != null)
|
if (_activeTransactions != null)
|
||||||
{
|
{
|
||||||
foreach (var txn in _activeTransactions.Values)
|
foreach (var txn in _activeTransactions.Values)
|
||||||
{
|
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
RollbackTransaction(txn.TransactionId);
|
RollbackTransaction(txn.TransactionId);
|
||||||
}
|
}
|
||||||
catch { /* Ignore errors during dispose */ }
|
catch
|
||||||
|
{
|
||||||
|
/* Ignore errors during dispose */
|
||||||
}
|
}
|
||||||
|
|
||||||
_activeTransactions.Clear();
|
_activeTransactions.Clear();
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -168,32 +161,38 @@ public sealed partial class StorageEngine : IStorageEngine, IDisposable
|
|||||||
_commitLock?.Dispose();
|
_commitLock?.Dispose();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// <inheritdoc />
|
||||||
|
void IStorageEngine.RegisterCdc(ChangeStreamDispatcher cdc)
|
||||||
|
{
|
||||||
|
RegisterCdc(cdc);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <inheritdoc />
|
||||||
|
ChangeStreamDispatcher? IStorageEngine.Cdc => Cdc;
|
||||||
|
|
||||||
|
/// <inheritdoc />
|
||||||
|
CompressionOptions IStorageEngine.CompressionOptions => CompressionOptions;
|
||||||
|
|
||||||
|
/// <inheritdoc />
|
||||||
|
CompressionService IStorageEngine.CompressionService => CompressionService;
|
||||||
|
|
||||||
|
/// <inheritdoc />
|
||||||
|
CompressionTelemetry IStorageEngine.CompressionTelemetry => CompressionTelemetry;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Returns a point-in-time snapshot of compression telemetry counters.
|
||||||
|
/// </summary>
|
||||||
|
public CompressionStats GetCompressionStats()
|
||||||
|
{
|
||||||
|
return CompressionTelemetry.GetSnapshot();
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Registers the change stream dispatcher used for CDC notifications.
|
/// Registers the change stream dispatcher used for CDC notifications.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="cdc">The change stream dispatcher instance.</param>
|
/// <param name="cdc">The change stream dispatcher instance.</param>
|
||||||
internal void RegisterCdc(CDC.ChangeStreamDispatcher cdc)
|
internal void RegisterCdc(ChangeStreamDispatcher cdc)
|
||||||
{
|
{
|
||||||
_cdc = cdc;
|
Cdc = cdc;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Gets the registered change stream dispatcher, if available.
|
|
||||||
/// </summary>
|
|
||||||
internal CDC.ChangeStreamDispatcher? Cdc => _cdc;
|
|
||||||
|
|
||||||
/// <inheritdoc />
|
|
||||||
void IStorageEngine.RegisterCdc(CDC.ChangeStreamDispatcher cdc) => RegisterCdc(cdc);
|
|
||||||
|
|
||||||
/// <inheritdoc />
|
|
||||||
CDC.ChangeStreamDispatcher? IStorageEngine.Cdc => _cdc;
|
|
||||||
|
|
||||||
/// <inheritdoc />
|
|
||||||
CompressionOptions IStorageEngine.CompressionOptions => _compressionOptions;
|
|
||||||
|
|
||||||
/// <inheritdoc />
|
|
||||||
CompressionService IStorageEngine.CompressionService => _compressionService;
|
|
||||||
|
|
||||||
/// <inheritdoc />
|
|
||||||
CompressionTelemetry IStorageEngine.CompressionTelemetry => _compressionTelemetry;
|
|
||||||
}
|
}
|
||||||
@@ -1,5 +1,5 @@
|
|||||||
|
using System.Buffers.Binary;
|
||||||
using System.Runtime.InteropServices;
|
using System.Runtime.InteropServices;
|
||||||
using ZB.MOM.WW.CBDD.Core.Indexing;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Storage;
|
namespace ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
|
|
||||||
@@ -30,7 +30,7 @@ public struct VectorPage
|
|||||||
public static void IncrementNodeCount(Span<byte> page)
|
public static void IncrementNodeCount(Span<byte> page)
|
||||||
{
|
{
|
||||||
int count = GetNodeCount(page);
|
int count = GetNodeCount(page);
|
||||||
System.Buffers.Binary.BinaryPrimitives.WriteInt32LittleEndian(page.Slice(NodeCountOffset), count + 1);
|
BinaryPrimitives.WriteInt32LittleEndian(page.Slice(NodeCountOffset), count + 1);
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -52,17 +52,17 @@ public struct VectorPage
|
|||||||
};
|
};
|
||||||
header.WriteTo(page);
|
header.WriteTo(page);
|
||||||
|
|
||||||
System.Buffers.Binary.BinaryPrimitives.WriteInt32LittleEndian(page.Slice(DimensionsOffset), dimensions);
|
BinaryPrimitives.WriteInt32LittleEndian(page.Slice(DimensionsOffset), dimensions);
|
||||||
System.Buffers.Binary.BinaryPrimitives.WriteInt32LittleEndian(page.Slice(MaxMOffset), maxM);
|
BinaryPrimitives.WriteInt32LittleEndian(page.Slice(MaxMOffset), maxM);
|
||||||
|
|
||||||
// Node Size Calculation:
|
// Node Size Calculation:
|
||||||
// Location (6) + MaxLevel (1) + Vector (dim * 4) + Links (maxM * 10 * 6) -- estimating 10 levels for simplicity
|
// Location (6) + MaxLevel (1) + Vector (dim * 4) + Links (maxM * 10 * 6) -- estimating 10 levels for simplicity
|
||||||
// Better: Node size is variable? No, let's keep it fixed per index configuration to avoid fragmentation.
|
// Better: Node size is variable? No, let's keep it fixed per index configuration to avoid fragmentation.
|
||||||
// HNSW standard: level 0 has 2*M links, levels > 0 have M links.
|
// HNSW standard: level 0 has 2*M links, levels > 0 have M links.
|
||||||
// Max level is typically < 16. Let's reserve space for 16 levels.
|
// Max level is typically < 16. Let's reserve space for 16 levels.
|
||||||
int nodeSize = 6 + 1 + (dimensions * 4) + (maxM * (2 + 15) * 6);
|
int nodeSize = 6 + 1 + dimensions * 4 + maxM * (2 + 15) * 6;
|
||||||
System.Buffers.Binary.BinaryPrimitives.WriteInt32LittleEndian(page.Slice(NodeSizeOffset), nodeSize);
|
BinaryPrimitives.WriteInt32LittleEndian(page.Slice(NodeSizeOffset), nodeSize);
|
||||||
System.Buffers.Binary.BinaryPrimitives.WriteInt32LittleEndian(page.Slice(NodeCountOffset), 0);
|
BinaryPrimitives.WriteInt32LittleEndian(page.Slice(NodeCountOffset), 0);
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -70,24 +70,30 @@ public struct VectorPage
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="page">The page buffer.</param>
|
/// <param name="page">The page buffer.</param>
|
||||||
/// <returns>The node count.</returns>
|
/// <returns>The node count.</returns>
|
||||||
public static int GetNodeCount(ReadOnlySpan<byte> page) =>
|
public static int GetNodeCount(ReadOnlySpan<byte> page)
|
||||||
System.Buffers.Binary.BinaryPrimitives.ReadInt32LittleEndian(page.Slice(NodeCountOffset));
|
{
|
||||||
|
return BinaryPrimitives.ReadInt32LittleEndian(page.Slice(NodeCountOffset));
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the configured node size for the page.
|
/// Gets the configured node size for the page.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="page">The page buffer.</param>
|
/// <param name="page">The page buffer.</param>
|
||||||
/// <returns>The node size in bytes.</returns>
|
/// <returns>The node size in bytes.</returns>
|
||||||
public static int GetNodeSize(ReadOnlySpan<byte> page) =>
|
public static int GetNodeSize(ReadOnlySpan<byte> page)
|
||||||
System.Buffers.Binary.BinaryPrimitives.ReadInt32LittleEndian(page.Slice(NodeSizeOffset));
|
{
|
||||||
|
return BinaryPrimitives.ReadInt32LittleEndian(page.Slice(NodeSizeOffset));
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the maximum number of nodes that can fit in the page.
|
/// Gets the maximum number of nodes that can fit in the page.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="page">The page buffer.</param>
|
/// <param name="page">The page buffer.</param>
|
||||||
/// <returns>The maximum node count.</returns>
|
/// <returns>The maximum node count.</returns>
|
||||||
public static int GetMaxNodes(ReadOnlySpan<byte> page) =>
|
public static int GetMaxNodes(ReadOnlySpan<byte> page)
|
||||||
(page.Length - DataOffset) / GetNodeSize(page);
|
{
|
||||||
|
return (page.Length - DataOffset) / GetNodeSize(page);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Writes a node to the page at the specified index.
|
/// Writes a node to the page at the specified index.
|
||||||
@@ -98,10 +104,11 @@ public struct VectorPage
|
|||||||
/// <param name="maxLevel">The maximum graph level for the node.</param>
|
/// <param name="maxLevel">The maximum graph level for the node.</param>
|
||||||
/// <param name="vector">The vector values to store.</param>
|
/// <param name="vector">The vector values to store.</param>
|
||||||
/// <param name="dimensions">The vector dimensionality.</param>
|
/// <param name="dimensions">The vector dimensionality.</param>
|
||||||
public static void WriteNode(Span<byte> page, int nodeIndex, DocumentLocation loc, int maxLevel, ReadOnlySpan<float> vector, int dimensions)
|
public static void WriteNode(Span<byte> page, int nodeIndex, DocumentLocation loc, int maxLevel,
|
||||||
|
ReadOnlySpan<float> vector, int dimensions)
|
||||||
{
|
{
|
||||||
int nodeSize = GetNodeSize(page);
|
int nodeSize = GetNodeSize(page);
|
||||||
int offset = DataOffset + (nodeIndex * nodeSize);
|
int offset = DataOffset + nodeIndex * nodeSize;
|
||||||
var nodeSpan = page.Slice(offset, nodeSize);
|
var nodeSpan = page.Slice(offset, nodeSize);
|
||||||
|
|
||||||
// 1. Document Location
|
// 1. Document Location
|
||||||
@@ -127,10 +134,11 @@ public struct VectorPage
|
|||||||
/// <param name="loc">When this method returns, contains the node document location.</param>
|
/// <param name="loc">When this method returns, contains the node document location.</param>
|
||||||
/// <param name="maxLevel">When this method returns, contains the node max level.</param>
|
/// <param name="maxLevel">When this method returns, contains the node max level.</param>
|
||||||
/// <param name="vector">The destination span for vector values.</param>
|
/// <param name="vector">The destination span for vector values.</param>
|
||||||
public static void ReadNodeData(ReadOnlySpan<byte> page, int nodeIndex, out DocumentLocation loc, out int maxLevel, Span<float> vector)
|
public static void ReadNodeData(ReadOnlySpan<byte> page, int nodeIndex, out DocumentLocation loc, out int maxLevel,
|
||||||
|
Span<float> vector)
|
||||||
{
|
{
|
||||||
int nodeSize = GetNodeSize(page);
|
int nodeSize = GetNodeSize(page);
|
||||||
int offset = DataOffset + (nodeIndex * nodeSize);
|
int offset = DataOffset + nodeIndex * nodeSize;
|
||||||
var nodeSpan = page.Slice(offset, nodeSize);
|
var nodeSpan = page.Slice(offset, nodeSize);
|
||||||
|
|
||||||
loc = DocumentLocation.ReadFrom(nodeSpan.Slice(0, 6));
|
loc = DocumentLocation.ReadFrom(nodeSpan.Slice(0, 6));
|
||||||
@@ -152,23 +160,19 @@ public struct VectorPage
|
|||||||
public static Span<byte> GetLinksSpan(Span<byte> page, int nodeIndex, int level, int dimensions, int maxM)
|
public static Span<byte> GetLinksSpan(Span<byte> page, int nodeIndex, int level, int dimensions, int maxM)
|
||||||
{
|
{
|
||||||
int nodeSize = GetNodeSize(page);
|
int nodeSize = GetNodeSize(page);
|
||||||
int nodeOffset = DataOffset + (nodeIndex * nodeSize);
|
int nodeOffset = DataOffset + nodeIndex * nodeSize;
|
||||||
|
|
||||||
// Link offset: Location(6) + MaxLevel(1) + Vector(dim*4)
|
// Link offset: Location(6) + MaxLevel(1) + Vector(dim*4)
|
||||||
int linkBaseOffset = nodeOffset + 7 + (dimensions * 4);
|
int linkBaseOffset = nodeOffset + 7 + dimensions * 4;
|
||||||
|
|
||||||
int levelOffset;
|
int levelOffset;
|
||||||
if (level == 0)
|
if (level == 0)
|
||||||
{
|
|
||||||
levelOffset = 0;
|
levelOffset = 0;
|
||||||
}
|
|
||||||
else
|
else
|
||||||
{
|
|
||||||
// Level 0 has 2*M links
|
// Level 0 has 2*M links
|
||||||
levelOffset = (2 * maxM * 6) + ((level - 1) * maxM * 6);
|
levelOffset = 2 * maxM * 6 + (level - 1) * maxM * 6;
|
||||||
}
|
|
||||||
|
|
||||||
int count = (level == 0) ? (2 * maxM) : maxM;
|
int count = level == 0 ? 2 * maxM : maxM;
|
||||||
return page.Slice(linkBaseOffset + levelOffset, count * 6);
|
return page.Slice(linkBaseOffset + levelOffset, count * 6);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1,7 +1,3 @@
|
|||||||
using System;
|
|
||||||
using System.Threading;
|
|
||||||
using System.Threading.Tasks;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Transactions;
|
namespace ZB.MOM.WW.CBDD.Core.Transactions;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
|
|||||||
@@ -3,24 +3,30 @@ namespace ZB.MOM.WW.CBDD.Core.Transactions;
|
|||||||
/// <summary>
|
/// <summary>
|
||||||
/// Defines a contract for managing and providing access to the current transaction context.
|
/// Defines a contract for managing and providing access to the current transaction context.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <remarks>Implementations of this interface are responsible for tracking the current transaction and starting a
|
/// <remarks>
|
||||||
|
/// Implementations of this interface are responsible for tracking the current transaction and starting a
|
||||||
/// new one if none exists. This is typically used in scenarios where transactional consistency is required across
|
/// new one if none exists. This is typically used in scenarios where transactional consistency is required across
|
||||||
/// multiple operations.</remarks>
|
/// multiple operations.
|
||||||
|
/// </remarks>
|
||||||
public interface ITransactionHolder
|
public interface ITransactionHolder
|
||||||
{
|
{
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the current transaction if one exists; otherwise, starts a new transaction.
|
/// Gets the current transaction if one exists; otherwise, starts a new transaction.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <remarks>Use this method to ensure that a transaction context is available for the current operation.
|
/// <remarks>
|
||||||
|
/// Use this method to ensure that a transaction context is available for the current operation.
|
||||||
/// If a transaction is already in progress, it is returned; otherwise, a new transaction is started and returned.
|
/// If a transaction is already in progress, it is returned; otherwise, a new transaction is started and returned.
|
||||||
/// The caller is responsible for managing the transaction's lifetime as appropriate.</remarks>
|
/// The caller is responsible for managing the transaction's lifetime as appropriate.
|
||||||
|
/// </remarks>
|
||||||
/// <returns>An <see cref="ITransaction" /> representing the current transaction, or a new transaction if none is active.</returns>
|
/// <returns>An <see cref="ITransaction" /> representing the current transaction, or a new transaction if none is active.</returns>
|
||||||
ITransaction GetCurrentTransactionOrStart();
|
ITransaction GetCurrentTransactionOrStart();
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the current transaction if one exists; otherwise, starts a new transaction asynchronously.
|
/// Gets the current transaction if one exists; otherwise, starts a new transaction asynchronously.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
/// <returns>A task that represents the asynchronous operation. The task result contains an <see cref="ITransaction"/>
|
/// <returns>
|
||||||
/// representing the current or newly started transaction.</returns>
|
/// A task that represents the asynchronous operation. The task result contains an <see cref="ITransaction" />
|
||||||
|
/// representing the current or newly started transaction.
|
||||||
|
/// </returns>
|
||||||
Task<ITransaction> GetCurrentTransactionOrStartAsync();
|
Task<ITransaction> GetCurrentTransactionOrStartAsync();
|
||||||
}
|
}
|
||||||
@@ -1,8 +1,6 @@
|
|||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
|
using ZB.MOM.WW.CBDD.Core.CDC;
|
||||||
using ZB.MOM.WW.CBDD.Core.Storage;
|
using ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
using System;
|
|
||||||
using System.Threading;
|
|
||||||
using System.Threading.Tasks;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Transactions;
|
namespace ZB.MOM.WW.CBDD.Core.Transactions;
|
||||||
|
|
||||||
@@ -12,12 +10,8 @@ namespace ZB.MOM.WW.CBDD.Core.Transactions;
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public sealed class Transaction : ITransaction
|
public sealed class Transaction : ITransaction
|
||||||
{
|
{
|
||||||
private readonly ulong _transactionId;
|
private readonly List<InternalChangeEvent> _pendingChanges = new();
|
||||||
private readonly IsolationLevel _isolationLevel;
|
|
||||||
private readonly DateTime _startTime;
|
|
||||||
private readonly StorageEngine _storage;
|
private readonly StorageEngine _storage;
|
||||||
private readonly List<CDC.InternalChangeEvent> _pendingChanges = new();
|
|
||||||
private TransactionState _state;
|
|
||||||
private bool _disposed;
|
private bool _disposed;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -30,41 +24,32 @@ public sealed class Transaction : ITransaction
|
|||||||
StorageEngine storage,
|
StorageEngine storage,
|
||||||
IsolationLevel isolationLevel = IsolationLevel.ReadCommitted)
|
IsolationLevel isolationLevel = IsolationLevel.ReadCommitted)
|
||||||
{
|
{
|
||||||
_transactionId = transactionId;
|
TransactionId = transactionId;
|
||||||
_storage = storage ?? throw new ArgumentNullException(nameof(storage));
|
_storage = storage ?? throw new ArgumentNullException(nameof(storage));
|
||||||
_isolationLevel = isolationLevel;
|
IsolationLevel = isolationLevel;
|
||||||
_startTime = DateTime.UtcNow;
|
StartTime = DateTime.UtcNow;
|
||||||
_state = TransactionState.Active;
|
State = TransactionState.Active;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Adds a pending CDC change to be published after commit.
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="change">The change event to buffer.</param>
|
|
||||||
internal void AddChange(CDC.InternalChangeEvent change)
|
|
||||||
{
|
|
||||||
_pendingChanges.Add(change);
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Gets the unique transaction identifier.
|
|
||||||
/// </summary>
|
|
||||||
public ulong TransactionId => _transactionId;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Gets the current transaction state.
|
|
||||||
/// </summary>
|
|
||||||
public TransactionState State => _state;
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the configured transaction isolation level.
|
/// Gets the configured transaction isolation level.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public IsolationLevel IsolationLevel => _isolationLevel;
|
public IsolationLevel IsolationLevel { get; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the UTC start time of the transaction.
|
/// Gets the UTC start time of the transaction.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public DateTime StartTime => _startTime;
|
public DateTime StartTime { get; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the unique transaction identifier.
|
||||||
|
/// </summary>
|
||||||
|
public ulong TransactionId { get; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the current transaction state.
|
||||||
|
/// </summary>
|
||||||
|
public TransactionState State { get; private set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Adds a write operation to the transaction's write set.
|
/// Adds a write operation to the transaction's write set.
|
||||||
@@ -74,13 +59,13 @@ public sealed class Transaction : ITransaction
|
|||||||
/// <param name="operation">The write operation to add.</param>
|
/// <param name="operation">The write operation to add.</param>
|
||||||
public void AddWrite(WriteOperation operation)
|
public void AddWrite(WriteOperation operation)
|
||||||
{
|
{
|
||||||
if (_state != TransactionState.Active)
|
if (State != TransactionState.Active)
|
||||||
throw new InvalidOperationException($"Cannot add writes to transaction in state {_state}");
|
throw new InvalidOperationException($"Cannot add writes to transaction in state {State}");
|
||||||
|
|
||||||
// Defensive copy: necessary to prevent use-after-return if caller uses pooled buffers
|
// Defensive copy: necessary to prevent use-after-return if caller uses pooled buffers
|
||||||
byte[] ownedCopy = operation.NewValue.ToArray();
|
byte[] ownedCopy = operation.NewValue.ToArray();
|
||||||
// StorageEngine gestisce tutte le scritture transazionali
|
// StorageEngine gestisce tutte le scritture transazionali
|
||||||
_storage.WritePage(operation.PageId, _transactionId, ownedCopy);
|
_storage.WritePage(operation.PageId, TransactionId, ownedCopy);
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -88,13 +73,13 @@ public sealed class Transaction : ITransaction
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public bool Prepare()
|
public bool Prepare()
|
||||||
{
|
{
|
||||||
if (_state != TransactionState.Active)
|
if (State != TransactionState.Active)
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
_state = TransactionState.Preparing;
|
State = TransactionState.Preparing;
|
||||||
|
|
||||||
// StorageEngine handles WAL writes
|
// StorageEngine handles WAL writes
|
||||||
return _storage.PrepareTransaction(_transactionId);
|
return _storage.PrepareTransaction(TransactionId);
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -104,23 +89,19 @@ public sealed class Transaction : ITransaction
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public void Commit()
|
public void Commit()
|
||||||
{
|
{
|
||||||
if (_state != TransactionState.Preparing && _state != TransactionState.Active)
|
if (State != TransactionState.Preparing && State != TransactionState.Active)
|
||||||
throw new InvalidOperationException($"Cannot commit transaction in state {_state}");
|
throw new InvalidOperationException($"Cannot commit transaction in state {State}");
|
||||||
|
|
||||||
// StorageEngine handles WAL writes and buffer management
|
// StorageEngine handles WAL writes and buffer management
|
||||||
_storage.CommitTransaction(_transactionId);
|
_storage.CommitTransaction(TransactionId);
|
||||||
|
|
||||||
_state = TransactionState.Committed;
|
State = TransactionState.Committed;
|
||||||
|
|
||||||
// Publish CDC events after successful commit
|
// Publish CDC events after successful commit
|
||||||
if (_pendingChanges.Count > 0 && _storage.Cdc != null)
|
if (_pendingChanges.Count > 0 && _storage.Cdc != null)
|
||||||
{
|
|
||||||
foreach (var change in _pendingChanges)
|
foreach (var change in _pendingChanges)
|
||||||
{
|
|
||||||
_storage.Cdc.Publish(change);
|
_storage.Cdc.Publish(change);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Asynchronously commits the transaction.
|
/// Asynchronously commits the transaction.
|
||||||
@@ -128,38 +109,19 @@ public sealed class Transaction : ITransaction
|
|||||||
/// <param name="ct">A cancellation token.</param>
|
/// <param name="ct">A cancellation token.</param>
|
||||||
public async Task CommitAsync(CancellationToken ct = default)
|
public async Task CommitAsync(CancellationToken ct = default)
|
||||||
{
|
{
|
||||||
if (_state != TransactionState.Preparing && _state != TransactionState.Active)
|
if (State != TransactionState.Preparing && State != TransactionState.Active)
|
||||||
throw new InvalidOperationException($"Cannot commit transaction in state {_state}");
|
throw new InvalidOperationException($"Cannot commit transaction in state {State}");
|
||||||
|
|
||||||
// StorageEngine handles WAL writes and buffer management
|
// StorageEngine handles WAL writes and buffer management
|
||||||
await _storage.CommitTransactionAsync(_transactionId, ct);
|
await _storage.CommitTransactionAsync(TransactionId, ct);
|
||||||
|
|
||||||
_state = TransactionState.Committed;
|
State = TransactionState.Committed;
|
||||||
|
|
||||||
// Publish CDC events after successful commit
|
// Publish CDC events after successful commit
|
||||||
if (_pendingChanges.Count > 0 && _storage.Cdc != null)
|
if (_pendingChanges.Count > 0 && _storage.Cdc != null)
|
||||||
{
|
|
||||||
foreach (var change in _pendingChanges)
|
foreach (var change in _pendingChanges)
|
||||||
{
|
|
||||||
_storage.Cdc.Publish(change);
|
_storage.Cdc.Publish(change);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Marks the transaction as committed without writing to PageFile.
|
|
||||||
/// Used by TransactionManager with lazy checkpointing.
|
|
||||||
/// </summary>
|
|
||||||
internal void MarkCommitted()
|
|
||||||
{
|
|
||||||
if (_state != TransactionState.Preparing && _state != TransactionState.Active)
|
|
||||||
throw new InvalidOperationException($"Cannot commit transaction in state {_state}");
|
|
||||||
|
|
||||||
// StorageEngine marks transaction as committed and moves to committed buffer
|
|
||||||
_storage.MarkTransactionCommitted(_transactionId);
|
|
||||||
|
|
||||||
_state = TransactionState.Committed;
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Rolls back the transaction (discards all writes)
|
/// Rolls back the transaction (discards all writes)
|
||||||
@@ -171,12 +133,12 @@ public sealed class Transaction : ITransaction
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public void Rollback()
|
public void Rollback()
|
||||||
{
|
{
|
||||||
if (_state == TransactionState.Committed)
|
if (State == TransactionState.Committed)
|
||||||
throw new InvalidOperationException("Cannot rollback committed transaction");
|
throw new InvalidOperationException("Cannot rollback committed transaction");
|
||||||
|
|
||||||
_pendingChanges.Clear();
|
_pendingChanges.Clear();
|
||||||
_storage.RollbackTransaction(_transactionId);
|
_storage.RollbackTransaction(TransactionId);
|
||||||
_state = TransactionState.Aborted;
|
State = TransactionState.Aborted;
|
||||||
|
|
||||||
OnRollback?.Invoke();
|
OnRollback?.Invoke();
|
||||||
}
|
}
|
||||||
@@ -189,15 +151,37 @@ public sealed class Transaction : ITransaction
|
|||||||
if (_disposed)
|
if (_disposed)
|
||||||
return;
|
return;
|
||||||
|
|
||||||
if (_state == TransactionState.Active || _state == TransactionState.Preparing)
|
if (State == TransactionState.Active || State == TransactionState.Preparing)
|
||||||
{
|
|
||||||
// Auto-rollback if not committed
|
// Auto-rollback if not committed
|
||||||
Rollback();
|
Rollback();
|
||||||
}
|
|
||||||
|
|
||||||
_disposed = true;
|
_disposed = true;
|
||||||
GC.SuppressFinalize(this);
|
GC.SuppressFinalize(this);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Adds a pending CDC change to be published after commit.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="change">The change event to buffer.</param>
|
||||||
|
internal void AddChange(InternalChangeEvent change)
|
||||||
|
{
|
||||||
|
_pendingChanges.Add(change);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Marks the transaction as committed without writing to PageFile.
|
||||||
|
/// Used by TransactionManager with lazy checkpointing.
|
||||||
|
/// </summary>
|
||||||
|
internal void MarkCommitted()
|
||||||
|
{
|
||||||
|
if (State != TransactionState.Preparing && State != TransactionState.Active)
|
||||||
|
throw new InvalidOperationException($"Cannot commit transaction in state {State}");
|
||||||
|
|
||||||
|
// StorageEngine marks transaction as committed and moves to committed buffer
|
||||||
|
_storage.MarkTransactionCommitted(TransactionId);
|
||||||
|
|
||||||
|
State = TransactionState.Committed;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
|
|||||||
@@ -1,3 +1,5 @@
|
|||||||
|
using System.Buffers;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Core.Transactions;
|
namespace ZB.MOM.WW.CBDD.Core.Transactions;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -18,10 +20,10 @@ public enum WalRecordType : byte
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public sealed class WriteAheadLog : IDisposable
|
public sealed class WriteAheadLog : IDisposable
|
||||||
{
|
{
|
||||||
private readonly string _walPath;
|
|
||||||
private FileStream? _walStream;
|
|
||||||
private readonly SemaphoreSlim _lock = new(1, 1);
|
private readonly SemaphoreSlim _lock = new(1, 1);
|
||||||
|
private readonly string _walPath;
|
||||||
private bool _disposed;
|
private bool _disposed;
|
||||||
|
private FileStream? _walStream;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Initializes a new instance of the <see cref="WriteAheadLog" /> class.
|
/// Initializes a new instance of the <see cref="WriteAheadLog" /> class.
|
||||||
@@ -36,11 +38,34 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
FileMode.OpenOrCreate,
|
FileMode.OpenOrCreate,
|
||||||
FileAccess.ReadWrite,
|
FileAccess.ReadWrite,
|
||||||
FileShare.None, // Exclusive access like PageFile
|
FileShare.None, // Exclusive access like PageFile
|
||||||
bufferSize: 64 * 1024); // 64KB buffer for better sequential write performance
|
64 * 1024); // 64KB buffer for better sequential write performance
|
||||||
// REMOVED FileOptions.WriteThrough for SQLite-style lazy checkpointing
|
// REMOVED FileOptions.WriteThrough for SQLite-style lazy checkpointing
|
||||||
// Durability is ensured by explicit Flush() calls
|
// Durability is ensured by explicit Flush() calls
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Releases resources used by the write-ahead log.
|
||||||
|
/// </summary>
|
||||||
|
public void Dispose()
|
||||||
|
{
|
||||||
|
if (_disposed)
|
||||||
|
return;
|
||||||
|
|
||||||
|
_lock.Wait();
|
||||||
|
try
|
||||||
|
{
|
||||||
|
_walStream?.Dispose();
|
||||||
|
_disposed = true;
|
||||||
|
}
|
||||||
|
finally
|
||||||
|
{
|
||||||
|
_lock.Release();
|
||||||
|
_lock.Dispose();
|
||||||
|
}
|
||||||
|
|
||||||
|
GC.SuppressFinalize(this);
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Writes a begin transaction record
|
/// Writes a begin transaction record
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -70,7 +95,7 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
try
|
try
|
||||||
{
|
{
|
||||||
// Use ArrayPool for async I/O compatibility (cannot use stackalloc with async)
|
// Use ArrayPool for async I/O compatibility (cannot use stackalloc with async)
|
||||||
var buffer = System.Buffers.ArrayPool<byte>.Shared.Rent(17);
|
byte[] buffer = ArrayPool<byte>.Shared.Rent(17);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
buffer[0] = (byte)WalRecordType.Begin;
|
buffer[0] = (byte)WalRecordType.Begin;
|
||||||
@@ -81,7 +106,7 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(buffer);
|
ArrayPool<byte>.Shared.Return(buffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
@@ -132,7 +157,7 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
await _lock.WaitAsync(ct);
|
await _lock.WaitAsync(ct);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
var buffer = System.Buffers.ArrayPool<byte>.Shared.Rent(17);
|
byte[] buffer = ArrayPool<byte>.Shared.Rent(17);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
buffer[0] = (byte)WalRecordType.Commit;
|
buffer[0] = (byte)WalRecordType.Commit;
|
||||||
@@ -143,7 +168,7 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(buffer);
|
ArrayPool<byte>.Shared.Return(buffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
@@ -194,7 +219,7 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
await _lock.WaitAsync(ct);
|
await _lock.WaitAsync(ct);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
var buffer = System.Buffers.ArrayPool<byte>.Shared.Rent(17);
|
byte[] buffer = ArrayPool<byte>.Shared.Rent(17);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
buffer[0] = (byte)WalRecordType.Abort;
|
buffer[0] = (byte)WalRecordType.Abort;
|
||||||
@@ -205,7 +230,7 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(buffer);
|
ArrayPool<byte>.Shared.Return(buffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
@@ -251,7 +276,7 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
await _lock.WaitAsync(ct);
|
await _lock.WaitAsync(ct);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
var buffer = System.Buffers.ArrayPool<byte>.Shared.Rent(17);
|
byte[] buffer = ArrayPool<byte>.Shared.Rent(17);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
buffer[0] = (byte)WalRecordType.Checkpoint;
|
buffer[0] = (byte)WalRecordType.Checkpoint;
|
||||||
@@ -261,7 +286,7 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(buffer);
|
ArrayPool<byte>.Shared.Return(buffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
@@ -310,15 +335,16 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
/// <param name="afterImage">The page contents after modification.</param>
|
/// <param name="afterImage">The page contents after modification.</param>
|
||||||
/// <param name="ct">The cancellation token.</param>
|
/// <param name="ct">The cancellation token.</param>
|
||||||
/// <returns>A task that represents the asynchronous write operation.</returns>
|
/// <returns>A task that represents the asynchronous write operation.</returns>
|
||||||
public async ValueTask WriteDataRecordAsync(ulong transactionId, uint pageId, ReadOnlyMemory<byte> afterImage, CancellationToken ct = default)
|
public async ValueTask WriteDataRecordAsync(ulong transactionId, uint pageId, ReadOnlyMemory<byte> afterImage,
|
||||||
|
CancellationToken ct = default)
|
||||||
{
|
{
|
||||||
await _lock.WaitAsync(ct);
|
await _lock.WaitAsync(ct);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
var headerSize = 17;
|
var headerSize = 17;
|
||||||
var totalSize = headerSize + afterImage.Length;
|
int totalSize = headerSize + afterImage.Length;
|
||||||
|
|
||||||
var buffer = System.Buffers.ArrayPool<byte>.Shared.Rent(totalSize);
|
byte[] buffer = ArrayPool<byte>.Shared.Rent(totalSize);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
buffer[0] = (byte)WalRecordType.Write;
|
buffer[0] = (byte)WalRecordType.Write;
|
||||||
@@ -332,7 +358,7 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(buffer);
|
ArrayPool<byte>.Shared.Return(buffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
@@ -345,9 +371,9 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
{
|
{
|
||||||
// Header: type(1) + txnId(8) + pageId(4) + afterSize(4) = 17 bytes
|
// Header: type(1) + txnId(8) + pageId(4) + afterSize(4) = 17 bytes
|
||||||
var headerSize = 17;
|
var headerSize = 17;
|
||||||
var totalSize = headerSize + afterImage.Length;
|
int totalSize = headerSize + afterImage.Length;
|
||||||
|
|
||||||
var buffer = System.Buffers.ArrayPool<byte>.Shared.Rent(totalSize);
|
byte[] buffer = ArrayPool<byte>.Shared.Rent(totalSize);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
buffer[0] = (byte)WalRecordType.Write;
|
buffer[0] = (byte)WalRecordType.Write;
|
||||||
@@ -361,7 +387,7 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
System.Buffers.ArrayPool<byte>.Shared.Return(buffer);
|
ArrayPool<byte>.Shared.Return(buffer);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -377,7 +403,7 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
_lock.Wait();
|
_lock.Wait();
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
_walStream?.Flush(flushToDisk: true);
|
_walStream?.Flush(true);
|
||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
@@ -395,9 +421,7 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
await _lock.WaitAsync(ct);
|
await _lock.WaitAsync(ct);
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
if (_walStream != null)
|
if (_walStream != null) await _walStream.FlushAsync(ct);
|
||||||
{
|
|
||||||
await _walStream.FlushAsync(ct);
|
|
||||||
// FlushAsync doesn't guarantee flushToDisk on all platforms/implementations in the same way as Flush(true)
|
// FlushAsync doesn't guarantee flushToDisk on all platforms/implementations in the same way as Flush(true)
|
||||||
// but FileStream in .NET 6+ handles this reasonable well.
|
// but FileStream in .NET 6+ handles this reasonable well.
|
||||||
// For strict durability, we might still want to invoke a sync flush or check platform specifics,
|
// For strict durability, we might still want to invoke a sync flush or check platform specifics,
|
||||||
@@ -407,7 +431,6 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
// To be safe for WAL, we might care about fsync.
|
// To be safe for WAL, we might care about fsync.
|
||||||
// For now, just FlushAsync();
|
// For now, just FlushAsync();
|
||||||
}
|
}
|
||||||
}
|
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
_lock.Release();
|
_lock.Release();
|
||||||
@@ -445,7 +468,7 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
{
|
{
|
||||||
_walStream.SetLength(0);
|
_walStream.SetLength(0);
|
||||||
_walStream.Position = 0;
|
_walStream.Position = 0;
|
||||||
_walStream.Flush(flushToDisk: true);
|
_walStream.Flush(true);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
@@ -468,7 +491,7 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
FileMode.Create,
|
FileMode.Create,
|
||||||
FileAccess.ReadWrite,
|
FileAccess.ReadWrite,
|
||||||
FileShare.None,
|
FileShare.None,
|
||||||
bufferSize: 64 * 1024);
|
64 * 1024);
|
||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
@@ -498,17 +521,15 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
|
|
||||||
while (_walStream.Position < _walStream.Length)
|
while (_walStream.Position < _walStream.Length)
|
||||||
{
|
{
|
||||||
var typeByte = _walStream.ReadByte();
|
int typeByte = _walStream.ReadByte();
|
||||||
if (typeByte == -1) break;
|
if (typeByte == -1) break;
|
||||||
|
|
||||||
var type = (WalRecordType)typeByte;
|
var type = (WalRecordType)typeByte;
|
||||||
|
|
||||||
// Check for invalid record type (file padding or corruption)
|
// Check for invalid record type (file padding or corruption)
|
||||||
if (typeByte == 0 || !Enum.IsDefined(typeof(WalRecordType), type))
|
if (typeByte == 0 || !Enum.IsDefined(typeof(WalRecordType), type))
|
||||||
{
|
|
||||||
// Reached end of valid records (file may have padding)
|
// Reached end of valid records (file may have padding)
|
||||||
break;
|
break;
|
||||||
}
|
|
||||||
|
|
||||||
WalRecord record;
|
WalRecord record;
|
||||||
|
|
||||||
@@ -519,14 +540,12 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
case WalRecordType.Abort:
|
case WalRecordType.Abort:
|
||||||
case WalRecordType.Checkpoint:
|
case WalRecordType.Checkpoint:
|
||||||
// Read common fields (txnId + timestamp = 16 bytes)
|
// Read common fields (txnId + timestamp = 16 bytes)
|
||||||
var bytesRead = _walStream.Read(headerBuf);
|
int bytesRead = _walStream.Read(headerBuf);
|
||||||
if (bytesRead < 16)
|
if (bytesRead < 16)
|
||||||
{
|
|
||||||
// Incomplete record, stop reading
|
// Incomplete record, stop reading
|
||||||
return records;
|
return records;
|
||||||
}
|
|
||||||
|
|
||||||
var txnId = BitConverter.ToUInt64(headerBuf[0..8]);
|
var txnId = BitConverter.ToUInt64(headerBuf[..8]);
|
||||||
var timestamp = BitConverter.ToInt64(headerBuf[8..16]);
|
var timestamp = BitConverter.ToInt64(headerBuf[8..16]);
|
||||||
|
|
||||||
record = new WalRecord
|
record = new WalRecord
|
||||||
@@ -542,30 +561,24 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
// Read txnId + pageId + afterSize = 16 bytes
|
// Read txnId + pageId + afterSize = 16 bytes
|
||||||
bytesRead = _walStream.Read(headerBuf);
|
bytesRead = _walStream.Read(headerBuf);
|
||||||
if (bytesRead < 16)
|
if (bytesRead < 16)
|
||||||
{
|
|
||||||
// Incomplete write record header, stop reading
|
// Incomplete write record header, stop reading
|
||||||
return records;
|
return records;
|
||||||
}
|
|
||||||
|
|
||||||
txnId = BitConverter.ToUInt64(headerBuf[0..8]);
|
txnId = BitConverter.ToUInt64(headerBuf[..8]);
|
||||||
var pageId = BitConverter.ToUInt32(headerBuf[8..12]);
|
var pageId = BitConverter.ToUInt32(headerBuf[8..12]);
|
||||||
var afterSize = BitConverter.ToInt32(headerBuf[12..16]);
|
var afterSize = BitConverter.ToInt32(headerBuf[12..16]);
|
||||||
|
|
||||||
// Validate afterSize to prevent overflow or corruption
|
// Validate afterSize to prevent overflow or corruption
|
||||||
if (afterSize < 0 || afterSize > 100 * 1024 * 1024) // Max 100MB per record
|
if (afterSize < 0 || afterSize > 100 * 1024 * 1024) // Max 100MB per record
|
||||||
{
|
|
||||||
// Corrupted size, stop reading
|
// Corrupted size, stop reading
|
||||||
return records;
|
return records;
|
||||||
}
|
|
||||||
|
|
||||||
var afterImage = new byte[afterSize];
|
var afterImage = new byte[afterSize];
|
||||||
|
|
||||||
// Read afterImage
|
// Read afterImage
|
||||||
if (_walStream.Read(afterImage) < afterSize)
|
if (_walStream.Read(afterImage) < afterSize)
|
||||||
{
|
|
||||||
// Incomplete after image, stop reading
|
// Incomplete after image, stop reading
|
||||||
return records;
|
return records;
|
||||||
}
|
|
||||||
|
|
||||||
record = new WalRecord
|
record = new WalRecord
|
||||||
{
|
{
|
||||||
@@ -592,29 +605,6 @@ public sealed class WriteAheadLog : IDisposable
|
|||||||
_lock.Release();
|
_lock.Release();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Releases resources used by the write-ahead log.
|
|
||||||
/// </summary>
|
|
||||||
public void Dispose()
|
|
||||||
{
|
|
||||||
if (_disposed)
|
|
||||||
return;
|
|
||||||
|
|
||||||
_lock.Wait();
|
|
||||||
try
|
|
||||||
{
|
|
||||||
_walStream?.Dispose();
|
|
||||||
_disposed = true;
|
|
||||||
}
|
|
||||||
finally
|
|
||||||
{
|
|
||||||
_lock.Release();
|
|
||||||
_lock.Dispose();
|
|
||||||
}
|
|
||||||
|
|
||||||
GC.SuppressFinalize(this);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
|
|||||||
@@ -1,12 +1,11 @@
|
|||||||
using System;
|
|
||||||
using System.Collections.Generic;
|
using System.Collections.Generic;
|
||||||
using System.Linq;
|
using System.Linq;
|
||||||
using Microsoft.CodeAnalysis;
|
using Microsoft.CodeAnalysis;
|
||||||
using ZB.MOM.WW.CBDD.SourceGenerators.Helpers;
|
using ZB.MOM.WW.CBDD.SourceGenerators.Helpers;
|
||||||
using ZB.MOM.WW.CBDD.SourceGenerators.Models;
|
using ZB.MOM.WW.CBDD.SourceGenerators.Models;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.SourceGenerators
|
namespace ZB.MOM.WW.CBDD.SourceGenerators;
|
||||||
{
|
|
||||||
public static class EntityAnalyzer
|
public static class EntityAnalyzer
|
||||||
{
|
{
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -28,14 +27,13 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
var tableAttr = AttributeHelper.GetAttribute(entityType, "Table");
|
var tableAttr = AttributeHelper.GetAttribute(entityType, "Table");
|
||||||
if (tableAttr != null)
|
if (tableAttr != null)
|
||||||
{
|
{
|
||||||
var tableName = tableAttr.ConstructorArguments.Length > 0 ? tableAttr.ConstructorArguments[0].Value?.ToString() : null;
|
string? tableName = tableAttr.ConstructorArguments.Length > 0
|
||||||
var schema = AttributeHelper.GetNamedArgumentValue(tableAttr, "Schema");
|
? tableAttr.ConstructorArguments[0].Value?.ToString()
|
||||||
|
: null;
|
||||||
|
string? schema = AttributeHelper.GetNamedArgumentValue(tableAttr, "Schema");
|
||||||
|
|
||||||
var collectionName = !string.IsNullOrEmpty(tableName) ? tableName! : entityInfo.Name;
|
string collectionName = !string.IsNullOrEmpty(tableName) ? tableName! : entityInfo.Name;
|
||||||
if (!string.IsNullOrEmpty(schema))
|
if (!string.IsNullOrEmpty(schema)) collectionName = $"{schema}.{collectionName}";
|
||||||
{
|
|
||||||
collectionName = $"{schema}.{collectionName}";
|
|
||||||
}
|
|
||||||
entityInfo.CollectionName = collectionName;
|
entityInfo.CollectionName = collectionName;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -44,10 +42,11 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
|
|
||||||
// Check if entity needs reflection-based deserialization
|
// Check if entity needs reflection-based deserialization
|
||||||
// Include properties with private setters or init-only setters (which can't be set outside initializers)
|
// Include properties with private setters or init-only setters (which can't be set outside initializers)
|
||||||
entityInfo.HasPrivateSetters = entityInfo.Properties.Any(p => (!p.HasPublicSetter && p.HasAnySetter) || p.HasInitOnlySetter);
|
entityInfo.HasPrivateSetters =
|
||||||
|
entityInfo.Properties.Any(p => (!p.HasPublicSetter && p.HasAnySetter) || p.HasInitOnlySetter);
|
||||||
|
|
||||||
// Check if entity has public parameterless constructor
|
// Check if entity has public parameterless constructor
|
||||||
var hasPublicParameterlessConstructor = entityType.Constructors
|
bool hasPublicParameterlessConstructor = entityType.Constructors
|
||||||
.Any(c => c.DeclaredAccessibility == Accessibility.Public && c.Parameters.Length == 0);
|
.Any(c => c.DeclaredAccessibility == Accessibility.Public && c.Parameters.Length == 0);
|
||||||
entityInfo.HasPrivateOrNoConstructor = !hasPublicParameterlessConstructor;
|
entityInfo.HasPrivateOrNoConstructor = !hasPublicParameterlessConstructor;
|
||||||
|
|
||||||
@@ -63,20 +62,14 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
{
|
{
|
||||||
// Fallback to convention: property named "Id"
|
// Fallback to convention: property named "Id"
|
||||||
var idProp = entityInfo.Properties.FirstOrDefault(p => p.Name == "Id");
|
var idProp = entityInfo.Properties.FirstOrDefault(p => p.Name == "Id");
|
||||||
if (idProp != null)
|
if (idProp != null) idProp.IsKey = true;
|
||||||
{
|
|
||||||
idProp.IsKey = true;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Check for AutoId (int/long keys)
|
// Check for AutoId (int/long keys)
|
||||||
if (entityInfo.IdProperty != null)
|
if (entityInfo.IdProperty != null)
|
||||||
{
|
{
|
||||||
var idType = entityInfo.IdProperty.TypeName.TrimEnd('?');
|
string idType = entityInfo.IdProperty.TypeName.TrimEnd('?');
|
||||||
if (idType == "int" || idType == "Int32" || idType == "long" || idType == "Int64")
|
if (idType == "int" || idType == "Int32" || idType == "long" || idType == "Int64") entityInfo.AutoId = true;
|
||||||
{
|
|
||||||
entityInfo.AutoId = true;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
return entityInfo;
|
return entityInfo;
|
||||||
@@ -109,20 +102,22 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
continue;
|
continue;
|
||||||
|
|
||||||
var columnAttr = AttributeHelper.GetAttribute(prop, "Column");
|
var columnAttr = AttributeHelper.GetAttribute(prop, "Column");
|
||||||
var bsonFieldName = AttributeHelper.GetAttributeStringValue(prop, "BsonProperty") ??
|
string? bsonFieldName = AttributeHelper.GetAttributeStringValue(prop, "BsonProperty") ??
|
||||||
AttributeHelper.GetAttributeStringValue(prop, "JsonPropertyName");
|
AttributeHelper.GetAttributeStringValue(prop, "JsonPropertyName");
|
||||||
|
|
||||||
if (bsonFieldName == null && columnAttr != null)
|
if (bsonFieldName == null && columnAttr != null)
|
||||||
{
|
bsonFieldName = columnAttr.ConstructorArguments.Length > 0
|
||||||
bsonFieldName = columnAttr.ConstructorArguments.Length > 0 ? columnAttr.ConstructorArguments[0].Value?.ToString() : null;
|
? columnAttr.ConstructorArguments[0].Value?.ToString()
|
||||||
}
|
: null;
|
||||||
|
|
||||||
var propInfo = new PropertyInfo
|
var propInfo = new PropertyInfo
|
||||||
{
|
{
|
||||||
Name = prop.Name,
|
Name = prop.Name,
|
||||||
TypeName = SyntaxHelper.GetTypeName(prop.Type),
|
TypeName = SyntaxHelper.GetTypeName(prop.Type),
|
||||||
BsonFieldName = bsonFieldName ?? prop.Name.ToLowerInvariant(),
|
BsonFieldName = bsonFieldName ?? prop.Name.ToLowerInvariant(),
|
||||||
ColumnTypeName = columnAttr != null ? AttributeHelper.GetNamedArgumentValue(columnAttr, "TypeName") : null,
|
ColumnTypeName = columnAttr != null
|
||||||
|
? AttributeHelper.GetNamedArgumentValue(columnAttr, "TypeName")
|
||||||
|
: null,
|
||||||
IsNullable = SyntaxHelper.IsNullableType(prop.Type),
|
IsNullable = SyntaxHelper.IsNullableType(prop.Type),
|
||||||
IsKey = AttributeHelper.IsKey(prop),
|
IsKey = AttributeHelper.IsKey(prop),
|
||||||
IsRequired = AttributeHelper.HasAttribute(prop, "Required"),
|
IsRequired = AttributeHelper.HasAttribute(prop, "Required"),
|
||||||
@@ -131,7 +126,7 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
HasInitOnlySetter = prop.SetMethod?.IsInitOnly == true,
|
HasInitOnlySetter = prop.SetMethod?.IsInitOnly == true,
|
||||||
HasAnySetter = prop.SetMethod != null,
|
HasAnySetter = prop.SetMethod != null,
|
||||||
IsReadOnlyGetter = isReadOnlyGetter,
|
IsReadOnlyGetter = isReadOnlyGetter,
|
||||||
BackingFieldName = (prop.SetMethod?.DeclaredAccessibility != Accessibility.Public)
|
BackingFieldName = prop.SetMethod?.DeclaredAccessibility != Accessibility.Public
|
||||||
? $"<{prop.Name}>k__BackingField"
|
? $"<{prop.Name}>k__BackingField"
|
||||||
: null
|
: null
|
||||||
};
|
};
|
||||||
@@ -143,11 +138,12 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
var stringLengthAttr = AttributeHelper.GetAttribute(prop, "StringLength");
|
var stringLengthAttr = AttributeHelper.GetAttribute(prop, "StringLength");
|
||||||
if (stringLengthAttr != null)
|
if (stringLengthAttr != null)
|
||||||
{
|
{
|
||||||
if (stringLengthAttr.ConstructorArguments.Length > 0 && stringLengthAttr.ConstructorArguments[0].Value is int max)
|
if (stringLengthAttr.ConstructorArguments.Length > 0 &&
|
||||||
|
stringLengthAttr.ConstructorArguments[0].Value is int max)
|
||||||
propInfo.MaxLength = max;
|
propInfo.MaxLength = max;
|
||||||
|
|
||||||
var minLenStr = AttributeHelper.GetNamedArgumentValue(stringLengthAttr, "MinimumLength");
|
string? minLenStr = AttributeHelper.GetNamedArgumentValue(stringLengthAttr, "MinimumLength");
|
||||||
if (int.TryParse(minLenStr, out var min))
|
if (int.TryParse(minLenStr, out int min))
|
||||||
propInfo.MinLength = min;
|
propInfo.MinLength = min;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -215,8 +211,8 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
|
|
||||||
foreach (var prop in nestedProps)
|
foreach (var prop in nestedProps)
|
||||||
{
|
{
|
||||||
var fullTypeName = prop.NestedTypeFullName!;
|
string fullTypeName = prop.NestedTypeFullName!;
|
||||||
var simpleName = prop.NestedTypeName!;
|
string simpleName = prop.NestedTypeName!;
|
||||||
|
|
||||||
// Avoid cycles
|
// Avoid cycles
|
||||||
if (analyzedTypes.Contains(fullTypeName)) continue;
|
if (analyzedTypes.Contains(fullTypeName)) continue;
|
||||||
@@ -254,8 +250,8 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
targetNestedTypes[fullTypeName] = nestedInfo;
|
targetNestedTypes[fullTypeName] = nestedInfo;
|
||||||
|
|
||||||
// Recurse
|
// Recurse
|
||||||
AnalyzeNestedTypesRecursive(nestedInfo.Properties, nestedInfo.NestedTypes, semanticModel, analyzedTypes, currentDepth + 1, maxDepth);
|
AnalyzeNestedTypesRecursive(nestedInfo.Properties, nestedInfo.NestedTypes, semanticModel, analyzedTypes,
|
||||||
}
|
currentDepth + 1, maxDepth);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1,12 +1,10 @@
|
|||||||
using System;
|
using System.Globalization;
|
||||||
using System.Collections.Generic;
|
|
||||||
using System.Linq;
|
using System.Linq;
|
||||||
using System.Text;
|
using System.Text;
|
||||||
using ZB.MOM.WW.CBDD.SourceGenerators.Models;
|
using ZB.MOM.WW.CBDD.SourceGenerators.Models;
|
||||||
using ZB.MOM.WW.CBDD.SourceGenerators.Helpers;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.SourceGenerators
|
namespace ZB.MOM.WW.CBDD.SourceGenerators;
|
||||||
{
|
|
||||||
public static class CodeGenerator
|
public static class CodeGenerator
|
||||||
{
|
{
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -18,26 +16,27 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
public static string GenerateMapperClass(EntityInfo entity, string mapperNamespace)
|
public static string GenerateMapperClass(EntityInfo entity, string mapperNamespace)
|
||||||
{
|
{
|
||||||
var sb = new StringBuilder();
|
var sb = new StringBuilder();
|
||||||
var mapperName = GetMapperName(entity.FullTypeName);
|
string mapperName = GetMapperName(entity.FullTypeName);
|
||||||
var keyProp = entity.Properties.FirstOrDefault(p => p.IsKey);
|
var keyProp = entity.Properties.FirstOrDefault(p => p.IsKey);
|
||||||
var isRoot = entity.IdProperty != null;
|
bool isRoot = entity.IdProperty != null;
|
||||||
|
|
||||||
sb.AppendLine("#pragma warning disable CS8604");
|
sb.AppendLine("#pragma warning disable CS8604");
|
||||||
|
|
||||||
// Class Declaration
|
// Class Declaration
|
||||||
if (isRoot)
|
if (isRoot)
|
||||||
{
|
{
|
||||||
var baseClass = GetBaseMapperClass(keyProp, entity);
|
string baseClass = GetBaseMapperClass(keyProp, entity);
|
||||||
// Ensure FullTypeName has global:: prefix if not already present (assuming FullTypeName is fully qualified)
|
// Ensure FullTypeName has global:: prefix if not already present (assuming FullTypeName is fully qualified)
|
||||||
var entityType = $"global::{entity.FullTypeName}";
|
var entityType = $"global::{entity.FullTypeName}";
|
||||||
sb.AppendLine($" public class {mapperName} : global::ZB.MOM.WW.CBDD.Core.Collections.{baseClass}{entityType}>");
|
sb.AppendLine(
|
||||||
|
$" public class {mapperName} : global::ZB.MOM.WW.CBDD.Core.Collections.{baseClass}{entityType}>");
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
sb.AppendLine($" public class {mapperName}");
|
sb.AppendLine($" public class {mapperName}");
|
||||||
}
|
}
|
||||||
|
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" {");
|
||||||
|
|
||||||
// Converter instance
|
// Converter instance
|
||||||
if (keyProp?.ConverterTypeName != null)
|
if (keyProp?.ConverterTypeName != null)
|
||||||
@@ -47,26 +46,34 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Generate static setters for private properties (Expression Trees)
|
// Generate static setters for private properties (Expression Trees)
|
||||||
var privateSetterProps = entity.Properties.Where(p => (!p.HasPublicSetter && p.HasAnySetter) || p.HasInitOnlySetter).ToList();
|
var privateSetterProps = entity.Properties
|
||||||
|
.Where(p => (!p.HasPublicSetter && p.HasAnySetter) || p.HasInitOnlySetter).ToList();
|
||||||
if (privateSetterProps.Any())
|
if (privateSetterProps.Any())
|
||||||
{
|
{
|
||||||
sb.AppendLine($" // Cached Expression Tree setters for private properties");
|
sb.AppendLine(" // Cached Expression Tree setters for private properties");
|
||||||
foreach (var prop in privateSetterProps)
|
foreach (var prop in privateSetterProps)
|
||||||
{
|
{
|
||||||
var entityType = $"global::{entity.FullTypeName}";
|
var entityType = $"global::{entity.FullTypeName}";
|
||||||
var propType = QualifyType(prop.TypeName);
|
string propType = QualifyType(prop.TypeName);
|
||||||
sb.AppendLine($" private static readonly global::System.Action<{entityType}, {propType}> _setter_{prop.Name} = CreateSetter<{entityType}, {propType}>(\"{prop.Name}\");");
|
sb.AppendLine(
|
||||||
|
$" private static readonly global::System.Action<{entityType}, {propType}> _setter_{prop.Name} = CreateSetter<{entityType}, {propType}>(\"{prop.Name}\");");
|
||||||
}
|
}
|
||||||
|
|
||||||
sb.AppendLine();
|
sb.AppendLine();
|
||||||
|
|
||||||
sb.AppendLine($" private static global::System.Action<TObj, TVal> CreateSetter<TObj, TVal>(string propertyName)");
|
sb.AppendLine(
|
||||||
sb.AppendLine($" {{");
|
" private static global::System.Action<TObj, TVal> CreateSetter<TObj, TVal>(string propertyName)");
|
||||||
sb.AppendLine($" var param = global::System.Linq.Expressions.Expression.Parameter(typeof(TObj), \"obj\");");
|
sb.AppendLine(" {");
|
||||||
sb.AppendLine($" var value = global::System.Linq.Expressions.Expression.Parameter(typeof(TVal), \"val\");");
|
sb.AppendLine(
|
||||||
sb.AppendLine($" var prop = global::System.Linq.Expressions.Expression.Property(param, propertyName);");
|
" var param = global::System.Linq.Expressions.Expression.Parameter(typeof(TObj), \"obj\");");
|
||||||
sb.AppendLine($" var assign = global::System.Linq.Expressions.Expression.Assign(prop, value);");
|
sb.AppendLine(
|
||||||
sb.AppendLine($" return global::System.Linq.Expressions.Expression.Lambda<global::System.Action<TObj, TVal>>(assign, param, value).Compile();");
|
" var value = global::System.Linq.Expressions.Expression.Parameter(typeof(TVal), \"val\");");
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(
|
||||||
|
" var prop = global::System.Linq.Expressions.Expression.Property(param, propertyName);");
|
||||||
|
sb.AppendLine(" var assign = global::System.Linq.Expressions.Expression.Assign(prop, value);");
|
||||||
|
sb.AppendLine(
|
||||||
|
" return global::System.Linq.Expressions.Expression.Lambda<global::System.Action<TObj, TVal>>(assign, param, value).Compile();");
|
||||||
|
sb.AppendLine(" }");
|
||||||
sb.AppendLine();
|
sb.AppendLine();
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -78,7 +85,8 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
}
|
}
|
||||||
else if (entity.Properties.All(p => !p.IsKey))
|
else if (entity.Properties.All(p => !p.IsKey))
|
||||||
{
|
{
|
||||||
sb.AppendLine($"// #warning Entity '{entity.Name}' has no defined primary key. Mapper may not support all features.");
|
sb.AppendLine(
|
||||||
|
$"// #warning Entity '{entity.Name}' has no defined primary key. Mapper may not support all features.");
|
||||||
}
|
}
|
||||||
|
|
||||||
// Serialize Method
|
// Serialize Method
|
||||||
@@ -95,39 +103,41 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
GenerateIdAccessors(sb, entity);
|
GenerateIdAccessors(sb, entity);
|
||||||
}
|
}
|
||||||
|
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" }");
|
||||||
sb.AppendLine("#pragma warning restore CS8604");
|
sb.AppendLine("#pragma warning restore CS8604");
|
||||||
|
|
||||||
return sb.ToString();
|
return sb.ToString();
|
||||||
}
|
}
|
||||||
|
|
||||||
private static void GenerateSerializeMethod(StringBuilder sb, EntityInfo entity, bool isRoot, string mapperNamespace)
|
private static void GenerateSerializeMethod(StringBuilder sb, EntityInfo entity, bool isRoot,
|
||||||
|
string mapperNamespace)
|
||||||
{
|
{
|
||||||
var entityType = $"global::{entity.FullTypeName}";
|
var entityType = $"global::{entity.FullTypeName}";
|
||||||
|
|
||||||
// Always generate SerializeFields (writes only fields, no document wrapper)
|
// Always generate SerializeFields (writes only fields, no document wrapper)
|
||||||
// This is needed even for root entities, as they may be used as nested objects
|
// This is needed even for root entities, as they may be used as nested objects
|
||||||
// Note: BsonSpanWriter is a ref struct, so it must be passed by ref
|
// Note: BsonSpanWriter is a ref struct, so it must be passed by ref
|
||||||
sb.AppendLine($" public void SerializeFields({entityType} entity, ref global::ZB.MOM.WW.CBDD.Bson.BsonSpanWriter writer)");
|
sb.AppendLine(
|
||||||
sb.AppendLine($" {{");
|
$" public void SerializeFields({entityType} entity, ref global::ZB.MOM.WW.CBDD.Bson.BsonSpanWriter writer)");
|
||||||
|
sb.AppendLine(" {");
|
||||||
GenerateFieldWritesCore(sb, entity, mapperNamespace);
|
GenerateFieldWritesCore(sb, entity, mapperNamespace);
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" }");
|
||||||
sb.AppendLine();
|
sb.AppendLine();
|
||||||
|
|
||||||
// Generate Serialize method (with document wrapper)
|
// Generate Serialize method (with document wrapper)
|
||||||
var methodSig = isRoot
|
string methodSig = isRoot
|
||||||
? $"public override int Serialize({entityType} entity, global::ZB.MOM.WW.CBDD.Bson.BsonSpanWriter writer)"
|
? $"public override int Serialize({entityType} entity, global::ZB.MOM.WW.CBDD.Bson.BsonSpanWriter writer)"
|
||||||
: $"public int Serialize({entityType} entity, global::ZB.MOM.WW.CBDD.Bson.BsonSpanWriter writer)";
|
: $"public int Serialize({entityType} entity, global::ZB.MOM.WW.CBDD.Bson.BsonSpanWriter writer)";
|
||||||
|
|
||||||
sb.AppendLine($" {methodSig}");
|
sb.AppendLine($" {methodSig}");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" {");
|
||||||
sb.AppendLine($" var startingPos = writer.BeginDocument();");
|
sb.AppendLine(" var startingPos = writer.BeginDocument();");
|
||||||
sb.AppendLine();
|
sb.AppendLine();
|
||||||
GenerateFieldWritesCore(sb, entity, mapperNamespace);
|
GenerateFieldWritesCore(sb, entity, mapperNamespace);
|
||||||
sb.AppendLine();
|
sb.AppendLine();
|
||||||
sb.AppendLine($" writer.EndDocument(startingPos);");
|
sb.AppendLine(" writer.EndDocument(startingPos);");
|
||||||
sb.AppendLine($" return writer.Position;");
|
sb.AppendLine(" return writer.Position;");
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" }");
|
||||||
}
|
}
|
||||||
|
|
||||||
private static void GenerateFieldWritesCore(StringBuilder sb, EntityInfo entity, string mapperNamespace)
|
private static void GenerateFieldWritesCore(StringBuilder sb, EntityInfo entity, string mapperNamespace)
|
||||||
@@ -140,37 +150,41 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
if (prop.ConverterTypeName != null)
|
if (prop.ConverterTypeName != null)
|
||||||
{
|
{
|
||||||
var providerProp = new PropertyInfo { TypeName = prop.ProviderTypeName ?? "string" };
|
var providerProp = new PropertyInfo { TypeName = prop.ProviderTypeName ?? "string" };
|
||||||
var idWriteMethod = GetPrimitiveWriteMethod(providerProp, allowKey: true);
|
string? idWriteMethod = GetPrimitiveWriteMethod(providerProp, true);
|
||||||
if (idWriteMethod == "WriteString")
|
if (idWriteMethod == "WriteString")
|
||||||
{
|
{
|
||||||
sb.AppendLine($" var convertedId = _idConverter.ConvertToProvider(entity.{prop.Name});");
|
sb.AppendLine(
|
||||||
sb.AppendLine($" if (convertedId != null)");
|
$" var convertedId = _idConverter.ConvertToProvider(entity.{prop.Name});");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" if (convertedId != null)");
|
||||||
sb.AppendLine($" writer.WriteString(\"_id\", convertedId);");
|
sb.AppendLine(" {");
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" writer.WriteString(\"_id\", convertedId);");
|
||||||
sb.AppendLine($" else");
|
sb.AppendLine(" }");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" else");
|
||||||
sb.AppendLine($" writer.WriteNull(\"_id\");");
|
sb.AppendLine(" {");
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" writer.WriteNull(\"_id\");");
|
||||||
|
sb.AppendLine(" }");
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
sb.AppendLine($" writer.{idWriteMethod}(\"_id\", _idConverter.ConvertToProvider(entity.{prop.Name}));");
|
sb.AppendLine(
|
||||||
|
$" writer.{idWriteMethod}(\"_id\", _idConverter.ConvertToProvider(entity.{prop.Name}));");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
var idWriteMethod = GetPrimitiveWriteMethod(prop, allowKey: true);
|
string? idWriteMethod = GetPrimitiveWriteMethod(prop, true);
|
||||||
if (idWriteMethod != null)
|
if (idWriteMethod != null)
|
||||||
{
|
{
|
||||||
sb.AppendLine($" writer.{idWriteMethod}(\"_id\", entity.{prop.Name});");
|
sb.AppendLine($" writer.{idWriteMethod}(\"_id\", entity.{prop.Name});");
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
sb.AppendLine($"#warning Unsupported Id type for '{prop.Name}': {prop.TypeName}. Serialization of '_id' will fail.");
|
sb.AppendLine(
|
||||||
|
$"#warning Unsupported Id type for '{prop.Name}': {prop.TypeName}. Serialization of '_id' will fail.");
|
||||||
sb.AppendLine($" // Unsupported Id type: {prop.TypeName}");
|
sb.AppendLine($" // Unsupported Id type: {prop.TypeName}");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -181,40 +195,37 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
|
|
||||||
private static void GenerateValidation(StringBuilder sb, PropertyInfo prop)
|
private static void GenerateValidation(StringBuilder sb, PropertyInfo prop)
|
||||||
{
|
{
|
||||||
var isString = prop.TypeName == "string" || prop.TypeName == "String";
|
bool isString = prop.TypeName == "string" || prop.TypeName == "String";
|
||||||
|
|
||||||
if (prop.IsRequired)
|
if (prop.IsRequired)
|
||||||
{
|
{
|
||||||
if (isString)
|
if (isString)
|
||||||
{
|
sb.AppendLine(
|
||||||
sb.AppendLine($" if (string.IsNullOrEmpty(entity.{prop.Name})) throw new global::System.ComponentModel.DataAnnotations.ValidationException(\"Property {prop.Name} is required.\");");
|
$" if (string.IsNullOrEmpty(entity.{prop.Name})) throw new global::System.ComponentModel.DataAnnotations.ValidationException(\"Property {prop.Name} is required.\");");
|
||||||
}
|
|
||||||
else if (prop.IsNullable)
|
else if (prop.IsNullable)
|
||||||
{
|
sb.AppendLine(
|
||||||
sb.AppendLine($" if (entity.{prop.Name} == null) throw new global::System.ComponentModel.DataAnnotations.ValidationException(\"Property {prop.Name} is required.\");");
|
$" if (entity.{prop.Name} == null) throw new global::System.ComponentModel.DataAnnotations.ValidationException(\"Property {prop.Name} is required.\");");
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (prop.MaxLength.HasValue && isString)
|
if (prop.MaxLength.HasValue && isString)
|
||||||
{
|
sb.AppendLine(
|
||||||
sb.AppendLine($" if ((entity.{prop.Name}?.Length ?? 0) > {prop.MaxLength}) throw new global::System.ComponentModel.DataAnnotations.ValidationException(\"Property {prop.Name} exceeds max length {prop.MaxLength}.\");");
|
$" if ((entity.{prop.Name}?.Length ?? 0) > {prop.MaxLength}) throw new global::System.ComponentModel.DataAnnotations.ValidationException(\"Property {prop.Name} exceeds max length {prop.MaxLength}.\");");
|
||||||
}
|
|
||||||
if (prop.MinLength.HasValue && isString)
|
if (prop.MinLength.HasValue && isString)
|
||||||
{
|
sb.AppendLine(
|
||||||
sb.AppendLine($" if ((entity.{prop.Name}?.Length ?? 0) < {prop.MinLength}) throw new global::System.ComponentModel.DataAnnotations.ValidationException(\"Property {prop.Name} is below min length {prop.MinLength}.\");");
|
$" if ((entity.{prop.Name}?.Length ?? 0) < {prop.MinLength}) throw new global::System.ComponentModel.DataAnnotations.ValidationException(\"Property {prop.Name} is below min length {prop.MinLength}.\");");
|
||||||
}
|
|
||||||
|
|
||||||
if (prop.RangeMin.HasValue || prop.RangeMax.HasValue)
|
if (prop.RangeMin.HasValue || prop.RangeMax.HasValue)
|
||||||
{
|
{
|
||||||
var minStr = prop.RangeMin?.ToString(System.Globalization.CultureInfo.InvariantCulture) ?? "double.MinValue";
|
string minStr = prop.RangeMin?.ToString(CultureInfo.InvariantCulture) ?? "double.MinValue";
|
||||||
var maxStr = prop.RangeMax?.ToString(System.Globalization.CultureInfo.InvariantCulture) ?? "double.MaxValue";
|
string maxStr = prop.RangeMax?.ToString(CultureInfo.InvariantCulture) ?? "double.MaxValue";
|
||||||
sb.AppendLine($" if ((double)entity.{prop.Name} < {minStr} || (double)entity.{prop.Name} > {maxStr}) throw new global::System.ComponentModel.DataAnnotations.ValidationException(\"Property {prop.Name} is outside range [{minStr}, {maxStr}].\");");
|
sb.AppendLine(
|
||||||
|
$" if ((double)entity.{prop.Name} < {minStr} || (double)entity.{prop.Name} > {maxStr}) throw new global::System.ComponentModel.DataAnnotations.ValidationException(\"Property {prop.Name} is outside range [{minStr}, {maxStr}].\");");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private static void GenerateWriteProperty(StringBuilder sb, PropertyInfo prop, string mapperNamespace)
|
private static void GenerateWriteProperty(StringBuilder sb, PropertyInfo prop, string mapperNamespace)
|
||||||
{
|
{
|
||||||
var fieldName = prop.BsonFieldName;
|
string fieldName = prop.BsonFieldName;
|
||||||
|
|
||||||
if (prop.IsCollection)
|
if (prop.IsCollection)
|
||||||
{
|
{
|
||||||
@@ -222,11 +233,11 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
if (prop.IsNullable)
|
if (prop.IsNullable)
|
||||||
{
|
{
|
||||||
sb.AppendLine($" if (entity.{prop.Name} != null)");
|
sb.AppendLine($" if (entity.{prop.Name} != null)");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" {");
|
||||||
}
|
}
|
||||||
|
|
||||||
var arrayVar = $"{prop.Name.ToLower()}Array";
|
var arrayVar = $"{prop.Name.ToLower()}Array";
|
||||||
var indent = prop.IsNullable ? " " : "";
|
string indent = prop.IsNullable ? " " : "";
|
||||||
sb.AppendLine($" {indent}var {arrayVar}Pos = writer.BeginArray(\"{fieldName}\");");
|
sb.AppendLine($" {indent}var {arrayVar}Pos = writer.BeginArray(\"{fieldName}\");");
|
||||||
sb.AppendLine($" {indent}var {prop.Name.ToLower()}Index = 0;");
|
sb.AppendLine($" {indent}var {prop.Name.ToLower()}Index = 0;");
|
||||||
sb.AppendLine($" {indent}foreach (var item in entity.{prop.Name})");
|
sb.AppendLine($" {indent}foreach (var item in entity.{prop.Name})");
|
||||||
@@ -236,23 +247,26 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
if (prop.IsCollectionItemNested)
|
if (prop.IsCollectionItemNested)
|
||||||
{
|
{
|
||||||
sb.AppendLine($" {indent} // Nested Object in List");
|
sb.AppendLine($" {indent} // Nested Object in List");
|
||||||
var nestedMapperTypes = GetMapperName(prop.NestedTypeFullName!);
|
string nestedMapperTypes = GetMapperName(prop.NestedTypeFullName!);
|
||||||
sb.AppendLine($" {indent} var {prop.Name.ToLower()}ItemMapper = new global::{mapperNamespace}.{nestedMapperTypes}();");
|
sb.AppendLine(
|
||||||
|
$" {indent} var {prop.Name.ToLower()}ItemMapper = new global::{mapperNamespace}.{nestedMapperTypes}();");
|
||||||
|
|
||||||
sb.AppendLine($" {indent} var itemStartPos = writer.BeginDocument({prop.Name.ToLower()}Index.ToString());");
|
sb.AppendLine(
|
||||||
sb.AppendLine($" {indent} {prop.Name.ToLower()}ItemMapper.SerializeFields(item, ref writer);");
|
$" {indent} var itemStartPos = writer.BeginDocument({prop.Name.ToLower()}Index.ToString());");
|
||||||
|
sb.AppendLine(
|
||||||
|
$" {indent} {prop.Name.ToLower()}ItemMapper.SerializeFields(item, ref writer);");
|
||||||
sb.AppendLine($" {indent} writer.EndDocument(itemStartPos);");
|
sb.AppendLine($" {indent} writer.EndDocument(itemStartPos);");
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
// Simplified: pass a dummy PropertyInfo with the item type for primitive collection items
|
// Simplified: pass a dummy PropertyInfo with the item type for primitive collection items
|
||||||
var dummyProp = new PropertyInfo { TypeName = prop.CollectionItemType! };
|
var dummyProp = new PropertyInfo { TypeName = prop.CollectionItemType! };
|
||||||
var writeMethod = GetPrimitiveWriteMethod(dummyProp);
|
string? writeMethod = GetPrimitiveWriteMethod(dummyProp);
|
||||||
if (writeMethod != null)
|
if (writeMethod != null)
|
||||||
{
|
sb.AppendLine(
|
||||||
sb.AppendLine($" {indent} writer.{writeMethod}({prop.Name.ToLower()}Index.ToString(), item);");
|
$" {indent} writer.{writeMethod}({prop.Name.ToLower()}Index.ToString(), item);");
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
sb.AppendLine($" {indent} {prop.Name.ToLower()}Index++;");
|
sb.AppendLine($" {indent} {prop.Name.ToLower()}Index++;");
|
||||||
|
|
||||||
sb.AppendLine($" {indent}}}");
|
sb.AppendLine($" {indent}}}");
|
||||||
@@ -261,49 +275,51 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
// Close the null check if block
|
// Close the null check if block
|
||||||
if (prop.IsNullable)
|
if (prop.IsNullable)
|
||||||
{
|
{
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" }");
|
||||||
sb.AppendLine($" else");
|
sb.AppendLine(" else");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" {");
|
||||||
sb.AppendLine($" writer.WriteNull(\"{fieldName}\");");
|
sb.AppendLine($" writer.WriteNull(\"{fieldName}\");");
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" }");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
else if (prop.IsNestedObject)
|
else if (prop.IsNestedObject)
|
||||||
{
|
{
|
||||||
sb.AppendLine($" if (entity.{prop.Name} != null)");
|
sb.AppendLine($" if (entity.{prop.Name} != null)");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" {");
|
||||||
sb.AppendLine($" var {prop.Name.ToLower()}Pos = writer.BeginDocument(\"{fieldName}\");");
|
sb.AppendLine($" var {prop.Name.ToLower()}Pos = writer.BeginDocument(\"{fieldName}\");");
|
||||||
var nestedMapperType = GetMapperName(prop.NestedTypeFullName!);
|
string nestedMapperType = GetMapperName(prop.NestedTypeFullName!);
|
||||||
sb.AppendLine($" var {prop.Name.ToLower()}Mapper = new global::{mapperNamespace}.{nestedMapperType}();");
|
sb.AppendLine(
|
||||||
sb.AppendLine($" {prop.Name.ToLower()}Mapper.SerializeFields(entity.{prop.Name}, ref writer);");
|
$" var {prop.Name.ToLower()}Mapper = new global::{mapperNamespace}.{nestedMapperType}();");
|
||||||
|
sb.AppendLine(
|
||||||
|
$" {prop.Name.ToLower()}Mapper.SerializeFields(entity.{prop.Name}, ref writer);");
|
||||||
sb.AppendLine($" writer.EndDocument({prop.Name.ToLower()}Pos);");
|
sb.AppendLine($" writer.EndDocument({prop.Name.ToLower()}Pos);");
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" }");
|
||||||
sb.AppendLine($" else");
|
sb.AppendLine(" else");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" {");
|
||||||
sb.AppendLine($" writer.WriteNull(\"{fieldName}\");");
|
sb.AppendLine($" writer.WriteNull(\"{fieldName}\");");
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" }");
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
var writeMethod = GetPrimitiveWriteMethod(prop, allowKey: false);
|
string? writeMethod = GetPrimitiveWriteMethod(prop);
|
||||||
if (writeMethod != null)
|
if (writeMethod != null)
|
||||||
{
|
{
|
||||||
if (prop.IsNullable || prop.TypeName == "string" || prop.TypeName == "String")
|
if (prop.IsNullable || prop.TypeName == "string" || prop.TypeName == "String")
|
||||||
{
|
{
|
||||||
sb.AppendLine($" if (entity.{prop.Name} != null)");
|
sb.AppendLine($" if (entity.{prop.Name} != null)");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" {");
|
||||||
// For nullable value types, use .Value to unwrap
|
// For nullable value types, use .Value to unwrap
|
||||||
// String is a reference type and doesn't need .Value
|
// String is a reference type and doesn't need .Value
|
||||||
var isValueTypeNullable = prop.IsNullable && IsValueType(prop.TypeName);
|
bool isValueTypeNullable = prop.IsNullable && IsValueType(prop.TypeName);
|
||||||
var valueAccess = isValueTypeNullable
|
string valueAccess = isValueTypeNullable
|
||||||
? $"entity.{prop.Name}.Value"
|
? $"entity.{prop.Name}.Value"
|
||||||
: $"entity.{prop.Name}";
|
: $"entity.{prop.Name}";
|
||||||
sb.AppendLine($" writer.{writeMethod}(\"{fieldName}\", {valueAccess});");
|
sb.AppendLine($" writer.{writeMethod}(\"{fieldName}\", {valueAccess});");
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" }");
|
||||||
sb.AppendLine($" else");
|
sb.AppendLine(" else");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" {");
|
||||||
sb.AppendLine($" writer.WriteNull(\"{fieldName}\");");
|
sb.AppendLine($" writer.WriteNull(\"{fieldName}\");");
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" }");
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
@@ -312,16 +328,18 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
sb.AppendLine($"#warning Property '{prop.Name}' of type '{prop.TypeName}' is not directly supported and has no converter. It will be skipped during serialization.");
|
sb.AppendLine(
|
||||||
|
$"#warning Property '{prop.Name}' of type '{prop.TypeName}' is not directly supported and has no converter. It will be skipped during serialization.");
|
||||||
sb.AppendLine($" // Unsupported type: {prop.TypeName} for {prop.Name}");
|
sb.AppendLine($" // Unsupported type: {prop.TypeName} for {prop.Name}");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private static void GenerateDeserializeMethod(StringBuilder sb, EntityInfo entity, bool isRoot, string mapperNamespace)
|
private static void GenerateDeserializeMethod(StringBuilder sb, EntityInfo entity, bool isRoot,
|
||||||
|
string mapperNamespace)
|
||||||
{
|
{
|
||||||
var entityType = $"global::{entity.FullTypeName}";
|
var entityType = $"global::{entity.FullTypeName}";
|
||||||
var needsReflection = entity.HasPrivateSetters || entity.HasPrivateOrNoConstructor;
|
bool needsReflection = entity.HasPrivateSetters || entity.HasPrivateOrNoConstructor;
|
||||||
|
|
||||||
// Always generate a public Deserialize method that accepts ref (for nested/internal usage)
|
// Always generate a public Deserialize method that accepts ref (for nested/internal usage)
|
||||||
GenerateDeserializeCore(sb, entity, entityType, needsReflection, mapperNamespace);
|
GenerateDeserializeCore(sb, entity, entityType, needsReflection, mapperNamespace);
|
||||||
@@ -330,18 +348,21 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
if (isRoot)
|
if (isRoot)
|
||||||
{
|
{
|
||||||
sb.AppendLine();
|
sb.AppendLine();
|
||||||
sb.AppendLine($" public override {entityType} Deserialize(global::ZB.MOM.WW.CBDD.Bson.BsonSpanReader reader)");
|
sb.AppendLine(
|
||||||
sb.AppendLine($" {{");
|
$" public override {entityType} Deserialize(global::ZB.MOM.WW.CBDD.Bson.BsonSpanReader reader)");
|
||||||
sb.AppendLine($" return Deserialize(ref reader);");
|
sb.AppendLine(" {");
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" return Deserialize(ref reader);");
|
||||||
|
sb.AppendLine(" }");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private static void GenerateDeserializeCore(StringBuilder sb, EntityInfo entity, string entityType, bool needsReflection, string mapperNamespace)
|
private static void GenerateDeserializeCore(StringBuilder sb, EntityInfo entity, string entityType,
|
||||||
|
bool needsReflection, string mapperNamespace)
|
||||||
{
|
{
|
||||||
// Public method that always accepts ref for internal/nested usage
|
// Public method that always accepts ref for internal/nested usage
|
||||||
sb.AppendLine($" public {entityType} Deserialize(ref global::ZB.MOM.WW.CBDD.Bson.BsonSpanReader reader)");
|
sb.AppendLine(
|
||||||
sb.AppendLine($" {{");
|
$" public {entityType} Deserialize(ref global::ZB.MOM.WW.CBDD.Bson.BsonSpanReader reader)");
|
||||||
|
sb.AppendLine(" {");
|
||||||
// Use object initializer if possible or constructor, but for now standard new()
|
// Use object initializer if possible or constructor, but for now standard new()
|
||||||
// To support required properties, we might need a different approach or verify if source generators can detect required.
|
// To support required properties, we might need a different approach or verify if source generators can detect required.
|
||||||
// For now, let's assume standard creation and property setting.
|
// For now, let's assume standard creation and property setting.
|
||||||
@@ -351,14 +372,16 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
// Declare temp variables for all properties
|
// Declare temp variables for all properties
|
||||||
foreach (var prop in entity.Properties)
|
foreach (var prop in entity.Properties)
|
||||||
{
|
{
|
||||||
var baseType = QualifyType(prop.TypeName.TrimEnd('?'));
|
string baseType = QualifyType(prop.TypeName.TrimEnd('?'));
|
||||||
|
|
||||||
// Handle collections init
|
// Handle collections init
|
||||||
if (prop.IsCollection)
|
if (prop.IsCollection)
|
||||||
{
|
{
|
||||||
var itemType = prop.CollectionItemType;
|
string? itemType = prop.CollectionItemType;
|
||||||
if (prop.IsCollectionItemNested) itemType = $"global::{prop.NestedTypeFullName}"; // Use full name with global::
|
if (prop.IsCollectionItemNested)
|
||||||
sb.AppendLine($" var {prop.Name.ToLower()} = new global::System.Collections.Generic.List<{itemType}>();");
|
itemType = $"global::{prop.NestedTypeFullName}"; // Use full name with global::
|
||||||
|
sb.AppendLine(
|
||||||
|
$" var {prop.Name.ToLower()} = new global::System.Collections.Generic.List<{itemType}>();");
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
@@ -368,49 +391,50 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
|
|
||||||
|
|
||||||
// Read document size and track boundaries
|
// Read document size and track boundaries
|
||||||
sb.AppendLine($" var docSize = reader.ReadDocumentSize();");
|
sb.AppendLine(" var docSize = reader.ReadDocumentSize();");
|
||||||
sb.AppendLine($" var docEndPos = reader.Position + docSize - 4; // -4 because size includes itself");
|
sb.AppendLine(" var docEndPos = reader.Position + docSize - 4; // -4 because size includes itself");
|
||||||
sb.AppendLine();
|
sb.AppendLine();
|
||||||
sb.AppendLine($" while (reader.Position < docEndPos)");
|
sb.AppendLine(" while (reader.Position < docEndPos)");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" {");
|
||||||
sb.AppendLine($" var bsonType = reader.ReadBsonType();");
|
sb.AppendLine(" var bsonType = reader.ReadBsonType();");
|
||||||
sb.AppendLine($" if (bsonType == global::ZB.MOM.WW.CBDD.Bson.BsonType.EndOfDocument) break;");
|
sb.AppendLine(" if (bsonType == global::ZB.MOM.WW.CBDD.Bson.BsonType.EndOfDocument) break;");
|
||||||
sb.AppendLine();
|
sb.AppendLine();
|
||||||
sb.AppendLine($" var elementName = reader.ReadElementHeader();");
|
sb.AppendLine(" var elementName = reader.ReadElementHeader();");
|
||||||
sb.AppendLine($" switch (elementName)");
|
sb.AppendLine(" switch (elementName)");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" {");
|
||||||
|
|
||||||
foreach (var prop in entity.Properties)
|
foreach (var prop in entity.Properties)
|
||||||
{
|
{
|
||||||
var caseName = prop.IsKey ? "_id" : prop.BsonFieldName;
|
string caseName = prop.IsKey ? "_id" : prop.BsonFieldName;
|
||||||
sb.AppendLine($" case \"{caseName}\":");
|
sb.AppendLine($" case \"{caseName}\":");
|
||||||
|
|
||||||
// Read Logic -> assign to local var
|
// Read Logic -> assign to local var
|
||||||
GenerateReadPropertyToLocal(sb, prop, "bsonType", mapperNamespace);
|
GenerateReadPropertyToLocal(sb, prop, "bsonType", mapperNamespace);
|
||||||
|
|
||||||
sb.AppendLine($" break;");
|
sb.AppendLine(" break;");
|
||||||
}
|
}
|
||||||
|
|
||||||
sb.AppendLine($" default:");
|
sb.AppendLine(" default:");
|
||||||
sb.AppendLine($" reader.SkipValue(bsonType);");
|
sb.AppendLine(" reader.SkipValue(bsonType);");
|
||||||
sb.AppendLine($" break;");
|
sb.AppendLine(" break;");
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" }");
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" }");
|
||||||
sb.AppendLine();
|
sb.AppendLine();
|
||||||
|
|
||||||
// Construct object - different approach if needs reflection
|
// Construct object - different approach if needs reflection
|
||||||
if (needsReflection)
|
if (needsReflection)
|
||||||
{
|
{
|
||||||
// Use GetUninitializedObject + Expression Trees for private setters
|
// Use GetUninitializedObject + Expression Trees for private setters
|
||||||
sb.AppendLine($" // Creating instance without calling constructor (has private members)");
|
sb.AppendLine(" // Creating instance without calling constructor (has private members)");
|
||||||
sb.AppendLine($" var entity = (global::{entity.FullTypeName})global::System.Runtime.CompilerServices.RuntimeHelpers.GetUninitializedObject(typeof(global::{entity.FullTypeName}));");
|
sb.AppendLine(
|
||||||
|
$" var entity = (global::{entity.FullTypeName})global::System.Runtime.CompilerServices.RuntimeHelpers.GetUninitializedObject(typeof(global::{entity.FullTypeName}));");
|
||||||
sb.AppendLine();
|
sb.AppendLine();
|
||||||
|
|
||||||
// Set properties using setters (Expression Trees for private, direct for public)
|
// Set properties using setters (Expression Trees for private, direct for public)
|
||||||
foreach (var prop in entity.Properties)
|
foreach (var prop in entity.Properties)
|
||||||
{
|
{
|
||||||
var varName = prop.Name.ToLower();
|
string varName = prop.Name.ToLower();
|
||||||
var propValue = varName;
|
string propValue = varName;
|
||||||
|
|
||||||
if (prop.IsCollection)
|
if (prop.IsCollection)
|
||||||
{
|
{
|
||||||
@@ -421,8 +445,10 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
}
|
}
|
||||||
else if (prop.CollectionConcreteTypeName != null)
|
else if (prop.CollectionConcreteTypeName != null)
|
||||||
{
|
{
|
||||||
var concreteType = prop.CollectionConcreteTypeName;
|
string? concreteType = prop.CollectionConcreteTypeName;
|
||||||
var itemType = prop.IsCollectionItemNested ? $"global::{prop.NestedTypeFullName}" : prop.CollectionItemType;
|
string? itemType = prop.IsCollectionItemNested
|
||||||
|
? $"global::{prop.NestedTypeFullName}"
|
||||||
|
: prop.CollectionItemType;
|
||||||
|
|
||||||
if (concreteType.Contains("HashSet"))
|
if (concreteType.Contains("HashSet"))
|
||||||
propValue = $"new global::System.Collections.Generic.HashSet<{itemType}>({propValue})";
|
propValue = $"new global::System.Collections.Generic.HashSet<{itemType}>({propValue})";
|
||||||
@@ -441,27 +467,24 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
|
|
||||||
// Use appropriate setter
|
// Use appropriate setter
|
||||||
if ((!prop.HasPublicSetter && prop.HasAnySetter) || prop.HasInitOnlySetter)
|
if ((!prop.HasPublicSetter && prop.HasAnySetter) || prop.HasInitOnlySetter)
|
||||||
{
|
|
||||||
// Use Expression Tree setter (for private or init-only setters)
|
// Use Expression Tree setter (for private or init-only setters)
|
||||||
sb.AppendLine($" _setter_{prop.Name}(entity, {propValue} ?? default!);");
|
sb.AppendLine($" _setter_{prop.Name}(entity, {propValue} ?? default!);");
|
||||||
}
|
|
||||||
else
|
else
|
||||||
{
|
|
||||||
// Direct property assignment
|
// Direct property assignment
|
||||||
sb.AppendLine($" entity.{prop.Name} = {propValue} ?? default!;");
|
sb.AppendLine($" entity.{prop.Name} = {propValue} ?? default!;");
|
||||||
}
|
}
|
||||||
}
|
|
||||||
sb.AppendLine();
|
sb.AppendLine();
|
||||||
sb.AppendLine($" return entity;");
|
sb.AppendLine(" return entity;");
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
// Standard object initializer approach
|
// Standard object initializer approach
|
||||||
sb.AppendLine($" return new {entityType}");
|
sb.AppendLine($" return new {entityType}");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" {");
|
||||||
foreach (var prop in entity.Properties)
|
foreach (var prop in entity.Properties)
|
||||||
{
|
{
|
||||||
var val = prop.Name.ToLower();
|
string val = prop.Name.ToLower();
|
||||||
if (prop.IsCollection)
|
if (prop.IsCollection)
|
||||||
{
|
{
|
||||||
// Convert to appropriate collection type
|
// Convert to appropriate collection type
|
||||||
@@ -471,128 +494,128 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
}
|
}
|
||||||
else if (prop.CollectionConcreteTypeName != null)
|
else if (prop.CollectionConcreteTypeName != null)
|
||||||
{
|
{
|
||||||
var concreteType = prop.CollectionConcreteTypeName;
|
string? concreteType = prop.CollectionConcreteTypeName;
|
||||||
var itemType = prop.IsCollectionItemNested ? $"global::{prop.NestedTypeFullName}" : prop.CollectionItemType;
|
string? itemType = prop.IsCollectionItemNested
|
||||||
|
? $"global::{prop.NestedTypeFullName}"
|
||||||
|
: prop.CollectionItemType;
|
||||||
|
|
||||||
// Check if it needs conversion from List
|
// Check if it needs conversion from List
|
||||||
if (concreteType.Contains("HashSet"))
|
if (concreteType.Contains("HashSet"))
|
||||||
{
|
|
||||||
val = $"new global::System.Collections.Generic.HashSet<{itemType}>({val})";
|
val = $"new global::System.Collections.Generic.HashSet<{itemType}>({val})";
|
||||||
}
|
|
||||||
else if (concreteType.Contains("ISet"))
|
else if (concreteType.Contains("ISet"))
|
||||||
{
|
|
||||||
val = $"new global::System.Collections.Generic.HashSet<{itemType}>({val})";
|
val = $"new global::System.Collections.Generic.HashSet<{itemType}>({val})";
|
||||||
}
|
|
||||||
else if (concreteType.Contains("LinkedList"))
|
else if (concreteType.Contains("LinkedList"))
|
||||||
{
|
|
||||||
val = $"new global::System.Collections.Generic.LinkedList<{itemType}>({val})";
|
val = $"new global::System.Collections.Generic.LinkedList<{itemType}>({val})";
|
||||||
}
|
|
||||||
else if (concreteType.Contains("Queue"))
|
else if (concreteType.Contains("Queue"))
|
||||||
{
|
|
||||||
val = $"new global::System.Collections.Generic.Queue<{itemType}>({val})";
|
val = $"new global::System.Collections.Generic.Queue<{itemType}>({val})";
|
||||||
}
|
|
||||||
else if (concreteType.Contains("Stack"))
|
else if (concreteType.Contains("Stack"))
|
||||||
{
|
|
||||||
val = $"new global::System.Collections.Generic.Stack<{itemType}>({val})";
|
val = $"new global::System.Collections.Generic.Stack<{itemType}>({val})";
|
||||||
}
|
|
||||||
else if (concreteType.Contains("IReadOnlyList") || concreteType.Contains("IReadOnlyCollection"))
|
else if (concreteType.Contains("IReadOnlyList") || concreteType.Contains("IReadOnlyCollection"))
|
||||||
{
|
|
||||||
val += ".AsReadOnly()";
|
val += ".AsReadOnly()";
|
||||||
}
|
|
||||||
// Otherwise keep as List (works for List<T>, IList<T>, ICollection<T>, IEnumerable<T>)
|
// Otherwise keep as List (works for List<T>, IList<T>, ICollection<T>, IEnumerable<T>)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// For nullable properties, don't use ?? default! since null is a valid value
|
// For nullable properties, don't use ?? default! since null is a valid value
|
||||||
if (prop.IsNullable)
|
if (prop.IsNullable)
|
||||||
{
|
|
||||||
sb.AppendLine($" {prop.Name} = {val},");
|
sb.AppendLine($" {prop.Name} = {val},");
|
||||||
}
|
|
||||||
else
|
else
|
||||||
{
|
|
||||||
sb.AppendLine($" {prop.Name} = {val} ?? default!,");
|
sb.AppendLine($" {prop.Name} = {val} ?? default!,");
|
||||||
}
|
}
|
||||||
}
|
|
||||||
sb.AppendLine($" }};");
|
sb.AppendLine(" };");
|
||||||
}
|
|
||||||
sb.AppendLine($" }}");
|
|
||||||
}
|
}
|
||||||
|
|
||||||
private static void GenerateReadPropertyToLocal(StringBuilder sb, PropertyInfo prop, string bsonTypeVar, string mapperNamespace)
|
sb.AppendLine(" }");
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void GenerateReadPropertyToLocal(StringBuilder sb, PropertyInfo prop, string bsonTypeVar,
|
||||||
|
string mapperNamespace)
|
||||||
{
|
{
|
||||||
var localVar = prop.Name.ToLower();
|
string localVar = prop.Name.ToLower();
|
||||||
|
|
||||||
if (prop.IsCollection)
|
if (prop.IsCollection)
|
||||||
{
|
{
|
||||||
var arrVar = prop.Name.ToLower();
|
string arrVar = prop.Name.ToLower();
|
||||||
sb.AppendLine($" // Read Array {prop.Name}");
|
sb.AppendLine($" // Read Array {prop.Name}");
|
||||||
sb.AppendLine($" var {arrVar}ArrSize = reader.ReadDocumentSize();");
|
sb.AppendLine($" var {arrVar}ArrSize = reader.ReadDocumentSize();");
|
||||||
sb.AppendLine($" var {arrVar}ArrEndPos = reader.Position + {arrVar}ArrSize - 4;");
|
sb.AppendLine($" var {arrVar}ArrEndPos = reader.Position + {arrVar}ArrSize - 4;");
|
||||||
sb.AppendLine($" while (reader.Position < {arrVar}ArrEndPos)");
|
sb.AppendLine($" while (reader.Position < {arrVar}ArrEndPos)");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" {");
|
||||||
sb.AppendLine($" var itemType = reader.ReadBsonType();");
|
sb.AppendLine(" var itemType = reader.ReadBsonType();");
|
||||||
sb.AppendLine($" if (itemType == global::ZB.MOM.WW.CBDD.Bson.BsonType.EndOfDocument) break;");
|
sb.AppendLine(
|
||||||
sb.AppendLine($" reader.ReadElementHeader(); // Skip index key");
|
" if (itemType == global::ZB.MOM.WW.CBDD.Bson.BsonType.EndOfDocument) break;");
|
||||||
|
sb.AppendLine(" reader.ReadElementHeader(); // Skip index key");
|
||||||
|
|
||||||
if (prop.IsCollectionItemNested)
|
if (prop.IsCollectionItemNested)
|
||||||
{
|
{
|
||||||
var nestedMapperTypes = GetMapperName(prop.NestedTypeFullName!);
|
string nestedMapperTypes = GetMapperName(prop.NestedTypeFullName!);
|
||||||
sb.AppendLine($" var {prop.Name.ToLower()}ItemMapper = new global::{mapperNamespace}.{nestedMapperTypes}();");
|
sb.AppendLine(
|
||||||
sb.AppendLine($" var item = {prop.Name.ToLower()}ItemMapper.Deserialize(ref reader);");
|
$" var {prop.Name.ToLower()}ItemMapper = new global::{mapperNamespace}.{nestedMapperTypes}();");
|
||||||
|
sb.AppendLine(
|
||||||
|
$" var item = {prop.Name.ToLower()}ItemMapper.Deserialize(ref reader);");
|
||||||
sb.AppendLine($" {localVar}.Add(item);");
|
sb.AppendLine($" {localVar}.Add(item);");
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
var readMethod = GetPrimitiveReadMethod(new PropertyInfo { TypeName = prop.CollectionItemType! });
|
string? readMethod = GetPrimitiveReadMethod(new PropertyInfo { TypeName = prop.CollectionItemType! });
|
||||||
if (readMethod != null)
|
if (readMethod != null)
|
||||||
{
|
{
|
||||||
var cast = (prop.CollectionItemType == "float" || prop.CollectionItemType == "Single") ? "(float)" : "";
|
string cast = prop.CollectionItemType == "float" || prop.CollectionItemType == "Single"
|
||||||
|
? "(float)"
|
||||||
|
: "";
|
||||||
sb.AppendLine($" var item = {cast}reader.{readMethod}();");
|
sb.AppendLine($" var item = {cast}reader.{readMethod}();");
|
||||||
sb.AppendLine($" {localVar}.Add(item);");
|
sb.AppendLine($" {localVar}.Add(item);");
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
sb.AppendLine($" reader.SkipValue(itemType);");
|
sb.AppendLine(" reader.SkipValue(itemType);");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
sb.AppendLine($" }}");
|
|
||||||
|
sb.AppendLine(" }");
|
||||||
}
|
}
|
||||||
else if (prop.IsKey && prop.ConverterTypeName != null)
|
else if (prop.IsKey && prop.ConverterTypeName != null)
|
||||||
{
|
{
|
||||||
var providerProp = new PropertyInfo { TypeName = prop.ProviderTypeName ?? "string" };
|
var providerProp = new PropertyInfo { TypeName = prop.ProviderTypeName ?? "string" };
|
||||||
var readMethod = GetPrimitiveReadMethod(providerProp);
|
string? readMethod = GetPrimitiveReadMethod(providerProp);
|
||||||
sb.AppendLine($" {localVar} = _idConverter.ConvertFromProvider(reader.{readMethod}());");
|
sb.AppendLine(
|
||||||
|
$" {localVar} = _idConverter.ConvertFromProvider(reader.{readMethod}());");
|
||||||
}
|
}
|
||||||
else if (prop.IsNestedObject)
|
else if (prop.IsNestedObject)
|
||||||
{
|
{
|
||||||
sb.AppendLine($" if ({bsonTypeVar} == global::ZB.MOM.WW.CBDD.Bson.BsonType.Null)");
|
sb.AppendLine($" if ({bsonTypeVar} == global::ZB.MOM.WW.CBDD.Bson.BsonType.Null)");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" {");
|
||||||
sb.AppendLine($" {localVar} = null;");
|
sb.AppendLine($" {localVar} = null;");
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" }");
|
||||||
sb.AppendLine($" else");
|
sb.AppendLine(" else");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" {");
|
||||||
var nestedMapperType = GetMapperName(prop.NestedTypeFullName!);
|
string nestedMapperType = GetMapperName(prop.NestedTypeFullName!);
|
||||||
sb.AppendLine($" var {prop.Name.ToLower()}Mapper = new global::{mapperNamespace}.{nestedMapperType}();");
|
sb.AppendLine(
|
||||||
sb.AppendLine($" {localVar} = {prop.Name.ToLower()}Mapper.Deserialize(ref reader);");
|
$" var {prop.Name.ToLower()}Mapper = new global::{mapperNamespace}.{nestedMapperType}();");
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(
|
||||||
|
$" {localVar} = {prop.Name.ToLower()}Mapper.Deserialize(ref reader);");
|
||||||
|
sb.AppendLine(" }");
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
var readMethod = GetPrimitiveReadMethod(prop);
|
string? readMethod = GetPrimitiveReadMethod(prop);
|
||||||
if (readMethod != null)
|
if (readMethod != null)
|
||||||
{
|
{
|
||||||
var cast = (prop.TypeName == "float" || prop.TypeName == "Single") ? "(float)" : "";
|
string cast = prop.TypeName == "float" || prop.TypeName == "Single" ? "(float)" : "";
|
||||||
|
|
||||||
// Handle nullable types - check for null in BSON stream
|
// Handle nullable types - check for null in BSON stream
|
||||||
if (prop.IsNullable)
|
if (prop.IsNullable)
|
||||||
{
|
{
|
||||||
sb.AppendLine($" if ({bsonTypeVar} == global::ZB.MOM.WW.CBDD.Bson.BsonType.Null)");
|
sb.AppendLine(
|
||||||
sb.AppendLine($" {{");
|
$" if ({bsonTypeVar} == global::ZB.MOM.WW.CBDD.Bson.BsonType.Null)");
|
||||||
|
sb.AppendLine(" {");
|
||||||
sb.AppendLine($" {localVar} = null;");
|
sb.AppendLine($" {localVar} = null;");
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" }");
|
||||||
sb.AppendLine($" else");
|
sb.AppendLine(" else");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" {");
|
||||||
sb.AppendLine($" {localVar} = {cast}reader.{readMethod}();");
|
sb.AppendLine($" {localVar} = {cast}reader.{readMethod}();");
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" }");
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
@@ -615,7 +638,7 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
{
|
{
|
||||||
if (string.IsNullOrEmpty(fullTypeName)) return "UnknownMapper";
|
if (string.IsNullOrEmpty(fullTypeName)) return "UnknownMapper";
|
||||||
// Remove global:: prefix
|
// Remove global:: prefix
|
||||||
var cleanName = fullTypeName.Replace("global::", "");
|
string cleanName = fullTypeName.Replace("global::", "");
|
||||||
// Replace dots, plus (nested classes), and colons (global::) with underscores
|
// Replace dots, plus (nested classes), and colons (global::) with underscores
|
||||||
return cleanName.Replace(".", "_").Replace("+", "_").Replace(":", "_") + "Mapper";
|
return cleanName.Replace(".", "_").Replace("+", "_").Replace(":", "_") + "Mapper";
|
||||||
}
|
}
|
||||||
@@ -627,14 +650,10 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
// Use CollectionIdTypeFullName if available (from DocumentCollection<TId, T> declaration)
|
// Use CollectionIdTypeFullName if available (from DocumentCollection<TId, T> declaration)
|
||||||
string keyType;
|
string keyType;
|
||||||
if (!string.IsNullOrEmpty(entity.CollectionIdTypeFullName))
|
if (!string.IsNullOrEmpty(entity.CollectionIdTypeFullName))
|
||||||
{
|
|
||||||
// Remove "global::" prefix if present
|
// Remove "global::" prefix if present
|
||||||
keyType = entity.CollectionIdTypeFullName!.Replace("global::", "");
|
keyType = entity.CollectionIdTypeFullName!.Replace("global::", "");
|
||||||
}
|
|
||||||
else
|
else
|
||||||
{
|
|
||||||
keyType = keyProp?.TypeName ?? "ObjectId";
|
keyType = keyProp?.TypeName ?? "ObjectId";
|
||||||
}
|
|
||||||
|
|
||||||
// Normalize keyType - remove nullable suffix for the methods
|
// Normalize keyType - remove nullable suffix for the methods
|
||||||
// We expect Id to have a value during serialization/deserialization
|
// We expect Id to have a value during serialization/deserialization
|
||||||
@@ -655,34 +674,31 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
}
|
}
|
||||||
|
|
||||||
var entityType = $"global::{entity.FullTypeName}";
|
var entityType = $"global::{entity.FullTypeName}";
|
||||||
var qualifiedKeyType = keyType.StartsWith("global::") ? keyType : (keyProp?.ConverterTypeName != null ? $"global::{keyProp.TypeName.TrimEnd('?')}" : keyType);
|
string qualifiedKeyType = keyType.StartsWith("global::") ? keyType :
|
||||||
|
keyProp?.ConverterTypeName != null ? $"global::{keyProp.TypeName.TrimEnd('?')}" : keyType;
|
||||||
|
|
||||||
var propName = keyProp?.Name ?? "Id";
|
string propName = keyProp?.Name ?? "Id";
|
||||||
|
|
||||||
// GetId can return nullable if the property is nullable, but we add ! to assert non-null
|
// GetId can return nullable if the property is nullable, but we add ! to assert non-null
|
||||||
// This helps catch bugs where entities are created without an Id
|
// This helps catch bugs where entities are created without an Id
|
||||||
if (keyProp?.IsNullable == true)
|
if (keyProp?.IsNullable == true)
|
||||||
{
|
sb.AppendLine(
|
||||||
sb.AppendLine($" public override {qualifiedKeyType} GetId({entityType} entity) => entity.{propName}!;");
|
$" public override {qualifiedKeyType} GetId({entityType} entity) => entity.{propName}!;");
|
||||||
}
|
|
||||||
else
|
else
|
||||||
{
|
sb.AppendLine(
|
||||||
sb.AppendLine($" public override {qualifiedKeyType} GetId({entityType} entity) => entity.{propName};");
|
$" public override {qualifiedKeyType} GetId({entityType} entity) => entity.{propName};");
|
||||||
}
|
|
||||||
|
|
||||||
// If the ID property has a private or init-only setter, use the compiled setter
|
// If the ID property has a private or init-only setter, use the compiled setter
|
||||||
if (entity.HasPrivateSetters && keyProp != null && (!keyProp.HasPublicSetter || keyProp.HasInitOnlySetter))
|
if (entity.HasPrivateSetters && keyProp != null && (!keyProp.HasPublicSetter || keyProp.HasInitOnlySetter))
|
||||||
{
|
sb.AppendLine(
|
||||||
sb.AppendLine($" public override void SetId({entityType} entity, {qualifiedKeyType} id) => _setter_{propName}(entity, id);");
|
$" public override void SetId({entityType} entity, {qualifiedKeyType} id) => _setter_{propName}(entity, id);");
|
||||||
}
|
|
||||||
else
|
else
|
||||||
{
|
sb.AppendLine(
|
||||||
sb.AppendLine($" public override void SetId({entityType} entity, {qualifiedKeyType} id) => entity.{propName} = id;");
|
$" public override void SetId({entityType} entity, {qualifiedKeyType} id) => entity.{propName} = id;");
|
||||||
}
|
|
||||||
|
|
||||||
if (keyProp?.ConverterTypeName != null)
|
if (keyProp?.ConverterTypeName != null)
|
||||||
{
|
{
|
||||||
var providerType = keyProp.ProviderTypeName ?? "string";
|
string providerType = keyProp.ProviderTypeName ?? "string";
|
||||||
// Normalize providerType
|
// Normalize providerType
|
||||||
switch (providerType)
|
switch (providerType)
|
||||||
{
|
{
|
||||||
@@ -694,36 +710,32 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
}
|
}
|
||||||
|
|
||||||
sb.AppendLine();
|
sb.AppendLine();
|
||||||
sb.AppendLine($" public override global::ZB.MOM.WW.CBDD.Core.Indexing.IndexKey ToIndexKey({qualifiedKeyType} id) => ");
|
sb.AppendLine(
|
||||||
sb.AppendLine($" global::ZB.MOM.WW.CBDD.Core.Indexing.IndexKey.Create(_idConverter.ConvertToProvider(id));");
|
$" public override global::ZB.MOM.WW.CBDD.Core.Indexing.IndexKey ToIndexKey({qualifiedKeyType} id) => ");
|
||||||
|
sb.AppendLine(
|
||||||
|
" global::ZB.MOM.WW.CBDD.Core.Indexing.IndexKey.Create(_idConverter.ConvertToProvider(id));");
|
||||||
sb.AppendLine();
|
sb.AppendLine();
|
||||||
sb.AppendLine($" public override {qualifiedKeyType} FromIndexKey(global::ZB.MOM.WW.CBDD.Core.Indexing.IndexKey key) => ");
|
sb.AppendLine(
|
||||||
|
$" public override {qualifiedKeyType} FromIndexKey(global::ZB.MOM.WW.CBDD.Core.Indexing.IndexKey key) => ");
|
||||||
sb.AppendLine($" _idConverter.ConvertFromProvider(key.As<{providerType}>());");
|
sb.AppendLine($" _idConverter.ConvertFromProvider(key.As<{providerType}>());");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private static string GetBaseMapperClass(PropertyInfo? keyProp, EntityInfo entity)
|
private static string GetBaseMapperClass(PropertyInfo? keyProp, EntityInfo entity)
|
||||||
{
|
{
|
||||||
if (keyProp?.ConverterTypeName != null)
|
if (keyProp?.ConverterTypeName != null) return $"DocumentMapperBase<global::{keyProp.TypeName}, ";
|
||||||
{
|
|
||||||
return $"DocumentMapperBase<global::{keyProp.TypeName}, ";
|
|
||||||
}
|
|
||||||
|
|
||||||
// Use CollectionIdTypeFullName if available (from DocumentCollection<TId, T> declaration)
|
// Use CollectionIdTypeFullName if available (from DocumentCollection<TId, T> declaration)
|
||||||
string keyType;
|
string keyType;
|
||||||
if (!string.IsNullOrEmpty(entity.CollectionIdTypeFullName))
|
if (!string.IsNullOrEmpty(entity.CollectionIdTypeFullName))
|
||||||
{
|
|
||||||
// Remove "global::" prefix if present
|
// Remove "global::" prefix if present
|
||||||
keyType = entity.CollectionIdTypeFullName!.Replace("global::", "");
|
keyType = entity.CollectionIdTypeFullName!.Replace("global::", "");
|
||||||
}
|
|
||||||
else
|
else
|
||||||
{
|
|
||||||
keyType = keyProp?.TypeName ?? "ObjectId";
|
keyType = keyProp?.TypeName ?? "ObjectId";
|
||||||
}
|
|
||||||
|
|
||||||
// Normalize type by removing nullable suffix (?) for comparison
|
// Normalize type by removing nullable suffix (?) for comparison
|
||||||
// At serialization time, we expect the Id to always have a value
|
// At serialization time, we expect the Id to always have a value
|
||||||
var normalizedKeyType = keyType.TrimEnd('?');
|
string normalizedKeyType = keyType.TrimEnd('?');
|
||||||
|
|
||||||
if (normalizedKeyType.EndsWith("Int32") || normalizedKeyType == "int") return "Int32MapperBase<";
|
if (normalizedKeyType.EndsWith("Int32") || normalizedKeyType == "int") return "Int32MapperBase<";
|
||||||
if (normalizedKeyType.EndsWith("Int64") || normalizedKeyType == "long") return "Int64MapperBase<";
|
if (normalizedKeyType.EndsWith("Int64") || normalizedKeyType == "long") return "Int64MapperBase<";
|
||||||
@@ -736,18 +748,14 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
|
|
||||||
private static string? GetPrimitiveWriteMethod(PropertyInfo prop, bool allowKey = false)
|
private static string? GetPrimitiveWriteMethod(PropertyInfo prop, bool allowKey = false)
|
||||||
{
|
{
|
||||||
var typeName = prop.TypeName;
|
string typeName = prop.TypeName;
|
||||||
if (prop.ColumnTypeName == "point" || prop.ColumnTypeName == "coordinate" || prop.ColumnTypeName == "geopoint")
|
if (prop.ColumnTypeName == "point" || prop.ColumnTypeName == "coordinate" || prop.ColumnTypeName == "geopoint")
|
||||||
{
|
|
||||||
return "WriteCoordinates";
|
return "WriteCoordinates";
|
||||||
}
|
|
||||||
|
|
||||||
if (typeName.Contains("double") && typeName.Contains(",") && typeName.StartsWith("(") && typeName.EndsWith(")"))
|
if (typeName.Contains("double") && typeName.Contains(",") && typeName.StartsWith("(") && typeName.EndsWith(")"))
|
||||||
{
|
|
||||||
return "WriteCoordinates";
|
return "WriteCoordinates";
|
||||||
}
|
|
||||||
|
|
||||||
var cleanType = typeName.TrimEnd('?').Trim();
|
string cleanType = typeName.TrimEnd('?').Trim();
|
||||||
|
|
||||||
if (cleanType.EndsWith("Int32") || cleanType == "int") return "WriteInt32";
|
if (cleanType.EndsWith("Int32") || cleanType == "int") return "WriteInt32";
|
||||||
if (cleanType.EndsWith("Int64") || cleanType == "long") return "WriteInt64";
|
if (cleanType.EndsWith("Int64") || cleanType == "long") return "WriteInt64";
|
||||||
@@ -769,18 +777,14 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
|
|
||||||
private static string? GetPrimitiveReadMethod(PropertyInfo prop)
|
private static string? GetPrimitiveReadMethod(PropertyInfo prop)
|
||||||
{
|
{
|
||||||
var typeName = prop.TypeName;
|
string typeName = prop.TypeName;
|
||||||
if (prop.ColumnTypeName == "point" || prop.ColumnTypeName == "coordinate" || prop.ColumnTypeName == "geopoint")
|
if (prop.ColumnTypeName == "point" || prop.ColumnTypeName == "coordinate" || prop.ColumnTypeName == "geopoint")
|
||||||
{
|
|
||||||
return "ReadCoordinates";
|
return "ReadCoordinates";
|
||||||
}
|
|
||||||
|
|
||||||
if (typeName.Contains("double") && typeName.Contains(",") && typeName.StartsWith("(") && typeName.EndsWith(")"))
|
if (typeName.Contains("double") && typeName.Contains(",") && typeName.StartsWith("(") && typeName.EndsWith(")"))
|
||||||
{
|
|
||||||
return "ReadCoordinates";
|
return "ReadCoordinates";
|
||||||
}
|
|
||||||
|
|
||||||
var cleanType = typeName.TrimEnd('?').Trim();
|
string cleanType = typeName.TrimEnd('?').Trim();
|
||||||
|
|
||||||
if (cleanType.EndsWith("Int32") || cleanType == "int") return "ReadInt32";
|
if (cleanType.EndsWith("Int32") || cleanType == "int") return "ReadInt32";
|
||||||
if (cleanType.EndsWith("Int64") || cleanType == "long") return "ReadInt64";
|
if (cleanType.EndsWith("Int64") || cleanType == "long") return "ReadInt64";
|
||||||
@@ -804,7 +808,7 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
{
|
{
|
||||||
// Check if the type is a value type (struct) that requires .Value unwrapping when nullable
|
// Check if the type is a value type (struct) that requires .Value unwrapping when nullable
|
||||||
// String is a reference type and doesn't need .Value
|
// String is a reference type and doesn't need .Value
|
||||||
var cleanType = typeName.TrimEnd('?').Trim();
|
string cleanType = typeName.TrimEnd('?').Trim();
|
||||||
|
|
||||||
// Common value types
|
// Common value types
|
||||||
if (cleanType.EndsWith("Int32") || cleanType == "int") return true;
|
if (cleanType.EndsWith("Int32") || cleanType == "int") return true;
|
||||||
@@ -830,8 +834,8 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
if (string.IsNullOrEmpty(typeName)) return "object";
|
if (string.IsNullOrEmpty(typeName)) return "object";
|
||||||
if (typeName.StartsWith("global::")) return typeName;
|
if (typeName.StartsWith("global::")) return typeName;
|
||||||
|
|
||||||
var isNullable = typeName.EndsWith("?");
|
bool isNullable = typeName.EndsWith("?");
|
||||||
var baseType = typeName.TrimEnd('?').Trim();
|
string baseType = typeName.TrimEnd('?').Trim();
|
||||||
|
|
||||||
if (baseType.StartsWith("(") && baseType.EndsWith(")")) return typeName; // Tuple
|
if (baseType.StartsWith("(") && baseType.EndsWith(")")) return typeName; // Tuple
|
||||||
|
|
||||||
@@ -869,7 +873,7 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
|
|
||||||
private static bool IsPrimitive(string typeName)
|
private static bool IsPrimitive(string typeName)
|
||||||
{
|
{
|
||||||
var cleanType = typeName.TrimEnd('?').Trim();
|
string cleanType = typeName.TrimEnd('?').Trim();
|
||||||
if (cleanType.StartsWith("(") && cleanType.EndsWith(")")) return true;
|
if (cleanType.StartsWith("(") && cleanType.EndsWith(")")) return true;
|
||||||
|
|
||||||
switch (cleanType)
|
switch (cleanType)
|
||||||
@@ -895,4 +899,3 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|||||||
@@ -1,13 +1,15 @@
|
|||||||
using System.Collections.Generic;
|
using System.Collections.Generic;
|
||||||
|
using System.IO;
|
||||||
using System.Linq;
|
using System.Linq;
|
||||||
using System.Text;
|
using System.Text;
|
||||||
using Microsoft.CodeAnalysis;
|
using Microsoft.CodeAnalysis;
|
||||||
|
using Microsoft.CodeAnalysis.CSharp;
|
||||||
using Microsoft.CodeAnalysis.CSharp.Syntax;
|
using Microsoft.CodeAnalysis.CSharp.Syntax;
|
||||||
using ZB.MOM.WW.CBDD.SourceGenerators.Helpers;
|
using ZB.MOM.WW.CBDD.SourceGenerators.Helpers;
|
||||||
using ZB.MOM.WW.CBDD.SourceGenerators.Models;
|
using ZB.MOM.WW.CBDD.SourceGenerators.Models;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.SourceGenerators
|
namespace ZB.MOM.WW.CBDD.SourceGenerators;
|
||||||
{
|
|
||||||
public class DbContextInfo
|
public class DbContextInfo
|
||||||
{
|
{
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -43,17 +45,18 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets a value indicating whether the DbContext inherits from another DbContext.
|
/// Gets or sets a value indicating whether the DbContext inherits from another DbContext.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public bool HasBaseDbContext { get; set; } // True if inherits from another DbContext (not DocumentDbContext directly)
|
public bool
|
||||||
|
HasBaseDbContext { get; set; } // True if inherits from another DbContext (not DocumentDbContext directly)
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the entities discovered for this DbContext.
|
/// Gets or sets the entities discovered for this DbContext.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public List<EntityInfo> Entities { get; set; } = new List<EntityInfo>();
|
public List<EntityInfo> Entities { get; set; } = new();
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the collected nested types keyed by full type name.
|
/// Gets or sets the collected nested types keyed by full type name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public Dictionary<string, NestedTypeInfo> GlobalNestedTypes { get; set; } = new Dictionary<string, NestedTypeInfo>();
|
public Dictionary<string, NestedTypeInfo> GlobalNestedTypes { get; set; } = new();
|
||||||
}
|
}
|
||||||
|
|
||||||
[Generator]
|
[Generator]
|
||||||
@@ -68,8 +71,8 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
// Find all classes that inherit from DocumentDbContext
|
// Find all classes that inherit from DocumentDbContext
|
||||||
var dbContextClasses = context.SyntaxProvider
|
var dbContextClasses = context.SyntaxProvider
|
||||||
.CreateSyntaxProvider(
|
.CreateSyntaxProvider(
|
||||||
predicate: static (node, _) => IsPotentialDbContext(node),
|
static (node, _) => IsPotentialDbContext(node),
|
||||||
transform: static (ctx, _) => GetDbContextInfo(ctx))
|
static (ctx, _) => GetDbContextInfo(ctx))
|
||||||
.Where(static context => context is not null)
|
.Where(static context => context is not null)
|
||||||
.Collect()
|
.Collect()
|
||||||
.SelectMany(static (contexts, _) => contexts.GroupBy(c => c!.FullClassName).Select(g => g.First())!);
|
.SelectMany(static (contexts, _) => contexts.GroupBy(c => c!.FullClassName).Select(g => g.First())!);
|
||||||
@@ -81,14 +84,13 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
|
|
||||||
var sb = new StringBuilder();
|
var sb = new StringBuilder();
|
||||||
sb.AppendLine($"// Found DbContext: {dbContext.ClassName}");
|
sb.AppendLine($"// Found DbContext: {dbContext.ClassName}");
|
||||||
sb.AppendLine($"// BaseType: {(dbContext.HasBaseDbContext ? "inherits from another DbContext" : "inherits from DocumentDbContext directly")}");
|
sb.AppendLine(
|
||||||
|
$"// BaseType: {(dbContext.HasBaseDbContext ? "inherits from another DbContext" : "inherits from DocumentDbContext directly")}");
|
||||||
|
|
||||||
|
|
||||||
foreach (var entity in dbContext.Entities)
|
foreach (var entity in dbContext.Entities)
|
||||||
{
|
|
||||||
// Aggregate nested types recursively
|
// Aggregate nested types recursively
|
||||||
CollectNestedTypes(entity.NestedTypes, dbContext.GlobalNestedTypes);
|
CollectNestedTypes(entity.NestedTypes, dbContext.GlobalNestedTypes);
|
||||||
}
|
|
||||||
|
|
||||||
// Collect namespaces
|
// Collect namespaces
|
||||||
var namespaces = new HashSet<string>
|
var namespaces = new HashSet<string>
|
||||||
@@ -101,169 +103,149 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
|
|
||||||
// Add Entity namespaces
|
// Add Entity namespaces
|
||||||
foreach (var entity in dbContext.Entities)
|
foreach (var entity in dbContext.Entities)
|
||||||
{
|
|
||||||
if (!string.IsNullOrEmpty(entity.Namespace))
|
if (!string.IsNullOrEmpty(entity.Namespace))
|
||||||
namespaces.Add(entity.Namespace);
|
namespaces.Add(entity.Namespace);
|
||||||
}
|
|
||||||
foreach (var nested in dbContext.GlobalNestedTypes.Values)
|
foreach (var nested in dbContext.GlobalNestedTypes.Values)
|
||||||
{
|
|
||||||
if (!string.IsNullOrEmpty(nested.Namespace))
|
if (!string.IsNullOrEmpty(nested.Namespace))
|
||||||
namespaces.Add(nested.Namespace);
|
namespaces.Add(nested.Namespace);
|
||||||
}
|
|
||||||
|
|
||||||
// Sanitize file path for name uniqueness
|
// Sanitize file path for name uniqueness
|
||||||
var safeName = dbContext.ClassName;
|
string safeName = dbContext.ClassName;
|
||||||
if (!string.IsNullOrEmpty(dbContext.FilePath))
|
if (!string.IsNullOrEmpty(dbContext.FilePath))
|
||||||
{
|
{
|
||||||
var fileName = System.IO.Path.GetFileNameWithoutExtension(dbContext.FilePath);
|
string fileName = Path.GetFileNameWithoutExtension(dbContext.FilePath);
|
||||||
safeName += $"_{fileName}";
|
safeName += $"_{fileName}";
|
||||||
}
|
}
|
||||||
|
|
||||||
sb.AppendLine("// <auto-generated/>");
|
sb.AppendLine("// <auto-generated/>");
|
||||||
sb.AppendLine("#nullable enable");
|
sb.AppendLine("#nullable enable");
|
||||||
foreach (var ns in namespaces.OrderBy(n => n))
|
foreach (string ns in namespaces.OrderBy(n => n)) sb.AppendLine($"using {ns};");
|
||||||
{
|
|
||||||
sb.AppendLine($"using {ns};");
|
|
||||||
}
|
|
||||||
sb.AppendLine();
|
sb.AppendLine();
|
||||||
|
|
||||||
// Use safeName (Context + Filename) to avoid collisions
|
// Use safeName (Context + Filename) to avoid collisions
|
||||||
var mapperNamespace = $"{dbContext.Namespace}.{safeName}_Mappers";
|
var mapperNamespace = $"{dbContext.Namespace}.{safeName}_Mappers";
|
||||||
sb.AppendLine($"namespace {mapperNamespace}");
|
sb.AppendLine($"namespace {mapperNamespace}");
|
||||||
sb.AppendLine($"{{");
|
sb.AppendLine("{");
|
||||||
|
|
||||||
var generatedMappers = new HashSet<string>();
|
var generatedMappers = new HashSet<string>();
|
||||||
|
|
||||||
// Generate Entity Mappers
|
// Generate Entity Mappers
|
||||||
foreach (var entity in dbContext.Entities)
|
foreach (var entity in dbContext.Entities)
|
||||||
{
|
|
||||||
if (generatedMappers.Add(entity.FullTypeName))
|
if (generatedMappers.Add(entity.FullTypeName))
|
||||||
{
|
|
||||||
sb.AppendLine(CodeGenerator.GenerateMapperClass(entity, mapperNamespace));
|
sb.AppendLine(CodeGenerator.GenerateMapperClass(entity, mapperNamespace));
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Generate Nested Mappers
|
// Generate Nested Mappers
|
||||||
foreach (var nested in dbContext.GlobalNestedTypes.Values)
|
foreach (var nested in dbContext.GlobalNestedTypes.Values)
|
||||||
{
|
|
||||||
if (generatedMappers.Add(nested.FullTypeName))
|
if (generatedMappers.Add(nested.FullTypeName))
|
||||||
{
|
{
|
||||||
var nestedEntity = new EntityInfo
|
var nestedEntity = new EntityInfo
|
||||||
{
|
{
|
||||||
Name = nested.Name,
|
Name = nested.Name,
|
||||||
Namespace = nested.Namespace,
|
Namespace = nested.Namespace,
|
||||||
FullTypeName = nested.FullTypeName, // Ensure FullTypeName is copied
|
FullTypeName = nested.FullTypeName // Ensure FullTypeName is copied
|
||||||
// Helper to copy properties
|
// Helper to copy properties
|
||||||
};
|
};
|
||||||
nestedEntity.Properties.AddRange(nested.Properties);
|
nestedEntity.Properties.AddRange(nested.Properties);
|
||||||
|
|
||||||
sb.AppendLine(CodeGenerator.GenerateMapperClass(nestedEntity, mapperNamespace));
|
sb.AppendLine(CodeGenerator.GenerateMapperClass(nestedEntity, mapperNamespace));
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
sb.AppendLine($"}}");
|
sb.AppendLine("}");
|
||||||
sb.AppendLine();
|
sb.AppendLine();
|
||||||
|
|
||||||
// Partial DbContext for InitializeCollections (Only for top-level partial classes)
|
// Partial DbContext for InitializeCollections (Only for top-level partial classes)
|
||||||
if (!dbContext.IsNested && dbContext.IsPartial)
|
if (!dbContext.IsNested && dbContext.IsPartial)
|
||||||
{
|
{
|
||||||
sb.AppendLine($"namespace {dbContext.Namespace}");
|
sb.AppendLine($"namespace {dbContext.Namespace}");
|
||||||
sb.AppendLine($"{{");
|
sb.AppendLine("{");
|
||||||
sb.AppendLine($" public partial class {dbContext.ClassName}");
|
sb.AppendLine($" public partial class {dbContext.ClassName}");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" {");
|
||||||
sb.AppendLine($" protected override void InitializeCollections()");
|
sb.AppendLine(" protected override void InitializeCollections()");
|
||||||
sb.AppendLine($" {{");
|
sb.AppendLine(" {");
|
||||||
|
|
||||||
// Call base.InitializeCollections() if this context inherits from another DbContext
|
// Call base.InitializeCollections() if this context inherits from another DbContext
|
||||||
if (dbContext.HasBaseDbContext)
|
if (dbContext.HasBaseDbContext) sb.AppendLine(" base.InitializeCollections();");
|
||||||
{
|
|
||||||
sb.AppendLine($" base.InitializeCollections();");
|
|
||||||
}
|
|
||||||
|
|
||||||
foreach (var entity in dbContext.Entities)
|
foreach (var entity in dbContext.Entities)
|
||||||
{
|
|
||||||
if (!string.IsNullOrEmpty(entity.CollectionPropertyName))
|
if (!string.IsNullOrEmpty(entity.CollectionPropertyName))
|
||||||
{
|
{
|
||||||
var mapperName = $"global::{mapperNamespace}.{CodeGenerator.GetMapperName(entity.FullTypeName)}";
|
var mapperName =
|
||||||
sb.AppendLine($" this.{entity.CollectionPropertyName} = CreateCollection(new {mapperName}());");
|
$"global::{mapperNamespace}.{CodeGenerator.GetMapperName(entity.FullTypeName)}";
|
||||||
}
|
sb.AppendLine(
|
||||||
|
$" this.{entity.CollectionPropertyName} = CreateCollection(new {mapperName}());");
|
||||||
}
|
}
|
||||||
|
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" }");
|
||||||
sb.AppendLine();
|
sb.AppendLine();
|
||||||
|
|
||||||
// Generate Set<TId, T>() override
|
// Generate Set<TId, T>() override
|
||||||
var collectionsWithProperties = dbContext.Entities
|
var collectionsWithProperties = dbContext.Entities
|
||||||
.Where(e => !string.IsNullOrEmpty(e.CollectionPropertyName) && !string.IsNullOrEmpty(e.CollectionIdTypeFullName))
|
.Where(e => !string.IsNullOrEmpty(e.CollectionPropertyName) &&
|
||||||
|
!string.IsNullOrEmpty(e.CollectionIdTypeFullName))
|
||||||
.ToList();
|
.ToList();
|
||||||
|
|
||||||
if (collectionsWithProperties.Any())
|
if (collectionsWithProperties.Any())
|
||||||
{
|
{
|
||||||
sb.AppendLine($" public override global::ZB.MOM.WW.CBDD.Core.Collections.DocumentCollection<TId, T> Set<TId, T>()");
|
sb.AppendLine(
|
||||||
sb.AppendLine($" {{");
|
" public override global::ZB.MOM.WW.CBDD.Core.Collections.DocumentCollection<TId, T> Set<TId, T>()");
|
||||||
|
sb.AppendLine(" {");
|
||||||
|
|
||||||
foreach (var entity in collectionsWithProperties)
|
foreach (var entity in collectionsWithProperties)
|
||||||
{
|
{
|
||||||
var entityTypeStr = $"global::{entity.FullTypeName}";
|
var entityTypeStr = $"global::{entity.FullTypeName}";
|
||||||
var idTypeStr = entity.CollectionIdTypeFullName;
|
string? idTypeStr = entity.CollectionIdTypeFullName;
|
||||||
sb.AppendLine($" if (typeof(TId) == typeof({idTypeStr}) && typeof(T) == typeof({entityTypeStr}))");
|
sb.AppendLine(
|
||||||
sb.AppendLine($" return (global::ZB.MOM.WW.CBDD.Core.Collections.DocumentCollection<TId, T>)(object)this.{entity.CollectionPropertyName};");
|
$" if (typeof(TId) == typeof({idTypeStr}) && typeof(T) == typeof({entityTypeStr}))");
|
||||||
|
sb.AppendLine(
|
||||||
|
$" return (global::ZB.MOM.WW.CBDD.Core.Collections.DocumentCollection<TId, T>)(object)this.{entity.CollectionPropertyName};");
|
||||||
}
|
}
|
||||||
|
|
||||||
if (dbContext.HasBaseDbContext)
|
if (dbContext.HasBaseDbContext)
|
||||||
{
|
sb.AppendLine(" return base.Set<TId, T>();");
|
||||||
sb.AppendLine($" return base.Set<TId, T>();");
|
|
||||||
}
|
|
||||||
else
|
else
|
||||||
{
|
sb.AppendLine(
|
||||||
sb.AppendLine($" throw new global::System.InvalidOperationException($\"No collection registered for entity type '{{typeof(T).Name}}' with key type '{{typeof(TId).Name}}'.\");");
|
" throw new global::System.InvalidOperationException($\"No collection registered for entity type '{typeof(T).Name}' with key type '{typeof(TId).Name}'.\");");
|
||||||
|
|
||||||
|
sb.AppendLine(" }");
|
||||||
}
|
}
|
||||||
|
|
||||||
sb.AppendLine($" }}");
|
sb.AppendLine(" }");
|
||||||
}
|
sb.AppendLine("}");
|
||||||
|
|
||||||
sb.AppendLine($" }}");
|
|
||||||
sb.AppendLine($"}}");
|
|
||||||
}
|
}
|
||||||
|
|
||||||
spc.AddSource($"{dbContext.Namespace}.{safeName}.Mappers.g.cs", sb.ToString());
|
spc.AddSource($"{dbContext.Namespace}.{safeName}.Mappers.g.cs", sb.ToString());
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
private static void CollectNestedTypes(Dictionary<string, NestedTypeInfo> source, Dictionary<string, NestedTypeInfo> target)
|
private static void CollectNestedTypes(Dictionary<string, NestedTypeInfo> source,
|
||||||
|
Dictionary<string, NestedTypeInfo> target)
|
||||||
{
|
{
|
||||||
foreach (var kvp in source)
|
foreach (var kvp in source)
|
||||||
{
|
|
||||||
if (!target.ContainsKey(kvp.Value.FullTypeName))
|
if (!target.ContainsKey(kvp.Value.FullTypeName))
|
||||||
{
|
{
|
||||||
target[kvp.Value.FullTypeName] = kvp.Value;
|
target[kvp.Value.FullTypeName] = kvp.Value;
|
||||||
CollectNestedTypes(kvp.Value.NestedTypes, target);
|
CollectNestedTypes(kvp.Value.NestedTypes, target);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
private static void PrintNestedTypes(StringBuilder sb, Dictionary<string, NestedTypeInfo> nestedTypes, string indent)
|
private static void PrintNestedTypes(StringBuilder sb, Dictionary<string, NestedTypeInfo> nestedTypes,
|
||||||
|
string indent)
|
||||||
{
|
{
|
||||||
foreach (var nt in nestedTypes.Values)
|
foreach (var nt in nestedTypes.Values)
|
||||||
{
|
{
|
||||||
sb.AppendLine($"//{indent}- {nt.Name} (Depth: {nt.Depth})");
|
sb.AppendLine($"//{indent}- {nt.Name} (Depth: {nt.Depth})");
|
||||||
if (nt.Properties.Count > 0)
|
if (nt.Properties.Count > 0)
|
||||||
{
|
|
||||||
// Print properties for nested type to be sure
|
// Print properties for nested type to be sure
|
||||||
foreach (var p in nt.Properties)
|
foreach (var p in nt.Properties)
|
||||||
{
|
{
|
||||||
var flags = new List<string>();
|
var flags = new List<string>();
|
||||||
if (p.IsCollection) flags.Add($"Collection<{p.CollectionItemType}>");
|
if (p.IsCollection) flags.Add($"Collection<{p.CollectionItemType}>");
|
||||||
if (p.IsNestedObject) flags.Add($"Nested<{p.NestedTypeName}>");
|
if (p.IsNestedObject) flags.Add($"Nested<{p.NestedTypeName}>");
|
||||||
var flagStr = flags.Any() ? $" [{string.Join(", ", flags)}]" : "";
|
string flagStr = flags.Any() ? $" [{string.Join(", ", flags)}]" : "";
|
||||||
sb.AppendLine($"//{indent} - {p.Name}: {p.TypeName}{flagStr}");
|
sb.AppendLine($"//{indent} - {p.Name}: {p.TypeName}{flagStr}");
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
if (nt.NestedTypes.Any())
|
if (nt.NestedTypes.Any()) PrintNestedTypes(sb, nt.NestedTypes, indent + " ");
|
||||||
{
|
|
||||||
PrintNestedTypes(sb, nt.NestedTypes, indent + " ");
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -281,7 +263,7 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
var classDecl = (ClassDeclarationSyntax)context.Node;
|
var classDecl = (ClassDeclarationSyntax)context.Node;
|
||||||
var semanticModel = context.SemanticModel;
|
var semanticModel = context.SemanticModel;
|
||||||
|
|
||||||
var classSymbol = semanticModel.GetDeclaredSymbol(classDecl) as INamedTypeSymbol;
|
var classSymbol = ModelExtensions.GetDeclaredSymbol(semanticModel, classDecl) as INamedTypeSymbol;
|
||||||
if (classSymbol == null) return null;
|
if (classSymbol == null) return null;
|
||||||
|
|
||||||
if (!SyntaxHelper.InheritsFrom(classSymbol, "DocumentDbContext"))
|
if (!SyntaxHelper.InheritsFrom(classSymbol, "DocumentDbContext"))
|
||||||
@@ -299,7 +281,7 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
Namespace = classSymbol.ContainingNamespace.ToDisplayString(),
|
Namespace = classSymbol.ContainingNamespace.ToDisplayString(),
|
||||||
FilePath = classDecl.SyntaxTree.FilePath,
|
FilePath = classDecl.SyntaxTree.FilePath,
|
||||||
IsNested = classSymbol.ContainingType != null,
|
IsNested = classSymbol.ContainingType != null,
|
||||||
IsPartial = classDecl.Modifiers.Any(m => m.IsKind(Microsoft.CodeAnalysis.CSharp.SyntaxKind.PartialKeyword)),
|
IsPartial = classDecl.Modifiers.Any(m => m.IsKind(SyntaxKind.PartialKeyword)),
|
||||||
HasBaseDbContext = hasBaseDbContext
|
HasBaseDbContext = hasBaseDbContext
|
||||||
};
|
};
|
||||||
|
|
||||||
@@ -313,7 +295,7 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
var entityCalls = SyntaxHelper.FindMethodInvocations(onModelCreating, "Entity");
|
var entityCalls = SyntaxHelper.FindMethodInvocations(onModelCreating, "Entity");
|
||||||
foreach (var call in entityCalls)
|
foreach (var call in entityCalls)
|
||||||
{
|
{
|
||||||
var typeName = SyntaxHelper.GetGenericTypeArgument(call);
|
string? typeName = SyntaxHelper.GetGenericTypeArgument(call);
|
||||||
if (typeName != null)
|
if (typeName != null)
|
||||||
{
|
{
|
||||||
// Try to find the symbol
|
// Try to find the symbol
|
||||||
@@ -324,15 +306,12 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
entityType = symbols.OfType<INamedTypeSymbol>().FirstOrDefault();
|
entityType = symbols.OfType<INamedTypeSymbol>().FirstOrDefault();
|
||||||
|
|
||||||
// 2. Try by metadata name (if fully qualified)
|
// 2. Try by metadata name (if fully qualified)
|
||||||
if (entityType == null)
|
if (entityType == null) entityType = semanticModel.Compilation.GetTypeByMetadataName(typeName);
|
||||||
{
|
|
||||||
entityType = semanticModel.Compilation.GetTypeByMetadataName(typeName);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (entityType != null)
|
if (entityType != null)
|
||||||
{
|
{
|
||||||
// Check for duplicates
|
// Check for duplicates
|
||||||
var fullTypeName = SyntaxHelper.GetFullName(entityType);
|
string fullTypeName = SyntaxHelper.GetFullName(entityType);
|
||||||
if (!info.Entities.Any(e => e.FullTypeName == fullTypeName))
|
if (!info.Entities.Any(e => e.FullTypeName == fullTypeName))
|
||||||
{
|
{
|
||||||
var entityInfo = EntityAnalyzer.Analyze(entityType, semanticModel);
|
var entityInfo = EntityAnalyzer.Analyze(entityType, semanticModel);
|
||||||
@@ -349,35 +328,53 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
var conversionCalls = SyntaxHelper.FindMethodInvocations(onModelCreating, "HasConversion");
|
var conversionCalls = SyntaxHelper.FindMethodInvocations(onModelCreating, "HasConversion");
|
||||||
foreach (var call in conversionCalls)
|
foreach (var call in conversionCalls)
|
||||||
{
|
{
|
||||||
var converterName = SyntaxHelper.GetGenericTypeArgument(call);
|
string? converterName = SyntaxHelper.GetGenericTypeArgument(call);
|
||||||
if (converterName == null) continue;
|
if (converterName == null) continue;
|
||||||
|
|
||||||
// Trace back: .Property(x => x.Id).HasConversion<T>() or .HasKey(x => x.Id).HasConversion<T>()
|
// Trace back: .Property(x => x.Id).HasConversion<T>() or .HasKey(x => x.Id).HasConversion<T>()
|
||||||
if (call.Expression is MemberAccessExpressionSyntax { Expression: InvocationExpressionSyntax propertyCall } &&
|
if (call.Expression is MemberAccessExpressionSyntax
|
||||||
propertyCall.Expression is MemberAccessExpressionSyntax { Name: IdentifierNameSyntax { Identifier: { Text: var propertyMethod } } } &&
|
{
|
||||||
|
Expression: InvocationExpressionSyntax propertyCall
|
||||||
|
} &&
|
||||||
|
propertyCall.Expression is MemberAccessExpressionSyntax
|
||||||
|
{
|
||||||
|
Name: IdentifierNameSyntax { Identifier: { Text: var propertyMethod } }
|
||||||
|
} &&
|
||||||
(propertyMethod == "Property" || propertyMethod == "HasKey"))
|
(propertyMethod == "Property" || propertyMethod == "HasKey"))
|
||||||
{
|
{
|
||||||
var propertyName = SyntaxHelper.GetPropertyName(propertyCall.ArgumentList.Arguments.FirstOrDefault()?.Expression);
|
string? propertyName =
|
||||||
|
SyntaxHelper.GetPropertyName(propertyCall.ArgumentList.Arguments.FirstOrDefault()?.Expression);
|
||||||
if (propertyName == null) continue;
|
if (propertyName == null) continue;
|
||||||
|
|
||||||
// Trace further back: Entity<T>().Property(...)
|
// Trace further back: Entity<T>().Property(...)
|
||||||
if (propertyCall.Expression is MemberAccessExpressionSyntax { Expression: InvocationExpressionSyntax entityCall } &&
|
if (propertyCall.Expression is MemberAccessExpressionSyntax
|
||||||
entityCall.Expression is MemberAccessExpressionSyntax { Name: GenericNameSyntax { Identifier: { Text: "Entity" } } })
|
|
||||||
{
|
{
|
||||||
var entityTypeName = SyntaxHelper.GetGenericTypeArgument(entityCall);
|
Expression: InvocationExpressionSyntax entityCall
|
||||||
|
} &&
|
||||||
|
entityCall.Expression is MemberAccessExpressionSyntax
|
||||||
|
{
|
||||||
|
Name: GenericNameSyntax { Identifier: { Text: "Entity" } }
|
||||||
|
})
|
||||||
|
{
|
||||||
|
string? entityTypeName = SyntaxHelper.GetGenericTypeArgument(entityCall);
|
||||||
if (entityTypeName != null)
|
if (entityTypeName != null)
|
||||||
{
|
{
|
||||||
var entity = info.Entities.FirstOrDefault(e => e.Name == entityTypeName || e.FullTypeName.EndsWith("." + entityTypeName));
|
var entity = info.Entities.FirstOrDefault(e =>
|
||||||
|
e.Name == entityTypeName || e.FullTypeName.EndsWith("." + entityTypeName));
|
||||||
if (entity != null)
|
if (entity != null)
|
||||||
{
|
{
|
||||||
var prop = entity.Properties.FirstOrDefault(p => p.Name == propertyName);
|
var prop = entity.Properties.FirstOrDefault(p => p.Name == propertyName);
|
||||||
if (prop != null)
|
if (prop != null)
|
||||||
{
|
{
|
||||||
// Resolve TProvider from ValueConverter<TModel, TProvider>
|
// Resolve TProvider from ValueConverter<TModel, TProvider>
|
||||||
var converterType = semanticModel.Compilation.GetTypeByMetadataName(converterName) ??
|
var converterType =
|
||||||
semanticModel.Compilation.GetSymbolsWithName(converterName).OfType<INamedTypeSymbol>().FirstOrDefault();
|
semanticModel.Compilation.GetTypeByMetadataName(converterName) ??
|
||||||
|
semanticModel.Compilation.GetSymbolsWithName(converterName)
|
||||||
|
.OfType<INamedTypeSymbol>().FirstOrDefault();
|
||||||
|
|
||||||
prop.ConverterTypeName = converterType != null ? SyntaxHelper.GetFullName(converterType) : converterName;
|
prop.ConverterTypeName = converterType != null
|
||||||
|
? SyntaxHelper.GetFullName(converterType)
|
||||||
|
: converterName;
|
||||||
|
|
||||||
if (converterType != null && converterType.BaseType != null &&
|
if (converterType != null && converterType.BaseType != null &&
|
||||||
converterType.BaseType.Name == "ValueConverter" &&
|
converterType.BaseType.Name == "ValueConverter" &&
|
||||||
@@ -391,11 +388,13 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
var converterBaseType = converterType.BaseType;
|
var converterBaseType = converterType.BaseType;
|
||||||
while (converterBaseType != null)
|
while (converterBaseType != null)
|
||||||
{
|
{
|
||||||
if (converterBaseType.Name == "ValueConverter" && converterBaseType.TypeArguments.Length == 2)
|
if (converterBaseType.Name == "ValueConverter" &&
|
||||||
|
converterBaseType.TypeArguments.Length == 2)
|
||||||
{
|
{
|
||||||
prop.ProviderTypeName = converterBaseType.TypeArguments[1].Name;
|
prop.ProviderTypeName = converterBaseType.TypeArguments[1].Name;
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
|
|
||||||
converterBaseType = converterBaseType.BaseType;
|
converterBaseType = converterBaseType.BaseType;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -410,10 +409,8 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
// Analyze properties to find DocumentCollection<TId, TEntity>
|
// Analyze properties to find DocumentCollection<TId, TEntity>
|
||||||
var properties = classSymbol.GetMembers().OfType<IPropertySymbol>();
|
var properties = classSymbol.GetMembers().OfType<IPropertySymbol>();
|
||||||
foreach (var prop in properties)
|
foreach (var prop in properties)
|
||||||
{
|
|
||||||
if (prop.Type is INamedTypeSymbol namedType &&
|
if (prop.Type is INamedTypeSymbol namedType &&
|
||||||
namedType.OriginalDefinition.Name == "DocumentCollection")
|
namedType.OriginalDefinition.Name == "DocumentCollection")
|
||||||
{
|
|
||||||
// Expecting 2 type arguments: TId, TEntity
|
// Expecting 2 type arguments: TId, TEntity
|
||||||
if (namedType.TypeArguments.Length == 2)
|
if (namedType.TypeArguments.Length == 2)
|
||||||
{
|
{
|
||||||
@@ -424,13 +421,11 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators
|
|||||||
if (entityInfo != null)
|
if (entityInfo != null)
|
||||||
{
|
{
|
||||||
entityInfo.CollectionPropertyName = prop.Name;
|
entityInfo.CollectionPropertyName = prop.Name;
|
||||||
entityInfo.CollectionIdTypeFullName = namedType.TypeArguments[0].ToDisplayString(SymbolDisplayFormat.FullyQualifiedFormat);
|
entityInfo.CollectionIdTypeFullName = namedType.TypeArguments[0]
|
||||||
}
|
.ToDisplayString(SymbolDisplayFormat.FullyQualifiedFormat);
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
return info;
|
return info;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|||||||
@@ -1,9 +1,8 @@
|
|||||||
using System;
|
|
||||||
using System.Linq;
|
using System.Linq;
|
||||||
using Microsoft.CodeAnalysis;
|
using Microsoft.CodeAnalysis;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.SourceGenerators.Helpers
|
namespace ZB.MOM.WW.CBDD.SourceGenerators.Helpers;
|
||||||
{
|
|
||||||
public static class AttributeHelper
|
public static class AttributeHelper
|
||||||
{
|
{
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -38,10 +37,7 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Helpers
|
|||||||
public static string? GetAttributeStringValue(ISymbol symbol, string attributeName)
|
public static string? GetAttributeStringValue(ISymbol symbol, string attributeName)
|
||||||
{
|
{
|
||||||
var attr = GetAttribute(symbol, attributeName);
|
var attr = GetAttribute(symbol, attributeName);
|
||||||
if (attr != null && attr.ConstructorArguments.Length > 0)
|
if (attr != null && attr.ConstructorArguments.Length > 0) return attr.ConstructorArguments[0].Value?.ToString();
|
||||||
{
|
|
||||||
return attr.ConstructorArguments[0].Value?.ToString();
|
|
||||||
}
|
|
||||||
|
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
@@ -56,9 +52,8 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Helpers
|
|||||||
{
|
{
|
||||||
var attr = GetAttribute(symbol, attributeName);
|
var attr = GetAttribute(symbol, attributeName);
|
||||||
if (attr != null && attr.ConstructorArguments.Length > 0)
|
if (attr != null && attr.ConstructorArguments.Length > 0)
|
||||||
{
|
if (attr.ConstructorArguments[0].Value is int val)
|
||||||
if (attr.ConstructorArguments[0].Value is int val) return val;
|
return val;
|
||||||
}
|
|
||||||
|
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
@@ -118,4 +113,3 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Helpers
|
|||||||
return GetAttribute(symbol, attributeName) != null;
|
return GetAttribute(symbol, attributeName) != null;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|||||||
@@ -1,11 +1,10 @@
|
|||||||
using System.Collections.Generic;
|
using System.Collections.Generic;
|
||||||
using System.Linq;
|
using System.Linq;
|
||||||
using Microsoft.CodeAnalysis;
|
using Microsoft.CodeAnalysis;
|
||||||
using Microsoft.CodeAnalysis.CSharp;
|
|
||||||
using Microsoft.CodeAnalysis.CSharp.Syntax;
|
using Microsoft.CodeAnalysis.CSharp.Syntax;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.SourceGenerators.Helpers
|
namespace ZB.MOM.WW.CBDD.SourceGenerators.Helpers;
|
||||||
{
|
|
||||||
public static class SyntaxHelper
|
public static class SyntaxHelper
|
||||||
{
|
{
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -23,6 +22,7 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Helpers
|
|||||||
return true;
|
return true;
|
||||||
current = current.BaseType;
|
current = current.BaseType;
|
||||||
}
|
}
|
||||||
|
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -39,9 +39,7 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Helpers
|
|||||||
.Where(invocation =>
|
.Where(invocation =>
|
||||||
{
|
{
|
||||||
if (invocation.Expression is MemberAccessExpressionSyntax memberAccess)
|
if (invocation.Expression is MemberAccessExpressionSyntax memberAccess)
|
||||||
{
|
|
||||||
return memberAccess.Name.Identifier.Text == methodName;
|
return memberAccess.Name.Identifier.Text == methodName;
|
||||||
}
|
|
||||||
return false;
|
return false;
|
||||||
})
|
})
|
||||||
.ToList();
|
.ToList();
|
||||||
@@ -57,9 +55,7 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Helpers
|
|||||||
if (invocation.Expression is MemberAccessExpressionSyntax memberAccess &&
|
if (invocation.Expression is MemberAccessExpressionSyntax memberAccess &&
|
||||||
memberAccess.Name is GenericNameSyntax genericName &&
|
memberAccess.Name is GenericNameSyntax genericName &&
|
||||||
genericName.TypeArgumentList.Arguments.Count > 0)
|
genericName.TypeArgumentList.Arguments.Count > 0)
|
||||||
{
|
|
||||||
return genericName.TypeArgumentList.Arguments[0].ToString();
|
return genericName.TypeArgumentList.Arguments[0].ToString();
|
||||||
}
|
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -71,22 +67,13 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Helpers
|
|||||||
public static string? GetPropertyName(ExpressionSyntax? expression)
|
public static string? GetPropertyName(ExpressionSyntax? expression)
|
||||||
{
|
{
|
||||||
if (expression == null) return null;
|
if (expression == null) return null;
|
||||||
if (expression is LambdaExpressionSyntax lambda)
|
if (expression is LambdaExpressionSyntax lambda) return GetPropertyName(lambda.Body as ExpressionSyntax);
|
||||||
{
|
if (expression is MemberAccessExpressionSyntax memberAccess) return memberAccess.Name.Identifier.Text;
|
||||||
return GetPropertyName(lambda.Body as ExpressionSyntax);
|
if (expression is PrefixUnaryExpressionSyntax prefixUnary &&
|
||||||
}
|
prefixUnary.Operand is MemberAccessExpressionSyntax prefixMember) return prefixMember.Name.Identifier.Text;
|
||||||
if (expression is MemberAccessExpressionSyntax memberAccess)
|
if (expression is PostfixUnaryExpressionSyntax postfixUnary &&
|
||||||
{
|
postfixUnary.Operand is MemberAccessExpressionSyntax postfixMember)
|
||||||
return memberAccess.Name.Identifier.Text;
|
|
||||||
}
|
|
||||||
if (expression is PrefixUnaryExpressionSyntax prefixUnary && prefixUnary.Operand is MemberAccessExpressionSyntax prefixMember)
|
|
||||||
{
|
|
||||||
return prefixMember.Name.Identifier.Text;
|
|
||||||
}
|
|
||||||
if (expression is PostfixUnaryExpressionSyntax postfixUnary && postfixUnary.Operand is MemberAccessExpressionSyntax postfixMember)
|
|
||||||
{
|
|
||||||
return postfixMember.Name.Identifier.Text;
|
return postfixMember.Name.Identifier.Text;
|
||||||
}
|
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -115,15 +102,9 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Helpers
|
|||||||
return GetTypeName(underlyingType) + "?";
|
return GetTypeName(underlyingType) + "?";
|
||||||
}
|
}
|
||||||
|
|
||||||
if (type is IArrayTypeSymbol arrayType)
|
if (type is IArrayTypeSymbol arrayType) return GetTypeName(arrayType.ElementType) + "[]";
|
||||||
{
|
|
||||||
return GetTypeName(arrayType.ElementType) + "[]";
|
|
||||||
}
|
|
||||||
|
|
||||||
if (type is INamedTypeSymbol nt && nt.IsTupleType)
|
if (type is INamedTypeSymbol nt && nt.IsTupleType) return type.ToDisplayString();
|
||||||
{
|
|
||||||
return type.ToDisplayString();
|
|
||||||
}
|
|
||||||
|
|
||||||
return type.ToDisplayString();
|
return type.ToDisplayString();
|
||||||
}
|
}
|
||||||
@@ -137,9 +118,7 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Helpers
|
|||||||
{
|
{
|
||||||
if (type is INamedTypeSymbol namedType &&
|
if (type is INamedTypeSymbol namedType &&
|
||||||
namedType.OriginalDefinition.SpecialType == SpecialType.System_Nullable_T)
|
namedType.OriginalDefinition.SpecialType == SpecialType.System_Nullable_T)
|
||||||
{
|
|
||||||
return true;
|
return true;
|
||||||
}
|
|
||||||
return type.NullableAnnotation == NullableAnnotation.Annotated;
|
return type.NullableAnnotation == NullableAnnotation.Annotated;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -167,7 +146,7 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Helpers
|
|||||||
// Check if the type itself is IEnumerable<T>
|
// Check if the type itself is IEnumerable<T>
|
||||||
if (type is INamedTypeSymbol namedType && namedType.IsGenericType)
|
if (type is INamedTypeSymbol namedType && namedType.IsGenericType)
|
||||||
{
|
{
|
||||||
var typeDefName = namedType.OriginalDefinition.ToDisplayString();
|
string typeDefName = namedType.OriginalDefinition.ToDisplayString();
|
||||||
if (typeDefName == "System.Collections.Generic.IEnumerable<T>" && namedType.TypeArguments.Length == 1)
|
if (typeDefName == "System.Collections.Generic.IEnumerable<T>" && namedType.TypeArguments.Length == 1)
|
||||||
{
|
{
|
||||||
itemType = namedType.TypeArguments[0];
|
itemType = namedType.TypeArguments[0];
|
||||||
@@ -198,14 +177,12 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Helpers
|
|||||||
{
|
{
|
||||||
if (type is INamedTypeSymbol namedType &&
|
if (type is INamedTypeSymbol namedType &&
|
||||||
namedType.OriginalDefinition.SpecialType == SpecialType.System_Nullable_T)
|
namedType.OriginalDefinition.SpecialType == SpecialType.System_Nullable_T)
|
||||||
{
|
|
||||||
type = namedType.TypeArguments[0];
|
type = namedType.TypeArguments[0];
|
||||||
}
|
|
||||||
|
|
||||||
if (type.SpecialType != SpecialType.None && type.SpecialType != SpecialType.System_Object)
|
if (type.SpecialType != SpecialType.None && type.SpecialType != SpecialType.System_Object)
|
||||||
return true;
|
return true;
|
||||||
|
|
||||||
var typeName = type.Name;
|
string typeName = type.Name;
|
||||||
if (typeName == "Guid" || typeName == "DateTime" || typeName == "DateTimeOffset" ||
|
if (typeName == "Guid" || typeName == "DateTime" || typeName == "DateTimeOffset" ||
|
||||||
typeName == "TimeSpan" || typeName == "DateOnly" || typeName == "TimeOnly" ||
|
typeName == "TimeSpan" || typeName == "DateOnly" || typeName == "TimeOnly" ||
|
||||||
typeName == "Decimal" || typeName == "ObjectId")
|
typeName == "Decimal" || typeName == "ObjectId")
|
||||||
@@ -250,4 +227,3 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Helpers
|
|||||||
.Any(f => f.AssociatedSymbol?.Equals(property, SymbolEqualityComparer.Default) == true);
|
.Any(f => f.AssociatedSymbol?.Equals(property, SymbolEqualityComparer.Default) == true);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
using System.Collections.Generic;
|
using System.Collections.Generic;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.SourceGenerators.Models
|
namespace ZB.MOM.WW.CBDD.SourceGenerators.Models;
|
||||||
{
|
|
||||||
public class DbContextInfo
|
public class DbContextInfo
|
||||||
{
|
{
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -22,11 +22,10 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Models
|
|||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the entity types discovered for the DbContext.
|
/// Gets the entity types discovered for the DbContext.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public List<EntityInfo> Entities { get; } = new List<EntityInfo>();
|
public List<EntityInfo> Entities { get; } = new();
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets global nested types keyed by type name.
|
/// Gets global nested types keyed by type name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public Dictionary<string, NestedTypeInfo> GlobalNestedTypes { get; } = new Dictionary<string, NestedTypeInfo>();
|
public Dictionary<string, NestedTypeInfo> GlobalNestedTypes { get; } = new();
|
||||||
}
|
|
||||||
}
|
}
|
||||||
@@ -1,8 +1,8 @@
|
|||||||
using System.Collections.Generic;
|
using System.Collections.Generic;
|
||||||
using System.Linq;
|
using System.Linq;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.SourceGenerators.Models
|
namespace ZB.MOM.WW.CBDD.SourceGenerators.Models;
|
||||||
{
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Contains metadata describing an entity discovered by source generation.
|
/// Contains metadata describing an entity discovered by source generation.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -12,22 +12,27 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Models
|
|||||||
/// Gets or sets the entity name.
|
/// Gets or sets the entity name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string Name { get; set; } = "";
|
public string Name { get; set; } = "";
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the entity namespace.
|
/// Gets or sets the entity namespace.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string Namespace { get; set; } = "";
|
public string Namespace { get; set; } = "";
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the fully qualified entity type name.
|
/// Gets or sets the fully qualified entity type name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string FullTypeName { get; set; } = "";
|
public string FullTypeName { get; set; } = "";
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the collection name for the entity.
|
/// Gets or sets the collection name for the entity.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string CollectionName { get; set; } = "";
|
public string CollectionName { get; set; } = "";
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the collection property name.
|
/// Gets or sets the collection property name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string? CollectionPropertyName { get; set; }
|
public string? CollectionPropertyName { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the fully qualified collection identifier type name.
|
/// Gets or sets the fully qualified collection identifier type name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -37,14 +42,17 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Models
|
|||||||
/// Gets the key property for the entity if one exists.
|
/// Gets the key property for the entity if one exists.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public PropertyInfo? IdProperty => Properties.FirstOrDefault(p => p.IsKey);
|
public PropertyInfo? IdProperty => Properties.FirstOrDefault(p => p.IsKey);
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets a value indicating whether IDs are automatically generated.
|
/// Gets or sets a value indicating whether IDs are automatically generated.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public bool AutoId { get; set; }
|
public bool AutoId { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets a value indicating whether the entity uses private setters.
|
/// Gets or sets a value indicating whether the entity uses private setters.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public bool HasPrivateSetters { get; set; }
|
public bool HasPrivateSetters { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets a value indicating whether the entity has a private or missing constructor.
|
/// Gets or sets a value indicating whether the entity has a private or missing constructor.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -53,15 +61,17 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Models
|
|||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the entity properties.
|
/// Gets the entity properties.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public List<PropertyInfo> Properties { get; } = new List<PropertyInfo>();
|
public List<PropertyInfo> Properties { get; } = new();
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets nested type metadata keyed by type name.
|
/// Gets nested type metadata keyed by type name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public Dictionary<string, NestedTypeInfo> NestedTypes { get; } = new Dictionary<string, NestedTypeInfo>();
|
public Dictionary<string, NestedTypeInfo> NestedTypes { get; } = new();
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets property names that should be ignored by mapping.
|
/// Gets property names that should be ignored by mapping.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public HashSet<string> IgnoredProperties { get; } = new HashSet<string>();
|
public HashSet<string> IgnoredProperties { get; } = new();
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -73,18 +83,22 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Models
|
|||||||
/// Gets or sets the property name.
|
/// Gets or sets the property name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string Name { get; set; } = "";
|
public string Name { get; set; } = "";
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the property type name.
|
/// Gets or sets the property type name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string TypeName { get; set; } = "";
|
public string TypeName { get; set; } = "";
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the BSON field name.
|
/// Gets or sets the BSON field name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string BsonFieldName { get; set; } = "";
|
public string BsonFieldName { get; set; } = "";
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the database column type name.
|
/// Gets or sets the database column type name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string? ColumnTypeName { get; set; }
|
public string? ColumnTypeName { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets a value indicating whether the property is nullable.
|
/// Gets or sets a value indicating whether the property is nullable.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -94,18 +108,22 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Models
|
|||||||
/// Gets or sets a value indicating whether the property has a public setter.
|
/// Gets or sets a value indicating whether the property has a public setter.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public bool HasPublicSetter { get; set; }
|
public bool HasPublicSetter { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets a value indicating whether the property uses an init-only setter.
|
/// Gets or sets a value indicating whether the property uses an init-only setter.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public bool HasInitOnlySetter { get; set; }
|
public bool HasInitOnlySetter { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets a value indicating whether the property has any setter.
|
/// Gets or sets a value indicating whether the property has any setter.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public bool HasAnySetter { get; set; }
|
public bool HasAnySetter { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets a value indicating whether the getter is read-only.
|
/// Gets or sets a value indicating whether the getter is read-only.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public bool IsReadOnlyGetter { get; set; }
|
public bool IsReadOnlyGetter { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the backing field name if available.
|
/// Gets or sets the backing field name if available.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -115,22 +133,27 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Models
|
|||||||
/// Gets or sets a value indicating whether the property is the key.
|
/// Gets or sets a value indicating whether the property is the key.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public bool IsKey { get; set; }
|
public bool IsKey { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets a value indicating whether the property is required.
|
/// Gets or sets a value indicating whether the property is required.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public bool IsRequired { get; set; }
|
public bool IsRequired { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the maximum allowed length.
|
/// Gets or sets the maximum allowed length.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public int? MaxLength { get; set; }
|
public int? MaxLength { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the minimum allowed length.
|
/// Gets or sets the minimum allowed length.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public int? MinLength { get; set; }
|
public int? MinLength { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the minimum allowed range value.
|
/// Gets or sets the minimum allowed range value.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public double? RangeMin { get; set; }
|
public double? RangeMin { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the maximum allowed range value.
|
/// Gets or sets the maximum allowed range value.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -140,14 +163,17 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Models
|
|||||||
/// Gets or sets a value indicating whether the property is a collection.
|
/// Gets or sets a value indicating whether the property is a collection.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public bool IsCollection { get; set; }
|
public bool IsCollection { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets a value indicating whether the property is an array.
|
/// Gets or sets a value indicating whether the property is an array.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public bool IsArray { get; set; }
|
public bool IsArray { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the collection item type name.
|
/// Gets or sets the collection item type name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string? CollectionItemType { get; set; }
|
public string? CollectionItemType { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the concrete collection type name.
|
/// Gets or sets the concrete collection type name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -157,22 +183,27 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Models
|
|||||||
/// Gets or sets a value indicating whether the property is a nested object.
|
/// Gets or sets a value indicating whether the property is a nested object.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public bool IsNestedObject { get; set; }
|
public bool IsNestedObject { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets a value indicating whether collection items are nested objects.
|
/// Gets or sets a value indicating whether collection items are nested objects.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public bool IsCollectionItemNested { get; set; }
|
public bool IsCollectionItemNested { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the nested type name.
|
/// Gets or sets the nested type name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string? NestedTypeName { get; set; }
|
public string? NestedTypeName { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the fully qualified nested type name.
|
/// Gets or sets the fully qualified nested type name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string? NestedTypeFullName { get; set; }
|
public string? NestedTypeFullName { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the converter type name.
|
/// Gets or sets the converter type name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string? ConverterTypeName { get; set; }
|
public string? ConverterTypeName { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the provider type name used by the converter.
|
/// Gets or sets the provider type name used by the converter.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -188,14 +219,17 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Models
|
|||||||
/// Gets or sets the nested type name.
|
/// Gets or sets the nested type name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string Name { get; set; } = "";
|
public string Name { get; set; } = "";
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the nested type namespace.
|
/// Gets or sets the nested type namespace.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string Namespace { get; set; } = "";
|
public string Namespace { get; set; } = "";
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the fully qualified nested type name.
|
/// Gets or sets the fully qualified nested type name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string FullTypeName { get; set; } = "";
|
public string FullTypeName { get; set; } = "";
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the depth of the nested type.
|
/// Gets or sets the depth of the nested type.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -204,10 +238,10 @@ namespace ZB.MOM.WW.CBDD.SourceGenerators.Models
|
|||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the nested type properties.
|
/// Gets the nested type properties.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public List<PropertyInfo> Properties { get; } = new List<PropertyInfo>();
|
public List<PropertyInfo> Properties { get; } = new();
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets nested type metadata keyed by type name.
|
/// Gets nested type metadata keyed by type name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public Dictionary<string, NestedTypeInfo> NestedTypes { get; } = new Dictionary<string, NestedTypeInfo>();
|
public Dictionary<string, NestedTypeInfo> NestedTypes { get; } = new();
|
||||||
}
|
|
||||||
}
|
}
|
||||||
@@ -1,6 +1,6 @@
|
|||||||
|
using System.Text;
|
||||||
using BenchmarkDotNet.Attributes;
|
using BenchmarkDotNet.Attributes;
|
||||||
using BenchmarkDotNet.Configs;
|
using BenchmarkDotNet.Configs;
|
||||||
using BenchmarkDotNet.Jobs;
|
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
using ZB.MOM.WW.CBDD.Core.Collections;
|
using ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
using ZB.MOM.WW.CBDD.Core.Storage;
|
using ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
@@ -15,19 +15,20 @@ namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
|||||||
[JsonExporterAttribute.Full]
|
[JsonExporterAttribute.Full]
|
||||||
public class CompactionBenchmarks
|
public class CompactionBenchmarks
|
||||||
{
|
{
|
||||||
|
private readonly List<ObjectId> _insertedIds = [];
|
||||||
|
private DocumentCollection<Person> _collection = null!;
|
||||||
|
|
||||||
|
private string _dbPath = string.Empty;
|
||||||
|
private StorageEngine _storage = null!;
|
||||||
|
private BenchmarkTransactionHolder _transactionHolder = null!;
|
||||||
|
private string _walPath = string.Empty;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the number of documents used per benchmark iteration.
|
/// Gets or sets the number of documents used per benchmark iteration.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
[Params(2_000)]
|
[Params(2_000)]
|
||||||
public int DocumentCount { get; set; }
|
public int DocumentCount { get; set; }
|
||||||
|
|
||||||
private string _dbPath = string.Empty;
|
|
||||||
private string _walPath = string.Empty;
|
|
||||||
private StorageEngine _storage = null!;
|
|
||||||
private BenchmarkTransactionHolder _transactionHolder = null!;
|
|
||||||
private DocumentCollection<Person> _collection = null!;
|
|
||||||
private List<ObjectId> _insertedIds = [];
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Prepares benchmark state and seed data for each iteration.
|
/// Prepares benchmark state and seed data for each iteration.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -53,10 +54,7 @@ public class CompactionBenchmarks
|
|||||||
_transactionHolder.CommitAndReset();
|
_transactionHolder.CommitAndReset();
|
||||||
_storage.Checkpoint();
|
_storage.Checkpoint();
|
||||||
|
|
||||||
for (var i = _insertedIds.Count - 1; i >= _insertedIds.Count / 3; i--)
|
for (int i = _insertedIds.Count - 1; i >= _insertedIds.Count / 3; i--) _collection.Delete(_insertedIds[i]);
|
||||||
{
|
|
||||||
_collection.Delete(_insertedIds[i]);
|
|
||||||
}
|
|
||||||
|
|
||||||
_transactionHolder.CommitAndReset();
|
_transactionHolder.CommitAndReset();
|
||||||
_storage.Checkpoint();
|
_storage.Checkpoint();
|
||||||
@@ -135,7 +133,7 @@ public class CompactionBenchmarks
|
|||||||
|
|
||||||
private static string BuildPayload(int seed)
|
private static string BuildPayload(int seed)
|
||||||
{
|
{
|
||||||
var builder = new System.Text.StringBuilder(2500);
|
var builder = new StringBuilder(2500);
|
||||||
for (var i = 0; i < 80; i++)
|
for (var i = 0; i < 80; i++)
|
||||||
{
|
{
|
||||||
builder.Append("compact-");
|
builder.Append("compact-");
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
|
using System.IO.Compression;
|
||||||
|
using System.Text;
|
||||||
using BenchmarkDotNet.Attributes;
|
using BenchmarkDotNet.Attributes;
|
||||||
using BenchmarkDotNet.Configs;
|
using BenchmarkDotNet.Configs;
|
||||||
using BenchmarkDotNet.Jobs;
|
|
||||||
using System.IO.Compression;
|
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
using ZB.MOM.WW.CBDD.Core.Collections;
|
using ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
using ZB.MOM.WW.CBDD.Core.Compression;
|
using ZB.MOM.WW.CBDD.Core.Compression;
|
||||||
@@ -19,6 +19,15 @@ public class CompressionBenchmarks
|
|||||||
{
|
{
|
||||||
private const int SeedCount = 300;
|
private const int SeedCount = 300;
|
||||||
private const int WorkloadCount = 100;
|
private const int WorkloadCount = 100;
|
||||||
|
private DocumentCollection<Person> _collection = null!;
|
||||||
|
|
||||||
|
private string _dbPath = string.Empty;
|
||||||
|
|
||||||
|
private Person[] _insertBatch = Array.Empty<Person>();
|
||||||
|
private ObjectId[] _seedIds = Array.Empty<ObjectId>();
|
||||||
|
private StorageEngine _storage = null!;
|
||||||
|
private BenchmarkTransactionHolder _transactionHolder = null!;
|
||||||
|
private string _walPath = string.Empty;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets whether compression is enabled for the benchmark run.
|
/// Gets or sets whether compression is enabled for the benchmark run.
|
||||||
@@ -38,15 +47,6 @@ public class CompressionBenchmarks
|
|||||||
[Params(CompressionLevel.Fastest, CompressionLevel.Optimal)]
|
[Params(CompressionLevel.Fastest, CompressionLevel.Optimal)]
|
||||||
public CompressionLevel Level { get; set; }
|
public CompressionLevel Level { get; set; }
|
||||||
|
|
||||||
private string _dbPath = string.Empty;
|
|
||||||
private string _walPath = string.Empty;
|
|
||||||
private StorageEngine _storage = null!;
|
|
||||||
private BenchmarkTransactionHolder _transactionHolder = null!;
|
|
||||||
private DocumentCollection<Person> _collection = null!;
|
|
||||||
|
|
||||||
private Person[] _insertBatch = Array.Empty<Person>();
|
|
||||||
private ObjectId[] _seedIds = Array.Empty<ObjectId>();
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Prepares benchmark storage and seed data for each iteration.
|
/// Prepares benchmark storage and seed data for each iteration.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -73,14 +73,14 @@ public class CompressionBenchmarks
|
|||||||
_seedIds = new ObjectId[SeedCount];
|
_seedIds = new ObjectId[SeedCount];
|
||||||
for (var i = 0; i < SeedCount; i++)
|
for (var i = 0; i < SeedCount; i++)
|
||||||
{
|
{
|
||||||
var doc = CreatePerson(i, includeLargeBio: true);
|
var doc = CreatePerson(i, true);
|
||||||
_seedIds[i] = _collection.Insert(doc);
|
_seedIds[i] = _collection.Insert(doc);
|
||||||
}
|
}
|
||||||
|
|
||||||
_transactionHolder.CommitAndReset();
|
_transactionHolder.CommitAndReset();
|
||||||
|
|
||||||
_insertBatch = Enumerable.Range(SeedCount, WorkloadCount)
|
_insertBatch = Enumerable.Range(SeedCount, WorkloadCount)
|
||||||
.Select(i => CreatePerson(i, includeLargeBio: true))
|
.Select(i => CreatePerson(i, true))
|
||||||
.ToArray();
|
.ToArray();
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -141,10 +141,7 @@ public class CompressionBenchmarks
|
|||||||
for (var i = 0; i < WorkloadCount; i++)
|
for (var i = 0; i < WorkloadCount; i++)
|
||||||
{
|
{
|
||||||
var person = _collection.FindById(_seedIds[i]);
|
var person = _collection.FindById(_seedIds[i]);
|
||||||
if (person != null)
|
if (person != null) checksum += person.Age;
|
||||||
{
|
|
||||||
checksum += person.Age;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
_transactionHolder.CommitAndReset();
|
_transactionHolder.CommitAndReset();
|
||||||
@@ -158,7 +155,7 @@ public class CompressionBenchmarks
|
|||||||
Id = ObjectId.NewObjectId(),
|
Id = ObjectId.NewObjectId(),
|
||||||
FirstName = $"First_{i}",
|
FirstName = $"First_{i}",
|
||||||
LastName = $"Last_{i}",
|
LastName = $"Last_{i}",
|
||||||
Age = 20 + (i % 50),
|
Age = 20 + i % 50,
|
||||||
Bio = includeLargeBio ? BuildBio(i) : $"bio-{i}",
|
Bio = includeLargeBio ? BuildBio(i) : $"bio-{i}",
|
||||||
CreatedAt = DateTime.UnixEpoch.AddMinutes(i),
|
CreatedAt = DateTime.UnixEpoch.AddMinutes(i),
|
||||||
Balance = 100 + i,
|
Balance = 100 + i,
|
||||||
@@ -183,7 +180,7 @@ public class CompressionBenchmarks
|
|||||||
|
|
||||||
private static string BuildBio(int seed)
|
private static string BuildBio(int seed)
|
||||||
{
|
{
|
||||||
var builder = new System.Text.StringBuilder(4500);
|
var builder = new StringBuilder(4500);
|
||||||
for (var i = 0; i < 150; i++)
|
for (var i = 0; i < 150; i++)
|
||||||
{
|
{
|
||||||
builder.Append("bio-");
|
builder.Append("bio-");
|
||||||
|
|||||||
@@ -1,6 +1,4 @@
|
|||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
using System;
|
|
||||||
using System.Collections.Generic;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
||||||
|
|
||||||
@@ -10,10 +8,12 @@ public class Address
|
|||||||
/// Gets or sets the Street.
|
/// Gets or sets the Street.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string Street { get; set; } = string.Empty;
|
public string Street { get; set; } = string.Empty;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the City.
|
/// Gets or sets the City.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string City { get; set; } = string.Empty;
|
public string City { get; set; } = string.Empty;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the ZipCode.
|
/// Gets or sets the ZipCode.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -26,14 +26,17 @@ public class WorkHistory
|
|||||||
/// Gets or sets the CompanyName.
|
/// Gets or sets the CompanyName.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string CompanyName { get; set; } = string.Empty;
|
public string CompanyName { get; set; } = string.Empty;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the Title.
|
/// Gets or sets the Title.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string Title { get; set; } = string.Empty;
|
public string Title { get; set; } = string.Empty;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the DurationYears.
|
/// Gets or sets the DurationYears.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public int DurationYears { get; set; }
|
public int DurationYears { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the Tags.
|
/// Gets or sets the Tags.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -46,22 +49,27 @@ public class Person
|
|||||||
/// Gets or sets the Id.
|
/// Gets or sets the Id.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public ObjectId Id { get; set; }
|
public ObjectId Id { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the FirstName.
|
/// Gets or sets the FirstName.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string FirstName { get; set; } = string.Empty;
|
public string FirstName { get; set; } = string.Empty;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the LastName.
|
/// Gets or sets the LastName.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string LastName { get; set; } = string.Empty;
|
public string LastName { get; set; } = string.Empty;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the Age.
|
/// Gets or sets the Age.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public int Age { get; set; }
|
public int Age { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the Bio.
|
/// Gets or sets the Bio.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public string? Bio { get; set; } = string.Empty;
|
public string? Bio { get; set; } = string.Empty;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the CreatedAt.
|
/// Gets or sets the CreatedAt.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -72,10 +80,12 @@ public class Person
|
|||||||
/// Gets or sets the Balance.
|
/// Gets or sets the Balance.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public decimal Balance { get; set; }
|
public decimal Balance { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the HomeAddress.
|
/// Gets or sets the HomeAddress.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public Address HomeAddress { get; set; } = new();
|
public Address HomeAddress { get; set; } = new();
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the EmploymentHistory.
|
/// Gets or sets the EmploymentHistory.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
|
|||||||
@@ -1,7 +1,5 @@
|
|||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
using ZB.MOM.WW.CBDD.Core.Collections;
|
using ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
using System.Buffers;
|
|
||||||
using System.Runtime.InteropServices;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
||||||
|
|
||||||
@@ -11,15 +9,21 @@ public class PersonMapper : ObjectIdMapperBase<Person>
|
|||||||
public override string CollectionName => "people";
|
public override string CollectionName => "people";
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override ObjectId GetId(Person entity) => entity.Id;
|
public override ObjectId GetId(Person entity)
|
||||||
|
{
|
||||||
|
return entity.Id;
|
||||||
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override void SetId(Person entity, ObjectId id) => entity.Id = id;
|
public override void SetId(Person entity, ObjectId id)
|
||||||
|
{
|
||||||
|
entity.Id = id;
|
||||||
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override int Serialize(Person entity, BsonSpanWriter writer)
|
public override int Serialize(Person entity, BsonSpanWriter writer)
|
||||||
{
|
{
|
||||||
var sizePos = writer.BeginDocument();
|
int sizePos = writer.BeginDocument();
|
||||||
|
|
||||||
writer.WriteObjectId("_id", entity.Id);
|
writer.WriteObjectId("_id", entity.Id);
|
||||||
writer.WriteString("firstname", entity.FirstName);
|
writer.WriteString("firstname", entity.FirstName);
|
||||||
@@ -36,34 +40,32 @@ public class PersonMapper : ObjectIdMapperBase<Person>
|
|||||||
writer.WriteDouble("balance", (double)entity.Balance);
|
writer.WriteDouble("balance", (double)entity.Balance);
|
||||||
|
|
||||||
// Nested Object: Address
|
// Nested Object: Address
|
||||||
var addrPos = writer.BeginDocument("homeaddress");
|
int addrPos = writer.BeginDocument("homeaddress");
|
||||||
writer.WriteString("street", entity.HomeAddress.Street);
|
writer.WriteString("street", entity.HomeAddress.Street);
|
||||||
writer.WriteString("city", entity.HomeAddress.City);
|
writer.WriteString("city", entity.HomeAddress.City);
|
||||||
writer.WriteString("zipcode", entity.HomeAddress.ZipCode);
|
writer.WriteString("zipcode", entity.HomeAddress.ZipCode);
|
||||||
writer.EndDocument(addrPos);
|
writer.EndDocument(addrPos);
|
||||||
|
|
||||||
// Collection: EmploymentHistory
|
// Collection: EmploymentHistory
|
||||||
var histPos = writer.BeginArray("employmenthistory");
|
int histPos = writer.BeginArray("employmenthistory");
|
||||||
for (int i = 0; i < entity.EmploymentHistory.Count; i++)
|
for (var i = 0; i < entity.EmploymentHistory.Count; i++)
|
||||||
{
|
{
|
||||||
var item = entity.EmploymentHistory[i];
|
var item = entity.EmploymentHistory[i];
|
||||||
// Array elements are keys "0", "1", "2"...
|
// Array elements are keys "0", "1", "2"...
|
||||||
var itemPos = writer.BeginDocument(i.ToString());
|
int itemPos = writer.BeginDocument(i.ToString());
|
||||||
|
|
||||||
writer.WriteString("companyname", item.CompanyName);
|
writer.WriteString("companyname", item.CompanyName);
|
||||||
writer.WriteString("title", item.Title);
|
writer.WriteString("title", item.Title);
|
||||||
writer.WriteInt32("durationyears", item.DurationYears);
|
writer.WriteInt32("durationyears", item.DurationYears);
|
||||||
|
|
||||||
// Nested Collection: Tags
|
// Nested Collection: Tags
|
||||||
var tagsPos = writer.BeginArray("tags");
|
int tagsPos = writer.BeginArray("tags");
|
||||||
for (int j = 0; j < item.Tags.Count; j++)
|
for (var j = 0; j < item.Tags.Count; j++) writer.WriteString(j.ToString(), item.Tags[j]);
|
||||||
{
|
|
||||||
writer.WriteString(j.ToString(), item.Tags[j]);
|
|
||||||
}
|
|
||||||
writer.EndArray(tagsPos);
|
writer.EndArray(tagsPos);
|
||||||
|
|
||||||
writer.EndDocument(itemPos);
|
writer.EndDocument(itemPos);
|
||||||
}
|
}
|
||||||
|
|
||||||
writer.EndArray(histPos);
|
writer.EndArray(histPos);
|
||||||
|
|
||||||
writer.EndDocument(sizePos);
|
writer.EndDocument(sizePos);
|
||||||
@@ -84,7 +86,7 @@ public class PersonMapper : ObjectIdMapperBase<Person>
|
|||||||
if (type == BsonType.EndOfDocument)
|
if (type == BsonType.EndOfDocument)
|
||||||
break;
|
break;
|
||||||
|
|
||||||
var name = reader.ReadElementHeader();
|
string name = reader.ReadElementHeader();
|
||||||
|
|
||||||
switch (name)
|
switch (name)
|
||||||
{
|
{
|
||||||
@@ -105,7 +107,7 @@ public class PersonMapper : ObjectIdMapperBase<Person>
|
|||||||
{
|
{
|
||||||
var addrType = reader.ReadBsonType();
|
var addrType = reader.ReadBsonType();
|
||||||
if (addrType == BsonType.EndOfDocument) break;
|
if (addrType == BsonType.EndOfDocument) break;
|
||||||
var addrName = reader.ReadElementHeader();
|
string addrName = reader.ReadElementHeader();
|
||||||
|
|
||||||
// We assume strict schema for benchmark speed, but should handle skipping
|
// We assume strict schema for benchmark speed, but should handle skipping
|
||||||
if (addrName == "street") person.HomeAddress.Street = reader.ReadString();
|
if (addrName == "street") person.HomeAddress.Street = reader.ReadString();
|
||||||
@@ -113,6 +115,7 @@ public class PersonMapper : ObjectIdMapperBase<Person>
|
|||||||
else if (addrName == "zipcode") person.HomeAddress.ZipCode = reader.ReadString();
|
else if (addrName == "zipcode") person.HomeAddress.ZipCode = reader.ReadString();
|
||||||
else reader.SkipValue(addrType);
|
else reader.SkipValue(addrType);
|
||||||
}
|
}
|
||||||
|
|
||||||
break;
|
break;
|
||||||
|
|
||||||
case "employmenthistory":
|
case "employmenthistory":
|
||||||
@@ -130,11 +133,20 @@ public class PersonMapper : ObjectIdMapperBase<Person>
|
|||||||
{
|
{
|
||||||
var itemType = reader.ReadBsonType();
|
var itemType = reader.ReadBsonType();
|
||||||
if (itemType == BsonType.EndOfDocument) break;
|
if (itemType == BsonType.EndOfDocument) break;
|
||||||
var itemName = reader.ReadElementHeader();
|
string itemName = reader.ReadElementHeader();
|
||||||
|
|
||||||
if (itemName == "companyname") workItem.CompanyName = reader.ReadString();
|
if (itemName == "companyname")
|
||||||
else if (itemName == "title") workItem.Title = reader.ReadString();
|
{
|
||||||
else if (itemName == "durationyears") workItem.DurationYears = reader.ReadInt32();
|
workItem.CompanyName = reader.ReadString();
|
||||||
|
}
|
||||||
|
else if (itemName == "title")
|
||||||
|
{
|
||||||
|
workItem.Title = reader.ReadString();
|
||||||
|
}
|
||||||
|
else if (itemName == "durationyears")
|
||||||
|
{
|
||||||
|
workItem.DurationYears = reader.ReadInt32();
|
||||||
|
}
|
||||||
else if (itemName == "tags")
|
else if (itemName == "tags")
|
||||||
{
|
{
|
||||||
reader.ReadDocumentSize(); // Enter Tags Array
|
reader.ReadDocumentSize(); // Enter Tags Array
|
||||||
@@ -149,10 +161,15 @@ public class PersonMapper : ObjectIdMapperBase<Person>
|
|||||||
reader.SkipValue(tagType);
|
reader.SkipValue(tagType);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
else reader.SkipValue(itemType);
|
else
|
||||||
|
{
|
||||||
|
reader.SkipValue(itemType);
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
person.EmploymentHistory.Add(workItem);
|
person.EmploymentHistory.Add(workItem);
|
||||||
}
|
}
|
||||||
|
|
||||||
break;
|
break;
|
||||||
|
|
||||||
default:
|
default:
|
||||||
|
|||||||
@@ -1,4 +1,3 @@
|
|||||||
using ZB.MOM.WW.CBDD.Core;
|
|
||||||
using ZB.MOM.WW.CBDD.Core.Storage;
|
using ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
using ZB.MOM.WW.CBDD.Core.Transactions;
|
using ZB.MOM.WW.CBDD.Core.Transactions;
|
||||||
|
|
||||||
@@ -19,6 +18,14 @@ internal sealed class BenchmarkTransactionHolder : ITransactionHolder, IDisposab
|
|||||||
_storage = storage ?? throw new ArgumentNullException(nameof(storage));
|
_storage = storage ?? throw new ArgumentNullException(nameof(storage));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Disposes this holder and rolls back any outstanding transaction.
|
||||||
|
/// </summary>
|
||||||
|
public void Dispose()
|
||||||
|
{
|
||||||
|
RollbackAndReset();
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets the current active transaction or starts a new one.
|
/// Gets the current active transaction or starts a new one.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -28,9 +35,7 @@ internal sealed class BenchmarkTransactionHolder : ITransactionHolder, IDisposab
|
|||||||
lock (_sync)
|
lock (_sync)
|
||||||
{
|
{
|
||||||
if (_currentTransaction == null || _currentTransaction.State != TransactionState.Active)
|
if (_currentTransaction == null || _currentTransaction.State != TransactionState.Active)
|
||||||
{
|
|
||||||
_currentTransaction = _storage.BeginTransaction();
|
_currentTransaction = _storage.BeginTransaction();
|
||||||
}
|
|
||||||
|
|
||||||
return _currentTransaction;
|
return _currentTransaction;
|
||||||
}
|
}
|
||||||
@@ -52,16 +57,11 @@ internal sealed class BenchmarkTransactionHolder : ITransactionHolder, IDisposab
|
|||||||
{
|
{
|
||||||
lock (_sync)
|
lock (_sync)
|
||||||
{
|
{
|
||||||
if (_currentTransaction == null)
|
if (_currentTransaction == null) return;
|
||||||
{
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (_currentTransaction.State == TransactionState.Active ||
|
if (_currentTransaction.State == TransactionState.Active ||
|
||||||
_currentTransaction.State == TransactionState.Preparing)
|
_currentTransaction.State == TransactionState.Preparing)
|
||||||
{
|
|
||||||
_currentTransaction.Commit();
|
_currentTransaction.Commit();
|
||||||
}
|
|
||||||
|
|
||||||
_currentTransaction.Dispose();
|
_currentTransaction.Dispose();
|
||||||
_currentTransaction = null;
|
_currentTransaction = null;
|
||||||
@@ -75,27 +75,14 @@ internal sealed class BenchmarkTransactionHolder : ITransactionHolder, IDisposab
|
|||||||
{
|
{
|
||||||
lock (_sync)
|
lock (_sync)
|
||||||
{
|
{
|
||||||
if (_currentTransaction == null)
|
if (_currentTransaction == null) return;
|
||||||
{
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (_currentTransaction.State == TransactionState.Active ||
|
if (_currentTransaction.State == TransactionState.Active ||
|
||||||
_currentTransaction.State == TransactionState.Preparing)
|
_currentTransaction.State == TransactionState.Preparing)
|
||||||
{
|
|
||||||
_currentTransaction.Rollback();
|
_currentTransaction.Rollback();
|
||||||
}
|
|
||||||
|
|
||||||
_currentTransaction.Dispose();
|
_currentTransaction.Dispose();
|
||||||
_currentTransaction = null;
|
_currentTransaction = null;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Disposes this holder and rolls back any outstanding transaction.
|
|
||||||
/// </summary>
|
|
||||||
public void Dispose()
|
|
||||||
{
|
|
||||||
RollbackAndReset();
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
@@ -1,5 +1,6 @@
|
|||||||
using Microsoft.Extensions.Logging;
|
using Microsoft.Extensions.Logging;
|
||||||
using Serilog;
|
using Serilog;
|
||||||
|
using ILogger = Microsoft.Extensions.Logging.ILogger;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
||||||
|
|
||||||
@@ -17,7 +18,7 @@ internal static class Logging
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
/// <typeparam name="T">The logger category type.</typeparam>
|
/// <typeparam name="T">The logger category type.</typeparam>
|
||||||
/// <returns>A logger for <typeparamref name="T" />.</returns>
|
/// <returns>A logger for <typeparamref name="T" />.</returns>
|
||||||
public static Microsoft.Extensions.Logging.ILogger CreateLogger<T>()
|
public static ILogger CreateLogger<T>()
|
||||||
{
|
{
|
||||||
return LoggerFactory.CreateLogger<T>();
|
return LoggerFactory.CreateLogger<T>();
|
||||||
}
|
}
|
||||||
@@ -32,7 +33,7 @@ internal static class Logging
|
|||||||
return Microsoft.Extensions.Logging.LoggerFactory.Create(builder =>
|
return Microsoft.Extensions.Logging.LoggerFactory.Create(builder =>
|
||||||
{
|
{
|
||||||
builder.ClearProviders();
|
builder.ClearProviders();
|
||||||
builder.AddSerilog(serilogLogger, dispose: true);
|
builder.AddSerilog(serilogLogger, true);
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -3,17 +3,17 @@ using BenchmarkDotNet.Configs;
|
|||||||
using BenchmarkDotNet.Exporters;
|
using BenchmarkDotNet.Exporters;
|
||||||
using BenchmarkDotNet.Reports;
|
using BenchmarkDotNet.Reports;
|
||||||
using BenchmarkDotNet.Running;
|
using BenchmarkDotNet.Running;
|
||||||
using Microsoft.Extensions.Logging;
|
using Perfolizer.Horology;
|
||||||
using Serilog.Context;
|
using Serilog.Context;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
||||||
|
|
||||||
class Program
|
internal class Program
|
||||||
{
|
{
|
||||||
static void Main(string[] args)
|
private static void Main(string[] args)
|
||||||
{
|
{
|
||||||
var logger = Logging.CreateLogger<Program>();
|
var logger = Logging.CreateLogger<Program>();
|
||||||
var mode = args.Length > 0 ? args[0].Trim().ToLowerInvariant() : string.Empty;
|
string mode = args.Length > 0 ? args[0].Trim().ToLowerInvariant() : string.Empty;
|
||||||
|
|
||||||
if (mode == "manual")
|
if (mode == "manual")
|
||||||
{
|
{
|
||||||
@@ -84,6 +84,6 @@ class Program
|
|||||||
.AddExporter(HtmlExporter.Default)
|
.AddExporter(HtmlExporter.Default)
|
||||||
.WithSummaryStyle(SummaryStyle.Default
|
.WithSummaryStyle(SummaryStyle.Default
|
||||||
.WithRatioStyle(RatioStyle.Trend)
|
.WithRatioStyle(RatioStyle.Trend)
|
||||||
.WithTimeUnit(Perfolizer.Horology.TimeUnit.Microsecond));
|
.WithTimeUnit(TimeUnit.Microsecond));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1,18 +1,13 @@
|
|||||||
using BenchmarkDotNet.Attributes;
|
using BenchmarkDotNet.Attributes;
|
||||||
using BenchmarkDotNet.Configs;
|
using BenchmarkDotNet.Configs;
|
||||||
using BenchmarkDotNet.Jobs;
|
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
|
||||||
using ZB.MOM.WW.CBDD.Core;
|
|
||||||
using ZB.MOM.WW.CBDD.Core.Collections;
|
|
||||||
using ZB.MOM.WW.CBDD.Core.Storage;
|
|
||||||
using ZB.MOM.WW.CBDD.Core.Transactions;
|
|
||||||
using Microsoft.Extensions.Logging;
|
using Microsoft.Extensions.Logging;
|
||||||
using Serilog.Context;
|
using Serilog.Context;
|
||||||
using System.IO;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
|
using ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
|
using ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
||||||
|
|
||||||
|
|
||||||
[InProcess]
|
[InProcess]
|
||||||
[MemoryDiagnoser]
|
[MemoryDiagnoser]
|
||||||
[GroupBenchmarksBy(BenchmarkLogicalGroupRule.ByCategory)]
|
[GroupBenchmarksBy(BenchmarkLogicalGroupRule.ByCategory)]
|
||||||
@@ -23,15 +18,15 @@ public class InsertBenchmarks
|
|||||||
private const int BatchSize = 1000;
|
private const int BatchSize = 1000;
|
||||||
private static readonly ILogger Logger = Logging.CreateLogger<InsertBenchmarks>();
|
private static readonly ILogger Logger = Logging.CreateLogger<InsertBenchmarks>();
|
||||||
|
|
||||||
|
private Person[] _batchData = Array.Empty<Person>();
|
||||||
|
private DocumentCollection<Person>? _collection;
|
||||||
|
|
||||||
private string _docDbPath = "";
|
private string _docDbPath = "";
|
||||||
private string _docDbWalPath = "";
|
private string _docDbWalPath = "";
|
||||||
|
private Person? _singlePerson;
|
||||||
|
|
||||||
private StorageEngine? _storage = null;
|
private StorageEngine? _storage;
|
||||||
private BenchmarkTransactionHolder? _transactionHolder = null;
|
private BenchmarkTransactionHolder? _transactionHolder;
|
||||||
private DocumentCollection<Person>? _collection = null;
|
|
||||||
|
|
||||||
private Person[] _batchData = Array.Empty<Person>();
|
|
||||||
private Person? _singlePerson = null;
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Tests setup.
|
/// Tests setup.
|
||||||
@@ -39,17 +34,14 @@ public class InsertBenchmarks
|
|||||||
[GlobalSetup]
|
[GlobalSetup]
|
||||||
public void Setup()
|
public void Setup()
|
||||||
{
|
{
|
||||||
var temp = AppContext.BaseDirectory;
|
string temp = AppContext.BaseDirectory;
|
||||||
var id = Guid.NewGuid().ToString("N");
|
var id = Guid.NewGuid().ToString("N");
|
||||||
_docDbPath = Path.Combine(temp, $"bench_docdb_{id}.db");
|
_docDbPath = Path.Combine(temp, $"bench_docdb_{id}.db");
|
||||||
_docDbWalPath = Path.ChangeExtension(_docDbPath, ".wal");
|
_docDbWalPath = Path.ChangeExtension(_docDbPath, ".wal");
|
||||||
|
|
||||||
_singlePerson = CreatePerson(0);
|
_singlePerson = CreatePerson(0);
|
||||||
_batchData = new Person[BatchSize];
|
_batchData = new Person[BatchSize];
|
||||||
for (int i = 0; i < BatchSize; i++)
|
for (var i = 0; i < BatchSize; i++) _batchData[i] = CreatePerson(i);
|
||||||
{
|
|
||||||
_batchData[i] = CreatePerson(i);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
private Person CreatePerson(int i)
|
private Person CreatePerson(int i)
|
||||||
@@ -59,7 +51,7 @@ public class InsertBenchmarks
|
|||||||
Id = ObjectId.NewObjectId(),
|
Id = ObjectId.NewObjectId(),
|
||||||
FirstName = $"First_{i}",
|
FirstName = $"First_{i}",
|
||||||
LastName = $"Last_{i}",
|
LastName = $"Last_{i}",
|
||||||
Age = 20 + (i % 50),
|
Age = 20 + i % 50,
|
||||||
Bio = null, // Removed large payload to focus on structure
|
Bio = null, // Removed large payload to focus on structure
|
||||||
CreatedAt = DateTime.UtcNow,
|
CreatedAt = DateTime.UtcNow,
|
||||||
Balance = 1000.50m * (i + 1),
|
Balance = 1000.50m * (i + 1),
|
||||||
@@ -72,8 +64,7 @@ public class InsertBenchmarks
|
|||||||
};
|
};
|
||||||
|
|
||||||
// Add 10 work history items to stress structure traversal
|
// Add 10 work history items to stress structure traversal
|
||||||
for (int j = 0; j < 10; j++)
|
for (var j = 0; j < 10; j++)
|
||||||
{
|
|
||||||
p.EmploymentHistory.Add(new WorkHistory
|
p.EmploymentHistory.Add(new WorkHistory
|
||||||
{
|
{
|
||||||
CompanyName = $"TechCorp_{i}_{j}",
|
CompanyName = $"TechCorp_{i}_{j}",
|
||||||
@@ -81,7 +72,6 @@ public class InsertBenchmarks
|
|||||||
DurationYears = j,
|
DurationYears = j,
|
||||||
Tags = new List<string> { "C#", "BSON", "Performance", "Database", "Complex" }
|
Tags = new List<string> { "C#", "BSON", "Performance", "Database", "Complex" }
|
||||||
});
|
});
|
||||||
}
|
|
||||||
|
|
||||||
return p;
|
return p;
|
||||||
}
|
}
|
||||||
@@ -111,7 +101,7 @@ public class InsertBenchmarks
|
|||||||
_storage?.Dispose();
|
_storage?.Dispose();
|
||||||
_storage = null;
|
_storage = null;
|
||||||
|
|
||||||
System.Threading.Thread.Sleep(100);
|
Thread.Sleep(100);
|
||||||
|
|
||||||
if (File.Exists(_docDbPath)) File.Delete(_docDbPath);
|
if (File.Exists(_docDbPath)) File.Delete(_docDbPath);
|
||||||
if (File.Exists(_docDbWalPath)) File.Delete(_docDbWalPath);
|
if (File.Exists(_docDbWalPath)) File.Delete(_docDbWalPath);
|
||||||
|
|||||||
@@ -1,12 +1,8 @@
|
|||||||
using BenchmarkDotNet.Attributes;
|
using BenchmarkDotNet.Attributes;
|
||||||
using BenchmarkDotNet.Configs;
|
using BenchmarkDotNet.Configs;
|
||||||
using BenchmarkDotNet.Jobs;
|
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
using ZB.MOM.WW.CBDD.Core;
|
|
||||||
using ZB.MOM.WW.CBDD.Core.Collections;
|
using ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
using ZB.MOM.WW.CBDD.Core.Storage;
|
using ZB.MOM.WW.CBDD.Core.Storage;
|
||||||
using ZB.MOM.WW.CBDD.Core.Transactions;
|
|
||||||
using System.IO;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
||||||
|
|
||||||
@@ -19,16 +15,16 @@ namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
|||||||
public class ReadBenchmarks
|
public class ReadBenchmarks
|
||||||
{
|
{
|
||||||
private const int DocCount = 1000;
|
private const int DocCount = 1000;
|
||||||
|
private DocumentCollection<Person> _collection = null!;
|
||||||
|
|
||||||
private string _docDbPath = null!;
|
private string _docDbPath = null!;
|
||||||
private string _docDbWalPath = null!;
|
private string _docDbWalPath = null!;
|
||||||
|
|
||||||
private StorageEngine _storage = null!;
|
|
||||||
private BenchmarkTransactionHolder _transactionHolder = null!;
|
|
||||||
private DocumentCollection<Person> _collection = null!;
|
|
||||||
|
|
||||||
private ObjectId[] _ids = null!;
|
private ObjectId[] _ids = null!;
|
||||||
|
|
||||||
|
private StorageEngine _storage = null!;
|
||||||
private ObjectId _targetId;
|
private ObjectId _targetId;
|
||||||
|
private BenchmarkTransactionHolder _transactionHolder = null!;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Tests setup.
|
/// Tests setup.
|
||||||
@@ -36,7 +32,7 @@ public class ReadBenchmarks
|
|||||||
[GlobalSetup]
|
[GlobalSetup]
|
||||||
public void Setup()
|
public void Setup()
|
||||||
{
|
{
|
||||||
var temp = AppContext.BaseDirectory;
|
string temp = AppContext.BaseDirectory;
|
||||||
var id = Guid.NewGuid().ToString("N");
|
var id = Guid.NewGuid().ToString("N");
|
||||||
_docDbPath = Path.Combine(temp, $"bench_read_docdb_{id}.db");
|
_docDbPath = Path.Combine(temp, $"bench_read_docdb_{id}.db");
|
||||||
_docDbWalPath = Path.ChangeExtension(_docDbPath, ".wal");
|
_docDbWalPath = Path.ChangeExtension(_docDbPath, ".wal");
|
||||||
@@ -49,11 +45,12 @@ public class ReadBenchmarks
|
|||||||
_collection = new DocumentCollection<Person>(_storage, _transactionHolder, new PersonMapper());
|
_collection = new DocumentCollection<Person>(_storage, _transactionHolder, new PersonMapper());
|
||||||
|
|
||||||
_ids = new ObjectId[DocCount];
|
_ids = new ObjectId[DocCount];
|
||||||
for (int i = 0; i < DocCount; i++)
|
for (var i = 0; i < DocCount; i++)
|
||||||
{
|
{
|
||||||
var p = CreatePerson(i);
|
var p = CreatePerson(i);
|
||||||
_ids[i] = _collection.Insert(p);
|
_ids[i] = _collection.Insert(p);
|
||||||
}
|
}
|
||||||
|
|
||||||
_transactionHolder.CommitAndReset();
|
_transactionHolder.CommitAndReset();
|
||||||
|
|
||||||
_targetId = _ids[DocCount / 2];
|
_targetId = _ids[DocCount / 2];
|
||||||
@@ -79,7 +76,7 @@ public class ReadBenchmarks
|
|||||||
Id = ObjectId.NewObjectId(),
|
Id = ObjectId.NewObjectId(),
|
||||||
FirstName = $"First_{i}",
|
FirstName = $"First_{i}",
|
||||||
LastName = $"Last_{i}",
|
LastName = $"Last_{i}",
|
||||||
Age = 20 + (i % 50),
|
Age = 20 + i % 50,
|
||||||
Bio = null,
|
Bio = null,
|
||||||
CreatedAt = DateTime.UtcNow,
|
CreatedAt = DateTime.UtcNow,
|
||||||
Balance = 1000.50m * (i + 1),
|
Balance = 1000.50m * (i + 1),
|
||||||
@@ -92,8 +89,7 @@ public class ReadBenchmarks
|
|||||||
};
|
};
|
||||||
|
|
||||||
// Add 10 work history items
|
// Add 10 work history items
|
||||||
for (int j = 0; j < 10; j++)
|
for (var j = 0; j < 10; j++)
|
||||||
{
|
|
||||||
p.EmploymentHistory.Add(new WorkHistory
|
p.EmploymentHistory.Add(new WorkHistory
|
||||||
{
|
{
|
||||||
CompanyName = $"TechCorp_{i}_{j}",
|
CompanyName = $"TechCorp_{i}_{j}",
|
||||||
@@ -101,7 +97,6 @@ public class ReadBenchmarks
|
|||||||
DurationYears = j,
|
DurationYears = j,
|
||||||
Tags = new List<string> { "C#", "BSON", "Performance", "Database", "Complex" }
|
Tags = new List<string> { "C#", "BSON", "Performance", "Database", "Complex" }
|
||||||
});
|
});
|
||||||
}
|
|
||||||
|
|
||||||
return p;
|
return p;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,7 +1,8 @@
|
|||||||
|
using System.Collections.Concurrent;
|
||||||
|
using System.Text.Json;
|
||||||
using BenchmarkDotNet.Attributes;
|
using BenchmarkDotNet.Attributes;
|
||||||
using BenchmarkDotNet.Configs;
|
using BenchmarkDotNet.Configs;
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
using System.Text.Json;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
||||||
|
|
||||||
@@ -13,32 +14,37 @@ namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
|||||||
public class SerializationBenchmarks
|
public class SerializationBenchmarks
|
||||||
{
|
{
|
||||||
private const int BatchSize = 10000;
|
private const int BatchSize = 10000;
|
||||||
private Person _person = null!;
|
|
||||||
private List<Person> _people = null!;
|
private static readonly ConcurrentDictionary<string, ushort> _keyMap = new(StringComparer.OrdinalIgnoreCase);
|
||||||
private PersonMapper _mapper = new PersonMapper();
|
private static readonly ConcurrentDictionary<ushort, string> _keys = new();
|
||||||
|
|
||||||
|
private readonly List<byte[]> _bsonDataList = new();
|
||||||
|
private readonly List<byte[]> _jsonDataList = new();
|
||||||
|
private readonly PersonMapper _mapper = new();
|
||||||
private byte[] _bsonData = Array.Empty<byte>();
|
private byte[] _bsonData = Array.Empty<byte>();
|
||||||
private byte[] _jsonData = Array.Empty<byte>();
|
private byte[] _jsonData = Array.Empty<byte>();
|
||||||
|
private List<Person> _people = null!;
|
||||||
private List<byte[]> _bsonDataList = new();
|
private Person _person = null!;
|
||||||
private List<byte[]> _jsonDataList = new();
|
|
||||||
|
|
||||||
private byte[] _serializeBuffer = Array.Empty<byte>();
|
private byte[] _serializeBuffer = Array.Empty<byte>();
|
||||||
|
|
||||||
private static readonly System.Collections.Concurrent.ConcurrentDictionary<string, ushort> _keyMap = new(StringComparer.OrdinalIgnoreCase);
|
|
||||||
private static readonly System.Collections.Concurrent.ConcurrentDictionary<ushort, string> _keys = new();
|
|
||||||
|
|
||||||
static SerializationBenchmarks()
|
static SerializationBenchmarks()
|
||||||
{
|
{
|
||||||
ushort id = 1;
|
ushort id = 1;
|
||||||
string[] initialKeys = { "_id", "firstname", "lastname", "age", "bio", "createdat", "balance", "homeaddress", "street", "city", "zipcode", "employmenthistory", "companyname", "title", "durationyears", "tags" };
|
string[] initialKeys =
|
||||||
foreach (var key in initialKeys)
|
{
|
||||||
|
"_id", "firstname", "lastname", "age", "bio", "createdat", "balance", "homeaddress", "street", "city",
|
||||||
|
"zipcode", "employmenthistory", "companyname", "title", "durationyears", "tags"
|
||||||
|
};
|
||||||
|
foreach (string key in initialKeys)
|
||||||
{
|
{
|
||||||
_keyMap[key] = id;
|
_keyMap[key] = id;
|
||||||
_keys[id] = key;
|
_keys[id] = key;
|
||||||
id++;
|
id++;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Add some indices for arrays
|
// Add some indices for arrays
|
||||||
for (int i = 0; i < 100; i++)
|
for (var i = 0; i < 100; i++)
|
||||||
{
|
{
|
||||||
var s = i.ToString();
|
var s = i.ToString();
|
||||||
_keyMap[s] = id;
|
_keyMap[s] = id;
|
||||||
@@ -55,10 +61,7 @@ public class SerializationBenchmarks
|
|||||||
{
|
{
|
||||||
_person = CreatePerson(0);
|
_person = CreatePerson(0);
|
||||||
_people = new List<Person>(BatchSize);
|
_people = new List<Person>(BatchSize);
|
||||||
for (int i = 0; i < BatchSize; i++)
|
for (var i = 0; i < BatchSize; i++) _people.Add(CreatePerson(i));
|
||||||
{
|
|
||||||
_people.Add(CreatePerson(i));
|
|
||||||
}
|
|
||||||
|
|
||||||
// Pre-allocate buffer for BSON serialization
|
// Pre-allocate buffer for BSON serialization
|
||||||
_serializeBuffer = new byte[8192];
|
_serializeBuffer = new byte[8192];
|
||||||
@@ -66,7 +69,7 @@ public class SerializationBenchmarks
|
|||||||
var writer = new BsonSpanWriter(_serializeBuffer, _keyMap);
|
var writer = new BsonSpanWriter(_serializeBuffer, _keyMap);
|
||||||
|
|
||||||
// Single item data
|
// Single item data
|
||||||
var len = _mapper.Serialize(_person, writer);
|
int len = _mapper.Serialize(_person, writer);
|
||||||
_bsonData = _serializeBuffer.AsSpan(0, len).ToArray();
|
_bsonData = _serializeBuffer.AsSpan(0, len).ToArray();
|
||||||
_jsonData = JsonSerializer.SerializeToUtf8Bytes(_person);
|
_jsonData = JsonSerializer.SerializeToUtf8Bytes(_person);
|
||||||
|
|
||||||
@@ -98,8 +101,7 @@ public class SerializationBenchmarks
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
for (int j = 0; j < 10; j++)
|
for (var j = 0; j < 10; j++)
|
||||||
{
|
|
||||||
p.EmploymentHistory.Add(new WorkHistory
|
p.EmploymentHistory.Add(new WorkHistory
|
||||||
{
|
{
|
||||||
CompanyName = $"TechCorp_{i}_{j}",
|
CompanyName = $"TechCorp_{i}_{j}",
|
||||||
@@ -107,7 +109,6 @@ public class SerializationBenchmarks
|
|||||||
DurationYears = j,
|
DurationYears = j,
|
||||||
Tags = new List<string> { "C#", "BSON", "Performance", "Database", "Complex" }
|
Tags = new List<string> { "C#", "BSON", "Performance", "Database", "Complex" }
|
||||||
});
|
});
|
||||||
}
|
|
||||||
return p;
|
return p;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -174,10 +175,7 @@ public class SerializationBenchmarks
|
|||||||
[BenchmarkCategory("Batch")]
|
[BenchmarkCategory("Batch")]
|
||||||
public void Serialize_List_Json()
|
public void Serialize_List_Json()
|
||||||
{
|
{
|
||||||
foreach (var p in _people)
|
foreach (var p in _people) JsonSerializer.SerializeToUtf8Bytes(p);
|
||||||
{
|
|
||||||
JsonSerializer.SerializeToUtf8Bytes(p);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -187,7 +185,7 @@ public class SerializationBenchmarks
|
|||||||
[BenchmarkCategory("Batch")]
|
[BenchmarkCategory("Batch")]
|
||||||
public void Deserialize_List_Bson()
|
public void Deserialize_List_Bson()
|
||||||
{
|
{
|
||||||
foreach (var data in _bsonDataList)
|
foreach (byte[] data in _bsonDataList)
|
||||||
{
|
{
|
||||||
var reader = new BsonSpanReader(data, _keys);
|
var reader = new BsonSpanReader(data, _keys);
|
||||||
_mapper.Deserialize(reader);
|
_mapper.Deserialize(reader);
|
||||||
@@ -201,9 +199,6 @@ public class SerializationBenchmarks
|
|||||||
[BenchmarkCategory("Batch")]
|
[BenchmarkCategory("Batch")]
|
||||||
public void Deserialize_List_Json()
|
public void Deserialize_List_Json()
|
||||||
{
|
{
|
||||||
foreach (var data in _jsonDataList)
|
foreach (byte[] data in _jsonDataList) JsonSerializer.Deserialize<Person>(data);
|
||||||
{
|
|
||||||
JsonSerializer.Deserialize<Person>(data);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1,4 +1,5 @@
|
|||||||
using System.Diagnostics;
|
using System.Diagnostics;
|
||||||
|
using System.IO.Compression;
|
||||||
using Microsoft.Extensions.Logging;
|
using Microsoft.Extensions.Logging;
|
||||||
using Serilog.Context;
|
using Serilog.Context;
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
@@ -10,45 +11,45 @@ namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
|||||||
|
|
||||||
internal static class DatabaseSizeBenchmark
|
internal static class DatabaseSizeBenchmark
|
||||||
{
|
{
|
||||||
|
private const int BatchSize = 50_000;
|
||||||
|
private const int ProgressInterval = 1_000_000;
|
||||||
private static readonly int[] TargetCounts = [10_000, 1_000_000, 10_000_000];
|
private static readonly int[] TargetCounts = [10_000, 1_000_000, 10_000_000];
|
||||||
|
|
||||||
private static readonly CompressionOptions CompressedBrotliFast = new()
|
private static readonly CompressionOptions CompressedBrotliFast = new()
|
||||||
{
|
{
|
||||||
EnableCompression = true,
|
EnableCompression = true,
|
||||||
MinSizeBytes = 256,
|
MinSizeBytes = 256,
|
||||||
MinSavingsPercent = 0,
|
MinSavingsPercent = 0,
|
||||||
Codec = CompressionCodec.Brotli,
|
Codec = CompressionCodec.Brotli,
|
||||||
Level = System.IO.Compression.CompressionLevel.Fastest
|
Level = CompressionLevel.Fastest
|
||||||
};
|
};
|
||||||
|
|
||||||
private static readonly Scenario[] Scenarios =
|
private static readonly Scenario[] Scenarios =
|
||||||
[
|
[
|
||||||
// Separate compression set (no compaction)
|
// Separate compression set (no compaction)
|
||||||
new(
|
new(
|
||||||
Set: "compression",
|
"compression",
|
||||||
Name: "CompressionOnly-Uncompressed",
|
"CompressionOnly-Uncompressed",
|
||||||
CompressionOptions: CompressionOptions.Default,
|
CompressionOptions.Default,
|
||||||
RunCompaction: false),
|
false),
|
||||||
new(
|
new(
|
||||||
Set: "compression",
|
"compression",
|
||||||
Name: "CompressionOnly-Compressed-BrotliFast",
|
"CompressionOnly-Compressed-BrotliFast",
|
||||||
CompressionOptions: CompressedBrotliFast,
|
CompressedBrotliFast,
|
||||||
RunCompaction: false),
|
false),
|
||||||
// Separate compaction set (compaction enabled)
|
// Separate compaction set (compaction enabled)
|
||||||
new(
|
new(
|
||||||
Set: "compaction",
|
"compaction",
|
||||||
Name: "Compaction-Uncompressed",
|
"Compaction-Uncompressed",
|
||||||
CompressionOptions: CompressionOptions.Default,
|
CompressionOptions.Default,
|
||||||
RunCompaction: true),
|
true),
|
||||||
new(
|
new(
|
||||||
Set: "compaction",
|
"compaction",
|
||||||
Name: "Compaction-Compressed-BrotliFast",
|
"Compaction-Compressed-BrotliFast",
|
||||||
CompressionOptions: CompressedBrotliFast,
|
CompressedBrotliFast,
|
||||||
RunCompaction: true)
|
true)
|
||||||
];
|
];
|
||||||
|
|
||||||
private const int BatchSize = 50_000;
|
|
||||||
private const int ProgressInterval = 1_000_000;
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Tests run.
|
/// Tests run.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -62,12 +63,12 @@ internal static class DatabaseSizeBenchmark
|
|||||||
logger.LogInformation("Scenarios: {Scenarios}", string.Join(", ", Scenarios.Select(x => $"{x.Set}:{x.Name}")));
|
logger.LogInformation("Scenarios: {Scenarios}", string.Join(", ", Scenarios.Select(x => $"{x.Set}:{x.Name}")));
|
||||||
logger.LogInformation("Batch size: {BatchSize:N0}", BatchSize);
|
logger.LogInformation("Batch size: {BatchSize:N0}", BatchSize);
|
||||||
|
|
||||||
foreach (var targetCount in TargetCounts)
|
foreach (int targetCount in TargetCounts)
|
||||||
{
|
|
||||||
foreach (var scenario in Scenarios)
|
foreach (var scenario in Scenarios)
|
||||||
{
|
{
|
||||||
var dbPath = Path.Combine(Path.GetTempPath(), $"cbdd_size_{scenario.Name}_{targetCount}_{Guid.NewGuid():N}.db");
|
string dbPath = Path.Combine(Path.GetTempPath(),
|
||||||
var walPath = Path.ChangeExtension(dbPath, ".wal");
|
$"cbdd_size_{scenario.Name}_{targetCount}_{Guid.NewGuid():N}.db");
|
||||||
|
string walPath = Path.ChangeExtension(dbPath, ".wal");
|
||||||
using var _ = LogContext.PushProperty("TargetCount", targetCount);
|
using var _ = LogContext.PushProperty("TargetCount", targetCount);
|
||||||
using var __ = LogContext.PushProperty("Scenario", scenario.Name);
|
using var __ = LogContext.PushProperty("Scenario", scenario.Name);
|
||||||
using var ___ = LogContext.PushProperty("ScenarioSet", scenario.Set);
|
using var ___ = LogContext.PushProperty("ScenarioSet", scenario.Set);
|
||||||
@@ -97,38 +98,31 @@ internal static class DatabaseSizeBenchmark
|
|||||||
var inserted = 0;
|
var inserted = 0;
|
||||||
while (inserted < targetCount)
|
while (inserted < targetCount)
|
||||||
{
|
{
|
||||||
var currentBatchSize = Math.Min(BatchSize, targetCount - inserted);
|
int currentBatchSize = Math.Min(BatchSize, targetCount - inserted);
|
||||||
var documents = new SizeBenchmarkDocument[currentBatchSize];
|
var documents = new SizeBenchmarkDocument[currentBatchSize];
|
||||||
var baseValue = inserted;
|
int baseValue = inserted;
|
||||||
|
|
||||||
for (var i = 0; i < currentBatchSize; i++)
|
for (var i = 0; i < currentBatchSize; i++) documents[i] = CreateDocument(baseValue + i);
|
||||||
{
|
|
||||||
documents[i] = CreateDocument(baseValue + i);
|
|
||||||
}
|
|
||||||
|
|
||||||
collection.InsertBulk(documents);
|
collection.InsertBulk(documents);
|
||||||
transactionHolder.CommitAndReset();
|
transactionHolder.CommitAndReset();
|
||||||
|
|
||||||
inserted += currentBatchSize;
|
inserted += currentBatchSize;
|
||||||
if (inserted == targetCount || inserted % ProgressInterval == 0)
|
if (inserted == targetCount || inserted % ProgressInterval == 0)
|
||||||
{
|
|
||||||
logger.LogInformation("Inserted {Inserted:N0}/{TargetCount:N0}", inserted, targetCount);
|
logger.LogInformation("Inserted {Inserted:N0}/{TargetCount:N0}", inserted, targetCount);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
insertStopwatch.Stop();
|
insertStopwatch.Stop();
|
||||||
preCompactDbBytes = File.Exists(dbPath) ? new FileInfo(dbPath).Length : 0;
|
preCompactDbBytes = File.Exists(dbPath) ? new FileInfo(dbPath).Length : 0;
|
||||||
preCompactWalBytes = File.Exists(walPath) ? new FileInfo(walPath).Length : 0;
|
preCompactWalBytes = File.Exists(walPath) ? new FileInfo(walPath).Length : 0;
|
||||||
|
|
||||||
if (scenario.RunCompaction)
|
if (scenario.RunCompaction)
|
||||||
{
|
|
||||||
compactionStats = storage.Compact(new CompactionOptions
|
compactionStats = storage.Compact(new CompactionOptions
|
||||||
{
|
{
|
||||||
EnableTailTruncation = true,
|
EnableTailTruncation = true,
|
||||||
DefragmentSlottedPages = true,
|
DefragmentSlottedPages = true,
|
||||||
NormalizeFreeList = true
|
NormalizeFreeList = true
|
||||||
});
|
});
|
||||||
}
|
|
||||||
|
|
||||||
postCompactDbBytes = File.Exists(dbPath) ? new FileInfo(dbPath).Length : 0;
|
postCompactDbBytes = File.Exists(dbPath) ? new FileInfo(dbPath).Length : 0;
|
||||||
postCompactWalBytes = File.Exists(walPath) ? new FileInfo(walPath).Length : 0;
|
postCompactWalBytes = File.Exists(walPath) ? new FileInfo(walPath).Length : 0;
|
||||||
@@ -165,14 +159,12 @@ internal static class DatabaseSizeBenchmark
|
|||||||
TryDelete(dbPath);
|
TryDelete(dbPath);
|
||||||
TryDelete(walPath);
|
TryDelete(walPath);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
logger.LogInformation("=== Size Benchmark Summary ===");
|
logger.LogInformation("=== Size Benchmark Summary ===");
|
||||||
foreach (var result in results
|
foreach (var result in results
|
||||||
.OrderBy(x => x.Set)
|
.OrderBy(x => x.Set)
|
||||||
.ThenBy(x => x.TargetCount)
|
.ThenBy(x => x.TargetCount)
|
||||||
.ThenBy(x => x.Scenario))
|
.ThenBy(x => x.Scenario))
|
||||||
{
|
|
||||||
logger.LogInformation(
|
logger.LogInformation(
|
||||||
"{Set,-11} | {Scenario,-38} | {Count,12:N0} docs | insert={Elapsed,12} | pre={Pre,12} | post={Post,12} | shrink={Shrink,12} | compact={CompactBytes,12} | ratio={Ratio}",
|
"{Set,-11} | {Scenario,-38} | {Count,12:N0} docs | insert={Elapsed,12} | pre={Pre,12} | post={Post,12} | shrink={Shrink,12} | compact={CompactBytes,12} | ratio={Ratio}",
|
||||||
result.Set,
|
result.Set,
|
||||||
@@ -184,7 +176,6 @@ internal static class DatabaseSizeBenchmark
|
|||||||
FormatBytes(result.ShrinkBytes),
|
FormatBytes(result.ShrinkBytes),
|
||||||
FormatBytes(result.CompactionStats.ReclaimedFileBytes),
|
FormatBytes(result.CompactionStats.ReclaimedFileBytes),
|
||||||
result.CompressionRatioText);
|
result.CompressionRatioText);
|
||||||
}
|
|
||||||
|
|
||||||
WriteSummaryCsv(results, logger);
|
WriteSummaryCsv(results, logger);
|
||||||
}
|
}
|
||||||
@@ -201,10 +192,7 @@ internal static class DatabaseSizeBenchmark
|
|||||||
|
|
||||||
private static void TryDelete(string path)
|
private static void TryDelete(string path)
|
||||||
{
|
{
|
||||||
if (File.Exists(path))
|
if (File.Exists(path)) File.Delete(path);
|
||||||
{
|
|
||||||
File.Delete(path);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
private static string FormatBytes(long bytes)
|
private static string FormatBytes(long bytes)
|
||||||
@@ -224,9 +212,9 @@ internal static class DatabaseSizeBenchmark
|
|||||||
|
|
||||||
private static void WriteSummaryCsv(IEnumerable<SizeResult> results, ILogger logger)
|
private static void WriteSummaryCsv(IEnumerable<SizeResult> results, ILogger logger)
|
||||||
{
|
{
|
||||||
var outputDirectory = Path.Combine(Directory.GetCurrentDirectory(), "BenchmarkDotNet.Artifacts", "results");
|
string outputDirectory = Path.Combine(Directory.GetCurrentDirectory(), "BenchmarkDotNet.Artifacts", "results");
|
||||||
Directory.CreateDirectory(outputDirectory);
|
Directory.CreateDirectory(outputDirectory);
|
||||||
var outputPath = Path.Combine(outputDirectory, "DatabaseSizeBenchmark-results.csv");
|
string outputPath = Path.Combine(outputDirectory, "DatabaseSizeBenchmark-results.csv");
|
||||||
|
|
||||||
var lines = new List<string>
|
var lines = new List<string>
|
||||||
{
|
{
|
||||||
@@ -234,7 +222,6 @@ internal static class DatabaseSizeBenchmark
|
|||||||
};
|
};
|
||||||
|
|
||||||
foreach (var result in results.OrderBy(x => x.Set).ThenBy(x => x.TargetCount).ThenBy(x => x.Scenario))
|
foreach (var result in results.OrderBy(x => x.Set).ThenBy(x => x.TargetCount).ThenBy(x => x.Scenario))
|
||||||
{
|
|
||||||
lines.Add(string.Join(",",
|
lines.Add(string.Join(",",
|
||||||
result.Set,
|
result.Set,
|
||||||
result.Scenario,
|
result.Scenario,
|
||||||
@@ -246,7 +233,6 @@ internal static class DatabaseSizeBenchmark
|
|||||||
result.ShrinkBytes.ToString(),
|
result.ShrinkBytes.ToString(),
|
||||||
result.CompactionStats.ReclaimedFileBytes.ToString(),
|
result.CompactionStats.ReclaimedFileBytes.ToString(),
|
||||||
result.CompressionRatioText));
|
result.CompressionRatioText));
|
||||||
}
|
|
||||||
|
|
||||||
File.WriteAllLines(outputPath, lines);
|
File.WriteAllLines(outputPath, lines);
|
||||||
logger.LogInformation("Database size summary CSV written to {OutputPath}", outputPath);
|
logger.LogInformation("Database size summary CSV written to {OutputPath}", outputPath);
|
||||||
@@ -271,10 +257,12 @@ internal static class DatabaseSizeBenchmark
|
|||||||
/// Gets or sets the pre compact total bytes.
|
/// Gets or sets the pre compact total bytes.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public long PreCompactTotalBytes => PreCompactDbBytes + PreCompactWalBytes;
|
public long PreCompactTotalBytes => PreCompactDbBytes + PreCompactWalBytes;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the post compact total bytes.
|
/// Gets or sets the post compact total bytes.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public long PostCompactTotalBytes => PostCompactDbBytes + PostCompactWalBytes;
|
public long PostCompactTotalBytes => PostCompactDbBytes + PostCompactWalBytes;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the shrink bytes.
|
/// Gets or sets the shrink bytes.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -295,10 +283,12 @@ internal static class DatabaseSizeBenchmark
|
|||||||
/// Gets or sets the id.
|
/// Gets or sets the id.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public ObjectId Id { get; set; }
|
public ObjectId Id { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the value.
|
/// Gets or sets the value.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public int Value { get; set; }
|
public int Value { get; set; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets the name.
|
/// Gets or sets the name.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -311,15 +301,21 @@ internal static class DatabaseSizeBenchmark
|
|||||||
public override string CollectionName => "size_documents";
|
public override string CollectionName => "size_documents";
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override ObjectId GetId(SizeBenchmarkDocument entity) => entity.Id;
|
public override ObjectId GetId(SizeBenchmarkDocument entity)
|
||||||
|
{
|
||||||
|
return entity.Id;
|
||||||
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override void SetId(SizeBenchmarkDocument entity, ObjectId id) => entity.Id = id;
|
public override void SetId(SizeBenchmarkDocument entity, ObjectId id)
|
||||||
|
{
|
||||||
|
entity.Id = id;
|
||||||
|
}
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
public override int Serialize(SizeBenchmarkDocument entity, BsonSpanWriter writer)
|
public override int Serialize(SizeBenchmarkDocument entity, BsonSpanWriter writer)
|
||||||
{
|
{
|
||||||
var sizePos = writer.BeginDocument();
|
int sizePos = writer.BeginDocument();
|
||||||
writer.WriteObjectId("_id", entity.Id);
|
writer.WriteObjectId("_id", entity.Id);
|
||||||
writer.WriteInt32("value", entity.Value);
|
writer.WriteInt32("value", entity.Value);
|
||||||
writer.WriteString("name", entity.Name);
|
writer.WriteString("name", entity.Name);
|
||||||
@@ -336,12 +332,9 @@ internal static class DatabaseSizeBenchmark
|
|||||||
while (reader.Remaining > 0)
|
while (reader.Remaining > 0)
|
||||||
{
|
{
|
||||||
var bsonType = reader.ReadBsonType();
|
var bsonType = reader.ReadBsonType();
|
||||||
if (bsonType == BsonType.EndOfDocument)
|
if (bsonType == BsonType.EndOfDocument) break;
|
||||||
{
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
|
|
||||||
var name = reader.ReadElementHeader();
|
string name = reader.ReadElementHeader();
|
||||||
switch (name)
|
switch (name)
|
||||||
{
|
{
|
||||||
case "_id":
|
case "_id":
|
||||||
|
|||||||
@@ -1,5 +1,4 @@
|
|||||||
using System.Diagnostics;
|
using System.Diagnostics;
|
||||||
using System.IO;
|
|
||||||
using System.Text;
|
using System.Text;
|
||||||
using Microsoft.Extensions.Logging;
|
using Microsoft.Extensions.Logging;
|
||||||
using Serilog.Context;
|
using Serilog.Context;
|
||||||
@@ -8,7 +7,7 @@ namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
|||||||
|
|
||||||
public class ManualBenchmark
|
public class ManualBenchmark
|
||||||
{
|
{
|
||||||
private static StringBuilder _log = new();
|
private static readonly StringBuilder _log = new();
|
||||||
|
|
||||||
private static void Log(ILogger logger, string message = "")
|
private static void Log(ILogger logger, string message = "")
|
||||||
{
|
{
|
||||||
@@ -60,10 +59,7 @@ public class ManualBenchmark
|
|||||||
try
|
try
|
||||||
{
|
{
|
||||||
var sw = Stopwatch.StartNew();
|
var sw = Stopwatch.StartNew();
|
||||||
for (int i = 0; i < 1000; i++)
|
for (var i = 0; i < 1000; i++) readBench.DocumentDb_FindById();
|
||||||
{
|
|
||||||
readBench.DocumentDb_FindById();
|
|
||||||
}
|
|
||||||
sw.Stop();
|
sw.Stop();
|
||||||
readByIdMs = sw.ElapsedMilliseconds;
|
readByIdMs = sw.ElapsedMilliseconds;
|
||||||
Log(logger, $" CBDD FindById x1000: {readByIdMs} ms ({(double)readByIdMs / 1000:F3} ms/op)");
|
Log(logger, $" CBDD FindById x1000: {readByIdMs} ms ({(double)readByIdMs / 1000:F3} ms/op)");
|
||||||
@@ -101,13 +97,10 @@ public class ManualBenchmark
|
|||||||
Log(logger, $"FindById x1000: {readByIdMs} ms");
|
Log(logger, $"FindById x1000: {readByIdMs} ms");
|
||||||
Log(logger, $"Single Insert: {singleInsertMs} ms");
|
Log(logger, $"Single Insert: {singleInsertMs} ms");
|
||||||
|
|
||||||
var artifactsDir = Path.Combine(AppContext.BaseDirectory, "BenchmarkDotNet.Artifacts", "results");
|
string artifactsDir = Path.Combine(AppContext.BaseDirectory, "BenchmarkDotNet.Artifacts", "results");
|
||||||
if (!Directory.Exists(artifactsDir))
|
if (!Directory.Exists(artifactsDir)) Directory.CreateDirectory(artifactsDir);
|
||||||
{
|
|
||||||
Directory.CreateDirectory(artifactsDir);
|
|
||||||
}
|
|
||||||
|
|
||||||
var filePath = Path.Combine(artifactsDir, "manual_report.txt");
|
string filePath = Path.Combine(artifactsDir, "manual_report.txt");
|
||||||
File.WriteAllText(filePath, _log.ToString());
|
File.WriteAllText(filePath, _log.ToString());
|
||||||
logger.LogInformation("Report saved to: {FilePath}", filePath);
|
logger.LogInformation("Report saved to: {FilePath}", filePath);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
|
using System.Text;
|
||||||
using BenchmarkDotNet.Attributes;
|
using BenchmarkDotNet.Attributes;
|
||||||
using BenchmarkDotNet.Configs;
|
using BenchmarkDotNet.Configs;
|
||||||
using BenchmarkDotNet.Jobs;
|
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
using ZB.MOM.WW.CBDD.Core.Collections;
|
using ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
using ZB.MOM.WW.CBDD.Core.Compression;
|
using ZB.MOM.WW.CBDD.Core.Compression;
|
||||||
@@ -16,6 +16,15 @@ namespace ZB.MOM.WW.CBDD.Tests.Benchmark;
|
|||||||
[JsonExporterAttribute.Full]
|
[JsonExporterAttribute.Full]
|
||||||
public class MixedWorkloadBenchmarks
|
public class MixedWorkloadBenchmarks
|
||||||
{
|
{
|
||||||
|
private readonly List<ObjectId> _activeIds = [];
|
||||||
|
private DocumentCollection<Person> _collection = null!;
|
||||||
|
|
||||||
|
private string _dbPath = string.Empty;
|
||||||
|
private int _nextValueSeed;
|
||||||
|
private StorageEngine _storage = null!;
|
||||||
|
private BenchmarkTransactionHolder _transactionHolder = null!;
|
||||||
|
private string _walPath = string.Empty;
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Gets or sets whether periodic online compaction is enabled.
|
/// Gets or sets whether periodic online compaction is enabled.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -28,14 +37,6 @@ public class MixedWorkloadBenchmarks
|
|||||||
[Params(800)]
|
[Params(800)]
|
||||||
public int Operations { get; set; }
|
public int Operations { get; set; }
|
||||||
|
|
||||||
private string _dbPath = string.Empty;
|
|
||||||
private string _walPath = string.Empty;
|
|
||||||
private StorageEngine _storage = null!;
|
|
||||||
private BenchmarkTransactionHolder _transactionHolder = null!;
|
|
||||||
private DocumentCollection<Person> _collection = null!;
|
|
||||||
private readonly List<ObjectId> _activeIds = [];
|
|
||||||
private int _nextValueSeed;
|
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Prepares benchmark storage and seed data for each iteration.
|
/// Prepares benchmark storage and seed data for each iteration.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -94,7 +95,7 @@ public class MixedWorkloadBenchmarks
|
|||||||
|
|
||||||
for (var i = 1; i <= Operations; i++)
|
for (var i = 1; i <= Operations; i++)
|
||||||
{
|
{
|
||||||
var mode = i % 5;
|
int mode = i % 5;
|
||||||
if (mode is 0 or 1)
|
if (mode is 0 or 1)
|
||||||
{
|
{
|
||||||
var id = _collection.Insert(CreatePerson(_nextValueSeed++));
|
var id = _collection.Insert(CreatePerson(_nextValueSeed++));
|
||||||
@@ -104,7 +105,7 @@ public class MixedWorkloadBenchmarks
|
|||||||
{
|
{
|
||||||
if (_activeIds.Count > 0)
|
if (_activeIds.Count > 0)
|
||||||
{
|
{
|
||||||
var idx = random.Next(_activeIds.Count);
|
int idx = random.Next(_activeIds.Count);
|
||||||
var id = _activeIds[idx];
|
var id = _activeIds[idx];
|
||||||
var current = _collection.FindById(id);
|
var current = _collection.FindById(id);
|
||||||
if (current != null)
|
if (current != null)
|
||||||
@@ -119,20 +120,16 @@ public class MixedWorkloadBenchmarks
|
|||||||
{
|
{
|
||||||
if (_activeIds.Count > 100)
|
if (_activeIds.Count > 100)
|
||||||
{
|
{
|
||||||
var idx = random.Next(_activeIds.Count);
|
int idx = random.Next(_activeIds.Count);
|
||||||
var id = _activeIds[idx];
|
var id = _activeIds[idx];
|
||||||
_collection.Delete(id);
|
_collection.Delete(id);
|
||||||
_activeIds.RemoveAt(idx);
|
_activeIds.RemoveAt(idx);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if (i % 50 == 0)
|
if (i % 50 == 0) _transactionHolder.CommitAndReset();
|
||||||
{
|
|
||||||
_transactionHolder.CommitAndReset();
|
|
||||||
}
|
|
||||||
|
|
||||||
if (PeriodicCompaction && i % 200 == 0)
|
if (PeriodicCompaction && i % 200 == 0)
|
||||||
{
|
|
||||||
_storage.RunOnlineCompactionPass(new CompactionOptions
|
_storage.RunOnlineCompactionPass(new CompactionOptions
|
||||||
{
|
{
|
||||||
OnlineMode = true,
|
OnlineMode = true,
|
||||||
@@ -142,7 +139,6 @@ public class MixedWorkloadBenchmarks
|
|||||||
EnableTailTruncation = true
|
EnableTailTruncation = true
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
_transactionHolder.CommitAndReset();
|
_transactionHolder.CommitAndReset();
|
||||||
return _collection.Count();
|
return _collection.Count();
|
||||||
@@ -155,7 +151,7 @@ public class MixedWorkloadBenchmarks
|
|||||||
Id = ObjectId.NewObjectId(),
|
Id = ObjectId.NewObjectId(),
|
||||||
FirstName = $"First_{seed}",
|
FirstName = $"First_{seed}",
|
||||||
LastName = $"Last_{seed}",
|
LastName = $"Last_{seed}",
|
||||||
Age = 18 + (seed % 60),
|
Age = 18 + seed % 60,
|
||||||
Bio = BuildPayload(seed),
|
Bio = BuildPayload(seed),
|
||||||
CreatedAt = DateTime.UnixEpoch.AddSeconds(seed),
|
CreatedAt = DateTime.UnixEpoch.AddSeconds(seed),
|
||||||
Balance = seed,
|
Balance = seed,
|
||||||
@@ -170,7 +166,7 @@ public class MixedWorkloadBenchmarks
|
|||||||
|
|
||||||
private static string BuildPayload(int seed)
|
private static string BuildPayload(int seed)
|
||||||
{
|
{
|
||||||
var builder = new System.Text.StringBuilder(1800);
|
var builder = new StringBuilder(1800);
|
||||||
for (var i = 0; i < 64; i++)
|
for (var i = 0; i < 64; i++)
|
||||||
{
|
{
|
||||||
builder.Append("mixed-");
|
builder.Append("mixed-");
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
using System.IO.Compression;
|
using System.IO.Compression;
|
||||||
|
using System.Text;
|
||||||
using System.Text.Json;
|
using System.Text.Json;
|
||||||
using Microsoft.Extensions.Logging;
|
using Microsoft.Extensions.Logging;
|
||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
@@ -20,15 +21,15 @@ internal static class PerformanceGateSmoke
|
|||||||
public static void Run(ILogger logger)
|
public static void Run(ILogger logger)
|
||||||
{
|
{
|
||||||
var compaction = RunCompactionProbe();
|
var compaction = RunCompactionProbe();
|
||||||
var compressionOff = RunCompressionGcProbe(enableCompression: false);
|
var compressionOff = RunCompressionGcProbe(false);
|
||||||
var compressionOn = RunCompressionGcProbe(enableCompression: true);
|
var compressionOn = RunCompressionGcProbe(true);
|
||||||
|
|
||||||
var report = new PerformanceGateReport(
|
var report = new PerformanceGateReport(
|
||||||
DateTimeOffset.UtcNow,
|
DateTimeOffset.UtcNow,
|
||||||
compaction,
|
compaction,
|
||||||
compressionOff,
|
compressionOff,
|
||||||
compressionOn);
|
compressionOn);
|
||||||
var reportPath = WriteReport(report);
|
string reportPath = WriteReport(report);
|
||||||
|
|
||||||
logger.LogInformation("Performance gate smoke report written to {ReportPath}", reportPath);
|
logger.LogInformation("Performance gate smoke report written to {ReportPath}", reportPath);
|
||||||
|
|
||||||
@@ -52,8 +53,8 @@ internal static class PerformanceGateSmoke
|
|||||||
|
|
||||||
private static CompactionProbeResult RunCompactionProbe()
|
private static CompactionProbeResult RunCompactionProbe()
|
||||||
{
|
{
|
||||||
var dbPath = NewDbPath("gate_compaction");
|
string dbPath = NewDbPath("gate_compaction");
|
||||||
var walPath = Path.ChangeExtension(dbPath, ".wal");
|
string walPath = Path.ChangeExtension(dbPath, ".wal");
|
||||||
|
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
@@ -62,18 +63,12 @@ internal static class PerformanceGateSmoke
|
|||||||
var collection = new DocumentCollection<Person>(storage, transactionHolder, new PersonMapper());
|
var collection = new DocumentCollection<Person>(storage, transactionHolder, new PersonMapper());
|
||||||
|
|
||||||
var ids = new List<ObjectId>(CompactionDocumentCount);
|
var ids = new List<ObjectId>(CompactionDocumentCount);
|
||||||
for (var i = 0; i < CompactionDocumentCount; i++)
|
for (var i = 0; i < CompactionDocumentCount; i++) ids.Add(collection.Insert(CreatePerson(i, true)));
|
||||||
{
|
|
||||||
ids.Add(collection.Insert(CreatePerson(i, includeLargeBio: true)));
|
|
||||||
}
|
|
||||||
|
|
||||||
transactionHolder.CommitAndReset();
|
transactionHolder.CommitAndReset();
|
||||||
storage.Checkpoint();
|
storage.Checkpoint();
|
||||||
|
|
||||||
for (var i = 0; i < ids.Count; i += 3)
|
for (var i = 0; i < ids.Count; i += 3) collection.Delete(ids[i]);
|
||||||
{
|
|
||||||
collection.Delete(ids[i]);
|
|
||||||
}
|
|
||||||
|
|
||||||
for (var i = 0; i < ids.Count; i += 5)
|
for (var i = 0; i < ids.Count; i += 5)
|
||||||
{
|
{
|
||||||
@@ -117,8 +112,8 @@ internal static class PerformanceGateSmoke
|
|||||||
|
|
||||||
private static CompressionGcProbeResult RunCompressionGcProbe(bool enableCompression)
|
private static CompressionGcProbeResult RunCompressionGcProbe(bool enableCompression)
|
||||||
{
|
{
|
||||||
var dbPath = NewDbPath(enableCompression ? "gate_gc_on" : "gate_gc_off");
|
string dbPath = NewDbPath(enableCompression ? "gate_gc_on" : "gate_gc_off");
|
||||||
var walPath = Path.ChangeExtension(dbPath, ".wal");
|
string walPath = Path.ChangeExtension(dbPath, ".wal");
|
||||||
var compressionOptions = enableCompression
|
var compressionOptions = enableCompression
|
||||||
? new CompressionOptions
|
? new CompressionOptions
|
||||||
{
|
{
|
||||||
@@ -140,16 +135,13 @@ internal static class PerformanceGateSmoke
|
|||||||
GC.WaitForPendingFinalizers();
|
GC.WaitForPendingFinalizers();
|
||||||
GC.Collect();
|
GC.Collect();
|
||||||
|
|
||||||
var g0Before = GC.CollectionCount(0);
|
int g0Before = GC.CollectionCount(0);
|
||||||
var g1Before = GC.CollectionCount(1);
|
int g1Before = GC.CollectionCount(1);
|
||||||
var g2Before = GC.CollectionCount(2);
|
int g2Before = GC.CollectionCount(2);
|
||||||
var allocBefore = GC.GetTotalAllocatedBytes(true);
|
long allocBefore = GC.GetTotalAllocatedBytes(true);
|
||||||
|
|
||||||
var ids = new ObjectId[CompressionDocumentCount];
|
var ids = new ObjectId[CompressionDocumentCount];
|
||||||
for (var i = 0; i < CompressionDocumentCount; i++)
|
for (var i = 0; i < CompressionDocumentCount; i++) ids[i] = collection.Insert(CreatePerson(i, true));
|
||||||
{
|
|
||||||
ids[i] = collection.Insert(CreatePerson(i, includeLargeBio: true));
|
|
||||||
}
|
|
||||||
|
|
||||||
transactionHolder.CommitAndReset();
|
transactionHolder.CommitAndReset();
|
||||||
|
|
||||||
@@ -166,17 +158,17 @@ internal static class PerformanceGateSmoke
|
|||||||
|
|
||||||
transactionHolder.CommitAndReset();
|
transactionHolder.CommitAndReset();
|
||||||
|
|
||||||
var readCount = collection.FindAll().Count();
|
int readCount = collection.FindAll().Count();
|
||||||
transactionHolder.CommitAndReset();
|
transactionHolder.CommitAndReset();
|
||||||
|
|
||||||
GC.Collect();
|
GC.Collect();
|
||||||
GC.WaitForPendingFinalizers();
|
GC.WaitForPendingFinalizers();
|
||||||
GC.Collect();
|
GC.Collect();
|
||||||
|
|
||||||
var g0After = GC.CollectionCount(0);
|
int g0After = GC.CollectionCount(0);
|
||||||
var g1After = GC.CollectionCount(1);
|
int g1After = GC.CollectionCount(1);
|
||||||
var g2After = GC.CollectionCount(2);
|
int g2After = GC.CollectionCount(2);
|
||||||
var allocAfter = GC.GetTotalAllocatedBytes(true);
|
long allocAfter = GC.GetTotalAllocatedBytes(true);
|
||||||
|
|
||||||
return new CompressionGcProbeResult(
|
return new CompressionGcProbeResult(
|
||||||
enableCompression,
|
enableCompression,
|
||||||
@@ -198,11 +190,11 @@ internal static class PerformanceGateSmoke
|
|||||||
|
|
||||||
private static string WriteReport(PerformanceGateReport report)
|
private static string WriteReport(PerformanceGateReport report)
|
||||||
{
|
{
|
||||||
var outputDirectory = Path.Combine(Directory.GetCurrentDirectory(), "BenchmarkDotNet.Artifacts", "results");
|
string outputDirectory = Path.Combine(Directory.GetCurrentDirectory(), "BenchmarkDotNet.Artifacts", "results");
|
||||||
Directory.CreateDirectory(outputDirectory);
|
Directory.CreateDirectory(outputDirectory);
|
||||||
|
|
||||||
var reportPath = Path.Combine(outputDirectory, "PerformanceGateSmoke-report.json");
|
string reportPath = Path.Combine(outputDirectory, "PerformanceGateSmoke-report.json");
|
||||||
var json = JsonSerializer.Serialize(report, new JsonSerializerOptions { WriteIndented = true });
|
string json = JsonSerializer.Serialize(report, new JsonSerializerOptions { WriteIndented = true });
|
||||||
File.WriteAllText(reportPath, json);
|
File.WriteAllText(reportPath, json);
|
||||||
return reportPath;
|
return reportPath;
|
||||||
}
|
}
|
||||||
@@ -214,7 +206,7 @@ internal static class PerformanceGateSmoke
|
|||||||
Id = ObjectId.NewObjectId(),
|
Id = ObjectId.NewObjectId(),
|
||||||
FirstName = $"First_{i}",
|
FirstName = $"First_{i}",
|
||||||
LastName = $"Last_{i}",
|
LastName = $"Last_{i}",
|
||||||
Age = 20 + (i % 50),
|
Age = 20 + i % 50,
|
||||||
Bio = includeLargeBio ? BuildBio(i) : $"bio-{i}",
|
Bio = includeLargeBio ? BuildBio(i) : $"bio-{i}",
|
||||||
CreatedAt = DateTime.UnixEpoch.AddMinutes(i),
|
CreatedAt = DateTime.UnixEpoch.AddMinutes(i),
|
||||||
Balance = 100 + i,
|
Balance = 100 + i,
|
||||||
@@ -239,7 +231,7 @@ internal static class PerformanceGateSmoke
|
|||||||
|
|
||||||
private static string BuildBio(int seed)
|
private static string BuildBio(int seed)
|
||||||
{
|
{
|
||||||
var builder = new System.Text.StringBuilder(4500);
|
var builder = new StringBuilder(4500);
|
||||||
for (var i = 0; i < 150; i++)
|
for (var i = 0; i < 150; i++)
|
||||||
{
|
{
|
||||||
builder.Append("bio-");
|
builder.Append("bio-");
|
||||||
@@ -253,14 +245,13 @@ internal static class PerformanceGateSmoke
|
|||||||
}
|
}
|
||||||
|
|
||||||
private static string NewDbPath(string prefix)
|
private static string NewDbPath(string prefix)
|
||||||
=> Path.Combine(Path.GetTempPath(), $"{prefix}_{Guid.NewGuid():N}.db");
|
{
|
||||||
|
return Path.Combine(Path.GetTempPath(), $"{prefix}_{Guid.NewGuid():N}.db");
|
||||||
|
}
|
||||||
|
|
||||||
private static void TryDelete(string path)
|
private static void TryDelete(string path)
|
||||||
{
|
{
|
||||||
if (File.Exists(path))
|
if (File.Exists(path)) File.Delete(path);
|
||||||
{
|
|
||||||
File.Delete(path);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
private sealed record PerformanceGateReport(
|
private sealed record PerformanceGateReport(
|
||||||
|
|||||||
@@ -21,7 +21,7 @@ public class ArchitectureFitnessTests
|
|||||||
[Fact]
|
[Fact]
|
||||||
public void Solution_DependencyGraph_ShouldRemainAcyclic_AndFollowLayerDirection()
|
public void Solution_DependencyGraph_ShouldRemainAcyclic_AndFollowLayerDirection()
|
||||||
{
|
{
|
||||||
var repoRoot = FindRepositoryRoot();
|
string repoRoot = FindRepositoryRoot();
|
||||||
var projectGraph = LoadSolutionProjectGraph(repoRoot);
|
var projectGraph = LoadSolutionProjectGraph(repoRoot);
|
||||||
|
|
||||||
// Explicit layer rules
|
// Explicit layer rules
|
||||||
@@ -30,14 +30,13 @@ public class ArchitectureFitnessTests
|
|||||||
projectGraph[CoreProject].ShouldBe(new[] { BsonProject });
|
projectGraph[CoreProject].ShouldBe(new[] { BsonProject });
|
||||||
projectGraph[FacadeProject]
|
projectGraph[FacadeProject]
|
||||||
.OrderBy(v => v, StringComparer.Ordinal)
|
.OrderBy(v => v, StringComparer.Ordinal)
|
||||||
.ShouldBe(new[] { BsonProject, CoreProject, SourceGeneratorsProject }.OrderBy(v => v, StringComparer.Ordinal));
|
.ShouldBe(new[] { BsonProject, CoreProject, SourceGeneratorsProject }.OrderBy(v => v,
|
||||||
|
StringComparer.Ordinal));
|
||||||
|
|
||||||
// Source projects should not depend on tests.
|
// Source projects should not depend on tests.
|
||||||
foreach (var kvp in projectGraph.Where(p => p.Key.StartsWith("src/", StringComparison.Ordinal)))
|
foreach (var kvp in projectGraph.Where(p => p.Key.StartsWith("src/", StringComparison.Ordinal)))
|
||||||
{
|
|
||||||
kvp.Value.Any(dep => dep.StartsWith("tests/", StringComparison.Ordinal))
|
kvp.Value.Any(dep => dep.StartsWith("tests/", StringComparison.Ordinal))
|
||||||
.ShouldBeFalse($"{kvp.Key} must not reference test projects.");
|
.ShouldBeFalse($"{kvp.Key} must not reference test projects.");
|
||||||
}
|
|
||||||
|
|
||||||
HasCycle(projectGraph)
|
HasCycle(projectGraph)
|
||||||
.ShouldBeFalse("Project references must remain acyclic.");
|
.ShouldBeFalse("Project references must remain acyclic.");
|
||||||
@@ -51,7 +50,7 @@ public class ArchitectureFitnessTests
|
|||||||
{
|
{
|
||||||
var lowLevelTypes = new[] { typeof(BsonSpanReader), typeof(BsonSpanWriter) };
|
var lowLevelTypes = new[] { typeof(BsonSpanReader), typeof(BsonSpanWriter) };
|
||||||
|
|
||||||
var collectionOffenders = typeof(DocumentCollection<,>)
|
string[] collectionOffenders = typeof(DocumentCollection<,>)
|
||||||
.GetMethods(BindingFlags.Public | BindingFlags.Instance | BindingFlags.Static | BindingFlags.DeclaredOnly)
|
.GetMethods(BindingFlags.Public | BindingFlags.Instance | BindingFlags.Static | BindingFlags.DeclaredOnly)
|
||||||
.Where(m => lowLevelTypes.Any(t => MethodUsesType(m, t)))
|
.Where(m => lowLevelTypes.Any(t => MethodUsesType(m, t)))
|
||||||
.Select(m => m.Name)
|
.Select(m => m.Name)
|
||||||
@@ -61,7 +60,7 @@ public class ArchitectureFitnessTests
|
|||||||
|
|
||||||
collectionOffenders.ShouldBeEmpty();
|
collectionOffenders.ShouldBeEmpty();
|
||||||
|
|
||||||
var dbContextOffenders = typeof(DocumentDbContext)
|
string[] dbContextOffenders = typeof(DocumentDbContext)
|
||||||
.GetMethods(BindingFlags.Public | BindingFlags.Instance | BindingFlags.Static | BindingFlags.DeclaredOnly)
|
.GetMethods(BindingFlags.Public | BindingFlags.Instance | BindingFlags.Static | BindingFlags.DeclaredOnly)
|
||||||
.Where(m => lowLevelTypes.Any(t => MethodUsesType(m, t)))
|
.Where(m => lowLevelTypes.Any(t => MethodUsesType(m, t)))
|
||||||
.Select(m => m.Name)
|
.Select(m => m.Name)
|
||||||
@@ -84,22 +83,23 @@ public class ArchitectureFitnessTests
|
|||||||
typeof(BTreeIndex),
|
typeof(BTreeIndex),
|
||||||
typeof(CollectionIndexManager<,>),
|
typeof(CollectionIndexManager<,>),
|
||||||
typeof(CollectionSecondaryIndex<,>),
|
typeof(CollectionSecondaryIndex<,>),
|
||||||
typeof(VectorSearchIndex),
|
typeof(VectorSearchIndex)
|
||||||
};
|
};
|
||||||
|
|
||||||
var fieldOffenders = targetTypes
|
string[] fieldOffenders = targetTypes
|
||||||
.SelectMany(t => t.GetFields(BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Public)
|
.SelectMany(t => t.GetFields(BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Public)
|
||||||
.Where(f => f.FieldType == typeof(StorageEngine))
|
.Where(f => f.FieldType == typeof(StorageEngine))
|
||||||
.Select(f => $"{t.Name}.{f.Name}"))
|
.Select(f => $"{t.Name}.{f.Name}"))
|
||||||
.OrderBy(v => v)
|
.OrderBy(v => v)
|
||||||
.ToArray();
|
.ToArray();
|
||||||
|
|
||||||
fieldOffenders.ShouldBeEmpty("Collection/index orchestration should hold IStorageEngine instead of concrete StorageEngine.");
|
fieldOffenders.ShouldBeEmpty(
|
||||||
|
"Collection/index orchestration should hold IStorageEngine instead of concrete StorageEngine.");
|
||||||
}
|
}
|
||||||
|
|
||||||
private static Dictionary<string, List<string>> LoadSolutionProjectGraph(string repoRoot)
|
private static Dictionary<string, List<string>> LoadSolutionProjectGraph(string repoRoot)
|
||||||
{
|
{
|
||||||
var solutionPath = Path.Combine(repoRoot, "CBDD.slnx");
|
string solutionPath = Path.Combine(repoRoot, "CBDD.slnx");
|
||||||
var solutionDoc = XDocument.Load(solutionPath);
|
var solutionDoc = XDocument.Load(solutionPath);
|
||||||
|
|
||||||
var projects = solutionDoc
|
var projects = solutionDoc
|
||||||
@@ -115,11 +115,11 @@ public class ArchitectureFitnessTests
|
|||||||
_ => new List<string>(),
|
_ => new List<string>(),
|
||||||
StringComparer.Ordinal);
|
StringComparer.Ordinal);
|
||||||
|
|
||||||
foreach (var project in projects)
|
foreach (string project in projects)
|
||||||
{
|
{
|
||||||
var projectFile = Path.Combine(repoRoot, project);
|
string projectFile = Path.Combine(repoRoot, project);
|
||||||
var projectDoc = XDocument.Load(projectFile);
|
var projectDoc = XDocument.Load(projectFile);
|
||||||
var projectDir = Path.GetDirectoryName(projectFile)!;
|
string projectDir = Path.GetDirectoryName(projectFile)!;
|
||||||
|
|
||||||
var refs = projectDoc
|
var refs = projectDoc
|
||||||
.Descendants()
|
.Descendants()
|
||||||
@@ -127,7 +127,8 @@ public class ArchitectureFitnessTests
|
|||||||
.Select(e => e.Attribute("Include")?.Value)
|
.Select(e => e.Attribute("Include")?.Value)
|
||||||
.Where(v => !string.IsNullOrWhiteSpace(v))
|
.Where(v => !string.IsNullOrWhiteSpace(v))
|
||||||
.Select(v => v!.Replace('\\', '/'))
|
.Select(v => v!.Replace('\\', '/'))
|
||||||
.Select(v => NormalizePath(Path.GetRelativePath(repoRoot, Path.GetFullPath(Path.Combine(projectDir, v)))))
|
.Select(v =>
|
||||||
|
NormalizePath(Path.GetRelativePath(repoRoot, Path.GetFullPath(Path.Combine(projectDir, v)))))
|
||||||
.Where(projects.Contains)
|
.Where(projects.Contains)
|
||||||
.Distinct(StringComparer.Ordinal)
|
.Distinct(StringComparer.Ordinal)
|
||||||
.OrderBy(v => v, StringComparer.Ordinal)
|
.OrderBy(v => v, StringComparer.Ordinal)
|
||||||
@@ -143,30 +144,20 @@ public class ArchitectureFitnessTests
|
|||||||
{
|
{
|
||||||
var state = graph.Keys.ToDictionary(k => k, _ => 0, StringComparer.Ordinal);
|
var state = graph.Keys.ToDictionary(k => k, _ => 0, StringComparer.Ordinal);
|
||||||
|
|
||||||
foreach (var node in graph.Keys)
|
foreach (string node in graph.Keys)
|
||||||
{
|
|
||||||
if (state[node] == 0 && Visit(node))
|
if (state[node] == 0 && Visit(node))
|
||||||
{
|
|
||||||
return true;
|
return true;
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return false;
|
return false;
|
||||||
|
|
||||||
bool Visit(string node)
|
bool Visit(string node)
|
||||||
{
|
{
|
||||||
state[node] = 1; // visiting
|
state[node] = 1; // visiting
|
||||||
foreach (var dep in graph[node])
|
foreach (string dep in graph[node])
|
||||||
{
|
{
|
||||||
if (state[dep] == 1)
|
if (state[dep] == 1) return true;
|
||||||
{
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (state[dep] == 0 && Visit(dep))
|
if (state[dep] == 0 && Visit(dep)) return true;
|
||||||
{
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
state[node] = 2; // visited
|
state[node] = 2; // visited
|
||||||
@@ -176,30 +167,19 @@ public class ArchitectureFitnessTests
|
|||||||
|
|
||||||
private static bool MethodUsesType(MethodInfo method, Type forbidden)
|
private static bool MethodUsesType(MethodInfo method, Type forbidden)
|
||||||
{
|
{
|
||||||
if (TypeContains(method.ReturnType, forbidden))
|
if (TypeContains(method.ReturnType, forbidden)) return true;
|
||||||
{
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
|
|
||||||
return method.GetParameters().Any(p => TypeContains(p.ParameterType, forbidden));
|
return method.GetParameters().Any(p => TypeContains(p.ParameterType, forbidden));
|
||||||
}
|
}
|
||||||
|
|
||||||
private static bool TypeContains(Type inspected, Type forbidden)
|
private static bool TypeContains(Type inspected, Type forbidden)
|
||||||
{
|
{
|
||||||
if (inspected == forbidden)
|
if (inspected == forbidden) return true;
|
||||||
{
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (inspected.HasElementType && inspected.GetElementType() is { } elementType && TypeContains(elementType, forbidden))
|
if (inspected.HasElementType && inspected.GetElementType() is { } elementType &&
|
||||||
{
|
TypeContains(elementType, forbidden)) return true;
|
||||||
return true;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!inspected.IsGenericType)
|
if (!inspected.IsGenericType) return false;
|
||||||
{
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
|
|
||||||
return inspected.GetGenericArguments().Any(t => TypeContains(t, forbidden));
|
return inspected.GetGenericArguments().Any(t => TypeContains(t, forbidden));
|
||||||
}
|
}
|
||||||
@@ -209,11 +189,8 @@ public class ArchitectureFitnessTests
|
|||||||
var current = new DirectoryInfo(AppContext.BaseDirectory);
|
var current = new DirectoryInfo(AppContext.BaseDirectory);
|
||||||
while (current != null)
|
while (current != null)
|
||||||
{
|
{
|
||||||
var solutionPath = Path.Combine(current.FullName, "CBDD.slnx");
|
string solutionPath = Path.Combine(current.FullName, "CBDD.slnx");
|
||||||
if (File.Exists(solutionPath))
|
if (File.Exists(solutionPath)) return current.FullName;
|
||||||
{
|
|
||||||
return current.FullName;
|
|
||||||
}
|
|
||||||
|
|
||||||
current = current.Parent;
|
current = current.Parent;
|
||||||
}
|
}
|
||||||
@@ -222,5 +199,7 @@ public class ArchitectureFitnessTests
|
|||||||
}
|
}
|
||||||
|
|
||||||
private static string NormalizePath(string path)
|
private static string NormalizePath(string path)
|
||||||
=> path.Replace('\\', '/');
|
{
|
||||||
|
return path.Replace('\\', '/');
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -32,10 +32,10 @@ public class BsonDocumentAndBufferWriterTests
|
|||||||
|
|
||||||
var wrapped = new BsonDocument(doc.RawData.ToArray(), reverseMap);
|
var wrapped = new BsonDocument(doc.RawData.ToArray(), reverseMap);
|
||||||
|
|
||||||
wrapped.TryGetString("name", out var name).ShouldBeTrue();
|
wrapped.TryGetString("name", out string? name).ShouldBeTrue();
|
||||||
name.ShouldBe("Alice");
|
name.ShouldBe("Alice");
|
||||||
|
|
||||||
wrapped.TryGetInt32("age", out var age).ShouldBeTrue();
|
wrapped.TryGetInt32("age", out int age).ShouldBeTrue();
|
||||||
age.ShouldBe(32);
|
age.ShouldBe(32);
|
||||||
|
|
||||||
wrapped.TryGetObjectId("_id", out var id).ShouldBeTrue();
|
wrapped.TryGetObjectId("_id", out var id).ShouldBeTrue();
|
||||||
@@ -86,16 +86,13 @@ public class BsonDocumentAndBufferWriterTests
|
|||||||
}
|
}
|
||||||
|
|
||||||
var builder = new BsonDocumentBuilder(keyMap);
|
var builder = new BsonDocumentBuilder(keyMap);
|
||||||
for (int i = 1; i <= 180; i++)
|
for (var i = 1; i <= 180; i++) builder.AddInt32($"k{i}", i);
|
||||||
{
|
|
||||||
builder.AddInt32($"k{i}", i);
|
|
||||||
}
|
|
||||||
|
|
||||||
var doc = builder.Build();
|
var doc = builder.Build();
|
||||||
doc.Size.ShouldBeGreaterThan(1024);
|
doc.Size.ShouldBeGreaterThan(1024);
|
||||||
|
|
||||||
var wrapped = new BsonDocument(doc.RawData.ToArray(), reverseMap);
|
var wrapped = new BsonDocument(doc.RawData.ToArray(), reverseMap);
|
||||||
wrapped.TryGetInt32("k180", out var value).ShouldBeTrue();
|
wrapped.TryGetInt32("k180", out int value).ShouldBeTrue();
|
||||||
value.ShouldBe(180);
|
value.ShouldBe(180);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -125,7 +122,7 @@ public class BsonDocumentAndBufferWriterTests
|
|||||||
writer.EndDocument(rootSizePos);
|
writer.EndDocument(rootSizePos);
|
||||||
int rootEnd = writer.Position;
|
int rootEnd = writer.Position;
|
||||||
|
|
||||||
var bytes = output.WrittenSpan.ToArray();
|
byte[] bytes = output.WrittenSpan.ToArray();
|
||||||
PatchDocumentSize(bytes, childSizePos, childEnd);
|
PatchDocumentSize(bytes, childSizePos, childEnd);
|
||||||
PatchDocumentSize(bytes, arraySizePos, arrayEnd);
|
PatchDocumentSize(bytes, arraySizePos, arrayEnd);
|
||||||
PatchDocumentSize(bytes, rootSizePos, rootEnd);
|
PatchDocumentSize(bytes, rootSizePos, rootEnd);
|
||||||
@@ -172,10 +169,10 @@ public class BsonDocumentAndBufferWriterTests
|
|||||||
var singleByteReader = new BsonSpanReader(new byte[] { 0x2A }, new ConcurrentDictionary<ushort, string>());
|
var singleByteReader = new BsonSpanReader(new byte[] { 0x2A }, new ConcurrentDictionary<ushort, string>());
|
||||||
singleByteReader.ReadByte().ShouldBe((byte)0x2A);
|
singleByteReader.ReadByte().ShouldBe((byte)0x2A);
|
||||||
|
|
||||||
var cstring = Encoding.UTF8.GetBytes("hello\0");
|
byte[] cstring = Encoding.UTF8.GetBytes("hello\0");
|
||||||
var cstringReader = new BsonSpanReader(cstring, new ConcurrentDictionary<ushort, string>());
|
var cstringReader = new BsonSpanReader(cstring, new ConcurrentDictionary<ushort, string>());
|
||||||
var destination = new char[16];
|
var destination = new char[16];
|
||||||
var written = cstringReader.ReadCString(destination);
|
int written = cstringReader.ReadCString(destination);
|
||||||
|
|
||||||
new string(destination, 0, written).ShouldBe("hello");
|
new string(destination, 0, written).ShouldBe("hello");
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,14 +1,83 @@
|
|||||||
using ZB.MOM.WW.CBDD.Bson;
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
using ZB.MOM.WW.CBDD.Core.Collections;
|
using ZB.MOM.WW.CBDD.Core.Collections;
|
||||||
using Xunit;
|
|
||||||
using System.Collections.Generic;
|
|
||||||
using System;
|
|
||||||
using System.Linq;
|
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Tests;
|
namespace ZB.MOM.WW.CBDD.Tests;
|
||||||
|
|
||||||
public class BsonSchemaTests
|
public class BsonSchemaTests
|
||||||
{
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Verifies schema generation for a simple entity.
|
||||||
|
/// </summary>
|
||||||
|
[Fact]
|
||||||
|
public void GenerateSchema_SimpleEntity()
|
||||||
|
{
|
||||||
|
var schema = BsonSchemaGenerator.FromType<SimpleEntity>();
|
||||||
|
|
||||||
|
schema.Title.ShouldBe("SimpleEntity");
|
||||||
|
schema.Fields.Count.ShouldBe(4);
|
||||||
|
|
||||||
|
var idField = schema.Fields.First(f => f.Name == "_id");
|
||||||
|
idField.Type.ShouldBe(BsonType.ObjectId);
|
||||||
|
|
||||||
|
var nameField = schema.Fields.First(f => f.Name == "name");
|
||||||
|
nameField.Type.ShouldBe(BsonType.String);
|
||||||
|
|
||||||
|
var ageField = schema.Fields.First(f => f.Name == "age");
|
||||||
|
ageField.Type.ShouldBe(BsonType.Int32);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Verifies schema generation for collection fields.
|
||||||
|
/// </summary>
|
||||||
|
[Fact]
|
||||||
|
public void GenerateSchema_Collections()
|
||||||
|
{
|
||||||
|
var schema = BsonSchemaGenerator.FromType<CollectionEntity>();
|
||||||
|
|
||||||
|
var tags = schema.Fields.First(f => f.Name == "tags");
|
||||||
|
tags.Type.ShouldBe(BsonType.Array);
|
||||||
|
tags.ArrayItemType.ShouldBe(BsonType.String);
|
||||||
|
|
||||||
|
var scores = schema.Fields.First(f => f.Name == "scores");
|
||||||
|
scores.Type.ShouldBe(BsonType.Array);
|
||||||
|
scores.ArrayItemType.ShouldBe(BsonType.Int32);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Verifies schema generation for nested document fields.
|
||||||
|
/// </summary>
|
||||||
|
[Fact]
|
||||||
|
public void GenerateSchema_Nested()
|
||||||
|
{
|
||||||
|
var schema = BsonSchemaGenerator.FromType<NestedEntity>();
|
||||||
|
|
||||||
|
var parent = schema.Fields.First(f => f.Name == "parent");
|
||||||
|
parent.Type.ShouldBe(BsonType.Document);
|
||||||
|
parent.NestedSchema.ShouldNotBeNull();
|
||||||
|
parent.NestedSchema.Fields.ShouldContain(f => f.Name == "_id");
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Verifies schema generation for collections of complex types.
|
||||||
|
/// </summary>
|
||||||
|
[Fact]
|
||||||
|
public void GenerateSchema_ComplexCollection()
|
||||||
|
{
|
||||||
|
var schema = BsonSchemaGenerator.FromType<ComplexCollectionEntity>();
|
||||||
|
|
||||||
|
var items = schema.Fields.First(f => f.Name == "items");
|
||||||
|
items.Type.ShouldBe(BsonType.Array);
|
||||||
|
// items.ArrayItemType.ShouldBe(BsonType.Document); // Wait, my generator logic might return Array here? No, item type logic...
|
||||||
|
|
||||||
|
// Let's verify generator logic for complex array item type
|
||||||
|
// In generator: (BsonType.Array, itemNested, itemBsonType)
|
||||||
|
// itemBsonType for SimpleEntity should be Document
|
||||||
|
|
||||||
|
items.ArrayItemType.ShouldBe(BsonType.Document);
|
||||||
|
items.NestedSchema.ShouldNotBeNull();
|
||||||
|
items.NestedSchema.Fields.ShouldContain(f => f.Name == "_id");
|
||||||
|
}
|
||||||
|
|
||||||
public class SimpleEntity
|
public class SimpleEntity
|
||||||
{
|
{
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -32,27 +101,6 @@ public class BsonSchemaTests
|
|||||||
public bool IsActive { get; set; }
|
public bool IsActive { get; set; }
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Verifies schema generation for a simple entity.
|
|
||||||
/// </summary>
|
|
||||||
[Fact]
|
|
||||||
public void GenerateSchema_SimpleEntity()
|
|
||||||
{
|
|
||||||
var schema = BsonSchemaGenerator.FromType<SimpleEntity>();
|
|
||||||
|
|
||||||
schema.Title.ShouldBe("SimpleEntity");
|
|
||||||
schema.Fields.Count.ShouldBe(4);
|
|
||||||
|
|
||||||
var idField = schema.Fields.First(f => f.Name == "_id");
|
|
||||||
idField.Type.ShouldBe(BsonType.ObjectId);
|
|
||||||
|
|
||||||
var nameField = schema.Fields.First(f => f.Name == "name");
|
|
||||||
nameField.Type.ShouldBe(BsonType.String);
|
|
||||||
|
|
||||||
var ageField = schema.Fields.First(f => f.Name == "age");
|
|
||||||
ageField.Type.ShouldBe(BsonType.Int32);
|
|
||||||
}
|
|
||||||
|
|
||||||
public class CollectionEntity
|
public class CollectionEntity
|
||||||
{
|
{
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -66,23 +114,6 @@ public class BsonSchemaTests
|
|||||||
public int[] Scores { get; set; } = Array.Empty<int>();
|
public int[] Scores { get; set; } = Array.Empty<int>();
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Verifies schema generation for collection fields.
|
|
||||||
/// </summary>
|
|
||||||
[Fact]
|
|
||||||
public void GenerateSchema_Collections()
|
|
||||||
{
|
|
||||||
var schema = BsonSchemaGenerator.FromType<CollectionEntity>();
|
|
||||||
|
|
||||||
var tags = schema.Fields.First(f => f.Name == "tags");
|
|
||||||
tags.Type.ShouldBe(BsonType.Array);
|
|
||||||
tags.ArrayItemType.ShouldBe(BsonType.String);
|
|
||||||
|
|
||||||
var scores = schema.Fields.First(f => f.Name == "scores");
|
|
||||||
scores.Type.ShouldBe(BsonType.Array);
|
|
||||||
scores.ArrayItemType.ShouldBe(BsonType.Int32);
|
|
||||||
}
|
|
||||||
|
|
||||||
public class NestedEntity
|
public class NestedEntity
|
||||||
{
|
{
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -91,20 +122,6 @@ public class BsonSchemaTests
|
|||||||
public SimpleEntity Parent { get; set; } = new();
|
public SimpleEntity Parent { get; set; } = new();
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Verifies schema generation for nested document fields.
|
|
||||||
/// </summary>
|
|
||||||
[Fact]
|
|
||||||
public void GenerateSchema_Nested()
|
|
||||||
{
|
|
||||||
var schema = BsonSchemaGenerator.FromType<NestedEntity>();
|
|
||||||
|
|
||||||
var parent = schema.Fields.First(f => f.Name == "parent");
|
|
||||||
parent.Type.ShouldBe(BsonType.Document);
|
|
||||||
parent.NestedSchema.ShouldNotBeNull();
|
|
||||||
parent.NestedSchema.Fields.ShouldContain(f => f.Name == "_id");
|
|
||||||
}
|
|
||||||
|
|
||||||
public class ComplexCollectionEntity
|
public class ComplexCollectionEntity
|
||||||
{
|
{
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@@ -112,25 +129,4 @@ public class BsonSchemaTests
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
public List<SimpleEntity> Items { get; set; } = new();
|
public List<SimpleEntity> Items { get; set; } = new();
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// Verifies schema generation for collections of complex types.
|
|
||||||
/// </summary>
|
|
||||||
[Fact]
|
|
||||||
public void GenerateSchema_ComplexCollection()
|
|
||||||
{
|
|
||||||
var schema = BsonSchemaGenerator.FromType<ComplexCollectionEntity>();
|
|
||||||
|
|
||||||
var items = schema.Fields.First(f => f.Name == "items");
|
|
||||||
items.Type.ShouldBe(BsonType.Array);
|
|
||||||
// items.ArrayItemType.ShouldBe(BsonType.Document); // Wait, my generator logic might return Array here? No, item type logic...
|
|
||||||
|
|
||||||
// Let's verify generator logic for complex array item type
|
|
||||||
// In generator: (BsonType.Array, itemNested, itemBsonType)
|
|
||||||
// itemBsonType for SimpleEntity should be Document
|
|
||||||
|
|
||||||
items.ArrayItemType.ShouldBe(BsonType.Document);
|
|
||||||
items.NestedSchema.ShouldNotBeNull();
|
|
||||||
items.NestedSchema.Fields.ShouldContain(f => f.Name == "_id");
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
@@ -1,6 +1,5 @@
|
|||||||
using ZB.MOM.WW.CBDD.Bson;
|
|
||||||
using Xunit;
|
|
||||||
using System.Collections.Concurrent;
|
using System.Collections.Concurrent;
|
||||||
|
using ZB.MOM.WW.CBDD.Bson;
|
||||||
|
|
||||||
namespace ZB.MOM.WW.CBDD.Tests;
|
namespace ZB.MOM.WW.CBDD.Tests;
|
||||||
|
|
||||||
@@ -15,8 +14,12 @@ public class BsonSpanReaderWriterTests
|
|||||||
public BsonSpanReaderWriterTests()
|
public BsonSpanReaderWriterTests()
|
||||||
{
|
{
|
||||||
ushort id = 1;
|
ushort id = 1;
|
||||||
string[] initialKeys = ["name", "age", "active", "_id", "val", "dec", "timestamp", "int32", "int64", "double", "data", "child", "value", "0", "1"];
|
string[] initialKeys =
|
||||||
foreach (var key in initialKeys)
|
[
|
||||||
|
"name", "age", "active", "_id", "val", "dec", "timestamp", "int32", "int64", "double", "data", "child",
|
||||||
|
"value", "0", "1"
|
||||||
|
];
|
||||||
|
foreach (string key in initialKeys)
|
||||||
{
|
{
|
||||||
_keyMap[key] = id;
|
_keyMap[key] = id;
|
||||||
_keys[id] = key;
|
_keys[id] = key;
|
||||||
@@ -33,7 +36,7 @@ public class BsonSpanReaderWriterTests
|
|||||||
Span<byte> buffer = stackalloc byte[256];
|
Span<byte> buffer = stackalloc byte[256];
|
||||||
var writer = new BsonSpanWriter(buffer, _keyMap);
|
var writer = new BsonSpanWriter(buffer, _keyMap);
|
||||||
|
|
||||||
var sizePos = writer.BeginDocument();
|
int sizePos = writer.BeginDocument();
|
||||||
writer.WriteString("name", "John");
|
writer.WriteString("name", "John");
|
||||||
writer.WriteInt32("age", 30);
|
writer.WriteInt32("age", 30);
|
||||||
writer.WriteBoolean("active", true);
|
writer.WriteBoolean("active", true);
|
||||||
@@ -42,29 +45,29 @@ public class BsonSpanReaderWriterTests
|
|||||||
var documentBytes = buffer[..writer.Position];
|
var documentBytes = buffer[..writer.Position];
|
||||||
|
|
||||||
var reader = new BsonSpanReader(documentBytes, _keys);
|
var reader = new BsonSpanReader(documentBytes, _keys);
|
||||||
var size = reader.ReadDocumentSize();
|
int size = reader.ReadDocumentSize();
|
||||||
|
|
||||||
size.ShouldBe(writer.Position);
|
size.ShouldBe(writer.Position);
|
||||||
|
|
||||||
var type1 = reader.ReadBsonType();
|
var type1 = reader.ReadBsonType();
|
||||||
var name1 = reader.ReadElementHeader();
|
string name1 = reader.ReadElementHeader();
|
||||||
var value1 = reader.ReadString();
|
string value1 = reader.ReadString();
|
||||||
|
|
||||||
type1.ShouldBe(BsonType.String);
|
type1.ShouldBe(BsonType.String);
|
||||||
name1.ShouldBe("name");
|
name1.ShouldBe("name");
|
||||||
value1.ShouldBe("John");
|
value1.ShouldBe("John");
|
||||||
|
|
||||||
var type2 = reader.ReadBsonType();
|
var type2 = reader.ReadBsonType();
|
||||||
var name2 = reader.ReadElementHeader();
|
string name2 = reader.ReadElementHeader();
|
||||||
var value2 = reader.ReadInt32();
|
int value2 = reader.ReadInt32();
|
||||||
|
|
||||||
type2.ShouldBe(BsonType.Int32);
|
type2.ShouldBe(BsonType.Int32);
|
||||||
name2.ShouldBe("age");
|
name2.ShouldBe("age");
|
||||||
value2.ShouldBe(30);
|
value2.ShouldBe(30);
|
||||||
|
|
||||||
var type3 = reader.ReadBsonType();
|
var type3 = reader.ReadBsonType();
|
||||||
var name3 = reader.ReadElementHeader();
|
string name3 = reader.ReadElementHeader();
|
||||||
var value3 = reader.ReadBoolean();
|
bool value3 = reader.ReadBoolean();
|
||||||
|
|
||||||
type3.ShouldBe(BsonType.Boolean);
|
type3.ShouldBe(BsonType.Boolean);
|
||||||
name3.ShouldBe("active");
|
name3.ShouldBe("active");
|
||||||
@@ -82,7 +85,7 @@ public class BsonSpanReaderWriterTests
|
|||||||
|
|
||||||
var oid = ObjectId.NewObjectId();
|
var oid = ObjectId.NewObjectId();
|
||||||
|
|
||||||
var sizePos = writer.BeginDocument();
|
int sizePos = writer.BeginDocument();
|
||||||
writer.WriteObjectId("_id", oid);
|
writer.WriteObjectId("_id", oid);
|
||||||
writer.EndDocument(sizePos);
|
writer.EndDocument(sizePos);
|
||||||
|
|
||||||
@@ -91,7 +94,7 @@ public class BsonSpanReaderWriterTests
|
|||||||
|
|
||||||
reader.ReadDocumentSize();
|
reader.ReadDocumentSize();
|
||||||
var type = reader.ReadBsonType();
|
var type = reader.ReadBsonType();
|
||||||
var name = reader.ReadElementHeader();
|
string name = reader.ReadElementHeader();
|
||||||
var readOid = reader.ReadObjectId();
|
var readOid = reader.ReadObjectId();
|
||||||
|
|
||||||
type.ShouldBe(BsonType.ObjectId);
|
type.ShouldBe(BsonType.ObjectId);
|
||||||
@@ -112,8 +115,8 @@ public class BsonSpanReaderWriterTests
|
|||||||
|
|
||||||
var reader = new BsonSpanReader(buffer, _keys);
|
var reader = new BsonSpanReader(buffer, _keys);
|
||||||
var type = reader.ReadBsonType();
|
var type = reader.ReadBsonType();
|
||||||
var name = reader.ReadElementHeader();
|
string name = reader.ReadElementHeader();
|
||||||
var val = reader.ReadDouble();
|
double val = reader.ReadDouble();
|
||||||
|
|
||||||
type.ShouldBe(BsonType.Double);
|
type.ShouldBe(BsonType.Double);
|
||||||
name.ShouldBe("val");
|
name.ShouldBe("val");
|
||||||
@@ -129,13 +132,13 @@ public class BsonSpanReaderWriterTests
|
|||||||
var buffer = new byte[256];
|
var buffer = new byte[256];
|
||||||
var writer = new BsonSpanWriter(buffer, _keyMap);
|
var writer = new BsonSpanWriter(buffer, _keyMap);
|
||||||
|
|
||||||
decimal original = 123456.789m;
|
var original = 123456.789m;
|
||||||
writer.WriteDecimal128("dec", original);
|
writer.WriteDecimal128("dec", original);
|
||||||
|
|
||||||
var reader = new BsonSpanReader(buffer, _keys);
|
var reader = new BsonSpanReader(buffer, _keys);
|
||||||
var type = reader.ReadBsonType();
|
var type = reader.ReadBsonType();
|
||||||
var name = reader.ReadElementHeader();
|
string name = reader.ReadElementHeader();
|
||||||
var val = reader.ReadDecimal128();
|
decimal val = reader.ReadDecimal128();
|
||||||
|
|
||||||
type.ShouldBe(BsonType.Decimal128);
|
type.ShouldBe(BsonType.Decimal128);
|
||||||
name.ShouldBe("dec");
|
name.ShouldBe("dec");
|
||||||
@@ -156,7 +159,7 @@ public class BsonSpanReaderWriterTests
|
|||||||
var expectedTime = new DateTime(now.Year, now.Month, now.Day,
|
var expectedTime = new DateTime(now.Year, now.Month, now.Day,
|
||||||
now.Hour, now.Minute, now.Second, now.Millisecond, DateTimeKind.Utc);
|
now.Hour, now.Minute, now.Second, now.Millisecond, DateTimeKind.Utc);
|
||||||
|
|
||||||
var sizePos = writer.BeginDocument();
|
int sizePos = writer.BeginDocument();
|
||||||
writer.WriteDateTime("timestamp", expectedTime);
|
writer.WriteDateTime("timestamp", expectedTime);
|
||||||
writer.EndDocument(sizePos);
|
writer.EndDocument(sizePos);
|
||||||
|
|
||||||
@@ -165,7 +168,7 @@ public class BsonSpanReaderWriterTests
|
|||||||
|
|
||||||
reader.ReadDocumentSize();
|
reader.ReadDocumentSize();
|
||||||
var type = reader.ReadBsonType();
|
var type = reader.ReadBsonType();
|
||||||
var name = reader.ReadElementHeader();
|
string name = reader.ReadElementHeader();
|
||||||
var readTime = reader.ReadDateTime();
|
var readTime = reader.ReadDateTime();
|
||||||
|
|
||||||
type.ShouldBe(BsonType.DateTime);
|
type.ShouldBe(BsonType.DateTime);
|
||||||
@@ -182,7 +185,7 @@ public class BsonSpanReaderWriterTests
|
|||||||
Span<byte> buffer = stackalloc byte[256];
|
Span<byte> buffer = stackalloc byte[256];
|
||||||
var writer = new BsonSpanWriter(buffer, _keyMap);
|
var writer = new BsonSpanWriter(buffer, _keyMap);
|
||||||
|
|
||||||
var sizePos = writer.BeginDocument();
|
int sizePos = writer.BeginDocument();
|
||||||
writer.WriteInt32("int32", int.MaxValue);
|
writer.WriteInt32("int32", int.MaxValue);
|
||||||
writer.WriteInt64("int64", long.MaxValue);
|
writer.WriteInt64("int64", long.MaxValue);
|
||||||
writer.WriteDouble("double", 3.14159);
|
writer.WriteDouble("double", 3.14159);
|
||||||
@@ -217,7 +220,7 @@ public class BsonSpanReaderWriterTests
|
|||||||
|
|
||||||
byte[] testData = [1, 2, 3, 4, 5];
|
byte[] testData = [1, 2, 3, 4, 5];
|
||||||
|
|
||||||
var sizePos = writer.BeginDocument();
|
int sizePos = writer.BeginDocument();
|
||||||
writer.WriteBinary("data", testData);
|
writer.WriteBinary("data", testData);
|
||||||
writer.EndDocument(sizePos);
|
writer.EndDocument(sizePos);
|
||||||
|
|
||||||
@@ -226,8 +229,8 @@ public class BsonSpanReaderWriterTests
|
|||||||
|
|
||||||
reader.ReadDocumentSize();
|
reader.ReadDocumentSize();
|
||||||
var type = reader.ReadBsonType();
|
var type = reader.ReadBsonType();
|
||||||
var name = reader.ReadElementHeader();
|
string name = reader.ReadElementHeader();
|
||||||
var readData = reader.ReadBinary(out var subtype);
|
var readData = reader.ReadBinary(out byte subtype);
|
||||||
|
|
||||||
type.ShouldBe(BsonType.Binary);
|
type.ShouldBe(BsonType.Binary);
|
||||||
name.ShouldBe("data");
|
name.ShouldBe("data");
|
||||||
@@ -244,10 +247,10 @@ public class BsonSpanReaderWriterTests
|
|||||||
Span<byte> buffer = stackalloc byte[512];
|
Span<byte> buffer = stackalloc byte[512];
|
||||||
var writer = new BsonSpanWriter(buffer, _keyMap);
|
var writer = new BsonSpanWriter(buffer, _keyMap);
|
||||||
|
|
||||||
var rootSizePos = writer.BeginDocument();
|
int rootSizePos = writer.BeginDocument();
|
||||||
writer.WriteString("name", "Parent");
|
writer.WriteString("name", "Parent");
|
||||||
|
|
||||||
var childSizePos = writer.BeginDocument("child");
|
int childSizePos = writer.BeginDocument("child");
|
||||||
writer.WriteString("name", "Child");
|
writer.WriteString("name", "Child");
|
||||||
writer.WriteInt32("value", 42);
|
writer.WriteInt32("value", 42);
|
||||||
writer.EndDocument(childSizePos);
|
writer.EndDocument(childSizePos);
|
||||||
@@ -256,7 +259,7 @@ public class BsonSpanReaderWriterTests
|
|||||||
|
|
||||||
var documentBytes = buffer[..writer.Position];
|
var documentBytes = buffer[..writer.Position];
|
||||||
var reader = new BsonSpanReader(documentBytes, _keys);
|
var reader = new BsonSpanReader(documentBytes, _keys);
|
||||||
var rootSize = reader.ReadDocumentSize();
|
int rootSize = reader.ReadDocumentSize();
|
||||||
|
|
||||||
rootSize.ShouldBe(writer.Position);
|
rootSize.ShouldBe(writer.Position);
|
||||||
|
|
||||||
|
|||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user