Compare commits

..

10 Commits

Author SHA1 Message Date
Denis Angell
b418b416b0 add claude 2026-02-06 19:31:07 +01:00
Ayaz Salikhov
2305bc98a4 chore: Remove CODEOWNERS (#6337) 2026-02-06 11:39:23 -05:00
Bart
677758b1cc perf: Remove unnecessary caches (#5439)
This change removes the cache in `DatabaseNodeImp` and simplifies the caching logic in `SHAMapStoreImp`. As NuDB and RocksDB internally already use caches, additional caches in the code are not very valuable or may even be unnecessary, as also confirmed during preliminary performance analyses.
2026-02-06 09:42:35 -05:00
Bart
25d7c2c4ec chore: Restore unity builds (#6328)
In certain cases, such as when modifying headers used by many compilation units, performing a unity build is slower than when performing a regular build with `ccache` enabled. There is also a benefit to a unity build in that it can detect things such as macro redefinitions within the group of files that are compiled together as a unit. This change therefore restores the ability to perform unity builds. However, instead of running every configuration with and without unity enabled, it is now only enabled for a single configuration to maintain lower computational use.

As part of restoring the code, it became clear that currently two configurations have coverage enabled, since the check doesn't focus specifically on Debian Bookworm so it also applies to Debian Trixie. This has been fixed too in this change.
2026-02-06 14:12:45 +00:00
Bart
0a626d95f4 refactor: Update secp256k1 to 0.7.1 (#6331)
The latest secp256k1 release, 0.7.1, contains bug fixes that we may benefit from, see https://github.com/bitcoin-core/secp256k1/blob/master/CHANGELOG.md.
2026-02-05 16:45:57 +00:00
Niq Dudfield
6006c281e2 fix: Increment sequence when accepting new manifests (#6059)
The `ManifestCache::applyManifest` function was returning early without incrementing `seq_`. `OverlayImpl `uses this sequence to identify/invalidate a cached `TMManifests` message, which is exchanged with peers on connection. Depending on network size, startup sequencing, and topology, this can cause syncing issues. This change therefore increments `seq_` when a new manifest is accepted.
2026-02-05 10:40:27 -05:00
Vito Tumas
e79673cf40 fix typo in LendingHelpers unit-test (#6215) 2026-02-05 10:23:44 +00:00
Ayaz Salikhov
7f41012e59 chore: Update secp256k1 and openssl (#6327) 2026-02-04 18:27:10 +00:00
Bart
b449a6ee84 chore: Remove unnecessary script (#6326) 2026-02-04 11:30:16 -05:00
Bart
34ef577604 refactor: Replace include guards by '#pragma once' (#6322)
This change replaces all include guards in the `src/` and `include/` directories by `#pragma once`.
2026-02-04 09:50:21 -05:00
701 changed files with 1586 additions and 3354 deletions

54
.claude/CLAUDE.md Normal file
View File

@@ -0,0 +1,54 @@
# Workflow Orchestration
## 1. Plan Mode Default
- Enter plan mode for ANY non-trivial task (3+ steps or architectural decisions)
- If something goes sideways, STOP and re-plan immediately - don't keep pushing
- Use plan mode for verification steps, not just building
- Write detailed specs upfront to reduce ambiguity
## 2. Subagent Strategy
- Use subagents liberally to keep main context window clean
- Offload research, exploration, and parallel analysis to subagents
- For complex problems, throw compute at it via parallelizing
- One task per subagent for focused execution
## 3. Self-Improvement Loop
- After ANY correction from the user: update `tasks/lessons.md` with the pattern
- Write rules for yourself that prevent the same mistake
- Ruthlessly iterate on these rules until mistake drops
- Review lessons at session start for relevant project
## 4. Verification Before Done
- Never mark a task complete without proving it works
- Diff behavior between main and your changes when relevant
- Ask yourself: "Would a staff engineer approve this?"
- Run tests, check logs, demonstrate correctness
## 5. Demand Elegance (Balanced)
- For non-trivial changes: pause and ask "is there a more elegant way?"
- If a fix feels hacky: "Know now, do later, implement the elegant solution"
- Skip this for simple, obvious fixes - don't over-engineer
- Challenge your own first solution
## 6. Autonomous Bug Fixing
- When given a bug report: just fix it. Don't ask for hand-holding
- Point at logs, errors, failing tests - then resolve them
- Zero context switching required from the user
- Go fix failing CI tests without being told how
# Task Management
1. **Plan First**: Write plan to `tasks/todo.md` with checkable items
2. **Verify Plans**: Check in before starting implementation
3. **Track Progress**: Mark items complete as you go
4. **Explain Changes**: High-level summary at each step
5. **Document Results**: Add review section to `tasks/todo.md`
6. **Capture Lessons**: Update `tasks/lessons.md` after corrections
# Core Principles
**Simplicity First**: Make every change as simple as possible. Impact minimal code.
**No Boilerplate**: Skip documentation. No README/docs bloat. Test only if requested.
**Minimal Impact**: Changes should only touch what's necessary. Avoid introducing bugs.

530
.claude/skills/amendment.md Normal file
View File

@@ -0,0 +1,530 @@
# XRPL Amendment Creation Skill
This skill guides you through creating XRPL amendments, whether for brand new features or fixes/extensions to existing functionality.
## Amendment Types
There are two main types of amendments:
1. **New Feature Amendment** (`feature{Name}`) - Entirely new functionality
2. **Fix/Extension Amendment** (`fix{Name}`) - Modifications to existing functionality
## Step 1: Determine Amendment Type
Ask the user:
- Is this a **brand new feature** (new transaction type, ledger entry, or capability)?
- Or is this a **fix or extension** to existing functionality?
---
## For NEW FEATURE Amendments
### Checklist:
#### 1. Feature Definition in features.macro
**ONLY FILE TO EDIT:** `include/xrpl/protocol/detail/features.macro`
- [ ] Add to TOP of features.macro (reverse chronological order):
```
XRPL_FEATURE(YourFeatureName, Supported::no, VoteBehavior::DefaultNo)
```
- [ ] This creates the variable `featureYourFeatureName` automatically
- [ ] Follow naming convention: Use the feature name WITHOUT the "feature" prefix
- [ ] Examples: `Batch``featureBatch`, `LendingProtocol``featureLendingProtocol`
#### 2. Support Status
- [ ] Start with `Supported::no` during development
- [ ] Change to `Supported::yes` when ready for network voting
- [ ] Use `VoteBehavior::DefaultNo` (validators must explicitly vote for it)
#### 3. Code Implementation
- [ ] Implement new functionality (transaction type, ledger entry, etc.)
- [ ] Add feature gate check in preflight:
```cpp
if (!env.current()->rules().enabled(feature{Name}))
{
return temDISABLED;
}
```
#### 4. Disable Route Handling
- [ ] Ensure transaction returns `temDISABLED` when amendment is disabled
- [ ] Implement early rejection in preflight/preclaim phase
- [ ] Add appropriate error messages
#### 5. Test Implementation
Create comprehensive test suite with this structure:
```cpp
class {FeatureName}_test : public beast::unit_test::suite
{
public:
void testEnable(FeatureBitset features)
{
testcase("enabled");
// Test with feature DISABLED
{
auto const amendNoFeature = features - feature{Name};
Env env{*this, amendNoFeature};
env(transaction, ter(temDISABLED));
}
// Test with feature ENABLED
{
Env env{*this, features};
env(transaction, ter(tesSUCCESS));
// Validate new functionality works
}
}
void testPreflight(FeatureBitset features)
{
testcase("preflight");
// Test malformed transaction validation
}
void testPreclaim(FeatureBitset features)
{
testcase("preclaim");
// Test signature and claim phase validation
}
void testWithFeats(FeatureBitset features)
{
testEnable(features);
testPreflight(features);
testPreclaim(features);
// Add feature-specific tests
}
void run() override
{
using namespace test::jtx;
auto const all = supported_amendments();
testWithFeats(all);
}
};
```
#### 6. Test Coverage Requirements
- [ ] Test amendment DISABLED state (returns `temDISABLED`)
- [ ] Test amendment ENABLED state (returns `tesSUCCESS`)
- [ ] Test malformed transactions
- [ ] Test signature validation
- [ ] Test edge cases specific to feature
- [ ] Test amendment transition behavior
#### 7. Documentation
- [ ] Create specification document (XLS_{NAME}.md)
- [ ] Document new transaction types, ledger entries, or capabilities
- [ ] Create test plan document
- [ ] Document expected behavior when enabled/disabled
---
## For FIX/EXTENSION Amendments
### Checklist:
#### 1. Fix Definition in features.macro
**ONLY FILE TO EDIT:** `include/xrpl/protocol/detail/features.macro`
- [ ] Add to TOP of features.macro (reverse chronological order):
```
XRPL_FIX(YourFixName, Supported::no, VoteBehavior::DefaultNo)
```
- [ ] This creates the variable `fixYourFixName` automatically
- [ ] Follow naming convention: Use the fix name WITHOUT the "fix" prefix (it's added automatically)
- [ ] Examples: `TokenEscrowV1``fixTokenEscrowV1`, `DirectoryLimit``fixDirectoryLimit`
- [ ] Start with `Supported::no` during development, change to `Supported::yes` when ready
#### 2. Backward Compatibility Implementation
**Critical**: Use enable() with if/else to preserve existing functionality
```cpp
// Check if fix is enabled
bool const fix{Name} = env.current()->rules().enabled(fix{Name});
// Conditional logic based on amendment state
if (fix{Name})
{
// NEW behavior with fix applied
// This is the corrected/improved logic
}
else
{
// OLD behavior (legacy path)
// Preserve original functionality for backward compatibility
}
```
**Alternative pattern with ternary operator:**
```cpp
auto& viewToUse = sb.rules().enabled(fix{Name}) ? sb : legacyView;
```
#### 3. Multiple Fix Versions Pattern
For iterative fixes, use version checking:
```cpp
bool const fixV1 = rv.rules().enabled(fixXahauV1);
bool const fixV2 = rv.rules().enabled(fixXahauV2);
switch (transactionType)
{
case TYPE_1:
if (fixV1) {
// Behavior with V1 fix
} else {
// Legacy behavior
}
break;
case TYPE_2:
if (fixV2) {
// Behavior with V2 fix
} else if (fixV1) {
// Behavior with only V1
} else {
// Legacy behavior
}
break;
}
```
#### 4. Test Both Paths
Always test BOTH enabled and disabled states:
```cpp
void testFix(FeatureBitset features)
{
testcase("fix behavior");
for (bool withFix : {false, true})
{
auto const amend = withFix ? features : features - fix{Name};
Env env{*this, amend};
// Setup test scenario
env.fund(XRP(1000), alice);
env.close();
if (!withFix)
{
// Test OLD behavior (before fix)
env(operation, ter(expectedErrorWithoutFix));
// Verify old behavior is preserved
}
else
{
// Test NEW behavior (after fix)
env(operation, ter(expectedErrorWithFix));
// Verify fix works correctly
}
}
}
```
#### 5. Security Fix Pattern
For security-critical fixes (like fixBatchInnerSigs):
```cpp
// Test vulnerability exists WITHOUT fix
{
auto const amendNoFix = features - fix{Name};
Env env{*this, amendNoFix};
// Demonstrate vulnerability
// Expected: Validity::Valid (WRONG - vulnerable!)
BEAST_EXPECT(result == Validity::Valid);
}
// Test vulnerability is FIXED WITH amendment
{
Env env{*this, features};
// Demonstrate fix
// Expected: Validity::SigBad (CORRECT - protected!)
BEAST_EXPECT(result == Validity::SigBad);
}
```
#### 6. Test Coverage Requirements
- [ ] Test fix DISABLED (legacy behavior preserved)
- [ ] Test fix ENABLED (new behavior applied)
- [ ] Test amendment transition
- [ ] For security fixes: demonstrate vulnerability without fix
- [ ] For security fixes: demonstrate protection with fix
- [ ] Test edge cases that triggered the fix
- [ ] Test combinations with other amendments
#### 7. Documentation
- [ ] Document what was broken/suboptimal
- [ ] Document the fix applied
- [ ] Document backward compatibility behavior
- [ ] Create test summary showing both paths
---
## Best Practices for All Amendments
### 1. Naming Conventions
- New features: `feature{DescriptiveName}` (e.g., `featureBatch`, `featureHooks`)
- Fixes: `fix{IssueDescription}` (e.g., `fixBatchInnerSigs`, `fixNSDelete`)
- Use CamelCase without underscores
### 2. Feature Flag Checking
```cpp
// At the point where behavior diverges:
bool const amendmentEnabled = env.current()->rules().enabled(feature{Name});
// Or in view/rules context:
if (!rv.rules().enabled(feature{Name}))
return {}; // or legacy behavior
```
### 3. Error Codes
- New features when disabled: `temDISABLED`
- Fixes may return different validation errors based on state
- Document all error code changes
### 4. Test Structure Template
```cpp
class Amendment_test : public beast::unit_test::suite
{
public:
// Core tests
void testEnable(FeatureBitset features); // Enable/disable states
void testPreflight(FeatureBitset features); // Validation
void testPreclaim(FeatureBitset features); // Claim phase
// Feature-specific tests
void test{SpecificScenario1}(FeatureBitset features);
void test{SpecificScenario2}(FeatureBitset features);
// Master orchestrator
void testWithFeats(FeatureBitset features)
{
testEnable(features);
testPreflight(features);
testPreclaim(features);
test{SpecificScenario1}(features);
test{SpecificScenario2}(features);
}
void run() override
{
using namespace test::jtx;
auto const all = supported_amendments();
testWithFeats(all);
}
};
BEAST_DEFINE_TESTSUITE(Amendment, app, ripple);
```
### 5. Documentation Files
Create these files:
- **Specification**: `XLS_{FEATURE_NAME}.md` - Technical specification
- **Test Plan**: `{FEATURE}_COMPREHENSIVE_TEST_PLAN.md` - Test strategy
- **Test Summary**: `{FEATURE}_TEST_IMPLEMENTATION_SUMMARY.md` - Test results
- **Review Findings**: `{FEATURE}_REVIEW_FINDINGS.md` (if applicable)
### 6. Amendment Transition Testing
Test the moment an amendment activates:
```cpp
void testAmendmentTransition(FeatureBitset features)
{
testcase("amendment transition");
// Start with amendment disabled
auto const amendNoFeature = features - feature{Name};
Env env{*this, amendNoFeature};
// Perform operations in disabled state
env(operation1, ter(temDISABLED));
// Enable amendment mid-test (if testing mechanism supports it)
// Verify state transitions correctly
// Perform operations in enabled state
env(operation2, ter(tesSUCCESS));
}
```
### 7. Cache Behavior
If amendment affects caching (like fixBatchInnerSigs):
- [ ] Test cache behavior without fix
- [ ] Test cache behavior with fix
- [ ] Document cache invalidation requirements
### 8. Multi-Amendment Combinations
Test interactions with other amendments:
```cpp
void testMultipleAmendments(FeatureBitset features)
{
// Test all combinations
for (bool withFeature1 : {false, true})
for (bool withFeature2 : {false, true})
{
auto amend = features;
if (!withFeature1) amend -= feature1;
if (!withFeature2) amend -= feature2;
Env env{*this, amend};
// Test interaction behavior
}
}
```
### 9. Performance Considerations
- [ ] Minimize runtime checks (cache `rules().enabled()` result if used multiple times)
- [ ] Avoid nested feature checks where possible
- [ ] Document performance impact
### 10. Code Review Checklist
- [ ] Both enabled/disabled paths are tested
- [ ] Backward compatibility is preserved (for fixes)
- [ ] Error codes are appropriate
- [ ] Documentation is complete
- [ ] Security implications are considered
- [ ] Cache behavior is correct
- [ ] Edge cases are covered
---
## Common Patterns Reference
### Pattern: New Transaction Type
```cpp
// In transactor code:
TER doApply() override
{
if (!ctx_.view().rules().enabled(feature{Name}))
return temDISABLED;
// New transaction logic here
return tesSUCCESS;
}
```
### Pattern: New Ledger Entry Type
```cpp
// In ledger entry creation:
if (!view.rules().enabled(feature{Name}))
return temDISABLED;
auto const sle = std::make_shared<SLE>(ltNEW_TYPE, keylet);
view.insert(sle);
```
### Pattern: Behavioral Fix
```cpp
// At decision point:
bool const useFix = view.rules().enabled(fix{Name});
if (useFix)
{
// Corrected behavior
return performCorrectValidation();
}
else
{
// Legacy behavior (preserved for compatibility)
return performLegacyValidation();
}
```
### Pattern: View Selection
```cpp
// Select which view to use based on amendment:
auto& applyView = sb.rules().enabled(feature{Name}) ? newView : legacyView;
```
---
## Example Workflows
### Workflow 1: Creating a New Feature Amendment
1. User requests: "Add a new ClaimReward transaction type"
2. Skill asks: "What should the amendment be called? (e.g., BalanceRewards - without 'feature' prefix)"
3. Add to features.macro:
```
XRPL_FEATURE(BalanceRewards, Supported::no, VoteBehavior::DefaultNo)
```
4. Implement transaction with `temDISABLED` gate using `featureBalanceRewards`
5. Create test suite with testEnable, testPreflight, testPreclaim
6. Run tests with amendment enabled and disabled
7. When ready, update to `Supported::yes` in features.macro
8. Create specification document
9. Review checklist
### Workflow 2: Creating a Fix Amendment
1. User requests: "Fix the signature validation bug in batch transactions"
2. Skill asks: "What should the fix be called? (e.g., BatchInnerSigs - without 'fix' prefix)"
3. Add to features.macro:
```
XRPL_FIX(BatchInnerSigs, Supported::no, VoteBehavior::DefaultNo)
```
4. Implement fix with if/else using `fixBatchInnerSigs` to preserve old behavior
5. Create test demonstrating vulnerability without fix
6. Create test showing fix works when enabled
7. When ready, update to `Supported::yes` in features.macro
8. Document both code paths
9. Review checklist
---
## Quick Reference: File Locations
- **Amendment definitions (ONLY place to add)**: `include/xrpl/protocol/detail/features.macro`
- **Feature.h (auto-generated, DO NOT EDIT)**: `include/xrpl/protocol/Feature.h`
- **Feature.cpp (auto-generated, DO NOT EDIT)**: `src/libxrpl/protocol/Feature.cpp`
- **Test files**: `src/test/app/` or `src/test/protocol/`
- **Specifications**: Project root (e.g., `XLS_SMART_CONTRACTS.md`)
- **Test plans**: Project root (e.g., `BATCH_COMPREHENSIVE_TEST_PLAN.md`)
## How the Macro System Works
The amendment system uses C preprocessor macros:
1. **features.macro** - Single source of truth (ONLY file you edit):
```
XRPL_FEATURE(Batch, Supported::yes, VoteBehavior::DefaultNo)
XRPL_FIX(TokenEscrowV1, Supported::yes, VoteBehavior::DefaultNo)
```
2. **Feature.h** - Auto-generated declarations from macro:
```cpp
extern uint256 const featureBatch;
extern uint256 const fixTokenEscrowV1;
```
3. **Feature.cpp** - Auto-generated registrations from macro:
```cpp
uint256 const featureBatch = registerFeature("Batch", ...);
uint256 const fixTokenEscrowV1 = registerFeature("fixTokenEscrowV1", ...);
```
**DO NOT** modify Feature.h or Feature.cpp directly - they process features.macro automatically.
---
## When to Use This Skill
Invoke this skill when:
- Creating a new XRPL amendment
- Adding a new transaction type or ledger entry
- Fixing existing XRPL functionality
- Need guidance on amendment best practices
- Setting up amendment tests
- Reviewing amendment implementation
The skill will guide you through the appropriate workflow based on amendment type.

8
.github/CODEOWNERS vendored
View File

@@ -1,8 +0,0 @@
# Allow anyone to review any change by default.
*
# Require the rpc-reviewers team to review changes to the rpc code.
include/xrpl/protocol/ @xrplf/rpc-reviewers
src/libxrpl/protocol/ @xrplf/rpc-reviewers
src/xrpld/rpc/ @xrplf/rpc-reviewers
src/xrpld/app/misc/ @xrplf/rpc-reviewers

View File

@@ -70,7 +70,7 @@ that `test` code should _never_ be included in `xrpl` or `xrpld` code.)
## Validation
The [levelization](generate.py) script takes no parameters,
The [levelization](generate.sh) script takes no parameters,
reads no environment variables, and can be run from any directory,
as long as it is in the expected location in the rippled repo.
It can be run at any time from within a checked out repo, and will
@@ -104,7 +104,7 @@ It generates many files of [results](results):
Github Actions workflow to test that levelization loops haven't
changed. Unfortunately, if changes are detected, it can't tell if
they are improvements or not, so if you have resolved any issues or
done anything else to improve levelization, run `generate.py`,
done anything else to improve levelization, run `levelization.sh`,
and commit the updated results.
The `loops.txt` and `ordering.txt` files relate the modules
@@ -128,7 +128,7 @@ The committed files hide the detailed values intentionally, to
prevent false alarms and merging issues, and because it's easy to
get those details locally.
1. Run `generate.py`
1. Run `levelization.sh`
2. Grep the modules in `paths.txt`.
- For example, if a cycle is found `A ~= B`, simply `grep -w
A .github/scripts/levelization/results/paths.txt | grep -w B`

View File

@@ -1,369 +0,0 @@
#!/usr/bin/env python3
"""
Usage: generate.py
This script takes no parameters, and can be run from any directory,
as long as it is in the expected.
location in the repo.
"""
import os
import re
import subprocess
import sys
from collections import defaultdict
from pathlib import Path
from typing import Dict, List, Tuple, Set, Optional
# Compile regex patterns once at module level
INCLUDE_PATTERN = re.compile(r"^\s*#include.*/.*\.h")
INCLUDE_PATH_PATTERN = re.compile(r'[<"]([^>"]+)[>"]')
def dictionary_sort_key(s: str) -> str:
"""
Create a sort key that mimics 'sort -d' (dictionary order).
Dictionary order only considers blanks and alphanumeric characters.
This means punctuation like '.' is ignored during sorting.
"""
# Keep only alphanumeric characters and spaces
return "".join(c for c in s if c.isalnum() or c.isspace())
def get_level(file_path: str) -> str:
"""
Extract the level from a file path (second and third directory components).
Equivalent to bash: cut -d/ -f 2,3
Examples:
src/xrpld/app/main.cpp -> xrpld.app
src/libxrpl/protocol/STObject.cpp -> libxrpl.protocol
include/xrpl/basics/base_uint.h -> xrpl.basics
"""
parts = file_path.split("/")
# Get fields 2 and 3 (indices 1 and 2 in 0-based indexing)
if len(parts) >= 3:
level = f"{parts[1]}/{parts[2]}"
elif len(parts) >= 2:
level = f"{parts[1]}/toplevel"
else:
level = file_path
# If the "level" indicates a file, cut off the filename
if "." in level.split("/")[-1]: # Avoid Path object creation
# Use the "toplevel" label as a workaround for `sort`
# inconsistencies between different utility versions
level = level.rsplit("/", 1)[0] + "/toplevel"
return level.replace("/", ".")
def extract_include_level(include_line: str) -> Optional[str]:
"""
Extract the include path from an #include directive.
Gets the first two directory components from the include path.
Equivalent to bash: cut -d/ -f 1,2
Examples:
#include <xrpl/basics/base_uint.h> -> xrpl.basics
#include "xrpld/app/main/Application.h" -> xrpld.app
"""
# Remove everything before the quote or angle bracket
match = INCLUDE_PATH_PATTERN.search(include_line)
if not match:
return None
include_path = match.group(1)
parts = include_path.split("/")
# Get first two fields (indices 0 and 1)
if len(parts) >= 2:
include_level = f"{parts[0]}/{parts[1]}"
else:
include_level = include_path
# If the "includelevel" indicates a file, cut off the filename
if "." in include_level.split("/")[-1]: # Avoid Path object creation
include_level = include_level.rsplit("/", 1)[0] + "/toplevel"
return include_level.replace("/", ".")
def find_repo_root(start_path: Path, depth_limit: int = 10) -> Path:
"""
Find the repository root by looking for .git directory or src/include folders.
Walks up the directory tree from the start path.
"""
current = start_path.resolve()
# Walk up the directory tree
for _ in range(depth_limit): # Limit search depth to prevent infinite loops
# Check if this directory has src or include folders
has_src = (current / "src").exists()
has_include = (current / "include").exists()
if has_src or has_include:
return current
# Check if this is a git repository root
if (current / ".git").exists():
# Check if it has src or include nearby
if has_src or has_include:
return current
# Move up one level
parent = current.parent
if parent == current: # Reached filesystem root
break
current = parent
# If we couldn't find it, raise an error
raise RuntimeError(
"Could not find repository root. "
"Expected to find a directory containing 'src' and/or 'include' folders."
)
def get_scan_directories(repo_root: Path) -> List[Path]:
"""
Get the list of directories to scan for include files.
Returns paths that actually exist.
"""
directories = []
for dir_name in ["include", "src"]:
dir_path = repo_root / dir_name
if dir_path.exists() and dir_path.is_dir():
directories.append(dir_path)
if not directories:
raise RuntimeError(f"No 'src' or 'include' directories found in {repo_root}")
return directories
def main():
# Change to the script's directory
script_dir = Path(__file__).parent.resolve()
os.chdir(script_dir)
# If the shell is interactive, clean up any flotsam before analyzing
# Match bash behavior: check if PS1 is set (indicates interactive shell)
# When running a script, PS1 is not set even if stdin/stdout are TTYs
if os.environ.get("PS1"):
try:
subprocess.run(["git", "clean", "-ix"], check=False, timeout=30)
except (subprocess.TimeoutExpired, KeyboardInterrupt):
print("Skipping git clean...")
except Exception:
# If git clean fails for any reason, just continue
pass
# Clean up and create results directory
results_dir = script_dir / "results"
if results_dir.exists():
import shutil
shutil.rmtree(results_dir)
results_dir.mkdir()
# Find the repository root by searching for src and include directories
try:
repo_root = find_repo_root(script_dir)
scan_dirs = get_scan_directories(repo_root)
print(f"Found repository root: {repo_root}")
print(f"Scanning directories:")
for scan_dir in scan_dirs:
print(f" - {scan_dir.relative_to(repo_root)}")
except RuntimeError as e:
print(f"Error: {e}", file=sys.stderr)
sys.exit(1)
print("\nScanning for raw includes...")
# Find all #include directives
raw_includes: List[Tuple[str, str]] = []
rawincludes_file = results_dir / "rawincludes.txt"
# Write to file as we go to avoid storing everything in memory
with open(rawincludes_file, "w", buffering=8192) as raw_f:
for dir_path in scan_dirs:
print(f" Scanning {dir_path.relative_to(repo_root)}...")
for file_path in dir_path.rglob("*"):
if not file_path.is_file():
continue
try:
rel_path_str = str(file_path.relative_to(repo_root))
# Read file with larger buffer for better performance
with open(
file_path,
"r",
encoding="utf-8",
errors="ignore",
buffering=8192,
) as f:
for line in f:
# Quick check before regex
if "#include" not in line or "boost" in line:
continue
if INCLUDE_PATTERN.match(line):
line_stripped = line.strip()
entry = f"{rel_path_str}:{line_stripped}\n"
print(entry, end="")
raw_f.write(entry)
raw_includes.append((rel_path_str, line_stripped))
except Exception as e:
print(f"Error reading {file_path}: {e}", file=sys.stderr)
# Build levelization paths and count directly (no need to sort first)
print("Build levelization paths")
path_counts: Dict[Tuple[str, str], int] = defaultdict(int)
for file_path, include_line in raw_includes:
level = get_level(file_path)
include_level = extract_include_level(include_line)
if include_level and level != include_level:
path_counts[(level, include_level)] += 1
# Sort and deduplicate paths (using dictionary order like bash 'sort -d')
print("Sort and deduplicate paths")
paths_file = results_dir / "paths.txt"
with open(paths_file, "w") as f:
# Sort using dictionary order: only alphanumeric and spaces matter
sorted_items = sorted(
path_counts.items(),
key=lambda x: (dictionary_sort_key(x[0][0]), dictionary_sort_key(x[0][1])),
)
for (level, include_level), count in sorted_items:
line = f"{count:7} {level} {include_level}\n"
print(line.rstrip())
f.write(line)
# Split into flat-file database
print("Split into flat-file database")
includes_dir = results_dir / "includes"
included_by_dir = results_dir / "included_by"
includes_dir.mkdir()
included_by_dir.mkdir()
# Batch writes by grouping data first to avoid repeated file opens
includes_data: Dict[str, List[Tuple[str, int]]] = defaultdict(list)
included_by_data: Dict[str, List[Tuple[str, int]]] = defaultdict(list)
# Process in sorted order to match bash script behavior (dictionary order)
sorted_items = sorted(
path_counts.items(),
key=lambda x: (dictionary_sort_key(x[0][0]), dictionary_sort_key(x[0][1])),
)
for (level, include_level), count in sorted_items:
includes_data[level].append((include_level, count))
included_by_data[include_level].append((level, count))
# Write all includes files in sorted order (dictionary order)
for level in sorted(includes_data.keys(), key=dictionary_sort_key):
entries = includes_data[level]
with open(includes_dir / level, "w") as f:
for include_level, count in entries:
line = f"{include_level} {count}\n"
print(line.rstrip())
f.write(line)
# Write all included_by files in sorted order (dictionary order)
for include_level in sorted(included_by_data.keys(), key=dictionary_sort_key):
entries = included_by_data[include_level]
with open(included_by_dir / include_level, "w") as f:
for level, count in entries:
line = f"{level} {count}\n"
print(line.rstrip())
f.write(line)
# Search for loops
print("Search for loops")
loops_file = results_dir / "loops.txt"
ordering_file = results_dir / "ordering.txt"
loops_found: Set[Tuple[str, str]] = set()
# Pre-load all include files into memory to avoid repeated I/O
# This is the biggest optimization - we were reading files repeatedly in nested loops
# Use list of tuples to preserve file order
includes_cache: Dict[str, List[Tuple[str, int]]] = {}
includes_lookup: Dict[str, Dict[str, int]] = {} # For fast lookup
# Note: bash script uses 'for source in *' which uses standard glob sorting,
# NOT dictionary order. So we use standard sorted() here, not dictionary_sort_key.
for include_file in sorted(includes_dir.iterdir(), key=lambda p: p.name):
if not include_file.is_file():
continue
includes_cache[include_file.name] = []
includes_lookup[include_file.name] = {}
with open(include_file, "r") as f:
for line in f:
parts = line.strip().split()
if len(parts) >= 2:
include_name = parts[0]
include_count = int(parts[1])
includes_cache[include_file.name].append(
(include_name, include_count)
)
includes_lookup[include_file.name][include_name] = include_count
with open(loops_file, "w", buffering=8192) as loops_f, open(
ordering_file, "w", buffering=8192
) as ordering_f:
# Use standard sorting to match bash glob expansion 'for source in *'
for source in sorted(includes_cache.keys()):
source_includes = includes_cache[source]
for include, include_freq in source_includes:
# Check if include file exists and references source
if include not in includes_lookup:
continue
source_freq = includes_lookup[include].get(source)
if source_freq is not None:
# Found a loop
loop_key = tuple(sorted([source, include]))
if loop_key in loops_found:
continue
loops_found.add(loop_key)
loops_f.write(f"Loop: {source} {include}\n")
# If the counts are close, indicate that the two modules are
# on the same level, though they shouldn't be
diff = include_freq - source_freq
if diff > 3:
loops_f.write(f" {source} > {include}\n\n")
elif diff < -3:
loops_f.write(f" {include} > {source}\n\n")
elif source_freq == include_freq:
loops_f.write(f" {include} == {source}\n\n")
else:
loops_f.write(f" {include} ~= {source}\n\n")
else:
ordering_f.write(f"{source} > {include}\n")
# Print results
print("\nOrdering:")
with open(ordering_file, "r") as f:
print(f.read(), end="")
print("\nLoops:")
with open(loops_file, "r") as f:
print(f.read(), end="")
if __name__ == "__main__":
main()

130
.github/scripts/levelization/generate.sh vendored Executable file
View File

@@ -0,0 +1,130 @@
#!/bin/bash
# Usage: generate.sh
# This script takes no parameters, reads no environment variables,
# and can be run from any directory, as long as it is in the expected
# location in the repo.
pushd $( dirname $0 )
if [ -v PS1 ]
then
# if the shell is interactive, clean up any flotsam before analyzing
git clean -ix
fi
# Ensure all sorting is ASCII-order consistently across platforms.
export LANG=C
rm -rfv results
mkdir results
includes="$( pwd )/results/rawincludes.txt"
pushd ../../..
echo Raw includes:
grep -r '^[ ]*#include.*/.*\.h' include src | \
grep -v boost | tee ${includes}
popd
pushd results
oldifs=${IFS}
IFS=:
mkdir includes
mkdir included_by
echo Build levelization paths
exec 3< ${includes} # open rawincludes.txt for input
while read -r -u 3 file include
do
level=$( echo ${file} | cut -d/ -f 2,3 )
# If the "level" indicates a file, cut off the filename
if [[ "${level##*.}" != "${level}" ]]
then
# Use the "toplevel" label as a workaround for `sort`
# inconsistencies between different utility versions
level="$( dirname ${level} )/toplevel"
fi
level=$( echo ${level} | tr '/' '.' )
includelevel=$( echo ${include} | sed 's/.*["<]//; s/[">].*//' | \
cut -d/ -f 1,2 )
if [[ "${includelevel##*.}" != "${includelevel}" ]]
then
# Use the "toplevel" label as a workaround for `sort`
# inconsistencies between different utility versions
includelevel="$( dirname ${includelevel} )/toplevel"
fi
includelevel=$( echo ${includelevel} | tr '/' '.' )
if [[ "$level" != "$includelevel" ]]
then
echo $level $includelevel | tee -a paths.txt
fi
done
echo Sort and deduplicate paths
sort -ds paths.txt | uniq -c | tee sortedpaths.txt
mv sortedpaths.txt paths.txt
exec 3>&- #close fd 3
IFS=${oldifs}
unset oldifs
echo Split into flat-file database
exec 4<paths.txt # open paths.txt for input
while read -r -u 4 count level include
do
echo ${include} ${count} | tee -a includes/${level}
echo ${level} ${count} | tee -a included_by/${include}
done
exec 4>&- #close fd 4
loops="$( pwd )/loops.txt"
ordering="$( pwd )/ordering.txt"
pushd includes
echo Search for loops
# Redirect stdout to a file
exec 4>&1
exec 1>"${loops}"
for source in *
do
if [[ -f "$source" ]]
then
exec 5<"${source}" # open for input
while read -r -u 5 include includefreq
do
if [[ -f $include ]]
then
if grep -q -w $source $include
then
if grep -q -w "Loop: $include $source" "${loops}"
then
continue
fi
sourcefreq=$( grep -w $source $include | cut -d\ -f2 )
echo "Loop: $source $include"
# If the counts are close, indicate that the two modules are
# on the same level, though they shouldn't be
if [[ $(( $includefreq - $sourcefreq )) -gt 3 ]]
then
echo -e " $source > $include\n"
elif [[ $(( $sourcefreq - $includefreq )) -gt 3 ]]
then
echo -e " $include > $source\n"
elif [[ $sourcefreq -eq $includefreq ]]
then
echo -e " $include == $source\n"
else
echo -e " $include ~= $source\n"
fi
else
echo "$source > $include" >> "${ordering}"
fi
fi
done
exec 5>&- #close fd 5
fi
done
exec 1>&4 #close fd 1
exec 4>&- #close fd 4
cat "${ordering}"
cat "${loops}"
popd
popd
popd

30
.github/scripts/rename/include.sh vendored Executable file
View File

@@ -0,0 +1,30 @@
#!/bin/bash
# Exit the script as soon as an error occurs.
set -e
# This script checks whether there are no new include guards introduced by a new
# PR, as header files should use "#pragma once" instead. The script assumes any
# include guards will use "XRPL_" as prefix.
# Usage: .github/scripts/rename/include.sh <repository directory>
if [ "$#" -ne 1 ]; then
echo "Usage: $0 <repository directory>"
exit 1
fi
DIRECTORY=$1
echo "Processing directory: ${DIRECTORY}"
if [ ! -d "${DIRECTORY}" ]; then
echo "Error: Directory '${DIRECTORY}' does not exist."
exit 1
fi
find "${DIRECTORY}" -type f \( -name "*.h" -o -name "*.hpp" -o -name "*.ipp" \) | while read -r FILE; do
echo "Processing file: ${FILE}"
if grep -q "#ifndef XRPL_" "${FILE}"; then
echo "Please replace all include guards by #pragma once."
exit 1
fi
done
echo "Checking complete."

View File

@@ -196,11 +196,22 @@ def generate_strategy_matrix(all: bool, config: Config) -> list:
# Enable code coverage for Debian Bookworm using GCC 15 in Debug on
# linux/amd64
if (
f"{os['compiler_name']}-{os['compiler_version']}" == "gcc-15"
f"{os['distro_name']}-{os['distro_version']}" == "debian-bookworm"
and f"{os['compiler_name']}-{os['compiler_version']}" == "gcc-15"
and build_type == "Debug"
and architecture["platform"] == "linux/amd64"
):
cmake_args = f"-Dcoverage=ON -Dcoverage_format=xml -DCODE_COVERAGE_VERBOSE=ON -DCMAKE_C_FLAGS=-O0 -DCMAKE_CXX_FLAGS=-O0 {cmake_args}"
cmake_args = f"{cmake_args} -Dcoverage=ON -Dcoverage_format=xml -DCODE_COVERAGE_VERBOSE=ON -DCMAKE_C_FLAGS=-O0 -DCMAKE_CXX_FLAGS=-O0"
# Enable unity build for Ubuntu Jammy using GCC 12 in Debug on
# linux/amd64.
if (
f"{os['distro_name']}-{os['distro_version']}" == "ubuntu-jammy"
and f"{os['compiler_name']}-{os['compiler_version']}" == "gcc-12"
and build_type == "Debug"
and architecture["platform"] == "linux/amd64"
):
cmake_args = f"{cmake_args} -Dunity=ON"
# Generate a unique name for the configuration, e.g. macos-arm64-debug
# or debian-bookworm-gcc-12-amd64-release.
@@ -217,6 +228,8 @@ def generate_strategy_matrix(all: bool, config: Config) -> list:
config_name += f"-{build_type.lower()}"
if "-Dcoverage=ON" in cmake_args:
config_name += "-coverage"
if "-Dunity=ON" in cmake_args:
config_name += "-unity"
# Add the configuration to the list, with the most unique fields first,
# so that they are easier to identify in the GitHub Actions UI, as long

View File

@@ -20,7 +20,7 @@ jobs:
- name: Checkout repository
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Check levelization
run: python .github/scripts/levelization/generate.py
run: .github/scripts/levelization/generate.sh
- name: Check for differences
env:
MESSAGE: |
@@ -32,7 +32,7 @@ jobs:
removed from loops.txt, it's probably an improvement, while if
something was added, it's probably a regression.
Run '.github/scripts/levelization/generate.py' in your repo, commit
Run '.github/scripts/levelization/generate.sh' in your repo, commit
and push the changes. See .github/scripts/levelization/README.md for
more info.
run: |

View File

@@ -31,6 +31,8 @@ jobs:
run: .github/scripts/rename/namespace.sh .
- name: Check config name
run: .github/scripts/rename/config.sh .
- name: Check include guards
run: .github/scripts/rename/include.sh .
- name: Check for differences
env:
MESSAGE: |

5
.gitignore vendored
View File

@@ -69,8 +69,3 @@ DerivedData
# AI tools.
/.augment
/.claude
/CLAUDE.md
# Python
__pycache__

View File

@@ -575,10 +575,16 @@ See [Sanitizers docs](./docs/build/sanitizers.md) for more details.
| `assert` | OFF | Enable assertions. |
| `coverage` | OFF | Prepare the coverage report. |
| `tests` | OFF | Build tests. |
| `unity` | OFF | Configure a unity build. |
| `xrpld` | OFF | Build the xrpld application, and not just the libxrpl library. |
| `werr` | OFF | Treat compilation warnings as errors |
| `wextra` | OFF | Enable additional compilation warnings |
[Unity builds][5] may be faster for the first build (at the cost of much more
memory) since they concatenate sources into fewer translation units. Non-unity
builds may be faster for incremental builds, and can be helpful for detecting
`#include` omissions.
## Troubleshooting
### Conan
@@ -645,6 +651,7 @@ If you want to experiment with a new package, follow these steps:
[1]: https://github.com/conan-io/conan-center-index/issues/13168
[2]: https://en.cppreference.com/w/cpp/compiler_support/20
[3]: https://docs.conan.io/en/latest/getting_started.html
[5]: https://en.wikipedia.org/wiki/Unity_build
[6]: https://github.com/boostorg/beast/issues/2648
[7]: https://github.com/boostorg/beast/issues/2661
[gcovr]: https://gcovr.com/en/stable/getting-started.html

View File

@@ -940,23 +940,7 @@
#
# path Location to store the database
#
# Optional keys
#
# cache_size Size of cache for database records. Default is 16384.
# Setting this value to 0 will use the default value.
#
# cache_age Length of time in minutes to keep database records
# cached. Default is 5 minutes. Setting this value to
# 0 will use the default value.
#
# Note: if neither cache_size nor cache_age is
# specified, the cache for database records will not
# be created. If only one of cache_size or cache_age
# is specified, the cache will be created using the
# default value for the unspecified parameter.
#
# Note: the cache will not be created if online_delete
# is specified.
# Optional keys for NuDB and RocksDB:
#
# fast_load Boolean. If set, load the last persisted ledger
# from disk upon process start before syncing to
@@ -964,8 +948,6 @@
# if sufficient IOPS capacity is available.
# Default 0.
#
# Optional keys for NuDB or RocksDB:
#
# earliest_seq The default is 32570 to match the XRP ledger
# network's earliest allowed sequence. Alternate
# networks may set this value. Minimum value of 1.

View File

@@ -4,7 +4,12 @@
include(target_protobuf_sources)
# Protocol buffers cannot participate in a unity build,
# because all the generated sources
# define a bunch of `static const` variables with the same names,
# so we just build them as a separate library.
add_library(xrpl.libpb)
set_target_properties(xrpl.libpb PROPERTIES UNITY_BUILD OFF)
target_protobuf_sources(xrpl.libpb xrpl/proto LANGUAGE cpp IMPORT_DIRS include/xrpl/proto
PROTOS include/xrpl/proto/xrpl.proto)

View File

@@ -30,6 +30,14 @@ if (tests)
endif ()
endif ()
option(unity "Creates a build using UNITY support in cmake." OFF)
if (unity)
if (NOT is_ci)
set(CMAKE_UNITY_BUILD_BATCH_SIZE 15 CACHE STRING "")
endif ()
set(CMAKE_UNITY_BUILD ON CACHE BOOL "Do a unity build")
endif ()
if (is_clang AND is_linux)
option(voidstar "Enable Antithesis instrumentation." OFF)
endif ()

View File

@@ -6,11 +6,11 @@
"sqlite3/3.49.1#8631739a4c9b93bd3d6b753bac548a63%1765850149.926",
"soci/4.0.3#a9f8d773cd33e356b5879a4b0564f287%1765850149.46",
"snappy/1.1.10#968fef506ff261592ec30c574d4a7809%1765850147.878",
"secp256k1/0.7.0#9c4ab67bdc3860c16ea5b36aed8f74ea%1765850147.928",
"secp256k1/0.7.1#3a61e95e220062ef32c48d019e9c81f7%1770306721.686",
"rocksdb/10.5.1#4a197eca381a3e5ae8adf8cffa5aacd0%1765850186.86",
"re2/20230301#ca3b241baec15bd31ea9187150e0b333%1765850148.103",
"protobuf/6.32.1#f481fd276fc23a33b85a3ed1e898b693%1765850161.038",
"openssl/3.5.4#1b986e61b38fdfda3b40bebc1b234393%1768312656.257",
"openssl/3.5.5#05a4ac5b7323f7a329b2db1391d9941f%1769599205.414",
"nudb/2.0.9#0432758a24204da08fee953ec9ea03cb%1769436073.32",
"lz4/1.10.0#59fc63cac7f10fbe8e05c7e62c2f3504%1765850143.914",
"libiconv/1.17#1e65319e945f2d31941a9d28cc13c058%1765842973.492",
@@ -23,7 +23,7 @@
"date/3.0.4#862e11e80030356b53c2c38599ceb32b%1765850143.772",
"c-ares/1.34.5#5581c2b62a608b40bb85d965ab3ec7c8%1765850144.336",
"bzip2/1.0.8#c470882369c2d95c5c77e970c0c7e321%1765850143.837",
"boost/1.90.0#d5e8defe7355494953be18524a7f135b%1765955095.179",
"boost/1.90.0#d5e8defe7355494953be18524a7f135b%1769454080.269",
"abseil/20250127.0#99262a368bd01c0ccca8790dfced9719%1766517936.993"
],
"build_requires": [
@@ -31,7 +31,7 @@
"strawberryperl/5.32.1.1#707032463aa0620fa17ec0d887f5fe41%1765850165.196",
"protobuf/6.32.1#f481fd276fc23a33b85a3ed1e898b693%1765850161.038",
"nasm/2.16.01#31e26f2ee3c4346ecd347911bd126904%1765850144.707",
"msys2/cci.latest#1996656c3c98e5765b25b60ff5cf77b4%1764840888.758",
"msys2/cci.latest#eea83308ad7e9023f7318c60d5a9e6cb%1770199879.083",
"m4/1.4.19#70dc8bbb33e981d119d2acc0175cf381%1763158052.846",
"cmake/4.2.0#ae0a44f44a1ef9ab68fd4b3e9a1f8671%1765850153.937",
"cmake/3.31.10#313d16a1aa16bbdb2ca0792467214b76%1765850153.479",

View File

@@ -23,6 +23,7 @@ class Xrpl(ConanFile):
"shared": [True, False],
"static": [True, False],
"tests": [True, False],
"unity": [True, False],
"xrpld": [True, False],
}
@@ -31,8 +32,8 @@ class Xrpl(ConanFile):
"grpc/1.72.0",
"libarchive/3.8.1",
"nudb/2.0.9",
"openssl/3.5.4",
"secp256k1/0.7.0",
"openssl/3.5.5",
"secp256k1/0.7.1",
"soci/4.0.3",
"zlib/1.3.1",
]
@@ -54,6 +55,7 @@ class Xrpl(ConanFile):
"shared": False,
"static": True,
"tests": False,
"unity": False,
"xrpld": False,
"date/*:header_only": True,
"ed25519/*:shared": False,
@@ -166,6 +168,7 @@ class Xrpl(ConanFile):
tc.variables["rocksdb"] = self.options.rocksdb
tc.variables["BUILD_SHARED_LIBS"] = self.options.shared
tc.variables["static"] = self.options.static
tc.variables["unity"] = self.options.unity
tc.variables["xrpld"] = self.options.xrpld
tc.generate()

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_ARCHIVE_H_INCLUDED
#define XRPL_BASICS_ARCHIVE_H_INCLUDED
#pragma once
#include <boost/filesystem.hpp>
@@ -16,5 +15,3 @@ void
extractTarLz4(boost::filesystem::path const& src, boost::filesystem::path const& dst);
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_BASICCONFIG_H_INCLUDED
#define XRPL_BASICS_BASICCONFIG_H_INCLUDED
#pragma once
#include <xrpl/basics/contract.h>
@@ -369,5 +368,3 @@ get_if_exists<bool>(Section const& section, std::string const& name, bool& v)
}
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_BLOB_H_INCLUDED
#define XRPL_BASICS_BLOB_H_INCLUDED
#pragma once
#include <vector>
@@ -11,5 +10,3 @@ namespace xrpl {
using Blob = std::vector<unsigned char>;
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_BUFFER_H_INCLUDED
#define XRPL_BASICS_BUFFER_H_INCLUDED
#pragma once
#include <xrpl/basics/Slice.h>
#include <xrpl/beast/utility/instrumentation.h>
@@ -213,5 +212,3 @@ operator!=(Buffer const& lhs, Buffer const& rhs) noexcept
}
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_BYTEUTILITIES_H_INCLUDED
#define XRPL_BASICS_BYTEUTILITIES_H_INCLUDED
#pragma once
namespace xrpl {
@@ -20,5 +19,3 @@ megabytes(T value) noexcept
static_assert(kilobytes(2) == 2048, "kilobytes(2) == 2048");
static_assert(megabytes(3) == 3145728, "megabytes(3) == 3145728");
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_COMPRESSIONALGORITHMS_H_INCLUDED
#define XRPL_COMPRESSIONALGORITHMS_H_INCLUDED
#pragma once
#include <xrpl/basics/contract.h>
@@ -133,5 +132,3 @@ lz4Decompress(InputStream& in, std::size_t inSize, std::uint8_t* decompressed, s
} // namespace compression_algorithms
} // namespace xrpl
#endif // XRPL_COMPRESSIONALGORITHMS_H_INCLUDED

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_COUNTEDOBJECT_H_INCLUDED
#define XRPL_BASICS_COUNTEDOBJECT_H_INCLUDED
#pragma once
#include <xrpl/beast/type_name.h>
@@ -134,5 +133,3 @@ public:
};
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_DECAYINGSAMPLE_H_INCLUDED
#define XRPL_BASICS_DECAYINGSAMPLE_H_INCLUDED
#pragma once
#include <chrono>
#include <cmath>
@@ -130,5 +129,3 @@ private:
};
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_EXPECTED_H_INCLUDED
#define XRPL_BASICS_EXPECTED_H_INCLUDED
#pragma once
#include <xrpl/basics/contract.h>
@@ -229,5 +228,3 @@ public:
};
} // namespace xrpl
#endif // XRPL_BASICS_EXPECTED_H_INCLUDED

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_FILEUTILITIES_H_INCLUDED
#define XRPL_BASICS_FILEUTILITIES_H_INCLUDED
#pragma once
#include <boost/filesystem.hpp>
#include <boost/system/error_code.hpp>
@@ -18,5 +17,3 @@ void
writeFileContents(boost::system::error_code& ec, boost::filesystem::path const& destPath, std::string const& contents);
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_INTRUSIVEPOINTER_H_INCLUDED
#define XRPL_BASICS_INTRUSIVEPOINTER_H_INCLUDED
#pragma once
#include <concepts>
#include <cstdint>
@@ -485,4 +484,3 @@ dynamic_pointer_cast(TT const& v)
}
} // namespace intr_ptr
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_INTRUSIVEPOINTER_IPP_INCLUDED
#define XRPL_BASICS_INTRUSIVEPOINTER_IPP_INCLUDED
#pragma once
#include <xrpl/basics/IntrusivePointer.h>
#include <xrpl/basics/IntrusiveRefCounts.h>
@@ -703,4 +702,3 @@ SharedWeakUnion<T>::unsafeReleaseNoStore()
}
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_INTRUSIVEREFCOUNTS_H_INCLUDED
#define XRPL_BASICS_INTRUSIVEREFCOUNTS_H_INCLUDED
#pragma once
#include <xrpl/beast/utility/instrumentation.h>
@@ -461,4 +460,3 @@ partialDestructorFinished(T** o)
//------------------------------------------------------------------------------
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_KEYCACHE_H
#define XRPL_BASICS_KEYCACHE_H
#pragma once
#include <xrpl/basics/TaggedCache.h>
#include <xrpl/basics/base_uint.h>
@@ -9,5 +8,3 @@ namespace xrpl {
using KeyCache = TaggedCache<uint256, int, true>;
} // namespace xrpl
#endif // XRPL_BASICS_KEYCACHE_H

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_LOCALVALUE_H_INCLUDED
#define XRPL_BASICS_LOCALVALUE_H_INCLUDED
#pragma once
#include <boost/thread/tss.hpp>
@@ -107,5 +106,3 @@ LocalValue<T>::operator*()
lvs->values.emplace(this, std::make_unique<detail::LocalValues::Value<T>>(t_)).first->second->get());
}
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_LOG_H_INCLUDED
#define XRPL_BASICS_LOG_H_INCLUDED
#pragma once
#include <xrpl/basics/UnorderedContainers.h>
#include <xrpl/beast/utility/Journal.h>
@@ -258,5 +257,3 @@ beast::Journal
debugLog();
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_MATHUTILITIES_H_INCLUDED
#define XRPL_BASICS_MATHUTILITIES_H_INCLUDED
#pragma once
#include <algorithm>
#include <cassert>
@@ -45,5 +44,3 @@ static_assert(calculatePercent(50'000'001, 100'000'000) == 51);
static_assert(calculatePercent(99'999'999, 100'000'000) == 100);
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_NUMBER_H_INCLUDED
#define XRPL_BASICS_NUMBER_H_INCLUDED
#pragma once
#include <xrpl/beast/utility/instrumentation.h>
@@ -819,5 +818,3 @@ public:
};
} // namespace xrpl
#endif // XRPL_BASICS_NUMBER_H_INCLUDED

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_RANGESET_H_INCLUDED
#define XRPL_BASICS_RANGESET_H_INCLUDED
#pragma once
#include <xrpl/beast/core/LexicalCast.h>
@@ -173,5 +172,3 @@ prevMissing(RangeSet<T> const& rs, T t, T minVal = 0)
}
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_RESOLVER_H_INCLUDED
#define XRPL_BASICS_RESOLVER_H_INCLUDED
#pragma once
#include <xrpl/beast/net/IPEndpoint.h>
@@ -45,5 +44,3 @@ public:
};
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_RESOLVERASIO_H_INCLUDED
#define XRPL_BASICS_RESOLVERASIO_H_INCLUDED
#pragma once
#include <xrpl/basics/Resolver.h>
#include <xrpl/beast/utility/Journal.h>
@@ -18,5 +17,3 @@ public:
};
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_SHAMAP_HASH_H_INCLUDED
#define XRPL_BASICS_SHAMAP_HASH_H_INCLUDED
#pragma once
#include <xrpl/basics/base_uint.h>
#include <xrpl/basics/partitioned_unordered_map.h>
@@ -98,5 +97,3 @@ extract(SHAMapHash const& key)
}
} // namespace xrpl
#endif // XRPL_BASICS_SHAMAP_HASH_H_INCLUDED

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_SHAREDWEAKCACHEPOINTER_H_INCLUDED
#define XRPL_BASICS_SHAREDWEAKCACHEPOINTER_H_INCLUDED
#pragma once
#include <memory>
#include <variant>
@@ -113,4 +112,3 @@ private:
std::variant<std::shared_ptr<T>, std::weak_ptr<T>> combo_;
};
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_SHAREDWEAKCACHEPOINTER_IPP_INCLUDED
#define XRPL_BASICS_SHAREDWEAKCACHEPOINTER_IPP_INCLUDED
#pragma once
#include <xrpl/basics/SharedWeakCachePointer.h>
@@ -164,4 +163,3 @@ SharedWeakCachePointer<T>::convertToWeak()
return false;
}
} // namespace xrpl
#endif

View File

@@ -1,7 +1,6 @@
// Copyright (c) 2022, Nikolaos D. Bougalis <nikb@bougalis.net>
#ifndef XRPL_BASICS_SLABALLOCATOR_H_INCLUDED
#define XRPL_BASICS_SLABALLOCATOR_H_INCLUDED
#pragma once
#include <xrpl/basics/ByteUtilities.h>
#include <xrpl/beast/type_name.h>
@@ -386,5 +385,3 @@ public:
};
} // namespace xrpl
#endif // XRPL_BASICS_SLABALLOCATOR_H_INCLUDED

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_SLICE_H_INCLUDED
#define XRPL_BASICS_SLICE_H_INCLUDED
#pragma once
#include <xrpl/basics/contract.h>
#include <xrpl/basics/strHex.h>
@@ -231,5 +230,3 @@ makeSlice(std::basic_string<char, Traits, Alloc> const& s)
}
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_STRINGUTILITIES_H_INCLUDED
#define XRPL_BASICS_STRINGUTILITIES_H_INCLUDED
#pragma once
#include <xrpl/basics/Blob.h>
#include <xrpl/basics/strHex.h>
@@ -132,5 +131,3 @@ bool
isProperlyFormedTomlDomain(std::string_view domain);
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_TAGGEDCACHE_H_INCLUDED
#define XRPL_BASICS_TAGGEDCACHE_H_INCLUDED
#pragma once
#include <xrpl/basics/IntrusivePointer.h>
#include <xrpl/basics/Log.h>
@@ -298,5 +297,3 @@ private:
};
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_TAGGEDCACHE_IPP_INCLUDED
#define XRPL_BASICS_TAGGEDCACHE_IPP_INCLUDED
#pragma once
#include <xrpl/basics/IntrusivePointer.ipp>
#include <xrpl/basics/TaggedCache.h>
@@ -784,5 +783,3 @@ TaggedCache<Key, T, IsKeyCache, SharedWeakUnionPointer, SharedPointerType, Hash,
}
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_TOSTRING_H_INCLUDED
#define XRPL_BASICS_TOSTRING_H_INCLUDED
#pragma once
#include <string>
#include <type_traits>
@@ -44,5 +43,3 @@ to_string(char const* s)
}
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_UNORDEREDCONTAINERS_H_INCLUDED
#define XRPL_BASICS_UNORDEREDCONTAINERS_H_INCLUDED
#pragma once
#include <xrpl/basics/hardened_hash.h>
#include <xrpl/basics/partitioned_unordered_map.h>
@@ -99,5 +98,3 @@ template <
using hardened_hash_multiset = std::unordered_multiset<Value, Hash, Pred, Allocator>;
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_UPTIMETIMER_H_INCLUDED
#define XRPL_BASICS_UPTIMETIMER_H_INCLUDED
#pragma once
#include <atomic>
#include <chrono>
@@ -46,5 +45,3 @@ private:
};
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_ALGORITHM_H_INCLUDED
#define XRPL_ALGORITHM_H_INCLUDED
#pragma once
#include <utility>
@@ -90,5 +89,3 @@ remove_if_intersect_or_match(FwdIter1 first1, FwdIter1 last1, InputIter2 first2,
}
} // namespace xrpl
#endif

View File

@@ -32,8 +32,7 @@
*/
#ifndef XRPL_BASICS_BASE64_H_INCLUDED
#define XRPL_BASICS_BASE64_H_INCLUDED
#pragma once
#include <cstdint>
#include <string>
@@ -53,5 +52,3 @@ std::string
base64_decode(std::string_view data);
} // namespace xrpl
#endif

View File

@@ -3,8 +3,7 @@
// Distributed under the MIT/X11 software license, see the accompanying
// file license.txt or http://www.opensource.org/licenses/mit-license.php.
#ifndef XRPL_BASICS_BASE_UINT_H_INCLUDED
#define XRPL_BASICS_BASE_UINT_H_INCLUDED
#pragma once
#include <xrpl/basics/Expected.h>
#include <xrpl/basics/Slice.h>
@@ -644,5 +643,3 @@ struct is_uniquely_represented<xrpl::base_uint<Bits, Tag>> : public std::true_ty
};
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_CHRONO_H_INCLUDED
#define XRPL_BASICS_CHRONO_H_INCLUDED
#pragma once
#include <xrpl/beast/clock/abstract_clock.h>
#include <xrpl/beast/clock/basic_seconds_clock.h>
@@ -99,5 +98,3 @@ stopwatch()
}
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_COMPARATORS_H_INCLUDED
#define XRPL_BASICS_COMPARATORS_H_INCLUDED
#pragma once
#include <functional>
@@ -53,5 +52,3 @@ using equal_to = std::equal_to<T>;
#endif
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_CONTRACT_H_INCLUDED
#define XRPL_BASICS_CONTRACT_H_INCLUDED
#pragma once
#include <xrpl/beast/type_name.h>
@@ -48,5 +47,3 @@ Throw(Args&&... args)
LogicError(std::string const& how) noexcept;
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_HARDENED_HASH_H_INCLUDED
#define XRPL_BASICS_HARDENED_HASH_H_INCLUDED
#pragma once
#include <xrpl/beast/hash/hash_append.h>
#include <xrpl/beast/hash/xxhasher.h>
@@ -93,5 +92,3 @@ public:
};
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef JOIN_H_INCLUDED
#define JOIN_H_INCLUDED
#pragma once
#include <string>
@@ -80,5 +79,3 @@ public:
};
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_MAKE_SSLCONTEXT_H_INCLUDED
#define XRPL_BASICS_MAKE_SSLCONTEXT_H_INCLUDED
#pragma once
#include <boost/asio/ssl/context.hpp>
@@ -20,5 +19,3 @@ make_SSLContextAuthed(
std::string const& cipherList);
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_MULDIV_H_INCLUDED
#define XRPL_BASICS_MULDIV_H_INCLUDED
#pragma once
#include <cstdint>
#include <limits>
@@ -22,5 +21,3 @@ std::optional<std::uint64_t>
mulDiv(std::uint64_t value, std::uint64_t mul, std::uint64_t div);
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_PARTITIONED_UNORDERED_MAP_H
#define XRPL_BASICS_PARTITIONED_UNORDERED_MAP_H
#pragma once
#include <xrpl/beast/hash/uhash.h>
#include <xrpl/beast/utility/instrumentation.h>
@@ -393,5 +392,3 @@ private:
};
} // namespace xrpl
#endif // XRPL_BASICS_PARTITIONED_UNORDERED_MAP_H

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_RANDOM_H_INCLUDED
#define XRPL_BASICS_RANDOM_H_INCLUDED
#pragma once
#include <xrpl/beast/utility/instrumentation.h>
#include <xrpl/beast/xor_shift_engine.h>
@@ -174,5 +173,3 @@ rand_bool()
/** @} */
} // namespace xrpl
#endif // XRPL_BASICS_RANDOM_H_INCLUDED

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_ROCKSDB_H_INCLUDED
#define XRPL_BASICS_ROCKSDB_H_INCLUDED
#pragma once
#if XRPL_ROCKSDB_AVAILABLE
// #include <rocksdb2/port/port_posix.h>
@@ -28,5 +27,3 @@
#include <rocksdb/write_batch.h>
#endif
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_SAFE_CAST_H_INCLUDED
#define XRPL_BASICS_SAFE_CAST_H_INCLUDED
#pragma once
#include <type_traits>
@@ -69,5 +68,3 @@ unsafe_cast(Src s) noexcept
}
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_SCOPE_H_INCLUDED
#define XRPL_BASICS_SCOPE_H_INCLUDED
#pragma once
#include <xrpl/beast/utility/instrumentation.h>
@@ -220,5 +219,3 @@ template <class Mutex>
scope_unlock(std::unique_lock<Mutex>&) -> scope_unlock<Mutex>;
} // namespace xrpl
#endif

View File

@@ -1,7 +1,6 @@
// Copyright (c) 2022, Nikolaos D. Bougalis <nikb@bougalis.net>
#ifndef XRPL_BASICS_SPINLOCK_H_INCLUDED
#define XRPL_BASICS_SPINLOCK_H_INCLUDED
#pragma once
#include <xrpl/beast/utility/instrumentation.h>
@@ -201,5 +200,3 @@ public:
/** @} */
} // namespace xrpl
#endif

View File

@@ -1,5 +1,4 @@
#ifndef XRPL_BASICS_STRHEX_H_INCLUDED
#define XRPL_BASICS_STRHEX_H_INCLUDED
#pragma once
#include <boost/algorithm/hex.hpp>
#include <boost/endian/conversion.hpp>
@@ -27,5 +26,3 @@ strHex(T const& from)
}
} // namespace xrpl
#endif

View File

@@ -1,7 +1,6 @@
// Copyright (c) 2014, Nikolaos D. Bougalis <nikb@bougalis.net>
#ifndef BEAST_UTILITY_TAGGED_INTEGER_H_INCLUDED
#define BEAST_UTILITY_TAGGED_INTEGER_H_INCLUDED
#pragma once
#include <xrpl/beast/hash/hash_append.h>
@@ -202,4 +201,3 @@ struct is_contiguously_hashable<xrpl::tagged_integer<Int, Tag>, HashAlgorithm>
};
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_ASIO_IO_LATENCY_PROBE_H_INCLUDED
#define BEAST_ASIO_IO_LATENCY_PROBE_H_INCLUDED
#pragma once
#include <xrpl/beast/utility/instrumentation.h>
@@ -226,5 +225,3 @@ private:
};
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_CHRONO_ABSTRACT_CLOCK_H_INCLUDED
#define BEAST_CHRONO_ABSTRACT_CLOCK_H_INCLUDED
#pragma once
namespace beast {
@@ -89,5 +88,3 @@ get_abstract_clock()
}
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_CHRONO_BASIC_SECONDS_CLOCK_H_INCLUDED
#define BEAST_CHRONO_BASIC_SECONDS_CLOCK_H_INCLUDED
#pragma once
#include <chrono>
@@ -33,5 +32,3 @@ public:
};
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_CHRONO_MANUAL_CLOCK_H_INCLUDED
#define BEAST_CHRONO_MANUAL_CLOCK_H_INCLUDED
#pragma once
#include <xrpl/beast/clock/abstract_clock.h>
#include <xrpl/beast/utility/instrumentation.h>
@@ -75,5 +74,3 @@ public:
};
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_CONTAINER_AGED_CONTAINER_H_INCLUDED
#define BEAST_CONTAINER_AGED_CONTAINER_H_INCLUDED
#pragma once
#include <type_traits>
@@ -12,5 +11,3 @@ struct is_aged_container : std::false_type
};
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_CONTAINER_AGED_CONTAINER_UTILITY_H_INCLUDED
#define BEAST_CONTAINER_AGED_CONTAINER_UTILITY_H_INCLUDED
#pragma once
#include <xrpl/beast/container/aged_container.h>
@@ -24,5 +23,3 @@ expire(AgedContainer& c, std::chrono::duration<Rep, Period> const& age)
}
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_CONTAINER_AGED_MAP_H_INCLUDED
#define BEAST_CONTAINER_AGED_MAP_H_INCLUDED
#pragma once
#include <xrpl/beast/container/detail/aged_ordered_container.h>
@@ -18,5 +17,3 @@ template <
using aged_map = detail::aged_ordered_container<false, true, Key, T, Clock, Compare, Allocator>;
}
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_CONTAINER_AGED_MULTIMAP_H_INCLUDED
#define BEAST_CONTAINER_AGED_MULTIMAP_H_INCLUDED
#pragma once
#include <xrpl/beast/container/detail/aged_ordered_container.h>
@@ -18,5 +17,3 @@ template <
using aged_multimap = detail::aged_ordered_container<true, true, Key, T, Clock, Compare, Allocator>;
}
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_CONTAINER_AGED_MULTISET_H_INCLUDED
#define BEAST_CONTAINER_AGED_MULTISET_H_INCLUDED
#pragma once
#include <xrpl/beast/container/detail/aged_ordered_container.h>
@@ -17,5 +16,3 @@ template <
using aged_multiset = detail::aged_ordered_container<true, false, Key, void, Clock, Compare, Allocator>;
}
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_CONTAINER_AGED_SET_H_INCLUDED
#define BEAST_CONTAINER_AGED_SET_H_INCLUDED
#pragma once
#include <xrpl/beast/container/detail/aged_ordered_container.h>
@@ -17,5 +16,3 @@ template <
using aged_set = detail::aged_ordered_container<false, false, Key, void, Clock, Compare, Allocator>;
}
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_CONTAINER_AGED_UNORDERED_MAP_H_INCLUDED
#define BEAST_CONTAINER_AGED_UNORDERED_MAP_H_INCLUDED
#pragma once
#include <xrpl/beast/container/detail/aged_unordered_container.h>
@@ -19,5 +18,3 @@ template <
using aged_unordered_map = detail::aged_unordered_container<false, true, Key, T, Clock, Hash, KeyEqual, Allocator>;
}
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_CONTAINER_AGED_UNORDERED_MULTIMAP_H_INCLUDED
#define BEAST_CONTAINER_AGED_UNORDERED_MULTIMAP_H_INCLUDED
#pragma once
#include <xrpl/beast/container/detail/aged_unordered_container.h>
@@ -19,5 +18,3 @@ template <
using aged_unordered_multimap = detail::aged_unordered_container<true, true, Key, T, Clock, Hash, KeyEqual, Allocator>;
}
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_CONTAINER_AGED_UNORDERED_MULTISET_H_INCLUDED
#define BEAST_CONTAINER_AGED_UNORDERED_MULTISET_H_INCLUDED
#pragma once
#include <xrpl/beast/container/detail/aged_unordered_container.h>
@@ -19,5 +18,3 @@ using aged_unordered_multiset =
detail::aged_unordered_container<true, false, Key, void, Clock, Hash, KeyEqual, Allocator>;
}
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_CONTAINER_AGED_UNORDERED_SET_H_INCLUDED
#define BEAST_CONTAINER_AGED_UNORDERED_SET_H_INCLUDED
#pragma once
#include <xrpl/beast/container/detail/aged_unordered_container.h>
@@ -18,5 +17,3 @@ template <
using aged_unordered_set = detail::aged_unordered_container<false, false, Key, void, Clock, Hash, KeyEqual, Allocator>;
}
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_CONTAINER_DETAIL_AGED_ASSOCIATIVE_CONTAINER_H_INCLUDED
#define BEAST_CONTAINER_DETAIL_AGED_ASSOCIATIVE_CONTAINER_H_INCLUDED
#pragma once
namespace beast {
namespace detail {
@@ -33,5 +32,3 @@ struct aged_associative_container_extract_t<false>
} // namespace detail
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_CONTAINER_DETAIL_AGED_CONTAINER_ITERATOR_H_INCLUDED
#define BEAST_CONTAINER_DETAIL_AGED_CONTAINER_ITERATOR_H_INCLUDED
#pragma once
#include <iterator>
#include <type_traits>
@@ -146,5 +145,3 @@ private:
} // namespace detail
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_CONTAINER_DETAIL_AGED_ORDERED_CONTAINER_H_INCLUDED
#define BEAST_CONTAINER_DETAIL_AGED_ORDERED_CONTAINER_H_INCLUDED
#pragma once
#include <xrpl/beast/clock/abstract_clock.h>
#include <xrpl/beast/container/aged_container.h>
@@ -1687,5 +1686,3 @@ expire(
}
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_CONTAINER_DETAIL_AGED_UNORDERED_CONTAINER_H_INCLUDED
#define BEAST_CONTAINER_DETAIL_AGED_UNORDERED_CONTAINER_H_INCLUDED
#pragma once
#include <xrpl/beast/clock/abstract_clock.h>
#include <xrpl/beast/container/aged_container.h>
@@ -2226,5 +2225,3 @@ expire(
}
} // namespace beast
#endif

View File

@@ -4,8 +4,7 @@
// Official repository: https://github.com/boostorg/beast
//
#ifndef BEAST_CONTAINER_DETAIL_EMPTY_BASE_OPTIMIZATION_H_INCLUDED
#define BEAST_CONTAINER_DETAIL_EMPTY_BASE_OPTIMIZATION_H_INCLUDED
#pragma once
#include <boost/type_traits/is_final.hpp>
@@ -89,5 +88,3 @@ public:
} // namespace detail
} // namespace beast
#endif

View File

@@ -2,8 +2,7 @@
// Copyright (c) 2013 - Raw Material Software Ltd.
// Please visit http://www.juce.com
#ifndef BEAST_CORE_CURRENT_THREAD_NAME_H_INCLUDED
#define BEAST_CORE_CURRENT_THREAD_NAME_H_INCLUDED
#pragma once
#include <boost/predef.h>
@@ -53,5 +52,3 @@ std::string
getCurrentThreadName();
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_MODULE_CORE_TEXT_LEXICALCAST_H_INCLUDED
#define BEAST_MODULE_CORE_TEXT_LEXICALCAST_H_INCLUDED
#pragma once
#include <xrpl/beast/utility/instrumentation.h>
@@ -208,5 +207,3 @@ lexicalCast(In in, Out defaultValue = Out())
}
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_INTRUSIVE_LIST_H_INCLUDED
#define BEAST_INTRUSIVE_LIST_H_INCLUDED
#pragma once
#include <iterator>
@@ -574,5 +573,3 @@ private:
};
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_INTRUSIVE_LOCKFREESTACK_H_INCLUDED
#define BEAST_INTRUSIVE_LOCKFREESTACK_H_INCLUDED
#pragma once
#include <atomic>
#include <iterator>
@@ -268,5 +267,3 @@ private:
};
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_MODULE_CORE_DIAGNOSTIC_SEMANTICVERSION_H_INCLUDED
#define BEAST_MODULE_CORE_DIAGNOSTIC_SEMANTICVERSION_H_INCLUDED
#pragma once
#include <string>
#include <vector>
@@ -95,5 +94,3 @@ operator<(SemanticVersion const& lhs, SemanticVersion const& rhs)
}
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_HASH_HASH_APPEND_H_INCLUDED
#define BEAST_HASH_HASH_APPEND_H_INCLUDED
#pragma once
#include <boost/container/flat_set.hpp>
#include <boost/endian/conversion.hpp>
@@ -440,5 +439,3 @@ hash_append(HashAlgorithm& h, std::error_code const& ec)
}
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_HASH_UHASH_H_INCLUDED
#define BEAST_HASH_UHASH_H_INCLUDED
#pragma once
#include <xrpl/beast/hash/hash_append.h>
#include <xrpl/beast/hash/xxhasher.h>
@@ -25,5 +24,3 @@ struct uhash
};
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_HASH_XXHASHER_H_INCLUDED
#define BEAST_HASH_XXHASHER_H_INCLUDED
#pragma once
#include <boost/endian/conversion.hpp>
@@ -152,5 +151,3 @@ public:
};
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_INSIGHT_COLLECTOR_H_INCLUDED
#define BEAST_INSIGHT_COLLECTOR_H_INCLUDED
#pragma once
#include <xrpl/beast/insight/Counter.h>
#include <xrpl/beast/insight/Event.h>
@@ -120,5 +119,3 @@ public:
} // namespace insight
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_INSIGHT_COUNTER_H_INCLUDED
#define BEAST_INSIGHT_COUNTER_H_INCLUDED
#pragma once
#include <xrpl/beast/insight/CounterImpl.h>
@@ -94,5 +93,3 @@ private:
} // namespace insight
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_INSIGHT_COUNTERIMPL_H_INCLUDED
#define BEAST_INSIGHT_COUNTERIMPL_H_INCLUDED
#pragma once
#include <cstdint>
#include <memory>
@@ -21,5 +20,3 @@ public:
} // namespace insight
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_INSIGHT_EVENT_H_INCLUDED
#define BEAST_INSIGHT_EVENT_H_INCLUDED
#pragma once
#include <xrpl/beast/insight/EventImpl.h>
@@ -61,5 +60,3 @@ private:
} // namespace insight
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_INSIGHT_EVENTIMPL_H_INCLUDED
#define BEAST_INSIGHT_EVENTIMPL_H_INCLUDED
#pragma once
#include <chrono>
#include <memory>
@@ -21,5 +20,3 @@ public:
} // namespace insight
} // namespace beast
#endif

View File

@@ -1,5 +1,4 @@
#ifndef BEAST_INSIGHT_GAUGE_H_INCLUDED
#define BEAST_INSIGHT_GAUGE_H_INCLUDED
#pragma once
#include <xrpl/beast/insight/GaugeImpl.h>
@@ -124,5 +123,3 @@ private:
} // namespace insight
} // namespace beast
#endif

Some files were not shown because too many files have changed in this diff Show More