Compare commits

..

313 Commits

Author SHA1 Message Date
ZhangxiangHu-Ripple
f615764e8a Merge remote-tracking branch 'upstream/ripple/se/supported' into zhang/groth16 2025-10-23 15:18:21 -07:00
Mayukha Vadari
d735f742a2 Merge branch 'ripple/smart-escrow' into ripple/se/supported 2025-10-23 15:42:08 -04:00
Mayukha Vadari
566b85b3d6 Merge branch 'ripple/se/fees' into ripple/smart-escrow 2025-10-23 15:41:53 -04:00
Mayukha Vadari
af6beb1d7c fix ledger used for rules 2025-10-23 15:41:32 -04:00
Mayukha Vadari
7d22fe804d Merge branch 'ripple/se/fees' into ripple/smart-escrow 2025-10-23 15:38:46 -04:00
Mayukha Vadari
ce19c13059 Merge branch 'ripple/wamr-host-functions' into ripple/se/fees 2025-10-23 15:38:37 -04:00
Mayukha Vadari
286dc6322b Merge branch 'ripple/wamr' into ripple/wamr-host-functions 2025-10-23 15:38:28 -04:00
Mayukha Vadari
c9346cd40d Merge branch 'develop' into ripple/wamr 2025-10-23 15:38:04 -04:00
Pratik Mankawde
2bf77cc8f6 refactor: Retire fix1515 amendment (#5920)
Amendments activated for more than 2 years can be retired. This change retires the fix1515 amendment.

Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-23 13:35:54 +00:00
Ayaz Salikhov
5e33ca56fd Use "${ENVVAR}" instead of ${{ env.ENVVAR }} syntax in GitHub Actions (#5923) 2025-10-22 18:43:04 +00:00
Pratik Mankawde
7c39c810eb Moved fix1513 to retire state (#5919)
Signed-off-by: Pratik Mankawde <pmankawde@ripple.com>
2025-10-22 14:50:43 +00:00
ZhangxiangHu-Ripple
32f9c09536 update gmp and libsodium in conan.lock 2025-10-21 13:57:36 -07:00
ZhangxiangHu-Ripple
fe132244c3 Merge ripple/se/supported 2025-10-21 12:52:01 -07:00
Mayukha Vadari
ce6169d7b1 Merge remote-tracking branch 'upstream/ripple/smart-escrow' into ripple/se/supported 2025-10-21 14:00:15 -04:00
Valon Mamudi
a7792ebcae Add configurable NuDB block size feature (#5468)
As XRPL network demand grows and ledger sizes increase, the default 4K NuDB block size becomes a performance bottleneck, especially on high-performance storage systems. Modern SSDs and enterprise storage often perform better with larger block sizes, but rippled previously had no way to configure this parameter. This change therefore implements configurable NuDB block size support, allowing operators to optimize storage performance based on their hardware configuration. The feature adds a new `nudb_block_size` configuration parameter that enables block sizes from 4K to 32K bytes, with comprehensive validation and backward compatibility.

Specific changes are:
- Implements `parseBlockSize()` function with validation.
- Adds `nudb_block_size` configuration parameter.
- Supports block sizes from 4K to 32K (power of 2).
- Adds comprehensive logging and error handling.
- Maintains backward compatibility with 4K default.
- Adds unit tests for block size validation.
- Updates configuration documentation with performance guidance.
- Marks feature as experimental.
- Applies code formatting fixes.

Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-21 00:51:44 +00:00
Bronek Kozicki
83ee3788e1 fix: Enforce reserve when creating trust line or MPToken in VaultWithdraw (#5857)
Similarly to other transaction typed that can create a trust line or MPToken for the transaction submitter (e.g. CashCheck #5285, EscrowFinish #5185 ), VaultWithdraw should enforce reserve before creating a new object. Additionally, the lsfRequireDestTag account flag should be enforced for the transaction submitter.

Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-20 23:07:12 +00:00
Mayukha Vadari
ae719b86d3 refactor: move server_definitions code to its own files (#5890) 2025-10-20 22:24:48 +00:00
Mayukha Vadari
dd722f8b3f chore: remove unnecessary LCOV_EXCL_LINE (#5913) 2025-10-20 22:23:52 +00:00
Mayukha Vadari
9cfb7ac340 fix build issue 2025-10-20 17:20:35 -04:00
Bart
30190a5feb chore: Set explicit timeouts for build and test jobs (#5912)
The default job timeout is 5 hours, while build times are anywhere between 4-20 mins and test times between 2-10. As a runner occasionally gets stuck, we should fail much quicker.

Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-20 20:49:19 +00:00
Mayukha Vadari
3f71f643fa Merge branch 'ripple/smart-escrow' into ripple/se/supported 2025-10-20 14:51:24 -04:00
Mayukha Vadari
3a0e9aab4f Merge branch 'ripple/se/fees' into ripple/smart-escrow 2025-10-20 14:51:15 -04:00
ZhangxiangHu-Ripple
4818997fd9 remove conan lock file and change the cmake version to 3.5...3.25 2025-10-20 10:48:00 -07:00
ZhangxiangHu-Ripple
7ebeb374a2 Add gmp and libsodium from conan 2025-10-20 09:40:57 -07:00
Mayukha Vadari
43caa1ef29 Merge branch 'ripple/wamr-host-functions' into ripple/se/fees 2025-10-20 11:53:56 -04:00
Mayukha Vadari
1c5683ec78 Merge branch 'ripple/wamr' into ripple/wamr-host-functions 2025-10-20 11:53:22 -04:00
Mayukha Vadari
9bee155d59 Merge branch 'develop' into ripple/wamr 2025-10-20 11:53:03 -04:00
Bart
afb6e0e41b chore: Set fail fast to false, except for when the merge group is used (#5897)
This PR sets the fail-fast strategy option to false (it defaults to true), unless it is run by a merge group.

Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-17 16:17:02 +00:00
Bart
5523557226 chore: Clean up Conan variables in CI (#5903)
This change sanitizes inputs by setting them as environment variables, and adjusts the number of CPUs used for building. Namely, GitHub inputs should be sanitized, per recommendation by Semgrep, as using them directly poses a security risk. A recent change further overrode the global configuration by having builds use all cores, but as we have noticed an increased number of job cancelation this change updates it to use all cores less one.

Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-17 16:04:58 +00:00
Bart
b64707f53b chore: Add support for RHEL 8 (#5880)
Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-17 14:09:47 +00:00
Ayaz Salikhov
0b113f371f refactor: Update pre-commit workflow to latest version (#5902)
Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-17 13:40:10 +00:00
tequ
b4c894c1ba refactor: Autofill signature for Simulate RPC (#5852)
This change enables autofilling of signature-related fields in the Simulate RPC.

Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-16 21:18:53 +00:00
Mayukha Vadari
f34b05f4de Merge branch 'ripple/wamr' into ripple/wamr-host-functions 2025-10-16 12:12:05 -04:00
Mayukha Vadari
97ce25f4ce Merge branch 'develop' into ripple/wamr 2025-10-16 12:11:55 -04:00
Mayukha Vadari
92281a4ede refactor: replace string JSONs with Json::Value (#5886)
There are some tests that write out JSONs as a string instead of using the Json::Value library, which are cleaned up by this change.

Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-16 16:02:25 +00:00
Bronek Kozicki
e80642fc12 fix: Fix regression in ConnectAttempt (#5900)
A regression was introduced in #5669 which would cause rippled to potentially dereference a disengaged std::optional when connecting to a peer. This would cause UB in release build and crash in debug.

Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-16 12:54:36 +00:00
Mayukha Vadari
640ce4988f refactor: replace boost::lexical_cast<std::string> with to_string (#5883)
This change replaces boost::lexical_cast<std::string> with to_string in some of the tests to make them more readable.

Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-16 12:46:21 +00:00
ZhangxiangHu-Ripple
8d8a0f4471 enable smart escrow and change variables 2025-10-15 16:59:16 -07:00
ZhangxiangHu-Ripple
d376f8ae87 Add bn254 curve host functions 2025-10-15 16:02:32 -07:00
Mayukha Vadari
a422855ea7 refactor: replace JSON LastLedgerSequence with last_ledger_seq (#5884)
This change replaces instances of JSON LastLedgerSequence with last_ledger_seq, which makes the tests a bit simpler and easier to read.

Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-15 20:55:11 +00:00
Jingchen
108f90586c chore: Reduce build log verbosity on Windows (#5865)
Windows is extremely chatty and generates tons of logs when building, making it practically impossible to use the build logs to debug issues. This change sets the verbosity to 'quiet' on Windows.

Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-15 20:53:01 +00:00
tequ
519d1dbc34 refactor: Replace fee().accountReserve(0) with fee().reserve (#5843)
This PR changes fee().accountReserve(0) to fee().reserve, as the current network reserve amount should be used instead of the account reserve.

Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-15 20:50:06 +00:00
Bart
3d44758e5a fix: Update tools image shas (#5896)
This change updates the Docker image hashes of the tools-rippled images to fix a missing dependency.

Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-15 18:23:44 +00:00
Mayukha Vadari
3b6cd22e32 fix 2025-10-14 18:10:11 -04:00
Michael Legleux
97bc94a7f6 feat: Install validator-keys (#5841)
* feat: Install validator-keys

* output validator-keys with everything else
2025-10-14 22:02:38 +00:00
Mayukha Vadari
c9c35780d2 reduce diff more 2025-10-14 18:00:58 -04:00
Mayukha Vadari
17f401f374 reduce diff 2025-10-14 17:58:41 -04:00
Mayukha Vadari
c6c54b3282 resolve todos 2025-10-14 17:56:12 -04:00
zingero
34619f2504 docs: Fix typo in JSON writer documentation (#5881)
Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-14 19:10:19 +00:00
tequ
3509de9c5f refactor: Add paychan namespace and update related tests (#5840)
This change adds a paychan namespace to the TestHelpers and implementation files, improving organization and clarity. Additionally, it updates the AMM test to use the new `paychan::create` function for payment channel creation.

Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-14 18:37:39 +00:00
Ayaz Salikhov
459d0da010 chore: Support CMake 4 without workarounds (#5866) 2025-10-14 11:18:34 -04:00
Mayukha Vadari
91455b6860 respond to comments 2025-10-13 15:20:03 -04:00
Olek
9e14c14a26 Use xrplf conan repo for wamr (#5862) 2025-10-13 15:11:21 -04:00
Mayukha Vadari
4b32cd16e7 Merge branch 'ripple/smart-escrow' into ripple/se/supported 2025-10-13 14:02:39 -04:00
Mayukha Vadari
f16f243c22 Merge branch 'ripple/se/fees' into ripple/smart-escrow 2025-10-13 14:01:11 -04:00
Mayukha Vadari
fe601308e7 Merge branch 'ripple/wamr-host-functions' into ripple/se/fees 2025-10-13 13:58:09 -04:00
Mayukha Vadari
c507880d8f Merge branch 'ripple/wamr' into ripple/wamr-host-functions 2025-10-13 13:57:22 -04:00
Mayukha Vadari
3f8328bbf8 Merge branch 'develop' into ripple/wamr 2025-10-13 13:55:07 -04:00
Mayukha Vadari
8637d606a4 chore: Exclude code/unreachable transaction code from Codecov (#5847)
This change excludes from Codecov unreachable/difficult-to-test transaction code (such as `tecINTERNAL`) and old code (from amendments that have been enabled for a long time that are only around for ledger replay reasons). This removes about 200 lines of misses and increases the Codecov coverage by 0.3% (79.2% to 79.5%).
2025-10-13 14:56:18 +00:00
Bart
8456b8275e chore: Add wildcard to support triggering for release pipelines (#5879)
This change adds a wildcard to the release branch in the CI pipeline spec. Namely, after adopting an improved release process, with release branches that now look like release-X.Y, the trigger pipeline was no longer running as it only searched for an exact match to release.
2025-10-10 12:22:42 -04:00
Bart
3c88786bb0 refactor: Downgrades OpenSSL to 3.5.4 (#5878)
This change downgrades OpenSSL 3.6.0 to 3.5.4. To avoid potential zero-day issues in a new major version of OpenSSL, 3.6.0, it is safer to stick with 3.5.4. While 3.6.0 has some nice new features, such as improved SHA512 hashing, it also introduces new features that could contain bugs. In contrast, 3.5.4 has seen quite a few bug fixes over 3.5.0 and has been used in the wild for a while now.
2025-10-10 14:18:24 +00:00
Mayukha Vadari
3bd8a4721f Merge branch 'ripple/smart-escrow' into ripple/se/supported 2025-10-09 17:13:18 -04:00
Mayukha Vadari
e41f6a71b7 Merge branch 'ripple/se/fees' into ripple/smart-escrow 2025-10-09 17:13:13 -04:00
Mayukha Vadari
51f1be7f5b Merge branch 'ripple/wamr-host-functions' into ripple/se/fees 2025-10-09 17:10:52 -04:00
Mayukha Vadari
c10a5f9ef6 Merge branch 'ripple/wamr' into ripple/wamr-host-functions 2025-10-09 17:10:31 -04:00
Mayukha Vadari
3c141de695 Merge branch 'develop' into ripple/wamr 2025-10-09 16:52:25 -04:00
Bart
46ba8a28fe refactor: Update Conan dependencies: OpenSSL (#5873)
This change bumps OpenSSL from 1.1.1w to 3.6.0.
2025-10-09 13:27:26 -04:00
Bronek Kozicki
5ecde3cf39 Add vault invariants (#5518)
This change adds invariants for SingleAssetVault #5224 (XLS-065), which had been intentionally skipped earlier to keep the SAV PR size manageable.
2025-10-08 15:04:02 +00:00
tequ
620fb26823 test: Add more tests for Simulate RPC metadata (#5827) 2025-10-08 14:36:09 +00:00
Bronek Kozicki
6b6b213cf5 chore: Fix release build error (#5864)
This change fixes a release build error with GCC 15.2.

The `fields` variable is only used in `XRPL_ASSERT`, which evaluates to nothing in a Release build, leaving the variable unused. This change silences the build warning.

Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-08 13:45:44 +00:00
Bart
f61086b43c refactor: Update CI strategy matrix to use new RHEL 9 and RHEL 10 images (#5856)
This change uses the new RHEL 9 and 10 images to build and test the binary, and adds support for having different Docker image SHAs per distro-compiler combination.

Instead of supporting RHEL each minor version, we are simplifying our pipelines by only supporting RHEL major versions. Our CI Docker images have already been updated accordingly, and we recently added support for RHEL 10 as well. Up until now, the CI Docker images had all been rebuilt at the same time, but that is not necessarily true as the most recent push to the CI repo has shown where the RHEL images now have a different SHA than the Debian and Ubuntu ones.

Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
2025-10-08 13:15:24 +00:00
Mayukha Vadari
176fd2b6e4 chore: exclude all UNREACHABLE blocks from codecov (#5846) 2025-10-08 09:25:51 +01:00
Bart
2df730438d Set version to 3.0.0-b1 (#5859) 2025-10-07 20:28:19 +00:00
Mayukha Vadari
0fd535496b Merge branch 'ripple/smart-escrow' into ripple/se/supported 2025-10-06 17:01:38 -04:00
Mayukha Vadari
db263b696c Merge branch 'ripple/se/fees' into ripple/smart-escrow 2025-10-06 17:01:32 -04:00
Mayukha Vadari
f57b855d74 Merge branch 'ripple/wamr-host-functions' into ripple/se/fees 2025-10-06 17:01:17 -04:00
Mayukha Vadari
da2b9455f2 fix: remove get_ledger_account_hash and get_ledger_tx_hash host functions (#5850)
* remove `get_ledger_account_hash` and `get_ledger_tx_hash`

* fix build+tests
2025-10-06 16:38:40 -04:00
Mayukha Vadari
86525d8583 test: add tests for fee voting (#5747) 2025-10-06 16:32:04 -04:00
Mayukha Vadari
c41e52f57a Move Smart Escrow tests to separate file (#5849) 2025-10-06 16:27:21 -04:00
Mayukha Vadari
55772a0d07 add sfData preflight checks + tests (#5839) 2025-10-02 17:50:43 -04:00
Mayukha Vadari
f8651a266c Merge branch 'ripple/smart-escrow' into ripple/se/supported 2025-10-02 15:52:38 -04:00
Mayukha Vadari
965a9e89ac Merge remote-tracking branch 'upstream/ripple/se/fees' into ripple/smart-escrow 2025-10-02 15:51:35 -04:00
Mayukha Vadari
51ee06429b Merge branch 'ripple/wamr-host-functions' into ripple/se/fees 2025-10-02 14:35:36 -04:00
Mayukha Vadari
cb622488c0 Merge branch 'ripple/wamr' into ripple/wamr-host-functions 2025-10-02 14:35:25 -04:00
Mayukha Vadari
32f971fec6 Merge branch 'develop' into ripple/wamr 2025-10-02 14:35:13 -04:00
Bronek Kozicki
5d79bfc531 Remove bogus coverage warning (#5838) 2025-10-02 11:54:09 +01:00
Ed Hennis
51ef35ab55 fix: Transaction sig checking functions do not get a full context (#5829)
Fixes a (currently harmless) bug introduced by PR #5594
2025-10-01 20:58:43 +00:00
Valentin Balaschenko
330a3215bc fix: FD/handle guarding + exponential backoff (#5823) 2025-10-01 12:57:33 +01:00
Ed Hennis
85c2ceacde Merge tag '2.6.1' into ximinez/merge261
2.6.1

* tag '2.6.1':
  Set version to 2.6.1
  Set version to 2.6.1-rc2
  Mark PermissionDelegation as unsupported
2025-09-30 19:10:51 -04:00
yinyiqian1
8e4fda160d Rename flags for DynamicMPT (#5820) 2025-09-30 18:49:53 +00:00
Mayukha Vadari
ca85d09f02 Merge branch 'ripple/wamr-host-functions' into ripple/se/fees 2025-09-30 14:43:24 -04:00
Mayukha Vadari
8dea76baa4 Merge branch 'ripple/wamr' into ripple/wamr-host-functions 2025-09-30 14:42:49 -04:00
Mayukha Vadari
299fbe04c4 Merge branch 'develop' into ripple/wamr 2025-09-30 14:42:24 -04:00
Bart
072b1c442c chore: Set free-form CI inputs as env vars (#5822)
This change moves CI values that could be user-provided into environment variables.
2025-09-30 19:46:10 +02:00
Ayaz Salikhov
294e03ecf5 ci: Upload artifacts during build and test in a separate job (#5817) 2025-09-30 16:15:24 +00:00
Ed Hennis
550f90a75e refactor: Add support for extra transaction signatures (#5594)
* Restructures Transactor signature checking code to be able to handle a `sigObject`, which may be the full transaction, or may be an object field containing a separate signature. Either way, the `sigObject` can be a single- or multi-sign signature.
2025-09-29 22:11:53 +00:00
Ed Hennis
d67dcfe3c4 refactor: Restructure Transactor::preflight to reduce boilerplate (#5592)
* Restructures `Transactor::preflight` to create several functions that will remove the need for error-prone boilerplate code in derived classes' implementations of `preflight`.
2025-09-29 17:31:42 -04:00
Mayukha Vadari
57fc1df7d7 switch from wasm32-unknown-unknown to wasm32v1-none (#5814) 2025-09-29 15:43:22 -04:00
Mayukha Vadari
8d266d3941 remove STInt64 (#5815) 2025-09-29 15:43:10 -04:00
Mayukha Vadari
0fd2f715bb switch fixIncludeKeyletFields to Supported::yes (#5819) 2025-09-27 09:04:04 +02:00
Mayukha Vadari
0f6a55826b Mark SmartEscrow as Supported::yes 2025-09-26 17:57:11 -04:00
Mayukha Vadari
a865b4da1c Merge branch 'ripple/se/fees' into ripple/smart-escrow 2025-09-26 17:10:10 -04:00
Mayukha Vadari
e59f5f3b01 fix tests 2025-09-26 17:09:53 -04:00
Mayukha Vadari
8729688feb Merge remote-tracking branch 'upstream/ripple/se/fees' into ripple/smart-escrow 2025-09-26 16:56:51 -04:00
Mayukha Vadari
c8b06e7de1 Merge branch 'ripple/wamr-host-functions' into ripple/se/fees 2025-09-26 16:50:34 -04:00
Mayukha Vadari
eaba76f9e6 Merge branch 'ripple/wamr' into ripple/wamr-host-functions 2025-09-26 16:37:25 -04:00
Mayukha Vadari
cb702cc238 Merge branch 'develop' into ripple/wamr 2025-09-26 16:37:04 -04:00
Mayukha Vadari
807462b191 Add STInt32 as a new SType (#5788)
This change adds `STInt32` as a new `SType` under the `STInteger` umbrella, with `SType` value `12`. This is the first and only `STInteger` type that supports negative values.
2025-09-26 20:13:15 +00:00
Mayukha Vadari
f1f798bb85 Merge branch 'ripple/se/fees' into ripple/smart-escrow 2025-09-26 15:52:22 -04:00
Mayukha Vadari
c3fd52c177 Merge branch 'ripple/wamr-host-functions' into ripple/se/fees 2025-09-26 15:51:56 -04:00
Mayukha Vadari
b69b4a0a4a Merge branch 'ripple/wamr' into ripple/wamr-host-functions 2025-09-26 15:51:48 -04:00
Mayukha Vadari
50d6072a73 Merge branch 'develop' into ripple/wamr 2025-09-26 15:51:40 -04:00
Ayaz Salikhov
19c4226d3d ci: Call all reusable workflows reusable (#5818) 2025-09-26 18:33:42 +00:00
Mayukha Vadari
d02c306f1e test: add more comprehensive tests for FeeVote (#5746)
This change adds more comprehensive tests for the `FeeVote` module, which previously only checked the basics, and not the more comprehensive flows in that class.
2025-09-26 17:40:19 +00:00
Jingchen
cfd26f444c fix: Address http header case sensitivity (#5767)
This change makes the regex in `HttpClient.cpp` that matches the content-length http header case insensitive to improve compatibility, as http headers are case insensitive.
2025-09-26 11:40:43 +00:00
tequ
2c3024716b change fixPriceOracleOrder to Supported::yes (#5749) 2025-09-26 12:07:48 +01:00
Bart
a12f5de68d chore: Pin all CI Docker tags (#5813)
To avoid surprises and ensure reproducibility, this change pins all CI Docker image tags to the latest version in the XRPLF/CI repo.
2025-09-25 16:08:07 +00:00
Bronek Kozicki
51c5f2bfc9 Improve ValidatorList invalid UNL manifest logging (#5804)
This change raises logging severity from `INFO` to `WARN` when handling UNL manifest signed with an unexpected / invalid key. It also changes the internal error code for an invalid format of UNL manifest to `invalid` (from `untrusted`).

This is a follow up to problems experienced by an UNL node due to old manifest key configured in `validators.txt`, which would be easier to diagnose with improved logging.

It also replaces a log line with `UNREACHABLE` for an impossible situation when we match UNL manifest key against a configured key which has an invalid type (we cannot configure such a key because of checks when loading configured keys).
2025-09-25 16:14:29 +02:00
Olek
d24cd50e61 Switch to own wamr fork (#5808) 2025-09-23 16:39:21 -04:00
Mayukha Vadari
85bff20ae5 Merge branch 'ripple/se/fees' into ripple/smart-escrow 2025-09-23 10:22:30 -04:00
Valentin Balaschenko
73ff54143d docs: Add warning about using std::counting_semaphore (#5595)
This adds a comment to avoid using `std::counting_semaphore` until the minimum compiler versions of GCC and Clang have been updated to no longer contain the bug that is present in older compilers.
2025-09-23 13:26:26 +02:00
Mayukha Vadari
737fab5471 fix issues 2025-09-22 23:55:57 -04:00
Mayukha Vadari
e6592e93a9 Merge branch 'ripple/se/fees' into ripple/smart-escrow 2025-09-22 18:32:36 -04:00
Mayukha Vadari
5a6c4e8ae0 Merge branch 'ripple/wamr-host-functions' into ripple/se/fees 2025-09-22 18:23:54 -04:00
Mayukha Vadari
9f5875158c Merge branch 'ripple/wamr' into ripple/wamr-host-functions 2025-09-22 18:23:45 -04:00
Mayukha Vadari
c3dc33c861 Merge branch 'develop' into ripple/wamr 2025-09-22 18:23:35 -04:00
Mayukha Vadari
7420f47658 SmartEscrow fee voting changes 2025-09-22 18:22:44 -04:00
Bart
08b136528e Revert "Update Conan dependencies: OpenSSL" (#5807)
This change reverts #5617, because it will require extensive testing that will take up more time than we have before the next scheduled release.

Reverting this change does not mean we are abandoning it. We aim to pick it back up once there's a sufficient time window to allow for testing on multiple distros running a mixture of OpenSSL 1.x and 3.x.
2025-09-22 18:27:02 +00:00
Mayukha Vadari
6b8a589447 test: Add STInteger and STParsedJSON tests (#5726)
This change is to improve code coverage (and to simplify #5720 and #5725); there is otherwise no change in functionality. The change adds basic tests for `STInteger` and `STParsedJSON`, so it becomes easier to test smaller changes to the types, as well as removes `STParsedJSONArray`, since it is not used anywhere (including in Clio).
2025-09-22 20:00:31 +02:00
Olek
6be8f2124c Latests HF perf test (#5789) 2025-09-18 15:51:39 -04:00
Mayukha Vadari
edfed06001 fix merge issues 2025-09-18 15:39:49 -04:00
Mayukha Vadari
1c646dba91 Merge remote-tracking branch 'upstream/ripple/wamr' into wamr-host-functions 2025-09-18 15:29:02 -04:00
Mayukha Vadari
6781068058 Merge branch 'develop' into ripple/wamr 2025-09-18 15:27:54 -04:00
Ed Hennis
ffeabc9642 refactor: Simplify STParsedJSON with some helper functions (#5591)
- Add code coverage for STParsedJSON edge cases

Co-authored-by: Denis Angell <dangell@transia.co>
2025-09-18 19:04:40 +00:00
Mayukha Vadari
cfe57c1dfe Merge branch 'ripple/wamr' into ripple/wamr-host-functions 2025-09-18 14:37:58 -04:00
Mayukha Vadari
c34d09a971 Merge branch 'develop' into ripple/wamr 2025-09-18 14:24:34 -04:00
Ed Hennis
3cbdf818a7 Miscellaneous refactors and updates (#5590)
- Added a new Invariant: `ValidPseudoAccounts` which checks that all pseudo-accounts behave consistently through creation and updates, and that no "real" accounts look like pseudo-accounts (which means they don't have a 0 sequence). 
- `to_short_string(base_uint)`. Like `to_string`, but only returns the first 8 characters. (Similar to how a git commit ID can be abbreviated.) Used as a wrapped sink to prefix most transaction-related messages. More can be added later.
- `XRPL_ASSERT_PARTS`. Convenience wrapper for `XRPL_ASSERT`, which takes the `function` and `description` as separate parameters.
- `SField::sMD_PseudoAccount`. Metadata option for `SField` definitions to indicate that the field, if set in an `AccountRoot` indicates that account is a pseudo-account. Removes the need for hard-coded field lists all over the place. Added the flag to `AMMID` and `VaultID`.
- Added functionality to `SField` ctor to detect both code and name collisions using asserts. And require all SFields to have a name
- Convenience type aliases `STLedgerEntry::const_pointer` and `STLedgerEntry::const_ref`. (`SLE` is an alias to `STLedgerEntry`.)
- Generalized `feeunit.h` (`TaggedFee`) into `unit.h` (`ValueUnit`) and added new "BIPS"-related tags for future use. Also refactored the type restrictions to use Concepts.
- Restructured `transactions.macro` to do two big things
	1. Include the `#include` directives for transactor header files directly in the macro file. Removes the need to update `applySteps.cpp` and the resulting conflicts.
	2. Added a `privileges` parameter to the `TRANSACTION` macro, which specifies some of the operations a transaction is allowed to do. These `privileges` are enforced by invariant checks. Again, removed the need to update scattered lists of transaction types in various checks.
- Unit tests:
	1.  Moved more helper functions into `TestHelpers.h` and `.cpp`. 
	2. Cleaned up the namespaces to prevent / mitigate random collisions and ambiguous symbols, particularly in unity builds.
	3. Generalized `Env::balance` to add support for `MPTIssue` and `Asset`.
	4. Added a set of helper classes to simplify `Env` transaction parameter classes: `JTxField`, `JTxFieldWrapper`, and a bunch of classes derived or aliased from it. For an example of how awesome it is, check the changes `src/test/jtx/escrow.h` for how much simpler the definitions are for `finish_time`, `cancel_time`, `condition`, and `fulfillment`. 
	5. Generalized several of the amount-related helper classes to understand `Asset`s.
     6. `env.balance` for an MPT issuer will return a negative number (or 0) for consistency with IOUs.
2025-09-18 17:55:49 +00:00
Ed Hennis
bd834c87e0 Merge tag '2.6.1-rc1' into ximinez/merge-261rc1
2.6.1-rc1

* tag '2.6.1-rc1':
  Set version to 2.6.1-rc1
  Downgrade to boost 1.83
2025-09-18 11:46:22 -04:00
Jingchen
dc8b37a524 refactor: Modularise ledger (#5493)
This change moves the ledger code to libxrpl.
2025-09-18 11:12:24 -04:00
Bronek Kozicki
617a895af5 chore: Add unit tests dir to code coverage excludes (#5803)
This change excludes unit test code from code coverage reporting.
2025-09-18 06:30:34 -04:00
Bart
1af1048c58 chore: Build and test all configs for daily scheduled run (#5801)
This change re-enables building and testing all configurations, but only for the daily scheduled run. Previously all configurations were run for each merge into the develop branch, but that overwhelmed both the GitHub runners and the Conan remote, and thus they were limited to just a subset of configurations. Now that the number of jobs is limited via `max-parallel: 10`, we should be able to safely enable building all configurations again. However, building them all once a day instead of for each PR merge should be sufficient.
2025-09-17 19:17:48 -04:00
Ed Hennis
f07ba87e51 Merge tag '2.5.1' into upstream--develop
- Ensures the commits don't get orphaned, even though the relevant code
  changes are already included.

* tag '2.5.1':
  Set version to 2.5.1
  Fix: Don't flag consensus as stalled prematurely (#5658)
2025-09-17 19:05:14 -04:00
Bart
e66558a883 chore: Limits CI build and test parallelism to reduce resource contention (#5799)
GitHub runners have a limit on how many concurrent jobs they can actually process (even though they will try to run them all at the same time), and similarly the Conan remote cannot handle hundreds of concurrent requests. Previously, the Conan dependency uploading was already limited to max 10 jobs running in parallel, and this change makes the same change to the build+test workflow.
2025-09-17 22:55:00 +00:00
Mayukha Vadari
510314d344 fix(amendment): Add missing fields for keylets to ledger objects (#5646)
This change adds a fix amendment (`fixIncludeKeyletFields`) that adds:
* `sfSequence` to `Escrow` and `PayChannel`
* `sfOwner` to `SignerList`
* `sfOracleDocumentID` to `Oracle`

This ensures that all ledger entries hold all the information needed to determine their keylet.
2025-09-17 21:34:47 +00:00
yinyiqian1
37b951859c Rename mutable flags (#5797)
This is a minor change on top of #5705
2025-09-17 21:43:04 +01:00
Jingchen
9494fc9668 chore: Use self hosted windows runners (#5780)
This changes switches from the GitHub-managed Windows runners to self-hosted runners to significantly reduce build time.
2025-09-17 09:29:15 -04:00
Vito Tumas
17a2606591 Bugfix: Adds graceful peer disconnection (#5669)
The XRPL establishes connections in three stages: first a TCP connection, then a TLS/SSL handshake to secure the connection, and finally an upgrade to the bespoke XRP Ledger peer-to-peer protocol. During connection termination, xrpld directly closes the TCP connection, bypassing the TLS/SSL shutdown handshake. This makes peer disconnection diagnostics more difficult - abrupt TCP termination appears as if the peer crashed rather than disconnected gracefully.

This change refactors the connection lifecycle with the following changes:
- Enhanced outgoing connection logic with granular timeouts for each connection stage (TCP, TLS, XRPL handshake) to improve diagnostic capabilities
- Updated both PeerImp and ConnectAttempt to use proper asynchronous TLS shutdown procedures for graceful connection termination
2025-09-16 10:51:55 +01:00
yinyiqian1
ccb9f1e42d Support DynamicMPT XLS-94d (#5705)
* extends the functionality of the MPTokenIssuanceSet transaction, allowing the issuer to update fields or flags that were explicitly marked as mutable during creation.
2025-09-15 19:42:36 +00:00
Bart
3e4e9a2ddc Only notify clio for PRs targeting the release and master branches (#5794)
Clio should only be notified when releases are about to be made, instead of for all PR, so this change only notifies Clio when a PR targets the release or master branch.
2025-09-15 13:28:47 -04:00
Bart
4caebfbd0e refactor: Wrap GitHub CI conditionals in curly braces (#5796)
This change wraps all GitHub conditionals in `${{ .. }}`, both for consistency and to reduce unexpected failures, because it was previously noticed that not all conditionals work without those curly braces.
2025-09-15 16:26:08 +00:00
Denis Angell
37c377a1b6 Fix: EscrowTokenV1 (#5571)
* resolves an accounting inconsistency in MPT escrows where transfer fees were not properly handled when unlocking escrowed tokens.
2025-09-15 14:48:47 +00:00
Jingchen
bd182c0a3e fix: Skip processing transaction batch if the batch is empty (#5670)
Avoids an assertion failure in NetworkOPsImp::apply in the unlikely event that all incoming transactions are invalid.
2025-09-15 13:51:19 +00:00
Ayaz Salikhov
406c26cc72 ci: Fix conan secrets in upload-conan-deps (#5785)
- Accounts for some variables that were changed and missed when the reusable workflow was removed.
2025-09-12 17:09:42 +00:00
Jingchen
9bd1ce436a Fix code coverage error (#5765)
* Fix the issue where COVERAGE_CXX_COMPILER_FLAGS is never used
2025-09-12 15:13:27 +00:00
Mayukha Vadari
ebd90c4742 chore: remove unneeded float stuff (#5729) 2025-09-11 18:41:24 -04:00
Mayukha Vadari
ba52d34828 test: improve codecov in HostFuncWrapper.cpp (#5730) 2025-09-11 18:09:08 -04:00
Mayukha Vadari
2f869b3cfc Merge branch 'develop' into ripple/smart-escrow 2025-09-11 16:41:14 -04:00
Mayukha Vadari
ffa21c27a7 fix test 2025-09-11 16:40:59 -04:00
Mayukha Vadari
1b6312afb3 rearrange files 2025-09-11 16:34:03 -04:00
Mayukha Vadari
bf32dc2e72 add fixtures files 2025-09-11 16:28:11 -04:00
Mayukha Vadari
a15d65f7a2 update tests 2025-09-11 16:20:33 -04:00
Mayukha Vadari
2de8488855 add temBAD_WASM 2025-09-11 16:02:17 -04:00
Mayukha Vadari
129aa4bfaa bring out IOUAmount.h 2025-09-11 13:18:42 -04:00
Ayaz Salikhov
f69ad4eff6 docs: Add remote to conan lock create command (#5770)
* docs: Add remote to `conan lock create` command
* Document error resolution for conan package issues
* Update BUILD.md
* Add more info about lockfiles
2025-09-11 15:42:27 +00:00
Mayukha Vadari
6fe0599cc2 refactor: clean up CTID.h (#5681) 2025-09-11 14:49:26 +00:00
tequ
e6f8bc720f Add additional metadata to simulate response (#5754) 2025-09-11 15:17:06 +01:00
Ayaz Salikhov
fbd60fc000 ci: Use pre-commit reusable workflow (#5772) 2025-09-11 13:58:11 +01:00
Mayukha Vadari
b1d70db63b limits 2025-09-10 15:05:06 -04:00
Mayukha Vadari
f03c3aafe4 misc host function files 2025-09-10 15:02:48 -04:00
Mayukha Vadari
51a9f106d1 CODEOWNERS 2025-09-10 14:59:09 -04:00
Mayukha Vadari
bfc048e3fe add tests 2025-09-10 14:57:23 -04:00
Mayukha Vadari
83418644f7 add host functions 2025-09-10 14:56:21 -04:00
Mayukha Vadari
dbc9dd5bfc Add WAMR integration code 2025-09-10 14:56:08 -04:00
Mayukha Vadari
5c480cf883 Merge branch 'develop' into ripple/smart-escrow 2025-09-10 14:51:45 -04:00
Mayukha Vadari
45ab15d4b5 add WAMR dependency 2025-09-10 14:40:48 -04:00
yinyiqian1
61d628d654 fix: Add restrictions to Permission Delegation: fixDelegateV1_1 (#5650)
- Amendment: fixDelegateV1_1
- In DelegateSet, disallow invalid PermissionValues like 0, and transaction values when the transaction's amendment is not enabled. Acts as if the transaction doesn't exist, which is the same thing older versions without the amendment will do.
- Payment burn/mint should disallow DEX currency exchange.
- Support MPT for Payment burn/mint.
2025-09-10 17:47:33 +00:00
Mayukha Vadari
adc64e7866 Merge branch 'develop' into ripple/smart-escrow 2025-09-10 10:46:51 -04:00
Ayaz Salikhov
3d92375d12 ci: Add missing dependencies to workflows (#5783) 2025-09-10 08:20:45 +00:00
Ayaz Salikhov
cdbe70b2a7 ci: Use default conan install format (#5784) 2025-09-10 07:35:58 +00:00
Bronek Kozicki
f6426ca183 Switch CI pipeline bookworm:gcc-13 from arm64 to amd64 (#5779) 2025-09-09 21:23:07 +00:00
Ayaz Salikhov
e5f7a8442d ci: Change upload-conan-deps workflow is run (#5782)
- Don't run upload-conan-deps in PRs, unless the PR changes the workflow file.
- Change cron schedule for uploading Conan dependencies to run after work hours for most dev.
2025-09-09 16:21:12 -04:00
Mayukha Vadari
4d4a1cfe82 Merge branch 'develop' into ripple/smart-escrow 2025-09-09 16:19:27 -04:00
Ayaz Salikhov
e67e0395df ci: Limit number of parallel jobs in "upload-conan-deps" (#5781)
- This should prevent Artifactory from being overloaded by too many requests at a time.
- Uses "max-parallel" to limit the build job to 10 simultaneous instances.
- Only run the minimal matrix on PRs.
2025-09-09 19:47:06 +00:00
Mayukha Vadari
f2c7da3705 Merge branch 'develop' into ripple/smart-escrow 2025-09-09 14:51:25 -04:00
Ed Hennis
148f669a25 chore: "passed" fails if any previous jobs fail or are cancelled (#5776)
For the purposes of being able to merge a PR, Github Actions jobs count as passed if they ran and passed, or were skipped.

With this change, if any of the jobs that "passed" depends on fail or are cancelled, then "passed" will fail. If they all succeed or are skipped, then "passed" is skipped, which does not prevent a merge.

This saves spinning up a runner in the usual case where things work, and will simplify our branch protection rules, so that only "passed" will need to be checked.
2025-09-09 18:07:04 +00:00
yinyiqian1
f1eaa6a264 enable fixAMMClawbackRounding (#5750) 2025-09-09 15:57:28 +00:00
Ayaz Salikhov
da4c8c9550 ci: Only run build-test/notify-clio if should-run indicates to (#5777)
- Fixes an issue introduced by #5762 which removed the transitive `should-run` check from these two jobs.
2025-09-09 11:25:41 -04:00
Mayukha Vadari
3ab0a82cd3 Merge branch 'develop' into ripple/smart-escrow 2025-09-08 15:20:01 -04:00
Wo Jake
bcde2790a4 Update old links & descriptions in README.md (#4701) 2025-09-08 18:03:20 +00:00
Ayaz Salikhov
9ebeb413e4 feat: Implement separate upload workflow (#5762)
* feat: Implement separate upload workflow
* Use cleanup-workspace
* Name some workflows reusable
* Add dependencies
2025-09-08 15:15:59 +00:00
Bronek Kozicki
6d40b882a4 Switch on-trigger to minimal build (#5773) 2025-09-08 13:54:50 +00:00
tzchenxixi
9fe0a154f1 chore: remove redundant word in comment (#5752) 2025-09-08 13:13:32 +00:00
Ayaz Salikhov
cb52c9af00 fix: Remove extra @ in notify-clio.yml (#5771) 2025-09-05 14:08:17 +01:00
Mayukha Vadari
6bf8338038 chore: Add conan.lock to workflow file checks (#5769)
* Add conan.lock to workflow file checks
* Add conan.lock to on-trigger.yml
2025-09-04 22:32:23 +00:00
Mayukha Vadari
a46d772147 fix build and tests (#5768)
* fix conan.lock

* add conan.lock to triggers

* update on-trigger.yml too

* fix tests

* roll back unrelated changes
2025-09-04 17:05:39 -04:00
Ayaz Salikhov
b0f4174e47 chore: Use tooling provided by pre-commit (#5753) 2025-09-04 20:30:54 +00:00
Ayaz Salikhov
3865dde0b8 fix: Add missing info to notify-clio workflow (#5761)
* Add missing info to notify-clio workflow, as conan_ref
2025-09-04 19:26:57 +00:00
Mayukha Vadari
f3c50318e8 Merge branch 'develop' into ripple/smart-escrow 2025-09-04 13:42:59 -04:00
Ayaz Salikhov
811c980821 ci: Use cleanup-workspace action (#5763)
* ci: Use cleanup-workspace action
* Use latest version
2025-09-04 16:27:30 +01:00
Bronek Kozicki
cf5f65b68e Add Scale to SingleAssetVault (#5652)
* Add and Scale to VaultCreate
* Add round-trip calculation to VaultDeposit VaultWithdraw and VaultClawback
* Implement Number::truncate() for VaultClawback
* Add rounding to DepositWithdraw
* Disallow zero shares withdraw or deposit with tecPRECISION_LOSS
* Return tecPATH_DRY on overflow when converting shares/assets
* Remove empty shares MPToken in clawback or withdraw (except for vault owner)
* Implicitly create shares MPToken for vault owner in VaultCreate
* Review feedback: defensive checks in shares/assets calculations

---------

Co-authored-by: Ed Hennis <ed@ripple.com>
2025-09-04 08:54:24 +00:00
Mayukha Vadari
e7aa924c0e Merge branch 'develop' into ripple/smart-escrow 2025-09-03 15:54:57 -04:00
Jingchen
c38f2a3f2e Fix coverage parameter (#5760) 2025-09-03 16:08:02 +00:00
Ed Hennis
16c2ff97cc Set version to 2.5.1 2025-09-03 10:20:12 -04:00
Ed Hennis
32043463a8 Fix: Don't flag consensus as stalled prematurely (#5658)
Fix stalled consensus detection to prevent false positives in situations where there are no disputed transactions.

Stalled consensus detection was added to 2.5.0 in response to a network consensus halt that caused a round to run for over an hour. However, it has a flaw that makes it very easy to have false positives. Those false positives are usually mitigated by other checks that prevent them from having an effect, but there have been several instances of validators "running ahead" because there are circumstances where the other checks are "successful", allowing the stall state to be checked.
2025-09-03 10:12:30 -04:00
Ayaz Salikhov
724e9b1313 chore: Use conan lockfile (#5751)
* chore: Use conan lockfile
* Add windows-specific dependencies as well
* Add more info about lockfiles
* Update lockfile to latest version
* Update BUILD.md with conan install note
2025-09-03 10:24:07 +00:00
Bronek Kozicki
2e6f00aef2 Add required disable_ccache option (#5756) 2025-09-03 09:25:52 +01:00
Mayukha Vadari
5266f04970 chore: rollback unrelated changes (#5737) 2025-09-02 18:26:01 -04:00
Mayukha Vadari
db957cf191 Merge branch 'develop' into ripple/smart-escrow 2025-08-29 16:54:17 -04:00
Mayukha Vadari
e0b9812fc5 Refactor ledger_entry RPC source code and tests (#5237)
This is a major refactor of LedgerEntry.cpp. It adds a number of helper functions to make the code easier to maintain.

It also splits up the ledger and ledger_entry tests into different files, and cleans up the ledger_entry tests to make them easier to write and maintain.

This refactor also caught a few bugs in some of the other RPC processing, so those are fixed along the way.
2025-08-29 15:52:09 -04:00
Mayukha Vadari
8ac514363d get new fees and reserves working (#5714) 2025-08-29 10:58:52 -04:00
Vito Tumas
e4fdf33158 adds additional logging to differentiate why connections were refused (#5690)
This is a follow-up to PR #5664 that further improves the specificity of logging for refused peer connections. The previous changes did not account for several key scenarios, leading to potentially misleading log messages.

It addresses the following 

- Inbound Disabled: Connections are now explicitly logged as rejected when the server is not configured to accept inbound peers. Previously, this was logged as the server being "full," which was technically correct but lacked diagnostic clarity.
- Duplicate Connections: The logging now distinguishes between two types of duplicate connection refusals:
    - When a peer with the same node public key is already connected (duplicate connection).
    -  When a connection is rejected because the limit for connections from a single IP address has been reached.

These changes provide more accurate and actionable diagnostic information when analyzing peer connection behavior.
2025-08-29 00:00:38 +00:00
Ed Hennis
6e814d7ebd chore: Run CI jobs in more situations, and add "passed" job (#5739)
Test jobs will run if
* Either the PR is non-draft or has the "DraftRunCI" label set *AND*
* One of the following:
	* Certain files were changed *OR*
	* The PR is non-draft and has the "Ready to merge" flag *OR*
	* The workflow is being run from the merge queue.

Additionally, a meta "passed" job was added that is dependent on all the other test jobs, so the required jobs list under branch protection rules only needs to specify "passed" to ensure that *either* all the test jobs pass *or* all the test jobs are skipped because they don't need to be run.

This allows PRs that don't affect the build or binary to be merged without overriding.
2025-08-28 20:33:11 +00:00
Ayaz Salikhov
1e37d00d6c ci: Use XRPLF/prepare-runner action (#5740)
* ci: Use XRPLF/prepare-runner action
* Remove some old boost workaround
2025-08-28 19:32:49 +00:00
Mayukha Vadari
c2ea68cca4 Merge branch 'develop' into ripple/smart-escrow 2025-08-28 14:02:10 -04:00
Michael Legleux
87ea3ba65d Merge remote-tracking branch 'upstream/release' into merge2.6.0 2025-08-28 13:51:17 -04:00
Bronek Kozicki
dedf3d3983 Remove extraneous // LCOV_EXCL_START, and fix CMake warning (#5744)
* Remove extraneous // LCOV_EXCL_START
* Fix "At least one COMMAND must be given" CMake warning
2025-08-28 10:15:17 -04:00
Mayukha Vadari
3d86881ce7 Merge branch 'develop' into ripple/smart-escrow 2025-08-27 13:58:38 -04:00
Mayukha Vadari
697d1470f4 change: adjust the function signatures for get_ledger_sqn and get_parent_ledger_time (#5733) 2025-08-27 13:58:27 -04:00
Alex Kremer
1506e65558 refactor: Update to Boost 1.88 (#5570)
This updates Boost to 1.88, which is needed because Clio wants to move to 1.88 as that fixes several ASAN false positives around coroutine usage. In order for Clio to move to newer boost, libXRPL needs to move too. Hence the changes in this PR. A lot has changed between 1.83 and 1.88 so there are lots of changes in the diff, especially in regards to Boost.Asio and coroutines in particular.
2025-08-27 09:34:50 +00:00
Bart
808c86663c fix: Add codecov token to trigger workflow (#5736)
This change adds the Codecov token to the on-trigger workflow.
2025-08-26 19:07:23 -04:00
Mayukha Vadari
0b5f8f4051 Merge remote-tracking branch 'upstream/develop' into ripple/smart-escrow 2025-08-26 17:59:43 -04:00
Bart
92431a4238 chore: Add support for merge_group event (#5734)
This change adds support for the merge_group CI event, which will allow us to enable merge queues.
2025-08-26 17:12:37 -04:00
Bart
285120684c refactor: Replace 'on: pull_request: paths' by 'changed-files' action (#5728)
This PR moves the list of files from the `paths:` section in the `on: pull_request` into a separate job.
2025-08-26 16:00:00 -04:00
Mayukha Vadari
0fed78fbcc Merge branch 'develop' into ripple/smart-escrow 2025-08-26 15:04:33 -04:00
Mayukha Vadari
8c38ef726b chore: exclude a bunch of code that doesn't need to be tested from codecov (#5721)
* exclude the bulk of HostFunc.h from codecov

* fix codecov (maybe)

* adjust

* more codecov excl
2025-08-26 15:04:27 -04:00
Bart
77fef8732b fix: Simplify PR pipeline trigger rules (#5727)
This change removes `labeled` and `unlabeled` as pipeline trigger actions, and instead adds `reopened` and `ready_for_review`. The logic whether to run the pipeline jobs is then simplified, although to get a draft PR with the `DraftCIRun` label to run it can be necessary to close and reopen a PR.
2025-08-25 13:32:07 -04:00
Mayukha Vadari
2399d90334 Merge branch 'develop' into ripple/smart-escrow 2025-08-25 10:42:48 -04:00
Ed Hennis
7775c725f3 Merge remote-tracking branch 'upstream/release' into ximinez/merge-release 2025-08-22 19:56:21 -04:00
Bart
c61096239c chore: Remove codecov token check to support tokenless uploads on forks (#5722) 2025-08-22 23:31:01 +00:00
Mayukha Vadari
6367d68d1e Merge branch 'develop' into ripple/smart-escrow 2025-08-22 14:02:12 -04:00
Bart
c14ce956ad chore: Update clang-format and prettier with pre-commit (#5709)
The change updates how clang-format is called in CI and locally, and adds prettier to the pre-commit hook. Proto files are now also formatted, while external files are excluded.
2025-08-22 17:37:11 +00:00
Mayukha Vadari
155a84c8a3 Merge branch 'develop' into ripple/smart-escrow 2025-08-22 13:24:43 -04:00
Mayukha Vadari
095dc4d9cc fix(test): handle null metadata for unvalidated tx in Env::meta (#5715)
This change handles errors better when calling `env.meta`. It prints some debug help and throws an error if `env.meta` is going to return a `nullptr`.
2025-08-22 16:15:03 +00:00
Bronek Kozicki
2e255812ae chore: Workaround for CI build errors on arm64 (#5717)
CI builds with `clang-20` on `linux/arm64` are failing due to boost 1.86. This is hopefully fixed in version 1.88.
2025-08-22 10:58:36 -04:00
Mayukha Vadari
10558c9eff Merge remote-tracking branch 'upstream/develop' into develop5.5 2025-08-22 10:38:10 -04:00
Bart
896b8c3b54 chore: Fix file formatting (#5718) 2025-08-22 10:02:56 -04:00
Bart
58dd07bbdf fix: Skip notify-clio when running in a fork, reorder config fields (#5712)
This change will skip running the notify-clio job when a PR is created from a fork, and reorders the strategy matrix configuration fields so GitHub will more clearly show which configuration is running.
2025-08-21 16:32:04 -04:00
Bart
b13370ac0d chore: Reverts formatting changes to external files, adds formatting changes to proto files (#5711)
This change reverts the formatting applied to external files and adds formatting of proto files.

As clang-format will complain if a proto file is modified or moved, since the .clang-format file does not explicitly contain a section for proto files, the change has been included in this PR as well.
2025-08-21 15:22:25 -04:00
Bart
f847e3287c Update Conan dependencies: OpenSSL (#5617)
This change updates OpenSSL from 1.1.1w to 3.5.2. The code works as-is, but many functions have been marked as deprecated and thus will need to be rewritten. For now we explicitly add the `-DOPENSSL_SUPPRESS_DEPRECATED` to give us time to do so, while providing us with the benefits of the updated version.
2025-08-21 07:41:00 -04:00
Bart
56c1e078f2 fix: Correctly check for build_only when deciding whether to run tests (#5708)
This change modifies the `build_only` check used to determine whether to run tests. For easier debugging in the future it also prints out the contents of the strategy matrix.
2025-08-20 19:25:40 -04:00
Bart
afc05659ed fix: Adjust the CI workflows (#5700) 2025-08-19 12:46:38 -04:00
Bart
b04d239926 fix: Modify jobs to use '>>' instead of 'tee' for GITHUB_OUTPUT (#5699) 2025-08-18 10:49:55 -04:00
Bart
dc1caa41b2 refactor: Revamp CI workflows (#5661)
This change refactors the CI workflows to leverage the new CI Docker images for Debian, Red Hat, and Ubuntu.
2025-08-18 10:21:43 -04:00
Jingchen
ceb0ce5634 refactor: Decouple net from xrpld and move rpc-related classes to the rpc folder (#5477)
As a step of modularisation, this change moves code from `xrpld` to `libxrpl`.
2025-08-15 23:27:13 +00:00
Mayukha Vadari
dd30d811e6 feat: last set of host functions (#5674) 2025-08-15 16:51:30 -04:00
Mayukha Vadari
293d8e4ddb test: store Rust source code in rippled, instead of just opaque hex strings (#5653) 2025-08-15 15:32:36 -04:00
Mayukha Vadari
77875c9133 fix: actually return int instead of bool (#5651) 2025-08-15 13:58:38 -04:00
Olek
647b47567e Fix float point binary format (#5688) 2025-08-15 09:51:40 -04:00
Olek
b0a1ad3b06 Disable float point instructions (#5679) 2025-08-14 14:59:55 -04:00
Mayukha Vadari
1d141bf2e8 chore: move WASM files to separate folder (#5666) 2025-08-14 11:46:10 -04:00
Olek
0d0e279ae2 Float point Hostfunctions unit tests (#5656)
* Added direct unittests for float hostfunctions
2025-08-14 10:21:21 -04:00
Mayukha Vadari
5dc0cee28a cache data instead of setting it in updateData (#5642)
Co-authored-by: Oleksandr <115580134+oleks-rip@users.noreply.github.com>
2025-08-11 15:52:21 -04:00
Mayukha Vadari
c15947da56 fix CI 2025-08-06 12:41:12 -04:00
Mayukha Vadari
9bc04244e7 Merge remote-tracking branch 'upstream/ripple/smart-escrow' into develop5 2025-08-06 12:36:25 -04:00
Mayukha Vadari
38c7a27010 clean up WASM functions a bit (#5628) 2025-08-05 18:06:34 -04:00
Mayukha Vadari
58741d2791 feat: return an int instead of boolean from finish, display in metadata (#5641)
* create STInt32 and STInt64, use it for sfWasmReturnCode in metadata

* get it actually working

* add tests

* update comment

* change type

* respond to comments
2025-08-04 09:25:47 -04:00
Mayukha Vadari
8426470506 feat: add other misc host functions (#5574) 2025-07-31 18:39:21 -04:00
Olek
ccc3280b1a Update wamr to 2.4.1 (#5640) 2025-07-31 13:42:53 -04:00
Mayukha Vadari
2847075705 fix: ensure GasUsed shows up in the metadata even on tecWASM_REJECTED (#5633)
* always set gas used

* fix

* add tests

* clean up
2025-07-31 10:57:56 -04:00
Olek
3108ca0549 Float point HF (#5611)
- added support for 8-byte  float point
2025-07-30 14:38:03 +00:00
Mayukha Vadari
3b849ff497 Add unit tests for host functions (#5578) 2025-07-29 17:54:48 -04:00
Mayukha Vadari
66776b6a85 test: codecov for WasmHostFuncWrapper.cpp (#5601) 2025-07-29 16:06:21 -04:00
Mayukha Vadari
c8c241b50d Merge remote-tracking branch 'upstream/develop' into develop4.6 2025-07-29 11:34:44 -04:00
Bronek Kozicki
44cb588371 Build options cleanup (#5581)
As we no longer support old compiler versions, we are bringing back some warnings by removing no longer relevant `-Wno-...` options.
2025-07-28 13:02:41 -04:00
Oleksandr
6f91b8f8d1 Fix windows 2025-07-28 12:29:39 -04:00
Mayukha Vadari
3d93379132 add header 2025-07-25 15:10:47 -04:00
Mayukha Vadari
84fd7d0126 Merge remote-tracking branch 'upstream/develop' into develop4.6 2025-07-25 14:56:36 -04:00
Mayukha Vadari
7f52287aae rename variables (#5609)
* rename variables

* instanceWrapper -> runtime

* size -> srcSize

* begin -> ptr
2025-07-25 11:54:31 -04:00
Mayukha Vadari
98b8986868 fix merge issues (mostly with Conan2 upgrade) 2025-07-23 15:52:03 -04:00
Mayukha Vadari
250f2842ee Merge remote-tracking branch 'upstream/develop' into develop4.5 2025-07-23 13:43:47 -04:00
Olek
9eca1a3a0c MPT and IOU support for amount and issue (#5573)
* MPT and IOU support for ammount and issue

* Fix tests
Update wasm code to the latest version
Remove deprecated tests
Remove deprecated wasm
2025-07-22 17:43:21 +00:00
Mayukha Vadari
24b7a03224 feat: add more keylet host functions (#5522) 2025-07-17 12:37:41 -04:00
Mayukha Vadari
9007097d24 Simplify host function boilerplate (#5534)
* enum for HF errors

* switch getData functions to be templates

* getData<SField> working

* Slice -> Bytes in host functions

* RET -> helper function instead of macro

* get template function working

* more organization/cleanup

* fix failures

* more cleanup

* Bytes -> Slice

* SFieldParam macro -> type alias

* fix return type

* fix bugs

* replace std::make_index_sequence

* remove `failed` from output

* remove complex function

* more uniformity

* respond to comments

* enum class HostFunctionError

* rename variable

* respond to comments

* remove templating

* [WIP] basic getData tests

* weird linker error

* fix issue
2025-07-15 04:28:59 +05:30
Olek
bc445ec6a2 Add hostfunctions schedule table
Remove opcode schedule table from wamr
2025-07-11 18:08:36 -04:00
Mayukha Vadari
4fa0ae521e disallow a computation allowance of 0 (#5541) 2025-07-09 00:34:17 +05:30
Olek
7bdf5fa8b8 Fix build.md wamr version (#5535) 2025-07-04 00:48:03 +05:30
Olek
65b0b976d9 Sync error codes (#5527)
* Sync error codes
2025-07-02 17:33:39 -04:00
Elliot.
a0d275feec chore: Clear CODEOWNERS (#5528) 2025-07-02 10:39:57 -07:00
Mayukha Vadari
ece3a8d7be Merge branch 'develop' into develop4 2025-06-30 21:33:30 +05:30
Olek
463acf51b5 preflight checks for wasm (#5517) 2025-06-30 09:34:38 -04:00
Olek
1cd16fab87 Host-functions perf test fixes (#5514) 2025-06-27 09:59:28 -04:00
Olek
add55c4f33 Host functions gas cost for wasm_runtime interface (#5500) 2025-06-25 14:04:04 +00:00
Olek
51a9c0ff59 Host function gas cost (#5488)
* Update Wamr to 2.3.1
* Add gas cost per host-function
* Fix windows build
* Fix wasm test
* Add no import test
2025-06-12 15:54:49 -04:00
Mayukha Vadari
6e8a5f0f4e fix: make host function traces easier to use, fix get_NFT bug (#5466)
Co-authored-by: Olek <115580134+oleks-rip@users.noreply.github.com>
2025-06-05 14:24:13 -04:00
Mayukha Vadari
8a33702f26 fix merge issues 2025-06-05 12:38:26 -04:00
Mayukha Vadari
a072d49802 Merge remote-tracking branch 'upstream/ripple/smart-escrow' into develop3.5 2025-06-05 11:51:53 -04:00
Mayukha Vadari
a0aeeb8e07 Merge remote-tracking branch 'upstream/develop' into develop3.5 2025-06-05 11:50:38 -04:00
Olek
383b225690 Fix processing nonexistent field (#5467) 2025-06-04 17:32:11 -04:00
Mayukha Vadari
ace2247800 Merge remote-tracking branch 'upstream/ripple/smart-escrow' into develop3.5 2025-06-04 14:15:17 -04:00
Olek
6a6fed5dce More hostfunctions (#5451)
* Bug fixes:
- Fix bugs found during schedule table tests
- Add more tests
- Add parameters passing for runEscrowWasm function

* Add new host-functions
 fix wamr logging
 add runtime passing through HF
 fix runEscrowWasm interface

* Improve logs

* Fix logging bug

* Set 4k limit for update_data HF

* allHF wasm module fixes
2025-05-30 19:01:27 -04:00
Mayukha Vadari
1f8aece8cd feat: add a GasUsed parameter to the metadata (#5456) 2025-05-29 16:36:55 -04:00
Mayukha Vadari
6c6f8cd4f9 Merge remote-tracking branch 'upstream/develop' into develop3 2025-05-29 13:05:11 -04:00
Mayukha Vadari
fb1311e013 uncomment???? 2025-05-28 14:00:50 -04:00
Mayukha Vadari
ce31acf030 debug comments 2025-05-28 13:48:38 -04:00
Mayukha Vadari
31ad5ac63b Merge remote-tracking branch 'upstream/ripple/smart-escrow' into develop3 2025-05-27 18:29:41 -04:00
Mayukha Vadari
1ede0bdec4 fix: fix fixtures (#5445) 2025-05-23 17:37:14 -04:00
Mayukha Vadari
aef32ead2c better WASM logging to match rippled (#5395)
* basic logging

* pass in Journal

* log level based on journal level

* clean up

* attempt at adding WAMR logging properly

* improve logline

* maybe_unused

* fix

* fix

* fix segfault

* add test
2025-05-23 10:31:02 -04:00
Mayukha Vadari
5b43ec7f73 refactor: switch function name from ready to finish (#5430) 2025-05-20 16:12:19 -04:00
Olek
1e9ff88a00 Fix CI build issues
* Mac build fix
* Windows build fix
* Windows instruction counter fix
2025-05-08 12:39:37 -04:00
Mayukha Vadari
bb9bb5f5c5 Merge branch 'ripple/smart-escrow' into develop2 2025-05-01 18:44:06 -04:00
Mayukha Vadari
c533abd8b6 Update size and compute cap defaults (#5417) 2025-05-01 18:41:51 -04:00
Olek
bb9bc764bc Switch to WAMR (#5416)
* Switch to WAMR
2025-05-01 18:02:06 -04:00
Mayukha Vadari
b4b53a6cb7 Merge branch 'ripple/smart-escrow' into develop2 2025-04-29 15:25:54 -04:00
Mayukha Vadari
9c0204906c fix reference fee tests 2025-04-29 15:25:00 -04:00
Mayukha Vadari
4670b373c1 try to fix tests 2025-04-29 14:10:27 -04:00
Mayukha Vadari
f03b5883bd More host functions (#5411)
* getNFT

* escrow keylet

* account keylet

* credential keylet

* oracle keylet

* hook everything in

* fix stuff
2025-04-29 12:39:12 -04:00
Mayukha Vadari
f8b2fe4dd5 fix imports 2025-04-28 17:43:15 -04:00
Mayukha Vadari
be4a0c9c2b Merge remote-tracking branch 'upstream/ripple/smart-escrow' into develop2 2025-04-28 17:14:28 -04:00
Mayukha Vadari
f37d52d8e9 Set up fees for WASM processing (#5393)
* set up fields

* throw error if allowance is too high

* votable gas price

* fix comments

* hook everything together

* make test less flaky (hopefully)

* fix other tests

* fix some tests

* fix tests

* clean up

* add more tests

* uncomment other tests

* respond to comments

* fix build

* respond to comments
2025-04-24 08:47:13 -04:00
Mayukha Vadari
177cdaf550 Connect votable gas limit into VM (#5360)
* [WIP] add gas limit

* [WIP] host function escrow tests

* finish test

* uncomment out tests
2025-03-25 10:55:33 -04:00
pwang200
1573a443b7 smart escrow devnet 1 host functions (#5353)
* devnet 1 host functions

* clang-format

* fix build issues
2025-03-24 17:07:17 -04:00
Mayukha Vadari
911c0466c0 Merge develop into ripple/smart-escrow (#5357)
* Set version to 2.4.0

* refactor: Remove unused and add missing includes (#5293)

The codebase is filled with includes that are unused, and which thus can be removed. At the same time, the files often do not include all headers that contain the definitions used in those files. This change uses clang-format and clang-tidy to clean up the includes, with minor manual intervention to ensure the code compiles on all platforms.

* refactor: Calculate numFeatures automatically (#5324)

Requiring manual updates of numFeatures is an annoying manual process that is easily forgotten, and leads to frequent merge conflicts. This change takes advantage of the `XRPL_FEATURE` and `XRPL_FIX` macros, and adds a new `XRPL_RETIRE` macro to automatically set `numFeatures`.

* refactor: Improve ordering of headers with clang-format (#5343)

Removes all manual header groupings from source and header files by leveraging clang-format options.

* Rename "deadlock" to "stall" in `LoadManager` (#5341)

What the LoadManager class does is stall detection, which is not the same as deadlock detection. In the condition of severe CPU starvation, LoadManager will currently intentionally crash rippled reporting `LogicError: Deadlock detected`. This error message is misleading as the condition being detected is not a deadlock. This change fixes and refactors the code in response.

* Adds hub.xrpl-commons.org as a new Bootstrap Cluster (#5263)

* fix: Error message for ledger_entry rpc (#5344)

Changes the error to `malformedAddress` for `permissioned_domain` in the `ledger_entry` rpc, when the account is not a string. This change makes it more clear to a user what is wrong with their request.

* fix: Handle invalid marker parameter in grpc call (#5317)

The `end_marker` is used to limit the range of ledger entries to fetch. If `end_marker` is less than `marker`, a crash can occur. This change adds an additional check.

* fix: trust line RPC no ripple flag (#5345)

The Trustline RPC `no_ripple` flag gets set depending on `lsfDefaultRipple` flag, which is not a flag of a trustline but of the account root. The `lsfDefaultRipple` flag does not provide any insight if this particular trust line has `lsfLowNoRipple` or `lsfHighNoRipple` flag set, so it should not be used here at all. This change simplifies the logic.

* refactor: Updates Conan dependencies: RocksDB (#5335)

Updates RocksDB to version 9.7.3, the latest version supported in Conan 1.x. A patch for 9.7.4 that fixes a memory leak is included.

* fix: Remove null pointer deref, just do abort (#5338)

This change removes the existing undefined behavior from `LogicError`, so we can be certain that there will be always a stacktrace.

De-referencing a null pointer is an old trick to generate `SIGSEGV`, which would typically also create a stacktrace. However it is also an undefined behaviour and compilers can do something else. A more robust way to create a stacktrace while crashing the program is to use `std::abort`, which we have also used in this location for a long time. If we combine the two, we might not get the expected behaviour - namely, the nullpointer deref followed by `std::abort`, as handled in certain compiler versions may not immediately cause a crash. We have observed stacktrace being wiped instead, and thread put in indeterminate state, then stacktrace created without any useful information.

* chore: Add PR number to payload (#5310)

This PR adds one more payload field to the libXRPL compatibility check workflow - the PR number itself.

* chore: Update link to ripple-binary-codec (#5355)

The link to ripple-binary-codec's definitions.json appears to be outdated. The updated link is also documented here: https://xrpl.org/docs/references/protocol/binary-format#definitions-file

* Prevent consensus from getting stuck in the establish phase (#5277)

- Detects if the consensus process is "stalled". If it is, then we can declare a 
  consensus and end successfully even if we do not have 80% agreement on
  our proposal.
  - "Stalled" is defined as:
    - We have a close time consensus
    - Each disputed transaction is individually stalled:
      - It has been in the final "stuck" 95% requirement for at least 2
        (avMIN_ROUNDS) "inner rounds" of phaseEstablish,
      - and either all of the other trusted proposers or this validator, if proposing,
        have had the same vote(s) for at least 4 (avSTALLED_ROUNDS) "inner
        rounds", and at least 80% of the validators (including this one, if
        appropriate) agree about the vote (whether yes or no).
- If we have been in the establish phase for more than 10x the previous
  consensus establish phase's time, then consensus is considered "expired",
  and we will leave the round, which sends a partial validation (indicating
  that the node is moving on without validating). Two restrictions avoid
  prematurely exiting, or having an extended exit in extreme situations.
  - The 10x time is clamped to be within a range of 15s
    (ledgerMAX_CONSENSUS) to 120s (ledgerABANDON_CONSENSUS).
  - If consensus has not had an opportunity to walk through all avalanche
    states (defined as not going through 8 "inner rounds" of phaseEstablish),
    then ConsensusState::Expired is treated as ConsensusState::No.
- When enough nodes leave the round, any remaining nodes will see they've
  fallen behind, and move on, too, generally before hitting the timeout. Any
  validations or partial validations sent during this time will help the
  consensus process bring the nodes back together.

---------

Co-authored-by: Michael Legleux <mlegleux@ripple.com>
Co-authored-by: Bart <bthomee@users.noreply.github.com>
Co-authored-by: Ed Hennis <ed@ripple.com>
Co-authored-by: Bronek Kozicki <brok@incorrekt.com>
Co-authored-by: Darius Tumas <Tokeiito@users.noreply.github.com>
Co-authored-by: Sergey Kuznetsov <skuznetsov@ripple.com>
Co-authored-by: cyan317 <120398799+cindyyan317@users.noreply.github.com>
Co-authored-by: Vlad <129996061+vvysokikh1@users.noreply.github.com>
Co-authored-by: Alex Kremer <akremer@ripple.com>
2025-03-20 16:47:14 -04:00
Mayukha Vadari
b6a95f9970 PoC Smart Escrows (#5340)
* wasmedge in unittest

* add WashVM.h and cpp

* accountID comparison (vector<u8>) working

* json decode tx and ledger object with two buffers working

* wasm return a buffer working

* add a failure test case to P2P3

* host function return ledger sqn

* instruction gas and host function gas

* basics

* add scaffold

* add amendment check

* working PoC

* get test working

* fix clang-format

* prototype #2

* p2p3

* [WIP] P4

* P5

* add calculateBaseFee

* add FinishFunction preflight checks (+ tests)

* additional reserve for sfFinishFunction

* higher fees for EscrowFinish

* rename amendment to SmartEscrow

* make fee voting changes, add basic tests

* clean up

* clean up

* clean up

* more cleanup

* add subscribe tests

* add more tests

* undo formatting

* undo formatting

* remove bad comment

* more debugging statements

* fix clang-format

* fix rebase issues

* fix more rebase issues

* more rebase fixes

* add source code for wasm

* respond to comments

* add const

---------

Co-authored-by: Peng Wang <pwang200@gmail.com>
2025-03-20 14:08:06 -04:00
1195 changed files with 259063 additions and 14123 deletions

View File

@@ -1,4 +1,20 @@
---
BreakBeforeBraces: Custom
BraceWrapping:
AfterClass: true
AfterControlStatement: true
AfterEnum: false
AfterFunction: true
AfterNamespace: false
AfterObjCDeclaration: true
AfterStruct: true
AfterUnion: true
BeforeCatch: true
BeforeElse: true
IndentBraces: false
KeepEmptyLinesAtTheStartOfBlocks: false
MaxEmptyLinesToKeep: 1
---
Language: Cpp
AccessModifierOffset: -4
AlignAfterOpenBracket: AlwaysBreak
@@ -18,20 +34,7 @@ AlwaysBreakBeforeMultilineStrings: true
AlwaysBreakTemplateDeclarations: true
BinPackArguments: false
BinPackParameters: false
BraceWrapping:
AfterClass: true
AfterControlStatement: true
AfterEnum: false
AfterFunction: true
AfterNamespace: false
AfterObjCDeclaration: true
AfterStruct: true
AfterUnion: true
BeforeCatch: true
BeforeElse: true
IndentBraces: false
BreakBeforeBinaryOperators: false
BreakBeforeBraces: Custom
BreakBeforeTernaryOperators: true
BreakConstructorInitializersBeforeComma: true
ColumnLimit: 80
@@ -66,8 +69,6 @@ IndentFunctionDeclarationAfterType: false
IndentRequiresClause: true
IndentWidth: 4
IndentWrappedFunctionNames: false
KeepEmptyLinesAtTheStartOfBlocks: false
MaxEmptyLinesToKeep: 1
NamespaceIndentation: None
ObjCSpaceAfterProperty: false
ObjCSpaceBeforeProtocolList: false
@@ -96,7 +97,7 @@ TabWidth: 8
UseTab: Never
QualifierAlignment: Right
---
Language: JavaScript
---
Language: Json
Language: Proto
BasedOnStyle: Google
ColumnLimit: 0
IndentWidth: 2

View File

@@ -33,5 +33,6 @@ slack_app: false
ignore:
- "src/test/"
- "src/tests/"
- "include/xrpl/beast/test/"
- "include/xrpl/beast/unit_test/"

View File

@@ -12,3 +12,5 @@ fe9a5365b8a52d4acc42eb27369247e6f238a4f9
9a93577314e6a8d4b4a8368cc9d2b15a5d8303e8
552377c76f55b403a1c876df873a23d780fcc81c
97f0747e103f13e26e45b731731059b32f7679ac
b13370ac0d207217354f1fc1c29aef87769fb8a1
896b8c3b54a22b0497cb0d1ce95e1095f9a227ce

6
.github/CODEOWNERS vendored
View File

@@ -1,8 +1,2 @@
# Allow anyone to review any change by default.
*
# Require the rpc-reviewers team to review changes to the rpc code.
include/xrpl/protocol/ @xrplf/rpc-reviewers
src/libxrpl/protocol/ @xrplf/rpc-reviewers
src/xrpld/rpc/ @xrplf/rpc-reviewers
src/xrpld/app/misc/ @xrplf/rpc-reviewers

44
.github/actions/build-deps/action.yml vendored Normal file
View File

@@ -0,0 +1,44 @@
name: Build Conan dependencies
description: "Install Conan dependencies, optionally forcing a rebuild of all dependencies."
# Note that actions do not support 'type' and all inputs are strings, see
# https://docs.github.com/en/actions/reference/workflows-and-actions/metadata-syntax#inputs.
inputs:
verbosity:
description: "The build verbosity."
required: false
default: "verbose"
build_dir:
description: "The directory where to build."
required: true
build_type:
description: 'The build type to use ("Debug", "Release").'
required: true
force_build:
description: 'Force building of all dependencies ("true", "false").'
required: false
default: "false"
runs:
using: composite
steps:
- name: Install Conan dependencies
shell: bash
env:
BUILD_DIR: ${{ inputs.build_dir }}
BUILD_OPTION: ${{ inputs.force_build == 'true' && '*' || 'missing' }}
BUILD_TYPE: ${{ inputs.build_type }}
VERBOSITY: ${{ inputs.verbosity }}
run: |
echo 'Installing dependencies.'
mkdir -p "${BUILD_DIR}"
cd "${BUILD_DIR}"
conan install \
--output-folder . \
--build="${BUILD_OPTION}" \
--options:host='&:tests=True' \
--options:host='&:xrpld=True' \
--settings:all build_type="${BUILD_TYPE}" \
--conf:all tools.build:verbosity="${VERBOSITY}" \
--conf:all tools.compilation:verbosity="${VERBOSITY}" \
..

View File

@@ -1,34 +0,0 @@
name: build
inputs:
generator:
default: null
configuration:
required: true
cmake-args:
default: null
cmake-target:
default: all
# An implicit input is the environment variable `build_dir`.
runs:
using: composite
steps:
- name: configure
shell: bash
run: |
cd ${build_dir}
cmake \
${{ inputs.generator && format('-G "{0}"', inputs.generator) || '' }} \
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake \
-DCMAKE_BUILD_TYPE=${{ inputs.configuration }} \
-Dtests=TRUE \
-Dxrpld=TRUE \
${{ inputs.cmake-args }} \
..
- name: build
shell: bash
run: |
cmake \
--build ${build_dir} \
--config ${{ inputs.configuration }} \
--parallel ${NUM_PROCESSORS:-$(nproc)} \
--target ${{ inputs.cmake-target }}

View File

@@ -1,38 +0,0 @@
name: dependencies
inputs:
configuration:
required: true
# Implicit inputs are the environment variables `build_dir`, CONAN_REMOTE_URL,
# CONAN_REMOTE_USERNAME, and CONAN_REMOTE_PASSWORD. The latter two are only
# used to upload newly built dependencies to the Conan remote.
runs:
using: composite
steps:
- name: add Conan remote
if: ${{ env.CONAN_REMOTE_URL != '' }}
shell: bash
run: |
echo "Adding Conan remote 'xrplf' at ${{ env.CONAN_REMOTE_URL }}."
conan remote add --index 0 --force xrplf ${{ env.CONAN_REMOTE_URL }}
echo "Listing Conan remotes."
conan remote list
- name: install dependencies
shell: bash
run: |
mkdir -p ${{ env.build_dir }}
cd ${{ env.build_dir }}
conan install \
--output-folder . \
--build missing \
--options:host "&:tests=True" \
--options:host "&:xrpld=True" \
--settings:all build_type=${{ inputs.configuration }} \
..
- name: upload dependencies
if: ${{ env.CONAN_REMOTE_URL != '' && env.CONAN_REMOTE_USERNAME != '' && env.CONAN_REMOTE_PASSWORD != '' && github.ref_type == 'branch' && github.ref_name == github.event.repository.default_branch }}
shell: bash
run: |
echo "Logging into Conan remote 'xrplf' at ${{ env.CONAN_REMOTE_URL }}."
conan remote login xrplf "${{ env.CONAN_REMOTE_USERNAME }}" --password "${{ env.CONAN_REMOTE_PASSWORD }}"
echo "Uploading dependencies."
conan upload '*' --confirm --check --remote xrplf

43
.github/actions/print-env/action.yml vendored Normal file
View File

@@ -0,0 +1,43 @@
name: Print build environment
description: "Print environment and some tooling versions"
runs:
using: composite
steps:
- name: Check configuration (Windows)
if: ${{ runner.os == 'Windows' }}
shell: bash
run: |
echo 'Checking environment variables.'
set
echo 'Checking CMake version.'
cmake --version
echo 'Checking Conan version.'
conan --version
- name: Check configuration (Linux and macOS)
if: ${{ runner.os == 'Linux' || runner.os == 'macOS' }}
shell: bash
run: |
echo 'Checking path.'
echo ${PATH} | tr ':' '\n'
echo 'Checking environment variables.'
env | sort
echo 'Checking CMake version.'
cmake --version
echo 'Checking compiler version.'
${{ runner.os == 'Linux' && '${CC}' || 'clang' }} --version
echo 'Checking Conan version.'
conan --version
echo 'Checking Ninja version.'
ninja --version
echo 'Checking nproc version.'
nproc --version

46
.github/actions/setup-conan/action.yml vendored Normal file
View File

@@ -0,0 +1,46 @@
name: Setup Conan
description: "Set up Conan configuration, profile, and remote."
inputs:
conan_remote_name:
description: "The name of the Conan remote to use."
required: false
default: xrplf
conan_remote_url:
description: "The URL of the Conan endpoint to use."
required: false
default: https://conan.ripplex.io
runs:
using: composite
steps:
- name: Set up Conan configuration
shell: bash
run: |
echo 'Installing configuration.'
cat conan/global.conf ${{ runner.os == 'Linux' && '>>' || '>' }} $(conan config home)/global.conf
echo 'Conan configuration:'
conan config show '*'
- name: Set up Conan profile
shell: bash
run: |
echo 'Installing profile.'
conan config install conan/profiles/default -tf $(conan config home)/profiles/
echo 'Conan profile:'
conan profile show
- name: Set up Conan remote
shell: bash
env:
CONAN_REMOTE_NAME: ${{ inputs.conan_remote_name }}
CONAN_REMOTE_URL: ${{ inputs.conan_remote_url }}
run: |
echo "Adding Conan remote '${CONAN_REMOTE_NAME}' at '${CONAN_REMOTE_URL}'."
conan remote add --index 0 --force "${CONAN_REMOTE_NAME}" "${CONAN_REMOTE_URL}"
echo 'Listing Conan remotes.'
conan remote list

View File

@@ -50,7 +50,7 @@ that `test` code should _never_ be included in `ripple` code.)
## Validation
The [levelization.sh](levelization.sh) script takes no parameters,
The [levelization](generate.sh) script takes no parameters,
reads no environment variables, and can be run from any directory,
as long as it is in the expected location in the rippled repo.
It can be run at any time from within a checked out repo, and will
@@ -72,15 +72,15 @@ It generates many files of [results](results):
desired as described above. In a perfect repo, this file will be
empty.
This file is committed to the repo, and is used by the [levelization
Github workflow](../../.github/workflows/levelization.yml) to validate
Github workflow](../../workflows/reusable-check-levelization.yml) to validate
that nothing changed.
- [`ordering.txt`](results/ordering.txt): A list showing relationships
between modules where there are no loops as they actually exist, as
opposed to how they are desired as described above.
This file is committed to the repo, and is used by the [levelization
Github workflow](../../.github/workflows/levelization.yml) to validate
Github workflow](../../workflows/reusable-check-levelization.yml) to validate
that nothing changed.
- [`levelization.yml`](../../.github/workflows/levelization.yml)
- [`levelization.yml`](../../workflows/reusable-check-levelization.yml)
Github Actions workflow to test that levelization loops haven't
changed. Unfortunately, if changes are detected, it can't tell if
they are improvements or not, so if you have resolved any issues or
@@ -111,4 +111,4 @@ get those details locally.
1. Run `levelization.sh`
2. Grep the modules in `paths.txt`.
- For example, if a cycle is found `A ~= B`, simply `grep -w
A Builds/levelization/results/paths.txt | grep -w B`
A .github/scripts/levelization/results/paths.txt | grep -w B`

View File

@@ -1,6 +1,6 @@
#!/bin/bash
# Usage: levelization.sh
# Usage: generate.sh
# This script takes no parameters, reads no environment variables,
# and can be run from any directory, as long as it is in the expected
# location in the repo.
@@ -19,7 +19,7 @@ export LANG=C
rm -rfv results
mkdir results
includes="$( pwd )/results/rawincludes.txt"
pushd ../..
pushd ../../..
echo Raw includes:
grep -r '^[ ]*#include.*/.*\.h' include src | \
grep -v boost | tee ${includes}

View File

@@ -7,12 +7,6 @@ Loop: test.jtx test.unit_test
Loop: xrpld.app xrpld.core
xrpld.app > xrpld.core
Loop: xrpld.app xrpld.ledger
xrpld.app > xrpld.ledger
Loop: xrpld.app xrpld.net
xrpld.app > xrpld.net
Loop: xrpld.app xrpld.overlay
xrpld.overlay > xrpld.app
@@ -25,15 +19,9 @@ Loop: xrpld.app xrpld.rpc
Loop: xrpld.app xrpld.shamap
xrpld.app > xrpld.shamap
Loop: xrpld.core xrpld.net
xrpld.net > xrpld.core
Loop: xrpld.core xrpld.perflog
xrpld.perflog == xrpld.core
Loop: xrpld.net xrpld.rpc
xrpld.rpc ~= xrpld.net
Loop: xrpld.overlay xrpld.rpc
xrpld.rpc ~= xrpld.overlay

View File

@@ -2,6 +2,12 @@ libxrpl.basics > xrpl.basics
libxrpl.crypto > xrpl.basics
libxrpl.json > xrpl.basics
libxrpl.json > xrpl.json
libxrpl.ledger > xrpl.basics
libxrpl.ledger > xrpl.json
libxrpl.ledger > xrpl.ledger
libxrpl.ledger > xrpl.protocol
libxrpl.net > xrpl.basics
libxrpl.net > xrpl.net
libxrpl.protocol > xrpl.basics
libxrpl.protocol > xrpl.json
libxrpl.protocol > xrpl.protocol
@@ -19,11 +25,11 @@ test.app > test.unit_test
test.app > xrpl.basics
test.app > xrpld.app
test.app > xrpld.core
test.app > xrpld.ledger
test.app > xrpld.nodestore
test.app > xrpld.overlay
test.app > xrpld.rpc
test.app > xrpl.json
test.app > xrpl.ledger
test.app > xrpl.protocol
test.app > xrpl.resource
test.basics > test.jtx
@@ -42,8 +48,8 @@ test.consensus > test.unit_test
test.consensus > xrpl.basics
test.consensus > xrpld.app
test.consensus > xrpld.consensus
test.consensus > xrpld.ledger
test.consensus > xrpl.json
test.consensus > xrpl.ledger
test.core > test.jtx
test.core > test.toplevel
test.core > test.unit_test
@@ -61,10 +67,10 @@ test.json > xrpl.json
test.jtx > xrpl.basics
test.jtx > xrpld.app
test.jtx > xrpld.core
test.jtx > xrpld.ledger
test.jtx > xrpld.net
test.jtx > xrpld.rpc
test.jtx > xrpl.json
test.jtx > xrpl.ledger
test.jtx > xrpl.net
test.jtx > xrpl.protocol
test.jtx > xrpl.resource
test.jtx > xrpl.server
@@ -73,7 +79,7 @@ test.ledger > test.toplevel
test.ledger > xrpl.basics
test.ledger > xrpld.app
test.ledger > xrpld.core
test.ledger > xrpld.ledger
test.ledger > xrpl.ledger
test.ledger > xrpl.protocol
test.nodestore > test.jtx
test.nodestore > test.toplevel
@@ -109,7 +115,6 @@ test.rpc > test.toplevel
test.rpc > xrpl.basics
test.rpc > xrpld.app
test.rpc > xrpld.core
test.rpc > xrpld.net
test.rpc > xrpld.overlay
test.rpc > xrpld.rpc
test.rpc > xrpl.json
@@ -133,7 +138,11 @@ test.toplevel > test.csf
test.toplevel > xrpl.json
test.unit_test > xrpl.basics
tests.libxrpl > xrpl.basics
tests.libxrpl > xrpl.net
xrpl.json > xrpl.basics
xrpl.ledger > xrpl.basics
xrpl.ledger > xrpl.protocol
xrpl.net > xrpl.basics
xrpl.protocol > xrpl.basics
xrpl.protocol > xrpl.json
xrpl.resource > xrpl.basics
@@ -149,6 +158,8 @@ xrpld.app > xrpld.consensus
xrpld.app > xrpld.nodestore
xrpld.app > xrpld.perflog
xrpld.app > xrpl.json
xrpld.app > xrpl.ledger
xrpld.app > xrpl.net
xrpld.app > xrpl.protocol
xrpld.app > xrpl.resource
xrpld.conditions > xrpl.basics
@@ -158,14 +169,8 @@ xrpld.consensus > xrpl.json
xrpld.consensus > xrpl.protocol
xrpld.core > xrpl.basics
xrpld.core > xrpl.json
xrpld.core > xrpl.net
xrpld.core > xrpl.protocol
xrpld.ledger > xrpl.basics
xrpld.ledger > xrpl.json
xrpld.ledger > xrpl.protocol
xrpld.net > xrpl.basics
xrpld.net > xrpl.json
xrpld.net > xrpl.protocol
xrpld.net > xrpl.resource
xrpld.nodestore > xrpl.basics
xrpld.nodestore > xrpld.core
xrpld.nodestore > xrpld.unity
@@ -186,9 +191,10 @@ xrpld.perflog > xrpl.basics
xrpld.perflog > xrpl.json
xrpld.rpc > xrpl.basics
xrpld.rpc > xrpld.core
xrpld.rpc > xrpld.ledger
xrpld.rpc > xrpld.nodestore
xrpld.rpc > xrpl.json
xrpld.rpc > xrpl.ledger
xrpld.rpc > xrpl.net
xrpld.rpc > xrpl.protocol
xrpld.rpc > xrpl.resource
xrpld.rpc > xrpl.server

197
.github/scripts/strategy-matrix/generate.py vendored Executable file
View File

@@ -0,0 +1,197 @@
#!/usr/bin/env python3
import argparse
import itertools
import json
from pathlib import Path
from dataclasses import dataclass
THIS_DIR = Path(__file__).parent.resolve()
@dataclass
class Config:
architecture: list[dict]
os: list[dict]
build_type: list[str]
cmake_args: list[str]
'''
Generate a strategy matrix for GitHub Actions CI.
On each PR commit we will build a selection of Debian, RHEL, Ubuntu, MacOS, and
Windows configurations, while upon merge into the develop, release, or master
branches, we will build all configurations, and test most of them.
We will further set additional CMake arguments as follows:
- All builds will have the `tests`, `werr`, and `xrpld` options.
- All builds will have the `wextra` option except for GCC 12 and Clang 16.
- All release builds will have the `assert` option.
- Certain Debian Bookworm configurations will change the reference fee, enable
codecov, and enable voidstar in PRs.
'''
def generate_strategy_matrix(all: bool, config: Config) -> list:
configurations = []
for architecture, os, build_type, cmake_args in itertools.product(config.architecture, config.os, config.build_type, config.cmake_args):
# The default CMake target is 'all' for Linux and MacOS and 'install'
# for Windows, but it can get overridden for certain configurations.
cmake_target = 'install' if os["distro_name"] == 'windows' else 'all'
# We build and test all configurations by default, except for Windows in
# Debug, because it is too slow, as well as when code coverage is
# enabled as that mode already runs the tests.
build_only = False
if os['distro_name'] == 'windows' and build_type == 'Debug':
build_only = True
# Only generate a subset of configurations in PRs.
if not all:
# Debian:
# - Bookworm using GCC 13: Release and Unity on linux/amd64, set
# the reference fee to 500.
# - Bookworm using GCC 15: Debug and no Unity on linux/amd64, enable
# code coverage (which will be done below).
# - Bookworm using Clang 16: Debug and no Unity on linux/arm64,
# enable voidstar.
# - Bookworm using Clang 17: Release and no Unity on linux/amd64,
# set the reference fee to 1000.
# - Bookworm using Clang 20: Debug and Unity on linux/amd64.
if os['distro_name'] == 'debian':
skip = True
if os['distro_version'] == 'bookworm':
if f'{os['compiler_name']}-{os['compiler_version']}' == 'gcc-13' and build_type == 'Release' and '-Dunity=ON' in cmake_args and architecture['platform'] == 'linux/amd64':
cmake_args = f'-DUNIT_TEST_REFERENCE_FEE=500 {cmake_args}'
skip = False
if f'{os['compiler_name']}-{os['compiler_version']}' == 'gcc-15' and build_type == 'Debug' and '-Dunity=OFF' in cmake_args and architecture['platform'] == 'linux/amd64':
skip = False
if f'{os['compiler_name']}-{os['compiler_version']}' == 'clang-16' and build_type == 'Debug' and '-Dunity=OFF' in cmake_args and architecture['platform'] == 'linux/arm64':
cmake_args = f'-Dvoidstar=ON {cmake_args}'
skip = False
if f'{os['compiler_name']}-{os['compiler_version']}' == 'clang-17' and build_type == 'Release' and '-Dunity=ON' in cmake_args and architecture['platform'] == 'linux/amd64':
cmake_args = f'-DUNIT_TEST_REFERENCE_FEE=1000 {cmake_args}'
skip = False
if f'{os['compiler_name']}-{os['compiler_version']}' == 'clang-20' and build_type == 'Debug' and '-Dunity=ON' in cmake_args and architecture['platform'] == 'linux/amd64':
skip = False
if skip:
continue
# RHEL:
# - 9 using GCC 12: Debug and Unity on linux/amd64.
# - 10 using Clang: Release and no Unity on linux/amd64.
if os['distro_name'] == 'rhel':
skip = True
if os['distro_version'] == '9':
if f'{os['compiler_name']}-{os['compiler_version']}' == 'gcc-12' and build_type == 'Debug' and '-Dunity=ON' in cmake_args and architecture['platform'] == 'linux/amd64':
skip = False
elif os['distro_version'] == '10':
if f'{os['compiler_name']}-{os['compiler_version']}' == 'clang-any' and build_type == 'Release' and '-Dunity=OFF' in cmake_args and architecture['platform'] == 'linux/amd64':
skip = False
if skip:
continue
# Ubuntu:
# - Jammy using GCC 12: Debug and no Unity on linux/arm64.
# - Noble using GCC 14: Release and Unity on linux/amd64.
# - Noble using Clang 18: Debug and no Unity on linux/amd64.
# - Noble using Clang 19: Release and Unity on linux/arm64.
if os['distro_name'] == 'ubuntu':
skip = True
if os['distro_version'] == 'jammy':
if f'{os['compiler_name']}-{os['compiler_version']}' == 'gcc-12' and build_type == 'Debug' and '-Dunity=OFF' in cmake_args and architecture['platform'] == 'linux/arm64':
skip = False
elif os['distro_version'] == 'noble':
if f'{os['compiler_name']}-{os['compiler_version']}' == 'gcc-14' and build_type == 'Release' and '-Dunity=ON' in cmake_args and architecture['platform'] == 'linux/amd64':
skip = False
if f'{os['compiler_name']}-{os['compiler_version']}' == 'clang-18' and build_type == 'Debug' and '-Dunity=OFF' in cmake_args and architecture['platform'] == 'linux/amd64':
skip = False
if f'{os['compiler_name']}-{os['compiler_version']}' == 'clang-19' and build_type == 'Release' and '-Dunity=ON' in cmake_args and architecture['platform'] == 'linux/arm64':
skip = False
if skip:
continue
# MacOS:
# - Debug and no Unity on macos/arm64.
if os['distro_name'] == 'macos' and not (build_type == 'Debug' and '-Dunity=OFF' in cmake_args and architecture['platform'] == 'macos/arm64'):
continue
# Windows:
# - Release and Unity on windows/amd64.
if os['distro_name'] == 'windows' and not (build_type == 'Release' and '-Dunity=ON' in cmake_args and architecture['platform'] == 'windows/amd64'):
continue
# Additional CMake arguments.
cmake_args = f'{cmake_args} -Dtests=ON -Dwerr=ON -Dxrpld=ON'
if not f'{os['compiler_name']}-{os['compiler_version']}' in ['gcc-12', 'clang-16']:
cmake_args = f'{cmake_args} -Dwextra=ON'
if build_type == 'Release':
cmake_args = f'{cmake_args} -Dassert=ON'
# We skip all RHEL on arm64 due to a build failure that needs further
# investigation.
if os['distro_name'] == 'rhel' and architecture['platform'] == 'linux/arm64':
continue
# We skip all clang-20 on arm64 due to boost 1.86 build error
if f'{os['compiler_name']}-{os['compiler_version']}' == 'clang-20' and architecture['platform'] == 'linux/arm64':
continue
# Enable code coverage for Debian Bookworm using GCC 15 in Debug and no
# Unity on linux/amd64
if f'{os['compiler_name']}-{os['compiler_version']}' == 'gcc-15' and build_type == 'Debug' and '-Dunity=OFF' in cmake_args and architecture['platform'] == 'linux/amd64':
cmake_args = f'-Dcoverage=ON -Dcoverage_format=xml -DCODE_COVERAGE_VERBOSE=ON -DCMAKE_C_FLAGS=-O0 -DCMAKE_CXX_FLAGS=-O0 {cmake_args}'
cmake_target = 'coverage'
build_only = True
# Generate a unique name for the configuration, e.g. macos-arm64-debug
# or debian-bookworm-gcc-12-amd64-release-unity.
config_name = os['distro_name']
if (n := os['distro_version']) != '':
config_name += f'-{n}'
if (n := os['compiler_name']) != '':
config_name += f'-{n}'
if (n := os['compiler_version']) != '':
config_name += f'-{n}'
config_name += f'-{architecture['platform'][architecture['platform'].find('/')+1:]}'
config_name += f'-{build_type.lower()}'
if '-Dunity=ON' in cmake_args:
config_name += '-unity'
# Add the configuration to the list, with the most unique fields first,
# so that they are easier to identify in the GitHub Actions UI, as long
# names get truncated.
configurations.append({
'config_name': config_name,
'cmake_args': cmake_args,
'cmake_target': cmake_target,
'build_only': build_only,
'build_type': build_type,
'os': os,
'architecture': architecture,
})
return configurations
def read_config(file: Path) -> Config:
config = json.loads(file.read_text())
if config['architecture'] is None or config['os'] is None or config['build_type'] is None or config['cmake_args'] is None:
raise Exception('Invalid configuration file.')
return Config(**config)
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument('-a', '--all', help='Set to generate all configurations (generally used when merging a PR) or leave unset to generate a subset of configurations (generally used when committing to a PR).', action="store_true")
parser.add_argument('-c', '--config', help='Path to the JSON file containing the strategy matrix configurations.', required=False, type=Path)
args = parser.parse_args()
matrix = []
if args.config is None or args.config == '':
matrix += generate_strategy_matrix(args.all, read_config(THIS_DIR / "linux.json"))
matrix += generate_strategy_matrix(args.all, read_config(THIS_DIR / "macos.json"))
matrix += generate_strategy_matrix(args.all, read_config(THIS_DIR / "windows.json"))
else:
matrix += generate_strategy_matrix(args.all, read_config(args.config))
# Generate the strategy matrix.
print(f'matrix={json.dumps({"include": matrix})}')

View File

@@ -0,0 +1,184 @@
{
"architecture": [
{
"platform": "linux/amd64",
"runner": ["self-hosted", "Linux", "X64", "heavy"]
},
{
"platform": "linux/arm64",
"runner": ["self-hosted", "Linux", "ARM64", "heavy-arm64"]
}
],
"os": [
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "gcc",
"compiler_version": "12",
"image_sha": "6948666"
},
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "gcc",
"compiler_version": "13",
"image_sha": "6948666"
},
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "gcc",
"compiler_version": "14",
"image_sha": "6948666"
},
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "gcc",
"compiler_version": "15",
"image_sha": "6948666"
},
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "clang",
"compiler_version": "16",
"image_sha": "6948666"
},
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "clang",
"compiler_version": "17",
"image_sha": "6948666"
},
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "clang",
"compiler_version": "18",
"image_sha": "6948666"
},
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "clang",
"compiler_version": "19",
"image_sha": "6948666"
},
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "clang",
"compiler_version": "20",
"image_sha": "6948666"
},
{
"distro_name": "rhel",
"distro_version": "8",
"compiler_name": "gcc",
"compiler_version": "14",
"image_sha": "10e69b4"
},
{
"distro_name": "rhel",
"distro_version": "8",
"compiler_name": "clang",
"compiler_version": "any",
"image_sha": "10e69b4"
},
{
"distro_name": "rhel",
"distro_version": "9",
"compiler_name": "gcc",
"compiler_version": "12",
"image_sha": "10e69b4"
},
{
"distro_name": "rhel",
"distro_version": "9",
"compiler_name": "gcc",
"compiler_version": "13",
"image_sha": "10e69b4"
},
{
"distro_name": "rhel",
"distro_version": "9",
"compiler_name": "gcc",
"compiler_version": "14",
"image_sha": "10e69b4"
},
{
"distro_name": "rhel",
"distro_version": "9",
"compiler_name": "clang",
"compiler_version": "any",
"image_sha": "10e69b4"
},
{
"distro_name": "rhel",
"distro_version": "10",
"compiler_name": "gcc",
"compiler_version": "14",
"image_sha": "10e69b4"
},
{
"distro_name": "rhel",
"distro_version": "10",
"compiler_name": "clang",
"compiler_version": "any",
"image_sha": "10e69b4"
},
{
"distro_name": "ubuntu",
"distro_version": "jammy",
"compiler_name": "gcc",
"compiler_version": "12",
"image_sha": "6948666"
},
{
"distro_name": "ubuntu",
"distro_version": "noble",
"compiler_name": "gcc",
"compiler_version": "13",
"image_sha": "6948666"
},
{
"distro_name": "ubuntu",
"distro_version": "noble",
"compiler_name": "gcc",
"compiler_version": "14",
"image_sha": "6948666"
},
{
"distro_name": "ubuntu",
"distro_version": "noble",
"compiler_name": "clang",
"compiler_version": "16",
"image_sha": "6948666"
},
{
"distro_name": "ubuntu",
"distro_version": "noble",
"compiler_name": "clang",
"compiler_version": "17",
"image_sha": "6948666"
},
{
"distro_name": "ubuntu",
"distro_version": "noble",
"compiler_name": "clang",
"compiler_version": "18",
"image_sha": "6948666"
},
{
"distro_name": "ubuntu",
"distro_version": "noble",
"compiler_name": "clang",
"compiler_version": "19",
"image_sha": "6948666"
}
],
"build_type": ["Debug", "Release"],
"cmake_args": ["-Dunity=OFF", "-Dunity=ON"]
}

View File

@@ -0,0 +1,22 @@
{
"architecture": [
{
"platform": "macos/arm64",
"runner": ["self-hosted", "macOS", "ARM64", "mac-runner-m1"]
}
],
"os": [
{
"distro_name": "macos",
"distro_version": "",
"compiler_name": "",
"compiler_version": "",
"image_sha": ""
}
],
"build_type": ["Debug", "Release"],
"cmake_args": [
"-Dunity=OFF -DCMAKE_POLICY_VERSION_MINIMUM=3.5",
"-Dunity=ON -DCMAKE_POLICY_VERSION_MINIMUM=3.5"
]
}

View File

@@ -0,0 +1,19 @@
{
"architecture": [
{
"platform": "windows/amd64",
"runner": ["self-hosted", "Windows", "devbox"]
}
],
"os": [
{
"distro_name": "windows",
"distro_version": "",
"compiler_name": "",
"compiler_version": "",
"image_sha": ""
}
],
"build_type": ["Debug", "Release"],
"cmake_args": ["-Dunity=OFF", "-Dunity=ON"]
}

View File

@@ -1,64 +0,0 @@
name: clang-format
on:
push:
pull_request:
types: [opened, reopened, synchronize, ready_for_review]
jobs:
check:
if: ${{ github.event_name == 'push' || github.event.pull_request.draft != true || contains(github.event.pull_request.labels.*.name, 'DraftRunCI') }}
runs-on: ubuntu-24.04
container: ghcr.io/xrplf/ci/tools-rippled-clang-format
steps:
# For jobs running in containers, $GITHUB_WORKSPACE and ${{ github.workspace }} might not be the
# same directory. The actions/checkout step is *supposed* to checkout into $GITHUB_WORKSPACE and
# then add it to safe.directory (see instructions at https://github.com/actions/checkout)
# but that's apparently not happening for some container images. We can't be sure what is actually
# happening, so let's pre-emptively add both directories to safe.directory. There's a
# Github issue opened in 2022 and not resolved in 2025 https://github.com/actions/runner/issues/2058 ¯\_(ツ)_/¯
- run: |
git config --global --add safe.directory $GITHUB_WORKSPACE
git config --global --add safe.directory ${{ github.workspace }}
- uses: actions/checkout@v4
- name: Format first-party sources
run: |
clang-format --version
find include src tests -type f \( -name '*.cpp' -o -name '*.hpp' -o -name '*.h' -o -name '*.ipp' \) -exec clang-format -i {} +
- name: Check for differences
id: assert
shell: bash
run: |
set -o pipefail
git diff --exit-code | tee "clang-format.patch"
- name: Upload patch
if: failure() && steps.assert.outcome == 'failure'
uses: actions/upload-artifact@v4
continue-on-error: true
with:
name: clang-format.patch
if-no-files-found: ignore
path: clang-format.patch
- name: What happened?
if: failure() && steps.assert.outcome == 'failure'
env:
PREAMBLE: |
If you are reading this, you are looking at a failed Github Actions
job. That means you pushed one or more files that did not conform
to the formatting specified in .clang-format. That may be because
you neglected to run 'git clang-format' or 'clang-format' before
committing, or that your version of clang-format has an
incompatibility with the one on this
machine, which is:
SUGGESTION: |
To fix it, you can do one of two things:
1. Download and apply the patch generated as an artifact of this
job to your repo, commit, and push.
2. Run 'git-clang-format --extensions cpp,h,hpp,ipp develop'
in your repo, commit, and push.
run: |
echo "${PREAMBLE}"
clang-format --version
echo "${SUGGESTION}"
exit 1

View File

@@ -1,37 +0,0 @@
name: Build and publish Doxygen documentation
# To test this workflow, push your changes to your fork's `develop` branch.
on:
push:
branches:
- develop
- doxygen
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
documentation:
runs-on: ubuntu-latest
permissions:
contents: write
container: ghcr.io/xrplf/rippled-build-ubuntu:aaf5e3e
steps:
- name: checkout
uses: actions/checkout@v4
- name: check environment
run: |
echo ${PATH} | tr ':' '\n'
cmake --version
doxygen --version
env | sort
- name: build
run: |
mkdir build
cd build
cmake -Donly_docs=TRUE ..
cmake --build . --target docs --parallel $(nproc)
- name: publish
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: build/docs/html

View File

@@ -1,53 +0,0 @@
name: levelization
on:
push:
pull_request:
types: [opened, reopened, synchronize, ready_for_review]
jobs:
check:
if: ${{ github.event_name == 'push' || github.event.pull_request.draft != true || contains(github.event.pull_request.labels.*.name, 'DraftRunCI') }}
runs-on: ubuntu-latest
env:
CLANG_VERSION: 10
steps:
- uses: actions/checkout@v4
- name: Check levelization
run: Builds/levelization/levelization.sh
- name: Check for differences
id: assert
run: |
set -o pipefail
git diff --exit-code | tee "levelization.patch"
- name: Upload patch
if: failure() && steps.assert.outcome == 'failure'
uses: actions/upload-artifact@v4
continue-on-error: true
with:
name: levelization.patch
if-no-files-found: ignore
path: levelization.patch
- name: What happened?
if: failure() && steps.assert.outcome == 'failure'
env:
MESSAGE: |
If you are reading this, you are looking at a failed Github
Actions job. That means you changed the dependency relationships
between the modules in rippled. That may be an improvement or a
regression. This check doesn't judge.
A rule of thumb, though, is that if your changes caused
something to be removed from loops.txt, that's probably an
improvement. If something was added, it's probably a regression.
To fix it, you can do one of two things:
1. Download and apply the patch generated as an artifact of this
job to your repo, commit, and push.
2. Run './Builds/levelization/levelization.sh' in your repo,
commit, and push.
See Builds/levelization/README.md for more info.
run: |
echo "${MESSAGE}"
exit 1

View File

@@ -1,91 +0,0 @@
name: Check libXRPL compatibility with Clio
env:
CONAN_REMOTE_URL: https://conan.ripplex.io
CONAN_LOGIN_USERNAME_XRPLF: ${{ secrets.CONAN_REMOTE_USERNAME }}
CONAN_PASSWORD_XRPLF: ${{ secrets.CONAN_REMOTE_PASSWORD }}
on:
pull_request:
paths:
- "src/libxrpl/protocol/BuildInfo.cpp"
- ".github/workflows/libxrpl.yml"
types: [opened, reopened, synchronize, ready_for_review]
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
publish:
if: ${{ github.event_name == 'push' || github.event.pull_request.draft != true || contains(github.event.pull_request.labels.*.name, 'DraftRunCI') }}
name: Publish libXRPL
outputs:
outcome: ${{ steps.upload.outputs.outcome }}
version: ${{ steps.version.outputs.version }}
channel: ${{ steps.channel.outputs.channel }}
runs-on: [self-hosted, heavy]
container: ghcr.io/xrplf/rippled-build-ubuntu:aaf5e3e
steps:
- name: Wait for essential checks to succeed
uses: lewagon/wait-on-check-action@v1.3.4
with:
ref: ${{ github.event.pull_request.head.sha || github.sha }}
running-workflow-name: wait-for-check-regexp
check-regexp: "(dependencies|test).*linux.*" # Ignore windows and mac tests but make sure linux passes
repo-token: ${{ secrets.GITHUB_TOKEN }}
wait-interval: 10
- name: Checkout
uses: actions/checkout@v4
- name: Generate channel
id: channel
shell: bash
run: |
echo channel="clio/pr_${{ github.event.pull_request.number }}" | tee ${GITHUB_OUTPUT}
- name: Export new package
shell: bash
run: |
conan export . ${{ steps.channel.outputs.channel }}
- name: Add Conan remote
shell: bash
run: |
echo "Adding Conan remote 'xrplf' at ${{ env.CONAN_REMOTE_URL }}."
conan remote add xrplf ${{ env.CONAN_REMOTE_URL }} --insert 0 --force
echo "Listing Conan remotes."
conan remote list
- name: Parse new version
id: version
shell: bash
run: |
echo version="$(cat src/libxrpl/protocol/BuildInfo.cpp | grep "versionString =" \
| awk -F '"' '{print $2}')" | tee ${GITHUB_OUTPUT}
- name: Try to authenticate to Conan remote
id: remote
shell: bash
run: |
# `conan user` implicitly uses the environment variables CONAN_LOGIN_USERNAME_<REMOTE> and CONAN_PASSWORD_<REMOTE>.
# https://docs.conan.io/1/reference/commands/misc/user.html#using-environment-variables
# https://docs.conan.io/1/reference/env_vars.html#conan-login-username-conan-login-username-remote-name
# https://docs.conan.io/1/reference/env_vars.html#conan-password-conan-password-remote-name
echo outcome=$(conan user --remote xrplf --password >&2 \
&& echo success || echo failure) | tee ${GITHUB_OUTPUT}
- name: Upload new package
id: upload
if: (steps.remote.outputs.outcome == 'success')
shell: bash
run: |
echo "conan upload version ${{ steps.version.outputs.version }} on channel ${{ steps.channel.outputs.channel }}"
echo outcome=$(conan upload xrpl/${{ steps.version.outputs.version }}@${{ steps.channel.outputs.channel }} --remote ripple --confirm >&2 \
&& echo success || echo failure) | tee ${GITHUB_OUTPUT}
notify_clio:
name: Notify Clio
runs-on: ubuntu-latest
needs: publish
env:
GH_TOKEN: ${{ secrets.CLIO_NOTIFY_TOKEN }}
steps:
- name: Notify Clio about new version
if: (needs.publish.outputs.outcome == 'success')
shell: bash
run: |
gh api --method POST -H "Accept: application/vnd.github+json" -H "X-GitHub-Api-Version: 2022-11-28" \
/repos/xrplf/clio/dispatches -f "event_type=check_libxrpl" \
-F "client_payload[version]=${{ needs.publish.outputs.version }}@${{ needs.publish.outputs.channel }}" \
-F "client_payload[pr]=${{ github.event.pull_request.number }}"

View File

@@ -1,112 +0,0 @@
name: macos
on:
pull_request:
types: [opened, reopened, synchronize, ready_for_review]
push:
# If the branches list is ever changed, be sure to change it on all
# build/test jobs (nix, macos, windows, instrumentation)
branches:
# Always build the package branches
- develop
- release
- master
# Branches that opt-in to running
- "ci/**"
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
# This part of Conan configuration is specific to this workflow only; we do not want
# to pollute conan/profiles directory with settings which might not work for others
env:
CONAN_REMOTE_URL: https://conan.ripplex.io
CONAN_REMOTE_USERNAME: ${{ secrets.CONAN_REMOTE_USERNAME }}
CONAN_REMOTE_PASSWORD: ${{ secrets.CONAN_REMOTE_PASSWORD }}
# This part of the Conan configuration is specific to this workflow only; we
# do not want to pollute the 'conan/profiles' directory with settings that
# might not work for other workflows.
CONAN_GLOBAL_CONF: |
core.download:parallel={{os.cpu_count()}}
core.upload:parallel={{os.cpu_count()}}
tools.build:jobs={{ (os.cpu_count() * 4/5) | int }}
tools.build:verbosity=verbose
tools.compilation:verbosity=verbose
jobs:
test:
if: ${{ github.event_name == 'push' || github.event.pull_request.draft != true || contains(github.event.pull_request.labels.*.name, 'DraftRunCI') }}
strategy:
matrix:
platform:
- macos
generator:
- Ninja
configuration:
- Release
runs-on: [self-hosted, macOS, mac-runner-m1]
env:
# The `build` action requires these variables.
build_dir: .build
NUM_PROCESSORS: 12
steps:
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: install Conan
run: |
brew install conan
- name: install Ninja
if: matrix.generator == 'Ninja'
run: brew install ninja
- name: install python
run: |
if which python > /dev/null 2>&1; then
echo "Python executable exists"
else
brew install python@3.13
ln -s /opt/homebrew/bin/python3 /opt/homebrew/bin/python
fi
- name: install cmake
run: |
if which cmake > /dev/null 2>&1; then
echo "cmake executable exists"
else
brew install cmake
fi
- name: install nproc
run: |
brew install coreutils
- name: check environment
run: |
env | sort
echo ${PATH} | tr ':' '\n'
python --version
conan --version
cmake --version
nproc --version
echo -n "nproc returns: "
nproc
system_profiler SPHardwareDataType
sysctl -n hw.logicalcpu
clang --version
- name: configure Conan
run: |
echo "${CONAN_GLOBAL_CONF}" > $(conan config home)/global.conf
conan config install conan/profiles/ -tf $(conan config home)/profiles/
conan profile show
- name: build dependencies
uses: ./.github/actions/dependencies
with:
configuration: ${{ matrix.configuration }}
- name: build
uses: ./.github/actions/build
with:
generator: ${{ matrix.generator }}
configuration: ${{ matrix.configuration }}
cmake-args: "-Dassert=TRUE -Dwerr=TRUE ${{ matrix.cmake-args }}"
- name: test
run: |
n=$(nproc)
echo "Using $n test jobs"
cd ${build_dir}
./rippled --unittest --unittest-jobs $n
ctest -j $n --output-on-failure

View File

@@ -1,60 +0,0 @@
name: missing-commits
on:
push:
branches:
# Only check that the branches are up to date when updating the
# relevant branches.
- develop
- release
jobs:
up_to_date:
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Check for missing commits
id: commits
env:
SUGGESTION: |
If you are reading this, then the commits indicated above are
missing from "develop" and/or "release". Do a reverse-merge
as soon as possible. See CONTRIBUTING.md for instructions.
run: |
set -o pipefail
# Branches ordered by how "canonical" they are. Every commit in
# one branch should be in all the branches behind it
order=( master release develop )
branches=()
for branch in "${order[@]}"
do
# Check that the branches exist so that this job will work on
# forked repos, which don't necessarily have master and
# release branches.
if git ls-remote --exit-code --heads origin \
refs/heads/${branch} > /dev/null
then
branches+=( origin/${branch} )
fi
done
prior=()
for branch in "${branches[@]}"
do
if [[ ${#prior[@]} -ne 0 ]]
then
echo "Checking ${prior[@]} for commits missing from ${branch}"
git log --oneline --no-merges "${prior[@]}" \
^$branch | tee -a "missing-commits.txt"
echo
fi
prior+=( "${branch}" )
done
if [[ $( cat missing-commits.txt | wc -l ) -ne 0 ]]
then
echo "${SUGGESTION}"
exit 1
fi

View File

@@ -1,422 +0,0 @@
name: nix
on:
pull_request:
types: [opened, reopened, synchronize, ready_for_review]
push:
# If the branches list is ever changed, be sure to change it on all
# build/test jobs (nix, macos, windows)
branches:
# Always build the package branches
- develop
- release
- master
# Branches that opt-in to running
- "ci/**"
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
env:
CONAN_REMOTE_URL: https://conan.ripplex.io
CONAN_REMOTE_USERNAME: ${{ secrets.CONAN_REMOTE_USERNAME }}
CONAN_REMOTE_PASSWORD: ${{ secrets.CONAN_REMOTE_PASSWORD }}
# This part of the Conan configuration is specific to this workflow only; we
# do not want to pollute the 'conan/profiles' directory with settings that
# might not work for other workflows.
CONAN_GLOBAL_CONF: |
core.download:parallel={{ os.cpu_count() }}
core.upload:parallel={{ os.cpu_count() }}
tools.build:jobs={{ (os.cpu_count() * 4/5) | int }}
tools.build:verbosity=verbose
tools.compilation:verbosity=verbose
# This workflow has multiple job matrixes.
# They can be considered phases because most of the matrices ("test",
# "coverage", "conan", ) depend on the first ("dependencies").
#
# The first phase has a job in the matrix for each combination of
# variables that affects dependency ABI:
# platform, compiler, and configuration.
# It creates a GitHub artifact holding the Conan profile,
# and builds and caches binaries for all the dependencies.
# If an Artifactory remote is configured, they are cached there.
# If not, they are added to the GitHub artifact.
# GitHub's "cache" action has a size limit (10 GB) that is too small
# to hold the binaries if they are built locally.
# We must use the "{upload,download}-artifact" actions instead.
#
# The remaining phases have a job in the matrix for each test
# configuration. They install dependency binaries from the cache,
# whichever was used, and build and test rippled.
#
# "instrumentation" is independent, but is included here because it also
# builds on linux in the same "on:" conditions.
jobs:
dependencies:
if: ${{ github.event_name == 'push' || github.event.pull_request.draft != true || contains(github.event.pull_request.labels.*.name, 'DraftRunCI') }}
strategy:
fail-fast: false
matrix:
platform:
- linux
compiler:
- gcc
- clang
configuration:
- Debug
- Release
include:
- compiler: gcc
compiler_version: 12
distro: ubuntu
codename: jammy
- compiler: clang
compiler_version: 16
distro: debian
codename: bookworm
runs-on: [self-hosted, heavy]
container: ghcr.io/xrplf/ci/${{ matrix.distro }}-${{ matrix.codename }}:${{ matrix.compiler }}-${{ matrix.compiler_version }}
env:
build_dir: .build
steps:
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: check environment
run: |
echo ${PATH} | tr ':' '\n'
lsb_release -a || true
${{ matrix.compiler }}-${{ matrix.compiler_version }} --version
conan --version
cmake --version
env | sort
- name: configure Conan
run: |
echo "${CONAN_GLOBAL_CONF}" >> $(conan config home)/global.conf
conan config install conan/profiles/ -tf $(conan config home)/profiles/
conan profile show
- name: archive profile
# Create this archive before dependencies are added to the local cache.
run: tar -czf conan.tar.gz -C ${CONAN_HOME} .
- name: build dependencies
uses: ./.github/actions/dependencies
with:
configuration: ${{ matrix.configuration }}
- name: upload archive
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02
with:
name: ${{ matrix.platform }}-${{ matrix.compiler }}-${{ matrix.configuration }}
path: conan.tar.gz
if-no-files-found: error
test:
strategy:
fail-fast: false
matrix:
platform:
- linux
compiler:
- gcc
- clang
configuration:
- Debug
- Release
include:
- compiler: gcc
compiler_version: 12
distro: ubuntu
codename: jammy
- compiler: clang
compiler_version: 16
distro: debian
codename: bookworm
cmake-args:
-
- "-Dunity=ON"
needs: dependencies
runs-on: [self-hosted, heavy]
container: ghcr.io/xrplf/ci/${{ matrix.distro }}-${{ matrix.codename }}:${{ matrix.compiler }}-${{ matrix.compiler_version }}
env:
build_dir: .build
steps:
- name: download cache
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093
with:
name: ${{ matrix.platform }}-${{ matrix.compiler }}-${{ matrix.configuration }}
- name: extract cache
run: |
mkdir -p ${CONAN_HOME}
tar -xzf conan.tar.gz -C ${CONAN_HOME}
- name: check environment
run: |
env | sort
echo ${PATH} | tr ':' '\n'
conan --version
cmake --version
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: dependencies
uses: ./.github/actions/dependencies
with:
configuration: ${{ matrix.configuration }}
- name: build
uses: ./.github/actions/build
with:
generator: Ninja
configuration: ${{ matrix.configuration }}
cmake-args: "-Dassert=TRUE -Dwerr=TRUE ${{ matrix.cmake-args }}"
- name: check linking
run: |
cd ${build_dir}
ldd ./rippled
if [ "$(ldd ./rippled | grep -E '(libstdc\+\+|libgcc)' | wc -l)" -eq 0 ]; then
echo 'The binary is statically linked.'
else
echo 'The binary is dynamically linked.'
exit 1
fi
- name: test
run: |
cd ${build_dir}
./rippled --unittest --unittest-jobs $(nproc)
ctest -j $(nproc) --output-on-failure
reference-fee-test:
strategy:
fail-fast: false
matrix:
platform:
- linux
compiler:
- gcc
configuration:
- Debug
cmake-args:
- "-DUNIT_TEST_REFERENCE_FEE=200"
- "-DUNIT_TEST_REFERENCE_FEE=1000"
needs: dependencies
runs-on: [self-hosted, heavy]
container: ghcr.io/xrplf/ci/ubuntu-jammy:gcc-12
env:
build_dir: .build
steps:
- name: download cache
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093
with:
name: ${{ matrix.platform }}-${{ matrix.compiler }}-${{ matrix.configuration }}
- name: extract cache
run: |
mkdir -p ${CONAN_HOME}
tar -xzf conan.tar.gz -C ${CONAN_HOME}
- name: check environment
run: |
env | sort
echo ${PATH} | tr ':' '\n'
conan --version
cmake --version
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: dependencies
uses: ./.github/actions/dependencies
with:
configuration: ${{ matrix.configuration }}
- name: build
uses: ./.github/actions/build
with:
generator: Ninja
configuration: ${{ matrix.configuration }}
cmake-args: "-Dassert=TRUE -Dwerr=TRUE ${{ matrix.cmake-args }}"
- name: test
run: |
cd ${build_dir}
./rippled --unittest --unittest-jobs $(nproc)
ctest -j $(nproc) --output-on-failure
coverage:
strategy:
fail-fast: false
matrix:
platform:
- linux
compiler:
- gcc
configuration:
- Debug
needs: dependencies
runs-on: [self-hosted, heavy]
container: ghcr.io/xrplf/ci/ubuntu-jammy:gcc-12
env:
build_dir: .build
steps:
- name: download cache
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093
with:
name: ${{ matrix.platform }}-${{ matrix.compiler }}-${{ matrix.configuration }}
- name: extract cache
run: |
mkdir -p ${CONAN_HOME}
tar -xzf conan.tar.gz -C ${CONAN_HOME}
- name: check environment
run: |
echo ${PATH} | tr ':' '\n'
conan --version
cmake --version
gcovr --version
env | sort
ls ${CONAN_HOME}
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: dependencies
uses: ./.github/actions/dependencies
with:
configuration: ${{ matrix.configuration }}
- name: build
uses: ./.github/actions/build
with:
generator: Ninja
configuration: ${{ matrix.configuration }}
cmake-args: >-
-Dassert=TRUE
-Dwerr=TRUE
-Dcoverage=ON
-Dcoverage_format=xml
-DCODE_COVERAGE_VERBOSE=ON
-DCMAKE_CXX_FLAGS="-O0"
-DCMAKE_C_FLAGS="-O0"
cmake-target: coverage
- name: move coverage report
shell: bash
run: |
mv "${build_dir}/coverage.xml" ./
- name: archive coverage report
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02
with:
name: coverage.xml
path: coverage.xml
retention-days: 30
- name: upload coverage report
uses: wandalen/wretry.action@v1.4.10
with:
action: codecov/codecov-action@v4.5.0
with: |
files: coverage.xml
fail_ci_if_error: true
disable_search: true
verbose: true
plugin: noop
token: ${{ secrets.CODECOV_TOKEN }}
attempt_limit: 5
attempt_delay: 210000 # in milliseconds
conan:
needs: dependencies
runs-on: [self-hosted, heavy]
container:
image: ghcr.io/xrplf/ci/ubuntu-jammy:gcc-12
env:
build_dir: .build
platform: linux
compiler: gcc
compiler_version: 12
configuration: Release
steps:
- name: download cache
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093
with:
name: ${{ env.platform }}-${{ env.compiler }}-${{ env.configuration }}
- name: extract cache
run: |
mkdir -p ${CONAN_HOME}
tar -xzf conan.tar.gz -C ${CONAN_HOME}
- name: check environment
run: |
env | sort
echo ${PATH} | tr ':' '\n'
conan --version
cmake --version
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: dependencies
uses: ./.github/actions/dependencies
with:
configuration: ${{ env.configuration }}
- name: export
run: |
conan export . --version head
- name: build
run: |
cd tests/conan
mkdir ${build_dir} && cd ${build_dir}
conan install .. \
--settings:all build_type=${configuration} \
--output-folder . \
--build missing
cmake .. \
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=./build/${configuration}/generators/conan_toolchain.cmake \
-DCMAKE_BUILD_TYPE=${configuration}
cmake --build .
./example | grep '^[[:digit:]]\+\.[[:digit:]]\+\.[[:digit:]]\+'
instrumentation-build:
needs: dependencies
runs-on: [self-hosted, heavy]
container: ghcr.io/xrplf/ci/debian-bookworm:clang-16
env:
build_dir: .build
steps:
- name: download cache
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093
with:
name: linux-clang-Debug
- name: extract cache
run: |
mkdir -p ${CONAN_HOME}
tar -xzf conan.tar.gz -C ${CONAN_HOME}
- name: check environment
run: |
echo ${PATH} | tr ':' '\n'
conan --version
cmake --version
env | sort
ls ${CONAN_HOME}
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: dependencies
uses: ./.github/actions/dependencies
with:
configuration: Debug
- name: prepare environment
run: |
mkdir -p ${build_dir}
echo "SOURCE_DIR=$(pwd)" >> $GITHUB_ENV
echo "BUILD_DIR=$(pwd)/${build_dir}" >> $GITHUB_ENV
- name: build with instrumentation
run: |
cd ${BUILD_DIR}
cmake -S ${SOURCE_DIR} -B ${BUILD_DIR} \
-Dvoidstar=ON \
-Dtests=ON \
-Dxrpld=ON \
-DCMAKE_BUILD_TYPE=Debug \
-DSECP256K1_BUILD_BENCHMARK=OFF \
-DSECP256K1_BUILD_TESTS=OFF \
-DSECP256K1_BUILD_EXHAUSTIVE_TESTS=OFF \
-DCMAKE_TOOLCHAIN_FILE=${BUILD_DIR}/build/generators/conan_toolchain.cmake
cmake --build . --parallel $(nproc)
- name: verify instrumentation enabled
run: |
cd ${BUILD_DIR}
./rippled --version | grep libvoidstar
- name: run unit tests
run: |
cd ${BUILD_DIR}
./rippled -u --unittest-jobs $(( $(nproc)/4 ))
ctest -j $(nproc) --output-on-failure

133
.github/workflows/on-pr.yml vendored Normal file
View File

@@ -0,0 +1,133 @@
# This workflow runs all workflows to check, build and test the project on
# various Linux flavors, as well as on MacOS and Windows, on every push to a
# user branch. However, it will not run if the pull request is a draft unless it
# has the 'DraftRunCI' label.
name: PR
on:
merge_group:
types:
- checks_requested
pull_request:
types:
- opened
- reopened
- synchronize
- ready_for_review
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
defaults:
run:
shell: bash
jobs:
# This job determines whether the rest of the workflow should run. It runs
# when the PR is not a draft (which should also cover merge-group) or
# has the 'DraftRunCI' label.
should-run:
if: ${{ !github.event.pull_request.draft || contains(github.event.pull_request.labels.*.name, 'DraftRunCI') }}
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Determine changed files
# This step checks whether any files have changed that should
# cause the next jobs to run. We do it this way rather than
# using `paths` in the `on:` section, because all required
# checks must pass, even for changes that do not modify anything
# that affects those checks. We would therefore like to make the
# checks required only if the job runs, but GitHub does not
# support that directly. By always executing the workflow on new
# commits and by using the changed-files action below, we ensure
# that Github considers any skipped jobs to have passed, and in
# turn the required checks as well.
id: changes
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
# These paths are unique to `on-pr.yml`.
.github/scripts/levelization/**
.github/workflows/reusable-check-levelization.yml
.github/workflows/reusable-notify-clio.yml
.github/workflows/on-pr.yml
# Keep the paths below in sync with those in `on-trigger.yml`.
.github/actions/build-deps/**
.github/actions/build-test/**
.github/actions/setup-conan/**
.github/scripts/strategy-matrix/**
.github/workflows/reusable-build.yml
.github/workflows/reusable-build-test-config.yml
.github/workflows/reusable-build-test.yml
.github/workflows/reusable-strategy-matrix.yml
.github/workflows/reusable-test.yml
.codecov.yml
cmake/**
conan/**
external/**
include/**
src/**
tests/**
CMakeLists.txt
conanfile.py
conan.lock
- name: Check whether to run
# This step determines whether the rest of the workflow should
# run. The rest of the workflow will run if this job runs AND at
# least one of:
# * Any of the files checked in the `changes` step were modified
# * The PR is NOT a draft and is labeled "Ready to merge"
# * The workflow is running from the merge queue
id: go
env:
FILES: ${{ steps.changes.outputs.any_changed }}
DRAFT: ${{ github.event.pull_request.draft }}
READY: ${{ contains(github.event.pull_request.labels.*.name, 'Ready to merge') }}
MERGE: ${{ github.event_name == 'merge_group' }}
run: |
echo "go=${{ (env.DRAFT != 'true' && env.READY == 'true') || env.FILES == 'true' || env.MERGE == 'true' }}" >> "${GITHUB_OUTPUT}"
cat "${GITHUB_OUTPUT}"
outputs:
go: ${{ steps.go.outputs.go == 'true' }}
check-levelization:
needs: should-run
if: ${{ needs.should-run.outputs.go == 'true' }}
uses: ./.github/workflows/reusable-check-levelization.yml
build-test:
needs: should-run
if: ${{ needs.should-run.outputs.go == 'true' }}
uses: ./.github/workflows/reusable-build-test.yml
strategy:
fail-fast: false
matrix:
os: [linux, macos, windows]
with:
os: ${{ matrix.os }}
secrets:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
notify-clio:
needs:
- should-run
- build-test
if: ${{ needs.should-run.outputs.go == 'true' && contains(fromJSON('["release", "master"]'), github.ref_name) }}
uses: ./.github/workflows/reusable-notify-clio.yml
secrets:
clio_notify_token: ${{ secrets.CLIO_NOTIFY_TOKEN }}
conan_remote_username: ${{ secrets.CONAN_REMOTE_USERNAME }}
conan_remote_password: ${{ secrets.CONAN_REMOTE_PASSWORD }}
passed:
if: failure() || cancelled()
needs:
- build-test
- check-levelization
runs-on: ubuntu-latest
steps:
- name: Fail
run: false

75
.github/workflows/on-trigger.yml vendored Normal file
View File

@@ -0,0 +1,75 @@
# This workflow runs all workflows to build the dependencies required for the
# project on various Linux flavors, as well as on MacOS and Windows, on a
# scheduled basis, on merge into the 'develop', 'release', or 'master' branches,
# or manually. The missing commits check is only run when the code is merged
# into the 'develop' or 'release' branches, and the documentation is built when
# the code is merged into the 'develop' branch.
name: Trigger
on:
push:
branches:
- "develop"
- "release*"
- "master"
paths:
# These paths are unique to `on-trigger.yml`.
- ".github/workflows/reusable-check-missing-commits.yml"
- ".github/workflows/on-trigger.yml"
- ".github/workflows/publish-docs.yml"
# Keep the paths below in sync with those in `on-pr.yml`.
- ".github/actions/build-deps/**"
- ".github/actions/build-test/**"
- ".github/actions/setup-conan/**"
- ".github/scripts/strategy-matrix/**"
- ".github/workflows/reusable-build.yml"
- ".github/workflows/reusable-build-test-config.yml"
- ".github/workflows/reusable-build-test.yml"
- ".github/workflows/reusable-strategy-matrix.yml"
- ".github/workflows/reusable-test.yml"
- ".codecov.yml"
- "cmake/**"
- "conan/**"
- "external/**"
- "include/**"
- "src/**"
- "tests/**"
- "CMakeLists.txt"
- "conanfile.py"
- "conan.lock"
# Run at 06:32 UTC on every day of the week from Monday through Friday. This
# will force all dependencies to be rebuilt, which is useful to verify that
# all dependencies can be built successfully. Only the dependencies that
# are actually missing from the remote will be uploaded.
schedule:
- cron: "32 6 * * 1-5"
# Run when manually triggered via the GitHub UI or API.
workflow_dispatch:
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
defaults:
run:
shell: bash
jobs:
check-missing-commits:
if: ${{ github.event_name == 'push' && github.ref_type == 'branch' && contains(fromJSON('["develop", "release"]'), github.ref_name) }}
uses: ./.github/workflows/reusable-check-missing-commits.yml
build-test:
uses: ./.github/workflows/reusable-build-test.yml
strategy:
fail-fast: ${{ github.event_name == 'merge_group' }}
matrix:
os: [linux, macos, windows]
with:
os: ${{ matrix.os }}
strategy_matrix: ${{ github.event_name == 'schedule' && 'all' || 'minimal' }}
secrets:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}

15
.github/workflows/pre-commit.yml vendored Normal file
View File

@@ -0,0 +1,15 @@
name: Run pre-commit hooks
on:
pull_request:
push:
branches: [develop, release, master]
workflow_dispatch:
jobs:
# Call the workflow in the XRPLF/actions repo that runs the pre-commit hooks.
run-hooks:
uses: XRPLF/actions/.github/workflows/pre-commit.yml@a8d7472b450eb53a1e5228f64552e5974457a21a
with:
runs_on: ubuntu-latest
container: '{ "image": "ghcr.io/xrplf/ci/tools-rippled-pre-commit:sha-a8c7be1" }'

60
.github/workflows/publish-docs.yml vendored Normal file
View File

@@ -0,0 +1,60 @@
# This workflow builds the documentation for the repository, and publishes it to
# GitHub Pages when changes are merged into the default branch.
name: Build and publish documentation
on:
push:
paths:
- ".github/workflows/publish-docs.yml"
- "*.md"
- "**/*.md"
- "docs/**"
- "include/**"
- "src/libxrpl/**"
- "src/xrpld/**"
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
defaults:
run:
shell: bash
env:
BUILD_DIR: .build
jobs:
publish:
runs-on: ubuntu-latest
container: ghcr.io/xrplf/ci/tools-rippled-documentation:sha-a8c7be1
permissions:
contents: write
steps:
- name: Checkout repository
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Check configuration
run: |
echo 'Checking path.'
echo ${PATH} | tr ':' '\n'
echo 'Checking environment variables.'
env | sort
echo 'Checking CMake version.'
cmake --version
echo 'Checking Doxygen version.'
doxygen --version
- name: Build documentation
run: |
mkdir -p "${BUILD_DIR}"
cd "${BUILD_DIR}"
cmake -Donly_docs=ON ..
cmake --build . --target docs --parallel $(nproc)
- name: Publish documentation
if: ${{ github.ref_type == 'branch' && github.ref_name == github.event.repository.default_branch }}
uses: peaceiris/actions-gh-pages@4f9cc6602d3f66b9c108549d475ec49e8ef4d45e # v4.0.0
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ${{ env.BUILD_DIR }}/docs/html

View File

@@ -0,0 +1,69 @@
name: Build and test configuration
on:
workflow_call:
inputs:
build_dir:
description: "The directory where to build."
required: true
type: string
build_only:
description: 'Whether to only build or to build and test the code ("true", "false").'
required: true
type: boolean
build_type:
description: 'The build type to use ("Debug", "Release").'
type: string
required: true
cmake_args:
description: "Additional arguments to pass to CMake."
required: false
type: string
default: ""
cmake_target:
description: "The CMake target to build."
type: string
required: true
runs_on:
description: Runner to run the job on as a JSON string
required: true
type: string
image:
description: "The image to run in (leave empty to run natively)"
required: true
type: string
config_name:
description: "The configuration string (used for naming artifacts and such)."
required: true
type: string
secrets:
CODECOV_TOKEN:
description: "The Codecov token to use for uploading coverage reports."
required: true
jobs:
build:
uses: ./.github/workflows/reusable-build.yml
with:
build_dir: ${{ inputs.build_dir }}
build_type: ${{ inputs.build_type }}
cmake_args: ${{ inputs.cmake_args }}
cmake_target: ${{ inputs.cmake_target }}
runs_on: ${{ inputs.runs_on }}
image: ${{ inputs.image }}
config_name: ${{ inputs.config_name }}
secrets:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
test:
needs: build
uses: ./.github/workflows/reusable-test.yml
with:
run_tests: ${{ !inputs.build_only }}
verify_voidstar: ${{ contains(inputs.cmake_args, '-Dvoidstar=ON') }}
runs_on: ${{ inputs.runs_on }}
image: ${{ inputs.image }}
config_name: ${{ inputs.config_name }}

View File

@@ -0,0 +1,58 @@
# This workflow builds and tests the binary for various configurations.
name: Build and test
# This workflow can only be triggered by other workflows. Note that the
# workflow_call event does not support the 'choice' input type, see
# https://docs.github.com/en/actions/reference/workflows-and-actions/workflow-syntax#onworkflow_callinputsinput_idtype,
# so we use 'string' instead.
on:
workflow_call:
inputs:
build_dir:
description: "The directory where to build."
required: false
type: string
default: ".build"
os:
description: 'The operating system to use for the build ("linux", "macos", "windows").'
required: true
type: string
strategy_matrix:
# TODO: Support additional strategies, e.g. "ubuntu" for generating all Ubuntu configurations.
description: 'The strategy matrix to use for generating the configurations ("minimal", "all").'
required: false
type: string
default: "minimal"
secrets:
CODECOV_TOKEN:
description: "The Codecov token to use for uploading coverage reports."
required: true
jobs:
# Generate the strategy matrix to be used by the following job.
generate-matrix:
uses: ./.github/workflows/reusable-strategy-matrix.yml
with:
os: ${{ inputs.os }}
strategy_matrix: ${{ inputs.strategy_matrix }}
# Build and test the binary for each configuration.
build-test-config:
needs:
- generate-matrix
uses: ./.github/workflows/reusable-build-test-config.yml
strategy:
fail-fast: ${{ github.event_name == 'merge_group' }}
matrix: ${{ fromJson(needs.generate-matrix.outputs.matrix) }}
max-parallel: 10
with:
build_dir: ${{ inputs.build_dir }}
build_only: ${{ matrix.build_only }}
build_type: ${{ matrix.build_type }}
cmake_args: ${{ matrix.cmake_args }}
cmake_target: ${{ matrix.cmake_target }}
runs_on: ${{ toJSON(matrix.architecture.runner) }}
image: ${{ contains(matrix.architecture.platform, 'linux') && format('ghcr.io/xrplf/ci/{0}-{1}:{2}-{3}-sha-{4}', matrix.os.distro_name, matrix.os.distro_version, matrix.os.compiler_name, matrix.os.compiler_version, matrix.os.image_sha) || '' }}
config_name: ${{ matrix.config_name }}
secrets:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}

122
.github/workflows/reusable-build.yml vendored Normal file
View File

@@ -0,0 +1,122 @@
name: Build rippled
on:
workflow_call:
inputs:
build_dir:
description: "The directory where to build."
required: true
type: string
build_type:
description: 'The build type to use ("Debug", "Release").'
required: true
type: string
cmake_args:
description: "Additional arguments to pass to CMake."
required: true
type: string
cmake_target:
description: "The CMake target to build."
required: true
type: string
runs_on:
description: Runner to run the job on as a JSON string
required: true
type: string
image:
description: "The image to run in (leave empty to run natively)"
required: true
type: string
config_name:
description: "The name of the configuration."
required: true
type: string
secrets:
CODECOV_TOKEN:
description: "The Codecov token to use for uploading coverage reports."
required: true
defaults:
run:
shell: bash
jobs:
build:
name: Build ${{ inputs.config_name }}
runs-on: ${{ fromJSON(inputs.runs_on) }}
container: ${{ inputs.image != '' && inputs.image || null }}
timeout-minutes: 60
steps:
- name: Cleanup workspace
if: ${{ runner.os == 'macOS' }}
uses: XRPLF/actions/.github/actions/cleanup-workspace@3f044c7478548e3c32ff68980eeb36ece02b364e
- name: Checkout repository
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Prepare runner
uses: XRPLF/actions/.github/actions/prepare-runner@638e0dc11ea230f91bd26622fb542116bb5254d5
with:
disable_ccache: false
- name: Print build environment
uses: ./.github/actions/print-env
- name: Setup Conan
uses: ./.github/actions/setup-conan
- name: Build dependencies
uses: ./.github/actions/build-deps
with:
build_dir: ${{ inputs.build_dir }}
build_type: ${{ inputs.build_type }}
- name: Configure CMake
shell: bash
working-directory: ${{ inputs.build_dir }}
env:
BUILD_TYPE: ${{ inputs.build_type }}
CMAKE_ARGS: ${{ inputs.cmake_args }}
run: |
cmake \
-G '${{ runner.os == 'Windows' && 'Visual Studio 17 2022' || 'Ninja' }}' \
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake \
-DCMAKE_BUILD_TYPE="${BUILD_TYPE}" \
${CMAKE_ARGS} \
..
- name: Build the binary
shell: bash
working-directory: ${{ inputs.build_dir }}
env:
BUILD_TYPE: ${{ inputs.build_type }}
CMAKE_TARGET: ${{ inputs.cmake_target }}
run: |
cmake \
--build . \
--config "${BUILD_TYPE}" \
--parallel $(nproc) \
--target "${CMAKE_TARGET}"
- name: Upload rippled artifact
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: rippled-${{ inputs.config_name }}
path: ${{ inputs.build_dir }}/${{ runner.os == 'Windows' && inputs.build_type || '' }}/rippled${{ runner.os == 'Windows' && '.exe' || '' }}
retention-days: 3
if-no-files-found: error
- name: Upload coverage report
if: ${{ inputs.cmake_target == 'coverage' }}
uses: codecov/codecov-action@18283e04ce6e62d37312384ff67231eb8fd56d24 # v5.4.3
with:
disable_search: true
disable_telem: true
fail_ci_if_error: true
files: ${{ inputs.build_dir }}/coverage.xml
plugins: noop
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true

View File

@@ -0,0 +1,46 @@
# This workflow checks if the dependencies between the modules are correctly
# indexed.
name: Check levelization
# This workflow can only be triggered by other workflows.
on: workflow_call
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-levelization
cancel-in-progress: true
defaults:
run:
shell: bash
jobs:
levelization:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Check levelization
run: .github/scripts/levelization/generate.sh
- name: Check for differences
env:
MESSAGE: |
The dependency relationships between the modules in rippled have
changed, which may be an improvement or a regression.
A rule of thumb is that if your changes caused something to be
removed from loops.txt, it's probably an improvement, while if
something was added, it's probably a regression.
Run '.github/scripts/levelization/generate.sh' in your repo, commit
and push the changes. See .github/scripts/levelization/README.md for
more info.
run: |
DIFF=$(git status --porcelain)
if [ -n "${DIFF}" ]; then
# Print the differences to give the contributor a hint about what to
# expect when running levelization on their own machine.
git diff
echo "${MESSAGE}"
exit 1
fi

View File

@@ -0,0 +1,62 @@
# This workflow checks that all commits in the "master" branch are also in the
# "release" and "develop" branches, and that all commits in the "release" branch
# are also in the "develop" branch.
name: Check for missing commits
# This workflow can only be triggered by other workflows.
on: workflow_call
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-missing-commits
cancel-in-progress: true
defaults:
run:
shell: bash
jobs:
check:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with:
fetch-depth: 0
- name: Check for missing commits
env:
MESSAGE: |
If you are reading this, then the commits indicated above are missing
from the "develop" and/or "release" branch. Do a reverse-merge as soon
as possible. See CONTRIBUTING.md for instructions.
run: |
set -o pipefail
# Branches are ordered by how "canonical" they are. Every commit in one
# branch should be in all the branches behind it.
order=(master release develop)
branches=()
for branch in "${order[@]}"; do
# Check that the branches exist so that this job will work on forked
# repos, which don't necessarily have master and release branches.
echo "Checking if ${branch} exists."
if git ls-remote --exit-code --heads origin \
refs/heads/${branch} > /dev/null; then
branches+=(origin/${branch})
fi
done
prior=()
for branch in "${branches[@]}"; do
if [[ ${#prior[@]} -ne 0 ]]; then
echo "Checking ${prior[@]} for commits missing from ${branch}."
git log --oneline --no-merges "${prior[@]}" \
^$branch | tee -a "missing-commits.txt"
echo
fi
prior+=("${branch}")
done
if [[ $(cat missing-commits.txt | wc -l) -ne 0 ]]; then
echo "${MESSAGE}"
exit 1
fi

View File

@@ -0,0 +1,91 @@
# This workflow exports the built libxrpl package to the Conan remote on a
# a channel named after the pull request, and notifies the Clio repository about
# the new version so it can check for compatibility.
name: Notify Clio
# This workflow can only be triggered by other workflows.
on:
workflow_call:
inputs:
conan_remote_name:
description: "The name of the Conan remote to use."
required: false
type: string
default: xrplf
conan_remote_url:
description: "The URL of the Conan endpoint to use."
required: false
type: string
default: https://conan.ripplex.io
secrets:
clio_notify_token:
description: "The GitHub token to notify Clio about new versions."
required: true
conan_remote_username:
description: "The username for logging into the Conan remote."
required: true
conan_remote_password:
description: "The password for logging into the Conan remote."
required: true
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-clio
cancel-in-progress: true
defaults:
run:
shell: bash
jobs:
upload:
if: ${{ github.event.pull_request.head.repo.full_name == github.repository }}
runs-on: ubuntu-latest
container: ghcr.io/xrplf/ci/ubuntu-noble:gcc-13-sha-5dd7158
steps:
- name: Checkout repository
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Generate outputs
id: generate
env:
PR_NUMBER: ${{ github.event.pull_request.number }}
run: |
echo 'Generating user and channel.'
echo "user=clio" >> "${GITHUB_OUTPUT}"
echo "channel=pr_${PR_NUMBER}" >> "${GITHUB_OUTPUT}"
echo 'Extracting version.'
echo "version=$(cat src/libxrpl/protocol/BuildInfo.cpp | grep "versionString =" | awk -F '"' '{print $2}')" >> "${GITHUB_OUTPUT}"
- name: Calculate conan reference
id: conan_ref
run: |
echo "conan_ref=${{ steps.generate.outputs.version }}@${{ steps.generate.outputs.user }}/${{ steps.generate.outputs.channel }}" >> "${GITHUB_OUTPUT}"
- name: Set up Conan
uses: ./.github/actions/setup-conan
with:
conan_remote_name: ${{ inputs.conan_remote_name }}
conan_remote_url: ${{ inputs.conan_remote_url }}
- name: Log into Conan remote
env:
CONAN_REMOTE_NAME: ${{ inputs.conan_remote_name }}
run: conan remote login "${CONAN_REMOTE_NAME}" "${{ secrets.conan_remote_username }}" --password "${{ secrets.conan_remote_password }}"
- name: Upload package
env:
CONAN_REMOTE_NAME: ${{ inputs.conan_remote_name }}
run: |
conan export --user=${{ steps.generate.outputs.user }} --channel=${{ steps.generate.outputs.channel }} .
conan upload --confirm --check --remote="${CONAN_REMOTE_NAME}" xrpl/${{ steps.conan_ref.outputs.conan_ref }}
outputs:
conan_ref: ${{ steps.conan_ref.outputs.conan_ref }}
notify:
needs: upload
runs-on: ubuntu-latest
steps:
- name: Notify Clio
env:
GH_TOKEN: ${{ secrets.clio_notify_token }}
PR_URL: ${{ github.event.pull_request.html_url }}
run: |
gh api --method POST -H "Accept: application/vnd.github+json" -H "X-GitHub-Api-Version: 2022-11-28" \
/repos/xrplf/clio/dispatches -f "event_type=check_libxrpl" \
-F "client_payload[conan_ref]=${{ needs.upload.outputs.conan_ref }}" \
-F "client_payload[pr_url]=${PR_URL}"

View File

@@ -0,0 +1,41 @@
name: Generate strategy matrix
on:
workflow_call:
inputs:
os:
description: 'The operating system to use for the build ("linux", "macos", "windows").'
required: false
type: string
strategy_matrix:
# TODO: Support additional strategies, e.g. "ubuntu" for generating all Ubuntu configurations.
description: 'The strategy matrix to use for generating the configurations ("minimal", "all").'
required: false
type: string
default: "minimal"
outputs:
matrix:
description: "The generated strategy matrix."
value: ${{ jobs.generate-matrix.outputs.matrix }}
jobs:
generate-matrix:
runs-on: ubuntu-latest
outputs:
matrix: ${{ steps.generate.outputs.matrix }}
steps:
- name: Checkout repository
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.13
- name: Generate strategy matrix
working-directory: .github/scripts/strategy-matrix
id: generate
env:
GENERATE_CONFIG: ${{ inputs.os != '' && format('--config={0}.json', inputs.os) || '' }}
GENERATE_OPTION: ${{ inputs.strategy_matrix == 'all' && '--all' || '' }}
run: ./generate.py ${GENERATE_OPTION} ${GENERATE_CONFIG} >> "${GITHUB_OUTPUT}"

70
.github/workflows/reusable-test.yml vendored Normal file
View File

@@ -0,0 +1,70 @@
name: Test rippled
on:
workflow_call:
inputs:
verify_voidstar:
description: "Whether to verify the presence of voidstar instrumentation."
required: true
type: boolean
run_tests:
description: "Whether to run unit tests"
required: true
type: boolean
runs_on:
description: Runner to run the job on as a JSON string
required: true
type: string
image:
description: "The image to run in (leave empty to run natively)"
required: true
type: string
config_name:
description: "The name of the configuration."
required: true
type: string
jobs:
test:
name: Test ${{ inputs.config_name }}
runs-on: ${{ fromJSON(inputs.runs_on) }}
container: ${{ inputs.image != '' && inputs.image || null }}
timeout-minutes: 30
steps:
- name: Download rippled artifact
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
name: rippled-${{ inputs.config_name }}
- name: Make binary executable (Linux and macOS)
shell: bash
if: ${{ runner.os == 'Linux' || runner.os == 'macOS' }}
run: |
chmod +x ./rippled
- name: Check linking (Linux)
if: ${{ runner.os == 'Linux' }}
shell: bash
run: |
ldd ./rippled
if [ "$(ldd ./rippled | grep -E '(libstdc\+\+|libgcc)' | wc -l)" -eq 0 ]; then
echo 'The binary is statically linked.'
else
echo 'The binary is dynamically linked.'
exit 1
fi
- name: Verifying presence of instrumentation
if: ${{ inputs.verify_voidstar }}
shell: bash
run: |
./rippled --version | grep libvoidstar
- name: Test the binary
if: ${{ inputs.run_tests }}
shell: bash
run: |
./rippled --unittest --unittest-jobs $(nproc)
ctest -j $(nproc) --output-on-failure

94
.github/workflows/upload-conan-deps.yml vendored Normal file
View File

@@ -0,0 +1,94 @@
name: Upload Conan Dependencies
on:
schedule:
- cron: "0 3 * * 2-6"
workflow_dispatch:
inputs:
force_source_build:
description: "Force source build of all dependencies"
required: false
default: false
type: boolean
force_upload:
description: "Force upload of all dependencies"
required: false
default: false
type: boolean
pull_request:
branches: [develop]
paths:
# This allows testing changes to the upload workflow in a PR
- .github/workflows/upload-conan-deps.yml
push:
branches: [develop]
paths:
- .github/workflows/upload-conan-deps.yml
- .github/workflows/reusable-strategy-matrix.yml
- .github/actions/build-deps/action.yml
- .github/actions/setup-conan/action.yml
- ".github/scripts/strategy-matrix/**"
- conanfile.py
- conan.lock
env:
CONAN_REMOTE_NAME: xrplf
CONAN_REMOTE_URL: https://conan.ripplex.io
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
# Generate the strategy matrix to be used by the following job.
generate-matrix:
uses: ./.github/workflows/reusable-strategy-matrix.yml
with:
strategy_matrix: ${{ github.event_name == 'pull_request' && 'minimal' || 'all' }}
# Build and upload the dependencies for each configuration.
run-upload-conan-deps:
needs:
- generate-matrix
strategy:
fail-fast: false
matrix: ${{ fromJson(needs.generate-matrix.outputs.matrix) }}
max-parallel: 10
runs-on: ${{ matrix.architecture.runner }}
container: ${{ contains(matrix.architecture.platform, 'linux') && format('ghcr.io/xrplf/ci/{0}-{1}:{2}-{3}-sha-{4}', matrix.os.distro_name, matrix.os.distro_version, matrix.os.compiler_name, matrix.os.compiler_version, matrix.os.image_sha) || null }}
steps:
- name: Cleanup workspace
if: ${{ runner.os == 'macOS' }}
uses: XRPLF/actions/.github/actions/cleanup-workspace@3f044c7478548e3c32ff68980eeb36ece02b364e
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Prepare runner
uses: XRPLF/actions/.github/actions/prepare-runner@638e0dc11ea230f91bd26622fb542116bb5254d5
with:
disable_ccache: false
- name: Setup Conan
uses: ./.github/actions/setup-conan
with:
conan_remote_name: ${{ env.CONAN_REMOTE_NAME }}
conan_remote_url: ${{ env.CONAN_REMOTE_URL }}
- name: Build dependencies
uses: ./.github/actions/build-deps
with:
build_dir: .build
build_type: ${{ matrix.build_type }}
force_build: ${{ github.event_name == 'schedule' || github.event.inputs.force_source_build == 'true' }}
# The verbosity is set to "quiet" for Windows to avoid an excessive amount of logs, while it
# is set to "verbose" otherwise to provide more information during the build process.
verbosity: ${{ runner.os == 'Windows' && 'quiet' || 'verbose' }}
- name: Log into Conan remote
if: ${{ github.repository_owner == 'XRPLF' && github.event_name != 'pull_request' }}
run: conan remote login "${CONAN_REMOTE_NAME}" "${{ secrets.CONAN_REMOTE_USERNAME }}" --password "${{ secrets.CONAN_REMOTE_PASSWORD }}"
- name: Upload Conan packages
if: ${{ github.repository_owner == 'XRPLF' && github.event_name != 'pull_request' && github.event_name != 'schedule' }}
env:
FORCE_OPTION: ${{ github.event.inputs.force_upload == 'true' && '--force' || '' }}
run: conan upload "*" --remote="${CONAN_REMOTE_NAME}" --confirm ${FORCE_OPTION}

View File

@@ -1,106 +0,0 @@
name: windows
on:
pull_request:
types: [opened, reopened, synchronize, ready_for_review]
push:
# If the branches list is ever changed, be sure to change it on all
# build/test jobs (nix, macos, windows, instrumentation)
branches:
# Always build the package branches
- develop
- release
- master
# Branches that opt-in to running
- "ci/**"
# https://docs.github.com/en/actions/using-jobs/using-concurrency
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
env:
CONAN_REMOTE_URL: https://conan.ripplex.io
CONAN_REMOTE_USERNAME: ${{ secrets.CONAN_REMOTE_USERNAME }}
CONAN_REMOTE_PASSWORD: ${{ secrets.CONAN_REMOTE_PASSWORD }}
# This part of the Conan configuration is specific to this workflow only; we
# do not want to pollute the 'conan/profiles' directory with settings that
# might not work for other workflows.
CONAN_GLOBAL_CONF: |
core.download:parallel={{os.cpu_count()}}
core.upload:parallel={{os.cpu_count()}}
tools.build:jobs=24
tools.build:verbosity=verbose
tools.compilation:verbosity=verbose
jobs:
test:
if: ${{ github.event_name == 'push' || github.event.pull_request.draft != true || contains(github.event.pull_request.labels.*.name, 'DraftRunCI') }}
strategy:
fail-fast: false
matrix:
version:
- generator: Visual Studio 17 2022
runs-on: windows-2022
configuration:
- type: Release
tests: true
- type: Debug
# Skip running unit tests on debug builds, because they
# take an unreasonable amount of time
tests: false
runtime: d
runs-on: ${{ matrix.version.runs-on }}
env:
build_dir: .build
steps:
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: choose Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065
with:
python-version: 3.13
- name: learn Python cache directory
id: pip-cache
shell: bash
run: |
python -m pip install --upgrade pip
echo "dir=$(pip cache dir)" | tee ${GITHUB_OUTPUT}
- name: restore Python cache directory
uses: actions/cache@5a3ec84eff668545956fd18022155c47e93e2684
with:
path: ${{ steps.pip-cache.outputs.dir }}
key: ${{ runner.os }}-${{ hashFiles('.github/workflows/windows.yml') }}
- name: install Conan
run: pip install wheel conan
- name: check environment
run: |
dir env:
$env:PATH -split ';'
python --version
conan --version
cmake --version
- name: configure Conan
shell: bash
run: |
echo "${CONAN_GLOBAL_CONF}" > $(conan config home)/global.conf
conan config install conan/profiles/ -tf $(conan config home)/profiles/
conan profile show
- name: build dependencies
uses: ./.github/actions/dependencies
with:
configuration: ${{ matrix.configuration.type }}
- name: build
uses: ./.github/actions/build
with:
generator: "${{ matrix.version.generator }}"
configuration: ${{ matrix.configuration.type }}
# Hard code for now. Move to the matrix if varied options are needed
cmake-args: "-Dassert=TRUE -Dwerr=TRUE -Dreporting=OFF -Dunity=ON"
cmake-target: install
- name: test
shell: bash
if: ${{ matrix.configuration.tests }}
run: |
cd ${build_dir}/${{ matrix.configuration.type }}
./rippled --unittest --unittest-jobs $(nproc)
ctest -j $(nproc) --output-on-failure

9
.gitignore vendored
View File

@@ -37,10 +37,9 @@ Release/*.*
*.gcov
# Levelization checking
Builds/levelization/results/rawincludes.txt
Builds/levelization/results/paths.txt
Builds/levelization/results/includes/
Builds/levelization/results/includedby/
.github/scripts/levelization/results/*
!.github/scripts/levelization/results/loops.txt
!.github/scripts/levelization/results/ordering.txt
# Ignore tmp directory.
tmp
@@ -111,4 +110,4 @@ bld.rippled/
.vscode
# Suggested in-tree build directory
/.build/
/.build*/

View File

@@ -1,6 +1,39 @@
# .pre-commit-config.yaml
# To run pre-commit hooks, first install pre-commit:
# - `pip install pre-commit==${PRE_COMMIT_VERSION}`
#
# Then, run the following command to install the git hook scripts:
# - `pre-commit install`
# You can run all configured hooks against all files with:
# - `pre-commit run --all-files`
# To manually run a specific hook, use:
# - `pre-commit run <hook_id> --all-files`
# To run the hooks against only the staged files, use:
# - `pre-commit run`
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: 3e8a8703264a2f4a69428a0aa4dcb512790b2c8c # frozen: v6.0.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: mixed-line-ending
- id: check-merge-conflict
args: [--assume-in-merge]
- repo: https://github.com/pre-commit/mirrors-clang-format
rev: v18.1.8
rev: 7d85583be209cb547946c82fbe51f4bc5dd1d017 # frozen: v18.1.8
hooks:
- id: clang-format
args: [--style=file]
"types_or": [c++, c, proto]
- repo: https://github.com/rbubley/mirrors-prettier
rev: 5ba47274f9b181bce26a5150a725577f3c336011 # frozen: v3.6.2
hooks:
- id: prettier
exclude: |
(?x)^(
external/.*|
.github/scripts/levelization/results/.*\.txt|
conan\.lock
)$

1
.prettierignore Normal file
View File

@@ -0,0 +1 @@
external

View File

@@ -39,17 +39,12 @@ found here](./docs/build/environment.md).
- [Python 3.11](https://www.python.org/downloads/), or higher
- [Conan 2.17](https://conan.io/downloads.html)[^1], or higher
- [CMake 3.22](https://cmake.org/download/)[^2], or higher
- [CMake 3.22](https://cmake.org/download/), or higher
[^1]:
It is possible to build with Conan 1.60+, but the instructions are
significantly different, which is why we are not recommending it.
[^2]:
CMake 4 is not yet supported by all dependencies required by this project.
If you are affected by this issue, follow [conan workaround for cmake
4](#workaround-for-cmake-4)
`rippled` is written in the C++20 dialect and includes the `<concepts>` header.
The [minimum compiler versions][2] required are:
@@ -132,7 +127,7 @@ higher index than the default Conan Center remote, so it is consulted first. You
can do this by running:
```bash
conan remote add --index 0 xrplf "https://conan.ripplex.io"
conan remote add --index 0 xrplf https://conan.ripplex.io
```
Alternatively, you can pull the patched recipes into the repository and use them
@@ -158,6 +153,10 @@ updated dependencies with the newer version. However, if we switch to a newer
version that no longer requires a patch, no action is required on your part, as
the new recipe will be automatically pulled from the official Conan Center.
> [!NOTE]
> You might need to add `--lockfile=""` to your `conan install` command
> to avoid automatic use of the existing `conan.lock` file when you run `conan export` manually on your machine
### Conan profile tweaks
#### Missing compiler version
@@ -278,21 +277,6 @@ sed -i.bak -e 's|^arch=.*$|arch=x86_64|' $(conan config home)/profiles/default
sed -i.bak -e 's|^compiler\.runtime=.*$|compiler.runtime=static|' $(conan config home)/profiles/default
```
#### Workaround for CMake 4
If your system CMake is version 4 rather than 3, you may have to configure Conan
profile to use CMake version 3 for dependencies, by adding the following two
lines to your profile:
```text
[tool_requires]
!cmake/*: cmake/[>=3 <4]
```
This will force Conan to download and use a locally cached CMake 3 version, and
is needed because some of the dependencies used by this project do not support
CMake 4.
#### Clang workaround for grpc
If your compiler is clang, version 19 or later, or apple-clang, version 17 or
@@ -466,6 +450,33 @@ tools.build:cxxflags=['-DBOOST_ASIO_DISABLE_CONCEPTS']
The location of `rippled` binary in your build directory depends on your
CMake generator. Pass `--help` to see the rest of the command line options.
#### Conan lockfile
To achieve reproducible dependencies, we use [Conan lockfile](https://docs.conan.io/2/tutorial/versioning/lockfiles.html).
The `conan.lock` file in the repository contains a "snapshot" of the current dependencies.
It is implicitly used when running `conan` commands, you don't need to specify it.
You have to update this file every time you add a new dependency or change a revision or version of an existing dependency.
> [!NOTE]
> Conan uses local cache by default when creating a lockfile.
>
> To ensure, that lockfile creation works the same way on all developer machines, you should clear the local cache before creating a new lockfile.
To create a new lockfile, run the following commands in the repository root:
```bash
conan remove '*' --confirm
rm conan.lock
# This ensure that xrplf remote is the first to be consulted
conan remote add --force --index 0 xrplf https://conan.ripplex.io
conan lock create . -o '&:jemalloc=True' -o '&:rocksdb=True'
```
> [!NOTE]
> If some dependencies are exclusive for some OS, you may need to run the last command for them adding `--profile:all <PROFILE>`.
## Coverage report
The coverage report is intended for developers using compilers GCC
@@ -564,7 +575,13 @@ After any updates or changes to dependencies, you may need to do the following:
```
3. Re-run [conan export](#patched-recipes) if needed.
4. Re-run [conan install](#build-and-test).
4. [Regenerate lockfile](#conan-lockfile).
5. Re-run [conan install](#build-and-test).
#### ERROR: Package not resolved
If you're seeing an error like `ERROR: Package 'snappy/1.1.10' not resolved: Unable to find 'snappy/1.1.10#968fef506ff261592ec30c574d4a7809%1756234314.246' in remotes.`,
please add `xrplf` remote or re-run `conan export` for [patched recipes](#patched-recipes).
### `protobuf/port_def.inc` file not found

View File

@@ -99,6 +99,8 @@ add_subdirectory(external/secp256k1)
add_library(secp256k1::secp256k1 ALIAS secp256k1)
add_subdirectory(external/ed25519-donna)
add_subdirectory(external/antithesis-sdk)
add_subdirectory(external/libff)
add_library(libff::ff ALIAS ff)
find_package(gRPC REQUIRED)
find_package(lz4 REQUIRED)
# Target names with :: are not allowed in a generator expression.
@@ -120,6 +122,96 @@ endif()
find_package(nudb REQUIRED)
find_package(date REQUIRED)
find_package(xxHash REQUIRED)
find_package(wasm-xrplf REQUIRED)
# Silence noisy warnings just inside libff
# Treat vendored headers as SYSTEM so their warnings don't bubble up.
# Remove it if printing warnings is desired
target_include_directories(ff SYSTEM PRIVATE
${CMAKE_SOURCE_DIR}/external/libff/src
${CMAKE_SOURCE_DIR}/external/libff/depends/xbyak/xbyak
${CMAKE_SOURCE_DIR}/external/libff/depends/ate-pairing/include
)
# Suppress GCC array-bounds false positives (and related) only for libff.
# Remove it if printing warnings is desired
if (CMAKE_CXX_COMPILER_ID MATCHES "GNU")
target_compile_options(ff PRIVATE
-Wno-array-bounds
-Wno-stringop-overflow
-fno-strict-aliasing
)
elseif (CMAKE_CXX_COMPILER_ID MATCHES "Clang")
target_compile_options(ff PRIVATE
-Wno-array-bounds
)
endif()
# Slience redundant constexpr data member if using C++17
# Remove it if printing warnings is desired
set(LIBFF_GF_SOURCES
${PROJECT_SOURCE_DIR}/external/libff/libff/algebra/fields/binary/gf32.cpp
${PROJECT_SOURCE_DIR}/external/libff/libff/algebra/fields/binary/gf64.cpp
${PROJECT_SOURCE_DIR}/external/libff/libff/algebra/fields/binary/gf128.cpp
${PROJECT_SOURCE_DIR}/external/libff/libff/algebra/fields/binary/gf192.cpp
${PROJECT_SOURCE_DIR}/external/libff/libff/algebra/fields/binary/gf256.cpp
)
if (CMAKE_CXX_COMPILER_ID MATCHES "GNU|Clang")
set_source_files_properties(${LIBFF_GF_SOURCES}
PROPERTIES COMPILE_OPTIONS "-Wno-deprecated;-Wno-redundant-decls")
endif()
# --- Try config-mode first (works if someone provides GMPConfig.cmake) ---
find_package(GMP CONFIG QUIET)
# --- If no config package, synthesize imported targets with find_* and pkg-config ---
if (NOT TARGET GMP::gmp)
find_package(PkgConfig QUIET)
if (PkgConfig_FOUND)
pkg_check_modules(PC_GMP QUIET gmp)
pkg_check_modules(PC_GMPXX QUIET gmpxx)
endif()
# headers
find_path(GMP_INCLUDE_DIR
NAMES gmp.h
HINTS ${PC_GMP_INCLUDE_DIRS} ${PC_GMPXX_INCLUDE_DIRS}
)
# libraries
find_library(GMP_LIBRARY
NAMES gmp
HINTS ${PC_GMP_LIBRARY_DIRS}
)
find_library(GMPXX_LIBRARY
NAMES gmpxx
HINTS ${PC_GMPXX_LIBRARY_DIRS} ${PC_GMP_LIBRARY_DIRS}
)
if (GMP_INCLUDE_DIR AND GMP_LIBRARY)
add_library(GMP::gmp UNKNOWN IMPORTED)
set_target_properties(GMP::gmp PROPERTIES
IMPORTED_LOCATION "${GMP_LIBRARY}"
INTERFACE_INCLUDE_DIRECTORIES "${GMP_INCLUDE_DIR}"
)
endif()
if (GMP_INCLUDE_DIR AND GMPXX_LIBRARY)
add_library(GMP::gmpxx UNKNOWN IMPORTED)
set_target_properties(GMP::gmpxx PROPERTIES
IMPORTED_LOCATION "${GMPXX_LIBRARY}"
INTERFACE_INCLUDE_DIRECTORIES "${GMP_INCLUDE_DIR}"
INTERFACE_LINK_LIBRARIES GMP::gmp
)
endif()
endif()
# Final guard
if (NOT TARGET GMP::gmp OR NOT TARGET GMP::gmpxx)
message(FATAL_ERROR "GMP/GMPXX not found. Install libgmp-dev (and gmpxx) or set CMAKE_PREFIX_PATH/GMP_DIR.")
endif()
# Link them transitively through ripple_libs
target_link_libraries(ripple_libs INTERFACE GMP::gmp GMP::gmpxx)
target_link_libraries(ripple_libs INTERFACE
ed25519::ed25519
@@ -129,6 +221,7 @@ target_link_libraries(ripple_libs INTERFACE
secp256k1::secp256k1
soci::soci
SQLite::SQLite3
libff::ff
)
# Work around changes to Conan recipe for now.
@@ -153,4 +246,4 @@ include(RippledValidatorKeys)
if(tests)
include(CTest)
add_subdirectory(src/tests/libxrpl)
endif()
endif()

View File

@@ -81,7 +81,7 @@ If you create new source files, they must be organized as follows:
The source must be formatted according to the style guide below.
Header includes must be [levelized](./Builds/levelization).
Header includes must be [levelized](.github/scripts/levelization).
Changes should be usually squashed down into a single commit.
Some larger or more complicated change sets make more sense,

View File

@@ -6,7 +6,7 @@ The [XRP Ledger](https://xrpl.org/) is a decentralized cryptographic ledger powe
## XRP
[XRP](https://xrpl.org/xrp.html) is a public, counterparty-free asset native to the XRP Ledger, and is designed to bridge the many different currencies in use worldwide. XRP is traded on the open-market and is available for anyone to access. The XRP Ledger was created in 2012 with a finite supply of 100 billion units of XRP.
[XRP](https://xrpl.org/xrp.html) is a public, counterparty-free crypto-asset native to the XRP Ledger, and is designed as a gas token for network services and to bridge different currencies. XRP is traded on the open-market and is available for anyone to access. The XRP Ledger was created in 2012 with a finite supply of 100 billion units of XRP.
## rippled
@@ -23,26 +23,26 @@ If you are interested in running an **API Server** (including a **Full History S
- **[Censorship-Resistant Transaction Processing][]:** No single party decides which transactions succeed or fail, and no one can "roll back" a transaction after it completes. As long as those who choose to participate in the network keep it healthy, they can settle transactions in seconds.
- **[Fast, Efficient Consensus Algorithm][]:** The XRP Ledger's consensus algorithm settles transactions in 4 to 5 seconds, processing at a throughput of up to 1500 transactions per second. These properties put XRP at least an order of magnitude ahead of other top digital assets.
- **[Finite XRP Supply][]:** When the XRP Ledger began, 100 billion XRP were created, and no more XRP will ever be created. The available supply of XRP decreases slowly over time as small amounts are destroyed to pay transaction costs.
- **[Responsible Software Governance][]:** A team of full-time, world-class developers at Ripple maintain and continually improve the XRP Ledger's underlying software with contributions from the open-source community. Ripple acts as a steward for the technology and an advocate for its interests, and builds constructive relationships with governments and financial institutions worldwide.
- **[Finite XRP Supply][]:** When the XRP Ledger began, 100 billion XRP were created, and no more XRP will ever be created. The available supply of XRP decreases slowly over time as small amounts are destroyed to pay transaction fees.
- **[Responsible Software Governance][]:** A team of full-time developers at Ripple & other organizations maintain and continually improve the XRP Ledger's underlying software with contributions from the open-source community. Ripple acts as a steward for the technology and an advocate for its interests.
- **[Secure, Adaptable Cryptography][]:** The XRP Ledger relies on industry standard digital signature systems like ECDSA (the same scheme used by Bitcoin) but also supports modern, efficient algorithms like Ed25519. The extensible nature of the XRP Ledger's software makes it possible to add and disable algorithms as the state of the art in cryptography advances.
- **[Modern Features for Smart Contracts][]:** Features like Escrow, Checks, and Payment Channels support cutting-edge financial applications including the [Interledger Protocol](https://interledger.org/). This toolbox of advanced features comes with safety features like a process for amending the network and separate checks against invariant constraints.
- **[Modern Features][]:** Features like Escrow, Checks, and Payment Channels support financial applications atop of the XRP Ledger. This toolbox of advanced features comes with safety features like a process for amending the network and separate checks against invariant constraints.
- **[On-Ledger Decentralized Exchange][]:** In addition to all the features that make XRP useful on its own, the XRP Ledger also has a fully-functional accounting system for tracking and trading obligations denominated in any way users want, and an exchange built into the protocol. The XRP Ledger can settle long, cross-currency payment paths and exchanges of multiple currencies in atomic transactions, bridging gaps of trust with XRP.
[Censorship-Resistant Transaction Processing]: https://xrpl.org/xrp-ledger-overview.html#censorship-resistant-transaction-processing
[Fast, Efficient Consensus Algorithm]: https://xrpl.org/xrp-ledger-overview.html#fast-efficient-consensus-algorithm
[Finite XRP Supply]: https://xrpl.org/xrp-ledger-overview.html#finite-xrp-supply
[Responsible Software Governance]: https://xrpl.org/xrp-ledger-overview.html#responsible-software-governance
[Secure, Adaptable Cryptography]: https://xrpl.org/xrp-ledger-overview.html#secure-adaptable-cryptography
[Modern Features for Smart Contracts]: https://xrpl.org/xrp-ledger-overview.html#modern-features-for-smart-contracts
[On-Ledger Decentralized Exchange]: https://xrpl.org/xrp-ledger-overview.html#on-ledger-decentralized-exchange
[Censorship-Resistant Transaction Processing]: https://xrpl.org/transaction-censorship-detection.html#transaction-censorship-detection
[Fast, Efficient Consensus Algorithm]: https://xrpl.org/consensus-research.html#consensus-research
[Finite XRP Supply]: https://xrpl.org/what-is-xrp.html
[Responsible Software Governance]: https://xrpl.org/contribute-code.html#contribute-code-to-the-xrp-ledger
[Secure, Adaptable Cryptography]: https://xrpl.org/cryptographic-keys.html#cryptographic-keys
[Modern Features]: https://xrpl.org/use-specialized-payment-types.html
[On-Ledger Decentralized Exchange]: https://xrpl.org/decentralized-exchange.html#decentralized-exchange
## Source Code
Here are some good places to start learning the source code:
- Read the markdown files in the source tree: `src/ripple/**/*.md`.
- Read [the levelization document](./Builds/levelization) to get an idea of the internal dependency graph.
- Read [the levelization document](.github/scripts/levelization) to get an idea of the internal dependency graph.
- In the big picture, the `main` function constructs an `ApplicationImp` object, which implements the `Application` virtual interface. Almost every component in the application takes an `Application&` parameter in its constructor, typically named `app` and stored as a member variable `app_`. This allows most components to depend on any other component.
### Repository Contents

View File

@@ -5,7 +5,7 @@ then
name=$( basename $0 )
cat <<- USAGE
Usage: $name <username>
Where <username> is the Github username of the upstream repo. e.g. XRPLF
USAGE
exit 0
@@ -83,4 +83,3 @@ fi
_run git fetch --jobs=$(nproc) upstreams
exit 0

View File

@@ -5,7 +5,7 @@ then
name=$( basename $0 )
cat <<- USAGE
Usage: $name workbranch base/branch user/branch [user/branch [...]]
* workbranch will be created locally from base/branch
* base/branch and user/branch may be specified as user:branch to allow
easy copying from Github PRs
@@ -66,4 +66,3 @@ git push $push HEAD:$b
git fetch $repo
-------------------------------------------------------------------
PUSH

View File

@@ -396,8 +396,8 @@
# true - enables compression
# false - disables compression [default].
#
# The rippled server can save bandwidth by compressing its peer-to-peer communications,
# at a cost of greater CPU usage. If you enable link compression,
# The rippled server can save bandwidth by compressing its peer-to-peer communications,
# at a cost of greater CPU usage. If you enable link compression,
# the server automatically compresses communications with peer servers
# that also have link compression enabled.
# https://xrpl.org/enable-link-compression.html
@@ -975,6 +975,47 @@
# number of ledger records online. Must be greater
# than or equal to ledger_history.
#
# Optional keys for NuDB only:
#
# nudb_block_size EXPERIMENTAL: Block size in bytes for NuDB storage.
# Must be a power of 2 between 4096 and 32768. Default is 4096.
#
# This parameter controls the fundamental storage unit
# size for NuDB's internal data structures. The choice
# of block size can significantly impact performance
# depending on your storage hardware and filesystem:
#
# - 4096 bytes: Optimal for most standard SSDs and
# traditional filesystems (ext4, NTFS, HFS+).
# Provides good balance of performance and storage
# efficiency. Recommended for most deployments.
# Minimizes memory footprint and provides consistent
# low-latency access patterns across diverse hardware.
#
# - 8192-16384 bytes: May improve performance on
# high-end NVMe SSDs and copy-on-write filesystems
# like ZFS or Btrfs that benefit from larger block
# alignment. Can reduce metadata overhead for large
# databases. Offers better sequential throughput and
# reduced I/O operations at the cost of higher memory
# usage per operation.
#
# - 32768 bytes (32K): Maximum supported block size
# for high-performance scenarios with very fast
# storage. May increase memory usage and reduce
# efficiency for smaller databases. Best suited for
# enterprise environments with abundant RAM.
#
# Performance testing is recommended before deploying
# any non-default block size in production environments.
#
# Note: This setting cannot be changed after database
# creation without rebuilding the entire database.
# Choose carefully based on your hardware and expected
# database size.
#
# Example: nudb_block_size=4096
#
# These keys modify the behavior of online_delete, and thus are only
# relevant if online_delete is defined and non-zero:
#
@@ -1011,7 +1052,7 @@
# that rippled is still in sync with the network,
# and that the validated ledger is less than
# 'age_threshold_seconds' old. If not, then continue
# sleeping for this number of seconds and
# sleeping for this number of seconds and
# checking until healthy.
# Default is 5.
#
@@ -1113,7 +1154,7 @@
# page_size Valid values: integer (MUST be power of 2 between 512 and 65536)
# The default is 4096 bytes. This setting determines
# the size of a page in the transaction.db file.
# See https://www.sqlite.org/pragma.html#pragma_page_size
# See https://www.sqlite.org/pragma.html#pragma_page_size
# for more details about the available options.
#
# journal_size_limit Valid values: integer
@@ -1249,6 +1290,39 @@
# Example:
# owner_reserve = 2000000 # 2 XRP
#
# extension_compute_limit = <gas>
#
# The extension compute limit is the maximum amount of gas that can be
# consumed by a single transaction. The gas limit is used to prevent
# transactions from consuming too many resources.
#
# If this parameter is unspecified, rippled will use an internal
# default. Don't change this without understanding the consequences.
#
# Example:
# extension_compute_limit = 1000000 # 1 million gas
#
# extension_size_limit = <bytes>
#
# The extension size limit is the maximum size of a WASM extension in
# bytes. The size limit is used to prevent extensions from consuming
# too many resources.
#
# If this parameter is unspecified, rippled will use an internal
# default. Don't change this without understanding the consequences.
#
# Example:
# extension_size_limit = 100000 # 100 kb
#
# gas_price = <bytes>
#
# The gas price is the conversion between WASM gas and its price in drops.
#
# If this parameter is unspecified, rippled will use an internal
# default. Don't change this without understanding the consequences.
#
# Example:
# gas_price = 1000000 # 1 drop per gas
#-------------------------------------------------------------------------------
#
# 9. Misc Settings
@@ -1471,6 +1545,7 @@ secure_gateway = 127.0.0.1
[node_db]
type=NuDB
path=/var/lib/rippled/db/nudb
nudb_block_size=4096
online_delete=512
advisory_delete=0

View File

@@ -101,6 +101,14 @@
# 2025-05-12, Jingchen Wu
# - add -fprofile-update=atomic to ensure atomic profile generation
#
# 2025-08-28, Bronek Kozicki
# - fix "At least one COMMAND must be given" CMake warning from policy CMP0175
#
# 2025-09-03, Jingchen Wu
# - remove the unused function append_coverage_compiler_flags and append_coverage_compiler_flags_to_target
# - add a new function add_code_coverage_to_target
# - remove some unused code
#
# USAGE:
#
# 1. Copy this file into your cmake modules path.
@@ -109,10 +117,8 @@
# using a CMake option() to enable it just optionally):
# include(CodeCoverage)
#
# 3. Append necessary compiler flags for all supported source files:
# append_coverage_compiler_flags()
# Or for specific target:
# append_coverage_compiler_flags_to_target(YOUR_TARGET_NAME)
# 3. Append necessary compiler flags and linker flags for all supported source files:
# add_code_coverage_to_target(<target> <PRIVATE|PUBLIC|INTERFACE>)
#
# 3.a (OPTIONAL) Set appropriate optimization flags, e.g. -O0, -O1 or -Og
#
@@ -201,67 +207,69 @@ endforeach()
set(COVERAGE_COMPILER_FLAGS "-g --coverage"
CACHE INTERNAL "")
set(COVERAGE_CXX_COMPILER_FLAGS "")
set(COVERAGE_C_COMPILER_FLAGS "")
set(COVERAGE_CXX_LINKER_FLAGS "")
set(COVERAGE_C_LINKER_FLAGS "")
if(CMAKE_CXX_COMPILER_ID MATCHES "(GNU|Clang)")
include(CheckCXXCompilerFlag)
include(CheckCCompilerFlag)
include(CheckLinkerFlag)
set(COVERAGE_CXX_COMPILER_FLAGS ${COVERAGE_COMPILER_FLAGS})
set(COVERAGE_C_COMPILER_FLAGS ${COVERAGE_COMPILER_FLAGS})
set(COVERAGE_CXX_LINKER_FLAGS ${COVERAGE_COMPILER_FLAGS})
set(COVERAGE_C_LINKER_FLAGS ${COVERAGE_COMPILER_FLAGS})
check_cxx_compiler_flag(-fprofile-abs-path HAVE_cxx_fprofile_abs_path)
if(HAVE_cxx_fprofile_abs_path)
set(COVERAGE_CXX_COMPILER_FLAGS "${COVERAGE_COMPILER_FLAGS} -fprofile-abs-path")
set(COVERAGE_CXX_COMPILER_FLAGS "${COVERAGE_CXX_COMPILER_FLAGS} -fprofile-abs-path")
endif()
check_c_compiler_flag(-fprofile-abs-path HAVE_c_fprofile_abs_path)
if(HAVE_c_fprofile_abs_path)
set(COVERAGE_C_COMPILER_FLAGS "${COVERAGE_COMPILER_FLAGS} -fprofile-abs-path")
set(COVERAGE_C_COMPILER_FLAGS "${COVERAGE_C_COMPILER_FLAGS} -fprofile-abs-path")
endif()
check_cxx_compiler_flag(-fprofile-update HAVE_cxx_fprofile_update)
check_linker_flag(CXX -fprofile-abs-path HAVE_cxx_linker_fprofile_abs_path)
if(HAVE_cxx_linker_fprofile_abs_path)
set(COVERAGE_CXX_LINKER_FLAGS "${COVERAGE_CXX_LINKER_FLAGS} -fprofile-abs-path")
endif()
check_linker_flag(C -fprofile-abs-path HAVE_c_linker_fprofile_abs_path)
if(HAVE_c_linker_fprofile_abs_path)
set(COVERAGE_C_LINKER_FLAGS "${COVERAGE_C_LINKER_FLAGS} -fprofile-abs-path")
endif()
check_cxx_compiler_flag(-fprofile-update=atomic HAVE_cxx_fprofile_update)
if(HAVE_cxx_fprofile_update)
set(COVERAGE_CXX_COMPILER_FLAGS "${COVERAGE_COMPILER_FLAGS} -fprofile-update=atomic")
set(COVERAGE_CXX_COMPILER_FLAGS "${COVERAGE_CXX_COMPILER_FLAGS} -fprofile-update=atomic")
endif()
check_c_compiler_flag(-fprofile-update HAVE_c_fprofile_update)
check_c_compiler_flag(-fprofile-update=atomic HAVE_c_fprofile_update)
if(HAVE_c_fprofile_update)
set(COVERAGE_C_COMPILER_FLAGS "${COVERAGE_COMPILER_FLAGS} -fprofile-update=atomic")
set(COVERAGE_C_COMPILER_FLAGS "${COVERAGE_C_COMPILER_FLAGS} -fprofile-update=atomic")
endif()
endif()
set(CMAKE_Fortran_FLAGS_COVERAGE
${COVERAGE_COMPILER_FLAGS}
CACHE STRING "Flags used by the Fortran compiler during coverage builds."
FORCE )
set(CMAKE_CXX_FLAGS_COVERAGE
${COVERAGE_COMPILER_FLAGS}
CACHE STRING "Flags used by the C++ compiler during coverage builds."
FORCE )
set(CMAKE_C_FLAGS_COVERAGE
${COVERAGE_COMPILER_FLAGS}
CACHE STRING "Flags used by the C compiler during coverage builds."
FORCE )
set(CMAKE_EXE_LINKER_FLAGS_COVERAGE
""
CACHE STRING "Flags used for linking binaries during coverage builds."
FORCE )
set(CMAKE_SHARED_LINKER_FLAGS_COVERAGE
""
CACHE STRING "Flags used by the shared libraries linker during coverage builds."
FORCE )
mark_as_advanced(
CMAKE_Fortran_FLAGS_COVERAGE
CMAKE_CXX_FLAGS_COVERAGE
CMAKE_C_FLAGS_COVERAGE
CMAKE_EXE_LINKER_FLAGS_COVERAGE
CMAKE_SHARED_LINKER_FLAGS_COVERAGE )
check_linker_flag(CXX -fprofile-update=atomic HAVE_cxx_linker_fprofile_update)
if(HAVE_cxx_linker_fprofile_update)
set(COVERAGE_CXX_LINKER_FLAGS "${COVERAGE_CXX_LINKER_FLAGS} -fprofile-update=atomic")
endif()
check_linker_flag(C -fprofile-update=atomic HAVE_c_linker_fprofile_update)
if(HAVE_c_linker_fprofile_update)
set(COVERAGE_C_LINKER_FLAGS "${COVERAGE_C_LINKER_FLAGS} -fprofile-update=atomic")
endif()
endif()
get_property(GENERATOR_IS_MULTI_CONFIG GLOBAL PROPERTY GENERATOR_IS_MULTI_CONFIG)
if(NOT (CMAKE_BUILD_TYPE STREQUAL "Debug" OR GENERATOR_IS_MULTI_CONFIG))
message(WARNING "Code coverage results with an optimised (non-Debug) build may be misleading")
endif() # NOT (CMAKE_BUILD_TYPE STREQUAL "Debug" OR GENERATOR_IS_MULTI_CONFIG)
if(CMAKE_C_COMPILER_ID STREQUAL "GNU" OR CMAKE_Fortran_COMPILER_ID STREQUAL "GNU")
link_libraries(gcov)
endif()
# Defines a target for running and collection code coverage information
# Builds dependencies, runs the given executable and outputs reports.
# NOTE! The executable should always have a ZERO as exit code otherwise
@@ -446,23 +454,24 @@ function(setup_target_for_coverage_gcovr)
# Show info where to find the report
add_custom_command(TARGET ${Coverage_NAME} POST_BUILD
COMMAND ;
COMMAND echo
COMMENT "Code coverage report saved in ${GCOVR_OUTPUT_FILE} formatted as ${Coverage_FORMAT}"
)
endfunction() # setup_target_for_coverage_gcovr
function(append_coverage_compiler_flags)
set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} ${COVERAGE_COMPILER_FLAGS}" PARENT_SCOPE)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} ${COVERAGE_COMPILER_FLAGS}" PARENT_SCOPE)
set(CMAKE_Fortran_FLAGS "${CMAKE_Fortran_FLAGS} ${COVERAGE_COMPILER_FLAGS}" PARENT_SCOPE)
message(STATUS "Appending code coverage compiler flags: ${COVERAGE_COMPILER_FLAGS}")
endfunction() # append_coverage_compiler_flags
function(add_code_coverage_to_target name scope)
separate_arguments(COVERAGE_CXX_COMPILER_FLAGS NATIVE_COMMAND "${COVERAGE_CXX_COMPILER_FLAGS}")
separate_arguments(COVERAGE_C_COMPILER_FLAGS NATIVE_COMMAND "${COVERAGE_C_COMPILER_FLAGS}")
separate_arguments(COVERAGE_CXX_LINKER_FLAGS NATIVE_COMMAND "${COVERAGE_CXX_LINKER_FLAGS}")
separate_arguments(COVERAGE_C_LINKER_FLAGS NATIVE_COMMAND "${COVERAGE_C_LINKER_FLAGS}")
# Setup coverage for specific library
function(append_coverage_compiler_flags_to_target name)
separate_arguments(_flag_list NATIVE_COMMAND "${COVERAGE_COMPILER_FLAGS}")
target_compile_options(${name} PRIVATE ${_flag_list})
if(CMAKE_C_COMPILER_ID STREQUAL "GNU" OR CMAKE_Fortran_COMPILER_ID STREQUAL "GNU")
target_link_libraries(${name} PRIVATE gcov)
endif()
endfunction()
# Add compiler options to the target
target_compile_options(${name} ${scope}
$<$<COMPILE_LANGUAGE:CXX>:${COVERAGE_CXX_COMPILER_FLAGS}>
$<$<COMPILE_LANGUAGE:C>:${COVERAGE_C_COMPILER_FLAGS}>)
target_link_libraries (${name} ${scope}
$<$<LINK_LANGUAGE:CXX>:${COVERAGE_CXX_LINKER_FLAGS} gcov>
$<$<LINK_LANGUAGE:C>:${COVERAGE_C_LINKER_FLAGS} gcov>
)
endfunction() # add_code_coverage_to_target

View File

@@ -16,13 +16,16 @@ set(CMAKE_CXX_EXTENSIONS OFF)
target_compile_definitions (common
INTERFACE
$<$<CONFIG:Debug>:DEBUG _DEBUG>
$<$<AND:$<BOOL:${profile}>,$<NOT:$<BOOL:${assert}>>>:NDEBUG>)
# ^^^^ NOTE: CMAKE release builds already have NDEBUG
# defined, so no need to add it explicitly except for
# this special case of (profile ON) and (assert OFF)
# -- presumably this is because we don't want profile
# builds asserting unless asserts were specifically
# requested
#[===[
NOTE: CMAKE release builds already have NDEBUG defined, so no need to add it
explicitly except for the special case of (profile ON) and (assert OFF).
Presumably this is because we don't want profile builds asserting unless
asserts were specifically requested.
]===]
$<$<AND:$<BOOL:${profile}>,$<NOT:$<BOOL:${assert}>>>:NDEBUG>
# TODO: Remove once we have migrated functions from OpenSSL 1.x to 3.x.
OPENSSL_SUPPRESS_DEPRECATED
)
if (MSVC)
# remove existing exception flag since we set it to -EHa

View File

@@ -65,8 +65,14 @@ target_link_libraries(xrpl.imports.main
xrpl.libpb
xxHash::xxhash
$<$<BOOL:${voidstar}>:antithesis-sdk-cpp>
wasm-xrplf::wasm-xrplf
)
if (WIN32)
target_link_libraries(xrpl.imports.main INTERFACE ntdll)
endif()
include(add_module)
include(target_link_modules)
@@ -99,9 +105,24 @@ target_link_libraries(xrpl.libxrpl.protocol PUBLIC
add_module(xrpl resource)
target_link_libraries(xrpl.libxrpl.resource PUBLIC xrpl.libxrpl.protocol)
# Level 06
add_module(xrpl net)
target_link_libraries(xrpl.libxrpl.net PUBLIC
xrpl.libxrpl.basics
xrpl.libxrpl.json
xrpl.libxrpl.protocol
xrpl.libxrpl.resource
)
add_module(xrpl server)
target_link_libraries(xrpl.libxrpl.server PUBLIC xrpl.libxrpl.protocol)
add_module(xrpl ledger)
target_link_libraries(xrpl.libxrpl.ledger PUBLIC
xrpl.libxrpl.basics
xrpl.libxrpl.json
xrpl.libxrpl.protocol
)
add_library(xrpl.libxrpl)
set_target_properties(xrpl.libxrpl PROPERTIES OUTPUT_NAME xrpl)
@@ -121,6 +142,8 @@ target_link_modules(xrpl PUBLIC
protocol
resource
server
net
ledger
)
# All headers in libxrpl are in modules.

View File

@@ -33,6 +33,8 @@ setup_target_for_coverage_gcovr(
FORMAT ${coverage_format}
EXECUTABLE rippled
EXECUTABLE_ARGS --unittest$<$<BOOL:${coverage_test}>:=${coverage_test}> --unittest-jobs ${coverage_test_parallelism} --quiet --unittest-log
EXCLUDE "src/test" "include/xrpl/beast/test" "include/xrpl/beast/unit_test" "${CMAKE_BINARY_DIR}/pb-xrpl.libpb"
EXCLUDE "src/test" "src/tests" "include/xrpl/beast/test" "include/xrpl/beast/unit_test" "${CMAKE_BINARY_DIR}/pb-xrpl.libpb"
DEPENDENCIES rippled
)
add_code_coverage_to_target(opts INTERFACE)

View File

@@ -18,7 +18,9 @@ install (
xrpl.libxrpl.json
xrpl.libxrpl.protocol
xrpl.libxrpl.resource
xrpl.libxrpl.ledger
xrpl.libxrpl.server
xrpl.libxrpl.net
xrpl.libxrpl
antithesis-sdk-cpp
EXPORT RippleExports
@@ -36,7 +38,7 @@ install(CODE "
set(CMAKE_MODULE_PATH \"${CMAKE_MODULE_PATH}\")
include(create_symbolic_link)
create_symbolic_link(xrpl \
\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_INCLUDEDIR}/ripple)
\$ENV{DESTDIR}\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_INCLUDEDIR}/ripple)
")
install (EXPORT RippleExports
@@ -70,7 +72,7 @@ if (is_root_project AND TARGET rippled)
set(CMAKE_MODULE_PATH \"${CMAKE_MODULE_PATH}\")
include(create_symbolic_link)
create_symbolic_link(rippled${suffix} \
\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_BINDIR}/xrpld${suffix})
\$ENV{DESTDIR}\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_BINDIR}/xrpld${suffix})
")
endif ()

View File

@@ -28,15 +28,11 @@ target_compile_options (opts
$<$<AND:$<BOOL:${is_gcc}>,$<COMPILE_LANGUAGE:CXX>>:-Wsuggest-override>
$<$<BOOL:${is_gcc}>:-Wno-maybe-uninitialized>
$<$<BOOL:${perf}>:-fno-omit-frame-pointer>
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${coverage}>>:-g --coverage -fprofile-abs-path>
$<$<AND:$<BOOL:${is_clang}>,$<BOOL:${coverage}>>:-g --coverage>
$<$<BOOL:${profile}>:-pg>
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${profile}>>:-p>)
target_link_libraries (opts
INTERFACE
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${coverage}>>:-g --coverage -fprofile-abs-path>
$<$<AND:$<BOOL:${is_clang}>,$<BOOL:${coverage}>>:-g --coverage>
$<$<BOOL:${profile}>:-pg>
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${profile}>>:-p>)

View File

@@ -118,7 +118,7 @@ option(beast_no_unit_test_inline
"Prevents unit test definitions from being inserted into global table"
OFF)
option(single_io_service_thread
"Restricts the number of threads calling io_service::run to one. \
"Restricts the number of threads calling io_context::run to one. \
This can be useful when debugging."
OFF)
option(boost_show_deprecated

View File

@@ -1,4 +1,4 @@
option (validator_keys "Enables building of validator-keys-tool as a separate target (imported via FetchContent)" OFF)
option (validator_keys "Enables building of validator-keys tool as a separate target (imported via FetchContent)" OFF)
if (validator_keys)
git_branch (current_branch)
@@ -6,17 +6,15 @@ if (validator_keys)
if (NOT (current_branch STREQUAL "release"))
set (current_branch "master")
endif ()
message (STATUS "tracking ValidatorKeys branch: ${current_branch}")
message (STATUS "Tracking ValidatorKeys branch: ${current_branch}")
FetchContent_Declare (
validator_keys_src
validator_keys
GIT_REPOSITORY https://github.com/ripple/validator-keys-tool.git
GIT_TAG "${current_branch}"
)
FetchContent_GetProperties (validator_keys_src)
if (NOT validator_keys_src_POPULATED)
message (STATUS "Pausing to download ValidatorKeys...")
FetchContent_Populate (validator_keys_src)
endif ()
add_subdirectory (${validator_keys_src_SOURCE_DIR} ${CMAKE_BINARY_DIR}/validator-keys)
FetchContent_MakeAvailable(validator_keys)
set_target_properties(validator-keys PROPERTIES RUNTIME_OUTPUT_DIRECTORY "${CMAKE_BINARY_DIR}")
install(TARGETS validator-keys RUNTIME DESTINATION ${CMAKE_INSTALL_BINDIR})
endif ()

View File

@@ -14,12 +14,6 @@ find_package(Boost 1.82 REQUIRED
add_library(ripple_boost INTERFACE)
add_library(Ripple::boost ALIAS ripple_boost)
if(XCODE)
target_include_directories(ripple_boost BEFORE INTERFACE ${Boost_INCLUDE_DIRS})
target_compile_options(ripple_boost INTERFACE --system-header-prefix="boost/")
else()
target_include_directories(ripple_boost SYSTEM BEFORE INTERFACE ${Boost_INCLUDE_DIRS})
endif()
target_link_libraries(ripple_boost
INTERFACE
@@ -30,6 +24,7 @@ target_link_libraries(ripple_boost
Boost::date_time
Boost::filesystem
Boost::json
Boost::process
Boost::program_options
Boost::regex
Boost::system

59
conan.lock Normal file
View File

@@ -0,0 +1,59 @@
{
"version": "0.5",
"requires": [
"zlib/1.3.1#b8bc2603263cf7eccbd6e17e66b0ed76%1733936244.862",
"xxhash/0.8.3#681d36a0a6111fc56e5e45ea182c19cc%1756234289.683",
"wasm-xrplf/2.4.1-xrplf#dc67c558e283593ef0edd7eb00e9fa0d%1759862247.891",
"sqlite3/3.49.1#8631739a4c9b93bd3d6b753bac548a63%1756234266.869",
"soci/4.0.3#a9f8d773cd33e356b5879a4b0564f287%1756234262.318",
"snappy/1.1.10#968fef506ff261592ec30c574d4a7809%1756234314.246",
"rocksdb/10.0.1#85537f46e538974d67da0c3977de48ac%1756234304.347",
"re2/20230301#dfd6e2bf050eb90ddd8729cfb4c844a4%1756234257.976",
"protobuf/3.21.12#d927114e28de9f4691a6bbcdd9a529d1%1756234251.614",
"openssl/3.5.4#a1d5835cc6ed5c5b8f3cd5b9b5d24205%1760106486.594",
"nudb/2.0.9#c62cfd501e57055a7e0d8ee3d5e5427d%1756234237.107",
"lz4/1.10.0#59fc63cac7f10fbe8e05c7e62c2f3504%1743433196.251",
"libsodium/1.0.20#d2baa92ed999abe295ff63e2ee25b4f3%1743063880.072",
"libiconv/1.17#1e65319e945f2d31941a9d28cc13c058%1756223727.64",
"libbacktrace/cci.20210118#a7691bfccd8caaf66309df196790a5a1%1722218217.276",
"libarchive/3.8.1#5cf685686322e906cb42706ab7e099a8%1756234256.696",
"jemalloc/5.3.0#e951da9cf599e956cebc117880d2d9f8%1729241615.244",
"grpc/1.50.1#02291451d1e17200293a409410d1c4e1%1756234248.958",
"gmp/6.3.0#76a423d206f8aedd6bf8fc4e271467d2%1756240296.179",
"doctest/2.4.11#a4211dfc329a16ba9f280f9574025659%1756234220.819",
"date/3.0.4#f74bbba5a08fa388256688743136cb6f%1756234217.493",
"c-ares/1.34.5#b78b91e7cfb1f11ce777a285bbf169c6%1756234217.915",
"bzip2/1.0.8#00b4a4658791c1f06914e087f0e792f5%1756234261.716",
"boost/1.88.0#8852c0b72ce8271fb8ff7c53456d4983%1756223752.326",
"abseil/20230802.1#f0f91485b111dc9837a68972cb19ca7b%1756234220.907"
],
"build_requires": [
"zlib/1.3.1#b8bc2603263cf7eccbd6e17e66b0ed76%1733936244.862",
"strawberryperl/5.32.1.1#707032463aa0620fa17ec0d887f5fe41%1756234281.733",
"protobuf/3.21.12#d927114e28de9f4691a6bbcdd9a529d1%1756234251.614",
"nasm/2.16.01#31e26f2ee3c4346ecd347911bd126904%1756234232.901",
"msys2/cci.latest#5b73b10144f73cc5bfe0572ed9be39e1%1751977009.857",
"m4/1.4.19#b38ced39a01e31fef5435bc634461fd2%1700758725.451",
"cmake/3.31.8#dde3bde00bb843687e55aea5afa0e220%1756234232.89",
"b2/5.3.3#107c15377719889654eb9a162a673975%1756234226.28",
"automake/1.16.5#b91b7c384c3deaa9d535be02da14d04f%1755524470.56",
"autoconf/2.71#51077f068e61700d65bb05541ea1e4b0%1731054366.86"
],
"python_requires": [],
"overrides": {
"protobuf/3.21.12": [
null,
"protobuf/3.21.12"
],
"lz4/1.9.4": [
"lz4/1.10.0"
],
"boost/1.83.0": [
"boost/1.88.0"
],
"sqlite3/3.44.2": [
"sqlite3/3.49.1"
]
},
"config_requires": []
}

6
conan/global.conf Normal file
View File

@@ -0,0 +1,6 @@
# Global configuration for Conan. This is used to set the number of parallel
# downloads, uploads, and build jobs.
core:non_interactive=True
core.download:parallel={{ os.cpu_count() }}
core.upload:parallel={{ os.cpu_count() }}
tools.build:jobs={{ os.cpu_count() - 1 }}

View File

@@ -26,12 +26,6 @@ tools.build:cxxflags=['-Wno-missing-template-arg-list-after-template-kw']
{% if compiler == "apple-clang" and compiler_version >= 17 %}
tools.build:cxxflags=['-Wno-missing-template-arg-list-after-template-kw']
{% endif %}
{% if compiler == "clang" and compiler_version == 16 %}
tools.build:cxxflags=['-DBOOST_ASIO_DISABLE_CONCEPTS']
{% endif %}
{% if compiler == "gcc" and compiler_version < 13 %}
tools.build:cxxflags=['-Wno-restrict']
{% endif %}
[tool_requires]
!cmake/*: cmake/[>=3 <4]

View File

@@ -2,6 +2,7 @@ from conan import ConanFile, __version__ as conan_version
from conan.tools.cmake import CMake, CMakeToolchain, cmake_layout
import re
class Xrpl(ConanFile):
name = 'xrpl'
@@ -27,9 +28,10 @@ class Xrpl(ConanFile):
'grpc/1.50.1',
'libarchive/3.8.1',
'nudb/2.0.9',
'openssl/1.1.1w',
'openssl/3.5.4',
'soci/4.0.3',
'zlib/1.3.1',
'wasm-xrplf/2.4.1-xrplf',
]
test_requires = [
@@ -100,15 +102,20 @@ class Xrpl(ConanFile):
def configure(self):
if self.settings.compiler == 'apple-clang':
self.options['boost'].visibility = 'global'
if self.settings.compiler in ['clang', 'gcc']:
self.options['boost'].without_cobalt = True
def requirements(self):
# Conan 2 requires transitive headers to be specified
transitive_headers_opt = {'transitive_headers': True} if conan_version.split('.')[0] == '2' else {}
self.requires('boost/1.83.0', force=True, **transitive_headers_opt)
self.requires('boost/1.88.0', force=True, **transitive_headers_opt)
self.requires('date/3.0.4', **transitive_headers_opt)
self.requires('lz4/1.10.0', force=True)
self.requires('protobuf/3.21.12', force=True)
self.requires('sqlite3/3.49.1', force=True)
self.requires('gmp/6.3.0', force=True)
self.requires('libsodium/1.0.20', force=True)
if self.options.jemalloc:
self.requires('jemalloc/5.3.0')
if self.options.rocksdb:
@@ -131,6 +138,7 @@ class Xrpl(ConanFile):
self.folders.generators = 'build/generators'
generators = 'CMakeDeps'
def generate(self):
tc = CMakeToolchain(self)
tc.variables['tests'] = self.options.tests
@@ -175,6 +183,7 @@ class Xrpl(ConanFile):
'boost::filesystem',
'boost::json',
'boost::program_options',
'boost::process',
'boost::regex',
'boost::system',
'boost::thread',
@@ -187,6 +196,7 @@ class Xrpl(ConanFile):
'protobuf::libprotobuf',
'soci::soci',
'sqlite3::sqlite',
'wasm-xrplf::wasm-xrplf',
'xxhash::xxhash',
'zlib::zlib',
]

View File

@@ -5,8 +5,8 @@ skinparam roundcorner 20
skinparam maxmessagesize 160
actor "Rippled Start" as RS
participant "Timer" as T
participant "NetworkOPs" as NOP
participant "Timer" as T
participant "NetworkOPs" as NOP
participant "ValidatorList" as VL #lightgreen
participant "Consensus" as GC
participant "ConsensusAdaptor" as CA #lightgreen
@@ -20,7 +20,7 @@ VL -> NOP
NOP -> VL: update trusted validators
activate VL
VL -> VL: re-calculate quorum
hnote over VL#lightgreen: ignore negative listed validators\nwhen calculate quorum
hnote over VL#lightgreen: ignore negative listed validators\nwhen calculate quorum
VL -> NOP
deactivate VL
NOP -> GC: start round
@@ -36,14 +36,14 @@ activate GC
end
alt phase == OPEN
alt should close ledger
alt should close ledger
GC -> GC: phase = ESTABLISH
GC -> CA: onClose
activate CA
alt sqn%256==0
alt sqn%256==0
CA -[#green]> RM: <font color=green>getValidations
CA -[#green]> CA: <font color=green>create UNLModify Tx
hnote over CA#lightgreen: use validatations of the last 256 ledgers\nto figure out UNLModify Tx candidates.\nIf any, create UNLModify Tx, and add to TxSet.
CA -[#green]> CA: <font color=green>create UNLModify Tx
hnote over CA#lightgreen: use validatations of the last 256 ledgers\nto figure out UNLModify Tx candidates.\nIf any, create UNLModify Tx, and add to TxSet.
end
CA -> GC
GC -> CA: propose
@@ -61,14 +61,14 @@ else phase == ESTABLISH
CA -> CA : build LCL
hnote over CA #lightgreen: copy negative UNL from parent ledger
alt sqn%256==0
CA -[#green]> CA: <font color=green>Adjust negative UNL
CA -[#green]> CA: <font color=green>Adjust negative UNL
CA -[#green]> CA: <font color=green>apply UNLModify Tx
end
CA -> CA : validate and send validation message
activate NOP
CA -> NOP : end consensus and\n<b>begin next consensus round
deactivate NOP
deactivate CA
deactivate CA
hnote over RM: receive validations
end
else phase == ACCEPTED
@@ -76,4 +76,4 @@ else phase == ACCEPTED
end
deactivate GC
@enduml
@enduml

View File

@@ -4,7 +4,7 @@ class TimeoutCounter {
#app_ : Application&
}
TimeoutCounter o-- "1" Application
TimeoutCounter o-- "1" Application
': app_
Stoppable <.. Application
@@ -14,13 +14,13 @@ class Application {
-m_inboundLedgers : uptr<InboundLedgers>
}
Application *-- "1" LedgerReplayer
Application *-- "1" LedgerReplayer
': m_ledgerReplayer
Application *-- "1" InboundLedgers
Application *-- "1" InboundLedgers
': m_inboundLedgers
Stoppable <.. InboundLedgers
Application "1" --o InboundLedgers
Application "1" --o InboundLedgers
': app_
class InboundLedgers {
@@ -28,9 +28,9 @@ class InboundLedgers {
}
Stoppable <.. LedgerReplayer
InboundLedgers "1" --o LedgerReplayer
InboundLedgers "1" --o LedgerReplayer
': inboundLedgers_
Application "1" --o LedgerReplayer
Application "1" --o LedgerReplayer
': app_
class LedgerReplayer {
@@ -42,17 +42,17 @@ class LedgerReplayer {
-skipLists_ : hash_map<u256, wptr<SkipListAcquire>>
}
LedgerReplayer *-- LedgerReplayTask
LedgerReplayer *-- LedgerReplayTask
': tasks_
LedgerReplayer o-- LedgerDeltaAcquire
LedgerReplayer o-- LedgerDeltaAcquire
': deltas_
LedgerReplayer o-- SkipListAcquire
LedgerReplayer o-- SkipListAcquire
': skipLists_
TimeoutCounter <.. LedgerReplayTask
InboundLedgers "1" --o LedgerReplayTask
InboundLedgers "1" --o LedgerReplayTask
': inboundLedgers_
LedgerReplayer "1" --o LedgerReplayTask
LedgerReplayer "1" --o LedgerReplayTask
': replayer_
class LedgerReplayTask {
@@ -63,15 +63,15 @@ class LedgerReplayTask {
+addDelta(sptr<LedgerDeltaAcquire>)
}
LedgerReplayTask *-- "1" SkipListAcquire
LedgerReplayTask *-- "1" SkipListAcquire
': skipListAcquirer_
LedgerReplayTask *-- LedgerDeltaAcquire
LedgerReplayTask *-- LedgerDeltaAcquire
': deltas_
TimeoutCounter <.. SkipListAcquire
InboundLedgers "1" --o SkipListAcquire
InboundLedgers "1" --o SkipListAcquire
': inboundLedgers_
LedgerReplayer "1" --o SkipListAcquire
LedgerReplayer "1" --o SkipListAcquire
': replayer_
LedgerReplayTask --o SkipListAcquire : implicit via callback
@@ -83,9 +83,9 @@ class SkipListAcquire {
}
TimeoutCounter <.. LedgerDeltaAcquire
InboundLedgers "1" --o LedgerDeltaAcquire
InboundLedgers "1" --o LedgerDeltaAcquire
': inboundLedgers_
LedgerReplayer "1" --o LedgerDeltaAcquire
LedgerReplayer "1" --o LedgerDeltaAcquire
': replayer_
LedgerReplayTask --o LedgerDeltaAcquire : implicit via callback
@@ -95,4 +95,4 @@ class LedgerDeltaAcquire {
-replayer_ : LedgerReplayer&
-dataReadyCallbacks_ : vector<callback>
}
@enduml
@enduml

View File

@@ -38,7 +38,7 @@ deactivate lr
loop
lr -> lda : make_shared(ledgerId, ledgerSeq)
return delta
lr -> lrt : addDelta(delta)
lr -> lrt : addDelta(delta)
lrt -> lda : addDataCallback(callback)
return
return
@@ -62,7 +62,7 @@ deactivate peer
lr -> lda : processData(ledgerHeader, txns)
lda -> lda : notify()
note over lda: call the callbacks added by\naddDataCallback(callback).
lda -> lrt : callback(ledgerId)
lda -> lrt : callback(ledgerId)
lrt -> lrt : deltaReady(ledgerId)
lrt -> lrt : tryAdvance()
loop as long as child can be built
@@ -82,4 +82,4 @@ deactivate peer
deactivate peer
@enduml
@enduml

View File

@@ -1,3 +1,3 @@
---
DisableFormat: true
SortIncludes: false
SortIncludes: Never

3
external/README.md vendored
View File

@@ -1,7 +1,6 @@
# External Conan recipes
The subdirectories in this directory contain copies of external libraries used
by rippled.
The subdirectories in this directory contain external libraries used by rippled.
| Folder | Upstream | Description |
| :--------------- | :------------------------------------------------------------- | :------------------------------------------------------------------------------------------- |

View File

@@ -1,12 +1,12 @@
[ed25519](http://ed25519.cr.yp.to/) is an
[Elliptic Curve Digital Signature Algortithm](http://en.wikipedia.org/wiki/Elliptic_Curve_DSA),
developed by [Dan Bernstein](http://cr.yp.to/djb.html),
[Niels Duif](http://www.nielsduif.nl/),
[Tanja Lange](http://hyperelliptic.org/tanja),
[Peter Schwabe](http://www.cryptojedi.org/users/peter/),
[ed25519](http://ed25519.cr.yp.to/) is an
[Elliptic Curve Digital Signature Algortithm](http://en.wikipedia.org/wiki/Elliptic_Curve_DSA),
developed by [Dan Bernstein](http://cr.yp.to/djb.html),
[Niels Duif](http://www.nielsduif.nl/),
[Tanja Lange](http://hyperelliptic.org/tanja),
[Peter Schwabe](http://www.cryptojedi.org/users/peter/),
and [Bo-Yin Yang](http://www.iis.sinica.edu.tw/pages/byyang/).
This project provides performant, portable 32-bit & 64-bit implementations. All implementations are
This project provides performant, portable 32-bit & 64-bit implementations. All implementations are
of course constant time in regard to secret data.
#### Performance
@@ -52,35 +52,35 @@ are made.
#### Compilation
No configuration is needed **if you are compiling against OpenSSL**.
No configuration is needed **if you are compiling against OpenSSL**.
##### Hash Options
If you are not compiling aginst OpenSSL, you will need a hash function.
To use a simple/**slow** implementation of SHA-512, use `-DED25519_REFHASH` when compiling `ed25519.c`.
To use a simple/**slow** implementation of SHA-512, use `-DED25519_REFHASH` when compiling `ed25519.c`.
This should never be used except to verify the code works when OpenSSL is not available.
To use a custom hash function, use `-DED25519_CUSTOMHASH` when compiling `ed25519.c` and put your
To use a custom hash function, use `-DED25519_CUSTOMHASH` when compiling `ed25519.c` and put your
custom hash implementation in ed25519-hash-custom.h. The hash must have a 512bit digest and implement
struct ed25519_hash_context;
struct ed25519_hash_context;
void ed25519_hash_init(ed25519_hash_context *ctx);
void ed25519_hash_update(ed25519_hash_context *ctx, const uint8_t *in, size_t inlen);
void ed25519_hash_final(ed25519_hash_context *ctx, uint8_t *hash);
void ed25519_hash(uint8_t *hash, const uint8_t *in, size_t inlen);
void ed25519_hash_init(ed25519_hash_context *ctx);
void ed25519_hash_update(ed25519_hash_context *ctx, const uint8_t *in, size_t inlen);
void ed25519_hash_final(ed25519_hash_context *ctx, uint8_t *hash);
void ed25519_hash(uint8_t *hash, const uint8_t *in, size_t inlen);
##### Random Options
If you are not compiling aginst OpenSSL, you will need a random function for batch verification.
To use a custom random function, use `-DED25519_CUSTOMRANDOM` when compiling `ed25519.c` and put your
To use a custom random function, use `-DED25519_CUSTOMRANDOM` when compiling `ed25519.c` and put your
custom hash implementation in ed25519-randombytes-custom.h. The random function must implement:
void ED25519_FN(ed25519_randombytes_unsafe) (void *p, size_t len);
void ED25519_FN(ed25519_randombytes_unsafe) (void *p, size_t len);
Use `-DED25519_TEST` when compiling `ed25519.c` to use a deterministically seeded, non-thread safe CSPRNG
Use `-DED25519_TEST` when compiling `ed25519.c` to use a deterministically seeded, non-thread safe CSPRNG
variant of Bob Jenkins [ISAAC](http://en.wikipedia.org/wiki/ISAAC_%28cipher%29)
##### Minor options
@@ -91,79 +91,80 @@ Use `-DED25519_FORCE_32BIT` to force the use of 32 bit routines even when compil
##### 32-bit
gcc ed25519.c -m32 -O3 -c
gcc ed25519.c -m32 -O3 -c
##### 64-bit
gcc ed25519.c -m64 -O3 -c
gcc ed25519.c -m64 -O3 -c
##### SSE2
gcc ed25519.c -m32 -O3 -c -DED25519_SSE2 -msse2
gcc ed25519.c -m64 -O3 -c -DED25519_SSE2
gcc ed25519.c -m32 -O3 -c -DED25519_SSE2 -msse2
gcc ed25519.c -m64 -O3 -c -DED25519_SSE2
clang and icc are also supported
#### Usage
To use the code, link against `ed25519.o -mbits` and:
#include "ed25519.h"
#include "ed25519.h"
Add `-lssl -lcrypto` when using OpenSSL (Some systems don't need -lcrypto? It might be trial and error).
To generate a private key, simply generate 32 bytes from a secure
cryptographic source:
ed25519_secret_key sk;
randombytes(sk, sizeof(ed25519_secret_key));
ed25519_secret_key sk;
randombytes(sk, sizeof(ed25519_secret_key));
To generate a public key:
ed25519_public_key pk;
ed25519_publickey(sk, pk);
ed25519_public_key pk;
ed25519_publickey(sk, pk);
To sign a message:
ed25519_signature sig;
ed25519_sign(message, message_len, sk, pk, signature);
ed25519_signature sig;
ed25519_sign(message, message_len, sk, pk, signature);
To verify a signature:
int valid = ed25519_sign_open(message, message_len, pk, signature) == 0;
int valid = ed25519_sign_open(message, message_len, pk, signature) == 0;
To batch verify signatures:
const unsigned char *mp[num] = {message1, message2..}
size_t ml[num] = {message_len1, message_len2..}
const unsigned char *pkp[num] = {pk1, pk2..}
const unsigned char *sigp[num] = {signature1, signature2..}
int valid[num]
const unsigned char *mp[num] = {message1, message2..}
size_t ml[num] = {message_len1, message_len2..}
const unsigned char *pkp[num] = {pk1, pk2..}
const unsigned char *sigp[num] = {signature1, signature2..}
int valid[num]
/* valid[i] will be set to 1 if the individual signature was valid, 0 otherwise */
int all_valid = ed25519_sign_open_batch(mp, ml, pkp, sigp, num, valid) == 0;
/* valid[i] will be set to 1 if the individual signature was valid, 0 otherwise */
int all_valid = ed25519_sign_open_batch(mp, ml, pkp, sigp, num, valid) == 0;
**Note**: Batch verification uses `ed25519_randombytes_unsafe`, implemented in
`ed25519-randombytes.h`, to generate random scalars for the verification code.
**Note**: Batch verification uses `ed25519_randombytes_unsafe`, implemented in
`ed25519-randombytes.h`, to generate random scalars for the verification code.
The default implementation now uses OpenSSLs `RAND_bytes`.
Unlike the [SUPERCOP](http://bench.cr.yp.to/supercop.html) version, signatures are
not appended to messages, and there is no need for padding in front of messages.
Additionally, the secret key does not contain a copy of the public key, so it is
not appended to messages, and there is no need for padding in front of messages.
Additionally, the secret key does not contain a copy of the public key, so it is
32 bytes instead of 64 bytes, and the public key must be provided to the signing
function.
##### Curve25519
Curve25519 public keys can be generated thanks to
[Adam Langley](http://www.imperialviolet.org/2013/05/10/fastercurve25519.html)
Curve25519 public keys can be generated thanks to
[Adam Langley](http://www.imperialviolet.org/2013/05/10/fastercurve25519.html)
leveraging Ed25519's precomputed basepoint scalar multiplication.
curved25519_key sk, pk;
randombytes(sk, sizeof(curved25519_key));
curved25519_scalarmult_basepoint(pk, sk);
curved25519_key sk, pk;
randombytes(sk, sizeof(curved25519_key));
curved25519_scalarmult_basepoint(pk, sk);
Note the name is curved25519, a combination of curve and ed25519, to prevent
Note the name is curved25519, a combination of curve and ed25519, to prevent
name clashes. Performance is slightly faster than short message ed25519
signing due to both using the same code for the scalar multiply.
@@ -179,4 +180,4 @@ with extreme values to ensure they function correctly. SSE2 is now supported.
#### Papers
[Available on the Ed25519 website](http://ed25519.cr.yp.to/papers.html)
[Available on the Ed25519 website](http://ed25519.cr.yp.to/papers.html)

View File

@@ -1,78 +1,78 @@
This code fuzzes ed25519-donna (and optionally ed25519-donna-sse2) against the ref10 implementations of
[curve25519](https://github.com/floodyberry/supercop/tree/master/crypto_scalarmult/curve25519/ref10) and
[curve25519](https://github.com/floodyberry/supercop/tree/master/crypto_scalarmult/curve25519/ref10) and
[ed25519](https://github.com/floodyberry/supercop/tree/master/crypto_sign/ed25519/ref10).
Curve25519 tests that generating a public key from a secret key
# Building
## \*nix + PHP
## *nix + PHP
`php build-nix.php (required parameters) (optional parameters)`
Required parameters:
- `--function=[curve25519,ed25519]`
- `--bits=[32,64]`
* `--function=[curve25519,ed25519]`
* `--bits=[32,64]`
Optional parameters:
- `--with-sse2`
* `--with-sse2`
Also fuzz against ed25519-donna-sse2
Also fuzz against ed25519-donna-sse2
* `--with-openssl`
- `--with-openssl`
Build with OpenSSL's SHA-512.
Build with OpenSSL's SHA-512.
Default: Reference SHA-512 implementation (slow!)
Default: Reference SHA-512 implementation (slow!)
* `--compiler=[gcc,clang,icc]`
- `--compiler=[gcc,clang,icc]`
Default: gcc
Default: gcc
* `--no-asm`
- `--no-asm`
Do not use platform specific assembler
Do not use platform specific assembler
example:
php build-nix.php --bits=64 --function=ed25519 --with-sse2 --compiler=icc
php build-nix.php --bits=64 --function=ed25519 --with-sse2 --compiler=icc
## Windows
Create a project with access to the ed25519 files.
If you are not using OpenSSL, add the `ED25519_REFHASH` define to the projects
If you are not using OpenSSL, add the `ED25519_REFHASH` define to the projects
"Properties/Preprocessor/Preprocessor Definitions" option
Add the following files to the project:
- `fuzz/curve25519-ref10.c`
- `fuzz/ed25519-ref10.c`
- `fuzz/ed25519-donna.c`
- `fuzz/ed25519-donna-sse2.c` (optional)
- `fuzz-[curve25519/ed25519].c` (depending on which you want to fuzz)
* `fuzz/curve25519-ref10.c`
* `fuzz/ed25519-ref10.c`
* `fuzz/ed25519-donna.c`
* `fuzz/ed25519-donna-sse2.c` (optional)
* `fuzz-[curve25519/ed25519].c` (depending on which you want to fuzz)
If you are also fuzzing against ed25519-donna-sse2, add the `ED25519_SSE2` define for `fuzz-[curve25519/ed25519].c` under
If you are also fuzzing against ed25519-donna-sse2, add the `ED25519_SSE2` define for `fuzz-[curve25519/ed25519].c` under
its "Properties/Preprocessor/Preprocessor Definitions" option.
# Running
If everything agrees, the program will only output occasional status dots (every 0x1000 passes)
If everything agrees, the program will only output occasional status dots (every 0x1000 passes)
and a 64bit progress count (every 0x20000 passes):
fuzzing: ref10 curved25519 curved25519-sse2
................................ [0000000000020000]
................................ [0000000000040000]
................................ [0000000000060000]
................................ [0000000000080000]
................................ [00000000000a0000]
................................ [00000000000c0000]
If any of the implementations do not agree with the ref10 implementation, the program will dump
the random data that was used, the data generated by the ref10 implementation, and diffs of the
the random data that was used, the data generated by the ref10 implementation, and diffs of the
ed25519-donna data against the ref10 data.
## Example errors
@@ -83,21 +83,21 @@ These are example error dumps (with intentionally introduced errors).
Random data:
- sk, or Secret Key
- m, or Message
* sk, or Secret Key
* m, or Message
Generated data:
- pk, or Public Key
- sig, or Signature
- valid, or if the signature of the message is valid with the public key
* pk, or Public Key
* sig, or Signature
* valid, or if the signature of the message is valid with the public key
Dump:
sk:
0x3b,0xb7,0x17,0x7a,0x66,0xdc,0xb7,0x9a,0x90,0x25,0x07,0x99,0x96,0xf3,0x92,0xef,
0x78,0xf8,0xad,0x6c,0x35,0x87,0x81,0x67,0x03,0xe6,0x95,0xba,0x06,0x18,0x7c,0x9c,
m:
0x7c,0x8d,0x3d,0xe1,0x92,0xee,0x7a,0xb8,0x4d,0xc9,0xfb,0x02,0x34,0x1e,0x5a,0x91,
0xee,0x01,0xa6,0xb8,0xab,0x37,0x3f,0x3d,0x6d,0xa2,0x47,0xe3,0x27,0x93,0x7c,0xb7,
@@ -107,66 +107,67 @@ Dump:
0x63,0x14,0xe0,0x81,0x52,0xec,0xcd,0xcf,0x70,0x54,0x7d,0xa3,0x49,0x8b,0xf0,0x89,
0x70,0x07,0x12,0x2a,0xd9,0xaa,0x16,0x01,0xb2,0x16,0x3a,0xbb,0xfc,0xfa,0x13,0x5b,
0x69,0x83,0x92,0x70,0x95,0x76,0xa0,0x8e,0x16,0x79,0xcc,0xaa,0xb5,0x7c,0xf8,0x7a,
ref10:
pk:
0x71,0xb0,0x5e,0x62,0x1b,0xe3,0xe7,0x36,0x91,0x8b,0xc0,0x13,0x36,0x0c,0xc9,0x04,
0x16,0xf5,0xff,0x48,0x0c,0x83,0x6b,0x88,0x53,0xa2,0xc6,0x0f,0xf7,0xac,0x42,0x04,
sig:
0x3e,0x05,0xc5,0x37,0x16,0x0b,0x29,0x30,0x89,0xa3,0xe7,0x83,0x08,0x16,0xdd,0x96,
0x02,0xfa,0x0d,0x44,0x2c,0x43,0xaa,0x80,0x93,0x04,0x58,0x22,0x09,0xbf,0x11,0xa5,
0xcc,0xa5,0x3c,0x9f,0xa0,0xa4,0x64,0x5a,0x4a,0xdb,0x20,0xfb,0xc7,0x9b,0xfd,0x3f,
0x08,0xae,0xc4,0x3c,0x1e,0xd8,0xb6,0xb4,0xd2,0x6d,0x80,0x92,0xcb,0x71,0xf3,0x02,
valid: yes
ed25519-donna:
pk diff:
____,____,____,____,____,____,____,____,____,____,____,____,____,____,____,____,
____,____,____,____,____,____,____,____,____,____,____,____,____,____,____,____,
sig diff:
0x2c,0xb9,0x25,0x14,0xd0,0x94,0xeb,0xfe,0x46,0x02,0xc2,0xe8,0xa3,0xeb,0xbf,0xb5,
0x72,0x84,0xbf,0xc1,0x8a,0x32,0x30,0x99,0xf7,0x58,0xfe,0x06,0xa8,0xdc,0xdc,0xab,
0xb5,0x57,0x03,0x33,0x87,0xce,0x54,0x55,0x6a,0x69,0x8a,0xc4,0xb7,0x2a,0xed,0x97,
0xb4,0x68,0xe7,0x52,0x7a,0x07,0x55,0x3b,0xa2,0x94,0xd6,0x5e,0xa1,0x61,0x80,0x08,
valid: no
In this case, the generated public key matches, but the generated signature is completely
In this case, the generated public key matches, but the generated signature is completely
different and does not validate.
### Curve25519
Random data:
- sk, or Secret Key
* sk, or Secret Key
Generated data:
- pk, or Public Key
* pk, or Public Key
Dump:
sk:
0x44,0xec,0x0b,0x0e,0xa2,0x0e,0x9c,0x5b,0x8c,0xce,0x7b,0x1d,0x68,0xae,0x0f,0x9e,
0x81,0xe2,0x04,0x76,0xda,0x87,0xa4,0x9e,0xc9,0x4f,0x3b,0xf9,0xc3,0x89,0x63,0x70,
ref10:
0x24,0x55,0x55,0xc0,0xf9,0x80,0xaf,0x02,0x43,0xee,0x8c,0x7f,0xc1,0xad,0x90,0x95,
0x57,0x91,0x14,0x2e,0xf2,0x14,0x22,0x80,0xdd,0x4e,0x3c,0x85,0x71,0x84,0x8c,0x62,
curved25519 diff:
0x12,0xd1,0x61,0x2b,0x16,0xb3,0xd8,0x29,0xf8,0xa3,0xba,0x70,0x4e,0x49,0x4f,0x43,
0xa1,0x3c,0x6b,0x42,0x11,0x61,0xcc,0x30,0x87,0x73,0x46,0xfb,0x85,0xc7,0x9a,0x35,
curved25519-sse2 diff:
____,____,____,____,____,____,____,____,____,____,____,____,____,____,____,____,
____,____,____,____,____,____,____,____,____,____,____,____,____,____,____,____,
In this case, curved25519 is totally wrong, while curved25519-sse2 matches the reference
implementation.
In this case, curved25519 is totally wrong, while curved25519-sse2 matches the reference
implementation.

10
external/libff/AUTHORS vendored Executable file
View File

@@ -0,0 +1,10 @@
SCIPR Lab:
Eli Ben-Sasson
Alessandro Chiesa
Eran Tromer
Madars Virza
Howard Wu
External contributors:
Alexander Chernyakhovsky (Google Inc.)
Aleksejs Popovs

52
external/libff/CHANGELOG.md vendored Normal file
View File

@@ -0,0 +1,52 @@
## v0.3.0
This update introduces new field and curve API's, and enforces they are used consistently across the library. Furthermore it makes it possible to use fields, without having to initialize elliptic curves.
### Breaking Changes
- #23 Remove unused exponent param of curves
- #58 Add a defined API for every field type, and have minor tweaks to all fields to implement it (Thanks @alexander-zw)
- #79 Separate field initialization from curves (Thanks @alexander-zw)
### Features
- #71 Add BLS12-381 (Thanks @yelhousni)
- #80 Add clang-tidy checks to library and CI
- #82 Convert tests to use Google test (Thanks @alexander-zw)
- #83 Make run-clang-tidy return an error when linting fails
- #85 Add more unit tests for fields (Thanks @alexander-zw)
- #86 Add binary fields from [libiop](https://github.com/scipr-lab/libiop)
- #100 Move utils in from [libiop](https://github.com/scipr-lab/libiop)
### Bug fixes
- #75 Get rid of warning for unused constant PI, in complex field
- #78 Reduce prints when inhibit_profiling_info is set
- #79 & #87 Use std::size_t for all code, fix bugs introduced by #58
- #94 & #96 Fix bugs that make libff incompatible with
[libfqfft](https://github.com/scipr-lab/libfqfft) and [libiop](https://github.com/scipr-lab/libiop)
- #103 Fix bugs that make libff incompatible with [libsnark](https://github.com/scipr-lab/libsnark)
and add more tests for the field utils
## v0.2.0
_Special thanks to all downstream projects upstreaming their patches!_
### Breaking Changes
- File structure changed: All field utils are now in `libff/algebra/field_utils/`, `Fp_model` is
now in `libff/algebra/fields/prime_base/`, and all other F_p^n fields in
`libff/algebra/fields/prime_extension/`.
- The function `base_field_char()` of all fields and curves have been renamed to `field_char()`.
- The provided fields used in curves have been moved to separate files so that they can be imported
separately from `[field name]_fields.hpp`. However they are still accessible from the init file.
### Features
- #20 Improve operator+ speed for alt_bn, correct the corresponding docs, and reduce code duplication.
- #50 Add mul_by_cofactor to elliptic curve groups
- #50 Add sage scripts for altbn and mnt curves, to verify cofactors and generators
- #52 Change default procps build flags to work with Mac OS
### Bug fixes
- #19 Fix is_little_endian always returning true
- #20 Fix operator+ not contributing to alt_bn_128 profiling opcount
- #26 Remove unused warnings in release build
- #39 Update Travis Config for newer Ubuntu mirror defaults
- #50 Fix incorrect mnt4 g2 generator
- #54 Fix is_power_of_two for n > 2^32
- #55 Throw informative error for division by zero in div_ceil

236
external/libff/CMakeLists.txt vendored Executable file
View File

@@ -0,0 +1,236 @@
cmake_minimum_required(VERSION 3.5...3.25)
project (libff)
# Default to RelWithDebInfo configuration if no configuration is explicitly specified.
if(NOT CMAKE_BUILD_TYPE AND NOT CMAKE_CONFIGURATION_TYPES)
set(CMAKE_BUILD_TYPE RelWithDebInfo CACHE STRING "Build type on single-configuration generators" FORCE)
endif()
set(
CURVE
"BN128"
CACHE
STRING
"Default curve: one of BLS12_381, ALT_BN128, BN128, EDWARDS, MNT4, MNT6"
)
option(
DEBUG
"Enable debugging mode"
OFF
)
option(
LOWMEM
"Limit the size of multi-exponentiation tables, for low-memory platforms"
OFF
)
option(
MULTICORE
"Enable parallelized execution, using OpenMP"
OFF
)
option(
BINARY_OUTPUT
"In serialization, output raw binary data (instead of decimal), which is smaller and faster."
ON
)
option(
MONTGOMERY_OUTPUT
"Use Montgomery representations for serialization and words representation of Fp elements (faster but not human-readable)"
ON
)
option(
USE_PT_COMPRESSION
"Use point compression"
ON
)
option(
PROFILE_OP_COUNTS
"Collect counts for field and curve operations"
OFF
)
option(
USE_MIXED_ADDITION
"Convert each element of the key pair to affine coordinates"
OFF
)
# This option does not work on macOS, since there is no /proc
option(
WITH_PROCPS
"Use procps for memory profiling"
OFF
)
option(
CPPDEBUG
"Enable debugging of C++ STL (does not imply DEBUG)"
OFF
)
option(
PERFORMANCE
"Enable link-time and aggressive optimizations"
OFF
)
option(
USE_ASM
"Use architecture-specific optimized assembly code"
ON
)
option(
IS_LIBFF_PARENT
"Install submodule dependencies if caller originates from here"
ON
)
if(CMAKE_COMPILER_IS_GNUCXX OR "${CMAKE_CXX_COMPILER_ID}" STREQUAL "Clang")
# Common compilation flags and warning configuration
set(
CMAKE_CXX_FLAGS
"${CMAKE_CXX_FLAGS} -std=c++11 -Wall -Wextra -Wfatal-errors"
)
if("${MULTICORE}")
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fopenmp")
endif()
endif()
find_path(GMP_INCLUDE_DIR NAMES gmp.h)
find_library(GMP_LIBRARY gmp)
if(GMP_LIBRARY MATCHES ${CMAKE_SHARED_LIBRARY_SUFFIX})
set(gmp_library_type SHARED)
else()
set(gmp_library_type STATIC)
endif()
message(STATUS "GMP: ${GMP_LIBRARY}, ${GMP_INCLUDE_DIR}")
add_library(GMP::gmp ${gmp_library_type} IMPORTED)
set_target_properties(
GMP::gmp PROPERTIES
IMPORTED_LOCATION ${GMP_LIBRARY}
INTERFACE_INCLUDE_DIRECTORIES ${GMP_INCLUDE_DIR}
)
if("${WITH_PROCPS}")
if(APPLE)
message(WARNING "WITH_PROCPS must be set to OFF for macOS build")
endif()
include(FindPkgConfig)
pkg_check_modules(
PROCPS
REQUIRED
libprocps
)
else()
add_definitions(
-DNO_PROCPS
)
endif()
add_definitions(
-DCURVE_${CURVE}
)
enable_testing()
if(${CURVE} STREQUAL "BN128")
include_directories(depends)
add_definitions(
-DBN_SUPPORT_SNARK=1
)
endif()
if("${DEBUG}")
add_definitions(-DDEBUG=1)
endif()
if("${LOWMEM}")
add_definitions(-DLOWMEM=1)
endif()
if("${MULTICORE}")
add_definitions(-DMULTICORE=1)
endif()
if("${BINARY_OUTPUT}")
add_definitions(-DBINARY_OUTPUT)
endif()
if("${MONTGOMERY_OUTPUT}")
add_definitions(-DMONTGOMERY_OUTPUT)
endif()
if(NOT "${USE_PT_COMPRESSION}")
add_definitions(-DNO_PT_COMPRESSION=1)
endif()
if("${PROFILE_OP_COUNTS}")
add_definitions(-DPROFILE_OP_COUNTS=1)
endif()
if("${USE_MIXED_ADDITION}")
add_definitions(-DUSE_MIXED_ADDITION=1)
endif()
if("${CPPDEBUG}")
add_definitions(-D_GLIBCXX_DEBUG -D_GLIBCXX_DEBUG_PEDANTIC)
endif()
if("${PERFORMANCE}")
add_definitions(-DNDEBUG)
set(
CMAKE_CXX_FLAGS
"${CMAKE_CXX_FLAGS} -flto -fuse-linker-plugin"
)
set(
CMAKE_EXE_LINKER_FLAGS
"${CMAKE_EXE_LINKER_FLAGS} -flto"
)
endif()
if("${USE_ASM}")
add_definitions(-DUSE_ASM)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -mpclmul -msse4.1") # Used for binary fields.
endif()
# Configure CCache if available
find_program(CCACHE_FOUND ccache)
if(CCACHE_FOUND)
set_property(GLOBAL PROPERTY RULE_LAUNCH_COMPILE ccache)
set_property(GLOBAL PROPERTY RULE_LAUNCH_LINK ccache)
endif(CCACHE_FOUND)
if ("${IS_LIBFF_PARENT}")
find_program(
MARKDOWN
markdown_py
DOC "Path to markdown_py binary"
)
if(MARKDOWN-NOTFOUND)
else()
add_custom_target(
doc
${MARKDOWN} -f ${CMAKE_CURRENT_BINARY_DIR}/README.html -x toc -x extra --noisy ${CMAKE_CURRENT_SOURCE_DIR}/README.md
WORKING_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}
COMMENT "Translating from markdown to HTML" VERBATIM
)
endif()
# Add a `make check` target that builds and tests
add_custom_target(check COMMAND ${CMAKE_CTEST_COMMAND})
# Add a `make profile` target that builds and profiles
add_custom_target(
profile
COMMAND ${CMAKE_COMMAND}
-E echo 'Built target finished'
)
add_subdirectory(depends)
# Add a 'make clang-tidy' target that runs clang-tidy using checks specified in .clang-tidy
include(clang-tidy.cmake)
endif()
add_subdirectory(libff)

72
external/libff/CONTRIBUTING.md vendored Normal file
View File

@@ -0,0 +1,72 @@
# Contributing
Thank you for considering making contributions to libff!
Contributing to this repo can be done in several forms, such as participating in discussion or proposing code changes.
To ensure a smooth workflow for all contributors, the following general procedure for contributing has been established:
1) Either open or find an issue you'd like to help with
2) Participate in thoughtful discussion on that issue
3) If you would like to contribute:
* If the issue is a feature proposal, ensure that the proposal has been accepted
* Ensure that nobody else has already begun working on this issue.
If they have, please try to contact them to collaborate
* If nobody has been assigned for the issue and you would like to work on it, make a comment on the issue to inform the community of your intentions to begin work. (So we can avoid duplication of efforts)
* We suggest using standard Github best practices for contributing: fork the repo, branch from the HEAD of develop, make some commits on your branch, and submit a PR from the branch to develop.
More detail on this is below
* Be sure to include a relevant change log entry in the Pending section of CHANGELOG.md (see file for log format)
Note that for very small or clear problems (such as typos), or well isolated improvements, it is not required to an open issue to submit a PR. But be aware that for more complex problems/features touching multiple parts of the codebase, if a PR is opened before an adequate design discussion has taken place in a github issue, that PR runs a larger likelihood of being rejected.
Looking for a good place to start contributing? How about checking out some ["good first issues"](https://github.com/scipr-lab/libff/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22)
## Branch Structure
Libff has its default branch as `develop`, which is where PRs are merged into. The `master` branch should only contain code that is part of a release. Releases will be periodically made. All other branches should be assumed to be miscellaneous feature development branches.
All downstream users should be using tagged versions of the library.
## How to work on a fork
Please skip this section if you're familiar with contributing to open-source github projects.
First, fork the repo from the github UI, and clone your fork locally (we denote
by `<path-to-libff>` the path to your `libff` fork on your machine).
After cloning your fork on your machine, it may be useful to add the upstream
project as a git remote (to periodically update your fork for instance).
You can do so by running the following command:
```bash
# Go to your fork cloned repository (replacing <path-to-libff> by the appropriate path)
cd <path-to-libff>
# Add the upstream project to your remotes
git remote add upstream git@github.com:scipr-lab/libff.git
# (Optional) Check the addition by inspecting your remotes
git remote -vv
```
Then the way you make code contributions is to first think of a branch name that describes your change.
Then do the following:
```bash
# Make sure your "develop" branch is up to date with the upstream repository
git pull upstream develop
# Create a branch for your contribution (replacing <your-branch-name> by the appropriate value)
git checkout -b <your-branch-name>
```
and then work as normal on that branch, and PR'ing to upstream `develop` when you're done =)
## Updating documentation
All PRs should aim to leave the code more documented than it started with.
Please don't assume that its easy to infer what the code is doing,
as that is usually not the case for these complex protocols unless one has been working with it recently.
(Even if you understand the paper!)
Its often very useful to describe what is the high level view of what a code block is doing,
and either refer to the relevant section of a paper or include a short proof/argument for why it makes sense before the actual logic.
## Performance improvements
All performance improvements should be accompanied with benchmarks improving, or otherwise have it be clear that things have improved.
For some areas of the codebase, performance roughly follows the number of field multiplications, but there are also many areas where
low level system effects such as cache locality and superscalar operations become important for performance.
Thus performance can often become very non-intuitive / diverge from minimizing the number of arithmetic operations.

BIN
external/libff/LICENSE vendored Executable file

Binary file not shown.

137
external/libff/README.md vendored Normal file
View File

@@ -0,0 +1,137 @@
<h1 align="center">libff</h1>
<h4 align="center">C++ library for Finite Fields and Elliptic Curves</h4>
___libff___ is a C++ library for finite fields and elliptic curves. The library is developed by [SCIPR Lab] and contributors (see [AUTHORS] file) and is released under the MIT License (see [LICENSE] file).
## Table of contents
- [Directory structure](#directory-structure)
- [Elliptic curve choices](#elliptic-curve-choices)
- [Build guide](#build-guide)
## Directory structure
The directory structure is as follows:
* [__libff__](libff): C++ source code, containing the following modules:
* [__algebra__](libff/algebra): fields and elliptic curve groups
* [__common__](libff/common): miscellaneous utilities
* [__depends__](depends): dependency libraries
## Elliptic curve choices
The libsnark library currently provides three options:
* `edwards`:
an instantiation based on an Edwards curve, providing 80 bits of security.
* `bn128`:
an instantiation based on a Barreto-Naehrig curve, providing 128
bits of security. The underlying curve implementation is
\[ate-pairing], which has incorporated our patch that changes the
BN curve to one suitable for SNARK applications.
* This implementation uses dynamically-generated machine code for the curve
arithmetic. Some modern systems disallow execution of code on the heap, and
will thus block this implementation.
For example, on Fedora 20 at its default settings, you will get the error
`zmInit ERR:can't protect` when running this code. To solve this,
run `sudo setsebool -P allow_execheap 1` to allow execution,
or use `make CURVE=ALT_BN128` instead.
* `alt_bn128`:
an alternative to `bn128`, somewhat slower but avoids dynamic code generation.
Note that `bn128` requires an x86-64 CPU while the other curve choices
should be architecture-independent.
## Build guide
The library has the following dependencies:
* [Boost](http://www.boost.org/)
* [CMake](http://cmake.org/)
* [GMP](http://gmplib.org/)
* [libsodium](https://libsodium.gitbook.io/doc/)
* [libprocps](http://packages.ubuntu.com/trusty/libprocps-dev) (turned off by default)
The library has been tested on Linux, but it is compatible with Windows and MacOS.
### Installation
On Ubuntu 14.04 LTS:
```
sudo apt-get install build-essential git libboost-all-dev cmake libgmp3-dev libssl-dev libprocps3-dev pkg-config libsodium-dev
```
On MacOS, all of the libraries from the previous section can be installed with `brew`, except for `libprocps`, which is turned off by default.
Fetch dependencies from their GitHub repos:
```
git submodule init && git submodule update
```
### Compilation
To compile, starting at the project root directory, create the build directory and Makefile:
```
mkdir build && cd build
cmake ..
```
If you are on macOS, change the cmake command to be
```
cmake .. -DOPENSSL_ROOT_DIR=$(brew --prefix openssl)
```
Other build flags include:
| Flag | Value | Description |
| ---- | ----- | ----------- |
| MAKE_INSTALL_PREFIX | (your path) | Specifies the desired install location. |
| CMAKE_BUILD_TYPE | Debug | Enables asserts. Note that tests now use gtest instead of asserts. |
| WITH_PROCPS | ON | Enables `libprocps`, which is by default turned off since it is not supported on some systems such as MacOS. |
Then, to compile and install the library, run this within the build directory:
```
make
make install
```
This will install `libff.a` into `/install/path/lib`; so your application should be linked using `-L/install/path/lib -lff`. It also installs the requisite headers into `/install/path/include`; so your application should be compiled using `-I/install/path/include`.
## Testing
To build and execute the tests for this library, run:
```
make check
```
## Code formatting and linting
To run clang-tidy on this library, specify the variable `USE_CLANG_TIDY` (eg. `cmake .. -D USE_CLANG_TIDY=ON`).
Then, run:
```
make clang-tidy
```
One can specify which clang-tidy checks to run and which files to run clang-tidy on using the `.clang-tidy` file in the root directory of the project.
## Profile
To compile the multi-exponentiation profiler in this library, run:
```
make profile
```
The resulting profiler is named `multiexp_profile` and can be found in the `libff` folder under the build directory.
[SCIPR Lab]: http://www.scipr-lab.org/ (Succinct Computational Integrity and Privacy Research Lab)
[LICENSE]: LICENSE (LICENSE file in top directory of libff distribution)
[AUTHORS]: AUTHORS (AUTHORS file in top directory of libff distribution)

30
external/libff/clang-tidy.cmake vendored Normal file
View File

@@ -0,0 +1,30 @@
option(
USE_CLANG_TIDY
"Use clang-tidy if the program is found."
OFF
)
if(USE_CLANG_TIDY)
find_program(CLANG_TIDY clang-tidy)
if(CLANG_TIDY)
file(DOWNLOAD
https://raw.githubusercontent.com/llvm-mirror/clang-tools-extra/master/clang-tidy/tool/run-clang-tidy.py
${PROJECT_BINARY_DIR}/run-clang-tidy.py
)
find_program(RUN_CLANG_TIDY run-clang-tidy.py)
if(RUN_CLANG_TIDY)
message("Using clang-tidy. Creating target... To run, use: make clang-tidy")
add_custom_target(
clang-tidy
COMMAND python3 run-clang-tidy.py ../libff/algebra ../libff/common -quiet 2>&1
WORKING_DIRECTORY ${PROJECT_BINARY_DIR}
)
else()
message(
FATAL_ERROR
"run-clang-tidy.py not found. (Download and place in PATH). Aborting...")
endif()
else()
message(FATAL_ERROR "clang-tidy not found. Aborting...")
endif()
endif()

13
external/libff/depends/CMakeLists.txt vendored Executable file
View File

@@ -0,0 +1,13 @@
add_subdirectory(gtest EXCLUDE_FROM_ALL)
if(${CURVE} STREQUAL "BN128")
include_directories(ate-pairing/include)
include_directories(xbyak)
add_library(
zm
STATIC
ate-pairing/src/zm.cpp
ate-pairing/src/zm2.cpp
)
endif()

View File

@@ -0,0 +1,7 @@
*~
\#*\#
.\#*
*.omc
.omakedb*
lib/
CVS

View File

@@ -0,0 +1,10 @@
all:
$(MAKE) -C src
$(MAKE) -C test
clean:
$(MAKE) -C src clean
$(MAKE) -C test clean
check:
$(MAKE) -C test check

View File

@@ -0,0 +1,38 @@
<EFBFBD><EFBFBD><EFBFBD><EFBFBD>
Microsoft Visual Studio Solution File, Format Version 12.00
# Visual Studio Express 2012 for Windows Desktop
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "zmlib", "proj\11\zmlib\zmlib.vcxproj", "{2F9B80B9-D6A5-4534-94A3-7B42F5623193}"
EndProject
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "sample", "proj\11\sample\sample.vcxproj", "{C8EC3A26-31ED-4764-A6F1-D5EBD2D09AFF}"
ProjectSection(ProjectDependencies) = postProject
{2F9B80B9-D6A5-4534-94A3-7B42F5623193} = {2F9B80B9-D6A5-4534-94A3-7B42F5623193}
EndProjectSection
EndProject
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "test_bn", "proj\11\test_bn\test_bn.vcxproj", "{F4B8F9CD-2E24-4374-9BF6-6886347E861B}"
ProjectSection(ProjectDependencies) = postProject
{2F9B80B9-D6A5-4534-94A3-7B42F5623193} = {2F9B80B9-D6A5-4534-94A3-7B42F5623193}
EndProjectSection
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|x64 = Debug|x64
Release|x64 = Release|x64
EndGlobalSection
GlobalSection(ProjectConfigurationPlatforms) = postSolution
{2F9B80B9-D6A5-4534-94A3-7B42F5623193}.Debug|x64.ActiveCfg = Debug|x64
{2F9B80B9-D6A5-4534-94A3-7B42F5623193}.Debug|x64.Build.0 = Debug|x64
{2F9B80B9-D6A5-4534-94A3-7B42F5623193}.Release|x64.ActiveCfg = Release|x64
{2F9B80B9-D6A5-4534-94A3-7B42F5623193}.Release|x64.Build.0 = Release|x64
{C8EC3A26-31ED-4764-A6F1-D5EBD2D09AFF}.Debug|x64.ActiveCfg = Debug|x64
{C8EC3A26-31ED-4764-A6F1-D5EBD2D09AFF}.Debug|x64.Build.0 = Debug|x64
{C8EC3A26-31ED-4764-A6F1-D5EBD2D09AFF}.Release|x64.ActiveCfg = Release|x64
{C8EC3A26-31ED-4764-A6F1-D5EBD2D09AFF}.Release|x64.Build.0 = Release|x64
{F4B8F9CD-2E24-4374-9BF6-6886347E861B}.Debug|x64.ActiveCfg = Debug|x64
{F4B8F9CD-2E24-4374-9BF6-6886347E861B}.Debug|x64.Build.0 = Debug|x64
{F4B8F9CD-2E24-4374-9BF6-6886347E861B}.Release|x64.ActiveCfg = Release|x64
{F4B8F9CD-2E24-4374-9BF6-6886347E861B}.Release|x64.Build.0 = Release|x64
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
EndGlobalSection
EndGlobal

View File

@@ -0,0 +1,14 @@
<?xml version="1.0" encoding="shift_jis"?>
<VisualStudioPropertySheet
ProjectType="Visual C++"
Version="8.00"
Name="ate"
>
<Tool
Name="VCCLCompilerTool"
AdditionalIncludeDirectories="$(SolutionDir)../xbyak;$(SolutionDir)src;$(SolutionDir)include;$(SolutionDir)test"
PreprocessorDefinitions="WIN32;NOMINMAX;WIN32_LEAN_AND_MEAN"
WarningLevel="4"
DisableSpecificWarnings="4996"
/>
</VisualStudioPropertySheet>

View File

@@ -0,0 +1,55 @@
# common definition for Makefile
# for GNU c++
#CCACHE=$(shell eval ls /usr/local/bin/ccache 2>/dev/null)
#CXX = g++
#CC = gcc
#LD = g++
CP = cp -f
AR = ar r
MKDIR=mkdir -p
RM=rm -f
CFLAGS = -fPIC -O3 -fomit-frame-pointer -DNDEBUG -msse2 -mfpmath=sse -march=native
CFLAGS_WARN=-Wall -Wextra -Wformat=2 -Wcast-qual -Wcast-align -Wwrite-strings -Wfloat-equal -Wpointer-arith #-Wswitch-enum -Wstrict-aliasing=2
CFLAGS_ALWAYS = -D_FILE_OFFSET_BITS=64 -DMIE_ATE_USE_GMP
LDFLAGS = -lm -lzm $(LIB_DIR) -lgmp -lgmpxx
AS = nasm
AFLAGS = -f elf -D__unix__
ifeq ($(SUPPORT_SNARK),1)
CFLAGS += -DBN_SUPPORT_SNARK
endif
ifneq ($(VUINT_BIT_LEN),)
CFLAGS += -D"MIE_ZM_VUINT_BIT_LEN=$(VUINT_BIT_LEN)"
endif
# for only 64-bit
BIT=-m64
#BIT=-m32
#ifeq ($(shell uname -s),x86_64)
#BIT=-m64
#endif
#ifeq ($(shell uname -s),Darwin)
#BIT=-m64
#endif
ifeq ($(shell uname -s),Cygwin)
# install mingw64-x86_64-gcc-g++
CXX=x86_64-w64-mingw32-g++
LD=x86_64-w64-mingw32-g++
AR=x86_64-w64-mingw32-ar r
#LDFLAGS+=-L/usr/x86_64-w64-mingw32/sys-root/mingw/lib
endif
ifeq ($(DBG),on)
CFLAGS += -O0 -g3 -UNDEBUG
LDFLAGS += -g3
endif
.SUFFIXES: .cpp
.cpp.o:
$(CXX) -c $< -o $@ $(CFLAGS) $(CFLAGS_WARN) $(CFLAGS_ALWAYS) $(INC_DIR) $(BIT)
.c.o:
$(CC) -c $< -o $@ $(CFLAGS) $(CFLAGS_WARN) $(CFLAGS_ALWAYS) $(INC_DIR) $(BIT)
INC_DIR+= -I../src -I../../xbyak -I../include
LIB_DIR+= -L../lib

View File

@@ -0,0 +1,27 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ImportGroup Label="PropertySheets" />
<PropertyGroup Label="UserMacros" />
<PropertyGroup>
<OutDir>$(SolutionDir)bin\</OutDir>
</PropertyGroup>
<ItemDefinitionGroup>
<ClCompile>
<AdditionalIncludeDirectories>$(SolutionDir)../xbyak;$(SolutionDir)include</AdditionalIncludeDirectories>
</ClCompile>
</ItemDefinitionGroup>
<ItemDefinitionGroup>
<ClCompile>
<WarningLevel>Level4</WarningLevel>
<RuntimeLibrary>MultiThreaded</RuntimeLibrary>
<PrecompiledHeaderFile />
<PrecompiledHeaderOutputFile />
<PreprocessorDefinitions>_MBCS;%(PreprocessorDefinitions);NOMINMAX</PreprocessorDefinitions>
<OpenMPSupport>false</OpenMPSupport>
</ClCompile>
<Link>
<AdditionalLibraryDirectories>$(SolutionDir)lib\</AdditionalLibraryDirectories>
</Link>
</ItemDefinitionGroup>
<ItemGroup />
</Project>

View File

@@ -0,0 +1,18 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ImportGroup Label="PropertySheets" />
<PropertyGroup Label="UserMacros" />
<PropertyGroup>
<TargetName>$(ProjectName)d</TargetName>
</PropertyGroup>
<ItemDefinitionGroup>
<ClCompile>
<RuntimeLibrary>MultiThreadedDebugDLL</RuntimeLibrary>
</ClCompile>
</ItemDefinitionGroup>
<ItemGroup />
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'">
<LocalDebuggerWorkingDirectory>$(SolutionDir)bin</LocalDebuggerWorkingDirectory>
<DebuggerFlavor>WindowsLocalDebugger</DebuggerFlavor>
</PropertyGroup>
</Project>

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,169 @@
#pragma once
/**
@file
@brief measure exec time of function
@author MITSUNARI Shigeo
*/
#if defined(_MSC_VER) && (MSC_VER <= 1500)
#include <cybozu/inttype.hpp>
#else
#include <stdint.h>
#endif
#include <stdio.h>
#if defined(_M_IX86) || defined(_M_X64) || defined(__i386__) || defined(__x86_64__)
#define CYBOZU_BENCH_USE_RDTSC
#endif
#ifdef CYBOZU_BENCH_USE_RDTSC
#ifdef _MSC_VER
#include <intrin.h>
#endif
#else
#include <cybozu/time.hpp>
#endif
namespace cybozu {
#ifdef CYBOZU_BENCH_USE_RDTSC
class CpuClock {
public:
static inline uint64_t getRdtsc()
{
#ifdef _MSC_VER
return __rdtsc();
#else
unsigned int eax, edx;
__asm__ volatile("rdtsc" : "=a"(eax), "=d"(edx));
return ((uint64_t)edx << 32) | eax;
#endif
}
CpuClock()
: clock_(0)
, count_(0)
{
}
void begin()
{
clock_ -= getRdtsc();
}
void end()
{
clock_ += getRdtsc();
count_++;
}
int getCount() const { return count_; }
uint64_t getClock() const { return clock_; }
void clear() { count_ = 0; clock_ = 0; }
void put(const char *msg = 0, int N = 1) const
{
double t = getClock() / double(getCount()) / N;
if (msg && *msg) printf("%s ", msg);
if (t > 1e6) {
printf("%7.3fMclk", t * 1e-6);
} else if (t > 1e3) {
printf("%7.3fKclk", t * 1e-3);
} else {
printf("%6.2f clk", t);
}
if (msg && *msg) printf("\n");
}
// adhoc constatns for CYBOZU_BENCH
static const int loopN1 = 1000;
static const int loopN2 = 1000000;
static const uint64_t maxClk = (uint64_t)3e8;
private:
uint64_t clock_;
int count_;
};
#else
class CpuClock {
cybozu::Time t_;
uint64_t clock_;
int count_;
public:
CpuClock() : clock_(0), count_(0) { t_.setTime(0, 0); }
void begin()
{
if (count_ == 0) t_.setCurrentTime(); // start
}
/*
@note QQQ ; this is not same api as rdtsc version
*/
void end()
{
cybozu::Time cur(true);
int diffSec = (int)(cur.getTime() - t_.getTime());
int diffMsec = cur.getMsec() - t_.getMsec();
const int diff = diffSec * 1000 + diffMsec;
clock_ = diff;
count_++;
}
int getCount() const { return count_; }
uint64_t getClock() const { return clock_; }
void clear() { t_.setTime(0, 0); clock_ = 0; count_ = 0; }
void put(const char *msg = 0, int N = 1) const
{
double t = getClock() / double(getCount()) / N;
if (msg && *msg) printf("%s ", msg);
if (t > 1) {
printf("%6.2fmsec", t);
} else if (t > 1e-3) {
printf("%6.2fusec", t * 1e3);
} else {
printf("%6.2fnsec", t * 1e6);
}
if (msg && *msg) printf("\n");
}
// adhoc constatns for CYBOZU_BENCH
static const int loopN1 = 1000000;
static const int loopN2 = 1000;
static const uint64_t maxClk = (uint64_t)500;
};
#endif
namespace bench {
static CpuClock g_clk;
#ifdef __GNUC__
#define CYBOZU_UNUSED __attribute__((unused))
#else
#define CYBOZU_UNUSED
#endif
static int CYBOZU_UNUSED g_loopNum;
} // cybozu::bench
/*
loop counter is automatically determined
CYBOZU_BENCH(<msg>, <func>, <param1>, <param2>, ...);
if msg == "" then only set g_clk, g_loopNum
*/
#define CYBOZU_BENCH(msg, func, ...) \
{ \
const uint64_t maxClk = cybozu::CpuClock::maxClk; \
cybozu::CpuClock clk; \
for (int i = 0; i < cybozu::CpuClock::loopN2; i++) { \
clk.begin(); \
for (int j = 0; j < cybozu::CpuClock::loopN1; j++) { func(__VA_ARGS__); } \
clk.end(); \
if (clk.getClock() > maxClk) break; \
} \
if (msg && *msg) clk.put(msg, cybozu::CpuClock::loopN1); \
cybozu::bench::g_clk = clk; cybozu::bench::g_loopNum = cybozu::CpuClock::loopN1; \
}
/*
loop counter N is given
CYBOZU_BENCH_C(<msg>, <counter>, <func>, <param1>, <param2>, ...);
if msg == "" then only set g_clk, g_loopNum
*/
#define CYBOZU_BENCH_C(msg, _N, func, ...) \
{ \
cybozu::CpuClock clk; \
clk.begin(); \
for (int j = 0; j < _N; j++) { func(__VA_ARGS__); } \
clk.end(); \
if (msg && *msg) clk.put(msg, _N); \
cybozu::bench::g_clk = clk; cybozu::bench::g_loopNum = _N; \
}
} // cybozu

View File

@@ -0,0 +1,121 @@
#pragma once
/**
@file
@brief int type definition and macros
Copyright (C) 2008 Cybozu Labs, Inc., all rights reserved.
*/
#if defined(_MSC_VER) && (MSC_VER <= 1500) && !defined(CYBOZU_DEFINED_INTXX)
#define CYBOZU_DEFINED_INTXX
typedef __int64 int64_t;
typedef unsigned __int64 uint64_t;
typedef unsigned int uint32_t;
typedef int int32_t;
typedef unsigned short uint16_t;
typedef short int16_t;
typedef unsigned char uint8_t;
typedef signed char int8_t;
#else
#include <stdint.h>
#endif
#ifdef _MSC_VER
#ifndef CYBOZU_DEFINED_SSIZE_T
#define CYBOZU_DEFINED_SSIZE_T
#ifdef _WIN64
typedef int64_t ssize_t;
#else
typedef int32_t ssize_t;
#endif
#endif
#else
#include <unistd.h> // for ssize_t
#endif
#ifndef CYBOZU_ALIGN
#ifdef _MSC_VER
#define CYBOZU_ALIGN(x) __declspec(align(x))
#else
#define CYBOZU_ALIGN(x) __attribute__((aligned(x)))
#endif
#endif
#ifndef CYBOZU_FORCE_INLINE
#ifdef _MSC_VER
#define CYBOZU_FORCE_INLINE __forceinline
#else
#define CYBOZU_FORCE_INLINE __attribute__((always_inline))
#endif
#endif
#ifndef CYBOZU_ALLOCA
#ifdef _MSC_VER
#include <malloc.h>
#define CYBOZU_ALLOCA(x) _malloca(x)
#else
#define CYBOZU_ALLOCA_(x) __builtin_alloca(x)
#endif
#endif
#ifndef CYBOZU_NUM_OF_ARRAY
#define CYBOZU_NUM_OF_ARRAY(x) (sizeof(x) / sizeof(*x))
#endif
#ifndef CYBOZU_SNPRINTF
#if defined(_MSC_VER) && (_MSC_VER < 1900)
#define CYBOZU_SNPRINTF(x, len, ...) (void)_snprintf_s(x, len, len - 1, __VA_ARGS__)
#else
#define CYBOZU_SNPRINTF(x, len, ...) (void)snprintf(x, len, __VA_ARGS__)
#endif
#endif
#define CYBOZU_CPP_VERSION_CPP03 0
#define CYBOZU_CPP_VERSION_TR1 1
#define CYBOZU_CPP_VERSION_CPP11 2
#if (__cplusplus >= 201103) || (_MSC_VER >= 1500) || defined(__GXX_EXPERIMENTAL_CXX0X__)
#if defined(_MSC_VER) && (_MSC_VER <= 1600)
#define CYBOZU_CPP_VERSION CYBOZU_CPP_VERSION_TR1
#else
#define CYBOZU_CPP_VERSION CYBOZU_CPP_VERSION_CPP11
#endif
#elif (__GNUC__ >= 4 && __GNUC_MINOR__ >= 5) || (__clang_major__ >= 3)
#define CYBOZU_CPP_VERSION CYBOZU_CPP_VERSION_TR1
#else
#define CYBOZU_CPP_VERSION CYBOZU_CPP_VERSION_CPP03
#endif
#if (CYBOZU_CPP_VERSION == CYBOZU_CPP_VERSION_TR1)
#define CYBOZU_NAMESPACE_STD std::tr1
#define CYBOZU_NAMESPACE_TR1_BEGIN namespace tr1 {
#define CYBOZU_NAMESPACE_TR1_END }
#else
#define CYBOZU_NAMESPACE_STD std
#define CYBOZU_NAMESPACE_TR1_BEGIN
#define CYBOZU_NAMESPACE_TR1_END
#endif
#ifndef CYBOZU_OS_BIT
#if defined(_WIN64) || defined(__x86_64__)
#define CYBOZU_OS_BIT 64
#else
#define CYBOZU_OS_BIT 32
#endif
#endif
#ifndef CYBOZU_ENDIAN
#define CYBOZU_ENDIAN_UNKNOWN 0
#define CYBOZU_ENDIAN_LITTLE 1
#define CYBOZU_ENDIAN_BIG 2
#if defined(_M_IX86) || defined(_M_AMD64) || defined(__x86_64__) || defined(__i386__)
#define CYBOZU_ENDIAN CYBOZU_ENDIAN_LITTLE
#else
#define CYBOZU_ENDIAN CYBOZU_ENDIAN_UNKNOWN
#endif
#endif
namespace cybozu {
template<class T>
void disable_warning_unused_variable(const T&) { }
template<class T, class S>
T cast(const S* ptr) { return static_cast<T>(static_cast<const void*>(ptr)); }
template<class T, class S>
T cast(S* ptr) { return static_cast<T>(static_cast<void*>(ptr)); }
} // cybozu

View File

@@ -0,0 +1,2 @@
see
https://github.com/herumi/cybozulib/

View File

@@ -0,0 +1,371 @@
#pragma once
/**
@file
@brief unit test class
Copyright (C) 2008 Cybozu Labs, Inc., all rights reserved.
*/
#include <stdio.h>
#include <string.h>
#include <string>
#include <list>
#include <iostream>
#include <utility>
#if defined(_MSC_VER) && (MSC_VER <= 1500)
#include <cybozu/inttype.hpp>
#else
#include <stdint.h>
#endif
namespace cybozu { namespace test {
class AutoRun {
typedef void (*Func)();
typedef std::list<std::pair<const char*, Func> > UnitTestList;
public:
AutoRun()
: init_(0)
, term_(0)
, okCount_(0)
, ngCount_(0)
, exceptionCount_(0)
{
}
void setup(Func init, Func term)
{
init_ = init;
term_ = term;
}
void append(const char *name, Func func)
{
list_.push_back(std::make_pair(name, func));
}
void set(bool isOK)
{
if (isOK) {
okCount_++;
} else {
ngCount_++;
}
}
std::string getBaseName(const std::string& name) const
{
#ifdef _WIN32
const char sep = '\\';
#else
const char sep = '/';
#endif
size_t pos = name.find_last_of(sep);
std::string ret = name.substr(pos + 1);
pos = ret.find('.');
return ret.substr(0, pos);
}
int run(int, char *argv[])
{
std::string msg;
try {
if (init_) init_();
for (UnitTestList::const_iterator i = list_.begin(), ie = list_.end(); i != ie; ++i) {
std::cout << "ctest:module=" << i->first << std::endl;
try {
(i->second)();
} catch (std::exception& e) {
exceptionCount_++;
std::cout << "ctest: " << i->first << " is stopped by exception " << e.what() << std::endl;
} catch (...) {
exceptionCount_++;
std::cout << "ctest: " << i->first << " is stopped by unknown exception" << std::endl;
}
}
if (term_) term_();
} catch (std::exception& e) {
msg = std::string("ctest:err:") + e.what();
} catch (...) {
msg = "ctest:err: catch unknown exception";
}
fflush(stdout);
if (msg.empty()) {
std::cout << "ctest:name=" << getBaseName(*argv)
<< ", module=" << list_.size()
<< ", total=" << (okCount_ + ngCount_ + exceptionCount_)
<< ", ok=" << okCount_
<< ", ng=" << ngCount_
<< ", exception=" << exceptionCount_ << std::endl;
return 0;
} else {
std::cout << msg << std::endl;
return 1;
}
}
static inline AutoRun& getInstance()
{
static AutoRun instance;
return instance;
}
private:
Func init_;
Func term_;
int okCount_;
int ngCount_;
int exceptionCount_;
UnitTestList list_;
};
static AutoRun& autoRun = AutoRun::getInstance();
inline void test(bool ret, const std::string& msg, const std::string& param, const char *file, int line)
{
autoRun.set(ret);
if (!ret) {
printf("%s(%d):ctest:%s(%s);\n", file, line, msg.c_str(), param.c_str());
}
}
template<typename T, typename U>
bool isEqual(const T& lhs, const U& rhs)
{
return lhs == rhs;
}
// avoid warning of comparision of integers of different signs
inline bool isEqual(size_t lhs, int rhs)
{
return lhs == size_t(rhs);
}
inline bool isEqual(int lhs, size_t rhs)
{
return size_t(lhs) == rhs;
}
inline bool isEqual(const char *lhs, const char *rhs)
{
return strcmp(lhs, rhs) == 0;
}
inline bool isEqual(char *lhs, const char *rhs)
{
return strcmp(lhs, rhs) == 0;
}
inline bool isEqual(const char *lhs, char *rhs)
{
return strcmp(lhs, rhs) == 0;
}
inline bool isEqual(char *lhs, char *rhs)
{
return strcmp(lhs, rhs) == 0;
}
// avoid to compare float directly
inline bool isEqual(float lhs, float rhs)
{
union fi {
float f;
uint32_t i;
} lfi, rfi;
lfi.f = lhs;
rfi.f = rhs;
return lfi.i == rfi.i;
}
// avoid to compare double directly
inline bool isEqual(double lhs, double rhs)
{
union di {
double d;
uint64_t i;
} ldi, rdi;
ldi.d = lhs;
rdi.d = rhs;
return ldi.i == rdi.i;
}
} } // cybozu::test
#ifndef CYBOZU_TEST_DISABLE_AUTO_RUN
int main(int argc, char *argv[])
{
return cybozu::test::autoRun.run(argc, argv);
}
#endif
/**
alert if !x
@param x [in]
*/
#define CYBOZU_TEST_ASSERT(x) cybozu::test::test(!!(x), "CYBOZU_TEST_ASSERT", #x, __FILE__, __LINE__)
/**
alert if x != y
@param x [in]
@param y [in]
*/
#define CYBOZU_TEST_EQUAL(x, y) { \
bool eq = cybozu::test::isEqual(x, y); \
cybozu::test::test(eq, "CYBOZU_TEST_EQUAL", #x ", " #y, __FILE__, __LINE__); \
if (!eq) { \
std::cout << "ctest: lhs=" << (x) << std::endl; \
std::cout << "ctest: rhs=" << (y) << std::endl; \
} \
}
/**
alert if fabs(x, y) >= eps
@param x [in]
@param y [in]
*/
#define CYBOZU_TEST_NEAR(x, y, eps) { \
bool isNear = fabs((x) - (y)) < eps; \
cybozu::test::test(isNear, "CYBOZU_TEST_NEAR", #x ", " #y, __FILE__, __LINE__); \
if (!isNear) { \
std::cout << "ctest: lhs=" << (x) << std::endl; \
std::cout << "ctest: rhs=" << (y) << std::endl; \
} \
}
#define CYBOZU_TEST_EQUAL_POINTER(x, y) { \
bool eq = x == y; \
cybozu::test::test(eq, "CYBOZU_TEST_EQUAL_POINTER", #x ", " #y, __FILE__, __LINE__); \
if (!eq) { \
std::cout << "ctest: lhs=" << static_cast<const void*>(x) << std::endl; \
std::cout << "ctest: rhs=" << static_cast<const void*>(y) << std::endl; \
} \
}
/**
alert if x[] != y[]
@param x [in]
@param y [in]
@param n [in]
*/
#define CYBOZU_TEST_EQUAL_ARRAY(x, y, n) { \
for (size_t i = 0, ie = (size_t)(n); i < ie; i++) { \
bool eq = cybozu::test::isEqual(x, y); \
cybozu::test::test(eq, "CYBOZU_TEST_EQUAL_ARRAY", #x ", " #y, __FILE__, __LINE__); \
if (!eq) { \
std::cout << "ctest: i=" << i << std::endl; \
std::cout << "ctest: lhs=" << (x) << std::endl; \
std::cout << "ctest: rhs=" << (y) << std::endl; \
} \
} \
}
/**
always alert
@param msg [in]
*/
#define CYBOZU_TEST_FAIL(msg) cybozu::test::test(false, "CYBOZU_TEST_FAIL", msg, __FILE__, __LINE__)
/**
verify message in exception
*/
#define CYBOZU_TEST_EXCEPTION_MESSAGE(statement, Exception, msg) \
{ \
int ret = 0; \
std::string errMsg; \
try { \
statement; \
ret = 1; \
} catch (const Exception& e) { \
errMsg = e.what(); \
if (errMsg.find(msg) == std::string::npos) { \
ret = 2; \
} \
} catch (...) { \
ret = 3; \
} \
if (ret) { \
cybozu::test::test(false, "CYBOZU_TEST_EXCEPTION_MESSAGE", #statement ", " #Exception ", " #msg, __FILE__, __LINE__); \
if (ret == 1) { \
std::cout << "ctest: no exception" << std::endl; \
} else if (ret == 2) { \
std::cout << "ctest: bad exception msg:" << errMsg << std::endl; \
} else { \
std::cout << "ctest: unexpected exception" << std::endl; \
} \
} else { \
cybozu::test::autoRun.set(true); \
} \
}
#define CYBOZU_TEST_EXCEPTION(statement, Exception) \
{ \
int ret = 0; \
try { \
statement; \
ret = 1; \
} catch (const Exception&) { \
} catch (...) { \
ret = 2; \
} \
if (ret) { \
cybozu::test::test(false, "CYBOZU_TEST_EXCEPTION", #statement ", " #Exception, __FILE__, __LINE__); \
if (ret == 1) { \
std::cout << "ctest: no exception" << std::endl; \
} else { \
std::cout << "ctest: unexpected exception" << std::endl; \
} \
} else { \
cybozu::test::autoRun.set(true); \
} \
}
/**
verify statement does not throw
*/
#define CYBOZU_TEST_NO_EXCEPTION(statement) \
try { \
statement; \
cybozu::test::autoRun.set(true); \
} catch (...) { \
cybozu::test::test(false, "CYBOZU_TEST_NO_EXCEPTION", #statement, __FILE__, __LINE__); \
}
/**
append auto unit test
@param name [in] module name
*/
#define CYBOZU_TEST_AUTO(name) \
void cybozu_test_ ## name(); \
struct cybozu_test_local_ ## name { \
cybozu_test_local_ ## name() \
{ \
cybozu::test::autoRun.append(#name, cybozu_test_ ## name); \
} \
} cybozu_test_local_instance_ ## name; \
void cybozu_test_ ## name()
/**
append auto unit test with fixture
@param name [in] module name
*/
#define CYBOZU_TEST_AUTO_WITH_FIXTURE(name, Fixture) \
void cybozu_test_ ## name(); \
void cybozu_test_real_ ## name() \
{ \
Fixture f; \
cybozu_test_ ## name(); \
} \
struct cybozu_test_local_ ## name { \
cybozu_test_local_ ## name() \
{ \
cybozu::test::autoRun.append(#name, cybozu_test_real_ ## name); \
} \
} cybozu_test_local_instance_ ## name; \
void cybozu_test_ ## name()
/**
setup fixture
@param Fixture [in] class name of fixture
@note cstr of Fixture is called before test and dstr of Fixture is called after test
*/
#define CYBOZU_TEST_SETUP_FIXTURE(Fixture) \
Fixture *cybozu_test_local_fixture; \
void cybozu_test_local_init() \
{ \
cybozu_test_local_fixture = new Fixture(); \
} \
void cybozu_test_local_term() \
{ \
delete cybozu_test_local_fixture; \
} \
struct cybozu_test_local_fixture_setup_ { \
cybozu_test_local_fixture_setup_() \
{ \
cybozu::test::autoRun.setup(cybozu_test_local_init, cybozu_test_local_term); \
} \
} cybozu_test_local_fixture_setup_instance_;

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,432 @@
#pragma once
/**
Fp : finite field with characteristic 254bit prime
t = - 2^62 - 2^55 + 2^0
p = 36*t*t*t*t + 36*t*t*t + 24*t*t + 6*t + 1
*/
#include "zm.h"
#ifdef MIE_ATE_USE_GMP
#include <gmpxx.h>
#endif
namespace mie {
class Fp : public local::addsubmul<Fp,
local::comparable<Fp,
local::hasNegative<Fp,
local::inversible<Fp> > > > {
public:
typedef mie::Unit value_type;
/*
double size of Fp
*/
enum {
N = 32 / sizeof(Unit)
};
Fp()
{
}
MIE_FORCE_INLINE Fp(int x)
{
set(x);
}
MIE_FORCE_INLINE explicit Fp(const std::string& str)
{
set(str);
}
MIE_FORCE_INLINE explicit Fp(const mie::Unit *x)
{
std::copy(x, x + N, v_);
}
Fp(const mie::Vuint& rhs)
{
set(rhs);
}
void set(int x)
{
if (x == 0) {
clear();
} else if (x == 1) {
const mie::Vuint& r = getMontgomeryR();
assert(r.size() == N);
std::copy(&r[0], &r[0] + N, v_);
} else if (x > 0) {
v_[0] = x;
std::fill(v_ + 1, v_ + N, 0);
mul(*this, *this, montgomeryR2_);
} else {
v_[0] = -x;
std::fill(v_ + 1, v_ + N, 0);
mul(*this, *this, montgomeryR2_);
neg(*this, *this);
}
}
void set(const std::string& str)
{
set(mie::Vuint(str));
}
void set(const mie::Vuint& x)
{
assert(x < getModulo());
mie::Vuint y(x);
// count++;std::cout << "count=" << count << ", x=" << x << std::endl;
y *= getMontgomeryR();
y %= getModulo();
setDirect(*this, y);
}
static inline int compare(const Fp& x, const Fp& y)
{
return mie::local::PrimitiveFunction::compare(&x[0], N, &y[0], N);
}
static void (*add)(Fp& out, const Fp& x, const Fp& y);
// add without mod
static void (*addNC)(Fp& out, const Fp& x, const Fp& y);
static void (*subNC)(Fp& out, const Fp& x, const Fp& y);
static void (*shr1)(Fp& out, const Fp& x);
static void (*shr2)(Fp& out, const Fp& x);
static void (*sub)(Fp& out, const Fp& x, const Fp& y);
static void (*neg)(Fp& out, const Fp& x);
static void (*mul)(Fp& out, const Fp& x, const Fp& y);
static int (*preInv)(Fp& r, const Fp& x);
/*
z = 3z + 2x
*/
static inline void _3z_add_2xC(Fp& z, const Fp& x)
{
addNC(z, z, x);
addNC(z, z, z);
addNC(z, z, x);
fast_modp(z);
}
/*
z = 2z + 3x
*/
static inline void _2z_add_3x(Fp& z, const Fp& x)
{
addNC(z, x, z);
addNC(z, z, z);
addNC(z, z, x);
fast_modp(z);
}
inline friend std::ostream& operator<<(std::ostream& os, const Fp& x)
{
return os << x.toString(os.flags() & std::ios_base::hex ? 16 : 10);
}
inline friend std::istream& operator>>(std::istream& is, Fp& x)
{
std::string str;
mie::local::getDigits(is, str);
x.set(str);
return is;
}
MIE_FORCE_INLINE bool isZero() const
{
Unit t = 0;
for (size_t i = 0; i < N; i++) {
t |= v_[i];
}
return t == 0;
}
MIE_FORCE_INLINE void clear()
{
std::fill(v_, v_ + N, 0);
}
static inline void fromMont(Fp& y, const Fp& x)
{
mul(y, x, one_);
}
static inline void toMont(Fp& y, const Fp& x)
{
mul(y, x, montgomeryR2_);
}
// return real low value
Unit getLow() const
{
Fp t;
fromMont(t, *this);
return t.v_[0];
}
bool isOdd() const
{
return (getLow() & 1) != 0;
}
mie::Vuint get() const
{
Fp t;
fromMont(t, *this);
mie::Vuint ret(t.v_, N);
return ret;
}
static inline void inv(Fp& out, const Fp& x)
{
#ifdef MIE_USE_X64ASM
Fp r;
int k = preInv(r, x);
#else
static const Fp p(&p_[0]);
Fp u, v, r, s;
u = p;
v = x;
r.clear();
s.clear(); s[0] = 1; // s is real 1
int k = 0;
while (!v.isZero()) {
if ((u[0] & 1) == 0) {
shr1(u, u);
addNC(s, s, s);
} else if ((v[0] & 1) == 0) {
shr1(v, v);
addNC(r, r, r);
} else if (v >= u) {
subNC(v, v, u);
addNC(s, s, r);
shr1(v, v);
addNC(r, r, r);
} else {
subNC(u, u, v);
addNC(r, r, s);
shr1(u, u);
addNC(s, s, s);
}
k++;
}
if (r >= p) {
subNC(r, r, p);
}
assert(!r.isZero());
subNC(r, p, r);
#endif
/*
xr = 2^k
R = 2^256
get r2^(-k)R^2 = r 2^(512 - k)
*/
mul(out, r, invTbl_[k]);
}
void inverse()
{
inv(*this, *this);
}
static inline void divBy2(Fp &z, const Fp &x)
{
unsigned int i = x[0] & 0x1;
shr1(z, x);
addNC(z, z, halfTbl_[i]);
}
static inline void divBy4(Fp &z, const Fp &x)
{
unsigned int i = x[0] & 0x3;
shr2(z, x);
addNC(z, z, quarterTbl_[i]);
}
/* z <- z mod p for z in [0, 6p] */
static inline void fast_modp(Fp &z)
{
uint64_t t = z.v_[3] >> 61;
z -= getDirectP((int)t);
}
template<class T>
static MIE_FORCE_INLINE void setDirect(Fp& out, const T& in)
{
const size_t n = in.size();
// assert(n <= N);
if (n < N) {
std::copy(&in[0], &in[0] + n, out.v_);
std::fill(out.v_ + n, out.v_ + N, 0);
} else {
// ignore in[i] for i >= N
std::copy(&in[0], &in[0] + N, out.v_);
}
}
std::string toString(int base = 10) const { return get().toString(base); }
MIE_FORCE_INLINE const Unit& operator[](size_t i) const { return v_[i]; }
MIE_FORCE_INLINE Unit& operator[](size_t i) { return v_[i]; }
MIE_FORCE_INLINE size_t size() const { return N; }
static void setModulo(const mie::Vuint& p, int mode, bool useMulx = true, bool definedBN_SUPPORT_SNARK =
#ifdef BN_SUPPORT_SNARK
true
#else
false
#endif
);
static inline const mie::Vuint& getModulo() { return p_; }
static const Fp& getDirectP(int n); /* n = 0..6 */
static inline const mie::Vuint& getMontgomeryR() { return montgomeryR_; }
private:
MIE_ALIGN(16) Unit v_[N];
static mie::Vuint p_;
static mie::Fp invTbl_[512];
public:
static mie::Fp *halfTbl_; // [2] = [0, 1/2 mod p]
private:
static mie::Fp *quarterTbl_; // [4] = [0, 1/4, 2/4, 3/4]
static mie::Vuint montgomeryR_; // 1 = 1r
static mie::Vuint p_add1_div4_; // (p + 1) / 4
static mie::Fp montgomeryR2_; // m(x, r^2) = xr ; x -> xr
static mie::Fp one_; // 1
// m(xr, r^(-2)r) = xr^(-1) ; xr -> xr^(-1)
static void setTablesForDiv(const mie::Vuint& p);
public:
static inline void square(Fp& out, const Fp& x) { mul(out, x, x); }
#ifdef MIE_ATE_USE_GMP
static void toMpz(mpz_class& y, const Fp& x)
{
mpz_import(y.get_mpz_t(), N, -1, sizeof(Unit), 0, 0, x.v_);
}
static void fromMpz(Fp& y, const mpz_class& x)
{
size_t size;
mpz_export(y.v_, &size, -1, sizeof(Unit), 0, 0, x.get_mpz_t());
for (size_t i = size; i < N; i++) {
y.v_[i] = 0;
}
}
#endif
static bool squareRoot(Fp& y, const Fp& x)
{
Fp t;
t = mie::power(x, p_add1_div4_);
if (t * t != x) return false;
y = t;
return true;
}
struct Dbl : public local::addsubmul<Dbl,
local::comparable<Dbl,
local::hasNegative<Dbl> > > {
enum {
SIZE = sizeof(Unit) * N * 2
};
static MIE_FORCE_INLINE void setDirect(Dbl &out, const mie::Vuint &in)
{
const size_t n = in.size();
if (n < N * 2) {
std::copy(&in[0], &in[0] + n, out.v_);
std::fill(out.v_ + n, out.v_ + N * 2, 0);
} else {
// ignore in[i] for i >= N * 2
std::copy(&in[0], &in[0] + N * 2, out.v_);
}
}
static MIE_FORCE_INLINE void setDirect(Dbl &out, const std::string &in)
{
mie::Vuint t(in);
setDirect(out, t);
}
template<class T>
void setDirect(const T &in) { setDirect(*this, in); }
MIE_FORCE_INLINE void clear()
{
std::fill(v_, v_ + N * 2, 0);
}
Unit *ptr() { return v_; }
const Unit *const_ptr() const { return v_; }
mie::Vuint getDirect() const { return mie::Vuint(v_, N * 2); }
MIE_FORCE_INLINE const Unit& operator[](size_t i) const { return v_[i]; }
MIE_FORCE_INLINE Unit& operator[](size_t i) { return v_[i]; }
MIE_FORCE_INLINE size_t size() const { return N * 2; }
std::string toString(int base = 10) const
{
return ("Dbl(" + getDirect().toString(base) + ")");
}
friend inline std::ostream& operator<<(std::ostream& os, const Dbl& x)
{
return os << x.toString(os.flags() & std::ios_base::hex ? 16 : 10);
}
Dbl() {}
explicit Dbl(const Fp &x)
{
mul(*this, x, montgomeryR2_);
}
explicit Dbl(const std::string &str) { setDirect(*this, str); }
static inline int compare(const Dbl& x, const Dbl& y)
{
return mie::local::PrimitiveFunction::compare(&x[0], N * 2, &y[0], N * 2);
}
typedef void (uni_op)(Dbl &z, const Dbl &x);
typedef void (bin_op)(Dbl &z, const Dbl &x, const Dbl &y);
/*
z = (x + y) mod px
*/
static bin_op *add;
static bin_op *addNC;
static uni_op *neg;
/*
z = (x - y) mod px
*/
static bin_op *sub;
static bin_op *subNC;
static void subOpt1(Dbl &z, const Dbl &x, const Dbl &y)
{
assert(&z != &x);
assert(&z != &y);
addNC(z, x, pNTbl_[1]);
subNC(z, z, y);
}
/*
z = x * y
*/
static void (*mul)(Dbl &z, const Fp &x, const Fp &y);
/*
z = MontgomeryReduction(x)
*/
static void (*mod)(Fp &z, const Dbl &x);
/*
x <- x mod pN
*/
static Dbl *pNTbl_; // [4];
private:
MIE_ALIGN(16) Unit v_[N * 2];
};
};
namespace util {
template<>
struct IntTag<mie::Fp> {
typedef size_t value_type;
static inline value_type getBlock(const mie::Fp&, size_t)
{
err();
return 0;
}
static inline size_t getBlockSize(const mie::Fp&)
{
err();
return 0;
}
static inline void err()
{
printf("Use mie::Vuint intead of Fp for the 3rd parameter for ScalarMulti\n");
exit(1);
}
};
} // mie::util
} // mie

View File

@@ -0,0 +1,95 @@
import java.io.*;
import mcl.bn254.*;
public class BN254Test {
static {
System.loadLibrary("bn254_if_wrap");
}
public static void main(String argv[]) {
try {
BN254.SystemInit();
Fp aa = new Fp("12723517038133731887338407189719511622662176727675373276651903807414909099441");
Fp ab = new Fp("4168783608814932154536427934509895782246573715297911553964171371032945126671");
Fp ba = new Fp("13891744915211034074451795021214165905772212241412891944830863846330766296736");
Fp bb = new Fp("7937318970632701341203597196594272556916396164729705624521405069090520231616");
Ec1 g1 = new Ec1(new Fp(-1), new Fp(1));
Ec2 g2 = new Ec2(new Fp2(aa, ab), new Fp2(ba, bb));
System.out.println("g1=" + g1);
System.out.println("g2=" + g2);
assertBool("g1 is on EC", g1.isValid());
assertBool("g2 is on twist EC", g2.isValid());
Mpz r = BN254.GetParamR();
System.out.println("r=" + r);
{
Ec1 t = new Ec1(g1);
t.mul(r);
assertBool("orgder of g1 == r", t.isZero());
}
{
Ec2 t = new Ec2(g2);
t.mul(r);
assertBool("order of g2 == r", t.isZero());
}
Mpz a = new Mpz("123456789012345");
Mpz b = new Mpz("998752342342342342424242421");
// scalar-multiplication sample
{
Mpz c = new Mpz(a);
c.add(b);
Ec1 Pa = new Ec1(g1); Pa.mul(a);
Ec1 Pb = new Ec1(g1); Pb.mul(b);
Ec1 Pc = new Ec1(g1); Pc.mul(c);
Ec1 out = new Ec1(Pa);
out.add(Pb);
assertEqual("check g1 * c = g1 * a + g1 * b", Pc, out);
}
Fp12 e = new Fp12();
// calc e : G2 x G1 -> G3 pairing
e.pairing(g2, g1); // e = e(g2, g1)
System.out.println("e=" + e);
{
Fp12 t = new Fp12(e);
t.power(r);
assertEqual("order of e == r", t, new Fp12(1));
}
Ec2 g2a = new Ec2(g2);
g2a.mul(a);
Fp12 ea1 = new Fp12();
ea1.pairing(g2a, g1);
Fp12 ea2 = new Fp12(e);
ea2.power(a); // ea2 = e^a
assertEqual("e(g2 * a, g1) = e(g2, g1)^a", ea1, ea2);
Ec1 q1 = new Ec1(g1);
q1.mul(new Mpz(12345));
assertBool("q1 is on EC", q1.isValid());
Fp12 e1 = new Fp12();
Fp12 e2 = new Fp12();
e1.pairing(g2, g1); // e1 = e(g2, g1)
e2.pairing(g2, q1); // e2 = e(g2, q1)
Ec1 q2 = new Ec1(g1);
q2.add(q1);
e.pairing(g2, q2); // e = e(g2, q2)
e1.mul(e2);
assertEqual("e = e1 * e2", e, e1);
} catch (RuntimeException e) {
System.out.println("unknown exception :" + e);
}
}
public static void assertBool(String msg, Boolean b) {
if (b) {
System.out.println("OK : " + msg);
} else {
System.out.println("NG : " + msg);
}
}
public static void assertEqual(String msg, Object lhs, Object rhs) {
if (lhs.equals(rhs)) {
System.out.println("OK : " + msg);
} else {
System.out.println("NG : " + msg + ", lhs = " + lhs + ", rhs = " + rhs);
}
}
}

View File

@@ -0,0 +1,48 @@
MODULE_NAME=bn254
JAVA_NAME=BN254
IF_NAME=$(MODULE_NAME)_if
WRAP_CXX=$(IF_NAME)_wrap.cxx
include ../common.mk
LDFLAGS=-lgmp -lgmpxx
ifeq ($(UNAME_S),Darwin)
JAVA_HOME=$(shell /usr/libexec/java_home)
JAVA_INC=$(addprefix -I$(JAVA_HOME)/include,/ /darwin)
LIB_SUF=dylib
else
JAVA_HOME=$(realpath $(dir $(realpath $(shell which javac)))..)
JAVA_INC=-I$(JAVA_HOME)/include
LIB_SUF=so
CFLAGS+=-z noexecstack
LIB+=-lrt
endif
CFLAGS+= -shared -fPIC $(JAVA_INC)
PACKAGE_NAME=mcl.$(MODULE_NAME)
PACKAGE_DIR=$(subst .,/,$(PACKAGE_NAME))
TARGET=../bin/lib$(IF_NAME)_wrap.$(LIB_SUF)
JAVA_EXE=cd ../bin && LD_LIBRARY_PATH=./:$(LD_LIBRARY_PATH) java -classpath ../java
all: $(TARGET)
$(IF_NAME)_wrap.cxx: $(IF_NAME).i $(IF_NAME).hpp
$(MKDIR) $(PACKAGE_DIR)
swig -java -package $(PACKAGE_NAME) -outdir $(PACKAGE_DIR) -c++ -Wall $<
$(TARGET): $(IF_NAME)_wrap.cxx
$(MKDIR) ../bin
$(PRE)$(CXX) $? -o $@ $(CFLAGS) $(LDFLAGS) -I../include ../src/zm2.cpp ../src/zm.cpp -I../../xbyak
%.class: %.java
javac $<
$(JAVA_NAME)Test.class: $(JAVA_NAME)Test.java $(TARGET)
jar:
jar cvf $(MODULE_NAME).jar mcl
test: $(JAVA_NAME)Test.class $(TARGET)
$(JAVA_EXE) $(JAVA_NAME)Test
clean:
rm -rf *.class $(TARGET) $(PACKAGE_DIR)/*.class $(IF_NAME)_wrap.cxx

View File

@@ -0,0 +1,282 @@
#pragma once
/**
@file
@brief api for Java
@author herumi
@note modified new BSD license
http://opensource.org/licenses/BSD-3-Clause
*/
#ifndef MIE_ATE_USE_GMP
#define MIE_ATE_USE_GMP
#endif
#include "bn.h"
inline void SystemInit() throw(std::exception)
{
::bn::Param::init();
}
class Fp2;
class Fp12;
class Ec1;
class Ec2;
class Mpz {
mpz_class self_;
friend class Fp;
friend class Fp2;
friend class Fp12;
friend class Ec1;
friend class Ec2;
public:
Mpz() {}
Mpz(const Mpz& x) : self_(x.self_) {}
Mpz(int x) throw(std::exception) : self_(x) {}
Mpz(const std::string& str) throw(std::exception)
{
set(str);
}
void set(int x) throw(std::exception) { self_ = x; }
void set(const std::string& str) throw(std::exception)
{
self_.set_str(str, 0);
}
std::string toString() const throw(std::exception)
{
return self_.get_str();
}
bool equals(const Mpz& rhs) const { return self_ == rhs.self_; }
int compareTo(const Mpz& rhs) const { return mpz_cmp(self_.get_mpz_t(), rhs.self_.get_mpz_t()); }
void add(const Mpz& rhs) throw(std::exception) { self_ += rhs.self_; }
void sub(const Mpz& rhs) throw(std::exception) { self_ -= rhs.self_; }
void mul(const Mpz& rhs) throw(std::exception) { self_ *= rhs.self_; }
void mod(const Mpz& rhs) throw(std::exception) { self_ %= rhs.self_; }
};
class Fp {
::bn::Fp self_;
friend class Fp2;
friend class Ec1;
public:
Fp() {}
Fp(const Fp& x) : self_(x.self_) {}
Fp(int x) : self_(x) {}
Fp(const std::string& str) throw(std::exception)
{
self_.set(str);
}
void set(int x) { self_ = x; }
void set(const std::string& str) throw(std::exception)
{
self_.set(str);
}
std::string toString() const throw(std::exception)
{
return self_.toString();
}
bool equals(const Fp& rhs) const { return self_ == rhs.self_; }
void add(const Fp& rhs) throw(std::exception) { self_ += rhs.self_; }
void sub(const Fp& rhs) throw(std::exception) { self_ -= rhs.self_; }
void mul(const Fp& rhs) throw(std::exception) { self_ *= rhs.self_; }
void power(const Mpz& x)
{
self_ = mie::power(self_, x.self_);
}
};
class Fp2 {
::bn::Fp2 self_;
friend class Ec2;
public:
Fp2() {}
Fp2(const Fp2& x) : self_(x.self_) {}
Fp2(int a) : self_(a) {}
Fp2(int a, int b) : self_(a, b) {}
Fp2(const Fp& a, const Fp& b) throw(std::exception)
: self_(a.self_, b.self_)
{
}
Fp2(const std::string& a, const std::string& b) throw(std::exception)
: self_(Fp(a).self_, Fp(b).self_)
{
}
Fp& getA() { return *reinterpret_cast<Fp*>(&self_.a_); }
Fp& getB() { return *reinterpret_cast<Fp*>(&self_.b_); }
void set(const std::string& str) throw(std::exception)
{
self_.set(str);
}
std::string toString() const throw(std::exception)
{
return self_.toString();
}
bool equals(const Fp2& rhs) const { return self_ == rhs.self_; }
void add(const Fp2& rhs) throw(std::exception) { self_ += rhs.self_; }
void sub(const Fp2& rhs) throw(std::exception) { self_ -= rhs.self_; }
void mul(const Fp2& rhs) throw(std::exception) { self_ *= rhs.self_; }
void power(const Mpz& x)
{
self_ = mie::power(self_, x.self_);
}
};
class Fp12 {
::bn::Fp12 self_;
public:
Fp12() {}
Fp12(const Fp12& x) : self_(x.self_) {}
Fp12(int x) : self_(x) {}
void set(const std::string& str) throw(std::exception)
{
std::istringstream iss(str);
iss >> self_;
}
std::string toString() const throw(std::exception)
{
std::ostringstream oss;
oss << self_;
return oss.str();
}
bool equals(const Fp12& rhs) const { return self_ == rhs.self_; }
void add(const Fp12& rhs) throw(std::exception) { self_ += rhs.self_; }
void sub(const Fp12& rhs) throw(std::exception) { self_ -= rhs.self_; }
void mul(const Fp12& rhs) throw(std::exception) { self_ *= rhs.self_; }
void pairing(const Ec2& ec2, const Ec1& ec1);
void power(const Mpz& x)
{
self_ = mie::power(self_, x.self_);
}
};
class Ec1 {
::bn::Ec1 self_;
friend class Fp12;
public:
Ec1() { self_.clear(); }
Ec1(const Ec1& x) : self_(x.self_) {}
Ec1(const Fp& x, const Fp& y) throw(std::exception)
{
set(x, y);
}
Ec1(const Fp& x, const Fp& y, const Fp& z) throw(std::exception)
{
set(x, y, z);
}
bool isValid() const { return self_.isValid(); }
void set(const Fp& x, const Fp& y) throw(std::exception)
{
self_.set(x.self_, y.self_);
}
void set(const Fp& x, const Fp& y, const Fp& z) throw(std::exception)
{
self_.set(x.self_, y.self_, z.self_);
}
void set(const std::string& str) throw(std::exception)
{
std::istringstream iss(str);
iss >> self_;
}
std::string toString() const throw(std::exception)
{
std::ostringstream oss;
oss << self_;
return oss.str();
}
bool equals(const Ec1& rhs) const { return self_ == rhs.self_; }
bool isZero() const { return self_.isZero(); }
void clear() { self_.clear(); }
void dbl() { ::bn::Ec1::dbl(self_, self_); }
void neg() { ::bn::Ec1::neg(self_, self_); }
void add(const Ec1& rhs) { ::bn::Ec1::add(self_, self_, rhs.self_); }
void sub(const Ec1& rhs) { ::bn::Ec1::sub(self_, self_, rhs.self_); }
void mul(const Mpz& rhs) { ::bn::Ec1::mul(self_, self_, rhs.self_); }
Fp& getX() { return *reinterpret_cast<Fp*>(&self_.p[0]); }
Fp& getY() { return *reinterpret_cast<Fp*>(&self_.p[1]); }
Fp& getZ() { return *reinterpret_cast<Fp*>(&self_.p[2]); }
};
class Ec2 {
::bn::Ec2 self_;
friend class Fp12;
public:
Ec2() {}
Ec2(const Ec2& x) : self_(x.self_) {}
Ec2(const Fp2& x, const Fp2& y) throw(std::exception)
{
set(x, y);
}
Ec2(const Fp2& x, const Fp2& y, const Fp2& z) throw(std::exception)
{
set(x, y, z);
}
bool isValid() const { return self_.isValid(); }
void set(const Fp2& x, const Fp2& y) throw(std::exception)
{
self_.set(x.self_, y.self_);
}
void set(const Fp2& x, const Fp2& y, const Fp2& z) throw(std::exception)
{
self_.set(x.self_, y.self_, z.self_);
}
void set(const std::string& str) throw(std::exception)
{
std::istringstream iss(str);
iss >> self_;
}
std::string toString() const throw(std::exception)
{
std::ostringstream oss;
oss << self_;
return oss.str();
}
bool equals(const Ec2& rhs) const { return self_ == rhs.self_; }
bool isZero() const { return self_.isZero(); }
void clear() { self_.clear(); }
void dbl() { ::bn::Ec2::dbl(self_, self_); }
void neg() { ::bn::Ec2::neg(self_, self_); }
void add(const Ec2& rhs) { ::bn::Ec2::add(self_, self_, rhs.self_); }
void sub(const Ec2& rhs) { ::bn::Ec2::sub(self_, self_, rhs.self_); }
void mul(const Mpz& rhs) { ::bn::Ec2::mul(self_, self_, rhs.self_); }
Fp2& getX() { return *reinterpret_cast<Fp2*>(&self_.p[0]); }
Fp2& getY() { return *reinterpret_cast<Fp2*>(&self_.p[1]); }
Fp2& getZ() { return *reinterpret_cast<Fp2*>(&self_.p[2]); }
};
void Fp12::pairing(const Ec2& ec2, const Ec1& ec1)
{
::bn::opt_atePairing(self_, ec2.self_, ec1.self_);
}
inline const Mpz& GetParamR()
{
static Mpz r("16798108731015832284940804142231733909759579603404752749028378864165570215949");
return r;
}
#ifdef _MSC_VER
#if _MSC_VER == 1900
#ifdef _DEBUG
#pragma comment(lib, "14/mpird.lib")
#pragma comment(lib, "14/mpirxxd.lib")
#else
#pragma comment(lib, "14/mpir.lib")
#pragma comment(lib, "14/mpirxx.lib")
#endif
#elif _MSC_VER == 1800
#ifdef _DEBUG
#pragma comment(lib, "12/mpird.lib")
#pragma comment(lib, "12/mpirxxd.lib")
#else
#pragma comment(lib, "12/mpir.lib")
#pragma comment(lib, "12/mpirxx.lib")
#endif
#else
#ifdef _DEBUG
#pragma comment(lib, "mpird.lib")
#pragma comment(lib, "mpirxxd.lib")
#else
#pragma comment(lib, "mpir.lib")
#pragma comment(lib, "mpirxx.lib")
#endif
#endif
#endif

View File

@@ -0,0 +1,59 @@
%module BN254
%include "std_string.i"
%include "std_except.i"
%{
#include "bn254_if.hpp"
%}
%typemap(javacode) Mpz %{
public boolean equals(Object obj) {
if (this == obj) return true;
if (!(obj instanceof $javaclassname)) return false;
return equals(($javaclassname)obj);
}
%}
%typemap(javacode) Fp %{
public boolean equals(Object obj) {
if (this == obj) return true;
if (!(obj instanceof $javaclassname)) return false;
return equals(($javaclassname)obj);
}
%}
%typemap(javacode) Fp2 %{
public boolean equals(Object obj) {
if (this == obj) return true;
if (!(obj instanceof $javaclassname)) return false;
return equals(($javaclassname)obj);
}
%}
%typemap(javacode) Fp12 %{
public boolean equals(Object obj) {
if (this == obj) return true;
if (!(obj instanceof $javaclassname)) return false;
return equals(($javaclassname)obj);
}
%}
%typemap(javacode) Ec1 %{
public boolean equals(Object obj) {
if (this == obj) return true;
if (!(obj instanceof $javaclassname)) return false;
return equals(($javaclassname)obj);
}
%}
%typemap(javacode) Ec2 %{
public boolean equals(Object obj) {
if (this == obj) return true;
if (!(obj instanceof $javaclassname)) return false;
return equals(($javaclassname)obj);
}
%}
%include "bn254_if.hpp"

View File

@@ -0,0 +1,126 @@
# Java class files(under construction)
## Build
* Install [swig](http://www.swig.org/) and Java.
### Windows
* set SWIG to the path to swig in make_wrap.bat
* set JAVA_DIR to the path to java in set-java-path.bat.
* Use the follogin commands:
```
> cd java
> make_wrap.bat
```
* bin/bn254_if_wrap.dll is a dll for java.
### Linux
* set JAVA_INC to the path to Java in Makefile
> make test
* bin/libbn254_if_wrap.so is a shared library for java.
## API and Class
### Setup
* At first, call these functions.
```
> System.loadLibrary("bn254_if_wrap");
> BN254.SystemInit();
```
### class Mpz
* a wrapped class of mpz_class of GMP.
* Mpz(int), Mpz(String)
* void set(int x)
* Set x to this.
* void set(String x)
* Set x to this.
* void add(Mpz x)
* Set (this + x) to this.
* void sub(Mpz x)
* Set (this - x) to this.
* void mul(Mpz x)
* Set (this * x) to this.
* void mod(Mpz x)
* Set (this % x) to this.
### class Fp, Fp2, Fp12
* a wrapped class of bn::Fp, bn::Fp2, bn::Fp12.
#### common method
* Fp(int), Fp(String)
* void set(int x)
* Set x to this.
* void set(String x)
* Set x to this.
* The format of Fp is "123", "0xabc"
* The format of Fp2 is "[a,b]" ; a, b are the format of Fp
* The format of Fp12 is "[[[a0,b0],[a1,b1],[a2,b2]], [[a3,b3],[a4,b4],[a5,b5]]]"
* void add(Fp x)
* Set (this + x) to this.
* void sub(Fp x)
* Set (this - x) to this.
* void mul(Fp x)
* Set (this * x) to this.
* void power(Mpz x)
* Set (this ^ x) to this.
#### Fp2
* Fp2(int a, int b)
* Set (a, b) to this.
* Fp2(Fp a, Fp b)
* Set (a, b) to this.
* Fp2(String a, String b)
* Set (a, b) to this.
* Fp getA()
* Return the reference to a where this = (a, b).
* Fp getB()
* Return the reference to b where this = (a, b).
#### Fp12
* pairing(Ec2 ec2, Ec1 ec1)
* Set opt_ate_pairing(ec2, ec1) to this.
### Ec1, Ec2
* a wrapped class of bn::Ec1 and bn::Ec2.
* Ec1(Fp x, Fp y)
* Set (x, y, 1) to this.
* Ec1(Fp x, Fp y, Fp z)
* Set (x:y:z) to this.
* Ec1(String x)
* Set x to this.
* The format of Ec1 is "x_y" or "0" ; x, y are the format of Fp. "0" is the infinity point.
* The format of Ec2 is "x_y" or "0" ; x, y are the format of Fp2.
* Boolean isValid()
* Is (x:y:z) on the curve?
* Boolean isZero()
* Is this equal to the infinity point?
* void clear()
* set this to the infinity point.
* dbl()
* set (this * 2) to this.
* neg()
* Set (-this) to this.
* add(Ec1 x)
* Set (this + x) to this.
* sub(Ec1 x)
* Set (this - x) to this.
* mul(Mpz& x)
* Set (this * x) to this.
* Fp getX()
* Return the value of x.
* Fp getY()
* Return the value of y.
* Fp getZ()
* Return the value of z.

View File

@@ -0,0 +1,19 @@
@echo off
call set-java-path.bat
set JAVA_INCLUDE=%JAVA_DIR%\include
set SWIG=..\..\swig\swigwin-3.0.2\swig.exe
set PACKAGE_NAME=mcl.bn254
set PACKAGE_DIR=%PACKAGE_NAME:.=\%
echo [[run swig]]
mkdir %PACKAGE_DIR%
echo %SWIG% -java -package %PACKAGE_NAME% -outdir %PACKAGE_DIR% -c++ -Wall bn254_if.i
%SWIG% -java -package %PACKAGE_NAME% -outdir %PACKAGE_DIR% -c++ -Wall bn254_if.i
echo [[make dll]]
mkdir ..\bin
cl /MD /DNOMINMAX /DNDEBUG /LD /Ox /EHsc bn254_if_wrap.cxx ../src/zm.cpp ../src/zm2.cpp -I%JAVA_INCLUDE% -I%JAVA_INCLUDE%\win32 -I../include -I../../cybozulib_ext/mpir/include -I../../xbyak /link /LIBPATH:../../cybozulib_ext/mpir/lib /OUT:../bin/bn254_if_wrap.dll
call run-bn254.bat
echo [[make jar]]
%JAVA_DIR%\bin\jar cvf bn254.jar mcl

View File

@@ -0,0 +1,8 @@
@echo off
echo [[compile BN254Test.java]]
%JAVA_DIR%\bin\javac BN254Test.java
echo [[run BN254Test]]
pushd ..\bin
%JAVA_DIR%\bin\java -classpath ..\java BN254Test %1 %2 %3 %4 %5 %6
popd

View File

@@ -0,0 +1,8 @@
@echo off
if "%JAVA_HOME%"=="" (
set JAVA_DIR=c:/p/Java/jdk
) else (
set JAVA_DIR=%JAVA_HOME%
)
echo JAVA_DIR=%JAVA_DIR%
rem set PATH=%PATH%;%JAVA_DIR%\bin

View File

@@ -0,0 +1,71 @@
<EFBFBD><EFBFBD><EFBFBD><EFBFBD><?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup Label="ProjectConfigurations">
<ProjectConfiguration Include="Debug|x64">
<Configuration>Debug</Configuration>
<Platform>x64</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|x64">
<Configuration>Release</Configuration>
<Platform>x64</Platform>
</ProjectConfiguration>
</ItemGroup>
<PropertyGroup Label="Globals">
<ProjectGuid>{C8EC3A26-31ED-4764-A6F1-D5EBD2D09AFF}</ProjectGuid>
<RootNamespace>sample</RootNamespace>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<UseDebugLibraries>true</UseDebugLibraries>
<PlatformToolset>v110</PlatformToolset>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<UseDebugLibraries>false</UseDebugLibraries>
<PlatformToolset>v110</PlatformToolset>
<WholeProgramOptimization>true</WholeProgramOptimization>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.props" />
<ImportGroup Label="ExtensionSettings">
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
<Import Project="$(SolutionDir)common.props" />
<Import Project="$(SolutionDir)debug.props" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
<Import Project="$(SolutionDir)common.props" />
<Import Project="$(SolutionDir)release.props" />
</ImportGroup>
<PropertyGroup Label="UserMacros" />
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'">
<TargetName>sample</TargetName>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'">
<TargetName>sample</TargetName>
</PropertyGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'">
<ClCompile>
<Optimization>Disabled</Optimization>
</ClCompile>
<Link>
<GenerateDebugInformation>true</GenerateDebugInformation>
<SubSystem>Console</SubSystem>
</Link>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'">
<Link>
<AdditionalDependencies>kernel32.lib;user32.lib;gdi32.lib;winspool.lib;comdlg32.lib;advapi32.lib;shell32.lib;ole32.lib;oleaut32.lib;uuid.lib;odbc32.lib;odbccp32.lib;%(AdditionalDependencies);zmlib.lib</AdditionalDependencies>
<SubSystem>Console</SubSystem>
</Link>
</ItemDefinitionGroup>
<ItemGroup>
<ClCompile Include="$(SolutionDir)test\sample.cpp" />
</ItemGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.targets" />
<ImportGroup Label="ExtensionTargets">
</ImportGroup>
</Project>

View File

@@ -0,0 +1,71 @@
<EFBFBD><EFBFBD><EFBFBD><EFBFBD><?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup Label="ProjectConfigurations">
<ProjectConfiguration Include="Debug|x64">
<Configuration>Debug</Configuration>
<Platform>x64</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|x64">
<Configuration>Release</Configuration>
<Platform>x64</Platform>
</ProjectConfiguration>
</ItemGroup>
<PropertyGroup Label="Globals">
<ProjectGuid>{F4B8F9CD-2E24-4374-9BF6-6886347E861B}</ProjectGuid>
<RootNamespace>test_bn</RootNamespace>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<UseDebugLibraries>true</UseDebugLibraries>
<PlatformToolset>v110</PlatformToolset>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<UseDebugLibraries>false</UseDebugLibraries>
<PlatformToolset>v110</PlatformToolset>
<WholeProgramOptimization>true</WholeProgramOptimization>
<CharacterSet>MultiByte</CharacterSet>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.props" />
<ImportGroup Label="ExtensionSettings">
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
<Import Project="$(SolutionDir)common.props" />
<Import Project="$(SolutionDir)debug.props" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
<Import Project="$(SolutionDir)common.props" />
<Import Project="$(SolutionDir)release.props" />
</ImportGroup>
<PropertyGroup Label="UserMacros" />
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'">
<TargetName>test_bn</TargetName>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'">
<TargetName>test_bn</TargetName>
</PropertyGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'">
<ClCompile>
<Optimization>Disabled</Optimization>
</ClCompile>
<Link>
<GenerateDebugInformation>true</GenerateDebugInformation>
<SubSystem>Console</SubSystem>
</Link>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'">
<Link>
<AdditionalDependencies>kernel32.lib;user32.lib;gdi32.lib;winspool.lib;comdlg32.lib;advapi32.lib;shell32.lib;ole32.lib;oleaut32.lib;uuid.lib;odbc32.lib;odbccp32.lib;%(AdditionalDependencies);zmlib.lib</AdditionalDependencies>
<SubSystem>Console</SubSystem>
</Link>
</ItemDefinitionGroup>
<ItemGroup>
<ClCompile Include="$(SolutionDir)test\bn.cpp" />
</ItemGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.targets" />
<ImportGroup Label="ExtensionTargets">
</ImportGroup>
</Project>

Some files were not shown because too many files have changed in this diff Show More